Sample records for traditional processing techniques

  1. [Contention on the theory of processing techniques of Chinese materia medica in the Ming-Qing period].

    PubMed

    Chen, Bin; Jia, Tianzhu

    2015-03-01

    On the basis of the golden stage of development of processing techniques of medicinals in the Song dynasty, the theory and techniques of processing in the Ming-Qing dynasties developed and accomplished further. The knowledge of some physicians on the processing of common medicinal, such as Radix rehmannia and Radixophiopogonis, was questioned, with new idea of processing methods put forward and argued against those insisting traditional ones, marking the progress of the art of processing. By reviewing the contention of technical theory of medicinal processing in the Ming-Qing period, useful references can be provided for the inheritance and development of the traditional art of processing medicinals.

  2. Symmetric Phase Only Filtering for Improved DPIV Data Processing

    NASA Technical Reports Server (NTRS)

    Wernet, Mark P.

    2006-01-01

    The standard approach in Digital Particle Image Velocimetry (DPIV) data processing is to use Fast Fourier Transforms to obtain the cross-correlation of two single exposure subregions, where the location of the cross-correlation peak is representative of the most probable particle displacement across the subregion. This standard DPIV processing technique is analogous to Matched Spatial Filtering, a technique commonly used in optical correlators to perform the crosscorrelation operation. Phase only filtering is a well known variation of Matched Spatial Filtering, which when used to process DPIV image data yields correlation peaks which are narrower and up to an order of magnitude larger than those obtained using traditional DPIV processing. In addition to possessing desirable correlation plane features, phase only filters also provide superior performance in the presence of DC noise in the correlation subregion. When DPIV image subregions contaminated with surface flare light or high background noise levels are processed using phase only filters, the correlation peak pertaining only to the particle displacement is readily detected above any signal stemming from the DC objects. Tedious image masking or background image subtraction are not required. Both theoretical and experimental analyses of the signal-to-noise ratio performance of the filter functions are presented. In addition, a new Symmetric Phase Only Filtering (SPOF) technique, which is a variation on the traditional phase only filtering technique, is described and demonstrated. The SPOF technique exceeds the performance of the traditionally accepted phase only filtering techniques and is easily implemented in standard DPIV FFT based correlation processing with no significant computational performance penalty. An "Automatic" SPOF algorithm is presented which determines when the SPOF is able to provide better signal to noise results than traditional PIV processing. The SPOF based optical correlation processing approach is presented as a new paradigm for more robust cross-correlation processing of low signal-to-noise ratio DPIV image data."

  3. Comparison of an automated Most Probable Number (MPN) technique to traditional plating methods for estimating populations of total aerobes, coliforms and E. coli associated with freshly processed broiler chickens

    USDA-ARS?s Scientific Manuscript database

    Traditional microbiological techniques for estimating populations of viable bacteria can be laborious and time consuming. The Most Probable Number (MPN) technique is especially tedious as multiple series of tubes must be inoculated at several different dilutions. Recently, an instrument (TEMPOTM) ...

  4. A novel data processing technique for image reconstruction of penumbral imaging

    NASA Astrophysics Data System (ADS)

    Xie, Hongwei; Li, Hongyun; Xu, Zeping; Song, Guzhou; Zhang, Faqiang; Zhou, Lin

    2011-06-01

    CT image reconstruction technique was applied to the data processing of the penumbral imaging. Compared with other traditional processing techniques for penumbral coded pinhole image such as Wiener, Lucy-Richardson and blind technique, this approach is brand new. In this method, the coded aperture processing method was used for the first time independent to the point spread function of the image diagnostic system. In this way, the technical obstacles was overcome in the traditional coded pinhole image processing caused by the uncertainty of point spread function of the image diagnostic system. Then based on the theoretical study, the simulation of penumbral imaging and image reconstruction was carried out to provide fairly good results. While in the visible light experiment, the point source of light was used to irradiate a 5mm×5mm object after diffuse scattering and volume scattering. The penumbral imaging was made with aperture size of ~20mm. Finally, the CT image reconstruction technique was used for image reconstruction to provide a fairly good reconstruction result.

  5. Additive Manufacturing Infrared Inspection

    NASA Technical Reports Server (NTRS)

    Gaddy, Darrell

    2014-01-01

    Additive manufacturing is a rapid prototyping technology that allows parts to be built in a series of thin layers from plastic, ceramics, and metallics. Metallic additive manufacturing is an emerging form of rapid prototyping that allows complex structures to be built using various metallic powders. Significant time and cost savings have also been observed using the metallic additive manufacturing compared with traditional techniques. Development of the metallic additive manufacturing technology has advanced significantly over the last decade, although many of the techniques to inspect parts made from these processes have not advanced significantly or have limitations. Several external geometry inspection techniques exist such as Coordinate Measurement Machines (CMM), Laser Scanners, Structured Light Scanning Systems, or even traditional calipers and gages. All of the aforementioned techniques are limited to external geometry and contours or must use a contact probe to inspect limited internal dimensions. This presentation will document the development of a process for real-time dimensional inspection technique and digital quality record of the additive manufacturing process using Infrared camera imaging and processing techniques.

  6. Comparison of an automated Most Probable Number (MPN) technique to traditional plating methods for estimating populations of total aerobes, coliforms and E. coli associated with freshly processed broiler chickens

    USDA-ARS?s Scientific Manuscript database

    Recently, an instrument (TEMPOTM) has been developed to automate the Most Probable Number (MPN) technique and reduce the effort required to estimate some bacterial populations. We compared the automated MPN technique to traditional microbiological plating methods or PetrifilmTM for estimating the t...

  7. An overview of selected information storage and retrieval issues in computerized document processing

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Ihebuzor, Valentine U.

    1984-01-01

    The rapid development of computerized information storage and retrieval techniques has introduced the possibility of extending the word processing concept to document processing. A major advantage of computerized document processing is the relief of the tedious task of manual editing and composition usually encountered by traditional publishers through the immense speed and storage capacity of computers. Furthermore, computerized document processing provides an author with centralized control, the lack of which is a handicap of the traditional publishing operation. A survey of some computerized document processing techniques is presented with emphasis on related information storage and retrieval issues. String matching algorithms are considered central to document information storage and retrieval and are also discussed.

  8. Rapid and low-cost prototyping of medical devices using 3D printed molds for liquid injection molding.

    PubMed

    Chung, Philip; Heller, J Alex; Etemadi, Mozziyar; Ottoson, Paige E; Liu, Jonathan A; Rand, Larry; Roy, Shuvo

    2014-06-27

    Biologically inert elastomers such as silicone are favorable materials for medical device fabrication, but forming and curing these elastomers using traditional liquid injection molding processes can be an expensive process due to tooling and equipment costs. As a result, it has traditionally been impractical to use liquid injection molding for low-cost, rapid prototyping applications. We have devised a method for rapid and low-cost production of liquid elastomer injection molded devices that utilizes fused deposition modeling 3D printers for mold design and a modified desiccator as an injection system. Low costs and rapid turnaround time in this technique lower the barrier to iteratively designing and prototyping complex elastomer devices. Furthermore, CAD models developed in this process can be later adapted for metal mold tooling design, enabling an easy transition to a traditional injection molding process. We have used this technique to manufacture intravaginal probes involving complex geometries, as well as overmolding over metal parts, using tools commonly available within an academic research laboratory. However, this technique can be easily adapted to create liquid injection molded devices for many other applications.

  9. Proposed correlation of modern processing principles for Ayurvedic herbal drug manufacturing: A systematic review.

    PubMed

    Jain, Rahi; Venkatasubramanian, Padma

    2014-01-01

    Quality Ayurvedic herbal medicines are potential, low-cost solutions for addressing contemporary healthcare needs of both Indian and global community. Correlating Ayurvedic herbal preparations with modern processing principles (MPPs) can help develop new and use appropriate technology for scaling up production of the medicines, which is necessary to meet the growing demand. Understanding the fundamental Ayurvedic principles behind formulation and processing is also important for improving the dosage forms. Even though Ayurvedic industry has adopted technologies from food, chemical and pharmaceutical industries, there is no systematic study to correlate the traditional and modern processing methods. This study is an attempt to provide a possible correlation between the Ayurvedic processing methods and MPPs. A systematic literature review was performed to identify the Ayurvedic processing methods by collecting information from English editions of classical Ayurveda texts on medicine preparation methods. Correlation between traditional and MPPs was done based on the techniques used in Ayurvedic drug processing. It was observed that in Ayurvedic medicine preparations there were two major types of processes, namely extraction, and separation. Extraction uses membrane rupturing and solute diffusion principles, while separation uses volatility, adsorption, and size-exclusion principles. The study provides systematic documentation of methods used in Ayurveda for herbal drug preparation along with its interpretation in terms of MPPs. This is the first step which can enable improving or replacing traditional techniques. New technologies or use of existing technologies can be used to improve the dosage forms and scaling up while maintaining the Ayurvedic principles similar to traditional techniques.

  10. Quality assurance paradigms for artificial intelligence in modelling and simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oren, T.I.

    1987-04-01

    New classes of quality assurance concepts and techniques are required for the advanced knowledge-processing paradigms (such as artificial intelligence, expert systems, or knowledge-based systems) and the complex problems that only simulative systems can cope with. A systematization of quality assurance problems as well as examples are given to traditional and cognizant quality assurance techniques in traditional and cognizant modelling and simulation.

  11. The patient relationship and therapeutic techniques of the South Sotho traditional healer.

    PubMed

    Pinkoane, M G; Greeff, M; Williams, M J S

    2005-11-01

    Until 1996 the practice of traditional healers was outlawed in South Africa and not afforded a legal position in the community of health care providers. In 1978 the World Health Organization (WHO) identified traditional healers as those people forming an essential core of primary health care workers for rural people in the Third World Countries. However in 1994 the new South African government identified traditional healers as forming an essential element of primary health care workers. It is estimated that 80% of the black population uses traditional medicine because it is deeply rooted in their culture, which is linked to their religion. The traditional healer shares with the patient a world view which is completely alien to biomedical personnel. Therapeutic techniques typically used in traditional healing conflict with the therapeutic techniques used in biomedicine. The patients' perceptions of traditional healing, their needs and expectations, may be the driving force behind their continuous persistence to consult a traditional healer, even after these patients may have sought the therapeutic techniques of biomedical personnel. The operation of both systems in the same society creates a problem to both providers and recipients of health care. Confusion then arises and the consumer consequently chooses the services closer to her. The researcher aimed at investigating the characteristics of the relationship between the traditional healers and the patients, explored the therapeutic techniques that are used in the South Sotho traditional healing process, and investigated the views of both the traditional healers and the patients about the South -Sotho traditional healing process, to facilitate incorporation of the traditional healers in the National Health Care Delivery System. A qualitative research design was followed. Participants were identified by means of a non-probable, purposive voluntary sample. Data was collected by means of a video camera and semi-structured interviews with the six traditional healers and twelve patients, as well as by taking field notes after each session. Data analysis was achieved by means of using a checklist for the video recordings, and decoding was done for the interviews. A co-coder and the researcher analysed the data independently, after which three consensus discussions took place to finalise the analysed data. The researcher made conclusions, identified shortcomings, and made recommendations for application to nursing education, nursing research and nursing practice. The recommendations for nursing are reflected in the form of guidelines for the incorporation of the traditional healers in the National Health Care Delivery System.

  12. Rapid and Low-cost Prototyping of Medical Devices Using 3D Printed Molds for Liquid Injection Molding

    PubMed Central

    Chung, Philip; Heller, J. Alex; Etemadi, Mozziyar; Ottoson, Paige E.; Liu, Jonathan A.; Rand, Larry; Roy, Shuvo

    2014-01-01

    Biologically inert elastomers such as silicone are favorable materials for medical device fabrication, but forming and curing these elastomers using traditional liquid injection molding processes can be an expensive process due to tooling and equipment costs. As a result, it has traditionally been impractical to use liquid injection molding for low-cost, rapid prototyping applications. We have devised a method for rapid and low-cost production of liquid elastomer injection molded devices that utilizes fused deposition modeling 3D printers for mold design and a modified desiccator as an injection system. Low costs and rapid turnaround time in this technique lower the barrier to iteratively designing and prototyping complex elastomer devices. Furthermore, CAD models developed in this process can be later adapted for metal mold tooling design, enabling an easy transition to a traditional injection molding process. We have used this technique to manufacture intravaginal probes involving complex geometries, as well as overmolding over metal parts, using tools commonly available within an academic research laboratory. However, this technique can be easily adapted to create liquid injection molded devices for many other applications. PMID:24998993

  13. Extending the knowledge in histochemistry and cell biology.

    PubMed

    Heupel, Wolfgang-Moritz; Drenckhahn, Detlev

    2010-01-01

    Central to modern Histochemistry and Cell Biology stands the need for visualization of cellular and molecular processes. In the past several years, a variety of techniques has been achieved bridging traditional light microscopy, fluorescence microscopy and electron microscopy with powerful software-based post-processing and computer modeling. Researchers now have various tools available to investigate problems of interest from bird's- up to worm's-eye of view, focusing on tissues, cells, proteins or finally single molecules. Applications of new approaches in combination with well-established traditional techniques of mRNA, DNA or protein analysis have led to enlightening and prudent studies which have paved the way toward a better understanding of not only physiological but also pathological processes in the field of cell biology. This review is intended to summarize articles standing for the progress made in "histo-biochemical" techniques and their manifold applications.

  14. Way to nanogrinding technology

    NASA Astrophysics Data System (ADS)

    Miyashita, Masakazu

    1990-11-01

    Precision finishing process of hard and brittle material components such as single crystal silicon wafer and magnetic head consists of lapping and polishing which depend too much on skilled labor. This process is based on the traditional optical production technology and entirely different from the automated mass production technique in automobile production. Instead of traditional lapping and polishing, the nanogrinding is proposed as a new stock removal machining to generate optical surface on brittle materials. By this new technology, the damage free surface which is the same one produced by lapping and polishing can be obtained on brittle materials, and the free carvature can also be generated on brittle materials. This technology is based on the motion copying principle which is the same as in case of metal parts machining. The new nanogrinding technology is anticipated to be adapted as the machining technique suitable for automated mass production, because the stable machining on the level of optical production technique is expected to be obtained by the traditional lapping and polishing.

  15. Towards Online Delivery of Process-Oriented Guided Inquiry Learning Techniques in Information Technology Courses

    ERIC Educational Resources Information Center

    Trevathan, Jarrod; Myers, Trina

    2013-01-01

    Process-Oriented Guided Inquiry Learning (POGIL) is a technique used to teach in large lectures and tutorials. It invokes interaction, team building, learning and interest through highly structured group work. Currently, POGIL has only been implemented in traditional classroom settings where all participants are physically present. However,…

  16. What Makes School Ethnography "Ethnographic?"

    ERIC Educational Resources Information Center

    Erickson, Frederick

    Ethnography as an inquiry process guided by a point of view rather than as a reporting process guided by a standard technique or set of techniques is the main point of this essay which suggests the application of Malinowski's theories and methods to an ethnology of the school, indicates reasons why traditional ethnography is inadequate to the…

  17. [Applications of near-infrared spectroscopy to analysis of traditional Chinese herbal medicine].

    PubMed

    Li, Yan-Zhou; Min, Shun-Geng; Liu, Xia

    2008-07-01

    Analysis of traditional Chinese herbal medicine is of great importance to its quality control Conventional analysis methods can not meet the requirement of rapid and on-line analysis because of complex process more experiences or needed. In recent years, near-infrared spectroscopy technique has been used for rapid determination of active components, on-line quality control, identification of counterfeit and discrimination of geographical origins of herbal medicines and so on, due to its advantages of simple pretreatment, high efficiency, convenience to use solid diffuse reflection spectroscopy and fiber. In the present paper, the principles and methods of near-infrared spectroscopy technique are introduced concisely. Especially, the applications of this technique in quantitative analysis and qualitative analysis of traditional Chinese herbal medicine are reviewed.

  18. Comparative Study of Powdered Ginger Drink Processed by Different Method:Traditional and using Evaporation Machine

    NASA Astrophysics Data System (ADS)

    Apriyana, Wuri; Taufika Rosyida, Vita; Nur Hayati, Septi; Darsih, Cici; Dewi Poeloengasih, Crescentiana

    2017-12-01

    Ginger drink is one of the traditional beverage that became one of the products of interest by consumers in Indonesia. This drink is believed to have excellent properties for the health of the body. In this study, we have compared the moisture content, ash content, metal content and the identified compound of product which processed with traditional technique and using an evaporator machine. The results show that both of products fulfilled some parameters of the Indonesian National Standard for the traditional powdered drink. GC-MS analysis data showed the identified compound of both product. The major of hydrocarbon groups that influenced the flavor such as zingiberene, camphene, beta-phelladrine, beta-sesquepelladrine, curcumene, and beta-bisabolene were found higher in ginger drink powder treated with a machine than those processed traditionally.

  19. Traditional Chinese food technology and cuisine.

    PubMed

    Li, Jian-rong; Hsieh, Yun-Hwa P

    2004-01-01

    From ancient wisdom to modern science and technology, Chinese cuisine has been established from a long history of the country and gained a global reputation of its sophistication. Traditional Chinese foods and cuisine that exhibit Chinese culture, art and reality play an essential role in Chinese people's everyday lives. Recently, traditional Chinese foods have drawn a great degree of attention from food scientists and technologists, the food industry, and health promotion institutions worldwide due to the extensive values they offer beyond being merely another ethnic food. These traditional foods comprise a wide variety of products, such as pickled vegetables, salted fish and jellyfish, tofu and tofu derived products, rice and rice snack foods, fermented sauces, fish balls and thousand-year-old eggs. An overview of selected popular traditional Chinese foods and their processing techniques are included in this paper. Further development of the traditional techniques for formulation and production of these foods is expected to produce economic, social and health benefits.

  20. A study for high accuracy measurement of residual stress by deep hole drilling technique

    NASA Astrophysics Data System (ADS)

    Kitano, Houichi; Okano, Shigetaka; Mochizuki, Masahito

    2012-08-01

    The deep hole drilling technique (DHD) received much attention in recent years as a method for measuring through-thickness residual stresses. However, some accuracy problems occur when residual stress evaluation is performed by the DHD technique. One of the reasons is that the traditional DHD evaluation formula applies to the plane stress condition. The second is that the effects of the plastic deformation produced in the drilling process and the deformation produced in the trepanning process are ignored. In this study, a modified evaluation formula, which is applied to the plane strain condition, is proposed. In addition, a new procedure is proposed which can consider the effects of the deformation produced in the DHD process by investigating the effects in detail by finite element (FE) analysis. Then, the evaluation results obtained by the new procedure are compared with that obtained by traditional DHD procedure by FE analysis. As a result, the new procedure evaluates the residual stress fields better than the traditional DHD procedure when the measuring object is thick enough that the stress condition can be assumed as the plane strain condition as in the model used in this study.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pawlowski, Alex E.; Splitter, Derek A.; Muth, Thomas R.

    Additive manufacturing by itself provides many benefits, but by combining different materials processing techniques like traditional casting with additive manufacturing to create hybrid processes, custom materials can be tailor-made and mass produced for applications with specific performance needs.

  2. 75 FR 57263 - New Policy Announcing That Traditional Horizontal Survey Projects Performed With Terrestrial...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-20

    ... DEPARTMENT OF COMMERCE National Oceanic and Atmospheric Administration New Policy Announcing That Traditional Horizontal Survey Projects Performed With Terrestrial Survey Techniques Will No Longer Be Accepted for Processing or Loading Into NGS Databases AGENCY: National Geodetic Survey (NGS), National Ocean...

  3. A systematic mapping study of process mining

    NASA Astrophysics Data System (ADS)

    Maita, Ana Rocío Cárdenas; Martins, Lucas Corrêa; López Paz, Carlos Ramón; Rafferty, Laura; Hung, Patrick C. K.; Peres, Sarajane Marques; Fantinato, Marcelo

    2018-05-01

    This study systematically assesses the process mining scenario from 2005 to 2014. The analysis of 705 papers evidenced 'discovery' (71%) as the main type of process mining addressed and 'categorical prediction' (25%) as the main mining task solved. The most applied traditional technique is the 'graph structure-based' ones (38%). Specifically concerning computational intelligence and machine learning techniques, we concluded that little relevance has been given to them. The most applied are 'evolutionary computation' (9%) and 'decision tree' (6%), respectively. Process mining challenges, such as balancing among robustness, simplicity, accuracy and generalization, could benefit from a larger use of such techniques.

  4. Free Surface Downgoing VSP Multiple Imaging

    NASA Astrophysics Data System (ADS)

    Maula, Fahdi; Dac, Nguyen

    2018-03-01

    The common usage of a vertical seismic profile is to capture the reflection wavefield (upgoing wavefield) so that it can be used for further well tie or other interpretations. Borehole Seismic (VSP) receivers capture the reflection from below the well trajectory, traditionally no seismic image information above trajectory. The non-traditional way of processing the VSP multiple can be used to expand the imaging above the well trajectory. This paper presents the case study of using VSP downgoing multiples for further non-traditional imaging applications. In general, VSP processing, upgoing and downgoing arrivals are separated during processing. The up-going wavefield is used for subsurface illumination, whereas the downgoing wavefield and multiples are normally excluded from the processing. In a situation where the downgoing wavefield passes the reflectors several times (multiple), the downgoing wavefield carries reflection information. Its benefit is that it can be used for seismic tie up to seabed, and possibility for shallow hazards identifications. One of the concepts of downgoing imaging is widely known as mirror-imaging technique. This paper presents a case study from deep water offshore Vietnam. The case study is presented to demonstrate the robustness of the technique, and the limitations encountered during its processing.

  5. The Effectiveness of Active and Traditional Teaching Techniques in the Orthopedic Assessment Laboratory

    ERIC Educational Resources Information Center

    Nottingham, Sara; Verscheure, Susan

    2010-01-01

    Active learning is a teaching methodology with a focus on student-centered learning that engages students in the educational process. This study implemented active learning techniques in an orthopedic assessment laboratory, and the effects of these teaching techniques. Mean scores from written exams, practical exams, and final course evaluations…

  6. Adopting best practices: "Agility" moves from software development to healthcare project management.

    PubMed

    Kitzmiller, Rebecca; Hunt, Eleanor; Sproat, Sara Breckenridge

    2006-01-01

    It is time for a change in mindset in how nurses operationalize system implementations and manage projects. Computers and systems have evolved over time from unwieldy mysterious machines of the past to ubiquitous computer use in every aspect of daily lives and work sites. Yet, disconcertingly, the process used to implement these systems has not evolved. Technology implementation does not need to be a struggle. It is time to adapt traditional plan-driven implementation methods to incorporate agile techniques. Agility is a concept borrowed from software development and is presented here because it encourages flexibility, adaptation, and continuous learning as part of the implementation process. Agility values communication and harnesses change to an advantage, which facilitates the natural evolution of an adaptable implementation process. Specific examples of agility in an implementation are described, and plan-driven implementation stages are adapted to incorporate relevant agile techniques. This comparison demonstrates how an agile approach enhances traditional implementation techniques to meet the demands of today's complex healthcare environments.

  7. Comparing digital data processing techniques for surface mine and reclamation monitoring

    NASA Technical Reports Server (NTRS)

    Witt, R. G.; Bly, B. G.; Campbell, W. J.; Bloemer, H. H. L.; Brumfield, J. O.

    1982-01-01

    The results of three techniques used for processing Landsat digital data are compared for their utility in delineating areas of surface mining and subsequent reclamation. An unsupervised clustering algorithm (ISOCLS), a maximum-likelihood classifier (CLASFY), and a hybrid approach utilizing canonical analysis (ISOCLS/KLTRANS/ISOCLS) were compared by means of a detailed accuracy assessment with aerial photography at NASA's Goddard Space Flight Center. Results show that the hybrid approach was superior to the traditional techniques in distinguishing strip mined and reclaimed areas.

  8. Healthcare Learning Community and Student Retention

    ERIC Educational Resources Information Center

    Johnson, Sherryl W.

    2014-01-01

    Teaching, learning, and retention processes have evolved historically to include multifaceted techniques beyond the traditional lecture. This article presents related results of a study using a healthcare learning community in a southwest Georgia university. The value of novel techniques and tools in promoting student learning and retention…

  9. Structural and morphological approach of Co-Cr dental alloys processed by alternative manufacturing technologies

    NASA Astrophysics Data System (ADS)

    Porojan, Sorin; Bîrdeanu, Mihaela; Savencu, Cristina; Porojan, Liliana

    2017-08-01

    The integration of digitalized processing technologies in traditional dental restorations manufacturing is an emerging application. The objective of this study was to identify the different structural and morphological characteristics of Co-Cr dental alloys processed by alternative manufacturing techniques in order to understand the influence of microstructure on restorations properties and their clinical behavior. Metallic specimens made of Co-Cr dental alloys were prepared using traditional casting (CST), and computerized milling (MIL), selective laser sintering (SLS) and selective laser melting (SLM). The structural information of the samples was obtained by X-ray diffraction, the morphology and the topography of the samples were investigated by Scanning Electron Microscopy and Atomic Force Microscope. Given that the microstructure was significantly different, further differences in the clinical behavior of prosthetic restorations manufactured using additive techniques are anticipated.

  10. Microscale patterning of thermoplastic polymer surfaces by selective solvent swelling.

    PubMed

    Rahmanian, Omid; Chen, Chien-Fu; DeVoe, Don L

    2012-09-04

    A new method for the fabrication of microscale features in thermoplastic substrates is presented. Unlike traditional thermoplastic microfabrication techniques, in which bulk polymer is displaced from the substrate by machining or embossing, a unique process termed orogenic microfabrication has been developed in which selected regions of a thermoplastic surface are raised from the substrate by an irreversible solvent swelling mechanism. The orogenic technique allows thermoplastic surfaces to be patterned using a variety of masking methods, resulting in three-dimensional features that would be difficult to achieve through traditional microfabrication methods. Using cyclic olefin copolymer as a model thermoplastic material, several variations of this process are described to realize growth heights ranging from several nanometers to tens of micrometers, with patterning techniques include direct photoresist masking, patterned UV/ozone surface passivation, elastomeric stamping, and noncontact spotting. Orogenic microfabrication is also demonstrated by direct inkjet printing as a facile photolithography-free masking method for rapid desktop thermoplastic microfabrication.

  11. Optimizing spacecraft design - optimization engine development : progress and plans

    NASA Technical Reports Server (NTRS)

    Cornford, Steven L.; Feather, Martin S.; Dunphy, Julia R; Salcedo, Jose; Menzies, Tim

    2003-01-01

    At JPL and NASA, a process has been developed to perform life cycle risk management. This process requires users to identify: goals and objectives to be achieved (and their relative priorities), the various risks to achieving those goals and objectives, and options for risk mitigation (prevention, detection ahead of time, and alleviation). Risks are broadly defined to include the risk of failing to design a system with adequate performance, compatibility and robustness in addition to more traditional implementation and operational risks. The options for mitigating these different kinds of risks can include architectural and design choices, technology plans and technology back-up options, test-bed and simulation options, engineering models and hardware/software development techniques and other more traditional risk reduction techniques.

  12. Simulation Techniques in Training College Administrators.

    ERIC Educational Resources Information Center

    Fincher, Cameron

    Traditional methods of recruitment and selection in academic administration have not placed an emphasis on formal training or preparation but have relied heavily on informal notions of experiential learning. Simulation as a device for representing complex processes in a manageable form, gaming as an organizing technique for training and…

  13. Reducing software mass through behavior control. [of planetary roving robots

    NASA Technical Reports Server (NTRS)

    Miller, David P.

    1992-01-01

    Attention is given to the tradeoff between communication and computation as regards a planetary rover (both these subsystems are very power-intensive, and both can be the major driver of the rover's power subsystem, and therefore the minimum mass and size of the rover). Software techniques that can be used to reduce the requirements on both communciation and computation, allowing the overall robot mass to be greatly reduced, are discussed. Novel approaches to autonomous control, called behavior control, employ an entirely different approach, and for many tasks will yield a similar or superior level of autonomy to traditional control techniques, while greatly reducing the computational demand. Traditional systems have several expensive processes that operate serially, while behavior techniques employ robot capabilities that run in parallel. Traditional systems make extensive world models, while behavior control systems use minimal world models or none at all.

  14. Claim audits: a relic of the indemnity age?

    PubMed

    Ellender, D E

    1997-09-01

    Traditional claim audits offering quick fixes to specific problems or to recover overpayments will not provide benefit managers with the data and action plan they need to make informed decisions about cost-effective benefit administration. Today's benefits environment calls for a comprehensive review of claim administration, incorporating traditional audit techniques into a quality improvement audit process.

  15. Producing Hybrid Metal Composites by Combining Additive Manufacturing and Casting

    DOE PAGES

    Pawlowski, Alex E.; Splitter, Derek A.; Muth, Thomas R.; ...

    2017-10-01

    Additive manufacturing by itself provides many benefits, but by combining different materials processing techniques like traditional casting with additive manufacturing to create hybrid processes, custom materials can be tailor-made and mass produced for applications with specific performance needs.

  16. The research on surface characteristics of optical lens by 3D printing technique and precise diamond turning technique

    NASA Astrophysics Data System (ADS)

    Huang, Chien-Yao; Chang, Chun-Ming; Ho, Cheng-Fong; Lee, Tai-Wen; Lin, Ping-Hung; Hsu, Wei-Yao

    2017-06-01

    The advantage of 3D printing technique is flexible in design and fabrication. Using 3D printing technique, the traditional manufacturing limitations are not considered. The optical lens is the key component in an optical system. The traditional process to manufacture optical plastic lens is injection molding. However injection molding is only suitable for plastics lens, it cannot fabricate optical and mechanical components at same time. The assembly error of optical system can be reduced effectively with fabricating optical and mechanical components at same time. The process of printing optical and mechanical components simultaneously is proposed in previous papers, but the optical surface of printing components is not transparent. If we increase the transmittance of the optical surface, the printing components which fabricated by 3D printing process could be high transmission. Therefore, precise diamond turning technique has been used to turning the surface of 3D printing optical lens in this paper. The precise diamond turning techniques could process surfaces of components to meet the requirements of optical system. A 3D printing machine, Stratasys Connex 500, and a precise diamond turning machine, Precitech Freeform705XG, have been used in this paper, respectively. The dimension, roughness, transmission and printing types of 3D printing components have been discussed in this paper. After turning and polishing process, the roughness of 3D printing component is below 0.05 μm and the transmittance increase above 80 %. This optical module can be used in hand-held telescope and other system which need lens and special mechanical structure fabricated simultaneously.

  17. Physical, physicochemical and nutritional characteristics of Bhoja chaul, a traditional ready-to-eat dry heat parboiled rice product processed by an improvised soaking technique.

    PubMed

    Dutta, Himjyoti; Mahanta, Charu Lata; Singh, Vasudeva; Das, Barnali Baruah; Rahman, Narzu

    2016-01-15

    Bhoja chaul is a traditional whole rice product processed by the dry heat parboiling technique of low amylose/waxy paddy that is eaten after soaking in water and requires no cooking. The essential steps in Bhoja chaul making are soaking paddy in water, roasting with sand, drying and milling. In this study, the product was prepared from a low amylose variety and a waxy rice variety by an improvised laboratory scale technique. Bhoja chaul prepared in the laboratory by this technique was studied for physical, physicochemical, and textural properties. Improvised method shortened the processing time and gave a product with good textural characteristics. Shape of the rice kernels became bolder on processing. RVA studies and DSC endotherms suggested molecular damage and amylose-lipid complex formation by the linear B-chains of amylopectin, respectively. X-ray diffractography indicated formation of partial B-type pattern. Shifting of the crystalline region of the XRD curve towards lower values of Bragg's angle was attributed to the overall increase in inter-planar spacing of the crystalline lamellae. Resistant starch was negligible. Bhoja chaul may be useful for children and people with poor state of digestibility. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Plasma spectroscopy analysis technique based on optimization algorithms and spectral synthesis for arc-welding quality assurance.

    PubMed

    Mirapeix, J; Cobo, A; González, D A; López-Higuera, J M

    2007-02-19

    A new plasma spectroscopy analysis technique based on the generation of synthetic spectra by means of optimization processes is presented in this paper. The technique has been developed for its application in arc-welding quality assurance. The new approach has been checked through several experimental tests, yielding results in reasonably good agreement with the ones offered by the traditional spectroscopic analysis technique.

  19. Optimising the Use of Note-Taking as an External Cognitive Aid for Increasing Learning

    ERIC Educational Resources Information Center

    Makany, Tamas; Kemp, Jonathan; Dror, Itiel E.

    2009-01-01

    Taking notes is of uttermost importance in academic and commercial use and success. Different techniques for note-taking utilise different cognitive processes and strategies. This experimental study examined ways to enhance cognitive performance via different note-taking techniques. By comparing performances of traditional, linear style…

  20. Koji--where East meets West in fermentation.

    PubMed

    Zhu, Yang; Tramper, Johannes

    2013-12-01

    Almost all biotechnological processes originate from traditional food fermentations, i.e. the many indigenous processes that can be found already in the written history of thousands of years ago. We still consume many of these fermented foods and beverages on a daily basis today. The evolution of these traditional processes, in particular since the 19th century, stimulated and influenced the development of modern biotechnological processes. In return, the development of modern biotechnology and related advanced techniques will no doubt improve the process, the product quality and the safety of our favourite fermented foods and beverages. In this article, we describe the relationship between these traditional food fermentations and modern biotechnology. Using Koji and its derived product soy sauce as examples, we address the mutual influences that will provide us with a better future concerning the quality, safety and nutritional effect of many fermented food products. © 2013.

  1. Optimisation of an oak chips-grape mix maceration process. Influence of chip dose and maceration time.

    PubMed

    Gordillo, Belén; Baca-Bocanegra, Berta; Rodriguez-Pulído, Francisco J; González-Miret, M Lourdes; García Estévez, Ignacio; Quijada-Morín, Natalia; Heredia, Francisco J; Escribano-Bailón, M Teresa

    2016-09-01

    Oak chips-related phenolics are able to modify the composition of red wine and modulate the colour stability. In this study, the effect of two maceration techniques, traditional and oak chips-grape mix process, on the phenolic composition and colour of Syrah red wines from warm climate was studied. Two doses of oak chips (3 and 6g/L) at two maceration times (5 and 10days) during fermentation was considered. Changes on phenolic composition (HPLC-DAD-MS), copigmentation/polymerisation (spectrophotometry), and colour (Tristimulus and Differential Colorimetry) were assessed by multivariate statistical techniques. The addition of oak chips at shorter maceration times enhanced phenolic extraction, colour and its stabilisation in comparison to the traditional maceration. On contrast, increasing chip dose in extended maceration time resulted in wines with lighter and less stable colour. Results open the possibility of optimise alternative technological applications to traditional grape maceration for avoiding the common loss of colour of wines from warm climate. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Beyond the standard plate count: genomic views into microbial food ecology

    USDA-ARS?s Scientific Manuscript database

    Food spoilage is a complex process that involves multiple species with specific niches and metabolic processes; bacterial culturing techniques are the traditional methods for identifying the microbes responsible. These culture-dependent methods may be considered selective, targeting the isolation of...

  3. Bioremediation techniques applied to aqueous media contaminated with mercury.

    PubMed

    Velásquez-Riaño, Möritz; Benavides-Otaya, Holman D

    2016-12-01

    In recent years, the environmental and human health impacts of mercury contamination have driven the search for alternative, eco-efficient techniques different from the traditional physicochemical methods for treating this metal. One of these alternative processes is bioremediation. A comprehensive analysis of the different variables that can affect this process is presented. It focuses on determining the effectiveness of different techniques of bioremediation, with a specific consideration of three variables: the removal percentage, time needed for bioremediation and initial concentration of mercury to be treated in an aqueous medium.

  4. Numerical characterization of landing gear aeroacoustics using advanced simulation and analysis techniques

    NASA Astrophysics Data System (ADS)

    Redonnet, S.; Ben Khelil, S.; Bulté, J.; Cunha, G.

    2017-09-01

    With the objective of aircraft noise mitigation, we here address the numerical characterization of the aeroacoustics by a simplified nose landing gear (NLG), through the use of advanced simulation and signal processing techniques. To this end, the NLG noise physics is first simulated through an advanced hybrid approach, which relies on Computational Fluid Dynamics (CFD) and Computational AeroAcoustics (CAA) calculations. Compared to more traditional hybrid methods (e.g. those relying on the use of an Acoustic Analogy), and although it is used here with some approximations made (e.g. design of the CFD-CAA interface), the present approach does not rely on restrictive assumptions (e.g. equivalent noise source, homogeneous propagation medium), which allows to incorporate more realism into the prediction. In a second step, the outputs coming from such CFD-CAA hybrid calculations are processed through both traditional and advanced post-processing techniques, thus offering to further investigate the NLG's noise source mechanisms. Among other things, this work highlights how advanced computational methodologies are now mature enough to not only simulate realistic problems of airframe noise emission, but also to investigate their underlying physics.

  5. Graphical Technique to Support the Teaching/Learning Process of Software Process Reference Models

    NASA Astrophysics Data System (ADS)

    Espinosa-Curiel, Ismael Edrein; Rodríguez-Jacobo, Josefina; Fernández-Zepeda, José Alberto

    In this paper, we propose a set of diagrams to visualize software process reference models (PRM). The diagrams, called dimods, are the combination of some visual and process modeling techniques such as rich pictures, mind maps, IDEF and RAD diagrams. We show the use of this technique by designing a set of dimods for the Mexican Software Industry Process Model (MoProSoft). Additionally, we perform an evaluation of the usefulness of dimods. The result of the evaluation shows that dimods may be a support tool that facilitates the understanding, memorization, and learning of software PRMs in both, software development organizations and universities. The results also show that dimods may have advantages over the traditional description methods for these types of models.

  6. AI tools in computer based problem solving

    NASA Technical Reports Server (NTRS)

    Beane, Arthur J.

    1988-01-01

    The use of computers to solve value oriented, deterministic, algorithmic problems, has evolved a structured life cycle model of the software process. The symbolic processing techniques used, primarily in research, for solving nondeterministic problems, and those for which an algorithmic solution is unknown, have evolved a different model, much less structured. Traditionally, the two approaches have been used completely independently. With the advent of low cost, high performance 32 bit workstations executing identical software with large minicomputers and mainframes, it became possible to begin to merge both models into a single extended model of computer problem solving. The implementation of such an extended model on a VAX family of micro/mini/mainframe systems is described. Examples in both development and deployment of applications involving a blending of AI and traditional techniques are given.

  7. Investigation of Laser Welding of Ti Alloys for Cognitive Process Parameters Selection.

    PubMed

    Caiazzo, Fabrizia; Caggiano, Alessandra

    2018-04-20

    Laser welding of titanium alloys is attracting increasing interest as an alternative to traditional joining techniques for industrial applications, with particular reference to the aerospace sector, where welded assemblies allow for the reduction of the buy-to-fly ratio, compared to other traditional mechanical joining techniques. In this research work, an investigation on laser welding of Ti⁻6Al⁻4V alloy plates is carried out through an experimental testing campaign, under different process conditions, in order to perform a characterization of the produced weld bead geometry, with the final aim of developing a cognitive methodology able to support decision-making about the selection of the suitable laser welding process parameters. The methodology is based on the employment of artificial neural networks able to identify correlations between the laser welding process parameters, with particular reference to the laser power, welding speed and defocusing distance, and the weld bead geometric features, on the basis of the collected experimental data.

  8. Investigation of Laser Welding of Ti Alloys for Cognitive Process Parameters Selection

    PubMed Central

    2018-01-01

    Laser welding of titanium alloys is attracting increasing interest as an alternative to traditional joining techniques for industrial applications, with particular reference to the aerospace sector, where welded assemblies allow for the reduction of the buy-to-fly ratio, compared to other traditional mechanical joining techniques. In this research work, an investigation on laser welding of Ti–6Al–4V alloy plates is carried out through an experimental testing campaign, under different process conditions, in order to perform a characterization of the produced weld bead geometry, with the final aim of developing a cognitive methodology able to support decision-making about the selection of the suitable laser welding process parameters. The methodology is based on the employment of artificial neural networks able to identify correlations between the laser welding process parameters, with particular reference to the laser power, welding speed and defocusing distance, and the weld bead geometric features, on the basis of the collected experimental data. PMID:29677114

  9. Thermally evaporated conformal thin films on non-traditional/non-planar substrates

    NASA Astrophysics Data System (ADS)

    Pulsifer, Drew Patrick

    Conformal thin films have a wide variety of uses in the microelectronics, optics, and coatings industries. The ever-increasing capabilities of these conformal thin films have enabled tremendous technological advancement in the last half century. During this period, new thin-film deposition techniques have been developed and refined. While these techniques have remarkable performance for traditional applications which utilize planar substrates such as silicon wafers, they are not suitable for the conformal coating of non-traditional substrates such as biological material. The process of thermally evaporating a material under vacuum conditions is one of the oldest thin-film deposition techniques which is able to produce functional film morphologies. A drawback of thermally evaporated thin films is that they are not intrinsically conformal. To overcome this, while maintaining the advantages of thermal evaporation, a procedure for varying the substrates orientation with respect to the incident vapor flux during deposition was developed immediately prior to the research undertaken for this doctoral dissertation. This process was shown to greatly improve the conformality of thermally evaporated thin films. This development allows for several applications of thermally evaporated conformal thin films on non-planar/non-traditional substrates. Three settings in which to evaluate the improved conformal deposition of thermally evaporated thin films were investigated for this dissertation. In these settings the thin-film morphologies are of different types. In the first setting, a bioreplication approach was used to fabricate artificial visual decoys for the invasive species Agrilus planipennis, commonly known as the emerald ash borer (EAB). The mating behavior of this species involves an overflying EAB male pouncing on an EAB female at rest on an ash leaflet before copulation. The male spots the female on the leaflet by visually detecting the iridescent green color of the female's elytra. As rearing EAB and then deploying dead females as decoys is both arduous and inconvenient, the development of an artificial decoy would be of great interest to entomologists and foresters. A dead female EAB was used to make a negative die of nickel and a positive die of epoxy. The process of fabricating the paired dies utilized thermally evaporated conformal thin films in several critical steps. In order to conformally coat the EAB with nickel, the substrate stage holding the female EAB was periodically rocked and rotated during the deposition. This process was designed to result in a uniform thin film of ˜ 500-nm thickness with dense morphology. The nickel film was then reinforced through an electroforming process and mounted in a fixture which allowed it to be heated electrically. The corresponding positive die was replicated from the negative die through a series of successive castings. The final EAB positive die was fabricated from a hard epoxy material and attached to a fixture which allowed it to be heated while being pressed into the negative die. Decoys were then made by first depositing a quarter-wave-stack Bragg reflector on a polymer sheet and then stamping it with the pair of matched negative and positive dies to take the shape of the upper surface of an EAB female. As nearly 100 decoys were fabricated from just one EAB female, this bioreplication process is industrially scalable. Preliminary results from field trapping tests are indicative of success. For the second setting, a method of developing latent fingermarks with thermally evaporated conformal thin films was developed. Fingermarks have long been used to identify the individual who left them behind when he/she touched an object with the friction ridges of his/her hands. In many cases the fingermark which is left behind consists of sebaceous secretions which are not clearly visible under normal conditions. In order to make the fingermarks visible and identifiable, they are traditionally developed by either a physical technique which relies on a material preferentially sticking to sebaceous materials or a chemical technique which relies on a reaction with material within the fingermark. In this application, a columnar thin film (CTF) is deposited conformally over both the fingermark and the underlying substrate. The CTF is produced by the conformal-evaporated-film-by-rotation method, wherein the substrate with the fingermark upon it is held obliquely with respect to a vapor flux in a vacuum chamber. The substrate is then rapidly rotated about its surface normal resulting in a conformal film with columnar morphology. This technique was optimized for several substrates and compared with traditional development techniques. CTF development was found to be superior to traditional techniques in several cases. Use of the CTF was investigated for several types of particularly difficult to develop fingermarks such as those which consist of both bloody and nonbloody areas, and fingermarks on fired cartridge casings. The CTF technique's sensitivity was also compared to that of traditional development techniques. Finally, the CTF technique was compared with another thin film deposition technique called vacuum-metal deposition. (Abstract shortened by UMI.).

  10. Process Mining Online Assessment Data

    ERIC Educational Resources Information Center

    Pechenizkiy, Mykola; Trcka, Nikola; Vasilyeva, Ekaterina; van der Aalst, Wil; De Bra, Paul

    2009-01-01

    Traditional data mining techniques have been extensively applied to find interesting patterns, build descriptive and predictive models from large volumes of data accumulated through the use of different information systems. The results of data mining can be used for getting a better understanding of the underlying educational processes, for…

  11. Detecting subtle hydrochemical anomalies with multivariate statistics: an example from homogeneous groundwaters in the Great Artesian Basin, Australia

    NASA Astrophysics Data System (ADS)

    O'Shea, Bethany; Jankowski, Jerzy

    2006-12-01

    The major ion composition of Great Artesian Basin groundwater in the lower Namoi River valley is relatively homogeneous in chemical composition. Traditional graphical techniques have been combined with multivariate statistical methods to determine whether subtle differences in the chemical composition of these waters can be delineated. Hierarchical cluster analysis and principal components analysis were successful in delineating minor variations within the groundwaters of the study area that were not visually identified in the graphical techniques applied. Hydrochemical interpretation allowed geochemical processes to be identified in each statistically defined water type and illustrated how these groundwaters differ from one another. Three main geochemical processes were identified in the groundwaters: ion exchange, precipitation, and mixing between waters from different sources. Both statistical methods delineated an anomalous sample suspected of being influenced by magmatic CO2 input. The use of statistical methods to complement traditional graphical techniques for waters appearing homogeneous is emphasized for all investigations of this type. Copyright

  12. Microscale Patterning of Thermoplastic Polymer Surfaces by Selective Solvent Swelling

    PubMed Central

    Rahmanian, Omid; Chen, Chien-Fu; DeVoe, Don L.

    2012-01-01

    A new method for the fabrication of microscale features in thermoplastic substrates is presented. Unlike traditional thermoplastic microfabrication techniques, in which bulk polymer is displaced from the substrate by machining or embossing, a unique process termed orogenic microfabrication has been developed in which selected regions of a thermoplastic surface are raised from the substrate by an irreversible solvent swelling mechanism. The orogenic technique allows thermoplastic surfaces to be patterned using a variety of masking methods, resulting in three-dimensional features that would be difficult to achieve through traditional microfabrication methods. Using cyclic olefin copolymer as a model thermoplastic material, several variations of this process are described to realize growth heights ranging from several nanometers to tens of microns, with patterning techniques include direct photoresist masking, patterned UV/ozone surface passivation, elastomeric stamping, and noncontact spotting. Orogenic microfabrication is also demonstrated by direct inkjet printing as a facile photolithography-free masking method for rapid desktop thermoplastic microfabrication. PMID:22900539

  13. Effectiveness assessment of soil conservation measures in reducing soil erosion in Baiquan County of Northeastern China by using (137)Cs techniques.

    PubMed

    Zhang, Qing-Wen; Li, Yong

    2014-05-01

    Accelerated soil erosion is considered as a major land degradation process resulting in increased sediment production and sediment-associated nutrient inputs to the rivers. Over the last decade, several soil conservation programs for erosion control have been conducted throughout Northeastern China. Reliable information on soil erosion rates is an essential prerequisite to assess the effectiveness of soil conservation measures. A study was carried out in Baiquan County of Northeastern China to assess the effectiveness of soil conservation measures in reducing soil erosion using the (137)Cs tracer technique and related techniques. This study reports the use of (137)Cs measurements to quantify medium-term soil erosion rates in traditional slope farmland, contour cropping farmland and terrace farmland in the Dingjiagou catchment and the Xingsheng catchment of Baiquan County. The (137)Cs reference inventory of 2532 ± 670 Bq m(-2) was determined. Based on the principle of the (137)Cs tracer technique, soil erosion rates were estimated. The results showed that severe erosion on traditional slope farmland is the dominant soil erosion process in the area. The terrace measure reduced soil erosion rates by 16% for the entire slope. Typical net soil erosion rates are estimated to be 28.97 Mg per hectare per year for traditional slope farmland and 25.04 Mg per hectare per year for terrace farmland in the Dingjiagou catchment. In contrast to traditional slope farmland with a soil erosion rate of 34.65 Mg per hectare per year, contour cultivation reduced the soil erosion rate by 53% resulting in a soil erosion rate of 22.58 Mg per hectare per year in the Xingsheng catchment. These results indicated that soil losses can be controlled by changing tillage practices from the traditional slope farmland cultivation to the terrace or contour cultivation.

  14. The remote supervisory and controlling experiment system of traditional Chinese medicine production based on Fieldbus

    NASA Astrophysics Data System (ADS)

    Zhan, Jinliang; Lu, Pei

    2006-11-01

    Since the quality of traditional Chinese medicine products are affected by raw material, machining and many other factors, it is difficult for traditional Chinese medicine production process especially the extracting process to ensure the steady and homogeneous quality. At the same time, there exist some quality control blind spots due to lacking on-line quality detection means. But if infrared spectrum analysis technology was used in traditional Chinese medicine production process on the basis of off-line analysis to real-time detect the quality of semi-manufactured goods and to be assisted by advanced automatic control technique, the steady and homogeneous quality can be obtained. It can be seen that the on-line detection of extracting process plays an important role in the development of Chinese patent medicines industry. In this paper, the design and implement of a traditional Chinese medicine extracting process monitoring experiment system which is based on PROFIBUS-DP field bus, OPC, and Internet technology is introduced. The system integrates intelligence node which gathering data, superior sub-system which achieving figure configuration and remote supervisory, during the process of traditional Chinese medicine production, monitors the temperature parameter, pressure parameter, quality parameter etc. And it can be controlled by the remote nodes in the VPN (Visual Private Network). Experiment and application do have proved that the system can reach the anticipation effect fully, and with the merits of operational stability, real-time, reliable, convenient and simple manipulation and so on.

  15. NLPIR: A Theoretical Framework for Applying Natural Language Processing to Information Retrieval.

    ERIC Educational Resources Information Center

    Zhou, Lina; Zhang, Dongsong

    2003-01-01

    Proposes a theoretical framework called NLPIR that integrates natural language processing (NLP) into information retrieval (IR) based on the assumption that there exists representation distance between queries and documents. Discusses problems in traditional keyword-based IR, including relevance, and describes some existing NLP techniques.…

  16. Conventional and dense gas techniques for the production of liposomes: a review.

    PubMed

    Meure, Louise A; Foster, Neil R; Dehghani, Fariba

    2008-01-01

    The aim of this review paper is to compare the potential of various techniques developed for production of homogenous, stable liposomes. Traditional techniques, such as Bangham, detergent depletion, ether/ethanol injection, reverse-phase evaporation and emulsion methods, were compared with the recent advanced techniques developed for liposome formation. The major hurdles for scaling up the traditional methods are the consumption of large quantities of volatile organic solvent, the stability and homogeneity of the liposomal product, as well as the lengthy multiple steps involved. The new methods have been designed to alleviate the current issues for liposome formulation. Dense gas liposome techniques are still in their infancy, however they have remarkable advantages in reducing the use of organic solvents, providing fast, single-stage production and producing stable, uniform liposomes. Techniques such as the membrane contactor and heating methods are also promising as they eliminate the use of organic solvent, however high temperature is still required for processing.

  17. The Successive Contributions of Computers to Education: A Survey.

    ERIC Educational Resources Information Center

    Lelouche, Ruddy

    1998-01-01

    Shows how education has successively benefited from traditional information processing through programmed instruction and computer-assisted instruction (CAI), artificial intelligence, intelligent CAI, intelligent tutoring systems, and hypermedia techniques. Contains 29 references. (DDR)

  18. Feminist Pedagogy, Body Image, and the Dance Technique Class

    ERIC Educational Resources Information Center

    Barr, Sherrie; Oliver, Wendy

    2016-01-01

    This paper investigates the evolution of feminist consciousness in dance technique class as related to body image, the myth of the perfect body, and the development of feminist pedagogy. Western concert dance forms have often been taught in a manner where imitating the teacher is primary in the learning process. In this traditional scenario,…

  19. The Videotape As a Teaching Aid in State and Local Government.

    ERIC Educational Resources Information Center

    Shelly, Walter L.

    In order to educate students in state and local government and to create a better appreciation of the political process, the author contends that the traditional approach to teaching in Texas must be supplemented with innovative techniques. One successful technique is the use of the videotape as a teaching aid. Extension of the vote to the…

  20. User-Centered Innovation: A Model for "Early Usability Testing."

    ERIC Educational Resources Information Center

    Sugar, William A.; Boling, Elizabeth

    The goal of this study is to show how some concepts and techniques from disciplines outside Instructional Systems Development (ISD) have the potential to extend and enhance the traditional view of ISD practice when they are employed very early in the ISD process. The concepts and techniques employed were user-centered in design and usability, and…

  1. A Survey on Optimal Signal Processing Techniques Applied to Improve the Performance of Mechanical Sensors in Automotive Applications

    PubMed Central

    Hernandez, Wilmar

    2007-01-01

    In this paper a survey on recent applications of optimal signal processing techniques to improve the performance of mechanical sensors is made. Here, a comparison between classical filters and optimal filters for automotive sensors is made, and the current state of the art of the application of robust and optimal control and signal processing techniques to the design of the intelligent (or smart) sensors that today's cars need is presented through several experimental results that show that the fusion of intelligent sensors and optimal signal processing techniques is the clear way to go. However, the switch between the traditional methods of designing automotive sensors and the new ones cannot be done overnight because there are some open research issues that have to be solved. This paper draws attention to one of the open research issues and tries to arouse researcher's interest in the fusion of intelligent sensors and optimal signal processing techniques.

  2. Novel sonar signal processing tool using Shannon entropy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quazi, A.H.

    1996-06-01

    Traditionally, conventional signal processing extracts information from sonar signals using amplitude, signal energy or frequency domain quantities obtained using spectral analysis techniques. The object is to investigate an alternate approach which is entirely different than that of traditional signal processing. This alternate approach is to utilize the Shannon entropy as a tool for the processing of sonar signals with emphasis on detection, classification, and localization leading to superior sonar system performance. Traditionally, sonar signals are processed coherently, semi-coherently, and incoherently, depending upon the a priori knowledge of the signals and noise. Here, the detection, classification, and localization technique will bemore » based on the concept of the entropy of the random process. Under a constant energy constraint, the entropy of a received process bearing finite number of sample points is maximum when hypothesis H{sub 0} (that the received process consists of noise alone) is true and decreases when correlated signal is present (H{sub 1}). Therefore, the strategy used for detection is: (I) Calculate the entropy of the received data; then, (II) compare the entropy with the maximum value; and, finally, (III) make decision: H{sub 1} is assumed if the difference is large compared to pre-assigned threshold and H{sub 0} is otherwise assumed. The test statistics will be different between entropies under H{sub 0} and H{sub 1}. Here, we shall show the simulated results for detecting stationary and non-stationary signals in noise, and results on detection of defects in a Plexiglas bar using an ultrasonic experiment conducted by Hughes. {copyright} {ital 1996 American Institute of Physics.}« less

  3. Effect of a new tension system, used in acrylic resin flasking, on the dimensional stability of denture bases.

    PubMed

    Consani, Rafael Leonardo Xediek; Domitti, Saide Sarckis; Consani, Simonides

    2002-09-01

    The pressure of final closure may be released when the flask is removed from the mechanical or pneumatic press and placed in the spring clamp. This release in pressure may result in dimensional changes that distort the denture base. The purpose of this study was to investigate differences between the dimensional stability of standardized simulated denture bases processed by traditional moist heat-polymerization and those processed by use of a new tension system. A metal master die was fabricated to simulate an edentulous maxillary arch without irregularities in the alveolar ridge walls. A silicone mold of this metallic die was prepared, and 40 stone casts were formed from the mold with type III dental stone. The casts were randomly assigned to 4 test groups (A-D) of 10 specimens each. A uniform denture base pattern was made on each stone cast with a 1.5-mm thickness of base-plate wax, measured with a caliper. The patterns were invested for traditional hot water processing. A polymethyl methacrylate dough was prepared and packed for processing. The flasks in groups A and B were closed with the traditional pressure technique and placed in spring clamps after final closure. The flasks in groups C and D were pressed between the metallic plates of the new tension system after the final closure. The group A and C flasks were immediately immersed in the water processing unit at room temperature (25 degrees +/- 2 degrees C). The unit was programmed to raise the temperature to 74 degrees C over 1 hour, and then maintained the temperature at 74 degrees C for 8 hours. The group B and D flasks were bench stored at room temperature (25 degrees +/- 2 degrees C) for 6 hours and were then subjected to the same moist heat polymerization conditions as groups A and C. All processed dentures were bench cooled for 3 hours. After recovery from the flasks, the base-cast sets were transversally sectioned into 3 parts (corresponding to 3 zones): (1) distal of the canines, (2) mesial of the first molars, and (3) mesial of the posterior palate). These areas had been previously established and standardized by use of a pattern denture in the sawing device to determine the sections in each base-cast set. Base-cast gaps were measured at 5 predetermined points on each section with an optical micrometer that had a tolerance of 0.001 mm. Collected data were analyzed with analysis of variance and Tukey's test. Denture bases processed with the new tension system exhibited significantly better base adaptation than those processed with traditional acrylic resin packing. Immediately after polymerization (Groups A and C), mean dimensional change values were 0.213 +/- 0.055 mm for the traditional packing technique and 0.173 +/- 0.050 mm for new tension system. After delayed polymerization (Groups B and D), the values were 0.216 +/- 0.074 mm for the traditional packing technique and 0.164 +/- 0.032 mm for new tension system. With both techniques, dimensional changes in the posterior palatal zone were greater (conventional = 0.286 +/- 0.038 mm; new system = 0.214 +/- 0.024 mm) than those elsewhere on the base-cast set. Within the limitations of this study, the new tension packing system was associated with decreased dimensional changes in the simulated maxillary denture bases processed with heat-polymerization.

  4. Evaluation of MALDI-TOF mass spectrometry for differentiation of Pichia kluyveri strains isolated from traditional fermentation processes.

    PubMed

    De la Torre González, Francisco Javier; Gutiérrez Avendaño, Daniel Oswaldo; Gschaedler Mathis, Anne Christine; Kirchmayr, Manuel Reinhart

    2018-06-06

    Non- Saccharomyces yeasts are widespread microorganisms and some time ago were considered contaminants in the beverage industry. However, nowadays they have gained importance for their ability to produce aromatic compounds, which in alcoholic beverages improves aromatic complexity and therefore the overall quality. Thus, identification and differentiation of the species involved in fermentation processes is vital and can be classified in traditional methods and techniques based on molecular biology. Traditional methods, however, can be expensive, laborious and/or unable to accurately discriminate on strain level. In the present study, a total of 19 strains of Pichia kluyveri isolated from mezcal, tejuino and cacao fermentations were analyzed with rep-PCR fingerprinting and matrix assisted laser desorption ionization time-of-flight mass spectrometry (MALDI-TOF MS). The comparative analysis between MS spectra and rep-PCR patterns obtained from these strains showed a high similarity between both methods. However, minimal differences between the obtained rep-PCR and MALDI-TOF MS clusters could be observed. The data shown suggests that MALDI-TOF MS is a promising alternative technique for rapid, reliable and cost-effective differentiation of natives yeast strains isolated from different traditional fermented foods and beverages. This article is protected by copyright. All rights reserved.

  5. West Java Snack Mapping based on Snack Types, Main Ingredients, and Processing Techniques

    NASA Astrophysics Data System (ADS)

    Nurani, A. S.; Subekti, S.; Ana

    2016-04-01

    The research was motivated by lack of literature on archipelago snack especially from West Java. It aims to explore the snack types, the processing techniques, and the main ingredients by planning a learning material on archipelago cake especially from West Java. The research methods used are descriptive observations and interviews. The samples were randomly chosen from all regions in West Java. The findings show the identification of traditional snack from West java including: 1. snack types which are similar in all regions as research sample namely: opak, rangginang, nagasari, aliagrem, cuhcur, keripik, semprong, wajit, dodol, kecimpring, combro, tape ketan, and surabi. The typical snack types involve burayot (Garut), simping kaum (Purwakarta), surabi hejo (Karawang), papais cisaat (Subang), Papais moyong, opak bakar (Kuningan), opak oded, ranggesing (Sumedang), gapit, tapel (Cirebon), gulampo, kue aci (Tasikmalaya), wajit cililin, gurilem (West Bandung), and borondong (Bandung District); 2. various processing techniques namely: steaming, boiling, frying, caramelizing, baking, grilling, roaster, sugaring; 3. various main ingredients namely rice, local glutinous rice, rice flour, glutinous rice flour, starch, wheat flour, hunkue flour, cassava, sweet potato, banana, nuts, and corn; 4. snack classification in West Java namely (1) traditional snack, (2) creation-snack, (3) modification-snack, (4) outside influence-snack.

  6. [Construction and realization of real world integrated data warehouse from HIS on re-evaluation of post-maketing traditional Chinese medicine].

    PubMed

    Zhuang, Yan; Xie, Bangtie; Weng, Shengxin; Xie, Yanming

    2011-10-01

    To construct real world integrated data warehouse on re-evaluation of post-marketing traditional Chinese medicine for the research on key techniques of clinic re-evaluation which mainly includes indication of traditional Chinese medicine, dosage usage, course of treatment, unit medication, combined disease and adverse reaction, which provides data for reviewed research on its safety,availability and economy,and provides foundation for perspective research. The integrated data warehouse extracts and integrate data from HIS by information collection system and data warehouse technique and forms standard structure and data. The further research is on process based on the data. A data warehouse and several sub data warehouses were built, which focused on patients' main records, doctor orders, diseases diagnoses, laboratory results and economic indications in hospital. These data warehouses can provide research data for re-evaluation of post-marketing traditional Chinese medicine, and it has clinical value. Besides, it points out the direction for further research.

  7. The influence of surface finishing methods on touch-sensitive reactions

    NASA Astrophysics Data System (ADS)

    Kukhta, M. S.; Sokolov, A. P.; Krauinsh, P. Y.; Kozlova, A. D.; Bouchard, C.

    2017-02-01

    This paper describes the modern technological development trends in jewelry design. In the jewelry industry, new trends, associated with the introduction of updated non-traditional materials and finishing techniques, are appearing. The existing information-oriented society enhances the visual aesthetics of new jewelry forms, decoration techniques (depth and surface), synthesis of different materials, which, all in all, reveal a bias towards positive effects of visual design. Today, the jewelry industry includes not only traditional techniques, but also such improved techniques as computer-assisted design, 3D-prototyping and other alternatives to produce an updated level of jewelry material processing. The authors present the specific features of ornamental pattern designing, decoration types (depth and surface) and comparative analysis of different approaches in surface finishing. Identifying the appearance or the effect of jewelry is based on proposed evaluation criteria, providing an advanced visual aesthetics basis is predicated on touch-sensitive responses.

  8. Hospital positioning: a strategic tool for the 1990s.

    PubMed

    San Augustine, A J; Long, W J; Pantzallis, J

    1992-03-01

    The authors extend the process of market positioning in the health care sector by focusing on the simultaneous utilization of traditional research methods and emerging new computer-based adaptive perceptual mapping technologies and techniques.

  9. Analysis of preparation of Chinese traditional medicine based on the fiber fingerprint drop trace

    NASA Astrophysics Data System (ADS)

    Zhang, Zhilin; Wang, Jialu; Sun, Weimin; Yan, Qi

    2010-11-01

    The purpose of the fiber micro-drop analyzing technique is to measure the characteristics of liquids using optical methods. The fiber fingerprint drop trace (FFDT) is a curve of light intensity vs. time. This curve indicates the forming, growing and dripping processes of the liquid drops. A pair of fibers was used to monitor the dripping process. The FFDTs are acquired and analyzed by a computer. Different liquid samples of many kinds of preparation of Chinese traditional medicines were tested by using the fiber micro-drop sensor in the experiments. The FFDTs of preparation of Chinese traditional medicines with different concentrations were analyzed in different ways. Considering the characters of the FFDTs, a novel method is proposed to measure the different preparation of Chinese traditional medicines and its concentration based on the corresponding relationship of FFDTs and the physical and chemical parameters of the liquids.

  10. [Near infrared spectroscopy based process trajectory technology and its application in monitoring and controlling of traditional Chinese medicine manufacturing process].

    PubMed

    Li, Wen-Long; Qu, Hai-Bin

    2016-10-01

    In this paper, the principle of NIRS (near infrared spectroscopy)-based process trajectory technology was introduced.The main steps of the technique include:① in-line collection of the processes spectra of different technics; ② unfolding of the 3-D process spectra;③ determination of the process trajectories and their normal limits;④ monitoring of the new batches with the established MSPC (multivariate statistical process control) models.Applications of the technology in the chemical and biological medicines were reviewed briefly. By a comprehensive introduction of our feasibility research on the monitoring of traditional Chinese medicine technical process using NIRS-based multivariate process trajectories, several important problems of the practical applications which need urgent solutions are proposed, and also the application prospect of the NIRS-based process trajectory technology is fully discussed and put forward in the end. Copyright© by the Chinese Pharmaceutical Association.

  11. Autologous Fat Grafting to the Breast Using REVOLVE System to Reduce Clinical Costs.

    PubMed

    Brzezienski, Mark A; Jarrell, John A

    2016-09-01

    With the increasing popularity of fat grafting over the past decade, the techniques for harvest, processing and preparation, and transfer of the fat cells have evolved to improve efficiency and consistency. The REVOLVE System is a fat processing device used in autologous fat grafting which eliminates much of the specialized equipment as well as the labor intensive and time consuming efforts of the original Coleman technique of fat processing. This retrospective study evaluates the economics of fat grafting, comparing traditional Coleman processing to the REVOLVE System. From June 2013 through December 2013, 88 fat grafting cases by a single-surgeon were reviewed. Timed procedures using either the REVOLVE System or Coleman technique were extracted from the group. Data including fat grafting procedure time, harvested volume, harvest and recipient sites, and concurrent procedures were gathered. Cost and utilization assessments were performed comparing the economics between the groups using standard values of operating room costs provided by the study hospital. Thirty-seven patients with timed procedures were identified, 13 of which were Coleman technique patients and twenty-four (24) were REVOLVE System patients. The average rate of fat transfer was 1.77 mL/minute for the Coleman technique and 4.69 mL/minute for the REVOLVE System, which was a statistically significant difference (P < 0.0001) between the 2 groups. Cost analysis comparing the REVOLVE System and Coleman techniques demonstrates a dramatic divergence in the price per mL of transferred fat at 75 mL when using the previously calculated rates for each group. This single surgeon's experience with the REVOLVE System for fat processing establishes economic support for its use in specific high-volume fat grafting cases. Cost analysis comparing the REVOLVE System and Coleman techniques suggests that in cases of planned fat transfer of 75 mL or more, using the REVOLVE System for fat processing is more economically beneficial. This study may serve as a guide to plastic surgeons in deciding which cases might be appropriate for the use of the REVOLVE System and is the first report comparing economics of fat grafting with the traditional Coleman technique and the REVOLVE System.

  12. How we process trephine biopsy specimens: epoxy resin embedded bone marrow biopsies

    PubMed Central

    Krenacs, T; Bagdi, E; Stelkovics, E; Bereczki, L; Krenacs, L

    2005-01-01

    Improved cytomorphology of semithin resin sections over paraffin wax embedded sections may be important in diagnostic haematopathology. However, resin embedding can make immunohistochemical antigen detection or DNA isolation for clonal gene rearrangement assays difficult. This review describes the processing of bone marrow biopsies using buffered formaldehyde based fixation and epoxy resin embedding, with or without EDTA decalcification. Traditional semithin resin sections are completely rehydrated after etching in home made sodium methoxide solution. Resin elimination allows high resolution staining of tissue components with common histological stains. Efficient antigen retrieval and the Envision-HRP system permit the immunohistological detection of many antigens of diagnostic relevance, with retention of high quality cytomorphology. Furthermore, DNA can be extracted for clonality analysis. The technique can be completed within a similar time period to that of paraffin wax processing with only ∼30% increase in cost. This technique has been used for diagnosis in over 4000 bone marrow biopsies over the past 14 years. By meeting traditional and contemporary demands on the haematopathologist, it offers a powerful alternative to paraffin wax processing for diagnosis and research. PMID:16126867

  13. Development of materials for the rapid manufacture of die cast tooling

    NASA Astrophysics Data System (ADS)

    Hardro, Peter Jason

    The focus of this research is to develop a material composition that can be processed by rapid prototyping (RP) in order to produce tooling for the die casting process. Where these rapidly produced tools will be superior to traditional tooling production methods by offering one or more of the following advantages: reduced tooling cost, shortened tooling creation time, reduced man-hours for tool creation, increased tool life, and shortened die casting cycle time. By utilizing RP's additive build process and vast material selection, there was a prospect that die cast tooling may be produced quicker and with superior material properties. To this end, the material properties that influence die life and cycle time were determined, and a list of materials that fulfill these "optimal" properties were highlighted. Physical testing was conducted in order to grade the processability of each of the material systems and to optimize the manufacturing process for the downselected material system. Sample specimens were produced and microscopy techniques were utilized to determine a number of physical properties of the material system. Additionally, a benchmark geometry was selected and die casting dies were produced from traditional tool materials (H13 steel) and techniques (machining) and from the newly developed materials and RP techniques (selective laser sintering (SLS) and laser engineered net shaping (LENS)). Once the tools were created, a die cast alloy was selected and a preset number of parts were shot into each tool. During tool creation, the manufacturing time and cost was closely monitored and an economic model was developed to compare traditional tooling to RP tooling. This model allows one to determine, in the early design stages, when it is advantageous to implement RP tooling and when traditional tooling would be best. The results of the physical testing and economic analysis has shown that RP tooling is able to achieve a number of the research objectives, namely, reduce tooling cost, shorten tooling creation time, and reduce the man-hours needed for tool creation. Though identifying the appropriate time to use RP tooling appears to be the most important aspect in achieving successful implementation.

  14. Decomposition-Based Decision Making for Aerospace Vehicle Design

    NASA Technical Reports Server (NTRS)

    Borer, Nicholas K.; Mavris, DImitri N.

    2005-01-01

    Most practical engineering systems design problems have multiple and conflicting objectives. Furthermore, the satisfactory attainment level for each objective ( requirement ) is likely uncertain early in the design process. Systems with long design cycle times will exhibit more of this uncertainty throughout the design process. This is further complicated if the system is expected to perform for a relatively long period of time, as now it will need to grow as new requirements are identified and new technologies are introduced. These points identify a need for a systems design technique that enables decision making amongst multiple objectives in the presence of uncertainty. Traditional design techniques deal with a single objective or a small number of objectives that are often aggregates of the overarching goals sought through the generation of a new system. Other requirements, although uncertain, are viewed as static constraints to this single or multiple objective optimization problem. With either of these formulations, enabling tradeoffs between the requirements, objectives, or combinations thereof is a slow, serial process that becomes increasingly complex as more criteria are added. This research proposal outlines a technique that attempts to address these and other idiosyncrasies associated with modern aerospace systems design. The proposed formulation first recasts systems design into a multiple criteria decision making problem. The now multiple objectives are decomposed to discover the critical characteristics of the objective space. Tradeoffs between the objectives are considered amongst these critical characteristics by comparison to a probabilistic ideal tradeoff solution. The proposed formulation represents a radical departure from traditional methods. A pitfall of this technique is in the validation of the solution: in a multi-objective sense, how can a decision maker justify a choice between non-dominated alternatives? A series of examples help the reader to observe how this technique can be applied to aerospace systems design and compare the results of this so-called Decomposition-Based Decision Making to more traditional design approaches.

  15. The Medicine of Coming to Center: Use of the Native American Centering Technique--Ayeli--To Promote Wellness and Healing in Group Work

    ERIC Educational Resources Information Center

    Garrett, Michael Tlanusta; Brubaker, Michael; Torres-Rivera, Edil; West-Olatunji, Cirecie; Conwill, William L.

    2008-01-01

    This article provides group counselors a description of Ayeli, a culturally-based centering technique rooted in Native American traditions. Ayeli is a process that allows participants an opportunity to experience and reflect on four crucial elements relevant to wellness from a Native American perspective: belonging, mastery, independence, and…

  16. Utilising three-dimensional printing techniques when providing unique assistive devices: A case report.

    PubMed

    Day, Sarah Jane; Riley, Shaun Patrick

    2018-02-01

    The evolution of three-dimensional printing into prosthetics has opened conversations about the availability and cost of prostheses. This report will discuss how a prosthetic team incorporated additive manufacture techniques into the treatment of a patient with a partial hand amputation to create and test a unique assistive device which he could use to hold his French horn. Case description and methods: Using a process of shape capture, photogrammetry, computer-aided design and finite element analysis, a suitable assistive device was designed and tested. The design was fabricated using three-dimensional printing. Patient satisfaction was measured using a Pugh's Matrix™, and a cost comparison was made between the process used and traditional manufacturing. Findings and outcomes: Patient satisfaction was high. The three-dimensional printed devices were 56% cheaper to fabricate than a similar laminated device. Computer-aided design and three-dimensional printing proved to be an effective method for designing, testing and fabricating a unique assistive device. Clinical relevance CAD and 3D printing techniques can enable devices to be designed, tested and fabricated cheaper than when using traditional techniques. This may lead to improvements in quality and accessibility.

  17. Removable partial denture alloys processed by laser-sintering technique.

    PubMed

    Alageel, Omar; Abdallah, Mohamed-Nur; Alsheghri, Ammar; Song, Jun; Caron, Eric; Tamimi, Faleh

    2018-04-01

    Removable partial dentures (RPDs) are traditionally made using a casting technique. New additive manufacturing processes based on laser sintering has been developed for quick fabrication of RPDs metal frameworks at low cost. The objective of this study was to characterize the mechanical, physical, and biocompatibility properties of RPD cobalt-chromium (Co-Cr) alloys produced by two laser-sintering systems and compare them to those prepared using traditional casting methods. The laser-sintered Co-Cr alloys were processed by the selective laser-sintering method (SLS) and the direct metal laser-sintering (DMLS) method using the Phenix system (L-1) and EOS system (L-2), respectively. L-1 and L-2 techniques were 8 and 3.5 times more precise than the casting (CC) technique (p < 0.05). Co-Cr alloys processed by L-1 and L-2 showed higher (p < 0.05) hardness (14-19%), yield strength (10-13%), and fatigue resistance (71-72%) compared to CC alloys. This was probably due to their smaller grain size and higher microstructural homogeneity. All Co-Cr alloys exhibited low porosity (2.1-3.3%); however, pore distribution was more homogenous in L-1 and L-2 alloys when compared to CC alloys. Both laser-sintered and cast alloys were biocompatible. In conclusion, laser-sintered alloys are more precise and present better mechanical and fatigue properties than cast alloys for RPDs. © 2017 Wiley Periodicals, Inc. J Biomed Mater Res Part B: Appl Biomater, 106B: 1174-1185, 2018. © 2017 Wiley Periodicals, Inc.

  18. Continuous welding of unidirectional fiber reinforced thermoplastic tape material

    NASA Astrophysics Data System (ADS)

    Schledjewski, Ralf

    2017-10-01

    Continuous welding techniques like thermoplastic tape placement with in situ consolidation offer several advantages over traditional manufacturing processes like autoclave consolidation, thermoforming, etc. However, still there is a need to solve several important processing issues before it becomes a viable economic process. Intensive process analysis and optimization has been carried out in the past through experimental investigation, model definition and simulation development. Today process simulation is capable to predict resulting consolidation quality. Effects of material imperfections or process parameter variations are well known. But using this knowledge to control the process based on online process monitoring and according adaption of the process parameters is still challenging. Solving inverse problems and using methods for automated code generation allowing fast implementation of algorithms on targets are required. The paper explains the placement technique in general. Process-material-property-relationships and typical material imperfections are described. Furthermore, online monitoring techniques and how to use them for a model based process control system are presented.

  19. Humidity Measurements: A Psychrometer Suitable for On-Line Data Acquisition.

    ERIC Educational Resources Information Center

    Caporaloni, Marina; Ambrosini, Roberto

    1992-01-01

    Explains the typical design, operation, and calibration of a traditional psychrometer. Presents the method utilized for this class project with design considerations, calibration techniques, remote data sensing schematic, and specifics of the implementation process. (JJK)

  20. Process engineering of polynanomeric layered and infused composites

    NASA Astrophysics Data System (ADS)

    Williams, Ebonee Porche Marie

    As the application of advanced polymeric composites expands, the continued adaptation of traditional as well as the incorporation and/or implementation of new technologies continue to be at the core of development for engineers. One of these traditional technologies is honeycomb sandwich composites. This technology has been around for more than fifty years and there have been minimal alterations to the materials used to produce the parts and the process used to manufacture the structures. This is where the depth of this work focused. Traditional honeycomb core dip resin systems were modified to incorporate nano scale fillers. This adaptation is one of the defining aspects of polynanomeric systems, the concept of which is that modifications of the nano scale in a polymer system create nano layered structures that emulate the properties of both the polymer and the nano filler, a nano composite. The modified resin systems were characterized to investigate morphology, thermal and mechanical properties as well as electrical characteristics. It was observed that the nano altered resin system exhibited increased mechanical, 50 to 60%, and thermal properties, burn temperatures extended by 30°C, while also demonstrating improved electrical properties. These were significant results given that the main applications of honeycomb sandwich structures are on the interior of aircrafts. These results could open the door to some new applications of the modified resin system. This work also implemented a new processing technique to produce honeycomb sandwich structures. The technique was Vacuum Assisted Resin Transfer Molding, VARTM, which has gained interest over the last decade due to the reduced up front cost to initiate production, the ease of processing, and the overall health benefits. This process was successfully performed to produce sandwich structures by incorporating a permeable scrim layer at the core face sheet interface. This was the first successful production of unfilled honeycomb core sandwich part production. Overall this work is at the tip of implementing new materials and processing techniques into honeycomb core and honeycomb sandwich composite structures.

  1. Recent advances in electronic nose techniques for monitoring of fermentation process.

    PubMed

    Jiang, Hui; Zhang, Hang; Chen, Quansheng; Mei, Congli; Liu, Guohai

    2015-12-01

    Microbial fermentation process is often sensitive to even slight changes of conditions that may result in unacceptable end-product quality. Thus, the monitoring of the process is critical for discovering unfavorable deviations as early as possible and taking the appropriate measures. However, the use of traditional analytical techniques is often time-consuming and labor-intensive. In this sense, the most effective way of developing rapid, accurate and relatively economical method for quality assurance in microbial fermentation process is the use of novel chemical sensor systems. Electronic nose techniques have particular advantages in non-invasive monitoring of microbial fermentation process. Therefore, in this review, we present an overview of the most important contributions dealing with the quality control in microbial fermentation process using the electronic nose techniques. After a brief description of the fundamentals of the sensor techniques, some examples of potential applications of electronic nose techniques monitoring are provided, including the implementation of control strategies and the combination with other monitoring tools (i.e. sensor fusion). Finally, on the basis of the review, the electronic nose techniques are critically commented, and its strengths and weaknesses being highlighted. In addition, on the basis of the observed trends, we also propose the technical challenges and future outlook for the electronic nose techniques.

  2. The value of remote sensing techniques in supporting effective extrapolation across multiple marine spatial scales.

    PubMed

    Strong, James Asa; Elliott, Michael

    2017-03-15

    The reporting of ecological phenomena and environmental status routinely required point observations, collected with traditional sampling approaches to be extrapolated to larger reporting scales. This process encompasses difficulties that can quickly entrain significant errors. Remote sensing techniques offer insights and exceptional spatial coverage for observing the marine environment. This review provides guidance on (i) the structures and discontinuities inherent within the extrapolative process, (ii) how to extrapolate effectively across multiple spatial scales, and (iii) remote sensing techniques and data sets that can facilitate this process. This evaluation illustrates that remote sensing techniques are a critical component in extrapolation and likely to underpin the production of high-quality assessments of ecological phenomena and the regional reporting of environmental status. Ultimately, is it hoped that this guidance will aid the production of robust and consistent extrapolations that also make full use of the techniques and data sets that expedite this process. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Microwave-Assisted Hydro-Distillation of Essential Oil from Rosemary: Comparison with Traditional Distillation

    PubMed Central

    Moradi, Sara; Fazlali, Alireza; Hamedi, Hamid

    Background: Hydro-distillation (HD) method is a traditional technique which is used in most industrial companies. Microwave-assisted Hydro-distillation (MAHD) is an advanced HD technique utilizing a microwave oven in the extraction process. Methods: In this research, MAHD of essential oils from the aerial parts (leaves) of rosemary (Rosmarinus officinalis L.) was studied and the results were compared with those of the conventional HD in terms of extraction time, extraction efficiency, chemical composition, quality of the essential oils and cost of the operation. Results: Microwave hydro-distillation was superior in terms of saving energy and extraction time (30 min, compared to 90 min in HD). Chromatography was used for quantity analysis of the essential oils composition. Quality of essential oil improved in MAHD method due to an increase of 17% in oxygenated compounds. Conclusion: Consequently, microwave hydro-distillation can be used as a substitute of traditional hydro-distillation. PMID:29296263

  4. Microwave-Assisted Hydro-Distillation of Essential Oil from Rosemary: Comparison with Traditional Distillation.

    PubMed

    Moradi, Sara; Fazlali, Alireza; Hamedi, Hamid

    2018-01-01

    Hydro-distillation (HD) method is a traditional technique which is used in most industrial companies. Microwave-assisted Hydro-distillation (MAHD) is an advanced HD technique utilizing a microwave oven in the extraction process. In this research, MAHD of essential oils from the aerial parts (leaves) of rosemary ( Rosmarinus officinalis L. ) was studied and the results were compared with those of the conventional HD in terms of extraction time, extraction efficiency, chemical composition, quality of the essential oils and cost of the operation. Microwave hydro-distillation was superior in terms of saving energy and extraction time (30 min , compared to 90 min in HD). Chromatography was used for quantity analysis of the essential oils composition. Quality of essential oil improved in MAHD method due to an increase of 17% in oxygenated compounds. Consequently, microwave hydro-distillation can be used as a substitute of traditional hydro-distillation.

  5. GAPscreener: an automatic tool for screening human genetic association literature in PubMed using the support vector machine technique.

    PubMed

    Yu, Wei; Clyne, Melinda; Dolan, Siobhan M; Yesupriya, Ajay; Wulf, Anja; Liu, Tiebin; Khoury, Muin J; Gwinn, Marta

    2008-04-22

    Synthesis of data from published human genetic association studies is a critical step in the translation of human genome discoveries into health applications. Although genetic association studies account for a substantial proportion of the abstracts in PubMed, identifying them with standard queries is not always accurate or efficient. Further automating the literature-screening process can reduce the burden of a labor-intensive and time-consuming traditional literature search. The Support Vector Machine (SVM), a well-established machine learning technique, has been successful in classifying text, including biomedical literature. The GAPscreener, a free SVM-based software tool, can be used to assist in screening PubMed abstracts for human genetic association studies. The data source for this research was the HuGE Navigator, formerly known as the HuGE Pub Lit database. Weighted SVM feature selection based on a keyword list obtained by the two-way z score method demonstrated the best screening performance, achieving 97.5% recall, 98.3% specificity and 31.9% precision in performance testing. Compared with the traditional screening process based on a complex PubMed query, the SVM tool reduced by about 90% the number of abstracts requiring individual review by the database curator. The tool also ascertained 47 articles that were missed by the traditional literature screening process during the 4-week test period. We examined the literature on genetic associations with preterm birth as an example. Compared with the traditional, manual process, the GAPscreener both reduced effort and improved accuracy. GAPscreener is the first free SVM-based application available for screening the human genetic association literature in PubMed with high recall and specificity. The user-friendly graphical user interface makes this a practical, stand-alone application. The software can be downloaded at no charge.

  6. Expert system and process optimization techniques for real-time monitoring and control of plasma processes

    NASA Astrophysics Data System (ADS)

    Cheng, Jie; Qian, Zhaogang; Irani, Keki B.; Etemad, Hossein; Elta, Michael E.

    1991-03-01

    To meet the ever-increasing demand of the rapidly-growing semiconductor manufacturing industry it is critical to have a comprehensive methodology integrating techniques for process optimization real-time monitoring and adaptive process control. To this end we have accomplished an integrated knowledge-based approach combining latest expert system technology machine learning method and traditional statistical process control (SPC) techniques. This knowledge-based approach is advantageous in that it makes it possible for the task of process optimization and adaptive control to be performed consistently and predictably. Furthermore this approach can be used to construct high-level and qualitative description of processes and thus make the process behavior easy to monitor predict and control. Two software packages RIST (Rule Induction and Statistical Testing) and KARSM (Knowledge Acquisition from Response Surface Methodology) have been developed and incorporated with two commercially available packages G2 (real-time expert system) and ULTRAMAX (a tool for sequential process optimization).

  7. Real time automatic detection of bearing fault in induction machine using kurtogram analysis.

    PubMed

    Tafinine, Farid; Mokrani, Karim

    2012-11-01

    A proposed signal processing technique for incipient real time bearing fault detection based on kurtogram analysis is presented in this paper. The kurtogram is a fourth-order spectral analysis tool introduced for detecting and characterizing non-stationarities in a signal. This technique starts from investigating the resonance signatures over selected frequency bands to extract the representative features. The traditional spectral analysis is not appropriate for non-stationary vibration signal and for real time diagnosis. The performance of the proposed technique is examined by a series of experimental tests corresponding to different bearing conditions. Test results show that this signal processing technique is an effective bearing fault automatic detection method and gives a good basis for an integrated induction machine condition monitor.

  8. Benefits from remote sensing data utilization in urban planning processes and system recommendations

    NASA Technical Reports Server (NTRS)

    Mallon, H. J.; Howard, J. Y.

    1972-01-01

    The benefits of utilizing remote sensor data in the urban planning process of the Metropolitan Washington Council of Governments are investigated. An evaluation of sensor requirements, a description/ comparison of costs, benefits, levels of accuracy, ease of attainment, and frequency of update possible using sensor versus traditional data acquisition techniques are discussed.

  9. A Data-Driven Solution for Performance Improvement

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Marketed as the "Software of the Future," Optimal Engineering Systems P.I. EXPERT(TM) technology offers statistical process control and optimization techniques that are critical to businesses looking to restructure or accelerate operations in order to gain a competitive edge. Kennedy Space Center granted Optimal Engineering Systems the funding and aid necessary to develop a prototype of the process monitoring and improvement software. Completion of this prototype demonstrated that it was possible to integrate traditional statistical quality assurance tools with robust optimization techniques in a user- friendly format that is visually compelling. Using an expert system knowledge base, the software allows the user to determine objectives, capture constraints and out-of-control processes, predict results, and compute optimal process settings.

  10. Main Oxidizer Valve Design

    NASA Technical Reports Server (NTRS)

    Addona, Brad; Eddleman, David

    2015-01-01

    A developmental Main Oxidizer Valve (MOV) was designed by NASA-MSFC using additive manufacturing processes. The MOV is a pneumatically actuated poppet valve to control the flow of liquid oxygen to an engine's injector. A compression spring is used to return the valve to the closed state when pneumatic pressure is removed from the valve. The valve internal parts are cylindrical in shape, which lends itself to traditional lathe and milling operations. However, the valve body represents a complicated shape and contains the majority of the mass of the valve. Additive manufacturing techniques were used to produce a part that optimized mass and allowed for design features not practical with traditional machining processes.

  11. The cophylogeny of populations and cultures: reconstructing the evolution of Iranian tribal craft traditions using trees and jungles

    PubMed Central

    Tehrani, Jamshid J.; Collard, Mark; Shennan, Stephen J.

    2010-01-01

    Phylogenetic approaches to culture have shed new light on the role played by population dispersals in the spread and diversification of cultural traditions. However, the fact that cultural inheritance is based on separate mechanisms from genetic inheritance means that socially transmitted traditions have the potential to diverge from population histories. Here, we suggest that associations between these two systems can be reconstructed using techniques developed to study cospeciation between hosts and parasites and related problems in biology. Relationships among the latter are patterned by four main processes: co-divergence, intra-host speciation (duplication), intra-host extinction (sorting) and horizontal transfers. We show that patterns of cultural inheritance are structured by analogous processes, and then demonstrate the applicability of the host–parasite model to culture using empirical data on Iranian tribal populations. PMID:21041211

  12. Connected Text Reading and Differences in Text Reading Fluency in Adult Readers

    PubMed Central

    Wallot, Sebastian; Hollis, Geoff; van Rooij, Marieke

    2013-01-01

    The process of connected text reading has received very little attention in contemporary cognitive psychology. This lack of attention is in parts due to a research tradition that emphasizes the role of basic lexical constituents, which can be studied in isolated words or sentences. However, this lack of attention is in parts also due to the lack of statistical analysis techniques, which accommodate interdependent time series. In this study, we investigate text reading performance with traditional and nonlinear analysis techniques and show how outcomes from multiple analyses can used to create a more detailed picture of the process of text reading. Specifically, we investigate reading performance of groups of literate adult readers that differ in reading fluency during a self-paced text reading task. Our results indicate that classical metrics of reading (such as word frequency) do not capture text reading very well, and that classical measures of reading fluency (such as average reading time) distinguish relatively poorly between participant groups. Nonlinear analyses of distribution tails and reading time fluctuations provide more fine-grained information about the reading process and reading fluency. PMID:23977177

  13. Inherent secure communications using lattice based waveform design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pugh, Matthew Owen

    2013-12-01

    The wireless communications channel is innately insecure due to the broadcast nature of the electromagnetic medium. Many techniques have been developed and implemented in order to combat insecurities and ensure the privacy of transmitted messages. Traditional methods include encrypting the data via cryptographic methods, hiding the data in the noise floor as in wideband communications, or nulling the signal in the spatial direction of the adversary using array processing techniques. This work analyzes the design of signaling constellations, i.e. modulation formats, to combat eavesdroppers from correctly decoding transmitted messages. It has been shown that in certain channel models the abilitymore » of an adversary to decode the transmitted messages can be degraded by a clever signaling constellation based on lattice theory. This work attempts to optimize certain lattice parameters in order to maximize the security of the data transmission. These techniques are of interest because they are orthogonal to, and can be used in conjunction with, traditional security techniques to create a more secure communication channel.« less

  14. Applying traditional signal processing techniques to social media exploitation for situational understanding

    NASA Astrophysics Data System (ADS)

    Abdelzaher, Tarek; Roy, Heather; Wang, Shiguang; Giridhar, Prasanna; Al Amin, Md. Tanvir; Bowman, Elizabeth K.; Kolodny, Michael A.

    2016-05-01

    Signal processing techniques such as filtering, detection, estimation and frequency domain analysis have long been applied to extract information from noisy sensor data. This paper describes the exploitation of these signal processing techniques to extract information from social networks, such as Twitter and Instagram. Specifically, we view social networks as noisy sensors that report events in the physical world. We then present a data processing stack for detection, localization, tracking, and veracity analysis of reported events using social network data. We show using a controlled experiment that the behavior of social sources as information relays varies dramatically depending on context. In benign contexts, there is general agreement on events, whereas in conflict scenarios, a significant amount of collective filtering is introduced by conflicted groups, creating a large data distortion. We describe signal processing techniques that mitigate such distortion, resulting in meaningful approximations of actual ground truth, given noisy reported observations. Finally, we briefly present an implementation of the aforementioned social network data processing stack in a sensor network analysis toolkit, called Apollo. Experiences with Apollo show that our techniques are successful at identifying and tracking credible events in the physical world.

  15. An Effective Methodology for Processing and Analyzing Large, Complex Spacecraft Data Streams

    ERIC Educational Resources Information Center

    Teymourlouei, Haydar

    2013-01-01

    The emerging large datasets have made efficient data processing a much more difficult task for the traditional methodologies. Invariably, datasets continue to increase rapidly in size with time. The purpose of this research is to give an overview of some of the tools and techniques that can be utilized to manage and analyze large datasets. We…

  16. DNA Barcoding for the Identification and Authentication of Animal Species in Traditional Medicine.

    PubMed

    Yang, Fan; Ding, Fei; Chen, Hong; He, Mingqi; Zhu, Shixin; Ma, Xin; Jiang, Li; Li, Haifeng

    2018-01-01

    Animal-based traditional medicine not only plays a significant role in therapeutic practices worldwide but also provides a potential compound library for drug discovery. However, persistent hunting and illegal trade markedly threaten numerous medicinal animal species, and increasing demand further provokes the emergence of various adulterants. As the conventional methods are difficult and time-consuming to detect processed products or identify animal species with similar morphology, developing novel authentication methods for animal-based traditional medicine represents an urgent need. During the last decade, DNA barcoding offers an accurate and efficient strategy that can identify existing species and discover unknown species via analysis of sequence variation in a standardized region of DNA. Recent studies have shown that DNA barcoding as well as minibarcoding and metabarcoding is capable of identifying animal species and discriminating the authentics from the adulterants in various types of traditional medicines, including raw materials, processed products, and complex preparations. These techniques can also be used to detect the unlabelled and threatened animal species in traditional medicine. Here, we review the recent progress of DNA barcoding for the identification and authentication of animal species used in traditional medicine, which provides a reference for quality control and trade supervision of animal-based traditional medicine.

  17. DNA Barcoding for the Identification and Authentication of Animal Species in Traditional Medicine

    PubMed Central

    Yang, Fan; Ding, Fei; Chen, Hong; He, Mingqi; Zhu, Shixin; Ma, Xin; Jiang, Li

    2018-01-01

    Animal-based traditional medicine not only plays a significant role in therapeutic practices worldwide but also provides a potential compound library for drug discovery. However, persistent hunting and illegal trade markedly threaten numerous medicinal animal species, and increasing demand further provokes the emergence of various adulterants. As the conventional methods are difficult and time-consuming to detect processed products or identify animal species with similar morphology, developing novel authentication methods for animal-based traditional medicine represents an urgent need. During the last decade, DNA barcoding offers an accurate and efficient strategy that can identify existing species and discover unknown species via analysis of sequence variation in a standardized region of DNA. Recent studies have shown that DNA barcoding as well as minibarcoding and metabarcoding is capable of identifying animal species and discriminating the authentics from the adulterants in various types of traditional medicines, including raw materials, processed products, and complex preparations. These techniques can also be used to detect the unlabelled and threatened animal species in traditional medicine. Here, we review the recent progress of DNA barcoding for the identification and authentication of animal species used in traditional medicine, which provides a reference for quality control and trade supervision of animal-based traditional medicine. PMID:29849709

  18. Task-Driven Dynamic Text Summarization

    ERIC Educational Resources Information Center

    Workman, Terri Elizabeth

    2011-01-01

    The objective of this work is to examine the efficacy of natural language processing (NLP) in summarizing bibliographic text for multiple purposes. Researchers have noted the accelerating growth of bibliographic databases. Information seekers using traditional information retrieval techniques when searching large bibliographic databases are often…

  19. COMPUTERIZED RISK AND BIOACCUMULATION SYSTEM (VERSION 1.0)

    EPA Science Inventory

    CRABS is a combination of a rule-based expert system and more traditional procedural programming techniques. ule-based expert systems attempt to emulate the decision making process of human experts within a clearly defined subject area. xpert systems consist of an "inference engi...

  20. A hierarchical network-based algorithm for multi-scale watershed delineation

    NASA Astrophysics Data System (ADS)

    Castronova, Anthony M.; Goodall, Jonathan L.

    2014-11-01

    Watershed delineation is a process for defining a land area that contributes surface water flow to a single outlet point. It is a commonly used in water resources analysis to define the domain in which hydrologic process calculations are applied. There has been a growing effort over the past decade to improve surface elevation measurements in the U.S., which has had a significant impact on the accuracy of hydrologic calculations. Traditional watershed processing on these elevation rasters, however, becomes more burdensome as data resolution increases. As a result, processing of these datasets can be troublesome on standard desktop computers. This challenge has resulted in numerous works that aim to provide high performance computing solutions to large data, high resolution data, or both. This work proposes an efficient watershed delineation algorithm for use in desktop computing environments that leverages existing data, U.S. Geological Survey (USGS) National Hydrography Dataset Plus (NHD+), and open source software tools to construct watershed boundaries. This approach makes use of U.S. national-level hydrography data that has been precomputed using raster processing algorithms coupled with quality control routines. Our approach uses carefully arranged data and mathematical graph theory to traverse river networks and identify catchment boundaries. We demonstrate this new watershed delineation technique, compare its accuracy with traditional algorithms that derive watershed solely from digital elevation models, and then extend our approach to address subwatershed delineation. Our findings suggest that the open-source hierarchical network-based delineation procedure presented in the work is a promising approach to watershed delineation that can be used summarize publicly available datasets for hydrologic model input pre-processing. Through our analysis, we explore the benefits of reusing the NHD+ datasets for watershed delineation, and find that the our technique offers greater flexibility and extendability than traditional raster algorithms.

  1. Identifying risks in the realm of enterprise risk management.

    PubMed

    Carroll, Roberta

    2016-01-01

    An enterprise risk management (ERM) discipline is comprehensive and organization-wide. The effectiveness of ERM is governed in part by the strength and breadth of its practices and processes. An essential element in decision making is a thorough process by which organizational risks and value opportunities can be identified. This article will offer identification techniques that go beyond those used in traditional risk management programs and demonstrate how these techniques can be used to identify risks and opportunity in the ERM environment. © 2016 American Society for Healthcare Risk Management of the American Hospital Association.

  2. A Compact, Solid-State UV (266 nm) Laser System Capable of Burst-Mode Operation for Laser Ablation Desorption Processing

    NASA Technical Reports Server (NTRS)

    Arevalo, Ricardo, Jr.; Coyle, Barry; Paulios, Demetrios; Stysley, Paul; Feng, Steve; Getty, Stephanie; Binkerhoff, William

    2015-01-01

    Compared to wet chemistry and pyrolysis techniques, in situ laser-based methods of chemical analysis provide an ideal way to characterize precious planetary materials without requiring extensive sample processing. In particular, laser desorption and ablation techniques allow for rapid, reproducible and robust data acquisition over a wide mass range, plus: Quantitative, spatially-resolved measurements of elemental and molecular (organic and inorganic) abundances; Low analytical blanks and limits-of-detection ( ng g-1); and, the destruction of minimal quantities of sample ( g) compared to traditional solution and/or pyrolysis analyses (mg).

  3. An Intelligent Systems Approach to Automated Object Recognition: A Preliminary Study

    USGS Publications Warehouse

    Maddox, Brian G.; Swadley, Casey L.

    2002-01-01

    Attempts at fully automated object recognition systems have met with varying levels of success over the years. However, none of the systems have achieved high enough accuracy rates to be run unattended. One of the reasons for this may be that they are designed from the computer's point of view and rely mainly on image-processing methods. A better solution to this problem may be to make use of modern advances in computational intelligence and distributed processing to try to mimic how the human brain is thought to recognize objects. As humans combine cognitive processes with detection techniques, such a system would combine traditional image-processing techniques with computer-based intelligence to determine the identity of various objects in a scene.

  4. Parallel k-means++

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    A parallelization of the k-means++ seed selection algorithm on three distinct hardware platforms: GPU, multicore CPU, and multithreaded architecture. K-means++ was developed by David Arthur and Sergei Vassilvitskii in 2007 as an extension of the k-means data clustering technique. These algorithms allow people to cluster multidimensional data, by attempting to minimize the mean distance of data points within a cluster. K-means++ improved upon traditional k-means by using a more intelligent approach to selecting the initial seeds for the clustering process. While k-means++ has become a popular alternative to traditional k-means clustering, little work has been done to parallelize this technique.more » We have developed original C++ code for parallelizing the algorithm on three unique hardware architectures: GPU using NVidia's CUDA/Thrust framework, multicore CPU using OpenMP, and the Cray XMT multithreaded architecture. By parallelizing the process for these platforms, we are able to perform k-means++ clustering much more quickly than it could be done before.« less

  5. Application of hydrometallurgy techniques in quartz processing and purification: a review

    NASA Astrophysics Data System (ADS)

    Lin, Min; Lei, Shaomin; Pei, Zhenyu; Liu, Yuanyuan; Xia, Zhangjie; Xie, Feixiang

    2018-04-01

    Although there have been numerous studies on separation and purification of metallic minerals by hydrometallurgy techniques, applications of the chemical techniques in separation and purification of non-metallic minerals are rarely reported. This paper reviews disparate areas of study into processing and purification of quartz (typical non-metallic ore) in an attempt to summarize current work, as well as to suggest potential for future consolidation in the field. The review encompasses chemical techniques of the quartz processing including situations, progresses, leaching mechanism, scopes of application, advantages and drawbacks of micro-bioleaching, high temperature leaching, high temperature pressure leaching and catalyzed high temperature pressure leaching. Traditional leaching techniques including micro-bioleaching and high temperature leaching are unequal to demand of modern glass industry for quality of quartz concentrate because the quartz products has to be further processed. High temperature pressure leaching and catalyzed high temperature pressure leaching provide new ways to produce high-grade quartz sand with only one process and lower acid consumption. Furthermore, the catalyzed high temperature pressure leaching realizes effective purification of quartz with extremely low acid consumption (no using HF or any fluoride). It is proposed that, by integrating the different chemical processes of quartz processing and expounding leaching mechanisms and scopes of application, the research field as a monopolized industry would benefit.

  6. A seamless acquisition digital storage oscilloscope with three-dimensional waveform display

    NASA Astrophysics Data System (ADS)

    Yang, Kuojun; Tian, Shulin; Zeng, Hao; Qiu, Lei; Guo, Lianping

    2014-04-01

    In traditional digital storage oscilloscope (DSO), sampled data need to be processed after each acquisition. During data processing, the acquisition is stopped and oscilloscope is blind to the input signal. Thus, this duration is called dead time. With the rapid development of modern electronic systems, the effect of infrequent events becomes significant. To capture these occasional events in shorter time, dead time in traditional DSO that causes the loss of measured signal needs to be reduced or even eliminated. In this paper, a seamless acquisition oscilloscope without dead time is proposed. In this oscilloscope, three-dimensional waveform mapping (TWM) technique, which converts sampled data to displayed waveform, is proposed. With this technique, not only the process speed is improved, but also the probability information of waveform is displayed with different brightness. Thus, a three-dimensional waveform is shown to the user. To reduce processing time further, parallel TWM which processes several sampled points simultaneously, and dual-port random access memory based pipelining technique which can process one sampling point in one clock period are proposed. Furthermore, two DDR3 (Double-Data-Rate Three Synchronous Dynamic Random Access Memory) are used for storing sampled data alternately, thus the acquisition can continue during data processing. Therefore, the dead time of DSO is eliminated. In addition, a double-pulse test method is adopted to test the waveform capturing rate (WCR) of the oscilloscope and a combined pulse test method is employed to evaluate the oscilloscope's capture ability comprehensively. The experiment results show that the WCR of the designed oscilloscope is 6 250 000 wfms/s (waveforms per second), the highest value in all existing oscilloscopes. The testing results also prove that there is no dead time in our oscilloscope, thus realizing the seamless acquisition.

  7. A seamless acquisition digital storage oscilloscope with three-dimensional waveform display

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Kuojun, E-mail: kuojunyang@gmail.com; Guo, Lianping; School of Electrical and Electronic Engineering, Nanyang Technological University

    In traditional digital storage oscilloscope (DSO), sampled data need to be processed after each acquisition. During data processing, the acquisition is stopped and oscilloscope is blind to the input signal. Thus, this duration is called dead time. With the rapid development of modern electronic systems, the effect of infrequent events becomes significant. To capture these occasional events in shorter time, dead time in traditional DSO that causes the loss of measured signal needs to be reduced or even eliminated. In this paper, a seamless acquisition oscilloscope without dead time is proposed. In this oscilloscope, three-dimensional waveform mapping (TWM) technique, whichmore » converts sampled data to displayed waveform, is proposed. With this technique, not only the process speed is improved, but also the probability information of waveform is displayed with different brightness. Thus, a three-dimensional waveform is shown to the user. To reduce processing time further, parallel TWM which processes several sampled points simultaneously, and dual-port random access memory based pipelining technique which can process one sampling point in one clock period are proposed. Furthermore, two DDR3 (Double-Data-Rate Three Synchronous Dynamic Random Access Memory) are used for storing sampled data alternately, thus the acquisition can continue during data processing. Therefore, the dead time of DSO is eliminated. In addition, a double-pulse test method is adopted to test the waveform capturing rate (WCR) of the oscilloscope and a combined pulse test method is employed to evaluate the oscilloscope's capture ability comprehensively. The experiment results show that the WCR of the designed oscilloscope is 6 250 000 wfms/s (waveforms per second), the highest value in all existing oscilloscopes. The testing results also prove that there is no dead time in our oscilloscope, thus realizing the seamless acquisition.« less

  8. Conformity does not perpetuate suboptimal traditions in a wild population of songbirds

    PubMed Central

    Aplin, Lucy M.; Sheldon, Ben C.; McElreath, Richard

    2017-01-01

    Social learning is important to the life history of many animals, helping individuals to acquire new adaptive behavior. However despite long-running debate, it remains an open question whether a reliance on social learning can also lead to mismatched or maladaptive behavior. In a previous study, we experimentally induced traditions for opening a bidirectional door puzzle box in replicate subpopulations of the great tit Parus major. Individuals were conformist social learners, resulting in stable cultural behaviors. Here, we vary the rewards gained by these techniques to ask to what extent established behaviors are flexible to changing conditions. When subpopulations with established foraging traditions for one technique were subjected to a reduced foraging payoff, 49% of birds switched their behavior to a higher-payoff foraging technique after only 14 days, with younger individuals showing a faster rate of change. We elucidated the decision-making process for each individual, using a mechanistic learning model to demonstrate that, perhaps surprisingly, this population-level change was achieved without significant asocial exploration and without any evidence for payoff-biased copying. Rather, by combining conformist social learning with payoff-sensitive individual reinforcement (updating of experience), individuals and populations could both acquire adaptive behavior and track environmental change. PMID:28739943

  9. Contrasting faith-based and traditional substance abuse treatment programs.

    PubMed

    Neff, James Alan; Shorkey, Clayton T; Windsor, Liliane Cambraia

    2006-01-01

    This article (a) discusses the definition of faith-based substance abuse treatment programs, (b) juxtaposes Durkheim's theory regarding religion with treatment process model to highlight key dimensions of faith-based and traditional programs, and (c) presents results from a study of seven programs to identify key program dimensions and to identify differences/similarities between program types. Focus group/Concept Mapping techniques yielded a clear "spiritual activities, beliefs, and rituals" dimension, rated as significantly more important to faith-based programs. Faith-based program staff also rated "structure and discipline" as more important and "work readiness" as less important. No differences were found for "group activities/cohesion" and "role modeling/mentoring," "safe, supportive environment," and "traditional treatment modalities." Programs showed substantial similarities with regard to core social processes of treatment such as mentoring, role modeling, and social cohesion. Implications are considered for further research on treatment engagement, retention, and other outcomes.

  10. [Modified Misgav-Labach at a tertiary hospital].

    PubMed

    Martínez Ceccopieri, David Alejandro; Barrios Prieto, Ernesto; Martínez Ríos, David

    2012-08-01

    According to several studies from around the globe, the modified Misgav Ladach technique simplifies the surgical procedure for cesarean section, reduces operation time, costs, and complications, and optimizes obstetric and perinatal outcomes. Compare obstetric outcomes between patients operated on using traditional cesarean section technique and those operated on using modified Misgav Ladach technique. The study included 49 patients operated on using traditional cesarean section technique and 47 patients operated on using modified Misgav Ladach technique to compare the outcomes in both surgical techniques. The modified Misgav Ladach technique was associated with more benefits than those of the traditional technique: less surgical bleeding, less operation time, less analgesic total doses, less rescue analgesic doses and less need of more than one analgesic drug. The modified Misgav Ladach surgical technique was associated with better obstetric results than those of the traditional surgical technique; this concurs with the results reported by other national and international studies.

  11. Process simulation during the design process makes the difference: process simulations applied to a traditional design.

    PubMed

    Traversari, Roberto; Goedhart, Rien; Schraagen, Jan Maarten

    2013-01-01

    The objective is evaluation of a traditionally designed operating room using simulation of various surgical workflows. A literature search showed that there is no evidence for an optimal operating room layout regarding the position and size of an ultraclean ventilation (UCV) canopy with a separate preparation room for laying out instruments and in which patients are induced in the operating room itself. Neither was literature found reporting on process simulation being used for this application. Many technical guidelines and designs have mainly evolved over time, and there is no evidence on whether the proposed measures are also effective for the optimization of the layout for workflows. The study was conducted by applying observational techniques to simulated typical surgical procedures. Process simulations which included complete surgical teams and equipment required for the intervention were carried out for four typical interventions. Four observers used a form to record conflicts with the clean area boundaries and the height of the supply bridge. Preferences for particular layouts were discussed with the surgical team after each simulated procedure. We established that a clean area measuring 3 × 3 m and a supply bridge height of 2.05 m was satisfactory for most situations, provided a movable operation table is used. The only cases in which conflicts with the supply bridge were observed were during the use of a surgical robot (Da Vinci) and a surgical microscope. During multiple trauma interventions, bottlenecks regarding the dimensions of the clean area will probably arise. The process simulation of four typical interventions has led to significantly different operating room layouts than were arrived at through the traditional design process. Evidence-based design, human factors, work environment, operating room, traditional design, process simulation, surgical workflowsPreferred Citation: Traversari, R., Goedhart, R., & Schraagen, J. M. (2013). Process simulation during the design process makes the difference: Process simulations applied to a traditional design. Health Environments Research & Design Journal 6(2), pp 58-76.

  12. A comprehensive statistical investigation of schlieren image velocimetry (SIV) using high-velocity helium jet

    NASA Astrophysics Data System (ADS)

    Biswas, Sayan; Qiao, Li

    2017-03-01

    A detailed statistical assessment of seedless velocity measurement using Schlieren Image Velocimetry (SIV) was explored using open source Robust Phase Correlation (RPC) algorithm. A well-known flow field, an axisymmetric turbulent helium jet, was analyzed near and intermediate region (0≤ x/d≤ 20) for two different Reynolds numbers, Re d = 11,000 and Re d = 22,000 using schlieren with horizontal knife-edge, schlieren with vertical knife-edge and shadowgraph technique, and the resulted velocity fields from SIV techniques were compared to traditional Particle Image Velocimetry (PIV) measurements. A novel, inexpensive, easy to setup two-camera SIV technique had been demonstrated to measure high-velocity turbulent jet, with jet exit velocities 304 m/s (Mach = 0.3) and 611 m/s (Mach = 0.6), respectively. Several image restoration and enhancement techniques were tested to improve signal to noise ratio (SNR) in schlieren and shadowgraph images. Processing and post-processing parameters for SIV techniques were examined in detail. A quantitative comparison between self-seeded SIV techniques and traditional PIV had been made using correlation statistics. While the resulted flow field from schlieren with horizontal knife-edge and shadowgraph showed excellent agreement with PIV measurements, schlieren with vertical knife-edge performed poorly. The performance of spatial cross-correlations at different jet locations using SIV techniques and PIV was evaluated. Turbulence quantities like turbulence intensity, mean velocity fields, Reynolds shear stress influenced spatial correlations and correlation plane SNR heavily. Several performance metrics such as primary peak ratio (PPR), peak to correlation energy (PCE), the probability distribution of signal and noise were used to compare capability and potential of different SIV techniques.

  13. Cockpit System Situational Awareness Modeling Tool

    NASA Technical Reports Server (NTRS)

    Keller, John; Lebiere, Christian; Shay, Rick; Latorella, Kara

    2004-01-01

    This project explored the possibility of predicting pilot situational awareness (SA) using human performance modeling techniques for the purpose of evaluating developing cockpit systems. The Improved Performance Research Integration Tool (IMPRINT) was combined with the Adaptive Control of Thought-Rational (ACT-R) cognitive modeling architecture to produce a tool that can model both the discrete tasks of pilots and the cognitive processes associated with SA. The techniques for using this tool to predict SA were demonstrated using the newly developed Aviation Weather Information (AWIN) system. By providing an SA prediction tool to cockpit system designers, cockpit concepts can be assessed early in the design process while providing a cost-effective complement to the traditional pilot-in-the-loop experiments and data collection techniques.

  14. Total Quality Management: Institutional Research Applications.

    ERIC Educational Resources Information Center

    Heverly, Mary Ann

    Total Quality Management (TQM), a technique traditionally reserved for the manufacturing sector, has recently spread to service companies, government agencies, and educational institutions. TQM places responsibility for quality problems with management rather than on the workers. A principal concept of TQM is the management of Process Variation,…

  15. Classroom Control: Some Cybernetic Comments on the Possible and the Impossible.

    ERIC Educational Resources Information Center

    Robinson, Michael

    1979-01-01

    Application of cybernetic laws and information processing principles suggests that traditional and modern teaching methods are radically incompatible, in the sense that techniques developed in the one cannot be transferred to the other without dislocation of the system as a whole. (Author/WBC)

  16. Optimization and Characterization of the Friction Stir Welded Sheets of AA 5754-H111: Monitoring of the Quality of Joints with Thermographic Techniques.

    PubMed

    De Filippis, Luigi Alberto Ciro; Serio, Livia Maria; Palumbo, Davide; De Finis, Rosa; Galietti, Umberto

    2017-10-11

    Friction Stir Welding (FSW) is a solid-state welding process, based on frictional and stirring phenomena, that offers many advantages with respect to the traditional welding methods. However, several parameters can affect the quality of the produced joints. In this work, an experimental approach has been used for studying and optimizing the FSW process, applied on 5754-H111 aluminum plates. In particular, the thermal behavior of the material during the process has been investigated and two thermal indexes, the maximum temperature and the heating rate of the material, correlated to the frictional power input, were investigated for different process parameters (the travel and rotation tool speeds) configurations. Moreover, other techniques (micrographs, macrographs and destructive tensile tests) were carried out for supporting in a quantitative way the analysis of the quality of welded joints. The potential of thermographic technique has been demonstrated both for monitoring the FSW process and for predicting the quality of joints in terms of tensile strength.

  17. Wind Gust Measurement Techniques-From Traditional Anemometry to New Possibilities.

    PubMed

    Suomi, Irene; Vihma, Timo

    2018-04-23

    Information on wind gusts is needed for assessment of wind-induced damage and risks to safety. The measurement of wind gust speed requires a high temporal resolution of the anemometer system, because the gust is defined as a short-duration (seconds) maximum of the fluctuating wind speed. Until the digitalization of wind measurements in the 1990s, the wind gust measurements suffered from limited recording and data processing resources. Therefore, the majority of continuous wind gust records date back at most only by 30 years. Although the response characteristics of anemometer systems are good enough today, the traditional measurement techniques at weather stations based on cup and sonic anemometers are limited to heights and regions where the supporting structures can reach. Therefore, existing measurements are mainly concentrated over densely-populated land areas, whereas from remote locations, such as the marine Arctic, wind gust information is available only from sparse coastal locations. Recent developments of wind gust measurement techniques based on turbulence measurements from research aircraft and from Doppler lidar can potentially provide new information from heights and locations unreachable by traditional measurement techniques. Moreover, fast-developing measurement methods based on Unmanned Aircraft Systems (UASs) may add to better coverage of wind gust measurements in the future. In this paper, we provide an overview of the history and the current status of anemometry from the perspective of wind gusts. Furthermore, a discussion on the potential future directions of wind gust measurement techniques is provided.

  18. Visualization techniques for tongue analysis in traditional Chinese medicine

    NASA Astrophysics Data System (ADS)

    Pham, Binh L.; Cai, Yang

    2004-05-01

    Visual inspection of the tongue has been an important diagnostic method of Traditional Chinese Medicine (TCM). Clinic data have shown significant connections between various viscera cancers and abnormalities in the tongue and the tongue coating. Visual inspection of the tongue is simple and inexpensive, but the current practice in TCM is mainly experience-based and the quality of the visual inspection varies between individuals. The computerized inspection method provides quantitative models to evaluate color, texture and surface features on the tongue. In this paper, we investigate visualization techniques and processes to allow interactive data analysis with the aim to merge computerized measurements with human expert's diagnostic variables based on five-scale diagnostic conditions: Healthy (H), History Cancers (HC), History of Polyps (HP), Polyps (P) and Colon Cancer (C).

  19. Electrospinning for nano- to mesoscale photonic structures

    NASA Astrophysics Data System (ADS)

    Skinner, Jack L.; Andriolo, Jessica M.; Murphy, John P.; Ross, Brandon M.

    2017-08-01

    The fabrication of photonic and electronic structures and devices has directed the manufacturing industry for the last 50 years. Currently, the majority of small-scale photonic devices are created by traditional microfabrication techniques that create features by processes such as lithography and electron or ion beam direct writing. Microfabrication techniques are often expensive and slow. In contrast, the use of electrospinning (ES) in the fabrication of micro- and nano-scale devices for the manipulation of photons and electrons provides a relatively simple and economic viable alternative. ES involves the delivery of a polymer solution to a capillary held at a high voltage relative to the fiber deposition surface. Electrostatic force developed between the collection plate and the polymer promotes fiber deposition onto the collection plate. Issues with ES fabrication exist primarily due to an instability region that exists between the capillary and collection plate and is characterized by chaotic motion of the depositing polymer fiber. Material limitations to ES also exist; not all polymers of interest are amenable to the ES process due to process dependencies on molecular weight and chain entanglement or incompatibility with other polymers and overall process compatibility. Passive and active electronic and photonic fibers fabricated through the ES have great potential for use in light generation and collection in optical and electronic structures/devices. ES produces fiber devices that can be combined with inorganic, metallic, biological, or organic materials for novel device design. Synergistic material selection and post-processing techniques are also utilized for broad-ranging applications of organic nanofibers that span from biological to electronic, photovoltaic, or photonic. As the ability to electrospin optically and/or electronically active materials in a controlled manner continues to improve, the complexity and diversity of devices fabricated from this process can be expected to grow rapidly and provide an alternative to traditional resource-intensive fabrication techniques.

  20. GAPscreener: An automatic tool for screening human genetic association literature in PubMed using the support vector machine technique

    PubMed Central

    Yu, Wei; Clyne, Melinda; Dolan, Siobhan M; Yesupriya, Ajay; Wulf, Anja; Liu, Tiebin; Khoury, Muin J; Gwinn, Marta

    2008-01-01

    Background Synthesis of data from published human genetic association studies is a critical step in the translation of human genome discoveries into health applications. Although genetic association studies account for a substantial proportion of the abstracts in PubMed, identifying them with standard queries is not always accurate or efficient. Further automating the literature-screening process can reduce the burden of a labor-intensive and time-consuming traditional literature search. The Support Vector Machine (SVM), a well-established machine learning technique, has been successful in classifying text, including biomedical literature. The GAPscreener, a free SVM-based software tool, can be used to assist in screening PubMed abstracts for human genetic association studies. Results The data source for this research was the HuGE Navigator, formerly known as the HuGE Pub Lit database. Weighted SVM feature selection based on a keyword list obtained by the two-way z score method demonstrated the best screening performance, achieving 97.5% recall, 98.3% specificity and 31.9% precision in performance testing. Compared with the traditional screening process based on a complex PubMed query, the SVM tool reduced by about 90% the number of abstracts requiring individual review by the database curator. The tool also ascertained 47 articles that were missed by the traditional literature screening process during the 4-week test period. We examined the literature on genetic associations with preterm birth as an example. Compared with the traditional, manual process, the GAPscreener both reduced effort and improved accuracy. Conclusion GAPscreener is the first free SVM-based application available for screening the human genetic association literature in PubMed with high recall and specificity. The user-friendly graphical user interface makes this a practical, stand-alone application. The software can be downloaded at no charge. PMID:18430222

  1. Quality assessment of internet pharmaceutical products using traditional and non-traditional analytical techniques.

    PubMed

    Westenberger, Benjamin J; Ellison, Christopher D; Fussner, Andrew S; Jenney, Susan; Kolinski, Richard E; Lipe, Terra G; Lyon, Robbe C; Moore, Terry W; Revelle, Larry K; Smith, Anjanette P; Spencer, John A; Story, Kimberly D; Toler, Duckhee Y; Wokovich, Anna M; Buhse, Lucinda F

    2005-12-08

    This work investigated the use of non-traditional analytical methods to evaluate the quality of a variety of pharmaceutical products purchased via internet sites from foreign sources and compared the results with those obtained from conventional quality assurance methods. Traditional analytical techniques employing HPLC for potency, content uniformity, chromatographic purity and drug release profiles were used to evaluate the quality of five selected drug products (fluoxetine hydrochloride, levothyroxine sodium, metformin hydrochloride, phenytoin sodium, and warfarin sodium). Non-traditional techniques, such as near infrared spectroscopy (NIR), NIR imaging and thermogravimetric analysis (TGA), were employed to verify the results and investigate their potential as alternative testing methods. Two of 20 samples failed USP monographs for quality attributes. The additional analytical methods found 11 of 20 samples had different formulations when compared to the U.S. product. Seven of the 20 samples arrived in questionable containers, and 19 of 20 had incomplete labeling. Only 1 of the 20 samples had final packaging similar to the U.S. products. The non-traditional techniques complemented the traditional techniques used and highlighted additional quality issues for the products tested. For example, these methods detected suspect manufacturing issues (such as blending), which were not evident from traditional testing alone.

  2. Trichotomous processes in early memory development, aging, and neurocognitive impairment: a unified theory.

    PubMed

    Brainerd, C J; Reyna, V F; Howe, M L

    2009-10-01

    One of the most extensively investigated topics in the adult memory literature, dual memory processes, has had virtually no impact on the study of early memory development. The authors remove the key obstacles to such research by formulating a trichotomous theory of recall that combines the traditional dual processes of recollection and familiarity with a reconstruction process. The theory is then embedded in a hidden Markov model that measures all 3 processes with low-burden tasks that are appropriate for even young children. These techniques are applied to a large corpus of developmental studies of recall, yielding stable findings about the emergence of dual memory processes between childhood and young adulthood and generating tests of many theoretical predictions. The techniques are extended to the study of healthy aging and to the memory sequelae of common forms of neurocognitive impairment, resulting in a theoretical framework that is unified over 4 major domains of memory research: early development, mainstream adult research, aging, and neurocognitive impairment. The techniques are also extended to recognition, creating a unified dual process framework for recall and recognition.

  3. Application of TRIZ Theory in Patternless Casting Manufacturing Technique

    NASA Astrophysics Data System (ADS)

    Yang, Weidong; Gan, Dequan; Jiang, Ping; Tian, Yumei

    The ultimate goal of Patternless Casting Manufacturing (referred to as PCM) is how to obtain the casts by casting the sand mold directly. In the previous PCM, the resin content of sand mold is much higher than that required by traditional resin sand, so the casts obtained are difficult to be sound and qualified products, which limits the application of this technique greatly. In this paper, the TRIZ algorithm is introduced to the innovation process in PCM systematically.

  4. Online Microteaching: A Multifaceted Approach to Teacher Professional Development

    ERIC Educational Resources Information Center

    Kusmawan, Udan

    2017-01-01

    In this paper, the author proposes that microteaching may be practiced through online media. The core concept of traditional microteaching is that it is a manipulative technique used to facilitate self-reflective and critical thinking processes while teaching. Preliminary research was conducted with elementary teachers who were participating in…

  5. Language and Related Approaches to the Writing Process.

    ERIC Educational Resources Information Center

    Seesholtz, Melvin C.

    Providing instruction in language theory is an innovative technique for use in remedial and other composition courses in the two-year college. Such innovations provide intellectually stimulating material to students who lose interest when confronted with traditional grammar and composition. Students are acquainted with American Edited English in…

  6. Integrating PCR Theory and Bioinformatics into a Research-oriented Primer Design Exercise

    ERIC Educational Resources Information Center

    Robertson, Amber L.; Phillips, Allison R.

    2008-01-01

    Polymerase chain reaction (PCR) is a conceptually difficult technique that embodies many fundamental biological processes. Traditionally, students have struggled to analyze PCR results due to an incomplete understanding of the biological concepts (theory) of DNA replication and strand complementarity. Here we describe the design of a novel…

  7. Highly oriented carbon fiber–polymer composites via additive manufacturing

    DOE PAGES

    Tekinalp, Halil L.; Kunc, Vlastimil; Velez-Garcia, Gregorio M.; ...

    2014-10-16

    Additive manufacturing, diverging from traditional manufacturing techniques, such as casting and machining materials, can handle complex shapes with great design flexibility without the typical waste. Although this technique has been mainly used for rapid prototyping, interest is growing in using this method to directly manufacture actual parts of complex shape. To use 3D-printing additive manufacturing in wide spread applications, the technique and the feedstock materials require improvements to meet the mechanical requirements of load-bearing components. Thus, we investigated the short fiber (0.2 mm to 0.4 mm) reinforced acrylonitrile-butadiene-styrene composites as a feedstock for 3D-printing in terms of their processibility, microstructuremore » and mechanical performance; and also provided comparison with traditional compression molded composites. The tensile strength and modulus of 3D-printed samples increased ~115% and ~700%, respectively. 3D-printer yielded samples with very high fiber orientation in printing direction (up to 91.5 %), whereas, compression molding process yielded samples with significantly less fiber orientation. Microstructure-mechanical property relationships revealed that although the relatively high porosity is observed in the 3D-printed composites as compared to those produced by the conventional compression molding technique, they both exhibited comparable tensile strength and modulus. Furthermore, this phenomena is explained based on the changes in fiber orientation, dispersion and void formation.« less

  8. An Analog Macroscopic Technique for Studying Molecular Hydrodynamic Processes in Dense Gases and Liquids.

    PubMed

    Dahlberg, Jerry; Tkacik, Peter T; Mullany, Brigid; Fleischhauer, Eric; Shahinian, Hossein; Azimi, Farzad; Navare, Jayesh; Owen, Spencer; Bisel, Tucker; Martin, Tony; Sholar, Jodie; Keanini, Russell G

    2017-12-04

    An analog, macroscopic method for studying molecular-scale hydrodynamic processes in dense gases and liquids is described. The technique applies a standard fluid dynamic diagnostic, particle image velocimetry (PIV), to measure: i) velocities of individual particles (grains), extant on short, grain-collision time-scales, ii) velocities of systems of particles, on both short collision-time- and long, continuum-flow-time-scales, iii) collective hydrodynamic modes known to exist in dense molecular fluids, and iv) short- and long-time-scale velocity autocorrelation functions, central to understanding particle-scale dynamics in strongly interacting, dense fluid systems. The basic system is composed of an imaging system, light source, vibrational sensors, vibrational system with a known media, and PIV and analysis software. Required experimental measurements and an outline of the theoretical tools needed when using the analog technique to study molecular-scale hydrodynamic processes are highlighted. The proposed technique provides a relatively straightforward alternative to photonic and neutron beam scattering methods traditionally used in molecular hydrodynamic studies.

  9. Conservation of batik: Conseptual framework of design and process development

    NASA Astrophysics Data System (ADS)

    Syamwil, Rodia

    2018-03-01

    Development of Conservation Batik concept becomes critical due to the recessive of traditional batik as the intangible cultural heritage of humanity. The existence of printed batik, polluting process, and new stream design becomes the consequences of batik industry transformation to creative industry. Conservation Batik was proposed to answer all the threats to traditional batik, in the aspect of technique, process, and motif. However, creativities are also critical to meet consumer satisfaction. Research and development was conducted, start with the initial research in formulating the concept, and exploration of ideas to develop the designs of conservation motifs. In development steps, cyclical process to complete motif with high preferences, in the aspect of aesthetics, productivity, and efficiency. Data were collected through bibliography, documentation, observation, and interview, and analyzed in qualitative methods. The concept of Conservation Batik adopted from the principles of Universitas Negeri Semarang (UNNES) vision, as well as theoretical analyses, and expert judgment. Conservation Batik are assessed from three aspect, design, process, and consumer preferences. Conservation means the effort of safeguarding, promoting, maintaining, and preserving. Concervation Batik concept could be interpreted as batik with: (1) traditional values and authenticity; (2) the values of philosophycal meanings; (3) eco-friendly process with minimum waste; (4) conservation as idea resources of design; and (5) raising up of classic motifs.

  10. Assessment of the SRI Gasification Process for Syngas Generation with HTGR Integration -- White Paper

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    A.M. Gandrik

    2012-04-01

    This white paper is intended to compare the technical and economic feasibility of syngas generation using the SRI gasification process coupled to several high-temperature gas-cooled reactors (HTGRs) with more traditional HTGR-integrated syngas generation techniques, including: (1) Gasification with high-temperature steam electrolysis (HTSE); (2) Steam methane reforming (SMR); and (3) Gasification with SMR with and without CO2 sequestration.

  11. Low-temperature technique for thick film resist stabilization and curing

    NASA Astrophysics Data System (ADS)

    Minter, Jason P.; Wong, Selmer S.; Marlowe, Trey; Ross, Matthew F.; Narcy, Mark E.; Livesay, William R.

    1999-06-01

    For a range of thick film photoresist applications, including MeV ion implant processing, thin film head manufacturing, and microelectromechanical systems processing, there is a need for a low-temperature method for resist stabilization and curing. Traditional methods of stabilizing or curing resist films have relied on thermal cycling, which may not be desirable due to device temperature limitations or thermally-induced distortion of the resist features.

  12. Recycling-oriented characterization of plastic frames and printed circuit boards from mobile phones by electronic and chemical imaging.

    PubMed

    Palmieri, Roberta; Bonifazi, Giuseppe; Serranti, Silvia

    2014-11-01

    This study characterizes the composition of plastic frames and printed circuit boards from end-of-life mobile phones. This knowledge may help define an optimal processing strategy for using these items as potential raw materials. Correct handling of such a waste is essential for its further "sustainable" recovery, especially to maximize the extraction of base, rare and precious metals, minimizing the environmental impact of the entire process chain. A combination of electronic and chemical imaging techniques was thus examined, applied and critically evaluated in order to optimize the processing, through the identification and the topological assessment of the materials of interest and their quantitative distribution. To reach this goal, end-of-life mobile phone derived wastes have been systematically characterized adopting both "traditional" (e.g. scanning electronic microscopy combined with microanalysis and Raman spectroscopy) and innovative (e.g. hyperspectral imaging in short wave infrared field) techniques, with reference to frames and printed circuit boards. Results showed as the combination of both the approaches (i.e. traditional and classical) could dramatically improve recycling strategies set up, as well as final products recovery. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Scaling images using their background ratio. An application in statistical comparisons of images.

    PubMed

    Kalemis, A; Binnie, D; Bailey, D L; Flower, M A; Ott, R J

    2003-06-07

    Comparison of two medical images often requires image scaling as a pre-processing step. This is usually done with the scaling-to-the-mean or scaling-to-the-maximum techniques which, under certain circumstances, in quantitative applications may contribute a significant amount of bias. In this paper, we present a simple scaling method which assumes only that the most predominant values in the corresponding images belong to their background structure. The ratio of the two images to be compared is calculated and its frequency histogram is plotted. The scaling factor is given by the position of the peak in this histogram which belongs to the background structure. The method was tested against the traditional scaling-to-the-mean technique on simulated planar gamma-camera images which were compared using pixelwise statistical parametric tests. Both sensitivity and specificity for each condition were measured over a range of different contrasts and sizes of inhomogeneity for the two scaling techniques. The new method was found to preserve sensitivity in all cases while the traditional technique resulted in significant degradation of sensitivity in certain cases.

  14. Dietary Assessment on a Mobile Phone Using Image Processing and Pattern Recognition Techniques: Algorithm Design and System Prototyping.

    PubMed

    Probst, Yasmine; Nguyen, Duc Thanh; Tran, Minh Khoi; Li, Wanqing

    2015-07-27

    Dietary assessment, while traditionally based on pen-and-paper, is rapidly moving towards automatic approaches. This study describes an Australian automatic food record method and its prototype for dietary assessment via the use of a mobile phone and techniques of image processing and pattern recognition. Common visual features including scale invariant feature transformation (SIFT), local binary patterns (LBP), and colour are used for describing food images. The popular bag-of-words (BoW) model is employed for recognizing the images taken by a mobile phone for dietary assessment. Technical details are provided together with discussions on the issues and future work.

  15. Review of conventional and novel food processing methods on food allergens.

    PubMed

    Vanga, Sai Kranthi; Singh, Ashutosh; Raghavan, Vijaya

    2017-07-03

    With the turn of this century, novel food processing techniques have become commercially very important because of their profound advantages over the traditional methods. These novel processing methods tend to preserve the characteristic properties of food including their organoleptic and nutritional qualities better when compared with the conventional food processing methods. During the same period of time, there is a clear rise in the populations suffering from food allergies, especially infants and children. Though, this fact is widely attributed to the changing livelihood of population in both developed and developing nations and to the introduction of new food habits with advent of novel foods and new processing techniques, their complete role is still uncertain. Under the circumstance, it is very important to understand the structural changes in the protein as food is processed to comprehend whether the specific processing technique (conventional and novel) is increasing or mitigating the allergenicity. Various modern means are now being employed to understand the conformational changes in the protein which can affect the allergenicity. In this review, the processing effects on protein structure and allergenicity are discussed along with the insinuations of recent studies and techniques for establishing a platform to investigate future pathway to reduce or eliminate allergenicity in the population.

  16. Modified glass fibre reinforced polymer composites

    NASA Astrophysics Data System (ADS)

    Cao, Yumei

    A high ratio of strength to density and relatively low-cost are some of the significant features of glass fibre reinforced polymer composites (GFRPCs) that made them one of the most rapidly developed materials in recent years. They are widely used as the material of construction in the areas of aerospace, marine and everyday life, such as airplane, helicopter, boat, canoe, fishing rod, racket, etc. Traditionally, researchers tried to raise the mechanical properties and keep a high strength/weight ratio using all or some of the following methods: increasing the volume fraction of the fibre; using different polymeric matrix material; or changing the curing conditions. In recent years, some new techniques and processing methods were developed to further improve the mechanical properties of glass fibre (GF) reinforced polymer composite. For example, by modifying the surface condition of the GF, both the interface strength between the GF and the polymer matrix and the shear strength of the final composite can be significantly increased. Also, by prestressing the fibre during the curing process of the composite, the tensile, flexural and the impact properties of the composite can be greatly improved. In this research project, a new method of preparing GFRPCs, which combined several traditional and modern techniques together, was developed. This new method includes modification of the surface of the GF with silica particles, application of different levels of prestressing on the GF during the curing process, and the change of the fibre volume fraction and curing conditions in different sets of experiments. The results of the new processing were tested by the three-point bend test, the short beam shear test and the impact test to determine the new set of properties so formed in the composite material. Scanning electronic microscopy (SEM) was used to study the fracture surface of the new materials after the mechanical tests were performed. By taking advantages of the traditional and modern techniques at the same time, the newly developed modified glass fibre reinforced epoxy matrix composites (MGFRECs) have much improved comprehensive properties. The flexural strength, the flexural modulus, the shear modulus and the impact energy (Izod impact test) of the composites were improved up to 87%, 74%, 30% and 89% respectively when modified samples were compared to the samples made by the traditional methods.

  17. Automatic phase aberration compensation for digital holographic microscopy based on deep learning background detection.

    PubMed

    Nguyen, Thanh; Bui, Vy; Lam, Van; Raub, Christopher B; Chang, Lin-Ching; Nehmetallah, George

    2017-06-26

    We propose a fully automatic technique to obtain aberration free quantitative phase imaging in digital holographic microscopy (DHM) based on deep learning. The traditional DHM solves the phase aberration compensation problem by manually detecting the background for quantitative measurement. This would be a drawback in real time implementation and for dynamic processes such as cell migration phenomena. A recent automatic aberration compensation approach using principle component analysis (PCA) in DHM avoids human intervention regardless of the cells' motion. However, it corrects spherical/elliptical aberration only and disregards the higher order aberrations. Traditional image segmentation techniques can be employed to spatially detect cell locations. Ideally, automatic image segmentation techniques make real time measurement possible. However, existing automatic unsupervised segmentation techniques have poor performance when applied to DHM phase images because of aberrations and speckle noise. In this paper, we propose a novel method that combines a supervised deep learning technique with convolutional neural network (CNN) and Zernike polynomial fitting (ZPF). The deep learning CNN is implemented to perform automatic background region detection that allows for ZPF to compute the self-conjugated phase to compensate for most aberrations.

  18. Recombinant organisms for production of industrial products

    PubMed Central

    Adrio, Jose-Luis

    2010-01-01

    A revolution in industrial microbiology was sparked by the discoveries of ther double-stranded structure of DNA and the development of recombinant DNA technology. Traditional industrial microbiology was merged with molecular biology to yield improved recombinant processes for the industrial production of primary and secondary metabolites, protein biopharmaceuticals and industrial enzymes. Novel genetic techniques such as metabolic engineering, combinatorial biosynthesis and molecular breeding techniques and their modifications are contributing greatly to the development of improved industrial processes. In addition, functional genomics, proteomics and metabolomics are being exploited for the discovery of novel valuable small molecules for medicine as well as enzymes for catalysis. The sequencing of industrial microbal genomes is being carried out which bodes well for future process improvement and discovery of new industrial products. PMID:21326937

  19. Agile waveforms for joint SAR-GMTI processing

    NASA Astrophysics Data System (ADS)

    Jaroszewski, Steven; Corbeil, Allan; McMurray, Stephen; Majumder, Uttam; Bell, Mark R.; Corbeil, Jeffrey; Minardi, Michael

    2016-05-01

    Wideband radar waveforms that employ spread-spectrum techniques were investigated and experimentally tested. The waveforms combine bi-phase coding with a traditional LFM chirp and are applicable to joint SAR-GMTI processing. After de-spreading, the received signals can be processed to support simultaneous GMTI and high resolution SAR imaging missions by airborne radars. The spread spectrum coding techniques can provide nearly orthogonal waveforms and offer enhanced operations in some environments by distributing the transmitted energy over a large instantaneous bandwidth. The LFM component offers the desired Doppler tolerance. In this paper, the waveforms are formulated and a shift-register approach for de-spreading the received signals is described. Hardware loop-back testing has shown the feasibility of using these waveforms in experimental radar test bed.

  20. Recent advances in nuclear magnetic resonance quantum information processing.

    PubMed

    Criger, Ben; Passante, Gina; Park, Daniel; Laflamme, Raymond

    2012-10-13

    Quantum information processors have the potential to drastically change the way we communicate and process information. Nuclear magnetic resonance (NMR) has been one of the first experimental implementations of quantum information processing (QIP) and continues to be an excellent testbed to develop new QIP techniques. We review the recent progress made in NMR QIP, focusing on decoupling, pulse engineering and indirect nuclear control. These advances have enhanced the capabilities of NMR QIP, and have useful applications in both traditional NMR and other QIP architectures.

  1. Effects of processing adjuvants on traditional Chinese herbs.

    PubMed

    Chen, Lin-Lin; Verpoorte, Robert; Yen, Hung-Rong; Peng, Wen-Huang; Cheng, Yung-Chi; Chao, Jung; Pao, Li-Heng

    2018-04-01

    Processing of Chinese medicines is a pharmaceutical technique that transforms medicinal raw materials into decoction pieces for use in different therapies. Various adjuvants, such as vinegar, wine, honey, and brine, are used in the processing to enhance the efficacy and reduce the toxicity of crude drugs. Proper processing is essential to ensure the quality and safety of traditional Chinese medicines (TCMs). Therefore, sound knowledge of processing principles is crucial to the standardized use of these processing adjuvants and to facilitate the production and clinical use of decoction pieces. Many scientific reports have indicated the synergistic effects of processing mechanisms on the chemistry, pharmacology, and pharmacokinetics of the active ingredients in TCMs. Under certain conditions, adjuvants change the content of active or toxic components in drugs by chemical or physical transformation, increase or decrease drug dissolution, exert their own pharmacological effects, or alter drug pharmacokinetics. This review summarizes various processing methods adopted in the last two decades, and highlights current approaches to identify the effects of processing parameters on TCMs. Copyright © 2018. Published by Elsevier B.V.

  2. Reproducing the old masters: applying colour mixing and painting methodologies to inkjet printing

    NASA Astrophysics Data System (ADS)

    Olen, Melissa; Padfield, Joseph; Parraman, Carinna

    2014-01-01

    This research investigates multi-channel inkjet printing methods, which deviate from standard colour management workflows by reflecting on art historical processes, including the construction of colour in old master works, to reproduce specific colour pigment mixes in print. This is approached by incorporating artist colour mixing principles relevant to traditional art making processes through direct n-channel printing and the implementation of multiple pass printing. By demanding specific ink colourants to be employed in print, as well as the application of mixing colour though layering, we can mimic the effects of the traditional processes. These printing methods also generate colour through a variety of colour mixtures that may not have been employed or achieved by the printer driver. The objective of this research is to explore colour mixing and layering techniques in the printing of inkjet reproductions of original artworks that will maintain subtle colour transitions in dark shadow regions. While these colours are lost in traditional inkjet reproduction, by using direct n-channel editing capabilities to reproduce a painted original with high dynamic range we can improve colour variation in the shadow regions.

  3. Iodosodalite Waste Forms from Low-Temperature Aqueous Process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nam, Junghune; Chong, Saehwa; Riley, Brian J.

    ABSTRACT Nuclear energy is one option to meet rising electricity demands, although one concern of this technology is the proper capture and storage of radioisotopes produced during fission processes. One of the more difficult radioisotopes is 129I due to its volatility and poor solubility in traditional waste forms such as borosilicate glass. Iodosodalite has been previously proposed as a viable candidate to immobilize iodine due to high iodine loading and good chemical durability. Iodosodalite was traditionally synthesized using solid state and hydrothermal techniques, but this paper discusses an aqueous synthesis approach to optimize and maximize the iodosodalite yield. Products weremore » pressed into pellets and fired with glass binders. Chemical durability and iodine retention results are included.« less

  4. Antioxidant Properties of “Natchez” and “Triple Crown” Blackberries Using Korean Traditional Winemaking Techniques

    PubMed Central

    Maness, Niels; McGlynn, William

    2017-01-01

    This research evaluated blackberries grown in Oklahoma and wines produced using a modified traditional Korean technique employing relatively oxygen-permeable earthenware fermentation vessels. The fermentation variables were temperature (21.6°C versus 26.6°C) and yeast inoculation versus wild fermentation. Wild fermented wines had higher total phenolic concentration than yeast fermented wines. Overall, wines had a relatively high concentration of anthocyanin (85–320 mg L−1 malvidin-3-monoglucoside) and antioxidant capacity (9776–37845 µmol Trolox equivalent g−1). “Natchez” berries had a higher anthocyanin concentration than “Triple Crown” berries. Higher fermentation temperature at the start of the winemaking process followed by the use of lower fermentation/storage temperature for aging wine samples maximized phenolic compound extraction/retention. The Korean winemaking technique used in this study produced blackberry wines that were excellent sources of polyphenolic compounds as well as being high in antioxidant capacity as measured by the Oxygen Radical Absorbance Capacity (ORAC) test. PMID:28713820

  5. Additive manufacturing: Toward holistic design

    DOE PAGES

    Jared, Bradley H.; Aguilo, Miguel A.; Beghini, Lauren L.; ...

    2017-03-18

    Here, additive manufacturing offers unprecedented opportunities to design complex structures optimized for performance envelopes inaccessible under conventional manufacturing constraints. Additive processes also promote realization of engineered materials with microstructures and properties that are impossible via traditional synthesis techniques. Enthused by these capabilities, optimization design tools have experienced a recent revival. The current capabilities of additive processes and optimization tools are summarized briefly, while an emerging opportunity is discussed to achieve a holistic design paradigm whereby computational tools are integrated with stochastic process and material awareness to enable the concurrent optimization of design topologies, material constructs and fabrication processes.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jared, Bradley H.; Aguilo, Miguel A.; Beghini, Lauren L.

    Here, additive manufacturing offers unprecedented opportunities to design complex structures optimized for performance envelopes inaccessible under conventional manufacturing constraints. Additive processes also promote realization of engineered materials with microstructures and properties that are impossible via traditional synthesis techniques. Enthused by these capabilities, optimization design tools have experienced a recent revival. The current capabilities of additive processes and optimization tools are summarized briefly, while an emerging opportunity is discussed to achieve a holistic design paradigm whereby computational tools are integrated with stochastic process and material awareness to enable the concurrent optimization of design topologies, material constructs and fabrication processes.

  7. Microstructures and Mechanical Properties of Co-Cr Dental Alloys Fabricated by Three CAD/CAM-Based Processing Techniques

    PubMed Central

    Kim, Hae Ri; Jang, Seong-Ho; Kim, Young Kyung; Son, Jun Sik; Min, Bong Ki; Kim, Kyo-Han; Kwon, Tae-Yub

    2016-01-01

    The microstructures and mechanical properties of cobalt-chromium (Co-Cr) alloys produced by three CAD/CAM-based processing techniques were investigated in comparison with those produced by the traditional casting technique. Four groups of disc- (microstructures) or dumbbell- (mechanical properties) specimens made of Co-Cr alloys were prepared using casting (CS), milling (ML), selective laser melting (SLM), and milling/post-sintering (ML/PS). For each technique, the corresponding commercial alloy material was used. The microstructures of the specimens were evaluated via X-ray diffractometry, optical and scanning electron microscopy with energy-dispersive X-ray spectroscopy, and electron backscattered diffraction pattern analysis. The mechanical properties were evaluated using a tensile test according to ISO 22674 (n = 6). The microstructure of the alloys was strongly influenced by the manufacturing processes. Overall, the SLM group showed superior mechanical properties, the ML/PS group being nearly comparable. The mechanical properties of the ML group were inferior to those of the CS group. The microstructures and mechanical properties of Co-Cr alloys were greatly dependent on the manufacturing technique as well as the chemical composition. The SLM and ML/PS techniques may be considered promising alternatives to the Co-Cr alloy casting process. PMID:28773718

  8. Electronic speckle pattern interferometry using vortex beams.

    PubMed

    Restrepo, René; Uribe-Patarroyo, Néstor; Belenguer, Tomás

    2011-12-01

    We show that it is possible to perform electronic speckle pattern interferometry (ESPI) using, for the first time to our knowledge, vortex beams as the reference beam. The technique we propose is easy to implement, and the advantages obtained are, among others, environmental stability, lower processing time, and the possibility to switch between traditional ESPI and spiral ESPI. The experimental results clearly show the advantages of using the proposed technique for deformation studies of complex structures. © 2011 Optical Society of America

  9. Visualisation of urban airborne laser scanning data with occlusion images

    NASA Astrophysics Data System (ADS)

    Hinks, Tommy; Carr, Hamish; Gharibi, Hamid; Laefer, Debra F.

    2015-06-01

    Airborne Laser Scanning (ALS) was introduced to provide rapid, high resolution scans of landforms for computational processing. More recently, ALS has been adapted for scanning urban areas. The greater complexity of urban scenes necessitates the development of novel methods to exploit urban ALS to best advantage. This paper presents occlusion images: a novel technique that exploits the geometric complexity of the urban environment to improve visualisation of small details for better feature recognition. The algorithm is based on an inversion of traditional occlusion techniques.

  10. ROLES OF REMOTE SENSING AND CARTOGRAPHY IN THE USGS NATIONAL MAPPING DIVISION.

    USGS Publications Warehouse

    Southard, Rupert B.; Salisbury, John W.

    1983-01-01

    The inseparable roles of remote sensing and photogrammetry have been recognized to be consistent with the aims and interests of the American Society of Photogrammetry. In particular, spatial data storage, data merging and manipulation methods and other techniques originally developed for remote sensing applications also have applications for digital cartography. Also, with the introduction of much improved digital processing techniques, even relatively low resolution (80 m) traditional Landsat images can now be digitally mosaicked into excellent quality 1:250,000-scale image maps.

  11. Applications for Gradient Metal Alloys Fabricated Using Additive Manufacturing

    NASA Technical Reports Server (NTRS)

    Hofmann, Douglas C.; Borgonia, John Paul C.; Dillon, Robert P.; Suh, Eric J.; Mulder, jerry L.; Gardner, Paul B.

    2013-01-01

    Recently, additive manufacturing (AM) techniques have been developed that may shift the paradigm of traditional metal production by allowing complex net-shaped hardware to be built up layer-by-layer, rather than being machined from a billet. The AM process is ubiquitous with polymers due to their low melting temperatures, fast curing, and controllable viscosity, and 3D printers are widely available as commercial or consumer products. 3D printing with metals is inherently more complicated than with polymers due to their higher melting temperatures and reactivity with air, particularly when heated or molten. The process generally requires a high-power laser or other focused heat source, like an electron beam, for precise melting and deposition. Several promising metal AM techniques have been developed, including laser deposition (also called laser engineered net shaping or LENS® and laser deposition technology (LDT)), direct metal laser sintering (DMLS), and electron beam free-form (EBF). These machines typically use powders or wire feedstock that are melted and deposited using a laser or electron beam. Complex net-shape parts have been widely demonstrated using these (and other) AM techniques and the process appears to be a promising alternative to machining in some cases. Rather than simply competing with traditional machining for cost and time savings, the true advantage of AM involves the fabrication of hardware that cannot be produced using other techniques. This could include parts with "blind" features (like foams or trusses), parts that are difficult to machine conventionally, or parts made from materials that do not exist in bulk forms. In this work, the inventors identify that several AM techniques can be used to develop metal parts that change composition from one location in the part to another, allowing for complete control over the mechanical or physical properties. This changes the paradigm for conventional metal fabrication, which relies on an assortment of "post-processing" methods to locally alter properties (such as coating, heat treating, work hardening, shot peening, etching, anodizing, among others). Building the final part in an additive process allows for the development of an entirely new class of metals, so-called "functionally graded metals" or "gradient alloys." By carefully blending feedstock materials with different properties in an AM process, hardware can be developed with properties that cannot be obtained using other techniques but with the added benefit of the net-shaped fabrication that AM allows.

  12. Characterization of a Viking Blade Fabricated by Traditional Forging Techniques

    NASA Astrophysics Data System (ADS)

    Vo, H.; Frazer, D.; Bailey, N.; Traylor, R.; Austin, J.; Pringle, J.; Bickel, J.; Connick, R.; Connick, W.; Hosemann, P.

    2016-12-01

    A team of students from the University of California, Berkeley, participated in a blade-smithing competition hosted by the Minerals, Metals, and Materials Society at the TMS 2015 144th annual meeting and exhibition. Motivated by ancient forging methods, the UC Berkeley team chose to fabricate our blade from historical smithing techniques utilizing naturally-occurring deposits of iron ore. This approach resulted in receiving the "Best Example of a Traditional Blade Process/Ore Smelting Technique" award for our blade named "Berkelium." First, iron-enriched sand was collected from local beaches. Magnetite (Fe3O4) was then extracted from the sand and smelted into individual high- and low-carbon steel ingots. Layers of high- and low-carbon steels were forge-welded together, predominantly by hand, to form a composite material. Optical microscopy, energy dispersive spectroscopy, and Vickers hardness mechanical testing were conducted at different stages throughout the blade-making process to evaluate the microstructure and hardness evolution during formation. It was found that the pre-heat-treated blade microstructure was composed of ferrite and pearlite, and contained many nonmetallic inclusions. A final heat treatment was performed, which caused the average hardness of the blade edge to increase by more than a factor of two, indicating a martensitic transformation.

  13. Using the Halstead-Reitan Battery to diagnose brain damage: a comparison of the predictive power of traditional techniques to Rohling's Interpretive Method.

    PubMed

    Rohling, Martin L; Williamson, David J; Miller, L Stephen; Adams, Russell L

    2003-11-01

    The aim of this project was to validate an alternative global measure of neurocognitive impairment (Rohling Interpretive Method, or RIM) that could be generated from data gathered from a flexible battery approach. A critical step in this process is to establish the utility of the technique against current standards in the field. In this paper, we compared results from the Rohling Interpretive Method to those obtained from the General Neuropsychological Deficit Scale (GNDS; Reitan & Wolfson, 1988) and the Halstead-Russell Average Impairment Rating (AIR; Russell, Neuringer & Goldstein, 1970) on a large previously published sample of patients assessed with the Halstead-Reitan Battery (HRB). Findings support the use of the Rohling Interpretive Method in producing summary statistics similar in diagnostic sensitivity and specificity to the traditional HRB indices.

  14. Modeling of additive manufacturing processes for metals: Challenges and opportunities

    DOE PAGES

    Francois, Marianne M.; Sun, Amy; King, Wayne E.; ...

    2017-01-09

    Here, with the technology being developed to manufacture metallic parts using increasingly advanced additive manufacturing processes, a new era has opened up for designing novel structural materials, from designing shapes and complex geometries to controlling the microstructure (alloy composition and morphology). The material properties used within specific structural components are also designable in order to meet specific performance requirements that are not imaginable with traditional metal forming and machining (subtractive) techniques.

  15. Mechanical properties of wood fiber composites under the influence of temperature and humidity

    Treesearch

    Yibin Xue; David Veazie; Cindy Glinsey; Meagan Wright; Roger M. Rowell

    2003-01-01

    Woodfiber-thermoplastic composites (WPC) have received considerable attentions from the forest product industry for civil engineering applications due to its superior properties over wood and plastics alone. Particularly WPCs can be easily fabricated using traditional plastic processing techniques. The major limitation in the applications of WPCs is the poor...

  16. "How Seeing Helps Doing, and Doing Allows to See More": The Process of Imitation in the Dance Class

    ERIC Educational Resources Information Center

    Harbonnier-Topin, Nicole; Barbier, Jean-Marie

    2012-01-01

    Our field research in five contemporary dance technique classes, observing and describing the complexity and diversity involved in the traditional "demonstration-reproduction" pedagogical relationship, has led us to reconsider the role of modeling behaviour in the dance teaching context. We have also outlined recent neuroscientific…

  17. Mobile Formative Assessment Tool Based on Data Mining Techniques for Supporting Web-Based Learning

    ERIC Educational Resources Information Center

    Chen, Chih-Ming; Chen, Ming-Chuan

    2009-01-01

    Current trends clearly indicate that online learning has become an important learning mode. However, no effective assessment mechanism for learning performance yet exists for e-learning systems. Learning performance assessment aims to evaluate what learners learned during the learning process. Traditional summative evaluation only considers final…

  18. Comparison of traditional nondestructive analysis of RERTR fuel plates with digital radiographic techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davidsmeier, T.; Koehl, R.; Lanham, R.

    2008-07-15

    The current design and fabrication process for RERTR fuel plates utilizes film radiography during the nondestructive testing and characterization. Digital radiographic methods offer a potential increases in efficiency and accuracy. The traditional and digital radiographic methods are described and demonstrated on a fuel plate constructed with and average of 51% by volume fuel using the dispersion method. Fuel loading data from each method is analyzed and compared to a third baseline method to assess accuracy. The new digital method is shown to be more accurate, save hours of work, and provide additional information not easily available in the traditional method.more » Additional possible improvements suggested by the new digital method are also raised. (author)« less

  19. [Construction of biopharmaceutics classification system of Chinese materia medica].

    PubMed

    Liu, Yang; Wei, Li; Dong, Ling; Zhu, Mei-Ling; Tang, Ming-Min; Zhang, Lei

    2014-12-01

    Based on the characteristics of multicomponent of traditional Chinese medicine and drawing lessons from the concepts, methods and techniques of biopharmaceutics classification system (BCS) in chemical field, this study comes up with the science framework of biopharmaceutics classification system of Chinese materia medica (CMMBCS). Using the different comparison method of multicomponent level and the CMMBCS method of overall traditional Chinese medicine, the study constructs the method process while setting forth academic thoughts and analyzing theory. The basic role of this system is clear to reveal the interaction and the related absorption mechanism of multicomponent in traditional Chinese medicine. It also provides new ideas and methods for improving the quality of Chinese materia medica and the development of new drug research.

  20. Distributed cooperating processes in a mobile robot control system

    NASA Technical Reports Server (NTRS)

    Skillman, Thomas L., Jr.

    1988-01-01

    A mobile inspection robot has been proposed for the NASA Space Station. It will be a free flying autonomous vehicle that will leave a berthing unit to accomplish a variety of inspection tasks around the Space Station, and then return to its berth to recharge, refuel, and transfer information. The Flying Eye robot will receive voice communication to change its attitude, move at a constant velocity, and move to a predefined location along a self generated path. This mobile robot control system requires integration of traditional command and control techniques with a number of AI technologies. Speech recognition, natural language understanding, task and path planning, sensory abstraction and pattern recognition are all required for successful implementation. The interface between the traditional numeric control techniques and the symbolic processing to the AI technologies must be developed, and a distributed computing approach will be needed to meet the real time computing requirements. To study the integration of the elements of this project, a novel mobile robot control architecture and simulation based on the blackboard architecture was developed. The control system operation and structure is discussed.

  1. Microearthquake Studies at the Salton Sea Geothermal Field

    DOE Data Explorer

    Templeton, Dennise

    2013-10-01

    The objective of this project is to detect and locate microearthquakes to aid in the characterization of reservoir fracture networks. Accurate identification and mapping of the large numbers of microearthquakes induced in EGS is one technique that provides diagnostic information when determining the location, orientation and length of underground crack systems for use in reservoir development and management applications. Conventional earthquake location techniques often are employed to locate microearthquakes. However, these techniques require labor-intensive picking of individual seismic phase onsets across a network of sensors. For this project we adapt the Matched Field Processing (MFP) technique to the elastic propagation problem in geothermal reservoirs to identify more and smaller events than traditional methods alone.

  2. Determination of maize hardness by biospeckle and fuzzy granularity.

    PubMed

    Weber, Christian; Dai Pra, Ana L; Passoni, Lucía I; Rabal, Héctor J; Trivi, Marcelo; Poggio Aguerre, Guillermo J

    2014-09-01

    In recent years there has been renewed interest in the development of novel grain classification methods that could complement traditional empirical tests. A speckle pattern occurs when a laser beam illuminates an optically rough surface that flickers when the object is active and is called biospeckle. In this work, we use laser biospeckle to classify maize (Zea mays L.) kernel hardness. A series of grains of three types of maize were cut and illuminated by a laser. A series of images were then registered, stored, and processed. These were compared with results obtained by floating test. The laser speckle technique was effective in discriminating the grains based on the presence of floury or vitreous endosperm and could be considered a feasible alternative to traditional floating methods. The results indicate that this methodology can distinguish floury and vitreous grains. Moreover, the assay showed higher discrimination capability than traditional tests. It could be potentially useful for maize classification and to increase the efficiency of processing dry milling corn.

  3. Balancing Between Aging and Cancer: Molecular Genetics Meets Traditional Chinese Medicine.

    PubMed

    Liu, Jing; Peng, Lei; Huang, Wenhui; Li, Zhiming; Pan, Jun; Sang, Lei; Lu, Siqian; Zhang, Jihong; Li, Wanyi; Luo, Ying

    2017-09-01

    The biological consequences of cellular senescence and immortalization in aging and cancer are in conflict. Organisms have developed common cellular signaling pathways and surveillance mechanisms to control the processing of aging against tumorigenesis. The imbalance of any signals involved in this process may result in either premature aging or tumorigenesis and reduce the life span of the organism. In contrast, the balance between aging and tumorigenesis at a higher level (homeostatic-balance) may benefit the organism with tumor-free longevity. The focus of this perspective is to review the literature on the balance between "Yin" and "Yang" in traditional Chinese medicine. Modern cellular and molecular techniques now permit a more robust system to screen herbs in traditional Chinese medicine for their activity in balancing aging and tumorigenesis. The understanding of the crosstalk between aging and tumorigenesis and new perspectives on the application of Chinese medicine might shed light on anti-aging and tumor-free strategies. J. Cell. Biochem. 118: 2581-2586, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  4. Using Dispersed Modes During Model Correlation

    NASA Technical Reports Server (NTRS)

    Stewart, Eric C.; Hathcock, Megan L.

    2017-01-01

    The model correlation process for the modal characteristics of a launch vehicle is well established. After a test, parameters within the nominal model are adjusted to reflect structural dynamics revealed during testing. However, a full model correlation process for a complex structure can take months of man-hours and many computational resources. If the analyst only has weeks, or even days, of time in which to correlate the nominal model to the experimental results, then the traditional correlation process is not suitable. This paper describes using model dispersions to assist the model correlation process and decrease the overall cost of the process. The process creates thousands of model dispersions from the nominal model prior to the test and then compares each of them to the test data. Using mode shape and frequency error metrics, one dispersion is selected as the best match to the test data. This dispersion is further improved by using a commercial model correlation software. In the three examples shown in this paper, this dispersion based model correlation process performs well when compared to models correlated using traditional techniques and saves time in the post-test analysis.

  5. Additive Manufacturing at LANL: Advanced Characterization to Explore the Science of a New Manufacturing Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carpenter, John S.; Brown, Donald William; Clausen, Bjorn

    Additive manufacturing (AM), or three-dimensional (3D) printing as it is more commonly known, is defined as the process of joining materials and creating objects by melting, sintering, or fusing material in a layer-by-layer fashion coordinated via 3D model data.1 Subtractive, or traditional, manufacturing methodologies often consist of machining/removing material—like a sculptor—or forming material through the application of pressure—like a potter. Conversely, in an AM process, material is added in individual volume elements and built up in a way similar to interlocking building blocks, but with volume elements that are typically the size of a grain of sand. The additive processmore » often involves less waste when compared to subtractive techniques because material is only added when and where it is needed. Adjustments to the final structure are relatively straightforward and can be simply achieved by adjusting the 3D computer model. This makes the technology much more flexible than traditional, subtractive techniques where new tooling or forming equipment is usually needed to accommodate design changes. Also, the AM processes are beneficial because they permit the fabrication of unique geometries, such as miniaturized metal lattice structures, that cannot be achieved using traditional techniques. An example of a metal lattice structure is the Eiffel Tower with its Figure 1. Schematic showing required linkages between experimental and modeling thrusts in order to achieve science-based qualification. Arrows showing linkages are color-coded according to the funded projects listed. geometric, interconnecting struts that reduce the overall weight of the tower while maintaining strength. In AM, the size of the struts can be made smaller than the diameter of a human hair, which further reduces weight while maintaining strength—a combination of properties that can benefit many applications.« less

  6. Additive Manufacturing at LANL: Advanced Characterization to Explore the Science of a New Manufacturing Method

    DOE PAGES

    Carpenter, John S.; Brown, Donald William; Clausen, Bjorn; ...

    2017-03-01

    Additive manufacturing (AM), or three-dimensional (3D) printing as it is more commonly known, is defined as the process of joining materials and creating objects by melting, sintering, or fusing material in a layer-by-layer fashion coordinated via 3D model data.1 Subtractive, or traditional, manufacturing methodologies often consist of machining/removing material—like a sculptor—or forming material through the application of pressure—like a potter. Conversely, in an AM process, material is added in individual volume elements and built up in a way similar to interlocking building blocks, but with volume elements that are typically the size of a grain of sand. The additive processmore » often involves less waste when compared to subtractive techniques because material is only added when and where it is needed. Adjustments to the final structure are relatively straightforward and can be simply achieved by adjusting the 3D computer model. This makes the technology much more flexible than traditional, subtractive techniques where new tooling or forming equipment is usually needed to accommodate design changes. Also, the AM processes are beneficial because they permit the fabrication of unique geometries, such as miniaturized metal lattice structures, that cannot be achieved using traditional techniques. An example of a metal lattice structure is the Eiffel Tower with its Figure 1. Schematic showing required linkages between experimental and modeling thrusts in order to achieve science-based qualification. Arrows showing linkages are color-coded according to the funded projects listed. geometric, interconnecting struts that reduce the overall weight of the tower while maintaining strength. In AM, the size of the struts can be made smaller than the diameter of a human hair, which further reduces weight while maintaining strength—a combination of properties that can benefit many applications.« less

  7. A Systematic Approach to Applying Lean Techniques to Optimize an Office Process at the Y-12 National Security Complex

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Credille, Jennifer; Owens, Elizabeth

    This capstone offers the introduction of Lean concepts to an office activity to demonstrate the versatility of Lean. Traditionally Lean has been associated with process improvements as applied to an industrial atmosphere. However, this paper will demonstrate that implementing Lean concepts within an office activity can result in significant process improvements. Lean first emerged with the conception of the Toyota Production System. This innovative concept was designed to improve productivity in the automotive industry by eliminating waste and variation. Lean has also been applied to office environments, however the limited literature reveals most Lean techniques within an office are restrictedmore » to one or two techniques. Our capstone confronts these restrictions by introducing a systematic approach that utilizes multiple Lean concepts. The approach incorporates: system analysis, system reliability, system requirements, and system feasibility. The methodical Lean outline provides tools for a successful outcome, which ensures the process is thoroughly dissected and can be achieved for any process in any work environment.« less

  8. Optimization and Characterization of the Friction Stir Welded Sheets of AA 5754-H111: Monitoring of the Quality of Joints with Thermographic Techniques

    PubMed Central

    De Filippis, Luigi Alberto Ciro; Serio, Livia Maria; Galietti, Umberto

    2017-01-01

    Friction Stir Welding (FSW) is a solid-state welding process, based on frictional and stirring phenomena, that offers many advantages with respect to the traditional welding methods. However, several parameters can affect the quality of the produced joints. In this work, an experimental approach has been used for studying and optimizing the FSW process, applied on 5754-H111 aluminum plates. In particular, the thermal behavior of the material during the process has been investigated and two thermal indexes, the maximum temperature and the heating rate of the material, correlated to the frictional power input, were investigated for different process parameters (the travel and rotation tool speeds) configurations. Moreover, other techniques (micrographs, macrographs and destructive tensile tests) were carried out for supporting in a quantitative way the analysis of the quality of welded joints. The potential of thermographic technique has been demonstrated both for monitoring the FSW process and for predicting the quality of joints in terms of tensile strength. PMID:29019948

  9. [Computer-assisted image processing for quantifying histopathologic variables in the healing of colonic anastomosis in dogs].

    PubMed

    Novelli, M D; Barreto, E; Matos, D; Saad, S S; Borra, R C

    1997-01-01

    The authors present the experimental results of the computerized quantifying of tissular structures involved in the reparative process of colonic anastomosis performed by manual suture and biofragmentable ring. The quantified variables in this study were: oedema fluid, myofiber tissue, blood vessel and cellular nuclei. An image processing software developed at Laboratório de Informática Dedicado à Odontologia (LIDO) was utilized to quantifying the pathognomonic alterations in the inflammatory process in colonic anastomosis performed in 14 dogs. The results were compared to those obtained through traditional way diagnosis by two pathologists in view of counterproof measures. The criteria for these diagnoses were defined in levels represented by absent, light, moderate and intensive which were compared to analysis performed by the computer. There was significant statistical difference between two techniques: the biofragmentable ring technique exhibited low oedema fluid, organized myofiber tissue and higher number of alongated cellular nuclei in relation to manual suture technique. The analysis of histometric variables through computational image processing was considered efficient and powerful to quantify the main tissular inflammatory and reparative changing.

  10. Wind Gust Measurement Techniques—From Traditional Anemometry to New Possibilities

    PubMed Central

    2018-01-01

    Information on wind gusts is needed for assessment of wind-induced damage and risks to safety. The measurement of wind gust speed requires a high temporal resolution of the anemometer system, because the gust is defined as a short-duration (seconds) maximum of the fluctuating wind speed. Until the digitalization of wind measurements in the 1990s, the wind gust measurements suffered from limited recording and data processing resources. Therefore, the majority of continuous wind gust records date back at most only by 30 years. Although the response characteristics of anemometer systems are good enough today, the traditional measurement techniques at weather stations based on cup and sonic anemometers are limited to heights and regions where the supporting structures can reach. Therefore, existing measurements are mainly concentrated over densely-populated land areas, whereas from remote locations, such as the marine Arctic, wind gust information is available only from sparse coastal locations. Recent developments of wind gust measurement techniques based on turbulence measurements from research aircraft and from Doppler lidar can potentially provide new information from heights and locations unreachable by traditional measurement techniques. Moreover, fast-developing measurement methods based on Unmanned Aircraft Systems (UASs) may add to better coverage of wind gust measurements in the future. In this paper, we provide an overview of the history and the current status of anemometry from the perspective of wind gusts. Furthermore, a discussion on the potential future directions of wind gust measurement techniques is provided. PMID:29690647

  11. Intensity-modulated radiation therapy and volumetric-modulated arc therapy for adult craniospinal irradiation—A comparison with traditional techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Studenski, Matthew T., E-mail: matthew.studenski@jeffersonhospital.org; Shen, Xinglei; Yu, Yan

    2013-04-01

    Craniospinal irradiation (CSI) poses a challenging planning process because of the complex target volume. Traditional 3D conformal CSI does not spare any critical organs, resulting in toxicity in patients. Here the dosimetric advantages of intensity-modulated radiation therapy (IMRT) and volumetric-modulated arc therapy (VMAT) are compared with classic conformal planning in adults for both cranial and spine fields to develop a clinically feasible technique that is both effective and efficient. Ten adult patients treated with CSI were retrospectively identified. For the cranial fields, 5-field IMRT and dual 356° VMAT arcs were compared with opposed lateral 3D conformal radiotherapy (3D-CRT) fields. Formore » the spine fields, traditional posterior-anterior (PA) PA fields were compared with isocentric 5-field IMRT plans and single 200° VMAT arcs. Two adult patients have been treated using this IMRT technique to date and extensive quality assurance, especially for the junction regions, was performed. For the cranial fields, the IMRT technique had the highest planned target volume (PTV) maximum and was the least efficient, whereas the VMAT technique provided the greatest parotid sparing with better efficiency. 3D-CRT provided the most efficient delivery but with the highest parotid dose. For the spine fields, VMAT provided the best PTV coverage but had the highest mean dose to all organs at risk (OAR). 3D-CRT had the highest PTV and OAR maximum doses but was the most efficient. IMRT provides the greatest OAR sparing but the longest delivery time. For those patients with unresectable disease that can benefit from a higher, definitive dose, 3D-CRT–opposed laterals are the most clinically feasible technique for cranial fields and for spine fields. Although inefficient, the IMRT technique is the most clinically feasible because of the increased mean OAR dose with the VMAT technique. Quality assurance of the beams, especially the junction regions, is essential.« less

  12. Recombinant organisms for production of industrial products.

    PubMed

    Adrio, Jose-Luis; Demain, Arnold L

    2010-01-01

    A revolution in industrial microbiology was sparked by the discoveries of ther double-stranded structure of DNA and the development of recombinant DNA technology. Traditional industrial microbiology was merged with molecular biology to yield improved recombinant processes for the industrial production of primary and secondary metabolites, protein biopharmaceuticals and industrial enzymes. Novel genetic techniques such as metabolic engineering, combinatorial biosynthesis and molecular breeding techniques and their modifications are contributing greatly to the development of improved industrial processes. In addition, functional genomics, proteomics and metabolomics are being exploited for the discovery of novel valuable small molecules for medicine as well as enzymes for catalysis. The sequencing of industrial microbal genomes is being carried out which bodes well for future process improvement and discovery of new industrial products. © 2010 Landes Bioscience

  13. Dietary Assessment on a Mobile Phone Using Image Processing and Pattern Recognition Techniques: Algorithm Design and System Prototyping

    PubMed Central

    Probst, Yasmine; Nguyen, Duc Thanh; Tran, Minh Khoi; Li, Wanqing

    2015-01-01

    Dietary assessment, while traditionally based on pen-and-paper, is rapidly moving towards automatic approaches. This study describes an Australian automatic food record method and its prototype for dietary assessment via the use of a mobile phone and techniques of image processing and pattern recognition. Common visual features including scale invariant feature transformation (SIFT), local binary patterns (LBP), and colour are used for describing food images. The popular bag-of-words (BoW) model is employed for recognizing the images taken by a mobile phone for dietary assessment. Technical details are provided together with discussions on the issues and future work. PMID:26225994

  14. An AK-LDMeans algorithm based on image clustering

    NASA Astrophysics Data System (ADS)

    Chen, Huimin; Li, Xingwei; Zhang, Yongbin; Chen, Nan

    2018-03-01

    Clustering is an effective analytical technique for handling unmarked data for value mining. Its ultimate goal is to mark unclassified data quickly and correctly. We use the roadmap for the current image processing as the experimental background. In this paper, we propose an AK-LDMeans algorithm to automatically lock the K value by designing the Kcost fold line, and then use the long-distance high-density method to select the clustering centers to further replace the traditional initial clustering center selection method, which further improves the efficiency and accuracy of the traditional K-Means Algorithm. And the experimental results are compared with the current clustering algorithm and the results are obtained. The algorithm can provide effective reference value in the fields of image processing, machine vision and data mining.

  15. The report of Task Group 100 of the AAPM: Application of risk analysis methods to radiation therapy quality management

    PubMed Central

    Huq, M. Saiful; Fraass, Benedick A.; Dunscombe, Peter B.; Gibbons, John P.; Mundt, Arno J.; Mutic, Sasa; Palta, Jatinder R.; Rath, Frank; Thomadsen, Bruce R.; Williamson, Jeffrey F.; Yorke, Ellen D.

    2016-01-01

    The increasing complexity of modern radiation therapy planning and delivery challenges traditional prescriptive quality management (QM) methods, such as many of those included in guidelines published by organizations such as the AAPM, ASTRO, ACR, ESTRO, and IAEA. These prescriptive guidelines have traditionally focused on monitoring all aspects of the functional performance of radiotherapy (RT) equipment by comparing parameters against tolerances set at strict but achievable values. Many errors that occur in radiation oncology are not due to failures in devices and software; rather they are failures in workflow and process. A systematic understanding of the likelihood and clinical impact of possible failures throughout a course of radiotherapy is needed to direct limit QM resources efficiently to produce maximum safety and quality of patient care. Task Group 100 of the AAPM has taken a broad view of these issues and has developed a framework for designing QM activities, based on estimates of the probability of identified failures and their clinical outcome through the RT planning and delivery process. The Task Group has chosen a specific radiotherapy process required for “intensity modulated radiation therapy (IMRT)” as a case study. The goal of this work is to apply modern risk-based analysis techniques to this complex RT process in order to demonstrate to the RT community that such techniques may help identify more effective and efficient ways to enhance the safety and quality of our treatment processes. The task group generated by consensus an example quality management program strategy for the IMRT process performed at the institution of one of the authors. This report describes the methodology and nomenclature developed, presents the process maps, FMEAs, fault trees, and QM programs developed, and makes suggestions on how this information could be used in the clinic. The development and implementation of risk-assessment techniques will make radiation therapy safer and more efficient. PMID:27370140

  16. The report of Task Group 100 of the AAPM: Application of risk analysis methods to radiation therapy quality management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huq, M. Saiful, E-mail: HUQS@UPMC.EDU

    The increasing complexity of modern radiation therapy planning and delivery challenges traditional prescriptive quality management (QM) methods, such as many of those included in guidelines published by organizations such as the AAPM, ASTRO, ACR, ESTRO, and IAEA. These prescriptive guidelines have traditionally focused on monitoring all aspects of the functional performance of radiotherapy (RT) equipment by comparing parameters against tolerances set at strict but achievable values. Many errors that occur in radiation oncology are not due to failures in devices and software; rather they are failures in workflow and process. A systematic understanding of the likelihood and clinical impact ofmore » possible failures throughout a course of radiotherapy is needed to direct limit QM resources efficiently to produce maximum safety and quality of patient care. Task Group 100 of the AAPM has taken a broad view of these issues and has developed a framework for designing QM activities, based on estimates of the probability of identified failures and their clinical outcome through the RT planning and delivery process. The Task Group has chosen a specific radiotherapy process required for “intensity modulated radiation therapy (IMRT)” as a case study. The goal of this work is to apply modern risk-based analysis techniques to this complex RT process in order to demonstrate to the RT community that such techniques may help identify more effective and efficient ways to enhance the safety and quality of our treatment processes. The task group generated by consensus an example quality management program strategy for the IMRT process performed at the institution of one of the authors. This report describes the methodology and nomenclature developed, presents the process maps, FMEAs, fault trees, and QM programs developed, and makes suggestions on how this information could be used in the clinic. The development and implementation of risk-assessment techniques will make radiation therapy safer and more efficient.« less

  17. The report of Task Group 100 of the AAPM: Application of risk analysis methods to radiation therapy quality management.

    PubMed

    Huq, M Saiful; Fraass, Benedick A; Dunscombe, Peter B; Gibbons, John P; Ibbott, Geoffrey S; Mundt, Arno J; Mutic, Sasa; Palta, Jatinder R; Rath, Frank; Thomadsen, Bruce R; Williamson, Jeffrey F; Yorke, Ellen D

    2016-07-01

    The increasing complexity of modern radiation therapy planning and delivery challenges traditional prescriptive quality management (QM) methods, such as many of those included in guidelines published by organizations such as the AAPM, ASTRO, ACR, ESTRO, and IAEA. These prescriptive guidelines have traditionally focused on monitoring all aspects of the functional performance of radiotherapy (RT) equipment by comparing parameters against tolerances set at strict but achievable values. Many errors that occur in radiation oncology are not due to failures in devices and software; rather they are failures in workflow and process. A systematic understanding of the likelihood and clinical impact of possible failures throughout a course of radiotherapy is needed to direct limit QM resources efficiently to produce maximum safety and quality of patient care. Task Group 100 of the AAPM has taken a broad view of these issues and has developed a framework for designing QM activities, based on estimates of the probability of identified failures and their clinical outcome through the RT planning and delivery process. The Task Group has chosen a specific radiotherapy process required for "intensity modulated radiation therapy (IMRT)" as a case study. The goal of this work is to apply modern risk-based analysis techniques to this complex RT process in order to demonstrate to the RT community that such techniques may help identify more effective and efficient ways to enhance the safety and quality of our treatment processes. The task group generated by consensus an example quality management program strategy for the IMRT process performed at the institution of one of the authors. This report describes the methodology and nomenclature developed, presents the process maps, FMEAs, fault trees, and QM programs developed, and makes suggestions on how this information could be used in the clinic. The development and implementation of risk-assessment techniques will make radiation therapy safer and more efficient.

  18. Application of continuous substrate feeding to the ABE fermentation: Relief of product inhibition using extraction, perstraction, stripping, and pervaporation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qureshi, N.; Maddox, I.S.; Friedl, A.

    1992-09-01

    The technique of continuous substrate feeding has been applied to the batch fermentation process using freely suspended cells, for ABE (acetone-butanol-ethanol) production. To avoid the product inhibition which normally restricts ABE production to less than 20 g/L and sugar utilization to 60 g/L, a product removal technique has been integrated into the fermentation process. The techniques investigated were liquid-liquid extraction, perstraction, gas-stripping, and pervaporation. By using a substrate of whey permeate, the reactor productivity has been improved over that observed in a traditional batch fermentation, while equivalent lactose utilization and ABE production values of 180 g and 69 g, respectively,more » have been achieved in a 1-L culture volume. 17 refs., 14 figs., 5 tabs.« less

  19. The Classification of Tongue Colors with Standardized Acquisition and ICC Profile Correction in Traditional Chinese Medicine

    PubMed Central

    Tu, Li-ping; Chen, Jing-bo; Hu, Xiao-juan; Zhang, Zhi-feng

    2016-01-01

    Background and Goal. The application of digital image processing techniques and machine learning methods in tongue image classification in Traditional Chinese Medicine (TCM) has been widely studied nowadays. However, it is difficult for the outcomes to generalize because of lack of color reproducibility and image standardization. Our study aims at the exploration of tongue colors classification with a standardized tongue image acquisition process and color correction. Methods. Three traditional Chinese medical experts are chosen to identify the selected tongue pictures taken by the TDA-1 tongue imaging device in TIFF format through ICC profile correction. Then we compare the mean value of L * a * b * of different tongue colors and evaluate the effect of the tongue color classification by machine learning methods. Results. The L * a * b * values of the five tongue colors are statistically different. Random forest method has a better performance than SVM in classification. SMOTE algorithm can increase classification accuracy by solving the imbalance of the varied color samples. Conclusions. At the premise of standardized tongue acquisition and color reproduction, preliminary objectification of tongue color classification in Traditional Chinese Medicine (TCM) is feasible. PMID:28050555

  20. An exploration of tutors' experiences of facilitating problem-based learning. Part 1--an educational research methodology combining innovation and philosophical tradition.

    PubMed

    Haith-Cooper, Melanie

    2003-01-01

    The use of problem-based learning (PBL) in Health Professional curricula is becoming more wide spread. Although the way in which the tutor facilitates PBL can have a major impact on students' learning (Andrews and Jones 1996), the literature provides little consistency as to how the tutor can effectively facilitate PBL (Haith-Cooper 2000). It is therefore important to examine the facilitation role to promote effective learning through the use of PBL. This article is the first of two parts exploring a study that was undertaken to investigate tutors' experiences of facilitating PBL. This part focuses on the methodology and the combining of innovative processes with traditional philosophical traditions to develop a systematic educational research methodology. The study was undertaken respecting the philosophy of hermeneutic phenomenology but utilised alternative data collection and analysis technique. Video conferencing and e-mail were used in conjunction with more traditional processes to access a worldwide sample. This paper explores some of the issues that arose when undertaking such a study. The second article then focuses on exploring the findings of the study and their implications for the facilitation of PBL.

  1. The classification and application of toxic Chinese materia medica.

    PubMed

    Liu, Xinmin; Wang, Qiong; Song, Guangqing; Zhang, Guangping; Ye, Zuguang; Williamson, Elizabeth M

    2014-03-01

    Many important drugs in the Chinese materia medica (CMM) are known to be toxic, and it has long been recognized in classical Chinese medical theory that toxicity can arise directly from the components of a single CMM or may be induced by an interaction between combined CMM. Traditional Chinese Medicine presents a unique set of pharmaceutical theories that include particular methods for processing, combining and decocting, and these techniques contribute to reducing toxicity as well as enhancing efficacy. The current classification of toxic CMM drugs, traditional methods for processing toxic CMM and the prohibited use of certain combinations, is based on traditional experience and ancient texts and monographs, but accumulating evidence increasingly supports their use to eliminate or reduce toxicity. Modern methods are now being used to evaluate the safety of CMM; however, a new system for describing the toxicity of Chinese herbal medicines may need to be established to take into account those herbs whose toxicity is delayed or otherwise hidden, and which have not been incorporated into the traditional classification. This review explains the existing classification and justifies it where appropriate, using experimental results often originally published in Chinese and previously not available outside China. Copyright © 2013 John Wiley & Sons, Ltd.

  2. The Classification of Tongue Colors with Standardized Acquisition and ICC Profile Correction in Traditional Chinese Medicine.

    PubMed

    Qi, Zhen; Tu, Li-Ping; Chen, Jing-Bo; Hu, Xiao-Juan; Xu, Jia-Tuo; Zhang, Zhi-Feng

    2016-01-01

    Background and Goal . The application of digital image processing techniques and machine learning methods in tongue image classification in Traditional Chinese Medicine (TCM) has been widely studied nowadays. However, it is difficult for the outcomes to generalize because of lack of color reproducibility and image standardization. Our study aims at the exploration of tongue colors classification with a standardized tongue image acquisition process and color correction. Methods . Three traditional Chinese medical experts are chosen to identify the selected tongue pictures taken by the TDA-1 tongue imaging device in TIFF format through ICC profile correction. Then we compare the mean value of L * a * b * of different tongue colors and evaluate the effect of the tongue color classification by machine learning methods. Results . The L * a * b * values of the five tongue colors are statistically different. Random forest method has a better performance than SVM in classification. SMOTE algorithm can increase classification accuracy by solving the imbalance of the varied color samples. Conclusions . At the premise of standardized tongue acquisition and color reproduction, preliminary objectification of tongue color classification in Traditional Chinese Medicine (TCM) is feasible.

  3. Aconitum in traditional Chinese medicine: a valuable drug or an unpredictable risk?

    PubMed

    Singhuber, Judith; Zhu, Ming; Prinz, Sonja; Kopp, Brigitte

    2009-10-29

    Aconitum species have been used in China as an essential drug in Traditional Chinese Medicine (TCM) for 2000 years. Reviewing the clinical application of Aconitum, their pharmacological effects, toxicity and detoxifying measures, herb-herb interactions, clinical taboos, famous herbal formulas, traditional and current herbal processing methods based upon a wide range of literature investigations serve as a case study to explore the multidisciplinary implications of botanicals used in TCM. The toxicological risk of improper usage of Aconitum remains very high, especially in countries like China, India and Japan. The toxicity of Aconitum mainly derives from the diester diterpene alkaloids (DDAs) including aconitine (AC), mesaconitine (MA) and hypaconitine (HA). They can be decomposed into less or non-toxic derivatives through Chinese traditional processing methods (Paozhi), which play an essential role in detoxification. Using Paozhi, the three main forms of processed aconite -- yanfuzi, heishunpian and baifupian -- can be obtained (CPCommission, 2005). Moreover, some new processing techniques have been developed in China such as pressure-steaming. The current development of fingerprint assays, in particular HPLC, has set a good basis to conduct an appropriate quality control for TCM crude herbs and their ready-made products. Therefore, a stipulation for a maximum level of DDA content of Aconitum is highly desirable in order to guarantee the clinical safety and its low toxicity in decoctions. Newly developed HPLC methods have made the accurate and simultaneous determination and quantification of DDA content interesting.

  4. Quantitative Aspects of Single Molecule Microscopy

    PubMed Central

    Ober, Raimund J.; Tahmasbi, Amir; Ram, Sripad; Lin, Zhiping; Ward, E. Sally

    2015-01-01

    Single molecule microscopy is a relatively new optical microscopy technique that allows the detection of individual molecules such as proteins in a cellular context. This technique has generated significant interest among biologists, biophysicists and biochemists, as it holds the promise to provide novel insights into subcellular processes and structures that otherwise cannot be gained through traditional experimental approaches. Single molecule experiments place stringent demands on experimental and algorithmic tools due to the low signal levels and the presence of significant extraneous noise sources. Consequently, this has necessitated the use of advanced statistical signal and image processing techniques for the design and analysis of single molecule experiments. In this tutorial paper, we provide an overview of single molecule microscopy from early works to current applications and challenges. Specific emphasis will be on the quantitative aspects of this imaging modality, in particular single molecule localization and resolvability, which will be discussed from an information theoretic perspective. We review the stochastic framework for image formation, different types of estimation techniques and expressions for the Fisher information matrix. We also discuss several open problems in the field that demand highly non-trivial signal processing algorithms. PMID:26167102

  5. Microbial bioinformatics for food safety and production

    PubMed Central

    Alkema, Wynand; Boekhorst, Jos; Wels, Michiel

    2016-01-01

    In the production of fermented foods, microbes play an important role. Optimization of fermentation processes or starter culture production traditionally was a trial-and-error approach inspired by expert knowledge of the fermentation process. Current developments in high-throughput ‘omics’ technologies allow developing more rational approaches to improve fermentation processes both from the food functionality as well as from the food safety perspective. Here, the authors thematically review typical bioinformatics techniques and approaches to improve various aspects of the microbial production of fermented food products and food safety. PMID:26082168

  6. Expansion of transient operating data

    NASA Astrophysics Data System (ADS)

    Chipman, Christopher; Avitabile, Peter

    2012-08-01

    Real time operating data is very important to understand actual system response. Unfortunately, the amount of physical data points typically collected is very small and often interpretation of the data is difficult. Expansion techniques have been developed using traditional experimental modal data to augment this limited set of data. This expansion process allows for a much improved description of the real time operating response. This paper presents the results from several different structures to show the robustness of the technique. Comparisons are made to a more complete set of measured data to validate the approach. Both analytical simulations and actual experimental data are used to illustrate the usefulness of the technique.

  7. A Comparison of Collaborative and Traditional Instruction in Higher Education

    ERIC Educational Resources Information Center

    Gubera, Chip; Aruguete, Mara S.

    2013-01-01

    Although collaborative instructional techniques have become popular in college courses, it is unclear whether collaborative techniques can replace more traditional instructional methods. We examined the efficacy of collaborative courses (in-class, collaborative activities with no lectures) compared to traditional lecture courses (in-class,…

  8. New efforts in eastern cottonwood biomass production through breeding and clonal refinement

    Treesearch

    Jason W. Cromer; Randall J. Rousseau; B. Landis Herrin

    2014-01-01

    First generation biofuels (also known as traditional biofuels) primarily use corn to produce ethanol. Newer techniques and knowledge are now allowing ethanol production from renewable resources such as trees that have more complex molecular structures that inhibit access to sugars. Ethanol production is through an enzymatic process which uses cellulose, or pyrolosis...

  9. Enhancing Listening Comprehension through a Group Work Guessing Game

    ERIC Educational Resources Information Center

    Baleghizadeh, Sasan; Arabtabar, Fatemeh

    2010-01-01

    The present paper is an attempt to introduce an innovative technique for a more effective teaching of L2 listening comprehension through a process-oriented approach. Much of what is traditionally known as listening practice is in fact testing material in which students are required to listen to a recording and answer a number of comprehension…

  10. Elevating Virtual Machine Introspection for Fine-Grained Process Monitoring: Techniques and Applications

    ERIC Educational Resources Information Center

    Srinivasan, Deepa

    2013-01-01

    Recent rapid malware growth has exposed the limitations of traditional in-host malware-defense systems and motivated the development of secure virtualization-based solutions. By running vulnerable systems as virtual machines (VMs) and moving security software from inside VMs to the outside, the out-of-VM solutions securely isolate the anti-malware…

  11. Neural net diagnostics for VLSI test

    NASA Technical Reports Server (NTRS)

    Lin, T.; Tseng, H.; Wu, A.; Dogan, N.; Meador, J.

    1990-01-01

    This paper discusses the application of neural network pattern analysis algorithms to the IC fault diagnosis problem. A fault diagnostic is a decision rule combining what is known about an ideal circuit test response with information about how it is distorted by fabrication variations and measurement noise. The rule is used to detect fault existence in fabricated circuits using real test equipment. Traditional statistical techniques may be used to achieve this goal, but they can employ unrealistic a priori assumptions about measurement data. Our approach to this problem employs an adaptive pattern analysis technique based on feedforward neural networks. During training, a feedforward network automatically captures unknown sample distributions. This is important because distributions arising from the nonlinear effects of process variation can be more complex than is typically assumed. A feedforward network is also able to extract measurement features which contribute significantly to making a correct decision. Traditional feature extraction techniques employ matrix manipulations which can be particularly costly for large measurement vectors. In this paper we discuss a software system which we are developing that uses this approach. We also provide a simple example illustrating the use of the technique for fault detection in an operational amplifier.

  12. A procedure to achieve fine control in MW processing of foods

    NASA Astrophysics Data System (ADS)

    Cuccurullo, G.; Cinquanta, L.; Sorrentino, G.

    2007-01-01

    A two-dimensional analytical model for predicting the unsteady temperature field in a cylindrical shaped body affected by spatially varying heat generation is presented. The dimensionless problem is solved analytically by using both partial solutions and the variation of parameters techniques. Having in mind industrial microwave heating for food pasteurization, the easy-to-handle solution is used to confirm the intrinsic lack of spatial uniformity of such a treatment in comparison to the traditional one. From an experimental point of view, a batch pasteurization treatment was realized to compare the effect of two different control techniques both based on IR thermography readout: the former assured a classical PID control, while the latter was based on a "shadowing" technique, consisting in covering portions of the sample which are hot enough with a mobile metallic screen. A measure of the effectiveness of the two control techniques was obtained by evaluating the thermal death curves of a strain Lactobacillus plantarum submitted to pasteurization temperatures. Preliminary results showed meaningful increases in the microwave thermal inactivation of the L. plantarum and similar significant decreases in thermal inactivation time with respect to the traditional pasteurization thermal treatment.

  13. High-resolution, 2- and 3-dimensional imaging of uncut, unembedded tissue biopsy samples.

    PubMed

    Torres, Richard; Vesuna, Sam; Levene, Michael J

    2014-03-01

    Despite continuing advances in tissue processing automation, traditional embedding, cutting, and staining methods limit our ability for rapid, comprehensive visual examination. These limitations are particularly relevant to biopsies for which immediate therapeutic decisions are most necessary, faster feedback to the patient is desired, and preservation of tissue for ancillary studies is most important. The recent development of improved tissue clearing techniques has made it possible to consider use of multiphoton microscopy (MPM) tools in clinical settings, which could address difficulties of established methods. To demonstrate the potential of MPM of cleared tissue for the evaluation of unembedded and uncut pathology samples. Human prostate, liver, breast, and kidney specimens were fixed and dehydrated by using traditional histologic techniques, with or without incorporation of nucleic acid fluorescent stains into dehydration steps. A benzyl alcohol/benzyl benzoate clearing protocol was substituted for xylene. Multiphoton microscopy was performed on a home-built system. Excellent morphologic detail was achievable with MPM at depths greater than 500 μm. Pseudocoloring produced images analogous to hematoxylin-eosin-stained images. Concurrent second-harmonic generation detection allowed mapping of collagen. Subsequent traditional section staining with hematoxylin-eosin did not reveal any detrimental morphologic effects. Sample immunostains on renal tissue showed preservation of normal reactivity. Complete reconstructions of 1-mm cubic samples elucidated 3-dimensional architectural organization. Multiphoton microscopy on cleared, unembedded, uncut biopsy specimens shows potential as a practical clinical tool with significant advantages over traditional histology while maintaining compatibility with gold standard techniques. Further investigation to address remaining implementation barriers is warranted.

  14. Meeting the Challenge: Using Cytological Profiling to Discover Chemical Probes from Traditional Chinese Medicines against Parkinson's Disease.

    PubMed

    Wang, Chao; Yang, Xinzhou; Mellick, George D; Feng, Yunjiang

    2016-12-21

    Parkinson's disease (PD) is an incurable neurodegenerative disorder with a high prevalence rate worldwide. The fact that there are currently no proven disease-modifying treatments for PD underscores the urgency for a more comprehensive understanding of the underlying disease mechanism. Chemical probes have been proven to be powerful tools for studying biological processes. Traditional Chinese medicine (TCM) contains a huge reservoir of bioactive small molecules as potential chemical probes that may hold the key to unlocking the mystery of PD biology. The TCM-sourced chemical approach to PD biology can be advanced through the use of an emerging cytological profiling (CP) technique that allows unbiased characterization of small molecules and their cellular responses. This comprehensive technique, applied to chemical probe identification from TCM and used for studying the molecular mechanisms underlying PD, may inform future therapeutic target selection and provide a new perspective to PD drug discovery.

  15. Rethinking Extinction

    PubMed Central

    Dunsmoor, Joseph E.; Niv, Yael; Daw, Nathaniel; Phelps, Elizabeth A.

    2015-01-01

    Extinction serves as the leading theoretical framework and experimental model to describe how learned behaviors diminish through absence of anticipated reinforcement. In the past decade, extinction has moved beyond the realm of associative learning theory and behavioral experimentation in animals and has become a topic of considerable interest in the neuroscience of learning, memory, and emotion. Here, we review research and theories of extinction, both as a learning process and as a behavioral technique, and consider whether traditional understandings warrant a re-examination. We discuss the neurobiology, cognitive factors, and major computational theories, and revisit the predominant view that extinction results in new learning that interferes with expression of the original memory. Additionally, we reconsider the limitations of extinction as a technique to prevent the relapse of maladaptive behavior, and discuss novel approaches, informed by contemporary theoretical advances, that augment traditional extinction methods to target and potentially alter maladaptive memories. PMID:26447572

  16. Make no mistake—errors can be controlled*

    PubMed Central

    Hinckley, C

    2003-01-01

    

 Traditional quality control methods identify "variation" as the enemy. However, the control of variation by itself can never achieve the remarkably low non-conformance rates of world class quality leaders. Because the control of variation does not achieve the highest levels of quality, an inordinate focus on these techniques obscures key quality improvement opportunities and results in unnecessary pain and suffering for patients, and embarrassment, litigation, and loss of revenue for healthcare providers. Recent experience has shown that mistakes are the most common cause of problems in health care as well as in other industrial environments. Excessive product and process complexity contributes to both excessive variation and unnecessary mistakes. The best methods for controlling variation, mistakes, and complexity are each a form of mistake proofing. Using these mistake proofing techniques, virtually every mistake and non-conformance can be controlled at a fraction of the cost of traditional quality control methods. PMID:14532368

  17. Green-noise halftoning with dot diffusion

    NASA Astrophysics Data System (ADS)

    Lippens, Stefaan; Philips, Wilfried

    2007-02-01

    Dot diffusion is a halftoning technique that is based on the traditional error diffusion concept, but offers a high degree of parallel processing by its block based approach. Traditional dot diffusion however suffers from periodicity artifacts. To limit the visibility of these artifacts, we propose grid diffusion, which applies different class matrices for different blocks. Furthermore, in this paper we will discuss two approaches in the dot diffusion framework to generate green-noise halftone patterns. The first approach is based on output dependent feedback (hysteresis), analogous to the standard green-noise error diffusion techniques. We observe that the resulting halftones are rather coarse and highly dependent on the used dot diffusion class matrices. In the second approach we don't limit the diffusion to the nearest neighbors. This leads to less coarse halftones, compared to the first approach. The drawback is that it can only cope with rather limited cluster sizes. We can reduce these drawbacks by combining the two approaches.

  18. Lessons from Trees.

    ERIC Educational Resources Information Center

    Elrick, Mike

    2003-01-01

    Traditional techniques and gear are better suited for comfortable extended wilderness trips with high school students than are emerging technologies and techniques based on low-impact camping and petroleum-based clothing, which send students the wrong messages about ecological relatedness and sustainability. Traditional travel techniques and…

  19. Automation in the Teaching of Descriptive Geometry and CAD. High-Level CAD Templates Using Script Languages

    NASA Astrophysics Data System (ADS)

    Moreno, R.; Bazán, A. M.

    2017-10-01

    The main purpose of this work is to study improvements to the learning method of technical drawing and descriptive geometry through exercises with traditional techniques that are usually solved manually by applying automated processes assisted by high-level CAD templates (HLCts). Given that an exercise with traditional procedures can be solved, detailed step by step in technical drawing and descriptive geometry manuals, CAD applications allow us to do the same and generalize it later, incorporating references. Traditional teachings have become obsolete and current curricula have been relegated. However, they can be applied in certain automation processes. The use of geometric references (using variables in script languages) and their incorporation into HLCts allows the automation of drawing processes. Instead of repeatedly creating similar exercises or modifying data in the same exercises, users should be able to use HLCts to generate future modifications of these exercises. This paper introduces the automation process when generating exercises based on CAD script files, aided by parametric geometry calculation tools. The proposed method allows us to design new exercises without user intervention. The integration of CAD, mathematics, and descriptive geometry facilitates their joint learning. Automation in the generation of exercises not only saves time but also increases the quality of the statements and reduces the possibility of human error.

  20. Ultrasonic Processing Technique as a Green Preparation Approach for Diacerein-Loaded Niosomes.

    PubMed

    Khan, Muhammad Imran; Madni, Asadullah; Hirvonen, Jouni; Peltonen, Leena

    2017-07-01

    In this study, the feasibility of ultrasonic processing (UP) technique as green preparation method for production of poorly soluble model drug substance, diacerein, loaded niosomes was demonstrated. Also, the effects of different surfactant systems on niosomes' characteristics were analyzed. Niosomes were prepared using both the green UP technique and traditional thin-film hydration (TFH) technique, which requires the use of environmentally hazardous organic solvents. The studied surfactant systems were Span 20, Pluronic L64, and their mixture (Span 20 and Pluronic L64). Both the production techniques produced well-defined spherical vesicles, but the UP technique produced smaller and more monodisperse niosomes than TFH. The entrapment efficiencies with the UP method were lower than with TFH, but still at a feasible level. All the niosomal formulations released diacerein faster than pure drug, and the drug release rates from the niosomes produced by the UP method were higher than those from the TFH-produced niosomes. With UP technique, the optimum process conditions for small niosomal products with low PDI values and high entrapment efficiencies were obtained when 70% amplitude and 45-min sonication time were used. The overall results demonstrated the potency of UP technique as an alternative fast, cost-effective, and green preparation approach for production of niosomes, which can be utilized as drug carrier systems for poorly soluble drug materials.

  1. Optical coherence tomography for embryonic imaging: a review

    PubMed Central

    Raghunathan, Raksha; Singh, Manmohan; Dickinson, Mary E.; Larin, Kirill V.

    2016-01-01

    Abstract. Embryogenesis is a highly complex and dynamic process, and its visualization is crucial for understanding basic physiological processes during development and for identifying and assessing possible defects, malformations, and diseases. While traditional imaging modalities, such as ultrasound biomicroscopy, micro-magnetic resonance imaging, and micro-computed tomography, have long been adapted for embryonic imaging, these techniques generally have limitations in their speed, spatial resolution, and contrast to capture processes such as cardiodynamics during embryogenesis. Optical coherence tomography (OCT) is a noninvasive imaging modality with micrometer-scale spatial resolution and imaging depth up to a few millimeters in tissue. OCT has bridged the gap between ultrahigh resolution imaging techniques with limited imaging depth like confocal microscopy and modalities, such as ultrasound sonography, which have deeper penetration but poorer spatial resolution. Moreover, the noninvasive nature of OCT has enabled live imaging of embryos without any external contrast agents. We review how OCT has been utilized to study developing embryos and also discuss advances in techniques used in conjunction with OCT to understand embryonic development. PMID:27228503

  2. Infusing Reliability Techniques into Software Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shi, Ying

    2015-01-01

    Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.

  3. Polymer Waveguide Fabrication Techniques

    NASA Astrophysics Data System (ADS)

    Ramey, Delvan A.

    1985-01-01

    The ability of integrated optic systems to compete in signal processing aplications with more traditional analog and digital electronic systems is discussed. The Acousto-Optic Spectrum Analyzer is an example which motivated the particular work discussed herein. Provided real time processing is more critical than absolute accuracy, such integrated optic systems fulfill a design need. Fan-out waveguide arrays allow crosstalk in system detector arrays to be controlled without directly limiting system resolution. A polyurethane pattern definition process was developed in order to demonstrate fan-out arrays. This novel process is discussed, along with further research needs. Integrated optic system market penetration would be enhanced by development of commercial processes of this type.

  4. Comparison of DGT with traditional extraction methods for assessing arsenic bioavailability to Brassica chinensis in different soils.

    PubMed

    Dai, Yunchao; Nasir, Mubasher; Zhang, Yulin; Gao, Jiakai; Lv, Yamin; Lv, Jialong

    2018-01-01

    Several predictive models and methods have been used for heavy metals bioavailability, but there is no universally accepted approach in evaluating the bioavailability of arsenic (As) in soil. The technique of diffusive gradients in thin-films (DGT) is a promising tool, but there is a considerable debate with respect to its suitability. The DGT method was compared with other traditional chemical extractions techniques (soil solution, NaHCO 3 , NH 4 Cl, HCl, and total As method) for estimating As bioavailability in soil based on a greenhouse experiment using Brassica chinensis grown in various soils from 15 provinces in China. In addition, we assessed whether these methods are independent of soil properties. The correlations between plant and soil As concentration measured with traditional extraction techniques were pH and iron oxide (Fe ox ) dependent, indicating that these methods are influenced by soil properties. In contrast, DGT measurements were independent of soil properties and also showed a better correlation coefficient than other traditional techniques. Thus, DGT technique is superior to traditional techniques and should be preferable for evaluating As bioavailability in different type of soils. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Song

    CFD (Computational Fluid Dynamics) is a widely used technique in engineering design field. It uses mathematical methods to simulate and predict flow characteristics in a certain physical space. Since the numerical result of CFD computation is very hard to understand, VR (virtual reality) and data visualization techniques are introduced into CFD post-processing to improve the understandability and functionality of CFD computation. In many cases CFD datasets are very large (multi-gigabytes), and more and more interactions between user and the datasets are required. For the traditional VR application, the limitation of computing power is a major factor to prevent visualizing largemore » dataset effectively. This thesis presents a new system designing to speed up the traditional VR application by using parallel computing and distributed computing, and the idea of using hand held device to enhance the interaction between a user and VR CFD application as well. Techniques in different research areas including scientific visualization, parallel computing, distributed computing and graphical user interface designing are used in the development of the final system. As the result, the new system can flexibly be built on heterogeneous computing environment, dramatically shorten the computation time.« less

  6. Modelling and multi objective optimization of WEDM of commercially Monel super alloy using evolutionary algorithms

    NASA Astrophysics Data System (ADS)

    Varun, Sajja; Reddy, Kalakada Bhargav Bal; Vardhan Reddy, R. R. Vishnu

    2016-09-01

    In this research work, development of a multi response optimization technique has been undertaken, using traditional desirability analysis and non-traditional particle swarm optimization techniques (for different customer's priorities) in wire electrical discharge machining (WEDM). Monel 400 has been selected as work material for experimentation. The effect of key process parameters such as pulse on time (TON), pulse off time (TOFF), peak current (IP), wire feed (WF) were on material removal rate (MRR) and surface roughness(SR) in WEDM operation were investigated. Further, the responses such as MRR and SR were modelled empirically through regression analysis. The developed models can be used by the machinists to predict the MRR and SR over a wide range of input parameters. The optimization of multiple responses has been done for satisfying the priorities of multiple users by using Taguchi-desirability function method and particle swarm optimization technique. The analysis of variance (ANOVA) is also applied to investigate the effect of influential parameters. Finally, the confirmation experiments were conducted for the optimal set of machining parameters, and the betterment has been proved.

  7. Measurement and modeling of the acoustic field near an underwater vehicle and implications for acoustic source localization.

    PubMed

    Lepper, Paul A; D'Spain, Gerald L

    2007-08-01

    The performance of traditional techniques of passive localization in ocean acoustics such as time-of-arrival (phase differences) and amplitude ratios measured by multiple receivers may be degraded when the receivers are placed on an underwater vehicle due to effects of scattering. However, knowledge of the interference pattern caused by scattering provides a potential enhancement to traditional source localization techniques. Results based on a study using data from a multi-element receiving array mounted on the inner shroud of an autonomous underwater vehicle show that scattering causes the localization ambiguities (side lobes) to decrease in overall level and to move closer to the true source location, thereby improving localization performance, for signals in the frequency band 2-8 kHz. These measurements are compared with numerical modeling results from a two-dimensional time domain finite difference scheme for scattering from two fluid-loaded cylindrical shells. Measured and numerically modeled results are presented for multiple source aspect angles and frequencies. Matched field processing techniques quantify the source localization capabilities for both measurements and numerical modeling output.

  8. Post Processing Methods used to Improve Surface Finish of Products which are Manufactured by Additive Manufacturing Technologies: A Review

    NASA Astrophysics Data System (ADS)

    Kumbhar, N. N.; Mulay, A. V.

    2016-08-01

    The Additive Manufacturing (AM) processes open the possibility to go directly from Computer-Aided Design (CAD) to a physical prototype. These prototypes are used as test models before it is finalized as well as sometimes as a final product. Additive Manufacturing has many advantages over the traditional process used to develop a product such as allowing early customer involvement in product development, complex shape generation and also save time as well as money. Additive manufacturing also possess some special challenges that are usually worth overcoming such as Poor Surface quality, Physical Properties and use of specific raw material for manufacturing. To improve the surface quality several attempts had been made by controlling various process parameters of Additive manufacturing and also applying different post processing techniques on components manufactured by Additive manufacturing. The main objective of this work is to document an extensive literature review in the general area of post processing techniques which are used in Additive manufacturing.

  9. Sterilization by oxygen plasma

    NASA Astrophysics Data System (ADS)

    Moreira, Adir José; Mansano, Ronaldo Domingues; Andreoli Pinto, Terezinha de Jesus; Ruas, Ronaldo; Zambon, Luis da Silva; da Silva, Mônica Valero; Verdonck, Patrick Bernard

    2004-07-01

    The use of polymeric medical devices has stimulated the development of new sterilization methods. The traditional techniques rely on ethylene oxide, but there are many questions concerning the carcinogenic properties of the ethylene oxide residues adsorbed on the materials after processing. Another common technique is the gamma irradiation process, but it is costly, its safe operation requires an isolated site and it also affects the bulk properties of the polymers. The use of a gas plasma is an elegant alternative sterilization technique. The plasma promotes an efficient inactivation of the micro-organisms, minimises the damage to the materials and presents very little danger for personnel and the environment. Pure oxygen reactive ion etching type of plasmas were applied to inactivate a biologic indicator, the Bacillus stearothermophilus, to confirm the efficiency of this process. The sterilization processes took a short time, in a few minutes the mortality was complete. In situ analysis of the micro-organisms' inactivating time was possible using emission spectrophotometry. The increase in the intensity of the 777.5 nm oxygen line shows the end of the oxidation of the biologic materials. The results were also observed and corroborated by scanning electron microscopy.

  10. Microstructure and Magnetic Properties of Magnetic Material Fabricated by Selective Laser Melting

    NASA Astrophysics Data System (ADS)

    Jhong, Kai Jyun; Huang, Wei-Chin; Lee, Wen Hsi

    Selective Laser Melting (SLM) is a powder-based additive manufacturing which is capable of producing parts layer-by-layer from a 3D CAD model. The aim of this study is to adopt the selective laser melting technique to magnetic material fabrication. [1]For the SLM process to be practical in industrial use, highly specific mechanical properties of the final product must be achieved. The integrity of the manufactured components depend strongly on each single laser-melted track and every single layer, as well as the strength of the connections between them. In this study, effects of the processing parameters, such as the space distance of surface morphology is analyzed. Our hypothesis is that when a magnetic product is made by the selective laser melting techniques instead of traditional techniques, the finished component will have more precise and effective properties. This study analyzed the magnitudes of magnetic properties in comparison with different parameters in the SLM process and compiled a completed product to investigate the efficiency in contrast with products made with existing manufacturing processes.

  11. Distillation Designs for the Lunar Surface

    NASA Technical Reports Server (NTRS)

    Boul, Peter J.; Lange,Kevin E.; Conger, Bruce; Anderson, Molly

    2010-01-01

    Gravity-based distillation methods may be applied to the purification of wastewater on the lunar base. These solutions to water processing are robust physical separation techniques, which may be more advantageous than many other techniques for their simplicity in design and operation. The two techniques can be used in conjunction with each other to obtain high purity water. The components and feed compositions for modeling waste water streams are presented in conjunction with the Aspen property system for traditional stage distillation. While the individual components for each of the waste streams will vary naturally within certain bounds, an analog model for waste water processing is suggested based on typical concentration ranges for these components. Target purity levels for recycled water are determined for each individual component based on NASA s required maximum contaminant levels for potable water Optimum parameters such as reflux ratio, feed stage location, and processing rates are determined with respect to the power consumption of the process. Multistage distillation is evaluated for components in wastewater to determine the minimum number of stages necessary for each of 65 components in humidity condensate and urine wastewater mixed streams.

  12. Fast algorithm for spectral processing with application to on-line welding quality assurance

    NASA Astrophysics Data System (ADS)

    Mirapeix, J.; Cobo, A.; Jaúregui, C.; López-Higuera, J. M.

    2006-10-01

    A new technique is presented in this paper for the analysis of welding process emission spectra to accurately estimate in real-time the plasma electronic temperature. The estimation of the electronic temperature of the plasma, through the analysis of the emission lines from multiple atomic species, may be used to monitor possible perturbations during the welding process. Unlike traditional techniques, which usually involve peak fitting to Voigt functions using the Levenberg-Marquardt recursive method, sub-pixel algorithms are used to more accurately estimate the central wavelength of the peaks. Three different sub-pixel algorithms will be analysed and compared, and it will be shown that the LPO (linear phase operator) sub-pixel algorithm is a better solution within the proposed system. Experimental tests during TIG-welding using a fibre optic to capture the arc light, together with a low cost CCD-based spectrometer, show that some typical defects associated with perturbations in the electron temperature can be easily detected and identified with this technique. A typical processing time for multiple peak analysis is less than 20 ms running on a conventional PC.

  13. Temporomandibular joint arthroscopy technique using a single working cannula.

    PubMed

    Srouji, S; Oren, D; Zoabi, A; Ronen, O; Zraik, H

    2016-11-01

    The traditional arthroscopy technique includes the creation of three ports in order to enable visualization, operation, and arthrocentesis. The aim of this study was to assess an advanced temporomandibular joint (TMJ) arthroscopy technique that requires only a single cannula, through which a one-piece instrument containing a visualization canal, irrigation canal, and a working canal is inserted, as an alternative to the traditional double-puncture technique. This retrospective study assessed eight patients (13 TMJs) with pain and/or limited range of movement that was refractory to conservative therapy, who were treated between June 2015 and December 2015. The temporomandibular joint disorder (TMD) was diagnosed by physical examination and mouth opening measurements. The duration of surgery was recorded and compared to that documented for traditional arthroscopies performed by the same surgeon. Operative single-cannula arthroscopy (OSCA) was performed using a holmium YAG (Ho:YAG) 230μm fibre laser for ablation. The OSCA technique proved effective in improving mouth opening in all patients (mean increase 9.12±1.96mm) and in reducing pain (mean visual analogue scale decrease of 3.25±1.28). The operation time was approximately half that of the traditional technique. The OSCA technique is as efficient as the traditional technique, is simple to learn, and is simpler to execute. Copyright © 2016 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  14. A bio-behavioral model of addiction treatment: applying dual representation theory to craving management and relapse prevention.

    PubMed

    Matto, Holly

    2005-01-01

    A bio-behavioral approach to drug addiction treatment is outlined. The presented treatment model uses dual representation theory as a guiding framework for understanding the bio-behavioral processes activated during the application of expressive therapeutic methods. Specifically, the treatment model explains how visual processing techniques can supplement traditional relapse prevention therapy protocols, to help clients better manage cravings and control triggers in hard-to-treat populations such as chronic substance-dependent persons.

  15. Knowledge-based simulation for aerospace systems

    NASA Technical Reports Server (NTRS)

    Will, Ralph W.; Sliwa, Nancy E.; Harrison, F. Wallace, Jr.

    1988-01-01

    Knowledge-based techniques, which offer many features that are desirable in the simulation and development of aerospace vehicle operations, exhibit many similarities to traditional simulation packages. The eventual solution of these systems' current symbolic processing/numeric processing interface problem will lead to continuous and discrete-event simulation capabilities in a single language, such as TS-PROLOG. Qualitative, totally-symbolic simulation methods are noted to possess several intrinsic characteristics that are especially revelatory of the system being simulated, and capable of insuring that all possible behaviors are considered.

  16. Process monitoring and visualization solutions for hot-melt extrusion: a review.

    PubMed

    Saerens, Lien; Vervaet, Chris; Remon, Jean Paul; De Beer, Thomas

    2014-02-01

    Hot-melt extrusion (HME) is applied as a continuous pharmaceutical manufacturing process for the production of a variety of dosage forms and formulations. To ensure the continuity of this process, the quality of the extrudates must be assessed continuously during manufacturing. The objective of this review is to provide an overview and evaluation of the available process analytical techniques which can be applied in hot-melt extrusion. Pharmaceutical extruders are equipped with traditional (univariate) process monitoring tools, observing barrel and die temperatures, throughput, screw speed, torque, drive amperage, melt pressure and melt temperature. The relevance of several spectroscopic process analytical techniques for monitoring and control of pharmaceutical HME has been explored recently. Nevertheless, many other sensors visualizing HME and measuring diverse critical product and process parameters with potential use in pharmaceutical extrusion are available, and were thoroughly studied in polymer extrusion. The implementation of process analytical tools in HME serves two purposes: (1) improving process understanding by monitoring and visualizing the material behaviour and (2) monitoring and analysing critical product and process parameters for process control, allowing to maintain a desired process state and guaranteeing the quality of the end product. This review is the first to provide an evaluation of the process analytical tools applied for pharmaceutical HME monitoring and control, and discusses techniques that have been used in polymer extrusion having potential for monitoring and control of pharmaceutical HME. © 2013 Royal Pharmaceutical Society.

  17. In vivo analysis of insertional torque during pedicle screwing using cortical bone trajectory technique.

    PubMed

    Matsukawa, Keitaro; Yato, Yoshiyuki; Kato, Takashi; Imabayashi, Hideaki; Asazuma, Takashi; Nemoto, Koichi

    2014-02-15

    The insertional torque of pedicle screws using the cortical bone trajectory (CBT) was measured in vivo. To investigate the effectiveness of the CBT technique by measurement of the insertional torque. The CBT follows a mediolateral and caudocephalad directed path, engaging with cortical bone maximally from the pedicle to the vertebral body. Some biomechanical studies have demonstrated favorable characteristics of the CBT technique in cadaveric lumbar spine. However, no in vivo study has been reported on the mechanical behavior of this new trajectory. The insertional torque of pedicle screws using CBT and traditional techniques were measured intraoperatively in 48 consecutive patients. A total of 162 screws using the CBT technique and 36 screws using the traditional technique were compared. In 8 of 48 patients, the side-by-side comparison of 2 different insertional techniques for each vertebra were performed, which formed the H group. In addition, the insertional torque was correlated with bone mineral density. The mean maximum insertional torque of CBT screws and traditional screws were 2.49 ± 0.99 Nm and 1.24 ± 0.54 Nm, respectively. The CBT screws showed 2.01 times higher torque and the difference was significant between the 2 techniques (P < 0.01). In the H group, the insertional torque were 2.71 ± 1.36 Nm in the CBT screws and 1.58 ± 0.44 Nm in the traditional screws. The CBT screws demonstrated 1.71 times higher torque and statistical significance was achieved (P < 0.01). Positive linear correlations between maximum insertional torque and bone mineral density were found in both technique, the correlation coefficient of traditional screws (r = 0.63, P < 0.01) was higher than that of the CBT screws (r = 0.59, P < 0.01). The insertional torque using the CBT technique is about 1.7 times higher than the traditional technique. 2.

  18. The Effect of High Pressure Techniques on the Stability of Anthocyanins in Fruit and Vegetables

    PubMed Central

    Marszałek, Krystian; Woźniak, Łukasz; Kruszewski, Bartosz; Skąpska, Sylwia

    2017-01-01

    Anthocyanins are a group of phenolic compounds responsible for red, blue and violet colouration of many fruits, vegetables and flowers. The high content of these pigments is important as it influences directly their health promoting properties as well as the sensory quality of the product; however they are prone to degradation by, inter alia, elevated temperature and tissue enzymes. The traditional thermal methods of food preservation cause significant losses of these pigments. Thus, novel non-thermal techniques such as high pressure processing, high pressure carbon dioxide and high pressure homogenization are under consideration. In this review, the authors attempted to summarize the current knowledge of the impact of high pressure techniques on the stability of anthocyanins during processing and storage of fruit and vegetable products. Furthermore, the effect of the activity of enzymes involved in the degradation of these compounds has been described. The conclusions including comparisons of pressure-based methods with high temperature preservation techniques were presented. PMID:28134807

  19. The Effect of High Pressure Techniques on the Stability of Anthocyanins in Fruit and Vegetables.

    PubMed

    Marszałek, Krystian; Woźniak, Łukasz; Kruszewski, Bartosz; Skąpska, Sylwia

    2017-01-27

    Anthocyanins are a group of phenolic compounds responsible for red, blue and violet colouration of many fruits, vegetables and flowers. The high content of these pigments is important as it influences directly their health promoting properties as well as the sensory quality of the product; however they are prone to degradation by, inter alia, elevated temperature and tissue enzymes. The traditional thermal methods of food preservation cause significant losses of these pigments. Thus, novel non-thermal techniques such as high pressure processing, high pressure carbon dioxide and high pressure homogenization are under consideration. In this review, the authors attempted to summarize the current knowledge of the impact of high pressure techniques on the stability of anthocyanins during processing and storage of fruit and vegetable products. Furthermore, the effect of the activity of enzymes involved in the degradation of these compounds has been described. The conclusions including comparisons of pressure-based methods with high temperature preservation techniques were presented.

  20. Intraoral Digital Impressioning for Dental Implant Restorations Versus Traditional Implant Impression Techniques.

    PubMed

    Wilk, Brian L

    2015-01-01

    Over the course of the past two to three decades, intraoral digital impression systems have gained acceptance due to high accuracy and ease of use as they have been incorporated into the fabrication of dental implant restorations. The use of intraoral digital impressions enables the clinician to produce accurate restorations without the unpleasant aspects of traditional impression materials and techniques. This article discusses the various types of digital impression systems and their accuracy compared to traditional impression techniques. The cost, time, and patient satisfaction components of both techniques will also be reviewed.

  1. Wavelet Filter Banks for Super-Resolution SAR Imaging

    NASA Technical Reports Server (NTRS)

    Sheybani, Ehsan O.; Deshpande, Manohar; Memarsadeghi, Nargess

    2011-01-01

    This paper discusses Innovative wavelet-based filter banks designed to enhance the analysis of super resolution Synthetic Aperture Radar (SAR) images using parametric spectral methods and signal classification algorithms, SAR finds applications In many of NASA's earth science fields such as deformation, ecosystem structure, and dynamics of Ice, snow and cold land processes, and surface water and ocean topography. Traditionally, standard methods such as Fast-Fourier Transform (FFT) and Inverse Fast-Fourier Transform (IFFT) have been used to extract Images from SAR radar data, Due to non-parametric features of these methods and their resolution limitations and observation time dependence, use of spectral estimation and signal pre- and post-processing techniques based on wavelets to process SAR radar data has been proposed. Multi-resolution wavelet transforms and advanced spectral estimation techniques have proven to offer efficient solutions to this problem.

  2. Process Properties of Electronic High Voltage Discharges Triggered by Ultra-short Pulsed Laser Filaments

    NASA Astrophysics Data System (ADS)

    Cvecek, Kristian; Gröschel, Benjamin; Schmidt, Michael

    Remote processing of metallic workpieces by techniques based on electric arc discharge or laser irradiation for joining or cutting has a long tradition and is still being intensively investigated in present-day research. In applications that require high power processing, both approaches exhibit certain advantages and disadvantages that make them specific for a given task. While several hybrid approaches exist that try to combine the benefits of both techniques, none were as successful in providing a fixed electric discharge direction as discharges triggered by plasma filaments generated by ultra-short pulsed lasers. In this work we investigate spatial and temporal aspects of laser filament guided discharges and give an upper time delay between the filament creation and the electrical build-up of a dischargeable voltage for a successful filament triggered discharge.

  3. Group Support Systems (GSS)

    NASA Technical Reports Server (NTRS)

    Hamel, Gary P.; Wijesinghe, R.

    1996-01-01

    Groupware is a term describing an emerging computer software technology enhancing the ability of people to work together as a group, (a software driven 'group support system'). This project originated at the beginning of 1992 and reports were issued describing the activity through May 1995. These reports stressed the need for process as well as technology. That is, while the technology represented a computer assisted method for groups to work together, the Group Support System (GSS) technology als required an understanding of the facilitation process electronic meetings demand. Even people trained in traditional facilitation techniques did not necessarily aimlessly adopt groupware techniques. The latest phase of this activity attempted to (1) improve the facilitation process by developing training support for a portable groupware computer system, and (2) to explore settings and uses for the portable groupware system using different software, such as Lotus Notes.

  4. Use of Iba Techniques to Characterize High Velocity Thermal Spray Coatings

    NASA Astrophysics Data System (ADS)

    Trompetter, W.; Markwitz, A.; Hyland, M.

    Spray coatings are being used in an increasingly wide range of industries to improve the abrasive, erosive and sliding wear of machine components. Over the past decade industries have moved to the application of supersonic high velocity thermal spray techniques. These coating techniques produce superior coating quality in comparison to other traditional techniques such as plasma spraying. To date the knowledge of the bonding processes and the structure of the particles within thermal spray coatings is very subjective. The aim of this research is to improve our understanding of these materials through the use of IBA techniques in conjunction with other materials analysis techniques. Samples were prepared by spraying a widely used commercial NiCr powder onto substrates using a HVAF (high velocity air fuel) thermal spraying technique. Detailed analysis of the composition and structure of the power particles revealed two distinct types of particles. The majority was NiCr particles with a significant minority of particles composing of SiO2/CrO3. When the particles were investigated both as raw powder and in the sprayed coating, it was surprising to find that the composition of the coating meterial remained unchanged during the coating process despite the high velocity application.

  5. Extreme Learning Machine and Particle Swarm Optimization in optimizing CNC turning operation

    NASA Astrophysics Data System (ADS)

    Janahiraman, Tiagrajah V.; Ahmad, Nooraziah; Hani Nordin, Farah

    2018-04-01

    The CNC machine is controlled by manipulating cutting parameters that could directly influence the process performance. Many optimization methods has been applied to obtain the optimal cutting parameters for the desired performance function. Nonetheless, the industry still uses the traditional technique to obtain those values. Lack of knowledge on optimization techniques is the main reason for this issue to be prolonged. Therefore, the simple yet easy to implement, Optimal Cutting Parameters Selection System is introduced to help the manufacturer to easily understand and determine the best optimal parameters for their turning operation. This new system consists of two stages which are modelling and optimization. In modelling of input-output and in-process parameters, the hybrid of Extreme Learning Machine and Particle Swarm Optimization is applied. This modelling technique tend to converge faster than other artificial intelligent technique and give accurate result. For the optimization stage, again the Particle Swarm Optimization is used to get the optimal cutting parameters based on the performance function preferred by the manufacturer. Overall, the system can reduce the gap between academic world and the industry by introducing a simple yet easy to implement optimization technique. This novel optimization technique can give accurate result besides being the fastest technique.

  6. Fluorescence correlation spectroscopy: novel variations of an established technique.

    PubMed

    Haustein, Elke; Schwille, Petra

    2007-01-01

    Fluorescence correlation spectroscopy (FCS) is one of the major biophysical techniques used for unraveling molecular interactions in vitro and in vivo. It allows minimally invasive study of dynamic processes in biological specimens with extremely high temporal and spatial resolution. By recording and correlating the fluorescence fluctuations of single labeled molecules through the exciting laser beam, FCS gives information on molecular mobility and photophysical and photochemical reactions. By using dual-color fluorescence cross-correlation, highly specific binding studies can be performed. These have been extended to four reaction partners accessible by multicolor applications. Alternative detection schemes shift accessible time frames to slower processes (e.g., scanning FCS) or higher concentrations (e.g., TIR-FCS). Despite its long tradition, FCS is by no means dated. Rather, it has proven to be a highly versatile technique that can easily be adapted to solve specific biological questions, and it continues to find exciting applications in biology and medicine.

  7. A case-based reasoning tool for breast cancer knowledge management with data mining concepts and techniques

    NASA Astrophysics Data System (ADS)

    Demigha, Souâd.

    2016-03-01

    The paper presents a Case-Based Reasoning Tool for Breast Cancer Knowledge Management to improve breast cancer screening. To develop this tool, we combine both concepts and techniques of Case-Based Reasoning (CBR) and Data Mining (DM). Physicians and radiologists ground their diagnosis on their expertise (past experience) based on clinical cases. Case-Based Reasoning is the process of solving new problems based on the solutions of similar past problems and structured as cases. CBR is suitable for medical use. On the other hand, existing traditional hospital information systems (HIS), Radiological Information Systems (RIS) and Picture Archiving Information Systems (PACS) don't allow managing efficiently medical information because of its complexity and heterogeneity. Data Mining is the process of mining information from a data set and transform it into an understandable structure for further use. Combining CBR to Data Mining techniques will facilitate diagnosis and decision-making of medical experts.

  8. Terminology model discovery using natural language processing and visualization techniques.

    PubMed

    Zhou, Li; Tao, Ying; Cimino, James J; Chen, Elizabeth S; Liu, Hongfang; Lussier, Yves A; Hripcsak, George; Friedman, Carol

    2006-12-01

    Medical terminologies are important for unambiguous encoding and exchange of clinical information. The traditional manual method of developing terminology models is time-consuming and limited in the number of phrases that a human developer can examine. In this paper, we present an automated method for developing medical terminology models based on natural language processing (NLP) and information visualization techniques. Surgical pathology reports were selected as the testing corpus for developing a pathology procedure terminology model. The use of a general NLP processor for the medical domain, MedLEE, provides an automated method for acquiring semantic structures from a free text corpus and sheds light on a new high-throughput method of medical terminology model development. The use of an information visualization technique supports the summarization and visualization of the large quantity of semantic structures generated from medical documents. We believe that a general method based on NLP and information visualization will facilitate the modeling of medical terminologies.

  9. "Teaches People That I'm More Than a Disability": Using Nominal Group Technique in Patient-Oriented Research for People With Intellectual and Developmental Disabilities.

    PubMed

    Spassiani, Natasha A; Sawyer, Amanda R; Chacra, Megan S Abou; Koch, Kimberley; Muñoz, Yasmin A; Lunsky, Yona

    2016-04-01

    Individuals with intellectual and developmental disabilities (IDD) have complex healthcare needs, which are often unmet. Nominal group technique (NGT) uses a mixed-methods approach, which may engage the IDD population in the research process in a person-centered manner and address the shortcomings of traditional research methods with this population. NGT was used with a group of 10 self-advocates to evaluate a series of healthcare tools created by and for individuals with IDD. Participants provided helpful input about the strengths of these tools and suggestions to improve them. NGT was found to be an effective way to engage all participants in the research process.

  10. Computer enhancement through interpretive techniques

    NASA Technical Reports Server (NTRS)

    Foster, G.; Spaanenburg, H. A. E.; Stumpf, W. E.

    1972-01-01

    The improvement in the usage of the digital computer through the use of the technique of interpretation rather than the compilation of higher ordered languages was investigated by studying the efficiency of coding and execution of programs written in FORTRAN, ALGOL, PL/I and COBOL. FORTRAN was selected as the high level language for examining programs which were compiled, and A Programming Language (APL) was chosen for the interpretive language. It is concluded that APL is competitive, not because it and the algorithms being executed are well written, but rather because the batch processing is less efficient than has been admitted. There is not a broad base of experience founded on trying different implementation strategies which have been targeted at open competition with traditional processing methods.

  11. Decorin content and near infrared spectroscopy analysis of dried collagenous biomaterial samples.

    PubMed

    Aldema-Ramos, Mila L; Castell, Joan Carles; Muir, Zerlina E; Adzet, Jose Maria; Sabe, Rosa; Schreyer, Suzanne

    2012-12-14

    The efficient removal of proteoglycans, such as decorin, from the hide when processing it to leather by traditional means is generally acceptable and beneficial for leather quality, especially for softness and flexibility. A patented waterless or acetone dehydration method that can generate a product similar to leather called Dried Collagenous Biomaterial (known as BCD) was developed but has no effect on decorin removal efficiency. The Alcian Blue colorimetric technique was used to assay the sulfated glycosaminoglycan (sGAG) portion of decorin. The corresponding residual decorin content was correlated to the mechanical properties of the BCD samples and was comparable to the control leather made traditionally. The waterless dehydration and instantaneous chrome tanning process is a good eco-friendly alternative to transforming hides to leather because no additional effects were observed after examination using NIR spectroscopy and additional chemometric analysis.

  12. Diagnostic Emplotment in Q'eqchi' Maya Medicine.

    PubMed

    Hatala, Andrew R; Waldram, James B

    2017-04-01

    Medical diagnosis is a process of illness discrimination, categorization, and identification on the basis of careful observation and is central in biomedicine and many traditional medical systems around the world. Through a detailed analysis of several illness episodes and healer interviews among Maya communities in southern Belize, we observe that the diagnostic processes of traditional Q'eqchi' healers reflect patterns of narrative 'emplotment' that engage not simply the individual patient but also significant spiritual and cosmological forces. Three diagnostic techniques of the Q'eqchi' Maya healers are described and their connections to Maya concepts of personhood and cosmovision are presented. This research fosters an appreciation of how Indigenous knowledge systems shape clinical encounters and healing dramas, widening the spheres of clinical narrative co-construction and dialogue beyond the material and physical contexts implicit within Western clinical encounters.

  13. Potential hazards in smoke-flavored fish

    NASA Astrophysics Data System (ADS)

    Lin, Hong; Jiang, Jie; Li, Donghua

    2008-08-01

    Smoking is widely used in fish processing for the color and flavor. Smoke flavorings have evolved as a successful alternative to traditional smoking. The hazards of the fish products treated by liquid-smoking process are discussed in this review. The smoke flavoring is one important ingredient in the smoke-flavored fish. This paper gives the definition of smoke flavorings and the hazard of polycyclic aromatic hydrocarbons (PAHs) residue in the smoke flavorings on the market. It gives also an assessment of chemical hazards such as carcinogenic PAHs, especially Benzo-[ a]pyrene, as well as biological hazards such as Listeria monocytogenes, Clostridium botulinum, histamine and parasites in smoke-flavored fish. The limitations in regulations or standards are discussed. Smoke flavored fish have lower content of PAHs as compared with the traditional smoking techniques if the PAHs residue in smoke flavorings is controlled by regulations or standards.

  14. Superresolution Interferometric Imaging with Sparse Modeling Using Total Squared Variation: Application to Imaging the Black Hole Shadow

    NASA Astrophysics Data System (ADS)

    Kuramochi, Kazuki; Akiyama, Kazunori; Ikeda, Shiro; Tazaki, Fumie; Fish, Vincent L.; Pu, Hung-Yi; Asada, Keiichi; Honma, Mareki

    2018-05-01

    We propose a new imaging technique for interferometry using sparse modeling, utilizing two regularization terms: the ℓ 1-norm and a new function named total squared variation (TSV) of the brightness distribution. First, we demonstrate that our technique may achieve a superresolution of ∼30% compared with the traditional CLEAN beam size using synthetic observations of two point sources. Second, we present simulated observations of three physically motivated static models of Sgr A* with the Event Horizon Telescope (EHT) to show the performance of proposed techniques in greater detail. Remarkably, in both the image and gradient domains, the optimal beam size minimizing root-mean-squared errors is ≲10% of the traditional CLEAN beam size for ℓ 1+TSV regularization, and non-convolved reconstructed images have smaller errors than beam-convolved reconstructed images. This indicates that TSV is well matched to the expected physical properties of the astronomical images and the traditional post-processing technique of Gaussian convolution in interferometric imaging may not be required. We also propose a feature-extraction method to detect circular features from the image of a black hole shadow and use it to evaluate the performance of the image reconstruction. With this method and reconstructed images, the EHT can constrain the radius of the black hole shadow with an accuracy of ∼10%–20% in present simulations for Sgr A*, suggesting that the EHT would be able to provide useful independent measurements of the mass of the supermassive black holes in Sgr A* and also another primary target, M87.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saunders, P.

    The majority of general-purpose low-temperature handheld radiation thermometers are severely affected by the size-of-source effect (SSE). Calibration of these instruments is pointless unless the SSE is accounted for in the calibration process. Traditional SSE measurement techniques, however, are costly and time consuming, and because the instruments are direct-reading in temperature, traditional SSE results are not easily interpretable, particularly by the general user. This paper describes a simplified method for measuring the SSE, suitable for second-tier calibration laboratories and requiring no additional equipment, and proposes a means of reporting SSE results on a calibration certificate that should be easily understood bymore » the non-specialist user.« less

  16. Effects-Based Operations in the Cyber Domain

    DTIC Science & Technology

    2017-05-03

    as the joint targeting methodology . The description that Batschelet gave the traditional targeting methodology included a process of, “Decide, Detect...technology, requires new planning and methodology to fight back. This paper evaluates current Department of Defense doctrine to look at ways to conduct...developing its cyber tactics, techniques, and procedures, which, includes various targeting methodologies , such as the use of effects-based

  17. A non-parametric, supervised classification of vegetation types on the Kaibab National Forest using decision trees

    Treesearch

    Suzanne M. Joy; R. M. Reich; Richard T. Reynolds

    2003-01-01

    Traditional land classification techniques for large areas that use Landsat Thematic Mapper (TM) imagery are typically limited to the fixed spatial resolution of the sensors (30m). However, the study of some ecological processes requires land cover classifications at finer spatial resolutions. We model forest vegetation types on the Kaibab National Forest (KNF) in...

  18. Non-Profit/Higher Education Project Management Series: Project Management (PM) Foundations

    ERIC Educational Resources Information Center

    Burgher, Karl E.; Snyder, Michael B.

    2012-01-01

    This is the first in a series of forum articles on applying project management (PM) techniques and tools to the nonprofit sector with a focus on higher education. The authors will begin with a traditional look at project management because they believe that the integration of the tools and the processes associated with PM into many campus offices…

  19. Laparoscopic Appendectomy.

    DTIC Science & Technology

    1992-04-14

    is a minimally invasive endoscopic surgical procedure to remove the appendix. From December 1990 to February 1991, Tripler Army Medical Center...appendectomy appears to be a safe, cost-effective, minimally invasive surgical technique that in skilled hands may be used to remove most diseased appendices...requiring surgical intervention [1]. Management of this disease process has traditionally involved the surgical removal of the appendix through a right

  20. "Mushin": Learning in Technique-Intensive Sports as a Process of Uniting Mind and Body through Complex Learning Theory

    ERIC Educational Resources Information Center

    Light, Richard L.; Kentel, Jeanne Adéle

    2015-01-01

    Background: Interest in the use of learning theory to inform sport and physical-education pedagogy over the past decade beyond games and team sports has been limited. Purpose: Following on from recent interest within the literature in Eastern philosophic traditions, this article draws on the Japanese concept of "mushin" and complex…

  1. Supercritical fluid processing of drug nanoparticles in stable suspension.

    PubMed

    Pathak, Pankaj; Meziani, Mohammed J; Desai, Tarang; Foster, Charles; Diaz, Julian A; Sun, Ya-Ping

    2007-07-01

    Significant effort has been directed toward the development of drug formulation and delivery techniques, especially for the drug of no or poor aqueous solubility. Among various strategies to address the solubility issue, the reduction of drug particle sizes to the nanoscale has been identified as a potentially effective and broadly applicable approach. Complementary to traditional methods, supercritical fluid techniques have found unique applications in the production and processing of drug particles. Here we report the application of a newly developed supercritical fluid processing technique, Rapid Expansion of a Supercritical Solution into a Liquid Solvent, to the nanosizing of potent antiparasitic drug Amphotericin B particles. A supercritical carbon dioxide-cosolvent system was used for the solubilization and processing of the drug. The process produced well-dispersed nanoscale Amphotericin B particles suspended in an aqueous solution, and the suspension was intrinsically stable or could be further stabilized in the presence of water-soluble polymers. The properties of the drug nanoparticles were found to be dependent on the type of cosolvent used. The results on the use of dimethyl sulfoxide and methanol as cosolvents and their effects on the properties of nanosized Amphotericin B particles are presented and discussed.

  2. Forest control and regulation ... a comparison of traditional methods and alternatives

    Treesearch

    LeRoy C. Hennes; Michael J. Irving; Daniel I. Navon

    1971-01-01

    Two traditional techniques of forest control and regulation-formulas and area-volume check-are compared to linear programing, as used in a new computerized planning system called Timber Resource Allocation Method ( Timber RAM). Inventory data from a National Forest in California illustrate how each technique is used. The traditional methods are simpler to apply and...

  3. The role of traditional health practitioners in Rural KwaZulu-Natal, South Africa: generic or mode specific?

    PubMed

    Zuma, Thembelihle; Wight, Daniel; Rochat, Tamsen; Moshabela, Mosa

    2016-08-22

    Traditional health practitioners (THPs) play a vital role in the health care of the majority of the South African population and elsewhere on the African continent. However, many studies have challenged the role of THPs in health care. Concerns raised in the literature include the rationale, safety and effectiveness of traditional health practices and methods, as well as what informs them. This paper explores the processes followed in becoming a traditional healer and how these processes are related to THP roles. A qualitative research design was adopted, using four repeat group discussions with nine THPs, as part of a larger qualitative study conducted within the HIV Treatment as Prevention trial in rural South Africa. THPs were sampled through the local THP association and snowballing techniques. Data collection approaches included photo-voice and community walks. The role identity theory and content analysis were used to explore the data following transcription and translation. In the context of rural Northern KwaZulu-Natal, three types of THPs were identified: 1) Isangoma (diviner); 2) Inyanga (one who focuses on traditional medical remedies) and 3) Umthandazi (faith healer). Findings revealed that THPs are called by ancestors to become healers and/or go through an intensive process of learning about traditional medicines including plant, animal or mineral substances to provide health care. Some THPs identified themselves primarily as one type of healer, while most occupied multiple healing categories, that is, they practiced across different healing types. Our study also demonstrates that THPs fulfil roles that are not specific to the type of healer they are, these include services that go beyond the uses of herbs for physical illnesses or divination. THPs serve roles which include, but are not limited to, being custodians of traditional African religion and customs, educators about culture, counsellors, mediators and spiritual protectors. THPs' mode specific roles are influenced by the processes by which they become healers. However, whichever type of healer they identified as, most THPs used similar, generic methods and practices to focus on the physical, spiritual, cultural, psychological, emotional and social elements of illness.

  4. A service based adaptive U-learning system using UX.

    PubMed

    Jeong, Hwa-Young; Yi, Gangman

    2014-01-01

    In recent years, traditional development techniques for e-learning systems have been changing to become more convenient and efficient. One new technology in the development of application systems includes both cloud and ubiquitous computing. Cloud computing can support learning system processes by using services while ubiquitous computing can provide system operation and management via a high performance technical process and network. In the cloud computing environment, a learning service application can provide a business module or process to the user via the internet. This research focuses on providing the learning material and processes of courses by learning units using the services in a ubiquitous computing environment. And we also investigate functions that support users' tailored materials according to their learning style. That is, we analyzed the user's data and their characteristics in accordance with their user experience. We subsequently applied the learning process to fit on their learning performance and preferences. Finally, we demonstrate how the proposed system outperforms learning effects to learners better than existing techniques.

  5. A Service Based Adaptive U-Learning System Using UX

    PubMed Central

    Jeong, Hwa-Young

    2014-01-01

    In recent years, traditional development techniques for e-learning systems have been changing to become more convenient and efficient. One new technology in the development of application systems includes both cloud and ubiquitous computing. Cloud computing can support learning system processes by using services while ubiquitous computing can provide system operation and management via a high performance technical process and network. In the cloud computing environment, a learning service application can provide a business module or process to the user via the internet. This research focuses on providing the learning material and processes of courses by learning units using the services in a ubiquitous computing environment. And we also investigate functions that support users' tailored materials according to their learning style. That is, we analyzed the user's data and their characteristics in accordance with their user experience. We subsequently applied the learning process to fit on their learning performance and preferences. Finally, we demonstrate how the proposed system outperforms learning effects to learners better than existing techniques. PMID:25147832

  6. In-situ spectroscopic analysis of the traditional dyeing pigment Turkey red inside textile matrix

    NASA Astrophysics Data System (ADS)

    Meyer, M.; Huthwelker, T.; Borca, C. N.; Meßlinger, K.; Bieber, M.; Fink, R. H.; Späth, A.

    2018-03-01

    Turkey red is a traditional pigment for textile dyeing and its use has been proven for various cultures within the last three millennia. The pigment is a dye-mordant complex consisting of Al and an extract from R. tinctorum that contains mainly the anthraquinone derivative alizarin. The chemical structure of the complex has been analyzed by various spectroscopic and crystallographic techniques for extractions from textiles or directly in solution. We present an in-situ study of Turkey red by means of μ-XRF mapping and NEXAFS spectroscopy on textile fibres dyed according to a traditional process to gain insight into the coordination chemistry of the pigment in realistic matrix. We find an octahedral coordination of Al that corresponds well to the commonly accepted structure of the Al alizarin complex derived from ex-situ studies.

  7. Manufacturing with the Sun

    NASA Astrophysics Data System (ADS)

    Murphy, L. M.; Hauser, S. G.; Clyne, R. J.

    1992-05-01

    Concentrated solar radiation is now a viable alternative energy source for many advanced manufacturing processes. Researchers at the National Renewable Energy Laboratory (NREL) have demonstrated the feasibility of processes such as solar-induced surface transformation of materials (SISTM), solar-based manufacturing, and solar-pumped lasers. Researchers are also using sunlight to decontaminate water and soils polluted with organic compounds; these techniques could provide manufacturers with innovative alternatives to traditional methods of waste management. The solar technology that is now being integrated into today's manufacturing processes offers even greater potential for tomorrow, especially as applied to the radiation-abundant environment available in space and on the lunar surface.

  8. Manufacturing with the Sun

    NASA Astrophysics Data System (ADS)

    Murphy, Lawrence M.; Hauser, Steven G.; Clyne, Richard J.

    1991-12-01

    Concentrated solar radiation is now a viable alternative source for many advanced manufacturing processes. Researchers at the National Renewable Energy Laboratory (NREL) have demonstrated the feasibility of processes such as solar induced surface transformation of materials (SISTM), solar based manufacturing, and solar pumped lasers. Researchers are also using sunlight to decontaminate water and soils polluted with organic compounds; these techniques could provide manufacturers with innovative alternatives to traditional methods of waste management. The solar technology that is now being integrated into today's manufacturing processes offer greater potential for tomorrow, especially as applied to the radiation abundant environment available in space and on the lunar surface.

  9. Manufacturing with the Sun

    NASA Technical Reports Server (NTRS)

    Murphy, Lawrence M.; Hauser, Steven G.; Clyne, Richard J.

    1991-01-01

    Concentrated solar radiation is now a viable alternative source for many advanced manufacturing processes. Researchers at the National Renewable Energy Laboratory (NREL) have demonstrated the feasibility of processes such as solar induced surface transformation of materials (SISTM), solar based manufacturing, and solar pumped lasers. Researchers are also using sunlight to decontaminate water and soils polluted with organic compounds; these techniques could provide manufacturers with innovative alternatives to traditional methods of waste management. The solar technology that is now being integrated into today's manufacturing processes offer greater potential for tomorrow, especially as applied to the radiation abundant environment available in space and on the lunar surface.

  10. Event-Based Processing of Neutron Scattering Data

    DOE PAGES

    Peterson, Peter F.; Campbell, Stuart I.; Reuter, Michael A.; ...

    2015-09-16

    Many of the world's time-of-flight spallation neutrons sources are migrating to the recording of individual neutron events. This provides for new opportunities in data processing, the least of which is to filter the events based on correlating them with logs of sample environment and other ancillary equipment. This paper will describe techniques for processing neutron scattering data acquired in event mode that preserve event information all the way to a final spectrum, including any necessary corrections or normalizations. This results in smaller final errors, while significantly reducing processing time and memory requirements in typical experiments. Results with traditional histogramming techniquesmore » will be shown for comparison.« less

  11. Endoscopic versus traditional saphenous vein harvesting: a prospective, randomized trial.

    PubMed

    Allen, K B; Griffith, G L; Heimansohn, D A; Robison, R J; Matheny, R G; Schier, J J; Fitzgerald, E B; Shaar, C J

    1998-07-01

    Saphenous vein harvested with a traditional longitudinal technique often results in leg wound complications. An alternative endoscopic harvest technique may decrease these complications. One hundred twelve patients scheduled for elective coronary artery bypass grafting were prospectively randomized to have vein harvested using either an endoscopic (group A, n = 54) or traditional technique (group B, n = 58). Groups A and B, respectively, were similar with regard to length of vein harvested (41 +/- 8 cm versus 40 +/- 14 cm), bypasses done (4.1 +/- 1.1 versus 4.2 +/- 1.4), age, preoperative risk stratification, and risks for wound complication (diabetes, sex, obesity, preoperative anemia, hypoalbuminemia, and peripheral vascular disease). Leg wound complications were significantly (p < or = 0.02) reduced in group A (4% [2 of 51] versus 19% [11 of 58]). Univariate analysis identified traditional incision (p < or = 0.02) and diabetes (p < or = 0.05) as wound complication risk factors. Multiple logistic regression analysis identified only the traditional harvest technique as a risk factor for leg wound complications with no significant interaction between harvest technique and any preoperative risk factor (p < or = 0.03). Harvest rate (0.9 +/- 0.4 cm/min versus 1.2 +/- 0.5 cm/min) was slower for group A (p < or = 0.02) and conversion from endoscopic to a traditional harvest occurred in 5.6% (3 of 54) of patients. In a prospective, randomized trial, saphenous vein harvested endoscopically was associated with fewer wound complications than the traditional longitudinal method.

  12. Evaluation of stabilization techniques for ion implant processing

    NASA Astrophysics Data System (ADS)

    Ross, Matthew F.; Wong, Selmer S.; Minter, Jason P.; Marlowe, Trey; Narcy, Mark E.; Livesay, William R.

    1999-06-01

    With the integration of high current ion implant processing into volume CMOS manufacturing, the need for photoresist stabilization to achieve a stable ion implant process is critical. This study compares electron beam stabilization, a non-thermal process, with more traditional thermal stabilization techniques such as hot plate baking and vacuum oven processing. The electron beam processing is carried out in a flood exposure system with no active heating of the wafer. These stabilization techniques are applied to typical ion implant processes that might be found in a CMOS production process flow. The stabilization processes are applied to a 1.1 micrometers thick PFI-38A i-line photoresist film prior to ion implant processing. Post stabilization CD variation is detailed with respect to wall slope and feature integrity. SEM photographs detail the effects of the stabilization technique on photoresist features. The thermal stability of the photoresist is shown for different levels of stabilization and post stabilization thermal cycling. Thermal flow stability of the photoresist is detailed via SEM photographs. A significant improvement in thermal stability is achieved with the electron beam process, such that photoresist features are stable to temperatures in excess of 200 degrees C. Ion implant processing parameters are evaluated and compared for the different stabilization methods. Ion implant system end-station chamber pressure is detailed as a function of ion implant process and stabilization condition. The ion implant process conditions are detailed for varying factors such as ion current, energy, and total dose. A reduction in the ion implant systems end-station chamber pressure is achieved with the electron beam stabilization process over the other techniques considered. This reduction in end-station chamber pressure is shown to provide a reduction in total process time for a given ion implant dose. Improvements in the ion implant process are detailed across several combinations of current and energy.

  13. Traditional versus rule-based programming techniques: Application to the control of optional flight information

    NASA Technical Reports Server (NTRS)

    Ricks, Wendell R.; Abbott, Kathy H.

    1987-01-01

    To the software design community, the concern over the costs associated with a program's execution time and implementation is great. It is always desirable, and sometimes imperative, that the proper programming technique is chosen which minimizes all costs for a given application or type of application. A study is described that compared cost-related factors associated with traditional programming techniques to rule-based programming techniques for a specific application. The results of this study favored the traditional approach regarding execution efficiency, but favored the rule-based approach regarding programmer productivity (implementation ease). Although this study examined a specific application, the results should be widely applicable.

  14. Focused attention, open monitoring and automatic self-transcending: Categories to organize meditations from Vedic, Buddhist and Chinese traditions.

    PubMed

    Travis, Fred; Shear, Jonathan

    2010-12-01

    This paper proposes a third meditation-category--automatic self-transcending--to extend the dichotomy of focused attention and open monitoring proposed by Lutz. Automatic self-transcending includes techniques designed to transcend their own activity. This contrasts with focused attention, which keeps attention focused on an object; and open monitoring, which keeps attention involved in the monitoring process. Each category was assigned EEG bands, based on reported brain patterns during mental tasks, and meditations were categorized based on their reported EEG. Focused attention, characterized by beta/gamma activity, included meditations from Tibetan Buddhist, Buddhist, and Chinese traditions. Open monitoring, characterized by theta activity, included meditations from Buddhist, Chinese, and Vedic traditions. Automatic self-transcending, characterized by alpha1 activity, included meditations from Vedic and Chinese traditions. Between categories, the included meditations differed in focus, subject/object relation, and procedures. These findings shed light on the common mistake of averaging meditations together to determine mechanisms or clinical effects. Copyright © 2010 Elsevier Inc. All rights reserved.

  15. Traditional versus rule-based programming techniques - Application to the control of optional flight information

    NASA Technical Reports Server (NTRS)

    Ricks, Wendell R.; Abbott, Kathy H.

    1987-01-01

    A traditional programming technique for controlling the display of optional flight information in a civil transport cockpit is compared to a rule-based technique for the same function. This application required complex decision logic and a frequently modified rule base. The techniques are evaluated for execution efficiency and implementation ease; the criterion used to calculate the execution efficiency is the total number of steps required to isolate hypotheses that were true and the criteria used to evaluate the implementability are ease of modification and verification and explanation capability. It is observed that the traditional program is more efficient than the rule-based program; however, the rule-based programming technique is more applicable for improving programmer productivity.

  16. Comparison of ultrasound-assisted and traditional caustic leaching of spent cathode carbon (SCC) from aluminum electrolysis.

    PubMed

    Xiao, Jin; Yuan, Jie; Tian, Zhongliang; Yang, Kai; Yao, Zhen; Yu, Bailie; Zhang, Liuyun

    2018-01-01

    The spent cathode carbon (SCC) from aluminum electrolysis was subjected to caustic leaching to investigate the different effects of ultrasound-assisted and traditional methods on element fluorine (F) leaching rate and leaching residue carbon content. Sodium hydroxide (NaOH) dissolved in deionized water was used as the reaction system. Through single-factor experiments and a comparison of two leaching techniques, the optimum F leaching rate and residue carbon content for ultrasound-assisted leaching process were obtained at a temperature of 70°C, residue time of 40min, initial mass ratio of alkali to SCC (initial alkali-to-material ratio) of 0.6, liquid-to-solid ratio of 10mL/g, and ultrasonic power of 400W, respectively. Under the optimal conditions, the leaching residue carbon content was 94.72%, 2.19% larger than the carbon content of traditional leaching residue. Leaching wastewater was treated with calcium chloride (CaCl 2 ) and bleaching powder and the treated wastewater was recycled caustic solution. All in all, benefiting from advantage of the ultrasonication effects, ultrasound-assisted caustic leaching on spent cathode carbon had 55.6% shorter residue time than the traditional process with a higher impurity removal rate. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Survey Of Lossless Image Coding Techniques

    NASA Astrophysics Data System (ADS)

    Melnychuck, Paul W.; Rabbani, Majid

    1989-04-01

    Many image transmission/storage applications requiring some form of data compression additionally require that the decoded image be an exact replica of the original. Lossless image coding algorithms meet this requirement by generating a decoded image that is numerically identical to the original. Several lossless coding techniques are modifications of well-known lossy schemes, whereas others are new. Traditional Markov-based models and newer arithmetic coding techniques are applied to predictive coding, bit plane processing, and lossy plus residual coding. Generally speaking, the compression ratio offered by these techniques are in the area of 1.6:1 to 3:1 for 8-bit pictorial images. Compression ratios for 12-bit radiological images approach 3:1, as these images have less detailed structure, and hence, their higher pel correlation leads to a greater removal of image redundancy.

  18. Using Innovative Technologies for Manufacturing and Evaluating Rocket Engine Hardware

    NASA Technical Reports Server (NTRS)

    Betts, Erin M.; Hardin, Andy

    2011-01-01

    Many of the manufacturing and evaluation techniques that are currently used for rocket engine component production are traditional methods that have been proven through years of experience and historical precedence. As we enter into a new space age where new launch vehicles are being designed and propulsion systems are being improved upon, it is sometimes necessary to adopt new and innovative techniques for manufacturing and evaluating hardware. With a heavy emphasis on cost reduction and improvements in manufacturing time, manufacturing techniques such as Direct Metal Laser Sintering (DMLS) and white light scanning are being adopted and evaluated for their use on J-2X, with hopes of employing both technologies on a wide variety of future projects. DMLS has the potential to significantly reduce the processing time and cost of engine hardware, while achieving desirable material properties by using a layered powdered metal manufacturing process in order to produce complex part geometries. The white light technique is a non-invasive method that can be used to inspect for geometric feature alignment. Both the DMLS manufacturing method and the white light scanning technique have proven to be viable options for manufacturing and evaluating rocket engine hardware, and further development and use of these techniques is recommended.

  19. The effect of various veneering techniques on the marginal fit of zirconia copings.

    PubMed

    Torabi, Kianoosh; Vojdani, Mahroo; Giti, Rashin; Taghva, Masumeh; Pardis, Soheil

    2015-06-01

    This study aimed to evaluate the fit of zirconia ceramics before and after veneering, using 3 different veneering processes (layering, press-over, and CAD-on techniques). Thirty standardized zirconia CAD/CAM frameworks were constructed and divided into three groups of 10 each. The first group was veneered using the traditional layering technique. Press-over and CAD-on techniques were used to veneer second and third groups. The marginal gap of specimens was measured before and after veneering process at 18 sites on the master die using a digital microscope. Paired t-test was used to evaluate mean marginal gap changes. One-way ANOVA and post hoc tests were also employed for comparison among 3 groups (α=.05). Marginal gap of 3 groups was increased after porcelain veneering. The mean marginal gap values after veneering in the layering group (63.06 µm) was higher than press-over (50.64 µm) and CAD-on (51.50 µm) veneered groups (P<.001). Three veneering methods altered the marginal fit of zirconia copings. Conventional layering technique increased the marginal gap of zirconia framework more than pressing and CAD-on techniques. All ceramic crowns made through three different veneering methods revealed clinically acceptable marginal fit.

  20. Classroom Activities: Simple Strategies to Incorporate Student-Centered Activities within Undergraduate Science Lectures

    PubMed Central

    Lom, Barbara

    2012-01-01

    The traditional science lecture, where an instructor delivers a carefully crafted monolog to a large audience of students who passively receive the information, has been a popular mode of instruction for centuries. Recent evidence on the science of teaching and learning indicates that learner-centered, active teaching strategies can be more effective learning tools than traditional lectures. Yet most colleges and universities retain lectures as their central instructional method. This article highlights several simple collaborative teaching techniques that can be readily deployed within traditional lecture frameworks to promote active learning. Specifically, this article briefly introduces the techniques of: reader’s theatre, think-pair-share, roundtable, jigsaw, in-class quizzes, and minute papers. Each technique is broadly applicable well beyond neuroscience courses and easily modifiable to serve an instructor’s specific pedagogical goals. The benefits of each technique are described along with specific examples of how each technique might be deployed within a traditional lecture to create more active learning experiences. PMID:23494568

  1. On the manufacturing of a gas turbine engine part through metal spinning process

    NASA Astrophysics Data System (ADS)

    Hassanin, A. El; Astarita, A.; Scherillo, F.; Velotti, C.; Squillace, A.; Liguori, A.

    2018-05-01

    Metal spinning processes represents an interesting alternative to traditional sheet metal forming processes in several industrial contexts, such as automotive and aerospace. In this work, the production of a combustion chamber liner top prototype using AISI 304L stainless steel is proposed, in order to evaluate the process feasibility for the required part geometry. The prototypes production was carried out using a two-stage semiautomatic spinning process. The effects in terms of wall thickness reduction were investigated. Using optical microscopy and Scanning Electron Microscopy (SEM) techniques, the microstructural behavior of the metal subjected to the forming process was investigated, while for an evaluation of the influence on the mechanical properties Vickers micro-indentation tests were performed. The main result of the process, as observed from all the investigation techniques adopted, is the formation of strain induced martensite due to the severe plastic deformation and cold reduction of the material, ranging in this case from 30% to 50%. In some areas of the part section, some rips indicating an excessive tensile stress were also detected.

  2. [Advancements of computer chemistry in separation of Chinese medicine].

    PubMed

    Li, Lingjuan; Hong, Hong; Xu, Xuesong; Guo, Liwei

    2011-12-01

    Separating technique of Chinese medicine is not only a key technique in the field of Chinese medicine' s research and development, but also a significant step in the modernization of Chinese medicinal preparation. Computer chemistry can build model and look for the regulations from Chinese medicine system which is full of complicated data. This paper analyzed the applicability, key technology, basic mode and common algorithm of computer chemistry applied in the separation of Chinese medicine, introduced the mathematic mode and the setting methods of Extraction kinetics, investigated several problems which based on traditional Chinese medicine membrane procession, and forecasted the application prospect.

  3. Using parallel evolutionary development for a biologically-inspired computer vision system for mobile robots.

    PubMed

    Wright, Cameron H G; Barrett, Steven F; Pack, Daniel J

    2005-01-01

    We describe a new approach to attacking the problem of robust computer vision for mobile robots. The overall strategy is to mimic the biological evolution of animal vision systems. Our basic imaging sensor is based upon the eye of the common house fly, Musca domestica. The computational algorithms are a mix of traditional image processing, subspace techniques, and multilayer neural networks.

  4. Visual Purple, the Next Generation Crisis Management Decision Training Tool

    DTIC Science & Technology

    2001-09-01

    talents of professional Hollywood screenwriters during the scripting and writing process of the simulations. Additionally, cinematic techniques learned...cultural, and language experts for research development. Additionally, GTA provides country specific support in script writing and cinematic resources as...The result is an entirely new dimension of realism that traditional exercises often fail to capture. The scenario requires the participant to make the

  5. Processing of pulse oximeter signals using adaptive filtering and autocorrelation to isolate perfusion and oxygenation components

    NASA Astrophysics Data System (ADS)

    Ibey, Bennett; Subramanian, Hariharan; Ericson, Nance; Xu, Weijian; Wilson, Mark; Cote, Gerard L.

    2005-03-01

    A blood perfusion and oxygenation sensor has been developed for in situ monitoring of transplanted organs. In processing in situ data, motion artifacts due to increased perfusion can create invalid oxygenation saturation values. In order to remove the unwanted artifacts from the pulsatile signal, adaptive filtering was employed using a third wavelength source centered at 810nm as a reference signal. The 810 nm source resides approximately at the isosbestic point in the hemoglobin absorption curve where the absorbance of light is nearly equal for oxygenated and deoxygenated hemoglobin. Using an autocorrelation based algorithm oxygenation saturation values can be obtained without the need for large sampling data sets allowing for near real-time processing. This technique has been shown to be more reliable than traditional techniques and proven to adequately improve the measurement of oxygenation values in varying perfusion states.

  6. Membrane processing technology in the food industry: food processing, wastewater treatment, and effects on physical, microbiological, organoleptic, and nutritional properties of foods.

    PubMed

    Kotsanopoulos, Konstantinos V; Arvanitoyannis, Ioannis S

    2015-01-01

    Membrane processing technology (MPT) is increasingly used nowadays in a wide range of applications (demineralization, desalination, stabilization, separation, deacidification, reduction of microbial load, purification, etc.) in food industries. The most frequently applied techniques are electrodialysis (ED), reverse osmosis (RO), nanofiltration (NF), ultrafiltration (UF), and microfiltration (MF). Several membrane characteristics, such as pore size, flow properties, and the applied hydraulic pressure mainly determine membranes' potential uses. In this review paper the basic membrane techniques, their potential applications in a large number of fields and products towards the food industry, the main advantages and disadvantages of these methods, fouling phenomena as well as their effects on the organoleptic, qualitative, and nutritional value of foods are synoptically described. Some representative examples of traditional and modern membrane applications both in tabular and figural form are also provided.

  7. Vectorization with SIMD extensions speeds up reconstruction in electron tomography.

    PubMed

    Agulleiro, J I; Garzón, E M; García, I; Fernández, J J

    2010-06-01

    Electron tomography allows structural studies of cellular structures at molecular detail. Large 3D reconstructions are needed to meet the resolution requirements. The processing time to compute these large volumes may be considerable and so, high performance computing techniques have been used traditionally. This work presents a vector approach to tomographic reconstruction that relies on the exploitation of the SIMD extensions available in modern processors in combination to other single processor optimization techniques. This approach succeeds in producing full resolution tomograms with an important reduction in processing time, as evaluated with the most common reconstruction algorithms, namely WBP and SIRT. The main advantage stems from the fact that this approach is to be run on standard computers without the need of specialized hardware, which facilitates the development, use and management of programs. Future trends in processor design open excellent opportunities for vector processing with processor's SIMD extensions in the field of 3D electron microscopy.

  8. A hybrid, auto-adaptive and rule-based multi-agent approach using evolutionary algorithms for improved searching

    NASA Astrophysics Data System (ADS)

    Izquierdo, Joaquín; Montalvo, Idel; Campbell, Enrique; Pérez-García, Rafael

    2016-08-01

    Selecting the most appropriate heuristic for solving a specific problem is not easy, for many reasons. This article focuses on one of these reasons: traditionally, the solution search process has operated in a given manner regardless of the specific problem being solved, and the process has been the same regardless of the size, complexity and domain of the problem. To cope with this situation, search processes should mould the search into areas of the search space that are meaningful for the problem. This article builds on previous work in the development of a multi-agent paradigm using techniques derived from knowledge discovery (data-mining techniques) on databases of so-far visited solutions. The aim is to improve the search mechanisms, increase computational efficiency and use rules to enrich the formulation of optimization problems, while reducing the search space and catering to realistic problems.

  9. Explosively generated shock wave processing of metal powders by instrumented detonics

    NASA Astrophysics Data System (ADS)

    Sharma, A. D.; Sharma, A. K.; Thakur, N.

    2013-06-01

    The highest pressures generated by dynamic processes resulting either from high velocity impact or by spontaneous release of high energy rate substances in direct contact with a metal find superior applications over normal mechanical means. The special feature of explosive loading to the powder materials over traditional methods is its controlled detonation pressure which directly transmits shock energy to the materials which remain entrapped inside powder resulting into several micro-structural changes and hence improved mechanical properties. superalloy powders have been compacted nearer to the theoretical density by shock wave consolidation. In a single experimental set-up, compaction of metal powder and measurement of detonation velocity have been achieved successfully by using instrumented detonics. The thrust on the work is to obtain uniform, crack-free and fracture-less compacts of superalloys having intact crystalline structure as has been examined from FE-SEM, XRD and mechanical studies. Shock wave processing is an emerging technique and receiving much attention of the materials scientists and engineers owing to its excellent advantages over traditional metallurgical methods due to short processing time, scaleup advantage and controlled detonation pressure.

  10. Self-Reported Alcohol Consumption and Sexual Behavior in Males and Females: Using the Unmatched-Count Technique to Examine Reporting Practices of Socially Sensitive Subjects in a Sample of University Students

    ERIC Educational Resources Information Center

    Walsh, Jeffrey A.; Braithwaite, Jeremy

    2008-01-01

    This work, drawing on the literature on alcohol consumption, sexual behavior, and researching sensitive topics, tests the efficacy of the unmatched-count technique (UCT) in establishing higher rates of truthful self-reporting when compared to traditional survey techniques. Traditional techniques grossly underestimate the scope of problems…

  11. A preliminary microbiological assessment of process hygiene of traditional outdoor camel slaughter in Sahrawi refugee camps.

    PubMed

    Corrò, M; Saleh-Mohamed-Lamin, S; Jatri-Hamdi, S; Slut-Ahmed, B; Mohamed-Lejlifa, S; Di Lello, S; Rossi, D; Broglia, A; Vivas-Alegre, L

    2012-10-01

    The aim of this study was to investigate the hygiene performance of a camel (Camelus dromedarius) slaughtering process as carried out with the traditional method in the Sahrawi refugee camps located in southwestern Algeria. The camel slaughtering process in this region differs significantly from that carried out in commercial abattoirs. Slaughtering is performed outdoors in desert areas, and dehiding of the carcass is approached via the dorsoventral route rather than the classic ventrodorsal route. Samples were taken from 10 camel carcasses from three different areas: the hide, the carcass meat immediately after dehiding, and the meat after final cutting. Enterobacteriaceae counts (EC) were enumerated employing conventional laboratory techniques. Carcass meat samples resulted in EC below the detection limit more frequently if the hide samples from the same carcass had also EC counts below the detection limit. Because of the low number of trials, the calculation of statistical significance of the results was not possible. Further experimental research is needed in order to validate the results presented in this study. The comparison of the microbiological hygiene performance between dorsal dehiding and traditional ventral dehiding of slaughtered animals could serve to validate the hypothesis of the potential positive impact of the dorsal dehiding method in carcass meat hygiene.

  12. Characterization of heavy-metal-contaminated sediment by using unsupervised multivariate techniques and health risk assessment.

    PubMed

    Wang, Yeuh-Bin; Liu, Chen-Wuing; Wang, Sheng-Wei

    2015-03-01

    This study characterized the sediment quality of the severely contaminated Erjen River in Taiwan by using multivariate analysis methods-including factor analysis (FA), self-organizing maps (SOMs), and positive matrix factorization (PMF)-and health risk assessment. The SOMs classified the dataset with similar heavy-metal-contaminated sediment into five groups. FA extracted three major factors-traditional electroplating and metal-surface processing factor, nontraditional heavy-metal-industry factor, and natural geological factor-which accounted for 80.8% of the variance. The SOMs and FA revealed the heavy-metal-contaminated-sediment hotspots in the middle and upper reaches of the major tributary in the dry season. The hazardous index value for health risk via ingestion was 0.302. PMF further qualified the source apportionment, indicating that traditional electroplating and metal-surface-processing industries comprised 47% of the health risk posed by heavy-metal-contaminated sediment. Contaminants discharged from traditional electroplating and metal-surface-processing industries in the middle and upper reaches of the major tributary must be eliminated first to improve the sediment quality in Erjen River. The proposed assessment framework for heavy-metal-contaminated sediment can be applied to contaminated-sediment river sites in other regions. Copyright © 2014 Elsevier Inc. All rights reserved.

  13. [Comparative trial between traditional cesarean section and Misgav-Ladach technique].

    PubMed

    Gutiérrez, José Gabriel Tamayo; Coló, José Antonio Sereno; Arreola, María Sandra Huape

    2008-02-01

    The cesarean section was designed to extract to the neoborn, when the childbirth becomes difficult by the natural routes. The institutional obstetrical work demands long surgical time and high raw materials; therefore, simpler procedures must be implemented. To compare traditional cesarean section vs Misgav-Ladach technique to assess surgical time, and hospital stay and costs. Forty-eight pregnant patients at term with obstetrical indication for cesarean delivery were randomized in two groups: 24 were submitted to traditional cesarean and 24 to Misgav-Ladach technique. The outcomes included surgical time, bleeding, amount of sutures employed, pain intensity and some others adverse effects. The surgical time with Misgav-Ladach technique was shorter compared with traditional cesarean section, bleeding was consistently lesser and pain was also low. None adverse effects were registered in both groups. Although short follow-up showed significant operative time reduction and less bleeding, longer follow-up should be desirable in order to confirm no abdominal adhesions.

  14. Microwave, Millimeter, Submillimeter, and Far Infrared Spectral Databases

    NASA Technical Reports Server (NTRS)

    Pearson, J. C.; Pickett, H. M.; Drouin, B. J.; Chen, P.; Cohen, E. A.

    2002-01-01

    The spectrum of most known astrophysical molecules is derived from transitions between a few hundred to a few hundred thousand energy levels populated at room temperature. In the microwave and millimeter wave regions. spectroscopy is almost always performed with traditional microwave techniques. In the submillimeter and far infrared microwave technique becomes progressively more technologically challenging and infrared techniques become more widely employed as the wavelength gets shorter. Infrared techniques are typically one to two orders of magnitude less precise but they do generate all the strong features in the spectrum. With microwave technique, it is generally impossible and rarely necessary to measure every single transition of a molecular species, so careful fitting of quantum mechanical Hamiltonians to the transitions measured are required to produce the complete spectral picture of the molecule required by astronomers. The fitting process produces the most precise data possible and is required in the interpret heterodyne observations. The drawback of traditional microwave technique is that precise knowledge of the band origins of low lying excited states is rarely gained. The fitting of data interpolates well for the range of quantum numbers where there is laboratory data, but extrapolation is almost never precise. The majority of high resolution spectroscopic data is millimeter or longer in wavelength and a very limited number of molecules have ever been studied with microwave techniques at wavelengths shorter than 0.3 millimeters. The situation with infrared technique is similarly dire in the submillimeter and far infrared because the black body sources used are competing with a very significant thermal background making the signal to noise poor. Regardless of the technique used the data must be archived in a way useful for the interpretation of observations.

  15. Decorin Content and Near Infrared Spectroscopy Analysis of Dried Collagenous Biomaterial Samples

    PubMed Central

    Aldema-Ramos, Mila L.; Castell, Joan Carles; Muir, Zerlina E.; Adzet, Jose Maria; Sabe, Rosa; Schreyer, Suzanne

    2012-01-01

    The efficient removal of proteoglycans, such as decorin, from the hide when processing it to leather by traditional means is generally acceptable and beneficial for leather quality, especially for softness and flexibility. A patented waterless or acetone dehydration method that can generate a product similar to leather called Dried Collagenous Biomaterial (known as BCD) was developed but has no effect on decorin removal efficiency. The Alcian Blue colorimetric technique was used to assay the sulfated glycosaminoglycan (sGAG) portion of decorin. The corresponding residual decorin content was correlated to the mechanical properties of the BCD samples and was comparable to the control leather made traditionally. The waterless dehydration and instantaneous chrome tanning process is a good eco-friendly alternative to transforming hides to leather because no additional effects were observed after examination using NIR spectroscopy and additional chemometric analysis. PMID:24970152

  16. Optimization technique for rolled edge control process based on the acentric tool influence functions.

    PubMed

    Du, Hang; Song, Ci; Li, Shengyi; Xu, Mingjin; Peng, Xiaoqiang

    2017-05-20

    In the process of computer controlled optical surfacing (CCOS), the uncontrollable rolled edge restricts further improvements of the machining accuracy and efficiency. Two reasons are responsible for the rolled edge problem during small tool polishing. One is that the edge areas cannot be processed because of the orbit movement. The other is that changing the tool influence function (TIF) is difficult to compensate for in algorithms, since pressure step appears in the local pressure distribution at the surface edge. In this paper, an acentric tool influence function (A-TIF) is designed to remove the rolled edge after CCOS polishing. The model of A-TIF is analyzed theoretically, and a control point translation dwell time algorithm is used to verify that the full aperture of the workpiece can be covered by the peak removal point of the tool influence functions. Thus, surface residual error in the full aperture can be effectively corrected. Finally, the experiments are carried out. Two fused silica glass samples of 100  mm×100  mm are polished by traditional CCOS and the A-TIF method, respectively. The rolled edge was clearly produced in the sample polished by the traditional CCOS, while residual errors do not show this problem the sample polished by the A-TIF method. Therefore, the rolled edge caused by the traditional CCOS process is successfully suppressed during the A-TIF process. The ability to suppress the rolled edge of the designed A-TIF has been confirmed.

  17. Comparative study of presurgical hand hygiene with hydroalcoholic solution versus traditional presurgical hand hygiene.

    PubMed

    López Martín, M Beatriz; Erice Calvo-Sotelo, Alejo

    To compare presurgical hand hygiene with hydroalcoholic solution following the WHO protocol with traditional presurgical hand hygiene. Cultures of the hands of surgeons and surgical nurses were performed before and after presurgical hand hygiene and after removing gloves at the end of surgery. Cultures were done in 2different days: the first day after traditional presurgical hand hygiene, and the second day after presurgical hand hygiene with hydroalcoholic solution following the WHO protocol. The duration of the traditional hand hygiene was measured and compared with the duration (3min) of the WHO protocol. The cost of the products used in the traditional technique was compared with the cost of the hydroalcoholic solution used. The variability of the traditional technique was determined by observation. Following presurgical hand hygiene with hydroalcoholic solution, colony-forming units (CFU) were detected in 5 (7.3%) subjects, whereas after traditional presurgical hand hygiene CFU were detected in 14 subjects (20.5%) (p < 0.05). After glove removal, the numbers of CFU were similar. The time employed in hand hygiene with hydroalcoholic solution (3min) was inferior to the time employed in the traditional technique (p < 0.05), its cost was less than half, and there was no variability. Compared with other techniques, presurgical hand hygiene with hydroalcoholic solution significantly decreases CFU, has similar latency time, a lower cost, and saves time. Copyright © 2017 Elsevier España, S.L.U. All rights reserved.

  18. Toward Magnetorheological Finishing of Magnetic Materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shafrir, S.N.; Lambropoulos, J.C.; Jacobs, S.D.

    2007-10-24

    Magnetorheological finishing (MRF) is a precision finishing process traditionally limited to processing only nonmagnetic materials, e.g., optical glasses, ceramics, polymers, and metals. Here we demonstrate that MRF can be used for material removal from magnetic material surfaces. Our approach is to place an MRF spot on machined surfaces of magnetic WC-Co materials. The resulting surface roughness is comparable to that produced on nonmagnetic materials. This spotting technique may be used to evaluate the depth of subsurface damage, or deformed layer, induced by earlier manufacturing steps, such as grinding and lapping.

  19. [Online endpoint detection algorithm for blending process of Chinese materia medica].

    PubMed

    Lin, Zhao-Zhou; Yang, Chan; Xu, Bing; Shi, Xin-Yuan; Zhang, Zhi-Qiang; Fu, Jing; Qiao, Yan-Jiang

    2017-03-01

    Blending process, which is an essential part of the pharmaceutical preparation, has a direct influence on the homogeneity and stability of solid dosage forms. With the official release of Guidance for Industry PAT, online process analysis techniques have been more and more reported in the applications in blending process, but the research on endpoint detection algorithm is still in the initial stage. By progressively increasing the window size of moving block standard deviation (MBSD), a novel endpoint detection algorithm was proposed to extend the plain MBSD from off-line scenario to online scenario and used to determine the endpoint in the blending process of Chinese medicine dispensing granules. By online learning of window size tuning, the status changes of the materials in blending process were reflected in the calculation of standard deviation in a real-time manner. The proposed method was separately tested in the blending processes of dextrin and three other extracts of traditional Chinese medicine. All of the results have shown that as compared with traditional MBSD method, the window size changes according to the proposed MBSD method (progressively increasing the window size) could more clearly reflect the status changes of the materials in blending process, so it is suitable for online application. Copyright© by the Chinese Pharmaceutical Association.

  20. Resolution-improved in situ DNA hybridization detection based on microwave photonic interrogation.

    PubMed

    Cao, Yuan; Guo, Tuan; Wang, Xudong; Sun, Dandan; Ran, Yang; Feng, Xinhuan; Guan, Bai-ou

    2015-10-19

    In situ bio-sensing system based on microwave photonics filter (MPF) interrogation method with improved resolution is proposed and experimentally demonstrated. A microfiber Bragg grating (mFBG) is used as sensing probe for DNA hybridization detection. Different from the traditional wavelength monitoring technique, we use the frequency interrogation scheme for resolution-improved bio-sensing detection. Experimental results show that the frequency shift of MPF notch presents a linear response to the surrounding refractive index (SRI) change over the range of 1.33 to 1.38, with a SRI resolution up to 2.6 × 10(-5) RIU, which has been increased for almost two orders of magnitude compared with the traditional fundamental mode monitoring technique (~3.6 × 10(-3) RIU). Due to the high Q value (about 27), the whole process of DNA hybridization can be in situ monitored. The proposed MPF-based bio-sensing system provides a new interrogation method over the frequency domain with improved sensing resolution and rapid interrogation rate for biochemical and environmental measurement.

  1. Surveying for architectural students: as simple as possible - as much as necessary

    NASA Astrophysics Data System (ADS)

    Mayer, I.; Mitterecker, T.

    2017-08-01

    More and more, existing buildings - and particularly historic buildings - are becoming part of the daily business of every architect. Planning and designing in the field of architectural heritage requires not only knowledge of contemporary building techniques, design processes and national and international guidelines, but also a deep understanding of architectural heritage, its evolution and genesis, the building techniques that have been applied, materials used, traditions, etc. In many cases, it is indispensable to perform a detailed building survey and building research to achieve an adequate design concept. The Department of History of Architecture and Building Archaeology of TU Wien has an extensive tradition of building research and over the course of the past 10 years, has developed a teaching workflow to introduce architectural students to building archaeology und surveying methods for building research. A sophisticated, temporally interwoven combination of courses and lectures on different topics related to building archaeology and surveying rapidly gives the architectural students the right tools for this important but often neglected task.

  2. Parallel plan execution with self-processing networks

    NASA Technical Reports Server (NTRS)

    Dautrechy, C. Lynne; Reggia, James A.

    1989-01-01

    A critical issue for space operations is how to develop and apply advanced automation techniques to reduce the cost and complexity of working in space. In this context, it is important to examine how recent advances in self-processing networks can be applied for planning and scheduling tasks. For this reason, the feasibility of applying self-processing network models to a variety of planning and control problems relevant to spacecraft activities is being explored. Goals are to demonstrate that self-processing methods are applicable to these problems, and that MIRRORS/II, a general purpose software environment for implementing self-processing models, is sufficiently robust to support development of a wide range of application prototypes. Using MIRRORS/II and marker passing modelling techniques, a model of the execution of a Spaceworld plan was implemented. This is a simplified model of the Voyager spacecraft which photographed Jupiter, Saturn, and their satellites. It is shown that plan execution, a task usually solved using traditional artificial intelligence (AI) techniques, can be accomplished using a self-processing network. The fact that self-processing networks were applied to other space-related tasks, in addition to the one discussed here, demonstrates the general applicability of this approach to planning and control problems relevant to spacecraft activities. It is also demonstrated that MIRRORS/II is a powerful environment for the development and evaluation of self-processing systems.

  3. Comparison of a new hydro-surgical technique to traditional methods for the preparation of full-thickness skin grafts from canine cadaveric skin and report of a single clinical case.

    PubMed

    Townsend, F I; Ralphs, S C; Coronado, G; Sweet, D C; Ward, J; Bloch, C P

    2012-01-01

    To compare the hydro-surgical technique to traditional techniques for removal of subcutaneous tissue in the preparation of full-thickness skin grafts. Ex vivo experimental study and a single clinical case report. Four canine cadavers and a single clinical case. Four sections of skin were harvested from the lateral flank of recently euthanatized dogs. Traditional preparation methods used included both a blade or scissors technique, each of which were compared to the hydro-surgical technique individually. Preparation methods were compared based on length of time for removal of the subcutaneous tissue from the graft, histologic grading, and measurable thickness as compared to an untreated sample. The hydro-surgical technique had the shortest skin graft preparation time as compared to traditional techniques (p = 0.002). There was no significant difference in the histological grading or measurable subcutaneous thickness between skin specimens. The hydro-surgical technique provides a rapid, effective debridement of subcutaneous tissue in the preparation of full-thickness skin grafts. There were not any significant changes in histological grade and subcutaneous tissue remaining among all treatment types. Additionally the hydro-surgical technique was successfully used to prepare a full-thickness meshed free skin graft in the reconstruction of a traumatic medial tarsal wound in a dog.

  4. [Exploration and practice of genetics teaching assisted by network technology platform].

    PubMed

    Li, Ya-Xuan; Zhang, Fei-Xiong; Zhao, Xin; Cai, Min-Hua; Yan, Yue-Ming; Hu, Ying-Kao

    2010-04-01

    More teaching techniques have been brought out gradually along with the development of new technologies. On the basis of those traditional teaching methods, a new platform has been set up by the network technology for teaching process. In genetics teaching, it is possible to use the network platform to guide student studying, promote student's learning interest and study independently by themselves. It has been proved, after exploring and applying for many years, that network teaching is one of the most useful methods and has inimitable advantage comparing to the traditional ones in genetics teaching. The establishment of network teaching platform, the advantage and deficiency and relevant strategies were intro-duced in this paper.

  5. But Are They Learning? Getting Started in Classroom Evaluation

    PubMed Central

    Dancy, Melissa H; Beichner, Robert J

    2002-01-01

    There are increasing numbers of traditional biologists, untrained in educational research methods, who want to develop and assess new classroom innovations. In this article we argue the necessity of formal research over normal classroom feedback. We also argue that traditionally trained biologists can make significant contributions to biology pedagogy. We then offer some guidance to the biologist with no formal educational research training who wants to get started. Specifically, we suggest ways to find out what others have done, we discuss the difference between qualitative and quantitative research, and we elaborate on the process of gaining insights from student interviews. We end with an example of a project that has used many different research techniques. PMID:12459792

  6. Pharmaceutical drug marketing strategies and tactics: a comparative analysis of attitudes held by pharmaceutical representatives and physicians.

    PubMed

    Parker, R Stephen; Pettijohn, Charles E

    2005-01-01

    A variety of promotional strategies have been used to stimulate sales of pharmaceutical drugs. Traditionally, push techniques have been the predominant means used to encourage physicians to prescribe drugs and thus increase sales. Recently, the traditional push strategy has been supplemented by a pull strategy. Direct-to-consumer advertising is increasingly used to encourage consumers to request advertised drugs from their physicians. This research compares the attitudes of two of the most affected participants in the prescriptive sales processes; physicians and pharmaceutical sales representatives. The findings indicate differences between physicians and pharmaceutical sales representatives regarding the efficacy and ethical considerations of various promotional strategies.

  7. Focal brain lesions induced with ultraviolet irradiation.

    PubMed

    Nakata, Mariko; Nagasaka, Kazuaki; Shimoda, Masayuki; Takashima, Ichiro; Yamamoto, Shinya

    2018-05-22

    Lesion and inactivation methods have played important roles in neuroscience studies. However, traditional techniques for creating a brain lesion are highly invasive, and control of lesion size and shape using these techniques is not easy. Here, we developed a novel method for creating a lesion on the cortical surface via 365 nm ultraviolet (UV) irradiation without breaking the dura mater. We demonstrated that 2.0 mWh UV irradiation, but not the same amount of non-UV light irradiation, induced an inverted bell-shaped lesion with neuronal loss and accumulation of glial cells. Moreover, the volume of the UV irradiation-induced lesion depended on the UV light exposure amount. We further succeeded in visualizing the lesioned site in a living animal using magnetic resonance imaging (MRI). Importantly, we also observed using an optical imaging technique that the spread of neural activation evoked by adjacent cortical stimulation disappeared only at the UV-irradiated site. In summary, UV irradiation can induce a focal brain lesion with a stable shape and size in a less invasive manner than traditional lesioning methods. This method is applicable to not only neuroscientific lesion experiments but also studies of the focal brain injury recovery process.

  8. Instrumentation for studying binder burnout in an immobilized plutonium ceramic wasteform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitchell, M; Pugh, D; Herman, C

    The Plutonium Immobilization Program produces a ceramic wasteform that utilizes organic binders. Several techniques and instruments were developed to study binder burnout on full size ceramic samples in a production environment. This approach provides a method for developing process parameters on production scale to optimize throughput, product quality, offgas behavior, and plant emissions. These instruments allow for offgas analysis, large-scale TGA, product quality observation, and thermal modeling. Using these tools, results from lab-scale techniques such as laser dilametry studies and traditional TGA/DTA analysis can be integrated. Often, the sintering step of a ceramification process is the limiting process step thatmore » controls the production throughput. Therefore, optimization of sintering behavior is important for overall process success. Furthermore, the capabilities of this instrumentation allows better understanding of plant emissions of key gases: volatile organic compounds (VOCs), volatile inorganics including some halide compounds, NO{sub x}, SO{sub x}, carbon dioxide, and carbon monoxide.« less

  9. Trends in non-stationary signal processing techniques applied to vibration analysis of wind turbine drive train - A contemporary survey

    NASA Astrophysics Data System (ADS)

    Uma Maheswari, R.; Umamaheswari, R.

    2017-02-01

    Condition Monitoring System (CMS) substantiates potential economic benefits and enables prognostic maintenance in wind turbine-generator failure prevention. Vibration Monitoring and Analysis is a powerful tool in drive train CMS, which enables the early detection of impending failure/damage. In variable speed drives such as wind turbine-generator drive trains, the vibration signal acquired is of non-stationary and non-linear. The traditional stationary signal processing techniques are inefficient to diagnose the machine faults in time varying conditions. The current research trend in CMS for drive-train focuses on developing/improving non-linear, non-stationary feature extraction and fault classification algorithms to improve fault detection/prediction sensitivity and selectivity and thereby reducing the misdetection and false alarm rates. In literature, review of stationary signal processing algorithms employed in vibration analysis is done at great extent. In this paper, an attempt is made to review the recent research advances in non-linear non-stationary signal processing algorithms particularly suited for variable speed wind turbines.

  10. The production of fine grained magnesium alloys through thermomechanical processing for the optimization of microstructural and mechanical properties

    NASA Astrophysics Data System (ADS)

    Young, John Paul

    The low density and high strength to weight ratio of magnesium alloys makes them ideal candidates to replace many of the heavier steel and aluminum alloys currently used in the automotive and other industries. Although cast magnesium alloys components have a long history of use in the automotive industry, the integration of wrought magnesium alloys components has been hindered by a number of factors. Grain refinement through thermomechanical processing offers a possible solution to many of the inherent problems associated with magnesium alloys. This work explores the development of several thermomechanical processing techniques and investigates their impact on the microstructural and mechanical properties of magnesium alloys. In addition to traditional thermomechanical processing, this work includes the development of new severe plastic deformation techniques for the production of fine grain magnesium plate and pipe and develops a procedure by which the thermal microstructural stability of severely plastically deformed microstructures can be assessed.

  11. Online differentiation of mineral phase in aerosol particles by ion formation mechanism using a LAAP-TOF single-particle mass spectrometer

    NASA Astrophysics Data System (ADS)

    Marsden, Nicholas A.; Flynn, Michael J.; Allan, James D.; Coe, Hugh

    2018-01-01

    Mineralogy of silicate mineral dust has a strong influence on climate and ecosystems due to variation in physiochemical properties that result from differences in composition and crystal structure (mineral phase). Traditional offline methods of analysing mineral phase are labour intensive and the temporal resolution of the data is much longer than many atmospheric processes. Single-particle mass spectrometry (SPMS) is an established technique for the online size-resolved measurement of particle composition by laser desorption ionisation (LDI) followed by time-of-flight mass spectrometry (TOF-MS). Although non-quantitative, the technique is able to identify the presence of silicate minerals in airborne dust particles from markers of alkali metals and silicate molecular ions in the mass spectra. However, the differentiation of mineral phase in silicate particles by traditional mass spectral peak area measurements is not possible. This is because instrument function and matrix effects in the ionisation process result in variations in instrument response that are greater than the differences in composition between common mineral phases.In this study, we introduce a novel technique that enables the differentiation of mineral phase in silicate mineral particles by ion formation mechanism measured from subtle changes in ion arrival times at the TOF-MS detector. Using a combination of peak area and peak centroid measurements, we show that the arrangement of the interstitial alkali metals in the crystal structure, an important property in silicate mineralogy, influences the ion arrival times of elemental and molecular ion species in the negative ion mass spectra. A classification scheme is presented that allowed for the differentiation of illite-smectite, kaolinite and feldspar minerals on a single-particle basis. Online analysis of mineral dust aerosol generated from clay mineral standards produced mineral fractions that are in agreement with bulk measurements reported by traditional XRD (X-ray diffraction) analysis.

  12. Hepatitis Diagnosis Using Facial Color Image

    NASA Astrophysics Data System (ADS)

    Liu, Mingjia; Guo, Zhenhua

    Facial color diagnosis is an important diagnostic method in traditional Chinese medicine (TCM). However, due to its qualitative, subjective and experi-ence-based nature, traditional facial color diagnosis has a very limited application in clinical medicine. To circumvent the subjective and qualitative problems of facial color diagnosis of Traditional Chinese Medicine, in this paper, we present a novel computer aided facial color diagnosis method (CAFCDM). The method has three parts: face Image Database, Image Preprocessing Module and Diagnosis Engine. Face Image Database is carried out on a group of 116 patients affected by 2 kinds of liver diseases and 29 healthy volunteers. The quantitative color feature is extracted from facial images by using popular digital image processing techni-ques. Then, KNN classifier is employed to model the relationship between the quantitative color feature and diseases. The results show that the method can properly identify three groups: healthy, severe hepatitis with jaundice and severe hepatitis without jaundice with accuracy higher than 73%.

  13. Pre-Nursing Students Perceptions of Traditional and Inquiry Based Chemistry Laboratories

    NASA Astrophysics Data System (ADS)

    Rogers, Jessica

    This paper describes a process that attempted to meet the needs of undergraduate students in a pre-nursing chemistry class. The laboratory was taught in traditional verification style and students were surveyed to assess their perceptions of the educational goals of the laboratory. A literature review resulted in an inquiry based method and analysis of the needs of nurses resulted in more application based activities. This new inquiry format was implemented the next semester, the students were surveyed at the end of the semester and results were compared to the previous method. Student and instructor response to the change in format was positive. Students in the traditional format placed goals concerning technique above critical thinking and felt the lab was easy to understand and carry out. Students in the inquiry based lab felt they learned more critical thinking skills and enjoyed the independence of designing experiments and answering their own questions.

  14. Chemomics-based marker compounds mining and mimetic processing for exploring chemical mechanisms in traditional processing of herbal medicines, a continuous study on Rehmanniae Radix.

    PubMed

    Zhou, Li; Xu, Jin-Di; Zhou, Shan-Shan; Shen, Hong; Mao, Qian; Kong, Ming; Zou, Ye-Ting; Xu, Ya-Yun; Xu, Jun; Li, Song-Lin

    2017-12-29

    Exploring processing chemistry, in particular the chemical transformation mechanisms involved, is a key step to elucidate the scientific basis in traditional processing of herbal medicines. Previously, taking Rehmanniae Radix (RR) as a case study, the holistic chemome (secondary metabolome and glycome) difference between raw and processed RR was revealed by integrating hyphenated chromatographic techniques-based targeted glycomics and untargeted metabolomics. Nevertheless, the complex chemical transformation mechanisms underpinning the holistic chemome variation in RR processing remain to be extensively clarified. As a continuous study, here a novel strategy by combining chemomics-based marker compounds mining and mimetic processing is proposed for further exploring the chemical mechanisms involved in herbal processing. First, the differential marker compounds between raw and processed herbs were rapidly discovered by untargeted chemomics-based mining approach through multivariate statistical analysis of the chemome data obtained by integrated metabolomics and glycomics analysis. Second, the marker compounds were mimetically processed under the simulated physicochemical conditions as in the herb processing, and the final reaction products were chemically characterized by targeted chemomics-based mining approach. Third, the main chemical transformation mechanisms involved were clarified by linking up the original marker compounds and their mimetic processing products. Using this strategy, a set of differential marker compounds including saccharides, glycosides and furfurals in raw and processed RR was rapidly found, and the major chemical mechanisms involved in RR processing were elucidated as stepwise transformations of saccharides (polysaccharides, oligosaccharides and monosaccharides) and glycosides (iridoid glycosides and phenethylalcohol glycosides) into furfurals (glycosylated/non-glycosylated hydroxymethylfurfurals) by deglycosylation and/or dehydration. The research deliverables indicated that the proposed strategy could advance the understanding of RR processing chemistry, and therefore may be considered a promising approach for delving into the scientific basis in traditional processing of herbal medicines. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Water-Assisted Production of Thermoplastic Nanocomposites: A Review.

    PubMed

    Karger-Kocsis, József; Kmetty, Ákos; Lendvai, László; Drakopoulos, Stavros X; Bárány, Tamás

    2014-12-29

    Water-assisted, or more generally liquid-mediated, melt compounding of nanocomposites is basically a combination of solution-assisted and traditional melt mixing methods. It is an emerging technique to overcome several disadvantages of the above two. Water or aqueous liquids with additives, do not work merely as temporary carrier materials of suitable nanofillers. During batchwise and continuous compounding, these liquids are fully or partly evaporated. In the latter case, the residual liquid is working as a plasticizer. This processing technique contributes to a better dispersion of the nanofillers and affects markedly the morphology and properties of the resulting nanocomposites. A survey is given below on the present praxis and possible future developments of water-assisted melt mixing techniques for the production of thermoplastic nanocomposites.

  16. Water-Assisted Production of Thermoplastic Nanocomposites: A Review

    PubMed Central

    Karger-Kocsis, József; Kmetty, Ákos; Lendvai, László; Drakopoulos, Stavros X.; Bárány, Tamás

    2014-01-01

    Water-assisted, or more generally liquid-mediated, melt compounding of nanocomposites is basically a combination of solution-assisted and traditional melt mixing methods. It is an emerging technique to overcome several disadvantages of the above two. Water or aqueous liquids with additives, do not work merely as temporary carrier materials of suitable nanofillers. During batchwise and continuous compounding, these liquids are fully or partly evaporated. In the latter case, the residual liquid is working as a plasticizer. This processing technique contributes to a better dispersion of the nanofillers and affects markedly the morphology and properties of the resulting nanocomposites. A survey is given below on the present praxis and possible future developments of water-assisted melt mixing techniques for the production of thermoplastic nanocomposites. PMID:28787925

  17. Dual resin bonded joints in polyetheretherketone (PEEK) matrix composites

    NASA Astrophysics Data System (ADS)

    Zelenak, Steve; Radford, Donald W.; Dean, Michael W.

    1993-04-01

    The paper describes applications of the dual resin (miscible polymer) bonding technique (Smiley, 1989) developed as an alternative to traditional bonding approaches to joining thermoplastic matrix composite subassemblies into structures. In the experiments, the performance of joint geometries, such as those that could be used to assemble large truss structures in space, are investigated using truss joint models consisting of woven carbon fiber/PEEK tubes of about 1 mm wall thickness. Specific process conditions and hand-held hardware used to apply heat and pressure were chosen to simulate a field asembly technique. Results are presented on tube/cruciform double lap shear tests, pinned-pinned tube compression tests, and single lap shear bond tests of joints obtained using the dual resin bonding technique.

  18. Integrating multi-criteria techniques with geographical information systems in waste facility location to enhance public participation.

    PubMed

    Higgs, Gary

    2006-04-01

    Despite recent U.K. Government commitments' to encourage public participation in environmental decision making, those exercises conducted to date have been largely confined to 'traditional' modes of participation such as the dissemination of information and in encouraging feedback on proposals through, for example, questionnaires or surveys. It is the premise of this paper that participative approaches that use IT-based methods, based on combined geographical information systems (GIS) and multi-criteria evaluation techniques that could involve the public in the decision-making process, have the potential to build consensus and reduce disputes and conflicts such as those arising from the siting of different types of waste facilities. The potential of these techniques are documented through a review of the existing literature in order to highlight the opportunities and challenges facing decision makers in increasing the involvement of the public at different stages of the waste facility management process. It is concluded that there are important lessons to be learned by researchers, consultants, managers and decision makers if barriers hindering the wider use of such techniques are to be overcome.

  19. Lightweight and Statistical Techniques for Petascale PetaScale Debugging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, Barton

    2014-06-30

    This project investigated novel techniques for debugging scientific applications on petascale architectures. In particular, we developed lightweight tools that narrow the problem space when bugs are encountered. We also developed techniques that either limit the number of tasks and the code regions to which a developer must apply a traditional debugger or that apply statistical techniques to provide direct suggestions of the location and type of error. We extend previous work on the Stack Trace Analysis Tool (STAT), that has already demonstrated scalability to over one hundred thousand MPI tasks. We also extended statistical techniques developed to isolate programming errorsmore » in widely used sequential or threaded applications in the Cooperative Bug Isolation (CBI) project to large scale parallel applications. Overall, our research substantially improved productivity on petascale platforms through a tool set for debugging that complements existing commercial tools. Previously, Office Of Science application developers relied either on primitive manual debugging techniques based on printf or they use tools, such as TotalView, that do not scale beyond a few thousand processors. However, bugs often arise at scale and substantial effort and computation cycles are wasted in either reproducing the problem in a smaller run that can be analyzed with the traditional tools or in repeated runs at scale that use the primitive techniques. New techniques that work at scale and automate the process of identifying the root cause of errors were needed. These techniques significantly reduced the time spent debugging petascale applications, thus leading to a greater overall amount of time for application scientists to pursue the scientific objectives for which the systems are purchased. We developed a new paradigm for debugging at scale: techniques that reduced the debugging scenario to a scale suitable for traditional debuggers, e.g., by narrowing the search for the root-cause analysis to a small set of nodes or by identifying equivalence classes of nodes and sampling our debug targets from them. We implemented these techniques as lightweight tools that efficiently work on the full scale of the target machine. We explored four lightweight debugging refinements: generic classification parameters, such as stack traces, application-specific classification parameters, such as global variables, statistical data acquisition techniques and machine learning based approaches to perform root cause analysis. Work done under this project can be divided into two categories, new algorithms and techniques for scalable debugging, and foundation infrastructure work on our MRNet multicast-reduction framework for scalability, and Dyninst binary analysis and instrumentation toolkits.« less

  20. INFORMATION STORAGE AND RETRIEVAL, A STATE-OF-THE-ART REPORT

    DTIC Science & Technology

    The objective of the study was to compile relevant background and interpretive material and prepare a state-of-the-art report which would put the...to-person communications. Section III presents basic IS and R concepts and techniques. It traces the history of traditional librarianship through...the process of communication between the originators and users of information. Section V categorizes the information system operations required to

  1. Introduction to bioengineering: melding of engineering and biological sciences.

    PubMed

    Shoureshi, Rahmat A

    2005-04-01

    Engineering has traditionally focused on the external extensions of organisms, such as transportation systems, high-rise buildings, and entertainment systems. In contrast, bioengineering is concerned with inward processes of biologic organisms. Utilization of engineering principles and techniques in the analysis and solution of problems in medicine and biology is the basis for bioengineering. This article discusses subspecialties in bioengineering and presents examples of projects in this discipline.

  2. 3D direct writing fabrication of electrodes for electrochemical storage devices

    NASA Astrophysics Data System (ADS)

    Wei, Min; Zhang, Feng; Wang, Wei; Alexandridis, Paschalis; Zhou, Chi; Wu, Gang

    2017-06-01

    Among different printing techniques, direct ink writing is commonly used to fabricate 3D battery and supercapacitor electrodes. The major advantages of using the direct ink writing include effectively building 3D structure for energy storage devices and providing higher power density and higher energy density than traditional techniques due to the increased surface area of electrode. Nevertheless, direct ink writing has high standards for the printing inks, which requires high viscosity, high yield stress under shear and compression, and well-controlled viscoelasticity. Recently, a number of 3D-printed energy storage devices have been reported, and it is very important to understand the printing process and the ink preparation process for further material design and technology development. We discussed current progress of direct ink writing technologies by using various electrode materials including carbon nanotube-based material, graphene-based material, LTO (Li4Ti5O12), LFP (LiFePO4), LiMn1-xFexPO4, and Zn-based metallic oxide. Based on achieve electrochemical performance, these 3D-printed devices deliver performance comparable to the energy storage device fabricated using traditional methods still leaving large room for further improvement. Finally, perspectives are provided on the potential future direction of 3D printing for all solid-state electrochemical energy storage devices.

  3. Study of supersonic plasma technology jets

    NASA Astrophysics Data System (ADS)

    Selezneva, Svetlana; Gravelle, Denis; Boulos, Maher; van de Sanden, Richard; Schram, Dc

    2001-10-01

    Recently some new techniques using remote thermal plasma for thin film deposition and plasma chemistry processes were developed. These techniques include PECVD of diamonds, diamond-like and polymer films; a-C:H and a-Si:H films. The latter are of especial interest because of their applications for solar cell production industry. In remote plasma deposition, thermal plasma is formed by means of one of traditional plasma sources. The chamber pressure is reduced with the help of continuous pumping. In that way the flow is accelerated up to the supersonic speed. The plasma expansion is controlled using a specific torch nozzle design. To optimize the deposition process detailed knowledge about the gas dynamic structure of the jet and chemical kinetics mechanisms is required. In the paper, we show how the flow pattern and the character of the deviations from local thermodynamic equilibrium differs in plasmas generated by different plasma sources, such as induction plasma torch, traditional direct current arc and cascaded arc. We study the effects of the chamber pressure, nozzle design and carrier gas on the resulting plasma properties. The analysis is performed by means of numerical modeling using commercially available FLUENT program with incorporated user-defined subroutines for two-temperature model. The results of continuum mechanics approach are compared with that of the kinetic Monte Carlo method and with the experimental data.

  4. Automatic optimization of well locations in a North Sea fractured chalk reservoir using a front tracking reservoir simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rian, D.T.; Hage, A.

    1994-12-31

    A numerical simulator is often used as a reservoir management tool. One of its main purposes is to aid in the evaluation of number of wells, well locations and start time for wells. Traditionally, the optimization of a field development is done by a manual trial and error process. In this paper, an example of an automated technique is given. The core in the automization process is the reservoir simulator Frontline. Frontline is based on front tracking techniques, which makes it fast and accurate compared to traditional finite difference simulators. Due to its CPU-efficiency the simulator has been coupled withmore » an optimization module, which enables automatic optimization of location of wells, number of wells and start-up times. The simulator was used as an alternative method in the evaluation of waterflooding in a North Sea fractured chalk reservoir. Since Frontline, in principle, is 2D, Buckley-Leverett pseudo functions were used to represent the 3rd dimension. The area full field simulation model was run with up to 25 wells for 20 years in less than one minute of Vax 9000 CPU-time. The automatic Frontline evaluation indicated that a peripheral waterflood could double incremental recovery compared to a central pattern drive.« less

  5. Spinal cord injuries functional rehabilitation - Traditional approaches and new strategies in physiotherapy.

    PubMed

    de Almeida, Patrícia Maria Duarte

    2006-02-01

    Considering the body structures and systems loss of function, after a Spinal Cord Injury, with is respective activities limitations and social participation restriction, the rehabilitation process goals are to achieve the maximal functional independence and quality of life allowed by the clinical lesion. For this is necessary a rehabilitation period with a rehabilitation team, including the physiotherapist whose interventions will depend on factors such degree of completeness or incompleteness and patient clinical stage. Physiotherapy approach includes several procedures and techniques related with a traditional model or with the recent perspective of neuronal regeneration. Following a traditional model, the interventions in complete A and incomplete B lesions, is based on compensatory method of functional rehabilitation using the non affected muscles. In the incomplete C and D lesions, motor re-education below the lesion, using key points to facilitate normal and selective patterns of movement is preferable. In other way if the neuronal regeneration is possible with respective function improve; the physiotherapy approach goals are to maintain muscular trofism and improve the recruitment of motor units using intensive techniques. In both, there is no scientific evidence to support the procedures, exists a lack of investigation and most of the research are methodologically poor. © 2006 Sociedade Portuguesa de Pneumologia/SPP.

  6. Assessment of three-dimensional high-definition visualization technology to perform microvascular anastomosis.

    PubMed

    Wong, Alex K; Davis, Gabrielle B; Nguyen, T JoAnna; Hui, Kenneth J W S; Hwang, Brian H; Chan, Linda S; Zhou, Zhao; Schooler, Wesley G; Chandrasekhar, Bala S; Urata, Mark M

    2014-07-01

    Traditional visualization techniques in microsurgery require strict positioning in order to maintain the field of visualization. However, static posturing over time may lead to musculoskeletal strain and injury. Three-dimensional high-definition (3DHD) visualization technology may be a useful adjunct to limiting static posturing and improving ergonomics in microsurgery. In this study, we aimed to investigate the benefits of using the 3DHD technology over traditional techniques. A total of 14 volunteers consisting of novice and experienced microsurgeons performed femoral anastomoses on male Sprague-Dawley retired breeder rats using traditional techniques as well as the 3DHD technology and compared the two techniques. Participants subsequently completed a questionnaire regarding their preference in terms of operational parameters, ergonomics, overall quality, and educational benefits. Efficiency was also evaluated by mean times to complete the anastomosis with each technique. A total of 27 anastomoses were performed, 14 of 14 using the traditional microscope and 13 of 14 using the 3DHD technology. Preference toward the traditional modality was noted with respect to the parameters of precision, field adjustments, zoom and focus, depth perception, and overall quality. The 3DHD technique was preferred for improved stamina and less back and eye strain. Participants believed that the 3DHD technique was the better method for learning microsurgery. Longer mean time of anastomosis completion was noted in participants utilizing the 3DHD technique. The 3DHD technology may prove to be valuable in improving proper ergonomics in microsurgery. In addition, it may be useful in medical education when applied to the learning of new microsurgical skills. More studies are warranted to determine its efficacy and safety in a clinical setting. Copyright © 2014 British Association of Plastic, Reconstructive and Aesthetic Surgeons. Published by Elsevier Ltd. All rights reserved.

  7. Application of Ensemble Detection and Analysis to Modeling Uncertainty in Non Stationary Process

    NASA Technical Reports Server (NTRS)

    Racette, Paul

    2010-01-01

    Characterization of non stationary and nonlinear processes is a challenge in many engineering and scientific disciplines. Climate change modeling and projection, retrieving information from Doppler measurements of hydrometeors, and modeling calibration architectures and algorithms in microwave radiometers are example applications that can benefit from improvements in the modeling and analysis of non stationary processes. Analyses of measured signals have traditionally been limited to a single measurement series. Ensemble Detection is a technique whereby mixing calibrated noise produces an ensemble measurement set. The collection of ensemble data sets enables new methods for analyzing random signals and offers powerful new approaches to studying and analyzing non stationary processes. Derived information contained in the dynamic stochastic moments of a process will enable many novel applications.

  8. In situ TEM near-field optical probing of nanoscale silicon crystallization.

    PubMed

    Xiang, Bin; Hwang, David J; In, Jung Bin; Ryu, Sang-Gil; Yoo, Jae-Hyuck; Dubon, Oscar; Minor, Andrew M; Grigoropoulos, Costas P

    2012-05-09

    Laser-based processing enables a wide variety of device configurations comprising thin films and nanostructures on sensitive, flexible substrates that are not possible with more traditional thermal annealing schemes. In near-field optical probing, only small regions of a sample are illuminated by the laser beam at any given time. Here we report a new technique that couples the optical near-field of the laser illumination into a transmission electron microscope (TEM) for real-time observations of the laser-materials interactions. We apply this technique to observe the transformation of an amorphous confined Si volume to a single crystal of Si using laser melting. By confinement of the material volume to nanometric dimensions, the entire amorphous precursor is within the laser spot size and transformed into a single crystal. This observation provides a path for laser processing of single-crystal seeds from amorphous precursors, a potentially transformative technique for the fabrication of solar cells and other nanoelectronic devices.

  9. Artifact Noise Removal Techniques on Seismocardiogram Using Two Tri-Axial Accelerometers

    PubMed Central

    Luu, Loc; Dinh, Anh

    2018-01-01

    The aim of this study is on the investigation of motion noise removal techniques using two-accelerometer sensor system and various placements of the sensors on gentle movement and walking of the patients. A Wi-Fi based data acquisition system and a framework on Matlab are developed to collect and process data while the subjects are in motion. The tests include eight volunteers who have no record of heart disease. The walking and running data on the subjects are analyzed to find the minimal-noise bandwidth of the SCG signal. This bandwidth is used to design filters in the motion noise removal techniques and peak signal detection. There are two main techniques of combining signals from the two sensors to mitigate the motion artifact: analog processing and digital processing. The analog processing comprises analog circuits performing adding or subtracting functions and bandpass filter to remove artifact noises before entering the data acquisition system. The digital processing processes all the data using combinations of total acceleration and z-axis only acceleration. The two techniques are tested on three placements of accelerometer sensors including horizontal, vertical, and diagonal on gentle motion and walking. In general, the total acceleration and z-axis acceleration are the best techniques to deal with gentle motion on all sensor placements which improve average systolic signal-noise-ratio (SNR) around 2 times and average diastolic SNR around 3 times comparing to traditional methods using only one accelerometer. With walking motion, ADDER and z-axis acceleration are the best techniques on all placements of the sensors on the body which enhance about 7 times of average systolic SNR and about 11 times of average diastolic SNR comparing to only one accelerometer method. Among the sensor placements, the performance of horizontal placement of the sensors is outstanding comparing with other positions on all motions. PMID:29614821

  10. [Studies on preparation of sustained-release Shuxiong formulation, a traditional Chinese medicine compound recipe, using time-controlled release techniques].

    PubMed

    Song, Hong-Tao; Zhang, Qian; Jiang, Peng; Guo, Tao; Chen, Da-Wei; He, Zhong-Gui

    2006-09-01

    To prepare a sustained-release formulation of traditional Chinese medicine compound recipe by adopting time-controlled release techniques. Shuxiong tablets were chosen as model drug. The prescription and technique of core tablets were formulated with selecting disintegrating time and swelling volume of core tablets in water as index. The time-controlled release tablets were prepared by adopting press-coated techniques, using PEG6000, HCO and EVA as coating materials. The influences of compositions, preparation process and dissolution conditions in vitro on the lag time (T(lag)) of drug release were investigated. The composition of core tablets was as follow: 30% of drug, 50% MCC and 20% CMS-Na. The T(lag) of time-controlled release tablets was altered remarkably by PEG6000 content of the outer layer, the amount of outer layer and hardness of tablet. The viscosity of dissolution media and basket rotation had less influence on the T(lag) but more on rate of drug release. The core tablets pressed with the optimized composition had preferable swelling and disintegrating properties. The shuxiong sustained-release formulations which contained core tablet and two kinds of time-controlled release tablets with 3 h and 6 h of T(lag) could release drug successively at 0 h, 3 h and 6 h in vitro. The technique made it possible that various components with extremely different physicochemical properties in these preparations could release synchronously.

  11. Generation of programmable temporal pulse shape and applications in micromachining

    NASA Astrophysics Data System (ADS)

    Peng, X.; Jordens, B.; Hooper, A.; Baird, B. W.; Ren, W.; Xu, L.; Sun, L.

    2009-02-01

    In this paper we presented a pulse shaping technique on regular solid-state lasers and the application in semiconductor micromachining. With a conventional Q-switched laser, all of the parameters can be adjusted over only limited ranges, especially the pulse width and pulse shape. However, some laser link processes using traditional laser pulses with pulse widths of a few nanoseconds to a few tens of nanoseconds tend to over-crater in thicker overlying passivation layers and thereby cause IC reliability problems. Use of a laser pulse with a special shape and a fast leading edge, such as tailored pulse, is one technique for controlling link processing. The pulse shaping technique is based on light-loop controlled optical modulation to shape conventional Q-switched solid-state lasers. One advantage of the pulse shaping technique is to provide a tailored pulse shape that can be programmed to have more than one amplitude value. Moreover, it has the capability of providing programmable tailored pulse shapes with discrete amplitude and time duration components. In addition, it provides fast rising and fall time of each pulse at fairly high repetition rate at 355nm with good beam quality. The regular-to-shaped efficiency is up to 50%. We conclude with a discussion of current results for laser processing of semiconductor memory link structures using programmable temporal pulse shapes. The processing experiments showed promising results with shaped pulse.

  12. Minimizing Alteration of Posterior Tibial Slope During Opening Wedge High Tibial Osteotomy: a Protocol with Experimental Validation in Paired Cadaveric Knees

    PubMed Central

    Westermann, Robert W; DeBerardino, Thomas; Amendola, Annunziato

    2014-01-01

    Introduction The High Tibial Osteotomy (HTO) is a reliable procedure in addressing uni- compartmental arthritis with associated coronal deformities. With osteotomy of the proximal tibia, there is a risk of altering the tibial slope in the sagittal plane. Surgical techniques continue to evolve with trends towards procedure reproducibility and simplification. We evaluated a modification of the Arthrex iBalance technique in 18 paired cadaveric knees with the goals of maintaining sagittal slope, increasing procedure efficiency, and decreasing use of intraoperative fluoroscopy. Methods Nine paired cadaveric knees (18 legs) underwent iBalance medial opening wedge high tibial osteotomies. In each pair, the right knee underwent an HTO using the modified technique, while all left knees underwent the traditional technique. Independent observers evaluated postoperative factors including tibial slope, placement of hinge pin, and implant placement. Specimens were then dissected to evaluate for any gross muscle, nerve or vessel injury. Results Changes to posterior tibial slope were similar using each technique. The change in slope in traditional iBalance technique was -0.3° ±2.3° and change in tibial slope using the modified iBalance technique was -0.4° ±2.3° (p=0.29). Furthermore, we detected no differences in posterior tibial slope between preoperative and postoperative specimens (p=0.74 traditional, p=0.75 modified). No differences in implant placement were detected between traditional and modified techniques. (p=0.85). No intraoperative iatrogenic complications (i.e. lateral cortex fracture, blood vessel or nerve injury) were observed in either group after gross dissection. Discussion & Conclusions Alterations in posterior tibial slope are associated with HTOs. Both traditional and modified iBalance techniques appear reliable in coronal plane corrections without changing posterior tibial slope. The present modification of the Arthrex iBalance technique may increase the efficiency of the operation and decrease radiation exposure to patients without compromising implant placement or global knee alignment. PMID:25328454

  13. GROUND WATER MONITORING AND SAMPLING: MULTI-LEVEL VERSUS TRADITIONAL METHODS – WHAT’S WHAT?

    EPA Science Inventory

    Recent studies have been conducted to evaluate different sampling techniques for determining VOC concentrations in groundwater. Samples were obtained using multi-level and traditional sampling techniques in three monitoring wells at the Raymark Superfund site in Stratford, CT. Ve...

  14. Spectroscopic analysis technique for arc-welding process control

    NASA Astrophysics Data System (ADS)

    Mirapeix, Jesús; Cobo, Adolfo; Conde, Olga; Quintela, María Ángeles; López-Higuera, José-Miguel

    2005-09-01

    The spectroscopic analysis of the light emitted by thermal plasmas has found many applications, from chemical analysis to monitoring and control of industrial processes. Particularly, it has been demonstrated that the analysis of the thermal plasma generated during arc or laser welding can supply information about the process and, thus, about the quality of the weld. In some critical applications (e.g. the aerospace sector), an early, real-time detection of defects in the weld seam (oxidation, porosity, lack of penetration, ...) is highly desirable as it can reduce expensive non-destructive testing (NDT). Among others techniques, full spectroscopic analysis of the plasma emission is known to offer rich information about the process itself, but it is also very demanding in terms of real-time implementations. In this paper, we proposed a technique for the analysis of the plasma emission spectrum that is able to detect, in real-time, changes in the process parameters that could lead to the formation of defects in the weld seam. It is based on the estimation of the electronic temperature of the plasma through the analysis of the emission peaks from multiple atomic species. Unlike traditional techniques, which usually involve peak fitting to Voigt functions using the Levenberg-Marquardt recursive method, we employ the LPO (Linear Phase Operator) sub-pixel algorithm to accurately estimate the central wavelength of the peaks (allowing an automatic identification of each atomic species) and cubic-spline interpolation of the noisy data to obtain the intensity and width of the peaks. Experimental tests on TIG-welding using fiber-optic capture of light and a low-cost CCD-based spectrometer, show that some typical defects can be easily detected and identified with this technique, whose typical processing time for multiple peak analysis is less than 20msec. running in a conventional PC.

  15. Process tool monitoring and matching using interferometry technique

    NASA Astrophysics Data System (ADS)

    Anberg, Doug; Owen, David M.; Mileham, Jeffrey; Lee, Byoung-Ho; Bouche, Eric

    2016-03-01

    The semiconductor industry makes dramatic device technology changes over short time periods. As the semiconductor industry advances towards to the 10 nm device node, more precise management and control of processing tools has become a significant manufacturing challenge. Some processes require multiple tool sets and some tools have multiple chambers for mass production. Tool and chamber matching has become a critical consideration for meeting today's manufacturing requirements. Additionally, process tools and chamber conditions have to be monitored to ensure uniform process performance across the tool and chamber fleet. There are many parameters for managing and monitoring tools and chambers. Particle defect monitoring is a well-known and established example where defect inspection tools can directly detect particles on the wafer surface. However, leading edge processes are driving the need to also monitor invisible defects, i.e. stress, contamination, etc., because some device failures cannot be directly correlated with traditional visualized defect maps or other known sources. Some failure maps show the same signatures as stress or contamination maps, which implies correlation to device performance or yield. In this paper we present process tool monitoring and matching using an interferometry technique. There are many types of interferometry techniques used for various process monitoring applications. We use a Coherent Gradient Sensing (CGS) interferometer which is self-referencing and enables high throughput measurements. Using this technique, we can quickly measure the topography of an entire wafer surface and obtain stress and displacement data from the topography measurement. For improved tool and chamber matching and reduced device failure, wafer stress measurements can be implemented as a regular tool or chamber monitoring test for either unpatterned or patterned wafers as a good criteria for improved process stability.

  16. Role of traditional healers in psychosocial support in caring for the orphans: a case of Dar-es Salaam City, Tanzania.

    PubMed

    Kayombo, Edmund J; Mbwambo, Zakaria H; Massila, Mariam

    2005-07-29

    Orphans are an increasing problem in developing countries particularly in Africa; due to the HIV/AIDS pandemic; and needs collective effort in intervention processes by including all stakeholders right from the grass roots level. This paper attempts to present the role of traditional healers in psychosocial support for orphan children in Dar-es-Salaam City with special focus on those whose parents have died because of HIV/AIDS. Six traditional healers who were involved in taking care of orphans were visited at their "vilinge" (traditional clinics). In total they had 72 orphans, 31 being boys and 41 being girls with age range from 3 years to 19. It was learned that traditional healers, besides providing remedies for illnesses/diseases of orphans, they also provided other basic needs. Further, they even provided psychosocial support allowing children to cope with orphan hood life with ease. Traditional healers are living within communities at the grass roots level; and appear unnoticed hidden forces, which are involved in taking care of orphans. This role of traditional healers in taking care of orphans needs to be recognised and even scaling it up by empowering them both in financial terms and training in basic skills of psychosocial techniques in how to handle orphans, in order to reduce discrimination and stigmatisation in the communities where they live.

  17. Super-smooth processing x-ray telescope application research based on the magnetorheological finishing (MRF) technology

    NASA Astrophysics Data System (ADS)

    Zhong, Xianyun; Hou, Xi; Yang, Jinshan

    2016-09-01

    Nickel is the unique material in the X-ray telescopes. And it has the typical soft material characteristics with low hardness high surface damage and low stability of thermal. The traditional fabrication techniques are exposed to lots of problems, including great surface scratches, high sub-surface damage and poor surface roughness and so on. The current fabrication technology for the nickel aspheric mainly adopt the single point diamond turning(SPDT), which has lots of advantages such as high efficiency, ultra-precision surface figure, low sub-surface damage and so on. But the residual surface texture of SPDT will cause great scattering losses and fall far short from the requirement in the X-ray applications. This paper mainly investigates the magnetorheological finishing (MRF) techniques for the super-smooth processing on the nickel optics. Through the study of the MRF polishing techniques, we obtained the ideal super-smooth polishing technique based on the self-controlled MRF-fluid NS-1, and finished the high-precision surface figure lower than RMS λ/80 (λ=632.8nm) and super-smooth roughness lower than Ra 0.3nm on the plane reflector and roughness lower than Ra 0.4nm on the convex cone. The studying of the MRF techniques makes a great effort to the state-of-the-art nickel material processing level for the X-ray optical systems applications.

  18. Comparison of repair techniques in small and medium-sized rotator cuff tears in cadaveric sheep shoulders.

    PubMed

    Onay, Ulaş; Akpınar, Sercan; Akgün, Rahmi Can; Balçık, Cenk; Tuncay, Ismail Cengiz

    2013-01-01

    The aim of this study was to compare new knotless single-row and double-row suture anchor techniques with traditional transosseous suture techniques for different sized rotator cuff tears in an animal model. The study included 56 cadaveric sheep shoulders. Supraspinatus cuff tears of 1 cm repaired with new knotless single-row suture anchor technique and supraspinatus and infraspinatus rotator cuff tears of 3 cm repaired with double-row suture anchor technique were compared to traditional transosseous suture techniques and control groups. The repaired tendons were loaded with 5 mm/min static velocity with 2.5 kgN load cell in Instron 8874 machine until the repair failure. The 1 cm transosseous group was statistically superior to 1 cm control group (p=0.021, p<0.05) and the 3 cm SpeedBridge group was statistically superior to the 1 cm SpeedFix group (p=0.012, p<0.05). The differences between the other groups were not statistically significant. No significant difference was found between the new knotless suture anchor techniques and traditional transosseous suture techniques.

  19. Direct-Write Printing on Three-Dimensional Geometries for Miniaturized Detector and Electronic Assemblies

    NASA Technical Reports Server (NTRS)

    Paquette, Beth; Samuels, Margaret; Chen, Peng

    2017-01-01

    Direct-write printing techniques will enable new detector assemblies that were not previously possible with traditional assembly processes. Detector concepts were manufactured using this technology to validate repeatability. Additional detector applications and printed wires on a 3-dimensional magnetometer bobbin will be designed for print. This effort focuses on evaluating performance for direct-write manufacturing techniques on 3-dimensional surfaces. Direct-write manufacturing has the potential to reduce mass and volume for fabrication and assembly of advanced detector concepts by reducing trace widths down to 10 microns, printing on complex geometries, allowing new electronic concept production, and reduced production times of complex those electronics.

  20. An Autonomous Sensor System Architecture for Active Flow and Noise Control Feedback

    NASA Technical Reports Server (NTRS)

    Humphreys, William M, Jr.; Culliton, William G.

    2008-01-01

    Multi-channel sensor fusion represents a powerful technique to simply and efficiently extract information from complex phenomena. While the technique has traditionally been used for military target tracking and situational awareness, a study has been successfully completed that demonstrates that sensor fusion can be applied equally well to aerodynamic applications. A prototype autonomous hardware processor was successfully designed and used to detect in real-time the two-dimensional flow reattachment location generated by a simple separated-flow wind tunnel model. The success of this demonstration illustrates the feasibility of using autonomous sensor processing architectures to enhance flow control feedback signal generation.

  1. Optimal focal-plane restoration

    NASA Technical Reports Server (NTRS)

    Reichenbach, Stephen E.; Park, Stephen K.

    1989-01-01

    Image restoration can be implemented efficiently by calculating the convolution of the digital image and a small kernel during image acquisition. Processing the image in the focal-plane in this way requires less computation than traditional Fourier-transform-based techniques such as the Wiener filter and constrained least-squares filter. Here, the values of the convolution kernel that yield the restoration with minimum expected mean-square error are determined using a frequency analysis of the end-to-end imaging system. This development accounts for constraints on the size and shape of the spatial kernel and all the components of the imaging system. Simulation results indicate the technique is effective and efficient.

  2. Using Movies to Analyse Gene Circuit Dynamics in Single Cells

    PubMed Central

    Locke, James CW; Elowitz, Michael B

    2010-01-01

    Preface Many bacterial systems rely on dynamic genetic circuits to control critical processes. A major goal of systems biology is to understand these behaviours in terms of individual genes and their interactions. However, traditional techniques based on population averages wash out critical dynamics that are either unsynchronized between cells or driven by fluctuations, or ‘noise,’ in cellular components. Recently, the combination of time-lapse microscopy, quantitative image analysis, and fluorescent protein reporters has enabled direct observation of multiple cellular components over time in individual cells. In conjunction with mathematical modelling, these techniques are now providing powerful insights into genetic circuit behaviour in diverse microbial systems. PMID:19369953

  3. Cognitive balanced model: a conceptual scheme of diagnostic decision making.

    PubMed

    Lucchiari, Claudio; Pravettoni, Gabriella

    2012-02-01

    Diagnostic reasoning is a critical aspect of clinical performance, having a high impact on quality and safety of care. Although diagnosis is fundamental in medicine, we still have a poor understanding of the factors that determine its course. According to traditional understanding, all information used in diagnostic reasoning is objective and logically driven. However, these conditions are not always met. Although we would be less likely to make an inaccurate diagnosis when following rational decision making, as described by normative models, the real diagnostic process works in a different way. Recent work has described the major cognitive biases in medicine as well as a number of strategies for reducing them, collectively called debiasing techniques. However, advances have encountered obstacles in achieving implementation into clinical practice. While traditional understanding of clinical reasoning has failed to consider contextual factors, most debiasing techniques seem to fail in raising sound and safer medical praxis. Technological solutions, being data driven, are fundamental in increasing care safety, but they need to consider human factors. Thus, balanced models, cognitive driven and technology based, are needed in day-to-day applications to actually improve the diagnostic process. The purpose of this article, then, is to provide insight into cognitive influences that have resulted in wrong, delayed or missed diagnosis. Using a cognitive approach, we describe the basis of medical error, with particular emphasis on diagnostic error. We then propose a conceptual scheme of the diagnostic process by the use of fuzzy cognitive maps. © 2011 Blackwell Publishing Ltd.

  4. System Considerations and Challendes in 3d Mapping and Modeling Using Low-Cost Uav Systems

    NASA Astrophysics Data System (ADS)

    Lari, Z.; El-Sheimy, N.

    2015-08-01

    In the last few years, low-cost UAV systems have been acknowledged as an affordable technology for geospatial data acquisition that can meet the needs of a variety of traditional and non-traditional mapping applications. In spite of its proven potential, UAV-based mapping is still lacking in terms of what is needed for it to become an acceptable mapping tool. In other words, a well-designed system architecture that considers payload restrictions as well as the specifications of the utilized direct geo-referencing component and the imaging systems in light of the required mapping accuracy and intended application is still required. Moreover, efficient data processing workflows, which are capable of delivering the mapping products with the specified quality while considering the synergistic characteristics of the sensors onboard, the wide range of potential users who might lack deep knowledge in mapping activities, and time constraints of emerging applications, are still needed to be adopted. Therefore, the introduced challenges by having low-cost imaging and georeferencing sensors onboard UAVs with limited payload capability, the necessity of efficient data processing techniques for delivering required products for intended applications, and the diversity of potential users with insufficient mapping-related expertise needs to be fully investigated and addressed by UAV-based mapping research efforts. This paper addresses these challenges and reviews system considerations, adaptive processing techniques, and quality assurance/quality control procedures for achievement of accurate mapping products from these systems.

  5. Development of a systematic strategy for the global identification and classification of the chemical constituents and metabolites of Kai-Xin-San based on liquid chromatography with quadrupole time-of-flight mass spectrometry combined with multiple data-processing approaches.

    PubMed

    Wang, Xiaotong; Liu, Jing; Yang, Xiaomei; Zhang, Qian; Zhang, Yiwen; Li, Qing; Bi, Kaishun

    2018-03-30

    To rapidly identify and classify complicated components and metabolites for traditional Chinese medicines, a liquid chromatography with quadrupole time-of-flight mass spectrometry method combined with multiple data-processing approaches was established. In this process, Kai-Xin-San, a widely used classic traditional Chinese medicine preparation, was chosen as a model prescription. Initially, the fragmentation patterns, diagnostic product ions and neutral loss of each category of compounds were summarized by collision-induced dissociation analysis of representative standards. In vitro, the multiple product ions filtering technique was utilized to identify the chemical constituents for globally covering trace components. With this strategy, 108 constituents were identified, and compounds database was successfully established. In vivo, the prototype compounds were extracted based on the established database, and the neutral loss filtering technique combined with the drug metabolism reaction rules was employed to identify metabolites. Overall, 69 constituents including prototype and metabolites were characterized in rat plasma and nine constituents were firstly characterized in rat brain, which may be the potential active constituents resulting in curative effects by synergistic interaction. In conclusion, this study provides a generally applicable strategy to global metabolite identification for the complicated components in complex matrix and a chemical basis for further pharmacological research of Kai-Xin-San. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Superchilling of muscle foods: Potential alternative for chilling and freezing.

    PubMed

    Banerjee, Rituparna; Maheswarappa, Naveena Basappa

    2017-12-05

    Superchilling is an attractive technique for preservation of muscle foods which freezes part of the water and insulate the food products from temperature fluctuations thereby enhancing the shelf-life during storage, transportation and retailing. Superchilling process synergistically improves the product shelf-life when used in combination with vacuum or modified atmospheric packaging. The shelf-life of muscle foods was reported to be increased by 1.5 to 4.0 times relative to traditional chilling technique. Advantages of superchilling and its ability to maintain the freshness of muscle foods over freezing has been discussed and its potential for Industrial application is highlighted. Present review also unravel the mechanistic bases for ice-crystal formation during superchilling and measures to ameliorate the drip loss. The future challenges especially automation in superchilling process for large scale Industrial application is presented.

  7. Non-Destructive Spectroscopic Techniques and Multivariate Analysis for Assessment of Fat Quality in Pork and Pork Products: A Review

    PubMed Central

    Kucha, Christopher T.; Liu, Li; Ngadi, Michael O.

    2018-01-01

    Fat is one of the most important traits determining the quality of pork. The composition of the fat greatly influences the quality of pork and its processed products, and contribute to defining the overall carcass value. However, establishing an efficient method for assessing fat quality parameters such as fatty acid composition, solid fat content, oxidative stability, iodine value, and fat color, remains a challenge that must be addressed. Conventional methods such as visual inspection, mechanical methods, and chemical methods are used off the production line, which often results in an inaccurate representation of the process because the dynamics are lost due to the time required to perform the analysis. Consequently, rapid, and non-destructive alternative methods are needed. In this paper, the traditional fat quality assessment techniques are discussed with emphasis on spectroscopic techniques as an alternative. Potential spectroscopic techniques include infrared spectroscopy, nuclear magnetic resonance and Raman spectroscopy. Hyperspectral imaging as an emerging advanced spectroscopy-based technology is introduced and discussed for the recent development of assessment for fat quality attributes. All techniques are described in terms of their operating principles and the research advances involving their application for pork fat quality parameters. Future trends for the non-destructive spectroscopic techniques are also discussed. PMID:29382092

  8. A review of recent developments in parametric based acoustic emission techniques applied to concrete structures

    NASA Astrophysics Data System (ADS)

    Vidya Sagar, R.; Raghu Prasad, B. K.

    2012-03-01

    This article presents a review of recent developments in parametric based acoustic emission (AE) techniques applied to concrete structures. It recapitulates the significant milestones achieved by previous researchers including various methods and models developed in AE testing of concrete structures. The aim is to provide an overview of the specific features of parametric based AE techniques of concrete structures carried out over the years. Emphasis is given to traditional parameter-based AE techniques applied to concrete structures. A significant amount of research on AE techniques applied to concrete structures has already been published and considerable attention has been given to those publications. Some recent studies such as AE energy analysis and b-value analysis used to assess damage of concrete bridge beams have also been discussed. The formation of fracture process zone and the AE energy released during the fracture process in concrete beam specimens have been summarised. A large body of experimental data on AE characteristics of concrete has accumulated over the last three decades. This review of parametric based AE techniques applied to concrete structures may be helpful to the concerned researchers and engineers to better understand the failure mechanism of concrete and evolve more useful methods and approaches for diagnostic inspection of structural elements and failure prediction/prevention of concrete structures.

  9. The effect of various veneering techniques on the marginal fit of zirconia copings

    PubMed Central

    Torabi, Kianoosh; Vojdani, Mahroo; Giti, Rashin; Pardis, Soheil

    2015-01-01

    PURPOSE This study aimed to evaluate the fit of zirconia ceramics before and after veneering, using 3 different veneering processes (layering, press-over, and CAD-on techniques). MATERIALS AND METHODS Thirty standardized zirconia CAD/CAM frameworks were constructed and divided into three groups of 10 each. The first group was veneered using the traditional layering technique. Press-over and CAD-on techniques were used to veneer second and third groups. The marginal gap of specimens was measured before and after veneering process at 18 sites on the master die using a digital microscope. Paired t-test was used to evaluate mean marginal gap changes. One-way ANOVA and post hoc tests were also employed for comparison among 3 groups (α=.05). RESULTS Marginal gap of 3 groups was increased after porcelain veneering. The mean marginal gap values after veneering in the layering group (63.06 µm) was higher than press-over (50.64 µm) and CAD-on (51.50 µm) veneered groups (P<.001). CONCLUSION Three veneering methods altered the marginal fit of zirconia copings. Conventional layering technique increased the marginal gap of zirconia framework more than pressing and CAD-on techniques. All ceramic crowns made through three different veneering methods revealed clinically acceptable marginal fit. PMID:26140175

  10. Evaluating the use of laser radiation in cleaning of copper embroidery threads on archaeological Egyptian textiles

    NASA Astrophysics Data System (ADS)

    Abdel-Kareem, Omar; Harith, M. A.

    2008-07-01

    Cleaning of copper embroidery threads on archaeological textiles is still a complicated conservation process, as most textile conservators believe that the advantages of using traditional cleaning techniques are less than their disadvantages. In this study, the uses of laser cleaning method and two modified recipes of wet cleaning methods were evaluated for cleaning of the corroded archaeological Egyptian copper embroidery threads on an archaeological Egyptian textile fabric. Some corroded copper thread samples were cleaned using modified recipes of wet cleaning method; other corroded copper thread samples were cleaned with Q-switched Nd:YAG laser radiation of wavelength 532 nm. All tested metal thread samples before and after cleaning were investigated using a light microscope and a scanning electron microscope with an energy dispersive X-ray analysis unit. Also the laser-induced breakdown spectroscopy (LIBS) technique was used for the elemental analysis of laser-cleaned samples to follow up the laser cleaning procedure. The results show that laser cleaning is the most effective method among all tested methods in the cleaning of corroded copper threads. It can be used safely in removing the corrosion products without any damage to both metal strips and fibrous core. The tested laser cleaning technique has solved the problems caused by other traditional cleaning techniques that are commonly used in the cleaning of metal threads on museum textiles.

  11. Epileptic seizure detection in EEG signal using machine learning techniques.

    PubMed

    Jaiswal, Abeg Kumar; Banka, Haider

    2018-03-01

    Epilepsy is a well-known nervous system disorder characterized by seizures. Electroencephalograms (EEGs), which capture brain neural activity, can detect epilepsy. Traditional methods for analyzing an EEG signal for epileptic seizure detection are time-consuming. Recently, several automated seizure detection frameworks using machine learning technique have been proposed to replace these traditional methods. The two basic steps involved in machine learning are feature extraction and classification. Feature extraction reduces the input pattern space by keeping informative features and the classifier assigns the appropriate class label. In this paper, we propose two effective approaches involving subpattern based PCA (SpPCA) and cross-subpattern correlation-based PCA (SubXPCA) with Support Vector Machine (SVM) for automated seizure detection in EEG signals. Feature extraction was performed using SpPCA and SubXPCA. Both techniques explore the subpattern correlation of EEG signals, which helps in decision-making process. SVM is used for classification of seizure and non-seizure EEG signals. The SVM was trained with radial basis kernel. All the experiments have been carried out on the benchmark epilepsy EEG dataset. The entire dataset consists of 500 EEG signals recorded under different scenarios. Seven different experimental cases for classification have been conducted. The classification accuracy was evaluated using tenfold cross validation. The classification results of the proposed approaches have been compared with the results of some of existing techniques proposed in the literature to establish the claim.

  12. Percutaneous Repair Technique for Acute Achilles Tendon Rupture with Assistance of Kirschner Wire.

    PubMed

    He, Ze-yang; Chai, Ming-xiang; Liu, Yue-ju; Zhang, Xiao-ran; Zhang, Tao; Song, Lian-xin; Ren, Zhi-xin; Wu, Xi-rui

    2015-11-01

    The aim of this study is to introduce a self-designed, minimally invasive technique for repairing an acute Achilles tendon rupture percutaneously. Comparing with the traditional open repair, the new technique provides obvious advantages of minimized operation-related lesions, fewer wound complications as well as a higher healing rate. However, a percutaneous technique without direct vision may be criticized by its insufficient anastomosis of Achilles tendon and may also lead to the lengthening of the Achilles tendon and a reduction in the strength of the gastrocnemius. To address the potential problems, we have improved our technique using a percutaneous Kirschner wire leverage process before suturing, which can effectively recover the length of the Achilles tendon and ensure the broken ends are in tight contact. With this improvement in technique, we have great confidence that it will become the treatment of choice for acute Achilles tendon ruptures. © 2015 Chinese Orthopaedic Association and Wiley Publishing Asia Pty Ltd.

  13. Radiation dose reduction using a neck detection algorithm for single spiral brain and cervical spine CT acquisition in the trauma setting.

    PubMed

    Ardley, Nicholas D; Lau, Ken K; Buchan, Kevin

    2013-12-01

    Cervical spine injuries occur in 4-8 % of adults with head trauma. Dual acquisition technique has been traditionally used for the CT scanning of brain and cervical spine. The purpose of this study was to determine the efficacy of radiation dose reduction by using a single acquisition technique that incorporated both anatomical regions with a dedicated neck detection algorithm. Thirty trauma patients for brain and cervical spine CT were included and were scanned with the single acquisition technique. The radiation doses from the single CT acquisition technique with the neck detection algorithm, which allowed appropriate independent dose administration relevant to brain and cervical spine regions, were recorded. Comparison was made both to the doses calculated from the simulation of the traditional dual acquisitions with matching parameters, and to the doses of retrospective dual acquisition legacy technique with the same sample size. The mean simulated dose for the traditional dual acquisition technique was 3.99 mSv, comparable to the average dose of 4.2 mSv from 30 previous patients who had CT of brain and cervical spine as dual acquisitions. The mean dose from the single acquisition technique was 3.35 mSv, resulting in a 16 % overall dose reduction. The images from the single acquisition technique were of excellent diagnostic quality. The new single acquisition CT technique incorporating the neck detection algorithm for brain and cervical spine significantly reduces the overall radiation dose by eliminating the unavoidable overlapping range between 2 anatomical regions which occurs with the traditional dual acquisition technique.

  14. Coronal Axis Measurement of the Optic Nerve Sheath Diameter Using a Linear Transducer.

    PubMed

    Amini, Richard; Stolz, Lori A; Patanwala, Asad E; Adhikari, Srikar

    2015-09-01

    The true optic nerve sheath diameter cutoff value for detecting elevated intracranial pressure is variable. The variability may stem from the technique used to acquire sonographic measurements of the optic nerve sheath diameter as well as sonographic artifacts inherent to the technique. The purpose of this study was to compare the traditional visual axis technique to an infraorbital coronal axis technique for assessing the optic nerve sheath diameter using a high-frequency linear array transducer. We conducted a cross-sectional study at an academic medical center. Timed optic nerve sheath diameter measurements were obtained on both eyes of healthy adult volunteers with a 10-5-MHz broadband linear array transducer using both traditional visual axis and coronal axis techniques. Optic nerve sheath diameter measurements were obtained by 2 sonologists who graded the difficulty of each technique and were blinded to each other's measurements for each participant. A total of 42 volunteers were enrolled, yielding 84 optic nerve sheath diameter measurements. There were no significant differences in the measurements between the techniques on either eye (P = .23 [right]; P = .99 [left]). Additionally, there was no difference in the degree of difficulty obtaining the measurements between the techniques (P = .16). There was a statistically significant difference in the time required to obtain the measurements between the traditional and coronal techniques (P < .05). Infraorbital coronal axis measurements are similar to measurements obtained in the traditional visual axis. The infraorbital coronal axis technique is slightly faster to perform and is not technically challenging. © 2015 by the American Institute of Ultrasound in Medicine.

  15. Sediment-generated noise (SGN): Comparison with physical bedload measurements in a small semi-arid watershed

    USDA-ARS?s Scientific Manuscript database

    Passive acoustic techniques for the measurement of Sediment-Generated Noise (SGN) in gravel-bed rivers present a promising alternative to traditional bedload measurement techniques. Where traditional methods are often prohibitively costly, particularly in labor requirements, and produce point-scale ...

  16. Comparison of alternative image reformatting techniques in micro-computed tomography and tooth clearing for detailed canal morphology.

    PubMed

    Lee, Ki-Wook; Kim, Yeun; Perinpanayagam, Hiran; Lee, Jong-Ki; Yoo, Yeon-Jee; Lim, Sang-Min; Chang, Seok Woo; Ha, Byung-Hyun; Zhu, Qiang; Kum, Kee-Yeon

    2014-03-01

    Micro-computed tomography (MCT) shows detailed root canal morphology that is not seen with traditional tooth clearing. However, alternative image reformatting techniques in MCT involving 2-dimensional (2D) minimum intensity projection (MinIP) and 3-dimensional (3D) volume-rendering reconstruction have not been directly compared with clearing. The aim was to compare alternative image reformatting techniques in MCT with tooth clearing on the mesiobuccal (MB) root of maxillary first molars. Eighteen maxillary first molar MB roots were scanned, and 2D MinIP and 3D volume-rendered images were reconstructed. Subsequently, the same MB roots were processed by traditional tooth clearing. Images from 2D, 3D, 2D + 3D, and clearing techniques were assessed by 4 endodontists to classify canal configuration and to identify fine anatomic structures such as accessory canals, intercanal communications, and loops. All image reformatting techniques in MCT showed detailed configurations and numerous fine structures, such that none were classified as simple type I or II canals; several were classified as types III and IV according to Weine classification or types IV, V, and VI according to Vertucci; and most were nonclassifiable because of their complexity. The clearing images showed less detail, few fine structures, and numerous type I canals. Classification of canal configuration was in 100% intraobserver agreement for all 18 roots visualized by any of the image reformatting techniques in MCT but for only 4 roots (22.2%) classified according to Weine and 6 (33.3%) classified according to Vertucci, when using the clearing technique. The combination of 2D MinIP and 3D volume-rendered images showed the most detailed canal morphology and fine anatomic structures. Copyright © 2014 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  17. Improved Power System Stability Using Backtracking Search Algorithm for Coordination Design of PSS and TCSC Damping Controller.

    PubMed

    Niamul Islam, Naz; Hannan, M A; Mohamed, Azah; Shareef, Hussain

    2016-01-01

    Power system oscillation is a serious threat to the stability of multimachine power systems. The coordinated control of power system stabilizers (PSS) and thyristor-controlled series compensation (TCSC) damping controllers is a commonly used technique to provide the required damping over different modes of growing oscillations. However, their coordinated design is a complex multimodal optimization problem that is very hard to solve using traditional tuning techniques. In addition, several limitations of traditionally used techniques prevent the optimum design of coordinated controllers. In this paper, an alternate technique for robust damping over oscillation is presented using backtracking search algorithm (BSA). A 5-area 16-machine benchmark power system is considered to evaluate the design efficiency. The complete design process is conducted in a linear time-invariant (LTI) model of a power system. It includes the design formulation into a multi-objective function from the system eigenvalues. Later on, nonlinear time-domain simulations are used to compare the damping performances for different local and inter-area modes of power system oscillations. The performance of the BSA technique is compared against that of the popular particle swarm optimization (PSO) for coordinated design efficiency. Damping performances using different design techniques are compared in term of settling time and overshoot of oscillations. The results obtained verify that the BSA-based design improves the system stability significantly. The stability of the multimachine power system is improved by up to 74.47% and 79.93% for an inter-area mode and a local mode of oscillation, respectively. Thus, the proposed technique for coordinated design has great potential to improve power system stability and to maintain its secure operation.

  18. Hypopharyngeal perforation near-miss during transesophageal echocardiography.

    PubMed

    Aviv, Jonathan E; Di Tullio, Marco R; Homma, Shunichi; Storper, Ian S; Zschommler, Anne; Ma, Guoguang; Petkova, Eva; Murphy, Mark; Desloge, Rosemary; Shaw, Gary; Benjamin, Stanley; Corwin, Steven

    2004-05-01

    The traditional blind passage of a transesophageal echocardiography probe transorally through the hypopharynx is considered safe. Yet, severe hypopharyngeal complications during transesophageal echocardiography at several institutions led the authors to investigate whether traditional probe passage results in a greater incidence of hypopharyngeal injuries when compared with probe passage under direct visualization. Randomized, prospective clinical study. In 159 consciously sedated adults referred for transesophageal echocardiography, the authors performed transesophageal echocardiography with concomitant transnasal videoendoscopic monitoring of the hypopharynx. Subjects were randomly assigned to receive traditional (blind) or experimental (optical) transesophageal echocardiography. The primary outcome measure was frequency of hypopharyngeal injuries (hypopharyngeal lacerations or hematomas), and the secondary outcome measure was number of hypopharyngeal contacts. No perforation occurred with either technique. However, hypopharyngeal lacerations or hematomas occurred in 19 of 80 (23.8%) patients with the traditional technique (11 superficial lacerations of pyriform sinus, 1 laceration of pharynx, 12 arytenoid hematomas, 2 vocal fold hematomas, and 1 pyriform hematoma) and in 1 of 79 patients (1.3%) with the optical technique (superficial pyriform laceration) (P =.001). All traumatized patients underwent flexible laryngoscopy, but none required additional intervention. Respectively, hypopharyngeal contacts were more frequent with the traditional than with the optical technique at the pyriform sinus (70.0% vs. 10.1% [P =.001]), arytenoid (55.0% vs. 3.8% [P =.001]), and vocal fold (15.0% vs. 3.86% [P =.016]). Optically guided trans-esophageal echocardiography results in significantly fewer hypopharyngeal injuries and fewer contacts than traditional, blind transesophageal echocardiography. The optically guided technique may result in decreased frequency of potentially significant complications and therefore in improved patient safety.

  19. Vision-based system for the control and measurement of wastewater flow rate in sewer systems.

    PubMed

    Nguyen, L S; Schaeli, B; Sage, D; Kayal, S; Jeanbourquin, D; Barry, D A; Rossi, L

    2009-01-01

    Combined sewer overflows and stormwater discharges represent an important source of contamination to the environment. However, the harsh environment inside sewers and particular hydraulic conditions during rain events reduce the reliability of traditional flow measurement probes. In the following, we present and evaluate an in situ system for the monitoring of water flow in sewers based on video images. This paper focuses on the measurement of the water level based on image-processing techniques. The developed image-based water level algorithms identify the wall/water interface from sewer images and measure its position with respect to real world coordinates. A web-based user interface and a 3-tier system architecture enable the remote configuration of the cameras and the image-processing algorithms. Images acquired and processed by our system were found to reliably measure water levels and thereby to provide crucial information leading to better understand particular hydraulic behaviors. In terms of robustness and accuracy, the water level algorithm provided equal or better results compared to traditional water level probes in three different in situ configurations.

  20. Acoustics based assessment of respiratory diseases using GMM classification.

    PubMed

    Mayorga, P; Druzgalski, C; Morelos, R L; Gonzalez, O H; Vidales, J

    2010-01-01

    The focus of this paper is to present a method utilizing lung sounds for a quantitative assessment of patient health as it relates to respiratory disorders. In order to accomplish this, applicable traditional techniques within the speech processing domain were utilized to evaluate lung sounds obtained with a digital stethoscope. Traditional methods utilized in the evaluation of asthma involve auscultation and spirometry, but utilization of more sensitive electronic stethoscopes, which are currently available, and application of quantitative signal analysis methods offer opportunities of improved diagnosis. In particular we propose an acoustic evaluation methodology based on the Gaussian Mixed Models (GMM) which should assist in broader analysis, identification, and diagnosis of asthma based on the frequency domain analysis of wheezing and crackles.

  1. Earth Observation Services (Forest Imaging)

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Two university professors used EOCAP funding to demonstrate that satellite data can generate forest classifications with equal or better accuracy than traditional aerial photography techniques. This comparison had not been previously available. CALFIRST, the resulting processing package, will be marketed to forest companies and government agencies. The EOCAP program provides government co-funding to encourage private investment in, and to broaden the use of, NASA- developed technology for analyzing information about Earth and ocean resources.

  2. Measurement and Image Processing Techniques for Particle Image Velocimetry Using Solid-Phase Carbon Dioxide

    DTIC Science & Technology

    2014-03-27

    between the nozzle /shroud tube interface, where the liquid is allowed to rapidly expand from the smaller diameter of the nozzle into the larger diameter...the CO2(l) freezes and agglomerates in the shroud tube, producing particles that are larger than if the liquid were expanded through a single nozzle ...Traditional seeding materials used for gas flows . . . . . . . . . . . . . . . . . 17 2.6 Example correlation peak for one IR in PIV

  3. Measuring the In-Process Figure, Final Prescription, and System Alignment of Large Optics and Segmented Mirrors Using Lidar Metrology

    NASA Technical Reports Server (NTRS)

    Ohl, Raymond; Slotwinski, Anthony; Eegholm, Bente; Saif, Babak

    2011-01-01

    The fabrication of large optics is traditionally a slow process, and fabrication capability is often limited by measurement capability. W hile techniques exist to measure mirror figure with nanometer precis ion, measurements of large-mirror prescription are typically limited to submillimeter accuracy. Using a lidar instrument enables one to measure the optical surface rough figure and prescription in virtuall y all phases of fabrication without moving the mirror from its polis hing setup. This technology improves the uncertainty of mirror presc ription measurement to the micron-regime.

  4. A simulation technique for predicting thickness of thermal sprayed coatings

    NASA Technical Reports Server (NTRS)

    Goedjen, John G.; Miller, Robert A.; Brindley, William J.; Leissler, George W.

    1995-01-01

    The complexity of many of the components being coated today using the thermal spray process makes the trial and error approach traditionally followed in depositing a uniform coating inadequate, thereby necessitating a more analytical approach to developing robotic trajectories. A two dimensional finite difference simulation model has been developed to predict the thickness of coatings deposited using the thermal spray process. The model couples robotic and component trajectories and thermal spraying parameters to predict coating thickness. Simulations and experimental verification were performed on a rotating disk to evaluate the predictive capabilities of the approach.

  5. Development of a Low-cost, FPGA-based, Delay Line Particle Detector for Satellite and Sounding Rocket Applications

    NASA Astrophysics Data System (ADS)

    Harrington, M.; Kujawski, J. T.; Adrian, M. L.; Weatherwax, A. T.

    2013-12-01

    Electrons are, by definition, a fundamental, chemical and electromagnetic constituent of any plasma. This is especially true within the partially ionized plasmas of Earth's ionosphere where electrons are a critical component of a vast array of plasma processes. Siena College is working on a novel method of processing information from electron spectrometer anodes using delay line techniques and inexpensive COTS electronics to track the movement of high-energy particles. Electron spectrometers use a variety of techniques to determine where an amplified electron cloud falls onto a collecting surface. One traditional method divides the collecting surface into sectors and uses a single detector for each sector. However, as the angular and spatial resolution increases, so does the number of detectors, increasing power consumption, cost, size, and weight of the system. An alternative approach is to connect each sector with a delay line built within the PCB material which is shielded from cross talk by a flooded ground plane. Only one pair of detectors (e.g., one at each end of the chain) are needed with the delay line technique which is different from traditional delay line detectors which use either Application Specific Integrated Circuits (ASICs) or very fast clocks. In this paper, we report on the implementation and testing of a delay line detector using a low-cost Xilinx FPGA and a thirty-two sector delay system. This Delay Line Detector has potential satellite and rocket flight applications due to its low cost, small size and power efficiency

  6. Automated quantitative micro-mineralogical characterization for environmental applications

    USGS Publications Warehouse

    Smith, Kathleen S.; Hoal, K.O.; Walton-Day, Katherine; Stammer, J.G.; Pietersen, K.

    2013-01-01

    Characterization of ore and waste-rock material using automated quantitative micro-mineralogical techniques (e.g., QEMSCAN® and MLA) has the potential to complement traditional acid-base accounting and humidity cell techniques when predicting acid generation and metal release. These characterization techniques, which most commonly are used for metallurgical, mineral-processing, and geometallurgical applications, can be broadly applied throughout the mine-life cycle to include numerous environmental applications. Critical insights into mineral liberation, mineral associations, particle size, particle texture, and mineralogical residence phase(s) of environmentally important elements can be used to anticipate potential environmental challenges. Resources spent on initial characterization result in lower uncertainties of potential environmental impacts and possible cost savings associated with remediation and closure. Examples illustrate mineralogical and textural characterization of fluvial tailings material from the upper Arkansas River in Colorado.

  7. [Service quality in health care: the application of the results of marketing research].

    PubMed

    Verheggen, F W; Harteloh, P P

    1993-01-01

    This paper deals with quality assurance in health care and its relation to quality assurance in trade and industry. We present the service quality model--a model of quality from marketing research--and discuss how it can be applied to health care. Traditional quality assurance appears to have serious flaws. It lacks a general theory of the sources of hazards in the complex process of patient care and tends to stagnate, for no real improvement takes place. Departing from this criticism, modern quality assurance in health care is marked by: defining quality in a preferential sense as "fitness for use"; the use of theories and models of trade and industry (process-control); an emphasis on analyzing the process, instead of merely inspecting it; use of the Deming problem solving technique (plan, do, check, act); improvement of the process of care by altering perceptions of parties involved. We present an experience of application and utilization of this method in the University Hospital Maastricht, The Netherlands. The successful application of this model requires a favorable corporate culture and motivation of the health care workers. This model provides a useful framework to uplift the traditional approach to quality assurance in health care.

  8. Flavonoid content in ethanolic extracts of selected raw and traditionally processed indigenous foods consumed by vulnerable groups of Kenya: antioxidant and type II diabetes-related functional properties.

    PubMed

    Kunyanga, Catherine N; Imungi, Jasper K; Okoth, Michael W; Biesalski, Hans K; Vadivel, Vellingiri

    2011-08-01

    The present study evaluated the flavonoid content, antioxidant as well as type II diabetes-related enzyme inhibition activities of ethanolic extract of certain raw and traditionally processed indigenous food ingredients including cereals, legumes, oil seeds, tubers, vegetables and leafy vegetables, which are commonly consumed by vulnerable groups in Kenya. The vegetables exhibited higher flavonoid content (50-703 mg/100 g) when compared with the grains (47-343 mg/100 g). The ethanolic extract of presently studied food ingredients revealed 33-93% DPPH radical scavenging capacity, 486-6,389 mmol Fe(II)/g reducing power, 19-43% α-amylase inhibition activity and 14-68% α-glucosidase inhibition activity. Among the different food-stuffs, the drumstick and amaranth leaves exhibited significantly higher flavonoid content with excellent functional properties. Roasting of grains and cooking of vegetables were found to be suitable processing methods in preserving the functional properties. Hence, such viable processing techniques for respective food samples will be considered in the formulation of functional supplementary foods for vulnerable groups in Kenya.

  9. Fluence map optimization (FMO) with dose-volume constraints in IMRT using the geometric distance sorting method.

    PubMed

    Lan, Yihua; Li, Cunhua; Ren, Haozheng; Zhang, Yong; Min, Zhifang

    2012-10-21

    A new heuristic algorithm based on the so-called geometric distance sorting technique is proposed for solving the fluence map optimization with dose-volume constraints which is one of the most essential tasks for inverse planning in IMRT. The framework of the proposed method is basically an iterative process which begins with a simple linear constrained quadratic optimization model without considering any dose-volume constraints, and then the dose constraints for the voxels violating the dose-volume constraints are gradually added into the quadratic optimization model step by step until all the dose-volume constraints are satisfied. In each iteration step, an interior point method is adopted to solve each new linear constrained quadratic programming. For choosing the proper candidate voxels for the current dose constraint adding, a so-called geometric distance defined in the transformed standard quadratic form of the fluence map optimization model was used to guide the selection of the voxels. The new geometric distance sorting technique can mostly reduce the unexpected increase of the objective function value caused inevitably by the constraint adding. It can be regarded as an upgrading to the traditional dose sorting technique. The geometry explanation for the proposed method is also given and a proposition is proved to support our heuristic idea. In addition, a smart constraint adding/deleting strategy is designed to ensure a stable iteration convergence. The new algorithm is tested on four cases including head-neck, a prostate, a lung and an oropharyngeal, and compared with the algorithm based on the traditional dose sorting technique. Experimental results showed that the proposed method is more suitable for guiding the selection of new constraints than the traditional dose sorting method, especially for the cases whose target regions are in non-convex shapes. It is a more efficient optimization technique to some extent for choosing constraints than the dose sorting method. By integrating a smart constraint adding/deleting scheme within the iteration framework, the new technique builds up an improved algorithm for solving the fluence map optimization with dose-volume constraints.

  10. [Acceptance and mindfulness-based cognitive-behavioral therapies].

    PubMed

    Ngô, Thanh-Lan

    2013-01-01

    Cognitive behavioral therapy (CBT) is one of the main approaches in psychotherapy. It teaches the patient to examine the link between dysfunctional thoughts and maladaptive behaviors and to re- evaluate the cognitive biases involved in the maintenance of symptoms by using strategies such as guided discovery. CBT is constantly evolving in part to improve its' effectiveness and accessibility. Thus in the last decade, increasingly popular approaches based on mindfulness and acceptance have emerged. These therapies do not attempt to modify cognitions even when they are biased and dysfunctional but rather seek a change in the relationship between the individual and the symptoms. This article aims to present the historical context that has allowed the emergence of this trend, the points of convergence and divergence with traditional CBT as well as a brief presentation of the different therapies based on mindfulness meditation and acceptance. Hayes (2004) described three successive waves in behavior therapy, each characterized by "dominant assumptions, methods and goals": traditional behavior therapy, cognitive therapy and therapies based on mindfulness meditation and acceptance. The latter consider that human suffering occurs when the individual lives a restricted life in order avoid pain and immediate discomfort to the detriment of his global wellbeing. These therapies combine mindfulness, experiential, acceptance strategies with traditional behavior principles in order to attain lasting results. There are significant points of convergence between traditional CBT and therapies based on mindfulness meditation and acceptance. They are both empirically validated, based upon a theoretical model postulating that avoidance is key in the maintenance of psychopathology and they recommend an approach strategy in order to overcome the identified problem. They both use behavioral techniques in the context of a collaborative relationship in order to identify precise problems and to achieve specific goals. They focus on the present moment rather than on historical causes. However, they also present significant differences: control vs acceptance of thoughts, focus on cognition vs behavior, focus on the relationship between the individual and his thoughts vs cognitive content, goal of modifying dysfunctional beliefs vs metacognitive processes, use of experiential vs didactic methods, focus on symptoms vs quality of life, strategies used before vs after the unfolding of full emotional response. The main interventions based on mindfulness meditation and acceptance are: Acceptance and Commitment Therapy, Functional Analytic Therapy, the expanded model of Behavioral Activation, Metacognitive Therapy, Mindfulness based Cognitive Therapy, Dialectic Behavior Therapy, Integrative Behavioral Couples Therapy and Compassionate Mind Training. These are described in this article. They offer concepts and techniques which might enhance therapeutic efficacy. They teach a new way to deploy attention and to enter into a relationship with current experience (for example, defusion) in order to diminish cognitive reactivity, a maintenance factor for psychopathology, and to enhance psychological flexibility. The focus on cognitive process, metacognition as well as cognitive content might yield additional benefits in therapy. It is possible to combine traditional CBT with third wave approaches by using psychoeducation and cognitive restructuring in the beginning phases of therapy in order to establish thought bias and to then encourage acceptance of internal experiences as well as exposure to feared stimuli rather than to continue to use cognitive restructuring techniques. Traditional CBT and third wave approaches seem to impact different processes: the former enhance the capacity to observe and describe experiences and the latter diminish experiential avoidance and increase conscious action as well as acceptance. The identification of personal values helps to motivate the individual to undertake actions required in order to enhance quality of life. In the case of chronic illness, it diminishes suffering by increasing acceptance. Although the evidence base supporting the efficacy of third wave approaches is less robust than in the case of traditional cognitive or behavior therapy, therapies based on mindfulness meditation and acceptance are promising interventions that might help to elucidate change process and offer complementary strategies in order to help patients.

  11. Combined PDF and Rietveld studies of ADORable zeolites and the disordered intermediate IPC-1P.

    PubMed

    Morris, Samuel A; Wheatley, Paul S; Položij, Miroslav; Nachtigall, Petr; Eliášová, Pavla; Čejka, Jiří; Lucas, Tim C; Hriljac, Joseph A; Pinar, Ana B; Morris, Russell E

    2016-09-28

    The disordered intermediate of the ADORable zeolite UTL has been structurally confirmed using the pair distribution function (PDF) technique. The intermediate, IPC-1P, is a disordered layered compound formed by the hydrolysis of UTL in 0.1 M hydrochloric acid solution. Its structure is unsolvable by traditional X-ray diffraction techniques. The PDF technique was first benchmarked against high-quality synchrotron Rietveld refinements of IPC-2 (OKO) and IPC-4 (PCR) - two end products of IPC-1P condensation that share very similar structural features. An IPC-1P starting model derived from density functional theory was used for the PDF refinement, which yielded a final fit of Rw = 18% and a geometrically reasonable structure. This confirms the layers do stay intact throughout the ADOR process and shows PDF is a viable technique for layered zeolite structure determination.

  12. Component Pin Recognition Using Algorithms Based on Machine Learning

    NASA Astrophysics Data System (ADS)

    Xiao, Yang; Hu, Hong; Liu, Ze; Xu, Jiangchang

    2018-04-01

    The purpose of machine vision for a plug-in machine is to improve the machine’s stability and accuracy, and recognition of the component pin is an important part of the vision. This paper focuses on component pin recognition using three different techniques. The first technique involves traditional image processing using the core algorithm for binary large object (BLOB) analysis. The second technique uses the histogram of oriented gradients (HOG), to experimentally compare the effect of the support vector machine (SVM) and the adaptive boosting machine (AdaBoost) learning meta-algorithm classifiers. The third technique is the use of an in-depth learning method known as convolution neural network (CNN), which involves identifying the pin by comparing a sample to its training. The main purpose of the research presented in this paper is to increase the knowledge of learning methods used in the plug-in machine industry in order to achieve better results.

  13. 150-nm DR contact holes die-to-database inspection

    NASA Astrophysics Data System (ADS)

    Kuo, Shen C.; Wu, Clare; Eran, Yair; Staud, Wolfgang; Hemar, Shirley; Lindman, Ofer

    2000-07-01

    Using a failure analysis-driven yield enhancements concept, based on an optimization of the mask manufacturing process and UV reticle inspection is studied and shown to improve the contact layer quality. This is achieved by relating various manufacturing processes to very fine tuned contact defect detection. In this way, selecting an optimized manufacturing process with fine-tuned inspection setup is achieved in a controlled manner. This paper presents a study, performed on a specially designed test reticle, which simulates production contact layers of design rule 250nm, 180nm and 150nm. This paper focuses on the use of advanced UV reticle inspection techniques as part of the process optimization cycle. Current inspection equipment uses traditional and insufficient methods of small contact-hole inspection and review.

  14. Product-oriented Software Certification Process for Software Synthesis

    NASA Technical Reports Server (NTRS)

    Nelson, Stacy; Fischer, Bernd; Denney, Ewen; Schumann, Johann; Richardson, Julian; Oh, Phil

    2004-01-01

    The purpose of this document is to propose a product-oriented software certification process to facilitate use of software synthesis and formal methods. Why is such a process needed? Currently, software is tested until deemed bug-free rather than proving that certain software properties exist. This approach has worked well in most cases, but unfortunately, deaths still occur due to software failure. Using formal methods (techniques from logic and discrete mathematics like set theory, automata theory and formal logic as opposed to continuous mathematics like calculus) and software synthesis, it is possible to reduce this risk by proving certain software properties. Additionally, software synthesis makes it possible to automate some phases of the traditional software development life cycle resulting in a more streamlined and accurate development process.

  15. Assembly processes comparison for a miniaturized laser used for the Exomars European Space Agency mission

    NASA Astrophysics Data System (ADS)

    Ribes-Pleguezuelo, Pol; Inza, Andoni Moral; Basset, Marta Gilaberte; Rodríguez, Pablo; Rodríguez, Gemma; Laudisio, Marco; Galan, Miguel; Hornaff, Marcel; Beckert, Erik; Eberhardt, Ramona; Tünnermann, Andreas

    2016-11-01

    A miniaturized diode-pumped solid-state laser (DPSSL) designed as part of the Raman laser spectrometer (RLS) instrument for the European Space Agency (ESA) Exomars mission 2020 is assembled and tested for the mission purpose and requirements. Two different processes were tried for the laser assembling: one based on adhesives, following traditional laser manufacturing processes; another based on a low-stress and organic-free soldering technique called solderjet bumping technology. The manufactured devices were tested for the processes validation by passing mechanical, thermal cycles, radiation, and optical functional tests. The comparison analysis showed a device improvement in terms of reliability of the optical performances from the soldered to the assembled by adhesive-based means.

  16. Organo-metallic elements for associative information processing

    NASA Astrophysics Data System (ADS)

    Potember, Richard S.; Poehler, Theodore O.

    1989-01-01

    In the three years of the program we have: (1) built and tested a 4 bit element matrix device for possible use in high density content-addressable memories systems; (2) established a test and evaluation laboratory to examine optical materials for nonlinear effects, saturable absorption, harmonic generation and photochromism; (3) successfully designed, constructed and operated a codeposition processing system that enables organic materials to be deposited on a variety of substrates to produce optical grade coatings and films. This system is also compatible with other traditional microelectronic techniques; (4) used the sol-gel process with colloidal AgTCNQ to fabricate high speed photochromic switches; (5) develop and applied for patent coverage to make VO2 optical switching materials via the sol-gel processing using vanadium (IV) alkoxide compounds.

  17. Trajectory Optimization for Missions to Small Bodies with a Focus on Scientific Merit.

    PubMed

    Englander, Jacob A; Vavrina, Matthew A; Lim, Lucy F; McFadden, Lucy A; Rhoden, Alyssa R; Noll, Keith S

    2017-01-01

    Trajectory design for missions to small bodies is tightly coupled both with the selection of targets for a mission and with the choice of spacecraft power, propulsion, and other hardware. Traditional methods of trajectory optimization have focused on finding the optimal trajectory for an a priori selection of destinations and spacecraft parameters. Recent research has expanded the field of trajectory optimization to multidisciplinary systems optimization that includes spacecraft parameters. The logical next step is to extend the optimization process to include target selection based not only on engineering figures of merit but also scientific value. This paper presents a new technique to solve the multidisciplinary mission optimization problem for small-bodies missions, including classical trajectory design, the choice of spacecraft power and propulsion systems, and also the scientific value of the targets. This technique, when combined with modern parallel computers, enables a holistic view of the small body mission design process that previously required iteration among several different design processes.

  18. Use of micro computed-tomography and 3D printing for reverse engineering of mouse embryo nasal capsule

    NASA Astrophysics Data System (ADS)

    Tesařová, M.; Zikmund, T.; Kaucká, M.; Adameyko, I.; Jaroš, J.; Paloušek, D.; Škaroupka, D.; Kaiser, J.

    2016-03-01

    Imaging of increasingly complex cartilage in vertebrate embryos is one of the key tasks of developmental biology. This is especially important to study shape-organizing processes during initial skeletal formation and growth. Advanced imaging techniques that are reflecting biological needs give a powerful impulse to push the boundaries of biological visualization. Recently, techniques for contrasting tissues and organs have improved considerably, extending traditional 2D imaging approaches to 3D . X-ray micro computed tomography (μCT), which allows 3D imaging of biological objects including their internal structures with a resolution in the micrometer range, in combination with contrasting techniques seems to be the most suitable approach for non-destructive imaging of embryonic developing cartilage. Despite there are many software-based ways for visualization of 3D data sets, having a real solid model of the studied object might give novel opportunities to fully understand the shape-organizing processes in the developing body. In this feasibility study we demonstrated the full procedure of creating a real 3D object of mouse embryo nasal capsule, i.e. the staining, the μCT scanning combined by the advanced data processing and the 3D printing.

  19. Nanostructured silicon via metal assisted catalyzed etch (MACE): chemistry fundamentals and pattern engineering

    NASA Astrophysics Data System (ADS)

    Toor, Fatima; Miller, Jeffrey B.; Davidson, Lauren M.; Nichols, Logan; Duan, Wenqi; Jura, Michael P.; Yim, Joanne; Forziati, Joanne; Black, Marcie R.

    2016-10-01

    There are a range of different methods to generate a nanostructured surface on silicon (Si) but the most cost effective and optically interesting is the metal assisted wet chemical etching (MACE) (Koynov et al 2006 Appl. Phys. Lett. 88 203107). MACE of Si is a controllable, room-temperature wet-chemical technique that uses a thin layer of metal to etch the surface of Si, leaving behind various nano- and micro-scale surface features or ‘black silicon’. MACE-fabricated nanowires (NWs) provide improved antireflection and light trapping functionality (Toor et al 2016 Nanoscale 8 15448-66) compared with the traditional ‘iso-texturing’ (Campbell and Green 1987 J. Appl. Phys. 62 243-9). The resulting lower reflection and improved light trapping can lead to higher short circuit currents in NW solar cells (Toor et al 2011 Appl. Phys. Lett. 99 103501). In addition, NW cells can have higher fill factors and voltages than traditionally processed cells, thus leading to increased solar cell efficiencies (Cabrera et al 2013 IEEE J. Photovolt. 3 102-7). MACE NW processing also has synergy with next generation Si solar cell designs, such as thin epitaxial-Si and passivated emitter rear contact (Toor et al 2016 Nanoscale 8 15448-66). While several companies have begun manufacturing black Si, and many more are researching these techniques, much of the work has not been published in traditional journals and is publicly available only through conference proceedings and patent publications, which makes learning the field challenging. There have been three specialized review articles published recently on certain aspects of MACE or black Si, but do not present a full review that would benefit the industry (Liu et al 2014 Energy Environ. Sci. 7 3223-63 Yusufoglu et al 2015 IEEE J. Photovolt. 5 320-8 Huang et al 2011 Adv. Mater. 23 285-308). In this feature article, we review the chemistry of MACE and explore how changing parameters in the wet etch process effects the resulting texture on the Si surface. Then we review efforts to increase the uniformity and reproducibility of the MACE process, which is critical for commercializing the black Si technology.

  20. Structured dyadic behavior therapy processes for ADHD intervention.

    PubMed

    Curtis, David F

    2014-03-01

    Children with Attention-Deficit/Hyperactivity Disorder (ADHD) present significant problems with behavioral disinhibition that often negatively affect their peer relationships. Although behavior therapies for ADHD have traditionally aimed to help parents and teachers better manage children's ADHD-related behaviors, therapy processes seldom use peer relationships to implement evidence-based behavioral principles. This article introduces Structured Dyadic Behavior Therapy as a milieu for introducing effective behavioral techniques within a socially meaningful context. Establishing collaborative behavioral goals, benchmarking, and redirection strategies are discussed to highlight how in-session dyadic processes can be used to promote more meaningful reinforcement and change for children with ADHD. Implications for improving patient care, access to care, and therapist training are also discussed.

  1. Sentiment analysis of feature ranking methods for classification accuracy

    NASA Astrophysics Data System (ADS)

    Joseph, Shashank; Mugauri, Calvin; Sumathy, S.

    2017-11-01

    Text pre-processing and feature selection are important and critical steps in text mining. Text pre-processing of large volumes of datasets is a difficult task as unstructured raw data is converted into structured format. Traditional methods of processing and weighing took much time and were less accurate. To overcome this challenge, feature ranking techniques have been devised. A feature set from text preprocessing is fed as input for feature selection. Feature selection helps improve text classification accuracy. Of the three feature selection categories available, the filter category will be the focus. Five feature ranking methods namely: document frequency, standard deviation information gain, CHI-SQUARE, and weighted-log likelihood -ratio is analyzed.

  2. Nonlinear, non-stationary image processing technique for eddy current NDE

    NASA Astrophysics Data System (ADS)

    Yang, Guang; Dib, Gerges; Kim, Jaejoon; Zhang, Lu; Xin, Junjun; Udpa, Lalita

    2012-05-01

    Automatic analysis of eddy current (EC) data has facilitated the analysis of large volumes of data generated in the inspection of steam generator tubes in nuclear power plants. The traditional procedure for analysis of EC data includes data calibration, pre-processing, region of interest (ROI) detection, feature extraction and classification. Accurate ROI detection has been enhanced by pre-processing, which involves reducing noise and other undesirable components as well as enhancing defect indications in the raw measurement. This paper presents the Hilbert-Huang Transform (HHT) for feature extraction and support vector machine (SVM) for classification. The performance is shown to significantly better than the existing rule based classification approach used in industry.

  3. Recombinant antibodies and their use in biosensors.

    PubMed

    Zeng, Xiangqun; Shen, Zhihong; Mernaugh, Ray

    2012-04-01

    Inexpensive, noninvasive immunoassays can be used to quickly detect disease in humans. Immunoassay sensitivity and specificity are decidedly dependent upon high-affinity, antigen-specific antibodies. Antibodies are produced biologically. As such, antibody quality and suitability for use in immunoassays cannot be readily determined or controlled by human intervention. However, the process through which high-quality antibodies can be obtained has been shortened and streamlined by use of genetic engineering and recombinant antibody techniques. Antibodies that traditionally take several months or more to produce when animals are used can now be developed in a few weeks as recombinant antibodies produced in bacteria, yeast, or other cell types. Typically most immunoassays use two or more antibodies or antibody fragments to detect antigens that are indicators of disease. However, a label-free biosensor, for example, a quartz-crystal microbalance (QCM) needs one antibody only. As such, the cost and time needed to design and develop an immunoassay can be substantially reduced if recombinant antibodies and biosensors are used rather than traditional antibody and assay (e.g. enzyme-linked immunosorbant assay, ELISA) methods. Unlike traditional antibodies, recombinant antibodies can be genetically engineered to self-assemble on biosensor surfaces, at high density, and correctly oriented to enhance antigen-binding activity and to increase assay sensitivity, specificity, and stability. Additionally, biosensor surface chemistry and physical and electronic properties can be modified to further increase immunoassay performance above and beyond that obtained by use of traditional methods. This review describes some of the techniques investigators have used to develop highly specific and sensitive, recombinant antibody-based biosensors for detection of antigens in simple or complex biological samples.

  4. Traditional Agriculture and Permaculture.

    ERIC Educational Resources Information Center

    Pierce, Dick

    1997-01-01

    Discusses benefits of combining traditional agricultural techniques with the concepts of "permaculture," a framework for revitalizing traditions, culture, and spirituality. Describes school, college, and community projects that have assisted American Indian communities in revitalizing sustainable agricultural practices that incorporate…

  5. Detection of micro gap weld joint by using magneto-optical imaging and Kalman filtering compensated with RBF neural network

    NASA Astrophysics Data System (ADS)

    Gao, Xiangdong; Chen, Yuquan; You, Deyong; Xiao, Zhenlin; Chen, Xiaohui

    2017-02-01

    An approach for seam tracking of micro gap weld whose width is less than 0.1 mm based on magneto optical (MO) imaging technique during butt-joint laser welding of steel plates is investigated. Kalman filtering(KF) technology with radial basis function(RBF) neural network for weld detection by an MO sensor was applied to track the weld center position. Because the laser welding system process noises and the MO sensor measurement noises were colored noises, the estimation accuracy of traditional KF for seam tracking was degraded by the system model with extreme nonlinearities and could not be solved by the linear state-space model. Also, the statistics characteristics of noises could not be accurately obtained in actual welding. Thus, a RBF neural network was applied to the KF technique to compensate for the weld tracking errors. The neural network can restrain divergence filter and improve the system robustness. In comparison of traditional KF algorithm, the RBF with KF was not only more effectively in improving the weld tracking accuracy but also reduced noise disturbance. Experimental results showed that magneto optical imaging technique could be applied to detect micro gap weld accurately, which provides a novel approach for micro gap seam tracking.

  6. Dereplicating and Spatial Mapping of Secondary Metabolites from Fungal Cultures in Situ.

    PubMed

    Sica, Vincent P; Raja, Huzefa A; El-Elimat, Tamam; Kertesz, Vilmos; Van Berkel, Gary J; Pearce, Cedric J; Oberlies, Nicholas H

    2015-08-28

    Ambient ionization mass spectrometry techniques have recently become prevalent in natural product research due to their ability to examine secondary metabolites in situ. These techniques retain invaluable spatial and temporal details that are lost through traditional extraction processes. However, most ambient ionization techniques do not collect mutually supportive data, such as chromatographic retention times and/or UV/vis spectra, and this can limit the ability to identify certain metabolites, such as differentiating isomers. To overcome this, the droplet-liquid microjunction-surface sampling probe (droplet-LMJ-SSP) was coupled with UPLC-PDA-HRMS-MS/MS, thus providing separation, retention times, MS data, and UV/vis data used in traditional dereplication protocols. By capturing these mutually supportive data, the identity of secondary metabolites can be confidently and rapidly assigned in situ. Using the droplet-LMJ-SSP, a protocol was constructed to analyze the secondary metabolite profile of fungal cultures without any sample preparation. The results demonstrate that fungal cultures can be dereplicated from the Petri dish, thus identifying secondary metabolites, including isomers, and confirming them against reference standards. Furthermore, heat maps, similar to mass spectrometry imaging, can be used to ascertain the location and relative concentration of secondary metabolites directly on the surface and/or surroundings of a fungal culture.

  7. The History of Nontraditional or Ectopic Placement of Reservoirs in Prosthetic Urology.

    PubMed

    Perito, Paul; Wilson, Steven

    2016-04-01

    Reservoir placement during implantation of prosthetic urology devices has been problematic throughout the history of the surgical treatment of erectile dysfunction and urinary incontinence. We thought it would be interesting to review the history of reservoir placement leading up to current surgical techniques. To provide an overview of the past and present techniques for reservoir placement and discuss the evolutionary process leading to safe and effective placement of prosthetic reservoirs. We reviewed data pertaining to inflatable penile prosthesis (IPP) reservoirs and pressure-regulating balloons (PRB) in a chronological fashion, spanning 25 years. Main outcomes included a historical review of techniques for IPP reservoir and PRB placement leading to the subsequent incremental improvements in safety and efficacy when performing penile implants and artificial urinary sphincters. Prosthetic urologic reservoirs have traditionally been placed in the retropubic space. Over the years, urologists have attempted use of alternative spaces including peritoneal, epigastric, "ectopic," posterior to transversalis, and high submuscular. Current advances in prosthetic urologic reservoir placement allow safe and effective abdominal wall placement of reservoirs. These novel approaches appear to be so effective that urologists may now be able to cease using the traditional retropubic space for reservoir placement, even in the case of virgin pelves. Copyright © 2016 International Society for Sexual Medicine. Published by Elsevier Inc. All rights reserved.

  8. Polyelectrolyte and carbon nanotube multilayers made from ionic liquid solutions

    NASA Astrophysics Data System (ADS)

    Nakashima, Takuya; Zhu, Jian; Qin, Ming; Ho, Szushen; Kotov, Nicholas A.

    2010-10-01

    The inevitable contact of substrates with water during the traditional practice of layer-by-layer assembly (LBL) creates problems for multiple potential applications of LBL films in electronics. To resolve this issue, we demonstrate here the possibility of a LBL process using ionic liquids (ILs), which potentially eliminates corrosion and hydration processes related to aqueous media and opens additional possibilities in structural control of LBL films. ILs are also considered to be one of the best ``green'' processing solvents, and hence, are advantageous in respect to traditional organic solvents. Poly(ethyleneimine) (PEI) and poly(sodium styrenesulfonate) (PSS) were dispersed in a hydrophilic IL and successfully deposited in the LBL fashion. To produce electroactive thin films with significance to electronics, a similar process was realized for PSS-modified single-walled carbon nanotubes (SWNT-PSS) and poly(vinyl alcohol) (PVA). Characterization of the coating using standard spectroscopy and microscopy techniques typical of the multilayer field indicated that there are both similarities and differences in the structure and properties of LBL films build from ILs and aqueous solutions. The films exhibited electrical conductivity of 102 S m-1 with transparency as high as 98% for visible light, which is comparable to similar parameters for many carbon nanotube and graphene films prepared by both aqueous LBL and other methods.The inevitable contact of substrates with water during the traditional practice of layer-by-layer assembly (LBL) creates problems for multiple potential applications of LBL films in electronics. To resolve this issue, we demonstrate here the possibility of a LBL process using ionic liquids (ILs), which potentially eliminates corrosion and hydration processes related to aqueous media and opens additional possibilities in structural control of LBL films. ILs are also considered to be one of the best ``green'' processing solvents, and hence, are advantageous in respect to traditional organic solvents. Poly(ethyleneimine) (PEI) and poly(sodium styrenesulfonate) (PSS) were dispersed in a hydrophilic IL and successfully deposited in the LBL fashion. To produce electroactive thin films with significance to electronics, a similar process was realized for PSS-modified single-walled carbon nanotubes (SWNT-PSS) and poly(vinyl alcohol) (PVA). Characterization of the coating using standard spectroscopy and microscopy techniques typical of the multilayer field indicated that there are both similarities and differences in the structure and properties of LBL films build from ILs and aqueous solutions. The films exhibited electrical conductivity of 102 S m-1 with transparency as high as 98% for visible light, which is comparable to similar parameters for many carbon nanotube and graphene films prepared by both aqueous LBL and other methods. Electronic supplementary information (ESI) available: Aggregation of PEI and PSS in [EMIm][EtSO4], detailed FTIR data, water-contact angle for (PEI/PSS)10 multilayers, and XPS survey spectra. See DOI: 10.1039/b9nr00333a

  9. On the possibility of producing true real-time retinal cross-sectional images using a graphics processing unit enhanced master-slave optical coherence tomography system.

    PubMed

    Bradu, Adrian; Kapinchev, Konstantin; Barnes, Frederick; Podoleanu, Adrian

    2015-07-01

    In a previous report, we demonstrated master-slave optical coherence tomography (MS-OCT), an OCT method that does not need resampling of data and can be used to deliver en face images from several depths simultaneously. In a separate report, we have also demonstrated MS-OCT's capability of producing cross-sectional images of a quality similar to those provided by the traditional Fourier domain (FD) OCT technique, but at a much slower rate. Here, we demonstrate that by taking advantage of the parallel processing capabilities offered by the MS-OCT method, cross-sectional OCT images of the human retina can be produced in real time. We analyze the conditions that ensure a true real-time B-scan imaging operation and demonstrate in vivo real-time images from human fovea and the optic nerve, with resolution and sensitivity comparable to those produced using the traditional FD-based method, however, without the need of data resampling.

  10. DICOM to print, 35-mm slides, web, and video projector: tutorial using Adobe Photoshop.

    PubMed

    Gurney, Jud W

    2002-10-01

    Preparing images for publication has dealt with film and the photographic process. With picture archiving and communications systems, many departments will no longer produce film. This will change how images are produced for publication. DICOM, the file format for radiographic images, has to be converted and then prepared for traditional publication, 35-mm slides, the newest techniques of video projection, and the World Wide Web. Tagged image file format is the common format for traditional print publication, whereas joint photographic expert group is the current file format for the World Wide Web. Each medium has specific requirements that can be met with a common image-editing program such as Adobe Photoshop (Adobe Systems, San Jose, CA). High-resolution images are required for print, a process that requires interpolation. However, the Internet requires images with a small file size for rapid transmission. The resolution of each output differs and the image resolution must be optimized to match the output of the publishing medium.

  11. Quasi-Isentropic Compression of Wrought and Additively Manufactures 304L Stainless Steel

    NASA Astrophysics Data System (ADS)

    Specht, Paul; Brown, Justin; Wise, Jack; Furnish, Michael; Adams, David

    2017-06-01

    The thermodynamic and constitutive responses of both additively manufactured (AM) and traditional wrought processed 304L stainless steel (SS) were investigated through quasi-isentropic compression to peak stresses near 1Mbar using Sandia National Laboratories' Z machine. The AM 304L SS samples were made with a laser engineered net shaping (LENS™) technique. Compared to traditional wrought processed 304L SS, the AM samples were highly textured with larger grain sizes (i.e.near 1mm) and residual stresses (> 100 MPa). Interferometric measurements of interface velocities enabled inference of the quasi-isentropes for each fabrication type of 304L SS. Release from peak stress provided flow strength measurements of the wrought and AM 304L SS. Sandia National Laboratories is a multi-mission laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000. Approved For Unclassified Unlimited Release SAND2017-2040A.

  12. Bacterial dynamics and metabolite changes in solid-state acetic acid fermentation of Shanxi aged vinegar.

    PubMed

    Li, Sha; Li, Pan; Liu, Xiong; Luo, Lixin; Lin, Weifeng

    2016-05-01

    Solid-state acetic acid fermentation (AAF), a natural or semi-controlled fermentation process driven by reproducible microbial communities, is an important technique to produce traditional Chinese cereal vinegars. Highly complex microbial communities and metabolites are involved in traditional Chinese solid-state AAF, but the association between microbiota and metabolites during this process are still poorly understood. In this study, we performed amplicon 16S rRNA gene sequencing on the Illumina MiSeq platform, PCR-denaturing gradient gel electrophoresis, and metabolite analysis to trace the bacterial dynamics and metabolite changes under AAF process. A succession of bacterial assemblages was observed during the AAF process. Lactobacillales dominated all the stages. However, Acetobacter species in Rhodospirillales were considerably accelerated during AAF until the end of fermentation. Quantitative PCR results indicated that the biomass of total bacteria showed a "system microbe self-domestication" process in the first 3 days, and then peaked at the seventh day before gradually decreasing until the end of AAF. Moreover, a total of 88 metabolites, including 8 organic acids, 16 free amino acids, and 66 aroma compounds were detected during AAF. Principal component analysis and cluster analyses revealed the high correlation between the dynamics of bacterial community and metabolites.

  13. Monitoring the microbial community during solid-state acetic acid fermentation of Zhenjiang aromatic vinegar.

    PubMed

    Xu, Wei; Huang, Zhiyong; Zhang, Xiaojun; Li, Qi; Lu, Zhenming; Shi, Jinsong; Xu, Zhenghong; Ma, Yanhe

    2011-09-01

    Zhenjiang aromatic vinegar is one of the most famous Chinese traditional vinegars. In this study, change of the microbial community during its fermentation process was investigated. DGGE results showed that microbial community was comparatively stable, and the diversity has a disciplinary series of changes during the fermentation process. It was suggested that domestication of microbes and unique cycle-inoculation style used in the fermentation of Zhenjiang aromatic vinegar were responsible for comparatively stable of the microbial community. Furthermore, two clone libraries were constructed. The results showed that bacteria presented in the fermentation belonged to genus Lactobacillus, Acetobacter, Gluconacetobacter, Staphylococcus, Enterobacter, Pseudomonas, Flavobacterium and Sinorhizobium, while the fungi were genus Saccharomyces. DGGE combined with clone library analysis was an effective and credible technique for analyzing the microbial community during the fermentation process of Zhenjiang aromatic vinegar. Real-time PCR results suggested that the biomass showed a "system microbes self-domestication" process in the first 5 days, then reached a higher level at the 7th day before gradually decreasing until the fermentation ended at the 20th day. This is the first report to study the changes of microbial community during fermentation process of Chinese traditional solid-state fermentation of vinegar. Copyright © 2011 Elsevier Ltd. All rights reserved.

  14. Comparison of Directigen Group A Strep Test with a traditional culture technique for detection of group A beta-hemolytic streptococci.

    PubMed Central

    McCusker, J J; McCoy, E L; Young, C L; Alamares, R; Hirsch, L S

    1984-01-01

    The Directigen Group A Strep Test (DGAST), a new rapid method of detecting group A beta-hemolytic streptococci directly from throat swabs, was compared with a traditional culture technique for the detection of group A beta-hemolytic streptococci. Five hundred oropharyngeal swabs from pediatric and adult patients were cultured and then processed by using the DGAST. Of the 144 specimens positive by culture, 131 were DGAST positive (sensitivity, 90.9%). Of the 356 specimens negative by culture, 353 were DGAST negative (specificity, 99.2%). Twelve of the 13 false-negative DGAST results were from pediatric patients. One hundred isolates of non-group A beta-hemolytic streptococci were recovered, primarily groups C, F, and G. The DGAST is easy to perform, rapid, sensitive, and very specific for detection of group A beta-hemolytic streptococci directly from swabs. Supplementing the DGAST with a culture on a 5% sheep blood agar plate would enhance detection of group A beta-hemolytic streptococci, especially in pediatric patients. PMID:6386884

  15. Passive Resistor Temperature Compensation for a High-Temperature Piezoresistive Pressure Sensor.

    PubMed

    Yao, Zong; Liang, Ting; Jia, Pinggang; Hong, Yingping; Qi, Lei; Lei, Cheng; Zhang, Bin; Li, Wangwang; Zhang, Diya; Xiong, Jijun

    2016-07-22

    The main limitation of high-temperature piezoresistive pressure sensors is the variation of output voltage with operating temperature, which seriously reduces their measurement accuracy. This paper presents a passive resistor temperature compensation technique whose parameters are calculated using differential equations. Unlike traditional experiential arithmetic, the differential equations are independent of the parameter deviation among the piezoresistors of the microelectromechanical pressure sensor and the residual stress caused by the fabrication process or a mismatch in the thermal expansion coefficients. The differential equations are solved using calibration data from uncompensated high-temperature piezoresistive pressure sensors. Tests conducted on the calibrated equipment at various temperatures and pressures show that the passive resistor temperature compensation produces a remarkable effect. Additionally, a high-temperature signal-conditioning circuit is used to improve the output sensitivity of the sensor, which can be reduced by the temperature compensation. Compared to traditional experiential arithmetic, the proposed passive resistor temperature compensation technique exhibits less temperature drift and is expected to be highly applicable for pressure measurements in harsh environments with large temperature variations.

  16. Passive Resistor Temperature Compensation for a High-Temperature Piezoresistive Pressure Sensor

    PubMed Central

    Yao, Zong; Liang, Ting; Jia, Pinggang; Hong, Yingping; Qi, Lei; Lei, Cheng; Zhang, Bin; Li, Wangwang; Zhang, Diya; Xiong, Jijun

    2016-01-01

    The main limitation of high-temperature piezoresistive pressure sensors is the variation of output voltage with operating temperature, which seriously reduces their measurement accuracy. This paper presents a passive resistor temperature compensation technique whose parameters are calculated using differential equations. Unlike traditional experiential arithmetic, the differential equations are independent of the parameter deviation among the piezoresistors of the microelectromechanical pressure sensor and the residual stress caused by the fabrication process or a mismatch in the thermal expansion coefficients. The differential equations are solved using calibration data from uncompensated high-temperature piezoresistive pressure sensors. Tests conducted on the calibrated equipment at various temperatures and pressures show that the passive resistor temperature compensation produces a remarkable effect. Additionally, a high-temperature signal-conditioning circuit is used to improve the output sensitivity of the sensor, which can be reduced by the temperature compensation. Compared to traditional experiential arithmetic, the proposed passive resistor temperature compensation technique exhibits less temperature drift and is expected to be highly applicable for pressure measurements in harsh environments with large temperature variations. PMID:27455271

  17. Promotion of family-centered birth with gentle cesarean delivery.

    PubMed

    Magee, Susanna R; Battle, Cynthia; Morton, John; Nothnagle, Melissa

    2014-01-01

    In this commentary we describe our experience developing a "gentle cesarean" program at a community hospital housing a family medicine residency program. The gentle cesarean technique has been popularized in recent obstetrics literature as a viable option to enhance the experience and outcomes of women and families undergoing cesarean delivery. Skin-to-skin placement of the infant in the operating room with no separation of mother and infant, reduction of extraneous noise, and initiation of breastfeeding in the operating room distinguish this technique from traditional cesarean delivery. Collaboration among family physicians, obstetricians, midwives, pediatricians, neonatologists, anesthesiologists, nurses, and operating room personnel facilitated the provision of gentle cesarean delivery to families requiring an operative birth. Among 144 gentle cesarean births performed from 2009 to 2012, complication rates were similar to or lower than those for traditional cesarean births. Gentle cesarean delivery is now standard of care at our institution. By sharing our experience, we hope to help other hospitals develop gentle cesarean programs. Family physicians should play an integral role in this process. © Copyright 2014 by the American Board of Family Medicine.

  18. [Research on engine remaining useful life prediction based on oil spectrum analysis and particle filtering].

    PubMed

    Sun, Lei; Jia, Yun-xian; Cai, Li-ying; Lin, Guo-yu; Zhao, Jin-song

    2013-09-01

    The spectrometric oil analysis(SOA) is an important technique for machine state monitoring, fault diagnosis and prognosis, and SOA based remaining useful life(RUL) prediction has an advantage of finding out the optimal maintenance strategy for machine system. Because the complexity of machine system, its health state degradation process can't be simply characterized by linear model, while particle filtering(PF) possesses obvious advantages over traditional Kalman filtering for dealing nonlinear and non-Gaussian system, the PF approach was applied to state forecasting by SOA, and the RUL prediction technique based on SOA and PF algorithm is proposed. In the prediction model, according to the estimating result of system's posterior probability, its prior probability distribution is realized, and the multi-step ahead prediction model based on PF algorithm is established. Finally, the practical SOA data of some engine was analyzed and forecasted by the above method, and the forecasting result was compared with that of traditional Kalman filtering method. The result fully shows the superiority and effectivity of the

  19. Residential roof condition assessment system using deep learning

    NASA Astrophysics Data System (ADS)

    Wang, Fan; Kerekes, John P.; Xu, Zhuoyi; Wang, Yandong

    2018-01-01

    The emergence of high resolution (HR) and ultra high resolution (UHR) airborne remote sensing imagery is enabling humans to move beyond traditional land cover analysis applications to the detailed characterization of surface objects. A residential roof condition assessment method using techniques from deep learning is presented. The proposed method operates on individual roofs and divides the task into two stages: (1) roof segmentation, followed by (2) condition classification of the segmented roof regions. As the first step in this process, a self-tuning method is proposed to segment the images into small homogeneous areas. The segmentation is initialized with simple linear iterative clustering followed by deep learned feature extraction and region merging, with the optimal result selected by an unsupervised index, Q. After the segmentation, a pretrained residual network is fine-tuned on the augmented roof segments using a proposed k-pixel extension technique for classification. The effectiveness of the proposed algorithm was demonstrated on both HR and UHR imagery collected by EagleView over different study sites. The proposed algorithm has yielded promising results and has outperformed traditional machine learning methods using hand-crafted features.

  20. Impacts of Oil and Gas Production on Winter Ozone Pollution in the Uintah Basin Using Model Source Apportionment

    NASA Astrophysics Data System (ADS)

    Tran, H. N. Q.; Tran, T. T.; Mansfield, M. L.; Lyman, S. N.

    2014-12-01

    Contributions of emissions from oil and gas activities to elevated ozone concentrations in the Uintah Basin - Utah were evaluated using the CMAQ Integrated Source Apportionment Method (CMAQ-ISAM) technique, and were compared with the results of traditional budgeting methods. Unlike the traditional budgeting method, which compares simulations with and without emissions of the source(s) in question to quantify its impacts, the CMAQ-ISAM technique assigns tags to emissions of each source and tracks their evolution through physical and chemical processes to quantify the final ozone product yield from the source. Model simulations were performed for two episodes in winter 2013 of low and high ozone to provide better understanding of source contributions under different weather conditions. Due to the highly nonlinear ozone chemistry, results obtained from the two methods differed significantly. The growing oil and gas industry in the Uintah Basin is the largest contributor to the elevated zone (>75 ppb) observed in the Basin. This study therefore provides an insight into the impact of oil and gas industry on the ozone issue, and helps in determining effective control strategies.

  1. Classroom Assessment Techniques: Checking for Student Understanding in an Introductory University Success Course

    ERIC Educational Resources Information Center

    Holbeck, Rick; Bergquist, Emily; Lees, Sheila

    2014-01-01

    Classroom Assessment Techniques (CATs) have been used in traditional university classrooms as a strategy to check for student understanding (Angelo & Cross, 1993). With the emergence of online learning and its popularity for non-traditional students, it is equally important that instructors in the online environment check for student…

  2. Traditional living and cultural ways as protective factors against suicide: perceptions of Alaska Native university students.

    PubMed

    DeCou, Christopher R; Skewes, Monica C; López, Ellen D S

    2013-01-01

    Native peoples living in Alaska have one of the highest rates of suicide in the world. This represents a significant health disparity for indigenous populations living in Alaska. This research was part of a larger study that explored qualitatively the perceptions of Alaska Native university students from rural communities regarding suicide. This analysis explored the resilience that arose from participants' experiences of traditional ways, including subsistence activities. Previous research has indicated the importance of traditional ways in preventing suicide and strengthening communities. Semi-structured interviews were conducted with 25 university students who had migrated to Fairbanks, Alaska, from rural Alaskan communities. An interview protocol was developed in collaboration with cultural and community advisors. Interviews were audio-recorded and transcribed. Participants were asked specific questions concerning the strengthening of traditional practices towards the prevention of suicide. Transcripts were analysed using the techniques of grounded theory. Participants identified several resilience factors against suicide, including traditional practices and subsistence activities, meaningful community involvement and an active lifestyle. Traditional practices and subsistence activities were perceived to create the context for important relationships, promote healthy living to prevent suicide, contrast with current challenges and transmit important cultural values. Participants considered the strengthening of these traditional ways as important in suicide prevention efforts. However, subsistence and traditional practices were viewed as a diminishing aspect of daily living in rural Alaska. Many college students from rural Alaska have been affected by suicide but are strong enough to cope with such tragic events. Subsistence living and traditional practices were perceived as important social and cultural processes with meaningful lifelong benefits for participants. Future research should continue to explore the ways in which traditional practices can contribute towards suicide prevention, as well as the far-reaching benefits of subsistence living.

  3. Natural Inspired Intelligent Visual Computing and Its Application to Viticulture.

    PubMed

    Ang, Li Minn; Seng, Kah Phooi; Ge, Feng Lu

    2017-05-23

    This paper presents an investigation of natural inspired intelligent computing and its corresponding application towards visual information processing systems for viticulture. The paper has three contributions: (1) a review of visual information processing applications for viticulture; (2) the development of natural inspired computing algorithms based on artificial immune system (AIS) techniques for grape berry detection; and (3) the application of the developed algorithms towards real-world grape berry images captured in natural conditions from vineyards in Australia. The AIS algorithms in (2) were developed based on a nature-inspired clonal selection algorithm (CSA) which is able to detect the arcs in the berry images with precision, based on a fitness model. The arcs detected are then extended to perform the multiple arcs and ring detectors information processing for the berry detection application. The performance of the developed algorithms were compared with traditional image processing algorithms like the circular Hough transform (CHT) and other well-known circle detection methods. The proposed AIS approach gave a Fscore of 0.71 compared with Fscores of 0.28 and 0.30 for the CHT and a parameter-free circle detection technique (RPCD) respectively.

  4. Applications of emerging imaging techniques for meat quality and safety detection and evaluation: A review.

    PubMed

    Xiong, Zhenjie; Sun, Da-Wen; Pu, Hongbin; Gao, Wenhong; Dai, Qiong

    2017-03-04

    With improvement in people's living standards, many people nowadays pay more attention to quality and safety of meat. However, traditional methods for meat quality and safety detection and evaluation, such as manual inspection, mechanical methods, and chemical methods, are tedious, time-consuming, and destructive, which cannot meet the requirements of modern meat industry. Therefore, seeking out rapid, non-destructive, and accurate inspection techniques is important for the meat industry. In recent years, a number of novel and noninvasive imaging techniques, such as optical imaging, ultrasound imaging, tomographic imaging, thermal imaging, and odor imaging, have emerged and shown great potential in quality and safety assessment. In this paper, a detailed overview of advanced applications of these emerging imaging techniques for quality and safety assessment of different types of meat (pork, beef, lamb, chicken, and fish) is presented. In addition, advantages and disadvantages of each imaging technique are also summarized. Finally, future trends for these emerging imaging techniques are discussed, including integration of multiple imaging techniques, cost reduction, and developing powerful image-processing algorithms.

  5. Asymmetric Dual-Band Tracking Technique for Optimal Joint Processing of BDS B1I and B1C Signals

    PubMed Central

    Wang, Chuhan; Cui, Xiaowei; Ma, Tianyi; Lu, Mingquan

    2017-01-01

    Along with the rapid development of the Global Navigation Satellite System (GNSS), satellite navigation signals have become more diversified, complex, and agile in adapting to increasing market demands. Various techniques have been developed for processing multiple navigation signals to achieve better performance in terms of accuracy, sensitivity, and robustness. This paper focuses on a technique for processing two signals with separate but adjacent center frequencies, such as B1I and B1C signals in the BeiDou global system. The two signals may differ in modulation scheme, power, and initial phase relation and can be processed independently by user receivers; however, the propagation delays of the two signals from a satellite are nearly identical as they are modulated on adjacent frequencies, share the same reference clock, and undergo nearly identical propagation paths to the receiver, resulting in strong coherence between the two signals. Joint processing of these signals can achieve optimal measurement performance due to the increased Gabor bandwidth and power. In this paper, we propose a universal scheme of asymmetric dual-band tracking (ASYM-DBT) to take advantage of the strong coherence, the increased Gabor bandwidth, and power of the two signals in achieving much-reduced thermal noise and more accurate ranging results when compared with the traditional single-band algorithm. PMID:29035350

  6. Asymmetric Dual-Band Tracking Technique for Optimal Joint Processing of BDS B1I and B1C Signals.

    PubMed

    Wang, Chuhan; Cui, Xiaowei; Ma, Tianyi; Zhao, Sihao; Lu, Mingquan

    2017-10-16

    Along with the rapid development of the Global Navigation Satellite System (GNSS), satellite navigation signals have become more diversified, complex, and agile in adapting to increasing market demands. Various techniques have been developed for processing multiple navigation signals to achieve better performance in terms of accuracy, sensitivity, and robustness. This paper focuses on a technique for processing two signals with separate but adjacent center frequencies, such as B1I and B1C signals in the BeiDou global system. The two signals may differ in modulation scheme, power, and initial phase relation and can be processed independently by user receivers; however, the propagation delays of the two signals from a satellite are nearly identical as they are modulated on adjacent frequencies, share the same reference clock, and undergo nearly identical propagation paths to the receiver, resulting in strong coherence between the two signals. Joint processing of these signals can achieve optimal measurement performance due to the increased Gabor bandwidth and power. In this paper, we propose a universal scheme of asymmetric dual-band tracking (ASYM-DBT) to take advantage of the strong coherence, the increased Gabor bandwidth, and power of the two signals in achieving much-reduced thermal noise and more accurate ranging results when compared with the traditional single-band algorithm.

  7. Investigation of FPGA-Based Real-Time Adaptive Digital Pulse Shaping for High-Count-Rate Applications

    NASA Astrophysics Data System (ADS)

    Saxena, Shefali; Hawari, Ayman I.

    2017-07-01

    Digital signal processing techniques have been widely used in radiation spectrometry to provide improved stability and performance with compact physical size over the traditional analog signal processing. In this paper, field-programmable gate array (FPGA)-based adaptive digital pulse shaping techniques are investigated for real-time signal processing. National Instruments (NI) NI 5761 14-bit, 250-MS/s adaptor module is used for digitizing high-purity germanium (HPGe) detector's preamplifier pulses. Digital pulse processing algorithms are implemented on the NI PXIe-7975R reconfigurable FPGA (Kintex-7) using the LabVIEW FPGA module. Based on the time separation between successive input pulses, the adaptive shaping algorithm selects the optimum shaping parameters (rise time and flattop time of trapezoid-shaping filter) for each incoming signal. A digital Sallen-Key low-pass filter is implemented to enhance signal-to-noise ratio and reduce baseline drifting in trapezoid shaping. A recursive trapezoid-shaping filter algorithm is employed for pole-zero compensation of exponentially decayed (with two-decay constants) preamplifier pulses of an HPGe detector. It allows extraction of pulse height information at the beginning of each pulse, thereby reducing the pulse pileup and increasing throughput. The algorithms for RC-CR2 timing filter, baseline restoration, pile-up rejection, and pulse height determination are digitally implemented for radiation spectroscopy. Traditionally, at high-count-rate conditions, a shorter shaping time is preferred to achieve high throughput, which deteriorates energy resolution. In this paper, experimental results are presented for varying count-rate and pulse shaping conditions. Using adaptive shaping, increased throughput is accepted while preserving the energy resolution observed using the longer shaping times.

  8. Which is the preferred revision technique for loosened iliac screw? A novel technique of boring cement injection from the outer cortical shell.

    PubMed

    Yu, Bin-Sheng; Yang, Zhan-Kun; Li, Ze-Min; Zeng, Li-Wen; Wang, Li-Bing; Lu, William Weijia

    2011-08-01

    An in vitro biomechanical cadaver study. To evaluate the pull-out strength after 5000 cyclic loading among 4 revision techniques for the loosened iliac screw using corticocancellous bone, longer screw, traditional cement augmentation, and boring cement augmentation. Iliac screw loosening is still a clinical problem for lumbo-iliac fusion. Although many revision techniques using corticocancellous bone, larger screw, and polymethylmethacrylate (PMMA) augmentation were applied in repairing pedicle screw loosening, their biomechanical effects on the loosened iliac screw remain undetermined. Eight fresh human cadaver pelvises with the bone mineral density values ranging from 0.83 to 0.97 g/cm were adopted in this study. After testing the primary screw of 7.5 mm diameter and 70 mm length, 4 revision techniques were sequentially established and tested on the same pelvis as follows: corticocancellous bone, longer screw with 100 mm length, traditional PMMA augmentation, and boring PMMA augmentation. The difference of the boring technique from traditional PMMA augmentation is that PMMA was injected into the screw tract through 3 boring holes of outer cortical shell without removing the screw. On an MTS machine, after 5000 cyclic compressive loading of -200∼-500 N to the screw head, axial maximum pull-out strengths of the 5 screws were measured and analyzed. The pull-out strengths of the primary screw and 4 revised screws with corticocancellous bone, longer screw and traditional and boring PMMA augmentation were 1167 N, 361 N, 854 N, 1954 N, and 1820 N, respectively. Although longer screw method obtained significantly higher pull-out strength than corticocancellous bone (P<0.05), the revised screws using these 2 techniques exhibited notably lower pull-out strength than the primary screw and 2 PMMA-augmented screws (P<0.05). Either traditional or boring PMMA screw showed obviously higher pull-out strength than the primary screw (P<0.05); however, no significant difference of pull-out strength was detected between the 2 PMMA screws (P>0.05). Wadding corticocancellous bone and increasing screw length failed to provide sufficient anchoring strength for a loosened iliac screw; however, both traditional and boring PMMA-augmented techniques could effectively increase the fixation strength. On the basis of the viewpoint of minimal invasion, the boring PMMA augmentation may serve as a suitable salvage technique for iliac screw loosening.

  9. Photocontrollable Fluorescent Proteins for Superresolution Imaging

    PubMed Central

    Shcherbakova, Daria M.; Sengupta, Prabuddha; Lippincott-Schwartz, Jennifer; Verkhusha, Vladislav V.

    2014-01-01

    Superresolution fluorescence microscopy permits the study of biological processes at scales small enough to visualize fine subcellular structures that are unresolvable by traditional diffraction-limited light microscopy. Many superresolution techniques, including those applicable to live cell imaging, utilize genetically encoded photocontrollable fluorescent proteins. The fluorescence of these proteins can be controlled by light of specific wavelengths. In this review, we discuss the biochemical and photophysical properties of photocontrollable fluorescent proteins that are relevant to their use in superresolution microscopy. We then describe the recently developed photoactivatable, photoswitchable, and reversibly photoswitchable fluorescent proteins, and we detail their particular usefulness in single-molecule localization–based and nonlinear ensemble–based superresolution techniques. Finally, we discuss recent applications of photocontrollable proteins in superresolution imaging, as well as how these applications help to clarify properties of intracellular structures and processes that are relevant to cell and developmental biology, neuroscience, cancer biology and biomedicine. PMID:24895855

  10. Noncontact Microembossing Technology for Fabricating Thermoplastic Optical Polymer Microlens Array Sheets

    PubMed Central

    Chang, Xuefeng; Ge, Xiaohong; Li, Hui

    2014-01-01

    Thermoplastic optical polymers have replaced traditional optical glass for many applications, due to their superior optical performance, mechanical characteristics, low cost, and efficient production process. This paper investigates noncontact microembossing technology used for producing microlens arrays made out of PMMA (polymethyl methacrylate), PS (polyStyrene), and PC (polycarbonate) from a quartz mold, with microhole arrays. An array of planoconvex microlenses are formed because of surface tension caused by applying pressure to the edge of a hole at a certain glass transition temperature. We studied the principle of noncontact microembossing techniques using finite element analysis, in addition to the thermal and mechanical properties of the three polymers. Then, the independently developed hot-embossing equipment was used to fabricate microlens arrays on PMMA, PS, and PC sheets. This is a promising technique for fabricating diverse thermoplastic optical polymer microlens array sheets, with a simple technological process and low production costs. PMID:25162063

  11. Spall Response of Additive Manufactured Ti-6Al-4V

    NASA Astrophysics Data System (ADS)

    Brown, Andrew; Gregg, Adam; Escobedo, Jp; Hazell, Paul; East, Daniel; Quadir, Zakaria

    2017-06-01

    Additive manufactured (AM) Ti-6Al-4V was produced via electron beam melting (EBM) and laser melting deposition (LMD) techniques. The dynamic response of AM varieties of common aerospace and infrastructure metals are yet to be fully characterized and compared to their traditionally processed counterparts. Spall damage is one of the primary failure modes in metals subjected to shock loading from high velocity impact. Both EBM and LMD Ti-6Al-4V were shock loaded via flyer-target plate impact using a single-stage light gas gun. Target plates were subjected to pressures just above the spall strength of the material (3-5 GPa) to investigate the early onset of damage nucleation as a function of processing technique and shock orientation with respect to the AM-build direction. Post-mortem characterization of the spall damage and surrounding microstructure was performed using a combination of optical microscopy, scanning electron microscopy, and electron backscatter diffraction.

  12. Laser-based surface preparation of composite laminates leads to improved electrodes for electrical measurements

    NASA Astrophysics Data System (ADS)

    Almuhammadi, Khaled; Selvakumaran, Lakshmi; Alfano, Marco; Yang, Yang; Bera, Tushar Kanti; Lubineau, Gilles

    2015-12-01

    Electrical impedance tomography (EIT) is a low-cost, fast and effective structural health monitoring technique that can be used on carbon fiber reinforced polymers (CFRP). Electrodes are a key component of any EIT system and as such they should feature low resistivity as well as high robustness and reproducibility. Surface preparation is required prior to bonding of electrodes. Currently this task is mostly carried out by traditional sanding. However this is a time consuming procedure which can also induce damage to surface fibers and lead to spurious electrode properties. Here we propose an alternative processing technique based on the use of pulsed laser irradiation. The processing parameters that result in selective removal of the electrically insulating resin with minimum surface fiber damage are identified. A quantitative analysis of the electrical contact resistance is presented and the results are compared with those obtained using sanding.

  13. Single Molecule Enzymology via Nanoelectronic Circuits

    NASA Astrophysics Data System (ADS)

    Collins, Philip

    Traditional single-molecule techniques rely on fluorescence or force transduction to monitor conformational changes and biochemical activity. Recent demonstrations of single-molecule monitoring with electronic transistors are poised to add to the single-molecule research toolkit. The transistor-based technique is sensitive to the motion of single charged side chain residues and can transduce those motions with microsecond resolution, opening the doors to single-molecule enzymology with unprecedented resolution. Furthermore, the solid-state platform provides opportunities for parallelization in arrays and long-duration monitoring of one molecule's activity or processivity, all without the limitations caused by photo-oxidation or mutagenic fluorophore incorporation. This presentation will review some of these advantages and their particular application to DNA polymerase I processing single-stranded DNA templates. This research was supported financially by the NIH NCI (R01 CA133592-01), the NIH NIGMS (1R01GM106957-01) and the NSF (DMR-1104629 and ECCS-1231910).

  14. Drowning in Data: Going Beyond Traditional Data Archival to Educate Data Users

    NASA Astrophysics Data System (ADS)

    Weigel, A. M.; Smith, T.; Smith, D. K.; Bugbee, K.; Sinclair, L.

    2017-12-01

    Increasing quantities of Earth science data and information prove overwhelming to new and unfamiliar users. Data discovery and use challenges faced by these users are compounded with atmospheric science field campaign data collected by a variety of instruments and stored, visualized, processed and analyzed in different ways. To address data and user needs assessed through annual surveys and user questions, the NASA Global Hydrology Resource Center Distributed Active Archive Center (GHRC DAAC), in collaboration with a graphic designer, has developed a series of resources to help users learn about GHRC science focus areas, field campaigns, instruments, data, and data processing techniques. In this talk, GHRC data recipes, micro articles, interactive data visualization techniques, and artistic science outreach and education efforts, such as ESRI story maps and research as art, will be overviewed. The objective of this talk is to stress the importance artistic information visualization has in communicating with and educating Earth science data users.

  15. Simulation and Development of Internal Model Control Applications in the Bayer Process

    NASA Astrophysics Data System (ADS)

    Colombé, Ph.; Dablainville, R.; Vacarisas, J.

    Traditional PID feedback control system is limited in its use in the Bayer cycle due to the important and omnipresent time delays which can lead to stability problems and sluggish response. Advanced modern control techniques are available, but suffer in an industrial environment from a lack of simplicity and robustness. In this respect the Internal Model Control (IMC) method may be considered as an exception. After a brief review of the basic theoretical principles behind IMC, an IMC scheme is developed to work with single-input, single-output, discrete-time, nonlinear systems. Two applications of IMC in the Bayer process, both in simulations and on industrial plants, are then described: control of the caustic soda concentration of the aluminate liquor and control of the A12O3/Na20 caust. ratio of the digested slurry, Finally, the results obtained make this technique quite attractive for the alumina industry.

  16. High volume fabrication of laser targets using MEMS techniques

    NASA Astrophysics Data System (ADS)

    Spindloe, C.; Arthur, G.; Hall, F.; Tomlinson, S.; Potter, R.; Kar, S.; Green, J.; Higginbotham, A.; Booth, N.; Tolley, M. K.

    2016-04-01

    The latest techniques for the fabrication of high power laser targets, using processes developed for the manufacture of Micro-Electro-Mechanical System (MEMS) devices are discussed. These laser targets are designed to meet the needs of the increased shot numbers that are available in the latest design of laser facilities. Traditionally laser targets have been fabricated using conventional machining or coarse etching processes and have been produced in quantities of 10s to low 100s. Such targets can be used for high complexity experiments such as Inertial Fusion Energy (IFE) studies and can have many complex components that need assembling and characterisation with high precision. Using the techniques that are common to MEMS devices and integrating these with an existing target fabrication capability we are able to manufacture and deliver targets to these systems. It also enables us to manufacture novel targets that have not been possible using other techniques. In addition, developments in the positioning systems that are required to deliver these targets to the laser focus are also required and a system to deliver the target to a focus of an F2 beam at 0.1Hz is discussed.

  17. [Application of rational ant colony optimization to improve the reproducibility degree of laser three-dimensional copy].

    PubMed

    Cui, Xiao-Yan; Huo, Zhong-Gang; Xin, Zhong-Hua; Tian, Xiao; Zhang, Xiao-Dong

    2013-07-01

    Three-dimensional (3D) copying of artificial ears and pistol printing are pushing laser three-dimensional copying technique to a new page. Laser three-dimensional scanning is a fresh field in laser application, and plays an irreplaceable part in three-dimensional copying. Its accuracy is the highest among all present copying techniques. Reproducibility degree marks the agreement of copied object with the original object on geometry, being the most important index property in laser three-dimensional copying technique. In the present paper, the error of laser three-dimensional copying was analyzed. The conclusion is that the data processing to the point cloud of laser scanning is the key technique to reduce the error and increase the reproducibility degree. The main innovation of this paper is as follows. On the basis of traditional ant colony optimization, rational ant colony optimization algorithm proposed by the author was applied to the laser three-dimensional copying as a new algorithm, and was put into practice. Compared with customary algorithm, rational ant colony optimization algorithm shows distinct advantages in data processing of laser three-dimensional copying, reducing the error and increasing the reproducibility degree of the copy.

  18. An evaluation of semi-automated methods for collecting ecosystem-level data in temperate marine systems.

    PubMed

    Griffin, Kingsley J; Hedge, Luke H; González-Rivero, Manuel; Hoegh-Guldberg, Ove I; Johnston, Emma L

    2017-07-01

    Historically, marine ecologists have lacked efficient tools that are capable of capturing detailed species distribution data over large areas. Emerging technologies such as high-resolution imaging and associated machine-learning image-scoring software are providing new tools to map species over large areas in the ocean. Here, we combine a novel diver propulsion vehicle (DPV) imaging system with free-to-use machine-learning software to semi-automatically generate dense and widespread abundance records of a habitat-forming algae over ~5,000 m 2 of temperate reef. We employ replicable spatial techniques to test the effectiveness of traditional diver-based sampling, and better understand the distribution and spatial arrangement of one key algal species. We found that the effectiveness of a traditional survey depended on the level of spatial structuring, and generally 10-20 transects (50 × 1 m) were required to obtain reliable results. This represents 2-20 times greater replication than have been collected in previous studies. Furthermore, we demonstrate the usefulness of fine-resolution distribution modeling for understanding patterns in canopy algae cover at multiple spatial scales, and discuss applications to other marine habitats. Our analyses demonstrate that semi-automated methods of data gathering and processing provide more accurate results than traditional methods for describing habitat structure at seascape scales, and therefore represent vastly improved techniques for understanding and managing marine seascapes.

  19. Insights into the microbial diversity and community dynamics of Chinese traditional fermented foods from using high-throughput sequencing approaches*

    PubMed Central

    He, Guo-qing; Liu, Tong-jie; Sadiq, Faizan A.; Gu, Jing-si; Zhang, Guo-hua

    2017-01-01

    Chinese traditional fermented foods have a very long history dating back thousands of years and have become an indispensable part of Chinese dietary culture. A plethora of research has been conducted to unravel the composition and dynamics of microbial consortia associated with Chinese traditional fermented foods using culture-dependent as well as culture-independent methods, like different high-throughput sequencing (HTS) techniques. These HTS techniques enable us to understand the relationship between a food product and its microbes to a greater extent than ever before. Considering the importance of Chinese traditional fermented products, the objective of this paper is to review the diversity and dynamics of microbiota in Chinese traditional fermented foods revealed by HTS approaches. PMID:28378567

  20. Impact energy and retained dose uniformity in enhanced glow discharge plasma immersion ion implantation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Q. Y.; Fu, Ricky K. Y.; Chu, Paul K.

    2009-08-10

    The implantation energy and retained dose uniformity in enhanced glow discharge plasma immersion ion implantation (EGD-PIII) is investigated numerically and experimentally. Depth profiles obtained from different samples processed by EGD-PIII and traditional PIII are compared. The retained doses under different pulse widths are calculated by integrating the area under the depth profiles. Our results indicate that the improvement in the impact energy and retained dose uniformity by this technique is remarkable.

  1. A novel data reduction technique for single slanted hot-wire measurements used to study incompressible compressor tip leakage flows

    NASA Astrophysics Data System (ADS)

    Berdanier, Reid A.; Key, Nicole L.

    2016-03-01

    The single slanted hot-wire technique has been used extensively as a method for measuring three velocity components in turbomachinery applications. The cross-flow orientation of probes with respect to the mean flow in rotating machinery results in detrimental prong interference effects when using multi-wire probes. As a result, the single slanted hot-wire technique is often preferred. Typical data reduction techniques solve a set of nonlinear equations determined by curve fits to calibration data. A new method is proposed which utilizes a look-up table method applied to a simulated triple-wire sensor with application to turbomachinery environments having subsonic, incompressible flows. Specific discussion regarding corrections for temperature and density changes present in a multistage compressor application is included, and additional consideration is given to the experimental error which accompanies each data reduction process. Hot-wire data collected from a three-stage research compressor with two rotor tip clearances are used to compare the look-up table technique with the traditional nonlinear equation method. The look-up table approach yields velocity errors of less than 5 % for test conditions deviating by more than 20 °C from calibration conditions (on par with the nonlinear solver method), while requiring less than 10 % of the computational processing time.

  2. Cu doping concentration effect on the physical properties of CdS thin films obtained by the CBD technique

    NASA Astrophysics Data System (ADS)

    Albor Aguilera, M. L.; Flores Márquez, J. M.; Remolina Millan, A.; Matsumoto Kuwabara, Y.; González Trujillo, M. A.; Hernández Vásquez, C.; Aguilar Hernandez, J. R.; Hernández Pérez, M. A.; Courel-Piedrahita, M.; Madeira, H. T. Yee

    2017-08-01

    Cu(In, Ga)Se2 (CIGS) and Cu2ZnSnS4 (CZTS) semiconductors are direct band gap materials; when these types of material are used in solar cells, they provide efficiencies of 22.1% and 12.6%, respectively. Most traditional fabrication methods involve expensive vacuum processes including co-evaporation and sputtering techniques, where films and doping are conducted separately. On the other hand, the chemical bath deposition (CBD) technique allows an in situ process. Cu-doped CdS thin films working as a buffer layer on solar cells provide good performing devices and they may be deposited by low cost techniques such as chemical methods. In this work, Cu-doped CdS thin films were deposited using the CBD technique on SnO2:F (FTO) substrates. The elemental analysis and mapping reconstruction were conducted by EDXS. Morphological, optical and electrical properties were studied, and they revealed that Cu doping modified the CdS structure, band-gap value and the electrical properties. Cu-doped CdS films show high resistivity compared to the non-doped CdS. The appropriate parameters of Cu-doped CdS films were determined to obtain an adequate window or buffer layer on CIGS and CZTS photovoltaic solar cells.

  3. Comparative numerical study on the optimal vulcanization of rubber compounds through traditional curing and microwaves

    NASA Astrophysics Data System (ADS)

    Milani, Gabriele; Milani, Federico

    2012-12-01

    The main problem in the industrial production process of thick EPM/EPDM elements is constituted by the different temperatures which undergo internal (cooler) and external regions. Indeed, while internal layers remain essentially under-vulcanized, external coating is always over-vulcanized, resulting in an overall average tensile strength insufficient to permit the utilization of the items in several applications where it is required a certain level of performance. Possible ways to improve rubber output mechanical properties include a careful calibration of exposition time and curing temperature in traditional heating or a vulcanization through innovative techniques, such as microwaves. In the present paper, a comprehensive numerical model able to give predictions on the optimized final mechanical properties of vulcanized 2D and 3D thick rubber items is presented and applied to a meaningful example of engineering interest. A detailed comparative numerical study is finally presented in order to establish pros and cons of traditional vulcanization vs microwaves curing.

  4. A case study on the labeling of bottarga produced in Sardinia from ovaries of grey mullets (Mugil cephalus and Mugil capurrii) caught in Eastern Central Atlantic coasts

    PubMed Central

    Piras, Pierluigi; Sardu, Francesco; Meloni, Domenico; Riina, Maria Vittoria; Beltramo, Chiara; Acutis, Pier Luigi

    2018-01-01

    The aim of this case study is to show how traditional and molecular methods can be employed to identify the Mugilidae species currently used in Sardinia (Italy) to produce the traditional bottarga for the processing of their ovaries. A total of six specimens of Mugil cephalus (n=3) and Mugil capurrii (n=3) were subjected to external morphology and meristic measurements. Subsequently, tissue samples of white muscle and ovaries from three individuals per species were underwent PCR-sequencing assay of mitochondrial DNA cytochrome oxidase subunit I (COI). The external morphology and meristic characters showed a sufficient level of reliability in the identification between the two species. At the same time, the molecular techniques showed the discriminatory power and confirmed the correct species identification in all the sampling units. DNA barcoding may be an effective aid to traditional taxonomy and can facilitate accurate species identification among the Mugilidae. PMID:29732322

  5. Overcoming rule-based rigidity and connectionist limitations through massively-parallel case-based reasoning

    NASA Technical Reports Server (NTRS)

    Barnden, John; Srinivas, Kankanahalli

    1990-01-01

    Symbol manipulation as used in traditional Artificial Intelligence has been criticized by neural net researchers for being excessively inflexible and sequential. On the other hand, the application of neural net techniques to the types of high-level cognitive processing studied in traditional artificial intelligence presents major problems as well. A promising way out of this impasse is to build neural net models that accomplish massively parallel case-based reasoning. Case-based reasoning, which has received much attention recently, is essentially the same as analogy-based reasoning, and avoids many of the problems leveled at traditional artificial intelligence. Further problems are avoided by doing many strands of case-based reasoning in parallel, and by implementing the whole system as a neural net. In addition, such a system provides an approach to some aspects of the problems of noise, uncertainty and novelty in reasoning systems. The current neural net system (Conposit), which performs standard rule-based reasoning, is being modified into a massively parallel case-based reasoning version.

  6. Micromechanical Structures Fabrication

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rajic, S

    2001-05-08

    Work in materials other than silicon for MEMS applications has typically been restricted to metals and metal oxides instead of more ''exotic'' semiconductors. However, group III-V and II-VI semiconductors form a very important and versatile collection of material and electronic parameters available to the MEMS and MOEMS designer. With these materials, not only are the traditional mechanical material variables (thermal conductivity, thermal expansion, Young's modulus, etc.) available, but also chemical constituents can be varied in ternary and quaternary materials. This flexibility can be extremely important for both friction and chemical compatibility issues for MEMS. In addition, the ability to continuallymore » vary the bandgap energy can be particularly useful for many electronics and infrared detection applications. However, there are two major obstacles associated with alternate semiconductor material MEMS. The first issue is the actual fabrication of non-silicon micro-devices and the second impediment is communicating with these novel devices. We have implemented an essentially material independent fabrication method that is amenable to most group III-V and II-VI semiconductors. This technique uses a combination of non-traditional direct write precision fabrication processes such as diamond turning, ion milling, laser ablation, etc. This type of deterministic fabrication approach lends itself to an almost trivial assembly process. We also implemented a mechanical, electrical, and optical self-aligning hybridization technique for these alternate-material MEMS substrates.« less

  7. Novel Fabrication and Simple Hybridization of Exotic Material MEMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Datskos, P.G.; Rajic, S.

    1999-11-13

    Work in materials other than silicon for MEMS applications has typically been restricted to metals and metal oxides instead of more ''exotic'' semiconductors. However, group III-V and II-VI semiconductors form a very important and versatile collection of material and electronic parameters available to the MEMS and MOEMS designer. With these materials, not only are the traditional mechanical material variables (thermal conductivity, thermal expansion, Young's modulus, etc.) available, but also chemical constituents can be varied in ternary and quaternary materials. This flexibility can be extremely important for both friction and chemical compatibility issues for MEMS. In addition, the ability to continuallymore » vary the bandgap energy can be particularly useful for many electronics and infrared detection applications. However, there are two major obstacles associated with alternate semiconductor material MEMS. The first issue is the actual fabrication of non-silicon devices and the second impediment is communicating with these novel devices. We will describe an essentially material independent fabrication method that is amenable to most group III-V and II-VI semiconductors. This technique uses a combination of non-traditional direct write precision fabrication processes such as diamond turning, ion milling, laser ablation, etc. This type of deterministic fabrication approach lends itself to an almost trivial assembly process. We will also describe in detail the mechanical, electrical, and optical self-aligning hybridization technique used for these alternate-material MEMS.« less

  8. Phase discrepancy induced from least squares wavefront reconstruction of wrapped phase measurements with high noise or large localized wavefront gradients

    NASA Astrophysics Data System (ADS)

    Steinbock, Michael J.; Hyde, Milo W.

    2012-10-01

    Adaptive optics is used in applications such as laser communication, remote sensing, and laser weapon systems to estimate and correct for atmospheric distortions of propagated light in real-time. Within an adaptive optics system, a reconstruction process interprets the raw wavefront sensor measurements and calculates an estimate for the unwrapped phase function to be sent through a control law and applied to a wavefront correction device. This research is focused on adaptive optics using a self-referencing interferometer wavefront sensor, which directly measures the wrapped wavefront phase. Therefore, its measurements must be reconstructed for use on a continuous facesheet deformable mirror. In testing and evaluating a novel class of branch-point- tolerant wavefront reconstructors based on the post-processing congruence operation technique, an increase in Strehl ratio compared to a traditional least squares reconstructor was noted even in non-scintillated fields. To investigate this further, this paper uses wave-optics simulations to eliminate many of the variables from a hardware adaptive optics system, so as to focus on the reconstruction techniques alone. The simulation results along with a discussion of the physical reasoning for this phenomenon are provided. For any applications using a self-referencing interferometer wavefront sensor with low signal levels or high localized wavefront gradients, understanding this phenomena is critical when applying a traditional least squares wavefront reconstructor.

  9. Line identification studies using traditional techniques and wavelength coincidence statistics

    NASA Technical Reports Server (NTRS)

    Cowley, Charles R.; Adelman, Saul J.

    1990-01-01

    Traditional line identification techniques result in the assignment of individual lines to an atomic or ionic species. These methods may be supplemented by wavelength coincidence statistics (WCS). The strength and weakness of these methods are discussed using spectra of a number of normal and peculiar B and A stars that have been studied independently by both methods. The present results support the overall findings of some earlier studies. WCS would be most useful in a first survey, before traditional methods have been applied. WCS can quickly make a global search for all species and in this way may enable identifications of an unexpected spectrum that could easily be omitted entirely from a traditional study. This is illustrated by O I. WCS is a subject to well known weakness of any statistical technique, for example, a predictable number of spurious results are to be expected. The danger of small number statistics are illustrated. WCS is at its best relative to traditional methods in finding a line-rich atomic species that is only weakly present in a complicated stellar spectrum.

  10. Time-dependent spatial intensity profiles of near-infrared idler pulses from nanosecond optical parametric oscillators

    NASA Astrophysics Data System (ADS)

    Olafsen, L. J.; Olafsen, J. S.; Eaves, I. K.

    2018-06-01

    We report on an experimental investigation of the time-dependent spatial intensity distribution of near-infrared idler pulses from an optical parametric oscillator measured using an infrared (IR) camera, in contrast to beam profiles obtained using traditional knife-edge techniques. Comparisons show the information gained by utilizing the thermal camera provides more detail than the spatially- or time-averaged measurements from a knife-edge profile. Synchronization, averaging, and thresholding techniques are applied to enhance the images acquired. The additional information obtained can improve the process by which semiconductor devices and other IR lasers are characterized for their beam quality and output response and thereby result in IR devices with higher performance.

  11. Requirements and principles for the implementation and construction of large-scale geographic information systems

    NASA Technical Reports Server (NTRS)

    Smith, Terence R.; Menon, Sudhakar; Star, Jeffrey L.; Estes, John E.

    1987-01-01

    This paper provides a brief survey of the history, structure and functions of 'traditional' geographic information systems (GIS), and then suggests a set of requirements that large-scale GIS should satisfy, together with a set of principles for their satisfaction. These principles, which include the systematic application of techniques from several subfields of computer science to the design and implementation of GIS and the integration of techniques from computer vision and image processing into standard GIS technology, are discussed in some detail. In particular, the paper provides a detailed discussion of questions relating to appropriate data models, data structures and computational procedures for the efficient storage, retrieval and analysis of spatially-indexed data.

  12. Propagation Techniques and Agronomic Requirements for the Cultivation of Barbados Aloe (Aloe vera (L.) Burm. F.)—A Review

    PubMed Central

    Cristiano, Giuseppe; Murillo-Amador, Bernardo; De Lucia, Barbara

    2016-01-01

    Barbados aloe (Aloe vera (L.) Burm. F.) has traditionally been used for healing in natural medicine. However, aloe is now attracting great interest in the global market due to its bioactive chemicals which are extracted from the leaves and used in industrial preparations for pharmaceutical, cosmetic, and food products. Aloe originated from tropical and sub-tropical Africa, but it is also now cultivated in warm climatic areas of Asia, Europe, and America. In this review, the most important factors affecting aloe production are described. We focus on propagation techniques, sustainable agronomic practices and efficient post harvesting and processing systems. PMID:27721816

  13. The Application of Collaborative Business Intelligence Technology in the Hospital SPD Logistics Management Model.

    PubMed

    Liu, Tongzhu; Shen, Aizong; Hu, Xiaojian; Tong, Guixian; Gu, Wei

    2017-06-01

    We aimed to apply collaborative business intelligence (BI) system to hospital supply, processing and distribution (SPD) logistics management model. We searched Engineering Village database, China National Knowledge Infrastructure (CNKI) and Google for articles (Published from 2011 to 2016), books, Web pages, etc., to understand SPD and BI related theories and recent research status. For the application of collaborative BI technology in the hospital SPD logistics management model, we realized this by leveraging data mining techniques to discover knowledge from complex data and collaborative techniques to improve the theories of business process. For the application of BI system, we: (i) proposed a layered structure of collaborative BI system for intelligent management in hospital logistics; (ii) built data warehouse for the collaborative BI system; (iii) improved data mining techniques such as supporting vector machines (SVM) and swarm intelligence firefly algorithm to solve key problems in hospital logistics collaborative BI system; (iv) researched the collaborative techniques oriented to data and business process optimization to improve the business processes of hospital logistics management. Proper combination of SPD model and BI system will improve the management of logistics in the hospitals. The successful implementation of the study requires: (i) to innovate and improve the traditional SPD model and make appropriate implement plans and schedules for the application of BI system according to the actual situations of hospitals; (ii) the collaborative participation of internal departments in hospital including the department of information, logistics, nursing, medical and financial; (iii) timely response of external suppliers.

  14. Micromechanical Machining Processes and their Application to Aerospace Structures, Devices and Systems

    NASA Technical Reports Server (NTRS)

    Friedrich, Craig R.; Warrington, Robert O.

    1995-01-01

    Micromechanical machining processes are those micro fabrication techniques which directly remove work piece material by either a physical cutting tool or an energy process. These processes are direct and therefore they can help reduce the cost and time for prototype development of micro mechanical components and systems. This is especially true for aerospace applications where size and weight are critical, and reliability and the operating environment are an integral part of the design and development process. The micromechanical machining processes are rapidly being recognized as a complementary set of tools to traditional lithographic processes (such as LIGA) for the fabrication of micromechanical components. Worldwide efforts in the U.S., Germany, and Japan are leading to results which sometimes rival lithography at a fraction of the time and cost. Efforts to develop processes and systems specific to aerospace applications are well underway.

  15. Regularized Embedded Multiple Kernel Dimensionality Reduction for Mine Signal Processing.

    PubMed

    Li, Shuang; Liu, Bing; Zhang, Chen

    2016-01-01

    Traditional multiple kernel dimensionality reduction models are generally based on graph embedding and manifold assumption. But such assumption might be invalid for some high-dimensional or sparse data due to the curse of dimensionality, which has a negative influence on the performance of multiple kernel learning. In addition, some models might be ill-posed if the rank of matrices in their objective functions was not high enough. To address these issues, we extend the traditional graph embedding framework and propose a novel regularized embedded multiple kernel dimensionality reduction method. Different from the conventional convex relaxation technique, the proposed algorithm directly takes advantage of a binary search and an alternative optimization scheme to obtain optimal solutions efficiently. The experimental results demonstrate the effectiveness of the proposed method for supervised, unsupervised, and semisupervised scenarios.

  16. Manufacturing PDMS micro lens array using spin coating under a multiphase system

    NASA Astrophysics Data System (ADS)

    Sun, Rongrong; Yang, Hanry; Rock, D. Mitchell; Danaei, Roozbeh; Panat, Rahul; Kessler, Michael R.; Li, Lei

    2017-05-01

    The development of micro lens arrays has garnered much interest due to increased demand of miniaturized systems. Traditional methods for manufacturing micro lens arrays have several shortcomings. For example, they require expensive facilities and long lead time, and traditional lens materials (i.e. glass) are typically heavy, costly and difficult to manufacture. In this paper, we explore a method for manufacturing a polydimethylsiloxane (PDMS) micro lens array using a simple spin coating technique. The micro lens array, formed under an interfacial tension dominated system, and the influence of material properties and process parameters on the fabricated lens shape are examined. The lenses fabricated using this method show comparable optical properties—including surface finish and image quality—with a reduced cost and manufacturing lead time.

  17. A review of automated image understanding within 3D baggage computed tomography security screening.

    PubMed

    Mouton, Andre; Breckon, Toby P

    2015-01-01

    Baggage inspection is the principal safeguard against the transportation of prohibited and potentially dangerous materials at airport security checkpoints. Although traditionally performed by 2D X-ray based scanning, increasingly stringent security regulations have led to a growing demand for more advanced imaging technologies. The role of X-ray Computed Tomography is thus rapidly expanding beyond the traditional materials-based detection of explosives. The development of computer vision and image processing techniques for the automated understanding of 3D baggage-CT imagery is however, complicated by poor image resolutions, image clutter and high levels of noise and artefacts. We discuss the recent and most pertinent advancements and identify topics for future research within the challenging domain of automated image understanding for baggage security screening CT.

  18. Efficacy of the core DNA barcodes in identifying processed and poorly conserved plant materials commonly used in South African traditional medicine

    PubMed Central

    Mankga, Ledile T.; Yessoufou, Kowiyou; Moteetee, Annah M.; Daru, Barnabas H.; van der Bank, Michelle

    2013-01-01

    Abstract Medicinal plants cover a broad range of taxa, which may be phylogenetically less related but morphologically very similar. Such morphological similarity between species may lead to misidentification and inappropriate use. Also the substitution of a medicinal plant by a cheaper alternative (e.g. other non-medicinal plant species), either due to misidentification, or deliberately to cheat consumers, is an issue of growing concern. In this study, we used DNA barcoding to identify commonly used medicinal plants in South Africa. Using the core plant barcodes, matK and rbcLa, obtained from processed and poorly conserved materials sold at the muthi traditional medicine market, we tested efficacy of the barcodes in species discrimination. Based on genetic divergence, PCR amplification efficiency and BLAST algorithm, we revealed varied discriminatory potentials for the DNA barcodes. In general, the barcodes exhibited high discriminatory power, indicating their effectiveness in verifying the identity of the most common plant species traded in South African medicinal markets. BLAST algorithm successfully matched 61% of the queries against a reference database, suggesting that most of the information supplied by sellers at traditional medicinal markets in South Africa is correct. Our findings reinforce the utility of DNA barcoding technique in limiting false identification that can harm public health. PMID:24453559

  19. Traditional and Constructivist Teaching Techniques: Comparing Two Groups of Undergraduate Nonscience Majors in a Biology Lab

    ERIC Educational Resources Information Center

    Travis, Holly; Lord, Thomas

    2004-01-01

    Constructivist teaching techniques work well in various instructional settings, but many teachers remain skeptical because there is a lack of quantitative data supporting this model. This study compared an undergraduate nonmajors biology lab section taught in a traditional teacher-centered style to a similar section taught as a constructivist…

  20. Electromyographic evaluation in children orthodontically treated for skeletal Class II malocclusion: Comparison of two treatment techniques.

    PubMed

    Ortu, Eleonora; Pietropaoli, Davide; Adib, Fray; Masci, Chiara; Giannoni, Mario; Monaco, Annalisa

    2017-11-16

    Objective To compare the clinical efficacy of two techniques for fabricating a Bimler device by assessing the patient's surface electromyography (sEMG) activity at rest before treatment and six months after treatment. Methods Twenty-four patients undergoing orthodontic treatment were enrolled in the study; 12 formed the test group and wore a Bimler device fabricated with a Myoprint impression using neuromuscular orthodontic technique and 12 formed the control group and were treated by traditional orthodontic technique with a wax bite in protrusion. The "rest" sEMG of each patient was recorded prior to treatment and six months after treatment. Results The neuromuscular-designed Bimler device was more comfortable and provided better treatment results than the traditional Bimler device. Conclusion This study suggests that the patient group subjected to neuromuscular orthodontic treatment had a treatment outcome with more relaxed masticatory muscles and better function versus the traditional orthodontic treatment.

  1. Rapid Separation of Bacteria from Blood—Review and Outlook

    PubMed Central

    Alizadeh, Mahsa; Husseini, Ghaleb A.; McClellan, Daniel S.; Buchanan, Clara M.; Bledsoe, Colin G.; Robison, Richard A.; Blanco, Rae; Roeder, Beverly L.; Melville, Madison; Hunter, Alex K.

    2017-01-01

    The high morbidity and mortality rate of bloodstream infections involving antibiotic-resistant bacteria necessitate a rapid identification of the infectious organism and its resistance profile. Traditional methods based on culturing the blood typically require at least 24 h, and genetic amplification by PCR in the presence of blood components has been problematic. The rapid separation of bacteria from blood would facilitate their genetic identification by PCR or other methods so that the proper antibiotic regimen can quickly be selected for the septic patient. Microfluidic systems that separate bacteria from whole blood have been developed, but these are designed to process only microliter quantities of whole blood or only highly diluted blood. However, symptoms of clinical blood infections can be manifest with bacterial burdens perhaps as low as 10 CFU/mL, and thus milliliter quantities of blood must be processed to collect enough bacteria for reliable genetic analysis. This review considers the advantages and shortcomings of various methods to separate bacteria from blood, with emphasis on techniques that can be done in less than 10 min on milliliter-quantities of whole blood. These techniques include filtration, screening, centrifugation, sedimentation, hydrodynamic focusing, chemical capture on surfaces or beads, field-flow fractionation, and dielectrophoresis. Techniques with the most promise include screening, sedimentation, and magnetic bead capture, as they allow large quantities of blood to be processed quickly. Some microfluidic techniques can be scaled up. PMID:27160415

  2. Using object-oriented analysis techniques to support system testing

    NASA Astrophysics Data System (ADS)

    Zucconi, Lin

    1990-03-01

    Testing of real-time control systems can be greatly facilitated by use of object-oriented and structured analysis modeling techniques. This report describes a project where behavior, process and information models built for a real-time control system were used to augment and aid traditional system testing. The modeling techniques used were an adaptation of the Ward/Mellor method for real-time systems analysis and design (Ward85) for object-oriented development. The models were used to simulate system behavior by means of hand execution of the behavior or state model and the associated process (data and control flow) and information (data) models. The information model, which uses an extended entity-relationship modeling technique, is used to identify application domain objects and their attributes (instance variables). The behavioral model uses state-transition diagrams to describe the state-dependent behavior of the object. The process model uses a transformation schema to describe the operations performed on or by the object. Together, these models provide a means of analyzing and specifying a system in terms of the static and dynamic properties of the objects which it manipulates. The various models were used to simultaneously capture knowledge about both the objects in the application domain and the system implementation. Models were constructed, verified against the software as-built and validated through informal reviews with the developer. These models were then hand-executed.

  3. Application of gray level mapping in computed tomographic colonography: a pilot study to compare with traditional surface rendering method for identification and differentiation of endoluminal lesions

    PubMed Central

    Chen, Lih-Shyang; Hsu, Ta-Wen; Chang, Shu-Han; Lin, Chih-Wen; Chen, Yu-Ruei; Hsieh, Chin-Chiang; Han, Shu-Chen; Chang, Ku-Yaw; Hou, Chun-Ju

    2017-01-01

    Objective: In traditional surface rendering (SR) computed tomographic endoscopy, only the shape of endoluminal lesion is depicted without gray-level information unless the volume rendering technique is used. However, volume rendering technique is relatively slow and complex in terms of computation time and parameter setting. We use computed tomographic colonography (CTC) images as examples and report a new visualization technique by three-dimensional gray level mapping (GM) to better identify and differentiate endoluminal lesions. Methods: There are 33 various endoluminal cases from 30 patients evaluated in this clinical study. These cases were segmented using gray-level threshold. The marching cube algorithm was used to detect isosurfaces in volumetric data sets. GM is applied using the surface gray level of CTC. Radiologists conducted the clinical evaluation of the SR and GM images. The Wilcoxon signed-rank test was used for data analysis. Results: Clinical evaluation confirms GM is significantly superior to SR in terms of gray-level pattern and spatial shape presentation of endoluminal cases (p < 0.01) and improves the confidence of identification and clinical classification of endoluminal lesions significantly (p < 0.01). The specificity and diagnostic accuracy of GM is significantly better than those of SR in diagnostic performance evaluation (p < 0.01). Conclusion: GM can reduce confusion in three-dimensional CTC and well correlate CTC with sectional images by the location as well as gray-level value. Hence, GM increases identification and differentiation of endoluminal lesions, and facilitates diagnostic process. Advances in knowledge: GM significantly improves the traditional SR method by providing reliable gray-level information for the surface points and is helpful in identification and differentiation of endoluminal lesions according to their shape and density. PMID:27925483

  4. Meliponiculture in Quilombola communities of Ipiranga and Gurugi, Paraíba state, Brazil: an ethnoecological approach

    PubMed Central

    2014-01-01

    Background The Quilombola communities of Ipiranga and Gurugi, located in Atlantic Rainforest in Southern of Paraíba state, have stories that are interwoven throughout time. The practice of meliponicultura has been carried out for generations in these social groups and provides an elaborate ecological knowledge based on native stingless bees, the melliferous flora and the management techniques used. The traditional knowledge that Quilombola have of stingless bees is of utmost importance for the establishment of conservation strategies for many species. Methods To deepen study concerning the ecological knowledge of the beekeepers, the method of participant observation together with structured and semi-structured interviews was used, as well as the collection of entomological and botanical categories of bees and plants mentioned. With the aim of recording the knowledge related to meliponiculture previously exercised by the residents, the method of the oral story was employed. Results and discussion Results show that the informants sampled possess knowledge of twelve categories of stingless bees (Apidae: Meliponini), classified according to morphological, behavioral and ecological characteristics. Their management techniques are represented by the making of traditional cortiço and the melliferous flora is composed of many species predominant in the Atlantic Rainforest. From recording the memories and recollections of the individuals, it was observed that an intricate system of beliefs has permeated the keeping of uruçu bees (Melipona scutellaris) for generations. Conclusion According to management techniques used by beekeepers, the keeping of stingless bees in the communities is considered a traditional activity that is embedded within a network of ecological knowledge and beliefs accumulated by generations over time, and is undergoing a process of transformation that provides new meanings to such knowledge, as can be observed in the practices of young people. PMID:24410767

  5. Modeling and Hazard Analysis Using STPA

    NASA Astrophysics Data System (ADS)

    Ishimatsu, Takuto; Leveson, Nancy; Thomas, John; Katahira, Masa; Miyamoto, Yuko; Nakao, Haruka

    2010-09-01

    A joint research project between MIT and JAXA/JAMSS is investigating the application of a new hazard analysis to the system and software in the HTV. Traditional hazard analysis focuses on component failures but software does not fail in this way. Software most often contributes to accidents by commanding the spacecraft into an unsafe state(e.g., turning off the descent engines prematurely) or by not issuing required commands. That makes the standard hazard analysis techniques of limited usefulness on software-intensive systems, which describes most spacecraft built today. STPA is a new hazard analysis technique based on systems theory rather than reliability theory. It treats safety as a control problem rather than a failure problem. The goal of STPA, which is to create a set of scenarios that can lead to a hazard, is the same as FTA but STPA includes a broader set of potential scenarios including those in which no failures occur but the problems arise due to unsafe and unintended interactions among the system components. STPA also provides more guidance to the analysts that traditional fault tree analysis. Functional control diagrams are used to guide the analysis. In addition, JAXA uses a model-based system engineering development environment(created originally by Leveson and called SpecTRM) which also assists in the hazard analysis. One of the advantages of STPA is that it can be applied early in the system engineering and development process in a safety-driven design process where hazard analysis drives the design decisions rather than waiting until reviews identify problems that are then costly or difficult to fix. It can also be applied in an after-the-fact analysis and hazard assessment, which is what we did in this case study. This paper describes the experimental application of STPA to the JAXA HTV in order to determine the feasibility and usefulness of the new hazard analysis technique. Because the HTV was originally developed using fault tree analysis and following the NASA standards for safety-critical systems, the results of our experimental application of STPA can be compared with these more traditional safety engineering approaches in terms of the problems identified and the resources required to use it.

  6. Alternative oil extraction methods from Echium plantagineum L. seeds using advanced techniques and green solvents.

    PubMed

    Castejón, Natalia; Luna, Pilar; Señoráns, Francisco J

    2018-04-01

    The edible oil processing industry involves large losses of organic solvent into the atmosphere and long extraction times. In this work, fast and environmentally friendly alternatives for the production of echium oil using green solvents are proposed. Advanced extraction techniques such as Pressurized Liquid Extraction (PLE), Microwave Assisted Extraction (MAE) and Ultrasound Assisted Extraction (UAE) were evaluated to efficiently extract omega-3 rich oil from Echium plantagineum seeds. Extractions were performed with ethyl acetate, ethanol, water and ethanol:water to develop a hexane-free processing method. Optimal PLE conditions with ethanol at 150 °C during 10 min produced a very similar oil yield (31.2%) to Soxhlet using hexane for 8 h (31.3%). UAE optimized method with ethanol at mild conditions (55 °C) produced a high oil yield (29.1%). Consequently, advanced extraction techniques showed good lipid yields and furthermore, the produced echium oil had the same omega-3 fatty acid composition than traditionally extracted oil. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Industrial and occupational ergonomics in the petrochemical process industry: a regression trees approach.

    PubMed

    Bevilacqua, M; Ciarapica, F E; Giacchetta, G

    2008-07-01

    This work is an attempt to apply classification tree methods to data regarding accidents in a medium-sized refinery, so as to identify the important relationships between the variables, which can be considered as decision-making rules when adopting any measures for improvement. The results obtained using the CART (Classification And Regression Trees) method proved to be the most precise and, in general, they are encouraging concerning the use of tree diagrams as preliminary explorative techniques for the assessment of the ergonomic, management and operational parameters which influence high accident risk situations. The Occupational Injury analysis carried out in this paper was planned as a dynamic process and can be repeated systematically. The CART technique, which considers a very wide set of objective and predictive variables, shows new cause-effect correlations in occupational safety which had never been previously described, highlighting possible injury risk groups and supporting decision-making in these areas. The use of classification trees must not, however, be seen as an attempt to supplant other techniques, but as a complementary method which can be integrated into traditional types of analysis.

  8. The Use of MMF Screws: Surgical Technique, Indications, Contraindications, and Common Problems in Review of the Literature

    PubMed Central

    Cornelius, Carl-Peter; Ehrenfeld, Michael

    2010-01-01

    Mandibulo-maxillary fixation (MMF) screws are inserted into the bony base of both jaws in the process of fracture realignment and immobilisation. The screw heads act as anchor points to fasten wire loops or rubber bands connecting the mandible to the maxilla. Traditional interdental chain-linked wiring or arch bar techniques provide the anchorage by attached cleats, hooks, or eyelets. In comparison to these tooth-borne appliances MMF screws facilitate and shorten the way to achieve intermaxillary fixation considerably. In addition, MMF screws help to reduce the hazards of glove perforation and wire stick injuries. On the downside, MMF screws are attributed with the risk of tooth root damage and a lack of versatility beyond the pure maintenance of occlusion such as stabilizing loose teeth or splinting fragments of the alveolar process. The surgical technique of MMF screws as well as the pros and cons of the clinical application are reviewed. The adequate screw placement to prevent serious tooth root injuries is still an issue to rethink and modify conceptual guidelines. PMID:22110819

  9. Additive Manufacturing Design Considerations for Liquid Engine Components

    NASA Technical Reports Server (NTRS)

    Whitten, Dave; Hissam, Andy; Baker, Kevin; Rice, Darron

    2014-01-01

    The Marshall Space Flight Center's Propulsion Systems Department has gained significant experience in the last year designing, building, and testing liquid engine components using additive manufacturing. The department has developed valve, duct, turbo-machinery, and combustion device components using this technology. Many valuable lessons were learned during this process. These lessons will be the focus of this presentation. We will present criteria for selecting part candidates for additive manufacturing. Some part characteristics are 'tailor made' for this process. Selecting the right parts for the process is the first step to maximizing productivity gains. We will also present specific lessons we learned about feature geometry that can and cannot be produced using additive manufacturing machines. Most liquid engine components were made using a two-step process. The base part was made using additive manufacturing and then traditional machining processes were used to produce the final part. The presentation will describe design accommodations needed to make the base part and lessons we learned about which features could be built directly and which require the final machine process. Tolerance capabilities, surface finish, and material thickness allowances will also be covered. Additive Manufacturing can produce internal passages that cannot be made using traditional approaches. It can also eliminate a significant amount of manpower by reducing part count and leveraging model-based design and analysis techniques. Information will be shared about performance enhancements and design efficiencies we experienced for certain categories of engine parts.

  10. Laser-induced Forward Transfer of Ag Nanopaste.

    PubMed

    Breckenfeld, Eric; Kim, Heungsoo; Auyeung, Raymond C Y; Piqué, Alberto

    2016-03-31

    Over the past decade, there has been much development of non-lithographic methods(1-3) for printing metallic inks or other functional materials. Many of these processes such as inkjet(3) and laser-induced forward transfer (LIFT)(4) have become increasingly popular as interest in printable electronics and maskless patterning has grown. These additive manufacturing processes are inexpensive, environmentally friendly, and well suited for rapid prototyping, when compared to more traditional semiconductor processing techniques. While most direct-write processes are confined to two-dimensional structures and cannot handle materials with high viscosity (particularly inkjet), LIFT can transcend both constraints if performed properly. Congruent transfer of three dimensional pixels (called voxels), also referred to as laser decal transfer (LDT)(5-9), has recently been demonstrated with the LIFT technique using highly viscous Ag nanopastes to fabricate freestanding interconnects, complex voxel shapes, and high-aspect-ratio structures. In this paper, we demonstrate a simple yet versatile process for fabricating a variety of micro- and macroscale Ag structures. Structures include simple shapes for patterning electrical contacts, bridging and cantilever structures, high-aspect-ratio structures, and single-shot, large area transfers using a commercial digital micromirror device (DMD) chip.

  11. Laser-induced Forward Transfer of Ag Nanopaste

    PubMed Central

    Breckenfeld, Eric; Kim, Heungsoo; Auyeung, Raymond C. Y.; Piqué, Alberto

    2016-01-01

    Over the past decade, there has been much development of non-lithographic methods1-3 for printing metallic inks or other functional materials. Many of these processes such as inkjet3 and laser-induced forward transfer (LIFT)4 have become increasingly popular as interest in printable electronics and maskless patterning has grown. These additive manufacturing processes are inexpensive, environmentally friendly, and well suited for rapid prototyping, when compared to more traditional semiconductor processing techniques. While most direct-write processes are confined to two-dimensional structures and cannot handle materials with high viscosity (particularly inkjet), LIFT can transcend both constraints if performed properly. Congruent transfer of three dimensional pixels (called voxels), also referred to as laser decal transfer (LDT)5-9, has recently been demonstrated with the LIFT technique using highly viscous Ag nanopastes to fabricate freestanding interconnects, complex voxel shapes, and high-aspect-ratio structures. In this paper, we demonstrate a simple yet versatile process for fabricating a variety of micro- and macroscale Ag structures. Structures include simple shapes for patterning electrical contacts, bridging and cantilever structures, high-aspect-ratio structures, and single-shot, large area transfers using a commercial digital micromirror device (DMD) chip. PMID:27077645

  12. Image processing for safety assessment in civil engineering.

    PubMed

    Ferrer, Belen; Pomares, Juan C; Irles, Ramon; Espinosa, Julian; Mas, David

    2013-06-20

    Behavior analysis of construction safety systems is of fundamental importance to avoid accidental injuries. Traditionally, measurements of dynamic actions in civil engineering have been done through accelerometers, but high-speed cameras and image processing techniques can play an important role in this area. Here, we propose using morphological image filtering and Hough transform on high-speed video sequence as tools for dynamic measurements on that field. The presented method is applied to obtain the trajectory and acceleration of a cylindrical ballast falling from a building and trapped by a thread net. Results show that safety recommendations given in construction codes can be potentially dangerous for workers.

  13. Signal Processing Methods for Liquid Rocket Engine Combustion Spontaneous Stability and Rough Combustion Assessments

    NASA Technical Reports Server (NTRS)

    Kenny, R. Jeremy; Casiano, Matthew; Fischbach, Sean; Hulka, James R.

    2012-01-01

    Liquid rocket engine combustion stability assessments are traditionally broken into three categories: dynamic stability, spontaneous stability, and rough combustion. This work focuses on comparing the spontaneous stability and rough combustion assessments for several liquid engine programs. The techniques used are those developed at Marshall Space Flight Center (MSFC) for the J-2X Workhorse Gas Generator program. Stability assessment data from the Integrated Powerhead Demonstrator (IPD), FASTRAC, and Common Extensible Cryogenic Engine (CECE) programs are compared against previously processed J-2X Gas Generator data. Prior metrics for spontaneous stability assessments are updated based on the compilation of all data sets.

  14. Texture Feature Extraction and Classification for Iris Diagnosis

    NASA Astrophysics Data System (ADS)

    Ma, Lin; Li, Naimin

    Appling computer aided techniques in iris image processing, and combining occidental iridology with the traditional Chinese medicine is a challenging research area in digital image processing and artificial intelligence. This paper proposes an iridology model that consists the iris image pre-processing, texture feature analysis and disease classification. To the pre-processing, a 2-step iris localization approach is proposed; a 2-D Gabor filter based texture analysis and a texture fractal dimension estimation method are proposed for pathological feature extraction; and at last support vector machines are constructed to recognize 2 typical diseases such as the alimentary canal disease and the nerve system disease. Experimental results show that the proposed iridology diagnosis model is quite effective and promising for medical diagnosis and health surveillance for both hospital and public use.

  15. Safety Assessment of Electronic Cigarettes and Their Relationship with Cardiovascular Disease

    PubMed Central

    Zhang, Guangwei; Zhang, Kai; Hou, Rui; Xing, Chunli; Yu, Qi; Liu, Enqi

    2018-01-01

    Smoking leads to the occurrence and development of a variety of diseases. Most importantly, it is an independent risk factor of cardiovascular atherosclerosis. In recent years, electronic cigarettes have become a popular alternative to traditional cigarettes, since modern micro-electronic techniques provide the possibility of simulating the process of traditional smoking. Additionally, it is convenient and fashionable. Nevertheless, comments about the safety of electronic cigarettes remain controversial. Although the research about electronic cigarettes increased exponentially, there has been no systematic study of its safety. The aim of the current study is to review the literature reports about the safety of electronic cigarettes, and to understand their hazards and disadvantages. It was found that most of the current research about electronic cigarettes comprises short-term and in vitro studies. There are few reports of in vivo and long-term studies. Notably, the level of harmful components such as volatile organic compounds, tobacco-specific nitrosamines and heavy metals in electronic cigarettes are even higher than in traditional cigarettes. Therefore, the harm of electronic cigarettes should not be underestimated. In conclusion, the question of whether electronic cigarettes are a safe and sufficient substitute for traditional smoking needs further investigation. PMID:29304018

  16. Acceleration of atmospheric Cherenkov telescope signal processing to real-time speed with the Auto-Pipe design system

    NASA Astrophysics Data System (ADS)

    Tyson, Eric J.; Buckley, James; Franklin, Mark A.; Chamberlain, Roger D.

    2008-10-01

    The imaging atmospheric Cherenkov technique for high-energy gamma-ray astronomy is emerging as an important new technique for studying the high energy universe. Current experiments have data rates of ≈20TB/year and duty cycles of about 10%. In the future, more sensitive experiments may produce up to 1000 TB/year. The data analysis task for these experiments requires keeping up with this data rate in close to real-time. Such data analysis is a classic example of a streaming application with very high performance requirements. This class of application often benefits greatly from the use of non-traditional approaches for computation including using special purpose hardware (FPGAs and ASICs), or sophisticated parallel processing techniques. However, designing, debugging, and deploying to these architectures is difficult and thus they are not widely used by the astrophysics community. This paper presents the Auto-Pipe design toolset that has been developed to address many of the difficulties in taking advantage of complex streaming computer architectures for such applications. Auto-Pipe incorporates a high-level coordination language, functional and performance simulation tools, and the ability to deploy applications to sophisticated architectures. Using the Auto-Pipe toolset, we have implemented the front-end portion of an imaging Cherenkov data analysis application, suitable for real-time or offline analysis. The application operates on data from the VERITAS experiment, and shows how Auto-Pipe can greatly ease performance optimization and application deployment of a wide variety of platforms. We demonstrate a performance improvement over a traditional software approach of 32x using an FPGA solution and 3.6x using a multiprocessor based solution.

  17. Development of a framework for resilience measurement: Suggestion of fuzzy Resilience Grade (RG) and fuzzy Resilience Early Warning Grade (REWG).

    PubMed

    Omidvar, Mohsen; Mazloumi, Adel; Mohammad Fam, Iraj; Nirumand, Fereshteh

    2017-01-01

    Resilience engineering (RE) can be an alternative technique to the traditional risk assessment and management techniques, to predict and manage safety conditions of modern socio-technical organizations. While traditional risk management approaches are retrospective and highlight error calculation and computation of malfunction possibilities, resilience engineering seeks ways to improve capacity at all levels of organizations in order to build strong yet flexible processes. Considering the resilience potential measurement as a concern in complex working systems, the aim of this study was to quantify the resilience by the help of fuzzy sets and Multi-Criteria Decision-Making (MCDM) techniques. In this paper, we adopted the fuzzy analytic hierarchy process (FAHP) method to measure resilience in a gas refinery plant. A resilience assessment framework containing six indicators, each with its own sub-indicators, was constructed. Then, the fuzzy weights of the indicators and the sub-indicators were derived from pair-wise comparisons conducted by experts. The fuzzy evaluating vectors of the indicators and the sub-indicators computed according to the initial assessment data. Finally, the Comprehensive Resilience Index (CoRI), Resilience Grade (RG), and Resilience Early Warning Grade (REWG) were established. To demonstrate the applicability of the proposed method, an illustrative example in a gas refinery complex (an instance of socio-technical systems) was provided. CoRI of the refinery ranked as "III". In addition, for the six main indicators, RG and REWG ranked as "III" and "NEWZ", respectively, except for C3, in which RG ranked as "II", and REWG ranked as "OEWZ". The results revealed the engineering practicability and usefulness of the proposed method in resilience evaluation of socio-technical systems.

  18. Computer animation challenges for computational fluid dynamics

    NASA Astrophysics Data System (ADS)

    Vines, Mauricio; Lee, Won-Sook; Mavriplis, Catherine

    2012-07-01

    Computer animation requirements differ from those of traditional computational fluid dynamics (CFD) investigations in that visual plausibility and rapid frame update rates trump physical accuracy. We present an overview of the main techniques for fluid simulation in computer animation, starting with Eulerian grid approaches, the Lattice Boltzmann method, Fourier transform techniques and Lagrangian particle introduction. Adaptive grid methods, precomputation of results for model reduction, parallelisation and computation on graphical processing units (GPUs) are reviewed in the context of accelerating simulation computations for animation. A survey of current specific approaches for the application of these techniques to the simulation of smoke, fire, water, bubbles, mixing, phase change and solid-fluid coupling is also included. Adding plausibility to results through particle introduction, turbulence detail and concentration on regions of interest by level set techniques has elevated the degree of accuracy and realism of recent animations. Basic approaches are described here. Techniques to control the simulation to produce a desired visual effect are also discussed. Finally, some references to rendering techniques and haptic applications are mentioned to provide the reader with a complete picture of the challenges of simulating fluids in computer animation.

  19. Implementing Target Value Design.

    PubMed

    Alves, Thais da C L; Lichtig, Will; Rybkowski, Zofia K

    2017-04-01

    An alternative to the traditional way of designing projects is the process of target value design (TVD), which takes different departure points to start the design process. The TVD process starts with the client defining an allowable cost that needs to be met by the design and construction teams. An expected cost in the TVD process is defined through multiple interactions between multiple stakeholders who define wishes and others who define ways of achieving these wishes. Finally, a target cost is defined based on the expected profit the design and construction teams are expecting to make. TVD follows a series of continuous improvement efforts aimed at reaching the desired goals for the project and its associated target value cost. The process takes advantage of rapid cycles of suggestions, analyses, and implementation that starts with the definition of value for the client. In the traditional design process, the goal is to identify user preferences and find solutions that meet the needs of the client's expressed preferences. In the lean design process, the goal is to educate users about their values and advocate for a better facility over the long run; this way owners can help contractors and designers to identify better solutions. This article aims to inform the healthcare community about tools and techniques commonly used during the TVD process and how they can be used to educate and support project participants in developing better solutions to meet their needs now as well as in the future.

  20. Pacifying the Open Abdomen with Concomitant Intestinal Fistula: A Novel Approach

    DTIC Science & Technology

    2009-05-08

    have also effectively sed the same technique to aid in fistula control and bol- tering of our split-thickness skin graft during the process of nitial...graft in-growth. Once the skin graft has taken, more raditional stoma appliances or other methods can be used to igure 3 Appropriate positioning of... skin grafting at a lowerabor cost than traditional wet-to-dry dressing changes. The use f the wider aperture of the nipple also permits for the ade- uate

  1. The Progress of CDAS

    NASA Technical Reports Server (NTRS)

    Zhu, Renjie; Zhang, Xiuzhong; Wei, Wenren; Xiang, Ying; Li, Bin; Wu, Yajun; Shu, Fengchun; Luo, Jintao; Wang, Jinqing; Xue, Zhuhe; hide

    2010-01-01

    The Chinese Data Acquisition System (CDAS) based on FPGA techniques has been developed in China for the purpose of replacing the traditional analog baseband converter. CDAS is a high speed data acquisition and processing system with 1024 Msps sample rate for 512M bandwidth input and up to 16 channels (both USB and LSB) output with VSI interface compatible. The instrument is a flexible environment which can be updated easily. In this paper, the construction, the performance, the experiment results, and the future plans of CDAS will be reported.

  2. Creative reflections-the strategic use of reflections in multitrack music production

    NASA Astrophysics Data System (ADS)

    Case, Alexander

    2005-09-01

    There is a long tradition of deliberately capturing and even synthesizing early reflections to enhance the music intended for loudspeaker playback. The desire to improve or at least alter the quality, audibility, intelligibility, stereo width, and/or uniqueness of the audio signal guides the recording engineer's use of the recording space, influences their microphone selection and placement, and inspires countless signal-processing approaches. This paper reviews contemporary multitrack production techniques that specifically take advantage of reflected sound energy for musical benefit.

  3. Colour Based Image Processing Method for Recognizing Ribbed Smoked Sheet Grade

    NASA Astrophysics Data System (ADS)

    Fibriani, Ike; Sumardi; Bayu Satriya, Alfredo; Budi Utomo, Satryo

    2017-03-01

    This research proposes a colour based image processing technique to recognize the Ribbed Smoked Sheet (RSS) grade so that the RSS sorting process can be faster and more accurate than the traditional one. The RSS sheet image captured by the camera is transformed into grayscale image to simplify the recognition of rust and mould on the RSS sheet. Then the grayscale image is transformed into binary image using threshold value which is obtained from the RSS 1 reference colour. The grade recognition is determined by counting the white pixel percentage. The result shows that the system has 88% of accuracy. Most faults exist on RSS 2 recognition. This is due to the illumination distribution which is not equal over the RSS image.

  4. Technology Solutions Case Study: Excavationless: Exterior-Side Foundation Insulation for Existing Homes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Building science research supports installing exterior (soil side) foundation insulation as the optimal method to enhance the hygrothermal performance of new homes. With exterior foundation insulation, water management strategies are maximized while insulating the basement space and ensuring a more even temperature at the foundation wall. This project describes an innovative, minimally invasive foundation insulation upgrade technique on an existing home that uses hydrovac excavation technology combined with a liquid insulating foam. Cost savings over the traditional excavation process ranged from 23% to 50%. The excavationless process could result in even greater savings since replacement of building structures, exterior features,more » utility meters, and landscaping would be minimal or non-existent in an excavationless process.« less

  5. The Integration of COTS/GOTS within NASA's HST Command and Control System

    NASA Technical Reports Server (NTRS)

    Pfarr, Thomas; Reis, James E.; Obenschain, Arthur F. (Technical Monitor)

    2001-01-01

    NASA's mission critical Hubble Space Telescope (HST) command and control system has been re-engineered with COTS/GOTS and minimal custom code. This paper focuses on the design of this new HST Control Center System (CCS) and the lessons learned throughout its development. CCS currently utilizes 31 COTS/GOTS products with an additional 12 million lines of custom glueware code; the new CCS exceeds the capabilities of the original system while significantly reducing the lines of custom code by more than 50%. The lifecycle of COTS/GOTS products will be examined including the pack-age selection process, evaluation process, and integration process. The advantages, disadvantages, issues, concerns, and lessons teamed for integrating COTS/GOTS into the NASA's mission critical HST CCS will be examined in detail. Command and control systems designed with traditional custom code development efforts will be compared with command and control systems designed with new development techniques relying heavily on COTS/COTS integration. This paper will reveal the many hidden costs of COTS/GOTS solutions when compared to traditional custom code development efforts; this paper will show the high cost of COTS/GOTS solutions including training expenses, consulting fees, and long-term maintenance expenses.

  6. Kinematic real-time feedback is more effective than traditional teaching method in learning ankle joint mobilisation: a randomised controlled trial.

    PubMed

    González-Sánchez, Manuel; Ruiz-Muñoz, Maria; Ávila-Bolívar, Ana Belén; Cuesta-Vargas, Antonio I

    2016-10-06

    To analyse the effect of real-time kinematic feedback (KRTF) when learning two ankle joint mobilisation techniques comparing the results with the traditional teaching method. Double-blind randomized trial. Faculty of Health Sciences. undergraduate students with no experience in manual therapy. Each student practised intensely for 90 min (45 min for each mobilisation) according to the random methodology assigned (G1: traditional method group and G2: KRTF group). G1: an expert professor supervising the student's practice, the professorstudent ratio was 1:8. G2: placed in front of a station where, while they performed the manoeuvre, they received a KRTF on a laptop. total time of mobilisation, time to reach maximum amplitude, maximum angular displacement in the three axes, maximum and average velocity to reach the maximum angular displacement, average velocity during the mobilisation. Among the pre-post intervention measurements, there were significant differences within the two groups for all outcome variables, however, G2 (KRTF) achieved significantly greater improvements in kinematic parameters for the two mobilisations (significant increase in displacement, velocity and significant reduction in the mobilisations runtime) than G1. Ankle plantar flexion: G1's measurement stability (post-intervention) ranged between 0.491 and 0.687, while G2's measurement stability ranged between 0.899 and 0.984. Ankle dorsal flexion mobilisation: G1 the measurement stability (post-intervention) ranged from 0.543 and 0.684 while G2 ranged between 0.899 and 0.974. KRTF was proven to be more effective tool than traditional teaching method in the teaching - learning process of two joint mobilisation techniques. NCT02504710.

  7. A Fundamental Study of Inorganic Clathrate and Other Open-Framework Materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nolas, George

    Due to formidable synthetic challenges, many materials of scientific and technological interest are first obtained as microcrystalline powders. High purity, high yield processing techniques are often lacking and thus care must be taken in interpretation of the observed structural, chemical, and physical properties of powder or polycrystalline materials, which can be strongly influenced by extrinsic properties. Furthermore, the preparation of high-quality single crystals for many materials by traditional techniques can be especially challenging in cases where the elemental constituents have greatly differing melting points and/or vapor pressures, when the desired compound is thermodynamically metastable, or where growth with participation ofmore » the melt is generally not possible. New processing techniques are therefore imperative in order to investigate the intrinsic properties of these materials and elucidate their fundamental physical properties. Intermetallic clathrates constitute one such class of materials. The complex crystal structures of intermetallic clathrates are characterized by mainly group 14 host frameworks encapsulating guest-ions in polyhedral cages. The unique features of clathrate structures are intimately related to their physical properties, offering ideal systems for the study of structure-property relationships in crystalline solids. Moreover, intermetallic clathrates are being actively investigated due to their potential for application in thermoelectrics, photovoltaics and opto-electronics, superconductivity, and magnetocaloric technologies. We have developed different processing techniques in order to synthesize phase-pure high yield clathrates reproducibly, as well as grow single crystals for the first time. We also employed these techniques to synthesize new “open-framework” compounds. These advances in materials processing and crystal growth allowed for the investigation of the physical properties of a variety of different clathrate compositions for the first time.« less

  8. A microarchitecture for resource-limited superscalar microprocessors

    NASA Astrophysics Data System (ADS)

    Basso, Todd David

    1999-11-01

    Microelectronic components in space and satellite systems must be resistant to total dose radiation, single-even upset, and latchup in order to accomplish their missions. The demand for inexpensive, high-volume, radiation hardened (rad-hard) integrated circuits (ICs) is expected to increase dramatically as the communication market continues to expand. Motorola's Complementary Gallium Arsenide (CGaAsTM) technology offers superior radiation tolerance compared to traditional CMOS processes, while being more economical than dedicated rad-hard CMOS processes. The goals of this dissertation are to optimize a superscalar microarchitecture suitable for CGaAsTM microprocessors, develop circuit techniques for such applications, and evaluate the potential of CGaAsTM for the development of digital VLSI circuits. Motorola's 0.5 mum CGaAsTM process is summarized and circuit techniques applicable to digital CGaAsTM are developed. Direct coupled FET, complementary, and domino logic circuits are compared based on speed, power, area, and noise margins. These circuit techniques are employed in the design of a 600 MHz PowerPCTM arithmetic logic unit. The dissertation emphasizes CGaASTM-specific design considerations, specifically, low integration level. A baseline superscalar microarchitecture is defined and SPEC95 integer benchmark simulations are used to evaluate the applicability of advanced architectural features to microprocessors having low integration levels. The performance simulations center around the optimization of a simple superscalar core, small-scale branch prediction, instruction prefetching, and an off-chip primary data cache. The simulation results are used to develop a superscalar microarchitecture capable of outperforming a comparable sequential pipeline, while using only 500,000 transistors. The architecture, running at 200 MHz, is capable of achieving an estimated 153 MIPS, translating to a 27% performance increase over a comparable traditional pipelined microprocessor. The proposed microarchitecture is process independent and can be applied to low-cost, or transistor-limited applications. The proposed microarchitecture is implemented in the design of a 0.35 mum CMOS microprocessor, and the design of a 0.5 mum CGaAsTM micro-processor. The two technologies and designs are compared to ascertain the state of CGaAsTM for digital VLSI applications.

  9. Ultrasound Guidance for Botulinum Neurotoxin Chemodenervation Procedures.

    PubMed

    Alter, Katharine E; Karp, Barbara I

    2017-12-28

    Injections of botulinum neurotoxins (BoNTs) are prescribed by clinicians for a variety of disorders that cause over-activity of muscles; glands; pain and other structures. Accurately targeting the structure for injection is one of the principle goals when performing BoNTs procedures. Traditionally; injections have been guided by anatomic landmarks; palpation; range of motion; electromyography or electrical stimulation. Ultrasound (US) based imaging based guidance overcomes some of the limitations of traditional techniques. US and/or US combined with traditional guidance techniques is utilized and or recommended by many expert clinicians; authors and in practice guidelines by professional academies. This article reviews the advantages and disadvantages of available guidance techniques including US as well as technical aspects of US guidance and a focused literature review related to US guidance for chemodenervation procedures including BoNTs injection.

  10. Intelligent Vision On The SM9O Mini-Computer Basis And Applications

    NASA Astrophysics Data System (ADS)

    Hawryszkiw, J.

    1985-02-01

    Distinction has to be made between image processing and vision Image processing finds its roots in the strong tradition of linear signal processing and promotes geometrical transform techniques, such as fi I tering , compression, and restoration. Its purpose is to transform an image for a human observer to easily extract from that image information significant for him. For example edges after a gradient operator, or a specific direction after a directional filtering operation. Image processing consists in fact in a set of local or global space-time transforms. The interpretation of the final image is done by the human observer. The purpose of vision is to extract the semantic content of the image. The machine can then understand that content, and run a process of decision, which turns into an action. Thus, intel I i gent vision depends on - Image processing - Pattern recognition - Artificial intel I igence

  11. Decision insight into stakeholder conflict for ERN.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Siirola, John; Tidwell, Vincent Carroll; Benz, Zachary O.

    Participatory modeling has become an important tool in facilitating resource decision making and dispute resolution. Approaches to modeling that are commonly used in this context often do not adequately account for important human factors. Current techniques provide insights into how certain human activities and variables affect resource outcomes; however, they do not directly simulate the complex variables that shape how, why, and under what conditions different human agents behave in ways that affect resources and human interactions related to them. Current approaches also do not adequately reveal how the effects of individual decisions scale up to have systemic level effectsmore » in complex resource systems. This lack of integration prevents the development of more robust models to support decision making and dispute resolution processes. Development of integrated tools is further hampered by the fact that collection of primary data for decision-making modeling is costly and time consuming. This project seeks to develop a new approach to resource modeling that incorporates both technical and behavioral modeling techniques into a single decision-making architecture. The modeling platform is enhanced by use of traditional and advanced processes and tools for expedited data capture. Specific objectives of the project are: (1) Develop a proof of concept for a new technical approach to resource modeling that combines the computational techniques of system dynamics and agent based modeling, (2) Develop an iterative, participatory modeling process supported with traditional and advance data capture techniques that may be utilized to facilitate decision making, dispute resolution, and collaborative learning processes, and (3) Examine potential applications of this technology and process. The development of this decision support architecture included both the engineering of the technology and the development of a participatory method to build and apply the technology. Stakeholder interaction with the model and associated data capture was facilitated through two very different modes of engagement, one a standard interface involving radio buttons, slider bars, graphs and plots, while the other utilized an immersive serious gaming interface. The decision support architecture developed through this project was piloted in the Middle Rio Grande Basin to examine how these tools might be utilized to promote enhanced understanding and decision-making in the context of complex water resource management issues. Potential applications of this architecture and its capacity to lead to enhanced understanding and decision-making was assessed through qualitative interviews with study participants who represented key stakeholders in the basin.« less

  12. Manufacturing Precise, Lightweight Paraboloidal Mirrors

    NASA Technical Reports Server (NTRS)

    Hermann, Frederick Thomas

    2006-01-01

    A process for fabricating a precise, diffraction- limited, ultra-lightweight, composite- material (matrix/fiber) paraboloidal telescope mirror has been devised. Unlike the traditional process of fabrication of heavier glass-based mirrors, this process involves a minimum of manual steps and subjective judgment. Instead, this process involves objectively controllable, repeatable steps; hence, this process is better suited for mass production. Other processes that have been investigated for fabrication of precise composite-material lightweight mirrors have resulted in print-through of fiber patterns onto reflecting surfaces, and have not provided adequate structural support for maintenance of stable, diffraction-limited surface figures. In contrast, this process does not result in print-through of the fiber pattern onto the reflecting surface and does provide a lightweight, rigid structure capable of maintaining a diffraction-limited surface figure in the face of changing temperature, humidity, and air pressure. The process consists mainly of the following steps: 1. A precise glass mandrel is fabricated by conventional optical grinding and polishing. 2. The mandrel is coated with a release agent and covered with layers of a carbon- fiber composite material. 3. The outer surface of the outer layer of the carbon-fiber composite material is coated with a surfactant chosen to provide for the proper flow of an epoxy resin to be applied subsequently. 4. The mandrel as thus covered is mounted on a temperature-controlled spin table. 5. The table is heated to a suitable temperature and spun at a suitable speed as the epoxy resin is poured onto the coated carbon-fiber composite material. 6. The surface figure of the optic is monitored and adjusted by use of traditional Ronchi, Focault, and interferometric optical measurement techniques while the speed of rotation and the temperature are adjusted to obtain the desired figure. The proper selection of surfactant, speed or rotation, viscosity of the epoxy, and temperature make it possible to obtain the desired diffraction-limited, smooth (1/50th wave) parabolic outer surface, suitable for reflective coating. 7. A reflective coat is applied by use of conventional coating techniques. 8. Once the final figure is set, a lightweight structural foam is applied to the rear of the optic to ensure stability of the figure.

  13. Drawing lithography for microneedles: a review of fundamentals and biomedical applications.

    PubMed

    Lee, Kwang; Jung, Hyungil

    2012-10-01

    A microneedle is a three-dimensional (3D) micromechanical structure and has been in the spotlight recently as a drug delivery system (DDS). Because a microneedle delivers the target drug after penetrating the skin barrier, the therapeutic effects of microneedles proceed from its 3D structural geometry. Various types of microneedles have been fabricated using subtractive micromanufacturing methods which are based on the inherently planar two-dimensional (2D) geometries. However, traditional subtractive processes are limited for flexible structural microneedles and makes functional biomedical applications for efficient drug delivery difficult. The authors of the present study propose drawing lithography as a unique additive process for the fabrication of a microneedle directly from 2D planar substrates, thus overcoming a subtractive process shortcoming. The present article provides the first overview of the principal drawing lithography technology: fundamentals and biomedical applications. The continuous drawing technique for an ultrahigh-aspect ratio (UHAR) hollow microneedle, stepwise controlled drawing technique for a dissolving microneedle, and drawing technique with antidromic isolation for a hybrid electro-microneedle (HEM) are reviewed, and efficient biomedical applications by drawing lithography-mediated microneedles as an innovative drug and gene delivery system are described. Drawing lithography herein can provide a great breakthrough in the development of materials science and biotechnology. Copyright © 2012 Elsevier Ltd. All rights reserved.

  14. Grayscale lithography-automated mask generation for complex three-dimensional topography

    NASA Astrophysics Data System (ADS)

    Loomis, James; Ratnayake, Dilan; McKenna, Curtis; Walsh, Kevin M.

    2016-01-01

    Grayscale lithography is a relatively underutilized technique that enables fabrication of three-dimensional (3-D) microstructures in photosensitive polymers (photoresists). By spatially modulating ultraviolet (UV) dosage during the writing process, one can vary the depth at which photoresist is developed. This means complex structures and bioinspired designs can readily be produced that would otherwise be cost prohibitive or too time intensive to fabricate. The main barrier to widespread grayscale implementation, however, stems from the laborious generation of mask files required to create complex surface topography. We present a process and associated software utility for automatically generating grayscale mask files from 3-D models created within industry-standard computer-aided design (CAD) suites. By shifting the microelectromechanical systems (MEMS) design onus to commonly used CAD programs ideal for complex surfacing, engineering professionals already familiar with traditional 3-D CAD software can readily utilize their pre-existing skills to make valuable contributions to the MEMS community. Our conversion process is demonstrated by prototyping several samples on a laser pattern generator-capital equipment already in use in many foundries. Finally, an empirical calibration technique is shown that compensates for nonlinear relationships between UV exposure intensity and photoresist development depth as well as a thermal reflow technique to help smooth microstructure surfaces.

  15. A Survey and Proposed Framework on the Soft Biometrics Technique for Human Identification in Intelligent Video Surveillance System

    PubMed Central

    Kim, Min-Gu; Moon, Hae-Min; Chung, Yongwha; Pan, Sung Bum

    2012-01-01

    Biometrics verification can be efficiently used for intrusion detection and intruder identification in video surveillance systems. Biometrics techniques can be largely divided into traditional and the so-called soft biometrics. Whereas traditional biometrics deals with physical characteristics such as face features, eye iris, and fingerprints, soft biometrics is concerned with such information as gender, national origin, and height. Traditional biometrics is versatile and highly accurate. But it is very difficult to get traditional biometric data from a distance and without personal cooperation. Soft biometrics, although featuring less accuracy, can be used much more freely though. Recently, many researchers have been made on human identification using soft biometrics data collected from a distance. In this paper, we use both traditional and soft biometrics for human identification and propose a framework for solving such problems as lighting, occlusion, and shadowing. PMID:22919273

  16. A survey and proposed framework on the soft biometrics technique for human identification in intelligent video surveillance system.

    PubMed

    Kim, Min-Gu; Moon, Hae-Min; Chung, Yongwha; Pan, Sung Bum

    2012-01-01

    Biometrics verification can be efficiently used for intrusion detection and intruder identification in video surveillance systems. Biometrics techniques can be largely divided into traditional and the so-called soft biometrics. Whereas traditional biometrics deals with physical characteristics such as face features, eye iris, and fingerprints, soft biometrics is concerned with such information as gender, national origin, and height. Traditional biometrics is versatile and highly accurate. But it is very difficult to get traditional biometric data from a distance and without personal cooperation. Soft biometrics, although featuring less accuracy, can be used much more freely though. Recently, many researchers have been made on human identification using soft biometrics data collected from a distance. In this paper, we use both traditional and soft biometrics for human identification and propose a framework for solving such problems as lighting, occlusion, and shadowing.

  17. An Integrated Strategy to Qualitatively Differentiate Components of Raw and Processed Viticis Fructus Based on NIR, HPLC and UPLC-MS Analysis.

    PubMed

    Diao, Jiayin; Xu, Can; Zheng, Huiting; He, Siyi; Wang, Shumei

    2018-06-21

    Viticis Fructus is a traditional Chinese herbal drug processed by various methods to achieve different clinical purposes. Thermal treatment potentially alters chemical composition, which may impact on effectiveness and toxicity. In order to interpret the constituent discrepancies of raw versus processed (stir-fried) Viticis Fructus, a multivariate detection method (NIR, HPLC, and UPLC-MS) based on metabonomics and chemometrics was developed. Firstly, synergy interval partial least squares and partial least squares-discriminant analysis were employed to screen the distinctive wavebands (4319 - 5459 cm -1 ) based on preprocessed near-infrared spectra. Then, HPLC with principal component analysis was performed to characterize the distinction. Subsequently, a total of 49 compounds were identified by UPLC-MS, among which 42 compounds were eventually characterized as having a significant change during processing via the semiquantitative volcano plot analysis. Moreover, based on the partial least squares-discriminant analysis, 16 compounds were chosen as characteristic markers that could be in close correlation with the discriminatory near-infrared wavebands. Together, all of these characterization techniques effectively discriminated raw and processed products of Viticis Fructus. In general, our work provides an integrated way of classifying Viticis Fructus, and a strategy to explore discriminatory chemical markers for other traditional Chinese herbs, thus ensuring safety and efficacy for consumers. Georg Thieme Verlag KG Stuttgart · New York.

  18. Applications of High and Ultra High Pressure Homogenization for Food Safety.

    PubMed

    Patrignani, Francesca; Lanciotti, Rosalba

    2016-01-01

    Traditionally, the shelf-life and safety of foods have been achieved by thermal processing. Low temperature long time and high temperature short time treatments are the most commonly used hurdles for the pasteurization of fluid foods and raw materials. However, the thermal treatments can reduce the product quality and freshness. Consequently, some non-thermal pasteurization process have been proposed during the last decades, including high hydrostatic pressure, pulsed electric field, ultrasound (US), and high pressure homogenization (HPH). This last technique has been demonstrated to have a great potential to provide "fresh-like" products with prolonged shelf-life. Moreover, the recent developments in high-pressure-homogenization technology and the design of new homogenization valves able to withstand pressures up to 350-400 MPa have opened new opportunities to homogenization processing in the food industries and, consequently, permitted the development of new products differentiated from traditional ones by sensory and structural characteristics or functional properties. For this, this review deals with the principal mechanisms of action of HPH against microorganisms of food concern in relation to the adopted homogenizer and process parameters. In addition, the effects of homogenization on foodborne pathogenic species inactivation in relation to the food matrix and food chemico-physical and process variables will be reviewed. Also the combined use of this alternative technology with other non-thermal technologies will be considered.

  19. Applications of High and Ultra High Pressure Homogenization for Food Safety

    PubMed Central

    Patrignani, Francesca; Lanciotti, Rosalba

    2016-01-01

    Traditionally, the shelf-life and safety of foods have been achieved by thermal processing. Low temperature long time and high temperature short time treatments are the most commonly used hurdles for the pasteurization of fluid foods and raw materials. However, the thermal treatments can reduce the product quality and freshness. Consequently, some non-thermal pasteurization process have been proposed during the last decades, including high hydrostatic pressure, pulsed electric field, ultrasound (US), and high pressure homogenization (HPH). This last technique has been demonstrated to have a great potential to provide “fresh-like” products with prolonged shelf-life. Moreover, the recent developments in high-pressure-homogenization technology and the design of new homogenization valves able to withstand pressures up to 350–400 MPa have opened new opportunities to homogenization processing in the food industries and, consequently, permitted the development of new products differentiated from traditional ones by sensory and structural characteristics or functional properties. For this, this review deals with the principal mechanisms of action of HPH against microorganisms of food concern in relation to the adopted homogenizer and process parameters. In addition, the effects of homogenization on foodborne pathogenic species inactivation in relation to the food matrix and food chemico-physical and process variables will be reviewed. Also the combined use of this alternative technology with other non-thermal technologies will be considered. PMID:27536270

  20. A Comparison of FPGA and GPGPU Designs for Bayesian Occupancy Filters.

    PubMed

    Medina, Luis; Diez-Ochoa, Miguel; Correal, Raul; Cuenca-Asensi, Sergio; Serrano, Alejandro; Godoy, Jorge; Martínez-Álvarez, Antonio; Villagra, Jorge

    2017-11-11

    Grid-based perception techniques in the automotive sector based on fusing information from different sensors and their robust perceptions of the environment are proliferating in the industry. However, one of the main drawbacks of these techniques is the traditionally prohibitive, high computing performance that is required for embedded automotive systems. In this work, the capabilities of new computing architectures that embed these algorithms are assessed in a real car. The paper compares two ad hoc optimized designs of the Bayesian Occupancy Filter; one for General Purpose Graphics Processing Unit (GPGPU) and the other for Field-Programmable Gate Array (FPGA). The resulting implementations are compared in terms of development effort, accuracy and performance, using datasets from a realistic simulator and from a real automated vehicle.

  1. Size analysis of polyglutamine protein aggregates using fluorescence detection in an analytical ultracentrifuge.

    PubMed

    Polling, Saskia; Hatters, Danny M; Mok, Yee-Foong

    2013-01-01

    Defining the aggregation process of proteins formed by poly-amino acid repeats in cells remains a challenging task due to a lack of robust techniques for their isolation and quantitation. Sedimentation velocity methodology using fluorescence detected analytical ultracentrifugation is one approach that can offer significant insight into aggregation formation and kinetics. While this technique has traditionally been used with purified proteins, it is now possible for substantial information to be collected with studies using cell lysates expressing a GFP-tagged protein of interest. In this chapter, we describe protocols for sample preparation and setting up the fluorescence detection system in an analytical ultracentrifuge to perform sedimentation velocity experiments on cell lysates containing aggregates formed by poly-amino acid repeat proteins.

  2. Meteor tracking via local pattern clustering in spatio-temporal domain

    NASA Astrophysics Data System (ADS)

    Kukal, Jaromír.; Klimt, Martin; Švihlík, Jan; Fliegel, Karel

    2016-09-01

    Reliable meteor detection is one of the crucial disciplines in astronomy. A variety of imaging systems is used for meteor path reconstruction. The traditional approach is based on analysis of 2D image sequences obtained from a double station video observation system. Precise localization of meteor path is difficult due to atmospheric turbulence and other factors causing spatio-temporal fluctuations of the image background. The proposed technique performs non-linear preprocessing of image intensity using Box-Cox transform as recommended in our previous work. Both symmetric and asymmetric spatio-temporal differences are designed to be robust in the statistical sense. Resulting local patterns are processed by data whitening technique and obtained vectors are classified via cluster analysis and Self-Organized Map (SOM).

  3. A Distance Measure for Attention Focusing and Anomaly Detection in Systems Monitoring

    NASA Technical Reports Server (NTRS)

    Doyle, R.

    1994-01-01

    Any attempt to introduce automation into the monitoring of complex physical systems must start from a robust anomaly detection capability. This task is far from straightforward, for a single definition of what constitutes an anomaly is difficult to come by. In addition, to make the monitoring process efficient, and to avoid the potential for information overload on human operators, attention focusing must also be addressed. When an anomaly occurs, more often than not several sensors are affected, and the partially redundant information they provide can be confusing, particularly in a crisis situation where a response is needed quickly. Previous results on extending traditional anomaly detection techniques are summarized. The focus of this paper is a new technique for attention focusing.

  4. Determining Training Device Requirements in Army Aviation Systems

    NASA Technical Reports Server (NTRS)

    Poumade, M. L.

    1984-01-01

    A decision making methodology which applies the systems approach to the training problem is discussed. Training is viewed as a total system instead of a collection of individual devices and unrelated techniques. The core of the methodology is the use of optimization techniques such as the transportation algorithm and multiobjective goal programming with training task and training device specific data. The role of computers, especially automated data bases and computer simulation models, in the development of training programs is also discussed. The approach can provide significant training enhancement and cost savings over the more traditional, intuitive form of training development and device requirements process. While given from an aviation perspective, the methodology is equally applicable to other training development efforts.

  5. Restoration of high-resolution AFM images captured with broken probes

    NASA Astrophysics Data System (ADS)

    Wang, Y. F.; Corrigan, D.; Forman, C.; Jarvis, S.; Kokaram, A.

    2012-03-01

    A type of artefact is induced by damage of the scanning probe when the Atomic Force Microscope (AFM) captures a material surface structure with nanoscale resolution. This artefact has a dramatic form of distortion rather than the traditional blurring artefacts. Practically, it is not easy to prevent the damage of the scanning probe. However, by using natural image deblurring techniques in image processing domain, a comparatively reliable estimation of the real sample surface structure can be generated. This paper introduces a novel Hough Transform technique as well as a Bayesian deblurring algorithm to remove this type of artefact. The deblurring result is successful at removing blur artefacts in the AFM artefact images. And the details of the fibril surface topography are well preserved.

  6. [RESEARCH PROGRESS OF THREE-DIMENSIONAL PRINTING TECHNIQUE FOR SPINAL IMPLANTS].

    PubMed

    Lu, Qi; Yu, Binsheng

    2016-09-08

    To summarize the current research progress of three-dimensional (3D) printing technique for spinal implants manufacture. The recent original literature concerning technology, materials, process, clinical applications, and development direction of 3D printing technique in spinal implants was reviewed and analyzed. At present, 3D printing technologies used to manufacture spinal implants include selective laser sintering, selective laser melting, and electron beam melting. Titanium and its alloys are mainly used. 3D printing spinal implants manufactured by the above materials and technology have been successfully used in clinical. But the problems regarding safety, related complications, cost-benefit analysis, efficacy compared with traditional spinal implants, and the lack of relevant policies and regulations remain to be solved. 3D printing technique is able to provide individual and customized spinal implants for patients, which is helpful for the clinicians to perform operations much more accurately and safely. With the rapid development of 3D printing technology and new materials, more and more 3D printing spinal implants will be developed and used clinically.

  7. Evolution and enabling capabilities of spatially resolved techniques for the characterization of heterogeneously catalyzed reactions

    DOE PAGES

    Morgan, Kevin; Touitou, Jamal; Choi, Jae -Soon; ...

    2016-01-15

    The development and optimization of catalysts and catalytic processes requires knowledge of reaction kinetics and mechanisms. In traditional catalyst kinetic characterization, the gas composition is known at the inlet, and the exit flow is measured to determine changes in concentration. As such, the progression of the chemistry within the catalyst is not known. Technological advances in electromagnetic and physical probes have made visualizing the evolution of the chemistry within catalyst samples a reality, as part of a methodology commonly known as spatial resolution. Herein, we discuss and evaluate the development of spatially resolved techniques, including the evolutions and achievements ofmore » this growing area of catalytic research. The impact of such techniques is discussed in terms of the invasiveness of physical probes on catalytic systems, as well as how experimentally obtained spatial profiles can be used in conjunction with kinetic modeling. Moreover, some aims and aspirations for further evolution of spatially resolved techniques are considered.« less

  8. Development Context Driven Change Awareness and Analysis Framework

    NASA Technical Reports Server (NTRS)

    Sarma, Anita; Branchaud, Josh; Dwyer, Matthew B.; Person, Suzette; Rungta, Neha

    2014-01-01

    Recent work on workspace monitoring allows conflict prediction early in the development process, however, these approaches mostly use syntactic differencing techniques to compare different program versions. In contrast, traditional change-impact analysis techniques analyze related versions of the program only after the code has been checked into the master repository. We propose a novel approach, De- CAF (Development Context Analysis Framework), that leverages the development context to scope a change impact analysis technique. The goal is to characterize the impact of each developer on other developers in the team. There are various client applications such as task prioritization, early conflict detection, and providing advice on testing that can benefit from such a characterization. The DeCAF framework leverages information from the development context to bound the iDiSE change impact analysis technique to analyze only the parts of the code base that are of interest. Bounding the analysis can enable DeCAF to efficiently compute the impact of changes using a combination of program dependence and symbolic execution based approaches.

  9. Development Context Driven Change Awareness and Analysis Framework

    NASA Technical Reports Server (NTRS)

    Sarma, Anita; Branchaud, Josh; Dwyer, Matthew B.; Person, Suzette; Rungta, Neha; Wang, Yurong; Elbaum, Sebastian

    2014-01-01

    Recent work on workspace monitoring allows conflict prediction early in the development process, however, these approaches mostly use syntactic differencing techniques to compare different program versions. In contrast, traditional change-impact analysis techniques analyze related versions of the program only after the code has been checked into the master repository. We propose a novel approach, DeCAF (Development Context Analysis Framework), that leverages the development context to scope a change impact analysis technique. The goal is to characterize the impact of each developer on other developers in the team. There are various client applications such as task prioritization, early conflict detection, and providing advice on testing that can benefit from such a characterization. The DeCAF framework leverages information from the development context to bound the iDiSE change impact analysis technique to analyze only the parts of the code base that are of interest. Bounding the analysis can enable DeCAF to efficiently compute the impact of changes using a combination of program dependence and symbolic execution based approaches.

  10. Modified McCash Technique for Management of Dupuytren Contracture.

    PubMed

    Lesiak, Alex C; Jarrett, Nicole J; Imbriglia, Joseph E

    2017-05-01

    Despite recent advancements in the nonsurgical treatment for Dupuytren contracture, a number of patients remain poor nonsurgical candidates or elect for surgical management. The traditional McCash technique releases contractures while leaving open palmar wounds. Although successful in alleviating contractures, these wounds are traditionally large, transverse incisions across the palm. A modification of this technique has been performed that permits the surgeon to utilize smaller wounds while eliminating debilitating contractures. Copyright © 2017 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.

  11. The influence of ArF excimer laser micromachining on physicochemical properties of bioresorbable poly(L-lactide)

    NASA Astrophysics Data System (ADS)

    Stepak, Bogusz D.; Antończak, Arkadiusz J.; Szustakiewicz, Konrad; Pezowicz, Celina; Abramski, Krzysztof M.

    2016-03-01

    The main advantage of laser processing is a non-contact character of material removal and high precision attainable thanks to low laser beam dimensions. This technique enables forming a complex, submillimeter geometrical shapes such as vascular stents which cannot be manufactured using traditional techniques e.g. injection moulding or mechanical treatment. In the domain of nanosecond laser sources, an ArF excimer laser appears as a good candidate for laser micromachining of bioresorbable polymers such as poly(L-lactide). Due to long pulse duration, however, there is a risk of heat diffusion and accumulation in the material. In addition, due to short wavelength (193 nm) photochemical process can modify the chemical composition of ablated surfaces. The motivation for this research was to evaluate the influence of laser micromachining on physicochemical properties of poly(L-lactide). We performed calorimetric analysis of laser machined samples by using differential scanning calorimetry (DSC). It allowed us to find the optimal process parameters for heat affected zone (HAZ) reduction. The chemical composition of the ablated surface was investigated by FTIR in attenuated total reflectance (ATR) mode.

  12. Fabricating Superior NiAl Bronze Components through Wire Arc Additive Manufacturing.

    PubMed

    Ding, Donghong; Pan, Zengxi; van Duin, Stephen; Li, Huijun; Shen, Chen

    2016-08-03

    Cast nickel aluminum bronze (NAB) alloy is widely used for large engineering components in marine applications due to its excellent mechanical properties and corrosion resistance. Casting porosity, as well as coarse microstructure, however, are accompanied by a decrease in mechanical properties of cast NAB components. Although heat treatment, friction stir processing, and fusion welding were implemented to eliminate porosity, improve mechanical properties, and refine the microstructure of as-cast metal, their applications are limited to either surface modification or component repair. Instead of traditional casting techniques, this study focuses on developing NAB components using recently expanded wire arc additive manufacturing (WAAM). Consumable welding wire is melted and deposited layer-by-layer on substrates producing near-net shaped NAB components. Additively-manufactured NAB components without post-processing are fully dense, and exhibit fine microstructure, as well as comparable mechanical properties, to as-cast NAB alloy. The effects of heat input from the welding process and post-weld-heat-treatment (PWHT) are shown to give uniform NAB alloys with superior mechanical properties revealing potential marine applications of the WAAM technique in NAB production.

  13. Fabricating Superior NiAl Bronze Components through Wire Arc Additive Manufacturing

    PubMed Central

    Ding, Donghong; Pan, Zengxi; van Duin, Stephen; Li, Huijun; Shen, Chen

    2016-01-01

    Cast nickel aluminum bronze (NAB) alloy is widely used for large engineering components in marine applications due to its excellent mechanical properties and corrosion resistance. Casting porosity, as well as coarse microstructure, however, are accompanied by a decrease in mechanical properties of cast NAB components. Although heat treatment, friction stir processing, and fusion welding were implemented to eliminate porosity, improve mechanical properties, and refine the microstructure of as-cast metal, their applications are limited to either surface modification or component repair. Instead of traditional casting techniques, this study focuses on developing NAB components using recently expanded wire arc additive manufacturing (WAAM). Consumable welding wire is melted and deposited layer-by-layer on substrates producing near-net shaped NAB components. Additively-manufactured NAB components without post-processing are fully dense, and exhibit fine microstructure, as well as comparable mechanical properties, to as-cast NAB alloy. The effects of heat input from the welding process and post-weld-heat-treatment (PWHT) are shown to give uniform NAB alloys with superior mechanical properties revealing potential marine applications of the WAAM technique in NAB production. PMID:28773774

  14. Application of High Speed Digital Image Correlation in Rocket Engine Hot Fire Testing

    NASA Technical Reports Server (NTRS)

    Gradl, Paul R.; Schmidt, Tim

    2016-01-01

    Hot fire testing of rocket engine components and rocket engine systems is a critical aspect of the development process to understand performance, reliability and system interactions. Ground testing provides the opportunity for highly instrumented development testing to validate analytical model predictions and determine necessary design changes and process improvements. To properly obtain discrete measurements for model validation, instrumentation must survive in the highly dynamic and extreme temperature application of hot fire testing. Digital Image Correlation has been investigated and being evaluated as a technique to augment traditional instrumentation during component and engine testing providing further data for additional performance improvements and cost savings. The feasibility of digital image correlation techniques were demonstrated in subscale and full scale hotfire testing. This incorporated a pair of high speed cameras to measure three-dimensional, real-time displacements and strains installed and operated under the extreme environments present on the test stand. The development process, setup and calibrations, data collection, hotfire test data collection and post-test analysis and results are presented in this paper.

  15. The Application of Collaborative Business Intelligence Technology in the Hospital SPD Logistics Management Model

    PubMed Central

    LIU, Tongzhu; SHEN, Aizong; HU, Xiaojian; TONG, Guixian; GU, Wei

    2017-01-01

    Background: We aimed to apply collaborative business intelligence (BI) system to hospital supply, processing and distribution (SPD) logistics management model. Methods: We searched Engineering Village database, China National Knowledge Infrastructure (CNKI) and Google for articles (Published from 2011 to 2016), books, Web pages, etc., to understand SPD and BI related theories and recent research status. For the application of collaborative BI technology in the hospital SPD logistics management model, we realized this by leveraging data mining techniques to discover knowledge from complex data and collaborative techniques to improve the theories of business process. Results: For the application of BI system, we: (i) proposed a layered structure of collaborative BI system for intelligent management in hospital logistics; (ii) built data warehouse for the collaborative BI system; (iii) improved data mining techniques such as supporting vector machines (SVM) and swarm intelligence firefly algorithm to solve key problems in hospital logistics collaborative BI system; (iv) researched the collaborative techniques oriented to data and business process optimization to improve the business processes of hospital logistics management. Conclusion: Proper combination of SPD model and BI system will improve the management of logistics in the hospitals. The successful implementation of the study requires: (i) to innovate and improve the traditional SPD model and make appropriate implement plans and schedules for the application of BI system according to the actual situations of hospitals; (ii) the collaborative participation of internal departments in hospital including the department of information, logistics, nursing, medical and financial; (iii) timely response of external suppliers. PMID:28828316

  16. Microbial Burden Approach : New Monitoring Approach for Measuring Microbial Burden

    NASA Technical Reports Server (NTRS)

    Venkateswaran, Kasthuri; Vaishampayan, Parag; Barmatz, Martin

    2013-01-01

    Advantages of new approach for differentiating live cells/ spores from dead cells/spores. Four examples of Salmonella outbreaks leading to costly destruction of dairy products. List of possible collaboration activities between JPL and other industries (for future discussion). Limitations of traditional microbial monitoring approaches. Introduction to new approach for rapid measurement of viable (live) bacterial cells/spores and its areas of application. Detailed example for determining live spores using new approach (similar procedure for determining live cells). JPL has developed a patented approach for measuring amount of live and dead cells/spores. This novel "molecular" method takes less than 5 to 7 hrs. compared to the seven days required using conventional techniques. Conventional "molecular" techniques can not discriminate live cells/spores among dead cells/spores. The JPL-developed novel method eliminates false positive results obtained from conventional "molecular" techniques that lead to unnecessary delay in the processing and to unnecessary destruction of food products.

  17. Data classification using metaheuristic Cuckoo Search technique for Levenberg Marquardt back propagation (CSLM) algorithm

    NASA Astrophysics Data System (ADS)

    Nawi, Nazri Mohd.; Khan, Abdullah; Rehman, M. Z.

    2015-05-01

    A nature inspired behavior metaheuristic techniques which provide derivative-free solutions to solve complex problems. One of the latest additions to the group of nature inspired optimization procedure is Cuckoo Search (CS) algorithm. Artificial Neural Network (ANN) training is an optimization task since it is desired to find optimal weight set of a neural network in training process. Traditional training algorithms have some limitation such as getting trapped in local minima and slow convergence rate. This study proposed a new technique CSLM by combining the best features of two known algorithms back-propagation (BP) and Levenberg Marquardt algorithm (LM) for improving the convergence speed of ANN training and avoiding local minima problem by training this network. Some selected benchmark classification datasets are used for simulation. The experiment result show that the proposed cuckoo search with Levenberg Marquardt algorithm has better performance than other algorithm used in this study.

  18. Fabrication of Fe1.1Se0.5Te0.5 bulk by a high energy ball milling technique

    NASA Astrophysics Data System (ADS)

    Liu, Jixing; Li, Chengshan; Zhang, Shengnan; Feng, Jianqing; Zhang, Pingxiang; Zhou, Lian

    2017-11-01

    Fe1.1Se0.5Te0.5 superconducting bulks were successfully synthesized by a high energy ball milling (HEBM) aided sintering technique. Two advantages of this new technique have been revealed compared with traditional solid state sintering method. One is greatly increased the density of sintered bulks. It is because the precursor powders with β-Fe(Se, Te) and δ-Fe(Se, Te) were obtained directly by the HEBM process and without formation of liquid Se (and Te), which could avoid the huge volume expansion. The other is the obvious decrease of sintering temperature and dwell time due to the effective shortened length of diffusion paths. The superconducting critical temperature Tc of 14.2 K in our sample is comparable with those in previous reports, and further optimization of chemical composition is on the way.

  19. A noncontact laser technique for circular contouring accuracy measurement

    NASA Astrophysics Data System (ADS)

    Wang, Charles; Griffin, Bob

    2001-02-01

    The worldwide competition in manufacturing frequently requires the high-speed machine tools to deliver contouring accuracy in the order of a few micrometers, while moving at relatively high feed rates. Traditional test equipment is rather limited in its capability to measure contours of small radius at high speed. Described here is a new noncontact laser measurement technique for the test of circular contouring accuracy. This technique is based on a single-aperture laser Doppler displacement meter with a flat mirror as the target. It is of a noncontact type with the ability to vary the circular path radius continuously at data rates of up to 1000 Hz. Using this instrument, the actual radius, feed rate, velocity, and acceleration profiles can also be determined. The basic theory of operation, the hardware setup, the data collection, the data processing, and the error budget are discussed.

  20. A review of demodulation techniques for amplitude-modulation atomic force microscopy

    PubMed Central

    Harcombe, David M; Ragazzon, Michael R P; Moheimani, S O Reza; Fleming, Andrew J

    2017-01-01

    In this review paper, traditional and novel demodulation methods applicable to amplitude-modulation atomic force microscopy are implemented on a widely used digital processing system. As a crucial bandwidth-limiting component in the z-axis feedback loop of an atomic force microscope, the purpose of the demodulator is to obtain estimates of amplitude and phase of the cantilever deflection signal in the presence of sensor noise or additional distinct frequency components. Specifically for modern multifrequency techniques, where higher harmonic and/or higher eigenmode contributions are present in the oscillation signal, the fidelity of the estimates obtained from some demodulation techniques is not guaranteed. To enable a rigorous comparison, the performance metrics tracking bandwidth, implementation complexity and sensitivity to other frequency components are experimentally evaluated for each method. Finally, the significance of an adequate demodulator bandwidth is highlighted during high-speed tapping-mode atomic force microscopy experiments in constant-height mode. PMID:28900596

  1. Robot-aided electrospinning toward intelligent biomedical engineering.

    PubMed

    Tan, Rong; Yang, Xiong; Shen, Yajing

    2017-01-01

    The rapid development of robotics offers new opportunities for the traditional biofabrication in higher accuracy and controllability, which provides great potentials for the intelligent biomedical engineering. This paper reviews the state of the art of robotics in a widely used biomaterial fabrication process, i.e., electrospinning, including its working principle, main applications, challenges, and prospects. First, the principle and technique of electrospinning are introduced by categorizing it to melt electrospinning, solution electrospinning, and near-field electrospinning. Then, the applications of electrospinning in biomedical engineering are introduced briefly from the aspects of drug delivery, tissue engineering, and wound dressing. After that, we conclude the existing problems in traditional electrospinning such as low production, rough nanofibers, and uncontrolled morphology, and then discuss how those problems are addressed by robotics via four case studies. Lastly, the challenges and outlooks of robotics in electrospinning are discussed and prospected.

  2. A front-end wafer-level microsystem packaging technique with micro-cap array

    NASA Astrophysics Data System (ADS)

    Chiang, Yuh-Min

    2002-09-01

    The back-end packaging process is the remaining challenge for the micromachining industry to commercialize microsystem technology (MST) devices at low cost. This dissertation presents a novel wafer level protection technique as a final step of the front-end fabrication process for MSTs. It facilitates improved manufacturing throughput and automation in package assembly, wafer level testing of devices, and enhanced device performance. The method involves the use of a wafer-sized micro-cap array, which consists of an assortment of small caps micro-molded onto a material with adjustable shapes and sizes to serve as protective structures against the hostile environments during packaging. The micro-cap array is first constructed by a micromachining process with micro-molding technique, then sealed to the device wafer at wafer level. Epoxy-based wafer-level micro cap array has been successfully fabricated and showed good compatibility with conventional back-end packaging processes. An adhesive transfer technique was demonstrated to seal the micro cap array with a MEMS device wafer. No damage or gross leak was observed while wafer dicing or later during a gross leak test. Applications of the micro cap array are demonstrated on MEMS, microactuators fabricated using CRONOS MUMPS process. Depending on the application needs, the micro-molded cap can be designed and modified to facilitate additional component functions, such as optical, electrical, mechanical, and chemical functions, which are not easily achieved in the device by traditional means. Successful fabrication of a micro cap array comprised with microlenses can provide active functions as well as passive protection. An optical tweezer array could be one possibility for applications of a micro cap with microlenses. The micro cap itself could serve as micro well for DNA or bacteria amplification as well.

  3. An Evaluation of Understandability of Patient Journey Models in Mental Health.

    PubMed

    Percival, Jennifer; McGregor, Carolyn

    2016-07-28

    There is a significant trend toward implementing health information technology to reduce administrative costs and improve patient care. Unfortunately, little awareness exists of the challenges of integrating information systems with existing clinical practice. The systematic integration of clinical processes with information system and health information technology can benefit the patients, staff, and the delivery of care. This paper presents a comparison of the degree of understandability of patient journey models. In particular, the authors demonstrate the value of a relatively new patient journey modeling technique called the Patient Journey Modeling Architecture (PaJMa) when compared with traditional manufacturing based process modeling tools. The paper also presents results from a small pilot case study that compared the usability of 5 modeling approaches in a mental health care environment. Five business process modeling techniques were used to represent a selected patient journey. A mix of both qualitative and quantitative methods was used to evaluate these models. Techniques included a focus group and survey to measure usability of the various models. The preliminary evaluation of the usability of the 5 modeling techniques has shown increased staff understanding of the representation of their processes and activities when presented with the models. Improved individual role identification throughout the models was also observed. The extended version of the PaJMa methodology provided the most clarity of information flows for clinicians. The extended version of PaJMa provided a significant improvement in the ease of interpretation for clinicians and increased the engagement with the modeling process. The use of color and its effectiveness in distinguishing the representation of roles was a key feature of the framework not present in other modeling approaches. Future research should focus on extending the pilot case study to a more diversified group of clinicians and health care support workers.

  4. Development of lead zirconate titanate cantilevers on the micrometer length scale

    NASA Astrophysics Data System (ADS)

    Martin, Christopher Robert

    The objective of this research project was to fabricate a functional ferroelectric microcantilever from patterned lead zirconate titanate (PZT) thin films. Cantilevers fabricated from ferroelectric materials have tremendous potential in sensing applications, particularly due to the increased sensitivity that miniaturized devices offer. This thesis highlights and explores a number of the processing issues that hindered the production of a working prototype. PZT is patterned using soft lithography-inspired techniques from a PZT chemical precursor solution derived by the chelation synthesis route. As the ability to pattern ceramic materials derived from sol-gels on the micrometer scale is a relatively new technology, this thesis aims to expand the scientific understanding of new issues that arise when working with these patterned films. For example, the use of Micromolding in Capillaries (MIMIC) to pattern the PZT thin films results in the evolution of topographical distortions from the shape of the original mold during the shrinkage of patterned thin film during drying and sintering. The factors that contribute to this effect have been explained and a new processing technique called MicroChannel Molding (muCM) was developed. This new process combines the advantages of soft lithography with traditional silicon microfabrication techniques to ensure compatibility with current industrial practices. This work lays the foundation for the future production of working ferroelectric microcantilevers. The proposed microfabrication process is described along with descriptions of each processing difficulty that was encountered. Modifications to the process are proposed along with the descriptions of alternative processing techniques that were attempted for the benefit of future researchers. This dissertation concludes with the electronic characterization of micropattemed PZT thin films. To our knowledge, the ferroelectric properties of patterned PZT thin films have never been directly characterized before. The properties are measured with a commercial ferroelectric test system connected through a conductive Atomic Force Microscope tip. The films patterned by MIMIC and muCM are compared to large-area spin cast films to identify the role that the processing method has on the resulting properties.

  5. Dysphagia Screening: Contributions of Cervical Auscultation Signals and Modern Signal-Processing Techniques

    PubMed Central

    Dudik, Joshua M.; Coyle, James L.

    2015-01-01

    Cervical auscultation is the recording of sounds and vibrations caused by the human body from the throat during swallowing. While traditionally done by a trained clinician with a stethoscope, much work has been put towards developing more sensitive and clinically useful methods to characterize the data obtained with this technique. The eventual goal of the field is to improve the effectiveness of screening algorithms designed to predict the risk that swallowing disorders pose to individual patients’ health and safety. This paper provides an overview of these signal processing techniques and summarizes recent advances made with digital transducers in hopes of organizing the highly varied research on cervical auscultation. It investigates where on the body these transducers are placed in order to record a signal as well as the collection of analog and digital filtering techniques used to further improve the signal quality. It also presents the wide array of methods and features used to characterize these signals, ranging from simply counting the number of swallows that occur over a period of time to calculating various descriptive features in the time, frequency, and phase space domains. Finally, this paper presents the algorithms that have been used to classify this data into ‘normal’ and ‘abnormal’ categories. Both linear as well as non-linear techniques are presented in this regard. PMID:26213659

  6. A machine-learned computational functional genomics-based approach to drug classification.

    PubMed

    Lötsch, Jörn; Ultsch, Alfred

    2016-12-01

    The public accessibility of "big data" about the molecular targets of drugs and the biological functions of genes allows novel data science-based approaches to pharmacology that link drugs directly with their effects on pathophysiologic processes. This provides a phenotypic path to drug discovery and repurposing. This paper compares the performance of a functional genomics-based criterion to the traditional drug target-based classification. Knowledge discovery in the DrugBank and Gene Ontology databases allowed the construction of a "drug target versus biological process" matrix as a combination of "drug versus genes" and "genes versus biological processes" matrices. As a canonical example, such matrices were constructed for classical analgesic drugs. These matrices were projected onto a toroid grid of 50 × 82 artificial neurons using a self-organizing map (SOM). The distance, respectively, cluster structure of the high-dimensional feature space of the matrices was visualized on top of this SOM using a U-matrix. The cluster structure emerging on the U-matrix provided a correct classification of the analgesics into two main classes of opioid and non-opioid analgesics. The classification was flawless with both the functional genomics and the traditional target-based criterion. The functional genomics approach inherently included the drugs' modulatory effects on biological processes. The main pharmacological actions known from pharmacological science were captures, e.g., actions on lipid signaling for non-opioid analgesics that comprised many NSAIDs and actions on neuronal signal transmission for opioid analgesics. Using machine-learned techniques for computational drug classification in a comparative assessment, a functional genomics-based criterion was found to be similarly suitable for drug classification as the traditional target-based criterion. This supports a utility of functional genomics-based approaches to computational system pharmacology for drug discovery and repurposing.

  7. Effect of magnetic polarity on surface roughness during magnetic field assisted EDM of tool steel

    NASA Astrophysics Data System (ADS)

    Efendee, A. M.; Saifuldin, M.; Gebremariam, MA; Azhari, A.

    2018-04-01

    Electrical discharge machining (EDM) is one of the non-traditional machining techniques where the process offers wide range of parameters manipulation and machining applications. However, surface roughness, material removal rate, electrode wear and operation costs were among the topmost issue within this technique. Alteration of magnetic device around machining area offers exciting output to be investigated and the effects of magnetic polarity on EDM remain unacquainted. The aim of this research is to investigate the effect of magnetic polarity on surface roughness during magnetic field assisted electrical discharge machining (MFAEDM) on tool steel material (AISI 420 mod.) using graphite electrode. A Magnet with a force of 18 Tesla was applied to the EDM process at selected parameters. The sparks under magnetic field assisted EDM produced better surface finish than the normal conventional EDM process. At the presence of high magnetic field, the spark produced was squeezed and discharge craters generated on the machined surface was tiny and shallow. Correct magnetic polarity combination of MFAEDM process is highly useful to attain a high efficiency machining and improved quality of surface finish to meet the demand of modern industrial applications.

  8. Real-time continuous visual biofeedback in the treatment of speech breathing disorders following childhood traumatic brain injury: report of one case.

    PubMed

    Murdoch, B E; Pitt, G; Theodoros, D G; Ward, E C

    1999-01-01

    The efficacy of traditional and physiological biofeedback methods for modifying abnormal speech breathing patterns was investigated in a child with persistent dysarthria following severe traumatic brain injury (TBI). An A-B-A-B single-subject experimental research design was utilized to provide the subject with two exclusive periods of therapy for speech breathing, based on traditional therapy techniques and physiological biofeedback methods, respectively. Traditional therapy techniques included establishing optimal posture for speech breathing, explanation of the movement of the respiratory muscles, and a hierarchy of non-speech and speech tasks focusing on establishing an appropriate level of sub-glottal air pressure, and improving the subject's control of inhalation and exhalation. The biofeedback phase of therapy utilized variable inductance plethysmography (or Respitrace) to provide real-time, continuous visual biofeedback of ribcage circumference during breathing. As in traditional therapy, a hierarchy of non-speech and speech tasks were devised to improve the subject's control of his respiratory pattern. Throughout the project, the subject's respiratory support for speech was assessed both instrumentally and perceptually. Instrumental assessment included kinematic and spirometric measures, and perceptual assessment included the Frenchay Dysarthria Assessment, Assessment of Intelligibility of Dysarthric Speech, and analysis of a speech sample. The results of the study demonstrated that real-time continuous visual biofeedback techniques for modifying speech breathing patterns were not only effective, but superior to the traditional therapy techniques for modifying abnormal speech breathing patterns in a child with persistent dysarthria following severe TBI. These results show that physiological biofeedback techniques are potentially useful clinical tools for the remediation of speech breathing impairment in the paediatric dysarthric population.

  9. Alternatives to current flow cytometry data analysis for clinical and research studies.

    PubMed

    Gondhalekar, Carmen; Rajwa, Bartek; Patsekin, Valery; Ragheb, Kathy; Sturgis, Jennifer; Robinson, J Paul

    2018-02-01

    Flow cytometry has well-established methods for data analysis based on traditional data collection techniques. These techniques typically involved manual insertion of tube samples into an instrument that, historically, could only measure 1-3 colors. The field has since evolved to incorporate new technologies for faster and highly automated sample preparation and data collection. For example, the use of microwell plates on benchtop instruments is now a standard on virtually every new instrument, and so users can easily accumulate multiple data sets quickly. Further, because the user must carefully define the layout of the plate, this information is already defined when considering the analytical process, expanding the opportunities for automated analysis. Advances in multi-parametric data collection, as demonstrated by the development of hyperspectral flow-cytometry, 20-40 color polychromatic flow cytometry, and mass cytometry (CyTOF), are game-changing. As data and assay complexity increase, so too does the complexity of data analysis. Complex data analysis is already a challenge to traditional flow cytometry software. New methods for reviewing large and complex data sets can provide rapid insight into processes difficult to define without more advanced analytical tools. In settings such as clinical labs where rapid and accurate data analysis is a priority, rapid, efficient and intuitive software is needed. This paper outlines opportunities for analysis of complex data sets using examples of multiplexed bead-based assays, drug screens and cell cycle analysis - any of which could become integrated into the clinical environment. Copyright © 2017. Published by Elsevier Inc.

  10. Visualization-by-Sketching: An Artist's Interface for Creating Multivariate Time-Varying Data Visualizations.

    PubMed

    Schroeder, David; Keefe, Daniel F

    2016-01-01

    We present Visualization-by-Sketching, a direct-manipulation user interface for designing new data visualizations. The goals are twofold: First, make the process of creating real, animated, data-driven visualizations of complex information more accessible to artists, graphic designers, and other visual experts with traditional, non-technical training. Second, support and enhance the role of human creativity in visualization design, enabling visual experimentation and workflows similar to what is possible with traditional artistic media. The approach is to conceive of visualization design as a combination of processes that are already closely linked with visual creativity: sketching, digital painting, image editing, and reacting to exemplars. Rather than studying and tweaking low-level algorithms and their parameters, designers create new visualizations by painting directly on top of a digital data canvas, sketching data glyphs, and arranging and blending together multiple layers of animated 2D graphics. This requires new algorithms and techniques to interpret painterly user input relative to data "under" the canvas, balance artistic freedom with the need to produce accurate data visualizations, and interactively explore large (e.g., terabyte-sized) multivariate datasets. Results demonstrate a variety of multivariate data visualization techniques can be rapidly recreated using the interface. More importantly, results and feedback from artists support the potential for interfaces in this style to attract new, creative users to the challenging task of designing more effective data visualizations and to help these users stay "in the creative zone" as they work.

  11. Data-driven in computational plasticity

    NASA Astrophysics Data System (ADS)

    Ibáñez, R.; Abisset-Chavanne, E.; Cueto, E.; Chinesta, F.

    2018-05-01

    Computational mechanics is taking an enormous importance in industry nowadays. On one hand, numerical simulations can be seen as a tool that allows the industry to perform fewer experiments, reducing costs. On the other hand, the physical processes that are intended to be simulated are becoming more complex, requiring new constitutive relationships to capture such behaviors. Therefore, when a new material is intended to be classified, an open question still remains: which constitutive equation should be calibrated. In the present work, the use of model order reduction techniques are exploited to identify the plastic behavior of a material, opening an alternative route with respect to traditional calibration methods. Indeed, the main objective is to provide a plastic yield function such that the mismatch between experiments and simulations is minimized. Therefore, once the experimental results just like the parameterization of the plastic yield function are provided, finding the optimal plastic yield function can be seen either as a traditional optimization or interpolation problem. It is important to highlight that the dimensionality of the problem is equal to the number of dimensions related to the parameterization of the yield function. Thus, the use of sparse interpolation techniques seems almost compulsory.

  12. Josephson frequency meter for millimeter and submillimeter wavelengths

    NASA Technical Reports Server (NTRS)

    Anischenko, S. E.; Larkin, S. Y.; Chaikovsky, V. I.; Kabayev, P. V.; Kamyshin, V. V.

    1995-01-01

    Frequency measurements of electromagnetic oscillations of millimeter and submillimeter wavebands with frequency growth due to a number of reasons become more and more difficult. First, these frequencies are considered to be cutoffs for semiconductor converting devices and one has to use optical measurement methods instead of traditional ones with frequency transfer. Second, resonance measurement methods are characterized by using relatively narrow bands and optical ones are limited in frequency and time resolution due to the limited range and velocity of movement of their mechanical elements as well as the efficiency of these optical techniques decrease with the increase of wavelength due to diffraction losses. That requires a priori information on the radiation frequency band of the source involved. Method of measuring frequency of harmonic microwave signals in millimeter and submillimeter wavebands based on the ac Josephson effect in superconducting contacts is devoid of all the above drawbacks. This approach offers a number of major advantages over the more traditional measurement methods, that is one based on frequency conversion, resonance and interferometric techniques. It can be characterized by high potential accuracy, wide range of frequencies measured, prompt measurement and the opportunity to obtain a panoramic display of the results as well as full automation of the measuring process.

  13. A Big Data Guide to Understanding Climate Change: The Case for Theory-Guided Data Science.

    PubMed

    Faghmous, James H; Kumar, Vipin

    2014-09-01

    Global climate change and its impact on human life has become one of our era's greatest challenges. Despite the urgency, data science has had little impact on furthering our understanding of our planet in spite of the abundance of climate data. This is a stark contrast from other fields such as advertising or electronic commerce where big data has been a great success story. This discrepancy stems from the complex nature of climate data as well as the scientific questions climate science brings forth. This article introduces a data science audience to the challenges and opportunities to mine large climate datasets, with an emphasis on the nuanced difference between mining climate data and traditional big data approaches. We focus on data, methods, and application challenges that must be addressed in order for big data to fulfill their promise with regard to climate science applications. More importantly, we highlight research showing that solely relying on traditional big data techniques results in dubious findings, and we instead propose a theory-guided data science paradigm that uses scientific theory to constrain both the big data techniques as well as the results-interpretation process to extract accurate insight from large climate data .

  14. Disaster victim identification of military aircrew, 1945-2002.

    PubMed

    Smith, Adrian

    2003-11-01

    Aviation accident fatalities are characterized by substantial tissue disruption and fragmentation, limiting the usefulness of traditional identification methods. This study examines the success of disaster victim identification (DVI) in military aviation accident fatalities in the Australian Defense Force (ADF). Accident reports and autopsy records of aircrew fatalities during the period 1945-2002 were examined to identify difficulties experienced during the DVI process or injuries that would prevent identification of remains using non-DNA methods. The ADF had 301 aircraft fatalities sustained in 144 accidents during the period 1945-2002. The autopsy reports for 117 fatalities were reviewed (covering 73.7% of aircrew fatalities from 1960-2002). Of the 117 victims, 38 (32.4%) sustained injuries which were severe enough to prevent identification by traditional (non-DNA) comparative scientific DVI techniques of fingerprint and dental analysis. Many of the ADF fatalities who could not be positively identified in the past could be identified today through the use of DNA techniques. Successful DNA identification, however, depends on having a reference DNA profile. This paper recommends the establishment of a DNA repository to store reference blood samples to facilitate the identification of ADF aircrew remains without causing additional distress to family members.

  15. How "Flipping" the Classroom Can Improve the Traditional Lecture

    ERIC Educational Resources Information Center

    Berrett, Dan

    2012-01-01

    In this article, the author discusses a teaching technique called "flipping" and describes how "flipping" the classroom can improve the traditional lecture. As its name suggests, flipping describes the inversion of expectations in the traditional college lecture. It takes many forms, including interactive engagement, just-in-time teaching (in…

  16. Use Hierarchical Storage and Analysis to Exploit Intrinsic Parallelism

    NASA Astrophysics Data System (ADS)

    Zender, C. S.; Wang, W.; Vicente, P.

    2013-12-01

    Big Data is an ugly name for the scientific opportunities and challenges created by the growing wealth of geoscience data. How to weave large, disparate datasets together to best reveal their underlying properties, to exploit their strengths and minimize their weaknesses, to continually aggregate more information than the world knew yesterday and less than we will learn tomorrow? Data analytics techniques (statistics, data mining, machine learning, etc.) can accelerate pattern recognition and discovery. However, often researchers must, prior to analysis, organize multiple related datasets into a coherent framework. Hierarchical organization permits entire dataset to be stored in nested groups that reflect their intrinsic relationships and similarities. Hierarchical data can be simpler and faster to analyze by coding operators to automatically parallelize processes over isomorphic storage units, i.e., groups. The newest generation of netCDF Operators (NCO) embody this hierarchical approach, while still supporting traditional analysis approaches. We will use NCO to demonstrate the trade-offs involved in processing a prototypical Big Data application (analysis of CMIP5 datasets) using hierarchical and traditional analysis approaches.

  17. Decision Support System Requirements Definition for Human Extravehicular Activity Based on Cognitive Work Analysis

    PubMed Central

    Miller, Matthew James; McGuire, Kerry M.; Feigh, Karen M.

    2016-01-01

    The design and adoption of decision support systems within complex work domains is a challenge for cognitive systems engineering (CSE) practitioners, particularly at the onset of project development. This article presents an example of applying CSE techniques to derive design requirements compatible with traditional systems engineering to guide decision support system development. Specifically, it demonstrates the requirements derivation process based on cognitive work analysis for a subset of human spaceflight operations known as extravehicular activity. The results are presented in two phases. First, a work domain analysis revealed a comprehensive set of work functions and constraints that exist in the extravehicular activity work domain. Second, a control task analysis was performed on a subset of the work functions identified by the work domain analysis to articulate the translation of subject matter states of knowledge to high-level decision support system requirements. This work emphasizes an incremental requirements specification process as a critical component of CSE analyses to better situate CSE perspectives within the early phases of traditional systems engineering design. PMID:28491008

  18. A Roadmap for Using Agile Development in a Traditional Environment

    NASA Technical Reports Server (NTRS)

    Streiffert, Barbara; Starbird, Thomas; Grenander, Sven

    2006-01-01

    One of the newer classes of software engineering techniques is called 'Agile Development'. In Agile Development software engineers take small implementation steps and, in some cases, they program in pairs. In addition, they develop automatic tests prior to implementing their small functional piece. Agile Development focuses on rapid turnaround, incremental planning, customer involvement and continuous integration. Agile Development is not the traditional waterfall method or even a rapid prototyping method (although this methodology is closer to Agile Development). At the Jet Propulsion Laboratory (JPL) a few groups have begun Agile Development software implementations. The difficulty with this approach becomes apparent when Agile Development is used in an organization that has specific criteria and requirements handed down for how software development is to be performed. The work at the JPL is performed for the National Aeronautics and Space Agency (NASA). Both organizations have specific requirements, rules and processes for developing software. This paper will discuss some of the initial uses of the Agile Development methodology, the spread of this method and the current status of the successful incorporation into the current JPL development policies and processes.

  19. Decision Support System Requirements Definition for Human Extravehicular Activity Based on Cognitive Work Analysis.

    PubMed

    Miller, Matthew James; McGuire, Kerry M; Feigh, Karen M

    2017-06-01

    The design and adoption of decision support systems within complex work domains is a challenge for cognitive systems engineering (CSE) practitioners, particularly at the onset of project development. This article presents an example of applying CSE techniques to derive design requirements compatible with traditional systems engineering to guide decision support system development. Specifically, it demonstrates the requirements derivation process based on cognitive work analysis for a subset of human spaceflight operations known as extravehicular activity . The results are presented in two phases. First, a work domain analysis revealed a comprehensive set of work functions and constraints that exist in the extravehicular activity work domain. Second, a control task analysis was performed on a subset of the work functions identified by the work domain analysis to articulate the translation of subject matter states of knowledge to high-level decision support system requirements. This work emphasizes an incremental requirements specification process as a critical component of CSE analyses to better situate CSE perspectives within the early phases of traditional systems engineering design.

  20. Analysis of Coaxial Soil Cell in Reflection and Transmission

    PubMed Central

    Pelletier, Mathew G.; Viera, Joseph A.; Schwartz, Robert C.; Evett, Steven R.; Lascano, Robert J.; McMichael, Robert L.

    2011-01-01

    Accurate measurement of moisture content is a prime requirement in hydrological, geophysical and biogeochemical research as well as for material characterization and process control. Within these areas, accurate measurements of the surface area and bound water content is becoming increasingly important for providing answers to many fundamental questions ranging from characterization of cotton fiber maturity, to accurate characterization of soil water content in soil water conservation research to bio-plant water utilization to chemical reactions and diffusions of ionic species across membranes in cells as well as in the dense suspensions that occur in surface films. In these bound water materials, the errors in the traditional time-domain-reflectometer, “TDR”, exceed the range of the full span of the material’s permittivity that is being measured. Thus, there is a critical need to re-examine the TDR system and identify where the errors are to direct future research. One promising technique to address the increasing demands for higher accuracy water content measurements is utilization of electrical permittivity characterization of materials. This technique has enjoyed a strong following in the soil-science and geological community through measurements of apparent permittivity via time-domain-reflectometery as well in many process control applications. Recent research however, is indicating a need to increase the accuracy beyond that available from traditional TDR. The most logical pathway then becomes a transition from TDR based measurements to network analyzer measurements of absolute permittivity that will remove the adverse effects that high surface area soils and conductivity impart onto the measurements of apparent permittivity in traditional TDR applications. This research examines the theoretical basis behind the coaxial probe, from which the modern TDR probe originated from, to provide a basis on which to perform absolute permittivity measurements. The research reveals currently utilized formulations in accepted techniques for permittivity measurements which violate the underlying assumptions inherent in the basic models due to the TDR acting as an antenna by radiating energy off the end of the probe, rather than returning it back to the source as is the current assumption. To remove the effects of radiation from the experimental results obtain herein, this research utilized custom designed coaxial probes of various diameters and probe lengths by which to test the coaxial cell measurement technique for accuracy in determination of absolute permittivity. In doing so, the research reveals that the basic models available in the literature all omitted a key correction factor that is hypothesized by this research as being most likely due to fringe capacitance. To test this theory, a Poisson model of a coaxial cell was formulated to calculate the effective extra length provided by the fringe capacitance which is then used to correct the experimental results such that experimental measurements utilizing differing coaxial cell diameters and probe lengths, upon correction with the Poisson model derived correction factor, all produce the same results thereby lending support for the use of an augmented measurement technique, described herein, for measurement of absolute permittivity, as opposed to the traditional TDR measurement of apparent permittivity. PMID:22163757

Top