Sample records for existing techniques including

  1. Modeling of ETL-Processes and Processed Information in Clinical Data Warehousing.

    PubMed

    Tute, Erik; Steiner, Jochen

    2018-01-01

    Literature describes a big potential for reuse of clinical patient data. A clinical data warehouse (CDWH) is a means for that. To support management and maintenance of processes extracting, transforming and loading (ETL) data into CDWHs as well as to ease reuse of metadata between regular IT-management, CDWH and secondary data users by providing a modeling approach. Expert survey and literature review to find requirements and existing modeling techniques. An ETL-modeling-technique was developed extending existing modeling techniques. Evaluation by exemplarily modeling existing ETL-process and a second expert survey. Nine experts participated in the first survey. Literature review yielded 15 included publications. Six existing modeling techniques were identified. A modeling technique extending 3LGM2 and combining it with openEHR information models was developed and evaluated. Seven experts participated in the evaluation. The developed approach can help in management and maintenance of ETL-processes and could serve as interface between regular IT-management, CDWH and secondary data users.

  2. Photomorphic analysis techniques: An interim spatial analysis using satellite remote sensor imagery and historical data

    NASA Technical Reports Server (NTRS)

    Keuper, H. R.; Peplies, R. W.; Gillooly, R. P.

    1977-01-01

    The use of machine scanning and/or computer-based techniques to provide greater objectivity in the photomorphic approach was investigated. Photomorphic analysis and its application in regional planning are discussed. Topics included: delineation of photomorphic regions; inadequacies of existing classification systems; tonal and textural characteristics and signature analysis techniques; pattern recognition and Fourier transform analysis; and optical experiments. A bibliography is included.

  3. Sensing Applied Load and Damage Effects in Composites with Nondestructive Techniques

    DTIC Science & Technology

    2017-05-01

    evaluation (NDE) techniques. Evaluation using piezoelectrically induced guided waves, acoustic emission, thermography, and X-ray imaging were compared...nondestructive inspection to further understanding of the material itself and the capabilities of various nondestructive evaluation (NDE) techniques...materials because of their inherent differences. NDE techniques exist that can evaluate composite structures for damage including C-Scan

  4. NASA reliability preferred practices for design and test

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Given here is a manual that was produced to communicate within the aerospace community design practices that have contributed to NASA mission success. The information represents the best technical advice that NASA has to offer on reliability design and test practices. Topics covered include reliability practices, including design criteria, test procedures, and analytical techniques that have been applied to previous space flight programs; and reliability guidelines, including techniques currently applied to space flight projects, where sufficient information exists to certify that the technique will contribute to mission success.

  5. Feasibility study on the design of a probe for rectal cancer detection

    NASA Technical Reports Server (NTRS)

    Anselm, V. J.; Frazer, R. E.; Lecroisset, D. H.; Roseboro, J. A.; Smokler, M. I.

    1977-01-01

    Rectal examination techniques are considered in terms of detection capability, patient acceptance, and cost reduction. A review of existing clinical techniques are considered in terms of detection capability, patient acceptance, and cost reduction. A review of existing clinical techniques and of relevant aerospace technology included evaluation of the applicability of visual, thermal, ultrasound, and radioisotope modalities of examination. The desired improvements can be obtained by redesigning the proctosigmoidoscope to have reduced size, additional visibility, and the capability of readily providing a color photograph of the entire rectosigmoid mucosa in a single composite view.

  6. University role in astronaut life support systems: Portable thermal control systems

    NASA Technical Reports Server (NTRS)

    Ephrath, A. R.

    1971-01-01

    One of the most vital life support systems is that used to provide the astronaut with an adequate thermal environment. State-of-the-art techniques are reviewed for collecting and rejecting excess heat loads from the suited astronaut. Emphasis is placed on problem areas which exist and which may be suitable topics for university research. Areas covered include thermal control requirements and restrictions, methods of heat absorption and rejection or storage, and comparison between existing methods and possible future techniques.

  7. Efficient morse decompositions of vector fields.

    PubMed

    Chen, Guoning; Mischaikow, Konstantin; Laramee, Robert S; Zhang, Eugene

    2008-01-01

    Existing topology-based vector field analysis techniques rely on the ability to extract the individual trajectories such as fixed points, periodic orbits, and separatrices that are sensitive to noise and errors introduced by simulation and interpolation. This can make such vector field analysis unsuitable for rigorous interpretations. We advocate the use of Morse decompositions, which are robust with respect to perturbations, to encode the topological structures of a vector field in the form of a directed graph, called a Morse connection graph (MCG). While an MCG exists for every vector field, it need not be unique. Previous techniques for computing MCG's, while fast, are overly conservative and usually results in MCG's that are too coarse to be useful for the applications. To address this issue, we present a new technique for performing Morse decomposition based on the concept of tau-maps, which typically provides finer MCG's than existing techniques. Furthermore, the choice of tau provides a natural tradeoff between the fineness of the MCG's and the computational costs. We provide efficient implementations of Morse decomposition based on tau-maps, which include the use of forward and backward mapping techniques and an adaptive approach in constructing better approximations of the images of the triangles in the meshes used for simulation.. Furthermore, we propose the use of spatial tau-maps in addition to the original temporal tau-maps. These techniques provide additional trade-offs between the quality of the MCGs and the speed of computation. We demonstrate the utility of our technique with various examples in the plane and on surfaces including engine simulation data sets.

  8. Surgical Techniques for the Reconstruction of Medial Collateral Ligament and Posteromedial Corner Injuries of the Knee: A Systematic Review.

    PubMed

    DeLong, Jeffrey M; Waterman, Brian R

    2015-11-01

    To systematically review reconstruction techniques of the medial collateral ligament (MCL) and associated medial structures of the knee (e.g., posterior oblique ligament). A systematic review of Medline/PubMed Database (1966 to November 2013), reference list scanning and citation searches of included articles, and manual searches of high-impact journals (2000 to July 2013) and conference proceedings (2009 to July 2013) were performed to identify publications describing MCL reconstruction techniques of the knee. Exclusion criteria included (1) MCL primary repair techniques or advancement procedures, (2) lack of clear description of MCL reconstruction technique, (3) animal models, (4) nonrelevant study design, (5) and foreign language articles without available translation. After review of 4,600 references, 25 publications with 359 of 388 patients (92.5%) were isolated for analysis, including 18 single-bundle MCL and 10 double-bundle reconstruction techniques. Only 2 techniques were classified as anatomic reconstructions, and clinical and objective outcomes (n = 28; 100% <3 mm side-to-side difference [SSD]) were superior to those with nonanatomic reconstruction (n = 182; 79.1% <3 mm SSD) and tendon transfer techniques (n = 114; 52.6% <3 mm SSD). This systematic review demonstrated that numerous medial reconstruction techniques have been used in the treatment of isolated and combined medial knee injuries in the existent literature. Many variations exist among reconstruction techniques and may differ by graft choices, method of fixation, number of bundles, tensioning protocol, and degree of anatomic restoration of medial and posteromedial corner knee restraints. Further studies are required to better ascertain the comparative clinical outcomes with anatomic, non-anatomic, and tendon transfer techniques for medial knee reconstruction. Level IV, systematic review of level IV studies and surgical techniques. Published by Elsevier Inc.

  9. NLPIR: A Theoretical Framework for Applying Natural Language Processing to Information Retrieval.

    ERIC Educational Resources Information Center

    Zhou, Lina; Zhang, Dongsong

    2003-01-01

    Proposes a theoretical framework called NLPIR that integrates natural language processing (NLP) into information retrieval (IR) based on the assumption that there exists representation distance between queries and documents. Discusses problems in traditional keyword-based IR, including relevance, and describes some existing NLP techniques.…

  10. Proceedings of the Workshop on an Electromagnetic Positioning System in Space

    NASA Technical Reports Server (NTRS)

    Oran, W. A. (Editor)

    1978-01-01

    A workshop was convened to help determine if sufficient justification existed to proceed with the design of an electromagnetic (EM) positioning device for use in space. Those in attendance included experts in crystal growth, nucleation phenomena, containerless processing techniques, properties of materials, metallurgical techniques, and glass technology. Specific areas mentioned included the study of metallic glasses and investigations of the properties of high temperature materials.

  11. A ROle-Oriented Filtering (ROOF) approach for collaborative recommendation

    NASA Astrophysics Data System (ADS)

    Ghani, Imran; Jeong, Seung Ryul

    2016-09-01

    In collaborative filtering (CF) recommender systems, existing techniques frequently focus on determining similarities among users' historical interests. This generally refers to situations in which each user normally plays a single role and his/her taste remains consistent over the long term. However, we note that existing techniques have not been significantly employed in a role-oriented context. This is especially so in situations where users may change their roles over time or play multiple roles simultaneously, while still expecting to access relevant information resources accordingly. Such systems include enterprise architecture management systems, e-commerce sites or journal management systems. In scenarios involving existing techniques, each user needs to build up very different profiles (preferences and interests) based on multiple roles which change over time. Should this not occur to a satisfactory degree, their previous information will either be lost or not utilised at all. To limit the occurrence of such issues, we propose a ROle-Oriented Filtering (ROOF) approach focusing on the manner in which multiple user profiles are obtained and maintained over time. We conducted a number of experiments using an enterprise architecture management scenario. In so doing, we observed that the ROOF approach performs better in comparison with other existing collaborative filtering-based techniques.

  12. Application of multivariable search techniques to the optimization of airfoils in a low speed nonlinear inviscid flow field

    NASA Technical Reports Server (NTRS)

    Hague, D. S.; Merz, A. W.

    1975-01-01

    Multivariable search techniques are applied to a particular class of airfoil optimization problems. These are the maximization of lift and the minimization of disturbance pressure magnitude in an inviscid nonlinear flow field. A variety of multivariable search techniques contained in an existing nonlinear optimization code, AESOP, are applied to this design problem. These techniques include elementary single parameter perturbation methods, organized search such as steepest-descent, quadratic, and Davidon methods, randomized procedures, and a generalized search acceleration technique. Airfoil design variables are seven in number and define perturbations to the profile of an existing NACA airfoil. The relative efficiency of the techniques are compared. It is shown that elementary one parameter at a time and random techniques compare favorably with organized searches in the class of problems considered. It is also shown that significant reductions in disturbance pressure magnitude can be made while retaining reasonable lift coefficient values at low free stream Mach numbers.

  13. ASSURED CLOUD COMPUTING UNIVERSITY CENTER OFEXCELLENCE (ACC UCOE)

    DTIC Science & Technology

    2018-01-18

    average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed...infrastructure security -Design of algorithms and techniques for real- time assuredness in cloud computing -Map-reduce task assignment with data locality...46 DESIGN OF ALGORITHMS AND TECHNIQUES FOR REAL- TIME ASSUREDNESS IN CLOUD COMPUTING

  14. The manufacture of flat conductor cable

    NASA Technical Reports Server (NTRS)

    Angele, W.

    1974-01-01

    The major techniques are described for fabricating flat conductor cable (FCC). Various types of FCC, including unshielded, shielded, power, and signal, in both existing and conceptual constructions, are covered.

  15. A forecast of new test capabilities using Magnetic Suspension and Balance Systems

    NASA Technical Reports Server (NTRS)

    Lawing, Pierce L.; Johnson, William G., Jr.

    1988-01-01

    This paper outlines the potential of Magnetic Suspension and Balance System (MSBS) technology to solve existing problems related to support interference in wind tunnels. Improvement of existing test techniques and exciting new techniques are envisioned as a result of applying MSBS. These include improved data accuracy, dynamic stability testing, two-body/stores release testing, and pilot/designer-in-the-loop tests. It also discusses the use of MSBS for testing exotic configurations such as hybrid hypersonic vehicles. A new facility concept that combines features of ballistic tubes, magnetic suspension, and cryogenic tunnels is described.

  16. The application of magnetic gradiometry and electromagnetic induction at a former radioactive waste disposal site.

    PubMed

    Rucker, Dale Franklin

    2010-04-01

    A former radioactive waste disposal site is surveyed with two non-intrusive geophysical techniques, including magnetic gradiometry and electromagnetic induction. Data were gathered over the site by towing the geophysical equipment mounted to a non-electrically conductive and non-magnetic fibre-glass cart. Magnetic gradiometry, which detects the location of ferromagnetic material, including iron and steel, was used to map the existence of a previously unknown buried pipeline formerly used in the delivery of liquid waste to a number of surface disposal trenches and concrete vaults. The existence of a possible pipeline is reinforced by historical engineering drawing and photographs. The electromagnetic induction (EMI) technique was used to map areas of high and low electrical conductivity, which coincide with the magnetic gradiometry data. The EMI also provided information on areas of high electrical conductivity unrelated to a pipeline network. Both data sets demonstrate the usefulness of surface geophysical surveillance techniques to minimize the risk of exposure in the event of future remediation efforts.

  17. Extending radiative transfer models by use of Bayes rule. [in atmospheric science

    NASA Technical Reports Server (NTRS)

    Whitney, C.

    1977-01-01

    This paper presents a procedure that extends some existing radiative transfer modeling techniques to problems in atmospheric science where curvature and layering of the medium and dynamic range and angular resolution of the signal are important. Example problems include twilight and limb scan simulations. Techniques that are extended include successive orders of scattering, matrix operator, doubling, Gauss-Seidel iteration, discrete ordinates and spherical harmonics. The procedure for extending them is based on Bayes' rule from probability theory.

  18. Prognosis model for stand development

    Treesearch

    Albert R. Stage

    1973-01-01

    Describes a set of computer programs for developing prognoses of the development of existing stand under alternative regimes of management. Calibration techniques, modeling procedures, and a procedure for including stochastic variation are described. Implementation of the system for lodgepole pine, including assessment of losses attributed to an infestation of mountain...

  19. Applications of integrated human error identification techniques on the chemical cylinder change task.

    PubMed

    Cheng, Ching-Min; Hwang, Sheue-Ling

    2015-03-01

    This paper outlines the human error identification (HEI) techniques that currently exist to assess latent human errors. Many formal error identification techniques have existed for years, but few have been validated to cover latent human error analysis in different domains. This study considers many possible error modes and influential factors, including external error modes, internal error modes, psychological error mechanisms, and performance shaping factors, and integrates several execution procedures and frameworks of HEI techniques. The case study in this research was the operational process of changing chemical cylinders in a factory. In addition, the integrated HEI method was used to assess the operational processes and the system's reliability. It was concluded that the integrated method is a valuable aid to develop much safer operational processes and can be used to predict human error rates on critical tasks in the plant. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  20. Deformations of the gyroid and Lidinoid minimal surfaces using flat structures

    NASA Astrophysics Data System (ADS)

    Weyhaupt, Adam

    2015-03-01

    Mathematically, the challenge in proving the existence of a purported triply periodic minimal surface is in computing parameter values that depend on a system of equations defined by elliptic integrals. This is generally very difficult. In the presence of some symmetry, however, a technique developed by Weber and Wolf can reduce these elliptic integrals to basic algebra and geometry of polygons. These techniques can easily prove the existence of some surfaces and the presence of a family of solutions. Families of surfaces are important mathematically, but recent work by Seddon, et. al., experimentally confirms that these families of surfaces can occur physically as well. In this talk, we give a brief overview of the technique and show how it can be applied to prove the existence of several families of surfaces, including lower symmetry variants of the gyroid and Lidinoid such as the rG, rPD, tG, and rL. We also conjecture a map of the moduli space of triply periodic minimal surfaces of genus 3.

  1. Modeling the Environmental Impact of Air Traffic Operations

    NASA Technical Reports Server (NTRS)

    Chen, Neil

    2011-01-01

    There is increased interest to understand and mitigate the impacts of air traffic on the climate, since greenhouse gases, nitrogen oxides, and contrails generated by air traffic can have adverse impacts on the climate. The models described in this presentation are useful for quantifying these impacts and for studying alternative environmentally aware operational concepts. These models have been developed by leveraging and building upon existing simulation and optimization techniques developed for the design of efficient traffic flow management strategies. Specific enhancements to the existing simulation and optimization techniques include new models that simulate aircraft fuel flow, emissions and contrails. To ensure that these new models are beneficial to the larger climate research community, the outputs of these new models are compatible with existing global climate modeling tools like the FAA's Aviation Environmental Design Tool.

  2. Advantages and Disadvantages of Transtibial, Anteromedial Portal, and Outside-In Femoral Tunnel Drilling in Single-Bundle Anterior Cruciate Ligament Reconstruction: A Systematic Review.

    PubMed

    Robin, Brett N; Jani, Sunil S; Marvil, Sean C; Reid, John B; Schillhammer, Carl K; Lubowitz, James H

    2015-07-01

    Controversy exists regarding the best method for creating the knee anterior cruciate ligament (ACL) femoral tunnel or socket. The purpose of this study was to systematically review the risks, benefits, advantages, and disadvantages of the endoscopic transtibial (TT) technique, anteromedial portal technique, outside-in technique, and outside-in retrograde drilling technique for creating the ACL femoral tunnel. A PubMed search of English-language studies published between January 1, 2000, and February 17, 2014, was performed using the following keywords: "anterior cruciate ligament" AND "femoral tunnel." Included were studies reporting risks, benefits, advantages, and/or disadvantages of any ACL femoral technique. In addition, references of included articles were reviewed to identify potential studies missed in the original search. A total of 27 articles were identified through the search. TT technique advantages include familiarity and proven long-term outcomes; disadvantages include the risk of nonanatomic placement because of constrained (TT) drilling. Anteromedial portal technique advantages include unconstrained anatomic placement; disadvantages include technical challenges, short tunnels or sockets, and posterior-wall blowout. Outside-in technique advantages include unconstrained anatomic placement; disadvantages include the need for 2 incisions. Retrograde drilling technique advantages include unconstrained anatomic placement, as well as all-epiphyseal drilling in skeletally immature patients; disadvantages include the need for fluoroscopy for all-epiphyseal drilling. There is no one, single, established "gold-standard" technique for creation of the ACL femoral socket. Four accepted techniques show diverse and subjective advantages, disadvantages, risks, and benefits. Level V, systematic review of Level II through V evidence. Copyright © 2015 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.

  3. Inter-satellite time transfer: Techniques and applications

    NASA Technical Reports Server (NTRS)

    Detoma, Edoardo; Wardrip, S. Clark

    1990-01-01

    A brief review is presented of the well known time transfer techniques that have been studied and tested throughout the years. The applicability of time transfer techniques to a timing service as provided through a TDRS/DRS System, the problems related to the choice of the timing signal within the constraints imposed by the existing systems, and the possible practical implementations, including a description of the time synchronization support via TDRSS to the Gamma Ray Observatory (GRO) are discussed.

  4. Multidimensional chromatography in food analysis.

    PubMed

    Herrero, Miguel; Ibáñez, Elena; Cifuentes, Alejandro; Bernal, Jose

    2009-10-23

    In this work, the main developments and applications of multidimensional chromatographic techniques in food analysis are reviewed. Different aspects related to the existing couplings involving chromatographic techniques are examined. These couplings include multidimensional GC, multidimensional LC, multidimensional SFC as well as all their possible combinations. Main advantages and drawbacks of each coupling are critically discussed and their key applications in food analysis described.

  5. A survey of fault diagnosis technology

    NASA Technical Reports Server (NTRS)

    Riedesel, Joel

    1989-01-01

    Existing techniques and methodologies for fault diagnosis are surveyed. The techniques run the gamut from theoretical artificial intelligence work to conventional software engineering applications. They are shown to define a spectrum of implementation alternatives where tradeoffs determine their position on the spectrum. Various tradeoffs include execution time limitations and memory requirements of the algorithms as well as their effectiveness in addressing the fault diagnosis problem.

  6. Face lift.

    PubMed

    Warren, Richard J; Aston, Sherrell J; Mendelson, Bryan C

    2011-12-01

    After reading this article, the participant should be able to: 1. Identify and describe the anatomy of and changes to the aging face, including changes in bone mass and structure and changes to the skin, tissue, and muscles. 2. Assess each individual's unique anatomy before embarking on face-lift surgery and incorporate various surgical techniques, including fat grafting and other corrective procedures in addition to shifting existing fat to a higher position on the face, into discussions with patients. 3. Identify risk factors and potential complications in prospective patients. 4. Describe the benefits and risks of various techniques. The ability to surgically rejuvenate the aging face has progressed in parallel with plastic surgeons' understanding of facial anatomy. In turn, a more clear explanation now exists for the visible changes seen in the aging face. This article and its associated video content review the current understanding of facial anatomy as it relates to facial aging. The standard face-lift techniques are explained and their various features, both good and bad, are reviewed. The objective is for surgeons to make a better aesthetic diagnosis before embarking on face-lift surgery, and to have the ability to use the appropriate technique depending on the clinical situation.

  7. How Effective are Existing Arsenic Removal Techniques

    EPA Science Inventory

    This presentation will summarize the system performance results of the technologies demonstrated in the arsenic demonstration program. The technologies include adsorptive media, iron removal, iron removal with iron additions, iron removal followed by adsorptive media, coagulatio...

  8. Automated rejection of parasitic frequency sidebands in heterodyne-detection LIDAR applications

    NASA Technical Reports Server (NTRS)

    Esproles, Carlos; Tratt, David M.; Menzies, Robert T.

    1989-01-01

    A technique is described for the detection of the sporadic onset of multiaxial mode behavior of a normally single-mode TEA CO2 laser. The technique is implemented using primarily commercial circuit modules; it incorporates a peak detector that displays the RF detector output on a digital voltmeter, and a LED bar graph. The technique was successfully demonstrated with an existing coherent atmospheric LIDAR facility utilizing an injection-seeded single-mode TEA CO2 laser. The block schematic diagram is included.

  9. CAPSULE REPORT: HARD CHROME FUME ...

    EPA Pesticide Factsheets

    All existing information which includes the information extrapolated from the Hard Chrome Pollution Prevention Demonstration Project(s) and other sources derived from plating facilities and industry contacts, will be condensed and featured in this document. At least five chromium emission prevention/control devices have been tested covering a wide spectrum of techniques currently in use at small and large-sized chrome metal plating shops. The goal for limiting chromium emissions to levels specified in the MACT Standards are: (1) 0.030 milligrams per dry standard cubic meter of air (mg/dscm) for small facilities with existing tanks, (2) 0.015 mg/dscm for small facilities with new tanks or large facilities with existing or new tanks. It should be emphasized that chemical mist suppressants still have quality issues and work practices that need to be addressed when they are used. Some of the mist suppressants currently in use are: one-, two-, and three-stage mesh pad mist eliminators; composite mesh pad mist eliminators; packed-bed scrubbers and polyballs. This capsule report should, redominantly, emphasize pollution prevention techniques and include, but not be restricted to, the afore-mentioned devices. Information

  10. The Effect of a Self-Monitored Relaxation Breathing Exercise on Male Adolescent Aggressive Behavior

    ERIC Educational Resources Information Center

    Gaines, Trudi; Barry, Leasha M.

    2008-01-01

    This study sought to contribute to the identification of effective interventions in the area of male adolescent aggressive behavior. Existing research includes both group- and single-case studies implementing treatments which typically include an anger-management component and its attendant relaxation and stress-reduction techniques. The design of…

  11. Existing and Emerging Technologies in Education: A Descriptive Overview. CREATE Monograph Series.

    ERIC Educational Resources Information Center

    Bakke, Thomas W.

    Second in a series of six monographs on the use of new technologies in the instruction of learning disabled students, the paper offers a descriptive overview of new technologies. Topics addressed include the following: (1) techniques for sharing computer resources (including aspects of networking, sharing information through databases, and the use…

  12. Overview of Sparse Graph for Multiple Access in Future Mobile Networks

    NASA Astrophysics Data System (ADS)

    Lei, Jing; Li, Baoguo; Li, Erbao; Gong, Zhenghui

    2017-10-01

    Multiple access via sparse graph, such as low density signature (LDS) and sparse code multiple access (SCMA), is a promising technique for future wireless communications. This survey presents an overview of the developments in this burgeoning field, including transmitter structures, extrinsic information transform (EXIT) chart analysis and comparisons with existing multiple access techniques. Such technique enables multiple access under overloaded conditions to achieve a satisfactory performance. Message passing algorithm is utilized for multi-user detection in the receiver, and structures of the sparse graph are illustrated in detail. Outlooks and challenges of this technique are also presented.

  13. Technology and Technique Standards for Camera-Acquired Digital Dermatologic Images: A Systematic Review.

    PubMed

    Quigley, Elizabeth A; Tokay, Barbara A; Jewell, Sarah T; Marchetti, Michael A; Halpern, Allan C

    2015-08-01

    Photographs are invaluable dermatologic diagnostic, management, research, teaching, and documentation tools. Digital Imaging and Communications in Medicine (DICOM) standards exist for many types of digital medical images, but there are no DICOM standards for camera-acquired dermatologic images to date. To identify and describe existing or proposed technology and technique standards for camera-acquired dermatologic images in the scientific literature. Systematic searches of the PubMed, EMBASE, and Cochrane databases were performed in January 2013 using photography and digital imaging, standardization, and medical specialty and medical illustration search terms and augmented by a gray literature search of 14 websites using Google. Two reviewers independently screened titles of 7371 unique publications, followed by 3 sequential full-text reviews, leading to the selection of 49 publications with the most recent (1985-2013) or detailed description of technology or technique standards related to the acquisition or use of images of skin disease (or related conditions). No universally accepted existing technology or technique standards for camera-based digital images in dermatology were identified. Recommendations are summarized for technology imaging standards, including spatial resolution, color resolution, reproduction (magnification) ratios, postacquisition image processing, color calibration, compression, output, archiving and storage, and security during storage and transmission. Recommendations are also summarized for technique imaging standards, including environmental conditions (lighting, background, and camera position), patient pose and standard view sets, and patient consent, privacy, and confidentiality. Proposed standards for specific-use cases in total body photography, teledermatology, and dermoscopy are described. The literature is replete with descriptions of obtaining photographs of skin disease, but universal imaging standards have not been developed, validated, and adopted to date. Dermatologic imaging is evolving without defined standards for camera-acquired images, leading to variable image quality and limited exchangeability. The development and adoption of universal technology and technique standards may first emerge in scenarios when image use is most associated with a defined clinical benefit.

  14. Autologous Collagen Matrix (ACM): Lower Pole Support With a Supero-Anterior Capsular Flap in Secondary Subpectoral Breast Augmentation.

    PubMed

    Montemurro, Paolo; Cheema, Mubashir; Hedén, Per; Avvedimento, Stefano; Agko, Mouchammed; Quattrini Li, Alessandro

    2017-05-01

    Secondary aesthetic breast surgery is a complex and challenging scenario. It requires the surgeon to identify contributing factors, provide patient education, make a further management plan, and optimize the conditions for a favorable result. Various techniques have been described in literature but the rate of reoperation is still high. The first author has been using a supero-anterior capsular flap with a neopectoral subcapsular pocket and an implant change in these cases. To review the patient characteristics, indications, and early results of using part of the existing implant capsule for secondary subpectoral breast augmentations. All patients who underwent secondary breast augmentation, over a period of 2 years by the first author (P.M.), using the supero-anterior capsular flap technique were included. The technique involves dissection of a new subpectoral pocket and uses the existing implant capsule as an internal brassiere. A total of 36 patients were operated by this technique. Of these, 17 patients had developed a complication while 19 patients wanted a change in size only. At a mean follow up of 10.2 months, there was no bottoming out, double bubble, or capsular contracture. This reliable technique provides stable results as shown by low rate of complications with the existing follow up. © 2017 The American Society for Aesthetic Plastic Surgery, Inc. Reprints and permission: journals.permissions@oup.com

  15. Full-color large-scaled computer-generated holograms for physical and non-physical objects

    NASA Astrophysics Data System (ADS)

    Matsushima, Kyoji; Tsuchiyama, Yasuhiro; Sonobe, Noriaki; Masuji, Shoya; Yamaguchi, Masahiro; Sakamoto, Yuji

    2017-05-01

    Several full-color high-definition CGHs are created for reconstructing 3D scenes including real-existing physical objects. The field of the physical objects are generated or captured by employing three techniques; 3D scanner, synthetic aperture digital holography, and multi-viewpoint images. Full-color reconstruction of high-definition CGHs is realized by RGB color filters. The optical reconstructions are presented for verifying these techniques.

  16. Carbon Coating Of Copper By Arc-Discharge Pyrolysis

    NASA Technical Reports Server (NTRS)

    Ebihara, Ben T.; Jopek, Stanley

    1988-01-01

    Adherent, abrasion-resistant coat deposited with existing equipment. Carbon formed and deposited as coating on copper substrate by pyrolysis of hydrocarbon oil in electrical-arc discharges. Technique for producing carbon deposits on copper accomplished with electrical-discharge-machining equipment used for cutting metals. Applications for new coating technique include the following: solar-energy-collecting devices, coating of metals other than copper with carbon, and carburization of metal surfaces.

  17. Periodontal considerations for esthetics: edentulous ridge augmentation.

    PubMed

    Rosenberg, E S; Cutler, S A

    1993-01-01

    Edentulous ridge augmentation is a plastic surgical technique that is performed to improve patient esthetics when unsightly, deformed ridges exist. This article describes the etiology of ridge deformities and the many procedures that can be executed to achieve an esthetic, functional result. Historically, soft-tissue mucogingival techniques were described to augment collapsed ridges. Pedicle grafts, free soft-tissue grafts, and subepithelial connective tissue grafts are predictable forms of therapy. More recently, ridge augmentation techniques were developed that regenerate the lost periodontium. These include allografts, bioglasses, guided tissue regenerative procedures, and tissue expansion.

  18. Selected photographic techniques, a compilation

    NASA Technical Reports Server (NTRS)

    1971-01-01

    A selection has been made of methods, devices, and techniques developed in the field of photography during implementation of space and nuclear research projects. These items include many adaptations, variations, and modifications to standard hardware and practice, and should prove interesting to both amateur and professional photographers and photographic technicians. This compilation is divided into two sections. The first section presents techniques and devices that have been found useful in making photolab work simpler, more productive, and higher in quality. Section two deals with modifications to and special applications for existing photographic equipment.

  19. Robust volcano plot: identification of differential metabolites in the presence of outliers.

    PubMed

    Kumar, Nishith; Hoque, Md Aminul; Sugimoto, Masahiro

    2018-04-11

    The identification of differential metabolites in metabolomics is still a big challenge and plays a prominent role in metabolomics data analyses. Metabolomics datasets often contain outliers because of analytical, experimental, and biological ambiguity, but the currently available differential metabolite identification techniques are sensitive to outliers. We propose a kernel weight based outlier-robust volcano plot for identifying differential metabolites from noisy metabolomics datasets. Two numerical experiments are used to evaluate the performance of the proposed technique against nine existing techniques, including the t-test and the Kruskal-Wallis test. Artificially generated data with outliers reveal that the proposed method results in a lower misclassification error rate and a greater area under the receiver operating characteristic curve compared with existing methods. An experimentally measured breast cancer dataset to which outliers were artificially added reveals that our proposed method produces only two non-overlapping differential metabolites whereas the other nine methods produced between seven and 57 non-overlapping differential metabolites. Our data analyses show that the performance of the proposed differential metabolite identification technique is better than that of existing methods. Thus, the proposed method can contribute to analysis of metabolomics data with outliers. The R package and user manual of the proposed method are available at https://github.com/nishithkumarpaul/Rvolcano .

  20. Multiple degree of freedom optical pattern recognition

    NASA Technical Reports Server (NTRS)

    Casasent, D.

    1987-01-01

    Three general optical approaches to multiple degree of freedom object pattern recognition (where no stable object rest position exists) are advanced. These techniques include: feature extraction, correlation, and artificial intelligence. The details of the various processors are advanced together with initial results.

  1. Cost effectiveness as applied to the Viking Lander systems-level thermal development test program

    NASA Technical Reports Server (NTRS)

    Buna, T.; Shupert, T. C.

    1974-01-01

    The economic aspects of thermal testing at the systems-level as applied to the Viking Lander Capsule thermal development program are reviewed. The unique mission profile and pioneering scientific goals of Viking imposed novel requirements on testing, including the development of a simulation technique for the Martian thermal environment. The selected approach included modifications of an existing conventional thermal vacuum facility, and improved test-operational techniques that are applicable to the simulation of the other mission phases as well, thereby contributing significantly to the cost effectiveness of the overall thermal test program.

  2. The BAPE 2 balloon-borne CO2

    NASA Technical Reports Server (NTRS)

    Degnan, J. J.; Walker, H. E.; Peruso, C. J.; Johnson, E. H.; Klein, B. J.; Mcelroy, J. H.

    1972-01-01

    The systems and techniques which were utilized in the experiment to establish an air-to-ground CO2 laser heterodyne link are described along with the successes and problems encountered when the heterodyne receiver and laser transmitter package were removed from the controlled environment of the laboratory. Major topics discussed include: existing systems and the underlying principles involved in their operation; experimental techniques and optical alignment methods which were found to be useful; theoretical calculations of signal strengths expected under a variety of test conditions and in actual flight; and the experimental results including problems encountered and their possible solutions.

  3. An Expertise Recommender using Web Mining

    NASA Technical Reports Server (NTRS)

    Joshi, Anupam; Chandrasekaran, Purnima; ShuYang, Michelle; Ramakrishnan, Ramya

    2001-01-01

    This report explored techniques to mine web pages of scientists to extract information regarding their expertise, build expertise chains and referral webs, and semi automatically combine this information with directory information services to create a recommender system that permits query by expertise. The approach included experimenting with existing techniques that have been reported in research literature in recent past , and adapted them as needed. In addition, software tools were developed to capture and use this information.

  4. Teaching old spacecraft new tricks

    NASA Technical Reports Server (NTRS)

    Farquhar, Robert; Dunham, David

    1988-01-01

    The technique of sending existing space probes on extended mission by altering their orbital paths with gravity-assist maneuvers and relatively brief rocket firings is examined. The use of the technique to convert the International Sun-Earth Explorer 3 mission into the International Cometary Explorer mission is discussed. Other examples are considered, including the extension of the Giotto mission and the retargeting of the Sakigake spacecraft. The original and altered trajectories of these three missions are illustrated.

  5. Metamodels for Computer-Based Engineering Design: Survey and Recommendations

    NASA Technical Reports Server (NTRS)

    Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.

    1997-01-01

    The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of todays engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper we review several of these techniques including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We survey their existing application in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations and how common pitfalls can be avoided.

  6. Optical control and diagnostics sensors for gas turbine machinery

    NASA Astrophysics Data System (ADS)

    Trolinger, James D.; Jenkins, Thomas P.; Heeg, Bauke

    2012-10-01

    There exists a vast range of optical techniques that have been under development for solving complex measurement problems related to gas-turbine machinery and phenomena. For instance, several optical techniques are ideally suited for studying fundamental combustion phenomena in laboratory environments. Yet other techniques hold significant promise for use as either on-line gas turbine control sensors, or as health monitoring diagnostics sensors. In this paper, we briefly summarize these and discuss, in more detail, some of the latter class of techniques, including phosphor thermometry, hyperspectral imaging and low coherence interferometry, which are particularly suited for control and diagnostics sensing on hot section components with ceramic thermal barrier coatings (TBCs).

  7. Accuracy of lagoon gas emissions using an inverse dispersion method

    USDA-ARS?s Scientific Manuscript database

    Measuring gas emissions from treatment lagoons and storage ponds poses challenging conditions for existing micrometeorological techniques because of non-ideal wind conditions. These include those induced by trees and crops surrounding the lagoons, and lagoons with dimensions too small to establish ...

  8. Rethinking developmental toxicity testing: Evolution or revolution?

    PubMed

    Scialli, Anthony R; Daston, George; Chen, Connie; Coder, Prägati S; Euling, Susan Y; Foreman, Jennifer; Hoberman, Alan M; Hui, Julia; Knudsen, Thomas; Makris, Susan L; Morford, LaRonda; Piersma, Aldert H; Stanislaus, Dinesh; Thompson, Kary E

    2018-06-01

    Current developmental toxicity testing adheres largely to protocols suggested in 1966 involving the administration of test compound to pregnant laboratory animals. After more than 50 years of embryo-fetal development testing, are we ready to consider a different approach to human developmental toxicity testing? A workshop was held under the auspices of the Developmental and Reproductive Toxicology Technical Committee of the ILSI Health and Environmental Sciences Institute to consider how we might design developmental toxicity testing if we started over with 21st century knowledge and techniques (revolution). We first consider what changes to the current protocols might be recommended to make them more predictive for human risk (evolution). The evolutionary approach includes modifications of existing protocols and can include humanized models, disease models, more accurate assessment and testing of metabolites, and informed approaches to dose selection. The revolution could start with hypothesis-driven testing where we take what we know about a compound or close analog and answer specific questions using targeted experimental techniques rather than a one-protocol-fits-all approach. Central to the idea of hypothesis-driven testing is the concept that testing can be done at the level of mode of action. It might be feasible to identify a small number of key events at a molecular or cellular level that predict an adverse outcome and for which testing could be performed in vitro or in silico or, rarely, using limited in vivo models. Techniques for evaluating these key events exist today or are in development. Opportunities exist for refining and then replacing current developmental toxicity testing protocols using techniques that have already been developed or are within reach. © 2018 The Authors. Birth Defects Research Published by Wiley Periodicals, Inc.

  9. Feasibility study of the application of existing techniques to remotely monitor hydrochloric acid in the atmosphere

    NASA Technical Reports Server (NTRS)

    Zwick, H.; Ward, V.; Beaudette, L.

    1973-01-01

    A critical evaluation of existing optical remote sensors for HCl vapor detection in solid propellant rocket plumes is presented. The P branch of the fundamental vibration-rotation band was selected as the most promising spectral feature to sense. A computation of transmittance for HCl vapor, an estimation of interferent spectra, the application of these spectra to computer modelled remote sensors, and a trade-off study for instrument recommendation are also included.

  10. Which causal structures might support a quantum-classical gap?

    NASA Astrophysics Data System (ADS)

    Pienaar, Jacques

    2017-04-01

    A causal scenario is a graph that describes the cause and effect relationships between all relevant variables in an experiment. A scenario is deemed ‘not interesting’ if there is no device-independent way to distinguish the predictions of classical physics from any generalised probabilistic theory (including quantum mechanics). Conversely, an interesting scenario is one in which there exists a gap between the predictions of different operational probabilistic theories, as occurs for example in Bell-type experiments. Henson, Lal and Pusey (HLP) recently proposed a sufficient condition for a causal scenario to not be interesting. In this paper we supplement their analysis with some new techniques and results. We first show that existing graphical techniques due to Evans can be used to confirm by inspection that many graphs are interesting without having to explicitly search for inequality violations. For three exceptional cases—the graphs numbered \\#15,16,20 in HLP—we show that there exist non-Shannon type entropic inequalities that imply these graphs are interesting. In doing so, we find that existing methods of entropic inequalities can be greatly enhanced by conditioning on the specific values of certain variables.

  11. 75 FR 39710 - Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-12

    ... information on respondents, including through the use of automated collection techniques or other forms of... Commission, Office of Investor Education and Advocacy, Washington, DC 20549-0213. Extension: Rule 303, SEC... (``Commission'') is soliciting comments on the existing collection of information provided for in Rule 303 (17...

  12. Science Safety Procedure Handbook.

    ERIC Educational Resources Information Center

    Lynch, Mervyn A.; Offet, Lorna

    This booklet outlines general safety procedures in the areas of: (1) student supervision; (2) storage safety regulations, including lists of incompatible chemicals, techniques of disposal and storage; (3) fire; and (4) first aid. Specific sections exist for elementary, junior high school, senior high school, in which special procedures are…

  13. Development and application of numerical techniques for general-relativistic magnetohydrodynamics simulations of black hole accretion

    NASA Astrophysics Data System (ADS)

    White, Christopher Joseph

    We describe the implementation of sophisticated numerical techniques for general-relativistic magnetohydrodynamics simulations in the Athena++ code framework. Improvements over many existing codes include the use of advanced Riemann solvers and of staggered-mesh constrained transport. Combined with considerations for computational performance and parallel scalability, these allow us to investigate black hole accretion flows with unprecedented accuracy. The capability of the code is demonstrated by exploring magnetically arrested disks.

  14. Models and techniques for evaluating the effectiveness of aircraft computing systems

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1978-01-01

    The development of system models that can provide a basis for the formulation and evaluation of aircraft computer system effectiveness, the formulation of quantitative measures of system effectiveness, and the development of analytic and simulation techniques for evaluating the effectiveness of a proposed or existing aircraft computer are described. Specific topics covered include: system models; performability evaluation; capability and functional dependence; computation of trajectory set probabilities; and hierarchical modeling of an air transport mission.

  15. DSN system performance test Doppler noise models; noncoherent configuration

    NASA Technical Reports Server (NTRS)

    Bunce, R.

    1977-01-01

    The newer model for variance, the Allan technique, now adopted for testing, is analyzed in the subject mode. A model is generated (including considerable contribution from the station secondary frequency standard), and rationalized with existing data. The variance model is definitely sound; the Allan technique mates theory and measure. The mean-frequency model is an estimate; this problem is yet to be rigorously resolved. The unaltered defining expressions are noncovergent, and the observed mean is quite erratic.

  16. Hyperspectral face recognition with spatiospectral information fusion and PLS regression.

    PubMed

    Uzair, Muhammad; Mahmood, Arif; Mian, Ajmal

    2015-03-01

    Hyperspectral imaging offers new opportunities for face recognition via improved discrimination along the spectral dimension. However, it poses new challenges, including low signal-to-noise ratio, interband misalignment, and high data dimensionality. Due to these challenges, the literature on hyperspectral face recognition is not only sparse but is limited to ad hoc dimensionality reduction techniques and lacks comprehensive evaluation. We propose a hyperspectral face recognition algorithm using a spatiospectral covariance for band fusion and partial least square regression for classification. Moreover, we extend 13 existing face recognition techniques, for the first time, to perform hyperspectral face recognition.We formulate hyperspectral face recognition as an image-set classification problem and evaluate the performance of seven state-of-the-art image-set classification techniques. We also test six state-of-the-art grayscale and RGB (color) face recognition algorithms after applying fusion techniques on hyperspectral images. Comparison with the 13 extended and five existing hyperspectral face recognition techniques on three standard data sets show that the proposed algorithm outperforms all by a significant margin. Finally, we perform band selection experiments to find the most discriminative bands in the visible and near infrared response spectrum.

  17. A review of techniques to determine alternative selection in design for remanufacturing

    NASA Astrophysics Data System (ADS)

    Noor, A. Z. Mohamed; Fauadi, M. H. F. Md; Jafar, F. A.; Mohamad, N. R.; Yunos, A. S. Mohd

    2017-10-01

    This paper discusses the techniques used for optimization in manufacturing system. Although problem domain is focused on sustainable manufacturing, techniques used to optimize general manufacturing system were also discussed. Important aspects of Design for Remanufacturing (DFReM) considered include indexes, weighted average, grey decision making and Fuzzy TOPSIS. The limitation of existing techniques are most of them is highly based on decision maker’s perspective. Different experts may have different understanding and eventually scale it differently. Therefore, the objective of this paper is to determine available techniques and identify the lacking feature in it. Once all the techniques have been reviewed, a decision will be made by create another technique which should counter the lacking of discussed techniques. In this paper, shows that the hybrid computation of Fuzzy Analytic Hierarchy Process (AHP) and Artificial Neural Network (ANN) is suitable and fill the gap of all discussed technique.

  18. Understanding and Optimizing Asynchronous Low-Precision Stochastic Gradient Descent

    PubMed Central

    De Sa, Christopher; Feldman, Matthew; Ré, Christopher; Olukotun, Kunle

    2018-01-01

    Stochastic gradient descent (SGD) is one of the most popular numerical algorithms used in machine learning and other domains. Since this is likely to continue for the foreseeable future, it is important to study techniques that can make it run fast on parallel hardware. In this paper, we provide the first analysis of a technique called Buckwild! that uses both asynchronous execution and low-precision computation. We introduce the DMGC model, the first conceptualization of the parameter space that exists when implementing low-precision SGD, and show that it provides a way to both classify these algorithms and model their performance. We leverage this insight to propose and analyze techniques to improve the speed of low-precision SGD. First, we propose software optimizations that can increase throughput on existing CPUs by up to 11×. Second, we propose architectural changes, including a new cache technique we call an obstinate cache, that increase throughput beyond the limits of current-generation hardware. We also implement and analyze low-precision SGD on the FPGA, which is a promising alternative to the CPU for future SGD systems. PMID:29391770

  19. Progress in multidisciplinary design optimization at NASA Langley

    NASA Technical Reports Server (NTRS)

    Padula, Sharon L.

    1993-01-01

    Multidisciplinary Design Optimization refers to some combination of disciplinary analyses, sensitivity analysis, and optimization techniques used to design complex engineering systems. The ultimate objective of this research at NASA Langley Research Center is to help the US industry reduce the costs associated with development, manufacturing, and maintenance of aerospace vehicles while improving system performance. This report reviews progress towards this objective and highlights topics for future research. Aerospace design problems selected from the author's research illustrate strengths and weaknesses in existing multidisciplinary optimization techniques. The techniques discussed include multiobjective optimization, global sensitivity equations and sequential linear programming.

  20. Nonpharmacological, Blood Conservation Techniques for Preventing Neonatal Anemia—Effective and Promising Strategies for Reducing Transfusion

    PubMed Central

    Carroll, Patrick D.; Widness, John A.

    2012-01-01

    The development of anemia after birth in very premature, critically ill newborn infants is a universal well-described phenomenon. Although preventing anemia in this population, along with efforts to establish optimal red blood cell (RBC) transfusion and pharmacologic therapy continue to be actively investigated, the present review focuses exclusively on nonpharmacological approaches to the prevention and treatment of neonatal anemia. We begin with an overview of topics relevant to nonpharmacological techniques. These topics include neonatal and fetoplacental hemoglobin levels and blood volumes, clinical and laboratory practices applied in critically ill neonates, and current RBC transfusion practice guidelines. This is followed by a discussion of the most effective and promising nonpharmacological blood conservation strategies and techniques. Fortunately, many of these techniques are feasible in most neonatal intensive care units. When applied together, these techniques are more effective than existing pharmacotherapies in significantly decreasing neonatal RBC transfusions. They include increasing hemoglobin endowment and circulating blood volume at birth; removing less blood for laboratory testing; and optimizing nutrition. PMID:22818543

  1. Turbulent drag reduction for external flows

    NASA Technical Reports Server (NTRS)

    Bushnell, D. M.

    1985-01-01

    A summary of turbulent drag reduction approaches applicable to external flows is given. Because relatively recent and exhaustive reviews exist for laminar flow control and polymer (hydrodynamic) drag reduction, the focus here is upon the emerging areas of nonplanar geometry and large-eddy alteration. Turbulent control techniques for air generally result in modest (but technologically significant) drag reductions (order of 20 percent or less), whereas hydrodynamic approaches can yield drag reductions the order of 70 percent. Suggestions are included for alternative concepts and optimization of existing approaches.

  2. Turbulent drag reduction for external flows

    NASA Technical Reports Server (NTRS)

    Bushnell, D. M.

    1983-01-01

    Paper presents a review and summary of turbulent drag reduction approaches applicable to external flows. Because relatively recent and exhaustive reviews exist for laminar flow control and polymer (hydrodynamic) drag reduction, the paper focuses upon the emerging areas of nonplanar geometry and large eddy alteration. Turbulent control techniques for air generally result in modest (but technologically significant) drag reductions (order of 20 percent or less) whereas hydrodynamic approaches can yield drag reductions the order of 70 percent. Paper also includes suggestions for alternative concepts and optimization of existing approaches.

  3. Voltammetry Method Evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoyt, N.; Pereira, C.; Willit, J.

    2016-07-29

    The purpose of the ANL MPACT Voltammetry project is to evaluate the suitability of previously developed cyclic voltammetry techniques to provide electroanalytical measurements of actinide concentrations in realistic used fuel processing scenarios. The molten salts in these scenarios are very challenging as they include high concentrations of multiple electrochemically active species, thereby creating a variety of complications. Some of the problems that arise therein include issues related to uncompensated resistance, cylindrical diffusion, and alloying of the electrodeposited metals. Improvements to the existing voltammetry technique to account for these issues have been implemented, resulting in good measurements of actinide concentrations acrossmore » a wide range of adverse conditions.« less

  4. Energy Options: Challenge for the Future

    ERIC Educational Resources Information Center

    Hammond, Allen L.

    1972-01-01

    Summarizes alternative technological possibilities for ensuring a supply of energy for the United States, including nuclear technology, solar energy, shale oil and coal gassification, low pollutant techniques for burning coal, and a fuel cell suitable for commercial use. Reports the extent of existing research and development efforts. (AL)

  5. Study of advanced techniques for determining the long-term performance of components

    NASA Technical Reports Server (NTRS)

    1972-01-01

    A study was conducted of techniques having the capability of determining the performance and reliability of components for spacecraft liquid propulsion applications for long term missions. The study utilized two major approaches; improvement in the existing technology, and the evolution of new technology. The criteria established and methods evolved are applicable to valve components. Primary emphasis was placed on the propellants oxygen difluoride and diborane combination. The investigation included analysis, fabrication, and tests of experimental equipment to provide data and performance criteria.

  6. 76 FR 20996 - Agency Information Collection Activities: Extension of an Existing Information Collection...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-14

    ...-training institution, to include approved private elementary and secondary schools and public secondary..., mechanical, or other technological collection techniques or other forms of information technology, e.g... directs the Attorney General, in consultation with the Secretary of State and the Secretary of Education...

  7. 76 FR 40918 - Proposed Data Collections Submitted for Public Comment and Recommendations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-12

    ... respondents, including through the use of automated collection techniques or other forms of information... health at work for all people through research and prevention. Under Public Law 91-596, Section 20 and 22... materials. Typically, this need has been met by translating existing, English-language training materials...

  8. Behavioral Self-Control and Career Development.

    ERIC Educational Resources Information Center

    Thoresen, Carl E.; Ewart, Craig K.

    A broader view of the career problem and the counselor's role through teaching clients behavioral self-control techniques is offered in this paper. Preliminary discussion includes a review of existing vocational theories and research, in particular, Holland's typology and Super's self-concept theory. It is concluded from these reviews that the…

  9. Evidence-based surgery: barriers, solutions, and the role of evidence synthesis.

    PubMed

    Garas, George; Ibrahim, Amel; Ashrafian, Hutan; Ahmed, Kamran; Patel, Vanash; Okabayashi, Koji; Skapinakis, Petros; Darzi, Ara; Athanasiou, Thanos

    2012-08-01

    Surgery is a rapidly evolving field, making the rigorous testing of emerging innovations vital. However, most surgical research fails to employ randomized controlled trials (RCTs) and has particularly been based on low-quality study designs. Subsequently, the analysis of data through meta-analysis and evidence synthesis is particularly difficult. Through a systematic review of the literature, this article explores the barriers to achieving a strong evidence base in surgery and offers potential solutions to overcome the barriers. Many barriers exist to evidence-based surgical research. They include enabling factors, such as funding, time, infrastructure, patient preference, ethical issues, and additionally barriers associated with specific attributes related to researchers, methodologies, or interventions. Novel evidence synthesis techniques in surgery are discussed, including graphics synthesis, treatment networks, and network meta-analyses that help overcome many of the limitations associated with existing techniques. They offer the opportunity to assess gaps and quantitatively present inconsistencies within the existing evidence of RCTs. Poorly or inadequately performed RCTs and meta-analyses can give rise to incorrect results and thus fail to inform clinical practice or revise policy. The above barriers can be overcome by providing academic leadership and good organizational support to ensure that adequate personnel, resources, and funding are allocated to the researcher. Training in research methodology and data interpretation can ensure that trials are conducted correctly and evidence is adequately synthesized and disseminated. The ultimate goal of overcoming the barriers to evidence-based surgery includes the improved quality of patient care in addition to enhanced patient outcomes.

  10. Using the EXIST Active Shields for Earth Occultation Observations of X-Ray Sources

    NASA Technical Reports Server (NTRS)

    Wilson, Colleen A.; Fishman, Gerald; Hong, Jae-Sub; Gridlay, Jonathan; Krawczynski, Henric

    2005-01-01

    The EXIST active shields, now being planned for the main detectors of the coded aperture telescope, will have approximately 15 times the area of the BATSE detectors; and they will have a good geometry on the spacecraft for viewing both the leading and training Earth's limb for occultation observations. These occultation observations will complement the imaging observations of EXIST and can extend them to higher energies. Earth occultatio observations of the hard X-ray sky with BATSE on the Compton Gamma Ray Observatory developed and demonstrated the capabilities of large, flat, uncollimated detectors for this method. With BATSE, a catalog of 179 X-ray sources was monitored twice every spacecraft orbit for 9 years at energies above about 25 keV, resulting in 83 definite detections and 36 possible detections with 5-sigma detection sensitivities of 3.5-20 mcrab (20-430 keV) depending on the sky location. This catalog included four transients discovered with this technique and many variable objects (galactic and extragalactic). This poster will describe the Earth occultation technique, summarize the BATSE occultation observations, and compare the basic observational parameters of the occultation detector elements of BATSE and EXIST.

  11. Modeling of switching regulator power stages with and without zero-inductor-current dwell time

    NASA Technical Reports Server (NTRS)

    Lee, F. C. Y.; Yu, Y.

    1979-01-01

    State-space techniques are employed to derive accurate models for the three basic switching converter power stages: buck, boost, and buck/boost operating with and without zero-inductor-current dwell time. A generalized procedure is developed which treats the continuous-inductor-current mode without dwell time as a special case of the discontinuous-current mode when the dwell time vanishes. Abrupt changes of system behavior, including a reduction of the system order when the dwell time appears, are shown both analytically and experimentally. Merits resulting from the present modeling technique in comparison with existing modeling techniques are illustrated.

  12. Emergency treatment of exertional heatstroke and comparison of whole body cooling techniques.

    PubMed

    Costrini, A

    1990-02-01

    This manuscript compares the whole body cooling techniques in the emergency treatment of heatstroke. Historically, the use of cold water immersion with skin massage has been quite successful in rapidly lowering body temperature and in avoiding severe complications or death. Recent studies have suggested alternative therapies, including the use of a warm air spray, the use of helicopter downdraft, and pharmacological agents. While evidence exists to support these methods, they have not been shown to reduce fatalities as effectively as ice water immersion. Although several cooling methods may have clinical use, all techniques rely on the prompt recognition of symptoms and immediate action in the field.

  13. Surface defect detection in tiling Industries using digital image processing methods: analysis and evaluation.

    PubMed

    Karimi, Mohammad H; Asemani, Davud

    2014-05-01

    Ceramic and tile industries should indispensably include a grading stage to quantify the quality of products. Actually, human control systems are often used for grading purposes. An automatic grading system is essential to enhance the quality control and marketing of the products. Since there generally exist six different types of defects originating from various stages of tile manufacturing lines with distinct textures and morphologies, many image processing techniques have been proposed for defect detection. In this paper, a survey has been made on the pattern recognition and image processing algorithms which have been used to detect surface defects. Each method appears to be limited for detecting some subgroup of defects. The detection techniques may be divided into three main groups: statistical pattern recognition, feature vector extraction and texture/image classification. The methods such as wavelet transform, filtering, morphology and contourlet transform are more effective for pre-processing tasks. Others including statistical methods, neural networks and model-based algorithms can be applied to extract the surface defects. Although, statistical methods are often appropriate for identification of large defects such as Spots, but techniques such as wavelet processing provide an acceptable response for detection of small defects such as Pinhole. A thorough survey is made in this paper on the existing algorithms in each subgroup. Also, the evaluation parameters are discussed including supervised and unsupervised parameters. Using various performance parameters, different defect detection algorithms are compared and evaluated. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  14. Machine learning in heart failure: ready for prime time.

    PubMed

    Awan, Saqib Ejaz; Sohel, Ferdous; Sanfilippo, Frank Mario; Bennamoun, Mohammed; Dwivedi, Girish

    2018-03-01

    The aim of this review is to present an up-to-date overview of the application of machine learning methods in heart failure including diagnosis, classification, readmissions and medication adherence. Recent studies have shown that the application of machine learning techniques may have the potential to improve heart failure outcomes and management, including cost savings by improving existing diagnostic and treatment support systems. Recently developed deep learning methods are expected to yield even better performance than traditional machine learning techniques in performing complex tasks by learning the intricate patterns hidden in big medical data. The review summarizes the recent developments in the application of machine and deep learning methods in heart failure management.

  15. Characterization of agricultural land using singular value decomposition

    NASA Astrophysics Data System (ADS)

    Herries, Graham M.; Danaher, Sean; Selige, Thomas

    1995-11-01

    A method is defined and tested for the characterization of agricultural land from multi-spectral imagery, based on singular value decomposition (SVD) and key vector analysis. The SVD technique, which bears a close resemblance to multivariate statistic techniques, has previously been successfully applied to problems of signal extraction for marine data and forestry species classification. In this study the SVD technique is used as a classifier for agricultural regions, using airborne Daedalus ATM data, with 1 m resolution. The specific region chosen is an experimental research farm in Bavaria, Germany. This farm has a large number of crops, within a very small region and hence is not amenable to existing techniques. There are a number of other significant factors which render existing techniques such as the maximum likelihood algorithm less suitable for this area. These include a very dynamic terrain and tessellated pattern soil differences, which together cause large variations in the growth characteristics of the crops. The SVD technique is applied to this data set using a multi-stage classification approach, removing unwanted land-cover classes one step at a time. Typical classification accuracy's for SVD are of the order of 85-100%. Preliminary results indicate that it is a fast and efficient classifier with the ability to differentiate between crop types such as wheat, rye, potatoes and clover. The results of characterizing 3 sub-classes of Winter Wheat are also shown.

  16. Effective evaluation of privacy protection techniques in visible and thermal imagery

    NASA Astrophysics Data System (ADS)

    Nawaz, Tahir; Berg, Amanda; Ferryman, James; Ahlberg, Jörgen; Felsberg, Michael

    2017-09-01

    Privacy protection may be defined as replacing the original content in an image region with a (less intrusive) content having modified target appearance information to make it less recognizable by applying a privacy protection technique. Indeed, the development of privacy protection techniques also needs to be complemented with an established objective evaluation method to facilitate their assessment and comparison. Generally, existing evaluation methods rely on the use of subjective judgments or assume a specific target type in image data and use target detection and recognition accuracies to assess privacy protection. An annotation-free evaluation method that is neither subjective nor assumes a specific target type is proposed. It assesses two key aspects of privacy protection: "protection" and "utility." Protection is quantified as an appearance similarity, and utility is measured as a structural similarity between original and privacy-protected image regions. We performed an extensive experimentation using six challenging datasets (having 12 video sequences), including a new dataset (having six sequences) that contains visible and thermal imagery. The new dataset is made available online for the community. We demonstrate effectiveness of the proposed method by evaluating six image-based privacy protection techniques and also show comparisons of the proposed method over existing methods.

  17. Coping with Multi-Level Classes Effectively and Creatively.

    ERIC Educational Resources Information Center

    Strasheim, Lorraine A.

    This paper includes a discussion of the problem of multilevel Latin classes, a description of various techniques and perspectives the teacher might use in dealing with these classes, and copies of materials and exercises that have proved useful in multilevel classes. Because the reasons for the existence of such classes are varied, it is suggested…

  18. The Present and Future State of Blended Learning in Workplace Learning Settings in the United States

    ERIC Educational Resources Information Center

    Kim, Kyong-Jee; Bonk, Curtis J.; Oh, Eunjung

    2008-01-01

    This article reports a survey about blended learning in workplace learning settings. The survey found that blended learning gained popularity in many organizations but also that several barriers exist in implementing it. This survey also includes predictions on instructional strategies, emerging technologies, and evaluation techniques for blended…

  19. A summary and evaluation of semi-empirical methods for the prediction of helicopter rotor noise

    NASA Technical Reports Server (NTRS)

    Pegg, R. J.

    1979-01-01

    Existing prediction techniques are compiled and described. The descriptions include input and output parameter lists, required equations and graphs, and the range of validity for each part of the prediction procedures. Examples are provided illustrating the analysis procedure and the degree of agreement with experimental results.

  20. Using the Earth to Heat and Cool Homes.

    ERIC Educational Resources Information Center

    Thomas, Stephen G.

    The heat collecting capacity of the earth and or the earth's ground waters and surface waters exist as potential energy sources for home heating and cooling. Techniques and devices associated with use of the earth's thermal energy capabilities are presented and evaluated in this four-chapter report. Included in these chapters are: (1) descriptions…

  1. Strategic Long Range Planning for Universities. AIR Forum 1980 Paper.

    ERIC Educational Resources Information Center

    Baker, Michael E.

    The use of strategic long-range planning at Carnegie-Mellon University (CMU) is discussed. A structure for strategic planning analysis that integrates existing techniques is presented, and examples of planning activities at CMU are included. The key concept in strategic planning is competitive advantage: if a university has a competitive…

  2. Study of different concentric rings inside gallstones with LIBS.

    PubMed

    Pathak, Ashok Kumar; Singh, Vivek Kumar; Rai, Nilesh Kumar; Rai, Awadhesh Kumar; Rai, Pradeep Kumar; Rai, Pramod Kumar; Rai, Suman; Baruah, G D

    2011-07-01

    Gallstones obtained from patients from the north-east region of India (Assam) were studied using laser-induced breakdown spectroscopy (LIBS) technique. LIBS spectra of the different layers (in cross-section) of the gallstones were recorded in the spectral region 200-900 nm. Several elements, including calcium, magnesium, manganese, copper, silicon, phosphorus, iron, sodium and potassium, were detected in the gallstones. Lighter elements, including carbon, hydrogen, nitrogen and oxygen were also detected, which demonstrates the superiority of the LIBS technique over other existing analytical techniques. The LIBS technique was applied to investigate the evolution of C(2) swan bands and CN violet bands in the LIBS spectra of the gallstones in air and an argon atmosphere. The different layers (dark and light layers) of the gallstones were discriminated on the basis of the presence and intensities of the spectral lines for carbon, hydrogen, nitrogen, oxygen and copper. An attempt was also made to correlate the presence of major and minor elements in the gallstones with the common diet of the population of Assam.

  3. The impact of machine learning techniques in the study of bipolar disorder: A systematic review.

    PubMed

    Librenza-Garcia, Diego; Kotzian, Bruno Jaskulski; Yang, Jessica; Mwangi, Benson; Cao, Bo; Pereira Lima, Luiza Nunes; Bermudez, Mariane Bagatin; Boeira, Manuela Vianna; Kapczinski, Flávio; Passos, Ives Cavalcante

    2017-09-01

    Machine learning techniques provide new methods to predict diagnosis and clinical outcomes at an individual level. We aim to review the existing literature on the use of machine learning techniques in the assessment of subjects with bipolar disorder. We systematically searched PubMed, Embase and Web of Science for articles published in any language up to January 2017. We found 757 abstracts and included 51 studies in our review. Most of the included studies used multiple levels of biological data to distinguish the diagnosis of bipolar disorder from other psychiatric disorders or healthy controls. We also found studies that assessed the prediction of clinical outcomes and studies using unsupervised machine learning to build more consistent clinical phenotypes of bipolar disorder. We concluded that given the clinical heterogeneity of samples of patients with BD, machine learning techniques may provide clinicians and researchers with important insights in fields such as diagnosis, personalized treatment and prognosis orientation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Flood Detection/Monitoring Using Adjustable Histogram Equalization Technique

    PubMed Central

    Riaz, Muhammad Mohsin; Ghafoor, Abdul

    2014-01-01

    Flood monitoring technique using adjustable histogram equalization is proposed. The technique overcomes the limitations (overenhancement, artifacts, and unnatural look) of existing technique by adjusting the contrast of images. The proposed technique takes pre- and postimages and applies different processing steps for generating flood map without user interaction. The resultant flood maps can be used for flood monitoring and detection. Simulation results show that the proposed technique provides better output quality compared to the state of the art existing technique. PMID:24558332

  5. Direct measurement of local material properties within living embryonic tissues

    NASA Astrophysics Data System (ADS)

    Serwane, Friedhelm; Mongera, Alessandro; Rowghanian, Payam; Kealhofer, David; Lucio, Adam; Hockenbery, Zachary; Campàs, Otger

    The shaping of biological matter requires the control of its mechanical properties across multiple scales, ranging from single molecules to cells and tissues. Despite their relevance, measurements of the mechanical properties of sub-cellular, cellular and supra-cellular structures within living embryos pose severe challenges to existing techniques. We have developed a technique that uses magnetic droplets to measure the mechanical properties of complex fluids, including in situ and in vivo measurements within living embryos ,across multiple length and time scales. By actuating the droplets with magnetic fields and recording their deformation we probe the local mechanical properties, at any length scale we choose by varying the droplets' diameter. We use the technique to determine the subcellular mechanics of individual blastomeres of zebrafish embryos, and bridge the gap to the tissue scale by measuring the local viscosity and elasticity of zebrafish embryonic tissues. Using this technique, we show that embryonic zebrafish tissues are viscoelastic with a fluid-like behavior at long time scales. This technique will enable mechanobiology and mechano-transduction studies in vivo, including the study of diseases correlated with tissue stiffness, such as cancer.

  6. Nonpharmacological, blood conservation techniques for preventing neonatal anemia--effective and promising strategies for reducing transfusion.

    PubMed

    Carroll, Patrick D; Widness, John A

    2012-08-01

    The development of anemia after birth in very premature, critically ill newborn infants is a universal well-described phenomenon. Although preventing anemia in this population, along with efforts to establish optimal red blood cell (RBC) transfusion and pharmacologic therapy continue to be actively investigated, the present review focuses exclusively on nonpharmacological approaches to the prevention and treatment of neonatal anemia. We begin with an overview of topics relevant to nonpharmacological techniques. These topics include neonatal and fetoplacental hemoglobin levels and blood volumes, clinical and laboratory practices applied in critically ill neonates, and current RBC transfusion practice guidelines. This is followed by a discussion of the most effective and promising nonpharmacological blood conservation strategies and techniques. Fortunately, many of these techniques are feasible in most neonatal intensive care units. When applied together, these techniques are more effective than existing pharmacotherapies in significantly decreasing neonatal RBC transfusions. They include increasing hemoglobin endowment and circulating blood volume at birth; removing less blood for laboratory testing; and optimizing nutrition. Copyright © 2012 Elsevier Inc. All rights reserved.

  7. Advanced techniques for determining long term compatibility of materials with propellants

    NASA Technical Reports Server (NTRS)

    Green, R. L.

    1972-01-01

    The search for advanced measurement techniques for determining long term compatibility of materials with propellants was conducted in several parts. A comprehensive survey of the existing measurement and testing technology for determining material-propellant interactions was performed. Selections were made from those existing techniques which were determined could meet or be made to meet the requirements. Areas of refinement or changes were recommended for improvement of others. Investigations were also performed to determine the feasibility and advantages of developing and using new techniques to achieve significant improvements over existing ones. The most interesting demonstration was that of the new technique, the volatile metal chelate analysis. Rivaling the neutron activation analysis in terms of sensitivity and specificity, the volatile metal chelate technique was fully demonstrated.

  8. Retinal Image Simulation of Subjective Refraction Techniques.

    PubMed

    Perches, Sara; Collados, M Victoria; Ares, Jorge

    2016-01-01

    Refraction techniques make it possible to determine the most appropriate sphero-cylindrical lens prescription to achieve the best possible visual quality. Among these techniques, subjective refraction (i.e., patient's response-guided refraction) is the most commonly used approach. In this context, this paper's main goal is to present a simulation software that implements in a virtual manner various subjective-refraction techniques--including Jackson's Cross-Cylinder test (JCC)--relying all on the observation of computer-generated retinal images. This software has also been used to evaluate visual quality when the JCC test is performed in multifocal-contact-lens wearers. The results reveal this software's usefulness to simulate the retinal image quality that a particular visual compensation provides. Moreover, it can help to gain a deeper insight and to improve existing refraction techniques and it can be used for simulated training.

  9. Development of a Process for a High Capacity Arc Heater Production of Silicon for Solar Arrays

    NASA Technical Reports Server (NTRS)

    Reed, W. H.

    1979-01-01

    A program was established to develop a high temperature silicon production process using existing electric arc heater technology. Silicon tetrachloride and a reductant (sodium) are injected into an arc heated mixture of hydrogen and argon. Under these high temperature conditions, a very rapid reaction is expected to occur and proceed essentially to completion, yielding silicon and gaseous sodium chloride. Techniques for high temperature separation and collection were developed. Included in this report are: test system preparation; testing; injection techniques; kinetics; reaction demonstration; conclusions; and the project status.

  10. Geometric and shading correction for images of printed materials using boundary.

    PubMed

    Brown, Michael S; Tsoi, Yau-Chat

    2006-06-01

    A novel technique that uses boundary interpolation to correct geometric distortion and shading artifacts present in images of printed materials is presented. Unlike existing techniques, our algorithm can simultaneously correct a variety of geometric distortions, including skew, fold distortion, binder curl, and combinations of these. In addition, the same interpolation framework can be used to estimate the intrinsic illumination component of the distorted image to correct shading artifacts. We detail our algorithm for geometric and shading correction and demonstrate its usefulness on real-world and synthetic data.

  11. Theoretical and software considerations for nonlinear dynamic analysis

    NASA Technical Reports Server (NTRS)

    Schmidt, R. J.; Dodds, R. H., Jr.

    1983-01-01

    In the finite element method for structural analysis, it is generally necessary to discretize the structural model into a very large number of elements to accurately evaluate displacements, strains, and stresses. As the complexity of the model increases, the number of degrees of freedom can easily exceed the capacity of present-day software system. Improvements of structural analysis software including more efficient use of existing hardware and improved structural modeling techniques are discussed. One modeling technique that is used successfully in static linear and nonlinear analysis is multilevel substructuring. This research extends the use of multilevel substructure modeling to include dynamic analysis and defines the requirements for a general purpose software system capable of efficient nonlinear dynamic analysis. The multilevel substructuring technique is presented, the analytical formulations and computational procedures for dynamic analysis and nonlinear mechanics are reviewed, and an approach to the design and implementation of a general purpose structural software system is presented.

  12. Pelvic floor muscle training for urgency urinary incontinence in women: a systematic review.

    PubMed

    Greer, Joy A; Smith, Ariana L; Arya, Lily A

    2012-06-01

    The objective of this study is to evaluate the effectiveness of existing physiotherapy modalities for the treatment of urge urinary incontinence (UUI). A systematic review was performed for primary studies of physiotherapy techniques for UUI published in English between 1996 and August 2010 in major electronic databases. Only randomized clinical trials that reported outcomes separately for women with UUI were included. Outcomes assessed were reduction in UUI, urinary frequency, and nocturia. Data from 13 full-text trials including the modalities of pelvic floor muscles exercises with or without biofeedback, vaginal electrical stimulation, magnetic stimulation, and vaginal cones were analyzed. The methodologic quality of these trials was fair. Significant improvement in UUI was reported for all physiotherapy techniques except vaginal cone therapy. There are insufficient data to determine if pelvic physiotherapy improves urinary frequency or nocturia. Evidence suggests that physiotherapy techniques may be beneficial for the treatment of UUI.

  13. Economics of polysilicon processes

    NASA Technical Reports Server (NTRS)

    Yaws, C. L.; Li, K. Y.; Chou, S. M.

    1986-01-01

    Techniques are being developed to provide lower cost polysilicon material for solar cells. Existing technology which normally provides semiconductor industry polysilicon material is undergoing changes and also being used to provide polysilicon material for solar cells. Economics of new and existing technologies are presented for producing polysilicon. The economics are primarily based on the preliminary process design of a plant producing 1,000 metric tons/year of silicon. The polysilicon processes include: Siemen's process (hydrogen reduction of trichlorosilane); Union Carbide process (silane decomposition); and Hemlock Semiconductor process (hydrogen reduction of dichlorosilane). The economics include cost estimates of capital investment and product cost to produce polysilicon via the technology. Sensitivity analysis results are also presented to disclose the effect of major paramentes such as utilities, labor, raw materials and capital investment.

  14. Computing Challenges in Coded Mask Imaging

    NASA Technical Reports Server (NTRS)

    Skinner, Gerald

    2009-01-01

    This slide presaentation reviews the complications and challenges in developing computer systems for Coded Mask Imaging telescopes. The coded mask technique is used when there is no other way to create the telescope, (i.e., when there are wide fields of view, high energies for focusing or low energies for the Compton/Tracker Techniques and very good angular resolution.) The coded mask telescope is described, and the mask is reviewed. The coded Masks for the INTErnational Gamma-Ray Astrophysics Laboratory (INTEGRAL) instruments are shown, and a chart showing the types of position sensitive detectors used for the coded mask telescopes is also reviewed. Slides describe the mechanism of recovering an image from the masked pattern. The correlation with the mask pattern is described. The Matrix approach is reviewed, and other approaches to image reconstruction are described. Included in the presentation is a review of the Energetic X-ray Imaging Survey Telescope (EXIST) / High Energy Telescope (HET), with information about the mission, the operation of the telescope, comparison of the EXIST/HET with the SWIFT/BAT and details of the design of the EXIST/HET.

  15. A New Adaptive Framework for Collaborative Filtering Prediction

    PubMed Central

    Almosallam, Ibrahim A.; Shang, Yi

    2010-01-01

    Collaborative filtering is one of the most successful techniques for recommendation systems and has been used in many commercial services provided by major companies including Amazon, TiVo and Netflix. In this paper we focus on memory-based collaborative filtering (CF). Existing CF techniques work well on dense data but poorly on sparse data. To address this weakness, we propose to use z-scores instead of explicit ratings and introduce a mechanism that adaptively combines global statistics with item-based values based on data density level. We present a new adaptive framework that encapsulates various CF algorithms and the relationships among them. An adaptive CF predictor is developed that can self adapt from user-based to item-based to hybrid methods based on the amount of available ratings. Our experimental results show that the new predictor consistently obtained more accurate predictions than existing CF methods, with the most significant improvement on sparse data sets. When applied to the Netflix Challenge data set, our method performed better than existing CF and singular value decomposition (SVD) methods and achieved 4.67% improvement over Netflix’s system. PMID:21572924

  16. A New Adaptive Framework for Collaborative Filtering Prediction.

    PubMed

    Almosallam, Ibrahim A; Shang, Yi

    2008-06-01

    Collaborative filtering is one of the most successful techniques for recommendation systems and has been used in many commercial services provided by major companies including Amazon, TiVo and Netflix. In this paper we focus on memory-based collaborative filtering (CF). Existing CF techniques work well on dense data but poorly on sparse data. To address this weakness, we propose to use z-scores instead of explicit ratings and introduce a mechanism that adaptively combines global statistics with item-based values based on data density level. We present a new adaptive framework that encapsulates various CF algorithms and the relationships among them. An adaptive CF predictor is developed that can self adapt from user-based to item-based to hybrid methods based on the amount of available ratings. Our experimental results show that the new predictor consistently obtained more accurate predictions than existing CF methods, with the most significant improvement on sparse data sets. When applied to the Netflix Challenge data set, our method performed better than existing CF and singular value decomposition (SVD) methods and achieved 4.67% improvement over Netflix's system.

  17. Geophysical monitoring in a hydrocarbon reservoir

    NASA Astrophysics Data System (ADS)

    Caffagni, Enrico; Bokelmann, Goetz

    2016-04-01

    Extraction of hydrocarbons from reservoirs demands ever-increasing technological effort, and there is need for geophysical monitoring to better understand phenomena occurring within the reservoir. Significant deformation processes happen when man-made stimulation is performed, in combination with effects deriving from the existing natural conditions such as stress regime in situ or pre-existing fracturing. Keeping track of such changes in the reservoir is important, on one hand for improving recovery of hydrocarbons, and on the other hand to assure a safe and proper mode of operation. Monitoring becomes particularly important when hydraulic-fracturing (HF) is used, especially in the form of the much-discussed "fracking". HF is a sophisticated technique that is widely applied in low-porosity geological formations to enhance the production of natural hydrocarbons. In principle, similar HF techniques have been applied in Europe for a long time in conventional reservoirs, and they will probably be intensified in the near future; this suggests an increasing demand in technological development, also for updating and adapting the existing monitoring techniques in applied geophysics. We review currently available geophysical techniques for reservoir monitoring, which appear in the different fields of analysis in reservoirs. First, the properties of the hydrocarbon reservoir are identified; here we consider geophysical monitoring exclusively. The second step is to define the quantities that can be monitored, associated to the properties. We then describe the geophysical monitoring techniques including the oldest ones, namely those in practical usage from 40-50 years ago, and the most recent developments in technology, within distinct groups, according to the application field of analysis in reservoir. This work is performed as part of the FracRisk consortium (www.fracrisk.eu); this project, funded by the Horizon2020 research programme, aims at helping minimize the environmental footprint of the shale-gas exploration and exploitation.

  18. Evaluation of a satellite laser ranging technique using pseudonoise code modulated laser diodes

    NASA Technical Reports Server (NTRS)

    Ball, Carolyn Kay

    1987-01-01

    Several types of Satellite Laser Ranging systems exist, operating with pulsed, high-energy lasers. The distance between a ground point and an orbiting satellite can be determined to within a few centimeters. A new technique substitutes pseudonoise code modulated laser diodes, which are much more compact, reliable and less costly, for the lasers now used. Since laser diode technology is only now achieving sufficiently powerful lasers, the capabilities of the new technique are investigated. Also examined are the effects of using an avalanche photodiode detector instead of a photomultiplier tube. The influence of noise terms (including background radiation, detector dark and thermal noise and speckle) that limit the system range and performance is evaluated.

  19. Middle Atmosphere Program. Handbook for MAP, volume 9

    NASA Technical Reports Server (NTRS)

    Bowhill, S. A. (Editor); Edwards, B. (Editor)

    1983-01-01

    The term Mesosphere-Stratosphere-Troposphere radar (MST) was invented to describe the use of a high power radar transmitter together with a large vertically, or near vertically, pointing antenna to study the dynamics and structure of the atmosphere from about 10 to 100 km, using the very weak coherently scattered radiation returned from small scale irregularities in refractive index. Nine topics were addressed including: meteorological and dynamic requirements for MST radar networks; interpretation of radar returns for clear air; techniques for the measurement of horizontal and vertical velocities; techniques for studying gravity waves and turbulence; capabilities and limitations of existing MST radar; design considerations for high power VHF radar transceivers; optimum radar antenna configurations; and data analysis techniques.

  20. A Double Whammy: Health Promotion Among Cancer Survivors with Pre-Existing Functional Limitations

    PubMed Central

    Volker, Deborah L.; Becker, Heather; Kang, Sook Jung; Kullberg, Vicki

    2012-01-01

    Purpose/Objectives To explore the experience of living with a cancer diagnosis within the context of a pre-existing functional disability and to identify strategies to promote health in this growing population of cancer survivors. Research Approach Qualitative descriptive Setting Four sites in the United States Participants 19 female cancer survivors with pre-existing disabling conditions Methodologic Approach Four focus groups were conducted. The audiotapes were transcribed and analyzed using content analysis techniques. Main Research Variables cancer survivor, disability, health promotion Findings Analytic categories included living with a cancer diagnosis, health promotion strategies, and wellness program development for survivors with pre-existing functional limitations. Participants described many challenges associated with managing a cancer diagnosis on top of living with a chronic disabling functional limitation. They identified strategies they used to maintain their health and topics to be included in health promotion programs tailored for this unique group of cancer survivors. Conclusions The “double whammy” of a cancer diagnosis for persons with pre-existing functional limitations requires modification of health promotion strategies and programs to promote wellness in this group of cancer survivors. Interpretation Nurses and other health care providers must attend to patients’ pre-existing conditions as well as the challenges of the physical, emotional, social, and economic sequelae of a cancer diagnosis. PMID:23269771

  1. Laboratory reptile surgery: principles and techniques.

    PubMed

    Alworth, Leanne C; Hernandez, Sonia M; Divers, Stephen J

    2011-01-01

    Reptiles used for research and instruction may require surgical procedures, including biopsy, coelomic device implantation, ovariectomy, orchidectomy, and esophogostomy tube placement, to accomplish research goals. Providing veterinary care for unanticipated clinical problems may require surgical techniques such as amputation, bone or shell fracture repair, and coeliotomy. Although many principles of surgery are common between mammals and reptiles, important differences in anatomy and physiology exist. Veterinarians who provide care for these species should be aware of these differences. Most reptiles undergoing surgery are small and require specific instrumentation and positioning. In addition, because of the wide variety of unique physiologic and anatomic characteristics among snakes, chelonians, and lizards, different techniques may be necessary for different reptiles. This overview describes many common reptile surgery techniques and their application for research purposes or to provide medical care to research subjects.

  2. Novel method for fog monitoring using cellular networks infrastructures

    NASA Astrophysics Data System (ADS)

    David, N.; Alpert, P.; Messer, H.

    2012-08-01

    A major detrimental effect of fog is visibility limitation which can result in serious transportation accidents, traffic delays and therefore economic damage. Existing monitoring techniques including satellites, transmissometers and human observers - suffer from low spatial resolution, high cost or lack of precision when measuring near ground level. Here we show a novel technique for fog monitoring using wireless communication systems. Communication networks widely deploy commercial microwave links across the terrain at ground level. Operating at frequencies of tens of GHz they are affected by fog and are, effectively, an existing, spatially world-wide distributed sensor network that can provide crucial information about fog concentration and visibility. Fog monitoring potential is demonstrated for a heavy fog event that took place in Israel. The correlation between transmissomters and human eye observations to the visibility estimates from the nearby microwave links was found to be 0.53 and 0.61, respectively. These values indicate the high potential of the proposed method.

  3. Application of Three Existing Stope Boundary Optimisation Methods in an Operating Underground Mine

    NASA Astrophysics Data System (ADS)

    Erdogan, Gamze; Yavuz, Mahmut

    2017-12-01

    The underground mine planning and design optimisation process have received little attention because of complexity and variability of problems in underground mines. Although a number of optimisation studies and software tools are available and some of them, in special, have been implemented effectively to determine the ultimate-pit limits in an open pit mine, there is still a lack of studies for optimisation of ultimate stope boundaries in underground mines. The proposed approaches for this purpose aim at maximizing the economic profit by selecting the best possible layout under operational, technical and physical constraints. In this paper, the existing three heuristic techniques including Floating Stope Algorithm, Maximum Value Algorithm and Mineable Shape Optimiser (MSO) are examined for optimisation of stope layout in a case study. Each technique is assessed in terms of applicability, algorithm capabilities and limitations considering the underground mine planning challenges. Finally, the results are evaluated and compared.

  4. ASRDI oxygen technology survey. Volume 5: Density and liquid level measurement instrumentation for the cryogenic fluids oxygen, hydrogen, and nitrogen

    NASA Technical Reports Server (NTRS)

    Roder, H. M.

    1974-01-01

    Information is presented on instrumentation for density measurement, liquid level measurement, quantity gauging, and phase measurement. Coverage of existing information directly concerned with oxygen was given primary emphasis. A description of the physical principle of measurement for each instrumentation type is included. The basic materials of construction are listed if available from the source document for each instrument discussed. Cleaning requirements, procedures, and verification techniques are included.

  5. Towards Optimal Platform-Based Robot Design for Ankle Rehabilitation: The State of the Art and Future Prospects

    PubMed Central

    Li, Hongsheng

    2018-01-01

    This review aims to compare existing robot-assisted ankle rehabilitation techniques in terms of robot design. Included studies mainly consist of selected papers in two published reviews involving a variety of robot-assisted ankle rehabilitation techniques. A free search was also made in Google Scholar and Scopus by using keywords “ankle∗,” and “robot∗,” and (“rehabilitat∗” or “treat∗”). The search is limited to English-language articles published between January 1980 and September 2016. Results show that existing robot-assisted ankle rehabilitation techniques can be classified into wearable exoskeleton and platform-based devices. Platform-based devices are mostly developed for the treatment of a variety of ankle musculoskeletal and neurological injuries, while wearable ones focus more on ankle-related gait training. In terms of robot design, comparative analysis indicates that an ideal ankle rehabilitation robot should have aligned rotation center as the ankle joint, appropriate workspace, and actuation torque, no matter how many degrees of freedom (DOFs) it has. Single-DOF ankle robots are mostly developed for specific applications, while multi-DOF devices are more suitable for comprehensive ankle rehabilitation exercises. Other factors including posture adjustability and sensing functions should also be considered to promote related clinical applications. An ankle rehabilitation robot with reconfigurability to maximize its functions will be a new research point towards optimal design, especially on parallel mechanisms. PMID:29736230

  6. Towards Optimal Platform-Based Robot Design for Ankle Rehabilitation: The State of the Art and Future Prospects.

    PubMed

    Miao, Qing; Zhang, Mingming; Wang, Congzhe; Li, Hongsheng

    2018-01-01

    This review aims to compare existing robot-assisted ankle rehabilitation techniques in terms of robot design. Included studies mainly consist of selected papers in two published reviews involving a variety of robot-assisted ankle rehabilitation techniques. A free search was also made in Google Scholar and Scopus by using keywords "ankle ∗ ," and "robot ∗ ," and ("rehabilitat ∗ " or "treat ∗ "). The search is limited to English-language articles published between January 1980 and September 2016. Results show that existing robot-assisted ankle rehabilitation techniques can be classified into wearable exoskeleton and platform-based devices. Platform-based devices are mostly developed for the treatment of a variety of ankle musculoskeletal and neurological injuries, while wearable ones focus more on ankle-related gait training. In terms of robot design, comparative analysis indicates that an ideal ankle rehabilitation robot should have aligned rotation center as the ankle joint, appropriate workspace, and actuation torque, no matter how many degrees of freedom (DOFs) it has. Single-DOF ankle robots are mostly developed for specific applications, while multi-DOF devices are more suitable for comprehensive ankle rehabilitation exercises. Other factors including posture adjustability and sensing functions should also be considered to promote related clinical applications. An ankle rehabilitation robot with reconfigurability to maximize its functions will be a new research point towards optimal design, especially on parallel mechanisms.

  7. Manual of Considerations and Techniques for Start-Up of Municipal Wastewater Treatment Facilities.

    ERIC Educational Resources Information Center

    Rader, R. D.; And Others

    This manual provides guidance for putting into initial operation a new municipal wastewater treatment plant, a new addition to an existing treatment plant, or a change in the mode of a treatment plant's operation. Information is provided on preparing for actual treatment plant start-up. Preparation for start-up includes: staffing the plant,…

  8. Using Geospatial Techniques to Address Institutional Objectives: St. Petersburg College Geo-Demographic Analysis. IR Applications, Volume 27

    ERIC Educational Resources Information Center

    Morris, Phillip; Thrall, Grant

    2010-01-01

    Geographic analysis has been adopted by businesses, especially the retail sector, since the early 1990s (Thrall, 2002). Institutional research can receive the same benefits businesses have by adopting geographic analysis and technology. The commonalities between businesses and higher education institutions include the existence of trade areas, the…

  9. Application of Fourier transforms for microwave radiometric inversions

    NASA Technical Reports Server (NTRS)

    Holmes, J. J.; Balanis, C. A.; Truman, W. M.

    1975-01-01

    Existing microwave radiometer technology now provides a suitable method for remote determination of the ocean surface's absolute brightness temperature. To extract the brightness temperature of the water from the antenna temperature, an unstable Fredholm integral equation of the first kind is solved. Fourier transform techniques are used to invert the integral after it is placed into a cross correlation form. Application and verification of the methods to a two-dimensional modeling of a laboratory wave tank system are included. The instability of the ill-posed Fredholm equation is examined and a restoration procedure is included which smooths the resulting oscillations. With the recent availability and advances of fast Fourier transform (FFT) techniques, the method presented becomes very attractive in the evaluation of large quantities of data.

  10. Glucagon Is a Safe and Inexpensive Initial Strategy in Esophageal Food Bolus Impaction.

    PubMed

    Haas, Jason; Leo, Julia; Vakil, Nimish

    2016-03-01

    Controversy exists about the utility of pharmacologic agents and endoscopic technique used for esophageal food bolus impaction. To evaluate the utility of glucagon and the technique used for endoscopic removal, including the rate of success and the adverse events of the techniques. The database of the largest healthcare provider in southeastern Wisconsin was retrospectively reviewed for patients presenting with esophageal food bolus impaction. Data extracted included glucagon administration and its success rate, outcome of radiographic studies, and the endoscopic method of removal and adverse events associated with it, including 30-day mortality. A total of 750 patients were identified with food bolus impaction from 2007 to 2012. Glucagon was administered in 440 patients and was successful in 174 (39.5%). Endoscopic removal was performed in 470 patients and was successful in 469 (99.8%). The push technique was utilized in 209 patients, reduction in the bolus size by piecemeal removal followed by the push technique was utilized in 97 patients, and the pull technique was utilized in 107 patients. There were no perforations with endoscopic removal. Only 4.5% of the X-rays performed reported a possible foreign body within the esophagus. Glucagon was a significantly less-expensive strategy than endoscopic therapy (p < 0.0001). Glucagon is low cost, is moderately effective, and may be considered as an initial strategy. Endoscopic removal regardless of technique is safe and effective. The yield of radiography is poor in the setting of food bolus impaction.

  11. Fluorescence techniques in agricultural applications

    NASA Astrophysics Data System (ADS)

    McMurtrey, James E.; Corp, Lawrence A.; Kim, Moon S.; Chappelle, Emmett W.; Daughtry, Craig S. T.; DiBenedetto, J. D.

    2001-03-01

    Intellectual property licensing is an important issue facing all technology companies. Before entering into license agreements a number of issues need to be addressed, including invention ownership, obtaining and identifying licensable subject matter, and developing a licensing strategy. There are a number of important provisions that are included in most intellectual property license agreements. These provisions include definitions, the license grant, consideration, audit rights, confidentiality, warranties, indemnification, and limitation of liability. Special licensing considerations exist relative to each type of intellectual property, and when the other party is a foreign company or a university.

  12. GalWeight: A New and Effective Weighting Technique for Determining Galaxy Cluster and Group Membership

    NASA Astrophysics Data System (ADS)

    Abdullah, Mohamed H.; Wilson, Gillian; Klypin, Anatoly

    2018-07-01

    We introduce GalWeight, a new technique for assigning galaxy cluster membership. This technique is specifically designed to simultaneously maximize the number of bona fide cluster members while minimizing the number of contaminating interlopers. The GalWeight technique can be applied to both massive galaxy clusters and poor galaxy groups. Moreover, it is effective in identifying members in both the virial and infall regions with high efficiency. We apply the GalWeight technique to MDPL2 and Bolshoi N-body simulations, and find that it is >98% accurate in correctly assigning cluster membership. We show that GalWeight compares very favorably against four well-known existing cluster membership techniques (shifting gapper, den Hartog, caustic, SIM). We also apply the GalWeight technique to a sample of 12 Abell clusters (including the Coma cluster) using observations from the Sloan Digital Sky Survey. We conclude by discussing GalWeight’s potential for other astrophysical applications.

  13. A LITERATURE REVIEW OF WIPE SAMPLING METHODS ...

    EPA Pesticide Factsheets

    Wipe sampling is an important technique for the estimation of contaminant deposition in buildings, homes, or outdoor surfaces as a source of possible human exposure. Numerousmethods of wipe sampling exist, and each method has its own specification for the type of wipe, wetting solvent, and determinative step to be used, depending upon the contaminant of concern. The objective of this report is to concisely summarize the findings of a literature review that was conducted to identify the state-of-the-art wipe sampling techniques for a target list of compounds. This report describes the methods used to perform the literature review; a brief review of wipe sampling techniques in general; an analysis of physical and chemical properties of each target analyte; an analysis of wipe sampling techniques for the target analyte list; and asummary of the wipe sampling techniques for the target analyte list, including existing data gaps. In general, no overwhelming consensus can be drawn from the current literature on how to collect a wipe sample for the chemical warfare agents, organophosphate pesticides, and other toxic industrial chemicals of interest to this study. Different methods, media, and wetting solvents have been recommended and used by various groups and different studies. For many of the compounds of interest, no specific wipe sampling methodology has been established for their collection. Before a wipe sampling method (or methods) can be established for the co

  14. Lunar surface magnetometer experiment

    NASA Technical Reports Server (NTRS)

    Dyal, P.; Parkin, C. W.; Colburn, D. S.; Schubert, G.

    1972-01-01

    The Apollo 16 lunar surface magnetometer (LSM) activation completed the network installation of magnetic observatories on the lunar surface and initiated simultaneous measurements of the global response of the moon to large-scale solar and terrestrial magnetic fields. Fossil remanent magnetic fields have been measured at nine locations on the lunar surface, including the Apollo 16 LSM site in the Descartes highlands area. This fossil record indicates the possible existence of an ancient lunar dynamo or a solar or terrestrial field much stronger than exists at present. The experimental technique and operation of the LSM are described and the results obtained are discussed.

  15. Measuring Children’s Media Use in the Digital Age

    PubMed Central

    Vandewater, Elizabeth A.; Lee, Sook-Jung

    2009-01-01

    In this new and rapidly changing era of digital technology, there is increasing consensus among media scholars that there is an urgent need to develop measurement approaches which more adequately capture media use The overarching goal of this paper is facilitate the development of measurement approaches appropriate for capturing children’s media use in the digital age. The paper outlines various approaches to measurement, focusing mainly on those which have figured prominently in major existing studies of children’s media use. We identify issues related to each technique, including advantages and disadvantages. We also include a review of existing empirical comparisons of various methodologies. The paper is intended to foster discussion of the best ways to further research and knowledge regarding the impact of media on children. PMID:19763246

  16. Launch team training system

    NASA Technical Reports Server (NTRS)

    Webb, J. T.

    1988-01-01

    A new approach to the training, certification, recertification, and proficiency maintenance of the Shuttle launch team is proposed. Previous training approaches are first reviewed. Short term program goals include expanding current training methods, improving the existing simulation capability, and scheduling training exercises with the same priority as hardware tests. Long-term goals include developing user requirements which would take advantage of state-of-the-art tools and techniques. Training requirements for the different groups of people to be trained are identified, and future goals are outlined.

  17. Photoinduced force microscopy: A technique for hyperspectral nanochemical mapping

    NASA Astrophysics Data System (ADS)

    Murdick, Ryan A.; Morrison, William; Nowak, Derek; Albrecht, Thomas R.; Jahng, Junghoon; Park, Sung

    2017-08-01

    Advances in nanotechnology have intensified the need for tools that can characterize newly synthesized nanomaterials. A variety of techniques has recently been shown which combines atomic force microscopy (AFM) with optical illumination including tip-enhanced Raman spectroscopy (TERS), scattering-type scanning near-field optical microscopy (sSNOM), and photothermal induced resonance microscopy (PTIR). To varying degrees, these existing techniques enable optical spectroscopy with the nanoscale spatial resolution inherent to AFM, thereby providing nanochemical interrogation of a specimen. Here we discuss photoinduced force microscopy (PiFM), a recently developed technique for nanoscale optical spectroscopy that exploits image forces acting between an AFM tip and sample to detect wavelength-dependent polarization within the sample to generate absorption spectra. This approach enables ∼10 nm spatial resolution with spectra that show correlation with macroscopic optical absorption spectra. Unlike other techniques, PiFM achieves this high resolution with virtually no constraints on sample or substrate properties. The applicability of PiFM to a variety of archetypal systems is reported here, highlighting the potential of PiFM as a useful tool for a wide variety of industrial and academic investigations, including semiconducting nanoparticles, nanocellulose, block copolymers, and low dimensional systems, as well as chemical and morphological mixing at interfaces.

  18. Optimising Laser Tattoo Removal

    PubMed Central

    Sardana, Kabir; Ranjan, Rashmi; Ghunawat, Sneha

    2015-01-01

    Lasers are the standard modality for tattoo removal. Though there are various factors that determine the results, we have divided them into three logical headings, laser dependant factors such as type of laser and beam modifications, tattoo dependent factors like size and depth, colour of pigment and lastly host dependent factors, which includes primarily the presence of a robust immune response. Modifications in the existing techniques may help in better clinical outcome with minimal risk of complications. This article provides an insight into some of these techniques along with a detailed account of the factors involved in tattoo removal. PMID:25949018

  19. 3D shape measurement of automotive glass by using a fringe reflection technique

    NASA Astrophysics Data System (ADS)

    Skydan, O. A.; Lalor, M. J.; Burton, D. R.

    2007-01-01

    In automotive and glass making industries, there is a need for accurately measuring the 3D shapes of reflective surfaces to speed up and ensure product development and manufacturing quality by using non-contact techniques. This paper describes a technique for the measurement of non-full-field reflective surfaces of automotive glass by using a fringe reflection technique. Physical properties of the measurement surfaces do not allow us to apply optical geometries used in existing techniques for surface measurement based upon direct fringe pattern illumination. However, this property of surface reflectivity can be used to implement similar ideas from existing techniques in a new improved method. In other words, the reflective surface can be used as a mirror to reflect illuminated fringe patterns onto a screen behind. It has been found that in the case of implementing the reflective fringe technique, the phase-shift distribution depends not only on the height of the object but also on the slope at each measurement point. This requires the solving of differential equations to find the surface slope and height distributions in the x and y directions and development of the additional height reconstruction algorithms. The main focus has been made on developing a mathematical model of the optical sub-system and discussing ways for its practical implementation including calibration procedures. A number of implemented image processing algorithms for system calibration and data analysis are discussed and two experimental results are given for automotive glass surfaces with different shapes and defects. The proposed technique showed the ability to provide accurate non-destructive measurement of 3D shapes of the reflective automotive glass surfaces and can be used as a key element for a glass shape quality control system on-line or in a laboratory environment.

  20. Optimization of monitoring networks based on uncertainty quantification of model predictions of contaminant transport

    NASA Astrophysics Data System (ADS)

    Vesselinov, V. V.; Harp, D.

    2010-12-01

    The process of decision making to protect groundwater resources requires a detailed estimation of uncertainties in model predictions. Various uncertainties associated with modeling a natural system, such as: (1) measurement and computational errors; (2) uncertainties in the conceptual model and model-parameter estimates; (3) simplifications in model setup and numerical representation of governing processes, contribute to the uncertainties in the model predictions. Due to this combination of factors, the sources of predictive uncertainties are generally difficult to quantify individually. Decision support related to optimal design of monitoring networks requires (1) detailed analyses of existing uncertainties related to model predictions of groundwater flow and contaminant transport, (2) optimization of the proposed monitoring network locations in terms of their efficiency to detect contaminants and provide early warning. We apply existing and newly-proposed methods to quantify predictive uncertainties and to optimize well locations. An important aspect of the analysis is the application of newly-developed optimization technique based on coupling of Particle Swarm and Levenberg-Marquardt optimization methods which proved to be robust and computationally efficient. These techniques and algorithms are bundled in a software package called MADS. MADS (Model Analyses for Decision Support) is an object-oriented code that is capable of performing various types of model analyses and supporting model-based decision making. The code can be executed under different computational modes, which include (1) sensitivity analyses (global and local), (2) Monte Carlo analysis, (3) model calibration, (4) parameter estimation, (5) uncertainty quantification, and (6) model selection. The code can be externally coupled with any existing model simulator through integrated modules that read/write input and output files using a set of template and instruction files (consistent with the PEST I/O protocol). MADS can also be internally coupled with a series of built-in analytical simulators. MADS provides functionality to work directly with existing control files developed for the code PEST (Doherty 2009). To perform the computational modes mentioned above, the code utilizes (1) advanced Latin-Hypercube sampling techniques (including Improved Distributed Sampling), (2) various gradient-based Levenberg-Marquardt optimization methods, (3) advanced global optimization methods (including Particle Swarm Optimization), and (4) a selection of alternative objective functions. The code has been successfully applied to perform various model analyses related to environmental management of real contamination sites. Examples include source identification problems, quantification of uncertainty, model calibration, and optimization of monitoring networks. The methodology and software codes are demonstrated using synthetic and real case studies where monitoring networks are optimized taking into account the uncertainty in model predictions of contaminant transport.

  1. Multi-physics damage sensing in nano-engineered structural composites.

    PubMed

    de Villoria, Roberto Guzmán; Yamamoto, Namiko; Miravete, Antonio; Wardle, Brian L

    2011-05-06

    Non-destructive evaluation techniques can offer viable diagnostic and prognostic routes to mitigating failures in engineered structures such as bridges, buildings and vehicles. However, existing techniques have significant drawbacks, including poor spatial resolution and limited in situ capabilities. We report here a novel approach where structural advanced composites containing electrically conductive aligned carbon nanotubes (CNTs) are ohmically heated via simple electrical contacts, and damage is visualized via thermographic imaging. Damage, in the form of cracks and other discontinuities, usefully increases resistance to both electrical and thermal transport in these materials, which enables tomographic full-field damage assessment in many cases. Characteristics of the technique include the ability for real-time measurement of the damage state during loading, low-power operation (e.g. 15 °C rise at 1 W), and beyond state-of-the-art spatial resolution for sensing damage in composites. The enhanced thermographic technique is a novel and practical approach for in situ monitoring to ascertain structural health and to prevent structural failures in engineered structures such as aerospace and automotive vehicles and wind turbine blades, among others.

  2. Multi-physics damage sensing in nano-engineered structural composites

    NASA Astrophysics Data System (ADS)

    Guzmán de Villoria, Roberto; Yamamoto, Namiko; Miravete, Antonio; Wardle, Brian L.

    2011-05-01

    Non-destructive evaluation techniques can offer viable diagnostic and prognostic routes to mitigating failures in engineered structures such as bridges, buildings and vehicles. However, existing techniques have significant drawbacks, including poor spatial resolution and limited in situ capabilities. We report here a novel approach where structural advanced composites containing electrically conductive aligned carbon nanotubes (CNTs) are ohmically heated via simple electrical contacts, and damage is visualized via thermographic imaging. Damage, in the form of cracks and other discontinuities, usefully increases resistance to both electrical and thermal transport in these materials, which enables tomographic full-field damage assessment in many cases. Characteristics of the technique include the ability for real-time measurement of the damage state during loading, low-power operation (e.g. 15 °C rise at 1 W), and beyond state-of-the-art spatial resolution for sensing damage in composites. The enhanced thermographic technique is a novel and practical approach for in situ monitoring to ascertain structural health and to prevent structural failures in engineered structures such as aerospace and automotive vehicles and wind turbine blades, among others.

  3. Geomagnetism. Volume I

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacobs, J.A.

    1987-01-01

    The latest attempt to summarise the wealth of knowledge now available on geomagnetic phenomena has resulted in this multi-volume treatise, with contributions and reviews from many scientists. The first volume in the series contains a thorough review of all existing information on measuring the Earth's magnetic field, both on land and at sea, and includes a comparative analysis of the techniques available for this purpose.

  4. Determining the Elastic Modulus of Compliant Thin Films Supported on Substrates from Flat Punch Indentation Measurements

    Treesearch

    M.J. Wald; J.M. Considine; K.T. Turner

    2013-01-01

    Instrumented indentation is a technique that can be used to measure the elastic properties of soft thin films supported on stiffer substrates, including polymer films, cellulosic sheets, and thin layers of biological materials. When measuring thin film properties using indentation, the effect of the substrate must be considered. Most existing models for determining the...

  5. The challenge of measuring emergency preparedness: integrating component metrics to build system-level measures for strategic national stockpile operations.

    PubMed

    Jackson, Brian A; Faith, Kay Sullivan

    2013-02-01

    Although significant progress has been made in measuring public health emergency preparedness, system-level performance measures are lacking. This report examines a potential approach to such measures for Strategic National Stockpile (SNS) operations. We adapted an engineering analytic technique used to assess the reliability of technological systems-failure mode and effects analysis-to assess preparedness. That technique, which includes systematic mapping of the response system and identification of possible breakdowns that affect performance, provides a path to use data from existing SNS assessment tools to estimate likely future performance of the system overall. Systems models of SNS operations were constructed and failure mode analyses were performed for each component. Linking data from existing assessments, including the technical assistance review and functional drills, to reliability assessment was demonstrated using publicly available information. The use of failure mode and effects estimates to assess overall response system reliability was demonstrated with a simple simulation example. Reliability analysis appears an attractive way to integrate information from the substantial investment in detailed assessments for stockpile delivery and dispensing to provide a view of likely future response performance.

  6. Laboratory Reptile Surgery: Principles and Techniques

    PubMed Central

    Alworth, Leanne C; Hernandez, Sonia M; Divers, Stephen J

    2011-01-01

    Reptiles used for research and instruction may require surgical procedures, including biopsy, coelomic device implantation, ovariectomy, orchidectomy, and esophogostomy tube placement, to accomplish research goals. Providing veterinary care for unanticipated clinical problems may require surgical techniques such as amputation, bone or shell fracture repair, and coeliotomy. Although many principles of surgery are common between mammals and reptiles, important differences in anatomy and physiology exist. Veterinarians who provide care for these species should be aware of these differences. Most reptiles undergoing surgery are small and require specific instrumentation and positioning. In addition, because of the wide variety of unique physiologic and anatomic characteristics among snakes, chelonians, and lizards, different techniques may be necessary for different reptiles. This overview describes many common reptile surgery techniques and their application for research purposes or to provide medical care to research subjects. PMID:21333158

  7. Computer image processing in marine resource exploration

    NASA Technical Reports Server (NTRS)

    Paluzzi, P. R.; Normark, W. R.; Hess, G. R.; Hess, H. D.; Cruickshank, M. J.

    1976-01-01

    Pictographic data or imagery is commonly used in marine exploration. Pre-existing image processing techniques (software) similar to those used on imagery obtained from unmanned planetary exploration were used to improve marine photography and side-scan sonar imagery. Features and details not visible by conventional photo processing methods were enhanced by filtering and noise removal on selected deep-sea photographs. Information gained near the periphery of photographs allows improved interpretation and facilitates construction of bottom mosaics where overlapping frames are available. Similar processing techniques were applied to side-scan sonar imagery, including corrections for slant range distortion, and along-track scale changes. The use of digital data processing and storage techniques greatly extends the quantity of information that can be handled, stored, and processed.

  8. Accelerated testing of space mechanisms

    NASA Technical Reports Server (NTRS)

    Murray, S. Frank; Heshmat, Hooshang

    1995-01-01

    This report contains a review of various existing life prediction techniques used for a wide range of space mechanisms. Life prediction techniques utilized in other non-space fields such as turbine engine design are also reviewed for applicability to many space mechanism issues. The development of new concepts on how various tribological processes are involved in the life of the complex mechanisms used for space applications are examined. A 'roadmap' for the complete implementation of a tribological prediction approach for complex mechanical systems including standard procedures for test planning, analytical models for life prediction and experimental verification of the life prediction and accelerated testing techniques are discussed. A plan is presented to demonstrate a method for predicting the life and/or performance of a selected space mechanism mechanical component.

  9. Water reuse systems: A review of the principal components

    USGS Publications Warehouse

    Lucchetti, G.; Gray, G.A.

    1988-01-01

    Principal components of water reuse systems include ammonia removal, disease control, temperature control, aeration, and particulate filtration. Effective ammonia removal techniques include air stripping, ion exchange, and biofiltration. Selection of a particular technique largely depends on site-specific requirements (e.g., space, existing water quality, and fish densities). Disease control, although often overlooked, is a major problem in reuse systems. Pathogens can be controlled most effectively with ultraviolet radiation, ozone, or chlorine. Simple and inexpensive methods are available to increase oxygen concentration and eliminate gas supersaturation, these include commercial aerators, air injectors, and packed columns. Temperature control is a major advantage of reuse systems, but the equipment required can be expensive, particularly if water temperature must be rigidly controlled and ambient air temperature fluctuates. Filtration can be readily accomplished with a hydrocyclone or sand filter that increases overall system efficiency. Based on criteria of adaptability, efficiency, and reasonable cost, we recommend components for a small water reuse system.

  10. Capabilities of the RENEB network for research and large scale radiological and nuclear emergency situations.

    PubMed

    Monteiro Gil, Octávia; Vaz, Pedro; Romm, Horst; De Angelis, Cinzia; Antunes, Ana Catarina; Barquinero, Joan-Francesc; Beinke, Christina; Bortolin, Emanuela; Burbidge, Christopher Ian; Cucu, Alexandra; Della Monaca, Sara; Domene, Mercedes Moreno; Fattibene, Paola; Gregoire, Eric; Hadjidekova, Valeria; Kulka, Ulrike; Lindholm, Carita; Meschini, Roberta; M'Kacher, Radhia; Moquet, Jayne; Oestreicher, Ursula; Palitti, Fabrizio; Pantelias, Gabriel; Montoro Pastor, Alegria; Popescu, Irina-Anca; Quattrini, Maria Cristina; Ricoul, Michelle; Rothkamm, Kai; Sabatier, Laure; Sebastià, Natividad; Sommer, Sylwester; Terzoudi, Georgia; Testa, Antonella; Trompier, François; Vral, Anne

    2017-01-01

    To identify and assess, among the participants in the RENEB (Realizing the European Network of Biodosimetry) project, the emergency preparedness, response capabilities and resources that can be deployed in the event of a radiological or nuclear accident/incident affecting a large number of individuals. These capabilities include available biodosimetry techniques, infrastructure, human resources (existing trained staff), financial and organizational resources (including the role of national contact points and their articulation with other stakeholders in emergency response) as well as robust quality control/assurance systems. A survey was prepared and sent to the RENEB partners in order to acquire information about the existing, operational techniques and infrastructure in the laboratories of the different RENEB countries and to assess the capacity of response in the event of radiological or nuclear accident involving mass casualties. The survey focused on several main areas: laboratory's general information, country and staff involved in biological and physical dosimetry; retrospective assays used, the number of assays available per laboratory and other information related to biodosimetry and emergency preparedness. Following technical intercomparisons amongst RENEB members, an update of the survey was performed one year later concerning the staff and the available assays. The analysis of RENEB questionnaires allowed a detailed assessment of existing capacity of the RENEB network to respond to nuclear and radiological emergencies. This highlighted the key importance of international cooperation in order to guarantee an effective and timely response in the event of radiological or nuclear accidents involving a considerable number of casualties. The deployment of the scientific and technical capabilities existing within the RENEB network members seems mandatory, to help other countries with less or no capacity for biological or physical dosimetry, or countries overwhelmed in case of a radiological or nuclear accident involving a large number of individuals.

  11. Overview of Supersonic Aerodynamics Measurement Techniques in the NASA Langley Unitary Plan Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Erickson, Gary E.

    2007-01-01

    An overview is given of selected measurement techniques used in the NASA Langley Research Center (NASA LaRC) Unitary Plan Wind Tunnel (UPWT) to determine the aerodynamic characteristics of aerospace vehicles operating at supersonic speeds. A broad definition of a measurement technique is adopted in this paper and is any qualitative or quantitative experimental approach that provides information leading to the improved understanding of the supersonic aerodynamic characteristics. On-surface and off-surface measurement techniques used to obtain discrete (point) and global (field) measurements and planar and global flow visualizations are described, and examples of all methods are included. The discussion is limited to recent experiences in the UPWT and is, therefore, not an exhaustive review of existing experimental techniques. The diversity and high quality of the measurement techniques and the resultant data illustrate the capabilities of a ground-based experimental facility and the key role that it plays in the advancement of our understanding, prediction, and control of supersonic aerodynamics.

  12. Intelligent and integrated techniques for coalbed methane (CBM) recovery and reduction of greenhouse gas emission.

    PubMed

    Qianting, Hu; Yunpei, Liang; Han, Wang; Quanle, Zou; Haitao, Sun

    2017-07-01

    Coalbed methane (CBM) recovery is a crucial approach to realize the exploitation of a clean energy and the reduction of the greenhouse gas emission. In the past 10 years, remarkable achievements on CBM recovery have been obtained in China. However, some key difficulties still exist such as long borehole drilling in complicated geological condition, and poor gas drainage effect due to low permeability. In this study, intelligent and integrated techniques for CBM recovery are introduced. These integrated techniques mainly include underground CBM recovery techniques and ground well CBM recovery techniques. The underground CBM recovery techniques consist of the borehole formation technique, gas concentration improvement technique, and permeability enhancement technique. According to the division of mining-induced disturbance area, the ground well arrangement area and well structure type in mining-induced disturbance developing area and mining-induced disturbance stable area are optimized to significantly improve the ground well CBM recovery. Besides, automatic devices such as drilling pipe installation device are also developed to achieve remote control of data recording, which makes the integrated techniques intelligent. These techniques can provide key solutions to some long-term difficulties in CBM recovery.

  13. Pest control: A modelling approach. Comment on “Multiscale approach to pest insect monitoring: Random walks, pattern formation, synchronization, and networks” by S. Petrovskii, N. Petrovskaya and D. Bearup

    NASA Astrophysics Data System (ADS)

    Tyson, Rebecca C.

    2014-09-01

    Successful food production results in the delivery to market of beautiful produce, free of damage from insects. All of that produce however, is an excellent and plentiful food source, and nature has evolved a multitude of insects that compete with humans for access. There exist a number of management strategies to combat pests, including traditional crop rotation and companion planting techniques, as well as more sophisticated techniques including mating disruption using pheromones and the application of chemical sprays. Chemical sprays are extremely effective, and are in widespread use around the globe [1,12,20]. Indeed, pesticides are the dominant form of pest management in current use [10,20].

  14. Chemical vapor deposition growth

    NASA Technical Reports Server (NTRS)

    Ruth, R. P.; Manasevit, H. M.; Kenty, J. L.; Moudy, L. A.; Simpson, W. I.; Yang, J. J.

    1976-01-01

    The chemical vapor deposition (CVD) method for the growth of Si sheet on inexpensive substrate materials is investigated. The objective is to develop CVD techniques for producing large areas of Si sheet on inexpensive substrate materials, with sheet properties suitable for fabricating solar cells meeting the technical goals of the Low Cost Silicon Solar Array Project. Specific areas covered include: (1) modification and test of existing CVD reactor system; (2) identification and/or development of suitable inexpensive substrate materials; (3) experimental investigation of CVD process parameters using various candidate substrate materials; (4) preparation of Si sheet samples for various special studies, including solar cell fabrication; (5) evaluation of the properties of the Si sheet material produced by the CVD process; and (6) fabrication and evaluation of experimental solar cell structures, using standard and near-standard processing techniques.

  15. Environmental stress cracking of polymers

    NASA Technical Reports Server (NTRS)

    Mahan, K. I.

    1980-01-01

    A two point bending method for use in studying the environmental stress cracking and crazing phenomena is described and demonstrated for a variety of polymer/solvent systems. Critical strain values obtained from these curves are reported for various polymer/solvent systems including a considerable number of systems for which critical strain values have not been previously reported. Polymers studied using this technique include polycarbonate (PC), ABS, high impact styrene (HIS), polyphenylene oxide (PPO), and polymethyl methacrylate (PMMA). Critical strain values obtained using this method compared favorably with available existing data. The major advantage of the technique is the ability to obtain time vs. strain curves over a short period of time. The data obtained suggests that over a short period of time the transition in most of the polymer solvent systems is more gradual than previously believed.

  16. Digital I and C system upgrade integration technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, H. W.; Shih, C.; Wang, J. R.

    2012-07-01

    This work developed an integration technique for digital I and C system upgrade, the utility can replace the I and C systems step by step systematically by this method. Inst. of Nuclear Energy Research (INER) developed a digital Instrumentation and Control (I and C) replacement integration technique on the basis of requirement of the three existing nuclear power plants (NPPs), which are Chin-Shan (CS) NPP, Kuo-Sheng (KS) NPP, and Maanshan (MS) NPP, in Taiwan, and also developed the related Critical Digital Review (CDR) Procedure. The digital I and C replacement integration technique includes: (I) Establishment of Nuclear Power Plant Digitalmore » Replacement Integration Guideline, (2) Preliminary Investigation on I and C System Digitalization, (3) Evaluation on I and C System Digitalization, and (4) Establishment of I and C System Digitalization Architectures. These works can be a reference for performing I and C system digital replacement integration of the three existing NPPs of Taiwan Power Company (TPC). A CDR is the review for a critical system digital I and C replacement. The major reference of this procedure is EPRI TR- 1011710 (2005) 'Handbook for Evaluating Critical Digital Equipment and Systems' which was published by the Electric Power Research Inst. (EPRI). With this document, INER developed a TPC-specific CDR procedure. Currently, CDR becomes one of the policies for digital I and C replacement in TPC. The contents of this CDR procedure include: Scope, Responsibility, Operation Procedure, Operation Flow Chart, CDR review items. The CDR review items include the comparison of the design change, Software Verification and Validation (SVandV), Failure Mode and Effects Analysis (FMEA), Evaluation of Diversity and Defense-in-depth (D3), Evaluation of Watchdog Timer, Evaluation of Electromagnetic Compatibility (EMC), Evaluation of Grounding for System/Component, Seismic Evaluation, Witness and Inspection, Lessons Learnt from the Digital I and C Failure Events. A solid review can assure the quality of the digital I and C system replacement. (authors)« less

  17. From Phenomena to Objects: Segmentation of Fuzzy Objects and its Application to Oceanic Eddies

    NASA Astrophysics Data System (ADS)

    Wu, Qingling

    A challenging image analysis problem that has received limited attention to date is the isolation of fuzzy objects---i.e. those with inherently indeterminate boundaries---from continuous field data. This dissertation seeks to bridge the gap between, on the one hand, the recognized need for Object-Based Image Analysis of fuzzy remotely sensed features, and on the other, the optimization of existing image segmentation techniques for the extraction of more discretely bounded features. Using mesoscale oceanic eddies as a case study of a fuzzy object class evident in Sea Surface Height Anomaly (SSHA) imagery, the dissertation demonstrates firstly, that the widely used region-growing and watershed segmentation techniques can be optimized and made comparable in the absence of ground truth data using the principle of parsimony. However, they both have significant shortcomings, with the region growing procedure creating contour polygons that do not follow the shape of eddies while the watershed technique frequently subdivides eddies or groups together separate eddy objects. Secondly, it was determined that these problems can be remedied by using a novel Non-Euclidian Voronoi (NEV) tessellation technique. NEV is effective in isolating the extrema associated with eddies in SSHA data while using a non-Euclidian cost-distance based procedure (based on cumulative gradients in ocean height) to define the boundaries between fuzzy objects. Using this procedure as the first stage in isolating candidate eddy objects, a novel "region-shrinking" multicriteria eddy identification algorithm was developed that includes consideration of shape and vorticity. Eddies identified by this region-shrinking technique compare favorably with those identified by existing techniques, while simplifying and improving existing automated eddy detection algorithms. However, it also tends to find a larger number of eddies as a result of its ability to separate what other techniques identify as connected eddies. The research presented here is of significance not only to eddy research in oceanography, but also to other areas of Earth System Science for which the automated detection of features lacking rigid boundary definitions is of importance.

  18. Cell Membrane Coating Nanotechnology.

    PubMed

    Fang, Ronnie H; Kroll, Ashley V; Gao, Weiwei; Zhang, Liangfang

    2018-06-01

    Nanoparticle-based therapeutic, prevention, and detection modalities have the potential to greatly impact how diseases are diagnosed and managed in the clinic. With the wide range of nanomaterials available, the rational design of nanocarriers on an application-specific basis has become increasingly commonplace. Here, a comprehensive overview is provided on an emerging platform: cell-membrane-coating nanotechnology. As a fundamental unit of biology, cells carry out a wide range of functions, including the remarkable ability to interface and interact with their surrounding environment. Instead of attempting to replicate such functions via synthetic techniques, researchers are now directly leveraging naturally derived cell membranes as a means of bestowing nanoparticles with enhanced biointerfacing capabilities. This top-down technique is facile, highly generalizable, and has the potential to greatly augment existing nanocarriers. Further, the introduction of a natural membrane substrate onto nanoparticles surfaces has enabled additional applications beyond those traditionally associated with nanomedicine. Despite its relative youth, there exists an impressive body of literature on cell membrane coating, which is covered here in detail. Overall, there is still significant room for development, as researchers continue to refine existing workflows while finding new and exciting applications that can take advantage of this developing technology. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Editing Transgenic DNA Components by Inducible Gene Replacement in Drosophila melanogaster

    PubMed Central

    Lin, Chun-Chieh; Potter, Christopher J.

    2016-01-01

    Gene conversions occur when genomic double-strand DNA breaks (DSBs) trigger unidirectional transfer of genetic material from a homologous template sequence. Exogenous or mutated sequence can be introduced through this homology-directed repair (HDR). We leveraged gene conversion to develop a method for genomic editing of existing transgenic insertions in Drosophila melanogaster. The clustered regularly-interspaced palindromic repeats (CRISPR)/Cas9 system is used in the homology assisted CRISPR knock-in (HACK) method to induce DSBs in a GAL4 transgene, which is repaired by a single-genomic transgenic construct containing GAL4 homologous sequences flanking a T2A-QF2 cassette. With two crosses, this technique converts existing GAL4 lines, including enhancer traps, into functional QF2 expressing lines. We used HACK to convert the most commonly-used GAL4 lines (labeling tissues such as neurons, fat, glia, muscle, and hemocytes) to QF2 lines. We also identified regions of the genome that exhibited differential efficiencies of HDR. The HACK technique is robust and readily adaptable for targeting and replacement of other genomic sequences, and could be a useful approach to repurpose existing transgenes as new genetic reagents become available. PMID:27334272

  20. A step-by-step guide to office-based sperm retrieval for obstructive azoospermia

    PubMed Central

    Mills, Jesse N.

    2017-01-01

    A variety of surgical options exists for sperm retrieval in the setting of obstructive azoospermia (OA). With appropriate preparation, the majority of these techniques can safely be performed in the office with local anesthesia and with or without monitored anesthesia care (MAC). The available techniques include percutaneous options such as percutaneous epididymal sperm aspiration (PESA) and testicular sperm aspiration (TESA), as well as open techniques that include testicular sperm extraction (TESE) and microsurgical epididymal sperm aspiration (MESA). In addition to providing a step-by-step description of each available approach, we introduce and describe a new technique for sperm retrieval for OA called minimally invasive epididymal sperm aspiration (MIESA). The MIESA utilizes a tiny keyhole incision, and the epididymis is exposed without testicular delivery. Epididymal aspiration is performed in the style of MESA, except using loupe magnification rather than an operating microscope. MIESA is a safe, office-based procedure in which millions of motile sperm can be retrieved for cryopreservation. While we prefer the MIESA technique for OA, there remain distinct advantages of each open and percutaneous approach. In the current era of assisted reproductive technology, sperm retrieval rates for OA should approach 100% regardless of the technique. This reference provides a roadmap for both advanced and novice male reproductive surgeons to guide them through every stage of sperm retrieval for OA, including preoperative evaluation, patient selection, procedural techniques, and complications. With the incredible advances in in vitro fertilization (IVF), combined with innovative surgical treatment for male factor infertility in recent years, OA is no longer a barrier for men to become biologic fathers. PMID:28904906

  1. Automated Prescription of Oblique Brain 3D MRSI

    PubMed Central

    Ozhinsky, Eugene; Vigneron, Daniel B.; Chang, Susan M.; Nelson, Sarah J.

    2012-01-01

    Two major difficulties encountered in implementing Magnetic Resonance Spectroscopic Imaging (MRSI) in a clinical setting are limited coverage and difficulty in prescription. The goal of this project was to completely automate the process of 3D PRESS MRSI prescription, including placement of the selection box, saturation bands and shim volume, while maximizing the coverage of the brain. The automated prescription technique included acquisition of an anatomical MRI image, optimization of the oblique selection box parameters, optimization of the placement of OVS saturation bands, and loading of the calculated parameters into a customized 3D MRSI pulse sequence. To validate the technique and compare its performance with existing protocols, 3D MRSI data were acquired from 6 exams from 3 healthy volunteers. To assess the performance of the automated 3D MRSI prescription for patients with brain tumors, the data were collected from 16 exams from 8 subjects with gliomas. This technique demonstrated robust coverage of the tumor, high consistency of prescription and very good data quality within the T2 lesion. PMID:22692829

  2. Minimizing radiation exposure during percutaneous nephrolithotomy.

    PubMed

    Chen, T T; Preminger, G M; Lipkin, M E

    2015-12-01

    Given the recent trends in growing per capita radiation dose from medical sources, there have been increasing concerns over patient radiation exposure. Patients with kidney stones undergoing percutaneous nephrolithotomy (PNL) are at particular risk for high radiation exposure. There exist several risk factors for increased radiation exposure during PNL which include high Body Mass Index, multiple access tracts, and increased stone burden. We herein review recent trends in radiation exposure, radiation exposure during PNL to both patients and urologists, and various approaches to reduce radiation exposure. We discuss incorporating the principles of As Low As reasonably Achievable (ALARA) into clinical practice and review imaging techniques such as ultrasound and air contrast to guide PNL access. Alternative surgical techniques and approaches to reducing radiation exposure, including retrograde intra-renal surgery, retrograde nephrostomy, endoscopic-guided PNL, and minimally invasive PNL, are also highlighted. It is important for urologists to be aware of these concepts and techniques when treating stone patients with PNL. The discussions outlined will assist urologists in providing patient counseling and high quality of care.

  3. Applying manifold learning techniques to the CAESAR database

    NASA Astrophysics Data System (ADS)

    Mendoza-Schrock, Olga; Patrick, James; Arnold, Gregory; Ferrara, Matthew

    2010-04-01

    Understanding and organizing data is the first step toward exploiting sensor phenomenology for dismount tracking. What image features are good for distinguishing people and what measurements, or combination of measurements, can be used to classify the dataset by demographics including gender, age, and race? A particular technique, Diffusion Maps, has demonstrated the potential to extract features that intuitively make sense [1]. We want to develop an understanding of this tool by validating existing results on the Civilian American and European Surface Anthropometry Resource (CAESAR) database. This database, provided by the Air Force Research Laboratory (AFRL) Human Effectiveness Directorate and SAE International, is a rich dataset which includes 40 traditional, anthropometric measurements of 4400 human subjects. If we could specifically measure the defining features for classification, from this database, then the future question will then be to determine a subset of these features that can be measured from imagery. This paper briefly describes the Diffusion Map technique, shows potential for dimension reduction of the CAESAR database, and describes interesting problems to be further explored.

  4. Evaluation of DSS-14 pedestal-review of top surface repair procedures

    NASA Technical Reports Server (NTRS)

    Oesterle, R. G.; Musser, D. W.; Salse, E. A. B.

    1983-01-01

    Proposed repair procedures for the top surface of the pedestal supporting the hydrostatic bearing runner for the 64m Antenna are presented. These procedures included: (1) removal of existing grout and concrete to approximately 8 in. below original concrete surface using a presplitting technique with expansive cement followed by secondary breaking; (2) preparation of exposed concrete surface including an epoxy bonding agent; and (3) replacement of material removed with 8 in. of new concrete surface including an epoxy bonding agent; and (4) replacement of material removed with 8 in. of new concrete and 4 in. of new grout.

  5. A Survey of Terrestrial Approaches to the Challenge of Lunar Dust Containment

    NASA Technical Reports Server (NTRS)

    Aguilera, Tatiana; Perry, Jay L.

    2009-01-01

    Numerous technical challenges exist to successfully extend lunar surface exploration beyond the tantalizing first steps of Apollo. Among these is the challenge of lunar dust intrusion into the cabin environment. Addressing this challenge includes the design of barriers to intrusion as well as techniques for removing the dust from the cabin atmosphere. Opportunities exist for adapting approaches employed in dusty industrial operations and pristine manufacturing environments to cabin environmental quality maintenance applications. A survey of process technologies employed by the semiconductor, pharmaceutical, food processing, and mining industries offers insight into basic approaches that may be suitable for adaptation to lunar surface exploration applications.

  6. Rigorous results for the minimal speed of Kolmogorov-Petrovskii-Piscounov monotonic fronts with a cutoffa)

    NASA Astrophysics Data System (ADS)

    Benguria, Rafael D.; Depassier, M. Cristina; Loss, Michael

    2012-12-01

    We study the effect of a cutoff on the speed of pulled fronts of the one-dimensional reaction diffusion equation. To accomplish this, we first use variational techniques to prove the existence of a heteroclinic orbit in phase space for traveling wave solutions of the corresponding reaction diffusion equation under conditions that include discontinuous reaction profiles. This existence result allows us to prove rigorous upper and lower bounds on the minimal speed of monotonic fronts in terms of the cut-off parameter ɛ. From these bounds we estimate the range of validity of the Brunet-Derrida formula for a general class of reaction terms.

  7. Image-guided optical measurement of blood oxygen saturation within capillary vessels (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Akons, Kfir; Zeidan, Adel; Yeheskely-Hayon, Daniella; Minai, Limor; Yelin, Dvir

    2016-03-01

    Values of blood oxygenation levels are useful for assessing heart and lung conditions, and are frequently monitored during routine patient care. Independent measurement of the oxygen saturation in capillary blood, which is significantly different from that of arterial blood, is important for diagnosing tissue hypoxia and for increasing the accuracy of existing techniques that measure arterial oxygen saturation. Here, we developed a simple, non-invasive technique for measuring the reflected spectra from individual capillary vessels within a human lip, allowing local measurement of the blood oxygen saturation. The optical setup includes a spatially incoherent broadband light that was focused onto a specific vessel below the lip surface. Backscattered light was imaged by a camera for identifying a target vessel and pointing the illumination beam to its cross section. Scattered light from the vessel was then collected by a single-mode fiber and analyzed by a fast spectrometer. Spectra acquired from small capillary vessels within a volunteer lip showed the characteristic oxyhemoglobin absorption bands in real time and with a high signal-to-noise ratio. Measuring capillary oxygen saturation using this technique would potentially be more accurate compared to existing pulse oximetry techniques due to its insensitivity to the patient's skin color, pulse rate, motion, and medical condition. It could be used as a standalone endoscopic technique for measuring tissue hypoxia or in conjunction with conventional pulse oximetry for a more accurate measurement of oxygen transport in the body.

  8. A study of microwave downcoverters operating in the K sub u band

    NASA Technical Reports Server (NTRS)

    Fellers, R. G.; Simpson, T. L.; Tseng, B.

    1982-01-01

    A computer program for parametric amplifier design is developed with special emphasis on practical design considerations for microwave integrated circuit degenerate amplifiers. Precision measurement techniques are developed to obtain a more realistic varactor equivalent circuit. The existing theory of a parametric amplifier is modified to include the equivalent circuit, and microwave properties, such as loss characteristics and circuit discontinuities are investigated.

  9. Research directions in large scale systems and decentralized control

    NASA Technical Reports Server (NTRS)

    Tenney, R. R.

    1980-01-01

    Control theory provides a well established framework for dealing with automatic decision problems and a set of techniques for automatic decision making which exploit special structure, but it does not deal well with complexity. The potential exists for combining control theoretic and knowledge based concepts into a unified approach. The elements of control theory are diagrammed, including modern control and large scale systems.

  10. Social Circles Detection from Ego Network and Profile Information

    DTIC Science & Technology

    2014-12-19

    response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing... algorithm used to infer k-clique communities is expo- nential, which makes this technique unfeasible when treating egonets with a large number of users...atic when considering RBMs. This inconvenient was positively solved implementing a sparsity treatment with the RBM algorithm . (ii) The ground truth was

  11. Incorporating energy conservation techniques in the operation of existing LeRC R and D facilities. [energy policy/NASA programs

    NASA Technical Reports Server (NTRS)

    Nieberding, W. C.

    1975-01-01

    A general discussion of various methods which can be used to reduce energy consumption is presented. A very brief description of Lewis Research Center facilities is given and the energy reduction methods are discussed relative to them. Some specific examples (ie; automated equipment and data systems) of the implementation of the energy reduction methods are included.

  12. N-Sulfinylimine compounds, R-NSO: a chemistry family with strong temperament

    NASA Astrophysics Data System (ADS)

    Romano, R. M.; Della Védova, C. O.

    2000-04-01

    In this review, an update on the structural properties and theoretical studies of N-sulfinylimine compounds (R-NSO) is reported. They were deduced using several experimental techniques: gas-electron diffraction (GED), X-ray diffraction, 17O NMR, ultraviolet-visible absorption spectroscopy (UV-Vis), FTIR (including matrix studies of molecular randomisation) and Raman (including pre-resonant Raman spectra). Data are compared with those obtained by theoretical calculations. With these tools, excited state geometry using the time-dependent theory was calculated for these kinds of compounds. The existence of pre-resonant Raman effect was reported recently for R-NSO compounds. The configuration of R-NSO compounds was checked for this series confirming the existence of only one syn configuration. This finding is corroborated by theoretical calculations. The method of preparation is also summarised.

  13. The influence of surface finishing methods on touch-sensitive reactions

    NASA Astrophysics Data System (ADS)

    Kukhta, M. S.; Sokolov, A. P.; Krauinsh, P. Y.; Kozlova, A. D.; Bouchard, C.

    2017-02-01

    This paper describes the modern technological development trends in jewelry design. In the jewelry industry, new trends, associated with the introduction of updated non-traditional materials and finishing techniques, are appearing. The existing information-oriented society enhances the visual aesthetics of new jewelry forms, decoration techniques (depth and surface), synthesis of different materials, which, all in all, reveal a bias towards positive effects of visual design. Today, the jewelry industry includes not only traditional techniques, but also such improved techniques as computer-assisted design, 3D-prototyping and other alternatives to produce an updated level of jewelry material processing. The authors present the specific features of ornamental pattern designing, decoration types (depth and surface) and comparative analysis of different approaches in surface finishing. Identifying the appearance or the effect of jewelry is based on proposed evaluation criteria, providing an advanced visual aesthetics basis is predicated on touch-sensitive responses.

  14. Comparing the frequency of physical examination techniques performed by associate and baccalaureate degree prepared nurses in clinical practice: does education make a difference?

    PubMed

    Giddens, Jean

    2006-03-01

    Rapid changes in health care have underscored the need for reform in health professions education, including nursing education. One of many problems cited in the nursing and other health sciences education literature is overcrowded curricula; therefore, an evaluation of content is necessary. The purpose of this study was to determine whether differences exist in the frequency that physical examination techniques are performed by associate and baccalaureate degree prepared nurses. Participants completed a survey on performance of various physical examination techniques. A Mann-Whitney test showed no differences between the two groups in terms of frequency of techniques performed. A small negative correlation was found between frequency and years of experience with the nutrition assessment category. A comparison of physical examination content covered in baccalaureate and associate degree nursing programs is needed to further understand these findings.

  15. Retinal Image Simulation of Subjective Refraction Techniques

    PubMed Central

    Perches, Sara; Collados, M. Victoria; Ares, Jorge

    2016-01-01

    Refraction techniques make it possible to determine the most appropriate sphero-cylindrical lens prescription to achieve the best possible visual quality. Among these techniques, subjective refraction (i.e., patient’s response-guided refraction) is the most commonly used approach. In this context, this paper’s main goal is to present a simulation software that implements in a virtual manner various subjective-refraction techniques—including Jackson’s Cross-Cylinder test (JCC)—relying all on the observation of computer-generated retinal images. This software has also been used to evaluate visual quality when the JCC test is performed in multifocal-contact-lens wearers. The results reveal this software’s usefulness to simulate the retinal image quality that a particular visual compensation provides. Moreover, it can help to gain a deeper insight and to improve existing refraction techniques and it can be used for simulated training. PMID:26938648

  16. Labral cuff refixation in the hip: rationale and operative technique for preserving the chondrolabral interface for labral repair: a case series

    PubMed Central

    Filan, David

    2018-01-01

    ABSTRACT Arthroscopic labral ‘takedown’ and refixation is utilized to permit adequate visualization and resection of the acetabular rim deformity, in patients with pincer or mixed femoroacetabular impingement. Deficiencies exist in present techniques, which include disruption of vital anatomical support and vascular structures to the labrum and chondrolabral junction, drill or anchor articular penetration risk, bunching, elevation and instability of the labrum. A new operative technique is described which preserves the important chondrolabral interface, accurately restoring the ‘flap seal’ of the acetabular labrum while minimizing vascular disruption and reducing the risk of drill and anchor penetration. A prospective series of 123 consecutive cases of pincer or mixed femoroacetabular impingement, treated with arthroscopic labral cuff refixation and preservation of the chondrolabral interface, is reported; operative technique and 2-year outcomes are presented. PMID:29423255

  17. How Many Wolves (Canis lupus) Fit into Germany? The Role of Assumptions in Predictive Rule-Based Habitat Models for Habitat Generalists

    PubMed Central

    Fechter, Dominik; Storch, Ilse

    2014-01-01

    Due to legislative protection, many species, including large carnivores, are currently recolonizing Europe. To address the impending human-wildlife conflicts in advance, predictive habitat models can be used to determine potentially suitable habitat and areas likely to be recolonized. As field data are often limited, quantitative rule based models or the extrapolation of results from other studies are often the techniques of choice. Using the wolf (Canis lupus) in Germany as a model for habitat generalists, we developed a habitat model based on the location and extent of twelve existing wolf home ranges in Eastern Germany, current knowledge on wolf biology, different habitat modeling techniques and various input data to analyze ten different input parameter sets and address the following questions: (1) How do a priori assumptions and different input data or habitat modeling techniques affect the abundance and distribution of potentially suitable wolf habitat and the number of wolf packs in Germany? (2) In a synthesis across input parameter sets, what areas are predicted to be most suitable? (3) Are existing wolf pack home ranges in Eastern Germany consistent with current knowledge on wolf biology and habitat relationships? Our results indicate that depending on which assumptions on habitat relationships are applied in the model and which modeling techniques are chosen, the amount of potentially suitable habitat estimated varies greatly. Depending on a priori assumptions, Germany could accommodate between 154 and 1769 wolf packs. The locations of the existing wolf pack home ranges in Eastern Germany indicate that wolves are able to adapt to areas densely populated by humans, but are limited to areas with low road densities. Our analysis suggests that predictive habitat maps in general, should be interpreted with caution and illustrates the risk for habitat modelers to concentrate on only one selection of habitat factors or modeling technique. PMID:25029506

  18. A review of risk management process in construction projects of developing countries

    NASA Astrophysics Data System (ADS)

    Bahamid, R. A.; Doh, S. I.

    2017-11-01

    In the construction industry, risk management concept is a less popular technique. There are three main stages in the systematic approach to risk management in construction industry. These stages include: a) risk response; b) risk analysis and evaluation; and c) risk identification. The high risk related to construction business affects each of its participants; while operational analysis and management of construction related risks remain an enormous task to practitioners of the industry. This paper tends towards reviewing the existing literature on construction project risk managements in developing countries specifically on risk management process. The literature lacks ample risk management process approach capable of capturing risk impact on diverse project objectives. This literature review aims at discovering the frequently used techniques in risk identification and analysis. It also attempts to identify response to clarifying the different classifications of risk sources in the existing literature of developing countries, and to identify the future research directions on project risks in the area of construction in developing countries.

  19. ANALYSIS OF RADON MITIGATION TECHNIQUES USED IN EXISTING U.S. HOUSES

    EPA Science Inventory

    This paper reviews the full range of techniques that have been installed in existing US houses for the purpose of reducing indoor radon concentrations resulting from soil gas entry. The review addresses the performance, installation and operating costs, applicability, mechanisms,...

  20. The HVT technique and the 'uncertainty' relation for central potentials

    NASA Astrophysics Data System (ADS)

    Grypeos, M. E.; Koutroulos, C. G.; Oyewumi, K. J.; Petridou, Th

    2004-08-01

    The quantum mechanical hypervirial theorems (HVT) technique is used to treat the so-called 'uncertainty' relation for quite a general class of central potential wells, including the (reduced) Poeschl-Teller and the Gaussian one. It is shown that this technique is quite suitable in deriving an approximate analytic expression in the form of a truncated power series expansion for the dimensionless product Pnl equiv langr2rangnllangp2rangnl/planck2, for every (deeply) bound state of a particle moving non-relativistically in the well, provided that a (dimensionless) parameter s is sufficiently small. Attention is also paid to a number of cases, among the limited existing ones, in which exact analytic or semi-analytic expressions for Pnl can be derived. Finally, numerical results are given and discussed.

  1. The fix for tough spots

    NASA Technical Reports Server (NTRS)

    Anders, John B.; Walsh, Michael J.; Bushnell, Dennis M.

    1988-01-01

    Modern turbulence-control techniques are discussed. Particular atention is given to retrofit techniques such as riblets and large-eddy breakup (LEBU) devices which use passive elements suitable for a variety of existing vehicles with minimum added complexity. Riblets are small flow-aligned grooves in the aircraft skin that damp turbulence and reduce skin friction; the mechanism of riblet drag reduction derives from the enhancement of turbulence-altering, transverse viscous forces by strong spanwise surface geometry gradients. LEBUs are thin plates or ribbons suspended in a turbulent boundary layer to sever or break up the large vortices that form the convoluted outer edge of the layer. Other turbulence-control techniques are discussed, including one that involves the injection of control vortices into the turbulent boundary layer to modify or substitute for large-eddy structures.

  2. Cellular-based preemption system

    NASA Technical Reports Server (NTRS)

    Bachelder, Aaron D. (Inventor)

    2011-01-01

    A cellular-based preemption system that uses existing cellular infrastructure to transmit preemption related data to allow safe passage of emergency vehicles through one or more intersections. A cellular unit in an emergency vehicle is used to generate position reports that are transmitted to the one or more intersections during an emergency response. Based on this position data, the one or more intersections calculate an estimated time of arrival (ETA) of the emergency vehicle, and transmit preemption commands to traffic signals at the intersections based on the calculated ETA. Additional techniques may be used for refining the position reports, ETA calculations, and the like. Such techniques include, without limitation, statistical preemption, map-matching, dead-reckoning, augmented navigation, and/or preemption optimization techniques, all of which are described in further detail in the above-referenced patent applications.

  3. Preparation and characterization of maghemite nanoparticles from mild steel for magnetically guided drug therapy.

    PubMed

    Kumar, Nitesh; Kulkarni, Kaustubh; Behera, Laxmidhar; Verma, Vivek

    2017-08-01

    Maghemite (γ-Fe 2 O 3 ) nanoparticles for therapeutic applications are prepared from mild steel but the existing synthesis technique is very cumbersome. The entire process takes around 100 days with multiple steps which lack proper understanding. In the current work, maghemite nanoparticles of cuboidal and spheroidal morphologies were prepared from mild steel chips by a novel cost effective oil reduction technique for magnetically guided intravascular drug delivery. The technique developed in this work yields isometric sized γ-Fe 2 O 3 nanoparticles in 6 h with higher saturation magnetization as compared to the existing similar solid state synthesis route. Mass and heat flow kinetics during the heating and quenching steps were studied with the help of Finite element simulations. Qualitative and quantitative analysis of the γ-Fe 2 O 3 phase is performed with the help of x-ray diffraction, transmission electron microscope and x-ray photoelectron spectroscopy. Mechanism for the α-Fe 2 O 3 (haematite) to γ-Fe 2 O 3 (maghemite) phase evolution during the synthesis process is also investigated. Maghemite (γ-Fe 2 O 3 ) nanoparticles were prepared bya novel cost effective oil reduction technique as mentioned below in the figure. The raw materials included mild steel chips which is one of the most abundant engineering materials. These particles can be used as ideal nanocarriers for targeted drug delivery through the vascular network.

  4. Modeling biology using relational databases.

    PubMed

    Peitzsch, Robert M

    2003-02-01

    There are several different methodologies that can be used for designing a database schema; no one is the best for all occasions. This unit demonstrates two different techniques for designing relational tables and discusses when each should be used. These two techniques presented are (1) traditional Entity-Relationship (E-R) modeling and (2) a hybrid method that combines aspects of data warehousing and E-R modeling. The method of choice depends on (1) how well the information and all its inherent relationships are understood, (2) what types of questions will be asked, (3) how many different types of data will be included, and (4) how much data exists.

  5. Progress in The Semantic Analysis of Scientific Code

    NASA Technical Reports Server (NTRS)

    Stewart, Mark

    2000-01-01

    This paper concerns a procedure that analyzes aspects of the meaning or semantics of scientific and engineering code. This procedure involves taking a user's existing code, adding semantic declarations for some primitive variables, and parsing this annotated code using multiple, independent expert parsers. These semantic parsers encode domain knowledge and recognize formulae in different disciplines including physics, numerical methods, mathematics, and geometry. The parsers will automatically recognize and document some static, semantic concepts and help locate some program semantic errors. These techniques may apply to a wider range of scientific codes. If so, the techniques could reduce the time, risk, and effort required to develop and modify scientific codes.

  6. State of the Art Assessment of Simulation in Advanced Materials Development

    NASA Technical Reports Server (NTRS)

    Wise, Kristopher E.

    2008-01-01

    Advances in both the underlying theory and in the practical implementation of molecular modeling techniques have increased their value in the advanced materials development process. The objective is to accelerate the maturation of emerging materials by tightly integrating modeling with the other critical processes: synthesis, processing, and characterization. The aims of this report are to summarize the state of the art of existing modeling tools and to highlight a number of areas in which additional development is required. In an effort to maintain focus and limit length, this survey is restricted to classical simulation techniques including molecular dynamics and Monte Carlo simulations.

  7. The Commercial Challenges Of Pacs

    NASA Astrophysics Data System (ADS)

    Vanden Brink, John A.

    1984-08-01

    The increasing use of digital imaging techniques create a need for improved methods of digital processing, communication and archiving. However, the commercial opportunity is dependent on the resolution of a number of issues. These issues include proof that digital processes are more cost effective than present techniques, implementation of information system support in the imaging activity, implementation of industry standards, conversion of analog images to digital formats, definition of clinical needs, the implications of the purchase decision and technology requirements. In spite of these obstacles, a market is emerging, served by new and existing companies, that may become a $500 million market (U.S.) by 1990 for equipment and supplies.

  8. A method for digital image registration using a mathematical programming technique

    NASA Technical Reports Server (NTRS)

    Yao, S. S.

    1973-01-01

    A new algorithm based on a nonlinear programming technique to correct the geometrical distortions of one digital image with respect to another is discussed. This algorithm promises to be superior to existing ones in that it is capable of treating localized differential scaling, translational and rotational errors over the whole image plane. A series of piece-wise 'rubber-sheet' approximations are used, constrained in such a manner that a smooth approximation over the entire image can be obtained. The theoretical derivation is included. The result of using the algorithm to register four channel S065 Apollo IX digitized photography over Imperial Valley, California, is discussed in detail.

  9. Report of the Terrestrial Bodies Science Working Group. Volume 9: Complementary research and development

    NASA Technical Reports Server (NTRS)

    Fanale, F. P.; Kaula, W. M.; Mccord, T. B.; Trombka, J. L.

    1977-01-01

    Topics discussed include the need for: the conception and development of a wide spectrum of experiments, instruments, and vehicles in order to derive the proper return from an exploration program; the effective use of alternative methods of data acquisition involving ground-based, airborne and near Earth orbital techniques to supplement spacraft mission; and continued reduction and analysis of existing data including laboratory and theoretical studies in order to benefit fully from experiments and to build on the past programs toward a logical and efficient exploration of the solar system.

  10. Advanced Curation Preparation for Mars Sample Return and Cold Curation

    NASA Technical Reports Server (NTRS)

    Fries, M. D.; Harrington, A. D.; McCubbin, F. M.; Mitchell, J.; Regberg, A. B.; Snead, C.

    2017-01-01

    NASA Curation is tasked with the care and distribution of NASA's sample collections, such as the Apollo lunar samples and cometary material collected by the Stardust spacecraft. Curation is also mandated to perform Advanced Curation research and development, which includes improving the curation of existing collections as well as preparing for future sample return missions. Advanced Curation has identified a suite of technologies and techniques that will require attention ahead of Mars sample return (MSR) and missions with cold curation (CCur) requirements, perhaps including comet sample return missions.

  11. Application of Behavior Change Techniques in a Personalized Nutrition Electronic Health Intervention Study: Protocol for the Web-Based Food4Me Randomized Controlled Trial

    PubMed Central

    Macready, Anna L; Fallaize, Rosalind; Butler, Laurie T; Ellis, Judi A; Kuznesof, Sharron; Frewer, Lynn J; Celis-Morales, Carlos; Livingstone, Katherine M; Araújo-Soares, Vera; Fischer, Arnout RH; Stewart-Knox, Barbara J; Mathers, John C

    2018-01-01

    Background To determine the efficacy of behavior change techniques applied in dietary and physical activity intervention studies, it is first necessary to record and describe techniques that have been used during such interventions. Published frameworks used in dietary and smoking cessation interventions undergo continuous development, and most are not adapted for Web-based delivery. The Food4Me study (N=1607) provided the opportunity to use existing frameworks to describe standardized Web-based techniques employed in a large-scale, internet-based intervention to change dietary behavior and physical activity. Objective The aims of this study were (1) to describe techniques embedded in the Food4Me study design and explain the selection rationale and (2) to demonstrate the use of behavior change technique taxonomies, develop standard operating procedures for training, and identify strengths and limitations of the Food4Me framework that will inform its use in future studies. Methods The 6-month randomized controlled trial took place simultaneously in seven European countries, with participants receiving one of four levels of personalized advice (generalized, intake-based, intake+phenotype–based, and intake+phenotype+gene–based). A three-phase approach was taken: (1) existing taxonomies were reviewed and techniques were identified a priori for possible inclusion in the Food4Me study, (2) a standard operating procedure was developed to maintain consistency in the use of methods and techniques across research centers, and (3) the Food4Me behavior change technique framework was reviewed and updated post intervention. An analysis of excluded techniques was also conducted. Results Of 46 techniques identified a priori as being applicable to Food4Me, 17 were embedded in the intervention design; 11 were from a dietary taxonomy, and 6 from a smoking cessation taxonomy. In addition, the four-category smoking cessation framework structure was adopted for clarity of communication. Smoking cessation texts were adapted for dietary use where necessary. A posteriori, a further 9 techniques were included. Examination of excluded items highlighted the distinction between techniques considered appropriate for face-to-face versus internet-based delivery. Conclusions The use of existing taxonomies facilitated the description and standardization of techniques used in Food4Me. We recommend that for complex studies of this nature, technique analysis should be conducted a priori to develop standardized procedures and training and reviewed a posteriori to audit the techniques actually adopted. The present framework description makes a valuable contribution to future systematic reviews and meta-analyses that explore technique efficacy and underlying psychological constructs. This was a novel application of the behavior change taxonomies and was the first internet-based personalized nutrition intervention to use such a framework remotely. Trial Registration ClinicalTrials.gov NCT01530139; https://clinicaltrials.gov/ct2/show/NCT01530139 (Archived by WebCite at http://www.webcitation.org/6y8XYUft1) PMID:29631993

  12. Image analysis software for following progression of peripheral neuropathy

    NASA Astrophysics Data System (ADS)

    Epplin-Zapf, Thomas; Miller, Clayton; Larkin, Sean; Hermesmeyer, Eduardo; Macy, Jenny; Pellegrini, Marco; Luccarelli, Saverio; Staurenghi, Giovanni; Holmes, Timothy

    2009-02-01

    A relationship has been reported by several research groups [1 - 4] between the density and shapes of nerve fibers in the cornea and the existence and severity of peripheral neuropathy. Peripheral neuropathy is a complication of several prevalent diseases or conditions, which include diabetes, HIV, prolonged alcohol overconsumption and aging. A common clinical technique for confirming the condition is intramuscular electromyography (EMG), which is invasive, so a noninvasive technique like the one proposed here carries important potential advantages for the physician and patient. A software program that automatically detects the nerve fibers, counts them and measures their shapes is being developed and tested. Tests were carried out with a database of subjects with levels of severity of diabetic neuropathy as determined by EMG testing. Results from this testing, that include a linear regression analysis are shown.

  13. Optimization technique for problems with an inequality constraint

    NASA Technical Reports Server (NTRS)

    Russell, K. J.

    1972-01-01

    General technique uses a modified version of an existing technique termed the pattern search technique. New procedure called the parallel move strategy permits pattern search technique to be used with problems involving a constraint.

  14. The Effects of Practice-Based Training on Graduate Teaching Assistants’ Classroom Practices

    PubMed Central

    Becker, Erin A.; Easlon, Erin J.; Potter, Sarah C.; Guzman-Alvarez, Alberto; Spear, Jensen M.; Facciotti, Marc T.; Igo, Michele M.; Singer, Mitchell; Pagliarulo, Christopher

    2017-01-01

    Evidence-based teaching is a highly complex skill, requiring repeated cycles of deliberate practice and feedback to master. Despite existing well-characterized frameworks for practice-based training in K–12 teacher education, the major principles of these frameworks have not yet been transferred to instructor development in higher educational contexts, including training of graduate teaching assistants (GTAs). We sought to determine whether a practice-based training program could help GTAs learn and use evidence-based teaching methods in their classrooms. We implemented a weekly training program for introductory biology GTAs that included structured drills of techniques selected to enhance student practice, logic development, and accountability and reduce apprehension. These elements were selected based on their previous characterization as dimensions of active learning. GTAs received regular performance feedback based on classroom observations. To quantify use of target techniques and levels of student participation, we collected and coded 160 h of video footage. We investigated the relationship between frequency of GTA implementation of target techniques and student exam scores; however, we observed no significant relationship. Although GTAs adopted and used many of the target techniques with high frequency, techniques that enforced student participation were not stably adopted, and their use was unresponsive to formal feedback. We also found that techniques discussed in training, but not practiced, were not used at quantifiable frequencies, further supporting the importance of practice-based training for influencing instructional practices. PMID:29146664

  15. A Review of New Surgical and Endoscopic Therapies for Gastroesophageal Reflux Disease.

    PubMed

    Ganz, Robert A

    2016-07-01

    Treatment of gastroesophageal reflux disease in the United States today is binary, with the majority of patients with gastroesophageal reflux disease being treated with antisecre-tory medications and a minority of patients, typically those with volume regurgitation, undergoing Nissen fundoplication. However, there has been increasing dissatisfaction with proton pump inhibitor therapy among a significant number of patients with gastroesophageal reflux disease owing to cost, side effects, and refractory symptoms, and there has been a general reluctance to undergo surgical fundoplication due to its attendant side-effect profile. As a result, a therapy gap exists for many patients with gastroesophageal reflux disease. Alternative techniques are available for these gap patients, including 2 endoscopic fundoplication techniques, an endoscopic radiofrequency energy delivery technique, and 2 minimally invasive surgical procedures. These alternative techniques have been extensively evaluated; however, there are limitations to published studies, including arbitrary definitions of success, variable efficacy measurements, deficient reporting tools, inconsistent study designs, inconsistent lengths of follow-up postintervention, and lack of comparison data across techniques. Although all of the techniques appear to be safe, the endoscopic techniques lack demonstrable reflux control and show variable symptom improvement and variable decreases in proton pump inhibitor use. The surgical techniques are more robust, with evidence for adequate reflux control, symptom improvement, and decreased proton pump inhibitor use; however, these techniques are more difficult to perform and are more intrusive. Additionally, these alternative techniques have only been studied in patients with relatively normal anatomy. The field of gastroesophageal reflux disease treatment is in need of consistent definitions of efficacy, standardized study design and outcome measurements, and improved reporting tools before the role of these techniques can be fully ascertained.

  16. ECG-derived respiration based on iterated Hilbert transform and Hilbert vibration decomposition.

    PubMed

    Sharma, Hemant; Sharma, K K

    2018-06-01

    Monitoring of the respiration using the electrocardiogram (ECG) is desirable for the simultaneous study of cardiac activities and the respiration in the aspects of comfort, mobility, and cost of the healthcare system. This paper proposes a new approach for deriving the respiration from single-lead ECG based on the iterated Hilbert transform (IHT) and the Hilbert vibration decomposition (HVD). The ECG signal is first decomposed into the multicomponent sinusoidal signals using the IHT technique. Afterward, the lower order amplitude components obtained from the IHT are filtered using the HVD to extract the respiration information. Experiments are performed on the Fantasia and Apnea-ECG datasets. The performance of the proposed ECG-derived respiration (EDR) approach is compared with the existing techniques including the principal component analysis (PCA), R-peak amplitudes (RPA), respiratory sinus arrhythmia (RSA), slopes of the QRS complex, and R-wave angle. The proposed technique showed the higher median values of correlation (first and third quartile) for both the Fantasia and Apnea-ECG datasets as 0.699 (0.55, 0.82) and 0.57 (0.40, 0.73), respectively. Also, the proposed algorithm provided the lowest values of the mean absolute error and the average percentage error computed from the EDR and reference (recorded) respiration signals for both the Fantasia and Apnea-ECG datasets as 1.27 and 9.3%, and 1.35 and 10.2%, respectively. In the experiments performed over different age group subjects of the Fantasia dataset, the proposed algorithm provided effective results in the younger population but outperformed the existing techniques in the case of elderly subjects. The proposed EDR technique has the advantages over existing techniques in terms of the better agreement in the respiratory rates and specifically, it reduces the need for an extra step required for the detection of fiducial points in the ECG for the estimation of respiration which makes the process effective and less-complex. The above performance results obtained from two different datasets validate that the proposed approach can be used for monitoring of the respiration using single-lead ECG.

  17. Current state of the art of vision based SLAM

    NASA Astrophysics Data System (ADS)

    Muhammad, Naveed; Fofi, David; Ainouz, Samia

    2009-02-01

    The ability of a robot to localise itself and simultaneously build a map of its environment (Simultaneous Localisation and Mapping or SLAM) is a fundamental characteristic required for autonomous operation of the robot. Vision Sensors are very attractive for application in SLAM because of their rich sensory output and cost effectiveness. Different issues are involved in the problem of vision based SLAM and many different approaches exist in order to solve these issues. This paper gives a classification of state-of-the-art vision based SLAM techniques in terms of (i) imaging systems used for performing SLAM which include single cameras, stereo pairs, multiple camera rigs and catadioptric sensors, (ii) features extracted from the environment in order to perform SLAM which include point features and line/edge features, (iii) initialisation of landmarks which can either be delayed or undelayed, (iv) SLAM techniques used which include Extended Kalman Filtering, Particle Filtering, biologically inspired techniques like RatSLAM, and other techniques like Local Bundle Adjustment, and (v) use of wheel odometry information. The paper also presents the implementation and analysis of stereo pair based EKF SLAM for synthetic data. Results prove the technique to work successfully in the presence of considerable amounts of sensor noise. We believe that state of the art presented in the paper can serve as a basis for future research in the area of vision based SLAM. It will permit further research in the area to be carried out in an efficient and application specific way.

  18. Technological advances in radiotherapy of rectal cancer: opportunities and challenges.

    PubMed

    Appelt, Ane L; Sebag-Montefiore, David

    2016-07-01

    This review summarizes the available evidence for the use of modern radiotherapy techniques for chemoradiotherapy for rectal cancer, with specific focus on intensity-modulated radiotherapy (IMRT) and volumetric arc therapy (VMAT) techniques. The dosimetric benefits of IMRT and VMAT are well established, but prospective clinical studies are limited, with phase I-II studies only. Recent years have seen the publication of a few larger prospective patient series as well as some retrospective cohorts, several of which include much needed late toxicity data. Overall results are encouraging, as toxicity levels - although varying across reports - appear lower than for 3D conformal radiotherapy. Innovative treatment techniques and strategies which may be facilitated by the use of IMRT/VMAT include simultaneously integrated tumour boost, adaptive treatment, selective sparing of specific organs to enable chemotherapy escalation, and nonsurgical management. Few prospective studies of IMRT and VMAT exist, which causes uncertainty not just in regards to the clinical benefit of these technologies but also in the optimal use. The priority for future research should be subgroups of patients who might receive relatively greater benefit from innovative treatment techniques, such as patients receiving chemoradiotherapy with definitive intent and patients treated with dose escalation.

  19. New simulation model of multicomponent crystal growth and inhibition.

    PubMed

    Wathen, Brent; Kuiper, Michael; Walker, Virginia; Jia, Zongchao

    2004-04-02

    We review a novel computational model for the study of crystal structures both on their own and in conjunction with inhibitor molecules. The model advances existing Monte Carlo (MC) simulation techniques by extending them from modeling 3D crystal surface patches to modeling entire 3D crystals, and by including the use of "complex" multicomponent molecules within the simulations. These advances makes it possible to incorporate the 3D shape and non-uniform surface properties of inhibitors into simulations, and to study what effect these inhibitor properties have on the growth of whole crystals containing up to tens of millions of molecules. The application of this extended MC model to the study of antifreeze proteins (AFPs) and their effects on ice formation is reported, including the success of the technique in achieving AFP-induced ice-growth inhibition with concurrent changes to ice morphology that mimic experimental results. Simulations of ice-growth inhibition suggest that the degree of inhibition afforded by an AFP is a function of its ice-binding position relative to the underlying anisotropic growth pattern of ice. This extended MC technique is applicable to other crystal and crystal-inhibitor systems, including more complex crystal systems such as clathrates.

  20. Diagnostic and Therapeutic Management of Nasal Airway Obstruction: Advances in Diagnosis and Treatment.

    PubMed

    Mohan, Suresh; Fuller, Jennifer C; Ford, Stephanie Friree; Lindsay, Robin W

    2018-05-10

    Nasal airway obstruction (NAO) is a common complaint in the otolaryngologist's office and can have a negative influence on quality of life (QOL). Existing diagnostic methods have improved, but little consensus exists on optimal tools. Furthermore, although surgical techniques for nasal obstruction continue to be developed, effective outcome measurement is lacking. An update of recent advances in diagnostic and therapeutic management of NAO is warranted. To review advances in diagnosis and treatment of NAO from the last 5 years. PubMed, Embase, CINAHL, the Cochrane Library, LILACS, Web of Science, and Guideline.gov were searched with the terms nasal obstruction and nasal blockage and their permutations from July 26, 2012, through October 23, 2017. Studies were included if they evaluated NAO using a subjective and an objective technique, and in the case of intervention-based studies, the Nasal Obstruction Symptom Evaluation (NOSE) scale and an objective technique. Exclusion criteria consisted of animal studies; patients younger than 14 years; nasal foreign bodies; nasal masses including polyps; choanal atresia; sinus disease; obstructive sleep apnea or sleep-disordered breathing; allergic rhinitis; and studies not specific to nasal obstruction. The initial search resulted in 942 articles. After independent screening by 2 investigators, 46 unique articles remained, including 2 randomized clinical trials, 3 systematic reviews, 3 meta-analyses, and 39 nonrandomized cohort studies (including a combined systematic review and meta-analysis). An aggregate of approximately 32 000 patients were reviewed (including meta-analyses). Of the subjective measures available for NAO, the NOSE scale is outstanding with regard to disease-specific validation and correlation with symptoms. No currently available objective measure can be considered a criterion standard. Structural measures of flow, pressure, and volume appear to be necessary but insufficient to assess NAO. Therefore, novel variables and techniques must continue to be explored in search of an ideal instrument to aid in assessment of surgical outcomes. Nasal airway obstruction is a clinical diagnosis with considerable effects on QOL. An adequate diagnosis begins with a focused history and physical examination and requires a patient QOL measure such as the NOSE scale. Objective measures should be adjunctive and require further validation for widespread adoption. These results are limited by minimal high-quality evidence among studies and the risk of bias in observational studies. NA.

  1. Estimating acreage by double sampling using LANDSAT data

    NASA Technical Reports Server (NTRS)

    Pont, F.; Horwitz, H.; Kauth, R. (Principal Investigator)

    1982-01-01

    Double sampling techniques employing LANDSAT data for estimating the acreage of corn and soybeans was investigated and evaluated. The evaluation was based on estimated costs and correlations between two existing procedures having differing cost/variance characteristics, and included consideration of their individual merits when coupled with a fictional 'perfect' procedure of zero bias and variance. Two features of the analysis are: (1) the simultaneous estimation of two or more crops; and (2) the imposition of linear cost constraints among two or more types of resource. A reasonably realistic operational scenario was postulated. The costs were estimated from current experience with the measurement procedures involved, and the correlations were estimated from a set of 39 LACIE-type sample segments located in the U.S. Corn Belt. For a fixed variance of the estimate, double sampling with the two existing LANDSAT measurement procedures can result in a 25% or 50% cost reduction. Double sampling which included the fictional perfect procedure results in a more cost effective combination when it is used with the lower cost/higher variance representative of the existing procedures.

  2. Chemical Fracturing of Refractory-Metal Vessels

    NASA Technical Reports Server (NTRS)

    Campana, R. J.

    1986-01-01

    Localized reactions cause refractory-metal vessels to break up at predetermined temperatures. Device following concept designed to break up along predetermined lines into smaller pieces at temperature significantly below melting point of metal from which made. Possible applications include fire extinguishers that breakup to release extinguishing gas in enclosed areas, pressure vessels that could otherwise burst dangerously in fire, and self-destroying devices. Technique particularly suitable modification to already existing structures.

  3. Laser ultrasonic multi-component imaging

    DOEpatents

    Williams, Thomas K [Federal Way, WA; Telschow, Kenneth [Des Moines, WA

    2011-01-25

    Techniques for ultrasonic determination of the interfacial relationship of multi-component systems are discussed. In implementations, a laser energy source may be used to excite a multi-component system including a first component and a second component at least in partial contact with the first component. Vibrations resulting from the excitation may be detected for correlation with a resonance pattern indicating if discontinuity exists at the interface of the first and second components.

  4. UNDERWATER MAPPING USING GLORIA AND MIPS.

    USGS Publications Warehouse

    Chavez, Pat S.; Anderson, Jeffrey A.; Schoonmaker, James W.

    1987-01-01

    Advances in digital image processing of the (GLORIA) Geological Long-Range Induced Asdic) sidescan-sonar image data have made it technically and economically possible to map large areas of the ocean floor including the Exclusive Economic Zone. Software was written to correct both geometric and radiometric distortions that exist in the original raw GLORIA data. A digital mosaicking technique was developed enabling 2 degree by 2 degree quadrangles to be generated.

  5. Improving the Rainbow Attack by Reusing Colours

    NASA Astrophysics Data System (ADS)

    Ågren, Martin; Johansson, Thomas; Hell, Martin

    Hashing or encrypting a key or a password is a vital part in most network security protocols. The most practical generic attack on such schemes is a time memory trade-off attack. Such an attack inverts any one-way function using a trade-off between memory and execution time. Existing techniques include the Hellman attack and the rainbow attack, where the latter uses different reduction functions ("colours") within a table.

  6. Cellular uptake and intracellular fate of engineered nanoparticles: a review on the application of imaging techniques.

    PubMed

    Tantra, Ratna; Knight, Alex

    2011-09-01

    The use of imaging tools to probe nanoparticle-cell interactions will be crucial to elucidating the mechanisms of nanoparticle-induced toxicity. Of particular interest are mechanisms associated with cell penetration, translocation and subsequent accumulation inside the cell, or in cellular compartments. The objective of the present paper is to review imaging techniques that have been previously used in order to assess such interactions, and new techniques with the potential to be useful in this area. In order to identify the most suitable techniques, they were evaluated and matched against a list of evaluation criteria. We conclude that limitations exist with all of the techniques and the ultimate choice will thus depend on the needs of end users, and their particular application. The state-of-the-art techniques appear to have the least limitations, despite the fact that they are not so well established and still far from being routine. For example, super-resolution microscopy techniques appear to have many advantages for understanding the details of the interactions between nanoparticles and cells. Future research should concentrate on further developing or improving such novel techniques, to include the development of standardized methods and appropriate reference materials.

  7. Transgender Phonosurgery: A Systematic Review and Meta-analysis.

    PubMed

    Song, Tara Elena; Jiang, Nancy

    2017-05-01

    Objectives Different surgical techniques have been described in the literature to increase vocal pitch. The purpose of this study is to systematically review these surgeries and perform a meta-analysis to determine which technique increases pitch the most. Data Sources CINAHL, Cochrane, Embase, Medline, PubMed, and Science Direct. Review Methods A systematic review and meta-analysis of the literature was performed using the CINAHL, Cochrane, Embase, Medline, PubMed, and Science Direct databases. Studies were eligible for inclusion if they evaluated pitch-elevating phonosurgical techniques in live humans and performed pre- and postoperative acoustic analysis. Data were gathered regarding surgical technique, pre- and postoperative fundamental frequencies, perioperative care measures, and complications. Results Twenty-nine studies were identified. After applying inclusion and exclusion criteria, a total of 13 studies were included in the meta-analysis. Mechanisms of pitch elevation included increasing vocal cord tension (cricothyroid approximation), shortening the vocal cord length (cold knife glottoplasty, laser-shortening glottoplasty), and decreasing mass (laser reduction glottoplasty). The most common interventions were shortening techniques and cricothyroid approximation (6 studies each). The largest increase in fundamental frequency was seen with techniques that shortened the vocal cords. Preoperative speech therapy, postoperative voice rest, and reporting of patient satisfaction were inconsistent. Many of the studies were limited by low power and short length of follow-up. Conclusions Multiple techniques for elevation of vocal pitch exist, but vocal cord shortening procedures appear to result in the largest increase in fundamental frequency.

  8. "Relative CIR": an image enhancement and visualization technique

    USGS Publications Warehouse

    Fleming, Michael D.

    1993-01-01

    Many techniques exist to spectrally and spatially enhance digital multispectral scanner data. One technique enhances an image while keeping the colors as they would appear in a color-infrared (CIR) image. This "relative CIR" technique generates an image that is both spectrally and spatially enhanced, while displaying a maximum range of colors. The technique enables an interpreter to visualize either spectral or land cover classes by their relative CIR characteristics. A relative CIR image is generated by developed spectral statistics for each class in the classifications and then, using a nonparametric approach for spectral enhancement, the means of the classes for each band are ranked. A 3 by 3 pixel smoothing filter is applied to the classification for spatial enhancement and the classes are mapped to the representative rank for each band. Practical applications of the technique include displaying an image classification product as a CIR image that was not derived directly from a spectral image, visualizing how a land cover classification would look as a CIR image, and displaying a spectral classification or intermediate product that will be used to label spectral classes.

  9. RADON REDUCTION TECHNIQUES FOR EXISTING DETACHED HOUSES - TECHNICAL GUIDANCE (THIRD EDITION) FOR ACTIVE SOIL DEPRESSURIZATION SYSTEMS

    EPA Science Inventory

    This technical guidance document is designed to aid in the selection, design, installation and operation of indoor radon reduction techniques using soil depressurization in existing houses. Its emphasis is on active soil depressurization; i.e., on systems that use a fan to depre...

  10. Phases and stability of non-uniform black strings

    NASA Astrophysics Data System (ADS)

    Emparan, Roberto; Luna, Raimon; Martínez, Marina; Suzuki, Ryotaku; Tanabe, Kentaro

    2018-05-01

    We construct solutions of non-uniform black strings in dimensions from D ≈ 9 all the way up to D = ∞, and investigate their thermodynamics and dynamical stability. Our approach employs the large- D perturbative expansion beyond the leading order, including corrections up to 1 /D 4. Combining both analytical techniques and relatively simple numerical solution of ODEs, we map out the ranges of parameters in which non-uniform black strings exist in each dimension and compute their thermodynamics and quasinormal modes with accuracy. We establish with very good precision the existence of Sorkin's critical dimension and we prove that not only the thermodynamic stability, but also the dynamic stability of the solutions changes at it.

  11. Management of Dynamic Biomedical Terminologies: Current Status and Future Challenges

    PubMed Central

    Dos Reis, J. C.; Pruski, C.

    2015-01-01

    Summary Objectives Controlled terminologies and their dependent artefacts provide a consensual understanding of a domain while reducing ambiguities and enabling reasoning. However, the evolution of a domain’s knowledge directly impacts these terminologies and generates inconsistencies in the underlying biomedical information systems. In this article, we review existing work addressing the dynamic aspect of terminologies as well as their effects on mappings and semantic annotations. Methods We investigate approaches related to the identification, characterization and propagation of changes in terminologies, mappings and semantic annotations including techniques to update their content. Results and conclusion Based on the explored issues and existing methods, we outline open research challenges requiring investigation in the near future. PMID:26293859

  12. Application of nonlinear least-squares regression to ground-water flow modeling, west-central Florida

    USGS Publications Warehouse

    Yobbi, D.K.

    2000-01-01

    A nonlinear least-squares regression technique for estimation of ground-water flow model parameters was applied to an existing model of the regional aquifer system underlying west-central Florida. The regression technique minimizes the differences between measured and simulated water levels. Regression statistics, including parameter sensitivities and correlations, were calculated for reported parameter values in the existing model. Optimal parameter values for selected hydrologic variables of interest are estimated by nonlinear regression. Optimal estimates of parameter values are about 140 times greater than and about 0.01 times less than reported values. Independently estimating all parameters by nonlinear regression was impossible, given the existing zonation structure and number of observations, because of parameter insensitivity and correlation. Although the model yields parameter values similar to those estimated by other methods and reproduces the measured water levels reasonably accurately, a simpler parameter structure should be considered. Some possible ways of improving model calibration are to: (1) modify the defined parameter-zonation structure by omitting and/or combining parameters to be estimated; (2) carefully eliminate observation data based on evidence that they are likely to be biased; (3) collect additional water-level data; (4) assign values to insensitive parameters, and (5) estimate the most sensitive parameters first, then, using the optimized values for these parameters, estimate the entire data set.

  13. Overview of existing cartilage repair technology.

    PubMed

    McNickle, Allison G; Provencher, Matthew T; Cole, Brian J

    2008-12-01

    Currently, autologous chondrocyte implantation and osteochondral grafting bridge the gap between palliation of cartilage injury and resurfacing via arthroplasty. Emerging technologies seek to advance first generation techniques and accomplish several goals including predictable outcomes, cost-effective technology, single-stage procedures, and creation of durable repair tissue. The biologic pipeline represents a variety of technologies including synthetics, scaffolds, cell therapy, and cell-infused matrices. Synthetic constructs, an alternative to biologic repair, resurface a focal chondral defect rather than the entire joint surface. Scaffolds are cell-free constructs designed as a biologic "net" to augment marrow stimulation techniques. Minced cartilage technology uses stabilized autologous or allogeneic fragments in 1-stage transplantation. Second and third generation cell-based methods include alternative membranes, chondrocyte seeding, and culturing onto scaffolds. Despite the promising early results of these products, significant technical obstacles remain along with unknown long-term durability. The vast array of developing technologies has exceptional promise and the potential to revolutionize the cartilage treatment algorithm within the next decade.

  14. FluoRender: joint freehand segmentation and visualization for many-channel fluorescence data analysis.

    PubMed

    Wan, Yong; Otsuna, Hideo; Holman, Holly A; Bagley, Brig; Ito, Masayoshi; Lewis, A Kelsey; Colasanto, Mary; Kardon, Gabrielle; Ito, Kei; Hansen, Charles

    2017-05-26

    Image segmentation and registration techniques have enabled biologists to place large amounts of volume data from fluorescence microscopy, morphed three-dimensionally, onto a common spatial frame. Existing tools built on volume visualization pipelines for single channel or red-green-blue (RGB) channels have become inadequate for the new challenges of fluorescence microscopy. For a three-dimensional atlas of the insect nervous system, hundreds of volume channels are rendered simultaneously, whereas fluorescence intensity values from each channel need to be preserved for versatile adjustment and analysis. Although several existing tools have incorporated support of multichannel data using various strategies, the lack of a flexible design has made true many-channel visualization and analysis unavailable. The most common practice for many-channel volume data presentation is still converting and rendering pseudosurfaces, which are inaccurate for both qualitative and quantitative evaluations. Here, we present an alternative design strategy that accommodates the visualization and analysis of about 100 volume channels, each of which can be interactively adjusted, selected, and segmented using freehand tools. Our multichannel visualization includes a multilevel streaming pipeline plus a triple-buffer compositing technique. Our method also preserves original fluorescence intensity values on graphics hardware, a crucial feature that allows graphics-processing-unit (GPU)-based processing for interactive data analysis, such as freehand segmentation. We have implemented the design strategies as a thorough restructuring of our original tool, FluoRender. The redesign of FluoRender not only maintains the existing multichannel capabilities for a greatly extended number of volume channels, but also enables new analysis functions for many-channel data from emerging biomedical-imaging techniques.

  15. High-Temperature Strain Sensing for Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Piazza, Anthony; Richards, Lance W.; Hudson, Larry D.

    2008-01-01

    Thermal protection systems (TPS) and hot structures are utilizing advanced materials that operate at temperatures that exceed abilities to measure structural performance. Robust strain sensors that operate accurately and reliably beyond 1800 F are needed but do not exist. These shortcomings hinder the ability to validate analysis and modeling techniques and hinders the ability to optimize structural designs. This presentation examines high-temperature strain sensing for aerospace applications and, more specifically, seeks to provide strain data for validating finite element models and thermal-structural analyses. Efforts have been made to develop sensor attachment techniques for relevant structural materials at the small test specimen level and to perform laboratory tests to characterize sensor and generate corrections to apply to indicated strains. Areas highlighted in this presentation include sensors, sensor attachment techniques, laboratory evaluation/characterization of strain measurement, and sensor use in large-scale structures.

  16. Detecting drug-target binding in cells using fluorescence-activated cell sorting coupled with mass spectrometry analysis.

    PubMed

    Wilson, Kris; Webster, Scott P; Iredale, John P; Zheng, Xiaozhong; Homer, Natalie Z; Pham, Nhan T; Auer, Manfred; Mole, Damian J

    2017-12-15

    The assessment of drug-target engagement for determining the efficacy of a compound inside cells remains challenging, particularly for difficult target proteins. Existing techniques are more suited to soluble protein targets. Difficult target proteins include those with challenging in vitro solubility, stability or purification properties that preclude target isolation. Here, we report a novel technique that measures intracellular compound-target complex formation, as well as cellular permeability, specificity and cytotoxicity-the toxicity-affinity-permeability-selectivity (TAPS) technique. The TAPS assay is exemplified here using human kynurenine 3-monooxygenase (KMO), a challenging intracellular membrane protein target of significant current interest. TAPS confirmed target binding of known KMO inhibitors inside cells. We conclude that the TAPS assay can be used to facilitate intracellular hit validation on most, if not all intracellular drug targets.

  17. Detecting drug-target binding in cells using fluorescence-activated cell sorting coupled with mass spectrometry analysis

    NASA Astrophysics Data System (ADS)

    Wilson, Kris; Webster, Scott P.; Iredale, John P.; Zheng, Xiaozhong; Homer, Natalie Z.; Pham, Nhan T.; Auer, Manfred; Mole, Damian J.

    2018-01-01

    The assessment of drug-target engagement for determining the efficacy of a compound inside cells remains challenging, particularly for difficult target proteins. Existing techniques are more suited to soluble protein targets. Difficult target proteins include those with challenging in vitro solubility, stability or purification properties that preclude target isolation. Here, we report a novel technique that measures intracellular compound-target complex formation, as well as cellular permeability, specificity and cytotoxicity-the toxicity-affinity-permeability-selectivity (TAPS) technique. The TAPS assay is exemplified here using human kynurenine 3-monooxygenase (KMO), a challenging intracellular membrane protein target of significant current interest. TAPS confirmed target binding of known KMO inhibitors inside cells. We conclude that the TAPS assay can be used to facilitate intracellular hit validation on most, if not all intracellular drug targets.

  18. High lift selected concepts

    NASA Technical Reports Server (NTRS)

    Henderson, M. L.

    1979-01-01

    The benefits to high lift system maximum life and, alternatively, to high lift system complexity, of applying analytic design and analysis techniques to the design of high lift sections for flight conditions were determined and two high lift sections were designed to flight conditions. The influence of the high lift section on the sizing and economics of a specific energy efficient transport (EET) was clarified using a computerized sizing technique and an existing advanced airplane design data base. The impact of the best design resulting from the design applications studies on EET sizing and economics were evaluated. Flap technology trade studies, climb and descent studies, and augmented stability studies are included along with a description of the baseline high lift system geometry, a calculation of lift and pitching moment when separation is present, and an inverse boundary layer technique for pressure distribution synthesis and optimization.

  19. IGA: A Simplified Introduction and Implementation Details for Finite Element Users

    NASA Astrophysics Data System (ADS)

    Agrawal, Vishal; Gautam, Sachin S.

    2018-05-01

    Isogeometric analysis (IGA) is a recently introduced technique that employs the Computer Aided Design (CAD) concept of Non-uniform Rational B-splines (NURBS) tool to bridge the substantial bottleneck between the CAD and finite element analysis (FEA) fields. The simplified transition of exact CAD models into the analysis alleviates the issues originating from geometrical discontinuities and thus, significantly reduces the design-to-analysis time in comparison to traditional FEA technique. Since its origination, the research in the field of IGA is accelerating and has been applied to various problems. However, the employment of CAD tools in the area of FEA invokes the need of adapting the existing implementation procedure for the framework of IGA. Also, the usage of IGA requires the in-depth knowledge of both the CAD and FEA fields. This can be overwhelming for a beginner in IGA. Hence, in this paper, a simplified introduction and implementation details for the incorporation of NURBS based IGA technique within the existing FEA code is presented. It is shown that with little modifications, the available standard code structure of FEA can be adapted for IGA. For the clear and concise explanation of these modifications, step-by-step implementation of a benchmark plate with a circular hole under the action of in-plane tension is included.

  20. Costs of Limiting Route Optimization to Published Waypoints in the Traffic Aware Planner

    NASA Technical Reports Server (NTRS)

    Karr, David A.; Vivona, Robert A.; Wing, David J.

    2013-01-01

    The Traffic Aware Planner (TAP) is an airborne advisory tool that generates optimized, traffic-avoiding routes to support the aircraft crew in making strategic reroute requests to Air Traffic Control (ATC). TAP is derived from a research-prototype self-separation tool, the Autonomous Operations Planner (AOP), in which optimized route modifications that avoid conflicts with traffic and weather, using waypoints at explicit latitudes and longitudes (a technique supported by self-separation concepts), are generated by maneuver patterns applied to the existing route. For use in current-day operations in which trajectory changes must be requested from ATC via voice communication, TAP produces optimized routes described by advisories that use only published waypoints prior to a reconnection waypoint on the existing route. We describe how the relevant algorithms of AOP have been modified to implement this requirement. The modifications include techniques for finding appropriate published waypoints in a maneuver pattern and a method for combining the genetic algorithm of AOP with an exhaustive search of certain types of advisory. We demonstrate methods to investigate the increased computation required by these techniques and to estimate other costs (measured in terms such as time to destination and fuel burned) that may be incurred when only published waypoints are used.

  1. Automated prescription of oblique brain 3D magnetic resonance spectroscopic imaging.

    PubMed

    Ozhinsky, Eugene; Vigneron, Daniel B; Chang, Susan M; Nelson, Sarah J

    2013-04-01

    Two major difficulties encountered in implementing Magnetic Resonance Spectroscopic Imaging (MRSI) in a clinical setting are limited coverage and difficulty in prescription. The goal of this project was to automate completely the process of 3D PRESS MRSI prescription, including placement of the selection box, saturation bands and shim volume, while maximizing the coverage of the brain. The automated prescription technique included acquisition of an anatomical MRI image, optimization of the oblique selection box parameters, optimization of the placement of outer-volume suppression saturation bands, and loading of the calculated parameters into a customized 3D MRSI pulse sequence. To validate the technique and compare its performance with existing protocols, 3D MRSI data were acquired from six exams from three healthy volunteers. To assess the performance of the automated 3D MRSI prescription for patients with brain tumors, the data were collected from 16 exams from 8 subjects with gliomas. This technique demonstrated robust coverage of the tumor, high consistency of prescription and very good data quality within the T2 lesion. Copyright © 2012 Wiley Periodicals, Inc.

  2. A processing centre for the CNES CE-GPS experimentation

    NASA Technical Reports Server (NTRS)

    Suard, Norbert; Durand, Jean-Claude

    1994-01-01

    CNES is involved in a GPS (Global Positioning System) geostationary overlay experimentation. The purpose of this experimentation is to test various new techniques in order to select the optimal station synchronization method, as well as the geostationary spacecraft orbitography method. These new techniques are needed to develop the Ranging GPS Integrity Channel services. The CNES experimentation includes three transmitting/receiving ground stations (manufactured by IN-SNEC), one INMARSAT 2 C/L band transponder and a processing center named STE (Station de Traitements de l'Experimentation). Not all the techniques to be tested are implemented, but the experimental system has to include several functions; part of the future system simulation functions, such as a servo-loop function, and in particular a data collection function providing for rapid monitoring of system operation, analysis of existing ground station processes, and several weeks of data coverage for other scientific studies. This paper discusses system architecture and some criteria used in its design, as well as the monitoring function, the approach used to develop a low-cost and short-life processing center in collaboration with a CNES sub-contractor (ATTDATAID), and some results.

  3. Novel Passive Clearing Methods for the Rapid Production of Optical Transparency in Whole CNS Tissue.

    PubMed

    Woo, Jiwon; Lee, Eunice Yoojin; Park, Hyo-Suk; Park, Jeong Yoon; Cho, Yong Eun

    2018-05-08

    Since the development of CLARITY, a bioelectrochemical clearing technique that allows for three-dimensional phenotype mapping within transparent tissues, a multitude of novel clearing methodologies including CUBIC (clear, unobstructed brain imaging cocktails and computational analysis), SWITCH (system-wide control of interaction time and kinetics of chemicals), MAP (magnified analysis of the proteome), and PACT (passive clarity technique), have been established to further expand the existing toolkit for the microscopic analysis of biological tissues. The present study aims to improve upon and optimize the original PACT procedure for an array of intact rodent tissues, including the whole central nervous system (CNS), kidneys, spleen, and whole mouse embryos. Termed psPACT (process-separate PACT) and mPACT (modified PACT), these novel techniques provide highly efficacious means of mapping cell circuitry and visualizing subcellular structures in intact normal and pathological tissues. In the following protocol, we provide a detailed, step-by-step outline on how to achieve maximal tissue clearance with minimal invasion of their structural integrity via psPACT and mPACT.

  4. Accurately modeling Gaussian beam propagation in the context of Monte Carlo techniques

    NASA Astrophysics Data System (ADS)

    Hokr, Brett H.; Winblad, Aidan; Bixler, Joel N.; Elpers, Gabriel; Zollars, Byron; Scully, Marlan O.; Yakovlev, Vladislav V.; Thomas, Robert J.

    2016-03-01

    Monte Carlo simulations are widely considered to be the gold standard for studying the propagation of light in turbid media. However, traditional Monte Carlo methods fail to account for diffraction because they treat light as a particle. This results in converging beams focusing to a point instead of a diffraction limited spot, greatly effecting the accuracy of Monte Carlo simulations near the focal plane. Here, we present a technique capable of simulating a focusing beam in accordance to the rules of Gaussian optics, resulting in a diffraction limited focal spot. This technique can be easily implemented into any traditional Monte Carlo simulation allowing existing models to be converted to include accurate focusing geometries with minimal effort. We will present results for a focusing beam in a layered tissue model, demonstrating that for different scenarios the region of highest intensity, thus the greatest heating, can change from the surface to the focus. The ability to simulate accurate focusing geometries will greatly enhance the usefulness of Monte Carlo for countless applications, including studying laser tissue interactions in medical applications and light propagation through turbid media.

  5. Pastoral crisis intervention with children: recognizing and responding to the spiritual reaction of children.

    PubMed

    McPherson, Kenneth F

    2004-01-01

    Many individuals struggle to express their thoughts and feelings following a crisis situation. When these feelings include questions related to spiritual issues such as bad things happening to good people, meaning in life and its tragedies, and the very existence of a caring and loving God, people shut down even more tightly. Imagine how much greater this difficulty becomes for those who lack the ability to verbalize what they are experiencing. Many of our most widely used crisis intervention models rely on verbal techniques to elicit people's thoughts and feelings about stressful incidents they've just experienced. The main focus of this paper is to provide alternative techniques for eliciting the thoughts and feelings of children during traumatic times. The paper reviews basic principles of pastoral crisis intervention (PCI), presents typical spiritual reactions of children to trauma by age groups, presents Crisis Response Play Therapy (CRPT) as one alternative method that bypasses the need for verbalization, and proposes the use of similar experiential techniques for special needs populations, including adults, who have difficulty giving voice to their experiences.

  6. [Optimization of the pseudorandom input signals used for the forced oscillation technique].

    PubMed

    Liu, Xiaoli; Zhang, Nan; Liang, Hong; Zhang, Zhengbo; Li, Deyu; Wang, Weidong

    2017-10-01

    The forced oscillation technique (FOT) is an active pulmonary function measurement technique that was applied to identify the mechanical properties of the respiratory system using external excitation signals. FOT commonly includes single frequency sine, pseudorandom and periodic impulse excitation signals. Aiming at preventing the time-domain amplitude overshoot that might exist in the acquisition of combined multi sinusoidal pseudorandom signals, this paper studied the phase optimization of pseudorandom signals. We tried two methods including the random phase combination and time-frequency domain swapping algorithm to solve this problem, and used the crest factor to estimate the effect of optimization. Furthermore, in order to make the pseudorandom signals met the requirement of the respiratory system identification in 4-40 Hz, we compensated the input signals' amplitudes at the low frequency band (4-18 Hz) according to the frequency-response curve of the oscillation unit. Resuts showed that time-frequency domain swapping algorithm could effectively optimize the phase combination of pseudorandom signals. Moreover, when the amplitudes at low frequencies were compensated, the expected stimulus signals which met the performance requirements were obtained eventually.

  7. Emergency cricothyrotomy-a comparative study of different techniques in human cadavers.

    PubMed

    Schober, Patrick; Hegemann, Martina C; Schwarte, Lothar A; Loer, Stephan A; Noetges, Peter

    2009-02-01

    Emergency cricothyrotomy is the final lifesaving option in "cannot intubate-cannot ventilate" situations. Fast, efficient and safe management is indispensable to reestablish oxygenation, thus the quickest, most reliable and safest technique should be used. Several cricothyrotomy techniques exist, which can be grouped into two categories: anatomical-surgical and puncture. We studied success rate, tracheal tube insertion time and complications of different techniques, including a novel cricothyrotomy scissors technique in human cadavers. Sixty-three inexperienced health care providers were randomly assigned to apply either an anatomical-surgical technique (standard surgical technique, n=18; novel cricothyrotomy scissors technique, n=14) or a puncture technique (catheter-over-needle technique, n=17; wire-guided technique, n=14). Airway access was almost always successful with the anatomical-surgical techniques (success rate in standard surgical group 94%, scissors group 100%). In contrast, the success rate was smaller (p<0.05) with the puncture techniques (catheter-over-needle group 82%, wire-guided technique 71%). Tracheal tube insertion time was faster overall (p<0.05) with anatomical-surgical techniques (standard surgical 78s [54-135], novel cricothyrotomy scissors technique 60s [42-82]; median [IQR]) than with puncture techniques (catheter-over-needle technique 74s [48-145], wire-guided technique 135s [116-307]). We observed fewer complications with anatomical-surgical techniques than with puncture techniques (p<0.001). In inexperienced health care personnel, anatomical-surgical techniques showed a higher success rate, a faster tracheal tube insertion time and a lower complication rate compared with puncture techniques, suggesting that they may be the techniques of choice in emergencies.

  8. Nanomechanical effects of light unveil photons momentum in medium

    PubMed Central

    Verma, Gopal; Chaudhary, Komal; Singh, Kamal P.

    2017-01-01

    Precision measurement on momentum transfer between light and fluid interface has many implications including resolving the intriguing nature of photons momentum in a medium. For example, the existence of Abraham pressure of light under specific experimental configuration and the predictions of Chau-Amperian formalism of optical momentum for TE and TM polarizations remain untested. Here, we quantitatively and cleanly measure nanomehanical dynamics of water surface excited by radiation pressure of a laser beam. We systematically scanned wide range of experimental parameters including long exposure times, angle of incidence, spot size and laser polarization, and used two independent pump-probe techniques to validate a nano- bump on the water surface under all the tested conditions, in quantitative agreement with the Minkowski’s momentum of light. With careful experiments, we demonstrate advantages and limitations of nanometer resolved optical probing techniques and narrow down actual manifestation of optical momentum in a medium. PMID:28198468

  9. Application and sensitivity investigation of Fourier transforms for microwave radiometric inversions

    NASA Technical Reports Server (NTRS)

    Holmes, J. J.; Balanis, C. A.

    1974-01-01

    Existing microwave radiometer technology now provides a suitable method for remote determination of the ocean surface's absolute brightness temperature. To extract the brightness temperature of the water from the antenna temperature equation, an unstable Fredholm integral equation of the first kind was solved. Fast Fourier Transform techniques were used to invert the integral after it is placed into a cross-correlation form. Application and verification of the methods to a two-dimensional modeling of a laboratory wave tank system were included. The instability of the Fredholm equation was then demonstrated and a restoration procedure was included which smooths the resulting oscillations. With the recent availability and advances of Fast Fourier Transform techniques, the method presented becomes very attractive in the evaluation of large quantities of data. Actual radiometric measurements of sea water are inverted using the restoration method, incorporating the advantages of the Fast Fourier Transform algorithm for computations.

  10. UMA/GAN network architecture analysis

    NASA Astrophysics Data System (ADS)

    Yang, Liang; Li, Wensheng; Deng, Chunjian; Lv, Yi

    2009-07-01

    This paper is to critically analyze the architecture of UMA which is one of Fix Mobile Convergence (FMC) solutions, and also included by the third generation partnership project(3GPP). In UMA/GAN network architecture, UMA Network Controller (UNC) is the key equipment which connects with cellular core network and mobile station (MS). UMA network could be easily integrated into the existing cellular networks without influencing mobile core network, and could provides high-quality mobile services with preferentially priced indoor voice and data usage. This helps to improve subscriber's experience. On the other hand, UMA/GAN architecture helps to integrate other radio technique into cellular network which includes WiFi, Bluetooth, and WiMax and so on. This offers the traditional mobile operators an opportunity to integrate WiMax technique into cellular network. In the end of this article, we also give an analysis of potential influence on the cellular core networks ,which is pulled by UMA network.

  11. An experimental study of nonlinear dynamic system identification

    NASA Technical Reports Server (NTRS)

    Stry, Greselda I.; Mook, D. Joseph

    1990-01-01

    A technique for robust identification of nonlinear dynamic systems is developed and illustrated using both simulations and analog experiments. The technique is based on the Minimum Model Error optimal estimation approach. A detailed literature review is included in which fundamental differences between the current approach and previous work is described. The most significant feature of the current work is the ability to identify nonlinear dynamic systems without prior assumptions regarding the form of the nonlinearities, in constrast to existing nonlinear identification approaches which usually require detailed assumptions of the nonlinearities. The example illustrations indicate that the method is robust with respect to prior ignorance of the model, and with respect to measurement noise, measurement frequency, and measurement record length.

  12. Processing techniques for digital sonar images from GLORIA.

    USGS Publications Warehouse

    Chavez, P.S.

    1986-01-01

    Image processing techniques have been developed to handle data from one of the newest members of the remote sensing family of digital imaging systems. This paper discusses software to process data collected by the GLORIA (Geological Long Range Inclined Asdic) sonar imaging system, designed and built by the Institute of Oceanographic Sciences (IOS) in England, to correct for both geometric and radiometric distortions that exist in the original 'raw' data. Preprocessing algorithms that are GLORIA-specific include corrections for slant-range geometry, water column offset, aspect ratio distortion, changes in the ship's velocity, speckle noise, and shading problems caused by the power drop-off which occurs as a function of range.-from Author

  13. High-Temperature Piezoelectric Sensing

    PubMed Central

    Jiang, Xiaoning; Kim, Kyungrim; Zhang, Shujun; Johnson, Joseph; Salazar, Giovanni

    2014-01-01

    Piezoelectric sensing is of increasing interest for high-temperature applications in aerospace, automotive, power plants and material processing due to its low cost, compact sensor size and simple signal conditioning, in comparison with other high-temperature sensing techniques. This paper presented an overview of high-temperature piezoelectric sensing techniques. Firstly, different types of high-temperature piezoelectric single crystals, electrode materials, and their pros and cons are discussed. Secondly, recent work on high-temperature piezoelectric sensors including accelerometer, surface acoustic wave sensor, ultrasound transducer, acoustic emission sensor, gas sensor, and pressure sensor for temperatures up to 1,250 °C were reviewed. Finally, discussions of existing challenges and future work for high-temperature piezoelectric sensing are presented. PMID:24361928

  14. ATMOS Spacelab 1 science investigation

    NASA Technical Reports Server (NTRS)

    Park, J. H.; Smith, M. A. H.; Twitty, J. T.; Russell, J. M., III

    1979-01-01

    Existing infrared spectra from high speed interferometer balloon flights were analyzed and experimental analysis techniques applicable to similar data from the ATMOS experiment (Spacelab 3) were investigated. Specific techniques under investigation included line-by-line simulation of the spectra to aid in the identification of absorbing gases, simultaneous retrieval of pressure and temperature profiles using carefully chosen pairs of CO2 absorption lines, and the use of these pressures and temperatures in the retrieval of gas concentration profiles for many absorbing species. A search for a new absorption features was also carried out, and special attention was given to identification of absorbing gases in spectral bandpass regions to be measured by the halogen occultation experiment.

  15. Infrared thermographic detection of buried grave sites

    NASA Astrophysics Data System (ADS)

    Weil, Gary J.; Graf, Richard J.

    1992-04-01

    Since time began, people have been born and people have died. For a variety of reasons grave sites have had to be located and investigated. These reasons have included legal, criminal, religious, construction and even simple curiosity problems. Destructive testing methods such as shovels and backhoes, have traditionally been used to determine grave site locations in fields, under pavements, and behind hidden locations. These existing techniques are slow, inconvenient, dirty, destructive, visually obtrusive, irritating to relatives, explosive to the media and expensive. A new, nondestructive, non-contact technique, infrared thermography has been developed to address these problems. This paper will describe how infrared thermography works and will be illustrated by several case histories.

  16. Comparative evaluation of border molding using two different techniques in maxillary edentulous arches: A clinical study

    PubMed Central

    Qanungo, Anchal; Aras, Meena Ajay; Chitre, Vidya; Coutinho, Ivy; Rajagopal, Praveen; Mysore, Ashwin

    2016-01-01

    Purpose: The aim of this in vivo study was to compare the single-step border molding technique using injectable heavy viscosity addition silicone with sectional border molding technique using low fusing impression compound by evaluating the retention of heat cure trial denture bases. Materials and Methods: Ten completely edentulous patients in need of prostheses were included in this study. Two border molding techniques, single-step (Group 1) and sectional (Group 2), were compared for retention. Both border molding techniques were performed in each patient. In both techniques, definitive wash impression was made with light viscosity addition silicone. The final results were analyzed using paired t-test to determine whether significant differences existed between the groups. Results: The t-value (3.031) infers that there was a significant difference between Group 1 and Group 2 (P = 0.014). The retention obtained in Group 2 (mean = 9.05 kgf) was significantly higher than that of Group 1 (mean = 8.26 kgf). Conclusion: Sectional border molding technique proved to be more retentive as compared to single-step border molding although clinically the retention appeared comparable. PMID:27746597

  17. Conservation in the energy industry

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The basic energy supply and utilization problems faced by the United States were described. Actions which might alleviate the domestic shortfall of petroleum and natural gas are described, analyzed and overall impacts are assessed. Specific actions included are coal gasification, in situ shale oil production, improved oil and gas recovery, importation of liquid natural gas and deregulation of natural gas prices. These actions are weighed against each other as alternate techniques of alleviating or overcoming existing shortfalls.

  18. Computing Flow through Well Screens Using an Embedded Well Technique

    DTIC Science & Technology

    2015-08-01

    average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed...necessary to solve the continuity equation and the momentum equation using small time - steps . With the assumption that the well flow reaches...well system so that much greater time - steps can be used for computation. The 1D steady- state well equation can be written as well well well well well

  19. Extreme winds and tornadoes: design and evaluation of buildings and structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDonald, J.R.

    1985-01-01

    The general provisions of ANSI A58.1-1982 are explained in detail. As mentioned above, these procedures may be used to determine design wind loads on structures from extreme winds, hurricane and tornado winds. Treatment of atmospheric pressure change loads are discussed, including recommendations for venting a building, if necessary, and the effects of rate of pressure change on HVAC systems. Finally, techniques for evaluating existing facilities are described.

  20. Development of Techniques for Spent Fuel Assay – Differential Dieaway Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swinhoe, Martyn Thomas; Goodsell, Alison; Ianakiev, Kiril Dimitrov

    This report summarizes the work done under a DNDO R&D funded project on the development of the differential dieaway method to measure plutonium in spent fuel. There are large amounts of plutonium that are contained in spent fuel assemblies, and currently there is no way to make quantitative non-destructive assay. This has led NA24 under the Next Generation Safeguards Initiative (NGSI) to establish a multi-year program to investigate, develop and implement measurement techniques for spent fuel. The techniques which are being experimentally tested by the existing NGSI project do not include any pulsed neutron active techniques. The present work coversmore » the active neutron differential dieaway technique and has advanced the state of knowledge of this technique as well as produced a design for a practical active neutron interrogation instrument for spent fuel. Monte Carlo results from the NGSI effort show that much higher accuracy (1-2%) for the Pu content in spent fuel assemblies can be obtained with active neutron interrogation techniques than passive techniques, and this would allow their use for nuclear material accountancy independently of any information from the operator. The main purpose of this work was to develop an active neutron interrogation technique for spent nuclear fuel.« less

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boucher, Laurel

    In an era of budget cuts and declining resources, an increased need exists for government agencies to develop formal and informal partnerships. Such partnerships are a means through which government agencies can use their resources to accomplish together what they cannot accomplish on their own. Interagency partnerships may involve multiple government agencies, private contractors, national laboratories, technology developers, public representatives, and other stakeholders. Four elements of strong and healthy interagency partnerships are presented as well as three needs that must be satisfied for the partnership to last. A diagnostic tool to measure the strength of these building blocks within anmore » existing partnership is provided. Tools, techniques, and templates to develop these fundamental elements within a new partnership or to strengthen those within an already existing partnership are presented. This includes a comprehensive template for a partnership agreement along with practical suggestions as membership, operations, and decisions-making. (authors)« less

  2. Synthesis of Greedy Algorithms Using Dominance Relations

    NASA Technical Reports Server (NTRS)

    Nedunuri, Srinivas; Smith, Douglas R.; Cook, William R.

    2010-01-01

    Greedy algorithms exploit problem structure and constraints to achieve linear-time performance. Yet there is still no completely satisfactory way of constructing greedy algorithms. For example, the Greedy Algorithm of Edmonds depends upon translating a problem into an algebraic structure called a matroid, but the existence of such a translation can be as hard to determine as the existence of a greedy algorithm itself. An alternative characterization of greedy algorithms is in terms of dominance relations, a well-known algorithmic technique used to prune search spaces. We demonstrate a process by which dominance relations can be methodically derived for a number of greedy algorithms, including activity selection, and prefix-free codes. By incorporating our approach into an existing framework for algorithm synthesis, we demonstrate that it could be the basis for an effective engineering method for greedy algorithms. We also compare our approach with other characterizations of greedy algorithms.

  3. Review of Bioassays for Monitoring Fate and Transport ofEstrogenic Endocrine Disrupting Compounds in Water

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    CGCampbell@lbl.gov

    Endocrine disrupting compounds (EDCs) are recognizedcontaminants threatening water quality. Despite efforts in sourceidentification, few strategies exist for characterization or treatment ofthis environmental pollution. Given that there are numerous EDCs that cannegatively affect humans and wildlife, general screening techniques likebioassays and biosensors provide an essential rapid and intensiveanalysis capacity. Commonly applied bioassays include the ELISA and YESassays, but promising technologies include ER-CALUXa, ELRA, Endotecta,RIANA, and IR-bioamplification. Two biosensors, Endotecta and RIANA, arefield portable using non-cellular biological detection strategies.Environmental management of EDCs in water requires integration ofbiosensors and bioassays for monitoring and assessment.

  4. Crosscutting Airborne Remote Sensing Technologies for Oil and Gas and Earth Science Applications

    NASA Technical Reports Server (NTRS)

    Aubrey, A. D.; Frankenberg, C.; Green, R. O.; Eastwood, M. L.; Thompson, D. R.; Thorpe, A. K.

    2015-01-01

    Airborne imaging spectroscopy has evolved dramatically since the 1980s as a robust remote sensing technique used to generate 2-dimensional maps of surface properties over large spatial areas. Traditional applications for passive airborne imaging spectroscopy include interrogation of surface composition, such as mapping of vegetation diversity and surface geological composition. Two recent applications are particularly relevant to the needs of both the oil and gas as well as government sectors: quantification of surficial hydrocarbon thickness in aquatic environments and mapping atmospheric greenhouse gas components. These techniques provide valuable capabilities for petroleum seepage in addition to detection and quantification of fugitive emissions. New empirical data that provides insight into the source strength of anthropogenic methane will be reviewed, with particular emphasis on the evolving constraints enabled by new methane remote sensing techniques. Contemporary studies attribute high-strength point sources as significantly contributing to the national methane inventory and underscore the need for high performance remote sensing technologies that provide quantitative leak detection. Imaging sensors that map spatial distributions of methane anomalies provide effective techniques to detect, localize, and quantify fugitive leaks. Airborne remote sensing instruments provide the unique combination of high spatial resolution (<1 m) and large coverage required to directly attribute methane emissions to individual emission sources. This capability cannot currently be achieved using spaceborne sensors. In this study, results from recent NASA remote sensing field experiments focused on point-source leak detection, will be highlighted. This includes existing quantitative capabilities for oil and methane using state-of-the-art airborne remote sensing instruments. While these capabilities are of interest to NASA for assessment of environmental impact and global climate change, industry similarly seeks to detect and localize leaks of both oil and methane across operating fields. In some cases, higher sensitivities desired for upstream and downstream applications can only be provided by new airborne remote sensing instruments tailored specifically for a given application. There exists a unique opportunity for alignment of efforts between commercial and government sectors to advance the next generation of instruments to provide more sensitive leak detection capabilities, including those for quantitative source strength determination.

  5. Non-destructive evaluation of laboratory scale hydraulic fracturing using acoustic emission

    NASA Astrophysics Data System (ADS)

    Hampton, Jesse Clay

    The primary objective of this research is to develop techniques to characterize hydraulic fractures and fracturing processes using acoustic emission monitoring based on laboratory scale hydraulic fracturing experiments. Individual microcrack AE source characterization is performed to understand the failure mechanisms associated with small failures along pre-existing discontinuities and grain boundaries. Individual microcrack analysis methods include moment tensor inversion techniques to elucidate the mode of failure, crack slip and crack normal direction vectors, and relative volumetric deformation of an individual microcrack. Differentiation between individual microcrack analysis and AE cloud based techniques is studied in efforts to refine discrete fracture network (DFN) creation and regional damage quantification of densely fractured media. Regional damage estimations from combinations of individual microcrack analyses and AE cloud density plotting are used to investigate the usefulness of weighting cloud based AE analysis techniques with microcrack source data. Two granite types were used in several sample configurations including multi-block systems. Laboratory hydraulic fracturing was performed with sample sizes ranging from 15 x 15 x 25 cm3 to 30 x 30 x 25 cm 3 in both unconfined and true-triaxially confined stress states using different types of materials. Hydraulic fracture testing in rock block systems containing a large natural fracture was investigated in terms of AE response throughout fracture interactions. Investigations of differing scale analyses showed the usefulness of individual microcrack characterization as well as DFN and cloud based techniques. Individual microcrack characterization weighting cloud based techniques correlated well with post-test damage evaluations.

  6. Improved 3-omega measurement of thermal conductivity in liquid, gases, and powders using a metal-coated optical fiber.

    PubMed

    Schiffres, Scott N; Malen, Jonathan A

    2011-06-01

    A novel 3ω thermal conductivity measurement technique called metal-coated 3ω is introduced for use with liquids, gases, powders, and aerogels. This technique employs a micron-scale metal-coated glass fiber as a heater/thermometer that is suspended within the sample. Metal-coated 3ω exceeds alternate 3ω based fluid sensing techniques in a number of key metrics enabling rapid measurements of small samples of materials with very low thermal effusivity (gases), using smaller temperature oscillations with lower parasitic conduction losses. Its advantages relative to existing fluid measurement techniques, including transient hot-wire, steady-state methods, and solid-wire 3ω are discussed. A generalized n-layer concentric cylindrical periodic heating solution that accounts for thermal boundary resistance is presented. Improved sensitivity to boundary conductance is recognized through this model. Metal-coated 3ω was successfully validated through a benchmark study of gases and liquids spanning two-orders of magnitude in thermal conductivity. © 2011 American Institute of Physics

  7. Volumetric velocimetry for fluid flows

    NASA Astrophysics Data System (ADS)

    Discetti, Stefano; Coletti, Filippo

    2018-04-01

    In recent years, several techniques have been introduced that are capable of extracting 3D three-component velocity fields in fluid flows. Fast-paced developments in both hardware and processing algorithms have generated a diverse set of methods, with a growing range of applications in flow diagnostics. This has been further enriched by the increasingly marked trend of hybridization, in which the differences between techniques are fading. In this review, we carry out a survey of the prominent methods, including optical techniques and approaches based on medical imaging. An overview of each is given with an example of an application from the literature, while focusing on their respective strengths and challenges. A framework for the evaluation of velocimetry performance in terms of dynamic spatial range is discussed, along with technological trends and emerging strategies to exploit 3D data. While critical challenges still exist, these observations highlight how volumetric techniques are transforming experimental fluid mechanics, and that the possibilities they offer have just begun to be explored.

  8. Generation of Well-Defined Micro/Nanoparticles via Advanced Manufacturing Techniques for Therapeutic Delivery

    PubMed Central

    Zhang, Peipei; Xia, Junfei; Luo, Sida

    2018-01-01

    Micro/nanoparticles have great potentials in biomedical applications, especially for drug delivery. Existing studies identified that major micro/nanoparticle features including size, shape, surface property and component materials play vital roles in their in vitro and in vivo applications. However, a demanding challenge is that most conventional particle synthesis techniques such as emulsion can only generate micro/nanoparticles with a very limited number of shapes (i.e., spherical or rod shapes) and have very loose control in terms of particle sizes. We reviewed the advanced manufacturing techniques for producing micro/nanoparticles with precisely defined characteristics, emphasizing the use of these well-controlled micro/nanoparticles for drug delivery applications. Additionally, to illustrate the vital roles of particle features in therapeutic delivery, we also discussed how the above-mentioned micro/nanoparticle features impact in vitro and in vivo applications. Through this review, we highlighted the unique opportunities in generating controllable particles via advanced manufacturing techniques and the great potential of using these micro/nanoparticles for therapeutic delivery. PMID:29670013

  9. A review on methods of regeneration of spent pickling solutions from steel processing.

    PubMed

    Regel-Rosocka, Magdalena

    2010-05-15

    The review presents various techniques of regeneration of spent pickling solutions, including the methods with acid recovery, such as diffusion dialysis, electrodialysis, membrane electrolysis and membrane distillation, evaporation, precipitation and spray roasting as well as those with acid and metal recovery: ion exchange, retardation, crystallization solvent and membrane extraction. Advantages and disadvantages of the techniques are presented, discussed and confronted with the best available techniques requirements. Most of the methods presented meet the BAT requirements. The best available techniques are electrodialysis, diffusion dialysis and crystallization; however, in practice spray roasting and retardation/ion-exchange are applied most frequently for spent pickling solution regeneration. As "waiting for their chance" solvent extraction, non-dispersive solvent extraction and membrane distillation should be indicated because they are well investigated and developed. Environmental and economic benefits of the methods presented in the review depend on the cost of chemicals and wastewater treatment, legislative regulations and cost of modernization of existing technologies or implementation of new ones. Copyright (c) 2009 Elsevier B.V. All rights reserved.

  10. Discrete-time modelling of musical instruments

    NASA Astrophysics Data System (ADS)

    Välimäki, Vesa; Pakarinen, Jyri; Erkut, Cumhur; Karjalainen, Matti

    2006-01-01

    This article describes physical modelling techniques that can be used for simulating musical instruments. The methods are closely related to digital signal processing. They discretize the system with respect to time, because the aim is to run the simulation using a computer. The physics-based modelling methods can be classified as mass-spring, modal, wave digital, finite difference, digital waveguide and source-filter models. We present the basic theory and a discussion on possible extensions for each modelling technique. For some methods, a simple model example is chosen from the existing literature demonstrating a typical use of the method. For instance, in the case of the digital waveguide modelling technique a vibrating string model is discussed, and in the case of the wave digital filter technique we present a classical piano hammer model. We tackle some nonlinear and time-varying models and include new results on the digital waveguide modelling of a nonlinear string. Current trends and future directions in physical modelling of musical instruments are discussed.

  11. Exploring 3D Human Action Recognition: from Offline to Online.

    PubMed

    Liu, Zhenyu; Li, Rui; Tan, Jianrong

    2018-02-20

    With the introduction of cost-effective depth sensors, a tremendous amount of research has been devoted to studying human action recognition using 3D motion data. However, most existing methods work in an offline fashion, i.e., they operate on a segmented sequence. There are a few methods specifically designed for online action recognition, which continually predicts action labels as a stream sequence proceeds. In view of this fact, we propose a question: can we draw inspirations and borrow techniques or descriptors from existing offline methods, and then apply these to online action recognition? Note that extending offline techniques or descriptors to online applications is not straightforward, since at least two problems-including real-time performance and sequence segmentation-are usually not considered in offline action recognition. In this paper, we give a positive answer to the question. To develop applicable online action recognition methods, we carefully explore feature extraction, sequence segmentation, computational costs, and classifier selection. The effectiveness of the developed methods is validated on the MSR 3D Online Action dataset and the MSR Daily Activity 3D dataset.

  12. Exploring 3D Human Action Recognition: from Offline to Online

    PubMed Central

    Li, Rui; Liu, Zhenyu; Tan, Jianrong

    2018-01-01

    With the introduction of cost-effective depth sensors, a tremendous amount of research has been devoted to studying human action recognition using 3D motion data. However, most existing methods work in an offline fashion, i.e., they operate on a segmented sequence. There are a few methods specifically designed for online action recognition, which continually predicts action labels as a stream sequence proceeds. In view of this fact, we propose a question: can we draw inspirations and borrow techniques or descriptors from existing offline methods, and then apply these to online action recognition? Note that extending offline techniques or descriptors to online applications is not straightforward, since at least two problems—including real-time performance and sequence segmentation—are usually not considered in offline action recognition. In this paper, we give a positive answer to the question. To develop applicable online action recognition methods, we carefully explore feature extraction, sequence segmentation, computational costs, and classifier selection. The effectiveness of the developed methods is validated on the MSR 3D Online Action dataset and the MSR Daily Activity 3D dataset. PMID:29461502

  13. Combining density functional theory (DFT) and pair distribution function (PDF) analysis to solve the structure of metastable materials: the case of metakaolin.

    PubMed

    White, Claire E; Provis, John L; Proffen, Thomas; Riley, Daniel P; van Deventer, Jannie S J

    2010-04-07

    Understanding the atomic structure of complex metastable (including glassy) materials is of great importance in research and industry, however, such materials resist solution by most standard techniques. Here, a novel technique combining thermodynamics and local structure is presented to solve the structure of the metastable aluminosilicate material metakaolin (calcined kaolinite) without the use of chemical constraints. The structure is elucidated by iterating between least-squares real-space refinement using neutron pair distribution function data, and geometry optimisation using density functional modelling. The resulting structural representation is both energetically feasible and in excellent agreement with experimental data. This accurate structural representation of metakaolin provides new insight into the local environment of the aluminium atoms, with evidence of the existence of tri-coordinated aluminium. By the availability of this detailed chemically feasible atomic description, without the need to artificially impose constraints during the refinement process, there exists the opportunity to tailor chemical and mechanical processes involving metakaolin and other complex metastable materials at the atomic level to obtain optimal performance at the macro-scale.

  14. Novel Technique for Making Measurements of SO2 with a Standalone Sonde

    NASA Astrophysics Data System (ADS)

    Flynn, J. H., III; Morris, G. A.; Kotsakis, A.; Alvarez, S. L.

    2017-12-01

    A novel technique has been developed to measure SO2 using the existing electrochemical concentration cell (ECC) ozonesonde technology. An interference in the ozone measurement occurs when SO2 is introduced to the iodide redox reaction causing the signal to decrease and go to zero when [O3] < [SO2]. The original method of measuring SO2 with ozonesondes involves launching two ozonesondes together with one ozonesonde unmodified and one with an SO2 filter [Morris et al, 2010]. By taking the difference between these profiles, the SO2 profile could be determined as long as [O3] > [SO2]. A new method allows for making a direct measurement of SO2 without the need for the dual payload by modifying the existing design. The ultimate goal is to be able to measure SO2 vertical profiles in the atmosphere, such as in plumes from anthropogenic or natural sources (i.e. volcanic eruptions). The benefits of an SO2 sonde include the ability to make measurements where aircraft cannot safely fly, such as in volcanic plumes, and to provide validation of SO2 columns from satellites.

  15. Cross-calibration of Medium Resolution Earth Observing Satellites by Using EO-1 Hyperion-derived Spectral Surface Reflectance from "Lunar Cal Sites"

    NASA Astrophysics Data System (ADS)

    Ungar, S.

    2017-12-01

    Over the past 3 years, the Earth Observing-one (EO-1) Hyperion imaging spectrometer was used to slowly scan the lunar surface at a rate which results in up to 32X oversampling to effectively increase the SNR. Several strategies, including comparison against the USGS RObotic Lunar Observatory (ROLO) mode,l are being employed to estimate the absolute and relative accuracy of the measurement set. There is an existing need to resolve discrepancies as high as 10% between ROLO and solar based calibration of current NASA EOS assets. Although the EO-1 mission was decommissioned at the end of March 2017, the development of a well-characterized exoatmospheric spectral radiometric database, for a range of lunar phase angles surrounding the fully illuminated moon, continues. Initial studies include a comprehensive analysis of the existing 17-year collection of more than 200 monthly lunar acquisitions. Specific lunar surface areas, such as a lunar mare, are being characterized as potential "lunar calibration sites" in terms of their radiometric stability in the presence of lunar nutation and libration. Site specific Hyperion-derived lunar spectral reflectance are being compared against spectrographic measurements made during the Apollo program. Techniques developed through this activity can be employed by future high-quality orbiting imaging spectrometers (such as HyspIRI and EnMap) to further refine calibration accuracies. These techniques will enable the consistent cross calibration of existing and future earth observing systems (spectral and multi-spectral) including those that do not have lunar viewing capability. When direct lunar viewing is not an option for an earth observing asset, orbiting imaging spectrometers can serve as transfer radiometers relating that asset's sensor response to lunar values through near contemporaneous observations of well characterized stable CEOS test sites. Analysis of this dataset will lead to the development of strategies to ensure more accurate cross calibrations when employing the more capable, future imaging spectrometers.

  16. Nanoscience

    DTIC Science & Technology

    2011-07-22

    L., Upgrading of Existing X - Ray Photoelectron Spectrometer Capabilities for Development and Analysis of Novel Energetic NanoCluster materials (DURIP...References From the Technical Reports database Allara, David L., Pennsylvania State University, Upgrading of Existing X - Ray Photoelectron...Scanning probe  X - ray Of these techniques, the most popularly used is the scanning probe, also known as the Dip-Pen Nanolithography (DPN) technique

  17. Sci-Fri PM: Radiation Therapy, Planning, Imaging, and Special Techniques - 10: Results from Canada Wide Survey on Total Body Irradiation Practice

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Studinski, Ryan; Fraser, Danielle; Samant, Rajiv

    Purpose: Total Body Irradiation (TBI) is delivered to a relatively small number of patients with a variety of techniques; it has been a challenge to develop consensus studies for best practice. This survey was created to assess the current state of TBI in Canada. Methods: The survey was created with questions focusing on the radiation prescription, delivery technique and resources involved. The survey was circulated electronically to the heads of every clinical medical physics department in Canada. Responses were gathered and collated, and centres that were known to deliver TBI were urged to respond. Results: Responses from 20 centres weremore » received, including 12 from centres that perform TBI. Although a variety of TBI dose prescriptions were reported, 12 Gy in 6 fractions was used in 11 centres while 5 centres use unique prescriptions. For dose rate, a range of 9 to 51 cGy/min was reported. Most centres use an extended SSD technique, with the patient standing or lying down against a wall. The rest use either a “sweeping” technique or a more complicated multi-field technique. All centres but one indicated that they shield the lungs, and only a minority shield other organs. The survey also showed that considerable resources are used for TBI including extra staffing, extended planning and treatment times and the use of locally developed hardware or software. Conclusions: This survey highlights that both similarities and important discrepancies exist between TBI techniques across the country, and is an opportunity to prompt more collaboration between centres.« less

  18. When Less Is More: The indications for MIS Techniques and Separation Surgery in Metastatic Spine Disease.

    PubMed

    Zuckerman, Scott L; Laufer, Ilya; Sahgal, Arjun; Yamada, Yoshiya J; Schmidt, Meic H; Chou, Dean; Shin, John H; Kumar, Naresh; Sciubba, Daniel M

    2016-10-15

    Systematic review. The aim of this study was to review the techniques, indications, and outcomes of minimally invasive surgery (MIS) and separation surgery with subsequent radiosurgery in the treatment of patients with metastatic spine disease. The utilization of MIS techniques in patients with spine metastases is a growing area within spinal oncology. Separation surgery represents a novel paradigm where radiosurgery provides long-term control after tumor is surgically separated from the neural elements. PubMed, Embase, and CINAHL databases were systematically queried for literature reporting MIS techniques or separation surgery in patients with metastatic spine disease. PRISMA guidelines were followed. Of the initial 983 articles found, 29 met inclusion criteria. Twenty-five articles discussed MIS techniques and were grouped according to the primary objective: percutaneous stabilization (8), tubular retractors (4), mini-open approach (8), and thoracoscopy/endoscopy (5). The remaining 4 studies reported separation surgery. Indications were similar across all studies and included patients with instability, refractory pain, or neurologic compromise. Intraoperative variables, outcomes, and complications were similar in MIS studies compared to traditional approaches, and some MIS studies showed a statistically significant improvement in outcomes. Studies of mini-open techniques had the strongest evidence for superiority. Low-quality evidence currently exists for MIS techniques and separation surgery in the treatment of metastatic spine disease. Given the early promising results, the next iteration of research should include higher-quality studies with sufficient power, and will be able to provide higher-level evidence on the outcomes of MIS approaches and separation surgery. N/A.

  19. A Passive System Reliability Analysis for a Station Blackout

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brunett, Acacia; Bucknor, Matthew; Grabaskas, David

    2015-05-03

    The latest iterations of advanced reactor designs have included increased reliance on passive safety systems to maintain plant integrity during unplanned sequences. While these systems are advantageous in reducing the reliance on human intervention and availability of power, the phenomenological foundations on which these systems are built require a novel approach to a reliability assessment. Passive systems possess the unique ability to fail functionally without failing physically, a result of their explicit dependency on existing boundary conditions that drive their operating mode and capacity. Argonne National Laboratory is performing ongoing analyses that demonstrate various methodologies for the characterization of passivemore » system reliability within a probabilistic framework. Two reliability analysis techniques are utilized in this work. The first approach, the Reliability Method for Passive Systems, provides a mechanistic technique employing deterministic models and conventional static event trees. The second approach, a simulation-based technique, utilizes discrete dynamic event trees to treat time- dependent phenomena during scenario evolution. For this demonstration analysis, both reliability assessment techniques are used to analyze an extended station blackout in a pool-type sodium fast reactor (SFR) coupled with a reactor cavity cooling system (RCCS). This work demonstrates the entire process of a passive system reliability analysis, including identification of important parameters and failure metrics, treatment of uncertainties and analysis of results.« less

  20. Lateral Patellar Instability in the Skeletally Mature Patient: Evaluation and Surgical Management.

    PubMed

    Diduch, David R; Kandil, Abdurrahman; Burrus, M Tyrrell

    2018-05-18

    Lateral patellar instability is a common disease process that affects all types of patients. Depending on the patient's anatomy and the results of preoperative imaging, surgical management options include medial patellofemoral ligament reconstruction, tibial tubercle osteotomy, and sulcus-deepening trochleoplasty. Medial patellofemoral ligament reconstruction or repair is useful for almost all patients, whereas tibial tubercle osteotomy is helpful to correct a lateralized tibial tubercle and the associated elevated lateral pull of the extensor mechanism. For a select subset of patients with severe trochlear dysplasia, a sulcus-deepening trochleoplasty can be a useful option to prevent future patellar instability. Many technical considerations exist for each procedure, and in most situations, no consensus exists to direct surgeons on the superior technique.

  1. 6-carboxydihydroresveratrol 3-O-β-glucopyranoside--a novel natural product from the Cretaceous relict Metasequoia glyptostroboides.

    PubMed

    Nguyen, Xuan Hong Thy; Juvik, Ole Johan; Øvstedal, Dag Olav; Fossen, Torgils

    2014-06-01

    Metasequoia glyptostroboides, a tree native to China, is described as a living fossil and has existed for millions of years. The oldest fossils recorded have been dated to the late Cretaceous era. During the time of its existence, the molecular defence system of the tree has apparently resisted millions of generations of pathogens, which encouraged search for novel natural product from this source. Eight compounds have been characterised from needles of M. glyptostroboides, including the novel natural product 6-carboxydihydroresveratrol 3-O-β-glucopyranoside. The structure determinations were based on extensive use of 2D NMR spectroscopic techniques and high-resolution mass spectrometry. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. Recent development of recycling lead from scrap CRTs: A technological review.

    PubMed

    Yu-Gong; Tian, Xiang-Miao; Wu, Yu-Feng; Zhe-Tan; Lei-Lv

    2016-11-01

    Cathode ray tubes (CRTs) contain numerous harmful substances with different functions. Lead is found in the funnel glass of CRTs. Improperly treated toxic lead may pose significant risks to human health and the environment. This paper reviews and summarizes existing technological processes on the recycling of lead from waste CRTs, including pyrometallurgy, hydrometallurgy, and product-regeneration. The present situation, advantages, and disadvantages of these techniques are described in detail. Generally, pyrometallurgy shows better practicability in recovery lead from waste CRT than hydrometallurgy and hydrometallurgy, in view of environmental impact, energy-consumption, product formats and safety and maturity of technology. Moreover, the gaps in the existing technologies were identified and recommendations for future research were provided. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Radiotherapy dosimetry audit: three decades of improving standards and accuracy in UK clinical practice and trials.

    PubMed

    Clark, Catharine H; Aird, Edwin G A; Bolton, Steve; Miles, Elizabeth A; Nisbet, Andrew; Snaith, Julia A D; Thomas, Russell A S; Venables, Karen; Thwaites, David I

    2015-01-01

    Dosimetry audit plays an important role in the development and safety of radiotherapy. National and large scale audits are able to set, maintain and improve standards, as well as having the potential to identify issues which may cause harm to patients. They can support implementation of complex techniques and can facilitate awareness and understanding of any issues which may exist by benchmarking centres with similar equipment. This review examines the development of dosimetry audit in the UK over the past 30 years, including the involvement of the UK in international audits. A summary of audit results is given, with an overview of methodologies employed and lessons learnt. Recent and forthcoming more complex audits are considered, with a focus on future needs including the arrival of proton therapy in the UK and other advanced techniques such as four-dimensional radiotherapy delivery and verification, stereotactic radiotherapy and MR linear accelerators. The work of the main quality assurance and auditing bodies is discussed, including how they are working together to streamline audit and to ensure that all radiotherapy centres are involved. Undertaking regular external audit motivates centres to modernize and develop techniques and provides assurance, not only that radiotherapy is planned and delivered accurately but also that the patient dose delivered is as prescribed.

  4. Radiotherapy dosimetry audit: three decades of improving standards and accuracy in UK clinical practice and trials

    PubMed Central

    Aird, Edwin GA; Bolton, Steve; Miles, Elizabeth A; Nisbet, Andrew; Snaith, Julia AD; Thomas, Russell AS; Venables, Karen; Thwaites, David I

    2015-01-01

    Dosimetry audit plays an important role in the development and safety of radiotherapy. National and large scale audits are able to set, maintain and improve standards, as well as having the potential to identify issues which may cause harm to patients. They can support implementation of complex techniques and can facilitate awareness and understanding of any issues which may exist by benchmarking centres with similar equipment. This review examines the development of dosimetry audit in the UK over the past 30 years, including the involvement of the UK in international audits. A summary of audit results is given, with an overview of methodologies employed and lessons learnt. Recent and forthcoming more complex audits are considered, with a focus on future needs including the arrival of proton therapy in the UK and other advanced techniques such as four-dimensional radiotherapy delivery and verification, stereotactic radiotherapy and MR linear accelerators. The work of the main quality assurance and auditing bodies is discussed, including how they are working together to streamline audit and to ensure that all radiotherapy centres are involved. Undertaking regular external audit motivates centres to modernize and develop techniques and provides assurance, not only that radiotherapy is planned and delivered accurately but also that the patient dose delivered is as prescribed. PMID:26329469

  5. Seismic data fusion anomaly detection

    NASA Astrophysics Data System (ADS)

    Harrity, Kyle; Blasch, Erik; Alford, Mark; Ezekiel, Soundararajan; Ferris, David

    2014-06-01

    Detecting anomalies in non-stationary signals has valuable applications in many fields including medicine and meteorology. These include uses such as identifying possible heart conditions from an Electrocardiography (ECG) signals or predicting earthquakes via seismographic data. Over the many choices of anomaly detection algorithms, it is important to compare possible methods. In this paper, we examine and compare two approaches to anomaly detection and see how data fusion methods may improve performance. The first approach involves using an artificial neural network (ANN) to detect anomalies in a wavelet de-noised signal. The other method uses a perspective neural network (PNN) to analyze an arbitrary number of "perspectives" or transformations of the observed signal for anomalies. Possible perspectives may include wavelet de-noising, Fourier transform, peak-filtering, etc.. In order to evaluate these techniques via signal fusion metrics, we must apply signal preprocessing techniques such as de-noising methods to the original signal and then use a neural network to find anomalies in the generated signal. From this secondary result it is possible to use data fusion techniques that can be evaluated via existing data fusion metrics for single and multiple perspectives. The result will show which anomaly detection method, according to the metrics, is better suited overall for anomaly detection applications. The method used in this study could be applied to compare other signal processing algorithms.

  6. The development of additive manufacturing technique for nickel-base alloys: A review

    NASA Astrophysics Data System (ADS)

    Zadi-Maad, Ahmad; Basuki, Arif

    2018-04-01

    Nickel-base alloys are an attractive alloy due to its excellent mechanical properties, a high resistance to creep deformation, corrosion, and oxidation. However, it is a hard task to control performance when casting or forging for this material. In recent years, additive manufacturing (AM) process has been implemented to replace the conventional directional solidification process for the production of nickel-base alloys. Due to its potentially lower cost and flexibility manufacturing process, AM is considered as a substitute technique for the existing. This paper provides a comprehensive review of the previous work related to the AM techniques for Ni-base alloys while highlighting current challenges and methods to solving them. The properties of conventionally manufactured Ni-base alloys are also compared with the AM fabricated alloys. The mechanical properties obtained from tension, hardness and fatigue test are included, along with discussions of the effect of post-treatment process. Recommendations for further work are also provided.

  7. Advances in paper-based sample pretreatment for point-of-care testing.

    PubMed

    Tang, Rui Hua; Yang, Hui; Choi, Jane Ru; Gong, Yan; Feng, Shang Sheng; Pingguan-Murphy, Belinda; Huang, Qing Sheng; Shi, Jun Ling; Mei, Qi Bing; Xu, Feng

    2017-06-01

    In recent years, paper-based point-of-care testing (POCT) has been widely used in medical diagnostics, food safety and environmental monitoring. However, a high-cost, time-consuming and equipment-dependent sample pretreatment technique is generally required for raw sample processing, which are impractical for low-resource and disease-endemic areas. Therefore, there is an escalating demand for a cost-effective, simple and portable pretreatment technique, to be coupled with the commonly used paper-based assay (e.g. lateral flow assay) in POCT. In this review, we focus on the importance of using paper as a platform for sample pretreatment. We firstly discuss the beneficial use of paper for sample pretreatment, including sample collection and storage, separation, extraction, and concentration. We highlight the working principle and fabrication of each sample pretreatment device, the existing challenges and the future perspectives for developing paper-based sample pretreatment technique.

  8. Correlation techniques to determine model form in robust nonlinear system realization/identification

    NASA Technical Reports Server (NTRS)

    Stry, Greselda I.; Mook, D. Joseph

    1991-01-01

    The fundamental challenge in identification of nonlinear dynamic systems is determining the appropriate form of the model. A robust technique is presented which essentially eliminates this problem for many applications. The technique is based on the Minimum Model Error (MME) optimal estimation approach. A detailed literature review is included in which fundamental differences between the current approach and previous work is described. The most significant feature is the ability to identify nonlinear dynamic systems without prior assumption regarding the form of the nonlinearities, in contrast to existing nonlinear identification approaches which usually require detailed assumptions of the nonlinearities. Model form is determined via statistical correlation of the MME optimal state estimates with the MME optimal model error estimates. The example illustrations indicate that the method is robust with respect to prior ignorance of the model, and with respect to measurement noise, measurement frequency, and measurement record length.

  9. New insights in the treatment of acromioclavicular separation

    PubMed Central

    van Bergen, Christiaan J A; van Bemmel, Annelies F; Alta, Tjarco D W; van Noort, Arthur

    2017-01-01

    A direct force on the superior aspect of the shoulder may cause acromioclavicular (AC) dislocation or separation. Severe dislocations can lead to chronic impairment, especially in the athlete and high-demand manual laborer. The dislocation is classified according to Rockwood. Types I and II are treated nonoperatively, while types IV, V and VI are generally treated operatively. Controversy exists regarding the optimal treatment of type III dislocations in the high-demand patient. Recent evidence suggests that these should be treated nonoperatively initially. Classic surgical techniques were associated with high complication rates, including recurrent dislocations and hardware breakage. In recent years, many new techniques have been introduced in order to improve the outcomes. Arthroscopic reconstruction or repair techniques have promising short-term results. This article aims to provide a current concepts review on the treatment of AC dislocations with emphasis on recent developments. PMID:29312844

  10. Choosing Objectives in Over-Subscription Planning

    NASA Technical Reports Server (NTRS)

    Smith, David E.

    2003-01-01

    Many NASA planning problems are over-subscription problems - that is, there are a large number of possible goals of differing value, and the planning system must choose a subset &it car! be accomplished within the limited time and resources available. Examples include planning for telescopes like Hubble, SIRTF, and SOFIA; scheduling for the Deep Space Network; and planning science experiments for a Mars rover. Unfortunately, existing planning systems are not designed to deal with problems like this - they expect a well-defined conjunctive goal and terminate in failure unless the entire goal is achieved. In this paper we develop techniques for over-subscription problems that assist a classical planner in choosing which goals to achieve, and the order in which to achieve them. These techniques use plan graph cost-estimation techniques to construct an orienteering problem, which is then used to provide heuristic advice on the goals and goal order that should considered by a planner.

  11. Multi-Resolution Unstructured Grid-Generation for Geophysical Applications on the Sphere

    NASA Technical Reports Server (NTRS)

    Engwirda, Darren

    2015-01-01

    An algorithm for the generation of non-uniform unstructured grids on ellipsoidal geometries is described. This technique is designed to generate high quality triangular and polygonal meshes appropriate for general circulation modelling on the sphere, including applications to atmospheric and ocean simulation, and numerical weather predication. Using a recently developed Frontal-Delaunay-refinement technique, a method for the construction of high-quality unstructured ellipsoidal Delaunay triangulations is introduced. A dual polygonal grid, derived from the associated Voronoi diagram, is also optionally generated as a by-product. Compared to existing techniques, it is shown that the Frontal-Delaunay approach typically produces grids with near-optimal element quality and smooth grading characteristics, while imposing relatively low computational expense. Initial results are presented for a selection of uniform and non-uniform ellipsoidal grids appropriate for large-scale geophysical applications. The use of user-defined mesh-sizing functions to generate smoothly graded, non-uniform grids is discussed.

  12. The use of remote sensing in solving Florida's geological and coastal engineering problems

    NASA Technical Reports Server (NTRS)

    Brooks, H. K.; Ruth, B. E.; Wang, Y. H.; Ferguson, R. L.

    1977-01-01

    LANDSAT imagery and NASA high altitude color infrared (CIR) photography were used to select suitable sites for sanitary landfill in Volusia County, Florida and to develop techniques for preventing sand deposits in the Clearwater inlet. Activities described include the acquisition of imagery, its analysis by the IMAGE 100 system, conventional photointerpretation, evaluation of existing data sources (vegetation, soil, and ground water maps), site investigations for ground truth, and preparation of displays for reports.

  13. Further Validation of Simulated Dynamic Interface Testing Techniques as a Tool in the Forecasting of Air Vehicle Deck Limits

    DTIC Science & Technology

    2010-01-01

    UAV Autonomy program which includes intelligent reasoning for autonomy, technologies to enhance see and avoid capabilities, object identification ...along the ship’s base recovery course (BRC). The pilot then flies toward the stern of the ship, aligning his approach path with the ship’s lineup line...quiescent point identification . CONCLUSIONS The primary goal for conducting dynamic interface analysis is to expand existing operating envelopes and

  14. Project Cyclops: a Design Study of a System for Detecting Extraterrestrial Intelligent Life

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The requirements in hardware, manpower, time and funding to conduct a realistic effort aimed at detecting the existence of extraterrestrial intelligent life are examined. The methods used are limited to present or near term future state-of-the-art techniques. Subjects discussed include: (1) possible methods of contact, (2) communication by electromagnetic waves, (3) antenna array and system facilities, (4) antenna elements, (5) signal processing, (6) search strategy, and (7) radio and radar astronomy.

  15. Automated Techniques for Rapid Analysis of Momentum Exchange Devices

    DTIC Science & Technology

    2013-12-01

    estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data sources, gathering and maintaining...NUMBER 9. SPONSORING /MONITORING AGENCY NAME(S) AND ADDRESS(ES) N /A 10. SPONSORING/MONITORING AGENCY REPORT NUMBER 11. SUPPLEMENTARY NOTES...1, pg. 12]. It allows a vector expressed in one frame to be differentiated with respect to another frame.     /a a N N a a ad dv v v dt dt

  16. Airborne Observation of Ocean Surface Roughness Variations Using a Combination of Microwave Radiometer and Reflectometer Systems: The Second Virginia Offshore (Virgo II) Experiment

    DTIC Science & Technology

    2014-03-06

    from scattered satellite transmissions, was first demonstrated using Global Navigation Satellite System ( GNSS ) reflections. Recently, reflectometry has...Earth’s atmosphere. The 2012 GNSS +R workshop provided an opportunity for engineers and Earth scientists to assess the state of the art, demonstrate new...bi-static radar technique utilizes signals of opportunity transmitted from existing L-band Global Navigation Satellite Systems ( GNSS ), including GPS

  17. Effectiveness of two reflection crack attenuation techniques.

    DOT National Transportation Integrated Search

    2015-09-01

    Asphalt overlays are one of the most common tools for rehabilitating existing asphalt and concrete pavements. : However, the performance of new overlays is often jeopardized by the cracking distress in the existing : pavement. This existing cracking ...

  18. [Soft-ridged bench terrace design in hilly loess region].

    PubMed

    Cao, Shixiong; Chen, Li; Gao, Wangsheng

    2005-08-01

    Reconfiguration of hillside field into terrace is regarded as one of the key techniques for water and soil conservation in mountainous regions. On slopes exceeding 30 degrees, the traditional techniques of terracing are difficult to apply as risers (i.e., backslopes), and if not reinforced, are so abrupt and easy to collapse under gravity alone, thus damaging the terrace. To improve the reconfiguration of hillside field into terrace, holistic techniques of soft-ridged bench terrace engineering, including revegetation, with trees and planting grasses on riser slopes, were tested between 1997 and 2001 in Xiabiangou watershed of Yan' an, Shaanxi Province. A "working with Nature" engineering approach, riser slopes of 45 degrees, similar to the pre-existing slope of 35 degrees, was employed to radically reduce gravity-erosion. Based on the concepts of biodiversity and the principles of landscape ecology, terrace benches, bunds, and risers were planted with trees, shrubs, forage grasses, and crops, serving to generate a diverse array of plants, a semi-forested area, and to stabilize terrace bunds. Soft-ridged bench terrace made it possible to significantly reduce hazards arising from gravity erosion, and reduce the costs of individual bench construction and maintenance by 24.9% and 55.5% of the costs under traditional techniques, respectively. Such a construction allowed an enrichment and concentration of nutrients in the soils of terrace bunds, providing an ideal environment for a range of plants to grow and develop. The terrace riser could be planted with drought-resistant plants ranging from forage grasses to trees, and this riser vegetation would turn the exposed bunds and risers existing under traditional techniques into plant-covered belts, great green ribbons decorating farmland and contributing to the enhancement of the landscape biology.

  19. [Effects of 2-chlorophenol-acclimation on microbial community structure in anaerobic granular sludge].

    PubMed

    Huang, Ai-Qun; Dai, Ya-Lei; Chen, Ling; Chen, Hao; Zhang, Wen

    2008-03-01

    The microbial community structure in 2-chlorophenol-acclimated anaerobic granular sludge and inoculating sludge were analyzed by 16S rDNA-based approach. Total DNA was extracted directly from the inoculating sludge and 2-CP-acclimated anaerobic sludge, and then amplified by polymerase chain reaction (PCR) technique with the specific primer pair ARC21F/ARC958R for Archaea and 31F/907R for Acidobacteria respectively. The positive PCR products were cloned and sequenced. The sequences analysis shows that there exist common Archaea in both sludge, including Methanothrix soehngenii, Methanosaeta concilii and uncultured euryarchaeote etc. Some special Archaea appear in the 2-CP-acclimated sludge, such as Methanobacterium aarhusense, Methanobacterium curvum and Methanobacterium beijingense etc. Others originally existed in the inoculating sludge disappear after acclimation. Common Acidobacteria are found in both sludge, including uncultured bacterium, uncultured Acidobacterium and unknown Actinomycete (MC 9). Some special microbes originally existed in the inoculating sludge, such as Desulfotomaculum sp. 176, uncultured Deltaproteobacterium n8d and uncultured hydrocarbon seep bacterium etc. disappear after acclimation, and uncultured Holophaga/Acidobacterium, uncultured Acidobacteria bacterium and unidentified Acidobacterium are found after 2-CP-acclimation.

  20. Surgery of adult bilateral vocal fold paralysis in adduction: history and trends.

    PubMed

    Sapundzhiev, Nikolay; Lichtenberger, György; Eckel, Hans Edmund; Friedrich, Gerhard; Zenev, Ivan; Toohill, Robert J; Werner, Jochen Alfred

    2008-12-01

    Bilateral vocal fold paralysis (BVFP) in adduction is characterised by inspiratory dyspnea, due to the paramedian position of the vocal folds with narrowing of the airway at the glottic level. The condition is often life threatening and therefore requires surgical intervention to prevent acute asphyxiation or pulmonary consequences of chronic airway obstruction. Aside from corticosteroid administration and intubation, which are only temporary measures, the standard approach for improving respiration is to perform a tracheotomy. Over the past century, a vast majority of surgical interventions have been developed and applied to restore the patency of the airway and achieve decannulation. Surgeons can generally choose for every individual patient from various well-established treatment options, which have a predictable outcome. An overview of the surgical techniques for laryngeal airway enlargement in BVFP is presented. Included are operative techniques, which have found application in clinical practice, and only to a small extent in purely anatomic or animal studies. The focus is on two major groups of interventions--for temporary and for definitive glottic enlargement. The major types of interventions include the following: (1) resection of anatomical structures; (2) retailoring and displacing the existing structures, with minimal tissue removal; (3) displacing existing structures, without tissue resection; (4) restoration or substitution of the missing innervation of the laryngeal musculature. The single interventions of these four major types have always followed the development of the medical equipment and anaesthesia. At the beginning of the twentieth century, when medicine was unable to counteract surgical infection, endoscopic or extramucosal surgical techniques were dominant. In the 1950s, the microscopic endoscopic laryngeal surgery boomed. At the end of the twentieth century many of the classical endoscopic operations were performed either with the help of surgical lasers alone, or in combination with other interventions.

  1. Scalable graphene production: perspectives and challenges of plasma applications

    NASA Astrophysics Data System (ADS)

    Levchenko, Igor; Ostrikov, Kostya (Ken); Zheng, Jie; Li, Xingguo; Keidar, Michael; B. K. Teo, Kenneth

    2016-05-01

    Graphene, a newly discovered and extensively investigated material, has many unique and extraordinary properties which promise major technological advances in fields ranging from electronics to mechanical engineering and food production. Unfortunately, complex techniques and high production costs hinder commonplace applications. Scaling of existing graphene production techniques to the industrial level without compromising its properties is a current challenge. This article focuses on the perspectives and challenges of scalability, equipment, and technological perspectives of the plasma-based techniques which offer many unique possibilities for the synthesis of graphene and graphene-containing products. The plasma-based processes are amenable for scaling and could also be useful to enhance the controllability of the conventional chemical vapour deposition method and some other techniques, and to ensure a good quality of the produced graphene. We examine the unique features of the plasma-enhanced graphene production approaches, including the techniques based on inductively-coupled and arc discharges, in the context of their potential scaling to mass production following the generic scaling approaches applicable to the existing processes and systems. This work analyses a large amount of the recent literature on graphene production by various techniques and summarizes the results in a tabular form to provide a simple and convenient comparison of several available techniques. Our analysis reveals a significant potential of scalability for plasma-based technologies, based on the scaling-related process characteristics. Among other processes, a greater yield of 1 g × h-1 m-2 was reached for the arc discharge technology, whereas the other plasma-based techniques show process yields comparable to the neutral-gas based methods. Selected plasma-based techniques show lower energy consumption than in thermal CVD processes, and the ability to produce graphene flakes of various sizes reaching hundreds of square millimetres, and the thickness varying from a monolayer to 10-20 layers. Additional factors such as electrical voltage and current, not available in thermal CVD processes could potentially lead to better scalability, flexibility and control of the plasma-based processes. Advantages and disadvantages of various systems are also considered.

  2. Scalable graphene production: perspectives and challenges of plasma applications.

    PubMed

    Levchenko, Igor; Ostrikov, Kostya Ken; Zheng, Jie; Li, Xingguo; Keidar, Michael; B K Teo, Kenneth

    2016-05-19

    Graphene, a newly discovered and extensively investigated material, has many unique and extraordinary properties which promise major technological advances in fields ranging from electronics to mechanical engineering and food production. Unfortunately, complex techniques and high production costs hinder commonplace applications. Scaling of existing graphene production techniques to the industrial level without compromising its properties is a current challenge. This article focuses on the perspectives and challenges of scalability, equipment, and technological perspectives of the plasma-based techniques which offer many unique possibilities for the synthesis of graphene and graphene-containing products. The plasma-based processes are amenable for scaling and could also be useful to enhance the controllability of the conventional chemical vapour deposition method and some other techniques, and to ensure a good quality of the produced graphene. We examine the unique features of the plasma-enhanced graphene production approaches, including the techniques based on inductively-coupled and arc discharges, in the context of their potential scaling to mass production following the generic scaling approaches applicable to the existing processes and systems. This work analyses a large amount of the recent literature on graphene production by various techniques and summarizes the results in a tabular form to provide a simple and convenient comparison of several available techniques. Our analysis reveals a significant potential of scalability for plasma-based technologies, based on the scaling-related process characteristics. Among other processes, a greater yield of 1 g × h(-1) m(-2) was reached for the arc discharge technology, whereas the other plasma-based techniques show process yields comparable to the neutral-gas based methods. Selected plasma-based techniques show lower energy consumption than in thermal CVD processes, and the ability to produce graphene flakes of various sizes reaching hundreds of square millimetres, and the thickness varying from a monolayer to 10-20 layers. Additional factors such as electrical voltage and current, not available in thermal CVD processes could potentially lead to better scalability, flexibility and control of the plasma-based processes. Advantages and disadvantages of various systems are also considered.

  3. Comparison of Metal-Backed Free-Space and Open-Ended Coaxial Probe Techniques for the Dielectric Characterization of Aeronautical Composites †

    PubMed Central

    López-Rodríguez, Patricia; Escot-Bocanegra, David; Poyatos-Martínez, David; Weinmann, Frank

    2016-01-01

    The trend in the last few decades is that current unmanned aerial vehicles are completely made of composite materials rather than metallic, such as carbon-fiber or fiberglass composites. From the electromagnetic point of view, this fact forces engineers and scientists to assess how these materials may affect their radar response or their electronics in terms of electromagnetic compatibility. In order to evaluate this, electromagnetic characterization of different composite materials has become a need. Several techniques exist to perform this characterization, all of them based on the utilization of different sensors for measuring different parameters. In this paper, an implementation of the metal-backed free-space technique, based on the employment of antenna probes, is utilized for the characterization of composite materials that belong to an actual drone. Their extracted properties are compared with those given by a commercial solution, an open-ended coaxial probe (OECP). The discrepancies found between both techniques along with a further evaluation of the methodologies, including measurements with a split-cavity resonator, conclude that the implemented free-space technique provides more reliable results for this kind of composites than the OECP technique. PMID:27347966

  4. The Coordinate Orthogonality Check (corthog)

    NASA Astrophysics Data System (ADS)

    Avitabile, P.; Pechinsky, F.

    1998-05-01

    A new technique referred to as the coordinate orthogonality check (CORTHOG) helps to identify how each physical degree of freedom contributes to the overall orthogonality relationship between analytical and experimental modal vectors on a mass-weighted basis. Using the CORTHOG technique together with the pseudo-orthogonality check (POC) clarifies where potential discrepancies exist between the analytical and experimental modal vectors. CORTHOG improves the understanding of the correlation (or lack of correlation) that exists between modal vectors. The CORTHOG theory is presented along with the evaluation of several cases to show the use of the technique.

  5. Management of fluid mud in estuaries, bays, and lakes. II: Measurement, modeling, and management

    USGS Publications Warehouse

    McAnally, W.H.; Teeter, A.; Schoellhamer, David H.; Friedrichs, C.; Hamilton, D.; Hayter, E.; Shrestha, P.; Rodriguez, H.; Sheremet, A.; Kirby, R.

    2007-01-01

    Techniques for measurement, modeling, and management of fluid mud are available, but research is needed to improve them. Fluid mud can be difficult to detect, measure, or sample, which has led to new instruments and new ways of using existing instruments. Multifrequency acoustic fathometers sense neither density nor viscosity and are, therefore, unreliable in measuring fluid mud. Nuclear density probes, towed sleds, seismic, and drop probes equipped with density meters offer the potential for accurate measurements. Numerical modeling of fluid mud requires solving governing equations for flow velocity, density, pressure, salinity, water surface, plus sediment submodels. A number of such models exist in one-, two-, and three-dimensional form, but they rely on empirical relationships that require substantial site-specific validation to observations. Management of fluid mud techniques can be classified as those that accomplish: Source control, formation control, and removal. Nautical depth, a fourth category, defines the channel bottom as a specific fluid mud density or alternative parameter as safe for navigation. Source control includes watershed management measures to keep fine sediment out of waterways and in-water measures such as structures and traps. Formation control methods include streamlined channels and structures plus other measures to reduce flocculation and structures that train currents. Removal methods include the traditional dredging and transport of dredged material plus agitation that contributes to formation control and/or nautical depth. Conditioning of fluid mud by dredging and aerating offers the possibility of improved navigability. Two examples—the Atchafalaya Bar Channel and Savannah Harbor—illustrate the use of measurements and management of fluid mud.

  6. High-speed real-time animated displays on the ADAGE (trademark) RDS 3000 raster graphics system

    NASA Technical Reports Server (NTRS)

    Kahlbaum, William M., Jr.; Ownbey, Katrina L.

    1989-01-01

    Techniques which may be used to increase the animation update rate of real-time computer raster graphic displays are discussed. They were developed on the ADAGE RDS 3000 graphic system in support of the Advanced Concepts Simulator at the NASA Langley Research Center. These techniques involve the use of a special purpose parallel processor, for high-speed character generation. The description of the parallel processor includes the Barrel Shifter which is part of the hardware and is the key to the high-speed character rendition. The final result of this total effort was a fourfold increase in the update rate of an existing primary flight display from 4 to 16 frames per second.

  7. High resolution optical DNA mapping

    NASA Astrophysics Data System (ADS)

    Baday, Murat

    Many types of diseases including cancer and autism are associated with copy-number variations in the genome. Most of these variations could not be identified with existing sequencing and optical DNA mapping methods. We have developed Multi-color Super-resolution technique, with potential for high throughput and low cost, which can allow us to recognize more of these variations. Our technique has made 10--fold improvement in the resolution of optical DNA mapping. Using a 180 kb BAC clone as a model system, we resolved dense patterns from 108 fluorescent labels of two different colors representing two different sequence-motifs. Overall, a detailed DNA map with 100 bp resolution was achieved, which has the potential to reveal detailed information about genetic variance and to facilitate medical diagnosis of genetic disease.

  8. Highly precise Re-Os dating for molybdenite using alkaline fusion and NTIMS.

    PubMed

    Markey, R; Stein, H; Morgan, J

    1998-03-01

    The technique described in this paper represents the modification and combination of two previously existing methods, alkaline fusion and negative thermal ion mass spectrometry (NTIMS). We have used this technique to analyze repeatedly a homogeneous molybdenite powder used as a reference standard in our laboratory. Analyses were made over a period of 18 months, using four different calibrations of two different spike solutions. The age of this standard reproduces at a level of +/-0.13%. Each individual age analysis carries an uncertainty of about 0.4% that includes the uncertainty in the decay constant for (187)Re. This new level of resolution has allowed us to recognize real differences in ages for two grain-size populations of molybdenite from some Archean samples.

  9. Highly precise Re-Os dating for molybdenite using alkaline fusion and NTIMS

    USGS Publications Warehouse

    Markey, R.; Stein, H.; Morgan, J.

    1998-01-01

    The technique described in this paper represents the modification and combination of two previously existing methods, alkaline fusion and negative thermal ion mass spectrometry (NTIMS). We have used this technique to analyze repeatedly a homogeneous molybdenite powder used as a reference standard in our laboratory. Analyses were made over a period of 18 months, using four different calibrations of two different spike solutions. The age of this standard reproduces at a level of ?? 0.13%. Each individual age analysis carries an uncertainty of about 0.4% that includes the uncertainty in the decay constant for 187Re. This new level of resolution has allowed us to recognize real differences in ages for two grain-size populations of molybdenite from some Archean samples.

  10. Oceanographic applications of laser technology

    NASA Technical Reports Server (NTRS)

    Hoge, F. E.

    1988-01-01

    Oceanographic activities with the Airborne Oceanographic Lidar (AOL) for the past several years have primarily been focussed on using active (laser induced pigment fluorescence) and concurrent passive ocean color spectra to improve existing ocean color algorithms for estimating primary production in the world's oceans. The most significant results were the development of a technique for selecting optimal passive wavelengths for recovering phytoplankton photopigment concentration and the application of this technique, termed active-passive correlation spectroscopy (APCS), to various forms of passive ocean color algorithms. Included in this activity is use of airborne laser and passive ocean color for development of advanced satellite ocean color sensors. Promising on-wavelength subsurface scattering layer measurements were recently obtained. A partial summary of these results are shown.

  11. Control techniques for an automated mixed traffic vehicle

    NASA Technical Reports Server (NTRS)

    Meisenholder, G. W.; Johnston, A. R.

    1977-01-01

    The paper describes an automated mixed traffic vehicle (AMTV), a driverless low-speed tram designed to operate in mixed pedestrian and vehicular traffic. The vehicle is a six-passenger electric tram equipped with sensing and control which permit it to function on existing streets in an automatic mode. The design includes established wire-following techniques for steering and near-IR headway sensors. A 7-mph cruise speed is reduced to 2 mph or a complete stop in response to sensor (or passenger) inputs. The AMTV performance is evaluated by operation on a loop route and by simulation. Some necessary improvements involving sensors, sensor pattern, use of an audible signal, and control lag are discussed. It is suggested that appropriate modifications will eliminate collision incidents.

  12. A NASA Perspective and Validation and Testing of Design Hardening for the Natural Space Radiation Environment (GOMAC Tech 03)

    NASA Technical Reports Server (NTRS)

    Day, John H. (Technical Monitor); LaBel, Kenneth A.; Howard, James W.; Carts, Martin A.; Seidleck, Christine

    2003-01-01

    With the dearth of dedicated radiation hardened foundries, new and novel techniques are being developed for hardening designs using non-dedicated foundry services. In this paper, we will discuss the implications of validating these methods for the natural space radiation environment issues: total ionizing dose (TID) and single event effects (SEE). Topics of discussion include: Types of tests that are required, Design coverage (i.e., design libraries: do they need validating for each application?) A new task within NASA to compare existing design. This latter task is a new effort in FY03 utilizing a 8051 microcontroller core from multiple design hardening developers as a test vehicle to evaluate each mitigative technique.

  13. [The future of forensic DNA analysis for criminal justice].

    PubMed

    Laurent, François-Xavier; Vibrac, Geoffrey; Rubio, Aurélien; Thévenot, Marie-Thérèse; Pène, Laurent

    2017-11-01

    In the criminal framework, the analysis of approximately 20 DNA microsatellites enables the establishment of a genetic profile with a high statistical power of discrimination. This technique gives us the possibility to establish or exclude a match between a biological trace detected at a crime scene and a suspect whose DNA was collected via an oral swab. However, conventional techniques do tend to complexify the interpretation of complex DNA samples, such as degraded DNA and mixture DNA. The aim of this review is to highlight the powerness of new forensic DNA methods (including high-throughput sequencing or single-cell sequencing) to facilitate the interpretation of the expert with full compliance with existing french legislation. © 2017 médecine/sciences – Inserm.

  14. Comparison of analysis and flight test data for a drone aircraft with active flutter suppression

    NASA Technical Reports Server (NTRS)

    Newsom, J. R.; Pototzky, A. S.

    1981-01-01

    A drone aircraft equipped with an active flutter suppression system is considered with emphasis on the comparison of modal dampings and frequencies as a function of Mach number. Results are presented for both symmetric and antisymmetric motion with flutter suppression off. Only symmetric results are given for flutter suppression on. Frequency response functions of the vehicle are presented from both flight test data and analysis. The analysis correlation is improved by using an empirical aerodynamic correction factor which is proportional to the ratio of experimental to analytical steady-state lift curve slope. The mathematical models are included and existing analytical techniques are described as well as an alternative analytical technique for obtaining closed-loop results.

  15. Desensitized Optimal Filtering and Sensor Fusion Toolkit

    NASA Technical Reports Server (NTRS)

    Karlgaard, Christopher D.

    2015-01-01

    Analytical Mechanics Associates, Inc., has developed a software toolkit that filters and processes navigational data from multiple sensor sources. A key component of the toolkit is a trajectory optimization technique that reduces the sensitivity of Kalman filters with respect to model parameter uncertainties. The sensor fusion toolkit also integrates recent advances in adaptive Kalman and sigma-point filters for non-Gaussian problems with error statistics. This Phase II effort provides new filtering and sensor fusion techniques in a convenient package that can be used as a stand-alone application for ground support and/or onboard use. Its modular architecture enables ready integration with existing tools. A suite of sensor models and noise distribution as well as Monte Carlo analysis capability are included to enable statistical performance evaluations.

  16. Numerical Modelling of Extended Leak-Off Test with a Pre-Existing Fracture

    NASA Astrophysics Data System (ADS)

    Lavrov, A.; Larsen, I.; Bauer, A.

    2016-04-01

    Extended leak-off test (XLOT) is one of the few techniques available for stress measurements in oil and gas wells. Interpretation of the test is often difficult since the results depend on a multitude of factors, including the presence of natural or drilling-induced fractures in the near-well area. Coupled numerical modelling of XLOT has been performed to investigate the pressure behaviour during the flowback phase as well as the effect of a pre-existing fracture on the test results in a low-permeability formation. Essential features of XLOT known from field measurements are captured by the model, including the saw-tooth shape of the pressure vs injected volume curve, and the change of slope in the pressure vs time curve during flowback used by operators as an indicator of the bottomhole pressure reaching the minimum in situ stress. Simulations with a pre-existing fracture running from the borehole wall in the radial direction have revealed that the results of XLOT are quite sensitive to the orientation of the pre-existing fracture. In particular, the fracture initiation pressure and the formation breakdown pressure increase steadily with decreasing angle between the fracture and the minimum in situ stress. Our findings seem to invalidate the use of the fracture initiation pressure and the formation breakdown pressure for stress measurements or rock strength evaluation purposes.

  17. A review of DTCA techniques: Appraising their success and potential impact on medication users.

    PubMed

    Babar, Zaheer-Ud-Din; Siraj, Ashna Medina; Curley, Louise

    2018-03-01

    Direct-to-consumer advertising (DTCA) has been present in some countries for nearly two decades. Its success and ramifications have been examined but not yet cataloged recently in a comprehensive manner. To review existing literature studies on the topic of DTCA techniques to provide an analysis of the current methods considered by drug marketers to enhance the effect of pharmaceutical product promotion and its success, as well as examine ramifications on the drug use process. A search of 7 electronic databases including MEDLINE and SCOPUS was conducted in December 2015, and updated until February 2016. A scientific review of literature (2008-2015) was performed to identify and collate information from relevant, peer reviewed original study articles investigating various DTCA techniques commonly employed in pharmaceutical promotion. A thematic analysis was undertaken to categorize categories of drug promotion, or techniques, and the saliency and impact of these. Nineteen original study articles were included in this review. All articles were based in the U.S. and New Zealand, where DTCA is legal. After reviewing all the articles, 4 themes with 11 subcategories were generated. These themes included disease mongering and medicalization, drug references, advertisement strategies and eDTCA. The themes describe different categories of techniques used to augment DTC advertisements to increase their impact and overall success in promoting a pharmaceutical product. Many DTCA techniques utilized by pharmaceutical marketers are beneficial to the success of DTC promotion of a drug. These techniques include the use of drug efficacy information, comparative claims, non-branded help seeking advertisements, formatted risks information, celebrity or expert endorsers and website trust factors. Through their use, public perception of the drug is made more favorable, increased attention is drawn to the advertisement, and the pharmaceutical product gains greater credibility and subsequent success in sales. However some techniques, although beneficial to pharmaceutical promotion, need to be monitored by policymakers and regulatory advisors, as they have the potential to negatively impact consumer health knowledge. Overall, through this review it is evident that there are a number if techniques that employed by pharmaceutical marketers to augment the success of pharmaceutical promotion. While these techniques may be beneficial to pharmaceutical companies and might increase awareness amongst consumers, it is important to be critical of them, as they have the potential to be exploited by pharmaceutical marketers. This review indicated that although some techniques are successful and appear to be satisfactory in providing information to consumers, other techniques need to be appraised more closely. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Single-molecule techniques in biophysics: a review of the progress in methods and applications.

    PubMed

    Miller, Helen; Zhou, Zhaokun; Shepherd, Jack; Wollman, Adam J M; Leake, Mark C

    2018-02-01

    Single-molecule biophysics has transformed our understanding of biology, but also of the physics of life. More exotic than simple soft matter, biomatter lives far from thermal equilibrium, covering multiple lengths from the nanoscale of single molecules to up to several orders of magnitude higher in cells, tissues and organisms. Biomolecules are often characterized by underlying instability: multiple metastable free energy states exist, separated by levels of just a few multiples of the thermal energy scale k B T, where k B is the Boltzmann constant and T absolute temperature, implying complex inter-conversion kinetics in the relatively hot, wet environment of active biological matter. A key benefit of single-molecule biophysics techniques is their ability to probe heterogeneity of free energy states across a molecular population, too challenging in general for conventional ensemble average approaches. Parallel developments in experimental and computational techniques have catalysed the birth of multiplexed, correlative techniques to tackle previously intractable biological questions. Experimentally, progress has been driven by improvements in sensitivity and speed of detectors, and the stability and efficiency of light sources, probes and microfluidics. We discuss the motivation and requirements for these recent experiments, including the underpinning mathematics. These methods are broadly divided into tools which detect molecules and those which manipulate them. For the former we discuss the progress of super-resolution microscopy, transformative for addressing many longstanding questions in the life sciences, and for the latter we include progress in 'force spectroscopy' techniques that mechanically perturb molecules. We also consider in silico progress of single-molecule computational physics, and how simulation and experimentation may be drawn together to give a more complete understanding. Increasingly, combinatorial techniques are now used, including correlative atomic force microscopy and fluorescence imaging, to probe questions closer to native physiological behaviour. We identify the trade-offs, limitations and applications of these techniques, and discuss exciting new directions.

  19. Single-molecule techniques in biophysics: a review of the progress in methods and applications

    NASA Astrophysics Data System (ADS)

    Miller, Helen; Zhou, Zhaokun; Shepherd, Jack; Wollman, Adam J. M.; Leake, Mark C.

    2018-02-01

    Single-molecule biophysics has transformed our understanding of biology, but also of the physics of life. More exotic than simple soft matter, biomatter lives far from thermal equilibrium, covering multiple lengths from the nanoscale of single molecules to up to several orders of magnitude higher in cells, tissues and organisms. Biomolecules are often characterized by underlying instability: multiple metastable free energy states exist, separated by levels of just a few multiples of the thermal energy scale k B T, where k B is the Boltzmann constant and T absolute temperature, implying complex inter-conversion kinetics in the relatively hot, wet environment of active biological matter. A key benefit of single-molecule biophysics techniques is their ability to probe heterogeneity of free energy states across a molecular population, too challenging in general for conventional ensemble average approaches. Parallel developments in experimental and computational techniques have catalysed the birth of multiplexed, correlative techniques to tackle previously intractable biological questions. Experimentally, progress has been driven by improvements in sensitivity and speed of detectors, and the stability and efficiency of light sources, probes and microfluidics. We discuss the motivation and requirements for these recent experiments, including the underpinning mathematics. These methods are broadly divided into tools which detect molecules and those which manipulate them. For the former we discuss the progress of super-resolution microscopy, transformative for addressing many longstanding questions in the life sciences, and for the latter we include progress in ‘force spectroscopy’ techniques that mechanically perturb molecules. We also consider in silico progress of single-molecule computational physics, and how simulation and experimentation may be drawn together to give a more complete understanding. Increasingly, combinatorial techniques are now used, including correlative atomic force microscopy and fluorescence imaging, to probe questions closer to native physiological behaviour. We identify the trade-offs, limitations and applications of these techniques, and discuss exciting new directions.

  20. The pre-image problem for Laplacian Eigenmaps utilizing L 1 regularization with applications to data fusion

    NASA Astrophysics Data System (ADS)

    Cloninger, Alexander; Czaja, Wojciech; Doster, Timothy

    2017-07-01

    As the popularity of non-linear manifold learning techniques such as kernel PCA and Laplacian Eigenmaps grows, vast improvements have been seen in many areas of data processing, including heterogeneous data fusion and integration. One problem with the non-linear techniques, however, is the lack of an easily calculable pre-image. Existence of such pre-image would allow visualization of the fused data not only in the embedded space, but also in the original data space. The ability to make such comparisons can be crucial for data analysts and other subject matter experts who are the end users of novel mathematical algorithms. In this paper, we propose a pre-image algorithm for Laplacian Eigenmaps. Our method offers major improvements over existing techniques, which allow us to address the problem of noisy inputs and the issue of how to calculate the pre-image of a point outside the convex hull of training samples; both of which have been overlooked in previous studies in this field. We conclude by showing that our pre-image algorithm, combined with feature space rotations, allows us to recover occluded pixels of an imaging modality based off knowledge of that image measured by heterogeneous modalities. We demonstrate this data recovery on heterogeneous hyperspectral (HS) cameras, as well as by recovering LIDAR measurements from HS data.

  1. New opportunities in quasi elastic neutron scattering spectroscopy

    NASA Astrophysics Data System (ADS)

    Mezei, F.; Russina, M.

    2001-07-01

    The high energy resolution usually required in quasi elastic neutron scattering (QENS) spectroscopy is commonly achieved by the use of cold neutrons. This is one of the important research areas where the majority of current work is done on instruments on continuous reactor sources. One particular reason for this is the capability of continuous source time-of-flight spectrometers to use instrumental parameters optimally adapted for best data collection efficiency in each experiment. These parameters include the pulse repetition rate and the length of the pulses to achieve optimal balance between resolution and intensity. In addition, the disc chopper systems used provide perfect symmetrical line shapes with no tails and low background. Recent development of a set of novel techniques enhance the efficiency of cold neutron spectroscopy on existing and future spallation sources in a dramatic fashion. These techniques involve the use of extended pulse length, high intensity coupled moderators, disc chopper systems and advanced neutron optical beam delivery, and they will enable Lujan center at Los Alamos to surpass the best existing reactor instruments in time-of-flight QENS work by more than on order of magnitude in terms of beam flux on the sample. Other applications of the same techniques will allow us to combine advantages of backscattering spectroscopy on continuous and pulsed sources in order to deliver μeV resolution in a very broad energy transfer range.

  2. Effects of mechanical loading on human mesenchymal stem cells for cartilage tissue engineering.

    PubMed

    Choi, Jane Ru; Yong, Kar Wey; Choi, Jean Yu

    2018-03-01

    Today, articular cartilage damage is a major health problem, affecting people of all ages. The existing conventional articular cartilage repair techniques, such as autologous chondrocyte implantation (ACI), microfracture, and mosaicplasty, have many shortcomings which negatively affect their clinical outcomes. Therefore, it is essential to develop an alternative and efficient articular repair technique that can address those shortcomings. Cartilage tissue engineering, which aims to create a tissue-engineered cartilage derived from human mesenchymal stem cells (MSCs), shows great promise for improving articular cartilage defect therapy. However, the use of tissue-engineered cartilage for the clinical therapy of articular cartilage defect still remains challenging. Despite the importance of mechanical loading to create a functional cartilage has been well demonstrated, the specific type of mechanical loading and its optimal loading regime is still under investigation. This review summarizes the most recent advances in the effects of mechanical loading on human MSCs. First, the existing conventional articular repair techniques and their shortcomings are highlighted. The important parameters for the evaluation of the tissue-engineered cartilage, including chondrogenic and hypertrophic differentiation of human MSCs are briefly discussed. The influence of mechanical loading on human MSCs is subsequently reviewed and the possible mechanotransduction signaling is highlighted. The development of non-hypertrophic chondrogenesis in response to the changing mechanical microenvironment will aid in the establishment of a tissue-engineered cartilage for efficient articular cartilage repair. © 2017 Wiley Periodicals, Inc.

  3. Application of Behavior Change Techniques in a Personalized Nutrition Electronic Health Intervention Study: Protocol for the Web-Based Food4Me Randomized Controlled Trial.

    PubMed

    Macready, Anna L; Fallaize, Rosalind; Butler, Laurie T; Ellis, Judi A; Kuznesof, Sharron; Frewer, Lynn J; Celis-Morales, Carlos; Livingstone, Katherine M; Araújo-Soares, Vera; Fischer, Arnout Rh; Stewart-Knox, Barbara J; Mathers, John C; Lovegrove, Julie A

    2018-04-09

    To determine the efficacy of behavior change techniques applied in dietary and physical activity intervention studies, it is first necessary to record and describe techniques that have been used during such interventions. Published frameworks used in dietary and smoking cessation interventions undergo continuous development, and most are not adapted for Web-based delivery. The Food4Me study (N=1607) provided the opportunity to use existing frameworks to describe standardized Web-based techniques employed in a large-scale, internet-based intervention to change dietary behavior and physical activity. The aims of this study were (1) to describe techniques embedded in the Food4Me study design and explain the selection rationale and (2) to demonstrate the use of behavior change technique taxonomies, develop standard operating procedures for training, and identify strengths and limitations of the Food4Me framework that will inform its use in future studies. The 6-month randomized controlled trial took place simultaneously in seven European countries, with participants receiving one of four levels of personalized advice (generalized, intake-based, intake+phenotype-based, and intake+phenotype+gene-based). A three-phase approach was taken: (1) existing taxonomies were reviewed and techniques were identified a priori for possible inclusion in the Food4Me study, (2) a standard operating procedure was developed to maintain consistency in the use of methods and techniques across research centers, and (3) the Food4Me behavior change technique framework was reviewed and updated post intervention. An analysis of excluded techniques was also conducted. Of 46 techniques identified a priori as being applicable to Food4Me, 17 were embedded in the intervention design; 11 were from a dietary taxonomy, and 6 from a smoking cessation taxonomy. In addition, the four-category smoking cessation framework structure was adopted for clarity of communication. Smoking cessation texts were adapted for dietary use where necessary. A posteriori, a further 9 techniques were included. Examination of excluded items highlighted the distinction between techniques considered appropriate for face-to-face versus internet-based delivery. The use of existing taxonomies facilitated the description and standardization of techniques used in Food4Me. We recommend that for complex studies of this nature, technique analysis should be conducted a priori to develop standardized procedures and training and reviewed a posteriori to audit the techniques actually adopted. The present framework description makes a valuable contribution to future systematic reviews and meta-analyses that explore technique efficacy and underlying psychological constructs. This was a novel application of the behavior change taxonomies and was the first internet-based personalized nutrition intervention to use such a framework remotely. ClinicalTrials.gov NCT01530139; https://clinicaltrials.gov/ct2/show/NCT01530139 (Archived by WebCite at http://www.webcitation.org/6y8XYUft1). ©Anna L Macready, Rosalind Fallaize, Laurie T Butler, Judi A Ellis, Sharron Kuznesof, Lynn J Frewer, Carlos Celis-Morales, Katherine M Livingstone, Vera Araújo-Soares, Arnout RH Fischer, Barbara J Stewart-Knox, John C Mathers, Julie A Lovegrove. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 09.04.2018.

  4. [Key informers. When and How?].

    PubMed

    Martín González, R

    2009-03-01

    When information obtained through duly designed and developed studies is not available, the solution to certain problems that affect the population or that respond to certain questions may be approached by using the information and experience provided by the so-called key informer. The key informer is defined as a person who is in contact with the community or with the problem to be studied, who is considered to have good knowledge of the situation and therefore who is considered an expert. The search for consensus is the basis to obtain information through the key informers. The techniques used have different characteristics based on whether the experts chosen meet together or not, whether they are guided or not, whether they interact with each other or not. These techniques include the survey, the Delphi technique, the nominal group technique, brainwriting, brainstorming, the Phillips 66 technique, the 6-3-5 technique, the community forum and the community impressions technique. Information provided by key informers through the search for consensus is relevant when this is not available or cannot be obtained by other methods. It has permitted the analysis of the existing neurological care model, elaboration of recommendations on visit times for the out-patient neurological care, and the elaboration of guidelines and recommendations for the management of prevalent neurological problems.

  5. Application of filtering techniques in preprocessing magnetic data

    NASA Astrophysics Data System (ADS)

    Liu, Haijun; Yi, Yongping; Yang, Hongxia; Hu, Guochuang; Liu, Guoming

    2010-08-01

    High precision magnetic exploration is a popular geophysical technique for its simplicity and its effectiveness. The explanation in high precision magnetic exploration is always a difficulty because of the existence of noise and disturbance factors, so it is necessary to find an effective preprocessing method to get rid of the affection of interference factors before further processing. The common way to do this work is by filtering. There are many kinds of filtering methods. In this paper we introduced in detail three popular kinds of filtering techniques including regularized filtering technique, sliding averages filtering technique, compensation smoothing filtering technique. Then we designed the work flow of filtering program based on these techniques and realized it with the help of DELPHI. To check it we applied it to preprocess magnetic data of a certain place in China. Comparing the initial contour map with the filtered contour map, we can see clearly the perfect effect our program. The contour map processed by our program is very smooth and the high frequency parts of data are disappeared. After filtering, we separated useful signals and noisy signals, minor anomaly and major anomaly, local anomaly and regional anomaly. It made us easily to focus on the useful information. Our program can be used to preprocess magnetic data. The results showed the effectiveness of our program.

  6. Detecting and classifying method based on similarity matching of Android malware behavior with profile.

    PubMed

    Jang, Jae-Wook; Yun, Jaesung; Mohaisen, Aziz; Woo, Jiyoung; Kim, Huy Kang

    2016-01-01

    Mass-market mobile security threats have increased recently due to the growth of mobile technologies and the popularity of mobile devices. Accordingly, techniques have been introduced for identifying, classifying, and defending against mobile threats utilizing static, dynamic, on-device, and off-device techniques. Static techniques are easy to evade, while dynamic techniques are expensive. On-device techniques are evasion, while off-device techniques need being always online. To address some of those shortcomings, we introduce Andro-profiler, a hybrid behavior based analysis and classification system for mobile malware. Andro-profiler main goals are efficiency, scalability, and accuracy. For that, Andro-profiler classifies malware by exploiting the behavior profiling extracted from the integrated system logs including system calls. Andro-profiler executes a malicious application on an emulator in order to generate the integrated system logs, and creates human-readable behavior profiles by analyzing the integrated system logs. By comparing the behavior profile of malicious application with representative behavior profile for each malware family using a weighted similarity matching technique, Andro-profiler detects and classifies it into malware families. The experiment results demonstrate that Andro-profiler is scalable, performs well in detecting and classifying malware with accuracy greater than 98 %, outperforms the existing state-of-the-art work, and is capable of identifying 0-day mobile malware samples.

  7. Fast Component Pursuit for Large-Scale Inverse Covariance Estimation.

    PubMed

    Han, Lei; Zhang, Yu; Zhang, Tong

    2016-08-01

    The maximum likelihood estimation (MLE) for the Gaussian graphical model, which is also known as the inverse covariance estimation problem, has gained increasing interest recently. Most existing works assume that inverse covariance estimators contain sparse structure and then construct models with the ℓ 1 regularization. In this paper, different from existing works, we study the inverse covariance estimation problem from another perspective by efficiently modeling the low-rank structure in the inverse covariance, which is assumed to be a combination of a low-rank part and a diagonal matrix. One motivation for this assumption is that the low-rank structure is common in many applications including the climate and financial analysis, and another one is that such assumption can reduce the computational complexity when computing its inverse. Specifically, we propose an efficient COmponent Pursuit (COP) method to obtain the low-rank part, where each component can be sparse. For optimization, the COP method greedily learns a rank-one component in each iteration by maximizing the log-likelihood. Moreover, the COP algorithm enjoys several appealing properties including the existence of an efficient solution in each iteration and the theoretical guarantee on the convergence of this greedy approach. Experiments on large-scale synthetic and real-world datasets including thousands of millions variables show that the COP method is faster than the state-of-the-art techniques for the inverse covariance estimation problem when achieving comparable log-likelihood on test data.

  8. Assessing the Queuing Process Using Data Envelopment Analysis: an Application in Health Centres.

    PubMed

    Safdar, Komal A; Emrouznejad, Ali; Dey, Prasanta K

    2016-01-01

    Queuing is one of the very important criteria for assessing the performance and efficiency of any service industry, including healthcare. Data Envelopment Analysis (DEA) is one of the most widely-used techniques for performance measurement in healthcare. However, no queue management application has been reported in the health-related DEA literature. Most of the studies regarding patient flow systems had the objective of improving an already existing Appointment System. The current study presents a novel application of DEA for assessing the queuing process at an Outpatients' department of a large public hospital in a developing country where appointment systems do not exist. The main aim of the current study is to demonstrate the usefulness of DEA modelling in the evaluation of a queue system. The patient flow pathway considered for this study consists of two stages; consultation with a doctor and pharmacy. The DEA results indicated that waiting times and other related queuing variables included need considerable minimisation at both stages.

  9. High temperature gas-cooled reactor (HTGR) graphite pebble fuel: Review of technologies for reprocessing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mcwilliams, A. J.

    2015-09-08

    This report reviews literature on reprocessing high temperature gas-cooled reactor graphite fuel components. A basic review of the various fuel components used in the pebble bed type reactors is provided along with a survey of synthesis methods for the fabrication of the fuel components. Several disposal options are considered for the graphite pebble fuel elements including the storage of intact pebbles, volume reduction by separating the graphite from fuel kernels, and complete processing of the pebbles for waste storage. Existing methods for graphite removal are presented and generally consist of mechanical separation techniques such as crushing and grinding chemical techniquesmore » through the use of acid digestion and oxidation. Potential methods for reprocessing the graphite pebbles include improvements to existing methods and novel technologies that have not previously been investigated for nuclear graphite waste applications. The best overall method will be dependent on the desired final waste form and needs to factor in the technical efficiency, political concerns, cost, and implementation.« less

  10. Concepts for the translation of genome-based innovations into public health: a comprehensive overview.

    PubMed

    Syurina, Elena V; Schulte In den Bäumen, Tobias; Brand, Angela; Ambrosino, Elena; Feron, Frans Jm

    2013-03-01

    Recent vast and rapid development of genome-related sciences is followed by the development of different assessment techniques or attempts to adapt the existing ones. The aim of this article is to give an overview of existing concepts for the assessment and translation of innovations into healthcare, applying a descriptive analysis of their present use by public health specialists and policy makers. The international literature review identified eight concepts including Health Technology Assessment, analytic validity, clinical validity, clinical utility, ethical, legal and social implications, Public Health Wheel and others. This study gives an overview of these concepts (including the level of current use) applying a descriptive analysis of their present use by public health specialists and policy makers. Despite the heterogeneity of the analyzed concepts and difference in use in everyday healthcare practice, the cross-integration of these concepts is important in order to improve translation speed and quality. Finally, some recommendations are made regarding the most applicable translational concepts.

  11. Flank and Lumbar Hernia Repair.

    PubMed

    Beffa, Lucas R; Margiotta, Alyssa L; Carbonell, Alfredo M

    2018-06-01

    Flank and lumbar hernias are challenging because of their rarity and anatomic location. Several challenges exist when approaching these specific abdominal wall defects, including location, innervation of the lateral abdominal wall musculature, and their proximity to bony landmarks. These hernias are confined by the costal margin, spine, and pelvic brim, which makes closure of the defect, including mesh placement, difficult. This article discusses the anatomy of lumbar and flank hernias, the various etiologies for these hernias, and the procedural steps for open and robotic preperitoneal approaches. The available clinical evidence regarding outcomes for various repair techniques is also reviewed. Copyright © 2018 Elsevier Inc. All rights reserved.

  12. Survey of methods for soil moisture determination

    NASA Technical Reports Server (NTRS)

    Schmugge, T. J.; Jackson, T. J.; Mckim, H. L.

    1979-01-01

    Existing and proposed methods for soil moisture determination are discussed. These include: (1) in situ investigations including gravimetric, nuclear, and electromagnetic techniques; (2) remote sensing approaches that use the reflected solar, thermal infrared, and microwave portions of the electromagnetic spectrum; and (3) soil physics models that track the behavior of water in the soil in response to meteorological inputs (precipitation) and demands (evapotranspiration). The capacities of these approaches to satisfy various user needs for soil moisture information vary from application to application, but a conceptual scheme for merging these approaches into integrated systems to provide soil moisture information is proposed that has the potential for meeting various application requirements.

  13. Anterior total hip arthroplasty using a metaphyseal bone-sparing stem: component alignment and early complications.

    PubMed

    Ahmed, Mohammed M; Otto, Thomas J; Moed, Berton R

    2016-04-22

    Limited-incision total hip arthroplasty (THA) preserves hip abductors, posterior capsule, and external rotators potentially diminishing dislocation risk. However, potential complications also exist, such as component malposition. Specific implants have been manufactured that enhance compatibility with this technique, while preserving metaphyseal bone; however, little data exists documenting early complications and component position. The purpose was to evaluate primary THA using a curved, bone-sparing stem inserted through the anterior approach with respect to component alignment and early complications. In a retrospective analysis of 108 cases, the surgical technique was outlined and the occurrence of intraoperative fractures, postoperative dislocations, infection, and limb length inequality was determined. Femoral stem and acetabular cup alignment was quantified using the initial postoperative radiographs. Patient follow-up averaged 12.9 (range 2 to 36) months. There were eight (7.4 %) complications requiring revision surgery in three (2.8 %) patients with three (2.8 %) infections and three (2.8 %) dislocations. Intraoperative complications included one calcar fracture above the lesser trochanter. Leg length inequality >5 mm was present in three (2.8 %) patients. Radiographic analysis showed that femoral neutral alignment was achieved in 95 hips (88.0 %). All femoral stems demonstrated satisfactory fit and fill and no evidence of subsidence, osteolysis, or loosening. An average abduction angle of 44.8° (± 5.3) and average cup anteversion of 16.2° (± 4.2) were also noted. Although the technique with this implant and approach is promising, it does not appear to offer important advantages over standard techniques. However, the findings merit further, long-term study.

  14. Nanomaterial characterization through image treatment, 3D reconstruction and AI techniques

    NASA Astrophysics Data System (ADS)

    Lopez de Uralde Huarte, Juan Jose

    Nanotechnology is not only the science of the future, but it is indeed the science of today. It is used in all sectors, from health to energy, including information technologies and transport. For the present investigation, we have taken carbon black as a use case. This nanomaterial is mixed with a wide variety of materials to improve their properties, like abrasion resistance, tire and plastic wear or tinting strength in pigments. Nowadays, indirect methods of analysis, like oil absorption or nitrogen adsorption are the most common techniques of the nanomaterial industry. These procedures measure the change in the physical state while adding oil and nitrogen. In this way, the superficial area is estimated and related with the properties of the material. Nevertheless, we have chosen to improve the existent direct methods, which consist in analysing microscopy images of nanomaterials. We have made progress in the image processing treatments and in the extracted features. In fact, some of them have overcome the existing features in the literature. In addition, we have applied, for the first time in the literature, machine learning to aggregate categorization. In this way, we identify automatically their morphology, which will determine the final properties of the material that is mixed with. Finally, we have presented an aggregate reconstruction genetic algorithm that, with only two orthogonal images, provides more information than a tomography, which needs a lot of images. To summarize, we have improved the state of the art in direct analysing techniques, allowing in the near future the replacement of the current indirect techniques.

  15. Nouvelles techniques pratiques pour la modelisation du comportement dynamique des systèmes eau-structure

    NASA Astrophysics Data System (ADS)

    Miquel, Benjamin

    The dynamic or seismic behavior of hydraulic structures is, as for conventional structures, essential to assure protection of human lives. These types of analyses also aim at limiting structural damage caused by an earthquake to prevent rupture or collapse of the structure. The particularity of these hydraulic structures is that not only the internal displacements are caused by the earthquake, but also by the hydrodynamic loads resulting from fluid-structure interaction. This thesis reviews the existing complex and simplified methods to perform such dynamic analysis for hydraulic structures. For the complex existing methods, attention is placed on the difficulties arising from their use. Particularly, interest is given in this work on the use of transmitting boundary conditions to simulate the semi infinity of reservoirs. A procedure has been developed to estimate the error that these boundary conditions can introduce in finite element dynamic analysis. Depending on their formulation and location, we showed that they can considerably affect the response of such fluid-structure systems. For practical engineering applications, simplified procedures are still needed to evaluate the dynamic behavior of structures in contact with water. A review of the existing simplified procedures showed that these methods are based on numerous simplifications that can affect the prediction of the dynamic behavior of such systems. One of the main objectives of this thesis has been to develop new simplified methods that are more accurate than those existing. First, a new spectral analysis method has been proposed. Expressions for the fundamental frequency of fluid-structure systems, key parameter of spectral analysis, have been developed. We show that this new technique can easily be implemented in a spreadsheet or program, and that its calculation time is near instantaneous. When compared to more complex analytical or numerical method, this new procedure yields excellent prediction of the dynamic behavior of fluid-structure systems. Spectral analyses ignore the transient and oscillatory nature of vibrations. When such dynamic analyses show that some areas of the studied structure undergo excessive stresses, time history analyses allow a better estimate of the extent of these zones as well as a time notion of these excessive stresses. Furthermore, the existing spectral analyses methods for fluid-structure systems account only for the static effect of higher modes. Thought this can generally be sufficient for dams, for flexible structures the dynamic effect of these modes should be accounted for. New methods have been developed for fluid-structure systems to account for these observations as well as the flexibility of foundations. A first method was developed to study structures in contact with one or two finite or infinite water domains. This new technique includes flexibility of structures and foundations as well as the dynamic effect of higher vibration modes and variations of the levels of the water domains. Extension of this method was performed to study beam structures in contact with fluids. These new developments have also allowed extending existing analytical formulations of the dynamic properties of a dry beam to a new formulation that includes effect of fluid-structure interaction. The method yields a very good estimate of the dynamic behavior of beam-fluid systems or beam like structures in contact with fluid. Finally, a Modified Accelerogram Method (MAM) has been developed to modify the design earthquake into a new accelerogram that directly accounts for the effect of fluid-structure interaction. This new accelerogram can therefore be applied directly to the dry structure (i.e. without water) in order to calculate the dynamic response of the fluid-structure system. This original technique can include numerous parameters that influence the dynamic response of such systems and allows to treat analytically the fluid-structure interaction while keeping the advantages of finite element modeling.

  16. Rapid Automated Aircraft Simulation Model Updating from Flight Data

    NASA Technical Reports Server (NTRS)

    Brian, Geoff; Morelli, Eugene A.

    2011-01-01

    Techniques to identify aircraft aerodynamic characteristics from flight measurements and compute corrections to an existing simulation model of a research aircraft were investigated. The purpose of the research was to develop a process enabling rapid automated updating of aircraft simulation models using flight data and apply this capability to all flight regimes, including flight envelope extremes. The process presented has the potential to improve the efficiency of envelope expansion flight testing, revision of control system properties, and the development of high-fidelity simulators for pilot training.

  17. Nanonization strategies for poorly water-soluble drugs.

    PubMed

    Chen, Huabing; Khemtong, Chalermchai; Yang, Xiangliang; Chang, Xueling; Gao, Jinming

    2011-04-01

    Poor water solubility for many drugs and drug candidates remains a major obstacle to their development and clinical application. Conventional formulations to improve solubility suffer from low bioavailability and poor pharmacokinetics, with some carriers rendering systemic toxicities (e.g. Cremophor(®) EL). In this review, several major nanonization techniques that seek to overcome these limitations for drug solubilization are presented. Strategies including drug nanocrystals, nanoemulsions and polymeric micelles are reviewed. Finally, perspectives on existing challenges and future opportunities are highlighted. Published by Elsevier Ltd.

  18. Assessing Cybercrime Through the Eyes of the WOMBAT

    NASA Astrophysics Data System (ADS)

    Dacier, Marc; Leita, Corrado; Thonnard, Olivier; van Pham, Hau; Kirda, Engin

    The WOMBAT project is a collaborative European funded research project that aims at providing new means to understand the existing and emerging threats that are targeting the Internet economy and the net citizens. The approach carried out by the partners include a data collection effort as well as some sophisticated analysis techniques. In this chapter, we present one of the threats-related data collection system in use by the project, as well as some of the early results obtained when digging into these data sets.

  19. Subsynchronous instability of a geared centrifugal compressor of overhung design

    NASA Technical Reports Server (NTRS)

    Hudson, J. H.; Wittman, L. J.

    1980-01-01

    The original design analysis and shop test data are presented for a three stage (poster) air compressor with impellers mounted on the extensions of a twin pinion gear, and driven by an 8000 hp synchronous motor. Also included are field test data, subsequent rotor dynamics analysis, modifications, and final rotor behavior. A subsynchronous instability existed on a geared, overhung rotor. State-of-the-art rotor dynamics analysis techniques provided a reasonable analytical model of the rotor. A bearing modification arrived at analytically eliminated the instability.

  20. [From literature to academic history:the position and pathway of acupuncture theory research].

    PubMed

    Zhang, Shujian

    2017-03-12

    There are two clues in academic inheritance of acupuncture, including theoretical inheritance and empirical inheritance. Up to now, the mainstream of acupuncture theory has not been in conformity with empirical clinic, and could not explain new clinical techniques. The existing acupuncture theories are in need of strict re-examination, and new academic achievements shall be carefully absorbed. The literature review, concepts research and academic history study are considered as key pathways of acupuncture theory research.

  1. Configurations and calibration methods for passive sampling techniques.

    PubMed

    Ouyang, Gangfeng; Pawliszyn, Janusz

    2007-10-19

    Passive sampling technology has developed very quickly in the past 15 years, and is widely used for the monitoring of pollutants in different environments. The design and quantification of passive sampling devices require an appropriate calibration method. Current calibration methods that exist for passive sampling, including equilibrium extraction, linear uptake, and kinetic calibration, are presented in this review. A number of state-of-the-art passive sampling devices that can be used for aqueous and air monitoring are introduced according to their calibration methods.

  2. Determining a young dancer's readiness for dancing on pointe.

    PubMed

    Shah, Selina

    2009-01-01

    Ballet is one of the most popular youth activities in the United States. Many ballet students eventually train to dance "en pointe," the French words for "on pointe," or "on the tips of their toes." No research exists to define criteria for determining when a young dancer can transition from dancing in ballet slippers to dancing in pointe shoes. However, dancers can be evaluated for this progression based on a number of factors, including adequate foot and ankle plantarflexion, technique, training, proprioception, alignment, and strength.

  3. Computer modelling of grain microstructure in three dimensions

    NASA Astrophysics Data System (ADS)

    Narayan, K. Lakshmi

    We present a program that generates the two-dimensional micrographs of a three dimensional grain microstructure. The code utilizes a novel scanning, pixel mapping technique to secure statistical distributions of surface areas, grain sizes, aspect ratios, perimeters, number of nearest neighbors and volumes of the randomly nucleated particles. The program can be used for comparing the existing theories of grain growth, and interpretation of two-dimensional microstructure of three-dimensional samples. Special features have been included to minimize the computation time and resource requirements.

  4. Revitalization of open apex teeth with apical periodontitis using a collagen-hydroxyapatite scaffold.

    PubMed

    Nevins, Alan J; Cymerman, Jerome J

    2015-06-01

    An enhanced revision of the revitalization endodontic technique for immature teeth with apical periodontitis has been described. It includes the addition of collagen-hydroxyapatite scaffold to the currently practiced revascularization technique. Four cases treated in series are presented in this report, 1 case involving 2 teeth. Periapical diagnoses of immature teeth included "asymptomatic apical periodontitis," "symptomatic apical periodontitis," and "acute apical abscess." Additionally, 1 fully developed tooth that had undergone root canal treatment that failed had a periapical diagnosis of acute apical abscess. An established revascularization protocol was used for all teeth. In addition to stimulating blood clots, all teeth were filled with collagen-hydroxyapatite scaffolds. Periapical radiolucencies healed in all teeth, and diffuse radiopacity developed within the coronal portions of canal spaces. Root development with root lengthening occurred in the immature nonvital maxillary premolar that had not undergone prior treatment. The technique of adding a collagen-hydroxyapatite scaffold to the existing revitalization protocol has been described in which substantial hard tissue repair has occurred. This may leave teeth more fully developed and less likely to fracture. Copyright © 2015 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  5. Characterization of electrically-active defects in ultraviolet light-emitting diodes with laser-based failure analysis techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, Mary A.; Tangyunyong, Paiboon; Cole, Edward I.

    2016-01-14

    Laser-based failure analysis techniques demonstrate the ability to quickly and non-intrusively screen deep ultraviolet light-emitting diodes (LEDs) for electrically-active defects. In particular, two laser-based techniques, light-induced voltage alteration and thermally-induced voltage alteration, generate applied voltage maps (AVMs) that provide information on electrically-active defect behavior including turn-on bias, density, and spatial location. Here, multiple commercial LEDs were examined and found to have dark defect signals in the AVM indicating a site of reduced resistance or leakage through the diode. The existence of the dark defect signals in the AVM correlates strongly with an increased forward-bias leakage current. This increased leakage ismore » not present in devices without AVM signals. Transmission electron microscopy analysis of a dark defect signal site revealed a dislocation cluster through the pn junction. The cluster included an open core dislocation. Even though LEDs with few dark AVM defect signals did not correlate strongly with power loss, direct association between increased open core dislocation densities and reduced LED device performance has been presented elsewhere [M. W. Moseley et al., J. Appl. Phys. 117, 095301 (2015)].« less

  6. A photogrammetric technique for generation of an accurate multispectral optical flow dataset

    NASA Astrophysics Data System (ADS)

    Kniaz, V. V.

    2017-06-01

    A presence of an accurate dataset is the key requirement for a successful development of an optical flow estimation algorithm. A large number of freely available optical flow datasets were developed in recent years and gave rise for many powerful algorithms. However most of the datasets include only images captured in the visible spectrum. This paper is focused on the creation of a multispectral optical flow dataset with an accurate ground truth. The generation of an accurate ground truth optical flow is a rather complex problem, as no device for error-free optical flow measurement was developed to date. Existing methods for ground truth optical flow estimation are based on hidden textures, 3D modelling or laser scanning. Such techniques are either work only with a synthetic optical flow or provide a sparse ground truth optical flow. In this paper a new photogrammetric method for generation of an accurate ground truth optical flow is proposed. The method combines the benefits of the accuracy and density of a synthetic optical flow datasets with the flexibility of laser scanning based techniques. A multispectral dataset including various image sequences was generated using the developed method. The dataset is freely available on the accompanying web site.

  7. Characterization of electrically-active defects in ultraviolet light-emitting diodes with laser-based failure analysis techniques

    DOE PAGES

    Miller, Mary A.; Tangyunyong, Paiboon; Edward I. Cole, Jr.

    2016-01-12

    In this study, laser-based failure analysis techniques demonstrate the ability to quickly and non-intrusively screen deep ultraviolet light-emitting diodes(LEDs) for electrically-active defects. In particular, two laser-based techniques, light-induced voltage alteration and thermally-induced voltage alteration, generate applied voltage maps (AVMs) that provide information on electrically-active defect behavior including turn-on bias, density, and spatial location. Here, multiple commercial LEDs were examined and found to have dark defect signals in the AVM indicating a site of reduced resistance or leakage through the diode. The existence of the dark defect signals in the AVM correlates strongly with an increased forward-bias leakage current. This increasedmore » leakage is not present in devices without AVM signals. Transmission electron microscopyanalysis of a dark defect signal site revealed a dislocation cluster through the pn junction. The cluster included an open core dislocation. Even though LEDs with few dark AVM defect signals did not correlate strongly with power loss, direct association between increased open core dislocation densities and reduced LED device performance has been presented elsewhere [M. W. Moseley et al., J. Appl. Phys. 117, 095301 (2015)].« less

  8. Current and future methods for evaluating the allergenic potential of proteins: international workshop report 23-25 October 2007.

    PubMed

    Thomas, Karluss; Herouet-Guicheney, Corinne; Ladics, Gregory; McClain, Scott; MacIntosh, Susan; Privalle, Laura; Woolhiser, Mike

    2008-09-01

    The International Life Science Institute's Health and Environmental Sciences Institute's Protein Allergenicity Technical Committee hosted an international workshop October 23-25, 2007, in Nice, France, to review and discuss existing and emerging methods and techniques for improving the current weight-of-evidence approach for evaluating the potential allergenicity of novel proteins. The workshop included over 40 international experts from government, industry, and academia. Their expertise represented a range of disciplines including immunology, chemistry, molecular biology, bioinformatics, and toxicology. Among participants, there was consensus that (1) current bioinformatic approaches are highly conservative; (2) advances in bioinformatics using structural comparisons of proteins may be helpful as the availability of structural data increases; (3) proteomics may prove useful for monitoring the natural variability in a plant's proteome and assessing the impact of biotechnology transformations on endogenous levels of allergens, but only when analytical techniques have been standardized and additional data are available on the natural variation of protein expression in non-transgenic bred plants; (4) basophil response assays are promising techniques, but need additional evaluation around specificity, sensitivity, and reproducibility; (5) additional research is required to develop and validate an animal model for the purpose of predicting protein allergenicity.

  9. Refinement and application of acoustic impulse technique to study nozzle transmission characteristics

    NASA Technical Reports Server (NTRS)

    Salikuddin, M.; Brown, W. H.; Ramakrishnan, R.; Tanna, H. K.

    1983-01-01

    An improved acoustic impulse technique was developed and was used to study the transmission characteristics of duct/nozzle systems. To accomplish the above objective, various problems associated with the existing spark-discharge impulse technique were first studied. These included (1) the nonlinear behavior of high intensity pulses, (2) the contamination of the signal with flow noise, (3) low signal-to-noise ratio at high exhaust velocities, and (4) the inability to control or shape the signal generated by the source, specially when multiple spark points were used as the source. The first step to resolve these problems was the replacement of the spark-discharge source with electroacoustic driver(s). These included (1) synthesizing on acoustic impulse with acoustic driver(s) to control and shape the output signal, (2) time domain signal averaging to remove flow noise from the contaminated signal, (3) signal editing to remove unwanted portions of the time history, (4) spectral averaging, and (5) numerical smoothing. The acoustic power measurement technique was improved by taking multiple induct measurements and by a modal decomposition process to account for the contribution of higher order modes in the power computation. The improved acoustic impulse technique was then validated by comparing the results derived by an impedance tube method. The mechanism of acoustic power loss, that occurs when sound is transmitted through nozzle terminations, was investigated. Finally, the refined impulse technique was applied to obtain more accurate results for the acoustic transmission characteristics of a conical nozzle and a multi-lobe multi-tube supressor nozzle.

  10. How adolescent girls interpret weight-loss advertising.

    PubMed

    Hobbs, Renee; Broder, Sharon; Pope, Holly; Rowe, Jonelle

    2006-10-01

    While they demonstrate some ability to critically analyze the more obvious forms of deceptive weight-loss advertising, many girls do not recognize how advertising evokes emotional responses or how visual and narrative techniques are used to increase identification in weight-loss advertising. This study examined how girls aged 9-17 years interpreted magazine advertising, television (TV) advertising and infomercials for weight-loss products in order to determine whether deceptive advertising techniques were recognized and to assess pre-existing media-literacy skills. A total of 42 participants were interviewed in seven geographic regions of the United States. In groups of three, participants were shown seven print and TV advertisements (ads) for weight-loss products and asked to share their interpretations of each ad. Common factors in girls' interpretation of weight-loss advertising included responding to texts emotionally by identifying with characters; comparing and contrasting persuasive messages with real-life experiences with family members; using prior knowledge about nutrition management and recognizing obvious deceptive claims like 'rapid' or 'permanent' weight loss. Girls were less able to demonstrate skills including recognizing persuasive construction strategies including message purpose, target audience and subtext and awareness of economic factors including financial motives, credibility enhancement and branding.

  11. Historical shoreline mapping (I): improving techniques and reducing positioning errors

    USGS Publications Warehouse

    Thieler, E. Robert; Danforth, William W.

    1994-01-01

    A critical need exists among coastal researchers and policy-makers for a precise method to obtain shoreline positions from historical maps and aerial photographs. A number of methods that vary widely in approach and accuracy have been developed to meet this need. None of the existing methods, however, address the entire range of cartographic and photogrammetric techniques required for accurate coastal mapping. Thus, their application to many typical shoreline mapping problems is limited. In addition, no shoreline mapping technique provides an adequate basis for quantifying the many errors inherent in shoreline mapping using maps and air photos. As a result, current assessments of errors in air photo mapping techniques generally (and falsely) assume that errors in shoreline positions are represented by the sum of a series of worst-case assumptions about digitizer operator resolution and ground control accuracy. These assessments also ignore altogether other errors that commonly approach ground distances of 10 m. This paper provides a conceptual and analytical framework for improved methods of extracting geographic data from maps and aerial photographs. We also present a new approach to shoreline mapping using air photos that revises and extends a number of photogrammetric techniques. These techniques include (1) developing spatially and temporally overlapping control networks for large groups of photos; (2) digitizing air photos for use in shoreline mapping; (3) preprocessing digitized photos to remove lens distortion and film deformation effects; (4) simultaneous aerotriangulation of large groups of spatially and temporally overlapping photos; and (5) using a single-ray intersection technique to determine geographic shoreline coordinates and express the horizontal and vertical error associated with a given digitized shoreline. As long as historical maps and air photos are used in studies of shoreline change, there will be a considerable amount of error (on the order of several meters) present in shoreline position and rate-of- change calculations. The techniques presented in this paper, however, provide a means to reduce and quantify these errors so that realistic assessments of the technological noise (as opposed to geological noise) in geographic shoreline positions can be made.

  12. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    NASA Astrophysics Data System (ADS)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.

  13. A fuzzy optimal threshold technique for medical images

    NASA Astrophysics Data System (ADS)

    Thirupathi Kannan, Balaji; Krishnasamy, Krishnaveni; Pradeep Kumar Kenny, S.

    2012-01-01

    A new fuzzy based thresholding method for medical images especially cervical cytology images having blob and mosaic structures is proposed in this paper. Many existing thresholding algorithms may segment either blob or mosaic images but there aren't any single algorithm that can do both. In this paper, an input cervical cytology image is binarized, preprocessed and the pixel value with minimum Fuzzy Gaussian Index is identified as an optimal threshold value and used for segmentation. The proposed technique is tested on various cervical cytology images having blob or mosaic structures, compared with various existing algorithms and proved better than the existing algorithms.

  14. Demonstration of a Safety Analysis on a Complex System

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy; Alfaro, Liliana; Alvarado, Christine; Brown, Molly; Hunt, Earl B.; Jaffe, Matt; Joslyn, Susan; Pinnell, Denise; Reese, Jon; Samarziya, Jeffrey; hide

    1997-01-01

    For the past 17 years, Professor Leveson and her graduate students have been developing a theoretical foundation for safety in complex systems and building a methodology upon that foundation. The methodology includes special management structures and procedures, system hazard analyses, software hazard analysis, requirements modeling and analysis for completeness and safety, special software design techniques including the design of human-machine interaction, verification, operational feedback, and change analysis. The Safeware methodology is based on system safety techniques that are extended to deal with software and human error. Automation is used to enhance our ability to cope with complex systems. Identification, classification, and evaluation of hazards is done using modeling and analysis. To be effective, the models and analysis tools must consider the hardware, software, and human components in these systems. They also need to include a variety of analysis techniques and orthogonal approaches: There exists no single safety analysis or evaluation technique that can handle all aspects of complex systems. Applying only one or two may make us feel satisfied, but will produce limited results. We report here on a demonstration, performed as part of a contract with NASA Langley Research Center, of the Safeware methodology on the Center-TRACON Automation System (CTAS) portion of the air traffic control (ATC) system and procedures currently employed at the Dallas/Fort Worth (DFW) TRACON (Terminal Radar Approach CONtrol). CTAS is an automated system to assist controllers in handling arrival traffic in the DFW area. Safety is a system property, not a component property, so our safety analysis considers the entire system and not simply the automated components. Because safety analysis of a complex system is an interdisciplinary effort, our team included system engineers, software engineers, human factors experts, and cognitive psychologists.

  15. Reconstructing gravitational wave source parameters via direct comparisons to numerical relativity I: Method

    NASA Astrophysics Data System (ADS)

    Lange, Jacob; O'Shaughnessy, Richard; Healy, James; Lousto, Carlos; Shoemaker, Deirdre; Lovelace, Geoffrey; Scheel, Mark; Ossokine, Serguei

    2016-03-01

    In this talk, we describe a procedure to reconstruct the parameters of sufficiently massive coalescing compact binaries via direct comparison with numerical relativity simulations. For sufficiently massive sources, existing numerical relativity simulations are long enough to cover the observationally accessible part of the signal. Due to the signal's brevity, the posterior parameter distribution it implies is broad, simple, and easily reconstructed from information gained by comparing to only the sparse sample of existing numerical relativity simulations. We describe how followup simulations can corroborate and improve our understanding of a detected source. Since our method can include all physics provided by full numerical relativity simulations of coalescing binaries, it provides a valuable complement to alternative techniques which employ approximations to reconstruct source parameters. Supported by NSF Grant PHY-1505629.

  16. Remote sensing in Michigan for land resource management: Highway impact assessment

    NASA Technical Reports Server (NTRS)

    1972-01-01

    An existing section of M-14 freeway constructed in 1964 and a potential extension from Ann Arbor to Plymouth, Michigan provided an opportunity for investigating the potential uses of remote sensing techniques in providing projective information needed for assessing the impact of highway construction. Remote sensing data included multispectral scanner imagery and aerial photography. Only minor effects on vegetation, soils, and land use were found to have occurred in the existing corridor. Adverse changes expected to take place in the corridor proposed for extension of the freeway can be minimized by proper design of drainage ditches and attention to good construction practices. Remote sensing can be used to collect and present many types of data useful for highway impact assessment on land use, vegetation categories and species, soil properties and hydrologic characteristics.

  17. Special-effect edit detection using VideoTrails: a comparison with existing techniques

    NASA Astrophysics Data System (ADS)

    Kobla, Vikrant; DeMenthon, Daniel; Doermann, David S.

    1998-12-01

    Video segmentation plays an integral role in many multimedia applications, such as digital libraries, content management systems, and various other video browsing, indexing, and retrieval systems. Many algorithms for segmentation of video have appeared within the past few years. Most of these algorithms perform well on cuts, but yield poor performance on gradual transitions or special effects edits. A complete video segmentation system must also achieve good performance on special effect edit detection. In this paper, we discuss the performance of our Video Trails-based algorithms, with other existing special effect edit-detection algorithms within the literature. Results from experiments testing for the ability to detect edits from TV programs, ranging from commercials to news magazine programs, including diverse special effect edits, which we have introduced.

  18. Method and appartus for converting static in-ground vehicle scales into weigh-in-motion systems

    DOEpatents

    Muhs, Jeffrey D.; Scudiere, Matthew B.; Jordan, John K.

    2002-01-01

    An apparatus and method for converting in-ground static weighing scales for vehicles to weigh-in-motion systems. The apparatus upon conversion includes the existing in-ground static scale, peripheral switches and an electronic module for automatic computation of the weight. By monitoring the velocity, tire position, axle spacing, and real time output from existing static scales as a vehicle drives over the scales, the system determines when an axle of a vehicle is on the scale at a given time, monitors the combined weight output from any given axle combination on the scale(s) at any given time, and from these measurements automatically computes the weight of each individual axle and gross vehicle weight by an integration, integration approximation, and/or signal averaging technique.

  19. Managing Nonpoint Source Pollution in Western Washington: Landowner Learning Methods and Motivations

    NASA Astrophysics Data System (ADS)

    Ryan, Clare M.

    2009-06-01

    States, territories, and tribes identify nonpoint source pollution as responsible for more than half of the Nation’s existing and threatened water quality impairments, making it the principal remaining cause of water quality problems across the United States. Combinations of education, technical and financial assistance, and regulatory measures are used to inform landowners about nonpoint source pollution issues, and to stimulate the use of best management practices. A mail survey of non-commercial riparian landowners investigated how they learn about best management practices, the efficacy of different educational techniques, and what motivates them to implement land management activities. Landowners experience a variety of educational techniques, and rank those that include direct personal contact as more effective than brochures, advertisements, radio, internet, or television. The most important motivations for implementing best management practices were linked with elements of a personal stewardship ethic, accountability, personal commitment, and feasibility. Nonpoint source education and social marketing campaigns should include direct interpersonal contacts, and appeal to landowner motivations of caring, responsibility, and personal commitment.

  20. Airborne remote sensing of ultraviolet-absorbing aerosols during the NASA ATom, SEAC4RS and DC3 campaigns

    NASA Astrophysics Data System (ADS)

    Hall, S. R.; Ullmann, K.; Commane, R.; Crounse, J. D.; Daube, B. C.; Diskin, G. S.; Dollner, M.; Froyd, K. D.; Katich, J. M.; Kim, M. J.; Madronich, S.; Murphy, D. M.; Podolske, J. R.; Schwarz, J. P.; Teng, A.; Weber, R. J.; Weinzierl, B.; Wennberg, P. O.; Sachse, G.; Wofsy, S.

    2017-12-01

    Spectrally resolved up and down-welling actinic flux was measured from the NASA DC-8 aircraft by the Charged-coupled device Actinic Flux Spectroradiometers (CAFS) during recent campaigns including ATom, DC3 and SEAC4RS. The primary purpose is retrieval of 40 photolysis frequencies to complement the in situ chemistry. However, the spectra also provide the opportunity to examine absorption trends in the UV where few other measurements exist. In particular, absorption by brown (BrC) and black (BC) carbon aerosols result in characteristic UV signatures. A new technique exploits the spectral changes to detect the presence of these aerosols for qualitative, real-time, remote sensing of biomass burning (BB). The data may prove useful for examination of the evolution of BrC, including chemical processing and hygroscopic growth. The induced UV changes also feed back to the photolysis frequencies affecting the chemistry. Further work will determine the robustness of the technique and if quantitative spectral absorption retrievals are possible.

  1. Power measurement system of ECRH on HL-2A

    NASA Astrophysics Data System (ADS)

    Wang, He; Lu, Zhihong; Kubo, Shin; Chen, Gangyu; Wang, Chao; Zhou, Jun; Huang, Mei; Rao, Jun

    2015-03-01

    Electron Cyclotron Resonance Heating (ECRH) is one of the main auxiliary heating systems for HL-2A tokamak. The ECRH system with total output power 5MW has been equipped on HL-2A which include 6 sets of 0.5MW/1.0s at a frequency of 68GHz and 2 sets of 1MW/3s at a frequency of 140GHz. The power is one of important parameters in ECRH system. In this paper, the method for measuring the power of ECRH system on HL-2A is introduced which include calorimetric techniques and directional coupler. Calorimetric techniques is an existing method, which is used successfully in ECRH commissioning and experiment, and the transmission efficiency of ECRH system is achieved by measuring the absorbed microwave power in the Match Optical Unit (MOU), gyrotron output window and tours window of the EC system use this method. Now base on the theory of electromagnetic coupling through apertures, directional couplers are being designed, which is a new way for us.

  2. Parental Self-Assessment of Behavioral Effectiveness in Young Children and Views on Corporal Punishment in an Academic Pediatric Practice.

    PubMed

    Irons, Lance B; Flatin, Heidi; Harrington, Maya T; Vazifedan, Turaj; Harrington, John W

    2018-03-01

    This article assesses parental confidence and current behavioral techniques used by mostly African American caregivers of young children in an urban Southeastern setting, including their use and attitudes toward corporal punishment (CP). Two hundred and fifty parental participants of children aged 18 months to 5 years completed a survey on factors affecting their behavioral management and views on CP. Statistical analysis included χ 2 test and logistic regression with confidence interval significance determined at P <.05. Significant associations of CP usage were found in parents who were themselves exposed to CP and parental level of frustration with child disobedience. A total of 40.2% of respondents answered that they had not received any discipline strategies from pediatricians and 47.6% were interested in receiving more behavioral strategies. Clear opportunities exist for pediatricians to provide information on evidence-based disciplinary techniques, and these discussions may be facilitated through the creation of a No Hit Zone program in the pediatric practice.

  3. Successful Treatment of Early Talar Osteonecrosis by Core Decompression Combined with Intraosseous Stem Cell Injection: A Case Report.

    PubMed

    Nevalainen, Mika T; Repo, Jussi P; Pesola, Maija; Nyrhinen, Jukka P

    2018-01-01

    Osteonecrosis of the talus is a fairly rare condition. Many predisposing factors have been identified including previous trauma, use of corticosteroids, alcoholism, and smoking. As a gold standard, magnetic resonance imaging (MRI) is the most sensitive and specific diagnostic examination to detect osteonecrosis. While many treatment options for talar osteonecrosis exist, core decompression is suggested on young patients with good outcome results. More recently, intraosseous stem cell and platelet-rich plasma (PRP) injection has been added to the core decompression procedure. We report a successful treatment of early talar osteonecrosis ARCO I (Association Research Circulation Osseous) by core decompression combined with stem cell and PRP injection. On 3-month and 15-month follow-up, MRI showed complete resolution of the osteonecrotic changes together with clinical improvement. This modified technique is a viable treatment option for early talar osteonecrosis. Nevertheless, future prospects should include a study comparing this combined technique with plain core decompression.

  4. Using ProHits to store, annotate and analyze affinity purification - mass spectrometry (AP-MS) data

    PubMed Central

    Liu, Guomin; Zhang, Jianping; Choi, Hyungwon; Lambert, Jean-Philippe; Srikumar, Tharan; Larsen, Brett; Nesvizhskii, Alexey I.; Raught, Brian; Tyers, Mike; Gingras, Anne-Claude

    2012-01-01

    Affinity purification coupled with mass spectrometry (AP-MS) is a robust technique used to identify protein-protein interactions. With recent improvements in sample preparation, and dramatic advances in MS instrumentation speed and sensitivity, this technique is becoming more widely used throughout the scientific community. To meet the needs of research groups both large and small, we have developed software solutions for tracking, scoring and analyzing AP-MS data. Here, we provide details for the installation and utilization of ProHits, a Laboratory Information Management System designed specifically for AP-MS interaction proteomics. This protocol explains: (i) how to install the complete ProHits system, including modules for the management of mass spectrometry files and the analysis of interaction data, and (ii) alternative options for the use of pre-existing search results in simpler versions of ProHits, including a virtual machine implementation of our ProHits Lite software. We also describe how to use the main features of the software to analyze AP-MS data. PMID:22948730

  5. Study of synthesis techniques for insensitive aircraft control systems

    NASA Technical Reports Server (NTRS)

    Harvey, C. A.; Pope, R. E.

    1977-01-01

    Insensitive flight control system design criteria was defined in terms of maximizing performance (handling qualities, RMS gust response, transient response, stability margins) over a defined parameter range. Wing load alleviation for the C-5A was chosen as a design problem. The C-5A model was a 79-state, two-control structure with uncertainties assumed to exist in dynamic pressure, structural damping and frequency, and the stability derivative, M sub w. Five new techniques (mismatch estimation, uncertainty weighting, finite dimensional inverse, maximum difficulty, dual Lyapunov) were developed. Six existing techniques (additive noise, minimax, multiplant, sensitivity vector augmentation, state dependent noise, residualization) and the mismatch estimation and uncertainty weighting techniques were synthesized and evaluated on the design example. Evaluation and comparison of these six techniques indicated that the minimax and the uncertainty weighting techniques were superior to the other six, and of these two, uncertainty weighting has lower computational requirements. Techniques based on the three remaining new concepts appear promising and are recommended for further research.

  6. Robotic Lunar Rover Technologies and SEI Supporting Technologies at Sandia National Laboratories

    NASA Technical Reports Server (NTRS)

    Klarer, Paul R.

    1992-01-01

    Existing robotic rover technologies at Sandia National Laboratories (SNL) can be applied toward the realization of a robotic lunar rover mission in the near term. Recent activities at the SNL-RVR have demonstrated the utility of existing rover technologies for performing remote field geology tasks similar to those envisioned on a robotic lunar rover mission. Specific technologies demonstrated include low-data-rate teleoperation, multivehicle control, remote site and sample inspection, standard bandwidth stereo vision, and autonomous path following based on both internal dead reckoning and an external position location update system. These activities serve to support the use of robotic rovers for an early return to the lunar surface by demonstrating capabilities that are attainable with off-the-shelf technology and existing control techniques. The breadth of technical activities at SNL provides many supporting technology areas for robotic rover development. These range from core competency areas and microsensor fabrication facilities, to actual space qualification of flight components that are designed and fabricated in-house.

  7. Laparoscopic liver resection: when to use the laparoscopic stapler device

    PubMed Central

    Gumbs, Andrew A.; Gayet, Brice

    2008-01-01

    Minimally invasive hepatic resection was first described by Gagner et al. in the early 1990s and since then has become increasingly adopted by hepatobiliary and liver transplant surgeons. Several techniques exist to transect the hepatic parenchyma laparoscopically and include transection with stapler and/or energy devices, such as ultrasonic shears, radiofrequency ablation and bipolar devices. We believe that coagulative techniques allow for superior anatomic resections and ultimately permit for the performance of more complex hepatic resections. In the stapling technique, Glisson's capsule is usually incised with an energy device until the parenchyma is thinned out and multiple firings of the staplers are then used to transect the remaining parenchyma and larger bridging segmental vessels and ducts. Besides the economic constraints of using multiple stapler firings, the remaining staples have the disadvantage of hindering and even preventing additional hemostasis of the raw liver surface with monopolar and bipolar electrocautery. The laparoscopic stapler device is, however, useful for transection of the main portal branches and hepatic veins during minimally invasive major hepatic resections. Techniques to safely perform major hepatic resection with the above techniques will be described with an emphasis on when and how laparoscopic vascular staplers should be used. PMID:18773113

  8. A novel numerical framework for self-similarity in plasticity: Wedge indentation in single crystals

    NASA Astrophysics Data System (ADS)

    Juul, K. J.; Niordson, C. F.; Nielsen, K. L.; Kysar, J. W.

    2018-03-01

    A novel numerical framework for analyzing self-similar problems in plasticity is developed and demonstrated. Self-similar problems of this kind include processes such as stationary cracks, void growth, indentation etc. The proposed technique offers a simple and efficient method for handling this class of complex problems by avoiding issues related to traditional Lagrangian procedures. Moreover, the proposed technique allows for focusing the mesh in the region of interest. In the present paper, the technique is exploited to analyze the well-known wedge indentation problem of an elastic-viscoplastic single crystal. However, the framework may be readily adapted to any constitutive law of interest. The main focus herein is the development of the self-similar framework, while the indentation study serves primarily as verification of the technique by comparing to existing numerical and analytical studies. In this study, the three most common metal crystal structures will be investigated, namely the face-centered cubic (FCC), body-centered cubic (BCC), and hexagonal close packed (HCP) crystal structures, where the stress and slip rate fields around the moving contact point singularity are presented.

  9. Electroconductivity technique for the measurement of axial variation of holdups in three-phase fluidized beds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Begovich, J.M.; Watson, J.S.

    1978-03-01

    An electroconductivity technique is described which can be used not only for determining the overall phase holdups in a three-phase fluidized bed, but, more importantly, it can also be used for determining the local holdups as a function of height in the column. One disadvantage of the technique is that it can only be applied to systems with electroconductive liquids. However, since most real or prototype systems use either water or can be simulated with a fluid that can readily be made electroconductive, this handicap does not seem to be too severe. The technique has been applied successfully to amore » number of systems, including porous alumina beads, if a correction is made for their internal porosity. It has shown the existence of the transition region as the bed goes from a three-phase to a two-phase system. Further work should result in correlations for the distribution of the three phases throughout the entire column. These predictive equations will help in the rational design of reactors in which local conditions throughout the bed must be considered.« less

  10. Background adaptive division filtering for hand-held ground penetrating radar

    NASA Astrophysics Data System (ADS)

    Lee, Matthew A.; Anderson, Derek T.; Ball, John E.; White, Julie L.

    2016-05-01

    The challenge in detecting explosive hazards is that there are multiple types of targets buried at different depths in a highlycluttered environment. A wide array of target and clutter signatures exist, which makes detection algorithm design difficult. Such explosive hazards are typically deployed in past and present war zones and they pose a grave threat to the safety of civilians and soldiers alike. This paper focuses on a new image enhancement technique for hand-held ground penetrating radar (GPR). Advantages of the proposed technique is it runs in real-time and it does not require the radar to remain at a constant distance from the ground. Herein, we evaluate the performance of the proposed technique using data collected from a U.S. Army test site, which includes targets with varying amounts of metal content, placement depths, clutter and times of day. Receiver operating characteristic (ROC) curve-based results are presented for the detection of shallow, medium and deeply buried targets. Preliminary results are very encouraging and they demonstrate the usefulness of the proposed filtering technique.

  11. Custom-Made Finger Guard to Prevent Wire-Stick Injury to the Operator's Finger while Performing Intermaxillary Fixation.

    PubMed

    Kumaresan, Ramesh; Ponnusami, Karthikeyan; Karthikeyan, Priyadarshini

    2014-12-01

    The treatment of maxillofacial fractures involves different methods from bandages and splinting to methods of open reduction and internal fixation and usually requires control of the dental occlusion with the help of intermaxillary fixation (IMF). Different wiring techniques have been used to aid in IMF including placement of custom-made arch bars, eyelet etc. However, these wiring techniques are with a constant danger of trauma to the surgeon's fingers by their sharp ends. Though there exist a variety of commercially available barrier products and customized techniques to prevent wire-stick injury, cost factor, touch sensitivity, and comfort aspect restrain their acquirement and exploit. This technical note describes the construction of a simple and economical finger guard made of soft thermoplastic material that provides an added protection to fingers from wire-stick type injuries, and its flexible nature permits a comfortable finger flexion movement and acceptable touch sensitivity. This is a simple, economical, reusable puncture, and cut-resistance figure guard by which we can avoid wire-stick type injury to the operator's fingers during wiring technique.

  12. Apparatus for improving performance of electrical insulating structures

    DOEpatents

    Wilson, Michael J.; Goerz, David A.

    2004-08-31

    Removing the electrical field from the internal volume of high-voltage structures; e.g., bushings, connectors, capacitors, and cables. The electrical field is removed from inherently weak regions of the interconnect, such as between the center conductor and the solid dielectric, and places it in the primary insulation. This is accomplished by providing a conductive surface on the inside surface of the principal solid dielectric insulator surrounding the center conductor and connects the center conductor to this conductive surface. The advantage of removing the electric fields from the weaker dielectric region to a stronger area improves reliability, increases component life and operating levels, reduces noise and losses, and allows for a smaller compact design. This electric field control approach is currently possible on many existing products at a modest cost. Several techniques are available to provide the level of electric field control needed. Choosing the optimum technique depends on material, size, and surface accessibility. The simplest deposition method uses a standard electroless plating technique, but other metalization techniques include vapor and energetic deposition, plasma spraying, conductive painting, and other controlled coating methods.

  13. Apparatus for improving performance of electrical insulating structures

    DOEpatents

    Wilson, Michael J.; Goerz, David A.

    2002-01-01

    Removing the electrical field from the internal volume of high-voltage structures; e.g., bushings, connectors, capacitors, and cables. The electrical field is removed from inherently weak regions of the interconnect, such as between the center conductor and the solid dielectric, and places it in the primary insulation. This is accomplished by providing a conductive surface on the inside surface of the principal solid dielectric insulator surrounding the center conductor and connects the center conductor to this conductive surface. The advantage of removing the electric fields from the weaker dielectric region to a stronger area improves reliability, increases component life and operating levels, reduces noise and losses, and allows for a smaller compact design. This electric field control approach is currently possible on many existing products at a modest cost. Several techniques are available to provide the level of electric field control needed. Choosing the optimum technique depends on material, size, and surface accessibility. The simplest deposition method uses a standard electroless plating technique, but other metalization techniques include vapor and energetic deposition, plasma spraying, conductive painting, and other controlled coating methods.

  14. Method for improving performance of highly stressed electrical insulating structures

    DOEpatents

    Wilson, Michael J.; Goerz, David A.

    2002-01-01

    Removing the electrical field from the internal volume of high-voltage structures; e.g., bushings, connectors, capacitors, and cables. The electrical field is removed from inherently weak regions of the interconnect, such as between the center conductor and the solid dielectric, and places it in the primary insulation. This is accomplished by providing a conductive surface on the inside surface of the principal solid dielectric insulator surrounding the center conductor and connects the center conductor to this conductive surface. The advantage of removing the electric fields from the weaker dielectric region to a stronger area improves reliability, increases component life and operating levels, reduces noise and losses, and allows for a smaller compact design. This electric field control approach is currently possible on many existing products at a modest cost. Several techniques are available to provide the level of electric field control needed. Choosing the optimum technique depends on material, size, and surface accessibility. The simplest deposition method uses a standard electroless plating technique, but other metalization techniques include vapor and energetic deposition, plasma spraying, conductive painting, and other controlled coating methods.

  15. Human tracking over camera networks: a review

    NASA Astrophysics Data System (ADS)

    Hou, Li; Wan, Wanggen; Hwang, Jenq-Neng; Muhammad, Rizwan; Yang, Mingyang; Han, Kang

    2017-12-01

    In recent years, automated human tracking over camera networks is getting essential for video surveillance. The tasks of tracking human over camera networks are not only inherently challenging due to changing human appearance, but also have enormous potentials for a wide range of practical applications, ranging from security surveillance to retail and health care. This review paper surveys the most widely used techniques and recent advances for human tracking over camera networks. Two important functional modules for the human tracking over camera networks are addressed, including human tracking within a camera and human tracking across non-overlapping cameras. The core techniques of human tracking within a camera are discussed based on two aspects, i.e., generative trackers and discriminative trackers. The core techniques of human tracking across non-overlapping cameras are then discussed based on the aspects of human re-identification, camera-link model-based tracking and graph model-based tracking. Our survey aims to address existing problems, challenges, and future research directions based on the analyses of the current progress made toward human tracking techniques over camera networks.

  16. Location of planar targets in three space from monocular images

    NASA Technical Reports Server (NTRS)

    Cornils, Karin; Goode, Plesent W.

    1987-01-01

    Many pieces of existing and proposed space hardware that would be targets of interest for a telerobot can be represented as planar or near-planar surfaces. Examples include the biostack modules on the Long Duration Exposure Facility, the panels on Solar Max, large diameter struts, and refueling receptacles. Robust and temporally efficient methods for locating such objects with sufficient accuracy are therefore worth developing. Two techniques that derive the orientation and location of an object from its monocular image are discussed and the results of experiments performed to determine translational and rotational accuracy are presented. Both the quadrangle projection and elastic matching techniques extract three-space information using a minimum of four identifiable target points and the principles of the perspective transformation. The selected points must describe a convex polygon whose geometric characteristics are prespecified in a data base. The rotational and translational accuracy of both techniques was tested at various ranges. This experiment is representative of the sensing requirements involved in a typical telerobot target acquisition task. Both techniques determined target location to an accuracy sufficient for consistent and efficient acquisition by the telerobot.

  17. A new surgical technique for concealed penis using an advanced musculocutaneous scrotal flap.

    PubMed

    Han, Dong-Seok; Jang, Hoon; Youn, Chang-Shik; Yuk, Seung-Mo

    2015-06-19

    Until recently, no single, universally accepted surgical method has existed for all types of concealed penis repairs. We describe a new surgical technique for repairing concealed penis by using an advanced musculocutaneous scrotal flap. From January 2010 to June 2014, we evaluated 12 patients (12-40 years old) with concealed penises who were surgically treated with an advanced musculocutaneous scrotal flap technique after degloving through a ventral approach. All the patients were scheduled for regular follow-up at 6, 12, and 24 weeks postoperatively. The satisfaction grade for penile size, morphology, and voiding status were evaluated using a questionnaire preoperatively and at all of the follow-ups. Information regarding complications was obtained during the postoperative hospital stay and at all follow-ups. The patients' satisfaction grades, which included the penile size, morphology, and voiding status, improved postoperatively compared to those preoperatively. All patients had penile lymphedema postoperatively; however, this disappeared within 6 weeks. There were no complications such as skin necrosis and contracture, voiding difficulty, or erectile dysfunction. Our advanced musculocutaneous scrotal flap technique for concealed penis repair is technically easy and safe. In addition, it provides a good cosmetic appearance, functional outcomes and excellent postoperative satisfaction grades. Lastly, it seems applicable in any type of concealed penis, including cases in which the ventral skin defect is difficult to cover.

  18. Clinical veterinary proteomics: Techniques and approaches to decipher the animal plasma proteome.

    PubMed

    Ghodasara, P; Sadowski, P; Satake, N; Kopp, S; Mills, P C

    2017-12-01

    Over the last two decades, technological advancements in the field of proteomics have advanced our understanding of the complex biological systems of living organisms. Techniques based on mass spectrometry (MS) have emerged as powerful tools to contextualise existing genomic information and to create quantitative protein profiles from plasma, tissues or cell lines of various species. Proteomic approaches have been used increasingly in veterinary science to investigate biological processes responsible for growth, reproduction and pathological events. However, the adoption of proteomic approaches by veterinary investigators lags behind that of researchers in the human medical field. Furthermore, in contrast to human proteomics studies, interpretation of veterinary proteomic data is difficult due to the limited protein databases available for many animal species. This review article examines the current use of advanced proteomics techniques for evaluation of animal health and welfare and covers the current status of clinical veterinary proteomics research, including successful protein identification and data interpretation studies. It includes a description of an emerging tool, sequential window acquisition of all theoretical fragment ion mass spectra (SWATH-MS), available on selected mass spectrometry instruments. This newly developed data acquisition technique combines advantages of discovery and targeted proteomics approaches, and thus has the potential to advance the veterinary proteomics field by enhancing identification and reproducibility of proteomics data. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Fringe pattern demodulation with a two-dimensional digital phase-locked loop algorithm.

    PubMed

    Gdeisat, Munther A; Burton, David R; Lalor, Michael J

    2002-09-10

    A novel technique called a two-dimensional digital phase-locked loop (DPLL) for fringe pattern demodulation is presented. This algorithm is more suitable for demodulation of fringe patterns with varying phase in two directions than the existing DPLL techniques that assume that the phase of the fringe patterns varies only in one direction. The two-dimensional DPLL technique assumes that the phase of a fringe pattern is continuous in both directions and takes advantage of the phase continuity; consequently, the algorithm has better noise performance than the existing DPLL schemes. The two-dimensional DPLL algorithm is also suitable for demodulation of fringe patterns with low sampling rates, and it outperforms the Fourier fringe analysis technique in this aspect.

  20. Bounding the Resource Availability of Partially Ordered Events with Constant Resource Impact

    NASA Technical Reports Server (NTRS)

    Frank, Jeremy

    2004-01-01

    We compare existing techniques to bound the resource availability of partially ordered events. We first show that, contrary to intuition, two existing techniques, one due to Laborie and one due to Muscettola, are not strictly comparable in terms of the size of the search trees generated under chronological search with a fixed heuristic. We describe a generalization of these techniques called the Flow Balance Constraint to tightly bound the amount of available resource for a set of partially ordered events with piecewise constant resource impact We prove that the new technique generates smaller proof trees under chronological search with a fixed heuristic, at little increase in computational expense. We then show how to construct tighter resource bounds but at increased computational cost.

  1. Locality-Aware CTA Clustering For Modern GPUs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Ang; Song, Shuaiwen; Liu, Weifeng

    2017-04-08

    In this paper, we proposed a novel clustering technique for tapping into the performance potential of a largely ignored type of locality: inter-CTA locality. We first demonstrated the capability of the existing GPU hardware to exploit such locality, both spatially and temporally, on L1 or L1/Tex unified cache. To verify the potential of this locality, we quantified its existence in a broad spectrum of applications and discussed its sources of origin. Based on these insights, we proposed the concept of CTA-Clustering and its associated software techniques. Finally, We evaluated these techniques on all modern generations of NVIDIA GPU architectures. Themore » experimental results showed that our proposed clustering techniques could significantly improve on-chip cache performance.« less

  2. Rapid, topology-based particle tracking for high-resolution measurements of large complex 3D motion fields.

    PubMed

    Patel, Mohak; Leggett, Susan E; Landauer, Alexander K; Wong, Ian Y; Franck, Christian

    2018-04-03

    Spatiotemporal tracking of tracer particles or objects of interest can reveal localized behaviors in biological and physical systems. However, existing tracking algorithms are most effective for relatively low numbers of particles that undergo displacements smaller than their typical interparticle separation distance. Here, we demonstrate a single particle tracking algorithm to reconstruct large complex motion fields with large particle numbers, orders of magnitude larger than previously tractably resolvable, thus opening the door for attaining very high Nyquist spatial frequency motion recovery in the images. Our key innovations are feature vectors that encode nearest neighbor positions, a rigorous outlier removal scheme, and an iterative deformation warping scheme. We test this technique for its accuracy and computational efficacy using synthetically and experimentally generated 3D particle images, including non-affine deformation fields in soft materials, complex fluid flows, and cell-generated deformations. We augment this algorithm with additional particle information (e.g., color, size, or shape) to further enhance tracking accuracy for high gradient and large displacement fields. These applications demonstrate that this versatile technique can rapidly track unprecedented numbers of particles to resolve large and complex motion fields in 2D and 3D images, particularly when spatial correlations exist.

  3. Earthquake prediction evaluation standards applied to the VAN Method

    NASA Astrophysics Data System (ADS)

    Jackson, David D.

    Earthquake prediction research must meet certain standards before it can be suitably evaluated for potential application in decision making. For methods that result in a binary (on or off) alarm condition, requirements include (1) a quantitative description of observables that trigger an alarm, (2) a quantitative description, including ranges of time, location, and magnitude, of the predicted earthquakes, (3) documented evidence of all previous alarms, (4) a complete list of predicted earthquakes, (5) a complete list of unpredicted earthquakes. The VAN technique [Varotsos and Lazaridou, 1991; Varotsos et al., 1996] has not yet been stated as a testable hypothesis. It fails criteria (1) and (2) so it is not ready to be evaluated properly. Although telegrams were transmitted in advance of claimed successes, these telegrams did not fully specify the predicted events, and all of the published statistical evaluations involve many subjective ex post facto decisions. Lacking a statistically demonstrated relationship to earthquakes, a candidate prediction technique should satisfy several plausibility criteria, including: (1) a reasonable relationship between the location of the candidate precursor and that of the predicted earthquake, (2) some demonstration that the candidate precursory observations are related to stress, strain, or other quantities related to earthquakes, and (3) the existence of co-seismic as well as pre-seismic variations of the candidate precursor. The VAN technique meets none of these criteria.

  4. Peroral endoscopic myotomy as salvation technique post-Heller: International experience.

    PubMed

    Tyberg, Amy; Sharaiha, Reem Z; Familiari, Pietro; Costamagna, Guido; Casas, Fernando; Kumta, Nikhil A; Barret, Maximilien; Desai, Amit P; Schnoll-Sussman, Felice; Saxena, Payal; Martínez, Guadalupe; Zamarripa, Felipe; Gaidhane, Monica; Bertani, Helga; Draganov, Peter V; Balassone, Valerio; Sharata, Ahmed; Reavis, Kevin; Swanstrom, Lee; Invernizzi, Martina; Seewald, Stefan; Minami, Hitomi; Inoue, Haruhiro; Kahaleh, Michel

    2018-01-01

    Treatment for achalasia has traditionally been Heller myotomy (HM). Despite its excellent efficacy rate, a number of patients remain symptomatic post-procedure. Limited data exist as to the best management for recurrence of symptoms post-HM. We present an international, multicenter experience evaluating the efficacy and safety of post-HM peroral endoscopic myotomy (POEM). Patients who underwent POEM post-HM from 13 centers from January 2012 to January 2017 were included as part of a prospective registry. Technical success was defined as successful completion of the myotomy. Clinical success was defined as an Eckardt score of ≤3 on 12-month follow up. Adverse events (AE) including anesthesia-related, operative, and postoperative complications were recorded. Fifty-one patients were included in the study (mean age 54.2, 47% male). Technical success was achieved in 100% of patients. Clinical success on long-term follow up was achieved in 48 patients (94%), with a mean change in Eckardt score of 6.25. Seven patients (13%) had AE: six experienced periprocedural mucosal defect treated endoscopically and two patients developed mediastinitis treated conservatively. For patients with persistent symptoms after HM, POEM is a safe salvation technique with good short-term efficacy. As a result of the challenge associated with repeat HM, POEM might become the preferred technique in this patient population. Further studies with longer follow up are needed. © 2017 Japan Gastroenterological Endoscopy Society.

  5. Improvement of Railroad Roller Bearing Test Procedures & Development of Roller Bearing Diagnostic Techniques. Volume 2.

    DOT National Transportation Integrated Search

    1982-04-01

    A comprehensive review of existing basic diagnostic techniques applicable to the railcar roller bearing defect and failure problem was made. Of the potentially feasible diagnostic techniques identified, high frequency vibration was selected for exper...

  6. Some aspects of robotics calibration, design and control

    NASA Technical Reports Server (NTRS)

    Tawfik, Hazem

    1990-01-01

    The main objective is to introduce techniques in the areas of testing and calibration, design, and control of robotic systems. A statistical technique is described that analyzes a robot's performance and provides quantitative three-dimensional evaluation of its repeatability, accuracy, and linearity. Based on this analysis, a corrective action should be taken to compensate for any existing errors and enhance the robot's overall accuracy and performance. A comparison between robotics simulation software packages that were commercially available (SILMA, IGRIP) and that of Kennedy Space Center (ROBSIM) is also included. These computer codes simulate the kinematics and dynamics patterns of various robot arm geometries to help the design engineer in sizing and building the robot manipulator and control system. A brief discussion on an adaptive control algorithm is provided.

  7. An interactive user-friendly approach to surface-fitting three-dimensional geometries

    NASA Technical Reports Server (NTRS)

    Cheatwood, F. Mcneil; Dejarnette, Fred R.

    1988-01-01

    A surface-fitting technique has been developed which addresses two problems with existing geometry packages: computer storage requirements and the time required of the user for the initial setup of the geometry model. Coordinates of cross sections are fit using segments of general conic sections. The next step is to blend the cross-sectional curve-fits in the longitudinal direction using general conics to fit specific meridional half-planes. Provisions are made to allow the fitting of fuselages and wings so that entire wing-body combinations may be modeled. This report includes the development of the technique along with a User's Guide for the various menus within the program. Results for the modeling of the Space Shuttle and a proposed Aeroassist Flight Experiment geometry are presented.

  8. Modelling of high-frequency structure-borne sound transmission on FEM grids using the Discrete Flow Mapping technique

    NASA Astrophysics Data System (ADS)

    Hartmann, Timo; Tanner, Gregor; Xie, Gang; Chappell, David; Bajars, Janis

    2016-09-01

    Dynamical Energy Analysis (DEA) combined with the Discrete Flow Mapping technique (DFM) has recently been introduced as a mesh-based high frequency method modelling structure borne sound for complex built-up structures. This has proven to enhance vibro-acoustic simulations considerably by making it possible to work directly on existing finite element meshes circumventing time-consuming and costly re-modelling strategies. In addition, DFM provides detailed spatial information about the vibrational energy distribution within a complex structure in the mid-to-high frequency range. We will present here progress in the development of the DEA method towards handling complex FEM-meshes including Rigid Body Elements. In addition, structure borne transmission paths due to spot welds are considered. We will present applications for a car floor structure.

  9. An Experiment in Scientific Program Understanding

    NASA Technical Reports Server (NTRS)

    Stewart, Mark E. M.; Owen, Karl (Technical Monitor)

    2000-01-01

    This paper concerns a procedure that analyzes aspects of the meaning or semantics of scientific and engineering code. This procedure involves taking a user's existing code, adding semantic declarations for some primitive variables, and parsing this annotated code using multiple, independent expert parsers. These semantic parsers encode domain knowledge and recognize formulae in different disciplines including physics, numerical methods, mathematics, and geometry. The parsers will automatically recognize and document some static, semantic concepts and help locate some program semantic errors. Results are shown for three intensively studied codes and seven blind test cases; all test cases are state of the art scientific codes. These techniques may apply to a wider range of scientific codes. If so, the techniques could reduce the time, risk, and effort required to develop and modify scientific codes.

  10. Thin Film Physical Sensor Instrumentation Research and Development at NASA Glenn Research Center

    NASA Technical Reports Server (NTRS)

    Wrbanek, John D.; Fralick, Gustave C.

    2006-01-01

    A range of thin film sensor technology has been demonstrated enabling measurement of multiple parameters either individually or in sensor arrays including temperature, strain, heat flux, and flow. Multiple techniques exist for refractory thin film fabrication, fabrication and integration on complex surfaces and multilayered thin film insulation. Leveraging expertise in thin films and high temperature materials, investigations for the applications of thin film ceramic sensors has begun. The current challenges of instrumentation technology are to further develop systems packaging and component testing of specialized sensors, further develop instrumentation techniques on complex surfaces, improve sensor durability, and to address needs for extreme temperature applications. The technology research and development ongoing at NASA Glenn for applications to future launch vehicles, space vehicles, and ground systems is outlined.

  11. Monitoring genetic damage to ecosystems from hazardous waste

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, S.L.

    1992-03-01

    Applications of ecological toxicity testing to hazardous waste management have increased dramatically over the last few years, resulting in a greater awareness of the need for improved biomonitoring techniques. Our laboratory is developing advanced techniques to assess the genotoxic effects of environmental contamination on ecosystems. We have developed a novel mutagenesis assay using the nematode Caenorhabditis elegans, which is potentially applicable for multimedia studies in soil, sediment, and water. In addition, we are conducting validation studies of a previously developed anaphase aberration test that utilizes sea urchin embryos. Other related efforts include field validation studies of the new tests, evaluationmore » of their potential ecological relevance, and analysis of their sensitivity relative to that of existing toxicity tests that assess only lethal effects, rather than genetic damage.« less

  12. A study of actions in operative notes.

    PubMed

    Wang, Yan; Pakhomov, Serguei; Burkart, Nora E; Ryan, James O; Melton, Genevieve B

    2012-01-01

    Operative notes contain rich information about techniques, instruments, and materials used in procedures. To assist development of effective information extraction (IE) techniques for operative notes, we investigated the sublanguage used to describe actions within the operative report 'procedure description' section. Deep parsing results of 362,310 operative notes with an expanded Stanford parser using the SPECIALIST Lexicon resulted in 200 verbs (92% coverage) including 147 action verbs. Nominal action predicates for each action verb were gathered from WordNet, SPECIALIST Lexicon, New Oxford American Dictionary and Stedman's Medical Dictionary. Coverage gaps were seen in existing lexical, domain, and semantic resources (Unified Medical Language System (UMLS) Metathesaurus, SPECIALIST Lexicon, WordNet and FrameNet). Our findings demonstrate the need to construct surgical domain-specific semantic resources for IE from operative notes.

  13. Radar-cross-section reduction of wind turbines. part 1.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brock, Billy C.; Loui, Hung; McDonald, Jacob J.

    2012-03-05

    In recent years, increasing deployment of large wind-turbine farms has become an issue of growing concern for the radar community. The large radar cross section (RCS) presented by wind turbines interferes with radar operation, and the Doppler shift caused by blade rotation causes problems identifying and tracking moving targets. Each new wind-turbine farm installation must be carefully evaluated for potential disruption of radar operation for air defense, air traffic control, weather sensing, and other applications. Several approaches currently exist to minimize conflict between wind-turbine farms and radar installations, including procedural adjustments, radar upgrades, and proper choice of low-impact wind-farm sites,more » but each has problems with limited effectiveness or prohibitive cost. An alternative approach, heretofore not technically feasible, is to reduce the RCS of wind turbines to the extent that they can be installed near existing radar installations. This report summarizes efforts to reduce wind-turbine RCS, with a particular emphasis on the blades. The report begins with a survey of the wind-turbine RCS-reduction literature to establish a baseline for comparison. The following topics are then addressed: electromagnetic model development and validation, novel material development, integration into wind-turbine fabrication processes, integrated-absorber design, and wind-turbine RCS modeling. Related topics of interest, including alternative mitigation techniques (procedural, at-the-radar, etc.), an introduction to RCS and electromagnetic scattering, and RCS-reduction modeling techniques, can be found in a previous report.« less

  14. [A review of water and carbon flux partitioning and coupling in SPAC using stable isotope techniques].

    PubMed

    Xu, Xiao Wu; Yu, Xin Xiao; Jia, Guo Dong; Li, Han Zhi; Lu, Wei Wei; Liu, Zi Qiang

    2017-07-18

    Soil-vegetation-atmosphere continuum (SPAC) is one of the important research objects in the field of terrestrial hydrology, ecology and global change. The process of water and carbon cycling, and their coupling mechanism are frontier issues. With characteristics of tracing, integration and indication, stable isotope techniques contribute to the estimation of the relationship between carbon sequestration and water consumption in ecosystems. In this review, based on a brief introduction of stable isotope principles and techniques, the applications of stable isotope techniques to water and carbon exchange in SPAC using optical stable isotope techniques were mainly explained, including: partitioning of net carbon exchange into photosynthesis and respiration; partitioning of evapotranspiration into transpiration and evaporation; coupling of water and carbon cycle at the ecosystem scale. Advanced techniques and methods provided long-term and high frequency measurements for isotope signals at the ecosystem scale, but the issues about the precision and accuracy for measurements, partitioning of ecosystem respiration, adaptability for models under non-steady state, scaling up, coupling mechanism of water and carbon cycles, were challenging. The main existing research findings, limitations and future research prospects were discussed, which might help new research and technology development in the field of stable isotope ecology.

  15. Identifying configurations of behavior change techniques in effective medication adherence interventions: a qualitative comparative analysis.

    PubMed

    Kahwati, Leila; Viswanathan, Meera; Golin, Carol E; Kane, Heather; Lewis, Megan; Jacobs, Sara

    2016-05-04

    Interventions to improve medication adherence are diverse and complex. Consequently, synthesizing this evidence is challenging. We aimed to extend the results from an existing systematic review of interventions to improve medication adherence by using qualitative comparative analysis (QCA) to identify necessary or sufficient configurations of behavior change techniques among effective interventions. We used data from 60 studies in a completed systematic review to examine the combinations of nine behavior change techniques (increasing knowledge, increasing awareness, changing attitude, increasing self-efficacy, increasing intention formation, increasing action control, facilitation, increasing maintenance support, and motivational interviewing) among studies demonstrating improvements in adherence. Among the 60 studies, 34 demonstrated improved medication adherence. Among effective studies, increasing patient knowledge was a necessary but not sufficient technique. We identified seven configurations of behavior change techniques sufficient for improving adherence, which together accounted for 26 (76 %) of the effective studies. The intervention configuration that included increasing knowledge and self-efficacy was the most empirically relevant, accounting for 17 studies (50 %) and uniquely accounting for 15 (44 %). This analysis extends the completed review findings by identifying multiple combinations of behavior change techniques that improve adherence. Our findings offer direction for policy makers, practitioners, and future comparative effectiveness research on improving adherence.

  16. Current limitations and recommendations to improve testing ...

    EPA Pesticide Factsheets

    In this paper existing regulatory frameworks and test systems for assessing potential endocrine-active chemicals are described, and associated challenges discussed, along with proposed approaches to address these challenges. Regulatory frameworks vary somewhat across organizations, but all basically evaluate whether a chemical possesses endocrine activity and whether this activity can result in adverse outcomes either to humans or the environment. Current test systems include in silico, in vitro and in vivo techniques focused on detecting potential endocrine activity, and in vivo tests that collect apical data to detect possible adverse effects. These test systems are currently designed to robustly assess endocrine activity and/or adverse effects in the estrogen, androgen, and thyroid hormonal pathways; however, there are some limitations of current test systems for evaluating endocrine hazard and risk. These limitations include a lack of certainty regarding: 1)adequately sensitive species and life-stages, 2) mechanistic endpoints that are diagnostic for endocrine pathways of concern, and 3) the linkage between mechanistic responses and apical, adverse outcomes. Furthermore, some existing test methods are resource intensive in regard to time, cost, and use of animals. However, based on recent experiences, there are opportunities to improve approaches to, and guidance for existing test methods, and to reduce uncertainty. For example, in vitro high throughput

  17. Draft SEI Program Plans: 1995-1999

    DTIC Science & Technology

    1994-08-01

    risk management because we believe that (a) structured techniques, even quite simple ones, can be effective in identifying and quantifying risk ; and (b...belief that (1) structured techniques, even quite simple ones, could be effective in identifying and quantifying risk ; and (2) techniques existed to

  18. Wilderness campsite monitoring methods: a sourcebook

    Treesearch

    David N. Cole

    1989-01-01

    Summarizes information on techniques available for monitoring the condition of campsites, particularly those in wilderness. A variety of techniques are described and evaluated; sources of information are also listed. Problems with existing monitoring systems and places where refinement of technique is required are highlighted.

  19. Designing and Undertaking a Health Economics Study of Digital Health Interventions.

    PubMed

    McNamee, Paul; Murray, Elizabeth; Kelly, Michael P; Bojke, Laura; Chilcott, Jim; Fischer, Alastair; West, Robert; Yardley, Lucy

    2016-11-01

    This paper introduces and discusses key issues in the economic evaluation of digital health interventions. The purpose is to stimulate debate so that existing economic techniques may be refined or new methods developed. The paper does not seek to provide definitive guidance on appropriate methods of economic analysis for digital health interventions. This paper describes existing guides and analytic frameworks that have been suggested for the economic evaluation of healthcare interventions. Using selected examples of digital health interventions, it assesses how well existing guides and frameworks align to digital health interventions. It shows that digital health interventions may be best characterized as complex interventions in complex systems. Key features of complexity relate to intervention complexity, outcome complexity, and causal pathway complexity, with much of this driven by iterative intervention development over time and uncertainty regarding likely reach of the interventions among the relevant population. These characteristics imply that more-complex methods of economic evaluation are likely to be better able to capture fully the impact of the intervention on costs and benefits over the appropriate time horizon. This complexity includes wider measurement of costs and benefits, and a modeling framework that is able to capture dynamic interactions among the intervention, the population of interest, and the environment. The authors recommend that future research should develop and apply more-flexible modeling techniques to allow better prediction of the interdependency between interventions and important environmental influences. Copyright © 2016 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  20. On static solutions of the Einstein-Scalar Field equations

    NASA Astrophysics Data System (ADS)

    Reiris, Martín

    2017-03-01

    In this article we study self-gravitating static solutions of the Einstein-Scalar Field system in arbitrary dimensions. We discuss the existence of geodesically complete solutions depending on the form of the scalar field potential V(φ ), and provide full global geometric estimates when the solutions exist. The most complete results are obtained for the physically important Klein-Gordon field and are summarised as follows. When V(φ )=m2|φ |2, it is proved that geodesically complete solutions have Ricci-flat spatial metric, have constant lapse and are vacuum, (that is φ is constant and equal to zero if m≠ 0). In particular, when the spatial dimension is three, the only such solutions are either Minkowski or a quotient thereof (no nontrivial solutions exist). When V(φ )=m2|φ |2+2Λ , that is, when a vacuum energy or a cosmological constant is included, it is proved that no geodesically complete solution exists when Λ >0, whereas when Λ <0 it is proved that no non-vacuum geodesically complete solution exists unless m2<-2Λ /(n-1), ( n is the spatial dimension) and the spatial manifold is non-compact. The proofs are based on novel techniques in comparison geometry á la Bakry-Émery that have their own interest.

  1. Current limitations and recommendations to improve testing for the environmental assessment of endocrine active substances

    USGS Publications Warehouse

    Coady, Katherine K.; Biever, Ronald C.; Denslow, Nancy D.; Gross, Melanie; Guiney, Patrick D.; Holbech, Henrik; Karouna-Renier, Natalie K.; Katsiadaki, Ioanna; Krueger, Hank; Levine, Steven L.; Maack, Gerd; Williams, Mike; Wolf, Jeffrey C.; Ankley, Gerald T.

    2017-01-01

    In the present study, existing regulatory frameworks and test systems for assessing potential endocrine active chemicals are described, and associated challenges are discussed, along with proposed approaches to address these challenges. Regulatory frameworks vary somewhat across geographies, but all basically evaluate whether a chemical possesses endocrine activity and whether this activity can result in adverse outcomes either to humans or to the environment. Current test systems include in silico, in vitro, and in vivo techniques focused on detecting potential endocrine activity, and in vivo tests that collect apical data to detect possible adverse effects. These test systems are currently designed to robustly assess endocrine activity and/or adverse effects in the estrogen, androgen, and thyroid hormone signaling pathways; however, there are some limitations of current test systems for evaluating endocrine hazard and risk. These limitations include a lack of certainty regarding: 1) adequately sensitive species and life stages; 2) mechanistic endpoints that are diagnostic for endocrine pathways of concern; and 3) the linkage between mechanistic responses and apical, adverse outcomes. Furthermore, some existing test methods are resource intensive with regard to time, cost, and use of animals. However, based on recent experiences, there are opportunities to improve approaches to and guidance for existing test methods and to reduce uncertainty. For example, in vitro high-throughput screening could be used to prioritize chemicals for testing and provide insights as to the most appropriate assays for characterizing hazard and risk. Other recommendations include adding endpoints for elucidating connections between mechanistic effects and adverse outcomes, identifying potentially sensitive taxa for which test methods currently do not exist, and addressing key endocrine pathways of possible concern in addition to those associated with estrogen, androgen, and thyroid signaling. 

  2. Techniques for assessing water resource potentials in the developing countries: with emphasis on streamflow, erosion and sediment transport, water movement in unsaturated soils, ground water, and remote sensing in hydrologic applications

    USGS Publications Warehouse

    Taylor, George C.

    1971-01-01

    Hydrologic instrumentation and methodology for assessing water-resource potentials have originated largely in the developed countries of the temperature zone. The developing countries lie largely in the tropic zone, which contains the full gamut of the earth's climatic environments, including most of those of the temperate zone. For this reason, most hydrologic techniques have world-wide applicability. Techniques for assessing water-resource potentials for the high priority goals of economic growth are well established in the developing countries--but much more are well established in the developing countries--but much more so in some than in other. Conventional techniques for measurement and evaluation of basic hydrologic parameters are now well-understood in the developing countries and are generally adequate for their current needs and those of the immediate future. Institutional and economic constraints, however, inhibit growth of sustained programs of hydrologic data collection and application of the data to problems in engineering technology. Computer-based technology, including processing of hydrologic data and mathematical modelling of hydrologic parameters i also well-begun in many developing countries and has much wider potential application. In some developing counties, however, there is a tendency to look on the computer as a panacea for deficiencies in basic hydrologic data collection programs. This fallacy must be discouraged, as the computer is a tool and not a "magic box." There is no real substitute for sound programs of basic data collection. Nuclear and isotopic techniques are being used increasingly in the developed countries in the measurement and evaluation of virtually all hydrologic parameter in which conventional techniques have been used traditionally. Even in the developed countries, however, many hydrologists are not using nuclear techniques, simply because they lack knowledge of the principles involved and of the potential benefits. Nuclear methodology in hydrologic applications is generally more complex than the conventional and hence requires a high level of technical expertise for effective use. Application of nuclear techniques to hydrologic problems in the developing countries is likely to be marginal for some years to come, owing to the higher costs involved and expertise required. Nuclear techniques, however, would seem to have particular promise in studies of water movement in unsaturated soils and of erosion and sedimentation where conventional techniques are inadequate, inefficient and in some cases costly. Remote sensing offers great promise for synoptic evaluations of water resources and hydrologic processes, including the transient phenomena of the hydrologic cycle. Remote sensing is not, however, a panacea for deficiencies in hydrologic data programs in the developing countries. Rather it is a means for extending and augmenting on-the-ground observations ans surveys (ground truth) to evaluated water resources and hydrologic processes on a regionall or even continental scale. With respect to economic growth goals in developing countries, there are few identifiable gaps in existing hydrologic instrumentation and methodology insofar as appraisal, development and management of available water resources are concerned. What is needed is acceleration of institutional development and professional motivation toward more effective use of existing and proven methodology. Moreover, much sophisticated methodology can be applied effectively in the developing countries only when adequate levels of indigenous scientific skills have been reached and supportive institutional frameworks are evolved to viability.

  3. Agent independent task planning

    NASA Technical Reports Server (NTRS)

    Davis, William S.

    1990-01-01

    Agent-Independent Planning is a technique that allows the construction of activity plans without regard to the agent that will perform them. Once generated, a plan is then validated and translated into instructions for a particular agent, whether a robot, crewmember, or software-based control system. Because Space Station Freedom (SSF) is planned for orbital operations for approximately thirty years, it will almost certainly experience numerous enhancements and upgrades, including upgrades in robotic manipulators. Agent-Independent Planning provides the capability to construct plans for SSF operations, independent of specific robotic systems, by combining techniques of object oriented modeling, nonlinear planning and temporal logic. Since a plan is validated using the physical and functional models of a particular agent, new robotic systems can be developed and integrated with existing operations in a robust manner. This technique also provides the capability to generate plans for crewmembers with varying skill levels, and later apply these same plans to more sophisticated robotic manipulators made available by evolutions in technology.

  4. Portable Infrared Laser Spectroscopy for On-site Mycotoxin Analysis.

    PubMed

    Sieger, Markus; Kos, Gregor; Sulyok, Michael; Godejohann, Matthias; Krska, Rudolf; Mizaikoff, Boris

    2017-03-09

    Mycotoxins are toxic secondary metabolites of fungi that spoil food, and severely impact human health (e.g., causing cancer). Therefore, the rapid determination of mycotoxin contamination including deoxynivalenol and aflatoxin B 1 in food and feed samples is of prime interest for commodity importers and processors. While chromatography-based techniques are well established in laboratory environments, only very few (i.e., mostly immunochemical) techniques exist enabling direct on-site analysis for traders and manufacturers. In this study, we present MYCOSPEC - an innovative approach for spectroscopic mycotoxin contamination analysis at EU regulatory limits for the first time utilizing mid-infrared tunable quantum cascade laser (QCL) spectroscopy. This analysis technique facilitates on-site mycotoxin analysis by combining QCL technology with GaAs/AlGaAs thin-film waveguides. Multivariate data mining strategies (i.e., principal component analysis) enabled the classification of deoxynivalenol-contaminated maize and wheat samples, and of aflatoxin B 1 affected peanuts at EU regulatory limits of 1250 μg kg -1 and 8 μg kg -1 , respectively.

  5. Portable Infrared Laser Spectroscopy for On-site Mycotoxin Analysis

    PubMed Central

    Sieger, Markus; Kos, Gregor; Sulyok, Michael; Godejohann, Matthias; Krska, Rudolf; Mizaikoff, Boris

    2017-01-01

    Mycotoxins are toxic secondary metabolites of fungi that spoil food, and severely impact human health (e.g., causing cancer). Therefore, the rapid determination of mycotoxin contamination including deoxynivalenol and aflatoxin B1 in food and feed samples is of prime interest for commodity importers and processors. While chromatography-based techniques are well established in laboratory environments, only very few (i.e., mostly immunochemical) techniques exist enabling direct on-site analysis for traders and manufacturers. In this study, we present MYCOSPEC - an innovative approach for spectroscopic mycotoxin contamination analysis at EU regulatory limits for the first time utilizing mid-infrared tunable quantum cascade laser (QCL) spectroscopy. This analysis technique facilitates on-site mycotoxin analysis by combining QCL technology with GaAs/AlGaAs thin-film waveguides. Multivariate data mining strategies (i.e., principal component analysis) enabled the classification of deoxynivalenol-contaminated maize and wheat samples, and of aflatoxin B1 affected peanuts at EU regulatory limits of 1250 μg kg−1 and 8 μg kg−1, respectively. PMID:28276454

  6. A new coherent demodulation technique for land-mobile satellite communications

    NASA Technical Reports Server (NTRS)

    Yoshida, Shousei; Tomita, Hideho

    1990-01-01

    An advanced coherent demodulation technique is described for land mobile satellite (LMS) communications. The proposed technique features a combined narrow/wind band dual open loop carrier phase estimator, which is effectively able to compensate the fast carrier phase fluctuation by fading with sacrificing a phase slip rate. Also included is the realization of quick carrier and clock reacquisition after shadowing by taking open loop structure. Its bit error rate (BER) performance is superior to that of existing detection schemes, showing a BER of 1 x 10(exp -2) at 6.3 dB E sub b/N sub o over the Rician channel with 10 dB C/M and 200 Hz (1/16 modulation rate) fading pitch f sub d for QPSK. The proposed scheme consists of a fast response carrier recovery and a quick bit timing recovery with an interpolation. An experimental terminal model was developed to evaluate its performance at fading conditions. The results are quite satisfactory, giving prospects for future LMS applications.

  7. Portable Infrared Laser Spectroscopy for On-site Mycotoxin Analysis

    NASA Astrophysics Data System (ADS)

    Sieger, Markus; Kos, Gregor; Sulyok, Michael; Godejohann, Matthias; Krska, Rudolf; Mizaikoff, Boris

    2017-03-01

    Mycotoxins are toxic secondary metabolites of fungi that spoil food, and severely impact human health (e.g., causing cancer). Therefore, the rapid determination of mycotoxin contamination including deoxynivalenol and aflatoxin B1 in food and feed samples is of prime interest for commodity importers and processors. While chromatography-based techniques are well established in laboratory environments, only very few (i.e., mostly immunochemical) techniques exist enabling direct on-site analysis for traders and manufacturers. In this study, we present MYCOSPEC - an innovative approach for spectroscopic mycotoxin contamination analysis at EU regulatory limits for the first time utilizing mid-infrared tunable quantum cascade laser (QCL) spectroscopy. This analysis technique facilitates on-site mycotoxin analysis by combining QCL technology with GaAs/AlGaAs thin-film waveguides. Multivariate data mining strategies (i.e., principal component analysis) enabled the classification of deoxynivalenol-contaminated maize and wheat samples, and of aflatoxin B1 affected peanuts at EU regulatory limits of 1250 μg kg-1 and 8 μg kg-1, respectively.

  8. High-sensitivity determination of Zn(II) and Cu(II) in vitro by fluorescence polarization

    NASA Astrophysics Data System (ADS)

    Thompson, Richard B.; Maliwal, Badri P.; Feliccia, Vincent; Fierke, Carol A.

    1998-04-01

    Recent work has suggested that free Cu(II) may play a role in syndromes such as Crohn's and Wilson's diseases, as well as being a pollutant toxic at low levels to shellfish and sheep. Similarly, Zn(II) has been implicated in some neural damage in the brain resulting from epilepsy and ischemia. Several high sensitivity methods exist for determining these ions in solution, including GFAAS, ICP-MS, ICP-ES, and electrochemical techniques. However, these techniques are generally slow and costly, require pretreatment of the sample, require complex instruments and skilled personnel, and are incapable of imaging at the cellular and subcellular level. To address these shortcomings we developed fluorescence polarization (anisotropy) biosensing methods for these ions which are very sensitivity, highly selective, require simple instrumentation and little pretreatment, and are inexpensive. Thus free Cu(II) or Zn(II) can be determined at picomolar levels by changes in fluorescence polarization, lifetime, or wavelength ratio using these methods; these techniques may be adapted to microscopy.

  9. Conditional Random Field-Based Offline Map Matching for Indoor Environments

    PubMed Central

    Bataineh, Safaa; Bahillo, Alfonso; Díez, Luis Enrique; Onieva, Enrique; Bataineh, Ikram

    2016-01-01

    In this paper, we present an offline map matching technique designed for indoor localization systems based on conditional random fields (CRF). The proposed algorithm can refine the results of existing indoor localization systems and match them with the map, using loose coupling between the existing localization system and the proposed map matching technique. The purpose of this research is to investigate the efficiency of using the CRF technique in offline map matching problems for different scenarios and parameters. The algorithm was applied to several real and simulated trajectories of different lengths. The results were then refined and matched with the map using the CRF algorithm. PMID:27537892

  10. Conditional Random Field-Based Offline Map Matching for Indoor Environments.

    PubMed

    Bataineh, Safaa; Bahillo, Alfonso; Díez, Luis Enrique; Onieva, Enrique; Bataineh, Ikram

    2016-08-16

    In this paper, we present an offline map matching technique designed for indoor localization systems based on conditional random fields (CRF). The proposed algorithm can refine the results of existing indoor localization systems and match them with the map, using loose coupling between the existing localization system and the proposed map matching technique. The purpose of this research is to investigate the efficiency of using the CRF technique in offline map matching problems for different scenarios and parameters. The algorithm was applied to several real and simulated trajectories of different lengths. The results were then refined and matched with the map using the CRF algorithm.

  11. Coalbed methane: Clean energy for the world

    USGS Publications Warehouse

    Ahmed, A.-J.; Johnston, S.; Boyer, C.; Lambert, S.W.; Bustos, O.A.; Pashin, J.C.; Wray, A.

    2009-01-01

    Coalbed methane (CBM) has the potential to emerge as a significant clean energy resource. It also has the potential to replace other diminishing hydrocarbon reserves. The latest developments in technologies and methodologies are playing a key role in harnessing this unconventional resource. Some of these developments include adaptations of existing technologies used in conventional oil and gas generations, while others include new applications designed specifically to address coal's unique properties. Completion techniques have been developed that cause less damage to the production mechanisms of coal seams, such as those occurring during cementing operations. Stimulation fluids have also been engineered specifically to enhance CBM production. Deep coal deposits that remain inaccessible by conventional mining operations offer CBM development opportunities.

  12. Advances in Radiotherapy Management of Esophageal Cancer.

    PubMed

    Verma, Vivek; Moreno, Amy C; Lin, Steven H

    2016-10-21

    Radiation therapy (RT) as part of multidisciplinary oncologic care has been marked by profound advancements over the past decades. As part of multimodality therapy for esophageal cancer (EC), a prime goal of RT is to minimize not only treatment toxicities, but also postoperative complications and hospitalizations. Herein, discussion commences with the historical approaches to treating EC, including seminal trials supporting multimodality therapy. Subsequently, the impact of RT techniques, including three-dimensional conformal RT, intensity-modulated RT, and proton beam therapy, is examined through available data. We further discuss existing data and the potential for further development in the future, with an appraisal of the future outlook of technological advancements of RT for EC.

  13. Advances in Radiotherapy Management of Esophageal Cancer

    PubMed Central

    Verma, Vivek; Moreno, Amy C.; Lin, Steven H.

    2016-01-01

    Radiation therapy (RT) as part of multidisciplinary oncologic care has been marked by profound advancements over the past decades. As part of multimodality therapy for esophageal cancer (EC), a prime goal of RT is to minimize not only treatment toxicities, but also postoperative complications and hospitalizations. Herein, discussion commences with the historical approaches to treating EC, including seminal trials supporting multimodality therapy. Subsequently, the impact of RT techniques, including three-dimensional conformal RT, intensity-modulated RT, and proton beam therapy, is examined through available data. We further discuss existing data and the potential for further development in the future, with an appraisal of the future outlook of technological advancements of RT for EC. PMID:27775643

  14. Characterizing high-energy-density propellants for space propulsion applications

    NASA Astrophysics Data System (ADS)

    Kokan, Timothy

    There exists wide ranging research interest in high-energy-density matter (HEDM) propellants as a potential replacement for existing industry standard fuels for liquid rocket engines. The U.S. Air Force Research Laboratory, the U.S. Army Research Lab, the NASA Marshall Space Flight Center, and the NASA Glenn Research Center each either recently concluded or currently has ongoing programs in the synthesis and development of these potential new propellants. In order to perform conceptual designs using these new propellants, most conceptual rocket engine powerhead design tools (e.g. NPSS, ROCETS, and REDTOP-2) require several thermophysical properties of a given propellant over a wide range of temperature and pressure. These properties include enthalpy, entropy, density, viscosity, and thermal conductivity. Very little thermophysical property data exists for most of these potential new HEDM propellants. Experimental testing of these properties is both expensive and time consuming and is impractical in a conceptual vehicle design environment. A new technique for determining these thermophysical properties of potential new rocket engine propellants is presented. The technique uses a combination of three different computational methods to determine these properties. Quantum mechanics and molecular dynamics are used to model new propellants at a molecular level in order to calculate density, enthalpy, and entropy. Additivity methods are used to calculate the kinematic viscosity and thermal conductivity of new propellants. This new technique is validated via a series of verification experiments of HEDM compounds. Results are provided for two HEDM propellants: quadricyclane and 2-azido-N,N-dimethylethanamine (DMAZ). In each case, the new technique does a better job than the best current computational methods at accurately matching the experimental data of the HEDM compounds of interest. A case study is provided to help quantify the vehicle level impacts of using HEDM propellants. The case study consists of the National Aeronautics and Space Administration's (NASA) Exploration Systems Architecture Study (ESAS) Lunar Surface Access Module (LSAM). The results of this study show that the use of HEDM propellants instead of hypergolic propellants can lower the gross weight of the LSAM and may be an attractive alternative to the current baseline hypergolic propellant choice.

  15. An Evaluation of Understandability of Patient Journey Models in Mental Health.

    PubMed

    Percival, Jennifer; McGregor, Carolyn

    2016-07-28

    There is a significant trend toward implementing health information technology to reduce administrative costs and improve patient care. Unfortunately, little awareness exists of the challenges of integrating information systems with existing clinical practice. The systematic integration of clinical processes with information system and health information technology can benefit the patients, staff, and the delivery of care. This paper presents a comparison of the degree of understandability of patient journey models. In particular, the authors demonstrate the value of a relatively new patient journey modeling technique called the Patient Journey Modeling Architecture (PaJMa) when compared with traditional manufacturing based process modeling tools. The paper also presents results from a small pilot case study that compared the usability of 5 modeling approaches in a mental health care environment. Five business process modeling techniques were used to represent a selected patient journey. A mix of both qualitative and quantitative methods was used to evaluate these models. Techniques included a focus group and survey to measure usability of the various models. The preliminary evaluation of the usability of the 5 modeling techniques has shown increased staff understanding of the representation of their processes and activities when presented with the models. Improved individual role identification throughout the models was also observed. The extended version of the PaJMa methodology provided the most clarity of information flows for clinicians. The extended version of PaJMa provided a significant improvement in the ease of interpretation for clinicians and increased the engagement with the modeling process. The use of color and its effectiveness in distinguishing the representation of roles was a key feature of the framework not present in other modeling approaches. Future research should focus on extending the pilot case study to a more diversified group of clinicians and health care support workers.

  16. Global Design Optimization for Aerodynamics and Rocket Propulsion Components

    NASA Technical Reports Server (NTRS)

    Shyy, Wei; Papila, Nilay; Vaidyanathan, Rajkumar; Tucker, Kevin; Turner, James E. (Technical Monitor)

    2000-01-01

    Modern computational and experimental tools for aerodynamics and propulsion applications have matured to a stage where they can provide substantial insight into engineering processes involving fluid flows, and can be fruitfully utilized to help improve the design of practical devices. In particular, rapid and continuous development in aerospace engineering demands that new design concepts be regularly proposed to meet goals for increased performance, robustness and safety while concurrently decreasing cost. To date, the majority of the effort in design optimization of fluid dynamics has relied on gradient-based search algorithms. Global optimization methods can utilize the information collected from various sources and by different tools. These methods offer multi-criterion optimization, handle the existence of multiple design points and trade-offs via insight into the entire design space, can easily perform tasks in parallel, and are often effective in filtering the noise intrinsic to numerical and experimental data. However, a successful application of the global optimization method needs to address issues related to data requirements with an increase in the number of design variables, and methods for predicting the model performance. In this article, we review recent progress made in establishing suitable global optimization techniques employing neural network and polynomial-based response surface methodologies. Issues addressed include techniques for construction of the response surface, design of experiment techniques for supplying information in an economical manner, optimization procedures and multi-level techniques, and assessment of relative performance between polynomials and neural networks. Examples drawn from wing aerodynamics, turbulent diffuser flows, gas-gas injectors, and supersonic turbines are employed to help demonstrate the issues involved in an engineering design context. Both the usefulness of the existing knowledge to aid current design practices and the need for future research are identified.

  17. Global existence of the three-dimensional viscous quantum magnetohydrodynamic model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Jianwei, E-mail: yangjianwei@ncwu.edu.cn; Ju, Qiangchang, E-mail: qiangchang-ju@yahoo.com

    2014-08-15

    The global-in-time existence of weak solutions to the viscous quantum Magnetohydrodynamic equations in a three-dimensional torus with large data is proved. The global existence of weak solutions to the viscous quantum Magnetohydrodynamic equations is shown by using the Faedo-Galerkin method and weak compactness techniques.

  18. Methods for assessing the quality of mammalian embryos: How far we are from the gold standard?

    PubMed

    Rocha, José C; Passalia, Felipe; Matos, Felipe D; Maserati, Marc P; Alves, Mayra F; Almeida, Tamie G de; Cardoso, Bruna L; Basso, Andrea C; Nogueira, Marcelo F G

    2016-08-01

    Morphological embryo classification is of great importance for many laboratory techniques, from basic research to the ones applied to assisted reproductive technology. However, the standard classification method for both human and cattle embryos, is based on quality parameters that reflect the overall morphological quality of the embryo in cattle, or the quality of the individual embryonic structures, more relevant in human embryo classification. This assessment method is biased by the subjectivity of the evaluator and even though several guidelines exist to standardize the classification, it is not a method capable of giving reliable and trustworthy results. Latest approaches for the improvement of quality assessment include the use of data from cellular metabolism, a new morphological grading system, development kinetics and cleavage symmetry, embryo cell biopsy followed by pre-implantation genetic diagnosis, zona pellucida birefringence, ion release by the embryo cells and so forth. Nowadays there exists a great need for evaluation methods that are practical and non-invasive while being accurate and objective. A method along these lines would be of great importance to embryo evaluation by embryologists, clinicians and other professionals who work with assisted reproductive technology. Several techniques shows promising results in this sense, one being the use of digital images of the embryo as basis for features extraction and classification by means of artificial intelligence techniques (as genetic algorithms and artificial neural networks). This process has the potential to become an accurate and objective standard for embryo quality assessment.

  19. Methods for assessing the quality of mammalian embryos: How far we are from the gold standard?

    PubMed Central

    Rocha, José C.; Passalia, Felipe; Matos, Felipe D.; Maserati Jr, Marc P.; Alves, Mayra F.; de Almeida, Tamie G.; Cardoso, Bruna L.; Basso, Andrea C.; Nogueira, Marcelo F. G.

    2016-01-01

    Morphological embryo classification is of great importance for many laboratory techniques, from basic research to the ones applied to assisted reproductive technology. However, the standard classification method for both human and cattle embryos, is based on quality parameters that reflect the overall morphological quality of the embryo in cattle, or the quality of the individual embryonic structures, more relevant in human embryo classification. This assessment method is biased by the subjectivity of the evaluator and even though several guidelines exist to standardize the classification, it is not a method capable of giving reliable and trustworthy results. Latest approaches for the improvement of quality assessment include the use of data from cellular metabolism, a new morphological grading system, development kinetics and cleavage symmetry, embryo cell biopsy followed by pre-implantation genetic diagnosis, zona pellucida birefringence, ion release by the embryo cells and so forth. Nowadays there exists a great need for evaluation methods that are practical and non-invasive while being accurate and objective. A method along these lines would be of great importance to embryo evaluation by embryologists, clinicians and other professionals who work with assisted reproductive technology. Several techniques shows promising results in this sense, one being the use of digital images of the embryo as basis for features extraction and classification by means of artificial intelligence techniques (as genetic algorithms and artificial neural networks). This process has the potential to become an accurate and objective standard for embryo quality assessment. PMID:27584609

  20. Compression of Probabilistic XML Documents

    NASA Astrophysics Data System (ADS)

    Veldman, Irma; de Keijzer, Ander; van Keulen, Maurice

    Database techniques to store, query and manipulate data that contains uncertainty receives increasing research interest. Such UDBMSs can be classified according to their underlying data model: relational, XML, or RDF. We focus on uncertain XML DBMS with as representative example the Probabilistic XML model (PXML) of [10,9]. The size of a PXML document is obviously a factor in performance. There are PXML-specific techniques to reduce the size, such as a push down mechanism, that produces equivalent but more compact PXML documents. It can only be applied, however, where possibilities are dependent. For normal XML documents there also exist several techniques for compressing a document. Since Probabilistic XML is (a special form of) normal XML, it might benefit from these methods even more. In this paper, we show that existing compression mechanisms can be combined with PXML-specific compression techniques. We also show that best compression rates are obtained with a combination of PXML-specific technique with a rather simple generic DAG-compression technique.

  1. The Effects of Practice-Based Training on Graduate Teaching Assistants' Classroom Practices.

    PubMed

    Becker, Erin A; Easlon, Erin J; Potter, Sarah C; Guzman-Alvarez, Alberto; Spear, Jensen M; Facciotti, Marc T; Igo, Michele M; Singer, Mitchell; Pagliarulo, Christopher

    2017-01-01

    Evidence-based teaching is a highly complex skill, requiring repeated cycles of deliberate practice and feedback to master. Despite existing well-characterized frameworks for practice-based training in K-12 teacher education, the major principles of these frameworks have not yet been transferred to instructor development in higher educational contexts, including training of graduate teaching assistants (GTAs). We sought to determine whether a practice-based training program could help GTAs learn and use evidence-based teaching methods in their classrooms. We implemented a weekly training program for introductory biology GTAs that included structured drills of techniques selected to enhance student practice, logic development, and accountability and reduce apprehension. These elements were selected based on their previous characterization as dimensions of active learning. GTAs received regular performance feedback based on classroom observations. To quantify use of target techniques and levels of student participation, we collected and coded 160 h of video footage. We investigated the relationship between frequency of GTA implementation of target techniques and student exam scores; however, we observed no significant relationship. Although GTAs adopted and used many of the target techniques with high frequency, techniques that enforced student participation were not stably adopted, and their use was unresponsive to formal feedback. We also found that techniques discussed in training, but not practiced, were not used at quantifiable frequencies, further supporting the importance of practice-based training for influencing instructional practices. © 2017 E. A. Becker et al. CBE—Life Sciences Education © 2017 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  2. Terahertz microfluidic sensing using a parallel-plate waveguide sensor.

    PubMed

    Astley, Victoria; Reichel, Kimberly; Mendis, Rajind; Mittleman, Daniel M

    2012-08-30

    Refractive index (RI) sensing is a powerful noninvasive and label-free sensing technique for the identification, detection and monitoring of microfluidic samples with a wide range of possible sensor designs such as interferometers and resonators. Most of the existing RI sensing applications focus on biological materials in aqueous solutions in visible and IR frequencies, such as DNA hybridization and genome sequencing. At terahertz frequencies, applications include quality control, monitoring of industrial processes and sensing and detection applications involving nonpolar materials. Several potential designs for refractive index sensors in the terahertz regime exist, including photonic crystal waveguides, asymmetric split-ring resonators, and photonic band gap structures integrated into parallel-plate waveguides. Many of these designs are based on optical resonators such as rings or cavities. The resonant frequencies of these structures are dependent on the refractive index of the material in or around the resonator. By monitoring the shifts in resonant frequency the refractive index of a sample can be accurately measured and this in turn can be used to identify a material, monitor contamination or dilution, etc. The sensor design we use here is based on a simple parallel-plate waveguide. A rectangular groove machined into one face acts as a resonant cavity (Figures 1 and 2). When terahertz radiation is coupled into the waveguide and propagates in the lowest-order transverse-electric (TE1) mode, the result is a single strong resonant feature with a tunable resonant frequency that is dependent on the geometry of the groove. This groove can be filled with nonpolar liquid microfluidic samples which cause a shift in the observed resonant frequency that depends on the amount of liquid in the groove and its refractive index. Our technique has an advantage over other terahertz techniques in its simplicity, both in fabrication and implementation, since the procedure can be accomplished with standard laboratory equipment without the need for a clean room or any special fabrication or experimental techniques. It can also be easily expanded to multichannel operation by the incorporation of multiple grooves. In this video we will describe our complete experimental procedure, from the design of the sensor to the data analysis and determination of the sample refractive index.

  3. Stretchable Kirigami Polyvinylidene Difluoride Thin Films for Energy Harvesting: Design, Analysis, and Performance

    NASA Astrophysics Data System (ADS)

    Hu, Nan; Chen, Dajing; Wang, Dong; Huang, Shicheng; Trase, Ian; Grover, Hannah M.; Yu, Xiaojiao; Zhang, John X. J.; Chen, Zi

    2018-02-01

    Kirigami, a modified form of origami which includes cutting, has been used to improve material stretchability and compliance. However, this technique is, so far, underexplored in patterning piezoelectric materials towards developing efficient and mechanically flexible thin-film energy generators. Motivated by existing kirigami-based applications, we introduce interdigitated cuts to polyvinylidene fluoride (PVDF) films to evaluate the effect on voltage generation and stretchability. Our results from theoretical analysis, numerical simulations, and experimental tests show that kirigami PVDF films exhibit an extended strain range while still maintaining significant voltage generation compared to films without cuts. Various cutting patterns are studied, and it is found that films with denser cuts have a larger voltage output. This kirigami design can enhance the properties of existing piezoelectric materials and help to integrate tunable PVDF generators into biomedical devices.

  4. Mobile text messaging solutions for obesity prevention

    NASA Astrophysics Data System (ADS)

    Akopian, David; Jayaram, Varun; Aaleswara, Lakshmipathi; Esfahanian, Moosa; Mojica, Cynthia; Parra-Medina, Deborah; Kaghyan, Sahak

    2011-02-01

    Cellular telephony has become a bright example of co-evolution of human society and information technology. This trend has also been reflected in health care and health promotion projects which included cell phones in data collection and communication chain. While many successful projects have been realized, the review of phone-based data collection techniques reveals that the existing technologies do not completely address health promotion research needs. The paper presents approaches which close this gap by extending existing versatile platforms. The messaging systems are designed for a health-promotion research to prevent obesity and obesity-related health disparities among low-income Latino adolescent girls. Messaging and polling mechanisms are used to communicate and automatically process response data for the target constituency. Preliminary survey data provide an insight on phone availability and technology perception for the study group.

  5. Zebra Crossing Spotter: Automatic Population of Spatial Databases for Increased Safety of Blind Travelers

    PubMed Central

    Ahmetovic, Dragan; Manduchi, Roberto; Coughlan, James M.; Mascetti, Sergio

    2016-01-01

    In this paper we propose a computer vision-based technique that mines existing spatial image databases for discovery of zebra crosswalks in urban settings. Knowing the location of crosswalks is critical for a blind person planning a trip that includes street crossing. By augmenting existing spatial databases (such as Google Maps or OpenStreetMap) with this information, a blind traveler may make more informed routing decisions, resulting in greater safety during independent travel. Our algorithm first searches for zebra crosswalks in satellite images; all candidates thus found are validated against spatially registered Google Street View images. This cascaded approach enables fast and reliable discovery and localization of zebra crosswalks in large image datasets. While fully automatic, our algorithm could also be complemented by a final crowdsourcing validation stage for increased accuracy. PMID:26824080

  6. A survey and taxonomy on energy efficient resource allocation techniques for cloud computing systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hameed, Abdul; Khoshkbarforoushha, Alireza; Ranjan, Rajiv

    In a cloud computing paradigm, energy efficient allocation of different virtualized ICT resources (servers, storage disks, and networks, and the like) is a complex problem due to the presence of heterogeneous application (e.g., content delivery networks, MapReduce, web applications, and the like) workloads having contentious allocation requirements in terms of ICT resource capacities (e.g., network bandwidth, processing speed, response time, etc.). Several recent papers have tried to address the issue of improving energy efficiency in allocating cloud resources to applications with varying degree of success. However, to the best of our knowledge there is no published literature on this subjectmore » that clearly articulates the research problem and provides research taxonomy for succinct classification of existing techniques. Hence, the main aim of this paper is to identify open challenges associated with energy efficient resource allocation. In this regard, the study, first, outlines the problem and existing hardware and software-based techniques available for this purpose. Furthermore, available techniques already presented in the literature are summarized based on the energy-efficient research dimension taxonomy. The advantages and disadvantages of the existing techniques are comprehensively analyzed against the proposed research dimension taxonomy namely: resource adaption policy, objective function, allocation method, allocation operation, and interoperability.« less

  7. [The methods of assessment of health risk from exposure to radon and radon daughters].

    PubMed

    Demin, V F; Zhukovskiy, M V; Kiselev, S M

    2014-01-01

    The critical analysis of existing models of the relationship dose-effect (RDE) for radon exposure on human health has been performed. Conclusion about the necessity and possibility of improving these models has been made. A new improved version ofthe RDE has been developed. A technique for assessing the human health risk of exposure to radon, including the method for estimating of exposure doses of radon, an improved model of RDE, proper methodology risk assessment has been described. Methodology is proposed for the use in the territory of Russia.

  8. Investigation of parabolic computational techniques for internal high-speed viscous flows

    NASA Technical Reports Server (NTRS)

    Anderson, O. L.; Power, G. D.

    1985-01-01

    A feasibility study was conducted to assess the applicability of an existing parabolic analysis (ADD-Axisymmetric Diffuser Duct), developed previously for subsonic viscous internal flows, to mixed supersonic/subsonic flows with heat addition simulating a SCRAMJET combustor. A study was conducted with the ADD code modified to include additional convection effects in the normal momentum equation when supersonic expansion and compression waves were present. It is concluded from the present study that for the class of problems where strong viscous/inviscid interactions are present a global iteration procedure is required.

  9. Blade frequency program for nonuniform helicopter rotors, with automated frequency search

    NASA Technical Reports Server (NTRS)

    Sadler, S. G.

    1972-01-01

    A computer program for determining the natural frequencies and normal modes of a lumped parameter model of a rotating, twisted beam, with nonuniform mass and elastic properties was developed. The program is used to solve the conditions existing in a helicopter rotor where the outboard end of the rotor has zero forces and moments. Three frequency search methods have been implemented. Including an automatic search technique, which allows the program to find up to the fifteen lowest natural frequencies without the necessity for input estimates of these frequencies.

  10. Clogging of Manifolds with Evaporatively Frozen Propellants. Part 2; Analysis

    NASA Technical Reports Server (NTRS)

    Simmon, J. A.; Gift, R. D.; Spurlock, J. M.

    1966-01-01

    The mechanisms of evaporative freezing of leaking propellant and the creation of flow stoppages within injector manifolds is discussed. A quantitative analysis of the conditions, including the existence of minimum and maximum leak rates, for the accumulation of evaporatively frozen propellant is presented. Clogging of the injector manifolds of the Apollo SPS and the Gemini OAMS engines by the freezing of leaking propellant is predicted and the seriousness of the consequences are discussed. Based on the analysis a realistic evaluation of selected techniques to eliminate flow stoppages by frozen propellant is made.

  11. Expert systems tools for Hubble Space Telescope observation scheduling

    NASA Technical Reports Server (NTRS)

    Miller, Glenn; Rosenthal, Don; Cohen, William; Johnston, Mark

    1987-01-01

    The utility of expert systems techniques for the Hubble Space Telescope (HST) planning and scheduling is discussed and a plan for development of expert system tools which will augment the existing ground system is described. Additional capabilities provided by these tools will include graphics-oriented plan evaluation, long-range analysis of the observation pool, analysis of optimal scheduling time intervals, constructing sequences of spacecraft activities which minimize operational overhead, and optimization of linkages between observations. Initial prototyping of a scheduler used the Automated Reasoning Tool running on a LISP workstation.

  12. Linear programming computational experience with onyx

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Atrek, E.

    1994-12-31

    ONYX is a linear programming software package based on an efficient variation of the gradient projection method. When fully configured, it is intended for application to industrial size problems. While the computational experience is limited at the time of this abstract, the technique is found to be robust and competitive with existing methodology in terms of both accuracy and speed. An overview of the approach is presented together with a description of program capabilities, followed by a discussion of up-to-date computational experience with the program. Conclusions include advantages of the approach and envisioned future developments.

  13. Diagnosis and management of acute complications in patients with colon cancer: bleeding, obstruction, and perforation

    PubMed Central

    Yang, Xue-Fei

    2014-01-01

    Among the colorectal cancers, the incidence of colon cancer has obviously increased. As a result, the actual incidence of colon cancer has exceeded that of rectal cancer, which dramatically changed the long-existing epidemiological profile. The acute complications of colon cancer include bleeding, obstruction, and perforation, which were among the common acute abdominal surgical conditions. The rapid and accurate diagnosis of these acute complications was very important, and laparoscopic techniques can be applied in abdominal surgery for management of the complications. PMID:25035661

  14. Meniscal tears, repairs and replacement: their relevance to osteoarthritis of the knee.

    PubMed

    McDermott, Ian

    2011-04-01

    The menisci of the knee are important load sharers and shock absorbers in the joint. Meniscal tears are common, and whenever possible meniscal tears should be surgically repaired. Meniscectomy leads to a significant increased risk of osteoarthritis, and various options now exist for replacing missing menisci, including the use of meniscal scaffolds or the replacement of the entire meniscus by meniscal allograft transplantation. The field of meniscal surgery continues to develop apace, and the future may lie in growing new menisci by tissue engineering techniques.

  15. Method of generating a surface mesh

    DOEpatents

    Shepherd, Jason F [Albuquerque, NM; Benzley, Steven [Provo, UT; Grover, Benjamin T [Tracy, CA

    2008-03-04

    A method and machine-readable medium provide a technique to generate and modify a quadrilateral finite element surface mesh using dual creation and modification. After generating a dual of a surface (mesh), a predetermined algorithm may be followed to generate and modify a surface mesh of quadrilateral elements. The predetermined algorithm may include the steps of generating two-dimensional cell regions in dual space, determining existing nodes in primal space, generating new nodes in the dual space, and connecting nodes to form the quadrilateral elements (faces) for the generated and modifiable surface mesh.

  16. Study of sandy soil grain-size distribution on its deformation properties

    NASA Astrophysics Data System (ADS)

    Antropova, L. B.; Gruzin, A. V.; Gildebrandt, M. I.; Malaya, L. D.; Nikulina, V. B.

    2018-04-01

    As a rule, new oil and gas fields' development faces the challenges of providing construction objects with material and mineral resources, for example, medium sand soil for buildings and facilities footings of the technological infrastructure under construction. This problem solution seems to lie in a rational usage of the existing environmental resources, soils included. The study was made of a medium sand soil grain-size distribution impact on its deformation properties. Based on the performed investigations, a technique for controlling sandy soil deformation properties was developed.

  17. [Research advances on cortical functional and structural deficits of amblyopia].

    PubMed

    Wu, Y; Liu, L Q

    2017-05-11

    Previous studies have observed functional deficits in primary visual cortex. With the development of functional magnetic resonance imaging and electrophysiological technique, the research of the striate, extra-striate cortex and higher-order cortical deficit underlying amblyopia reaches a new stage. The neural mechanisms of amblyopia show that anomalous responses exist throughout the visual processing hierarchy, including the functional and structural abnormalities. This review aims to summarize the current knowledge about structural and functional deficits of brain regions associated with amblyopia. (Chin J Ophthalmol, 2017, 53: 392 - 395) .

  18. Data::Downloader

    NASA Technical Reports Server (NTRS)

    Duggan, Brian

    2012-01-01

    Downloading and organizing large amounts of files is challenging, and often done using ad hoc methods. This software is capable of downloading and organizing files as an OpenSearch client. It can subscribe to RSS (Really Simple Syndication) feeds and Atom feeds containing arbitrary metadata, and maintains a local content addressable data store. It uses existing standards for obtaining the files, and uses efficient techniques for storing the files. Novel features include symbolic links to maintain a sane directory structure, checksums for validating file integrity during transfer and storage, and flexible use of server-provided metadata.

  19. Behavior based safety. A different way of looking at an old problem.

    PubMed

    Haney, L; Anderson, M

    1999-09-01

    1. The occupational and environmental health nurse role in behavioral safety initiatives can very to include: serving as a leader, change agent, collaborator with safety professionals, consultant, team participant, educator, coach, and supporter to employees and management. 2. Behavior based safety and health initiatives add to existing knowledge and techniques for improving the health and safety of workers. 3. Behavior based safety relies on employee involvement and places a strong emphasis on observation, measurement, feedback, positive reinforcement, and evaluation. It focuses on identification of system improvements and prevention.

  20. Water quality studied in areas of unconventional oil and gas development, including areas where hydraulic fracturing techniques are used, in the United States

    USGS Publications Warehouse

    Susong, David D.; Gallegos, Tanya J.; Oelsner, Gretchen P.

    2012-01-01

    The U.S. Geological Survey (USGS) John Wesley Powell Center for Analysis and Synthesis is hosting an interdisciplinary working group of USGS scientists to conduct a temporal and spatial analysis of surface-water and groundwater quality in areas of unconventional oil and gas development. The analysis uses existing national and regional datasets to describe water quality, evaluate water-quality changes over time where there are sufficient data, and evaluate spatial and temporal data gaps.

  1. Structure identification methods for atomistic simulations of crystalline materials

    DOE PAGES

    Stukowski, Alexander

    2012-05-28

    Here, we discuss existing and new computational analysis techniques to classify local atomic arrangements in large-scale atomistic computer simulations of crystalline solids. This article includes a performance comparison of typical analysis algorithms such as common neighbor analysis (CNA), centrosymmetry analysis, bond angle analysis, bond order analysis and Voronoi analysis. In addition we propose a simple extension to the CNA method that makes it suitable for multi-phase systems. Finally, we introduce a new structure identification algorithm, the neighbor distance analysis, which is designed to identify atomic structure units in grain boundaries.

  2. Periodicity computation of generalized mathematical biology problems involving delay differential equations.

    PubMed

    Jasim Mohammed, M; Ibrahim, Rabha W; Ahmad, M Z

    2017-03-01

    In this paper, we consider a low initial population model. Our aim is to study the periodicity computation of this model by using neutral differential equations, which are recognized in various studies including biology. We generalize the neutral Rayleigh equation for the third-order by exploiting the model of fractional calculus, in particular the Riemann-Liouville differential operator. We establish the existence and uniqueness of a periodic computational outcome. The technique depends on the continuation theorem of the coincidence degree theory. Besides, an example is presented to demonstrate the finding.

  3. Autoregressive linear least square single scanning electron microscope image signal-to-noise ratio estimation.

    PubMed

    Sim, Kok Swee; NorHisham, Syafiq

    2016-11-01

    A technique based on linear Least Squares Regression (LSR) model is applied to estimate signal-to-noise ratio (SNR) of scanning electron microscope (SEM) images. In order to test the accuracy of this technique on SNR estimation, a number of SEM images are initially corrupted with white noise. The autocorrelation function (ACF) of the original and the corrupted SEM images are formed to serve as the reference point to estimate the SNR value of the corrupted image. The LSR technique is then compared with the previous three existing techniques known as nearest neighbourhood, first-order interpolation, and the combination of both nearest neighborhood and first-order interpolation. The actual and the estimated SNR values of all these techniques are then calculated for comparison purpose. It is shown that the LSR technique is able to attain the highest accuracy compared to the other three existing techniques as the absolute difference between the actual and the estimated SNR value is relatively small. SCANNING 38:771-782, 2016. © 2016 Wiley Periodicals, Inc. © Wiley Periodicals, Inc.

  4. Noncontaminating technique for making holes in existing process systems

    NASA Technical Reports Server (NTRS)

    Hecker, T. P.; Czapor, H. P.; Giordano, S. M.

    1972-01-01

    Technique is developed for making cleanly-contoured holes in assembled process systems without introducing chips or other contaminants into system. Technique uses portable equipment and does not require dismantling of system. Method was tested on Inconel, stainless steel, ASTMA-53, and Hastelloy X in all positions.

  5. Final scientific and technical report: New experiments to measure the neutrino mass scale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Monreal, Benjamin

    In this work, we made material progress towards future measurements of the mass of the neutrino. The neutrino is a fundamental particle, first observed in the 1950s and subjected to particularly intense study over the past 20 years. It is now known to have some, non-zero mass, but we are in an unusual situation of knowing the mass exists but not knowing what value it takes. The mass may be determined by precise measurements of certain radioactive decay distributions, particularly the beta decay of tritium. The KATRIN experiment is an international project which is nearing the beginning of a tritiummore » measurement campaign using a large electrostatic spectrumeter. This research included participation in KATRIN, including construction and delivery of a key calibration subsystem, the ``Rear Section''. To obtain sensitivity beyond KATRIN's, new techniques are required; this work included R&D on a new technique we call CRES (Cyclotron Resonance Electron Spectroscopy) which has promise to enable even more sensitive tritium decay measurements. We successfully carried out CRES spectroscopy in a model system in 2014, making an important step towards the design of a next-generation tritium experiment with new neutrino mass measurement abilities.« less

  6. Atomistic determination of flexoelectric properties of crystalline dielectrics

    NASA Astrophysics Data System (ADS)

    Maranganti, R.; Sharma, P.

    2009-08-01

    Upon application of a uniform strain, internal sublattice shifts within the unit cell of a noncentrosymmetric dielectric crystal result in the appearance of a net dipole moment: a phenomenon well known as piezoelectricity. A macroscopic strain gradient on the other hand can induce polarization in dielectrics of any crystal structure, even those which possess a centrosymmetric lattice. This phenomenon, called flexoelectricity, has both bulk and surface contributions: the strength of the bulk contribution can be characterized by means of a material property tensor called the bulk flexoelectric tensor. Several recent studies suggest that strain-gradient induced polarization may be responsible for a variety of interesting and anomalous electromechanical phenomena in materials including electromechanical coupling effects in nonuniformly strained nanostructures, “dead layer” effects in nanocapacitor systems, and “giant” piezoelectricity in perovskite nanostructures among others. In this work, adopting a lattice dynamics based microscopic approach we provide estimates of the flexoelectric tensor for certain cubic crystalline ionic salts, perovskite dielectrics, III-V and II-VI semiconductors. We compare our estimates with experimental/theoretical values wherever available and also revisit the validity of an existing empirical scaling relationship for the magnitude of flexoelectric coefficients in terms of material parameters. It is interesting to note that two independent groups report values of flexoelectric properties for perovskite dielectrics that are orders of magnitude apart: Cross and co-workers from Penn State have carried out experimental studies on a variety of materials including barium titanate while Catalan and co-workers from Cambridge used theoretical ab initio techniques as well as experimental techniques to study paraelectric strontium titanate as well as ferroelectric barium titanate and lead titanate. We find that, in the case of perovskite dielectrics, our estimates agree to an order of magnitude with the experimental and theoretical estimates for strontium titanate. For barium titanate however, while our estimates agree to an order of magnitude with existing ab initio calculations, there exists a large discrepancy with experimental estimates. The possible reasons for the observed deviations are discussed.

  7. Techniques of monitoring blood glucose during pregnancy for women with pre-existing diabetes.

    PubMed

    Moy, Foong Ming; Ray, Amita; Buckley, Brian S; West, Helen M

    2017-06-11

    Self-monitoring of blood glucose (SMBG) is recommended as a key component of the management plan for diabetes therapy during pregnancy. No existing systematic reviews consider the benefits/effectiveness of various techniques of blood glucose monitoring on maternal and infant outcomes among pregnant women with pre-existing diabetes. The effectiveness of the various monitoring techniques is unclear. To compare techniques of blood glucose monitoring and their impact on maternal and infant outcomes among pregnant women with pre-existing diabetes. We searched the Cochrane Pregnancy and Childbirth Group's Trials Register (30 November 2016), searched reference lists of retrieved studies and contacted trial authors. Randomised controlled trials (RCTs) and quasi-RCTs comparing techniques of blood glucose monitoring including SMBG, continuous glucose monitoring (CGM) or clinic monitoring among pregnant women with pre-existing diabetes mellitus (type 1 or type 2). Trials investigating timing and frequency of monitoring were also included. RCTs using a cluster-randomised design were eligible for inclusion but none were identified. Two review authors independently assessed study eligibility, extracted data and assessed the risk of bias of included studies. Data were checked for accuracy. The quality of the evidence was assessed using the GRADE approach. This review update includes at total of 10 trials (538) women (468 women with type 1 diabetes and 70 women with type 2 diabetes). The trials took place in Europe and the USA. Five of the 10 included studies were at moderate risk of bias, four studies were at low to moderate risk of bias, and one study was at high risk of bias. The trials are too small to show differences in important outcomes such as macrosomia, preterm birth, miscarriage or death of baby. Almost all the reported GRADE outcomes were assessed as being very low-quality evidence. This was due to design limitations in the studies, wide confidence intervals, small sample sizes, and few events. In addition, there was high heterogeneity for some outcomes.Various methods of glucose monitoring were compared in the trials. Neither pooled analyses nor individual trial analyses showed any clear advantages of one monitoring technique over another for primary and secondary outcomes. Many important outcomes were not reported.1. Self-monitoring versus standard care (two studies, 43 women): there was no clear difference for caesarean section (risk ratio (RR) 0.78, 95% confidence interval (CI) 0.40 to 1.49; one study, 28 women) or glycaemic control (both very low-quality), and not enough evidence to assess perinatal mortality and neonatal mortality and morbidity composite. Hypertensive disorders of pregnancy, large-for-gestational age, neurosensory disability, and preterm birth were not reported in either study.2. Self-monitoring versus hospitalisation (one study, 100 women): there was no clear difference for hypertensive disorders of pregnancy (pre-eclampsia and hypertension) (RR 4.26, 95% CI 0.52 to 35.16; very low-quality: RR 0.43, 95% CI 0.08 to 2.22; very low-quality). There was no clear difference in caesarean section or preterm birth less than 37 weeks' gestation (both very low quality), and the sample size was too small to assess perinatal mortality (very low-quality). Large-for-gestational age, mortality or morbidity composite, neurosensory disability and preterm birth less than 34 weeks were not reported.3. Pre-prandial versus post-prandial glucose monitoring (one study, 61 women): there was no clear difference between groups for caesarean section (RR 1.45, 95% CI 0.92 to 2.28; very low-quality), large-for-gestational age (RR 1.16, 95% CI 0.73 to 1.85; very low-quality) or glycaemic control (very low-quality). The results for hypertensive disorders of pregnancy: pre-eclampsia and perinatal mortality are not meaningful because these outcomes were too rare to show differences in a small sample (all very low-quality). The study did not report the outcomes mortality or morbidity composite, neurosensory disability or preterm birth.4. Automated telemedicine monitoring versus conventional system (three studies, 84 women): there was no clear difference for caesarean section (RR 0.96, 95% CI 0.62 to 1.48; one study, 32 women; very low-quality), and mortality or morbidity composite in the one study that reported these outcomes. There were no clear differences for glycaemic control (very low-quality). No studies reported hypertensive disorders of pregnancy, large-for-gestational age, perinatal mortality (stillbirth and neonatal mortality), neurosensory disability or preterm birth.5.CGM versus intermittent monitoring (two studies, 225 women): there was no clear difference for pre-eclampsia (RR 1.37, 95% CI 0.52 to 3.59; low-quality), caesarean section (average RR 1.00, 95% CI 0.65 to 1.54; I² = 62%; very low-quality) and large-for-gestational age (average RR 0.89, 95% CI 0.41 to 1.92; I² = 82%; very low-quality). Glycaemic control indicated by mean maternal HbA1c was lower for women in the continuous monitoring group (mean difference (MD) -0.60 %, 95% CI -0.91 to -0.29; one study, 71 women; moderate-quality). There was not enough evidence to assess perinatal mortality and there were no clear differences for preterm birth less than 37 weeks' gestation (low-quality). Mortality or morbidity composite, neurosensory disability and preterm birth less than 34 weeks were not reported.6. Constant CGM versus intermittent CGM (one study, 25 women): there was no clear difference between groups for caesarean section (RR 0.77, 95% CI 0.33 to 1.79; very low-quality), glycaemic control (mean blood glucose in the 3rd trimester) (MD -0.14 mmol/L, 95% CI -2.00 to 1.72; very low-quality) or preterm birth less than 37 weeks' gestation (RR 1.08, 95% CI 0.08 to 15.46; very low-quality). Other primary (hypertensive disorders of pregnancy, large-for-gestational age, perinatal mortality (stillbirth and neonatal mortality), mortality or morbidity composite, and neurosensory disability) or GRADE outcomes (preterm birth less than 34 weeks' gestation) were not reported. This review found no evidence that any glucose monitoring technique is superior to any other technique among pregnant women with pre-existing type 1 or type 2 diabetes. The evidence base for the effectiveness of monitoring techniques is weak and additional evidence from large well-designed randomised trials is required to inform choices of glucose monitoring techniques.

  8. Comparison of baseline removal methods for laser-induced breakdown spectroscopy of geological samples

    NASA Astrophysics Data System (ADS)

    Dyar, M. Darby; Giguere, Stephen; Carey, CJ; Boucher, Thomas

    2016-12-01

    This project examines the causes, effects, and optimization of continuum removal in laser-induced breakdown spectroscopy (LIBS) to produce the best possible prediction accuracy of elemental composition in geological samples. We compare prediction accuracy resulting from several different techniques for baseline removal, including asymmetric least squares (ALS), adaptive iteratively reweighted penalized least squares (Air-PLS), fully automatic baseline correction (FABC), continuous wavelet transformation, median filtering, polynomial fitting, the iterative thresholding Dietrich method, convex hull/rubber band techniques, and a newly-developed technique for Custom baseline removal (BLR). We assess the predictive performance of these methods using partial least-squares analysis for 13 elements of geological interest, expressed as the weight percentages of SiO2, Al2O3, TiO2, FeO, MgO, CaO, Na2O, K2O, and the parts per million concentrations of Ni, Cr, Zn, Mn, and Co. We find that previously published methods for baseline subtraction generally produce equivalent prediction accuracies for major elements. When those pre-existing methods are used, automated optimization of their adjustable parameters is always necessary to wring the best predictive accuracy out of a data set; ideally, it should be done for each individual variable. The new technique of Custom BLR produces significant improvements in prediction accuracy over existing methods across varying geological data sets, instruments, and varying analytical conditions. These results also demonstrate the dual objectives of the continuum removal problem: removing a smooth underlying signal to fit individual peaks (univariate analysis) versus using feature selection to select only those channels that contribute to best prediction accuracy for multivariate analyses. Overall, the current practice of using generalized, one-method-fits-all-spectra baseline removal results in poorer predictive performance for all methods. The extra steps needed to optimize baseline removal for each predicted variable and empower multivariate techniques with the best possible input data for optimal prediction accuracy are shown to be well worth the slight increase in necessary computations and complexity.

  9. Severity-Based Adaptation with Limited Data for ASR to Aid Dysarthric Speakers

    PubMed Central

    Mustafa, Mumtaz Begum; Salim, Siti Salwah; Mohamed, Noraini; Al-Qatab, Bassam; Siong, Chng Eng

    2014-01-01

    Automatic speech recognition (ASR) is currently used in many assistive technologies, such as helping individuals with speech impairment in their communication ability. One challenge in ASR for speech-impaired individuals is the difficulty in obtaining a good speech database of impaired speakers for building an effective speech acoustic model. Because there are very few existing databases of impaired speech, which are also limited in size, the obvious solution to build a speech acoustic model of impaired speech is by employing adaptation techniques. However, issues that have not been addressed in existing studies in the area of adaptation for speech impairment are as follows: (1) identifying the most effective adaptation technique for impaired speech; and (2) the use of suitable source models to build an effective impaired-speech acoustic model. This research investigates the above-mentioned two issues on dysarthria, a type of speech impairment affecting millions of people. We applied both unimpaired and impaired speech as the source model with well-known adaptation techniques like the maximum likelihood linear regression (MLLR) and the constrained-MLLR(C-MLLR). The recognition accuracy of each impaired speech acoustic model is measured in terms of word error rate (WER), with further assessments, including phoneme insertion, substitution and deletion rates. Unimpaired speech when combined with limited high-quality speech-impaired data improves performance of ASR systems in recognising severely impaired dysarthric speech. The C-MLLR adaptation technique was also found to be better than MLLR in recognising mildly and moderately impaired speech based on the statistical analysis of the WER. It was found that phoneme substitution was the biggest contributing factor in WER in dysarthric speech for all levels of severity. The results show that the speech acoustic models derived from suitable adaptation techniques improve the performance of ASR systems in recognising impaired speech with limited adaptation data. PMID:24466004

  10. Aqua splint suture technique in isolated zygomatic arch fractures.

    PubMed

    Kim, Dong-Kyu; Kim, Seung Kyun; Lee, Jun Ho; Park, Chan Hum

    2014-04-01

    Various methods have been used to treat zygomatic arch fractures, but no optimal modality exists for reducing these fractures and supporting the depressed bone fragments without causing esthetic problems and discomfort for life. We developed a novel aqua splint and suture technique for stabilizing isolated zygomatic arch fractures. The objective of this study is to evaluate the effect of novel aqua splint and suture technique in isolated zygomatic arch fractures. Patients with isolated zygomatic arch fractures were treated by a single surgeon in a single center from January 2000 through December 2012. Classic Gillies approach without external fixation was performed from January 2000 to December 2003, while the novel technique has been performed since 2004. 67 consecutive patients were included (Classic method, n = 32 and Novel method, n = 35). An informed consent was obtained from all patients. The novel aqua splint and suture technique was performed by the following fashion: first, we evaluated intraoperatively the bony alignment by ultrasonography and then, reduced the depressed fracture surgically using the Gillies approach. Thereafter, to stabilize the fracture and obtain the smooth facial figure, we made an aqua splint that fit the facial contour and placed monofilament nonabsorbable sutures around the fractured zygomatic arch. The novel aqua splint and suture technique showed significantly correlated with better cosmetic and functional results. In conclusion, the aqua splint suture technique is very simple, quick, safe, and effective for stabilizing repositioned zygomatic arch fractures. The aqua splint suture technique can be a good alternative procedure in isolated zygomatic arch fractures.

  11. Risk factor analysis of new brain lesions associated with carotid endarterectmy.

    PubMed

    Lee, Jae Hoon; Suh, Bo Yang

    2014-01-01

    Carotid endarterectomy (CEA) is the standard treatment for carotid artery stenosis. New brain ischemia is a major concern associated with CEA and diffusion weighted imaging (DWI) is a good imaging modality for detecting early ischemic brain lesions. We aimed to investigate the surgical complications and identify the potential risk factors for the incidence of new brain lesions (NBL) on DWI after CEA. From January 2006 to November 2011, 94 patients who had been studied by magnetic resonance imaging including DWI within 1 week after CEA were included in this study. Data were retrospectively investigated by review of vascular registry protocol. Seven clinical variables and three procedural variables were analyzed as risk factors for NBL after CEA. The incidence of periprocedural NBL on DWI was 27.7%. There were no fatal complications, such as ipsilateral disabling stroke, myocardial infarction or mortality. A significantly higher incidence of NBL was found in ulcer positive patients as opposed to ulcer negative patients (P = 0.029). The incidence of NBL after operation was significantly higher in patients treated with conventional technique than with eversion technique (P = 0.042). Our data shows CEA has acceptable periprocedural complication rates and the existence of ulcerative plaque and conventional technique of endarterectomy are high risk factors for NBL development after CEA.

  12. Breakthrough Propulsion Physics Project: Project Management Methods

    NASA Technical Reports Server (NTRS)

    Millis, Marc G.

    2004-01-01

    To leap past the limitations of existing propulsion, the NASA Breakthrough Propulsion Physics (BPP) Project seeks further advancements in physics from which new propulsion methods can eventually be derived. Three visionary breakthroughs are sought: (1) propulsion that requires no propellant, (2) propulsion that circumvents existing speed limits, and (3) breakthrough methods of energy production to power such devices. Because these propulsion goals are presumably far from fruition, a special emphasis is to identify credible research that will make measurable progress toward these goals in the near-term. The management techniques to address this challenge are presented, with a special emphasis on the process used to review, prioritize, and select research tasks. This selection process includes these key features: (a) research tasks are constrained to only address the immediate unknowns, curious effects or critical issues, (b) reliability of assertions is more important than the implications of the assertions, which includes the practice where the reviewers judge credibility rather than feasibility, and (c) total scores are obtained by multiplying the criteria scores rather than by adding. Lessons learned and revisions planned are discussed.

  13. SEARCHING FOR EXTRATERRESTRIAL INTELLIGENCE SIGNALS IN ASTRONOMICAL SPECTRA, INCLUDING EXISTING DATA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Borra, Ermanno F., E-mail: borra@phy.ulaval.ca

    The main purpose of this article is to make astronomers aware that Searches for Extraterrestrial Intelligence (SETIs) can be carried out by analyzing standard astronomical spectra, including those they have already taken. Simplicity is the outstanding advantage of a search in spectra. The spectra can be analyzed by simple eye inspection or a few lines of code that uses Fourier transform software. Theory, confirmed by published experiments, shows that periodic signals in spectra can be easily generated by sending light pulses separated by constant time intervals. While part of this article, like all articles on SETIs, is highly speculative themore » basic physics is sound. In particular, technology now available on Earth could be used to send signals having the required energy to be detected at a target located 1000 lt-yr away. Extraterrestrial Intelligence (ETI) could use these signals to make us aware of their existence. For an ETI, the technique would also have the advantage that the signals could be detected both in spectra and searches for intensity pulses like those currently carried out on Earth.« less

  14. Counterfeit deterrence and digital imaging technology

    NASA Astrophysics Data System (ADS)

    Church, Sara E.; Fuller, Reese H.; Jaffe, Annette B.; Pagano, Lorelei W.

    2000-04-01

    The US government recognizes the growing problem of counterfeiting currency using digital imaging technology, as desktop systems become more sophisticated, less expensive and more prevalent. As the rate of counterfeiting with this type of equipment has grown, the need for specific prevention methods has become apparent to the banknote authorities. As a result, the Treasury Department and Federal Reserve have begun to address issues related specifically to this type of counterfeiting. The technical representatives of these agencies are taking a comprehensive approach to minimize counterfeiting using digital technology. This approach includes identification of current technology solutions for banknote recognition, data stream intervention and output marking, outreach to the hardware and software industries and enhancement of public education efforts. Other aspects include strong support and cooperation with existing international efforts to prevent counterfeiting, review and amendment of existing anti- counterfeiting legislation and investigation of currency design techniques to make faithful reproduction more difficult. Implementation of these steps and others are to lead to establishment of a formal, permanent policy to address and prevent the use of emerging technologies to counterfeit currency.

  15. Surface-supported metal-organic framework thin films: fabrication methods, applications, and challenges.

    PubMed

    Liu, Jinxuan; Wöll, Christof

    2017-10-02

    Surface-supported metal-organic framework thin films are receiving increasing attention as a novel form of nanotechnology. New deposition techniques that enable the control of the film thickness, homogeneity, morphology, and dimensions with a huge number of metal-organic framework compounds offer tremendous opportunities in a number of different application fields. In response to increasing demands for environmental sustainability and cleaner energy, much effort in recent years has been devoted to the development of MOF thin films for applications in photovoltaics, CO 2 reduction, energy storage, water splitting, and electronic devices, as well as for the fabrication of membranes. Although existing applications are promising and encouraging, MOF thin films still face numerous challenges, including the need for a more thorough understanding of the thin-film growth mechanism, stability of the internal and external interfaces, strategies for doping and models for charge carrier transport. In this paper, we review the recent advances in MOF thin films, including fabrication and patterning strategies and existing nanotechnology applications. We conclude by listing the most attractive future opportunities as well as the most urgent challenges.

  16. Piezoelectric technology in otolaryngology, and head and neck surgery: a review.

    PubMed

    Meller, C; Havas, T E

    2017-07-01

    Piezoelectric technology has existed for many years as a surgical tool for precise removal of soft tissue and bone. The existing literature regarding its use specifically for otolaryngology, and head and neck surgery was reviewed. The databases Medline, the Cochrane Central Register of Controlled Trials, PubMed, Embase and Cambridge Scientific Abstracts were searched. Studies were selected and reviewed based on relevance. Sixty studies were identified and examined for evidence of benefits and disadvantages of piezoelectric surgery and its application in otolaryngology. The technique was compared with traditional surgical methods, in terms of intra-operative bleeding, histology, learning curve, operative time and post-operative pain. Piezoelectric technology has been successfully employed, particularly in otology and skull base surgery, where its specific advantages versus traditional drills include a lack of 'blunting' and tissue selectivity. Technical advantages include ease of use, a short learning curve and improved visibility. Its higher cost warrants consideration given that clinically significant improvements in operative time and morbidity have not yet been proven. Further studies may define the evolving role of piezoelectric surgery in otolaryngology, and head and neck surgery.

  17. Advanced statistical methods for improved data analysis of NASA astrophysics missions

    NASA Technical Reports Server (NTRS)

    Feigelson, Eric D.

    1992-01-01

    The investigators under this grant studied ways to improve the statistical analysis of astronomical data. They looked at existing techniques, the development of new techniques, and the production and distribution of specialized software to the astronomical community. Abstracts of nine papers that were produced are included, as well as brief descriptions of four software packages. The articles that are abstracted discuss analytical and Monte Carlo comparisons of six different linear least squares fits, a (second) paper on linear regression in astronomy, two reviews of public domain software for the astronomer, subsample and half-sample methods for estimating sampling distributions, a nonparametric estimation of survival functions under dependent competing risks, censoring in astronomical data due to nondetections, an astronomy survival analysis computer package called ASURV, and improving the statistical methodology of astronomical data analysis.

  18. Mississippi Sound Remote Sensing Study

    NASA Technical Reports Server (NTRS)

    Atwell, B. H.

    1973-01-01

    The Mississippi Sound Remote Sensing Study was initiated as part of the research program of the NASA Earth Resources Laboratory. The objective of this study is development of remote sensing techniques to study near-shore marine waters. Included within this general objective are the following: (1) evaluate existing techniques and instruments used for remote measurement of parameters of interest within these waters; (2) develop methods for interpretation of state-of-the-art remote sensing data which are most meaningful to an understanding of processes taking place within near-shore waters; (3) define hardware development requirements and/or system specifications; (4) develop a system combining data from remote and surface measurements which will most efficiently assess conditions in near-shore waters; (5) conduct projects in coordination with appropriate operating agencies to demonstrate applicability of this research to environmental and economic problems.

  19. Advanced MR Imaging of the Placenta: Exploring the in utero placenta-brain connection

    PubMed Central

    Andescavage, Nickie Niforatos; DuPlessis, Adre; Limperopoulos, Catherine

    2015-01-01

    The placenta is a vital organ necessary for the healthy neurodevelopment of the fetus. Despite the known associations between placental dysfunction and neurologic impairment, there is a paucity of tools available to reliably assess in vivo placental health and function. Existing clinical tools for placental assessment remain insensitive in predicting and assessing placental well-being. Advanced MRI techniques hold significant promise for the dynamic, non-invasive, real-time assessment of placental health and identification of early placental-based disorders. In this review, we summarize the available clinical tools for placental assessment including ultrasound, Doppler, and conventional MRI. We then explore the emerging role of advanced placental MR imaging techniques for supporting the developing fetus, appraise the strengths and limitations of quantitative MRI in identifying early markers of placental dysfunction for improved pregnancy monitoring and fetal outcomes. PMID:25765905

  20. Approximate heating analysis for the windward-symmetry plane of Shuttle-like bodies at large angle of attack

    NASA Technical Reports Server (NTRS)

    Zoby, E. V.

    1981-01-01

    An engineering method has been developed for computing the windward-symmetry plane convective heat-transfer rates on Shuttle-like vehicles at large angles of attack. The engineering code includes an approximate inviscid flowfield technique, laminar and turbulent heating-rate expressions, an approximation to account for the variable-entropy effects on the surface heating and the concept of an equivalent axisymmetric body to model the windward-ray flowfields of Shuttle-like vehicles at angles of attack from 25 to 45 degrees. The engineering method is validated by comparing computed heating results with corresponding experimental data measured on Shuttle and advanced transportation models over a wide range of flow conditions and angles of attack from 25 to 40 degrees and also with results of existing prediction techniques. The comparisons are in good agreement.

  1. Enhancing audiovisual experience with haptic feedback: a survey on HAV.

    PubMed

    Danieau, F; Lecuyer, A; Guillotel, P; Fleureau, J; Mollet, N; Christie, M

    2013-01-01

    Haptic technology has been widely employed in applications ranging from teleoperation and medical simulation to art and design, including entertainment, flight simulation, and virtual reality. Today there is a growing interest among researchers in integrating haptic feedback into audiovisual systems. A new medium emerges from this effort: haptic-audiovisual (HAV) content. This paper presents the techniques, formalisms, and key results pertinent to this medium. We first review the three main stages of the HAV workflow: the production, distribution, and rendering of haptic effects. We then highlight the pressing necessity for evaluation techniques in this context and discuss the key challenges in the field. By building on existing technologies and tackling the specific challenges of the enhancement of audiovisual experience with haptics, we believe the field presents exciting research perspectives whose financial and societal stakes are significant.

  2. Program for narrow-band analysis of aircraft flyover noise using ensemble averaging techniques

    NASA Technical Reports Server (NTRS)

    Gridley, D.

    1982-01-01

    A package of computer programs was developed for analyzing acoustic data from an aircraft flyover. The package assumes the aircraft is flying at constant altitude and constant velocity in a fixed attitude over a linear array of ground microphones. Aircraft position is provided by radar and an option exists for including the effects of the aircraft's rigid-body attitude relative to the flight path. Time synchronization between radar and acoustic recording stations permits ensemble averaging techniques to be applied to the acoustic data thereby increasing the statistical accuracy of the acoustic results. Measured layered meteorological data obtained during the flyovers are used to compute propagation effects through the atmosphere. Final results are narrow-band spectra and directivities corrected for the flight environment to an equivalent static condition at a specified radius.

  3. Study for incorporating time-synchronized approach control into the CH-47/VALT digital navigation system

    NASA Technical Reports Server (NTRS)

    Mcconnell, W. J., Jr.

    1979-01-01

    Techniques for obtaining time synchronized (4D) approach control in the VALT research helicopter is described. Various 4D concepts and their compatibility with the existing VALT digital computer navigation and guidance system hardware and software are examined. Modifications to various techniques were investigated in order to take advantage of the unique operating characteristics of the helicopter in the terminal area. A 4D system is proposed, combining the direct to maneuver with the existing VALT curved path generation capability.

  4. Prediction of flow-induced failures of braided flexible hoses and bellows

    NASA Technical Reports Server (NTRS)

    Sack, L. E.; Nelson, R. L.; Mason, D. R.; Cooper, R. A.

    1972-01-01

    Analytical techniques were developed to evaluate braided hoses and bellows for possibility of flow induced resonance. These techniques determine likelihood of high cycle fatigue failure when such resonance exists.

  5. Assessing dynamics, spatial scale, and uncertainty in task-related brain network analyses

    PubMed Central

    Stephen, Emily P.; Lepage, Kyle Q.; Eden, Uri T.; Brunner, Peter; Schalk, Gerwin; Brumberg, Jonathan S.; Guenther, Frank H.; Kramer, Mark A.

    2014-01-01

    The brain is a complex network of interconnected elements, whose interactions evolve dynamically in time to cooperatively perform specific functions. A common technique to probe these interactions involves multi-sensor recordings of brain activity during a repeated task. Many techniques exist to characterize the resulting task-related activity, including establishing functional networks, which represent the statistical associations between brain areas. Although functional network inference is commonly employed to analyze neural time series data, techniques to assess the uncertainty—both in the functional network edges and the corresponding aggregate measures of network topology—are lacking. To address this, we describe a statistically principled approach for computing uncertainty in functional networks and aggregate network measures in task-related data. The approach is based on a resampling procedure that utilizes the trial structure common in experimental recordings. We show in simulations that this approach successfully identifies functional networks and associated measures of confidence emergent during a task in a variety of scenarios, including dynamically evolving networks. In addition, we describe a principled technique for establishing functional networks based on predetermined regions of interest using canonical correlation. Doing so provides additional robustness to the functional network inference. Finally, we illustrate the use of these methods on example invasive brain voltage recordings collected during an overt speech task. The general strategy described here—appropriate for static and dynamic network inference and different statistical measures of coupling—permits the evaluation of confidence in network measures in a variety of settings common to neuroscience. PMID:24678295

  6. Assessing dynamics, spatial scale, and uncertainty in task-related brain network analyses.

    PubMed

    Stephen, Emily P; Lepage, Kyle Q; Eden, Uri T; Brunner, Peter; Schalk, Gerwin; Brumberg, Jonathan S; Guenther, Frank H; Kramer, Mark A

    2014-01-01

    The brain is a complex network of interconnected elements, whose interactions evolve dynamically in time to cooperatively perform specific functions. A common technique to probe these interactions involves multi-sensor recordings of brain activity during a repeated task. Many techniques exist to characterize the resulting task-related activity, including establishing functional networks, which represent the statistical associations between brain areas. Although functional network inference is commonly employed to analyze neural time series data, techniques to assess the uncertainty-both in the functional network edges and the corresponding aggregate measures of network topology-are lacking. To address this, we describe a statistically principled approach for computing uncertainty in functional networks and aggregate network measures in task-related data. The approach is based on a resampling procedure that utilizes the trial structure common in experimental recordings. We show in simulations that this approach successfully identifies functional networks and associated measures of confidence emergent during a task in a variety of scenarios, including dynamically evolving networks. In addition, we describe a principled technique for establishing functional networks based on predetermined regions of interest using canonical correlation. Doing so provides additional robustness to the functional network inference. Finally, we illustrate the use of these methods on example invasive brain voltage recordings collected during an overt speech task. The general strategy described here-appropriate for static and dynamic network inference and different statistical measures of coupling-permits the evaluation of confidence in network measures in a variety of settings common to neuroscience.

  7. Meta-analysis suggests that the electromagnetic technique is better than the free-hand method for the distal locking during intramedullary nailing procedures.

    PubMed

    Zhu, Yanbin; Chang, Hengrui; Yu, Yiyang; Chen, Wei; Liu, Song; Zhang, Yingze

    2017-05-01

    To evaluate the comparative effectiveness and accuracy of electromagnetic technique (EM) verses free-hand method (FH) for distal locking in intramedullary nailing procedure. Relevant original studies were searched in Medline, Pubmed, Embase, China National Knowledge Infrastructure, and Cochrane Central Database (all through October 2015). Comparative studies providing sufficient data of interest were included in this meta-analysis. The Stata 11.0 was used to analyze all data. Eight studies involving 611 participants were included, with 305 in EM group and 306 in FH group. EM outperformed FH with reduced distal locking time of 4.1 minutes [standardized mean difference (SMD), 1.61; 95 % confidence interval (95 %CI), 0.81 to 2.41] and the reduced fluoroscopy time of 25.3 seconds (SMD, 2.64; 95 %CI, 2.12 to 3.16). Regarding the accuracy of distal screw placement, no significant difference was observed between two techniques (OR, 2.39; 95 %CI, 0.38 to 15.0). There was a trend of longer operative time in FH versus EM by 10 minutes (79.0 and 69.0 minutes), although the difference was not statistically significant (SMD, 0.341; 95 % CI, -0.02 to 0.703). The existing evidence suggests EM technique is a better alternative for distal locking in intramedullary nailing procedure, and this might aid in the management of diaphyseal fractures in lower extremities.

  8. Including the effect of motion artifacts in noise and performance analysis of dual-energy contrast-enhanced mammography

    NASA Astrophysics Data System (ADS)

    Allec, N.; Abbaszadeh, S.; Scott, C. C.; Lewin, J. M.; Karim, K. S.

    2012-12-01

    In contrast-enhanced mammography (CEM), the dual-energy dual-exposure technique, which can leverage existing conventional mammography infrastructure, relies on acquiring the low- and high-energy images using two separate exposures. The finite time between image acquisition leads to motion artifacts in the combined image. Motion artifacts can lead to greater anatomical noise in the combined image due to increased mismatch of the background tissue in the images to be combined, however the impact has not yet been quantified. In this study we investigate a method to include motion artifacts in the dual-energy noise and performance analysis. The motion artifacts are included via an extended cascaded systems model. To validate the model, noise power spectra of a previous dual-energy clinical study are compared to that of the model. The ideal observer detectability is used to quantify the effect of motion artifacts on tumor detectability. It was found that the detectability can be significantly degraded when motion is present (e.g., detectability of 2.5 mm radius tumor decreased by approximately a factor of 2 for translation motion on the order of 1000 μm). The method presented may be used for a more comprehensive theoretical noise and performance analysis and fairer theoretical performance comparison between dual-exposure techniques, where motion artifacts are present, and single-exposure techniques, where low- and high-energy images are acquired simultaneously and motion artifacts are absent.

  9. Including the effect of motion artifacts in noise and performance analysis of dual-energy contrast-enhanced mammography.

    PubMed

    Allec, N; Abbaszadeh, S; Scott, C C; Lewin, J M; Karim, K S

    2012-12-21

    In contrast-enhanced mammography (CEM), the dual-energy dual-exposure technique, which can leverage existing conventional mammography infrastructure, relies on acquiring the low- and high-energy images using two separate exposures. The finite time between image acquisition leads to motion artifacts in the combined image. Motion artifacts can lead to greater anatomical noise in the combined image due to increased mismatch of the background tissue in the images to be combined, however the impact has not yet been quantified. In this study we investigate a method to include motion artifacts in the dual-energy noise and performance analysis. The motion artifacts are included via an extended cascaded systems model. To validate the model, noise power spectra of a previous dual-energy clinical study are compared to that of the model. The ideal observer detectability is used to quantify the effect of motion artifacts on tumor detectability. It was found that the detectability can be significantly degraded when motion is present (e.g., detectability of 2.5 mm radius tumor decreased by approximately a factor of 2 for translation motion on the order of 1000 μm). The method presented may be used for a more comprehensive theoretical noise and performance analysis and fairer theoretical performance comparison between dual-exposure techniques, where motion artifacts are present, and single-exposure techniques, where low- and high-energy images are acquired simultaneously and motion artifacts are absent.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hameed, Abdul; Khoshkbarforoushha, Alireza; Ranjan, Rajiv

    In a cloud computing paradigm, energy efficient allocation of different virtualized ICT resources (servers, storage disks, and networks, and the like) is a complex problem due to the presence of heterogeneous application (e.g., content delivery networks, MapReduce, web applications, and the like) workloads having contentious allocation requirements in terms of ICT resource capacities (e.g., network bandwidth, processing speed, response time, etc.). Several recent papers have tried to address the issue of improving energy efficiency in allocating cloud resources to applications with varying degree of success. However, to the best of our knowledge there is no published literature on this subjectmore » that clearly articulates the research problem and provides research taxonomy for succinct classification of existing techniques. Hence, the main aim of this paper is to identify open challenges associated with energy efficient resource allocation. In this regard, the study, first, outlines the problem and existing hardware and software-based techniques available for this purpose. Furthermore, available techniques already presented in the literature are summarized based on the energy-efficient research dimension taxonomy. The advantages and disadvantages of the existing techniques are comprehensively analyzed against the proposed research dimension taxonomy namely: resource adaption policy, objective function, allocation method, allocation operation, and interoperability.« less

  11. Leveraging Existing Mission Tools in a Re-Usable, Component-Based Software Environment

    NASA Technical Reports Server (NTRS)

    Greene, Kevin; Grenander, Sven; Kurien, James; z,s (fshir. z[orttr); z,scer; O'Reilly, Taifun

    2006-01-01

    Emerging methods in component-based software development offer significant advantages but may seem incompatible with existing mission operations applications. In this paper we relate our positive experiences integrating existing mission applications into component-based tools we are delivering to three missions. In most operations environments, a number of software applications have been integrated together to form the mission operations software. In contrast, with component-based software development chunks of related functionality and data structures, referred to as components, can be individually delivered, integrated and re-used. With the advent of powerful tools for managing component-based development, complex software systems can potentially see significant benefits in ease of integration, testability and reusability from these techniques. These benefits motivate us to ask how component-based development techniques can be relevant in a mission operations environment, where there is significant investment in software tools that are not component-based and may not be written in languages for which component-based tools even exist. Trusted and complex software tools for sequencing, validation, navigation, and other vital functions cannot simply be re-written or abandoned in order to gain the advantages offered by emerging component-based software techniques. Thus some middle ground must be found. We have faced exactly this issue, and have found several solutions. Ensemble is an open platform for development, integration, and deployment of mission operations software that we are developing. Ensemble itself is an extension of an open source, component-based software development platform called Eclipse. Due to the advantages of component-based development, we have been able to vary rapidly develop mission operations tools for three surface missions by mixing and matching from a common set of mission operation components. We have also had to determine how to integrate existing mission applications for sequence development, sequence validation, and high level activity planning, and other functions into a component-based environment. For each of these, we used a somewhat different technique based upon the structure and usage of the existing application.

  12. The development of flux-split algorithms for flows with non-equilibrium thermodynamics and chemical reactions

    NASA Technical Reports Server (NTRS)

    Grossman, B.; Cinella, P.

    1988-01-01

    A finite-volume method for the numerical computation of flows with nonequilibrium thermodynamics and chemistry is presented. A thermodynamic model is described which simplifies the coupling between the chemistry and thermodynamics and also results in the retention of the homogeneity property of the Euler equations (including all the species continuity and vibrational energy conservation equations). Flux-splitting procedures are developed for the fully coupled equations involving fluid dynamics, chemical production and thermodynamic relaxation processes. New forms of flux-vector split and flux-difference split algorithms are embodied in a fully coupled, implicit, large-block structure, including all the species conservation and energy production equations. Several numerical examples are presented, including high-temperature shock tube and nozzle flows. The methodology is compared to other existing techniques, including spectral and central-differenced procedures, and favorable comparisons are shown regarding accuracy, shock-capturing and convergence rates.

  13. Technique for Radiometer and Antenna Array Calibration with Two Antenna Noise Diodes

    NASA Technical Reports Server (NTRS)

    Srinivasan, Karthik; Limaye, Ashutosh; Laymon, Charles; Meyer, Paul

    2011-01-01

    This paper presents a new technique to calibrate a microwave radiometer and phased array antenna system. This calibration technique uses a radiated noise source in addition to an injected noise sources for calibration. The plane of reference for this calibration technique is the face of the antenna and therefore can effectively calibration the gain fluctuations in the active phased array antennas. This paper gives the mathematical formulation for the technique and discusses the improvements brought by the method over the existing calibration techniques.

  14. Improved numerical methods for turbulent viscous recirculating flows

    NASA Technical Reports Server (NTRS)

    Vandoormaal, J. P.; Turan, A.; Raithby, G. D.

    1986-01-01

    The objective of the present study is to improve both the accuracy and computational efficiency of existing numerical techniques used to predict viscous recirculating flows in combustors. A review of the status of the study is presented along with some illustrative results. The effort to improve the numerical techniques consists of the following technical tasks: (1) selection of numerical techniques to be evaluated; (2) two dimensional evaluation of selected techniques; and (3) three dimensional evaluation of technique(s) recommended in Task 2.

  15. Change Detection Analysis of Water Pollution in Coimbatore Region using Different Color Models

    NASA Astrophysics Data System (ADS)

    Jiji, G. Wiselin; Devi, R. Naveena

    2017-12-01

    The data acquired through remote sensing satellites furnish facts about the land and water at varying resolutions and has been widely used for several change detection studies. Apart from the existence of many change detection methodologies and techniques, emergence of new ones continues to subsist. Existing change detection techniques exploit images that are either in gray scale or RGB color model. In this paper we introduced color models for performing change detection for water pollution. Here the polluted lakes are classified and post-classification change detection techniques are applied to RGB images and results obtained are analysed for changes to exist or not. Furthermore RGB images obtained after classification when converted to any of the two color models YCbCr and YIQ is found to produce the same results as that of the RGB model images. Thus it can be concluded that other color models like YCbCr, YIQ can be used as substitution to RGB color model for analysing change detection with regard to water pollution.

  16. Pathogen Decontamination of Food Crop Soil: A Review.

    PubMed

    Gurtler, Joshua B

    2017-09-01

    The purpose of this review is to delineate means of decontaminating soil. This information might be used to mitigate soil-associated risks of foodborne pathogens. The majority of the research in the published literature involves inactivation of plant pathogens in soil, i.e., those pathogens harmful to fruit and vegetable production and ornamental plants. Very little has been published regarding the inactivation of foodborne human pathogens in crop soil. Nevertheless, because decontamination techniques for plant pathogens might also be useful methods for eliminating foodborne pathogens, this review also includes inactivation of plant pathogens, with appropriate discussion and comparisons, in the hopes that these methods may one day be validated against foodborne pathogens. Some of the major soil decontamination methods that have been investigated and are covered include chemical decontamination (chemigation), solarization, steaming, biofumigation, bacterial competitive exclusion, torch flaming, microwave treatment, and amendment with biochar. Other innovative means of inactivating foodborne pathogens in soils may be discovered and explored in the future, provided that these techniques are economically feasible in terms of chemicals, equipment, and labor. Food microbiology and food safety researchers should reach out to soil scientists and plant pathologists to create links where they do not currently exist and strengthen relationships where they do exist to take advantage of multidisciplinary skills. In time, agricultural output and the demand for fresh produce will increase. With advances in the sensitivity of pathogen testing and epidemiological tracebacks, the need to mitigate preharvest bacterial contamination of fresh produce will become paramount. Hence, soil decontamination technologies may become more economically feasible and practical in light of increasing the microbial safety of fresh produce.

  17. Topical Antimicrobials for Burn Infections – An Update

    PubMed Central

    Sevgi, Mert; Toklu, Ani; Vecchio, Daniela; Hamblin, Michael R

    2014-01-01

    The relentless rise in antibiotic resistance among pathogenic bacteria and fungi, coupled with the high susceptibility of burn wounds to infection, and the difficulty of systemically administered antibiotics to reach damaged tissue, taken together have made the development of novel topical antimicrobials for burn infections a fertile area of innovation for researchers and companies. We previously covered the existing patent literature in this area in 2010, but the notable progress made since then, has highlighted the need for an update to bring the reader up to date on recent developments. New patents in the areas of topically applied antibiotics and agents that can potentiate the action of existing antibiotics may extend their useful lifetime. Developments have also been made in biofilm-disrupting agents. Antimicrobial peptides are nature’s way for many life forms to defend themselves against attack by pathogens. Silver has long been known to be a highly active antimicrobial but new inorganic metal derivatives based on bismuth, copper and gallium have emerged. Halogens such as chlorine and iodine can be delivered by novel technologies. A variety of topically applied antimicrobials include chitosan preparations, usnic acid, ceragenins and XF porphyrins. Natural product derived antimicrobials such as tannins and essential oils have also been studied. Novel techniques to deliver reactive oxygen species and nitric oxide in situ have been developed. Light-mediated techniques include photodynamic therapy, ultraviolet irradiation, blue light, low-level laser therapy and titania photocatalysis. Passive immunotherapy employs antibodies against pathogens and their virulence factors. Finally an interesting new area uses therapeutic microorganisms such as phages, probiotic bacteria and protozoa to combat infections. PMID:24215506

  18. Can Communicating Personalised Disease Risk Promote Healthy Behaviour Change? A Systematic Review of Systematic Reviews.

    PubMed

    French, David P; Cameron, Elaine; Benton, Jack S; Deaton, Christi; Harvie, Michelle

    2017-10-01

    The assessment and communication of disease risk that is personalised to the individual is widespread in healthcare contexts. Despite several systematic reviews of RCTs, it is unclear under what circumstances that personalised risk estimates promotes change in four key health-related behaviours: smoking, physical activity, diet and alcohol consumption. The present research aims to systematically identify, evaluate and synthesise the findings of existing systematic reviews. This systematic review of systematic reviews followed published guidance. A search of four databases and two-stage screening procedure with good reliability identified nine eligible systematic reviews. The nine reviews each included between three and 15 primary studies, containing 36 unique studies. Methods of personalising risk feedback included imaging/visual feedback, genetic testing, and numerical estimation from risk algorithms. The reviews were generally high quality. For a broad range of methods of estimating and communicating risk, the reviews found no evidence that risk information had strong or consistent effects on health-related behaviours. The most promising effects came from interventions using visual or imaging techniques and with smoking cessation and dietary behaviour as outcomes, but with inconsistent results. Few interventions explicitly used theory, few targeted self-efficacy or response efficacy, and a limited range of Behaviour Change Techniques were used. Presenting risk information on its own, even when highly personalised, does not produce strong effects on health-related behaviours or changes which are sustained. Future research in this area should build on the existing knowledge base about increasing the effects of risk communication on behaviour.

  19. Orthopoxvirus Genome Evolution: The Role of Gene Loss

    PubMed Central

    Hendrickson, Robert Curtis; Wang, Chunlin; Hatcher, Eneida L.; Lefkowitz, Elliot J.

    2010-01-01

    Poxviruses are highly successful pathogens, known to infect a variety of hosts. The family Poxviridae includes Variola virus, the causative agent of smallpox, which has been eradicated as a public health threat but could potentially reemerge as a bioterrorist threat. The risk scenario includes other animal poxviruses and genetically engineered manipulations of poxviruses. Studies of orthologous gene sets have established the evolutionary relationships of members within the Poxviridae family. It is not clear, however, how variations between family members arose in the past, an important issue in understanding how these viruses may vary and possibly produce future threats. Using a newly developed poxvirus-specific tool, we predicted accurate gene sets for viruses with completely sequenced genomes in the genus Orthopoxvirus. Employing sensitive sequence comparison techniques together with comparison of syntenic gene maps, we established the relationships between all viral gene sets. These techniques allowed us to unambiguously identify the gene loss/gain events that have occurred over the course of orthopoxvirus evolution. It is clear that for all existing Orthopoxvirus species, no individual species has acquired protein-coding genes unique to that species. All existing species contain genes that are all present in members of the species Cowpox virus and that cowpox virus strains contain every gene present in any other orthopoxvirus strain. These results support a theory of reductive evolution in which the reduction in size of the core gene set of a putative ancestral virus played a critical role in speciation and confining any newly emerging virus species to a particular environmental (host or tissue) niche. PMID:21994715

  20. Posterior lamellar reconstruction: a comprehensive review of the literature.

    PubMed

    Fin, Alessandra; De Biasio, Fabrizio; Lanzetta, Paolo; Mura, Sebastiano; Tarantini, Anna; Parodi, Pier Camillo

    2018-05-21

    The aim of the review is to describe the different techniques and materials available to reconstruct the tarsoconjunctival layer of the eyelid; to analyze their indications, advantages, and disadvantages. We searched the Cochrane, PubMed, and Ovid MEDLINE databases for English articles published between January 1990 and January 2017 using variations of the following key words: "posterior lamella," "eyelid reconstruction," "tarsoconjunctival," "flap," and "graft." Two reviewers checked the abstracts of the articles found to eliminate redundant or not relevant articles. The references of the identified articles were screened manually to include relevant works not found through the initial search. The search identified 174 articles. Only a few articles with a therapeutic level of evidence were found. Techniques for the posterior lamellar reconstruction can be categorized as local, regional, and distant flaps; tarsoconjunctival, heterotopic, homologous, and heterologous grafts. Several techniques and variations on the techniques exist to reconstruct the posterior lamella, and, for similar indications, there's no evidence of the primacy of one over the other. Defect size and location as well as patient features must guide the oculoplastic surgeon's choice. The use of biomaterials can avoid possible complications of the donor site.

  1. Heating and thermal control of brazing technique to break contamination path for potential Mars sample return

    NASA Astrophysics Data System (ADS)

    Bao, Xiaoqi; Badescu, Mircea; Sherrit, Stewart; Bar-Cohen, Yoseph; Campos, Sergio

    2017-04-01

    The potential return of Mars sample material is of great interest to the planetary science community, as it would enable extensive analysis of samples with highly sensitive laboratory instruments. It is important to make sure such a mission concept would not bring any living microbes, which may possibly exist on Mars, back to Earth's environment. In order to ensure the isolation of Mars microbes from Earth's Atmosphere, a brazing sealing and sterilizing technique was proposed to break the Mars-to-Earth contamination path. Effectively, heating the brazing zone in high vacuum space and controlling the sample temperature for integrity are key challenges to the implementation of this technique. The break-thechain procedures for container configurations, which are being considered, were simulated by multi-physics finite element models. Different heating methods including induction and resistive/radiation were evaluated. The temperature profiles of Martian samples in a proposed container structure were predicted. The results show that the sealing and sterilizing process can be controlled such that the samples temperature is maintained below the level that may cause damage, and that the brazing technique is a feasible approach to breaking the contamination path.

  2. VOLATILE ORGANIC COMPOUNDS (VOC) RECOVERY ...

    EPA Pesticide Factsheets

    The purpose of the seminar was to bring researchers, technology developers, and industry representatives together to discuss recovery technologies and techniques for VOCs. The seminar focused on the specific VOC recovery needs of industry and on case studies that summarize effective VOC product recovery techniques applicable to air, water, and solid waste. The case studies highlighted examples in which existing and new recovery technologies resulted in significant cost savings to industry. The seminar focused on the following key issues:. Status and future direction of EPA< DOE, and other major research programs.. What are the latest technology innovations in VOC treatment and recovery?. Performance and cost effectiveness of VOC recovery techniques.. How are recovery techniques applied to air, water, and solid waste?Presenters were from industry, academia, EPA, and various consulting firms. The presentations were followed by several facilitated breakout sessions; these sessions allowed participants an opportunity to discuss their needs and opinions on VOC recovery trends, research, and other issues.This document contains summaries of the presentations and discussions during the seminar. It does not constitute an actual proceedings, since the presentations were informal and no written versions were required. The list of participants and contact information are included in Appendix A. Information

  3. Flight test evaluation of predicted light aircraft drag, performance, and stability

    NASA Technical Reports Server (NTRS)

    Smetana, F. O.; Fox, S. R.

    1979-01-01

    A technique was developed which permits simultaneous extraction of complete lift, drag, and thrust power curves from time histories of a single aircraft maneuver such as a pullup (from V sub max to V sub stall) and pushover (to sub V max for level flight.) The technique is an extension to non-linear equations of motion of the parameter identification methods of lliff and Taylor and includes provisions for internal data compatibility improvement as well. The technique was show to be capable of correcting random errors in the most sensitive data channel and yielding highly accurate results. This technique was applied to flight data taken on the ATLIT aircraft. The drag and power values obtained from the initial least squares estimate are about 15% less than the 'true' values. If one takes into account the rather dirty wing and fuselage existing at the time of the tests, however, the predictions are reasonably accurate. The steady state lift measurements agree well with the extracted values only for small values of alpha. The predicted value of the lift at alpha = 0 is about 33% below that found in steady state tests while the predicted lift slope is 13% below the steady state value.

  4. DIFFUSION-WEIGHTED IMAGING OF THE LIVER: TECHNIQUES AND APPLICATIONS

    PubMed Central

    Lewis, Sara; Dyvorne, Hadrien; Cui, Yong; Taouli, Bachir

    2014-01-01

    SYNOPSIS Diffusion weighted MRI (DWI) is a technique that assesses the cellularity, tortuosity of the extracellular/extravascular space and cell membrane density based upon differences in water proton mobility in tissues. The strength of the diffusion weighting is reflected by the b-value. DWI using several b-values enables quantification of the apparent diffusion coefficient (ADC). DWI is increasingly employed in liver imaging for multiple reasons: it can add useful qualitative and quantitative information to conventional imaging sequences, it is acquired relatively quickly, it is easily incorporated into existing clinical protocols, and it is a non-contrast technique. DWI is useful for focal liver lesion detection and characterization, for the assessment of post-treatment tumor response and for evaluation of diffuse liver disease. ADC quantification can be used to characterize lesions as cystic/necrotic or solid and for predicting tumor response to therapy. Advanced diffusion methods such as IVIM (intravoxel incoherent motion) may have potential for detection, staging and evaluation of the progression of liver fibrosis and for liver lesion characterization. The lack of standardization of DWI technique including choice of b-values and sequence parameters has somewhat limited its widespread adoption. PMID:25086935

  5. A rapid ATR-FTIR spectroscopic method for detection of sibutramine adulteration in tea and coffee based on hierarchical cluster and principal component analyses.

    PubMed

    Cebi, Nur; Yilmaz, Mustafa Tahsin; Sagdic, Osman

    2017-08-15

    Sibutramine may be illicitly included in herbal slimming foods and supplements marketed as "100% natural" to enhance weight loss. Considering public health and legal regulations, there is an urgent need for effective, rapid and reliable techniques to detect sibutramine in dietetic herbal foods, teas and dietary supplements. This research comprehensively explored, for the first time, detection of sibutramine in green tea, green coffee and mixed herbal tea using ATR-FTIR spectroscopic technique combined with chemometrics. Hierarchical cluster analysis and PCA principle component analysis techniques were employed in spectral range (2746-2656cm -1 ) for classification and discrimination through Euclidian distance and Ward's algorithm. Unadulterated and adulterated samples were classified and discriminated with respect to their sibutramine contents with perfect accuracy without any false prediction. The results suggest that existence of the active substance could be successfully determined at the levels in the range of 0.375-12mg in totally 1.75g of green tea, green coffee and mixed herbal tea by using FTIR-ATR technique combined with chemometrics. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Advanced Signal Processing for High Temperatures Health Monitoring of Condensed Water Height in Steam Pipes

    NASA Technical Reports Server (NTRS)

    Lih, Shyh-Shiuh; Bar-Cohen, Yoseph; Lee, Hyeong Jae; Takano, Nobuyuki; Bao, Xiaoqi

    2013-01-01

    An advanced signal processing methodology is being developed to monitor the height of condensed water thru the wall of a steel pipe while operating at temperatures as high as 250deg. Using existing techniques, previous study indicated that, when the water height is low or there is disturbance in the environment, the predicted water height may not be accurate. In recent years, the use of the autocorrelation and envelope techniques in the signal processing has been demonstrated to be a very useful tool for practical applications. In this paper, various signal processing techniques including the auto correlation, Hilbert transform, and the Shannon Energy Envelope methods were studied and implemented to determine the water height in the steam pipe. The results have shown that the developed method provides a good capability for monitoring the height in the regular conditions. An alternative solution for shallow water or no water conditions based on a developed hybrid method based on Hilbert transform (HT) with a high pass filter and using the optimized windowing technique is suggested. Further development of the reported methods would provide a powerful tool for the identification of the disturbances of water height inside the pipe.

  7. A disease management programme for patients with diabetes mellitus is associated with improved quality of care within existing budgets.

    PubMed

    Steuten, L M G; Vrijhoef, H J M; Landewé-Cleuren, S; Schaper, N; Van Merode, G G; Spreeuwenberg, C

    2007-10-01

    To assess the impact of a disease management programme for patients with diabetes mellitus (Type 1 and Type 2) on cost-effectiveness, quality of life and patient self-management. By organizing care in accordance with the principles of disease management, it is aimed to increase quality of care within existing budgets. Single-group, pre-post design with 2-year follow-up in 473 patients. Substantial significant improvements in glycaemic control, health-related quality of life (HRQL) and patient self-management were found. No significant changes were detected in total costs of care. The probability that the disease management programme is cost-effective compared with usual care amounts to 74%, expressed in an average saving of 117 per additional life year at 5% improved HRQL. Introduction of a disease management programme for patients with diabetes is associated with improved intermediate outcomes within existing budgets. Further research should focus on long-term cost-effectiveness, including diabetic complications and mortality, in a controlled setting or by using decision-analytic modelling techniques.

  8. Meclofenamic acid blocks the gap junction communication between the retinal pigment epithelial cells.

    PubMed

    Ning, N; Wen, Y; Li, Y; Li, J

    2013-11-01

    Nonsteroidal anti-inflammatory drugs (NSAIDs) are commonly used to manage the pain and inflammation. NSAIDs can cause serious side effects, including vision problems. However, the underlying mechanisms are still unclear. Therefore, we aimed to investigate the effect of meclofenamic acid (MFA) on retinal pigment epithelium (RPE). In our study, we applied image analysis and whole-cell patch clamp recording to directly measure the effect of MFA on the gap junctional coupling between RPE cells. Analysis of Lucifer yellow (LY) transfer revealed that the gap junction communication existed between RPE cells. Functional experiments using the whole-cell configuration of the patch clamp technique showed that a gap junction conductance also existed between this kind of cells. Importantly, MFA largely inhibited the gap junction conductance and induced the uncoupling of RPE cells. Other NSAIDs, like aspirin and flufenamic acid (FFA), had the same effect. The gap junction functionally existed in RPE cells, which can be blocked by MFA. These findings may explain, at least partially, the vision problems with certain clinically used NSAIDs.

  9. Concrete Open-Wall Systems Wrapped with FRP under Torsional Loads

    PubMed Central

    Mancusi, Geminiano; Feo, Luciano; Berardi, Valentino P.

    2012-01-01

    The static behavior of reinforced concrete (RC) beams plated with layers of fiber-reinforced composite material (FRP) is widely investigated in current literature, which deals with both its numerical modeling as well as experiments. Scientific interest in this topic is explained by the increasing widespread use of composite materials in retrofitting techniques, as well as the consolidation and upgrading of existing reinforced concrete elements to new service conditions. The effectiveness of these techniques is typically influenced by the debonding of the FRP at the interface with concrete, where the transfer of stresses occurs from one element (RC member) to the other (FRP strengthening). In fact, the activation of the well-known premature failure modes can be regarded as a consequence of high peak values of the interfacial interactions. Until now, typical applications of FRP structural plating have included cases of flexural or shear-flexural strengthening. Within this context, the present study aims at extending the investigation to the case of wall-systems with open cross-section under torsional loads. It includes the results of some numerical analyses carried out by means of a finite element approximation.

  10. Clinical applications of textural analysis in non-small cell lung cancer.

    PubMed

    Phillips, Iain; Ajaz, Mazhar; Ezhil, Veni; Prakash, Vineet; Alobaidli, Sheaka; McQuaid, Sarah J; South, Christopher; Scuffham, James; Nisbet, Andrew; Evans, Philip

    2018-01-01

    Lung cancer is the leading cause of cancer mortality worldwide. Treatment pathways include regular cross-sectional imaging, generating large data sets which present intriguing possibilities for exploitation beyond standard visual interpretation. This additional data mining has been termed "radiomics" and includes semantic and agnostic approaches. Textural analysis (TA) is an example of the latter, and uses a range of mathematically derived features to describe an image or region of an image. Often TA is used to describe a suspected or known tumour. TA is an attractive tool as large existing image sets can be submitted to diverse techniques for data processing, presentation, interpretation and hypothesis testing with annotated clinical outcomes. There is a growing anthology of published data using different TA techniques to differentiate between benign and malignant lung nodules, differentiate tissue subtypes of lung cancer, prognosticate and predict outcome and treatment response, as well as predict treatment side effects and potentially aid radiotherapy planning. The aim of this systematic review is to summarize the current published data and understand the potential future role of TA in managing lung cancer.

  11. Metal ion reactive thin films using spray electrostatic LbL assembly.

    PubMed

    Krogman, Kevin C; Lyon, Katharine F; Hammond, Paula T

    2008-11-20

    By using the spray-layer-by-layer (Spray-LbL) technique, the number of metal counterions trapped within LbL coatings is significantly increased by kinetically freezing the film short of equilibrium, potentially limiting interchain penetration and forcing chains to remain extrinsically compensated to a much greater degree than observed in the traditional dipped LbL technique. The basis for the enhanced entrapment of metal ions such as Cu2+, Fe2+, and Ag+ is addressed, including the equilibrium driving force for extrinsic compensation by soft versus hard metal ions and the impact of Spray-LbL on the kinetics of polymer-ion complexation. These polymer-bound metal-ion coatings are also demonstrated to be effective treatments for air filtration, functionalizing existing filters with the ability to strongly bind toxic industrial compounds such as ammonia or cyanide gases, as well as chemical warfare agent simulants such as chloroethyl ethyl sulfide. On the basis of results reported here, future work could extend this method to include other toxic soft-base ligands such as carbon monoxide, benzene, or organophosphate nerve agents.

  12. Colloids with high-definition surface structures

    PubMed Central

    Chen, Hsien-Yeh; Rouillard, Jean-Marie; Gulari, Erdogan; Lahann, Joerg

    2007-01-01

    Compared with the well equipped arsenal of surface modification methods for flat surfaces, techniques that are applicable to curved, colloidal surfaces are still in their infancy. This technological gap exists because spin-coating techniques used in traditional photolithographic processes are not applicable to the curved surfaces of spherical objects. By replacing spin-coated photoresist with a vapor-deposited, photodefinable polymer coating, we have now fabricated microstructured colloids with a wide range of surface patterns, including asymmetric and chiral surface structures, that so far were typically reserved for flat substrates. This high-throughput method can yield surface-structured colloidal particles at a rate of ≈107 to 108 particles per operator per day. Equipped with spatially defined binding pockets, microstructured colloids can engage in programmable interactions, which can lead to directed self-assembly. The ability to create a wide range of colloids with both simple and complex surface patterns may contribute to the genesis of previously unknown colloidal structures and may have important technological implications in a range of different applications, including photonic and phononic materials or chemical sensors. PMID:17592149

  13. A radiographic template for a two-implant mandibular overdenture using the patient’s existing denture

    PubMed Central

    Huynh-Ba, G; Alexander, P; Vargas, A; Vierra, M; Oates, TW

    2012-01-01

    This article introduces a technique for modifying an existing mandibular complete denture for use as a radiographic template with a radiopaque light-activated calcium hydroxide (Ca(OH)2) preparation. This allows prosthetically-driven treatment planning and surgical placement of 2 implants to support the existing mandibular denture. PMID:23328197

  14. A frame selective dynamic programming approach for noise robust pitch estimation.

    PubMed

    Yarra, Chiranjeevi; Deshmukh, Om D; Ghosh, Prasanta Kumar

    2018-04-01

    The principles of the existing pitch estimation techniques are often different and complementary in nature. In this work, a frame selective dynamic programming (FSDP) method is proposed which exploits the complementary characteristics of two existing methods, namely, sub-harmonic to harmonic ratio (SHR) and sawtooth-wave inspired pitch estimator (SWIPE). Using variants of SHR and SWIPE, the proposed FSDP method classifies all the voiced frames into two classes-the first class consists of the frames where a confidence score maximization criterion is used for pitch estimation, while for the second class, a dynamic programming (DP) based approach is proposed. Experiments are performed on speech signals separately from KEELE, CSLU, and PaulBaghsaw corpora under clean and additive white Gaussian noise at 20, 10, 5, and 0 dB SNR conditions using four baseline schemes including SHR, SWIPE, and two DP based techniques. The pitch estimation performance of FSDP, when averaged over all SNRs, is found to be better than those of the baseline schemes suggesting the benefit of applying smoothness constraint using DP in selected frames in the proposed FSDP scheme. The VuV classification error from FSDP is also found to be lower than that from all four baseline schemes in almost all SNR conditions on three corpora.

  15. Intralesional cryotherapy for hypertrophic scars and keloids: a review

    PubMed Central

    O’Boyle, Ciaran P; Shayan-Arani, Holleh; Hamada, Maha Wagdy

    2017-01-01

    Introduction: Hypertrophic and keloid scarring remain notoriously troublesome for patients to tolerate and frustratingly difficult for clinicians to treat. Many different treatment modalities exist, signifying the failure of any method to achieve consistently excellent results. Intralesional cryotherapy is a relatively recent development that uses a double lumen needle, placed through the core of a keloid or hypertrophic scar, to deliver nitrogen vapour, which freezes the scar from its core, outwards. Methods: This article provides a comprehensive review of the literature on intralesional cryotherapy for hypertrophic scars and keloids. A systematic review or meta-analysis was not possible, since the existing articles did not permit this. Results: A search of English language, peer-reviewed literature was carried out. The evidence base was found to be low (level 4). In addition, much of the published evidence comes from a very few groups. Despite this, consistent findings from case series suggest that the technique is safe and achieves good scar reduction with very few treatments. Adverse effects include depigmentation, recurrence and pain. Pain and recurrence appear to be uncommon and depigmentation may be temporary. Discussion: Well-constructed, prospectively recruited comparative trials are absent from the literature. These are strongly encouraged, in order to strengthen general confidence in this technique and in the repeatability of outcomes reported thus far. PMID:29799581

  16. A Secure Information Framework with APRQ Properties

    NASA Astrophysics Data System (ADS)

    Rupa, Ch.

    2017-08-01

    Internet of the things is the most trending topics in the digital world. Security issues are rampant. In the corporate or institutional setting, security risks are apparent from the outset. Market leaders are unable to use the cryptographic techniques due to their complexities. Hence many bits of private information, including ID, are readily available for third parties to see and to utilize. There is a need to decrease the complexity and increase the robustness of the cryptographic approaches. In view of this, a new cryptographic technique as good encryption pact with adjacency, random prime number and quantum code properties has been proposed. Here, encryption can be done by using quantum photons with gray code. This approach uses the concepts of physics and mathematics with no external key exchange to improve the security of the data. It also reduces the key attacks by generation of a key at the party side instead of sharing. This method makes the security more robust than with the existing approach. Important properties of gray code and quantum are adjacency property and different photons to a single bit (0 or 1). These can reduce the avalanche effect. Cryptanalysis of the proposed method shows that it is resistant to various attacks and stronger than the existing approaches.

  17. Detecting and visualizing weak signatures in hyperspectral data

    NASA Astrophysics Data System (ADS)

    MacPherson, Duncan James

    This thesis evaluates existing techniques for detecting weak spectral signatures from remotely sensed hyperspectral data. Algorithms are presented that successfully detect hard-to-find 'mystery' signatures in unknown cluttered backgrounds. The term 'mystery' is used to describe a scenario where the spectral target and background endmembers are unknown. Sub-Pixel analysis and background suppression are used to find deeply embedded signatures which can be less than 10% of the total signal strength. Existing 'mystery target' detection algorithms are derived and compared. Several techniques are shown to be superior both visually and quantitatively. Detection performance is evaluated using confidence metrics that are developed. A multiple algorithm approach is shown to improve detection confidence significantly. Although the research focuses on remote sensing applications, the algorithms presented can be applied to a wide variety of diverse fields such as medicine, law enforcement, manufacturing, earth science, food production, and astrophysics. The algorithms are shown to be general and can be applied to both the reflective and emissive parts of the electromagnetic spectrum. The application scope is a broad one and the final results open new opportunities for many specific applications including: land mine detection, pollution and hazardous waste detection, crop abundance calculations, volcanic activity monitoring, detecting diseases in food, automobile or airplane target recognition, cancer detection, mining operations, extracting galactic gas emissions, etc.

  18. Research in the Restricted Problems of Three and Four Bodies Final Scientific Report

    NASA Technical Reports Server (NTRS)

    Richards, Paul B.; Bernstein, Irwin S.; Chai, Winchung A.; Cronin, Jane; Ellis, Jordan; Fine, William E.; Kass, Sheldon; Musa, Samuel A.; Russell, Lawrence H.

    1968-01-01

    Seven studies have been conducted on research in the existence and nature of solutions of the restricted problems of three and four bodies. The details and results of five of these research investigations have already been published, and the latest two studies will be published shortly. A complete bibliography of publications is included in this report. This research has been primarily qualitative and has yielded new information on the behavior of trajectories near the libration points in the Earth-Moon-Sun and Sun-Jupiter-Saturn systems, and on the existence of periodic trajectories about the libration points of the circular and elliptical restricted four-body models. We have also implemented Birkhoff's normalization process for conservative and nonconservative Hamiltonian systems with equilibrium points. This makes available a technique for analyzing stability properties of certain nonlinear dynamical systems, and we have applied this technique to the circular and elliptical restricted three-body models. A related study was also conducted to determine the feasibility of using cislunar periodic trajectories for various space missions. Preliminary results suggest that this concept is attractive for space flight safety operations in cislunar space. Results of this research will be of interest to mathematicians, particularly those working in ordinary differential equations, dynamical systems and celestial mechanics; to astronomers; and to space guidance and mission analysts.

  19. Spectral unmixing of urban land cover using a generic library approach

    NASA Astrophysics Data System (ADS)

    Degerickx, Jeroen; Lordache, Marian-Daniel; Okujeni, Akpona; Hermy, Martin; van der Linden, Sebastian; Somers, Ben

    2016-10-01

    Remote sensing based land cover classification in urban areas generally requires the use of subpixel classification algorithms to take into account the high spatial heterogeneity. These spectral unmixing techniques often rely on spectral libraries, i.e. collections of pure material spectra (endmembers, EM), which ideally cover the large EM variability typically present in urban scenes. Despite the advent of several (semi-) automated EM detection algorithms, the collection of such image-specific libraries remains a tedious and time-consuming task. As an alternative, we suggest the use of a generic urban EM library, containing material spectra under varying conditions, acquired from different locations and sensors. This approach requires an efficient EM selection technique, capable of only selecting those spectra relevant for a specific image. In this paper, we evaluate and compare the potential of different existing library pruning algorithms (Iterative Endmember Selection and MUSIC) using simulated hyperspectral (APEX) data of the Brussels metropolitan area. In addition, we develop a new hybrid EM selection method which is shown to be highly efficient in dealing with both imagespecific and generic libraries, subsequently yielding more robust land cover classification results compared to existing methods. Future research will include further optimization of the proposed algorithm and additional tests on both simulated and real hyperspectral data.

  20. Using the Delphi Technique to Support Curriculum Development

    ERIC Educational Resources Information Center

    Sitlington, Helen Barbara; Coetzer, Alan John

    2015-01-01

    Purpose: The purpose of this paper is to present an analysis of the use of the Delphi technique to support curriculum development with a view to enhancing existing literature on use of the technique for renewal of business course curricula. Design/methodology/approach: The authors outline the Delphi process for obtaining consensus amongst a…

  1. Segmentation Techniques for Expanding a Library Instruction Market: Evaluating and Brainstorming.

    ERIC Educational Resources Information Center

    Warren, Rebecca; Hayes, Sherman; Gunter, Donna

    2001-01-01

    Describes a two-part segmentation technique applied to an instruction program for an academic library during a strategic planning process. Discusses a brainstorming technique used to create a list of existing and potential audiences, and then describes a follow-up review session that evaluated the past years' efforts. (Author/LRW)

  2. Techniques for Forecasting Air Passenger Traffic

    NASA Technical Reports Server (NTRS)

    Taneja, N.

    1972-01-01

    The basic techniques of forecasting the air passenger traffic are outlined. These techniques can be broadly classified into four categories: judgmental, time-series analysis, market analysis and analytical. The differences between these methods exist, in part, due to the degree of formalization of the forecasting procedure. Emphasis is placed on describing the analytical method.

  3. A Survey on Anomaly Based Host Intrusion Detection System

    NASA Astrophysics Data System (ADS)

    Jose, Shijoe; Malathi, D.; Reddy, Bharath; Jayaseeli, Dorathi

    2018-04-01

    An intrusion detection system (IDS) is hardware, software or a combination of two, for monitoring network or system activities to detect malicious signs. In computer security, designing a robust intrusion detection system is one of the most fundamental and important problems. The primary function of system is detecting intrusion and gives alerts when user tries to intrusion on timely manner. In these techniques when IDS find out intrusion it will send alert massage to the system administrator. Anomaly detection is an important problem that has been researched within diverse research areas and application domains. This survey tries to provide a structured and comprehensive overview of the research on anomaly detection. From the existing anomaly detection techniques, each technique has relative strengths and weaknesses. The current state of the experiment practice in the field of anomaly-based intrusion detection is reviewed and survey recent studies in this. This survey provides a study of existing anomaly detection techniques, and how the techniques used in one area can be applied in another application domain.

  4. [Aging explosive detection using terahertz time-domain spectroscopy].

    PubMed

    Meng, Kun; Li, Ze-ren; Liu, Qiao

    2011-05-01

    Detecting the aging situation of stock explosive is essentially meaningful to the research on the capability, security and stability of explosive. Existing aging explosive detection techniques, such as scan microscope technique, Fourier transfer infrared spectrum technique, gas chromatogram mass spectrum technique and so on, are either not able to differentiate whether the explosive is aging or not, or not able to image the structure change of the molecule. In the present paper, using the density functional theory (DFT), the absorb spectrum changes after the explosive aging were calculated, from which we can clearly find the difference of spectrum between explosive molecule and aging ones in the terahertz band. The terahertz time-domain spectrum (THz-TDS) system as well as its frequency spectrum resolution and measured range are analyzed. Combined with the existing experimental results and the essential characters of the terahertz wave, the application of THz-TDS technique to the detection of aging explosive was demonstrated from the aspects of feasibility, veracity and practicability. On the base of that, the authors advance the new method of aging explosive detection using the terahertz time-domain spectrum technique.

  5. Noise suppression in surface microseismic data

    USGS Publications Warehouse

    Forghani-Arani, Farnoush; Batzle, Mike; Behura, Jyoti; Willis, Mark; Haines, Seth S.; Davidson, Michael

    2012-01-01

    We introduce a passive noise suppression technique, based on the τ − p transform. In the τ − p domain, one can separate microseismic events from surface noise based on distinct characteristics that are not visible in the time-offset domain. By applying the inverse τ − p transform to the separated microseismic event, we suppress the surface noise in the data. Our technique significantly improves the signal-to-noise ratios of the microseismic events and is superior to existing techniques for passive noise suppression in the sense that it preserves the waveform. We introduce a passive noise suppression technique, based on the τ − p transform. In the τ − p domain, one can separate microseismic events from surface noise based on distinct characteristics that are not visible in the time-offset domain. By applying the inverse τ − p transform to the separated microseismic event, we suppress the surface noise in the data. Our technique significantly improves the signal-to-noise ratios of the microseismic events and is superior to existing techniques for passive noise suppression in the sense that it preserves the waveform.

  6. Innovative use of technologies and methods to redesign care: the problem of care transitions.

    PubMed

    Richman, Mark; Sklaroff, Laura Myerchin; Hoang, Khathy; Wasson, Elijah; Gross-Schulman, Sandra

    2014-01-01

    Organizations are redesigning models of care in today's rapidly changing health care environment. Using proven innovation techniques maximizes likelihood of effective change. Our safety-net hospital aims to reduce high emergency department visit, admission, and readmission rates, key components to health care cost control. Twenty-five clinical stakeholders participated in mixed-methods innovation exercises to understand stakeholders, frame problems, and explore solutions. We identified existing barriers and means to improve post-emergency department/post-inpatient discharge care coordination/communication among patient-centered medical home care team members, including patients. Physicians and staff preferred automated e-mail notifications, including patient identifiers, medical home/primary care provider information, and relevant clinical documentation, to improve communication efficiency/efficacy.

  7. The balanced scorecard: an incremental approach model to health care management.

    PubMed

    Pineno, Charles J

    2002-01-01

    The balanced scorecard represents a technique used in strategic management to translate an organization's mission and strategy into a comprehensive set of performance measures that provide the framework for implementation of strategic management. This article develops an incremental approach for decision making by formulating a specific balanced scorecard model with an index of nonfinancial as well as financial measures. The incremental approach to costs, including profit contribution analysis and probabilities, allows decisionmakers to assess, for example, how their desire to meet different health care needs will cause changes in service design. This incremental approach to the balanced scorecard may prove to be useful in evaluating the existence of causality relationships between different objective and subjective measures to be included within the balanced scorecard.

  8. A method to model latent heat for transient analysis using NASTRAN

    NASA Technical Reports Server (NTRS)

    Harder, R. L.

    1982-01-01

    A sample heat transfer analysis is demonstrated which includes the heat of fusion. The method can be used to analyze a system with nonconstant specific heat. The enthalpy is introduced as an independent degree of freedom at each node. The user input consists of a curve of temperature as a function of enthalpy, which may include a constant temperature phase change. The basic NASTRAN heat transfer capability is used to model the effects of latent heat with existing direct matrix output and nonlinear load data cards. Although some user care is required, the numerical stability of the integration is quite good when the given recommendations are followed. The theoretical equations used and the NASTRAN techniques are shown.

  9. Contribution of concentrator photovoltaic installations to grid stability and power quality

    NASA Astrophysics Data System (ADS)

    del Toro García, Xavier; Roncero-Sánchez, Pedro; Torres, Alfonso Parreño; Vázquez, Javier

    2012-10-01

    Large-scale integration of Photovoltaic (PV) generation systems, including Concentrator Photovoltaic (CPV) technologies, will require the contribution and support of these technologies to the management and stability of the grid. New regulations and grid codes for PV installations in countries such as Spain have recently included dynamic voltage control support during faults. The PV installation must stay connected to the grid during voltage dips and inject reactive power in order to enhance the stability of the system. The existing PV inverter technologies based on the Voltage-Source Converter (VSC) are in general well suited to provide advanced grid-support characteristics. Nevertheless, new advanced control schemes and monitoring techniques will be necessary to meet the most demanding requirements.

  10. Surface scattering plasmon resonance fibre sensors: demonstration of rapid influenza A virus detection

    NASA Astrophysics Data System (ADS)

    Franςois, A.; Boehm, J.; Oh, S. Y.; Kok, T.; Monro, T. M.

    2011-06-01

    The management of threats such as pandemics and explosives, and of health and the environment requires the rapid deployment of highly sensitive detection tools. Sensors based on Surface Plasmon Resonance (SPR) allow rapid, labelfree, highly sensitive detection, and indeed this phenomenon underpins the only label-free optical biosensing technology that is available commercially. In these sensors, the existence of surface plasmons is inferred indirectly from absorption features that correspond to the coupling of light to the surface plasmon. Although SPR is not intrinsically a radiative process, under certain conditions the surface plasmon can itself couple to the local photon states, and emit light as first described byKretschmann. Here we show that by collecting and characterising this re-emitted light, it is possible to realise new SPR sensing architectures that are more compact, versatile and robust than existing approaches. This approach addresses existing practical limitations associated with current SPR technologies, including bulk, cost and calibration. It is applicable to a range of SPR geometries, including optical fibres, planar waveguides and prism configurations, and is in principle capable of detecting multiple analytes simultaneously. Moreover, this technique allows to combine SPR sensing and fluorescence sensing into a single platform which has never been demonstrated before and consequently use these two methods for a more reliable diagnostic. As an example, this approach has been used to demonstrate the rapid detection of the seasonal influenza virus.

  11. FAA center for aviation systems reliability: an overview

    NASA Astrophysics Data System (ADS)

    Brasche, Lisa J. H.

    1996-11-01

    The FAA Center for Aviation Systems Reliability has as its objectives: to develop quantitative nondestructive evaluation (NDE) methods for aircraft structures and materials, including prototype instrumentation, software, techniques and procedures; and to develop and maintain comprehensive education and training programs specific to the inspection of aviation structures. The program, which includes contributions from Iowa State University, Northwestern University, Wayne State University, Tuskegee University, AlliedSignal Propulsion Engines, General Electric Aircraft Engines and Pratt and Whitney, has been in existence since 1990. Efforts under way include: development of inspection for adhesively bonded structures; detection of corrosion; development of advanced NDE concepts that form the basis for an inspection simulator; improvements of titanium inspection as part of the Engine Titanium Consortium; development of education and training program. An overview of the efforts underway will be provided with focus on those technologies closest to technology transfer.

  12. Physical-level synthesis for digital lab-on-a-chip considering variation, contamination, and defect.

    PubMed

    Liao, Chen; Hu, Shiyan

    2014-03-01

    Microfluidic lab-on-a-chips have been widely utilized in biochemical analysis and human health studies due to high detection accuracy, high timing efficiency, and low cost. The increasing design complexity of lab-on-a-chips necessitates the computer-aided design (CAD) methodology in contrast to the classical manual design methodology. A key part in lab-on-a-chip CAD is physical-level synthesis. It includes the lab-on-a-chip placement and routing, where placement is to determine the physical location and the starting time of each operation and routing is to transport each droplet from the source to the destination. In the lab-on-a-chip design, variation, contamination, and defect need to be considered. This work designs a physical-level synthesis flow which simultaneously considers variation, contamination, and defect of the lab-on-a-chip design. It proposes a maze routing based, variation, contamination, and defect aware droplet routing technique, which is seamlessly integrated into an existing placement technique. The proposed technique improves the placement solution for routing and achieves the placement and routing co-optimization to handle variation, contamination, and defect. The simulation results demonstrate that our technique does not use any defective/contaminated grids, while the technique without considering contamination and defect uses 17.0% of the defective/contaminated grids on average. In addition, our routing variation aware technique significantly improves the average routing yield by 51.2% with only 3.5% increase in completion time compared to a routing variation unaware technique.

  13. Redefining the Practice of Peer Review Through Intelligent Automation Part 1: Creation of a Standardized Methodology and Referenceable Database.

    PubMed

    Reiner, Bruce I

    2017-10-01

    Conventional peer review practice is compromised by a number of well-documented biases, which in turn limit standard of care analysis, which is fundamental to determination of medical malpractice. In addition to these intrinsic biases, other existing deficiencies exist in current peer review including the lack of standardization, objectivity, retrospective practice, and automation. An alternative model to address these deficiencies would be one which is completely blinded to the peer reviewer, requires independent reporting from both parties, utilizes automated data mining techniques for neutral and objective report analysis, and provides data reconciliation for resolution of finding-specific report differences. If properly implemented, this peer review model could result in creation of a standardized referenceable peer review database which could further assist in customizable education, technology refinement, and implementation of real-time context and user-specific decision support.

  14. Prethermal Phases of Matter Protected by Time-Translation Symmetry

    NASA Astrophysics Data System (ADS)

    Else, Dominic V.; Bauer, Bela; Nayak, Chetan

    2017-01-01

    In a periodically driven (Floquet) system, there is the possibility for new phases of matter, not present in stationary systems, protected by discrete time-translation symmetry. This includes topological phases protected in part by time-translation symmetry, as well as phases distinguished by the spontaneous breaking of this symmetry, dubbed "Floquet time crystals." We show that such phases of matter can exist in the prethermal regime of periodically driven systems, which exists generically for sufficiently large drive frequency, thereby eliminating the need for integrability or strong quenched disorder, which limited previous constructions. We prove a theorem that states that such a prethermal regime persists until times that are nearly exponentially long in the ratio of certain couplings to the drive frequency. By similar techniques, we can also construct stationary systems that spontaneously break continuous time-translation symmetry. Furthermore, we argue that for driven systems coupled to a cold bath, the prethermal regime could potentially persist to infinite time.

  15. Characterization of Orbital Debris Via Hyper-Velocity Ground-Based Tests

    NASA Technical Reports Server (NTRS)

    Cowardin, Heather

    2015-01-01

    To replicate a hyper-velocity fragmentation event using modern-day spacecraft materials and construction techniques to better improve the existing DoD and NASA breakup models. DebriSat is intended to be representative of modern LEO satellites.Major design decisions were reviewed and approved by Aerospace subject matter experts from different disciplines. DebriSat includes 7 major subsystems. Attitude determination and control system (ADCS), command and data handling (C&DH), electrical power system (EPS), payload, propulsion, telemetry tracking and command (TT&C), and thermal management. To reduce cost, most components are emulated based on existing design of flight hardware and fabricated with the same materials. A key laboratory-based test, Satellite Orbital debris Characterization Impact Test (SOCIT), supporting the development of the DoD and NASA satellite breakup models was conducted at AEDC in 1992 .Breakup models based on SOCIT have supported many applications and matched on-orbit events reasonably well over the years.

  16. The Research Doesn't Always Apply: Practical Solutions to Evidence-Based Training-Load Monitoring in Elite Team Sports.

    PubMed

    Burgess, Darren J

    2017-04-01

    Research describing load-monitoring techniques for team sport is plentiful. Much of this research is conducted retrospectively and typically involves recreational or semielite teams. Load-monitoring research conducted on professional team sports is largely observational. Challenges exist for the practitioner in implementing peer-reviewed research into the applied setting. These challenges include match scheduling, player adherence, manager/coach buy-in, sport traditions, and staff availability. External-load monitoring often attracts questions surrounding technology reliability and validity, while internal-load monitoring makes some assumptions about player adherence, as well as having some uncertainty around the impact these measures have on player performance This commentary outlines examples of load-monitoring research, discusses the issues associated with the application of this research in an elite team-sport setting, and suggests practical adjustments to the existing research where necessary.

  17. Multilingual Sentiment Analysis: State of the Art and Independent Comparison of Techniques.

    PubMed

    Dashtipour, Kia; Poria, Soujanya; Hussain, Amir; Cambria, Erik; Hawalah, Ahmad Y A; Gelbukh, Alexander; Zhou, Qiang

    With the advent of Internet, people actively express their opinions about products, services, events, political parties, etc., in social media, blogs, and website comments. The amount of research work on sentiment analysis is growing explosively. However, the majority of research efforts are devoted to English-language data, while a great share of information is available in other languages. We present a state-of-the-art review on multilingual sentiment analysis. More importantly, we compare our own implementation of existing approaches on common data. Precision observed in our experiments is typically lower than the one reported by the original authors, which we attribute to the lack of detail in the original presentation of those approaches. Thus, we compare the existing works by what they really offer to the reader, including whether they allow for accurate implementation and for reliable reproduction of the reported results.

  18. Max-margin multiattribute learning with low-rank constraint.

    PubMed

    Zhang, Qiang; Chen, Lin; Li, Baoxin

    2014-07-01

    Attribute learning has attracted a lot of interests in recent years for its advantage of being able to model high-level concepts with a compact set of midlevel attributes. Real-world objects often demand multiple attributes for effective modeling. Most existing methods learn attributes independently without explicitly considering their intrinsic relatedness. In this paper, we propose max margin multiattribute learning with low-rank constraint, which learns a set of attributes simultaneously, using only relative ranking of the attributes for the data. By learning all the attributes simultaneously through low-rank constraint, the proposed method is able to capture their intrinsic correlation for improved learning; by requiring only relative ranking, the method avoids restrictive binary labels of attributes that are often assumed by many existing techniques. The proposed method is evaluated on both synthetic data and real visual data including a challenging video data set. Experimental results demonstrate the effectiveness of the proposed method.

  19. Plant xylem hydraulics: What we understand, current research, and future challenges.

    PubMed

    Venturas, Martin D; Sperry, John S; Hacke, Uwe G

    2017-06-01

    Herein we review the current state-of-the-art of plant hydraulics in the context of plant physiology, ecology, and evolution, focusing on current and future research opportunities. We explain the physics of water transport in plants and the limits of this transport system, highlighting the relationships between xylem structure and function. We describe the great variety of techniques existing for evaluating xylem resistance to cavitation. We address several methodological issues and their connection with current debates on conduit refilling and exponentially shaped vulnerability curves. We analyze the trade-offs existing between water transport safety and efficiency. We also stress how little information is available on molecular biology of cavitation and the potential role of aquaporins in conduit refilling. Finally, we draw attention to how plant hydraulic traits can be used for modeling stomatal responses to environmental variables and climate change, including drought mortality. © 2017 Institute of Botany, Chinese Academy of Sciences.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miall, A.D.

    The basic premise of the recent Exxon cycle chart, that there exists a globally correlatable suite of third-order eustatic cycles, remains unproven. Many of the tests of this premise are based on circular reasoning. The implied precision of the Exxon global cycle chart is not supportable, because it is greater than that of the best available chronostratigraphic techniques, such as those used to construct the global standard time scale. Correlations of new stratigraphic sections with the Exxon chart will almost always succeed, because there are so many Exxon sequence-boundary events from which to choose. This is demonstrated by the usemore » of four synthetic sections constructed from tables of random numbers. A minimum of 77% successful correlations of random events with the Exxon chart was achieved. The existing cycle chart represents an amalgam of regional and local tectonic events and probably also includes unrecognized miscorrelations. It is of questionable value as an independent standard of geologic time.« less

  1. E-Cigarettes: The Science Behind the Smoke and Mirrors.

    PubMed

    Cobb, Nathan K; Sonti, Rajiv

    2016-08-01

    E-cigarettes are a diverse set of devices that are designed for pulmonary delivery of nicotine through an aerosol, usually consisting of propylene glycol, nicotine, and flavorings. The devices heat the nicotine solution using a battery-powered circuit and deliver the resulting vapor into the proximal airways and lung. Although the current devices on the market appear to be safer than smoking combusted tobacco, they have their own inherent risks, which remain poorly characterized due to widespread product variability. Despite rising use throughout the United States, predominantly by smokers, limited evidence exists for their efficacy in smoking cessation. Pending regulation by the FDA will enforce limited disclosures on the industry but will not directly impact safety or efficacy. Meanwhile, respiratory health practitioners will need to tailor their discussions with patients, taking into account the broad range of existing effective smoking cessation techniques, including pharmaceutical nicotine replacement therapy. Copyright © 2016 by Daedalus Enterprises.

  2. Reachability analysis of real-time systems using time Petri nets.

    PubMed

    Wang, J; Deng, Y; Xu, G

    2000-01-01

    Time Petri nets (TPNs) are a popular Petri net model for specification and verification of real-time systems. A fundamental and most widely applied method for analyzing Petri nets is reachability analysis. The existing technique for reachability analysis of TPNs, however, is not suitable for timing property verification because one cannot derive end-to-end delay in task execution, an important issue for time-critical systems, from the reachability tree constructed using the technique. In this paper, we present a new reachability based analysis technique for TPNs for timing property analysis and verification that effectively addresses the problem. Our technique is based on a concept called clock-stamped state class (CS-class). With the reachability tree generated based on CS-classes, we can directly compute the end-to-end time delay in task execution. Moreover, a CS-class can be uniquely mapped to a traditional state class based on which the conventional reachability tree is constructed. Therefore, our CS-class-based analysis technique is more general than the existing technique. We show how to apply this technique to timing property verification of the TPN model of a command and control (C2) system.

  3. Signal-to-noise ratio estimation using adaptive tuning on the piecewise cubic Hermite interpolation model for images.

    PubMed

    Sim, K S; Yeap, Z X; Tso, C P

    2016-11-01

    An improvement to the existing technique of quantifying signal-to-noise ratio (SNR) of scanning electron microscope (SEM) images using piecewise cubic Hermite interpolation (PCHIP) technique is proposed. The new technique uses an adaptive tuning onto the PCHIP, and is thus named as ATPCHIP. To test its accuracy, 70 images are corrupted with noise and their autocorrelation functions are then plotted. The ATPCHIP technique is applied to estimate the uncorrupted noise-free zero offset point from a corrupted image. Three existing methods, the nearest neighborhood, first order interpolation and original PCHIP, are used to compare with the performance of the proposed ATPCHIP method, with respect to their calculated SNR values. Results show that ATPCHIP is an accurate and reliable method to estimate SNR values from SEM images. SCANNING 38:502-514, 2016. © 2015 Wiley Periodicals, Inc. © Wiley Periodicals, Inc.

  4. Symbolically Modeling Concurrent MCAPI Executions

    NASA Technical Reports Server (NTRS)

    Fischer, Topher; Mercer, Eric; Rungta, Neha

    2011-01-01

    Improper use of Inter-Process Communication (IPC) within concurrent systems often creates data races which can lead to bugs that are challenging to discover. Techniques that use Satisfiability Modulo Theories (SMT) problems to symbolically model possible executions of concurrent software have recently been proposed for use in the formal verification of software. In this work we describe a new technique for modeling executions of concurrent software that use a message passing API called MCAPI. Our technique uses an execution trace to create an SMT problem that symbolically models all possible concurrent executions and follows the same sequence of conditional branch outcomes as the provided execution trace. We check if there exists a satisfying assignment to the SMT problem with respect to specific safety properties. If such an assignment exists, it provides the conditions that lead to the violation of the property. We show how our method models behaviors of MCAPI applications that are ignored in previously published techniques.

  5. BSDWormer; an Open Source Implementation of a Poisson Wavelet Multiscale Analysis for Potential Fields

    NASA Astrophysics Data System (ADS)

    Horowitz, F. G.; Gaede, O.

    2014-12-01

    Wavelet multiscale edge analysis of potential fields (a.k.a. "worms") has been known since Moreau et al. (1997) and was independently derived by Hornby et al. (1999). The technique is useful for producing a scale-explicit overview of the structures beneath a gravity or magnetic survey, including establishing the location and estimating the attitude of surface features, as well as incorporating information about the geometric class (point, line, surface, volume, fractal) of the underlying sources — in a fashion much like traditional structural indices from Euler solutions albeit with better areal coverage. Hornby et al. (2002) show that worms form the locally highest concentration of horizontal edges of a given strike — which in conjunction with the results from Mallat and Zhong (1992) induces a (non-unique!) inversion where the worms are physically interpretable as lateral boundaries in a source distribution that produces a close approximation of the observed potential field. The technique has enjoyed widespread adoption and success in the Australian mineral exploration community — including "ground truth" via successfully drilling structures indicated by the worms. Unfortunately, to our knowledge, all implementations of the code to calculate the worms/multiscale edges (including Horowitz' original research code) are either part of commercial software packages, or have copyright restrictions that impede the use of the technique by the wider community. The technique is completely described mathematically in Hornby et al. (1999) along with some later publications. This enables us to re-implement from scratch the code required to calculate and visualize the worms. We are freely releasing the results under an (open source) BSD two-clause software license. A git repository is available at . We will give an overview of the technique, show code snippets using the codebase, and present visualization results for example datasets (including the Surat basin of Australia, and the Lake Ontario region of North America). We invite you to join us in creating and using the best worming software for potential fields in existence — as both gratis and libre software!

  6. Percutaneous Management of Accidentally Retained Foreign Bodies During Image-Guided Non-vascular Procedures: Novel Technique Using a Large-Bore Biopsy System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cazzato, Roberto Luigi, E-mail: gigicazzato@hotmail.it; Garnon, Julien, E-mail: juleiengarnon@gmail.com; Ramamurthy, Nitin, E-mail: nitin-ramamurthy@hotmail.com

    ObjectiveTo describe a novel percutaneous image-guided technique using a large-bore biopsy system to retrieve foreign bodies (FBs) accidentally retained during non-vascular interventional procedures.Materials and MethodsBetween May 2013 and October 2015, five patients underwent percutaneous retrieval of five iatrogenic FBs, including a biopsy needle tip in the femoral head following osteoblastoma biopsy and radiofrequency ablation (RFA); a co-axial needle shaft within a giant desmoid tumour following cryoablation; and three post-vertebroplasty cement tails within paraspinal muscles. All FBs were retrieved immediately following original procedures under local or general anaesthesia, using combined computed tomography (CT) and fluoroscopic guidance. The basic technique involved positioningmore » a 6G trocar sleeve around the FB long axis and co-axially advancing an 8G biopsy needle to retrieve the FB within the biopsy core. Retrospective chart review facilitated analysis of procedures, FBs, technical success, and complications.ResultsMean FB size was 23 mm (range 8–74 mm). Four FBs were located within 10 mm of non-vascular significant anatomic structures. The basic technique was successful in 3 cases; 2 cases required technical modifications including using a stiff guide-wire to facilitate retrieval in the case of the post-cryoablation FB; and using the central mandrin of the 6G trocar to push a cement tract back into an augmented vertebra when initial retrieval failed. Overall technical success (FB retrieval or removal to non-hazardous location) was 100 %, with no complications.ConclusionPercutaneous image-guided retrieval of iatrogenic FBs using a large-bore biopsy system is a feasible, safe, effective, and versatile technique, with potential advantages over existing methods.« less

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brooker, A.; Gonder, J.; Lopp, S.

    The Automotive Deployment Option Projection Tool (ADOPT) is a light-duty vehicle consumer choice and stock model supported by the U.S. Department of Energy’s Vehicle Technologies Office. It estimates technology improvement impacts on U.S. light-duty vehicles sales, petroleum use, and greenhouse gas emissions. ADOPT uses techniques from the multinomial logit method and the mixed logit method estimate sales. Specifically, it estimates sales based on the weighted value of key attributes including vehicle price, fuel cost, acceleration, range and usable volume. The average importance of several attributes changes nonlinearly across its range and changes with income. For several attributes, a distribution ofmore » importance around the average value is used to represent consumer heterogeneity. The majority of existing vehicle makes, models, and trims are included to fully represent the market. The Corporate Average Fuel Economy regulations are enforced. The sales feed into the ADOPT stock model. It captures key aspects for summing petroleum use and greenhouse gas emissions This includes capturing the change in vehicle miles traveled by vehicle age, the creation of new model options based on the success of existing vehicles, new vehicle option introduction rate limits, and survival rates by vehicle age. ADOPT has been extensively validated with historical sales data. It matches in key dimensions including sales by fuel economy, acceleration, price, vehicle size class, and powertrain across multiple years. A graphical user interface provides easy and efficient use. It manages the inputs, simulation, and results.« less

  8. Multivariate statistical analysis software technologies for astrophysical research involving large data bases

    NASA Technical Reports Server (NTRS)

    Djorgovski, George

    1993-01-01

    The existing and forthcoming data bases from NASA missions contain an abundance of information whose complexity cannot be efficiently tapped with simple statistical techniques. Powerful multivariate statistical methods already exist which can be used to harness much of the richness of these data. Automatic classification techniques have been developed to solve the problem of identifying known types of objects in multiparameter data sets, in addition to leading to the discovery of new physical phenomena and classes of objects. We propose an exploratory study and integration of promising techniques in the development of a general and modular classification/analysis system for very large data bases, which would enhance and optimize data management and the use of human research resource.

  9. Multivariate statistical analysis software technologies for astrophysical research involving large data bases

    NASA Technical Reports Server (NTRS)

    Djorgovski, Stanislav

    1992-01-01

    The existing and forthcoming data bases from NASA missions contain an abundance of information whose complexity cannot be efficiently tapped with simple statistical techniques. Powerful multivariate statistical methods already exist which can be used to harness much of the richness of these data. Automatic classification techniques have been developed to solve the problem of identifying known types of objects in multi parameter data sets, in addition to leading to the discovery of new physical phenomena and classes of objects. We propose an exploratory study and integration of promising techniques in the development of a general and modular classification/analysis system for very large data bases, which would enhance and optimize data management and the use of human research resources.

  10. An element search ant colony technique for solving virtual machine placement problem

    NASA Astrophysics Data System (ADS)

    Srija, J.; Rani John, Rose; Kanaga, Grace Mary, Dr.

    2017-09-01

    The data centres in the cloud environment play a key role in providing infrastructure for ubiquitous computing, pervasive computing, mobile computing etc. This computing technique tries to utilize the available resources in order to provide services. Hence maintaining the resource utilization without wastage of power consumption has become a challenging task for the researchers. In this paper we propose the direct guidance ant colony system for effective mapping of virtual machines to the physical machine with maximal resource utilization and minimal power consumption. The proposed algorithm has been compared with the existing ant colony approach which is involved in solving virtual machine placement problem and thus the proposed algorithm proves to provide better result than the existing technique.

  11. Techniques for Updating Pedestrian Network Data Including Facilities and Obstructions Information for Transportation of Vulnerable People

    PubMed Central

    Park, Seula; Bang, Yoonsik; Yu, Kiyun

    2015-01-01

    Demand for a Pedestrian Navigation Service (PNS) is on the rise. To provide a PNS for the transportation of vulnerable people, more detailed information of pedestrian facilities and obstructions should be included in Pedestrian Network Data (PND) used for PNS. Such data can be constructed efficiently by collecting GPS trajectories and integrating them with the existing PND. However, these two kinds of data have geometric differences and topological inconsistencies that need to be addressed. In this paper, we provide a methodology for integrating pedestrian facilities and obstructions information with an existing PND. At first we extracted the significant points from user-collected GPS trajectory by identifying the geometric difference index and attributes of each point. Then the extracted points were used to make an initial solution of the matching between the trajectory and the PND. Two geometrical algorithms were proposed and applied to reduce two kinds of errors in the matching: on dual lines and on intersections. Using the final solution for the matching, we reconstructed the node/link structure of PND including the facilities and obstructions information. Finally, performance was assessed with a test site and 79.2% of the collected data were correctly integrated with the PND. PMID:26404307

  12. Conceptual design optimization study

    NASA Technical Reports Server (NTRS)

    Hollowell, S. J.; Beeman, E. R., II; Hiyama, R. M.

    1990-01-01

    The feasibility of applying multilevel functional decomposition and optimization techniques to conceptual design of advanced fighter aircraft was investigated. Applying the functional decomposition techniques to the conceptual design phase appears to be feasible. The initial implementation of the modified design process will optimize wing design variables. A hybrid approach, combining functional decomposition techniques for generation of aerodynamic and mass properties linear sensitivity derivatives with existing techniques for sizing mission performance and optimization, is proposed.

  13. Urine sampling techniques in symptomatic primary-care patients: a diagnostic accuracy review.

    PubMed

    Holm, Anne; Aabenhus, Rune

    2016-06-08

    Choice of urine sampling technique in urinary tract infection may impact diagnostic accuracy and thus lead to possible over- or undertreatment. Currently no evidencebased consensus exists regarding correct sampling technique of urine from women with symptoms of urinary tract infection in primary care. The aim of this study was to determine the accuracy of urine culture from different sampling-techniques in symptomatic non-pregnant women in primary care. A systematic review was conducted by searching Medline and Embase for clinical studies conducted in primary care using a randomized or paired design to compare the result of urine culture obtained with two or more collection techniques in adult, female, non-pregnant patients with symptoms of urinary tract infection. We evaluated quality of the studies and compared accuracy based on dichotomized outcomes. We included seven studies investigating urine sampling technique in 1062 symptomatic patients in primary care. Mid-stream-clean-catch had a positive predictive value of 0.79 to 0.95 and a negative predictive value close to 1 compared to sterile techniques. Two randomized controlled trials found no difference in infection rate between mid-stream-clean-catch, mid-stream-urine and random samples. At present, no evidence suggests that sampling technique affects the accuracy of the microbiological diagnosis in non-pregnant women with symptoms of urinary tract infection in primary care. However, the evidence presented is in-direct and the difference between mid-stream-clean-catch, mid-stream-urine and random samples remains to be investigated in a paired design to verify the present findings.

  14. Automated web service composition supporting conditional branch structures

    NASA Astrophysics Data System (ADS)

    Wang, Pengwei; Ding, Zhijun; Jiang, Changjun; Zhou, Mengchu

    2014-01-01

    The creation of value-added services by automatic composition of existing ones is gaining a significant momentum as the potential silver bullet in service-oriented architecture. However, service composition faces two aspects of difficulties. First, users' needs present such characteristics as diversity, uncertainty and personalisation; second, the existing services run in a real-world environment that is highly complex and dynamically changing. These difficulties may cause the emergence of nondeterministic choices in the process of service composition, which has gone beyond what the existing automated service composition techniques can handle. According to most of the existing methods, the process model of composite service includes sequence constructs only. This article presents a method to introduce conditional branch structures into the process model of composite service when needed, in order to satisfy users' diverse and personalised needs and adapt to the dynamic changes of real-world environment. UML activity diagrams are used to represent dependencies in composite service. Two types of user preferences are considered in this article, which have been ignored by the previous work and a simple programming language style expression is adopted to describe them. Two different algorithms are presented to deal with different situations. A real-life case is provided to illustrate the proposed concepts and methods.

  15. Transitional properties of supersolitons in a two electron temperature warm multi-ion plasma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Varghese, Steffy S., E-mail: steffy13@iigs.iigm.res.in; Ghosh, S. S., E-mail: sukti@iigs.iigm.res.in

    The existence domain of an ion acoustic supersoliton and its transition to a regular kind of solitary wave have been explored in detail using Sagdeev pseudopotential technique for a two electron temperature warm multi-ion plasma having two species of ions. It was found that both the cold to hot electron temperature ratio and their respective ambient densities play a deterministic role for the existence of a supersoliton, as well as its transitional processes to a regular solitary wave. Analogous to a double layer solution, which often marks the boundary of the existence domain of a regular solitary wave, a “curvemore » of inflection” determines the boundary of the existence domain of a supersoliton. The characteristics of the “curve of inflection,” in turn, depend on the respective concentrations of the two ion species. It is observed that the supersolitons are actually a subset of a more general kind of solutions which are characterized by a fluctuation in the corresponding charge separation which precedes their maximum amplitude. It is also observed that these novel kinds of solitary structures, including supersolitons, occur only for a very narrow range of parameters near constant amplitude beyond which the wave breaks.« less

  16. Surface-geophysical techniques used to detect existing and infilled scour holes near bridge piers

    USGS Publications Warehouse

    Placzek, Gary; Haeni, F.P.

    1995-01-01

    Surface-geophysical techniques were used with a position-recording system to study riverbed scour near bridge piers. From May 1989 to May 1993. Fathometers, fixed- and swept-frequency con- tinuous seismic-reflection profiling (CSP) systems, and a ground-penetrating radar (GPR) system were used with a laser-positioning system to measure the depth and extent of existing and infilled scour holes near bridge piers. Equipment was purchased commercially and modified when necessary to interface the components and (or) to improve their performance. Three 200-kHz black-and-white chart- recording Fathometers produced profiles of the riverbed that included existing scour holes and exposed pier footings. The Fathometers were used in conjunction with other geophysical techniques to help interpret the geophysical data. A 20-kHz color Fathometer delineated scour-hole geometry and, in some cases, the thickness of fill material in the hole. The signal provided subbottom information as deep as 10 ft in fine-grained materials and resolved layers of fill material as thin as 1 foot thick. Fixed-frequency and swept-frequency CSP systems were evaluated. The fixed-frequency system used a 3.5-, 7.0-, or 14-kHz signal. The 3.5-kHz signal pene- trated up to 50 ft of fine-grained material and resolved layers as thin as 2.5-ft thick. The 14-kHz signal penetrated up to 20 ft of fine-grained material and resolved layers as thin as 1-ft thick. The swept-frequency systems used a signal that swept from 2- to 16-kHz. With this system, up to 50 ft of penetration was achieved, and fill material as thin as 1 ft was resolved. Scour-hole geometry, exposed pier footings, and fill thickness in scour holes were detected with both CSP systems. The GPR system used an 80-, 100-, or 300-megahertz signal. The technique produced records in water up to 15 ft deep that had a specific conductance less than 200x11ms/cm. The 100-MHz signal penetrated up to 40 ft of resistive granular material and resolved layers as thin as 2-ft thick. Scour-hole geometry, the thickness of fill material in scour holes, and riverbed deposition were detected using this technique. Processing techniques were applied after data collection to assist with the interpretation of the data. Data were transferred from the color Fathometer, CSP, and GPR systems to a personal computer, and a commercially available software package designed to process GPR data was used to process the GPR and CSP data. Digital filtering, predictive-deconvolution, and migration algorithms were applied to some of the data. The processed data were displayed and printed as color amplitude or wiggle-trace plots. These processing methods eased and improved the interpretation of some of the data, but some interference from side echoes from bridge piers and multiple reflections remained in the data. The surface-geophysical techniques were applied at six bridge sites in Connecticut. Each site had different water depths, specific conductance, and riverbed materials. Existing and infilled scour holes, exposed pier footings, and riverbed deposition were detected by the surveys. The interpretations of the geophysical data were confirmed by comparing the data with lithologic and (or) probing data.

  17. A formulation of convection for stellar structure and evolution calculations without the mixing-length theory approximations. I - Application to the sun

    NASA Technical Reports Server (NTRS)

    Lydon, Thomas J.; Fox, Peter A.; Sofia, Sabatino

    1992-01-01

    The problem of treating convective energy transport without MLT approximations is approached here by formulating the results of numerical simulations of convection in terms of energy fluxes. This revised treatment of convective transport can be easily incorporated within existing stellar structure codes. As an example, the technique is applied to the sun. The treatment does not include any free parameters, making the models extremely sensitive to the accuracy of the treatments of opacities, chemical abundances, treatments of the solar atmosphere, and the equation of state.

  18. Analysis of high aspect ratio jet flap wings of arbitrary geometry.

    NASA Technical Reports Server (NTRS)

    Lissaman, P. B. S.

    1973-01-01

    Paper presents a design technique for rapidly computing lift, induced drag, and spanwise loading of unswept jet flap wings of arbitrary thickness, chord, twist, blowing, and jet angle, including discontinuities. Linear theory is used, extending Spence's method for elliptically loaded jet flap wings. Curves for uniformly blown rectangular wings are presented for direct performance estimation. Arbitrary planforms require a simple computer program. Method of reducing wing to equivalent stretched, twisted, unblown planform for hand calculation is also given. Results correlate with limited existing data, and show lifting line theory is reasonable down to aspect ratios of 5.

  19. Multivariate Analysis, Retrieval, and Storage System (MARS). Volume 1: MARS System and Analysis Techniques

    NASA Technical Reports Server (NTRS)

    Hague, D. S.; Vanderberg, J. D.; Woodbury, N. W.

    1974-01-01

    A method for rapidly examining the probable applicability of weight estimating formulae to a specific aerospace vehicle design is presented. The Multivariate Analysis Retrieval and Storage System (MARS) is comprised of three computer programs which sequentially operate on the weight and geometry characteristics of past aerospace vehicles designs. Weight and geometric characteristics are stored in a set of data bases which are fully computerized. Additional data bases are readily added to the MARS system and/or the existing data bases may be easily expanded to include additional vehicles or vehicle characteristics.

  20. An exact solution of a simplified two-phase plume model. [for solid propellant rocket

    NASA Technical Reports Server (NTRS)

    Wang, S.-Y.; Roberts, B. B.

    1974-01-01

    An exact solution of a simplified two-phase, gas-particle, rocket exhaust plume model is presented. It may be used to make the upper-bound estimation of the heat flux and pressure loads due to particle impingement on the objects existing in the rocket exhaust plume. By including the correction factors to be determined experimentally, the present technique will provide realistic data concerning the heat and aerodynamic loads on these objects for design purposes. Excellent agreement in trend between the best available computer solution and the present exact solution is shown.

Top