CMMI(Registered) for Acquisition, Version 1.3. CMMI-ACQ, V1.3
2010-11-01
and Software Engineering – System Life Cycle Processes [ ISO 2008b] ISO /IEC 27001 :2005 Information technology – Security techniques – Information...International Organization for Standardization and International Electrotechnical Commission. ISO /IEC 27001 Information Technology – Security Techniques...International Organization for Standardization/International Electrotechnical Commission ( ISO /IEC) body of standards. CMMs focus on improving processes
Overlay metrology for double patterning processes
NASA Astrophysics Data System (ADS)
Leray, Philippe; Cheng, Shaunee; Laidler, David; Kandel, Daniel; Adel, Mike; Dinu, Berta; Polli, Marco; Vasconi, Mauro; Salski, Bartlomiej
2009-03-01
The double patterning (DPT) process is foreseen by the industry to be the main solution for the 32 nm technology node and even beyond. Meanwhile process compatibility has to be maintained and the performance of overlay metrology has to improve. To achieve this for Image Based Overlay (IBO), usually the optics of overlay tools are improved. It was also demonstrated that these requirements are achievable with a Diffraction Based Overlay (DBO) technique named SCOLTM [1]. In addition, we believe that overlay measurements with respect to a reference grid are required to achieve the required overlay control [2]. This induces at least a three-fold increase in the number of measurements (2 for double patterned layers to the reference grid and 1 between the double patterned layers). The requirements of process compatibility, enhanced performance and large number of measurements make the choice of overlay metrology for DPT very challenging. In this work we use different flavors of the standard overlay metrology technique (IBO) as well as the new technique (SCOL) to address these three requirements. The compatibility of the corresponding overlay targets with double patterning processes (Litho-Etch-Litho-Etch (LELE); Litho-Freeze-Litho-Etch (LFLE), Spacer defined) is tested. The process impact on different target types is discussed (CD bias LELE, Contrast for LFLE). We compare the standard imaging overlay metrology with non-standard imaging techniques dedicated to double patterning processes (multilayer imaging targets allowing one overlay target instead of three, very small imaging targets). In addition to standard designs already discussed [1], we investigate SCOL target designs specific to double patterning processes. The feedback to the scanner is determined using the different techniques. The final overlay results obtained are compared accordingly. We conclude with the pros and cons of each technique and suggest the optimal metrology strategy for overlay control in double patterning processes.
48 CFR 9904.414-50 - Techniques for application.
Code of Federal Regulations, 2010 CFR
2010-10-01
... the case of process cost accounting systems, the contracting parties may agree to substitute an.... 9904.414-50 Section 9904.414-50 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD... ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.414-50 Techniques for application. (a) The investment...
Common Workflow Service: Standards Based Solution for Managing Operational Processes
NASA Astrophysics Data System (ADS)
Tinio, A. W.; Hollins, G. A.
2017-06-01
The Common Workflow Service is a collaborative and standards-based solution for managing mission operations processes using techniques from the Business Process Management (BPM) discipline. This presentation describes the CWS and its benefits.
Fosness, Ryan L.; Dietsch, Benjamin J.
2015-10-21
This report presents the surveying techniques and data-processing methods used to collect, process, and disseminate topographic and hydrographic data. All standard and non‑standard data-collection methods, techniques, and data process methods were documented. Additional discussion describes the quality-assurance and quality-control elements used in this study, along with the limitations for the Torrinha-Itacoatiara study reach data. The topographic and hydrographic geospatial data are published along with associated metadata.
Ar+ and CuBr laser-assisted chemical bleaching of teeth: estimation of whiteness degree
NASA Astrophysics Data System (ADS)
Dimitrov, S.; Todorovska, Roumyana; Gizbrecht, Alexander I.; Raychev, L.; Petrov, Lyubomir P.
2003-11-01
In this work the results of adaptation of impartial methods for color determination aimed at developing of techniques for estimation of human teeth whiteness degree, sufficiently handy for common use in clinical practice are presented. For approbation and by the way of illustration of the techniques, standards of teeth colors were used as well as model and naturally discolored human teeth treated by two bleaching chemical compositions activated by three light sources each: Ar+ and CuBr lasers, and a standard halogen photopolymerization lamp. Typical reflection and fluorescence spectra of some samples are presented; the samples colors were estimated by a standard computer processing in RGB and B coordinates. The results of the applied spectral and colorimetric techniques are in a good agreement with those of the standard computer processing of the corresponding digital photographs and complies with the visually estimated degree of the teeth whiteness judged according to the standard reference scale commonly used in the aesthetic dentistry.
Symmetric Phase Only Filtering for Improved DPIV Data Processing
NASA Technical Reports Server (NTRS)
Wernet, Mark P.
2006-01-01
The standard approach in Digital Particle Image Velocimetry (DPIV) data processing is to use Fast Fourier Transforms to obtain the cross-correlation of two single exposure subregions, where the location of the cross-correlation peak is representative of the most probable particle displacement across the subregion. This standard DPIV processing technique is analogous to Matched Spatial Filtering, a technique commonly used in optical correlators to perform the crosscorrelation operation. Phase only filtering is a well known variation of Matched Spatial Filtering, which when used to process DPIV image data yields correlation peaks which are narrower and up to an order of magnitude larger than those obtained using traditional DPIV processing. In addition to possessing desirable correlation plane features, phase only filters also provide superior performance in the presence of DC noise in the correlation subregion. When DPIV image subregions contaminated with surface flare light or high background noise levels are processed using phase only filters, the correlation peak pertaining only to the particle displacement is readily detected above any signal stemming from the DC objects. Tedious image masking or background image subtraction are not required. Both theoretical and experimental analyses of the signal-to-noise ratio performance of the filter functions are presented. In addition, a new Symmetric Phase Only Filtering (SPOF) technique, which is a variation on the traditional phase only filtering technique, is described and demonstrated. The SPOF technique exceeds the performance of the traditionally accepted phase only filtering techniques and is easily implemented in standard DPIV FFT based correlation processing with no significant computational performance penalty. An "Automatic" SPOF algorithm is presented which determines when the SPOF is able to provide better signal to noise results than traditional PIV processing. The SPOF based optical correlation processing approach is presented as a new paradigm for more robust cross-correlation processing of low signal-to-noise ratio DPIV image data."
Code of Federal Regulations, 2010 CFR
2010-07-01
... being used will be based on information available to the Administrator, which may include, but is not... techniques, or the control system and process monitoring equipment during a malfunction in a manner... the process and control system monitoring equipment, and shall include a standardized checklist to...
Code of Federal Regulations, 2011 CFR
2011-07-01
... being used will be based on information available to the Administrator, which may include, but is not... techniques, or the control system and process monitoring equipment during a malfunction in a manner... the process and control system monitoring equipment, and shall include a standardized checklist to...
Design techniques for low-voltage analog integrated circuits
NASA Astrophysics Data System (ADS)
Rakús, Matej; Stopjaková, Viera; Arbet, Daniel
2017-08-01
In this paper, a review and analysis of different design techniques for (ultra) low-voltage integrated circuits (IC) are performed. This analysis shows that the most suitable design methods for low-voltage analog IC design in a standard CMOS process include techniques using bulk-driven MOS transistors, dynamic threshold MOS transistors and MOS transistors operating in weak or moderate inversion regions. The main advantage of such techniques is that there is no need for any modification of standard CMOS structure or process. Basic circuit building blocks like differential amplifiers or current mirrors designed using these approaches are able to operate with the power supply voltage of 600 mV (or even lower), which is the key feature towards integrated systems for modern portable applications.
Flat-plate solar array project process development area: Process research of non-CZ silicon material
NASA Technical Reports Server (NTRS)
Campbell, R. B.
1986-01-01
Several different techniques to simultaneously diffuse the front and back junctions in dendritic web silicon were investigated. A successful simultaneous diffusion reduces the cost of the solar cell by reducing the number of processing steps, the amount of capital equipment, and the labor cost. The three techniques studied were: (1) simultaneous diffusion at standard temperatures and times using a tube type diffusion furnace or a belt furnace; (2) diffusion using excimer laser drive-in; and (3) simultaneous diffusion at high temperature and short times using a pulse of high intensity light as the heat source. The use of an excimer laser and high temperature short time diffusion experiment were both more successful than the diffusion at standard temperature and times. The three techniques are described in detail and a cost analysis of the more successful techniques is provided.
AWS breaks new ground with soldering specification.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vianco, Paul Thomas
Joining technologies continue to advance with new materials, process innovations, and inspection techniques. An increasing number of high-valued, high-reliability applications -- from boilers and ship hulls to rocket motors and medical devices -- have required the development of industry standards and specifications in order to ensure that the best design and manufacturing practices are being used to produce safe, durable products and assemblies. Standards writing has always had an important role at the American Welding Society (AWS). The AWS standards and specifications cover such topics as filler materials, joining processes, inspection techniques, and qualification methods that are used in weldingmore » and brazing technologies. These AWS standards and specifications, all of which are approved by the American National Standards Institute (ANSI), have also provided the basis for many similar documents used in Europe and in Pacific Rim countries.« less
Method for formation of thin film transistors on plastic substrates
Carey, Paul G.; Smith, Patrick M.; Sigmon, Thomas W.; Aceves, Randy C.
1998-10-06
A process for formation of thin film transistors (TFTs) on plastic substrates replaces standard thin film transistor fabrication techniques, and uses sufficiently lower processing temperatures so that inexpensive plastic substrates may be used in place of standard glass, quartz, and silicon wafer-based substrates. The process relies on techniques for depositing semiconductors, dielectrics, and metals at low temperatures; crystallizing and doping semiconductor layers in the TFT with a pulsed energy source; and creating top-gate self-aligned as well as back-gate TFT structures. The process enables the fabrication of amorphous and polycrystalline channel silicon TFTs at temperatures sufficiently low to prevent damage to plastic substrates. The process has use in large area low cost electronics, such as flat panel displays and portable electronics.
Ebola Virus and Marburg Virus in Human Milk Are Inactivated by Holder Pasteurization.
Hamilton Spence, Erin; Huff, Monica; Shattuck, Karen; Vickers, Amy; Yun, Nadezda; Paessler, Slobodan
2017-05-01
Potential donors of human milk are screened for Ebola virus (EBOV) using standard questions, but testing for EBOV and Marburg virus (MARV) is not part of routine serological testing performed by milk banks. Research aim: This study tested the hypothesis that EBOV would be inactivated in donor human milk (DHM) by standard pasteurization techniques (Holder) used in all North American nonprofit milk banks. Milk samples were obtained from a nonprofit milk bank. They were inoculated with EBOV (Zaire strain) and MARV (Angola strain) and processed by standard Holder pasteurization technique. Plaque assays for EBOV and MARV were performed to detect the presence of virus after pasteurization. Neither EBOV nor MARV was detectable by viral plaque assay in DHM or culture media samples, which were pasteurized by the Holder process. EBOV and MARV are safely inactivated in human milk by standard Holder pasteurization technique. Screening for EBOV or MARV beyond questionnaire and self-deferral is not needed to ensure safety of DHM for high-risk infants.
Bahadori, Mohammadkarim; Ravangard, Ramin; Yaghoubi, Maryam; Alimohammadzadeh, Khalil
2014-01-01
Background: Military hospitals are responsible for preserving, restoring and improving the health of not only armed forces, but also other people. According to the military organizations strategy, which is being a leader and pioneer in all areas, providing quality health services is one of the main goals of the military health care organizations. This study was aimed to evaluate the service quality of selected military hospitals in Iran based on the Joint Commission International (JCI) standards and comparing these hospitals with each other and ranking them using the analytic hierarchy process (AHP) technique in 2013. Materials and Methods: This was a cross-sectional and descriptive study conducted on five military hospitals, selected using the purposive sampling method, in 2013. Required data collected using checklists of accreditation standards and nominal group technique. AHP technique was used for prioritizing. Furthermore, Expert Choice 11.0 was used to analyze the collected data. Results: Among JCI standards, the standards of access to care and continuity of care (weight = 0.122), quality improvement and patient safety (weight = 0.121) and leadership and management (weight = 0.117) had the greatest importance, respectively. Furthermore, in the overall ranking, BGT (weight = 0.369), IHM (0.238), SAU (0.202), IHK (weight = 0.125) and SAB (weight = 0.066) ranked first to fifth, respectively. Conclusion: AHP is an appropriate technique for measuring the overall performance of hospitals and their quality of services. It is a holistic approach that takes all hospital processes into consideration. The results of the present study can be used to improve hospitals performance through identifying areas, which are in need of focus for quality improvement and selecting strategies to improve service quality. PMID:25250364
NASA Astrophysics Data System (ADS)
Pasetti Monizza, G.; Matt, D. T.; Benedetti, C.
2016-11-01
According to Wortmann classification, the Building Industry (BI) can be defined as engineer-to-order (ETO) industry: the engineering-process starts only when an order is acquired. This definition implies that every final product (building) is almost unique’ and processes cannot be easily standardized or automated. Because of this, BI is one of the less efficient industries today’ mostly leaded by craftsmanship. In the last years’ several improvements in process efficiency have been made focusing on manufacturing and installation processes only. In order to improve the efficiency of design and engineering processes as well, the scientific community agrees that the most fruitful strategy should be Front-End Design (FED). Nevertheless, effective techniques and tools are missing. This paper discusses outcomes of a research activity that aims at highlighting whether Parametric and Generative Design techniques allow reducing wastes of resources and improving the overall efficiency of the BI, by pushing the Digitalization of design and engineering processes of products. Focusing on the Glued-Laminated-Timber industry, authors will show how Parametric and Generative Design techniques can be introduced in a standard supply-chain system, highlighting potentials and criticism on the supply-chain system as a whole.
Toward energy harvesting using active materials and conversion improvement by nonlinear processing.
Guyomar, Daniel; Badel, Adrien; Lefeuvre, Elie; Richard, Claude
2005-04-01
This paper presents a new technique of electrical energy generation using mechanically excited piezoelectric materials and a nonlinear process. This technique, called synchronized switch harvesting (SSH), is derived from the synchronized switch damping (SSD), which is a nonlinear technique previously developed to address the problem of vibration damping on mechanical structures. This technique results in a significant increase of the electromechanical conversion capability of piezoelectric materials. Comparatively with standard technique, the electrical harvested power may be increased above 900%. The performance of the nonlinear processing is demonstrated on structures excited at their resonance frequency as well as out of resonance.
Method for formation of thin film transistors on plastic substrates
Carey, P.G.; Smith, P.M.; Sigmon, T.W.; Aceves, R.C.
1998-10-06
A process for formation of thin film transistors (TFTs) on plastic substrates replaces standard thin film transistor fabrication techniques, and uses sufficiently lower processing temperatures so that inexpensive plastic substrates may be used in place of standard glass, quartz, and silicon wafer-based substrates. The process relies on techniques for depositing semiconductors, dielectrics, and metals at low temperatures; crystallizing and doping semiconductor layers in the TFT with a pulsed energy source; and creating top-gate self-aligned as well as back-gate TFT structures. The process enables the fabrication of amorphous and polycrystalline channel silicon TFTs at temperatures sufficiently low to prevent damage to plastic substrates. The process has use in large area low cost electronics, such as flat panel displays and portable electronics. 5 figs.
ERIC Educational Resources Information Center
Bailey, Anthony
2013-01-01
The nominal group technique (NGT) is a structured process to gather information from a group. The technique was first described in 1975 and has since become a widely-used standard to facilitate working groups. The NGT is effective for generating large numbers of creative new ideas and for group priority setting. This paper describes the process of…
What Makes School Ethnography "Ethnographic?"
ERIC Educational Resources Information Center
Erickson, Frederick
Ethnography as an inquiry process guided by a point of view rather than as a reporting process guided by a standard technique or set of techniques is the main point of this essay which suggests the application of Malinowski's theories and methods to an ethnology of the school, indicates reasons why traditional ethnography is inadequate to the…
NASA Technical Reports Server (NTRS)
Dabney, James B.; Arthur, James Douglas
2017-01-01
Agile methods have gained wide acceptance over the past several years, to the point that they are now a standard management and execution approach for small-scale software development projects. While conventional Agile methods are not generally applicable to large multi-year and mission-critical systems, Agile hybrids are now being developed (such as SAFe) to exploit the productivity improvements of Agile while retaining the necessary process rigor and coordination needs of these projects. From the perspective of Independent Verification and Validation (IVV), however, the adoption of these hybrid Agile frameworks is becoming somewhat problematic. Hence, we find it prudent to question the compatibility of conventional IVV techniques with (hybrid) Agile practices.This paper documents our investigation of (a) relevant literature, (b) the modification and adoption of Agile frameworks to accommodate the development of large scale, mission critical systems, and (c) the compatibility of standard IVV techniques within hybrid Agile development frameworks. Specific to the latter, we found that the IVV methods employed within a hybrid Agile process can be divided into three groups: (1) early lifecycle IVV techniques that are fully compatible with the hybrid lifecycles, (2) IVV techniques that focus on tracing requirements, test objectives, etc. are somewhat incompatible, but can be tailored with a modest effort, and (3) IVV techniques involving an assessment requiring artifact completeness that are simply not compatible with hybrid Agile processes, e.g., those that assume complete requirement specification early in the development lifecycle.
Modified Redundancy based Technique—a New Approach to Combat Error Propagation Effect of AES
NASA Astrophysics Data System (ADS)
Sarkar, B.; Bhunia, C. T.; Maulik, U.
2012-06-01
Advanced encryption standard (AES) is a great research challenge. It has been developed to replace the data encryption standard (DES). AES suffers from a major limitation of error propagation effect. To tackle this limitation, two methods are available. One is redundancy based technique and the other one is bite based parity technique. The first one has a significant advantage of correcting any error on definite term over the second one but at the cost of higher level of overhead and hence lowering the processing speed. In this paper, a new approach based on the redundancy based technique is proposed that would certainly speed up the process of reliable encryption and hence the secured communication.
A simple 2D composite image analysis technique for the crystal growth study of L-ascorbic acid.
Kumar, Krishan; Kumar, Virender; Lal, Jatin; Kaur, Harmeet; Singh, Jasbir
2017-06-01
This work was destined for 2D crystal growth studies of L-ascorbic acid using the composite image analysis technique. Growth experiments on the L-ascorbic acid crystals were carried out by standard (optical) microscopy, laser diffraction analysis, and composite image analysis. For image analysis, the growth of L-ascorbic acid crystals was captured as digital 2D RGB images, which were then processed to composite images. After processing, the crystal boundaries emerged as white lines against the black (cancelled) background. The crystal boundaries were well differentiated by peaks in the intensity graphs generated for the composite images. The lengths of crystal boundaries measured from the intensity graphs of composite images were in good agreement (correlation coefficient "r" = 0.99) with the lengths measured by standard microscopy. On the contrary, the lengths measured by laser diffraction were poorly correlated with both techniques. Therefore, the composite image analysis can replace the standard microscopy technique for the crystal growth studies of L-ascorbic acid. © 2017 Wiley Periodicals, Inc.
Shnorhavorian, Margarett; Kroon, Leah; Jeffries, Howard; Johnson, Rebecca
2012-11-01
There is limited literature on strategies to overcome the barriers to sperm banking among adolescent and young adult (AYA) males with cancer. By standardizing our process for offering sperm banking to AYA males before cancer treatment, we aimed to improve rates of sperm banking at our institution. Continuous process improvement is a technique that has recently been applied to improve health care delivery. We used continuous process improvement methodologies to create a standard process for fertility preservation for AYA males with cancer at our institution. We compared rates of sperm banking before and after standardization. In the 12-month period after implementation of a standardized process, 90% of patients were offered sperm banking. We demonstrated an 8-fold increase in the proportion of AYA males' sperm banking, and a 5-fold increase in the rate of sperm banking at our institution. Implementation of a standardized process for sperm banking for AYA males with cancer was associated with increased rates of sperm banking at our institution. This study supports the role of standardized health care in decreasing barriers to sperm banking.
Propellant for the NASA Standard Initiator
NASA Technical Reports Server (NTRS)
Hohmann, Carl; Tipton, Bill, Jr.; Dutton, Maureen
2000-01-01
This paper discusses processes employed in manufacturing zirconium-potassium perchlorate propellant for the NASA standard initiator. It provides both a historical background on the NSI device-detailing problem areas and their resolution--and on propellant blending techniques. Emphasis is placed on the precipitation blending method. The findings on mixing equipment, processing, and raw materials are described. Also detailed are findings on the bridgewire slurry operation, one of the critical steps in the production of the NASA standard initiator.
Cost minimizing of cutting process for CNC thermal and water-jet machines
NASA Astrophysics Data System (ADS)
Tavaeva, Anastasia; Kurennov, Dmitry
2015-11-01
This paper deals with optimization problem of cutting process for CNC thermal and water-jet machines. The accuracy of objective function parameters calculation for optimization problem is investigated. This paper shows that working tool path speed is not constant value. One depends on some parameters that are described in this paper. The relations of working tool path speed depending on the numbers of NC programs frames, length of straight cut, configuration part are presented. Based on received results the correction coefficients for working tool speed are defined. Additionally the optimization problem may be solved by using mathematical model. Model takes into account the additional restrictions of thermal cutting (choice of piercing and output tool point, precedence condition, thermal deformations). At the second part of paper the non-standard cutting techniques are considered. Ones may lead to minimizing of cutting cost and time compared with standard cutting techniques. This paper considers the effectiveness of non-standard cutting techniques application. At the end of the paper the future research works are indicated.
Characterisation of Ductile Prepregs
NASA Astrophysics Data System (ADS)
Pinto, F.; White, A.; Meo, M.
2013-04-01
This study is focused on the analysis of micro-perforated prepregs created from standard, off the shelf prepregs modified by a particular laser process to enhance ductility of prepregs for better formability and drapability. Fibres are shortened through the use of laser cutting in a predetermined pattern intended to maintain alignment, and therefore mechanical properties, yet increase ductility at the working temperature. The increase in ductility allows the product to be more effectively optimised for specific forming techniques. Tensile tests were conducted on several specimens in order to understand the ductility enhancement offered by this process with different micro-perforation patterns over standard prepregs. Furthermore, the effects of forming temperature was also analysed to assess the applicability of this material to hot draping techniques and other heated processes.
Retinal Information Processing for Minimum Laser Lesion Detection and Cumulative Damage
1992-09-17
TAL3Unaqr~orJ:ccd [] J ,;--Wicic tion --------------... MYRON....... . ................... ... ....... ...........................MYRON L. WOLBARSHT B D ist...possible beneficial visual function of the small retinal image movements. B . Visual System Models Prior models of visual system information processing have...against standard secondary sources whose calibrations can be traced to the National Bureau of Standards. B . Electrophysiological Techniques Extracellular
Quality Space and Launch Requirements Addendum to AS9100C
2015-03-05
45 8.9.1 Statistical Process Control (SPC) .......................................................................... 45 8.9.1.1 Out of Control...Systems Center SME Subject Matter Expert SOW Statement of Work SPC Statistical Process Control SPO System Program Office SRP Standard Repair...individual data exceeding the control limits. Control limits are developed using standard statistical methods or other approved techniques and are based on
Field methods and data processing techniques associated with mapped inventory plots
William A. Bechtold; Stanley J. Zarnoch
1999-01-01
The U.S. Forest Inventory and Analysis (FIA) and Forest Health Monitoring (FHM) programs utilize a fixed-area mapped-plot design as the national standard for extensive forest inventories. The mapped-plot design is explained, as well as the rationale for its selection as the national standard. Ratio-of-means estimators am presented as a method to process data from...
Software component quality evaluation
NASA Technical Reports Server (NTRS)
Clough, A. J.
1991-01-01
The paper describes a software inspection process that can be used to evaluate the quality of software components. Quality criteria, process application, independent testing of the process and proposed associated tool support are covered. Early results indicate that this technique is well suited for assessing software component quality in a standardized fashion. With automated machine assistance to facilitate both the evaluation and selection of software components, such a technique should promote effective reuse of software components.
Processing Techniques for Intelligibility Improvement to Speech with Co-Channel Interference.
1983-09-01
processing was found to be always less than in the original unprocessed co-channel sig- nali also as the length of the comb filter increased, the...7 D- i35 702 PROCESSING TECHNIQUES FOR INTELLIGIBILITY IMPRO EMENT 1.TO SPEECH WITH CO-C..(U) SIGNAL TECHNOLOGY INC GOLETACA B A HANSON ET AL SEP...11111111122 11111.25 1111 .4 111.6 MICROCOPY RESOLUTION TEST CHART NATIONAL BUREAU Of STANDARDS- 1963-A RA R.83-225 Set ,’ember 1983 PROCESSING
Combinative Particle Size Reduction Technologies for the Production of Drug Nanocrystals
Salazar, Jaime; Müller, Rainer H.; Möschwitzer, Jan P.
2014-01-01
Nanosizing is a suitable method to enhance the dissolution rate and therefore the bioavailability of poorly soluble drugs. The success of the particle size reduction processes depends on critical factors such as the employed technology, equipment, and drug physicochemical properties. High pressure homogenization and wet bead milling are standard comminution techniques that have been already employed to successfully formulate poorly soluble drugs and bring them to market. However, these techniques have limitations in their particle size reduction performance, such as long production times and the necessity of employing a micronized drug as the starting material. This review article discusses the development of combinative methods, such as the NANOEDGE, H 96, H 69, H 42, and CT technologies. These processes were developed to improve the particle size reduction effectiveness of the standard techniques. These novel technologies can combine bottom-up and/or top-down techniques in a two-step process. The combinative processes lead in general to improved particle size reduction effectiveness. Faster production of drug nanocrystals and smaller final mean particle sizes are among the main advantages. The combinative particle size reduction technologies are very useful formulation tools, and they will continue acquiring importance for the production of drug nanocrystals. PMID:26556191
78 FR 47784 - Notice of Information Collection
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-06
... Standards and Technology (NIST) Federal Information Processing Standard (FIPS) 201: Personal Identity...), address, employment history, biometric identifiers (e.g. fingerprints), signature, digital photograph... collection techniques or the use of other forms of information technology. Comments submitted in response to...
New Techniques to Evaluate the Incendiary Behavior of Insulators
NASA Technical Reports Server (NTRS)
Buhler, Charles; Calle, Carlos; Clements, Sid; Trigwell, Steve; Ritz, Mindy
2008-01-01
New techniques for evaluating the incendiary behavior of insulators is presented. The onset of incendive brush discharges in air is evaluated using standard spark probe techniques for the case simulating approaches of an electrically grounded sphere to a charged insulator in the presence of a flammable atmosphere. However, this standard technique is unsuitable for the case of brush discharges that may occur during the charging-separation process for two insulator materials. We present experimental techniques to evaluate this hazard in the presence of a flammable atmosphere which is ideally suited to measure the incendiary nature of micro-discharges upon separation, a measurement never before performed. Other measurement techniques unique to this study include; surface potential measurements of insulators before, during and after contact and separation, as well as methods to verify fieldmeter calibrations using a charge insulator surface opposed to standard high voltage plates. Key words: Kapton polyimide film, incendiary discharges, brush discharges, contact and frictional electrification, ignition hazards, insulators, contact angle, surface potential measurements.
Quigley, Elizabeth A; Tokay, Barbara A; Jewell, Sarah T; Marchetti, Michael A; Halpern, Allan C
2015-08-01
Photographs are invaluable dermatologic diagnostic, management, research, teaching, and documentation tools. Digital Imaging and Communications in Medicine (DICOM) standards exist for many types of digital medical images, but there are no DICOM standards for camera-acquired dermatologic images to date. To identify and describe existing or proposed technology and technique standards for camera-acquired dermatologic images in the scientific literature. Systematic searches of the PubMed, EMBASE, and Cochrane databases were performed in January 2013 using photography and digital imaging, standardization, and medical specialty and medical illustration search terms and augmented by a gray literature search of 14 websites using Google. Two reviewers independently screened titles of 7371 unique publications, followed by 3 sequential full-text reviews, leading to the selection of 49 publications with the most recent (1985-2013) or detailed description of technology or technique standards related to the acquisition or use of images of skin disease (or related conditions). No universally accepted existing technology or technique standards for camera-based digital images in dermatology were identified. Recommendations are summarized for technology imaging standards, including spatial resolution, color resolution, reproduction (magnification) ratios, postacquisition image processing, color calibration, compression, output, archiving and storage, and security during storage and transmission. Recommendations are also summarized for technique imaging standards, including environmental conditions (lighting, background, and camera position), patient pose and standard view sets, and patient consent, privacy, and confidentiality. Proposed standards for specific-use cases in total body photography, teledermatology, and dermoscopy are described. The literature is replete with descriptions of obtaining photographs of skin disease, but universal imaging standards have not been developed, validated, and adopted to date. Dermatologic imaging is evolving without defined standards for camera-acquired images, leading to variable image quality and limited exchangeability. The development and adoption of universal technology and technique standards may first emerge in scenarios when image use is most associated with a defined clinical benefit.
Signal processing for ION mobility spectrometers
NASA Technical Reports Server (NTRS)
Taylor, S.; Hinton, M.; Turner, R.
1995-01-01
Signal processing techniques for systems based upon Ion Mobility Spectrometry will be discussed in the light of 10 years of experience in the design of real-time IMS. Among the topics to be covered are compensation techniques for variations in the number density of the gas - the use of an internal standard (a reference peak) or pressure and temperature sensors. Sources of noise and methods for noise reduction will be discussed together with resolution limitations and the ability of deconvolution techniques to improve resolving power. The use of neural networks (either by themselves or as a component part of a processing system) will be reviewed.
Single crystals and nonlinear process for outstanding vibration-powered electrical generators.
Badel, Adrien; Benayad, Abdelmjid; Lefeuvre, Elie; Lebrun, Laurent; Richard, Claude; Guyomar, Daniel
2006-04-01
This paper compares the performances of vibration-powered electrical generators using a piezoelectric ceramic and a piezoelectric single crystal associated to several power conditioning circuits. A new approach of the piezoelectric power conversion based on a nonlinear voltage processing is presented, leading to three novel high performance power conditioning interfaces. Theoretical predictions and experimental results show that the nonlinear processing technique may increase the power harvested by a factor of 8 compared to standard techniques. Moreover, it is shown that, for a given energy harvesting technique, generators using single crystals deliver 20 times more power than generators using piezoelectric ceramics.
Use of fractography and sectioning techniques to study fracture mechanisms
NASA Technical Reports Server (NTRS)
Van Stone, R. H.; Cox, T. B.
1976-01-01
Recent investigations of the effect of microstructure on the fracture mechanisms and fracture toughness of steels, aluminum alloys, and titanium alloys have used standard fractographic techniques and a sectioning technique on specimens plastically deformed to various strains up to fracture. The specimens are prepared metallographically for observation in both optical and electron beam instruments. This permits observations to be made about the fracture mechanism as it occurs in thick sections and helps remove speculation from the interpretation of fractographic features. This technique may be used in conjunction with other standard techniques such as extraction replicas and microprobe analyses. Care must be taken to make sure that the microstructural features which are observed to play a role in the fracture process using the sectioning technique can be identified with fractography.
Zweerink, Alwin; Allaart, Cornelis P; Kuijer, Joost P A; Wu, LiNa; Beek, Aernout M; van de Ven, Peter M; Meine, Mathias; Croisille, Pierre; Clarysse, Patrick; van Rossum, Albert C; Nijveldt, Robin
2017-12-01
Although myocardial strain analysis is a potential tool to improve patient selection for cardiac resynchronization therapy (CRT), there is currently no validated clinical approach to derive segmental strains. We evaluated the novel segment length in cine (SLICE) technique to derive segmental strains from standard cardiovascular MR (CMR) cine images in CRT candidates. Twenty-seven patients with left bundle branch block underwent CMR examination including cine imaging and myocardial tagging (CMR-TAG). SLICE was performed by measuring segment length between anatomical landmarks throughout all phases on short-axis cines. This measure of frame-to-frame segment length change was compared to CMR-TAG circumferential strain measurements. Subsequently, conventional markers of CRT response were calculated. Segmental strains showed good to excellent agreement between SLICE and CMR-TAG (septum strain, intraclass correlation coefficient (ICC) 0.76; lateral wall strain, ICC 0.66). Conventional markers of CRT response also showed close agreement between both methods (ICC 0.61-0.78). Reproducibility of SLICE was excellent for intra-observer testing (all ICC ≥0.76) and good for interobserver testing (all ICC ≥0.61). The novel SLICE post-processing technique on standard CMR cine images offers both accurate and robust segmental strain measures compared to the 'gold standard' CMR-TAG technique, and has the advantage of being widely available. • Myocardial strain analysis could potentially improve patient selection for CRT. • Currently a well validated clinical approach to derive segmental strains is lacking. • The novel SLICE technique derives segmental strains from standard CMR cine images. • SLICE-derived strain markers of CRT response showed close agreement with CMR-TAG. • Future studies will focus on the prognostic value of SLICE in CRT candidates.
On the Performance Evaluation of 3D Reconstruction Techniques from a Sequence of Images
NASA Astrophysics Data System (ADS)
Eid, Ahmed; Farag, Aly
2005-12-01
The performance evaluation of 3D reconstruction techniques is not a simple problem to solve. This is not only due to the increased dimensionality of the problem but also due to the lack of standardized and widely accepted testing methodologies. This paper presents a unified framework for the performance evaluation of different 3D reconstruction techniques. This framework includes a general problem formalization, different measuring criteria, and a classification method as a first step in standardizing the evaluation process. Performance characterization of two standard 3D reconstruction techniques, stereo and space carving, is also presented. The evaluation is performed on the same data set using an image reprojection testing methodology to reduce the dimensionality of the evaluation domain. Also, different measuring strategies are presented and applied to the stereo and space carving techniques. These measuring strategies have shown consistent results in quantifying the performance of these techniques. Additional experiments are performed on the space carving technique to study the effect of the number of input images and the camera pose on its performance.
Molenaar, Peter C M
2008-01-01
It is argued that general mathematical-statistical theorems imply that standard statistical analysis techniques of inter-individual variation are invalid to investigate developmental processes. Developmental processes have to be analyzed at the level of individual subjects, using time series data characterizing the patterns of intra-individual variation. It is shown that standard statistical techniques based on the analysis of inter-individual variation appear to be insensitive to the presence of arbitrary large degrees of inter-individual heterogeneity in the population. An important class of nonlinear epigenetic models of neural growth is described which can explain the occurrence of such heterogeneity in brain structures and behavior. Links with models of developmental instability are discussed. A simulation study based on a chaotic growth model illustrates the invalidity of standard analysis of inter-individual variation, whereas time series analysis of intra-individual variation is able to recover the true state of affairs. (c) 2007 Wiley Periodicals, Inc.
Chemical vapor deposition growth
NASA Technical Reports Server (NTRS)
Ruth, R. P.; Manasevit, H. M.; Kenty, J. L.; Moudy, L. A.; Simpson, W. I.; Yang, J. J.
1976-01-01
The chemical vapor deposition (CVD) method for the growth of Si sheet on inexpensive substrate materials is investigated. The objective is to develop CVD techniques for producing large areas of Si sheet on inexpensive substrate materials, with sheet properties suitable for fabricating solar cells meeting the technical goals of the Low Cost Silicon Solar Array Project. Specific areas covered include: (1) modification and test of existing CVD reactor system; (2) identification and/or development of suitable inexpensive substrate materials; (3) experimental investigation of CVD process parameters using various candidate substrate materials; (4) preparation of Si sheet samples for various special studies, including solar cell fabrication; (5) evaluation of the properties of the Si sheet material produced by the CVD process; and (6) fabrication and evaluation of experimental solar cell structures, using standard and near-standard processing techniques.
NASA Technical Reports Server (NTRS)
Castle, J. G.
1976-01-01
A selective bibliography is given on electrical characterization techniques for semiconductors. Emphasis is placed on noncontacting techniques for the standard electrical parameters for monitoring crystal growth in space, preferably in real time with high resolution.
Chemical vapor deposition growth
NASA Technical Reports Server (NTRS)
Ruth, R. P.; Manasevit, H. M.; Campbell, A. G.; Johnson, R. E.; Kenty, J. L.; Moudy, L. A.; Shaw, G. L.; Simpson, W. I.; Yang, J. J.
1978-01-01
The objective was to investigate and develop chemical vapor deposition (CVD) techniques for the growth of large areas of Si sheet on inexpensive substrate materials, with resulting sheet properties suitable for fabricating solar cells that would meet the technical goals of the Low Cost Silicon Solar Array Project. The program involved six main technical tasks: (1) modification and test of an existing vertical-chamber CVD reactor system; (2) identification and/or development of suitable inexpensive substrate materials; (3) experimental investigation of CVD process parameters using various candidate substrate materials; (4) preparation of Si sheet samples for various special studies, including solar cell fabrication; (5) evaluation of the properties of the Si sheet material produced by the CVD process; and (6) fabrication and evaluation of experimental solar cell structures, using impurity diffusion and other standard and near-standard processing techniques supplemented late in the program by the in situ CVD growth of n(+)/p/p(+) sheet structures subsequently processed into experimental cells.
The Timeseries Toolbox - A Web Application to Enable Accessible, Reproducible Time Series Analysis
NASA Astrophysics Data System (ADS)
Veatch, W.; Friedman, D.; Baker, B.; Mueller, C.
2017-12-01
The vast majority of data analyzed by climate researchers are repeated observations of physical process or time series data. This data lends itself of a common set of statistical techniques and models designed to determine trends and variability (e.g., seasonality) of these repeated observations. Often, these same techniques and models can be applied to a wide variety of different time series data. The Timeseries Toolbox is a web application designed to standardize and streamline these common approaches to time series analysis and modeling with particular attention to hydrologic time series used in climate preparedness and resilience planning and design by the U. S. Army Corps of Engineers. The application performs much of the pre-processing of time series data necessary for more complex techniques (e.g. interpolation, aggregation). With this tool, users can upload any dataset that conforms to a standard template and immediately begin applying these techniques to analyze their time series data.
MMX-I: data-processing software for multimodal X-ray imaging and tomography.
Bergamaschi, Antoine; Medjoubi, Kadda; Messaoudi, Cédric; Marco, Sergio; Somogyi, Andrea
2016-05-01
A new multi-platform freeware has been developed for the processing and reconstruction of scanning multi-technique X-ray imaging and tomography datasets. The software platform aims to treat different scanning imaging techniques: X-ray fluorescence, phase, absorption and dark field and any of their combinations, thus providing an easy-to-use data processing tool for the X-ray imaging user community. A dedicated data input stream copes with the input and management of large datasets (several hundred GB) collected during a typical multi-technique fast scan at the Nanoscopium beamline and even on a standard PC. To the authors' knowledge, this is the first software tool that aims at treating all of the modalities of scanning multi-technique imaging and tomography experiments.
Estimation of correlation functions by stochastic approximation.
NASA Technical Reports Server (NTRS)
Habibi, A.; Wintz, P. A.
1972-01-01
Consideration of the autocorrelation function of a zero-mean stationary random process. The techniques are applicable to processes with nonzero mean provided the mean is estimated first and subtracted. Two recursive techniques are proposed, both of which are based on the method of stochastic approximation and assume a functional form for the correlation function that depends on a number of parameters that are recursively estimated from successive records. One technique uses a standard point estimator of the correlation function to provide estimates of the parameters that minimize the mean-square error between the point estimates and the parametric function. The other technique provides estimates of the parameters that maximize a likelihood function relating the parameters of the function to the random process. Examples are presented.
In situ spectroradiometric quantification of ERTS data. [Prescott and Phoenix, Arizona
NASA Technical Reports Server (NTRS)
Yost, E. F. (Principal Investigator)
1975-01-01
The author has identified the following significant results. Analyses of ERTS-1 photographic data were made to quantitatively relate ground reflectance measurements to photometric characteristics of the images. Digital image processing of photographic data resulted in a nomograph to correct for atmospheric effects over arid terrain. Optimum processing techniques to derive maximum geologic information from desert areas were established. Additive color techniques to provide quantitative measurements of surface water between different orbits were developed which were accepted as the standard flood mapping techniques using ERTS.
Standardizing Quality Assessment of Fused Remotely Sensed Images
NASA Astrophysics Data System (ADS)
Pohl, C.; Moellmann, J.; Fries, K.
2017-09-01
The multitude of available operational remote sensing satellites led to the development of many image fusion techniques to provide high spatial, spectral and temporal resolution images. The comparison of different techniques is necessary to obtain an optimized image for the different applications of remote sensing. There are two approaches in assessing image quality: 1. Quantitatively by visual interpretation and 2. Quantitatively using image quality indices. However an objective comparison is difficult due to the fact that a visual assessment is always subject and a quantitative assessment is done by different criteria. Depending on the criteria and indices the result varies. Therefore it is necessary to standardize both processes (qualitative and quantitative assessment) in order to allow an objective image fusion quality evaluation. Various studies have been conducted at the University of Osnabrueck (UOS) to establish a standardized process to objectively compare fused image quality. First established image fusion quality assessment protocols, i.e. Quality with No Reference (QNR) and Khan's protocol, were compared on varies fusion experiments. Second the process of visual quality assessment was structured and standardized with the aim to provide an evaluation protocol. This manuscript reports on the results of the comparison and provides recommendations for future research.
The Use of Nominal Group Technique: Case Study in Vietnam
ERIC Educational Resources Information Center
Dang, Vi Hoang
2015-01-01
The Nominal Group Technique (NGT) is a structured process to gather information from a group. The technique was first described in early 1970s and has since become a widely-used standard to facilitate working groups. The NGT is effective for generating large numbers of creative new ideas and for group priority setting. This article reports on a…
Expert system verification and validation study. Delivery 3A and 3B: Trip summaries
NASA Technical Reports Server (NTRS)
French, Scott
1991-01-01
Key results are documented from attending the 4th workshop on verification, validation, and testing. The most interesting part of the workshop was when representatives from the U.S., Japan, and Europe presented surveys of VV&T within their respective regions. Another interesting part focused on current efforts to define industry standards for artificial intelligence and how that might affect approaches to VV&T of expert systems. The next part of the workshop focused on VV&T methods of applying mathematical techniques to verification of rule bases and techniques for capturing information relating to the process of developing software. The final part focused on software tools. A summary is also presented of the EPRI conference on 'Methodologies, Tools, and Standards for Cost Effective Reliable Software Verification and Validation. The conference was divided into discussion sessions on the following issues: development process, automated tools, software reliability, methods, standards, and cost/benefit considerations.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-15
... Classification System. \\2\\ Maximum Achievable Control Technology. Table 2 is not intended to be exhaustive, but..., methods, systems, or techniques that reduce the volume of or eliminate HAP emissions through process changes, substitution of materials, or other modifications; enclose systems or processes to eliminate...
Preparation and application of in-fibre internal standardization solid-phase microextraction.
Zhao, Wennan; Ouyang, Gangfeng; Pawliszyn, Janusz
2007-03-01
The in-fibre standardization method is a novel approach that has been developed for field sampling/sample preparation, in which an internal standard is pre-loaded onto a solid-phase microextraction (SPME) fibre for calibration of the extraction of target analytes in field samples. The same method can also be used for in-vial sample analysis. In this study, different techniques to load the standard to a non-porous SPME fibre were investigated. It was found that the appropriateness of the technique depends on the physical properties of the standards that are used for the analysis. Headspace extraction of the standard dissolved in pumping oil works well for volatile compounds. Conversely, headspace extraction of the pure standard is an effective approach for semi-volatile compounds. For compounds with low volatility, a syringe-fibre transfer method and direct extraction of the standard dissolved in a solvent exhibited a good reproducibility (<5% RSD). The main advantage of the approaches investigated in this study is that the standard generation vials can be reused for hundreds of analyses without exhibiting significant loss. Moreover, most of the standard loading processes studied can be performed automatically, which is efficient and precise. Finally, the standard loading technique and in-fibre standardization method were applied to a complex matrix (milk) and the results illustrated that the matrix effect can be effectively compensated for with this approach.
Impervious surfaces mapping using high resolution satellite imagery
NASA Astrophysics Data System (ADS)
Shirmeen, Tahmina
In recent years, impervious surfaces have emerged not only as an indicator of the degree of urbanization, but also as an indicator of environmental quality. As impervious surface area increases, storm water runoff increases in velocity, quantity, temperature and pollution load. Any of these attributes can contribute to the degradation of natural hydrology and water quality. Various image processing techniques have been used to identify the impervious surfaces, however, most of the existing impervious surface mapping tools used moderate resolution imagery. In this project, the potential of standard image processing techniques to generate impervious surface data for change detection analysis using high-resolution satellite imagery was evaluated. The city of Oxford, MS was selected as the study site for this project. Standard image processing techniques, including Normalized Difference Vegetation Index (NDVI), Principal Component Analysis (PCA), a combination of NDVI and PCA, and image classification algorithms, were used to generate impervious surfaces from multispectral IKONOS and QuickBird imagery acquired in both leaf-on and leaf-off conditions. Accuracy assessments were performed, using truth data generated by manual classification, with Kappa statistics and Zonal statistics to select the most appropriate image processing techniques for impervious surface mapping. The performance of selected image processing techniques was enhanced by incorporating Soil Brightness Index (SBI) and Greenness Index (GI) derived from Tasseled Cap Transformed (TCT) IKONOS and QuickBird imagery. A time series of impervious surfaces for the time frame between 2001 and 2007 was made using the refined image processing techniques to analyze the changes in IS in Oxford. It was found that NDVI and the combined NDVI--PCA methods are the most suitable image processing techniques for mapping impervious surfaces in leaf-off and leaf-on conditions respectively, using high resolution multispectral imagery. It was also found that IS data generated by these techniques can be refined by removing the conflicting dry soil patches using SBI and GI obtained from TCT of the same imagery used for IS data generation. The change detection analysis of the IS time series shows that Oxford experienced the major changes in IS from the year 2001 to 2004 and 2006 to 2007.
Processing of Nanostructured Devices Using Microfabrication Techniques
NASA Technical Reports Server (NTRS)
Xu, Jennifer C (Inventor); Kulis, Michael H (Inventor); Berger, Gordon M (Inventor); Hunter, Gary W (Inventor); Vander Wal, Randall L (Inventor); Evans, Laura J (Inventor)
2014-01-01
Systems and methods that incorporate nanostructures into microdevices are discussed herein. These systems and methods can allow for standard microfabrication techniques to be extended to the field of nanotechnology. Sensors incorporating nanostructures can be fabricated as described herein, and can be used to reliably detect a range of gases with high response.
Perls with Gloria Re-reviewed: Gestalt Techniques and Perls's Practices.
ERIC Educational Resources Information Center
Dolliver, Robert H.
1991-01-01
Reviews the filmed interview with Gloria by Perls (1965) which demonstrated some standard Gestalt therapy techniques and presents examples from film. Identifies discrepancies between Perls's description of Gestalt therapeutic processes and his interview behavior. Makes reflections on the inherent difficulties with the concept of the emerging…
MMX-I: data-processing software for multimodal X-ray imaging and tomography
Bergamaschi, Antoine; Medjoubi, Kadda; Messaoudi, Cédric; Marco, Sergio; Somogyi, Andrea
2016-01-01
A new multi-platform freeware has been developed for the processing and reconstruction of scanning multi-technique X-ray imaging and tomography datasets. The software platform aims to treat different scanning imaging techniques: X-ray fluorescence, phase, absorption and dark field and any of their combinations, thus providing an easy-to-use data processing tool for the X-ray imaging user community. A dedicated data input stream copes with the input and management of large datasets (several hundred GB) collected during a typical multi-technique fast scan at the Nanoscopium beamline and even on a standard PC. To the authors’ knowledge, this is the first software tool that aims at treating all of the modalities of scanning multi-technique imaging and tomography experiments. PMID:27140159
Beyond the standard plate count: genomic views into microbial food ecology
USDA-ARS?s Scientific Manuscript database
Food spoilage is a complex process that involves multiple species with specific niches and metabolic processes; bacterial culturing techniques are the traditional methods for identifying the microbes responsible. These culture-dependent methods may be considered selective, targeting the isolation of...
Localized analysis of paint-coat drying using dynamic speckle interferometry
NASA Astrophysics Data System (ADS)
Sierra-Sosa, Daniel; Tebaldi, Myrian; Grumel, Eduardo; Rabal, Hector; Elmaghraby, Adel
2018-07-01
The paint-coating is part of several industrial processes, including the automotive industry, architectural coatings, machinery and appliances. These paint-coatings must comply with high quality standards, for this reason evaluation techniques from paint-coatings are in constant development. One important factor from the paint-coating process is the drying, as it has influence on the quality of final results. In this work we present an assessment technique based on the optical dynamic speckle interferometry, this technique allows for the temporal activity evaluation of the paint-coating drying process, providing localized information from drying. This localized information is relevant in order to address the drying homogeneity, optimal drying, and quality control. The technique relies in the definition of a new temporal history of the speckle patterns to obtain the local activity; this information is then clustered to provide a convenient indicative of different drying process stages. The experimental results presented were validated using the gravimetric drying curves
NASA Technical Reports Server (NTRS)
Navard, Sharon E.
1989-01-01
In recent years there has been a push within NASA to use statistical techniques to improve the quality of production. Two areas where statistics are used are in establishing product and process quality control of flight hardware and in evaluating the uncertainty of calibration of instruments. The Flight Systems Quality Engineering branch is responsible for developing and assuring the quality of all flight hardware; the statistical process control methods employed are reviewed and evaluated. The Measurement Standards and Calibration Laboratory performs the calibration of all instruments used on-site at JSC as well as those used by all off-site contractors. These calibrations must be performed in such a way as to be traceable to national standards maintained by the National Institute of Standards and Technology, and they must meet a four-to-one ratio of the instrument specifications to calibrating standard uncertainty. In some instances this ratio is not met, and in these cases it is desirable to compute the exact uncertainty of the calibration and determine ways of reducing it. A particular example where this problem is encountered is with a machine which does automatic calibrations of force. The process of force calibration using the United Force Machine is described in detail. The sources of error are identified and quantified when possible. Suggestions for improvement are made.
Packet telemetry and packet telecommand - The new generation of spacecraft data handling techniques
NASA Technical Reports Server (NTRS)
Hooke, A. J.
1983-01-01
Because of rising costs and reduced reliability of spacecraft and ground network hardware and software customization, standardization Packet Telemetry and Packet Telecommand concepts are emerging as viable alternatives. Autonomous packets of data, within each concept, which are created within ground and space application processes through the use of formatting techniques, are switched end-to-end through the space data network to their destination application processes through the use of standard transfer protocols. This process may result in facilitating a high degree of automation and interoperability because of completely mission-independent-designed intermediate data networks. The adoption of an international guideline for future space telemetry formatting of the Packet Telemetry concept, and the advancement of the NASA-ESA Working Group's Packet Telecommand concept to a level of maturity parallel to the of Packet Telemetry are the goals of the Consultative Committee for Space Data Systems. Both the Packet Telemetry and Packet Telecommand concepts are reviewed.
Image-Based 3d Reconstruction and Analysis for Orthodontia
NASA Astrophysics Data System (ADS)
Knyaz, V. A.
2012-08-01
Among the main tasks of orthodontia are analysis of teeth arches and treatment planning for providing correct position for every tooth. The treatment plan is based on measurement of teeth parameters and designing perfect teeth arch curve which teeth are to create after treatment. The most common technique for teeth moving uses standard brackets which put on teeth and a wire of given shape which is clamped by these brackets for producing necessary forces to every tooth for moving it in given direction. The disadvantages of standard bracket technique are low accuracy of tooth dimensions measurements and problems with applying standard approach for wide variety of complex orthodontic cases. The image-based technique for orthodontic planning, treatment and documenting aimed at overcoming these disadvantages is proposed. The proposed approach provides performing accurate measurements of teeth parameters needed for adequate planning, designing correct teeth position and monitoring treatment process. The developed technique applies photogrammetric means for teeth arch 3D model generation, brackets position determination and teeth shifting analysis.
Weaving the tapestry of learning: simulation, standardized patients, and virtual communities.
Holland, Brian; Landry, Karen; Mountain, Angela; Middlebrooks, Mary Alice; Heim, Deborah; Missildine, Kathy
2013-01-01
Using situated cognition learning theory, nursing faculty developed simulated clinical learning experiences integrating virtual communities and standardized patients. These learning experiences provide authenticity and realism not easily achieved using the individual techniques in isolation. The authors describe the process of weaving these strategies into a rich learning experience for students.
76 FR 16728 - Announcement of the American Petroleum Institute's Standards Activities
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-25
... voluntary standards for equipment, materials, operations, and processes for the petroleum and natural gas... Techniques for Designing and/or Optimizing Gas-lift Wells and Systems, 1st Ed. RP 13K, Chemical Analysis of... Q2, Quality Management Systems for Service Supply Organizations for the Petroleum and Natural Gas...
This presentation will describe the U.S. EPA’s drinking water and ambient water method development program in relation to the process employed and the typical challenges encountered in developing standardized LC/MS/MS methods for chemicals of emerging concern. The EPA&rsquo...
CMOS array design automation techniques. [metal oxide semiconductors
NASA Technical Reports Server (NTRS)
Ramondetta, P.; Feller, A.; Noto, R.; Lombardi, T.
1975-01-01
A low cost, quick turnaround technique for generating custom metal oxide semiconductor arrays using the standard cell approach was developed, implemented, tested and validated. Basic cell design topology and guidelines are defined based on an extensive analysis that includes circuit, layout, process, array topology and required performance considerations particularly high circuit speed.
PRELIMINARY COMPARATIVE STUDY OF METHODS TO EXTRACT VIRUS FROM RAW AND PROCESSED SEWAGE SLUDGES
Two simple virus extraction techniques were compared to an EPA standard method for detection of human enteric viruses in raw sewage sludge and class A biosolids. The techniques were used to detect both indigenous and seeded virus from a plant that distributes class A material pr...
40 CFR 426.135 - Standards of performance for new sources.
Code of Federal Regulations, 2013 CFR
2013-07-01
... greater than 50 gallons per day of process waste water, and employs hydrofluoric acid finishing techniques... any 1 day Average of daily values for 30 consecutive days shall not exceed— Lead 0.2 0.1 Fluoride 26.0... waste water, and employs hydrofluoric acid finishing techniques shall meet the following limitations...
40 CFR 426.135 - Standards of performance for new sources.
Code of Federal Regulations, 2012 CFR
2012-07-01
... greater than 50 gallons per day of process waste water, and employs hydrofluoric acid finishing techniques... any 1 day Average of daily values for 30 consecutive days shall not exceed— Lead 0.2 0.1 Fluoride 26.0... waste water, and employs hydrofluoric acid finishing techniques shall meet the following limitations...
40 CFR 426.135 - Standards of performance for new sources.
Code of Federal Regulations, 2014 CFR
2014-07-01
... greater than 50 gallons per day of process waste water, and employs hydrofluoric acid finishing techniques... any 1 day Average of daily values for 30 consecutive days shall not exceed— Lead 0.2 0.1 Fluoride 26.0... waste water, and employs hydrofluoric acid finishing techniques shall meet the following limitations...
Gang, Wei-juan; Wang, Xin; Wang, Fang; Dong, Guo-feng; Wu, Xiao-dong
2015-08-01
The national standard of "Regulations of Acupuncture-needle Manipulating Techniques" is one of the national Criteria of Acupuncturology for which a total of 22 items have been already established. In the process of formulation, a series of common and specific problems have been met. In the present paper, the authors expound these problems from 3 aspects, namely principles for formulation, methods for formulating criteria, and considerations about some problems. The formulating principles include selection and regulations of principles for technique classification and technique-related key factors. The main methods for formulating criteria are 1) taking the literature as the theoretical foundation, 2) taking the clinical practice as the supporting evidence, and 3) taking the expounded suggestions or conclusions through peer review.
Parrilla, Inma; del Olmo, David; Sijses, Laurien; Martinez-Alborcia, María J; Cuello, Cristina; Vazquez, Juan M; Martinez, Emilio A; Roca, Jordi
2012-05-01
The present study aimed to evaluate the ability of spermatozoa from individual boar ejaculates to withstand different semen-processing techniques. Eighteen sperm-rich ejaculate samples from six boars (three per boar) were diluted in Beltsville Thawing Solution and split into three aliquots. The aliquots were (1) further diluted to 3×10(7) sperm/mL and stored as a liquid at 17°C for 72 h, (2) frozen-thawed (FT) at 1×10(9) sperm/mL using standard 0.5-mL straw protocols, or (3) sex-sorted with subsequent liquid storage (at 17°C for 6 h) or FT (2×10(7) sperm/mL using a standard 0.25-mL straw protocol). The sperm quality was evaluated based on total sperm motility (the CASA system), viability (plasma membrane integrity assessed using flow cytometry and the LIVE/DEAD Sperm Viability Kit), lipid peroxidation (assessed via indirect measurement of the generation of malondialdehyde (MDA) using the BIOXYTECH MDA-586 Assay Kit) and DNA fragmentation (sperm chromatin dispersion assessed using the Sperm-Sus-Halomax(®) test). Data were normalized to the values assessed for the fresh (for liquid-stored and FT samples) or the sorted semen samples (for liquid stored and the FT sorted spermatozoa). All of the four sperm-processing techniques affected sperm quality (P<0.01), regardless of the semen donor, with reduced percentages of motile and viable sperm and increased MDA generation and percentages of sperm with fragmented DNA. Significant (P<0.05) inter-boar (effect of boars within each semen-processing technique) and intra-boar (effect of semen-processing techniques within each boar) differences were evident for all of the sperm quality parameters assessed, indicating differences in the ability of spermatozoa from individual boars to withstand the semen-processing techniques. These results are the first evidence that ejaculate spermatozoa from individual boars can respond in a boar-dependent manner to different semen-processing techniques. Copyright © 2012 Elsevier B.V. All rights reserved.
Bauer, Daniel R; Otter, Michael; Chafin, David R
2018-01-01
Studying and developing preanalytical tools and technologies for the purpose of obtaining high-quality samples for histological assays is a growing field. Currently, there does not exist a standard practice for collecting, fixing, and monitoring these precious samples. There has been some advancement in standardizing collection for the highest profile tumor types, such as breast, where HER2 testing drives therapeutic decisions. This review examines the area of tissue collection, transport, and monitoring of formalin diffusion and details a prototype system that could be used to help standardize tissue collection efforts. We have surveyed recent primary literature sources and conducted several site visits to understand the most error-prone processes in histology laboratories. This effort identified errors that resulted from sample collection techniques and subsequent transport delays from the operating room (OR) to the histology laboratories. We have therefore devised a prototype sample collection and transport concept. The system consists of a custom data logger and cold transport box and takes advantage of a novel cold + warm (named 2 + 2) fixation method. This review highlights the beneficial aspects of standardizing tissue collection, fixation, and monitoring. In addition, a prototype system is introduced that could help standardize these processes and is compatible with use directly in the OR and from remote sites.
2004-06-01
Information Systems, Faculty of ICT, International Islamic University, Malaysia . Abstract. Several techniques for evaluating a groupware...inspection based techniques couldn’t be carried out in other parts of Pakistan where the IT industry has mushroomed in the past few years. Nevertheless...there are no set standards for using any particular technique. Evaluating a groupware interface is an evolving process and requires more investigation
NASA Astrophysics Data System (ADS)
Łazarek, Łukasz; Antończak, Arkadiusz J.; Wójcik, Michał R.; Drzymała, Jan; Abramski, Krzysztof M.
2014-07-01
Laser-induced breakdown spectroscopy (LIBS), like many other spectroscopic techniques, is a comparative method. Typically, in qualitative analysis, synthetic certified standard with a well-known elemental composition is used to calibrate the system. Nevertheless, in all laser-induced techniques, such calibration can affect the accuracy through differences in the overall composition of the chosen standard. There are also some intermediate factors, which can cause imprecision in measurements, such as optical absorption, surface structure and thermal conductivity. In this work the calibration performed for the LIBS technique utilizes pellets made directly from the tested materials (old well-characterized samples). This choice produces a considerable improvement in the accuracy of the method. This technique was adopted for the determination of trace elements in industrial copper concentrates, standardized by conventional atomic absorption spectroscopy with a flame atomizer. A series of copper flotation concentrate samples was analyzed for three elements: silver, cobalt and vanadium. We also proposed a method of post-processing the measurement data to minimize matrix effects and permit reliable analysis. It has been shown that the described technique can be used in qualitative and quantitative analyses of complex inorganic materials, such as copper flotation concentrates. It was noted that the final validation of such methodology is limited mainly by the accuracy of the characterization of the standards.
NASA Astrophysics Data System (ADS)
Anees, Amir; Khan, Waqar Ahmad; Gondal, Muhammad Asif; Hussain, Iqtadar
2013-07-01
The aim of this work is to make use of the mean of absolute deviation (MAD) method for the evaluation process of substitution boxes used in the advanced encryption standard. In this paper, we use the MAD technique to analyze some popular and prevailing substitution boxes used in encryption processes. In particular, MAD is applied to advanced encryption standard (AES), affine power affine (APA), Gray, Lui J., Residue Prime, S8 AES, SKIPJACK, and Xyi substitution boxes.
41 CFR 102-118.35 - What definitions apply to this part?
Code of Federal Regulations, 2010 CFR
2010-07-01
... published formats and codes as authorized by the applicable Federal Information Processing Standards... techniques for carrying out transportation transactions using electronic transmissions of the information...
Recovery of hygiene water by multifiltration. [in space shuttle orbiters
NASA Technical Reports Server (NTRS)
Putnam, David F.; Jolly, Clifford D.; Colombo, Gerald V.; Price, Don
1989-01-01
A multifiltration hygiene water reclamation process that utilizes adsorption and particulate filtration techniques is described and evaluated. The applicability of the process is tested using a simulation of a 4-man subsystem operation for 240 days. It is proposed the process has a 10 year life, weighs 236 kg, and uses 88 kg of expendable filters and adsorption beds to process 8424 kg of water. The data reveal that the multifiltration is an efficient nonphase change technique for hygiene water recovery and that the chemical and microbiological purity of the product water is within the standards specified for the Space Station hygiene water.
The manual lists and describes the instruments and techniques that are available for measuring the concentration or size distribution of particles suspended in process streams. The standard, official, well established methods are described as well as some experimental methods and...
Effective Report Preparation: Streamlining the Reporting Process. AIR 1999 Annual Forum Paper.
ERIC Educational Resources Information Center
Dalrymple, Margaret; Wang, Mindy; Frost, Jacquelyn
This paper describes the processes and techniques used to improve and streamline the standard student reports used at Purdue University (Indiana). Various models for analyzing reporting processes are described, especially the model used in the study, the Shewart or Deming Cycle, a method that aids in continuous analysis and improvement through a…
Application of AIS Technology to Forest Mapping
NASA Technical Reports Server (NTRS)
Yool, S. R.; Star, J. L.
1985-01-01
Concerns about environmental effects of large scale deforestation have prompted efforts to map forests over large areas using various remote sensing data and image processing techniques. Basic research on the spectral characteristics of forest vegetation are required to form a basis for development of new techniques, and for image interpretation. Examination of LANDSAT data and image processing algorithms over a portion of boreal forest have demonstrated the complexity of relations between the various expressions of forest canopies, environmental variability, and the relative capacities of different image processing algorithms to achieve high classification accuracies under these conditions. Airborne Imaging Spectrometer (AIS) data may in part provide the means to interpret the responses of standard data and techniques to the vegetation based on its relatively high spectral resolution.
NASA Astrophysics Data System (ADS)
Wang, Huarui; Shen, Jianqi
2014-05-01
The size of nanoparticles is measured by laser diode self-mixing interferometry, which employs a sensitive, compact, and simple optical setup. However, the signal processing of the interferometry is slow or expensive. In this article, a fast and economic signal processing technique is introduced, in which the self-mixing AC signal is transformed into DC signals with an analog circuit consisting of 16 channels. These DC signals are obtained as a spectrum from which the size of nanoparticles can be retrieved. The technique is examined by measuring the standard nanoparticles. Further experiments are performed to compare the skimmed milk and whole milk, and also the fresh skimmed milk and rotten skimmed milk.
The development of a method of producing etch resistant wax patterns on solar cells
NASA Technical Reports Server (NTRS)
Pastirik, E.
1980-01-01
A potentially attractive technique for wax masking of solar cells prior to etching processes was studied. This technique made use of a reuseable wax composition which was applied to the solar cell in patterned form by means of a letterpress printing method. After standard wet etching was performed, wax removal by means of hot water was investigated. Application of the letterpress wax printing process to silicon was met with a number of difficulties. The most serious shortcoming of the process was its inability to produce consistently well-defined printed patterns on the hard silicon cell surface.
Dynamic Environmental Qualification Techniques.
1981-12-01
environments peculiar to military operations and requirements. numerous dynamic qualification test methods have been established. It was the purpose...requires the achievement of the highest practicable degree in the standard- ization of items, materials and engineering practices within the...standard is described as "A document that established engineering and technical requirements for processes, pro’cedures, practices and methods that have
Ferrell, Jack R.; Olarte, Mariefel V.; Christensen, Earl D.; ...
2016-07-05
Here, we discuss the standardization of analytical techniques for pyrolysis bio-oils, including the current status of methods, and our opinions on future directions. First, the history of past standardization efforts is summarized, and both successful and unsuccessful validation of analytical techniques highlighted. The majority of analytical standardization studies to-date has tested only physical characterization techniques. In this paper, we present results from an international round robin on the validation of chemical characterization techniques for bio-oils. Techniques tested included acid number, carbonyl titrations using two different methods (one at room temperature and one at 80 °C), 31P NMR for determination ofmore » hydroxyl groups, and a quantitative gas chromatography–mass spectrometry (GC-MS) method. Both carbonyl titration and acid number methods have yielded acceptable inter-laboratory variabilities. 31P NMR produced acceptable results for aliphatic and phenolic hydroxyl groups, but not for carboxylic hydroxyl groups. As shown in previous round robins, GC-MS results were more variable. Reliable chemical characterization of bio-oils will enable upgrading research and allow for detailed comparisons of bio-oils produced at different facilities. Reliable analytics are also needed to enable an emerging bioenergy industry, as processing facilities often have different analytical needs and capabilities than research facilities. We feel that correlations in reliable characterizations of bio-oils will help strike a balance between research and industry, and will ultimately help to -determine metrics for bio-oil quality. Lastly, the standardization of additional analytical methods is needed, particularly for upgraded bio-oils.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferrell, Jack R.; Olarte, Mariefel V.; Christensen, Earl D.
Here, we discuss the standardization of analytical techniques for pyrolysis bio-oils, including the current status of methods, and our opinions on future directions. First, the history of past standardization efforts is summarized, and both successful and unsuccessful validation of analytical techniques highlighted. The majority of analytical standardization studies to-date has tested only physical characterization techniques. In this paper, we present results from an international round robin on the validation of chemical characterization techniques for bio-oils. Techniques tested included acid number, carbonyl titrations using two different methods (one at room temperature and one at 80 °C), 31P NMR for determination ofmore » hydroxyl groups, and a quantitative gas chromatography–mass spectrometry (GC-MS) method. Both carbonyl titration and acid number methods have yielded acceptable inter-laboratory variabilities. 31P NMR produced acceptable results for aliphatic and phenolic hydroxyl groups, but not for carboxylic hydroxyl groups. As shown in previous round robins, GC-MS results were more variable. Reliable chemical characterization of bio-oils will enable upgrading research and allow for detailed comparisons of bio-oils produced at different facilities. Reliable analytics are also needed to enable an emerging bioenergy industry, as processing facilities often have different analytical needs and capabilities than research facilities. We feel that correlations in reliable characterizations of bio-oils will help strike a balance between research and industry, and will ultimately help to -determine metrics for bio-oil quality. Lastly, the standardization of additional analytical methods is needed, particularly for upgraded bio-oils.« less
Microcoupon Assay Of Adhesion And Growth Of Bacterial Films
NASA Technical Reports Server (NTRS)
Pierson, Duane L.; Koenig, David W.
1994-01-01
Microbiological assay technique facilitates determination of some characteristics of sessile bacteria like those that attach to and coat interior walls of water-purification systems. Biofilms cause sickness and interfere with purification process. Technique enables direct measurement of rate of attachment of bacterial cells, their metabolism, and effects of chemicals on them. Used to quantify effects of both bactericides and growth-stimulating agents and in place of older standard plate-count and tube-dilution techniques.
Label free sensing of creatinine using a 6 GHz CMOS near-field dielectric immunosensor.
Guha, S; Warsinke, A; Tientcheu, Ch M; Schmalz, K; Meliani, C; Wenger, Ch
2015-05-07
In this work we present a CMOS high frequency direct immunosensor operating at 6 GHz (C-band) for label free determination of creatinine. The sensor is fabricated in standard 0.13 μm SiGe:C BiCMOS process. The report also demonstrates the ability to immobilize creatinine molecules on a Si3N4 passivation layer of the standard BiCMOS/CMOS process, therefore, evading any further need of cumbersome post processing of the fabricated sensor chip. The sensor is based on capacitive detection of the amount of non-creatinine bound antibodies binding to an immobilized creatinine layer on the passivated sensor. The chip bound antibody amount in turn corresponds indirectly to the creatinine concentration used in the incubation phase. The determination of creatinine in the concentration range of 0.88-880 μM is successfully demonstrated in this work. A sensitivity of 35 MHz/10 fold increase in creatinine concentration (during incubation) at the centre frequency of 6 GHz is gained by the immunosensor. The results are compared with a standard optical measurement technique and the dynamic range and sensitivity is of the order of the established optical indication technique. The C-band immunosensor chip comprising an area of 0.3 mm(2) reduces the sensing area considerably, therefore, requiring a sample volume as low as 2 μl. The small analyte sample volume and label free approach also reduce the experimental costs in addition to the low fabrication costs offered by the batch fabrication technique of CMOS/BiCMOS process.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-02
... Flat Wood Paneling Surface Coating Processes AGENCY: Environmental Protection Agency (EPA). ACTION... by EPA's Control Techniques Guidelines (CTG) standards for flat wood paneling surface coating processes. EPA is approving this revision concerning the adoption of the EPA CTG requirements for flat wood...
An array processing system for lunar geochemical and geophysical data
NASA Technical Reports Server (NTRS)
Eliason, E. M.; Soderblom, L. A.
1977-01-01
A computerized array processing system has been developed to reduce, analyze, display, and correlate a large number of orbital and earth-based geochemical, geophysical, and geological measurements of the moon on a global scale. The system supports the activities of a consortium of about 30 lunar scientists involved in data synthesis studies. The system was modeled after standard digital image-processing techniques but differs in that processing is performed with floating point precision rather than integer precision. Because of flexibility in floating-point image processing, a series of techniques that are impossible or cumbersome in conventional integer processing were developed to perform optimum interpolation and smoothing of data. Recently color maps of about 25 lunar geophysical and geochemical variables have been generated.
Align-and-shine photolithography
NASA Astrophysics Data System (ADS)
Petrusis, Audrius; Rector, Jan H.; Smith, Kristen; de Man, Sven; Iannuzzi, Davide
2009-10-01
At the beginning of 2009, our group has introduced a new technique that allows fabrication of photolithographic patterns on the cleaved end of an optical fibre: the align-and-shine photolithography technique (see A. Petrušis et al., "The align-and-shine technique for series production of photolithography patterns on optical fibres", J. Micromech. Microeng. 19, 047001, 2009). Align-and-shine photolithography combines standard optical lithography with imagebased active fibre alignment processes. The technique adapts well to series production, opening the way to batch fabrication of fibre-top devices (D. Iannuzzi et al., "Monolithic fibre-top cantilever for critical environments and standard applications", Appl. Phys. Lett. 88, 053501, 2006) and all other devices that rely on suitable machining of engineered parts on the tip of a fibre. In this paper we review our results and briefly discuss its potential applications.
Acoustical standards in engineering acoustics
NASA Astrophysics Data System (ADS)
Burkhard, Mahlon D.
2004-05-01
The Engineering Acoustics Technical Committee is concerned with the evolution and improvement of acoustical techniques and apparatus, and with the promotion of new applications of acoustics. As cited in the Membership Directory and Handbook (2002), the interest areas include transducers and arrays; underwater acoustic systems; acoustical instrumentation and monitoring; applied sonics, promotion of useful effects, information gathering and transmission; audio engineering; acoustic holography and acoustic imaging; acoustic signal processing (equipment and techniques); and ultrasound and infrasound. Evident connections between engineering and standards are needs for calibration, consistent terminology, uniform presentation of data, reference levels, or design targets for product development. Thus for the acoustical engineer standards are both a tool for practices, for communication, and for comparison of his efforts with those of others. Development of many standards depends on knowledge of the way products are put together for the market place and acoustical engineers provide important input to the development of standards. Acoustical engineers and members of the Engineering Acoustics arm of the Society both benefit from and contribute to the Acoustical Standards of the Acoustical Society.
Hajihosseini, Payman; Anzehaee, Mohammad Mousavi; Behnam, Behzad
2018-05-22
The early fault detection and isolation in industrial systems is a critical factor in preventing equipment damage. In the proposed method, instead of using the time signals of sensors, the 2D image obtained by placing these signals next to each other in a matrix has been used; and then a novel fault detection and isolation procedure has been carried out based on image processing techniques. Different features including texture, wavelet transform, mean and standard deviation of the image accompanied with MLP and RBF neural networks based classifiers have been used for this purpose. Obtained results indicate the notable efficacy and success of the proposed method in detecting and isolating faults of the Tennessee Eastman benchmark process and its superiority over previous techniques. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.
Study of Variable Frequency Induction Heating in Steel Making Process
NASA Astrophysics Data System (ADS)
Fukutani, Kazuhiko; Umetsu, Kenji; Itou, Takeo; Isobe, Takanori; Kitahara, Tadayuki; Shimada, Ryuichi
Induction heating technologies have been the standard technologies employed in steel making processes because they are clean, they have a high energy density, they are highly the controllable, etc. However, there is a problem in using them; in general, frequencies of the electric circuits have to be kept fixed to improve their power factors, and this constraint makes the processes inflexible. In order to overcome this problem, we have developed a new heating technique-variable frequency power supply with magnetic energy recovery switching. This technique helps us in improving the quality of steel products as well as the productivity. We have also performed numerical calculations and experiments to evaluate its effect on temperature distributions on heated steel plates. The obtained results indicate that the application of the technique in steel making processes would be advantageous.
A Cost-effective and Reliable Method to Predict Mechanical Stress in Single-use and Standard Pumps
Dittler, Ina; Dornfeld, Wolfgang; Schöb, Reto; Cocke, Jared; Rojahn, Jürgen; Kraume, Matthias; Eibl, Dieter
2015-01-01
Pumps are mainly used when transferring sterile culture broths in biopharmaceutical and biotechnological production processes. However, during the pumping process shear forces occur which can lead to qualitative and/or quantitative product loss. To calculate the mechanical stress with limited experimental expense, an oil-water emulsion system was used, whose suitability was demonstrated for drop size detections in bioreactors1. As drop breakup of the oil-water emulsion system is a function of mechanical stress, drop sizes need to be counted over the experimental time of shear stress investigations. In previous studies, the inline endoscopy has been shown to be an accurate and reliable measurement technique for drop size detections in liquid/liquid dispersions. The aim of this protocol is to show the suitability of the inline endoscopy technique for drop size measurements in pumping processes. In order to express the drop size, the Sauter mean diameter d32 was used as the representative diameter of drops in the oil-water emulsion. The results showed low variation in the Sauter mean diameters, which were quantified by standard deviations of below 15%, indicating the reliability of the measurement technique. PMID:26274765
Milner, Rafał; Rusiniak, Mateusz; Lewandowska, Monika; Wolak, Tomasz; Ganc, Małgorzata; Piątkowska-Janko, Ewa; Bogorodzki, Piotr; Skarżyński, Henryk
2014-01-01
Background The neural underpinnings of auditory information processing have often been investigated using the odd-ball paradigm, in which infrequent sounds (deviants) are presented within a regular train of frequent stimuli (standards). Traditionally, this paradigm has been applied using either high temporal resolution (EEG) or high spatial resolution (fMRI, PET). However, used separately, these techniques cannot provide information on both the location and time course of particular neural processes. The goal of this study was to investigate the neural correlates of auditory processes with a fine spatio-temporal resolution. A simultaneous auditory evoked potentials (AEP) and functional magnetic resonance imaging (fMRI) technique (AEP-fMRI), together with an odd-ball paradigm, were used. Material/Methods Six healthy volunteers, aged 20–35 years, participated in an odd-ball simultaneous AEP-fMRI experiment. AEP in response to acoustic stimuli were used to model bioelectric intracerebral generators, and electrophysiological results were integrated with fMRI data. Results fMRI activation evoked by standard stimuli was found to occur mainly in the primary auditory cortex. Activity in these regions overlapped with intracerebral bioelectric sources (dipoles) of the N1 component. Dipoles of the N1/P2 complex in response to standard stimuli were also found in the auditory pathway between the thalamus and the auditory cortex. Deviant stimuli induced fMRI activity in the anterior cingulate gyrus, insula, and parietal lobes. Conclusions The present study showed that neural processes evoked by standard stimuli occur predominantly in subcortical and cortical structures of the auditory pathway. Deviants activate areas non-specific for auditory information processing. PMID:24413019
A real time quality control application for animal production by image processing.
Sungur, Cemil; Özkan, Halil
2015-11-01
Standards of hygiene and health are of major importance in food production, and quality control has become obligatory in this field. Thanks to rapidly developing technologies, it is now possible for automatic and safe quality control of food production. For this purpose, image-processing-based quality control systems used in industrial applications are being employed to analyze the quality of food products. In this study, quality control of chicken (Gallus domesticus) eggs was achieved using a real time image-processing technique. In order to execute the quality control processes, a conveying mechanism was used. Eggs passing on a conveyor belt were continuously photographed in real time by cameras located above the belt. The images obtained were processed by various methods and techniques. Using digital instrumentation, the volume of the eggs was measured, broken/cracked eggs were separated and dirty eggs were determined. In accordance with international standards for classifying the quality of eggs, the class of separated eggs was determined through a fuzzy implication model. According to tests carried out on thousands of eggs, a quality control process with an accuracy of 98% was possible. © 2014 Society of Chemical Industry.
A primer on standards setting as it applies to surgical education and credentialing.
Cendan, Juan; Wier, Daryl; Behrns, Kevin
2013-07-01
Surgical technological advances in the past three decades have led to dramatic reductions in the morbidity associated with abdominal procedures and permanently altered the surgical practice landscape. Significant changes continue apace including surgical robotics, natural orifice-based surgery, and single-incision approaches. These disruptive technologies have on occasion been injurious to patients, and high-stakes assessment before adoption of new technologies would be reasonable. We reviewed the drivers for well-established psychometric techniques available for the standards-setting process. We present a series of examples that are relevant in the surgical domain including standards setting for knowledge and skills assessments. Defensible standards for knowledge and procedural skills will likely become part of surgical clinical practice. Understanding the methodology for determining standards should position the surgical community to assist in the process and lead within their clinical settings as standards are considered that may affect patient safety and physician credentialing.
50 CFR 600.330 - National Standard 5-Efficiency.
Code of Federal Regulations, 2010 CFR
2010-10-01
...) Efficiency. In theory, an efficient fishery would harvest the OY with the minimum use of economic inputs such... techniques of harvesting, processing, or marketing, and should avoid creating strong incentives for excessive...
Recent Advances in Techniques for Hyperspectral Image Processing
NASA Technical Reports Server (NTRS)
Plaza, Antonio; Benediktsson, Jon Atli; Boardman, Joseph W.; Brazile, Jason; Bruzzone, Lorenzo; Camps-Valls, Gustavo; Chanussot, Jocelyn; Fauvel, Mathieu; Gamba, Paolo; Gualtieri, Anthony;
2009-01-01
Imaging spectroscopy, also known as hyperspectral imaging, has been transformed in less than 30 years from being a sparse research tool into a commodity product available to a broad user community. Currently, there is a need for standardized data processing techniques able to take into account the special properties of hyperspectral data. In this paper, we provide a seminal view on recent advances in techniques for hyperspectral image processing. Our main focus is on the design of techniques able to deal with the highdimensional nature of the data, and to integrate the spatial and spectral information. Performance of the discussed techniques is evaluated in different analysis scenarios. To satisfy time-critical constraints in specific applications, we also develop efficient parallel implementations of some of the discussed algorithms. Combined, these parts provide an excellent snapshot of the state-of-the-art in those areas, and offer a thoughtful perspective on future potentials and emerging challenges in the design of robust hyperspectral imaging algorithms
NASA Astrophysics Data System (ADS)
Paek, Seung Weon; Kang, Jae Hyun; Ha, Naya; Kim, Byung-Moo; Jang, Dae-Hyun; Jeon, Junsu; Kim, DaeWook; Chung, Kun Young; Yu, Sung-eun; Park, Joo Hyun; Bae, SangMin; Song, DongSup; Noh, WooYoung; Kim, YoungDuck; Song, HyunSeok; Choi, HungBok; Kim, Kee Sup; Choi, Kyu-Myung; Choi, Woonhyuk; Jeon, JoongWon; Lee, JinWoo; Kim, Ki-Su; Park, SeongHo; Chung, No-Young; Lee, KangDuck; Hong, YoungKi; Kim, BongSeok
2012-03-01
A set of design for manufacturing (DFM) techniques have been developed and applied to 45nm, 32nm and 28nm logic process technologies. A noble technology combined a number of potential confliction of DFM techniques into a comprehensive solution. These techniques work in three phases for design optimization and one phase for silicon diagnostics. In the DFM prevention phase, foundation IP such as standard cells, IO, and memory and P&R tech file are optimized. In the DFM solution phase, which happens during ECO step, auto fixing of process weak patterns and advanced RC extraction are performed. In the DFM polishing phase, post-layout tuning is done to improve manufacturability. DFM analysis enables prioritization of random and systematic failures. The DFM technique presented in this paper has been silicon-proven with three successful tape-outs in Samsung 32nm processes; about 5% improvement in yield was achieved without any notable side effects. Visual inspection of silicon also confirmed the positive effect of the DFM techniques.
ERIC Educational Resources Information Center
Lindle, Jane Clark; Stalion, Nancy; Young, Lu
2005-01-01
Kentucky's accountability system includes a school-processes audit known as Standards and Indicators for School Improvement (SISI), which is in a nascent stage of validation. Content validity methods include comparison to instruments measuring similar constructs as well as other techniques such as job analysis. This study used a two-phase process…
Application of the Near Miss Strategy and Edit Distance to Handle Dirty Data
NASA Astrophysics Data System (ADS)
Varol, Cihan; Bayrak, Coskun; Wagner, Rick; Goff, Dana
In today’s information age, processing customer information in a standardized and accurate manner is known to be a difficult task. Data collection methods vary from source to source by format, volume, and media type. Therefore, it is advantageous to deploy customized data hygiene techniques to standardize the data for meaningfulness and usefulness based on the organization.
Application of the Near Miss Strategy and Edit Distance to Handle Dirty Data
NASA Astrophysics Data System (ADS)
Varol, Cihan; Bayrak, Coskun; Wagner, Rick; Goff, Dana
In today's information age, processing customer information in a standardized and accurate manner is known to be a difficult task. Data collection methods vary from source to source by format, volume, and media type. Therefore, it is advantageous to deploy customized data hygiene techniques to standardize the data for meaningfulness and usefulness based on the organization.
NASA Technical Reports Server (NTRS)
Leveson, Nancy
1987-01-01
Software safety and its relationship to other qualities are discussed. It is shown that standard reliability and fault tolerance techniques will not solve the safety problem for the present. A new attitude requires: looking at what you do NOT want software to do along with what you want it to do; and assuming things will go wrong. New procedures and changes to entire software development process are necessary: special software safety analysis techniques are needed; and design techniques, especially eliminating complexity, can be very helpful.
2007-12-01
37 3. Poka - yoke ............................................................................................37 4. Systems for...Standard operating procedures • Visual displays for workflow and communication • Total productive maintenance • Poka - yoke techniques to prevent...process step or eliminating non-value-added steps, and reducing the seven common wastes, will decrease the total time of a process. 3. Poka - yoke
A Distributed Processing Approach to Payroll Time Reporting for a Large School District.
ERIC Educational Resources Information Center
Freeman, Raoul J.
1983-01-01
Describes a system for payroll reporting from geographically disparate locations in which data is entered, edited, and verified locally on minicomputers and then uploaded to a central computer for the standard payroll process. Communications and hardware, time-reporting software, data input techniques, system implementation, and its advantages are…
Rehearsal Processes in Children's Memory.
ERIC Educational Resources Information Center
Ornstein, Peter A.; Liberty, Charles
This study investigates developmental trends in free recall, with emphasis on rehearsal processes. An overt rehearsal technique was used in which 28 children in grades 3, 6, and 8 were instructed to rehearse out loud while trying to memorize a list of unrelated nouns. Control groups at each age level received standard free recall instructions,…
How To Better Track Effective School Indicators: The Control Chart Techniques.
ERIC Educational Resources Information Center
Coutts, Douglas
1998-01-01
Control charts are practical tools to monitor various school indicators (attendance rates, standardized test scores, grades, and graduation rates) by displaying data on the same scale over time. This article shows how principals can calculate the upper natural-process limit, lower natural-process limit, and upper control limit for attendance. (15…
Cellulase producing microorganism ATCC 55702
Dees, H. Craig
1997-01-01
Bacteria which produce large amounts of cellulase--containing cell-free fermentate have been identified. The original bacterium (ATCC 55703) was genetically altered using nitrosoguanidine (MNNG) treatment to produce the enhanced cellulase producing bacterium (ATCC 55702), which was identified through replicate plating. ATCC 55702 has improved characteristics and qualifies for the degradation of cellulosic waste materials for fuel production, food processing, textile processing, and other industrial applications. ATCC 55702 is an improved bacterial host for genetic manipulations using recombinant DNA techniques, and is less likely to destroy genetic manipulations using standard mutagenesis techniques.
Cellulase-containing cell-free fermentate produced from microorganism ATCC 55702
Dees, H. Craig
1997-12-16
Bacteria which produce large amounts of cellulase-containing cell-free fermentate have been identified. The original bacterium (ATCC 55703) was genetically altered using nitrosoguanidine (MNNG) treatment to produce the enhanced cellulase producing bacterium (ATCC 55702), which was identified through replicate plating. ATCC 55702 has improved characteristics and qualities for the degradation of cellulosic waste materials for fuel production, food processing, textile processing, and other industrial applications. ATCC 55702 is an improved bacterial host for genetic manipulations using recombinant DNA techniques, and is less likely to destroy genetic manipulations using standard mutagenesis techniques.
Cellulase producing microorganism ATCC 55702
Dees, H.C.
1997-12-30
Bacteria which produce large amounts of cellulase--containing cell-free fermentate have been identified. The original bacterium (ATCC 55703) was genetically altered using nitrosoguanidine (MNNG) treatment to produce the enhanced cellulase producing bacterium (ATCC 55702), which was identified through replicate plating. ATCC 55702 has improved characteristics and qualifies for the degradation of cellulosic waste materials for fuel production, food processing, textile processing, and other industrial applications. ATCC 55702 is an improved bacterial host for genetic manipulations using recombinant DNA techniques, and is less likely to destroy genetic manipulations using standard mutagenesis techniques. 5 figs.
Ensemble analyses improve signatures of tumour hypoxia and reveal inter-platform differences
2014-01-01
Background The reproducibility of transcriptomic biomarkers across datasets remains poor, limiting clinical application. We and others have suggested that this is in-part caused by differential error-structure between datasets, and their incomplete removal by pre-processing algorithms. Methods To test this hypothesis, we systematically assessed the effects of pre-processing on biomarker classification using 24 different pre-processing methods and 15 distinct signatures of tumour hypoxia in 10 datasets (2,143 patients). Results We confirm strong pre-processing effects for all datasets and signatures, and find that these differ between microarray versions. Importantly, exploiting different pre-processing techniques in an ensemble technique improved classification for a majority of signatures. Conclusions Assessing biomarkers using an ensemble of pre-processing techniques shows clear value across multiple diseases, datasets and biomarkers. Importantly, ensemble classification improves biomarkers with initially good results but does not result in spuriously improved performance for poor biomarkers. While further research is required, this approach has the potential to become a standard for transcriptomic biomarkers. PMID:24902696
Sample and data processing considerations for the NIST quantitative infrared database
NASA Astrophysics Data System (ADS)
Chu, Pamela M.; Guenther, Franklin R.; Rhoderick, George C.; Lafferty, Walter J.; Phillips, William
1999-02-01
Fourier-transform infrared (FT-IR) spectrometry has become a useful real-time in situ analytical technique for quantitative gas phase measurements. In fact, the U.S. Environmental Protection Agency (EPA) has recently approved open-path FT-IR monitoring for the determination of hazardous air pollutants (HAP) identified in EPA's Clean Air Act of 1990. To support infrared based sensing technologies, the National Institute of Standards and Technology (NIST) is currently developing a standard quantitative spectral database of the HAPs based on gravimetrically prepared standard samples. The procedures developed to ensure the quantitative accuracy of the reference data are discussed, including sample preparation, residual sample contaminants, data processing considerations, and estimates of error.
Iterative categorization (IC): a systematic technique for analysing qualitative data
2016-01-01
Abstract The processes of analysing qualitative data, particularly the stage between coding and publication, are often vague and/or poorly explained within addiction science and research more broadly. A simple but rigorous and transparent technique for analysing qualitative textual data, developed within the field of addiction, is described. The technique, iterative categorization (IC), is suitable for use with inductive and deductive codes and can support a range of common analytical approaches, e.g. thematic analysis, Framework, constant comparison, analytical induction, content analysis, conversational analysis, discourse analysis, interpretative phenomenological analysis and narrative analysis. Once the data have been coded, the only software required is a standard word processing package. Worked examples are provided. PMID:26806155
SHAW, Anugrah; COLEONE-CARVALHO, Ana Carla; HOLLINGSHURST, Julien; DRAPER, Michael; MACHADO NETO, Joaquim Gonçalves
2017-01-01
A collaborative approach, involving resources and expertise from several countries, was used to develop a test cell to measure cumulative permeation by a solid-state collection technique. The new technique was developed to measure the permeation of pesticide active ingredients and other chemicals with low vapor pressure that would otherwise be difficult to test via standard techniques. The development process is described and the results from the final chosen test method are reported. Inter-laboratory studies were conducted to further refine the new method and determine repeatability and reliability. The revised test method has been approved as a new ISO/EN standard to measure permeation of chemicals with low vapor pressure and/or solubility in water. PMID:29033403
Novel secret key generation techniques using memristor devices
NASA Astrophysics Data System (ADS)
Abunahla, Heba; Shehada, Dina; Yeun, Chan Yeob; Mohammad, Baker; Jaoude, Maguy Abi
2016-02-01
This paper proposes novel secret key generation techniques using memristor devices. The approach depends on using the initial profile of a memristor as a master key. In addition, session keys are generated using the master key and other specified parameters. In contrast to existing memristor-based security approaches, the proposed development is cost effective and power efficient since the operation can be achieved with a single device rather than a crossbar structure. An algorithm is suggested and demonstrated using physics based Matlab model. It is shown that the generated keys can have dynamic size which provides perfect security. Moreover, the proposed encryption and decryption technique using the memristor based generated keys outperforms Triple Data Encryption Standard (3DES) and Advanced Encryption Standard (AES) in terms of processing time. This paper is enriched by providing characterization results of a fabricated microscale Al/TiO2/Al memristor prototype in order to prove the concept of the proposed approach and study the impacts of process variations. The work proposed in this paper is a milestone towards System On Chip (SOC) memristor based security.
Amberg, Alexander; Barrett, Dave; Beale, Michael H.; Beger, Richard; Daykin, Clare A.; Fan, Teresa W.-M.; Fiehn, Oliver; Goodacre, Royston; Griffin, Julian L.; Hankemeier, Thomas; Hardy, Nigel; Harnly, James; Higashi, Richard; Kopka, Joachim; Lane, Andrew N.; Lindon, John C.; Marriott, Philip; Nicholls, Andrew W.; Reily, Michael D.; Thaden, John J.; Viant, Mark R.
2013-01-01
There is a general consensus that supports the need for standardized reporting of metadata or information describing large-scale metabolomics and other functional genomics data sets. Reporting of standard metadata provides a biological and empirical context for the data, facilitates experimental replication, and enables the re-interrogation and comparison of data by others. Accordingly, the Metabolomics Standards Initiative is building a general consensus concerning the minimum reporting standards for metabolomics experiments of which the Chemical Analysis Working Group (CAWG) is a member of this community effort. This article proposes the minimum reporting standards related to the chemical analysis aspects of metabolomics experiments including: sample preparation, experimental analysis, quality control, metabolite identification, and data pre-processing. These minimum standards currently focus mostly upon mass spectrometry and nuclear magnetic resonance spectroscopy due to the popularity of these techniques in metabolomics. However, additional input concerning other techniques is welcomed and can be provided via the CAWG on-line discussion forum at http://msi-workgroups.sourceforge.net/ or http://Msi-workgroups-feedback@lists.sourceforge.net. Further, community input related to this document can also be provided via this electronic forum. PMID:24039616
Real-time catheter localization and visualization using three-dimensional echocardiography
NASA Astrophysics Data System (ADS)
Kozlowski, Pawel; Bandaru, Raja Sekhar; D'hooge, Jan; Samset, Eigil
2017-03-01
Real-time three-dimensional transesophageal echocardiography (RT3D-TEE) is increasingly used during minimally invasive cardiac surgeries (MICS). In many cath labs, RT3D-TEE is already one of the requisite tools for image guidance during MICS. However, the visualization of the catheter is not always satisfactory making 3D- TEE challenging to use as the only modality for guidance. We propose a novel technique for better visualization of the catheter along with the cardiac anatomy using TEE alone - exploiting both beamforming and post processing methods. We extended our earlier method called Delay and Standard Deviation (DASD) beamforming to 3D in order to enhance specular reflections. The beam-formed image was further post-processed by the Frangi filter to segment the catheter. Multi-variate visualization techniques enabled us to render both the standard tissue and the DASD beam-formed image on a clinical ultrasound scanner simultaneously. A frame rate of 15 FPS was achieved.
Development of polypyrrole based solid-state on-chip microactuators using photolithography
NASA Astrophysics Data System (ADS)
Zhong, Yong; Lundemo, Staffan; Jager, Edwin W. H.
2018-07-01
There is a need for soft microactuators, especially for biomedical applications. We have developed a microfabrication process to create such soft, on-chip polymer based microactuators that can operate in air. The on-chip microactuators were fabricated using standard photolithographic techniques and wet etching, combined with special designed process to micropattern the electroactive polymer polypyrrole that drives the microactuators. By immobilizing a UV-patternable gel containing a liquid electrolyte on top of the electroactive polypyrrole layer, actuation in air was achieved although with reduced movement. Further optimization of the processing is currently on-going. The result shows the possibility to batch fabricate complex microsystems such as microrobotics and micromanipulators based on these solid-state on-chip microactuators using microfabrication methods including standard photolithographic processes.
Evaluation of standard methods for collecting and processing fuel moisture samples
Sally M. Haase; José Sánchez; David R. Weise
2016-01-01
A variety of techniques for collecting and processing samples to determine moisture content of wildland fuels in support of fire management activities were evaluated. The effects of using a chainsaw or handsaw to collect samples of largediameter wood, containers for storing and transporting collected samples, and quick-response ovens for estimating moisture content...
ERIC Educational Resources Information Center
Barajas-Saavedra, Arturo; Álvarez-Rodriguez, Francisco J.; Mendoza-González, Ricardo; Oviedo-De-Luna, Ana C.
2015-01-01
Development of digital resources is difficult due to their particular complexity relying on pedagogical aspects. Another aspect is the lack of well-defined development processes, experiences documented, and standard methodologies to guide and organize game development. Added to this, there is no documented technique to ensure correct…
Warner, Courtney J; Walsh, Daniel B; Horvath, Alexander J; Walsh, Teri R; Herrick, Daniel P; Prentiss, Steven J; Powell, Richard J
2013-11-01
Lean process improvement techniques are used in industry to improve efficiency and quality while controlling costs. These techniques are less commonly applied in health care. This study assessed the effectiveness of Lean principles on first case on-time operating room starts and quantified effects on resident work hours. Standard process improvement techniques (DMAIC methodology: define, measure, analyze, improve, control) were used to identify causes of delayed vascular surgery first case starts. Value stream maps and process flow diagrams were created. Process data were analyzed with Pareto and control charts. High-yield changes were identified and simulated in computer and live settings prior to implementation. The primary outcome measure was the proportion of on-time first case starts; secondary outcomes included hospital costs, resident rounding time, and work hours. Data were compared with existing benchmarks. Prior to implementation, 39% of first cases started on time. Process mapping identified late resident arrival in preoperative holding as a cause of delayed first case starts. Resident rounding process inefficiencies were identified and changed through the use of checklists, standardization, and elimination of nonvalue-added activity. Following implementation of process improvements, first case on-time starts improved to 71% at 6 weeks (P = .002). Improvement was sustained with an 86% on-time rate at 1 year (P < .001). Resident rounding time was reduced by 33% (from 70 to 47 minutes). At 9 weeks following implementation, these changes generated an opportunity cost potential of $12,582. Use of Lean principles allowed rapid identification and implementation of perioperative process changes that improved efficiency and resulted in significant cost savings. This improvement was sustained at 1 year. Downstream effects included improved resident efficiency with decreased work hours. Copyright © 2013 Society for Vascular Surgery. Published by Mosby, Inc. All rights reserved.
Laser-assisted selection and passaging of human pluripotent stem cell colonies.
Terstegge, Stefanie; Rath, Barbara H; Laufenberg, Iris; Limbach, Nina; Buchstaller, Andrea; Schütze, Karin; Brüstle, Oliver
2009-09-10
The derivation of somatic cell products from human embryonic stem cells (hESCs) requires a highly standardized production process with sufficient throughput. To date, the most common technique for hESC passaging is the manual dissection of colonies, which is a gentle, but laborious and time-consuming process and is consequently inappropriate for standardized maintenance of hESC. Here, we present a laser-based technique for the contact-free dissection and isolation of living hESCs (laser microdissection and pressure catapulting, LMPC). Following LMPC treatment, 80.6+/-8.7% of the cells remained viable as compared to 88.6+/-1.7% of manually dissected hESCs. Furthermore, there was no significant difference in the expression of pluripotency-associated markers when compared to the control. Flow cytometry revealed that 83.8+/-4.1% of hESCs isolated by LMPC expressed the surface marker Tra-1-60 (control: 83.9+/-3.6%). In vitro differentiation potential of LMPC treated hESCs as determined by embryoid body formation and multi-germlayer formation was not impaired. Moreover, we could not detect any overt karyotype alterations as a result of the LMPC process. Our data demonstrate the feasibility of standardized laser-based passaging of hESC cultures. This technology should facilitate both colony selection and maintenance culture of pluripotent stem cells.
NASA Astrophysics Data System (ADS)
Balbin, Jessie R.; Fausto, Janette C.; Janabajab, John Michael M.; Malicdem, Daryl James L.; Marcelo, Reginald N.; Santos, Jan Jeffrey Z.
2017-06-01
Mango production is highly vital in the Philippines. It is very essential in the food industry as it is being used in markets and restaurants daily. The quality of mangoes can affect the income of a mango farmer, thus incorrect time of harvesting will result to loss of quality mangoes and income. Scientific farming is much needed nowadays together with new gadgets because wastage of mangoes increase annually due to uncouth quality. This research paper focuses on profiling and sorting of Mangifera Indica using image processing techniques and pattern recognition. The image of a mango is captured on a weekly basis from its early stage. In this study, the researchers monitor the growth and color transition of a mango for profiling purposes. Actual dimensions of the mango are determined through image conversion and determination of pixel and RGB values covered through MATLAB. A program is developed to determine the range of the maximum size of a standard ripe mango. Hue, light, saturation (HSL) correction is used in the filtering process to assure the exactness of RGB values of a mango subject. By pattern recognition technique, the program can determine if a mango is standard and ready to be exported.
Microscopic assessment of the sealing ability of three endodontic filling techniques
Cueva-Goig, Roger; Llena-Puy, Mª Carmen
2016-01-01
Background Several techniques have been proposed for root canal filling. New rotary files, with non-standardized taper, are appearing, so, points adapted to the taper of the last instrument used to prepare the canal can help in the obturation process. The aim of this study is to assess the sealing ability of different root canal filling techniques. Material and Methods Root canals from 30 teeth were shaped with Mtwo and divided in three groups; A, standard lateral condensation with size 35 and 20 gutta-percha points; B, standard lateral condensation and injected gutta-percha; C, single gutta-percha point (standardized 35 Mtwo), continuous wave technique and injected gutta-percha. Root surfaces were covered with nail varnish, except for the apical 2 mm, and submerged in a NO3Ag2 solution; apical stain penetration was measured in mm. Data were compared using the Kruskal-Wallis test with a 90% confidence interval. Results A and B groups showed stain leakage in the 90% of the cases, whereas it was of 80% for group C. Stain leakage intervals were 1-5 mm for groups A and B and 1-3 mm for group C. There were no statistically significant differences between the three studied groups (p>.05). Conclusions All the analyzed root canal filling techniques showed some apical stain leakage, without significant differences among them. Key words:Gutta-percha filling, microleakage, single cone, injected gutta-percha, warm gutta-percha. PMID:26855702
Accuracy of the One-Stage and Two-Stage Impression Techniques: A Comparative Analysis
Jamshidy, Ladan; Faraji, Payam; Sharifi, Roohollah
2016-01-01
Introduction. One of the main steps of impression is the selection and preparation of an appropriate tray. Hence, the present study aimed to analyze and compare the accuracy of one- and two-stage impression techniques. Materials and Methods. A resin laboratory-made model, as the first molar, was prepared by standard method for full crowns with processed preparation finish line of 1 mm depth and convergence angle of 3-4°. Impression was made 20 times with one-stage technique and 20 times with two-stage technique using an appropriate tray. To measure the marginal gap, the distance between the restoration margin and preparation finish line of plaster dies was vertically determined in mid mesial, distal, buccal, and lingual (MDBL) regions by a stereomicroscope using a standard method. Results. The results of independent test showed that the mean value of the marginal gap obtained by one-stage impression technique was higher than that of two-stage impression technique. Further, there was no significant difference between one- and two-stage impression techniques in mid buccal region, but a significant difference was reported between the two impression techniques in MDL regions and in general. Conclusion. The findings of the present study indicated higher accuracy for two-stage impression technique than for the one-stage impression technique. PMID:28003824
Accuracy of the One-Stage and Two-Stage Impression Techniques: A Comparative Analysis.
Jamshidy, Ladan; Mozaffari, Hamid Reza; Faraji, Payam; Sharifi, Roohollah
2016-01-01
Introduction . One of the main steps of impression is the selection and preparation of an appropriate tray. Hence, the present study aimed to analyze and compare the accuracy of one- and two-stage impression techniques. Materials and Methods . A resin laboratory-made model, as the first molar, was prepared by standard method for full crowns with processed preparation finish line of 1 mm depth and convergence angle of 3-4°. Impression was made 20 times with one-stage technique and 20 times with two-stage technique using an appropriate tray. To measure the marginal gap, the distance between the restoration margin and preparation finish line of plaster dies was vertically determined in mid mesial, distal, buccal, and lingual (MDBL) regions by a stereomicroscope using a standard method. Results . The results of independent test showed that the mean value of the marginal gap obtained by one-stage impression technique was higher than that of two-stage impression technique. Further, there was no significant difference between one- and two-stage impression techniques in mid buccal region, but a significant difference was reported between the two impression techniques in MDL regions and in general. Conclusion . The findings of the present study indicated higher accuracy for two-stage impression technique than for the one-stage impression technique.
Advanced techniques and technology for efficient data storage, access, and transfer
NASA Technical Reports Server (NTRS)
Rice, Robert F.; Miller, Warner
1991-01-01
Advanced techniques for efficiently representing most forms of data are being implemented in practical hardware and software form through the joint efforts of three NASA centers. These techniques adapt to local statistical variations to continually provide near optimum code efficiency when representing data without error. Demonstrated in several earlier space applications, these techniques are the basis of initial NASA data compression standards specifications. Since the techniques clearly apply to most NASA science data, NASA invested in the development of both hardware and software implementations for general use. This investment includes high-speed single-chip very large scale integration (VLSI) coding and decoding modules as well as machine-transferrable software routines. The hardware chips were tested in the laboratory at data rates as high as 700 Mbits/s. A coding module's definition includes a predictive preprocessing stage and a powerful adaptive coding stage. The function of the preprocessor is to optimally process incoming data into a standard form data source that the second stage can handle.The built-in preprocessor of the VLSI coder chips is ideal for high-speed sampled data applications such as imaging and high-quality audio, but additionally, the second stage adaptive coder can be used separately with any source that can be externally preprocessed into the 'standard form'. This generic functionality assures that the applicability of these techniques and their recent high-speed implementations should be equally broad outside of NASA.
Khan, Wajahat Ali; Khattak, Asad Masood; Hussain, Maqbool; Amin, Muhammad Bilal; Afzal, Muhammad; Nugent, Christopher; Lee, Sungyoung
2014-08-01
Heterogeneity in the management of the complex medical data, obstructs the attainment of data level interoperability among Health Information Systems (HIS). This diversity is dependent on the compliance of HISs with different healthcare standards. Its solution demands a mediation system for the accurate interpretation of data in different heterogeneous formats for achieving data interoperability. We propose an adaptive AdapteR Interoperability ENgine mediation system called ARIEN, that arbitrates between HISs compliant to different healthcare standards for accurate and seamless information exchange to achieve data interoperability. ARIEN stores the semantic mapping information between different standards in the Mediation Bridge Ontology (MBO) using ontology matching techniques. These mappings are provided by our System for Parallel Heterogeneity (SPHeRe) matching system and Personalized-Detailed Clinical Model (P-DCM) approach to guarantee accuracy of mappings. The realization of the effectiveness of the mappings stored in the MBO is evaluation of the accuracy in transformation process among different standard formats. We evaluated our proposed system with the transformation process of medical records between Clinical Document Architecture (CDA) and Virtual Medical Record (vMR) standards. The transformation process achieved over 90 % of accuracy level in conversion process between CDA and vMR standards using pattern oriented approach from the MBO. The proposed mediation system improves the overall communication process between HISs. It provides an accurate and seamless medical information exchange to ensure data interoperability and timely healthcare services to patients.
Trapped rubber processing for advanced composites
NASA Technical Reports Server (NTRS)
Marra, P. J.
1976-01-01
Trapped rubber processing is a molding technique for composites in which precast silicone rubber is placed within a closed cavity where it thermally expands against the composite's surface supported by the vessel walls. The method has been applied by the Douglas Aircraft Company, under contract to NASA-Langley, to the design and fabrication of 10 DC-10 graphite/epoxy upper aft rudder assemblies. A three-bay development tool form mold die has been designed and manufactured, and tooling parameters have been established. Fabrication procedures include graphite layup, assembly of details in the tool, and a cure cycle. The technique has made it possible for the cocured fabrication of complex primary box structures otherwise impracticable via standard composite material processes.
Resource recycling technique of abandoned TNT-RDX-AL mixed explosive
NASA Astrophysics Data System (ADS)
Chen, Siyang; Ding, Yukui
2017-08-01
TNT-RDX-AL mixed explosive is a kind of high energy mixed explosive. It has the detonation characteristics even when reaching the scrapping standard. Inappropriate disposal often causes serious accident. Employing the resource recycling technique, the abandoned TNT-RDX-AL mixed explosive can be recycled. This paper summarized the progress of recycling of abandoned mixed explosive. What's more, three kinds of technological process of resource recycling abandoned TNT-RDX-AL mixed explosives are introduced. The author analysis of the current recovery processes and provided a reference for the recycling of the other same type explosive.
Dees, H. Craig
1998-01-01
Bacteria which produce large amounts of cellulose-containing cell-free fermentate have been identified. The original bacterium (ATCC 55703) was genetically altered using nitrosoguanidine (MNNG) treatment to produce the enhanced cellulase producing bacterium (ATCC 55702), which was identified through replicate plating. ATCC 55702 has improved characteristics and qualities for the degradation of cellulosic waste materials for fuel production, food processing, textile processing, and other industrial applications. ATCC 55702 is an improved bacterial host for genetic manipulations using recombinant DNA techniques, and is less likely to destroy genetic manipulations using standard mutagenesis techniques.
Dees, H. Craig
1998-01-01
Bacteria which produce large amounts of a cellulase-containing cell-free fermentate have been identified. The original bacterium (ATCC 55703) was genetically altered using nitrosoguanidine (MNNG) treatment to produce the enhanced cellulase producing bacterium (ATCC 55702), which was identified through replicate plating. ATCC 55702 has improved characteristics and qualities for the degradation of cellulosic waste materials for fuel production, food processing, textile processing, and other industrial applications. ATCC 55702 is an improved bacterial host for genetic manipulations using recombinant DNA techniques, and is less likely to destroy genetic manipulations using standard mutagenesis techniques.
Interagency Report: Astrogeology 58, television cartography
Batson, Raymond M.
1973-01-01
The purpose of this paper is to illustrate the processing of digital television pictures into base maps. In this context, a base map is defined as a pictorial representation of planetary surface morphology accurately reproduced on standard map projections. Topographic contour lines, albedo or geologic overprints may be super imposed on these base maps. The compilation of geodetic map controls, the techniques of mosaic compilation, computer processing and airbrush enhancement, and the compilation of con tour lines are discussed elsewhere by the originators of these techniques. A bibliography of applicable literature is included for readers interested in more detailed discussions.
Cellulase-containing cell-free fermentate produced from microorganism ATCC 55702
Dees, H.C.
1997-12-16
Bacteria which produce large amounts of cellulase-containing cell-free fermentate have been identified. The original bacterium (ATCC 55703) was genetically altered using nitrosoguanidine (MNNG) treatment to produce the enhanced cellulase producing bacterium (ATCC 55702), which was identified through replicate plating. ATCC 55702 has improved characteristics and qualities for the degradation of cellulosic waste materials for fuel production, food processing, textile processing, and other industrial applications. ATCC 55702 is an improved bacterial host for genetic manipulations using recombinant DNA techniques, and is less likely to destroy genetic manipulations using standard mutagenesis techniques. 5 figs.
Dees, H.C.
1998-05-26
Bacteria which produce large amounts of cellulose-containing cell-free fermentate have been identified. The original bacterium (ATCC 55703) was genetically altered using nitrosoguanidine (MNNG) treatment to produce the enhanced cellulase producing bacterium (ATCC 55702), which was identified through replicate plating. ATCC 55702 has improved characteristics and qualities for the degradation of cellulosic waste materials for fuel production, food processing, textile processing, and other industrial applications. ATCC 55702 is an improved bacterial host for genetic manipulations using recombinant DNA techniques, and is less likely to destroy genetic manipulations using standard mutagenesis techniques. 5 figs.
Jayakody, Chatura; Hull-Ryde, Emily A
2016-01-01
Well-defined quality control (QC) processes are used to determine whether a certain procedure or action conforms to a widely accepted standard and/or set of guidelines, and are important components of any laboratory quality assurance program (Popa-Burke et al., J Biomol Screen 14: 1017-1030, 2009). In this chapter, we describe QC procedures useful for monitoring the accuracy and precision of laboratory instrumentation, most notably automated liquid dispensers. Two techniques, gravimetric QC and photometric QC, are highlighted in this chapter. When used together, these simple techniques provide a robust process for evaluating liquid handler accuracy and precision, and critically underpin high-quality research programs.
Dahlberg, Jerry; Tkacik, Peter T; Mullany, Brigid; Fleischhauer, Eric; Shahinian, Hossein; Azimi, Farzad; Navare, Jayesh; Owen, Spencer; Bisel, Tucker; Martin, Tony; Sholar, Jodie; Keanini, Russell G
2017-12-04
An analog, macroscopic method for studying molecular-scale hydrodynamic processes in dense gases and liquids is described. The technique applies a standard fluid dynamic diagnostic, particle image velocimetry (PIV), to measure: i) velocities of individual particles (grains), extant on short, grain-collision time-scales, ii) velocities of systems of particles, on both short collision-time- and long, continuum-flow-time-scales, iii) collective hydrodynamic modes known to exist in dense molecular fluids, and iv) short- and long-time-scale velocity autocorrelation functions, central to understanding particle-scale dynamics in strongly interacting, dense fluid systems. The basic system is composed of an imaging system, light source, vibrational sensors, vibrational system with a known media, and PIV and analysis software. Required experimental measurements and an outline of the theoretical tools needed when using the analog technique to study molecular-scale hydrodynamic processes are highlighted. The proposed technique provides a relatively straightforward alternative to photonic and neutron beam scattering methods traditionally used in molecular hydrodynamic studies.
Muhogora, Wilbroad E; Msaki, Peter; Padovani, Renato
2015-03-08
The objective of this study was to improve the visibility of anatomical details by applying off-line postimage processing in chest computed radiography (CR). Four spatial domain-based external image processing techniques were developed by using MATLAB software version 7.0.0.19920 (R14) and image processing tools. The developed techniques were implemented to sample images and their visual appearances confirmed by two consultant radiologists to be clinically adequate. The techniques were then applied to 200 chest clinical images and randomized with other 100 images previously processed online. These 300 images were presented to three experienced radiologists for image quality assessment using standard quality criteria. The mean and ranges of the average scores for three radiologists were characterized for each of the developed technique and imaging system. The Mann-Whitney U-test was used to test the difference of details visibility between the images processed using each of the developed techniques and the corresponding images processed using default algorithms. The results show that the visibility of anatomical features improved significantly (0.005 ≤ p ≤ 0.02) with combinations of intensity values adjustment and/or spatial linear filtering techniques for images acquired using 60 ≤ kVp ≤ 70. However, there was no improvement for images acquired using 102 ≤ kVp ≤ 107 (0.127 ≤ p ≤ 0.48). In conclusion, the use of external image processing for optimization can be effective in chest CR, but should be implemented in consultations with the radiologists.
Msaki, Peter; Padovani, Renato
2015-01-01
The objective of this study was to improve the visibility of anatomical details by applying off‐line postimage processing in chest computed radiography (CR). Four spatial domain‐based external image processing techniques were developed by using MATLAB software version 7.0.0.19920 (R14) and image processing tools. The developed techniques were implemented to sample images and their visual appearances confirmed by two consultant radiologists to be clinically adequate. The techniques were then applied to 200 chest clinical images and randomized with other 100 images previously processed online. These 300 images were presented to three experienced radiologists for image quality assessment using standard quality criteria. The mean and ranges of the average scores for three radiologists were characterized for each of the developed technique and imaging system. The Mann‐Whitney U‐test was used to test the difference of details visibility between the images processed using each of the developed techniques and the corresponding images processed using default algorithms. The results show that the visibility of anatomical features improved significantly (0.005≤p≤0.02) with combinations of intensity values adjustment and/or spatial linear filtering techniques for images acquired using 60≤kVp≤70. However, there was no improvement for images acquired using 102≤kVp≤107 (0.127≤p≤0.48). In conclusion, the use of external image processing for optimization can be effective in chest CR, but should be implemented in consultations with the radiologists. PACS number: 87.59.−e, 87.59.−B, 87.59.−bd PMID:26103165
Establishment of metrological traceability in porosity measurements by x-ray computed tomography
NASA Astrophysics Data System (ADS)
Hermanek, Petr; Carmignato, Simone
2017-09-01
Internal porosity is an inherent phenomenon to many manufacturing processes, such as casting, additive manufacturing, and others. Since these defects cannot be completely avoided by improving production processes, it is important to have a reliable method to detect and evaluate them accurately. The accurate evaluation becomes even more important concerning current industrial trends to minimize size and weight of products on one side, and enhance their complexity and performance on the other. X-ray computed tomography (CT) has emerged as a promising instrument for holistic porosity measurements offering several advantages over equivalent methods already established in the detection of internal defects. The main shortcomings of the conventional techniques pertain to too general information about total porosity content (e.g. Archimedes method) or the destructive way of testing (e.g. microscopy of cross-sections). On the contrary, CT is a nondestructive technique providing complete information about size, shape and distribution of internal porosity. However, due to the lack of international standards and the fact that it is relatively a new measurement technique, CT as a measurement technology has not yet reached maturity. This study proposes a procedure for the establishment of measurement traceability in porosity measurements by CT including the necessary evaluation of measurement uncertainty. The traceability transfer is carried out through a novel reference standard calibrated by optical and tactile coordinate measuring systems. The measurement uncertainty is calculated following international standards and guidelines. In addition, the accuracy of porosity measurements by CT with the associated measurement uncertainty is evaluated using the reference standard.
NASA Astrophysics Data System (ADS)
Michalicek, M. Adrian; Comtois, John H.; Schriner, Heather K.
1998-04-01
This paper describes the design and characterization of several types of micromirror devices to include process capabilities, device modeling, and test data resulting in deflection versus applied potential curves and surface contour measurements. These devices are the first to be fabricated in the state-of-the-art four-level planarized polysilicon process available at Sandia National Laboratories known as the Sandia Ultra-planar Multi-level MEMS Technology. This enabling process permits the development of micromirror devices with near-ideal characteristics which have previously been unrealizable in standard three-layer polysilicon processes. This paper describes such characteristics which have previously been unrealizable in standard three-layer polysilicon processes. This paper describes such characteristics as elevated address electrodes, various address wiring techniques, planarized mirror surfaces suing Chemical Mechanical Polishing, unique post-process metallization, and the best active surface area to date.
NASA Astrophysics Data System (ADS)
Eaton, Adam; Vincely, Vinoin; Lloyd, Paige; Hugenberg, Kurt; Vishwanath, Karthik
2017-03-01
Video Photoplethysmography (VPPG) is a numerical technique to process standard RGB video data of exposed human skin and extracting the heart-rate (HR) from the skin areas. Being a non-contact technique, VPPG has the potential to provide estimates of subject's heart-rate, respiratory rate, and even the heart rate variability of human subjects with potential applications ranging from infant monitors, remote healthcare and psychological experiments, particularly given the non-contact and sensor-free nature of the technique. Though several previous studies have reported successful correlations in HR obtained using VPPG algorithms to HR measured using the gold-standard electrocardiograph, others have reported that these correlations are dependent on controlling for duration of the video-data analyzed, subject motion, and ambient lighting. Here, we investigate the ability of two commonly used VPPG-algorithms in extraction of human heart-rates under three different laboratory conditions. We compare the VPPG HR values extracted across these three sets of experiments to the gold-standard values acquired by using an electrocardiogram or a commercially available pulseoximeter. The two VPPG-algorithms were applied with and without KLT-facial feature tracking and detection algorithms from the Computer Vision MATLAB® toolbox. Results indicate that VPPG based numerical approaches have the ability to provide robust estimates of subject HR values and are relatively insensitive to the devices used to record the video data. However, they are highly sensitive to conditions of video acquisition including subject motion, the location, size and averaging techniques applied to regions-of-interest as well as to the number of video frames used for data processing.
Solution Deposition Methods for Carbon Nanotube Field-Effect Transistors
2009-06-01
authorized documents. Citation of manufacturer’s or trade names does not constitute an official endorsement or approval of the use thereof. Destroy...processed into FETs using standard microelectronics processing techniques. The resulting devices were characterized using a semiconductor parameter...method will help to determine which conditions are useful for producing CNT devices for chemical sensing and electronic applications. 15. SUBJECT TERMS
Calibration and assessment of full-field optical strain measurement procedures and instrumentation
NASA Astrophysics Data System (ADS)
Kujawinska, Malgorzata; Patterson, E. A.; Burguete, R.; Hack, E.; Mendels, D.; Siebert, T.; Whelan, Maurice
2006-09-01
There are no international standards or norms for the use of optical techniques for full-field strain measurement. In the paper the rationale and design of a reference material and a set of standarized materials for the calibration and evaluation of optical systems for full-field measurements of strain are outlined. A classification system for the steps in the measurement process is also proposed and allows the development of a unified approach to diagnostic testing of components in an optical system for strain measurement based on any optical technique. The results described arise from a European study known as SPOTS whose objectives were to begin to fill the gap caused by a lack of standards.
Chianea, Denis; Renard, Christophe; Garcia, Carine; Mbongo, Elvire; Monpeurt, Corine; Vest, Philippe
2010-01-01
The accreditation process, according to NF EN ISO 15189, implies a prior evaluation of the new reagent on-site for the implementation of each new assay technique. Thus, our new standardized method for determination of creatinine (non compensated method) in plasma or serum on UniCel DxC 600 (Beckman Coulter) has been tested according to LAB GTA 04 protocol. The reagent meets the quality criteria recommended by Valtec protocol, except fidelity with the low concentration standard (50 micromol/L). Besides there is no problem of results transferability with the two other techniques used in the laboratory (Jaffe compensated and enzymatic methods performed on Cobas Integra 800).
Nursing concerns and hospital product sterilization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rock, R.B. Jr.; Anderson, N.A.
Nurses and other health care professionals must be aware of the rationale and methodology for in-hospital health care product standardization, including consideration of the hospital standardization committee composition, pilot-study prerequisites, and general evaluation criteria. They must be familiar with the techniques of product sterilization, their effectiveness, and the materials required to maintain sterile product shelf-life until a product is used. Hospital standardization committees can assist in the product-use decisionmaking process. Product evaluation criteria should include considerations pertaining to cost, quality, service, and comparison to similar products.
Trout, Andrew T; Batie, Matthew R; Gupta, Anita; Sheridan, Rachel M; Tiao, Gregory M; Towbin, Alexander J
2017-11-01
Radiogenomics promises to identify tumour imaging features indicative of genomic or proteomic aberrations that can be therapeutically targeted allowing precision personalised therapy. An accurate radiological-pathological correlation is critical to the process of radiogenomic characterisation of tumours. An accurate correlation, however, is difficult to achieve with current pathological sectioning techniques which result in sectioning in non-standard planes. The purpose of this work is to present a technique to standardise hepatic sectioning to facilitateradiological-pathological correlation. We describe a process in which three-dimensional (3D)-printed specimen boxes based on preoperative cross-sectional imaging (CT and MRI) can be used to facilitate pathological sectioning in standard planes immediately on hepatic resection enabling improved tumour mapping. We have applied this process in 13 patients undergoing hepatectomy and have observed close correlation between imaging and gross pathology in patients with both unifocal and multifocal tumours. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Contamination detection NDE for cleaning process inspection
NASA Technical Reports Server (NTRS)
Marinelli, W. J.; Dicristina, V.; Sonnenfroh, D.; Blair, D.
1995-01-01
In the joining of multilayer materials, and in welding, the cleanliness of the joining surface may play a large role in the quality of the resulting bond. No non-intrusive techniques are currently available for the rapid measurement of contamination on large or irregularly shaped structures prior to the joining process. An innovative technique for the measurement of contaminant levels in these structures using laser based imaging is presented. The approach uses an ultraviolet excimer laser to illuminate large and/or irregular surface areas. The UV light induces fluorescence and is scattered from the contaminants. The illuminated area is viewed by an image-intensified CCD (charge coupled device) camera interfaced to a PC-based computer. The camera measures the fluorescence and/or scattering from the contaminants for comparison with established standards. Single shot measurements of contamination levels are possible. Hence, the technique may be used for on-line NDE testing during manufacturing processes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reyhan, M; Yue, N
Purpose: To validate an automated image processing algorithm designed to detect the center of radiochromic film used for in vivo film dosimetry against the current gold standard of manual selection. Methods: An image processing algorithm was developed to automatically select the region of interest (ROI) in *.tiff images that contain multiple pieces of radiochromic film (0.5x1.3cm{sup 2}). After a user has linked a calibration file to the processing algorithm and selected a *.tiff file for processing, an ROI is automatically detected for all films by a combination of thresholding and erosion, which removes edges and any additional markings for orientation.more » Calibration is applied to the mean pixel values from the ROIs and a *.tiff image is output displaying the original image with an overlay of the ROIs and the measured doses. Validation of the algorithm was determined by comparing in vivo dose determined using the current gold standard (manually drawn ROIs) versus automated ROIs for n=420 scanned films. Bland-Altman analysis, paired t-test, and linear regression were performed to demonstrate agreement between the processes. Results: The measured doses ranged from 0.2-886.6cGy. Bland-Altman analysis of the two techniques (automatic minus manual) revealed a bias of -0.28cGy and a 95% confidence interval of (5.5cGy,-6.1cGy). These values demonstrate excellent agreement between the two techniques. Paired t-test results showed no statistical differences between the two techniques, p=0.98. Linear regression with a forced zero intercept demonstrated that Automatic=0.997*Manual, with a Pearson correlation coefficient of 0.999. The minimal differences between the two techniques may be explained by the fact that the hand drawn ROIs were not identical to the automatically selected ones. The average processing time was 6.7seconds in Matlab on an IntelCore2Duo processor. Conclusion: An automated image processing algorithm has been developed and validated, which will help minimize user interaction and processing time of radiochromic film used for in vivo dosimetry.« less
Nonlinear relaxation algorithms for circuit simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saleh, R.A.
Circuit simulation is an important Computer-Aided Design (CAD) tool in the design of Integrated Circuits (IC). However, the standard techniques used in programs such as SPICE result in very long computer-run times when applied to large problems. In order to reduce the overall run time, a number of new approaches to circuit simulation were developed and are described. These methods are based on nonlinear relaxation techniques and exploit the relative inactivity of large circuits. Simple waveform-processing techniques are described to determine the maximum possible speed improvement that can be obtained by exploiting this property of large circuits. Three simulation algorithmsmore » are described, two of which are based on the Iterated Timing Analysis (ITA) method and a third based on the Waveform-Relaxation Newton (WRN) method. New programs that incorporate these techniques were developed and used to simulate a variety of industrial circuits. The results from these simulations are provided. The techniques are shown to be much faster than the standard approach. In addition, a number of parallel aspects of these algorithms are described, and a general space-time model of parallel-task scheduling is developed.« less
Muscle activity characterization by laser Doppler Myography
NASA Astrophysics Data System (ADS)
Scalise, Lorenzo; Casaccia, Sara; Marchionni, Paolo; Ercoli, Ilaria; Primo Tomasini, Enrico
2013-09-01
Electromiography (EMG) is the gold-standard technique used for the evaluation of muscle activity. This technique is used in biomechanics, sport medicine, neurology and rehabilitation therapy and it provides the electrical activity produced by skeletal muscles. Among the parameters measured with EMG, two very important quantities are: signal amplitude and duration of muscle contraction, muscle fatigue and maximum muscle power. Recently, a new measurement procedure, named Laser Doppler Myography (LDMi), for the non contact assessment of muscle activity has been proposed to measure the vibro-mechanical behaviour of the muscle. The aim of this study is to present the LDMi technique and to evaluate its capacity to measure some characteristic features proper of the muscle. In this paper LDMi is compared with standard superficial EMG (sEMG) requiring the application of sensors on the skin of each patient. sEMG and LDMi signals have been simultaneously acquired and processed to test correlations. Three parameters has been analyzed to compare these techniques: Muscle activation timing, signal amplitude and muscle fatigue. LDMi appears to be a reliable and promising measurement technique allowing the measurements without contact with the patient skin.
Blind retrospective motion correction of MR images.
Loktyushin, Alexander; Nickisch, Hannes; Pohmann, Rolf; Schölkopf, Bernhard
2013-12-01
Subject motion can severely degrade MR images. A retrospective motion correction algorithm, Gradient-based motion correction, which significantly reduces ghosting and blurring artifacts due to subject motion was proposed. The technique uses the raw data of standard imaging sequences; no sequence modifications or additional equipment such as tracking devices are required. Rigid motion is assumed. The approach iteratively searches for the motion trajectory yielding the sharpest image as measured by the entropy of spatial gradients. The vast space of motion parameters is efficiently explored by gradient-based optimization with a convergence guarantee. The method has been evaluated on both synthetic and real data in two and three dimensions using standard imaging techniques. MR images are consistently improved over different kinds of motion trajectories. Using a graphics processing unit implementation, computation times are in the order of a few minutes for a full three-dimensional volume. The presented technique can be an alternative or a complement to prospective motion correction methods and is able to improve images with strong motion artifacts from standard imaging sequences without requiring additional data. Copyright © 2013 Wiley Periodicals, Inc., a Wiley company.
An Example of Process Evaluation.
ERIC Educational Resources Information Center
Karl, Marion C.
The inappropriateness of standard experimental research design, which can stifle innovations, is discussed in connection with the problems of designing practical techniques for evaluating a Title III curriculum development project. The project, involving 12 school districts and 2,500 students, teaches concept understanding, critical thinking, and…
Manufacturing and quality control of interconnecting wire harnesses, Volume 2
NASA Technical Reports Server (NTRS)
1972-01-01
Interconnecting wire harnesses defined in the design standard are considered, including type 4, open bundle (not enclosed). Knowledge gained through experience on the Saturn 5 program coupled with recent advances in techniques, materials, and processes was incorporated into the document.
The application of welat latino for creating paes in solo wedding bride
NASA Astrophysics Data System (ADS)
Ihsani, Ade Novi Nurul; Krisnawati, Maria; Prasetyaningtyas, Wulansari; Anggraeni, Puput; Bela, Herlina Tria; Zunaedah, Putri Wahyu
2018-03-01
The purposes of this research were: 1) to find out the process of creating innovative welat, 2) to find out how to use innovative welat for Solo wedding bride paes creation. The method used in the research was research and development (R & D). Sampling technique in this research was purposive sampling by using 13 people as models. The data collection technique used observation and documentation. Data analysis technique used descriptive technique. The results of the study showed that 1) there were two times design change of the validity of welat creation, each product passed through several stages of designing, forming, determining the material and printing, 3) the first way of using the welat determined the distance dot between the cengkorongan of both forms by using welat according to the existed mold. In conclusion, Innovative welat can produce paes in accordance with the standard and shorten the process.
Liu, Hesheng; Schimpf, Paul H; Dong, Guoya; Gao, Xiaorong; Yang, Fusheng; Gao, Shangkai
2005-10-01
This paper presents a new algorithm called Standardized Shrinking LORETA-FOCUSS (SSLOFO) for solving the electroencephalogram (EEG) inverse problem. Multiple techniques are combined in a single procedure to robustly reconstruct the underlying source distribution with high spatial resolution. This algorithm uses a recursive process which takes the smooth estimate of sLORETA as initialization and then employs the re-weighted minimum norm introduced by FOCUSS. An important technique called standardization is involved in the recursive process to enhance the localization ability. The algorithm is further improved by automatically adjusting the source space according to the estimate of the previous step, and by the inclusion of temporal information. Simulation studies are carried out on both spherical and realistic head models. The algorithm achieves very good localization ability on noise-free data. It is capable of recovering complex source configurations with arbitrary shapes and can produce high quality images of extended source distributions. We also characterized the performance with noisy data in a realistic head model. An important feature of this algorithm is that the temporal waveforms are clearly reconstructed, even for closely spaced sources. This provides a convenient way to estimate neural dynamics directly from the cortical sources.
NASA Astrophysics Data System (ADS)
Kajiyama, Yoshitaka; Joseph, Kevin; Kajiyama, Koichi; Kudo, Shuji; Aziz, Hany
2014-02-01
A shadow mask technique capable of realizing high resolution (>330 pixel-per-inch) and ˜100% aperture ratio Organic Light-Emitting Diode (OLED) full color displays is demonstrated. The technique utilizes polyimide contact shadow masks, patterned by laser ablation. Red, green, and blue OLEDs with very small feature sizes (<25 μm) are fabricated side by side on one substrate. OLEDs fabricated via this technique have the same performance as those made by established technology. This technique has a strong potential to achieve high resolution OLED displays via standard vacuum deposition processes even on flexible substrates.
Gorgolewski, Krzysztof J; Auer, Tibor; Calhoun, Vince D; Craddock, R Cameron; Das, Samir; Duff, Eugene P; Flandin, Guillaume; Ghosh, Satrajit S; Glatard, Tristan; Halchenko, Yaroslav O; Handwerker, Daniel A; Hanke, Michael; Keator, David; Li, Xiangrui; Michael, Zachary; Maumet, Camille; Nichols, B Nolan; Nichols, Thomas E; Pellman, John; Poline, Jean-Baptiste; Rokem, Ariel; Schaefer, Gunnar; Sochat, Vanessa; Triplett, William; Turner, Jessica A; Varoquaux, Gaël; Poldrack, Russell A
2016-06-21
The development of magnetic resonance imaging (MRI) techniques has defined modern neuroimaging. Since its inception, tens of thousands of studies using techniques such as functional MRI and diffusion weighted imaging have allowed for the non-invasive study of the brain. Despite the fact that MRI is routinely used to obtain data for neuroscience research, there has been no widely adopted standard for organizing and describing the data collected in an imaging experiment. This renders sharing and reusing data (within or between labs) difficult if not impossible and unnecessarily complicates the application of automatic pipelines and quality assurance protocols. To solve this problem, we have developed the Brain Imaging Data Structure (BIDS), a standard for organizing and describing MRI datasets. The BIDS standard uses file formats compatible with existing software, unifies the majority of practices already common in the field, and captures the metadata necessary for most common data processing operations.
Gorgolewski, Krzysztof J.; Auer, Tibor; Calhoun, Vince D.; Craddock, R. Cameron; Das, Samir; Duff, Eugene P.; Flandin, Guillaume; Ghosh, Satrajit S.; Glatard, Tristan; Halchenko, Yaroslav O.; Handwerker, Daniel A.; Hanke, Michael; Keator, David; Li, Xiangrui; Michael, Zachary; Maumet, Camille; Nichols, B. Nolan; Nichols, Thomas E.; Pellman, John; Poline, Jean-Baptiste; Rokem, Ariel; Schaefer, Gunnar; Sochat, Vanessa; Triplett, William; Turner, Jessica A.; Varoquaux, Gaël; Poldrack, Russell A.
2016-01-01
The development of magnetic resonance imaging (MRI) techniques has defined modern neuroimaging. Since its inception, tens of thousands of studies using techniques such as functional MRI and diffusion weighted imaging have allowed for the non-invasive study of the brain. Despite the fact that MRI is routinely used to obtain data for neuroscience research, there has been no widely adopted standard for organizing and describing the data collected in an imaging experiment. This renders sharing and reusing data (within or between labs) difficult if not impossible and unnecessarily complicates the application of automatic pipelines and quality assurance protocols. To solve this problem, we have developed the Brain Imaging Data Structure (BIDS), a standard for organizing and describing MRI datasets. The BIDS standard uses file formats compatible with existing software, unifies the majority of practices already common in the field, and captures the metadata necessary for most common data processing operations. PMID:27326542
Vacuum Processing Technique for Development of Primary Standard Blackbodies
Navarro, M.; Bruce, S. S.; Johnson, B. Carol; Murthy, A. V.; Saunders, R. D.
1999-01-01
Blackbody sources with nearly unity emittance that are in equilibrium with a pure freezing metal such as gold, silver, or copper are used as primary standard sources in the International Temperature Scale of 1990 (ITS-90). Recently, a facility using radio-frequency induction heating for melting and filling the blackbody crucible with the freeze metal under vacuum conditions was developed at the National Institute of Standards and Technology (NIST). The blackbody development under a vacuum environment eliminated the possibility of contamination of the freeze metal during the process. The induction heating, compared to a resistively heated convection oven, provided faster heating of crucible and resulted in shorter turn-around time of about 7 h to manufacture a blackbody. This paper describes the new facility and its application to the development of fixed-point blackbodies.
NASA Astrophysics Data System (ADS)
Omega, Dousmaris; Andika, Aditya
2017-12-01
This paper discusses the results of a research conducted on the production process of an Indonesian pharmaceutical company. The company is experiencing low performance in the Overall Equipment Effectiveness (OEE) metric. The OEE of the company machines are below world class standard. The machine that has the lowest OEE is the filler machine. Through observation and analysis, it is found that the cleaning process of the filler machine consumes significant amount of time. The long duration of the cleaning process happens because there is no structured division of jobs between cleaning operators, differences in operators’ ability, and operators’ inability in utilizing available cleaning equipment. The company needs to improve the cleaning process. Therefore, Critical Path Method (CPM) analysis is conducted to find out what activities are critical in order to shorten and simplify the cleaning process in the division of tasks. Afterwards, The Maynard Operation and Sequence Technique (MOST) method is used to reduce ineffective movement and specify the cleaning process standard time. From CPM and MOST, it is obtained the shortest time of the cleaning process is 1 hour 28 minutes and the standard time is 1 hour 38.826 minutes.
Analysis of Lightweight Materials for the AM2 System
2014-06-01
and fatigue behavior in magnesium alloys . Materials Science & Engineering A (Structural Materials: Properties , Microstructure and Processing ), v 434...Table 7. Tensile properties of the alloys AA2024 or the T3 and T81 temper designations (Kuo et al . 2005...using a powder metallurgy technique, such as a standard cold compacting press and sintering process . However, the fatigue life of the liquid-based
NASA Astrophysics Data System (ADS)
Kulse, P.; Sasai, K.; Schulz, K.; Wietstruck, M.
2017-06-01
In the last decades the semiconductor technology has been driven by Moore's law leading to high performance CMOS technologies with feature sizes of less than 10 nm [1]. It has been pointed out that not only scaling but also the integration of novel components and technology modules into CMOS/BiCMOS technologies is becoming more attractive to realize smart and miniaturized systems [2]. Driven by new applications in the area of communication, health and automation, new components and technology modules such as BiCMOS embedded RF-MEMS, high-Q passives, Sibased microfluidics and InP-SiGe BiCMOS heterointegration have been demonstrated [3-6]. In contrast to standard VLSI processes fabricated on front side of the silicon wafer, these new technology modules require addition backside processing of the wafer; thus an accurate alignment between the front and backside of the wafer is mandatory. In previous work an advanced back to front side alignment technique and implementation into IHP's 0.25/0.13 μm high performance SiGe:C BiCMOS backside process module has been presented [7]. The developed technique enables a high resolution and accurate lithography on the backside of BiCMOS wafer for additional backside processing. In addition to the aforementioned back side process technologies, new applications like Through-Silicon Vias (TSV) for interposers and advanced substrate technologies for 3D heterogeneous integration demand not only single wafer fabrication but also processing of wafer stacks provided by temporary and permanent wafer bonding [8]. Therefore, the available overlay measurement techniques are not suitable if overlay and alignment marks are realized at the bonding interface of a wafer stack which consists of both a silicon device and a silicon carrier wafer. The former used EVG 40NT automated overlay measurement system, which use two opposite positioned microscopes inspecting simultaneous the wafer back and front side, is not capable measuring embedded overlay marks. In this work, the non-contact infrared alignment system of the Nikon i-line Stepper NSR-SF150 for both the alignment and the overlay determination of bonded wafer stacks with embedded alignment marks are used to achieve an accurate alignment between the different wafer sides. The embedded field image alignment (FIA) marks of the interface and the device wafer top layer are measured in a single measurement job. By taking the offsets between all different FIA's into account, after correcting the wafer rotation induced FIA position errors, hence an overlay for the stacked wafers can be determined. The developed approach has been validated by a standard back to front side application. The overlay was measured and determined using both, the EVG NT40 automated measurement system with special overlay marks and the measurement of the FIA marks of the front and back side layer. A comparison of both results shows mismatches in x and y translations smaller than 200 nm, which is relatively small compared to the overlay tolerances of +/-500 nm for the back to front side process. After the successful validation of the developed technique, special wafer stacks with FIA alignment marks in the bonding interface are fabricated. Due to the super IR light transparency of both doubled side polished wafers, the embedded FIA marks generate a stable and clear signal for accurate x and y wafer coordinate positioning. The FIA marks of the device wafer top layer were measured under standard condition in a developed photoresist mask without IR illumination. Following overlay calculation shows an overlay of less than 200 nm, which enables very accurate process condition for highly scaled TSV integration and advanced substrate integration into IHP's 0.25/0.13 μm SiGe:C BiCMOS technology. The presented method can be applied for both the standard back to front side process technologies and also new temporary and permanent wafer bonding applications.
A new polarimetric active radar calibrator and calibration technique
NASA Astrophysics Data System (ADS)
Tang, Jianguo; Xu, Xiaojian
2015-10-01
Polarimetric active radar calibrator (PARC) is one of the most important calibrators with high radar cross section (RCS) for polarimetry measurement. In this paper, a new double-antenna polarimetric active radar calibrator (DPARC) is proposed, which consists of two rotatable antennas with wideband electromagnetic polarization filters (EMPF) to achieve lower cross-polarization for transmission and reception. With two antennas which are rotatable around the radar line of sight (LOS), the DPARC provides a variety of standard polarimetric scattering matrices (PSM) through the rotation combination of receiving and transmitting polarization, which are useful for polarimatric calibration in different applications. In addition, a technique based on Fourier analysis is proposed for calibration processing. Numerical simulation results are presented to demonstrate the superior performance of the proposed DPARC and processing technique.
Iterative categorization (IC): a systematic technique for analysing qualitative data.
Neale, Joanne
2016-06-01
The processes of analysing qualitative data, particularly the stage between coding and publication, are often vague and/or poorly explained within addiction science and research more broadly. A simple but rigorous and transparent technique for analysing qualitative textual data, developed within the field of addiction, is described. The technique, iterative categorization (IC), is suitable for use with inductive and deductive codes and can support a range of common analytical approaches, e.g. thematic analysis, Framework, constant comparison, analytical induction, content analysis, conversational analysis, discourse analysis, interpretative phenomenological analysis and narrative analysis. Once the data have been coded, the only software required is a standard word processing package. Worked examples are provided. © 2016 The Authors. Addiction published by John Wiley & Sons Ltd on behalf of Society for the Study of Addiction.
NASA Technical Reports Server (NTRS)
Nashman, Marilyn; Chaconas, Karen J.
1988-01-01
The sensory processing system for the NASA/NBS Standard Reference Model (NASREM) for telerobotic control is described. This control system architecture was adopted by NASA of the Flight Telerobotic Servicer. The control system is hierarchically designed and consists of three parallel systems: task decomposition, world modeling, and sensory processing. The Sensory Processing System is examined, and in particular the image processing hardware and software used to extract features at low levels of sensory processing for tasks representative of those envisioned for the Space Station such as assembly and maintenance are described.
Investigations in adaptive processing of multispectral data
NASA Technical Reports Server (NTRS)
Kriegler, F. J.; Horwitz, H. M.
1973-01-01
Adaptive data processing procedures are applied to the problem of classifying objects in a scene scanned by multispectral sensor. These procedures show a performance improvement over standard nonadaptive techniques. Some sources of error in classification are identified and those correctable by adaptive processing are discussed. Experiments in adaptation of signature means by decision-directed methods are described. Some of these methods assume correlation between the trajectories of different signature means; for others this assumption is not made.
An introduction to chaotic and random time series analysis
NASA Technical Reports Server (NTRS)
Scargle, Jeffrey D.
1989-01-01
The origin of chaotic behavior and the relation of chaos to randomness are explained. Two mathematical results are described: (1) a representation theorem guarantees the existence of a specific time-domain model for chaos and addresses the relation between chaotic, random, and strictly deterministic processes; (2) a theorem assures that information on the behavior of a physical system in its complete state space can be extracted from time-series data on a single observable. Focus is placed on an important connection between the dynamical state space and an observable time series. These two results lead to a practical deconvolution technique combining standard random process modeling methods with new embedded techniques.
Metadynamics studies of crystal nucleation
Giberti, Federico; Salvalaglio, Matteo; Parrinello, Michele
2015-01-01
Crystallization processes are characterized by activated events and long timescales. These characteristics prevent standard molecular dynamics techniques from being efficiently used for the direct investigation of processes such as nucleation. This short review provides an overview on the use of metadynamics, a state-of-the-art enhanced sampling technique, for the simulation of phase transitions involving the production of a crystalline solid. In particular the principles of metadynamics are outlined, several order parameters are described that have been or could be used in conjunction with metadynamics to sample nucleation events and then an overview is given of recent metadynamics results in the field of crystal nucleation. PMID:25866662
The Commercial Challenges Of Pacs
NASA Astrophysics Data System (ADS)
Vanden Brink, John A.
1984-08-01
The increasing use of digital imaging techniques create a need for improved methods of digital processing, communication and archiving. However, the commercial opportunity is dependent on the resolution of a number of issues. These issues include proof that digital processes are more cost effective than present techniques, implementation of information system support in the imaging activity, implementation of industry standards, conversion of analog images to digital formats, definition of clinical needs, the implications of the purchase decision and technology requirements. In spite of these obstacles, a market is emerging, served by new and existing companies, that may become a $500 million market (U.S.) by 1990 for equipment and supplies.
Dees, H.C.
1998-07-14
Bacteria which produce large amounts of a cellulase-containing cell-free fermentate have been identified. The original bacterium (ATCC 55703) was genetically altered using nitrosoguanidine (MNNG) treatment to produce the enhanced cellulase producing bacterium (ATCC 55702), which was identified through replicate plating. ATCC 55702 has improved characteristics and qualities for the degradation of cellulosic waste materials for fuel production, food processing, textile processing, and other industrial applications. ATCC 55702 is an improved bacterial host for genetic manipulations using recombinant DNA techniques, and is less likely to destroy genetic manipulations using standard mutagenesis techniques. 5 figs.
Rapid adhesive bonding of advanced composites and titanium
NASA Technical Reports Server (NTRS)
Stein, B. A.; Tyeryart, J. R.; Hodgest, W. T.
1985-01-01
Rapid adhesive bonding (RAB) concepts utilize a toroid induction technique to heat the adhesive bond line directly. This technique was used to bond titanium overlap shear specimens with 3 advanced thermoplastic adhesives and APC-2 (graphite/PEEK) composites with PEEK film. Bond strengths equivalent to standard heated-platen press bonds were produced with large reductions in process time. RAB produced very strong bonds in APC-2 adherend specimens; the APC-2 adherends were highly resistant to delamination. Thermal cycling did not significantly affect the shear strengths of RAB titanium bonds with polyimide adhesives. A simple ultrasonic non-destructive evaluation process was found promising for evaluating bond quality.
Standard work for room entry: Linking lean, hand hygiene, and patient-centeredness.
O'Reilly, Kristin; Ruokis, Samantha; Russell, Kristin; Teves, Tim; DiLibero, Justin; Yassa, David; Berry, Hannah; Howell, Michael D
2016-03-01
Healthcare-associated infections are costly and fatal. Substantial front-line, administrative, regulatory, and research efforts have focused on improving hand hygiene. While broad agreement exists that hand hygiene is the most important single approach to infection prevention, compliance with hand hygiene is typically only about 40%(1). Our aim was to develop a standard process for room entry in the intensive care unit that improved compliance with hand hygiene and allowed for maximum efficiency. We recognized that hand hygiene is a single step in a substantially more complicated process of room entry. We applied Lean engineering techniques to develop a standard process that included both physical steps and also standard communication elements from provider to patients and families and created a physical environment to support this. We observed meaningful improvement in the performance of the new standard as well as time savings for clinical providers with each room entry. We also observed an increase in room entries that included verbal communication and an explanation of what the clinician was entering the room to do. The design and implementation of a standardized room entry process and the creation of an environment that supports that new process has resulted in measurable positive outcomes on the medical intensive care unit, including quality, patient experience, efficiency, and staff satisfaction. Designing a process, rather than viewing tasks that need to happen in close proximity in time (either serially or in parallel) as unrelated, simplifies work for staff and results in higher compliance to individual tasks. Copyright © 2015 Elsevier Inc. All rights reserved.
Elnaggar, Yosra Shaaban R; El-Massik, Magda A; Abdallah, Ossama Y; Ebian, Abd Elazim R
2010-06-01
The recent challenge in orally disintegrating tablets (ODT) manufacturing encompasses the compromise between instantaneous disintegration, sufficient hardness, and standard processing equipment. The current investigation constitutes one attempt to fulfill this challenge. Maltodextrin, in the present work, was utilized as a novel excipient to prepare ODT of meclizine. Tablets were prepared by both direct compression and wet granulation techniques. The effect of maltodextrin concentrations on ODT characteristics--manifested as hardness and disintegration time--was studied. The effect of conditioning (40 degrees C and 75% relative humidity) as a post-compression treatment on ODT characteristics was also assessed. Furthermore, maltodextrin-pronounced hardening effect was investigated using differential scanning calorimetry (DSC) and X-ray analysis. Results revealed that in both techniques, rapid disintegration (30-40 s) would be achieved on the cost of tablet hardness (about 1 kg). Post-compression conditioning of tablets resulted in an increase in hardness (3 kg), while keeping rapid disintegration (30-40 s) according to guidance of the FDA for ODT. However, direct compression-conditioning technique exhibited drawbacks of long conditioning time and appearance of the so-called patch effect. These problems were, yet, absent in wet granulation-conditioning technique. DSC and X-ray analysis suggested involvement of glass-elastic deformation in maltodextrin hardening effect. High-performance liquid chromatography analysis of meclizine ODT suggested no degradation of the drug by the applied conditions of temperature and humidity. Overall results proposed that maltodextrin is a promising saccharide for production of ODT with accepted hardness-disintegration time compromise, utilizing standard processing equipment and phenomena of phase transition.
Connatser, Raynella M.; Lewis, Sr., Samuel Arthur; Keiser, James R.; ...
2014-10-03
Integrating biofuels with conventional petroleum products requires improvements in processing to increase blendability with existing fuels. This work demonstrates analysis techniques for more hydrophilic bio-oil liquids that give improved quantitative and qualitative description of the total acid content and organic acid profiles. To protect infrastructure from damage and reduce the cost associated with upgrading, accurate determination of acid content and representative chemical compound analysis are central imperatives to assessing both the corrosivity and the progress toward removing oxygen and acidity in processed biomass liquids. Established techniques form an ample basis for bio-liquids evaluation. However, early in the upgrading process, themore » unique physical phases and varied hydrophilicity of many pyrolysis liquids can render analytical methods originally designed for use in petroleum-derived oils inadequate. In this work, the water solubility of the organic acids present in bio-oils is exploited in a novel extraction and titration technique followed by analysis on the water-based capillary electrophoresis (CE) platform. The modification of ASTM D664, the standard for Total Acid Number (TAN), to include aqueous carrier solvents improves the utility of that approach for quantifying acid content in hydrophilic bio-oils. Termed AMTAN (modified Total Acid Number), this technique offers 1.2% relative standard deviation and dynamic range comparable to the conventional ASTM method. Furthermore, the results of corrosion product evaluations using several different sources of real bio-oil are discussed in the context of the unique AMTAN and CE analytical approaches developed to facilitate those measurements.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-29
... Promulgation of Air Quality Implementation Plans; Maryland; Control of Volatile Organic Compounds Emissions... Maryland's Volatile Organic Compounds from Specific Processes Regulation. Maryland has adopted standards... (RACT) requirements for sources of volatile organic compounds (VOCs) covered by control techniques...
EVALUATION OF BIOSOLID SAMPLE PROCESSING TECHNIQUES TO MAXIMIZE RECOVERY OF BACTERIA
Current federal regulations (40 CFR 503) require enumeration of fecal coliform or Salmoella prior to land application of Class A biosolids. This regulation specifies use of enumeration methods included in "Standard Methods for the Examination of Water and Wastewater 18th Edition,...
Du, Baoqiang; Dong, Shaofeng; Wang, Yanfeng; Guo, Shuting; Cao, Lingzhi; Zhou, Wei; Zuo, Yandi; Liu, Dan
2013-11-01
A wide-frequency and high-resolution frequency measurement method based on the quantized phase step law is presented in this paper. Utilizing a variation law of the phase differences, the direct different frequency phase processing, and the phase group synchronization phenomenon, combining an A/D converter and the adaptive phase shifting principle, a counter gate is established in the phase coincidences at one-group intervals, which eliminates the ±1 counter error in the traditional frequency measurement method. More importantly, the direct phase comparison, the measurement, and the control between any periodic signals have been realized without frequency normalization in this method. Experimental results show that sub-picosecond resolution can be easily obtained in the frequency measurement, the frequency standard comparison, and the phase-locked control based on the phase quantization processing technique. The method may be widely used in navigation positioning, space techniques, communication, radar, astronomy, atomic frequency standards, and other high-tech fields.
NASA Astrophysics Data System (ADS)
Jolivet, S.; Mezghani, S.; El Mansori, M.
2016-09-01
The replication of topography has been generally restricted to optimizing material processing technologies in terms of statistical and single-scale features such as roughness. By contrast, manufactured surface topography is highly complex, irregular, and multiscale. In this work, we have demonstrated the use of multiscale analysis on replicates of surface finish to assess the precise control of the finished replica. Five commercial resins used for surface replication were compared. The topography of five standard surfaces representative of common finishing processes were acquired both directly and by a replication technique. Then, they were characterized using the ISO 25178 standard and multiscale decomposition based on a continuous wavelet transform, to compare the roughness transfer quality at different scales. Additionally, atomic force microscope force modulation mode was used in order to compare the resins’ stiffness properties. The results showed that less stiff resins are able to replicate the surface finish along a larger wavelength band. The method was then tested for non-destructive quality control of automotive gear tooth surfaces.
A study of software standards used in the avionics industry
NASA Technical Reports Server (NTRS)
Hayhurst, Kelly J.
1994-01-01
Within the past decade, software has become an increasingly common element in computing systems. In particular, the role of software used in the aerospace industry, especially in life- or safety-critical applications, is rapidly expanding. This intensifies the need to use effective techniques for achieving and verifying the reliability of avionics software. Although certain software development processes and techniques are mandated by government regulating agencies, no one methodology has been shown to consistently produce reliable software. The knowledge base for designing reliable software simply has not reached the maturity of its hardware counterpart. In an effort to increase our understanding of software, the Langley Research Center conducted a series of experiments over 15 years with the goal of understanding why and how software fails. As part of this program, the effectiveness of current industry standards for the development of avionics is being investigated. This study involves the generation of a controlled environment to conduct scientific experiments on software processes.
Digital techniques for processing Landsat imagery
NASA Technical Reports Server (NTRS)
Green, W. B.
1978-01-01
An overview of the basic techniques used to process Landsat images with a digital computer, and the VICAR image processing software developed at JPL and available to users through the NASA sponsored COSMIC computer program distribution center is presented. Examples of subjective processing performed to improve the information display for the human observer, such as contrast enhancement, pseudocolor display and band rationing, and of quantitative processing using mathematical models, such as classification based on multispectral signatures of different areas within a given scene and geometric transformation of imagery into standard mapping projections are given. Examples are illustrated by Landsat scenes of the Andes mountains and Altyn-Tagh fault zone in China before and after contrast enhancement and classification of land use in Portland, Oregon. The VICAR image processing software system which consists of a language translator that simplifies execution of image processing programs and provides a general purpose format so that imagery from a variety of sources can be processed by the same basic set of general applications programs is described.
Jakob, Severin; Pfeifenberger, Manuel J.; Hohenwarter, Anton; Pippan, Reinhard
2017-01-01
Abstract The standard preparation technique for micro-sized samples is focused ion beam milling, most frequently using Ga+ ions. The main drawbacks are the required processing time and the possibility and risks of ion implantation. In contrast, ultrashort pulsed laser ablation can process any type of material with ideally negligible damage to the surrounding volume and provides 4 to 6 orders of magnitude higher ablation rates than the ion beam technique. In this work, a femtosecond laser was used to prepare wood samples from spruce for mechanical testing at the micrometre level. After optimization of the different laser parameters, tensile and compressive specimens were produced from microtomed radial-tangential and longitudinal-tangential sections. Additionally, laser-processed samples were exposed to an electron beam prior to testing to study possible beam damage. The specimens originating from these different preparation conditions were mechanically tested. Advantages and limitations of the femtosecond laser preparation technique and the deformation and fracture behaviour of the samples are discussed. The results prove that femtosecond laser processing is a fast and precise preparation technique, which enables the fabrication of pristine biological samples with dimensions at the microscale. PMID:28970867
NASA Astrophysics Data System (ADS)
Jakob, Severin; Pfeifenberger, Manuel J.; Hohenwarter, Anton; Pippan, Reinhard
2017-12-01
The standard preparation technique for micro-sized samples is focused ion beam milling, most frequently using Ga+ ions. The main drawbacks are the required processing time and the possibility and risks of ion implantation. In contrast, ultrashort pulsed laser ablation can process any type of material with ideally negligible damage to the surrounding volume and provides 4 to 6 orders of magnitude higher ablation rates than the ion beam technique. In this work, a femtosecond laser was used to prepare wood samples from spruce for mechanical testing at the micrometre level. After optimization of the different laser parameters, tensile and compressive specimens were produced from microtomed radial-tangential and longitudinal-tangential sections. Additionally, laser-processed samples were exposed to an electron beam prior to testing to study possible beam damage. The specimens originating from these different preparation conditions were mechanically tested. Advantages and limitations of the femtosecond laser preparation technique and the deformation and fracture behaviour of the samples are discussed. The results prove that femtosecond laser processing is a fast and precise preparation technique, which enables the fabrication of pristine biological samples with dimensions at the microscale.
New signal processing technique for density profile reconstruction using reflectometry.
Clairet, F; Ricaud, B; Briolle, F; Heuraux, S; Bottereau, C
2011-08-01
Reflectometry profile measurement requires an accurate determination of the plasma reflected signal. Along with a good resolution and a high signal to noise ratio of the phase measurement, adequate data analysis is required. A new data processing based on time-frequency tomographic representation is used. It provides a clearer separation between multiple components and improves isolation of the relevant signals. In this paper, this data processing technique is applied to two sets of signals coming from two different reflectometer devices used on the Tore Supra tokamak. For the standard density profile reflectometry, it improves the initialization process and its reliability, providing a more accurate profile determination in the far scrape-off layer with density measurements as low as 10(16) m(-1). For a second reflectometer, which provides measurements in front of a lower hybrid launcher, this method improves the separation of the relevant plasma signal from multi-reflection processes due to the proximity of the plasma.
Manufacture and quality control of interconnecting wire hardnesses, Volume 1
NASA Technical Reports Server (NTRS)
1972-01-01
A standard is presented for manufacture, installation, and quality control of eight types of interconnecting wire harnesses. The processes, process controls, and inspection and test requirements reflected are based on acknowledgment of harness design requirements, acknowledgment of harness installation requirements, identification of the various parts, materials, etc., utilized in harness manufacture, and formulation of a typical manufacturing flow diagram for identification of each manufacturing and quality control process, operation, inspection, and test. The document covers interconnecting wire harnesses defined in the design standard, including type 1, enclosed in fluorocarbon elastomer convolute, tubing; type 2, enclosed in TFE convolute tubing lines with fiberglass braid; type 3, enclosed in TFE convolute tubing; and type 5, combination of types 3 and 4. Knowledge gained through experience on the Saturn 5 program coupled with recent advances in techniques, materials, and processes was incorporated.
Image processing techniques for digital orthophotoquad production
Hood, Joy J.; Ladner, L. J.; Champion, Richard A.
1989-01-01
Orthophotographs have long been recognized for their value as supplements or alternatives to standard maps. Recent trends towards digital cartography have resulted in efforts by the US Geological Survey to develop a digital orthophotoquad production system. Digital image files were created by scanning color infrared photographs on a microdensitometer. Rectification techniques were applied to remove tile and relief displacement, thereby creating digital orthophotos. Image mosaicking software was then used to join the rectified images, producing digital orthophotos in quadrangle format.
Porosity and Permeability of Chondritic Materials
NASA Technical Reports Server (NTRS)
Zolensky, Michael E.; Corrigan, Catherine M.; Dahl, Jason; Long, Michael
1996-01-01
We have investigated the porosity of a large number of chondritic interplanetary dust particles and meteorites by three techniques: standard liquid/gas flow techniques, a new, non-invasive ultrasonic technique, and image processing of backscattered images . The latter technique is obviously best suited to sub-kg sized samples. We have also measured the gas and liquid permeabilities of some chondrites by two techniques: standard liquid/gas flow techniques, and a new, non-destructive pressure release technique. We find that chondritic IDP's have a somewhat bimodal porosity distribution. Peaks are present at 0 and 4% porosity; a tail then extends to 53%. These values suggest IDP bulk densities of 1.1 to 3.3 g/cc. Type 1-3 chondrite matrix porosities range up to 30%, with a peak at 2%. The bulk porosities for type 1-3 chondrites have the same approximate range as exhibited by matrix, indicating that other components of the bulk meteorites (including chondrules and aggregates) have the same average porosity as matrix. These results reveal that the porosity of primitive materials at scales ranging from nanogram to kilogram are similar, implying similar accretion dynamics operated through 12 orders of size magnitude. Permeabilities of the investigated chondrites vary by several orders of magnitude, and there appears to be no simple dependence of permeability with degree of aqueous alteration, or chondrite type.
Mäkitie, A A; Salmi, M; Lindford, A; Tuomi, J; Lassus, P
2016-12-01
Prosthetic mask restoration of the donor face is essential in current facial transplant protocols. The aim was to develop a new three-dimensional (3D) printing (additive manufacturing; AM) process for the production of a donor face mask that fulfilled the requirements for facial restoration after facial harvest. A digital image of a single test person's face was obtained in a standardized setting and subjected to three different image processing techniques. These data were used for the 3D modeling and printing of a donor face mask. The process was also tested in a cadaver setting and ultimately used clinically in a donor patient after facial allograft harvest. and Conclusions: All the three developed and tested techniques enabled the 3D printing of a custom-made face mask in a timely manner that is almost an exact replica of the donor patient's face. This technique was successfully used in a facial allotransplantation donor patient. Copyright © 2016 British Association of Plastic, Reconstructive and Aesthetic Surgeons. Published by Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Vangenderen, J. L. (Principal Investigator); Lock, B. F.
1976-01-01
The author has identified the following significant results. Scope of the preprocessing techniques was restricted to standard material from the EROS Data Center accompanied by some enlarging procedures and the use of the diazo process. Investigation has shown that the most appropriate sampling strategy for this study is the stratified random technique. A viable sampling procedure, together with a method for determining minimum number of sample points in order to test results of any interpretation are presented.
How to Meet Water Cleanup Deadlines
ERIC Educational Resources Information Center
Schmidt, Richard K.
1976-01-01
Most waste treatment techniques conceived to meet the 1977 standards can be separated into three distinct phases: primary, secondary and tertiary treatment. An examination of the four heaviest industrial water users, pulp and paper, steel, plating, and food processing, demonstrates these treatments use proven technology to meet specific…
Internal Medicine House Officers' Performance as Assessed by Experts and Standardized Patients.
ERIC Educational Resources Information Center
Calhoun, Judith G.; And Others
1987-01-01
Three chronically ill patients were trained to evaluate the performance of 31 second-year internal medicine house officers based upon: a checklist for the medical data elicited during the medical interview; the process of the interview; and the physical examination technique. (Author/MLW)
Action Research: Enhancing Classroom Practice and Fulfilling Educational Responsibilities
ERIC Educational Resources Information Center
Young, Mark R.; Rapp, Eve; Murphy, James W.
2010-01-01
Action Research is an applied scholarly paradigm resulting in action for continuous improvement in our teaching and learning techniques offering faculty immediate classroom payback and providing documentation of meeting our educational responsibilities as required by AACSB standards. This article reviews the iterative action research process of…
Continuous internal channels formed in aluminum fusion welds
NASA Technical Reports Server (NTRS)
Gault, J.; Sabo, W.
1967-01-01
Process produces continuous internal channel systems on a repeatable basis in 2014-T6 aluminum. Standard machining forms the initial channel, which is filled with tungsten carbide powder. TIG machine fusion welding completes formation of the channel. Chem-mill techniques enlarge it to the desired size.
Selmi, Giuliana da Fontoura Rodrigues; Trapé, Angelo Zanaga
2014-05-01
Quantification of dermal exposure to pesticides in rural workers, used in risk assessment, can be performed with different techniques such as patches or whole body evaluation. However, the wide variety of methods can jeopardize the process by producing disparate results, depending on the principles in sample collection. A critical review was thus performed on the main techniques for quantifying dermal exposure, calling attention to this issue and the need to establish a single methodology for quantification of dermal exposure in rural workers. Such harmonization of different techniques should help achieve safer and healthier working conditions. Techniques that can provide reliable exposure data are an essential first step towards avoiding harm to workers' health.
EXPERIMENTS IN LITHOGRAPHY FROM REMOTE SENSOR IMAGERY.
Kidwell, R. H.; McSweeney, J.; Warren, A.; Zang, E.; Vickers, E.
1983-01-01
Imagery from remote sensing systems such as the Landsat multispectral scanner and return beam vidicon, as well as synthetic aperture radar and conventional optical camera systems, contains information at resolutions far in excess of that which can be reproduced by the lithographic printing process. The data often require special handling to produce both standard and special map products. Some conclusions have been drawn regarding processing techniques, procedures for production, and printing limitations.
Allison, A.G.
1959-09-01
S>A process is described for preparing a magnesium oxide slip casting slurry which when used in conjunction with standard casting techniques results in a very strong "green" slip casting and a fired piece of very close dimensional tolerance. The process involves aging an aqueous magnestum oxide slurry, having a basic pH value, until it attains a specified critical viscosity at which time a deflocculating agent is added without upsetting the basic pH value.
Three-dimensional measurement system for crime scene documentation
NASA Astrophysics Data System (ADS)
Adamczyk, Marcin; Hołowko, Elwira; Lech, Krzysztof; Michoński, Jakub; MÄ czkowski, Grzegorz; Bolewicki, Paweł; Januszkiewicz, Kamil; Sitnik, Robert
2017-10-01
Three dimensional measurements (such as photogrammetry, Time of Flight, Structure from Motion or Structured Light techniques) are becoming a standard in the crime scene documentation process. The usage of 3D measurement techniques provide an opportunity to prepare more insightful investigation and helps to show every trace in the context of the entire crime scene. In this paper we would like to present a hierarchical, three-dimensional measurement system that is designed for crime scenes documentation process. Our system reflects the actual standards in crime scene documentation process - it is designed to perform measurement in two stages. First stage of documentation, the most general, is prepared with a scanner with relatively low spatial resolution but also big measuring volume - it is used for the whole scene documentation. Second stage is much more detailed: high resolution but smaller size of measuring volume for areas that required more detailed approach. The documentation process is supervised by a specialised application CrimeView3D, that is a software platform for measurements management (connecting with scanners and carrying out measurements, automatic or semi-automatic data registration in the real time) and data visualisation (3D visualisation of documented scenes). It also provides a series of useful tools for forensic technicians: virtual measuring tape, searching for sources of blood spatter, virtual walk on the crime scene and many others. In this paper we present our measuring system and the developed software. We also provide an outcome from research on metrological validation of scanners that was performed according to VDI/VDE standard. We present a CrimeView3D - a software-platform that was developed to manage the crime scene documentation process. We also present an outcome from measurement sessions that were conducted on real crime scenes with cooperation with Technicians from Central Forensic Laboratory of Police.
Evidence for the associated production of a W boson and a top quark at ATLAS
NASA Astrophysics Data System (ADS)
Koll, James
This thesis discusses a search for the Standard Model single top Wt-channel process. An analysis has been performed searching for the Wt-channel process using 4.7 fb-1 of integrated luminosity collected with the ATLAS detector at the Large Hadron Collider. A boosted decision tree is trained using machine learning techniques to increase the separation between signal and background. A profile likelihood fit is used to measure the cross-section of the Wt-channel process at sigma(pp → Wt + X) = 16.8 +/-2.9 (stat) +/- 4.9(syst) pb, consistent with the Standard Model prediction. This fit is also used to generate pseudoexperiments to calculate the significance, finding an observed (expected) 3.3 sigma (3.4 sigma) excess over background.
Fiber Bragg grating sensor to monitor stress kinetics in drying process of commercial latex paints.
de Lourenço, Ivo; Possetti, Gustavo R C; Muller, Marcia; Fabris, José L
2010-01-01
In this paper, we report a study about the application of packaged fiber Bragg gratings used as strain sensors to monitor the stress kinetics during the drying process of commercial latex paints. Three stages of drying with distinct mechanical deformation and temporal behaviors were identified for the samples, with mechanical deformation from 15 μm to 21 μm in the longitudinal film dimension on time intervals from 370 to 600 minutes. Drying time tests based on human sense technique described by the Brazilian Technical Standards NBR 9558 were also done. The results obtained shows that human sense technique has a limited perception of the drying process and that the optical measurement system proposed can be used to characterize correctly the dry-through stage of paint. The influence of solvent (water) addition in the drying process was also investigated. The paint was diluted with four parts paint and one part water (80% paint), and one part paint and one part water (50% paint). It was observed that the increase of the water ratio mixed into the paint decreases both the mechanical deformation magnitude and the paint dry-through time. Contraction of 5.2 μm and 10.4 μm were measured for concentrations of 50% and 80% of paint in the mixture, respectively. For both diluted paints the dry-through time was approximately 170 minutes less than undiluted paint. The optical technique proposed in this work can contribute to the development of new standards to specify the drying time of paint coatings.
Fiber Bragg Grating Sensor to Monitor Stress Kinetics in Drying Process of Commercial Latex Paints
de Lourenço, Ivo; Possetti, Gustavo R. C.; Muller, Marcia; Fabris, José L.
2010-01-01
In this paper, we report a study about the application of packaged fiber Bragg gratings used as strain sensors to monitor the stress kinetics during the drying process of commercial latex paints. Three stages of drying with distinct mechanical deformation and temporal behaviors were identified for the samples, with mechanical deformation from 15 μm to 21 μm in the longitudinal film dimension on time intervals from 370 to 600 minutes. Drying time tests based on human sense technique described by the Brazilian Technical Standards NBR 9558 were also done. The results obtained shows that human sense technique has a limited perception of the drying process and that the optical measurement system proposed can be used to characterize correctly the dry-through stage of paint. The influence of solvent (water) addition in the drying process was also investigated. The paint was diluted with four parts paint and one part water (80% paint), and one part paint and one part water (50% paint). It was observed that the increase of the water ratio mixed into the paint decreases both the mechanical deformation magnitude and the paint dry-through time. Contraction of 5.2 μm and 10.4 μm were measured for concentrations of 50% and 80% of paint in the mixture, respectively. For both diluted paints the dry-through time was approximately 170 minutes less than undiluted paint. The optical technique proposed in this work can contribute to the development of new standards to specify the drying time of paint coatings. PMID:22399906
A Novel Catalyst Deposition Technique for the Growth of Carbon Nanotubes
NASA Technical Reports Server (NTRS)
Delzeit, Lance; Cassell, A.; Stevens, R.; Nguyen, C.; Meyyappan, M.; DeVincenzi, Donald L. (Technical Monitor)
2001-01-01
This viewgraph presentation provides information on the development of a technique at NASA's Ames Research Center by which carbon nanotubes (NT) can be grown. The project had several goals which included: 1) scaleability, 2) ability to control single wall nanotube (SWNT) and multiwall nanotube (MWNT) formation, 3) ability to control the density of nanotubes as they grow, 4) ability to apply standard masking techniques for NT patterning. Information regarding the growth technique includes its use of a catalyst deposition process. SWNTs of varying thicknesses can be grown by changing the catalyst composition. Demonstrations are given of various methods of masking including the use of transmission electron microscopic (TEM) grids.
Baldewijns, Greet; Luca, Stijn; Nagels, William; Vanrumste, Bart; Croonenborghs, Tom
2015-01-01
It has been shown that gait speed and transfer times are good measures of functional ability in elderly. However, data currently acquired by systems that measure either gait speed or transfer times in the homes of elderly people require manual reviewing by healthcare workers. This reviewing process is time-consuming. To alleviate this burden, this paper proposes the use of statistical process control methods to automatically detect both positive and negative changes in transfer times. Three SPC techniques: tabular CUSUM, standardized CUSUM and EWMA, known for their ability to detect small shifts in the data, are evaluated on simulated transfer times. This analysis shows that EWMA is the best-suited method with a detection accuracy of 82% and an average detection time of 9.64 days.
NASA Astrophysics Data System (ADS)
He, Yang; Sun, Yajuan; Zhang, Ruili; Wang, Yulei; Liu, Jian; Qin, Hong
2016-09-01
We construct high order symmetric volume-preserving methods for the relativistic dynamics of a charged particle by the splitting technique with processing. By expanding the phase space to include the time t, we give a more general construction of volume-preserving methods that can be applied to systems with time-dependent electromagnetic fields. The newly derived methods provide numerical solutions with good accuracy and conservative properties over long time of simulation. Furthermore, because of the use of an accuracy-enhancing processing technique, the explicit methods obtain high-order accuracy and are more efficient than the methods derived from standard compositions. The results are verified by the numerical experiments. Linear stability analysis of the methods shows that the high order processed method allows larger time step size in numerical integrations.
Comparison of denture base adaptation between CAD-CAM and conventional fabrication techniques.
Goodacre, Brian J; Goodacre, Charles J; Baba, Nadim Z; Kattadiyil, Mathew T
2016-08-01
Currently no data comparing the denture base adaptation of CAD-CAM and conventional denture processing techniques have been reported. The purpose of this in vitro study was to compare the denture base adaptation of pack and press, pour, injection, and CAD-CAM techniques for fabricating dentures to determine which process produces the most accurate and reproducible adaptation. A definitive cast was duplicated to create 40 gypsum casts that were laser scanned before any fabrication procedures were initiated. A master denture was made using the CAD-CAM process and was then used to create a putty mold for the fabrication of 30 standardized wax festooned dentures, 10 for each of the conventional processing techniques (pack and press, pour, injection). Scan files from 10 casts were sent to Global Dental Science, LLC for fabrication of the CAD-CAM test specimens. After specimens for each of the 4 techniques had been fabricated, they were hydrated for 24 hours and the intaglio surface laser scanned. The scan file of each denture was superimposed on the scan file of the corresponding preprocessing cast using surface matching software. Measurements were made at 60 locations, providing evaluation of fit discrepancies at the following areas: apex of the denture border, 6 mm from the denture border, crest of the ridge, palate, and posterior palatal seal. The use of median and interquartile range was used to assess accuracy and reproducibility. The Levine and Kruskal-Wallis analysis of variance was used to evaluate differences between processing techniques at the 5 specified locations (α=.05). The ranking of results based on median and interquartile range determined that the accuracy and reproducibility of the CAD-CAM technique was more consistently localized around zero at 3 of the 5 locations. Therefore, the CAD-CAM technique showed the best combination of accuracy and reproducibility among the tested fabrication techniques. The pack and press technique was more accurate at 2 of the 5 locations; however, its interquartile range (reproducibility) was the greatest of the 4 tested processing techniques. The pour technique was the most reproducible at 2 of the 5 locations; however, its accuracy was the lowest of the tested techniques. The CAD-CAM fabrication process was the most accurate and reproducible denture fabrication technique when compared with pack and press, pour, and injection denture base processing techniques. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.
Shaeri, Mohammad Ali; Sodagar, Amir M
2015-05-01
This paper proposes an efficient data compression technique dedicated to implantable intra-cortical neural recording devices. The proposed technique benefits from processing neural signals in the Discrete Haar Wavelet Transform space, a new spike extraction approach, and a novel data framing scheme to telemeter the recorded neural information to the outside world. Based on the proposed technique, a 64-channel neural signal processor was designed and prototyped as a part of a wireless implantable extra-cellular neural recording microsystem. Designed in a 0.13- μ m standard CMOS process, the 64-channel neural signal processor reported in this paper occupies ∼ 0.206 mm(2) of silicon area, and consumes 94.18 μW when operating under a 1.2-V supply voltage at a master clock frequency of 1.28 MHz.
Stocka, Jolanta; Tankiewicz, Maciej; Biziuk, Marek; Namieśnik, Jacek
2011-01-01
Pesticides are among the most dangerous environmental pollutants because of their stability, mobility and long-term effects on living organisms. Their presence in the environment is a particular danger. It is therefore crucial to monitor pesticide residues using all available analytical methods. The analysis of environmental samples for the presence of pesticides is very difficult: the processes involved in sample preparation are labor-intensive and time-consuming. To date, it has been standard practice to use large quantities of organic solvents in the sample preparation process; but as these solvents are themselves hazardous, solvent-less and solvent-minimized techniques are becoming popular. The application of Green Chemistry principles to sample preparation is primarily leading to the miniaturization of procedures and the use of solvent-less techniques, and these are discussed in the paper. PMID:22174632
Liquid argon TPC signal formation, signal processing and reconstruction techniques
NASA Astrophysics Data System (ADS)
Baller, B.
2017-07-01
This document describes a reconstruction chain that was developed for the ArgoNeuT and MicroBooNE experiments at Fermilab. These experiments study accelerator neutrino interactions that occur in a Liquid Argon Time Projection Chamber. Reconstructing the properties of particles produced in these interactions benefits from the knowledge of the micro-physics processes that affect the creation and transport of ionization electrons to the readout system. A wire signal deconvolution technique was developed to convert wire signals to a standard form for hit reconstruction, to remove artifacts in the electronics chain and to remove coherent noise. A unique clustering algorithm reconstructs line-like trajectories and vertices in two dimensions which are then matched to create of 3D objects. These techniques and algorithms are available to all experiments that use the LArSoft suite of software.
A RAPID Method for Blood Processing to Increase the Yield of Plasma Peptide Levels in Human Blood.
Teuffel, Pauline; Goebel-Stengel, Miriam; Hofmann, Tobias; Prinz, Philip; Scharner, Sophie; Körner, Jan L; Grötzinger, Carsten; Rose, Matthias; Klapp, Burghard F; Stengel, Andreas
2016-04-28
Research in the field of food intake regulation is gaining importance. This often includes the measurement of peptides regulating food intake. For the correct determination of a peptide's concentration, it should be stable during blood processing. However, this is not the case for several peptides which are quickly degraded by endogenous peptidases. Recently, we developed a blood processing method employing Reduced temperatures, Acidification, Protease inhibition, Isotopic exogenous controls and Dilution (RAPID) for the use in rats. Here, we have established this technique for the use in humans and investigated recovery, molecular form and circulating concentration of food intake regulatory hormones. The RAPID method significantly improved the recovery for (125)I-labeled somatostatin-28 (+39%), glucagon-like peptide-1 (+35%), acyl ghrelin and glucagon (+32%), insulin and kisspeptin (+29%), nesfatin-1 (+28%), leptin (+21%) and peptide YY3-36 (+19%) compared to standard processing (EDTA blood on ice, p <0.001). High performance liquid chromatography showed the elution of endogenous acyl ghrelin at the expected position after RAPID processing, while after standard processing 62% of acyl ghrelin were degraded resulting in an earlier peak likely representing desacyl ghrelin. After RAPID processing the acyl/desacyl ghrelin ratio in blood of normal weight subjects was 1:3 compared to 1:23 following standard processing (p = 0.03). Also endogenous kisspeptin levels were higher after RAPID compared to standard processing (+99%, p = 0.02). The RAPID blood processing method can be used in humans, yields higher peptide levels and allows for assessment of the correct molecular form.
Jabs, Douglas A; Nussenblatt, Robert B; Rosenbaum, James T
2005-09-01
To begin a process of standardizing the methods for reporting clinical data in the field of uveitis. Consensus workshop. Members of an international working group were surveyed about diagnostic terminology, inflammation grading schema, and outcome measures, and the results used to develop a series of proposals to better standardize the use of these entities. Small groups employed nominal group techniques to achieve consensus on several of these issues. The group affirmed that an anatomic classification of uveitis should be used as a framework for subsequent work on diagnostic criteria for specific uveitic syndromes, and that the classification of uveitis entities should be on the basis of the location of the inflammation and not on the presence of structural complications. Issues regarding the use of the terms "intermediate uveitis," "pars planitis," "panuveitis," and descriptors of the onset and course of the uveitis were addressed. The following were adopted: standardized grading schema for anterior chamber cells, anterior chamber flare, and for vitreous haze; standardized methods of recording structural complications of uveitis; standardized definitions of outcomes, including "inactive" inflammation, "improvement'; and "worsening" of the inflammation, and "corticosteroid sparing," and standardized guidelines for reporting visual acuity outcomes. A process of standardizing the approach to reporting clinical data in uveitis research has begun, and several terms have been standardized.
Moyer, Jason T; Gnatkovsky, Vadym; Ono, Tomonori; Otáhal, Jakub; Wagenaar, Joost; Stacey, William C; Noebels, Jeffrey; Ikeda, Akio; Staley, Kevin; de Curtis, Marco; Litt, Brian; Galanopoulou, Aristea S
2017-11-01
Electroencephalography (EEG)-the direct recording of the electrical activity of populations of neurons-is a tremendously important tool for diagnosing, treating, and researching epilepsy. Although standard procedures for recording and analyzing human EEG exist and are broadly accepted, there are no such standards for research in animal models of seizures and epilepsy-recording montages, acquisition systems, and processing algorithms may differ substantially among investigators and laboratories. The lack of standard procedures for acquiring and analyzing EEG from animal models of epilepsy hinders the interpretation of experimental results and reduces the ability of the scientific community to efficiently translate new experimental findings into clinical practice. Accordingly, the intention of this report is twofold: (1) to review current techniques for the collection and software-based analysis of neural field recordings in animal models of epilepsy, and (2) to offer pertinent standards and reporting guidelines for this research. Specifically, we review current techniques for signal acquisition, signal conditioning, signal processing, data storage, and data sharing, and include applicable recommendations to standardize collection and reporting. We close with a discussion of challenges and future opportunities, and include a supplemental report of currently available acquisition systems and analysis tools. This work represents a collaboration on behalf of the American Epilepsy Society/International League Against Epilepsy (AES/ILAE) Translational Task Force (TASK1-Workgroup 5), and is part of a larger effort to harmonize video-EEG interpretation and analysis methods across studies using in vivo and in vitro seizure and epilepsy models. Wiley Periodicals, Inc. © 2017 International League Against Epilepsy.
Nanowire and microwire fabrication technique and product
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sumant, Anirudha V.; Zach, Michael; Marten, Alan David
A continuous or semi-continuous process for fabricating nanowires or microwires makes use of the substantially planar template that may be moved through electrochemical solution to grow nanowires or microwires on exposed conductive edges on the surface of that template. The planar template allows fabrication of the template using standard equipment and techniques. Adhesive transfer may be used to remove the wires from the template and in one embodiment to draw a continuous wire from the template to be wound around the drum.
Digital radiography: spatial and contrast resolution
NASA Astrophysics Data System (ADS)
Bjorkholm, Paul; Annis, M.; Frederick, E.; Stein, J.; Swift, R.
1981-07-01
The addition of digital image collection and storage to standard and newly developed x-ray imaging techniques has allowed spectacular improvements in some diagnostic procedures. There is no reason to expect that the developments in this area are yet complete. But no matter what further developments occur in this field, all the techniques will share a common element, digital image storage and processing. This common element alone determines some of the important imaging characteristics. These will be discussed using one system, the Medical MICRODOSE System as an example.
Engineering Encounters: An Engineering Design Process for Early Childhood
ERIC Educational Resources Information Center
Lottero-Perdue, Pamela; Bowditch, Michelle; Kagan, Michelle; Robinson-Cheek, Linda; Webb, Tedra; Meller, Megan; Nosek, Theresa
2016-01-01
This column presents ideas and techniques to enhance your science teaching. This month's issue shares information about trying (again) to engineer an egg package. Engineering is an essential part of science education, as emphasized in the "Next Generation Science Standards" (NGSS Lead States 2013). Engineering practices and performance…
On the numerical treatment of Coulomb forces in scattering problems
NASA Astrophysics Data System (ADS)
Randazzo, J. M.; Ancarani, L. U.; Colavecchia, F. D.; Gasaneo, G.; Frapiccini, A. L.
2012-11-01
We investigate the limiting procedures to obtain Coulomb interactions from short-range potentials. The application of standard techniques used for the two-body case (exponential and sharp cutoff) to the three-body break-up problem is illustrated numerically by considering the Temkin-Poet (TP) model of e-H processes.
Dataflow Integration and Simulation Techniques for DSP System Design Tools
2007-01-01
Lebak, M. Richards , and D. Campbell, “VSIPL: An object-based open standard API for vector, signal, and image processing,” in Proceedings of the...Inc., document Version 0.98a. [56] P. Marwedel and G. Goossens , Eds., Code Generation for Embedded Processors. Kluwer Academic Publishers, 1995. [57
Manufacture and quality control of interconnecting wire harnesses, Volume 3
NASA Technical Reports Server (NTRS)
1972-01-01
The document covers interconnecting wire harnesses defined in the design standard, including type 6, enclosed in TFE heat shrink tubing; and type 7, flexible armored. Knowledge gained through experience on the Saturn 5 program coupled with recent advances in techniques, materials, and processes was incorporated into this document.
Total Quality Management: Application in Vocational Education. ERIC Digest No. 125.
ERIC Educational Resources Information Center
Lankard, Bettina A.
Total Quality Management (TQM) establishes business and industry standards and techniques that ensure the quality of products leaving and reaching firms through continuous actions rather than one final inspection. Deming, Juran, and Crosby, who initiated the process, share a common theme of participatory management. Management participation and…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-04
... program helps to ensure that requested data can be provided in the desired format, reporting burden (time... coal mining industry with a standardized reporting format that expedites the certification process... appropriate automated, electronic, mechanical, or other technological collection techniques or other forms of...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-14
... Flat Wood Paneling Surface Coating Processes AGENCY: Environmental Protection Agency (EPA). ACTION... sources covered by EPA's Control Techniques Guidelines (CTG) standards for flat wood paneling surface... Protection (PADEP) submitted to EPA a SIP revision concerning the adoption of the CTG for flat wood paneling...
Statistical segmentation of multidimensional brain datasets
NASA Astrophysics Data System (ADS)
Desco, Manuel; Gispert, Juan D.; Reig, Santiago; Santos, Andres; Pascau, Javier; Malpica, Norberto; Garcia-Barreno, Pedro
2001-07-01
This paper presents an automatic segmentation procedure for MRI neuroimages that overcomes part of the problems involved in multidimensional clustering techniques like partial volume effects (PVE), processing speed and difficulty of incorporating a priori knowledge. The method is a three-stage procedure: 1) Exclusion of background and skull voxels using threshold-based region growing techniques with fully automated seed selection. 2) Expectation Maximization algorithms are used to estimate the probability density function (PDF) of the remaining pixels, which are assumed to be mixtures of gaussians. These pixels can then be classified into cerebrospinal fluid (CSF), white matter and grey matter. Using this procedure, our method takes advantage of using the full covariance matrix (instead of the diagonal) for the joint PDF estimation. On the other hand, logistic discrimination techniques are more robust against violation of multi-gaussian assumptions. 3) A priori knowledge is added using Markov Random Field techniques. The algorithm has been tested with a dataset of 30 brain MRI studies (co-registered T1 and T2 MRI). Our method was compared with clustering techniques and with template-based statistical segmentation, using manual segmentation as a gold-standard. Our results were more robust and closer to the gold-standard.
NASA Astrophysics Data System (ADS)
Miller, Timothy M.; Abrahams, John H.; Allen, Christine A.
2006-04-01
We report a fabrication process for deep etching silicon to different depths with a single masking layer, using standard masking and exposure techniques. Using this technique, we have incorporated a deep notch in the support walls of a transition-edge-sensor (TES) bolometer array during the detector back-etch, while simultaneously creating a cavity behind the detector. The notches serve to receive the support beams of a separate component, the Backshort-Under-Grid (BUG), an array of adjustable height quarter-wave backshorts that fill the cavities behind each pixel in the detector array. The backshort spacing, set prior to securing to the detector array, can be controlled from 25 to 300 μm by adjusting only a few process steps. In addition to backshort spacing, the interlocking beams and notches provide positioning and structural support for the ˜1 mm pitch, 8×8 array. This process is being incorporated into developing a TES bolometer array with an adjustable backshort for use in far-infrared astronomy. The masking technique and machining process used to fabricate the interlocking walls will be discussed.
Neutrino oscillation processes in a quantum-field-theoretical approach
NASA Astrophysics Data System (ADS)
Egorov, Vadim O.; Volobuev, Igor P.
2018-05-01
It is shown that neutrino oscillation processes can be consistently described in the framework of quantum field theory using only the plane wave states of the particles. Namely, the oscillating electron survival probabilities in experiments with neutrino detection by charged-current and neutral-current interactions are calculated in the quantum field-theoretical approach to neutrino oscillations based on a modification of the Feynman propagator in the momentum representation. The approach is most similar to the standard Feynman diagram technique. It is found that the oscillating distance-dependent probabilities of detecting an electron in experiments with neutrino detection by charged-current and neutral-current interactions exactly coincide with the corresponding probabilities calculated in the standard approach.
Additive Manufacturing of Metal Structures at the Micrometer Scale.
Hirt, Luca; Reiser, Alain; Spolenak, Ralph; Zambelli, Tomaso
2017-05-01
Currently, the focus of additive manufacturing (AM) is shifting from simple prototyping to actual production. One driving factor of this process is the ability of AM to build geometries that are not accessible by subtractive fabrication techniques. While these techniques often call for a geometry that is easiest to manufacture, AM enables the geometry required for best performance to be built by freeing the design process from restrictions imposed by traditional machining. At the micrometer scale, the design limitations of standard fabrication techniques are even more severe. Microscale AM thus holds great potential, as confirmed by the rapid success of commercial micro-stereolithography tools as an enabling technology for a broad range of scientific applications. For metals, however, there is still no established AM solution at small scales. To tackle the limited resolution of standard metal AM methods (a few tens of micrometers at best), various new techniques aimed at the micrometer scale and below are presently under development. Here, we review these recent efforts. Specifically, we feature the techniques of direct ink writing, electrohydrodynamic printing, laser-assisted electrophoretic deposition, laser-induced forward transfer, local electroplating methods, laser-induced photoreduction and focused electron or ion beam induced deposition. Although these methods have proven to facilitate the AM of metals with feature sizes in the range of 0.1-10 µm, they are still in a prototype stage and their potential is not fully explored yet. For instance, comprehensive studies of material availability and material properties are often lacking, yet compulsory for actual applications. We address these items while critically discussing and comparing the potential of current microscale metal AM techniques. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Taylor, Christopher T.; Hutchinson, Simon; Salmon, Neil A.; Wilkinson, Peter N.; Cameron, Colin D.
2014-06-01
Image processing techniques can be used to improve the cost-effectiveness of future interferometric Passive MilliMetre Wave (PMMW) imagers. The implementation of such techniques will allow for a reduction in the number of collecting elements whilst ensuring adequate image fidelity is maintained. Various techniques have been developed by the radio astronomy community to enhance the imaging capability of sparse interferometric arrays. The most prominent are Multi- Frequency Synthesis (MFS) and non-linear deconvolution algorithms, such as the Maximum Entropy Method (MEM) and variations of the CLEAN algorithm. This investigation focuses on the implementation of these methods in the defacto standard for radio astronomy image processing, the Common Astronomy Software Applications (CASA) package, building upon the discussion presented in Taylor et al., SPIE 8362-0F. We describe the image conversion process into a CASA suitable format, followed by a series of simulations that exploit the highlighted deconvolution and MFS algorithms assuming far-field imagery. The primary target application used for this investigation is an outdoor security scanner for soft-sided Heavy Goods Vehicles. A quantitative analysis of the effectiveness of the aforementioned image processing techniques is presented, with thoughts on the potential cost-savings such an approach could yield. Consideration is also given to how the implementation of these techniques in CASA might be adapted to operate in a near-field target environment. This may enable a much wider usability by the imaging community outside of radio astronomy and thus would be directly relevant to portal screening security systems in the microwave and millimetre wave bands.
NASA Astrophysics Data System (ADS)
Fitri, Noor; Yandi, Nefri; Hermawati, Julianto, Tatang Shabur
2017-03-01
A comparative study of the quality of patchouli oil using Water-Steam Distillation (WSD) and Water Bubble Distillation (WBD) techniques has been studied. The raw materials were Patchouli plants from Samigaluh village, Kulon Progo district, Yogyakarta. This study is aimed to compare two distillation techniques in order to find out the optimal distillation technique to increase the content of patchouli alcohol (patchoulol) and the quality of patchouli oil. Pretreatment such as withering, drying, size reduction and light fermentation were intended to increase the yield. One kilogramm of patchouli was moisturized with 500 mL of aquadest. The light fermentation process was carried out for 20 hours in a dark container. Fermented patchouli was extracted for 6 hours using Water-Steam and Water Bubble Distillation techniques. Physical and chemical properties test of patchouli oil were performed using SNI standard No. SNI-06-2385-2006 and the chemical composition of patchouli oil was analysed by GC-MS. As the results, the higher yield oil is obtained using Water-Steam Distillation, i.e. 5.9% versus 2.4%. Spesific gravity, refractive index and acid number of patchouli oil in Water-Steam Distillation results did not meet the SNI standard, i.e. 0.991; 1.623 and 13.19, while the Water Bubble Distillation met the standard, i.e. 0.955; 1.510 and 6.61. The patchoulol content using Water Bubble Distillation technique is 61.53%, significant higher than those using Water-Steam Distillation, i.e. 38.24%. Thus, Water Bubble Distillation promises a potential technique to increase the content of patchoulol in the patchouli oil.
Thin Film Transistors On Plastic Substrates
Carey, Paul G.; Smith, Patrick M.; Sigmon, Thomas W.; Aceves, Randy C.
2004-01-20
A process for formation of thin film transistors (TFTs) on plastic substrates replaces standard thin film transistor fabrication techniques, and uses sufficiently lower processing temperatures so that inexpensive plastic substrates may be used in place of standard glass, quartz, and silicon wafer-based substrates. The silicon based thin film transistor produced by the process includes a low temperature substrate incapable of withstanding sustained processing temperatures greater than about 250.degree. C., an insulating layer on the substrate, a layer of silicon on the insulating layer having sections of doped silicon, undoped silicon, and poly-silicon, a gate dielectric layer on the layer of silicon, a layer of gate metal on the dielectric layer, a layer of oxide on sections of the layer of silicon and the layer of gate metal, and metal contacts on sections of the layer of silicon and layer of gate metal defining source, gate, and drain contacts, and interconnects.
NASA Technical Reports Server (NTRS)
Hihn, Jairus; Lewicki, Scott; Morgan, Scott
2011-01-01
The measurement techniques for organizations which have achieved the Software Engineering Institutes CMMI Maturity Levels 4 and 5 are well documented. On the other hand, how to effectively measure when an organization is Maturity Level 3 is less well understood, especially when there is no consistency in tool use and there is extensive tailoring of the organizational software processes. Most organizations fail in their attempts to generate, collect, and analyze standard process improvement metrics under these conditions. But at JPL, NASA's prime center for deep space robotic exploration, we have a long history of proving there is always a solution: It just may not be what you expected. In this paper we describe the wide variety of qualitative and quantitative techniques we have been implementing over the last few years, including the various approaches used to communicate the results to both software technical managers and senior managers.
Translation of scales in cross-cultural research: issues and techniques.
Cha, Eun-Seok; Kim, Kevin H; Erlen, Judith A
2007-05-01
This paper is a report of a study designed to: (i) describe issues and techniques of translation of standard measures for use in international research; (ii) identify a user-friendly and valid translation method when researchers have limited resources during translation procedure; and (iii) discuss translation issues using data from a pilot study as an example. The process of translation is an important part of cross-cultural studies. Cross-cultural researchers are often confronted by the need to translate scales from one language to another and to do this with limited resources. The lessons learned from our experience in a pilot study are presented to underline the importance of using appropriate translation procedures. The issues of the back-translation method are discussed to identify strategies to ensure success when translating measures. A combined technique is an appropriate method to maintain the content equivalences between the original and translated instruments in international research. There are several possible combinations of translation techniques. However, there is no gold standard of translation techniques because the research environment (e.g. accessibility and availability of bilingual people) and the research questions are different. It is important to use appropriate translation procedures and to employ a combined translation technique based on the research environment and questions.
Neonatal Jaundice Detection System.
Aydın, Mustafa; Hardalaç, Fırat; Ural, Berkan; Karap, Serhat
2016-07-01
Neonatal jaundice is a common condition that occurs in newborn infants in the first week of life. Today, techniques used for detection are required blood samples and other clinical testing with special equipment. The aim of this study is creating a non-invasive system to control and to detect the jaundice periodically and helping doctors for early diagnosis. In this work, first, a patient group which is consisted from jaundiced babies and a control group which is consisted from healthy babies are prepared, then between 24 and 48 h after birth, 40 jaundiced and 40 healthy newborns are chosen. Second, advanced image processing techniques are used on the images which are taken with a standard smartphone and the color calibration card. Segmentation, pixel similarity and white balancing methods are used as image processing techniques and RGB values and pixels' important information are obtained exactly. Third, during feature extraction stage, with using colormap transformations and feature calculation, comparisons are done in RGB plane between color change values and the 8-color calibration card which is specially designed. Finally, in the bilirubin level estimation stage, kNN and SVR machine learning regressions are used on the dataset which are obtained from feature extraction. At the end of the process, when the control group is based on for comparisons, jaundice is succesfully detected for 40 jaundiced infants and the success rate is 85 %. Obtained bilirubin estimation results are consisted with bilirubin results which are obtained from the standard blood test and the compliance rate is 85 %.
Micro-quantity tissue digestion for metal measurements by use of a microwave acid-digestion bomb.
Nicholson, J R; Savory, M G; Savory, J; Wills, M R
1989-03-01
We describe a simple and convenient method for processing small amounts of tissue samples for trace-metal measurements by atomic absorption spectrometry, by use of a modified Parr microwave digestion bomb. Digestion proceeds rapidly (less than or equal to 90 s) in a sealed Teflon-lined vessel that eliminates contamination or loss from volatilization. Small quantities of tissue (5-100 mg dry weight) are digested in high-purity nitric acid, yielding concentrations of analyte that can be measured directly without further sample manipulation. We analyzed National Institute of Standards and Technology bovine liver Standard Reference Material to verify the accuracy of the technique. We assessed the applicability of the technique to analysis for aluminum in bone by comparison with a dry ashing procedure.
Nikendei, C; Ganschow, P; Groener, J B; Huwendiek, S; Köchel, A; Köhl-Hackert, N; Pjontek, R; Rodrian, J; Scheibe, F; Stadler, A-K; Steiner, T; Stiepak, J; Tabatabai, J; Utz, A; Kadmon, M
2016-01-01
The competent physical examination of patients and the safe and professional implementation of clinical procedures constitute essential components of medical practice in nearly all areas of medicine. The central objective of the projects "Heidelberg standard examination" and "Heidelberg standard procedures", which were initiated by students, was to establish uniform interdisciplinary standards for physical examination and clinical procedures, and to distribute them in coordination with all clinical disciplines at the Heidelberg University Hospital. The presented project report illuminates the background of the initiative and its methodological implementation. Moreover, it describes the multimedia documentation in the form of pocketbooks and a multimedia internet-based platform, as well as the integration into the curriculum. The project presentation aims to provide orientation and action guidelines to facilitate similar processes in other faculties.
Measuring Operational Resilience Using the CERT(Registered) Resilience Management Model
2010-09-01
such as ISO 27002 [ ISO 2005]) and then measure the implementation and performance of practices contained in the standard. This checklist-based ap...Security techniques – Code of practice for information security management. ISO /IEC 27002 :2005, June 2005. Also known as ISO /IEC 17799:2005. [ ISO 2007...Table 23: ISO 15939 Process Activities and Tasks 54 Table 24: CERT-RMM Measurement and Analysis Process Area Goals and Practices 55 CMU/SEI
Wafer hot spot identification through advanced photomask characterization techniques
NASA Astrophysics Data System (ADS)
Choi, Yohan; Green, Michael; McMurran, Jeff; Ham, Young; Lin, Howard; Lan, Andy; Yang, Richer; Lung, Mike
2016-10-01
As device manufacturers progress through advanced technology nodes, limitations in standard 1-dimensional (1D) mask Critical Dimension (CD) metrics are becoming apparent. Historically, 1D metrics such as Mean to Target (MTT) and CD Uniformity (CDU) have been adequate for end users to evaluate and predict the mask impact on the wafer process. However, the wafer lithographer's process margin is shrinking at advanced nodes to a point that the classical mask CD metrics are no longer adequate to gauge the mask contribution to wafer process error. For example, wafer CDU error at advanced nodes is impacted by mask factors such as 3-dimensional (3D) effects and mask pattern fidelity on subresolution assist features (SRAFs) used in Optical Proximity Correction (OPC) models of ever-increasing complexity. These items are not quantifiable with the 1D metrology techniques of today. Likewise, the mask maker needs advanced characterization methods in order to optimize the mask process to meet the wafer lithographer's needs. These advanced characterization metrics are what is needed to harmonize mask and wafer processes for enhanced wafer hot spot analysis. In this paper, we study advanced mask pattern characterization techniques and their correlation with modeled wafer performance.
NASA Astrophysics Data System (ADS)
Dyakov, Y. A.; Kazaryan, M. A.; Golubkov, M. G.; Gubanova, D. P.; Bulychev, N. A.; Kazaryan, S. M.
2018-04-01
Studying the processes occurring in biological systems under irradiation is critically important for understanding the principles of working of biological systems. One of the main problems, which stimulate interest to the processes of photo-induced excitation and ionization of biomolecules, is the necessity of their identification by various mass spectrometry (MS) methods. While simple analysis of small molecules became a standard MS technique long time ago, recognition of large molecules, especially carbohydrates, is still a difficult problem, and requires sophisticated techniques and complicated computer analysis. Due to the large variety of substances in the samples, as far as the complexity of the processes occurring after excitation/ionization of the molecules, the recognition efficiency of MS technique in terms of carbohydrates is still not high enough. Additional theoretical and experimental analysis of ionization and dissociation processes in various kinds of polysaccharides, beginning from the simplest ones, is necessary. In our work, we extent previous theoretical and experimental studies of saccharides, and concentrate our attention to protonated glucose. In this article we paid the most attention to the cross-ring dissociation and water loss reactions due to their importance for identification of various isomers of hydrocarbon molecules (for example, distinguish α- and β-glucose).
Study Methods to Standardize Thermography NDE
NASA Technical Reports Server (NTRS)
Walker, James L.; Workman, Gary L.
1998-01-01
The purpose of this work is to develop thermographic inspection methods and standards for use in evaluating structural composites and aerospace hardware. Qualification techniques and calibration methods are investigated to standardize the thermographic method for use in the field. Along with the inspections of test standards structural hardware, support hardware is designed and fabricated to aid in the thermographic process. Also, a standard operating procedure is developed for performing inspections with the Bales Thermal Image Processor (TIP). Inspections are performed on a broad range of structural composites. These materials include various graphite/epoxies, graphite/cyanide-ester, graphite/silicon-carbide, graphite phenolic and Keviar/epoxy. Also metal honeycomb (titanium and aluminum faceplates over an aluminum honeycomb core) structures are investigated. Various structural shapes are investigated and the thickness of the structures vary from as few as 3 plies to as many as 80 plies. Special emphasis is placed on characterizing defects in attachment holes and bondlines, in addition to those resulting from impact damage and the inclusion of foreign matter. Image processing through statistical analysis and digital filtering is investigated to enhance the quality and quantify the NDE thermal images when necessary.
Study Methods to Standardize Thermography NDE
NASA Technical Reports Server (NTRS)
Walker, James L.; Workman, Gary L.
1998-01-01
The purpose of this work is to develop thermographic inspection methods and standards for use in evaluating structural composites and aerospace hardware. Qualification techniques and calibration methods are investigated to standardize the thermographic method for use in the field. Along with the inspections of test standards structural hardware, support hardware is designed and fabricated to aid in the thermographic process. Also, a standard operating procedure is developed for performing inspections with the Bales Thermal Image Processor (TIP). Inspections are performed on a broad range of structural composites. These materials include graphite/epoxies, graphite/cyanide-ester, graphite/silicon-carbide, graphite phenolic and Kevlar/epoxy. Also metal honeycomb (titanium and aluminum faceplates over an aluminum honeycomb core) structures are investigated. Various structural shapes are investigated and the thickness of the structures vary from as few as 3 plies to as many as 80 plies. Special emphasis is placed on characterizing defects in attachment holes and bondlines, in addition to those resulting from impact damage and the inclusion of foreign matter. Image processing through statistical analysis and digital filtering is investigated to enhance the quality and quantify the NDE thermal images when necessary.
Implementing ISO/IEEE 11073: proposal of two different strategic approaches.
Martínez-Espronceda, M; Serrano, L; Martínez, I; Escayola, J; Led, S; Trigo, J; García, J
2008-01-01
This paper explains the challenges encountered during the ISO/IEEE 11073 standard implementation process. The complexity of the standard and the consequent heavy requirements, which have not encouraged software engineers to adopt the standard. The developing complexity evaluation drives us to propose two possible implementation strategies that cover almost all possible use cases and eases handling the standard by non-expert users. The first one is focused on medical devices (MD) and proposes a low-memory and low-processor usage technique. It is based on message patterns that allow simple functions to generate ISO/IEEE 11073 messages and to process them easily. In this way a framework for MDs can be obtained. Second one is focused on more powerful machines such as data loggers or gateways (aka. computer engines (CE)), which do not have the MDs' memory and processor usage constraints. For CEs a more intelligent and adaptative Plug&Play (P&P) solution is provided. It consists on a general platform that can access to any device supported by the standard. Combining both strategies will cut developing time for applications based on ISO/EEE 11073.
Proceedings of the 21st DOE/NRC Nuclear Air Cleaning Conference; Sessions 1--8
DOE Office of Scientific and Technical Information (OSTI.GOV)
First, M.W.
1991-02-01
Separate abstracts have been prepared for the papers presented at the meeting on nuclear facility air cleaning technology in the following specific areas of interest: air cleaning technologies for the management and disposal of radioactive wastes; Canadian waste management program; radiological health effects models for nuclear power plant accident consequence analysis; filter testing; US standard codes on nuclear air and gas treatment; European community nuclear codes and standards; chemical processing off-gas cleaning; incineration and vitrification; adsorbents; nuclear codes and standards; mathematical modeling techniques; filter technology; safety; containment system venting; and nuclear air cleaning programs around the world. (MB)
Size reduction techniques for vital compliant VHDL simulation models
Rich, Marvin J.; Misra, Ashutosh
2006-08-01
A method and system select delay values from a VHDL standard delay file that correspond to an instance of a logic gate in a logic model. Then the system collects all the delay values of the selected instance and builds super generics for the rise-time and the fall-time of the selected instance. Then, the system repeats this process for every delay value in the standard delay file (310) that correspond to every instance of every logic gate in the logic model. The system then outputs a reduced size standard delay file (314) containing the super generics for every instance of every logic gate in the logic model.
Quantitative optical metrology with CMOS cameras
NASA Astrophysics Data System (ADS)
Furlong, Cosme; Kolenovic, Ervin; Ferguson, Curtis F.
2004-08-01
Recent advances in laser technology, optical sensing, and computer processing of data, have lead to the development of advanced quantitative optical metrology techniques for high accuracy measurements of absolute shapes and deformations of objects. These techniques provide noninvasive, remote, and full field of view information about the objects of interest. The information obtained relates to changes in shape and/or size of the objects, characterizes anomalies, and provides tools to enhance fabrication processes. Factors that influence selection and applicability of an optical technique include the required sensitivity, accuracy, and precision that are necessary for a particular application. In this paper, sensitivity, accuracy, and precision characteristics in quantitative optical metrology techniques, and specifically in optoelectronic holography (OEH) based on CMOS cameras, are discussed. Sensitivity, accuracy, and precision are investigated with the aid of National Institute of Standards and Technology (NIST) traceable gauges, demonstrating the applicability of CMOS cameras in quantitative optical metrology techniques. It is shown that the advanced nature of CMOS technology can be applied to challenging engineering applications, including the study of rapidly evolving phenomena occurring in MEMS and micromechatronics.
Full-Field Strain Methods for Investigating Failure Mechanisms in Triaxial Braided Composites
NASA Technical Reports Server (NTRS)
Littell, Justin D.; Binienda, Wieslaw K.; Goldberg, Robert K.; Roberts, Gary D.
2008-01-01
Recent advancements in braiding technology have led to commercially viable manufacturing approaches for making large structures with complex shape out of triaxial braided composite materials. In some cases, the static load capability of structures made using these materials has been higher than expected based on material strength properties measured using standard coupon tests. A more detailed investigation of deformation and failure processes in large-unit-cell-size triaxial braid composites is needed to evaluate the applicability of standard test methods for these materials and to develop alternative testing approaches. This report presents some new techniques that have been developed to investigate local deformation and failure using digital image correlation techniques. The methods were used to measure both local and global strains during standard straight-sided coupon tensile tests on composite materials made with 12- and 24-k yarns and a 0 /+60 /-60 triaxial braid architecture. Local deformation and failure within fiber bundles was observed and correlations were made between these local failures and global composite deformation and strength.
NDE standards for high temperature materials
NASA Technical Reports Server (NTRS)
Vary, Alex
1991-01-01
High temperature materials include monolithic ceramics for automotive gas turbine engines and also metallic/intermetallic and ceramic matrix composites for a range of aerospace applications. These are materials that can withstand extreme operating temperatures that will prevail in advanced high-efficiency gas turbine engines. High temperature engine components are very likely to consist of complex composite structures with three-dimensionality interwoven and various intermixed ceramic fibers. The thermomechanical properties of components made of these materials are actually created in-place during processing and fabrication stages. The complex nature of these new materials creates strong incentives for exact standards for unambiguous evaluations of defects and microstructural characteristics. NDE techniques and standards that will ultimately be applicable to production and quality control of high temperature materials and structures are still emerging. The needs range from flaw detection to below 100 micron levels in monolithic ceramics to global imaging of fiber architecture and matrix densification anomalies in composites. The needs are different depending on the processing stage, fabrication method, and nature of the finished product. The standards are discussed that must be developed in concert with advances in NDE technology, materials processing research, and fabrication development. High temperature materials and structures that fail to meet stringent specifications and standards are unlikely to compete successfully either technologically or in international markets.
Aerogel to simulate delamination and porosity defects in carbon-fiber reinforced polymer composites
NASA Astrophysics Data System (ADS)
Juarez, Peter; Leckey, Cara A. C.
2018-04-01
Representative defect standards are essential for the validation and calibration of new and existing inspection techniques. However, commonly used methods of simulating delaminations in carbon-fiber reinforced polymer (CFRP) composites do not accurately represent the behavior of the real-world defects for several widely-used NDE techniques. For instance, it is common practice to create a delamination standard by inserting Polytetrafluoroethylene (PTFE) in between ply layers. However, PTFE can transmit more ultrasonic energy than actual delaminations, leading to an unrealistic representation of the defect inspection. PTFE can also deform/wrinkle during the curing process and has a thermal effusivity two orders of magnitude higher than air (almost equal to that of a CFRP). It is therefore not effective in simulating a delamination for thermography. Currently there is also no standard practice for producing or representing a known porosity in composites. This paper presents a novel method of creating delamination and porosity standards using aerogel. Insertion of thin sheets of solid aerogel between ply layers during layup is shown to produce air-gap-like delaminations creating realistic ultrasonic and thermographic inspection responses. Furthermore, it is shown that depositing controlled amounts of aerogel powder can represent porosity. Micrograph data verifies the structural integrity of the aerogel through the composite curing process. This paper presents data from multiple NDE methods, including X-ray computed tomography, immersion ultrasound, and flash thermography to the effectiveness of aerogel as a delamination and porosity simulant.
Distributed geospatial model sharing based on open interoperability standards
Feng, Min; Liu, Shuguang; Euliss, Ned H.; Fang, Yin
2009-01-01
Numerous geospatial computational models have been developed based on sound principles and published in journals or presented in conferences. However modelers have made few advances in the development of computable modules that facilitate sharing during model development or utilization. Constraints hampering development of model sharing technology includes limitations on computing, storage, and connectivity; traditional stand-alone and closed network systems cannot fully support sharing and integrating geospatial models. To address this need, we have identified methods for sharing geospatial computational models using Service Oriented Architecture (SOA) techniques and open geospatial standards. The service-oriented model sharing service is accessible using any tools or systems compliant with open geospatial standards, making it possible to utilize vast scientific resources available from around the world to solve highly sophisticated application problems. The methods also allow model services to be empowered by diverse computational devices and technologies, such as portable devices and GRID computing infrastructures. Based on the generic and abstract operations and data structures required for Web Processing Service (WPS) standards, we developed an interactive interface for model sharing to help reduce interoperability problems for model use. Geospatial computational models are shared on model services, where the computational processes provided by models can be accessed through tools and systems compliant with WPS. We developed a platform to help modelers publish individual models in a simplified and efficient way. Finally, we illustrate our technique using wetland hydrological models we developed for the prairie pothole region of North America.
Rahman, Zia Ur; Sethi, Pooja; Murtaza, Ghulam; Virk, Hafeez Ul Hassan; Rai, Aitzaz; Mahmod, Masliza; Schoondyke, Jeffrey; Albalbissi, Kais
2017-01-01
Cardiovascular disease is a leading cause of morbidity and mortality globally. Early diagnostic markers are gaining popularity for better patient care disease outcomes. There is an increasing interest in noninvasive cardiac imaging biomarkers to diagnose subclinical cardiac disease. Feature tracking cardiac magnetic resonance imaging is a novel post-processing technique that is increasingly being employed to assess global and regional myocardial function. This technique has numerous applications in structural and functional diagnostics. It has been validated in multiple studies, although there is still a long way to go for it to become routine standard of care. PMID:28515849
Yimer, Mulat; Hailu, Tadesse; Mulu, Wondemagegn; Abera, Bayeh
2015-12-26
Although the sensitivity of Wet mount technique is questionable, it is the major diagnostic technique for routine diagnosis of intestinal parasitosis in Ethiopia. Therefore, the aim of this study was the evaluation performance of diagnostic methods of intestinal parasitosis in school age children in Ethiopia. A cross sectional study was conducted from May to June 2013. Single stool sample was processed for direct, Formol ether concentration (FEC) and Kato Katz methods. The sensitivity and negative predictive value (NPV) of diagnostic tests were calculated in terms of the "Gold" standard method (the combined result of the three methods altogether). A total of 422 school age children were participated in this study. The prevalence of intestinal parasites was high (74.6%) with Kato Katz technique. The sensitivity of Wet mount, FEC and Kato Katz tests against the Gold standard test was 48.9, 63.1 and 93.7%, respectively. Kato Katz technique revealed a better NPV 80.4 (80.1-80.6) as compared to the Wet mount (33.7%) and FEC techniques (41.3%). In this study, the Kato Katz technique outperformed the other two methods but the true values for sensitivity, specificity and diagnostic values are not known. Moreover, it is labor intensive and not easily accessible. Hence, it is preferable to use FEC technique to complement the Wet mount test.
Virtual lab demonstrations improve students' mastery of basic biology laboratory techniques.
Maldarelli, Grace A; Hartmann, Erica M; Cummings, Patrick J; Horner, Robert D; Obom, Kristina M; Shingles, Richard; Pearlman, Rebecca S
2009-01-01
Biology laboratory classes are designed to teach concepts and techniques through experiential learning. Students who have never performed a technique must be guided through the process, which is often difficult to standardize across multiple lab sections. Visual demonstration of laboratory procedures is a key element in teaching pedagogy. The main goals of the study were to create videos explaining and demonstrating a variety of lab techniques that would serve as teaching tools for undergraduate and graduate lab courses and to assess the impact of these videos on student learning. Demonstrations of individual laboratory procedures were videotaped and then edited with iMovie. Narration for the videos was edited with Audacity. Undergraduate students were surveyed anonymously prior to and following screening to assess the impact of the videos on student lab performance by completion of two Participant Perception Indicator surveys. A total of 203 and 171 students completed the pre- and posttesting surveys, respectively. Statistical analyses were performed to compare student perceptions of knowledge of, confidence in, and experience with the lab techniques before and after viewing the videos. Eleven demonstrations were recorded. Chi-square analysis revealed a significant increase in the number of students reporting increased knowledge of, confidence in, and experience with the lab techniques after viewing the videos. Incorporation of instructional videos as prelaboratory exercises has the potential to standardize techniques and to promote successful experimental outcomes.
Blob-enhanced reconstruction technique
NASA Astrophysics Data System (ADS)
Castrillo, Giusy; Cafiero, Gioacchino; Discetti, Stefano; Astarita, Tommaso
2016-09-01
A method to enhance the quality of the tomographic reconstruction and, consequently, the 3D velocity measurement accuracy, is presented. The technique is based on integrating information on the objects to be reconstructed within the algebraic reconstruction process. A first guess intensity distribution is produced with a standard algebraic method, then the distribution is rebuilt as a sum of Gaussian blobs, based on location, intensity and size of agglomerates of light intensity surrounding local maxima. The blobs substitution regularizes the particle shape allowing a reduction of the particles discretization errors and of their elongation in the depth direction. The performances of the blob-enhanced reconstruction technique (BERT) are assessed with a 3D synthetic experiment. The results have been compared with those obtained by applying the standard camera simultaneous multiplicative reconstruction technique (CSMART) to the same volume. Several blob-enhanced reconstruction processes, both substituting the blobs at the end of the CSMART algorithm and during the iterations (i.e. using the blob-enhanced reconstruction as predictor for the following iterations), have been tested. The results confirm the enhancement in the velocity measurements accuracy, demonstrating a reduction of the bias error due to the ghost particles. The improvement is more remarkable at the largest tested seeding densities. Additionally, using the blobs distributions as a predictor enables further improvement of the convergence of the reconstruction algorithm, with the improvement being more considerable when substituting the blobs more than once during the process. The BERT process is also applied to multi resolution (MR) CSMART reconstructions, permitting simultaneously to achieve remarkable improvements in the flow field measurements and to benefit from the reduction in computational time due to the MR approach. Finally, BERT is also tested on experimental data, obtaining an increase of the signal-to-noise ratio in the reconstructed flow field and a higher value of the correlation factor in the velocity measurements with respect to the volume to which the particles are not replaced.
Milewski, Robert J; Kumagai, Yutaro; Fujita, Katsumasa; Standley, Daron M; Smith, Nicholas I
2010-11-19
Macrophages represent the front lines of our immune system; they recognize and engulf pathogens or foreign particles thus initiating the immune response. Imaging macrophages presents unique challenges, as most optical techniques require labeling or staining of the cellular compartments in order to resolve organelles, and such stains or labels have the potential to perturb the cell, particularly in cases where incomplete information exists regarding the precise cellular reaction under observation. Label-free imaging techniques such as Raman microscopy are thus valuable tools for studying the transformations that occur in immune cells upon activation, both on the molecular and organelle levels. Due to extremely low signal levels, however, Raman microscopy requires sophisticated image processing techniques for noise reduction and signal extraction. To date, efficient, automated algorithms for resolving sub-cellular features in noisy, multi-dimensional image sets have not been explored extensively. We show that hybrid z-score normalization and standard regression (Z-LSR) can highlight the spectral differences within the cell and provide image contrast dependent on spectral content. In contrast to typical Raman imaging processing methods using multivariate analysis, such as single value decomposition (SVD), our implementation of the Z-LSR method can operate nearly in real-time. In spite of its computational simplicity, Z-LSR can automatically remove background and bias in the signal, improve the resolution of spatially distributed spectral differences and enable sub-cellular features to be resolved in Raman microscopy images of mouse macrophage cells. Significantly, the Z-LSR processed images automatically exhibited subcellular architectures whereas SVD, in general, requires human assistance in selecting the components of interest. The computational efficiency of Z-LSR enables automated resolution of sub-cellular features in large Raman microscopy data sets without compromise in image quality or information loss in associated spectra. These results motivate further use of label free microscopy techniques in real-time imaging of live immune cells.
Yuzbasioglu, Emir; Kurt, Hanefi; Turunc, Rana; Bilir, Halenur
2014-01-30
The purpose of this study was to compare two impression techniques from the perspective of patient preferences and treatment comfort. Twenty-four (12 male, 12 female) subjects who had no previous experience with either conventional or digital impression participated in this study. Conventional impressions of maxillary and mandibular dental arches were taken with a polyether impression material (Impregum, 3 M ESPE), and bite registrations were made with polysiloxane bite registration material (Futar D, Kettenbach). Two weeks later, digital impressions and bite scans were performed using an intra-oral scanner (CEREC Omnicam, Sirona). Immediately after the impressions were made, the subjects' attitudes, preferences and perceptions towards impression techniques were evaluated using a standardized questionnaire. The perceived source of stress was evaluated using the State-Trait Anxiety Scale. Processing steps of the impression techniques (tray selection, working time etc.) were recorded in seconds. Statistical analyses were performed with the Wilcoxon Rank test, and p < 0.05 was considered significant. There were significant differences among the groups (p < 0.05) in terms of total working time and processing steps. Patients stated that digital impressions were more comfortable than conventional techniques. Digital impressions resulted in a more time-efficient technique than conventional impressions. Patients preferred the digital impression technique rather than conventional techniques.
Multiscale Image Processing of Solar Image Data
NASA Astrophysics Data System (ADS)
Young, C.; Myers, D. C.
2001-12-01
It is often said that the blessing and curse of solar physics is too much data. Solar missions such as Yohkoh, SOHO and TRACE have shown us the Sun with amazing clarity but have also increased the amount of highly complex data. We have improved our view of the Sun yet we have not improved our analysis techniques. The standard techniques used for analysis of solar images generally consist of observing the evolution of features in a sequence of byte scaled images or a sequence of byte scaled difference images. The determination of features and structures in the images are done qualitatively by the observer. There is little quantitative and objective analysis done with these images. Many advances in image processing techniques have occured in the past decade. Many of these methods are possibly suited for solar image analysis. Multiscale/Multiresolution methods are perhaps the most promising. These methods have been used to formulate the human ability to view and comprehend phenomena on different scales. So these techniques could be used to quantitify the imaging processing done by the observers eyes and brains. In this work we present several applications of multiscale techniques applied to solar image data. Specifically, we discuss uses of the wavelet, curvelet, and related transforms to define a multiresolution support for EIT, LASCO and TRACE images.
Automated installation methods for photovoltaic arrays
NASA Astrophysics Data System (ADS)
Briggs, R.; Daniels, A.; Greenaway, R.; Oster, J., Jr.; Racki, D.; Stoeltzing, R.
1982-11-01
Since installation expenses constitute a substantial portion of the cost of a large photovoltaic power system, methods for reduction of these costs were investigated. The installation of the photovoltaic arrays includes all areas, starting with site preparation (i.e., trenching, wiring, drainage, foundation installation, lightning protection, grounding and installation of the panel) and concluding with the termination of the bus at the power conditioner building. To identify the optimum combination of standard installation procedures and automated/mechanized techniques, the installation process was investigated including the equipment and hardware available, the photovoltaic array structure systems and interfaces, and the array field and site characteristics. Preliminary designs of hardware for both the standard installation method, the automated/mechanized method, and a mix of standard installation procedures and mechanized procedures were identified to determine which process effectively reduced installation costs. In addition, costs associated with each type of installation method and with the design, development and fabrication of new installation hardware were generated.
The FBI compression standard for digitized fingerprint images
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brislawn, C.M.; Bradley, J.N.; Onyshczak, R.J.
1996-10-01
The FBI has formulated national standards for digitization and compression of gray-scale fingerprint images. The compression algorithm for the digitized images is based on adaptive uniform scalar quantization of a discrete wavelet transform subband decomposition, a technique referred to as the wavelet/scalar quantization method. The algorithm produces archival-quality images at compression ratios of around 15 to 1 and will allow the current database of paper fingerprint cards to be replaced by digital imagery. A compliance testing program is also being implemented to ensure high standards of image quality and interchangeability of data between different implementations. We will review the currentmore » status of the FBI standard, including the compliance testing process and the details of the first-generation encoder.« less
FBI compression standard for digitized fingerprint images
NASA Astrophysics Data System (ADS)
Brislawn, Christopher M.; Bradley, Jonathan N.; Onyshczak, Remigius J.; Hopper, Thomas
1996-11-01
The FBI has formulated national standards for digitization and compression of gray-scale fingerprint images. The compression algorithm for the digitized images is based on adaptive uniform scalar quantization of a discrete wavelet transform subband decomposition, a technique referred to as the wavelet/scalar quantization method. The algorithm produces archival-quality images at compression ratios of around 15 to 1 and will allow the current database of paper fingerprint cards to be replaced by digital imagery. A compliance testing program is also being implemented to ensure high standards of image quality and interchangeability of data between different implementations. We will review the current status of the FBI standard, including the compliance testing process and the details of the first-generation encoder.
I-line stepper based overlay evaluation method for wafer bonding applications
NASA Astrophysics Data System (ADS)
Kulse, P.; Sasai, K.; Schulz, K.; Wietstruck, M.
2018-03-01
In the last decades the semiconductor technology has been driven by Moore's law leading to high performance CMOS technologies with feature sizes of less than 10 nm [1]. It has been pointed out that not only scaling but also the integration of novel components and technology modules into CMOS/BiCMOS technologies is becoming more attractive to realize smart and miniaturized systems [2]. Driven by new applications in the area of communication, health and automation, new components and technology modules such as BiCMOS embedded RF-MEMS, high-Q passives, Sibased microfluidics and InP-SiGe BiCMOS heterointegration have been demonstrated [3-6]. In contrast to standard VLSI processes fabricated on front side of the silicon wafer, these new technology modules additionally require to process the backside of the wafer; thus require an accurate alignment between the front and backside of the wafer. In previous work an advanced back to front side alignment technique and implementation into IHP's 0.25/0.13 µm high performance SiGe:C BiCMOS backside process module has been presented [7]. The developed technique enables a high resolution and accurate lithography on the backside of BiCMOS wafer for additional backside processing. In addition to the aforementioned back side process technologies, new applications like Through-Silicon Vias (TSV) for interposers and advanced substrate technologies for 3D heterogeneous integration demand not only single wafer fabrication but also processing of wafer stacks provided by temporary and permanent wafer bonding [8-9]. In this work, the non-contact infrared alignment system of the Nikon® i-line Stepper NSR-SF150 for both alignment and the overlay determination of bonded wafer stacks with embedded alignment marks are used to achieve an accurate alignment between the different wafer sides. The embedded field image alignment (FIA) marks of the interface and the device wafer top layer are measured in a single measurement job. By taking the offsets between all different FIA's into account, after correcting the wafer rotation induced FIA position errors, hence an overlay for the stacked wafers can be determined. The developed approach has been validated by a standard front side resist in resist experiment. After the successful validation of the developed technique, special wafer stacks with FIA alignment marks in the bonding interface are fabricated and exposed. Following overlay calculation shows an overlay of less than 200 nm, which enables very accurate process condition for highly scaled TSV integration and advanced substrate integration into IHP's 0.25/0.13 µm SiGe:C BiCMOS technology. The developed technique also allows using significantly smaller alignment marks (i.e. standard FIA alignment marks). Furthermore, the presented method is used, in case of wafer bow related overlay tool problems, for the overlay evaluation of the last two metal layers from production wafers prepared in IHP's standard 0.25/0.13 µm SiGe:C BiCMOS technology. In conclusion, the exposure and measurement job can be done with the same tool, minimizing the back to front side/interface top layer misalignment which leads to a significant device performance improvement of backside/TSV integrated components and technologies.
On application of image analysis and natural language processing for music search
NASA Astrophysics Data System (ADS)
Gwardys, Grzegorz
2013-10-01
In this paper, I investigate a problem of finding most similar music tracks using, popular in Natural Language Processing, techniques like: TF-IDF and LDA. I de ned document as music track. Each music track is transformed to spectrogram, thanks that, I can use well known techniques to get words from images. I used SURF operation to detect characteristic points and novel approach for their description. The standard kmeans was used for clusterization. Clusterization is here identical with dictionary making, so after that I can transform spectrograms to text documents and perform TF-IDF and LDA. At the final, I can make a query in an obtained vector space. The research was done on 16 music tracks for training and 336 for testing, that are splitted in four categories: Hiphop, Jazz, Metal and Pop. Although used technique is completely unsupervised, results are satisfactory and encouraging to further research.
Ardila-Rey, Jorge Alfredo; Montaña, Johny; de Castro, Bruno Albuquerque; Schurch, Roger; Covolan Ulson, José Alfredo; Muhammad-Sukki, Firdaus; Bani, Nurul Aini
2018-03-29
Partial discharges (PDs) are one of the most important classes of ageing processes that occur within electrical insulation. PD detection is a standardized technique to qualify the state of the insulation in electric assets such as machines and power cables. Generally, the classical phase-resolved partial discharge (PRPD) patterns are used to perform the identification of the type of PD source when they are related to a specific degradation process and when the electrical noise level is low compared to the magnitudes of the PD signals. However, in practical applications such as measurements carried out in the field or in industrial environments, several PD sources and large noise signals are usually present simultaneously. In this study, three different inductive sensors have been used to evaluate and compare their performance in the detection and separation of multiple PD sources by applying the chromatic technique to each of the measured signals.
Image processing for x-ray inspection of pistachio nuts
NASA Astrophysics Data System (ADS)
Casasent, David P.
2001-03-01
A review is provided of image processing techniques that have been applied to the inspection of pistachio nuts using X-ray images. X-ray sensors provide non-destructive internal product detail not available from other sensors. The primary concern in this data is detecting the presence of worm infestations in nuts, since they have been linked to the presence of aflatoxin. We describe new techniques for segmentation, feature selection, selection of product categories (clusters), classifier design, etc. Specific novel results include: a new segmentation algorithm to produce images of isolated product items; preferable classifier operation (the classifier with the best probability of correct recognition Pc is not best); higher-order discrimination information is present in standard features (thus, high-order features appear useful); classifiers that use new cluster categories of samples achieve improved performance. Results are presented for X-ray images of pistachio nuts; however, all techniques have use in other product inspection applications.
Analysis of peptides using an integrated microchip HPLC-MS/MS system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kirby, Brian J.; Chirica, Gabriela S.; Reichmuth, David S.
Hyphendated LC-MS techniques are quickly becoming the standard tool for protemic analyses. For large homogeneous samples, bulk processing methods and capillary injection and separation techniques are suitable. However, for analysis of small or heterogeneous samples, techniques that can manipulate picoliter samples without dilution are required or samples will be lost or corrupted; further, static nanospray-type flowrates are required to maximize SNR. Microchip-level integration of sample injection with separation and mass spectrometry allow small-volume analytes to be processed on chip and immediately injected without dilution for analysis. An on-chip HPLC was fabricated using in situ polymerization of both fixed and mobilemore » polymer monoliths. Integration of the chip with a nanospray MS emitter enables identification of peptides by the use of tandem MS. The chip is capable of analyzing of very small sample volumes (< 200 pl) in short times (< 3 min).« less
Development of a New VLBI Data Analysis Software
NASA Technical Reports Server (NTRS)
Bolotin, Sergei; Gipson, John M.; MacMillan, Daniel S.
2010-01-01
We present an overview of a new VLBI analysis software under development at NASA GSFC. The new software will replace CALC/SOLVE and many related utility programs. It will have the capabilities of the current system as well as incorporate new models and data analysis techniques. In this paper we give a conceptual overview of the new software. We formulate the main goals of the software. The software should be flexible and modular to implement models and estimation techniques that currently exist or will appear in future. On the other hand it should be reliable and possess production quality for processing standard VLBI sessions. Also, it needs to be capable of processing observations from a fully deployed network of VLBI2010 stations in a reasonable time. We describe the software development process and outline the software architecture.
Fruit Sorting Using Fuzzy Logic Techniques
NASA Astrophysics Data System (ADS)
Elamvazuthi, Irraivan; Sinnadurai, Rajendran; Aftab Ahmed Khan, Mohamed Khan; Vasant, Pandian
2009-08-01
Fruit and vegetables market is getting highly selective, requiring their suppliers to distribute the goods according to very strict standards of quality and presentation. In the last years, a number of fruit sorting and grading systems have appeared to fulfill the needs of the fruit processing industry. However, most of them are overly complex and too costly for the small and medium scale industry (SMIs) in Malaysia. In order to address these shortcomings, a prototype machine was developed by integrating the fruit sorting, labeling and packing processes. To realise the prototype, many design issues were dealt with. Special attention is paid to the electronic weighing sub-system for measuring weight, and the opto-electronic sub-system for determining the height and width of the fruits. Specifically, this paper discusses the application of fuzzy logic techniques in the sorting process.
Wavelet Filter Banks for Super-Resolution SAR Imaging
NASA Technical Reports Server (NTRS)
Sheybani, Ehsan O.; Deshpande, Manohar; Memarsadeghi, Nargess
2011-01-01
This paper discusses Innovative wavelet-based filter banks designed to enhance the analysis of super resolution Synthetic Aperture Radar (SAR) images using parametric spectral methods and signal classification algorithms, SAR finds applications In many of NASA's earth science fields such as deformation, ecosystem structure, and dynamics of Ice, snow and cold land processes, and surface water and ocean topography. Traditionally, standard methods such as Fast-Fourier Transform (FFT) and Inverse Fast-Fourier Transform (IFFT) have been used to extract Images from SAR radar data, Due to non-parametric features of these methods and their resolution limitations and observation time dependence, use of spectral estimation and signal pre- and post-processing techniques based on wavelets to process SAR radar data has been proposed. Multi-resolution wavelet transforms and advanced spectral estimation techniques have proven to offer efficient solutions to this problem.
Wongtriratanachai, Prasit; Pruksakorn, Dumnoensun; Pothacharoen, Peraphan; Nimkingratana, Puwapong; Pattamapaspong, Nuttaya; Phornphutkul, Chanakarn; Setsitthakun, Sasiwariya; Fongsatitkul, Ladda; Phrompaet, Sureeporn
2013-11-01
Autologous chondrocyte implantation (ACI) has become one of the standard procedures for articular cartilage defect treatment. This technique provides a promising result. However the procedural process requires an approach of several steps from multidisciplinary teams. Although the success of this procedure has been reported from Srinakharinvirot University since 2007, the application of ACI is still limited in Thailand due to the complexity of processes and stringent quality control. This report is to present the first case of the cartilage defect treatment using the first generation-ACI under Chiang Mai University's (CMU) own facility and Ethics Committee. This paper also reviews the process of biotechnology procedures, patient selection, surgical, and rehabilitation techniques. The success of the first case is an important milestone for the further development of the CMU Human Translational Research Laboratory in near future.
INcreasing Security and Protection through Infrastructure REsilience: The INSPIRE Project
NASA Astrophysics Data System (ADS)
D'Antonio, Salvatore; Romano, Luigi; Khelil, Abdelmajid; Suri, Neeraj
The INSPIRE project aims at enhancing the European potential in the field of security by ensuring the protection of critical information infrastructures through (a) the identification of their vulnerabilities and (b) the development of innovative techniques for securing networked process control systems. To increase the resilience of such systems INSPIRE will develop traffic engineering algorithms, diagnostic processes and self-reconfigurable architectures along with recovery techniques. Hence, the core idea of the INSPIRE project is to protect critical information infrastructures by appropriately configuring, managing, and securing the communication network which interconnects the distributed control systems. A working prototype will be implemented as a final demonstrator of selected scenarios. Controls/Communication Experts will support project partners in the validation and demonstration activities. INSPIRE will also contribute to standardization process in order to foster multi-operator interoperability and coordinated strategies for securing lifeline systems.
ERIC Educational Resources Information Center
Gayles, Jochebed G.; Molenaar, Peter C. M.
2013-01-01
The fields of psychology and human development are experiencing a resurgence of scientific inquiries about phenomena that unfold at the level of the individual. This article addresses the issues of analyzing intraindividual psychological/developmental phenomena using standard analytical techniques for interindividual variation. When phenomena are…
Global Journal of Computer Science and Technology. Volume 1.2
ERIC Educational Resources Information Center
Dixit, R. K.
2009-01-01
Articles in this issue of "Global Journal of Computer Science and Technology" include: (1) Input Data Processing Techniques in Intrusion Detection Systems--Short Review (Suhair H. Amer and John A. Hamilton, Jr.); (2) Semantic Annotation of Stock Photography for CBIR Using MPEG-7 standards (R. Balasubramani and V. Kannan); (3) An Experimental Study…
Advanced telemetry systems for payloads. Technology needs, objectives and issues
NASA Technical Reports Server (NTRS)
1990-01-01
The current trends in advanced payload telemetry are the new developments in advanced modulation/coding, the applications of intelligent techniques, data distribution processing, and advanced signal processing methodologies. Concerted efforts will be required to design ultra-reliable man-rated software to cope with these applications. The intelligence embedded and distributed throughout various segments of the telemetry system will need to be overridden by an operator in case of life-threatening situations, making it a real-time integration issue. Suitable MIL standards on physical interfaces and protocols will be adopted to suit the payload telemetry system. New technologies and techniques will be developed for fast retrieval of mass data. Currently, these technology issues are being addressed to provide more efficient, reliable, and reconfigurable systems. There is a need, however, to change the operation culture. The current role of NASA as a leader in developing all the new innovative hardware should be altered to save both time and money. We should use all the available hardware/software developed by the industry and use the existing standards rather than inventing our own.
A Single-Block TRL Test Fixture for the Cryogenic Characterization of Planar Microwave Components
NASA Technical Reports Server (NTRS)
Mejia, M.; Creason, A. S.; Toncich, S. S.; Ebihara, B. T.; Miranda, F. A.
1996-01-01
The High-Temperature-Superconductivity (HTS) group of the RF Technology Branch, Space Electronics Division, is actively involved in the fabrication and cryogenic characterization of planar microwave components for space applications. This process requires fast, reliable, and accurate measurement techniques not readily available. A new calibration standard/test fixture that enhances the integrity and reliability of the component characterization process has been developed. The fixture consists of 50 omega thru, reflect, delay, and device under test gold lines etched onto a 254 microns (0.010 in) thick alumina substrate. The Thru-Reflect-Line (TRL) fixture was tested at room temperature using a 30 omega, 7.62 mm (300 mil) long, gold line as a known standard. Good agreement between the experimental data and the data modelled using Sonnet's em(C) software was obtained for both the return (S(sub 11)) and insertion (S( 21)) losses. A gold two-pole bandpass filter with a 7.3 GHz center frequency was used as our Device Under Test (DUT), and the results compared with those obtained using a Short-Open-Load-Thru (SOLT) calibration technique.
Towards Application of NASA Standard for Models and Simulations in Aeronautical Design Process
NASA Astrophysics Data System (ADS)
Vincent, Luc; Dunyach, Jean-Claude; Huet, Sandrine; Pelissier, Guillaume; Merlet, Joseph
2012-08-01
Even powerful computational techniques like simulation endure limitations in their validity domain. Consequently using simulation models requires cautions to avoid making biased design decisions for new aeronautical products on the basis of inadequate simulation results. Thus the fidelity, accuracy and validity of simulation models shall be monitored in context all along the design phases to build confidence in achievement of the goals of modelling and simulation.In the CRESCENDO project, we adapt the Credibility Assessment Scale method from NASA standard for models and simulations from space programme to the aircraft design in order to assess the quality of simulations. The proposed eight quality assurance metrics aggregate information to indicate the levels of confidence in results. They are displayed in management dashboard and can secure design trade-off decisions at programme milestones.The application of this technique is illustrated in aircraft design context with specific thermal Finite Elements Analysis. This use case shows how to judge the fitness- for-purpose of simulation as Virtual testing means and then green-light the continuation of Simulation Lifecycle Management (SLM) process.
The effect of various veneering techniques on the marginal fit of zirconia copings.
Torabi, Kianoosh; Vojdani, Mahroo; Giti, Rashin; Taghva, Masumeh; Pardis, Soheil
2015-06-01
This study aimed to evaluate the fit of zirconia ceramics before and after veneering, using 3 different veneering processes (layering, press-over, and CAD-on techniques). Thirty standardized zirconia CAD/CAM frameworks were constructed and divided into three groups of 10 each. The first group was veneered using the traditional layering technique. Press-over and CAD-on techniques were used to veneer second and third groups. The marginal gap of specimens was measured before and after veneering process at 18 sites on the master die using a digital microscope. Paired t-test was used to evaluate mean marginal gap changes. One-way ANOVA and post hoc tests were also employed for comparison among 3 groups (α=.05). Marginal gap of 3 groups was increased after porcelain veneering. The mean marginal gap values after veneering in the layering group (63.06 µm) was higher than press-over (50.64 µm) and CAD-on (51.50 µm) veneered groups (P<.001). Three veneering methods altered the marginal fit of zirconia copings. Conventional layering technique increased the marginal gap of zirconia framework more than pressing and CAD-on techniques. All ceramic crowns made through three different veneering methods revealed clinically acceptable marginal fit.
Gratacós, Jordi; Luelmo, Jesús; Rodríguez, Jesús; Notario, Jaume; Marco, Teresa Navío; de la Cueva, Pablo; Busquets, Manel Pujol; Font, Mercè García; Joven, Beatriz; Rivera, Raquel; Vega, Jose Luis Alvarez; Álvarez, Antonio Javier Chaves; Parera, Ricardo Sánchez; Carrascosa, Jose Carlos Ruiz; Martínez, Fernando José Rodríguez; Sánchez, José Pardo; Olmos, Carlos Feced; Pujol, Conrad; Galindez, Eva; Barrio, Silvia Pérez; Arana, Ana Urruticoechea; Hergueta, Mercedes; Coto, Pablo; Queiro, Rubén
2018-06-01
To define and give priority to standards of care and quality indicators of multidisciplinary care for patients with psoriatic arthritis (PsA). A systematic literature review on PsA standards of care and quality indicators was performed. An expert panel of rheumatologists and dermatologists who provide multidisciplinary care was established. In a consensus meeting group, the experts discussed and developed the standards of care and quality indicators and graded their priority, agreement and also the feasibility (only for quality indicators) following qualitative methodology and a Delphi process. Afterwards, these results were discussed with 2 focus groups, 1 with patients, another with health managers. A descriptive analysis is presented. We obtained 25 standards of care (9 of structure, 9 of process, 7 of results) and 24 quality indicators (2 of structure, 5 of process, 17 of results). Standards of care include relevant aspects in the multidisciplinary care of PsA patients like an appropriate physical infrastructure and technical equipment, the access to nursing care, labs and imaging techniques, other health professionals and treatments, or the development of care plans. Regarding quality indicators, the definition of multidisciplinary care model objectives and referral criteria, the establishment of responsibilities and coordination among professionals and the active evaluation of patients and data collection were given a high priority. Patients considered all of them as important. This set of standards of care and quality indicators for the multidisciplinary care of patients with PsA should help improve quality of care in these patients.
Low cost composite manufacturing utilizing intelligent pultrusion and resin transfer molding (IPRTM)
NASA Astrophysics Data System (ADS)
Bradley, James E.; Wysocki, Tadeusz S., Jr.
1993-02-01
This article describes an innovative method for the economical manufacturing of large, intricately-shaped tubular composite parts. Proprietary intelligent process control techniques are combined with standard pultrusion and RTM methodologies to provide high part throughput, performance, and quality while substantially reducing scrap, rework costs, and labor requirements. On-line process monitoring and control is achieved through a smart tooling interface consisting of modular zone tiles installed on part-specific die assemblies. Real-time archiving of process run parameters provides enhanced SPC and SQC capabilities.
Second Aerospace Environmental Technology Conference
NASA Technical Reports Server (NTRS)
Whitaker, A. F. (Editor); Clark-Ingram, M. (Editor)
1997-01-01
The mandated elimination of CFC'S, Halons, TCA, and other ozone depleting chemicals and specific hazardous materials has required changes and new developments in aerospace materials and processes. The aerospace industry has been involved for several years in providing product substitutions, redesigning entire production processes, and developing new materials that minimize or eliminate damage to the environment. These activities emphasize replacement cleaning solvents and their application, verification, compliant coatings including corrosion protection system and removal techniques, chemical propulsion effects on the environment, and the initiation of modifications to relevant processing and manufacturing specifications and standards.
Second Aerospace Environmental Technology Conference
NASA Technical Reports Server (NTRS)
Whitaker, A. F.; Clark-Ingram, M.; Hessler, S. L.
1997-01-01
The mandated elimination of CFC's, Halons, TCA, and other ozone depleting chemicals and specific hazardous materials has required changes and new developments in aerospace materials and processes. The aerospace industry has been involved for several years in providing product substitutions, redesigning entire production processes, and developing new materials that minimize or eliminate damage to the environment. These activities emphasize replacement cleaning solvents and their application verifications, compliant coatings including corrosion protection systems, and removal techniques, chemical propulsion effects on the environment, and the initiation of modifications to relevant processing and manufacturing specifications and standards.
A portable detection instrument based on DSP for beef marbling
NASA Astrophysics Data System (ADS)
Zhou, Tong; Peng, Yankun
2014-05-01
Beef marbling is one of the most important indices to assess beef quality. Beef marbling is graded by the measurement of the fat distribution density in the rib-eye region. However quality grades of beef in most of the beef slaughtering houses and businesses depend on trainees using their visual senses or comparing the beef slice to the Chinese standard sample cards. Manual grading demands not only great labor but it also lacks objectivity and accuracy. Aiming at the necessity of beef slaughtering houses and businesses, a beef marbling detection instrument was designed. The instrument employs Charge-coupled Device (CCD) imaging techniques, digital image processing, Digital Signal Processor (DSP) control and processing techniques and Liquid Crystal Display (LCD) screen display techniques. The TMS320DM642 digital signal processor of Texas Instruments (TI) is the core that combines high-speed data processing capabilities and real-time processing features. All processes such as image acquisition, data transmission, image processing algorithms and display were implemented on this instrument for a quick, efficient, and non-invasive detection of beef marbling. Structure of the system, working principle, hardware and software are introduced in detail. The device is compact and easy to transport. The instrument can determine the grade of beef marbling reliably and correctly.
Cloud services on an astronomy data center
NASA Astrophysics Data System (ADS)
Solar, Mauricio; Araya, Mauricio; Farias, Humberto; Mardones, Diego; Wang, Zhong
2016-08-01
The research on computational methods for astronomy performed by the first phase of the Chilean Virtual Observatory (ChiVO) led to the development of functional prototypes, implementing state-of-the-art computational methods and proposing new algorithms and techniques. The ChiVO software architecture is based on the use of the IVOA protocols and standards. These protocols and standards are grouped in layers, with emphasis on the application and data layers, because their basic standards define the minimum operation that a VO should conduct. As momentary verification, the current implementation works with a set of data, with 1 TB capacity, which comes from the reduction of the cycle 0 of ALMA. This research was mainly focused on spectroscopic data cubes coming from the cycle 0 ALMA's public data. As the dataset size increases when the cycle 1 ALMA's public data is also increasing every month, data processing is becoming a major bottleneck for scientific research in astronomy. When designing the ChiVO, we focused on improving both computation and I/ O costs, and this led us to configure a data center with 424 high speed cores of 2,6 GHz, 1 PB of storage (distributed in hard disk drives-HDD and solid state drive-SSD) and high speed communication Infiniband. We are developing a cloud based e-infrastructure for ChiVO services, in order to have a coherent framework for developing novel web services for on-line data processing in the ChiVO. We are currently parallelizing these new algorithms and techniques using HPC tools to speed up big data processing, and we will report our results in terms of data size, data distribution, number of cores and response time, in order to compare different processing and storage configurations.
Segmented Gamma Scanner for Small Containers of Uranium Processing Waste- 12295
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morris, K.E.; Smith, S.K.; Gailey, S.
2012-07-01
The Segmented Gamma Scanner (SGS) is commonly utilized in the assay of 55-gallon drums containing radioactive waste. Successfully deployed calibration methods include measurement of vertical line source standards in representative matrices and mathematical efficiency calibrations. The SGS technique can also be utilized to assay smaller containers, such as those used for criticality safety in uranium processing facilities. For such an application, a Can SGS System is aptly suited for the identification and quantification of radionuclides present in fuel processing wastes. Additionally, since the significant presence of uranium lumping can confound even a simple 'pass/fail' measurement regimen, the high-resolution gamma spectroscopymore » allows for the use of lump-detection techniques. In this application a lump correction is not required, but the application of a differential peak approach is used to simply identify the presence of U-235 lumps. The Can SGS is similar to current drum SGSs, but differs in the methodology for vertical segmentation. In the current drum SGS, the drum is placed on a rotator at a fixed vertical position while the detector, collimator, and transmission source are moved vertically to effect vertical segmentation. For the Can SGS, segmentation is more efficiently done by raising and lowering the rotator platform upon which the small container is positioned. This also reduces the complexity of the system mechanism. The application of the Can SGS introduces new challenges to traditional calibration and verification approaches. In this paper, we revisit SGS calibration methodology in the context of smaller waste containers, and as applied to fuel processing wastes. Specifically, we discuss solutions to the challenges introduced by requiring source standards to fit within the confines of the small containers and the unavailability of high-enriched uranium source standards. We also discuss the implementation of a previously used technique for identifying the presence of uranium lumping. The SGS technique is a well-accepted NDA technique applicable to containers of almost any size. It assumes a homogenous matrix and activity distribution throughout the entire container; an assumption that is at odds with the detection of lumps within the assay item typical of uranium-processing waste. This fact, in addition to the difficultly in constructing small reference standards of uranium-bearing materials, required the methodology used for performing an efficiency curve calibration to be altered. The solution discussed in this paper is demonstrated to provide good results for both the segment activity and full container activity when measuring heterogeneous source distributions. The application of this approach will need to be based on process knowledge of the assay items, as biases can be introduced if used with homogenous, or nearly homogenous, activity distributions. The bias will need to be quantified for each combination of container geometry and SGS scanning settings. One recommended approach for using the heterogeneous calibration discussed here is to assay each item using a homogenous calibration initially. Review of the segment activities compared to the full container activity will signal the presence of a non-uniform activity distribution as the segment activity will be grossly disproportionate to the full container activity. Upon seeing this result, the assay should either be reanalyzed or repeated using the heterogeneous calibration. (authors)« less
Allanite age-dating: Non-matrix-matched standardization in quadrupole LA-ICP-MS
NASA Astrophysics Data System (ADS)
Burn, M.; Lanari, P.; Pettke, T.; Engi, M.
2014-12-01
Allanite Th-U-Pb age-dating has recently been found to be powerful in unraveling the timing of geological processes such as the metamorphic dynamics in subduction zones and crystallization velocity of magmas. However, inconsistencies among analytical techniques have raised doubts about the accuracy of allanite age data. Spot analysis techniques such as LA-ICP-MS are claimed to be crucially dependent on matrix-matched standards, the quality of which is variable. We present a new approach in LA-ICP-MS data reduction that allows non-matrix-matched standardization via well constrained zircon reference materials as primary standards. Our data were obtained using a GeoLas Pro 193 nm ArF excimer laser ablation system coupled to an ELAN DRC-e quadrupole ICP-MS. We use 32 μm and 24 μm spot sizes; laser operating conditions of 9 Hz repetition rate and 2.5 J/cm2 fluence have proven advantageous. Matrix dependent downhole fractionation evolution is empirically determined by analyzing 208Pb/232Th and 206Pb/238U and applied prior to standardization. The new data reduction technique was tested on three magmatic allanite reference materials (SISSb, CAPb, TARA); within error these show the same downhole fractionation evolution for all allanite types and in different analytical sessions, provided measurement conditions remain the same. Although the downhole evolution of allanite and zircon differs significantly, a link between zircon and allanite matrix is established by assuming CAPb and TARA to be fixed at the corresponding reference ages. Our weighted mean 208Pb/232Th ages are 30.06 ± 0.22 (2σ) for SISSb, 275.4 ± 1.3 (2σ) for CAPb, and 409.9 ± 1.8 (2σ) for TARA. Precision of single spot age data varies between 1.5 and 8 % (2σ), dependent on spot size and common lead concentrations. Quadrupole LA-ICP-MS allanite age-dating has thus similar uncertainties as do other spot analysis techniques. The new data reduction technique is much less dependent on quality and homogeneity of allanite standard reference materials. This method of correcting for matrix-dependent downhole fractionation evolution opens new possibilities in the field of LA-ICP-MS data acquisition, e.g. the use of a NIST standard glass to date all material types given a set of well constrained reference materials.
Standardized pivot shift test improves measurement accuracy.
Hoshino, Yuichi; Araujo, Paulo; Ahlden, Mattias; Moore, Charity G; Kuroda, Ryosuke; Zaffagnini, Stefano; Karlsson, Jon; Fu, Freddie H; Musahl, Volker
2012-04-01
The variability of the pivot shift test techniques greatly interferes with achieving a quantitative and generally comparable measurement. The purpose of this study was to compare the variation of the quantitative pivot shift measurements with different surgeons' preferred techniques to a standardized technique. The hypothesis was that standardizing the pivot shift test would improve consistency in the quantitative evaluation when compared with surgeon-specific techniques. A whole lower body cadaveric specimen was prepared to have a low-grade pivot shift on one side and high-grade pivot shift on the other side. Twelve expert surgeons performed the pivot shift test using (1) their preferred technique and (2) a standardized technique. Electromagnetic tracking was utilized to measure anterior tibial translation and acceleration of the reduction during the pivot shift test. The variation of the measurement was compared between the surgeons' preferred technique and the standardized technique. The anterior tibial translation during pivot shift test was similar between using surgeons' preferred technique (left 24.0 ± 4.3 mm; right 15.5 ± 3.8 mm) and using standardized technique (left 25.1 ± 3.2 mm; right 15.6 ± 4.0 mm; n.s.). However, the variation in acceleration was significantly smaller with the standardized technique (left 3.0 ± 1.3 mm/s(2); right 2.5 ± 0.7 mm/s(2)) compared with the surgeons' preferred technique (left 4.3 ± 3.3 mm/s(2); right 3.4 ± 2.3 mm/s(2); both P < 0.01). Standardizing the pivot shift test maneuver provides a more consistent quantitative evaluation and may be helpful in designing future multicenter clinical outcome trials. Diagnostic study, Level I.
Development and Validation of Instruments to Measure Learning of Expert-Like Thinking
NASA Astrophysics Data System (ADS)
Adams, Wendy K.; Wieman, Carl E.
2011-06-01
This paper describes the process for creating and validating an assessment test that measures the effectiveness of instruction by probing how well that instruction causes students in a class to think like experts about specific areas of science. The design principles and process are laid out and it is shown how these align with professional standards that have been established for educational and psychological testing and the elements of assessment called for in a recent National Research Council study on assessment. The importance of student interviews for creating and validating the test is emphasized, and the appropriate interview procedures are presented. The relevance and use of standard psychometric statistical tests are discussed. Additionally, techniques for effective test administration are presented.
Ulmer, Candice Z; Ragland, Jared M; Koelmel, Jeremy P; Heckert, Alan; Jones, Christina M; Garrett, Timothy J; Yost, Richard A; Bowden, John A
2017-12-19
As advances in analytical separation techniques, mass spectrometry instrumentation, and data processing platforms continue to spur growth in the lipidomics field, more structurally unique lipid species are detected and annotated. The lipidomics community is in need of benchmark reference values to assess the validity of various lipidomics workflows in providing accurate quantitative measurements across the diverse lipidome. LipidQC addresses the harmonization challenge in lipid quantitation by providing a semiautomated process, independent of analytical platform, for visual comparison of experimental results of National Institute of Standards and Technology Standard Reference Material (SRM) 1950, "Metabolites in Frozen Human Plasma", against benchmark consensus mean concentrations derived from the NIST Lipidomics Interlaboratory Comparison Exercise.
Multiphoton spectral analysis of benzo[a]pyrene uptake and metabolism in a rat liver cell line
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barhoumi, Rola, E-mail: rmouneimne@cvm.tamu.edu; Mouneimne, Youssef; Ramos, Ernesto
2011-05-15
Dynamic analysis of the uptake and metabolism of polycyclic aromatic hydrocarbons (PAHs) and their metabolites within live cells in real time has the potential to provide novel insights into genotoxic and non-genotoxic mechanisms of cellular injury caused by PAHs. The present work, combining the use of metabolite spectra generated from metabolite standards using multiphoton spectral analysis and an 'advanced unmixing process', identifies and quantifies the uptake, partitioning, and metabolite formation of one of the most important PAHs (benzo[a]pyrene, BaP) in viable cultured rat liver cells over a period of 24 h. The application of the advanced unmixing process resulted inmore » the simultaneous identification of 8 metabolites in live cells at any single time. The accuracy of this unmixing process was verified using specific microsomal epoxide hydrolase inhibitors, glucuronidation and sulfation inhibitors as well as several mixtures of metabolite standards. Our findings prove that the two-photon microscopy imaging surpasses the conventional fluorescence imaging techniques and the unmixing process is a mathematical technique that seems applicable to the analysis of BaP metabolites in living cells especially for analysis of changes of the ultimate carcinogen benzo[a]pyrene-r-7,t-8-dihydrodiol-t-9,10-epoxide. Therefore, the combination of the two-photon acquisition with the unmixing process should provide important insights into the cellular and molecular mechanisms by which BaP and other PAHs alter cellular homeostasis.« less
Beam shaping as an enabler for new applications
NASA Astrophysics Data System (ADS)
Guertler, Yvonne; Kahmann, Max; Havrilla, David
2017-02-01
For many years, laser beam shaping has enabled users to achieve optimized process results as well as manage challenging applications. The latest advancements in industrial lasers and processing optics have taken this a step further as users are able to adapt the beam shape to meet specific application requirements in a very flexible way. TRUMPF has developed a wide range of experience in creating beam profiles at the work piece for optimized material processing. This technology is based on the physical model of wave optics and can be used with ultra short pulse lasers as well as multi-kW cw lasers. Basically, the beam shape can be adapted in all three dimensions in space, which allows maximum flexibility. Besides adaption of intensity profile, even multi-spot geometries can be produced. This approach is very cost efficient, because a standard laser source and (in the case of cw lasers) a standard fiber can be used without any special modifications. Based on this innovative beam shaping technology, TRUMPF has developed new and optimized processes. Two of the most recent application developments using these techniques are cutting glass and synthetic sapphire with ultra-short pulse lasers and enhanced brazing of hot dip zinc coated steel for automotive applications. Both developments lead to more efficient and flexible production processes, enabled by laser technology and open the door to new opportunities. They also indicate the potential of beam shaping techniques since they can be applied to both single-mode laser sources (TOP Cleave) and multi-mode laser sources (brazing).
Tensile strength of various nylon PA6 specimen modes
NASA Astrophysics Data System (ADS)
Raz, Karel; Zahalka, Martin
2017-05-01
This article explores the influence of production technique on the strength of nylon parts. Identical specimens were manufactured by various techniques. The material of specimens was nylon PA6. 3D printing and injection molding were used, with various orientations of printed layers, and various orientations of specimens in the working space of the 3D printer. The variants are described in detail. A special mold was used for the injection molding process in order to make specimens with and without a weld line. The effect of this weld line was evaluated. All specimens were tested using the standard tensile test configuration. The strength was compared. It was found that the same plastic material has very different mechanical properties depending on the production process.
New approach to estimating variability in visual field data using an image processing technique.
Crabb, D P; Edgar, D F; Fitzke, F W; McNaught, A I; Wynn, H P
1995-01-01
AIMS--A new framework for evaluating pointwise sensitivity variation in computerised visual field data is demonstrated. METHODS--A measure of local spatial variability (LSV) is generated using an image processing technique. Fifty five eyes from a sample of normal and glaucomatous subjects, examined on the Humphrey field analyser (HFA), were used to illustrate the method. RESULTS--Significant correlation between LSV and conventional estimates--namely, HFA pattern standard deviation and short term fluctuation, were found. CONCLUSION--LSV is not dependent on normals' reference data or repeated threshold determinations, thus potentially reducing test time. Also, the illustrated pointwise maps of LSV could provide a method for identifying areas of fluctuation commonly found in early glaucomatous field loss. PMID:7703196
Kropf, Stefan; Chalopin, Claire; Lindner, Dirk; Denecke, Kerstin
2017-06-28
Access to patient data within the hospital or between hospitals is still problematic since a variety of information systems is in use applying different vendor specific terminologies and underlying knowledge models. Beyond, the development of electronic health record systems (EHRSs) is time and resource consuming. Thus, there is a substantial need for a development strategy of standardized EHRSs. We are applying a reuse-oriented process model and demonstrate its feasibility and realization on a practical medical use case, which is an EHRS holding all relevant data arising in the context of treatment of tumors of the sella region. In this paper, we describe the development process and our practical experiences. Requirements towards the development of the EHRS were collected by interviews with a neurosurgeon and patient data analysis. For modelling of patient data, we selected openEHR as standard and exploited the software tools provided by the openEHR foundation. The patient information model forms the core of the development process, which comprises the EHR generation and the implementation of an EHRS architecture. Moreover, a reuse-oriented process model from the business domain was adapted to the development of the EHRS. The reuse-oriented process model is a model for a suitable abstraction of both, modeling and development of an EHR centralized EHRS. The information modeling process resulted in 18 archetypes that were aggregated in a template and built the boilerplate of the model driven development. The EHRs and the EHRS were developed by openEHR and W3C standards, tightly supported by well-established XML techniques. The GUI of the final EHRS integrates and visualizes information from various examinations, medical reports, findings and laboratory test results. We conclude that the development of a standardized overarching EHR and an EHRS is feasible using openEHR and W3C standards, enabling a high degree of semantic interoperability. The standardized representation visualizes data and can in this way support the decision process of clinicians.
Nikendei, C.; Ganschow, P.; Groener, J. B.; Huwendiek, S.; Köchel, A.; Köhl-Hackert, N.; Pjontek, R.; Rodrian, J.; Scheibe, F.; Stadler, A.-K.; Steiner, T.; Stiepak, J.; Tabatabai, J.; Utz, A.; Kadmon, M.
2016-01-01
The competent physical examination of patients and the safe and professional implementation of clinical procedures constitute essential components of medical practice in nearly all areas of medicine. The central objective of the projects “Heidelberg standard examination” and “Heidelberg standard procedures”, which were initiated by students, was to establish uniform interdisciplinary standards for physical examination and clinical procedures, and to distribute them in coordination with all clinical disciplines at the Heidelberg University Hospital. The presented project report illuminates the background of the initiative and its methodological implementation. Moreover, it describes the multimedia documentation in the form of pocketbooks and a multimedia internet-based platform, as well as the integration into the curriculum. The project presentation aims to provide orientation and action guidelines to facilitate similar processes in other faculties. PMID:27579354
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hansen, Anders T., E-mail: andehans@rm.dk; Lukacova, Slavka; Lassen-Ramshad, Yasmin
2015-01-01
When standard conformal x-ray technique for craniospinal irradiation is used, it is a challenge to achieve satisfactory dose coverage of the target including the area of the cribriform plate, while sparing organs at risk. We present a new intensity-modulated radiation therapy (IMRT), noncoplanar technique, for delivering irradiation to the cranial part and compare it with 3 other techniques and previously published results. A total of 13 patients who had previously received craniospinal irradiation with standard conformal x-ray technique were reviewed. New treatment plans were generated for each patient using the noncoplanar IMRT-based technique, a coplanar IMRT-based technique, and a coplanarmore » volumetric-modulated arch therapy (VMAT) technique. Dosimetry data for all patients were compared with the corresponding data from the conventional treatment plans. The new noncoplanar IMRT technique substantially reduced the mean dose to organs at risk compared with the standard radiation technique. The 2 other coplanar techniques also reduced the mean dose to some of the critical organs. However, this reduction was not as substantial as the reduction obtained by the noncoplanar technique. Furthermore, compared with the standard technique, the IMRT techniques reduced the total calculated radiation dose that was delivered to the normal tissue, whereas the VMAT technique increased this dose. Additionally, the coverage of the target was significantly improved by the noncoplanar IMRT technique. Compared with the standard technique, the coplanar IMRT and the VMAT technique did not improve the coverage of the target significantly. All the new planning techniques increased the number of monitor units (MU) used—the noncoplanar IMRT technique by 99%, the coplanar IMRT technique by 122%, and the VMAT technique by 26%—causing concern for leak radiation. The noncoplanar IMRT technique covered the target better and decreased doses to organs at risk compared with the other techniques. All the new techniques increased the number of MU compared with the standard technique.« less
Basu, Anirban; Kumar, Gopinatha Suresh
2014-05-30
The interaction of the synthetic azo dye and food colorant carmoisine with human and bovine serum albumins was studied by microcalorimetric techniques. A complete thermodynamic profile of the interaction was obtained from isothermal titration calorimetry studies. The equilibrium constant of the complexation process was of the order of 10(6)M(-1) and the binding stoichiometry was found to be 1:1 with both the serum albumins. The binding was driven by negative standard molar enthalpy and positive standard molar entropy contributions. The binding affinity was lower at higher salt concentrations in both cases but the same was dominated by mostly non-electrostatic forces at all salt concentrations. The polyelectrolytic forces contributed only 5-8% of the total standard molar Gibbs energy change. The standard molar enthalpy change enhanced whereas the standard molar entropic contribution decreased with rise in temperature but they compensated each other to keep the standard molar Gibbs energy change almost invariant. The negative standard molar heat capacity values suggested the involvement of a significant hydrophobic contribution in the complexation process. Besides, enthalpy-entropy compensation phenomenon was also observed in both the systems. The thermal stability of the serum proteins was found to be remarkably enhanced on binding to carmoisine. Copyright © 2014 Elsevier B.V. All rights reserved.
MEMS scanning micromirror for optical coherence tomography.
Strathman, Matthew; Liu, Yunbo; Keeler, Ethan G; Song, Mingli; Baran, Utku; Xi, Jiefeng; Sun, Ming-Ting; Wang, Ruikang; Li, Xingde; Lin, Lih Y
2015-01-01
This paper describes an endoscopic-inspired imaging system employing a micro-electromechanical system (MEMS) micromirror scanner to achieve beam scanning for optical coherence tomography (OCT) imaging. Miniaturization of a scanning mirror using MEMS technology can allow a fully functional imaging probe to be contained in a package sufficiently small for utilization in a working channel of a standard gastroesophageal endoscope. This work employs advanced image processing techniques to enhance the images acquired using the MEMS scanner to correct non-idealities in mirror performance. The experimental results demonstrate the effectiveness of the proposed technique.
MEMS scanning micromirror for optical coherence tomography
Strathman, Matthew; Liu, Yunbo; Keeler, Ethan G.; Song, Mingli; Baran, Utku; Xi, Jiefeng; Sun, Ming-Ting; Wang, Ruikang; Li, Xingde; Lin, Lih Y.
2014-01-01
This paper describes an endoscopic-inspired imaging system employing a micro-electromechanical system (MEMS) micromirror scanner to achieve beam scanning for optical coherence tomography (OCT) imaging. Miniaturization of a scanning mirror using MEMS technology can allow a fully functional imaging probe to be contained in a package sufficiently small for utilization in a working channel of a standard gastroesophageal endoscope. This work employs advanced image processing techniques to enhance the images acquired using the MEMS scanner to correct non-idealities in mirror performance. The experimental results demonstrate the effectiveness of the proposed technique. PMID:25657887
Baker, Jay B; Maskell, Kevin F; Matlock, Aaron G; Walsh, Ryan M; Skinner, Carl G
2015-07-01
We compared intubating with a preloaded bougie (PB) against standard bougie technique in terms of success rates, time to successful intubation and provider preference on a cadaveric airway model. In this prospective, crossover study, healthcare providers intubated a cadaver using the PB technique and the standard bougie technique. Participants were randomly assigned to start with either technique. Following standardized training and practice, procedural success and time for each technique was recorded for each participant. Subsequently, participants were asked to rate their perceived ease of intubation on a visual analogue scale of 1 to 10 (1=difficult and 10=easy) and to select which technique they preferred. 47 participants with variable experience intubating were enrolled at an emergency medicine intern airway course. The success rate of all groups for both techniques was equal (95.7%). The range of times to completion for the standard bougie technique was 16.0-70.2 seconds, with a mean time of 29.7 seconds. The range of times to completion for the PB technique was 15.7-110.9 seconds, with a mean time of 29.4 seconds. There was a non-significant difference of 0.3 seconds (95% confidence interval -2.8 to 3.4 seconds) between the two techniques. Participants rated the relative ease of intubation as 7.3/10 for the standard technique and 7.6/10 for the preloaded technique (p=0.53, 95% confidence interval of the difference -0.97 to 0.50). Thirty of 47 participants subjectively preferred the PB technique (p=0.039). There was no significant difference in success or time to intubation between standard bougie and PB techniques. The majority of participants in this study preferred the PB technique. Until a clear and clinically significant difference is found between these techniques, emergency airway operators should feel confident in using the technique with which they are most comfortable.
Toth, Thomas L; Lee, Malinda S; Bendikson, Kristin A; Reindollar, Richard H
2017-04-01
To better understand practice patterns and opportunities for standardization of ET. Cross-sectional survey. Not applicable. Not applicable. An anonymous 82-question survey was emailed to the medical directors of 286 Society for Assisted Reproductive Technology member IVF practices. A follow-up survey composed of three questions specific to ET technique was emailed to the same medical directors. Descriptive statistics of the results were compiled. The survey assessed policies, protocols, restrictions, and specifics pertinent to the technique of ET. There were 117 (41%) responses; 32% practice in academic settings and 68% in private practice. Responders were experienced clinicians, half of whom had performed <10 procedures during training. Ninety-eight percent of practices allowed all practitioners to perform ET; half did not follow a standardized ET technique. Multiple steps in the ET process were identified as "highly conserved;" others demonstrated discordance. ET technique is divided among [1] trial transfer followed immediately with ET (40%); [2] afterload transfer (30%); and [3] direct transfer without prior trial or afterload (27%). Embryos are discharged in the upper (66%) and middle thirds (29%) of the endometrial cavity and not closer than 1-1.5 cm from fundus (87%). Details of each step were reported and allowed the development of a "common" practice ET procedure. ET training and practices vary widely. Improved training and standardization based on outcomes data and best practices are warranted. A common practice procedure is suggested for validation by a systematic literature review. Copyright © 2017 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.
Bringing Standardized Processes in Atom-Probe Tomography: I Establishing Standardized Terminology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, Ian M; Danoix, F; Forbes, Richard
2011-01-01
Defining standardized methods requires careful consideration of the entire field and its applications. The International Field Emission Society (IFES) has elected a Standards Committee, whose task is to determine the needed steps to establish atom-probe tomography as an accepted metrology technique. Specific tasks include developing protocols or standards for: terminology and nomenclature; metrology and instrumentation, including specifications for reference materials; test methodologies; modeling and simulations; and science-based health, safety, and environmental practices. The Committee is currently working on defining terminology related to atom-probe tomography with the goal to include terms into a document published by the International Organization for Standardsmore » (ISO). A lot of terms also used in other disciplines have already been defined) and will be discussed for adoption in the context of atom-probe tomography.« less
Hollow Core Bragg Waveguide Design and Fabrication for Enhanced Raman Spectroscopy
NASA Astrophysics Data System (ADS)
Ramanan, Janahan
Raman spectroscopy is a widely used technique to unambiguously ascertain the chemical composition of a sample. The caveat with this technique is its extremely weak optical cross-section, making it difficult to measure Raman signal with standard optical setups. In this thesis, a novel hollow core Bragg Reflection Waveguide was designed to simultaneously increase the generation and collection of Raman scattered photons. A robust fabrication process of this waveguide was developed employing flip-chip bonding methods to securely seal the hollow core channel. The waveguide air-core propagation loss was experimentally measured to be 0.17 dB/cm, and the Raman sensitivity limit was measured to be 3 mmol/L for glycerol solution. The waveguide was also shown to enhance Raman modes of standard household aerosols that could not be seen with other devices.
Salazar, Jaime; Müller, Rainer H; Möschwitzer, Jan P
2013-07-16
Standard particle size reduction techniques such as high pressure homogenization or wet bead milling are frequently used in the production of nanosuspensions. The need for micronized starting material and long process times are their evident disadvantages. Combinative particle size reduction technologies have been developed to overcome the drawbacks of the standard techniques. The H 42 combinative technology consists of a drug pre-treatment by means of spray-drying followed by standard high pressure homogenization. In the present paper, spray-drying process parameters influencing the diminution effectiveness, such as drug and surfactant concentration, were systematically analyzed. Subsequently, the untreated and pre-treated drug powders were homogenized for 20 cycles at 1500 bar. For untreated, micronized glibenclamide, the particle size analysis revealed a mean particle size of 772 nm and volume-based size distribution values of 2.686 μm (d50%) and 14.423 μm (d90%). The use of pre-treated material (10:1 glibenclamide/docusate sodium salt ratio spray-dried as ethanolic solution) resulted in a mean particle size of 236 nm and volume-based size distribution values of 0.131 μm (d50%) and 0.285 μm (d90%). These results were markedly improved compared to the standard process. The nanosuspensions were further transferred into tablet formulations. Wet granulation, freeze-drying and spray-drying were investigated as downstream methods to produce dry intermediates. Regarding the dissolution rate, the rank order of the downstream processes was as follows: Spray-drying>freeze-drying>wet granulation. The best drug release (90% within 10 min) was obtained for tablets produced with spray-dried nanosuspension containing 2% mannitol as matrix former. In comparison, the tablets processed with micronized glibenclamide showed a drug release of only 26% after 10 min. The H 42 combinative technology could be successfully applied in the production of small drug nanocrystals. A nanosuspension transfer to tablets that maintained the fast dissolution properties of the drug nanocrystals was successfully achieved. Copyright © 2013 Elsevier B.V. All rights reserved.
Development of an Uncertainty Model for the National Transonic Facility
NASA Technical Reports Server (NTRS)
Walter, Joel A.; Lawrence, William R.; Elder, David W.; Treece, Michael D.
2010-01-01
This paper introduces an uncertainty model being developed for the National Transonic Facility (NTF). The model uses a Monte Carlo technique to propagate standard uncertainties of measured values through the NTF data reduction equations to calculate the combined uncertainties of the key aerodynamic force and moment coefficients and freestream properties. The uncertainty propagation approach to assessing data variability is compared with ongoing data quality assessment activities at the NTF, notably check standard testing using statistical process control (SPC) techniques. It is shown that the two approaches are complementary and both are necessary tools for data quality assessment and improvement activities. The SPC approach is the final arbiter of variability in a facility. Its result encompasses variation due to people, processes, test equipment, and test article. The uncertainty propagation approach is limited mainly to the data reduction process. However, it is useful because it helps to assess the causes of variability seen in the data and consequently provides a basis for improvement. For example, it is shown that Mach number random uncertainty is dominated by static pressure variation over most of the dynamic pressure range tested. However, the random uncertainty in the drag coefficient is generally dominated by axial and normal force uncertainty with much less contribution from freestream conditions.
Goffin, N J; Higginson, R L; Tyrer, J R
2016-12-01
In laser cladding, the potential benefits of wire feeding are considerable. Typical problems with the use of powder, such as gas entrapment, sub-100% material density and low deposition rate are all avoided with the use of wire. However, the use of a powder-based source material is the industry standard, with wire-based deposition generally regarded as an academic curiosity. This is because, although wire-based methods have been shown to be capable of superior quality results, the wire-based process is more difficult to control. In this work, the potential for wire shaping techniques, combined with existing holographic optical element knowledge, is investigated in order to further improve the processing characteristics. Experiments with pre-placed wire showed the ability of shaped wire to provide uniformity of wire melting compared with standard round wire, giving reduced power density requirements and superior control of clad track dilution. When feeding with flat wire, the resulting clad tracks showed a greater level of quality consistency and became less sensitive to alterations in processing conditions. In addition, a 22% increase in deposition rate was achieved. Stacking of multiple layers demonstrated the ability to create fully dense, three-dimensional structures, with directional metallurgical grain growth and uniform chemical structure.
Higginson, R. L.; Tyrer, J. R.
2016-01-01
In laser cladding, the potential benefits of wire feeding are considerable. Typical problems with the use of powder, such as gas entrapment, sub-100% material density and low deposition rate are all avoided with the use of wire. However, the use of a powder-based source material is the industry standard, with wire-based deposition generally regarded as an academic curiosity. This is because, although wire-based methods have been shown to be capable of superior quality results, the wire-based process is more difficult to control. In this work, the potential for wire shaping techniques, combined with existing holographic optical element knowledge, is investigated in order to further improve the processing characteristics. Experiments with pre-placed wire showed the ability of shaped wire to provide uniformity of wire melting compared with standard round wire, giving reduced power density requirements and superior control of clad track dilution. When feeding with flat wire, the resulting clad tracks showed a greater level of quality consistency and became less sensitive to alterations in processing conditions. In addition, a 22% increase in deposition rate was achieved. Stacking of multiple layers demonstrated the ability to create fully dense, three-dimensional structures, with directional metallurgical grain growth and uniform chemical structure. PMID:28119550
Chatake, Toshiyuki; Fujiwara, Satoru
2016-01-01
A difference in the neutron scattering length between hydrogen and deuterium leads to a high density contrast in neutron Fourier maps. In this study, a technique for determining the deuterium/hydrogen (D/H) contrast map in neutron macromolecular crystallography is developed and evaluated using ribonuclease A. The contrast map between the D2O-solvent and H2O-solvent crystals is calculated in real space, rather than in reciprocal space as performed in previous neutron D/H contrast crystallography. The present technique can thus utilize all of the amplitudes of the neutron structure factors for both D2O-solvent and H2O-solvent crystals. The neutron D/H contrast maps clearly demonstrate the powerful detectability of H/D exchange in proteins. In fact, alternative protonation states and alternative conformations of hydroxyl groups are observed at medium resolution (1.8 Å). Moreover, water molecules can be categorized into three types according to their tendency towards rotational disorder. These results directly indicate improvement in the neutron crystal structure analysis. This technique is suitable for incorporation into the standard structure-determination process used in neutron protein crystallography; consequently, more precise and efficient determination of the D-atom positions is possible using a combination of this D/H contrast technique and standard neutron structure-determination protocols.
Cheng, Cynthia; Lee, Chadd W; Daskalakis, Constantine
2015-10-27
Capillaroscopy is a non-invasive, efficient, relatively inexpensive and easy to learn methodology for directly visualizing the microcirculation. The capillaroscopy technique can provide insight into a patient's microvascular health, leading to a variety of potentially valuable dermatologic, ophthalmologic, rheumatologic and cardiovascular clinical applications. In addition, tumor growth may be dependent on angiogenesis, which can be quantitated by measuring microvessel density within the tumor. However, there is currently little to no standardization of techniques, and only one publication to date reports the reliability of a currently available, complex computer based algorithms for quantitating capillaroscopy data.(1) This paper describes a new, simpler, reliable, standardized capillary counting algorithm for quantitating nailfold capillaroscopy data. A simple, reproducible computerized capillaroscopy algorithm such as this would facilitate more widespread use of the technique among researchers and clinicians. Many researchers currently analyze capillaroscopy images by hand, promoting user fatigue and subjectivity of the results. This paper describes a novel, easy-to-use automated image processing algorithm in addition to a reproducible, semi-automated counting algorithm. This algorithm enables analysis of images in minutes while reducing subjectivity; only a minimal amount of training time (in our experience, less than 1 hr) is needed to learn the technique.
Daskalakis, Constantine
2015-01-01
Capillaroscopy is a non-invasive, efficient, relatively inexpensive and easy to learn methodology for directly visualizing the microcirculation. The capillaroscopy technique can provide insight into a patient’s microvascular health, leading to a variety of potentially valuable dermatologic, ophthalmologic, rheumatologic and cardiovascular clinical applications. In addition, tumor growth may be dependent on angiogenesis, which can be quantitated by measuring microvessel density within the tumor. However, there is currently little to no standardization of techniques, and only one publication to date reports the reliability of a currently available, complex computer based algorithms for quantitating capillaroscopy data.1 This paper describes a new, simpler, reliable, standardized capillary counting algorithm for quantitating nailfold capillaroscopy data. A simple, reproducible computerized capillaroscopy algorithm such as this would facilitate more widespread use of the technique among researchers and clinicians. Many researchers currently analyze capillaroscopy images by hand, promoting user fatigue and subjectivity of the results. This paper describes a novel, easy-to-use automated image processing algorithm in addition to a reproducible, semi-automated counting algorithm. This algorithm enables analysis of images in minutes while reducing subjectivity; only a minimal amount of training time (in our experience, less than 1 hr) is needed to learn the technique. PMID:26554744
Ultrasonic Imaging Techniques for Breast Cancer Detection
NASA Astrophysics Data System (ADS)
Goulding, N. R.; Marquez, J. D.; Prewett, E. M.; Claytor, T. N.; Nadler, B. R.
2008-02-01
Improving the resolution and specificity of current ultrasonic imaging technology is needed to enhance its relevance to breast cancer detection. A novel ultrasonic imaging reconstruction method is described that exploits classical straight-ray migration. This novel method improves signal processing for better image resolution and uses novel staging hardware options using a pulse-echo approach. A breast phantom with various inclusions is imaged using the classical migration method and is compared to standard computed tomography (CT) scans. These innovative ultrasonic methods incorporate ultrasound data acquisition, beam profile characterization, and image reconstruction. For an ultrasonic frequency of 2.25 MHz, imaged inclusions of approximately 1 cm are resolved and identified. Better resolution is expected with minor modifications. Improved image quality and resolution enables earlier detection and more accurate diagnoses of tumors thus reducing the number of biopsies performed, increasing treatment options, and lowering remission percentages. Using these new techniques the inclusions in the phantom are resolved and compared to the results of standard methods. Refinement of this application using other imaging techniques such as time-reversal mirrors (TRM), synthetic aperture focusing technique (SAFT), decomposition of the time reversal operator (DORT), and factorization methods is also discussed.
Comparison of denture tooth movement between CAD-CAM and conventional fabrication techniques.
Goodacre, Brian J; Goodacre, Charles J; Baba, Nadim Z; Kattadiyil, Mathew T
2018-01-01
Data comparing the denture tooth movement of computer-aided design and computer-aided manufacturing (CAD-CAM) and conventional denture processing techniques are lacking. The purpose of this in vitro study was to compare the denture tooth movement of pack-and-press, fluid resin, injection, CAD-CAM-bonded, and CAD-CAM monolithic techniques for fabricating dentures to determine which process produces the most accurate and reproducible prosthesis. A total of 50 dentures were evaluated, 10 for each of the 5 groups. A master denture was fabricated and milled from prepolymerized poly(methyl methacrylate). For the conventional processing techniques (pack-and-press, fluid resin, and injection) a polyvinyl siloxane putty mold of the master denture was made in which denture teeth were placed and molten wax injected. The cameo surface of each wax-festooned denture was laser scanned, resulting in a standard tessellation language (STL) format file. The CAD-CAM dentures included 2 subgroups: CAD-CAM-bonded teeth in which the denture teeth were bonded into the milled denture base and CAD-CAM monolithic teeth in which the denture teeth were milled as part of the denture base. After all specimens had been fabricated, they were hydrated for 24 hours, and the cameo surface laser scanned. The preprocessing and postprocessing scan files of each denture were superimposed using surface-matching software. Measurements were made at 64 locations, allowing evaluation of denture tooth movement in a buccal, lingual, mesial-distal, and occlusal direction. The use of median and interquartile range values was used to assess accuracy and reproducibility. Levene and Kruskal-Wallis analyses of variance were used to evaluate differences between processing techniques (α=.05). The CAD-CAM monolithic technique was the most accurate, followed by fluid resin, CAD-CAM-bonded, pack-and-press, and injection. CAD-CAM monolithic technique was the most reproducible, followed by pack-and-press, CAD-CAM-bonded, injection, and fluid resin. Techniques involving compression during processing showed increased positive occlusal tooth movement compared with techniques not involving compression. CAD-CAM monolithic dentures produced the best combination of accuracy and reproducibility of the tested techniques. The results from this study demonstrate that varying amounts of tooth movement can be expected depending on the processing technique. However, the clinical significance of these differences is unknown. Copyright © 2017 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Deffenbaugh, Paul Issac
3D printing has garnered immense attention from many fields including in-office rapid prototyping of mechanical parts, outer-space satellite replication, garage functional firearm manufacture, and NASA rocket engine component fabrication. 3D printing allows increased design flexibility in the fabrication of electronics, microwave circuits and wireless antennas and has reached a level of maturity which allows functional parts to be printed. Much more work is necessary in order to perfect the processes of 3D printed electronics especially in the area of automation. Chapter 1 shows several finished prototypes of 3D printed electronics as well as newly developed techniques in fabrication. Little is known about the RF and microwave properties and applications of the standard materials which have been developed for 3D printing. Measurement of a wide variety of materials over a broad spectrum of frequencies up to 10 GHz using a variety of well-established measurement methods is performed throughout chapter 2. Several types of high frequency RF transmission lines are fabricated and valuable model-matched data is gathered and provided in chapter 3 for future designers' use. Of particular note is a fully 3D printed stripline which was automatically fabricated in one process on one machine. Some core advantages of 3D printing RF/microwave components include rapid manufacturing of complex, dimensionally sensitive circuits (such as antennas and filters which are often iteratively tuned) and the ability to create new devices that cannot be made using standard fabrication techniques. Chapter 4 describes an exemplary fully 3D printed curved inverted-F antenna.
NASA Astrophysics Data System (ADS)
Vanvyve, E.; Magontier, P.; Vandenberghe, F. C.; Delle Monache, L.; Dickinson, K.
2012-12-01
Wind energy is amongst the fastest growing sources of renewable energy in the U.S. and could supply up to 20 % of the U.S power production by 2030. An accurate and reliable wind resource assessment for prospective wind farm sites is a challenging task, yet is crucial for evaluating the long-term profitability and feasibility of a potential development. We have developed an accurate and computationally efficient wind resource assessment technique for prospective wind farm sites, which incorporates innovative statistical techniques and the new NASA Earth science dataset MERRA. This technique produces a wind resource estimate that is more accurate than that obtained by the wind energy industry's standard technique, while providing a reliable quantification of its uncertainty. The focus now is on evaluating the socio-economic value of this new technique upon using the industry's standard technique. Would it yield lower financing costs? Could it result in lower electricity prices? Are there further down-the-line positive consequences, e.g. job creation, time saved, greenhouse gas decrease? Ultimately, we expect our results will inform efforts to refine and disseminate the new technique to support the development of the U.S. renewable energy infrastructure. In order to address the above questions, we are carrying out a cost-benefit analysis based on the net present worth of the technique. We will describe this approach, including the cash-flow process of wind farm financing, how the wind resource assessment factors in, and will present current results for various hypothetical candidate wind farm sites.
NASA Astrophysics Data System (ADS)
Miller, C. J.; Yoder, T. S.
2010-06-01
Explosive trace detection equipment has been deployed to airports for more than a decade. During this time, the need for standardized procedures and calibrated trace amounts for ensuring that the systems are operating properly and detecting the correct explosive has been apparent but a standard representative of a fingerprint has been elusive. Standards are also necessary to evaluate instrumentation in the laboratories during development and prior to deployment to determine sample throughput, probability of detection, false positive/negative rates, ease of use by operator, mechanical and/or software problems that may be encountered, and other pertinent parameters that would result in the equipment being unusable during field operations. Since many laboratories do not have access to nor are allowed to handle explosives, the equipment is tested using techniques aimed at simulating the actual explosives fingerprint. This laboratory study focused on examining the similarities and differences in three different surface contamination techniques that are used to performance test explosive trace detection equipment in an attempt to determine how effective the techniques are at replicating actual field samples and to offer scenarios where each contamination technique is applicable. The three techniques used were dry transfer deposition of standard solutions using the Transportation Security Laboratory’s (TSL) patented dry transfer techniques (US patent 6470730), direct deposition of explosive standards onto substrates, and fingerprinting of actual explosives onto substrates. RDX was deposited on the surface of one of five substrates using one of the three different deposition techniques. The process was repeated for each substrate type using each contamination technique. The substrate types used were: 50% cotton/50% polyester as found in T-shirts, 100% cotton with a smooth surface such as that found in a cotton dress shirt, 100% cotton on a rough surface such as that found on canvas or denim, suede leather such as might be found on jackets, purses, or shoes, and painted metal obtained from a car hood at a junk yard. The samples were not pre-cleaned prior to testing and contained sizing agents, and in the case of the metal, oil and dirt. The substrates were photographed using a Zeiss Discover V12 stereoscope with Axiocam ICc1 3 megapixel digital camera to determine the difference in the crystalline structure and surface contamination in an attempt to determine differences and similarities associated with current contamination deposition techniques. Some samples were analyzed using scanning electron microscopy (SEM) and some were extracted and analyzed with high performance liquid chromatography (HPLC) or gas chromatography with an electron capture detector (GC-ECD) to quantify the data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Danielson, Gary R.; Augustenborg, Elsa C.; Beck, Andrew E.
2010-10-29
The IAEA is challenged with limited availability of human resources for inspection and data analysis while proliferation threats increase. PNNL has a variety of IT solutions and techniques (at varying levels of maturity and development) that take raw data closer to useful knowledge, thereby assisting with and standardizing the analytical processes. This paper highlights some PNNL tools and techniques which are applicable to the international safeguards community, including: • Intelligent in-situ triage of data prior to reliable transmission to an analysis center resulting in the transmission of smaller and more relevant data sets • Capture of expert knowledge in re-usablemore » search strings tailored to specific mission outcomes • Image based searching fused with text based searching • Use of gaming to discover unexpected proliferation scenarios • Process modeling (e.g. Physical Model) as the basis for an information integration portal, which links to data storage locations along with analyst annotations, categorizations, geographic data, search strings and visualization outputs.« less
Human Language Technology: Opportunities and Challenges
2005-01-01
because of the connections to and reliance on signal processing. Audio diarization critically includes indexing of speakers [12], since speaker ...to reduce inter- speaker variability in training. Standard techniques include vocal-tract length normalization, adaptation of acoustic models using...maximum likelihood linear regression (MLLR), and speaker -adaptive training based on MLLR. The acoustic models are mixtures of Gaussians, typically with
Unit Planning Grids for Visual Arts--Grade 9-12 Advanced.
ERIC Educational Resources Information Center
Delaware State Dept. of Education, Dover.
This planning grid for teaching visual arts (advanced) in grades 9-12 in Delaware outlines the following six standards for students to complete: (1) students will select and use form, media, techniques, and processes to create works of art and communicate meaning; (2) students will create ways to use visual, spatial, and temporal concepts in…
Standardization of pitch-range settings in voice acoustic analysis.
Vogel, Adam P; Maruff, Paul; Snyder, Peter J; Mundt, James C
2009-05-01
Voice acoustic analysis is typically a labor-intensive, time-consuming process that requires the application of idiosyncratic parameters tailored to individual aspects of the speech signal. Such processes limit the efficiency and utility of voice analysis in clinical practice as well as in applied research and development. In the present study, we analyzed 1,120 voice files, using standard techniques (case-by-case hand analysis), taking roughly 10 work weeks of personnel time to complete. The results were compared with the analytic output of several automated analysis scripts that made use of preset pitch-range parameters. After pitch windows were selected to appropriately account for sex differences, the automated analysis scripts reduced processing time of the 1,120 speech samples to less than 2.5 h and produced results comparable to those obtained with hand analysis. However, caution should be exercised when applying the suggested preset values to pathological voice populations.
Peak-picking fundamental period estimation for hearing prostheses.
Howard, D M
1989-09-01
A real-time peak-picking fundamental period estimation device is described which is used in advanced hearing prostheses for the totally and profoundly deafened. The operation of the peak picker is compared with three well-established fundamental frequency estimation techniques: the electrolaryngograph, which is used as a "standard" hardware implementations of the cepstral technique, and the Gold/Rabiner parallel processing algorithm. These comparisons illustrate and highlight some of the important advantages and disadvantages that characterize the operation of these techniques. The special requirements of the hearing prostheses are discussed with respect to the operation of each device, and the choice of the peak picker is found to be felicitous in this application.
SU-E-I-27: Establishing Target Exposure Index Values for Computed Radiography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Murphy, N; Tchou, P; Belcher, K
2014-06-01
Purpose: To develop a standard set of target exposure index (TEI) values to be applied to Agfa Computed Radiography (CR) readers in accordance with International Electrotechnical Committee 62494-1 (ed. 1.0). Methods: A large data cohort was collected from six USAF Medical Treatment Facilities that exclusively use Agfa CR Readers. Dose monitoring statistics were collected from each reader. The data was analyzed based on anatomic region, view, and processing speed class. The Agfa specific exposure metric, logarithmic mean (LGM), was converted to exposure index (EI) for each data set. The optimum TEI value was determined by minimizing the number of studiesmore » that fell outside the acceptable deviation index (DI) range of +/− 2 for phototimed techniques or a range of +/−3 for fixed techniques. An anthropomorphic radiographic phantom was used to corroborate the TEI recommendations. Images were acquired of several anatomic regions and views using standard techniques. The images were then evaluated by two radiologists as either acceptable or unacceptable. The acceptable image with the lowest exposure and EI value was compared to the recommended TEI values using a passing DI range. Results: Target EI values were determined for a comprehensive list of anatomic regions and views. Conclusion: Target EI values must be established on each CR unit in order to provide a positive feedback system for the technologist. This system will serve as a mechanism to prevent under or overexposures of patients. The TEI recommendations are a first attempt at a large scale process improvement with the goal of setting reasonable and standardized TEI values. The implementation and effectiveness of the recommended TEI values should be monitored and adjustments made as necessary.« less
Methods of measurement for semiconductor materials, process control, and devices
NASA Technical Reports Server (NTRS)
Bullis, W. M. (Editor)
1972-01-01
Activities directed toward the development of methods of measurement for semiconductor materials, process control, and devices are described. Topics investigated include: measurements of transistor delay time; application of the infrared response technique to the study of radiation-damaged, lithium-drifted silicon detectors; and identification of a condition that minimizes wire flexure and reduces the failure rate of wire bonds in transistors and integrated circuits under slow thermal cycling conditions. Supplementary data concerning staff, standards committee activities, technical services, and publications are included as appendixes.
1988-11-01
system, using graphic techniques which enable users, analysts, and designers to get a clear and common picture of the system and how its parts fit...boxes into hierarchies suitable for computer implementation. ŗ. Structured Design uses tools, especially graphic ones, to render systems readily...LSA, PROCESSES, DATA FLOWS, DATA STORES, EX"RNAL ENTITIES, OVERALL SYSTEMS DESIGN PROCESS, over 19, ABSTRACT (Continue on reverse if necessary and
DOE Office of Scientific and Technical Information (OSTI.GOV)
Synovec, R.E.; Johnson, E.L.; Bahowick, T.J.
1990-08-01
This paper describes a new technique for data analysis in chromatography, based on taking the point-by-point ratio of sequential chromatograms that have been base line corrected. This ratio chromatogram provides a robust means for the identification and the quantitation of analytes. In addition, the appearance of an interferent is made highly visible, even when it coelutes with desired analytes. For quantitative analysis, the region of the ratio chromatogram corresponding to the pure elution of an analyte is identified and is used to calculate a ratio value equal to the ratio of concentrations of the analyte in sequential injections. For themore » ratio value calculation, a variance-weighted average is used, which compensates for the varying signal-to-noise ratio. This ratio value, or equivalently the percent change in concentration, is the basis of a chromatographic standard addition method and an algorithm to monitor analyte concentration in a process stream. In the case of overlapped peaks, a spiking procedure is used to calculate both the original concentration of an analyte and its signal contribution to the original chromatogram. Thus, quantitation and curve resolution may be performed simultaneously, without peak modeling or curve fitting. These concepts are demonstrated by using data from ion chromatography, but the technique should be applicable to all chromatographic techniques.« less
Testing single point incremental forming moulds for rotomoulding operations
NASA Astrophysics Data System (ADS)
Afonso, Daniel; de Sousa, Ricardo Alves; Torcato, Ricardo
2017-10-01
Low pressure polymer processes as thermoforming or rotational moulding use much simpler moulds than high pressure processes like injection. However, despite the low forces involved in the process, moulds manufacturing for these applications is still a very material, energy and time consuming operation. Particularly in rotational moulding there is no standard for the mould manufacture and very different techniques are applicable. The goal of this research is to develop and validate a method for manufacturing plastically formed sheet metal moulds by single point incremental forming (SPIF) for rotomoulding and rotocasting operations. A Stewart platform based SPIF machine allow the forming of thick metal sheets, granting the required structural stiffness for the mould surface, and keeping a short manufacture lead time and low thermal inertia. The experimental work involves the proposal of a hollow part, design and fabrication of a sheet metal mould using dieless incremental forming techniques and testing its operation in the production of prototype parts.
Vectorization with SIMD extensions speeds up reconstruction in electron tomography.
Agulleiro, J I; Garzón, E M; García, I; Fernández, J J
2010-06-01
Electron tomography allows structural studies of cellular structures at molecular detail. Large 3D reconstructions are needed to meet the resolution requirements. The processing time to compute these large volumes may be considerable and so, high performance computing techniques have been used traditionally. This work presents a vector approach to tomographic reconstruction that relies on the exploitation of the SIMD extensions available in modern processors in combination to other single processor optimization techniques. This approach succeeds in producing full resolution tomograms with an important reduction in processing time, as evaluated with the most common reconstruction algorithms, namely WBP and SIRT. The main advantage stems from the fact that this approach is to be run on standard computers without the need of specialized hardware, which facilitates the development, use and management of programs. Future trends in processor design open excellent opportunities for vector processing with processor's SIMD extensions in the field of 3D electron microscopy.
Higher resolution satellite remote sensing and the impact on image mapping
Watkins, Allen H.; Thormodsgard, June M.
1987-01-01
Recent advances in spatial, spectral, and temporal resolution of civil land remote sensing satellite data are presenting new opportunities for image mapping applications. The U.S. Geological Survey's experimental satellite image mapping program is evolving toward larger scale image map products with increased information content as a result of improved image processing techniques and increased resolution. Thematic mapper data are being used to produce experimental image maps at 1:100,000 scale that meet established U.S. and European map accuracy standards. Availability of high quality, cloud-free, 30-meter ground resolution multispectral data from the Landsat thematic mapper sensor, along with 10-meter ground resolution panchromatic and 20-meter ground resolution multispectral data from the recently launched French SPOT satellite, present new cartographic and image processing challenges.The need to fully exploit these higher resolution data increases the complexity of processing the images into large-scale image maps. The removal of radiometric artifacts and noise prior to geometric correction can be accomplished by using a variety of image processing filters and transforms. Sensor modeling and image restoration techniques allow maximum retention of spatial and radiometric information. An optimum combination of spectral information and spatial resolution can be obtained by merging different sensor types. These processing techniques are discussed and examples are presented.
Ringuet, Stephanie; Sassano, Lara; Johnson, Zackary I
2011-02-01
A sensitive, accurate and rapid analysis of major nutrients in aquatic systems is essential for monitoring and maintaining healthy aquatic environments. In particular, monitoring ammonium (NH(4)(+)) concentrations is necessary for maintenance of many fish stocks, while accurate monitoring and regulation of ammonium, orthophosphate (PO(4)(3-)), silicate (Si(OH)(4)) and nitrate (NO(3)(-)) concentrations are required for regulating algae production. Monitoring of wastewater streams is also required for many aquaculture, municipal and industrial wastewater facilities to comply with local, state or federal water quality effluent regulations. Traditional methods for quantifying these nutrient concentrations often require laborious techniques or expensive specialized equipment making these analyses difficult. Here we present four alternative microcolorimetric assays that are based on a standard 96-well microplate format and microplate reader that simplify the quantification of each of these nutrients. Each method uses small sample volumes (200 µL), has a detection limit ≤ 1 µM in freshwater and ≤ 2 µM in saltwater, precision of at least 8% and compares favorably with standard analytical procedures. Routine use of these techniques in the laboratory and at an aquaculture facility to monitor nutrient concentrations associated with microalgae growth demonstrates that they are rapid, accurate and highly reproducible among different users. These techniques offer an alternative to standard nutrient analyses and because they are based on the standard 96-well format, they significantly decrease the cost and time of processing while maintaining high precision and sensitivity.
Methods for assessing the quality of mammalian embryos: How far we are from the gold standard?
Rocha, José C; Passalia, Felipe; Matos, Felipe D; Maserati, Marc P; Alves, Mayra F; Almeida, Tamie G de; Cardoso, Bruna L; Basso, Andrea C; Nogueira, Marcelo F G
2016-08-01
Morphological embryo classification is of great importance for many laboratory techniques, from basic research to the ones applied to assisted reproductive technology. However, the standard classification method for both human and cattle embryos, is based on quality parameters that reflect the overall morphological quality of the embryo in cattle, or the quality of the individual embryonic structures, more relevant in human embryo classification. This assessment method is biased by the subjectivity of the evaluator and even though several guidelines exist to standardize the classification, it is not a method capable of giving reliable and trustworthy results. Latest approaches for the improvement of quality assessment include the use of data from cellular metabolism, a new morphological grading system, development kinetics and cleavage symmetry, embryo cell biopsy followed by pre-implantation genetic diagnosis, zona pellucida birefringence, ion release by the embryo cells and so forth. Nowadays there exists a great need for evaluation methods that are practical and non-invasive while being accurate and objective. A method along these lines would be of great importance to embryo evaluation by embryologists, clinicians and other professionals who work with assisted reproductive technology. Several techniques shows promising results in this sense, one being the use of digital images of the embryo as basis for features extraction and classification by means of artificial intelligence techniques (as genetic algorithms and artificial neural networks). This process has the potential to become an accurate and objective standard for embryo quality assessment.
Methods for assessing the quality of mammalian embryos: How far we are from the gold standard?
Rocha, José C.; Passalia, Felipe; Matos, Felipe D.; Maserati Jr, Marc P.; Alves, Mayra F.; de Almeida, Tamie G.; Cardoso, Bruna L.; Basso, Andrea C.; Nogueira, Marcelo F. G.
2016-01-01
Morphological embryo classification is of great importance for many laboratory techniques, from basic research to the ones applied to assisted reproductive technology. However, the standard classification method for both human and cattle embryos, is based on quality parameters that reflect the overall morphological quality of the embryo in cattle, or the quality of the individual embryonic structures, more relevant in human embryo classification. This assessment method is biased by the subjectivity of the evaluator and even though several guidelines exist to standardize the classification, it is not a method capable of giving reliable and trustworthy results. Latest approaches for the improvement of quality assessment include the use of data from cellular metabolism, a new morphological grading system, development kinetics and cleavage symmetry, embryo cell biopsy followed by pre-implantation genetic diagnosis, zona pellucida birefringence, ion release by the embryo cells and so forth. Nowadays there exists a great need for evaluation methods that are practical and non-invasive while being accurate and objective. A method along these lines would be of great importance to embryo evaluation by embryologists, clinicians and other professionals who work with assisted reproductive technology. Several techniques shows promising results in this sense, one being the use of digital images of the embryo as basis for features extraction and classification by means of artificial intelligence techniques (as genetic algorithms and artificial neural networks). This process has the potential to become an accurate and objective standard for embryo quality assessment. PMID:27584609
Characterization of Metal Powders Used for Additive Manufacturing.
Slotwinski, J A; Garboczi, E J; Stutzman, P E; Ferraris, C F; Watson, S S; Peltz, M A
2014-01-01
Additive manufacturing (AM) techniques can produce complex, high-value metal parts, with potential applications as critical parts, such as those found in aerospace components. The production of AM parts with consistent and predictable properties requires input materials (e.g., metal powders) with known and repeatable characteristics, which in turn requires standardized measurement methods for powder properties. First, based on our previous work, we assess the applicability of current standardized methods for powder characterization for metal AM powders. Then we present the results of systematic studies carried out on two different powder materials used for additive manufacturing: stainless steel and cobalt-chrome. The characterization of these powders is important in NIST efforts to develop appropriate measurements and standards for additive materials and to document the property of powders used in a NIST-led additive manufacturing material round robin. An extensive array of characterization techniques was applied to these two powders, in both virgin and recycled states. The physical techniques included laser diffraction particle size analysis, X-ray computed tomography for size and shape analysis, and optical and scanning electron microscopy. Techniques sensitive to structure and chemistry, including X-ray diffraction, energy dispersive analytical X-ray analysis using the X-rays generated during scanning electron microscopy, and X-Ray photoelectron spectroscopy were also employed. The results of these analyses show how virgin powder changes after being exposed to and recycled from one or more Direct Metal Laser Sintering (DMLS) additive manufacturing build cycles. In addition, these findings can give insight into the actual additive manufacturing process.
Utilizing commercial microwave for rapid and effective immunostaining.
Owens, Katrina; Park, Ji H; Kristian, Tibor
2013-09-30
There is an accumulating literature demonstrating the application of microwaves across a wide spectrum of histological techniques. Although exposure to microwaves for short periods resulted in substantial acceleration of all procedures this technique still is not adopted widely. In part, this may be due to concerns over solutions that will avoid induction of thermal damage to the tissue when using standard microwave. Here, we offer a cooling setup that can be used with conventional microwave ovens. We utilized dry ice for effective cooling during microwave irradiation of tissue samples. To prevent overheating, the cups with tissue during exposure to microwaves were surrounded with powdered dry ice. Since the dry ice does not touch the walls of the cups, freezing is prevented. Overheating is avoided by alternating the microwave treatment with 1-2 min time periods when the cups are cooled outside of the microwave oven. This technique was used on mouse brain sections that were immunostained with microglia-specific CD68 antiserum and astrocyte labeling GFAP antibody. Both standard and microwave-assisted immonolabeling gave comparable results visualizing cells with fine processes and low background signal. Short incubation time in the microwave requires high concentrations of antibody for tissue immunostaining. We show that by prolonging the microwaving procedure we were able to reduce the antibody concentration to the levels used in standard immunostaining protocol. In summary, our technique gives a possibility to use a conventional microwave for rapid and effective immunolabeling resulting in reduced amount of antibody required for satisfactory immunostaining. Published by Elsevier B.V.
Enhancement of Satellite Image Compression Using a Hybrid (DWT-DCT) Algorithm
NASA Astrophysics Data System (ADS)
Shihab, Halah Saadoon; Shafie, Suhaidi; Ramli, Abdul Rahman; Ahmad, Fauzan
2017-12-01
Discrete Cosine Transform (DCT) and Discrete Wavelet Transform (DWT) image compression techniques have been utilized in most of the earth observation satellites launched during the last few decades. However, these techniques have some issues that should be addressed. The DWT method has proven to be more efficient than DCT for several reasons. Nevertheless, the DCT can be exploited to improve the high-resolution satellite image compression when combined with the DWT technique. Hence, a proposed hybrid (DWT-DCT) method was developed and implemented in the current work, simulating an image compression system on-board on a small remote sensing satellite, with the aim of achieving a higher compression ratio to decrease the onboard data storage and the downlink bandwidth, while avoiding further complex levels of DWT. This method also succeeded in maintaining the reconstructed satellite image quality through replacing the standard forward DWT thresholding and quantization processes with an alternative process that employed the zero-padding technique, which also helped to reduce the processing time of DWT compression. The DCT, DWT and the proposed hybrid methods were implemented individually, for comparison, on three LANDSAT 8 images, using the MATLAB software package. A comparison was also made between the proposed method and three other previously published hybrid methods. The evaluation of all the objective and subjective results indicated the feasibility of using the proposed hybrid (DWT-DCT) method to enhance the image compression process on-board satellites.
Experimental quantification of the true efficiency of carbon nanotube thin-film thermophones.
Bouman, Troy M; Barnard, Andrew R; Asgarisabet, Mahsa
2016-03-01
Carbon nanotube thermophones can create acoustic waves from 1 Hz to 100 kHz. The thermoacoustic effect that allows for this non-vibrating sound source is naturally inefficient. Prior efforts have not explored their true efficiency (i.e., the ratio of the total acoustic power to the electrical input power). All previous works have used the ratio of sound pressure to input electrical power. A method for true power efficiency measurement is shown using a fully anechoic technique. True efficiency data are presented for three different drive signal processing techniques: standard alternating current (AC), direct current added to alternating current (DCAC), and amplitude modulation of an alternating current (AMAC) signal. These signal processing techniques are needed to limit the frequency doubling non-linear effects inherent to carbon nanotube thermophones. Each type of processing affects the true efficiency differently. Using a 72 W(rms) input signal, the measured efficiency ranges were 4.3 × 10(-6) - 319 × 10(-6), 1.7 × 10(-6) - 308 × 10(-6), and 1.2 × 10(-6) - 228 × 10(-6)% for AC, DCAC, and AMAC, respectively. These data were measured in the frequency range of 100 Hz to 10 kHz. In addition, the effects of these processing techniques relative to sound quality are presented in terms of total harmonic distortion.
Autologous Fat Grafting to the Breast Using REVOLVE System to Reduce Clinical Costs.
Brzezienski, Mark A; Jarrell, John A
2016-09-01
With the increasing popularity of fat grafting over the past decade, the techniques for harvest, processing and preparation, and transfer of the fat cells have evolved to improve efficiency and consistency. The REVOLVE System is a fat processing device used in autologous fat grafting which eliminates much of the specialized equipment as well as the labor intensive and time consuming efforts of the original Coleman technique of fat processing. This retrospective study evaluates the economics of fat grafting, comparing traditional Coleman processing to the REVOLVE System. From June 2013 through December 2013, 88 fat grafting cases by a single-surgeon were reviewed. Timed procedures using either the REVOLVE System or Coleman technique were extracted from the group. Data including fat grafting procedure time, harvested volume, harvest and recipient sites, and concurrent procedures were gathered. Cost and utilization assessments were performed comparing the economics between the groups using standard values of operating room costs provided by the study hospital. Thirty-seven patients with timed procedures were identified, 13 of which were Coleman technique patients and twenty-four (24) were REVOLVE System patients. The average rate of fat transfer was 1.77 mL/minute for the Coleman technique and 4.69 mL/minute for the REVOLVE System, which was a statistically significant difference (P < 0.0001) between the 2 groups. Cost analysis comparing the REVOLVE System and Coleman techniques demonstrates a dramatic divergence in the price per mL of transferred fat at 75 mL when using the previously calculated rates for each group. This single surgeon's experience with the REVOLVE System for fat processing establishes economic support for its use in specific high-volume fat grafting cases. Cost analysis comparing the REVOLVE System and Coleman techniques suggests that in cases of planned fat transfer of 75 mL or more, using the REVOLVE System for fat processing is more economically beneficial. This study may serve as a guide to plastic surgeons in deciding which cases might be appropriate for the use of the REVOLVE System and is the first report comparing economics of fat grafting with the traditional Coleman technique and the REVOLVE System.
2014-01-01
Background The purpose of this study was to compare two impression techniques from the perspective of patient preferences and treatment comfort. Methods Twenty-four (12 male, 12 female) subjects who had no previous experience with either conventional or digital impression participated in this study. Conventional impressions of maxillary and mandibular dental arches were taken with a polyether impression material (Impregum, 3 M ESPE), and bite registrations were made with polysiloxane bite registration material (Futar D, Kettenbach). Two weeks later, digital impressions and bite scans were performed using an intra-oral scanner (CEREC Omnicam, Sirona). Immediately after the impressions were made, the subjects’ attitudes, preferences and perceptions towards impression techniques were evaluated using a standardized questionnaire. The perceived source of stress was evaluated using the State-Trait Anxiety Scale. Processing steps of the impression techniques (tray selection, working time etc.) were recorded in seconds. Statistical analyses were performed with the Wilcoxon Rank test, and p < 0.05 was considered significant. Results There were significant differences among the groups (p < 0.05) in terms of total working time and processing steps. Patients stated that digital impressions were more comfortable than conventional techniques. Conclusions Digital impressions resulted in a more time-efficient technique than conventional impressions. Patients preferred the digital impression technique rather than conventional techniques. PMID:24479892
Metabolomic analysis using porcine skin: a pilot study of analytical techniques.
Wu, Julie; Fiehn, Oliver; Armstrong, April W
2014-06-15
Metabolic byproducts serve as indicators of the chemical processes and can provide valuable information on pathogenesis by measuring the amplified output. Standardized techniques for metabolome extraction of skin samples serve as a critical foundation to this field but have not been developed. We sought to determine the optimal cell lysage techniques for skin sample preparation and to compare GC-TOF-MS and UHPLC-QTOF-MS for metabolomic analysis. Using porcine skin samples, we pulverized the skin via various combinations of mechanical techniques for cell lysage. After extraction, the samples were subjected to GC-TOF-MS and/or UHPLC-QTOF-MS. Signal intensities from GC-TOF-MS analysis showed that ultrasonication (2.7x107) was most effective for cell lysage when compared to mortar-and-pestle (2.6x107), ball mill followed by ultrasonication (1.6x107), mortar-and-pestle followed by ultrasonication (1.4x107), and homogenization (trial 1: 8.4x106; trial 2: 1.6x107). Due to the similar signal intensities, ultrasonication and mortar-and-pestle were applied to additional samples and subjected to GC-TOF-MS and UHPLC-QTOF-MS. Ultrasonication yielded greater signal intensities than mortar-and-pestle for 92% of detected metabolites following GC-TOF-MS and for 68% of detected metabolites following UHPLC-QTOF-MS. Overall, ultrasonication is the preferred method for efficient cell lysage of skin tissue for both metabolomic platforms. With standardized sample preparation, metabolomic analysis of skin can serve as a powerful tool in elucidating underlying biological processes in dermatological conditions.
NASA Astrophysics Data System (ADS)
Lo, Li; Shen, Chuan-Chou; Lu, Chia-Jung; Chen, Yi-Chi; Chang, Ching-Chih; Wei, Kuo-Yen; Qu, Dingchuang; Gagan, Michael K.
2014-02-01
We have developed a rapid and precise procedure for measuring multiple elements in foraminifera and corals by inductively coupled plasma sector field mass spectrometry (ICP-SF-MS) with both cold- [800 W radio frequency (RF) power] and hot- (1200 W RF power) plasma techniques. Our quality control program includes careful subsampling protocols, contamination-free workbench spaces, and refined plastic-ware cleaning process. Element/Ca ratios are calculated directly from ion beam intensities of 24Mg, 27Al, 43Ca, 55Mn, 57Fe, 86Sr, and 138Ba, using a standard bracketing method. A routine measurement time is 3-5 min per dissolved sample. The matrix effects of nitric acid, and Ca and Sr levels, are carefully quantified and overcome. There is no significant difference between data determined by cold- and hot-plasma methods, but the techniques have different advantages. The cold-plasma technique offers a more stable plasma condition and better reproducibility for ppm-level elements. Long-term 2-sigma relative standard deviations (2-RSD) for repeat measurements of an in-house coral standard are 0.32% for Mg/Ca and 0.43% for Sr/Ca by cold-plasma ICP-SF-MS, and 0.69% for Mg/Ca and 0.51% for Sr/Ca by hot-plasma ICP-SF-MS. The higher sensitivity and enhanced measurement precision of the hot-plasma procedure yields 2-RSD precision for μmol/mol trace elements of 0.60% (Mg/Ca), 9.9% (Al/Ca), 0.68% (Mn/Ca), 2.7% (Fe/Ca), 0.50% (Sr/Ca), and 0.84% (Ba/Ca) for an in-house foraminiferal standard. Our refined ICP-SF-MS technique, which has the advantages of small sample size (2-4 μg carbonate consumed) and fast sample throughput (5-8 samples/hour), should open the way to the production of high precision and high resolution geochemical records for natural carbonate materials.
Method to Estimate the Dissolved Air Content in Hydraulic Fluid
NASA Technical Reports Server (NTRS)
Hauser, Daniel M.
2011-01-01
In order to verify the air content in hydraulic fluid, an instrument was needed to measure the dissolved air content before the fluid was loaded into the system. The instrument also needed to measure the dissolved air content in situ and in real time during the de-aeration process. The current methods used to measure the dissolved air content require the fluid to be drawn from the hydraulic system, and additional offline laboratory processing time is involved. During laboratory processing, there is a potential for contamination to occur, especially when subsaturated fluid is to be analyzed. A new method measures the amount of dissolved air in hydraulic fluid through the use of a dissolved oxygen meter. The device measures the dissolved air content through an in situ, real-time process that requires no additional offline laboratory processing time. The method utilizes an instrument that measures the partial pressure of oxygen in the hydraulic fluid. By using a standardized calculation procedure that relates the oxygen partial pressure to the volume of dissolved air in solution, the dissolved air content is estimated. The technique employs luminescent quenching technology to determine the partial pressure of oxygen in the hydraulic fluid. An estimated Henry s law coefficient for oxygen and nitrogen in hydraulic fluid is calculated using a standard method to estimate the solubility of gases in lubricants. The amount of dissolved oxygen in the hydraulic fluid is estimated using the Henry s solubility coefficient and the measured partial pressure of oxygen in solution. The amount of dissolved nitrogen that is in solution is estimated by assuming that the ratio of dissolved nitrogen to dissolved oxygen is equal to the ratio of the gas solubility of nitrogen to oxygen at atmospheric pressure and temperature. The technique was performed at atmospheric pressure and room temperature. The technique could be theoretically carried out at higher pressures and elevated temperatures.
Laser-assisted focused He + ion beam induced etching with and without XeF 2 gas assist
Stanford, Michael G.; Mahady, Kyle; Lewis, Brett B.; ...
2016-10-04
Focused helium ion (He +) milling has been demonstrated as a high-resolution nanopatterning technique; however, it can be limited by its low sputter yield as well as the introduction of undesired subsurface damage. Here, we introduce pulsed laser- and gas-assisted processes to enhance the material removal rate and patterning fidelity. A pulsed laser-assisted He+ milling process is shown to enable high-resolution milling of titanium while reducing subsurface damage in situ. Gas-assisted focused ion beam induced etching (FIBIE) of Ti is also demonstrated in which the XeF 2 precursor provides a chemical assist for enhanced material removal rate. In conclusion, amore » pulsed laser-assisted and gas-assisted FIBIE process is shown to increase the etch yield by ~9× relative to the pure He+ sputtering process. These He + induced nanopatterning techniques improve material removal rate, in comparison to standard He + sputtering, while simultaneously decreasing subsurface damage, thus extending the applicability of the He + probe as a nanopattering tool.« less
Laser-assisted focused He + ion beam induced etching with and without XeF 2 gas assist
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stanford, Michael G.; Mahady, Kyle; Lewis, Brett B.
Focused helium ion (He +) milling has been demonstrated as a high-resolution nanopatterning technique; however, it can be limited by its low sputter yield as well as the introduction of undesired subsurface damage. Here, we introduce pulsed laser- and gas-assisted processes to enhance the material removal rate and patterning fidelity. A pulsed laser-assisted He+ milling process is shown to enable high-resolution milling of titanium while reducing subsurface damage in situ. Gas-assisted focused ion beam induced etching (FIBIE) of Ti is also demonstrated in which the XeF 2 precursor provides a chemical assist for enhanced material removal rate. In conclusion, amore » pulsed laser-assisted and gas-assisted FIBIE process is shown to increase the etch yield by ~9× relative to the pure He+ sputtering process. These He + induced nanopatterning techniques improve material removal rate, in comparison to standard He + sputtering, while simultaneously decreasing subsurface damage, thus extending the applicability of the He + probe as a nanopattering tool.« less
Interpreting international governance standards for health IT use within general medical practice.
Mahncke, Rachel J; Williams, Patricia A H
2014-01-01
General practices in Australia recognise the importance of comprehensive protective security measures. Some elements of information security governance are incorporated into recommended standards, however the governance component of information security is still insufficiently addressed in practice. The International Organistion for Standardisation (ISO) released a new global standard in May 2013 entitled, ISO/IEC 27014:2013 Information technology - Security techniques - Governance of information security. This standard, applicable to organisations of all sizes, offers a framework against which to assess and implement the governance components of information security. The standard demonstrates the relationship between governance and the management of information security, provides strategic principles and processes, and forms the basis for establishing a positive information security culture. An analysis interpretation of this standard for use in Australian general practice was performed. This work is unique as such interpretation for the Australian healthcare environment has not been undertaken before. It demonstrates an application of the standard at a strategic level to inform existing development of an information security governance framework.
Aerospace Environmental Technology Conference
NASA Technical Reports Server (NTRS)
Whitaker, A. F. (Editor)
1995-01-01
The mandated elimination of CFC's, Halons, TCA, and other ozone depleting chemicals and specific hazardous materials has required changes and new developments in aerospace materials and processes. The aerospace industry has been involved for several years in providing product substitutions, redesigning entire production processes, and developing new materials that minimize or eliminate damage to the environment. These activities emphasize replacement cleaning solvents and their application verifications, compliant coatings including corrosion protection systems, and removal techniques, chemical propulsion effects on the environment, and the initiation of modifications to relevant processing and manufacturing specifications and standards. The Executive Summary of this Conference is published as NASA CP-3297.
Digital CODEC for real-time processing of broadcast quality video signals at 1.8 bits/pixel
NASA Technical Reports Server (NTRS)
Shalkhauser, Mary JO; Whyte, Wayne A., Jr.
1989-01-01
Advances in very large-scale integration and recent work in the field of bandwidth efficient digital modulation techniques have combined to make digital video processing technically feasible and potentially cost competitive for broadcast quality television transmission. A hardware implementation was developed for a DPCM-based digital television bandwidth compression algorithm which processes standard NTSC composite color television signals and produces broadcast quality video in real time at an average of 1.8 bits/pixel. The data compression algorithm and the hardware implementation of the CODEC are described, and performance results are provided.
Digital CODEC for real-time processing of broadcast quality video signals at 1.8 bits/pixel
NASA Technical Reports Server (NTRS)
Shalkhauser, Mary JO; Whyte, Wayne A.
1991-01-01
Advances in very large scale integration and recent work in the field of bandwidth efficient digital modulation techniques have combined to make digital video processing technically feasible an potentially cost competitive for broadcast quality television transmission. A hardware implementation was developed for DPCM (differential pulse code midulation)-based digital television bandwidth compression algorithm which processes standard NTSC composite color television signals and produces broadcast quality video in real time at an average of 1.8 bits/pixel. The data compression algorithm and the hardware implementation of the codec are described, and performance results are provided.
Tensile-Creep Test Specimen Preparation Practices of Surface Support Liners
NASA Astrophysics Data System (ADS)
Guner, Dogukan; Ozturk, Hasan
2017-12-01
Ground support has always been considered as a challenging issue in all underground operations. Many forms of support systems and supporting techniques are available in the mining/tunnelling industry. In the last two decades, a new polymer based material, Thin Spray-on Liner (TSL), has attained a place in the market as an alternative to the current areal ground support systems. Although TSL provides numerous merits and has different application purposes, the knowledge on mechanical properties and performance of this material is still limited. In laboratory studies, since tensile rupture is the most commonly observed failure mechanism in field applications, researchers have generally studied the tensile testing of TSLs with modification of American Society for Testing and Materials (ASTM) D-638 standards. For tensile creep testing, specimen preparation process also follows the ASTM standards. Two different specimen dimension types (Type I, Type IV) are widely preferred in TSL tensile testing that conform to the related standards. Moreover, molding and die cutting are commonly used specimen preparation techniques. In literature, there is a great variability of test results due to the difference in specimen preparation techniques and practices. In this study, a ductile TSL product was tested in order to investigate the effect of both specimen preparation techniques and specimen dimensions under 7-day curing time. As a result, ultimate tensile strength, tensile yield strength, tensile modulus, and elongation at break values were obtained for 4 different test series. It is concluded that Type IV specimens have higher strength values compared to Type I specimens and moulded specimens have lower results than that of prepared by using die cutter. Moreover, specimens prepared by molding techniques have scattered test results. Type IV specimens prepared by die cutter technique are suggested for preparation of tensile test and Type I specimens prepared by die cutter technique should be preferred for tensile creep tests.
Multiscale image processing and antiscatter grids in digital radiography.
Lo, Winnie Y; Hornof, William J; Zwingenberger, Allison L; Robertson, Ian D
2009-01-01
Scatter radiation is a source of noise and results in decreased signal-to-noise ratio and thus decreased image quality in digital radiography. We determined subjectively whether a digitally processed image made without a grid would be of similar quality to an image made with a grid but without image processing. Additionally the effects of exposure dose and of a using a grid with digital radiography on overall image quality were studied. Thoracic and abdominal radiographs of five dogs of various sizes were made. Four acquisition techniques were included (1) with a grid, standard exposure dose, digital image processing; (2) without a grid, standard exposure dose, digital image processing; (3) without a grid, half the exposure dose, digital image processing; and (4) with a grid, standard exposure dose, no digital image processing (to mimic a film-screen radiograph). Full-size radiographs as well as magnified images of specific anatomic regions were generated. Nine reviewers rated the overall image quality subjectively using a five-point scale. All digitally processed radiographs had higher overall scores than nondigitally processed radiographs regardless of patient size, exposure dose, or use of a grid. The images made at half the exposure dose had a slightly lower quality than those made at full dose, but this was only statistically significant in magnified images. Using a grid with digital image processing led to a slight but statistically significant increase in overall quality when compared with digitally processed images made without a grid but whether this increase in quality is clinically significant is unknown.
NASA Astrophysics Data System (ADS)
Sanger, Demas S.; Haneishi, Hideaki; Miyake, Yoichi
1995-08-01
This paper proposed a simple and automatic method for recognizing the light sources from various color negative film brands by means of digital image processing. First, we stretched the image obtained from a negative based on the standardized scaling factors, then extracted the dominant color component among red, green, and blue components of the stretched image. The dominant color component became the discriminator for the recognition. The experimental results verified that any one of the three techniques could recognize the light source from negatives of any film brands and all brands greater than 93.2 and 96.6% correct recognitions, respectively. This method is significant for the automation of color quality control in color reproduction from color negative film in mass processing and printing machine.
Uncertainty Analysis of Instrument Calibration and Application
NASA Technical Reports Server (NTRS)
Tripp, John S.; Tcheng, Ping
1999-01-01
Experimental aerodynamic researchers require estimated precision and bias uncertainties of measured physical quantities, typically at 95 percent confidence levels. Uncertainties of final computed aerodynamic parameters are obtained by propagation of individual measurement uncertainties through the defining functional expressions. In this paper, rigorous mathematical techniques are extended to determine precision and bias uncertainties of any instrument-sensor system. Through this analysis, instrument uncertainties determined through calibration are now expressed as functions of the corresponding measurement for linear and nonlinear univariate and multivariate processes. Treatment of correlated measurement precision error is developed. During laboratory calibration, calibration standard uncertainties are assumed to be an order of magnitude less than those of the instrument being calibrated. Often calibration standards do not satisfy this assumption. This paper applies rigorous statistical methods for inclusion of calibration standard uncertainty and covariance due to the order of their application. The effects of mathematical modeling error on calibration bias uncertainty are quantified. The effects of experimental design on uncertainty are analyzed. The importance of replication is emphasized, techniques for estimation of both bias and precision uncertainties using replication are developed. Statistical tests for stationarity of calibration parameters over time are obtained.
Data Mining of Macromolecular Structures.
van Beusekom, Bart; Perrakis, Anastassis; Joosten, Robbie P
2016-01-01
The use of macromolecular structures is widespread for a variety of applications, from teaching protein structure principles all the way to ligand optimization in drug development. Applying data mining techniques on these experimentally determined structures requires a highly uniform, standardized structural data source. The Protein Data Bank (PDB) has evolved over the years toward becoming the standard resource for macromolecular structures. However, the process selecting the data most suitable for specific applications is still very much based on personal preferences and understanding of the experimental techniques used to obtain these models. In this chapter, we will first explain the challenges with data standardization, annotation, and uniformity in the PDB entries determined by X-ray crystallography. We then discuss the specific effect that crystallographic data quality and model optimization methods have on structural models and how validation tools can be used to make informed choices. We also discuss specific advantages of using the PDB_REDO databank as a resource for structural data. Finally, we will provide guidelines on how to select the most suitable protein structure models for detailed analysis and how to select a set of structure models suitable for data mining.
The effect of various veneering techniques on the marginal fit of zirconia copings
Torabi, Kianoosh; Vojdani, Mahroo; Giti, Rashin; Pardis, Soheil
2015-01-01
PURPOSE This study aimed to evaluate the fit of zirconia ceramics before and after veneering, using 3 different veneering processes (layering, press-over, and CAD-on techniques). MATERIALS AND METHODS Thirty standardized zirconia CAD/CAM frameworks were constructed and divided into three groups of 10 each. The first group was veneered using the traditional layering technique. Press-over and CAD-on techniques were used to veneer second and third groups. The marginal gap of specimens was measured before and after veneering process at 18 sites on the master die using a digital microscope. Paired t-test was used to evaluate mean marginal gap changes. One-way ANOVA and post hoc tests were also employed for comparison among 3 groups (α=.05). RESULTS Marginal gap of 3 groups was increased after porcelain veneering. The mean marginal gap values after veneering in the layering group (63.06 µm) was higher than press-over (50.64 µm) and CAD-on (51.50 µm) veneered groups (P<.001). CONCLUSION Three veneering methods altered the marginal fit of zirconia copings. Conventional layering technique increased the marginal gap of zirconia framework more than pressing and CAD-on techniques. All ceramic crowns made through three different veneering methods revealed clinically acceptable marginal fit. PMID:26140175
Lawless, Craig; Hubbard, Simon J.; Fan, Jun; Bessant, Conrad; Hermjakob, Henning; Jones, Andrew R.
2012-01-01
Abstract New methods for performing quantitative proteome analyses based on differential labeling protocols or label-free techniques are reported in the literature on an almost monthly basis. In parallel, a correspondingly vast number of software tools for the analysis of quantitative proteomics data has also been described in the literature and produced by private companies. In this article we focus on the review of some of the most popular techniques in the field and present a critical appraisal of several software packages available to process and analyze the data produced. We also describe the importance of community standards to support the wide range of software, which may assist researchers in the analysis of data using different platforms and protocols. It is intended that this review will serve bench scientists both as a useful reference and a guide to the selection and use of different pipelines to perform quantitative proteomics data analysis. We have produced a web-based tool (http://www.proteosuite.org/?q=other_resources) to help researchers find appropriate software for their local instrumentation, available file formats, and quantitative methodology. PMID:22804616
Upright Imaging of Drosophila Egg Chambers
Manning, Lathiena; Starz-Gaiano, Michelle
2015-01-01
Drosophila melanogaster oogenesis provides an ideal context for studying varied developmental processes since the ovary is relatively simple in architecture, is well-characterized, and is amenable to genetic analysis. Each egg chamber consists of germ-line cells surrounded by a single epithelial layer of somatic follicle cells. Subsets of follicle cells undergo differentiation during specific stages to become several different cell types. Standard techniques primarily allow for a lateral view of egg chambers, and therefore a limited view of follicle cell organization and identity. The upright imaging protocol describes a mounting technique that enables a novel, vertical view of egg chambers with a standard confocal microscope. Samples are first mounted between two layers of glycerin jelly in a lateral (horizontal) position on a glass microscope slide. The jelly with encased egg chambers is then cut into blocks, transferred to a coverslip, and flipped to position egg chambers upright. Mounted egg chambers can be imaged on either an upright or an inverted confocal microscope. This technique enables the study of follicle cell specification, organization, molecular markers, and egg development with new detail and from a new perspective. PMID:25867882
Salerno, Stephen M; Arnett, Michael V; Domanski, Jeremy P
2009-01-01
Prior research on reducing variation in housestaff handoff procedures have depended on proprietary checkout software. Use of low-technology standardization techniques has not been widely studied. We wished to determine if standardizing the process of intern sign-out using low-technology sign-out tools could reduce perception of errors and missing handoff data. We conducted a pre-post prospective study of a cohort of 34 interns on a general internal medicine ward. Night interns coming off duty and day interns reassuming care were surveyed on their perception of erroneous sign-out data, mistakes made by the night intern overnight, and occurrences unanticipated by sign-out. Trainee satisfaction with the sign-out process was assessed with a 5-point Likert survey. There were 399 intern surveys performed 8 weeks before and 6 weeks after the introduction of a standardized sign-out form. The response rate was 95% for the night interns and 70% for the interns reassuming care in the morning. After the standardized form was introduced, night interns were significantly (p < .003) less likely to detect missing sign-out data including missing important diseases, contingency plans, or medications. Standardized sign-out did not significantly alter the frequency of dropped tasks or missed lab and X-ray data as perceived by the night intern. However, the day teams thought there were significantly less perceived errors on the part of the night intern (p = .001) after introduction of the standardized sign-out sheet. There was no difference in mean Likert scores of resident satisfaction with sign-out before and after the intervention. Standardized written sign-out sheets significantly improve the completeness and effectiveness of handoffs between night and day interns. Further research is needed to determine if these process improvements are related to better patient outcomes.
Economic Techniques of Occupational Health and Safety Management
NASA Astrophysics Data System (ADS)
Sidorov, Aleksandr I.; Beregovaya, Irina B.; Khanzhina, Olga A.
2016-10-01
The article deals with the issues on economic techniques of occupational health and safety management. Authors’ definition of safety management is given. It is represented as a task-oriented process to identify, establish and maintain such a state of work environment in which there are no possible effects of hazardous and harmful factors, or their influence does not go beyond certain limits. It was noted that management techniques that are the part of the control mechanism, are divided into administrative, organizational and administrative, social and psychological and economic. The economic management techniques are proposed to be classified depending on the management subject, management object, in relation to an enterprise environment, depending on a control action. Technoeconomic study, feasibility study, planning, financial incentives, preferential crediting of enterprises, pricing, profit sharing and equity, preferential tax treatment for enterprises, economic regulations and standards setting have been distinguished as economic techniques.
User's manual for computer program BASEPLOT
Sanders, Curtis L.
2002-01-01
The checking and reviewing of daily records of streamflow within the U.S. Geological Survey is traditionally accomplished by hand-plotting and mentally collating tables of data. The process is time consuming, difficult to standardize, and subject to errors in computation, data entry, and logic. In addition, the presentation of flow data on the internet requires more timely and accurate computation of daily flow records. BASEPLOT was developed for checking and review of primary streamflow records within the U.S. Geological Survey. Use of BASEPLOT enables users to (1) provide efficiencies during the record checking and review process, (2) improve quality control, (3) achieve uniformity of checking and review techniques of simple stage-discharge relations, and (4) provide a tool for teaching streamflow computation techniques. The BASEPLOT program produces tables of quality control checks and produces plots of rating curves and discharge measurements; variable shift (V-shift) diagrams; and V-shifts converted to stage-discharge plots, using data stored in the U.S. Geological Survey Automatic Data Processing System database. In addition, the program plots unit-value hydrographs that show unit-value stages, shifts, and datum corrections; input shifts, datum corrections, and effective dates; discharge measurements; effective dates for rating tables; and numeric quality control checks. Checklist/tutorial forms are provided for reviewers to ensure completeness of review and standardize the review process. The program was written for the U.S. Geological Survey SUN computer using the Statistical Analysis System (SAS) software produced by SAS Institute, Incorporated.
Improving Focal Depth Estimates: Studies of Depth Phase Detection at Regional Distances
NASA Astrophysics Data System (ADS)
Stroujkova, A.; Reiter, D. T.; Shumway, R. H.
2006-12-01
The accurate estimation of the depth of small, regionally recorded events continues to be an important and difficult explosion monitoring research problem. Depth phases (free surface reflections) are the primary tool that seismologists use to constrain the depth of a seismic event. When depth phases from an event are detected, an accurate source depth is easily found by using the delay times of the depth phases relative to the P wave and a velocity profile near the source. Cepstral techniques, including cepstral F-statistics, represent a class of methods designed for the depth-phase detection and identification; however, they offer only a moderate level of success at epicentral distances less than 15°. This is due to complexities in the Pn coda, which can lead to numerous false detections in addition to the true phase detection. Therefore, cepstral methods cannot be used independently to reliably identify depth phases. Other evidence, such as apparent velocities, amplitudes and frequency content, must be used to confirm whether the phase is truly a depth phase. In this study we used a variety of array methods to estimate apparent phase velocities and arrival azimuths, including beam-forming, semblance analysis, MUltiple SIgnal Classification (MUSIC) (e.g., Schmidt, 1979), and cross-correlation (e.g., Cansi, 1995; Tibuleac and Herrin, 1997). To facilitate the processing and comparison of results, we developed a MATLAB-based processing tool, which allows application of all of these techniques (i.e., augmented cepstral processing) in a single environment. The main objective of this research was to combine the results of three focal-depth estimation techniques and their associated standard errors into a statistically valid unified depth estimate. The three techniques include: 1. Direct focal depth estimate from the depth-phase arrival times picked via augmented cepstral processing. 2. Hypocenter location from direct and surface-reflected arrivals observed on sparse networks of regional stations using a Grid-search, Multiple-Event Location method (GMEL; Rodi and Toksöz, 2000; 2001). 3. Surface-wave dispersion inversion for event depth and focal mechanism (Herrmann and Ammon, 2002). To validate our approach and provide quality control for our solutions, we applied the techniques to moderated- sized events (mb between 4.5 and 6.0) with known focal mechanisms. We illustrate the techniques using events observed at regional distances from the KSAR (Wonju, South Korea) teleseismic array and other nearby broadband three-component stations. Our results indicate that the techniques can produce excellent agreement between the various depth estimates. In addition, combining the techniques into a "unified" estimate greatly reduced location errors and improved robustness of the solution, even if results from the individual methods yielded large standard errors.
Dugdale, Stephanie; Ward, Jonathan; Hernen, Jan; Elison, Sarah; Davies, Glyn; Donkor, Daniel
2016-07-22
In recent years, research within the field of health psychology has made significant progress in terms of advancing and standardizing the science of developing, evaluating and reporting complex behavioral change interventions. A major part of this work has involved the development of an evidence-based Behavior Change Technique Taxonomy v1 (BCTTv1), as a means of describing the active components contained within such complex interventions. To date, however, this standardized approach derived from health psychology research has not been applied to the development of complex interventions for the treatment of substance use disorders (SUD). Therefore, this paper uses Breaking Free Online (BFO), a computer-assisted therapy program for SUD, as an example of how the clinical techniques contained within such an intervention might be mapped onto the BCTTv1. The developers of BFO were able to produce a full list of the clinical techniques contained within BFO. Exploratory mapping of the BCTTv1 onto the clinical content of the BFO program was conducted separately by the authors of the paper. This included the developers of the BFO program and psychology professionals working within the SUD field. These coded techniques were reviewed by the authors and any discrepancies in the coding were discussed between all authors until an agreement was reached. The BCTTv1 was mapped onto the clinical content of the BFO program. At least one behavioral change technique was found in 12 out of 16 grouping categories within the BCTTv1. A total of 26 out of 93 behavior change techniques were identified across the clinical content of the program. This exploratory mapping exercise has identified the specific behavior change techniques contained within BFO, and has provided a means of describing these techniques in a standardized way using the BCTTv1 terminology. It has also provided an opportunity for the BCTTv1 mapping process to be reported to the wider SUD treatment community, as it may have real utility in the development and evaluation of other psychosocial and behavioral change interventions within this field.
A Versatile Multichannel Digital Signal Processing Module for Microcalorimeter Arrays
NASA Astrophysics Data System (ADS)
Tan, H.; Collins, J. W.; Walby, M.; Hennig, W.; Warburton, W. K.; Grudberg, P.
2012-06-01
Different techniques have been developed for reading out microcalorimeter sensor arrays: individual outputs for small arrays, and time-division or frequency-division or code-division multiplexing for large arrays. Typically, raw waveform data are first read out from the arrays using one of these techniques and then stored on computer hard drives for offline optimum filtering, leading not only to requirements for large storage space but also limitations on achievable count rate. Thus, a read-out module that is capable of processing microcalorimeter signals in real time will be highly desirable. We have developed multichannel digital signal processing electronics that are capable of on-board, real time processing of microcalorimeter sensor signals from multiplexed or individual pixel arrays. It is a 3U PXI module consisting of a standardized core processor board and a set of daughter boards. Each daughter board is designed to interface a specific type of microcalorimeter array to the core processor. The combination of the standardized core plus this set of easily designed and modified daughter boards results in a versatile data acquisition module that not only can easily expand to future detector systems, but is also low cost. In this paper, we first present the core processor/daughter board architecture, and then report the performance of an 8-channel daughter board, which digitizes individual pixel outputs at 1 MSPS with 16-bit precision. We will also introduce a time-division multiplexing type daughter board, which takes in time-division multiplexing signals through fiber-optic cables and then processes the digital signals to generate energy spectra in real time.
Wavelet-Based Signal Processing for Monitoring Discomfort and Fatigue
2008-06-01
Wigner - Ville distribution ( WVD ), the short-time Fourier transform (STFT) or spectrogram, the Choi-Williams distribution (CWD), the smoothed pseudo Wigner ...has the advantage of being computationally less expensive than other standard techniques, such as the Wigner - Ville distribution ( WVD ), the spectrogram...slopes derived from the spectrogram and the smoothed pseudo Wigner - Ville distribution . Furthermore, slopes derived from the filter bank
What Does CALL Have to Offer Computer Science and What Does Computer Science Have to Offer CALL?
ERIC Educational Resources Information Center
Cushion, Steve
2006-01-01
We will argue that CALL can usefully be viewed as a subset of computer software engineering and can profit from adopting some of the recent progress in software development theory. The unified modelling language has become the industry standard modelling technique and the accompanying unified process is rapidly gaining acceptance. The manner in…
Manufacturing and quality control of interconnecting wire harnesses, Volume 4
NASA Technical Reports Server (NTRS)
1972-01-01
The document covers interconnecting wire harnesses defined in the design standard, including type 8, flat conductor cable. Volume breadth covers installations of groups of harnesses in a major assembly and the associated post installation inspections and electrical tests. Knowledge gained through experience on the Saturn 5 program coupled with recent advances in techniques, materials, and processes was incorporated into this document.
Elemental Analysis in Biological Matrices Using ICP-MS.
Hansen, Matthew N; Clogston, Jeffrey D
2018-01-01
The increasing exploration of metallic nanoparticles for use as cancer therapeutic agents necessitates a sensitive technique to track the clearance and distribution of the material once introduced into a living system. Inductively coupled plasma mass spectrometry (ICP-MS) provides a sensitive and selective tool for tracking the distribution of metal components from these nanotherapeutics. This chapter presents a standardized method for processing biological matrices, ensuring complete homogenization of tissues, and outlines the preparation of appropriate standards and controls. The method described herein utilized gold nanoparticle-treated samples; however, the method can easily be applied to the analysis of other metals.
NASA Astrophysics Data System (ADS)
Nardi, F.; Grimaldi, S.; Petroselli, A.
2012-12-01
Remotely sensed Digital Elevation Models (DEMs), largely available at high resolution, and advanced terrain analysis techniques built in Geographic Information Systems (GIS), provide unique opportunities for DEM-based hydrologic and hydraulic modelling in data-scarce river basins paving the way for flood mapping at the global scale. This research is based on the implementation of a fully continuous hydrologic-hydraulic modelling optimized for ungauged basins with limited river flow measurements. The proposed procedure is characterized by a rainfall generator that feeds a continuous rainfall-runoff model producing flow time series that are routed along the channel using a bidimensional hydraulic model for the detailed representation of the inundation process. The main advantage of the proposed approach is the characterization of the entire physical process during hydrologic extreme events of channel runoff generation, propagation, and overland flow within the floodplain domain. This physically-based model neglects the need for synthetic design hyetograph and hydrograph estimation that constitute the main source of subjective analysis and uncertainty of standard methods for flood mapping. Selected case studies show results and performances of the proposed procedure as respect to standard event-based approaches.
Measurement Challenges for Carbon Nanotube Material
NASA Technical Reports Server (NTRS)
Sosa, Edward; Arepalli, Sivaram; Nikolaev, Pasha; Gorelik, Olga; Yowell, Leonard
2006-01-01
The advances in large scale applications of carbon nanotubes demand a reliable supply of raw and processed materials. It is imperative to have a consistent quality control of these nanomaterials to distinguish material inconsistency from the modifications induced by processing of nanotubes for any application. NASA Johnson Space Center realized this need five years back and started a program to standardize the characterization methods. The JSC team conducted two workshops (2003 and 2005) in collaboration with NIST focusing on purity and dispersion measurement issues of carbon nanotubes [1]. In 2004, the NASA-JSC protocol was developed by combining analytical techniques of SEM, TEM, UV-VIS-NIR absorption, Raman, and TGA [2]. This protocol is routinely used by several researchers across the world as a first step in characterizing raw and purified carbon nanotubes. A suggested practice guide consisting of detailed chapters on TGA, Raman, electron microscopy and NIR absorption is in the final stages and is undergoing revisions with input from the nanotube community [3]. The possible addition of other techniques such as XPS, and ICP to the existing protocol will be presented. Recent activities at ANSI and ISO towards implementing these protocols as nanotube characterization standards will be discussed.
NASA Technical Reports Server (NTRS)
Schilling, D. L.
1974-01-01
Digital multiplication of two waveforms using delta modulation (DM) is discussed. It is shown that while conventional multiplication of two N bit words requires N2 complexity, multiplication using DM requires complexity which increases linearly with N. Bounds on the signal-to-quantization noise ratio (SNR) resulting from this multiplication are determined and compared with the SNR obtained using standard multiplication techniques. The phase locked loop (PLL) system, consisting of a phase detector, voltage controlled oscillator, and a linear loop filter, is discussed in terms of its design and system advantages. Areas requiring further research are identified.
Sampling methods for microbiological analysis of red meat and poultry carcasses.
Capita, Rosa; Prieto, Miguel; Alonso-Calleja, Carlos
2004-06-01
Microbiological analysis of carcasses at slaughterhouses is required in the European Union for evaluating the hygienic performance of carcass production processes as required for effective hazard analysis critical control point implementation. The European Union microbial performance standards refer exclusively to the excision method, even though swabbing using the wet/dry technique is also permitted when correlation between both destructive and nondestructive methods can be established. For practical and economic reasons, the swab technique is the most extensively used carcass surface-sampling method. The main characteristics, advantages, and limitations of the common excision and swabbing methods are described here.
Label-free evanescent microscopy for membrane nano-tomography in living cells.
Bon, Pierre; Barroca, Thomas; Lévèque-Fort, Sandrine; Fort, Emmanuel
2014-11-01
We show that through-the-objective evanescent microscopy (epi-EM) is a powerful technique to image membranes in living cells. Readily implementable on a standard inverted microscope, this technique enables full-field and real-time tracking of membrane processes without labeling and thus signal fading. In addition, we demonstrate that the membrane/interface distance can be retrieved with 10 nm precision using a multilayer Fresnel model. We apply this nano-axial tomography of living cell membranes to retrieve quantitative information on membrane invagination dynamics. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim
Low-loss slot waveguides with silicon (111) surfaces realized using anisotropic wet etching
NASA Astrophysics Data System (ADS)
Debnath, Kapil; Khokhar, Ali; Boden, Stuart; Arimoto, Hideo; Oo, Swe; Chong, Harold; Reed, Graham; Saito, Shinichi
2016-11-01
We demonstrate low-loss slot waveguides on silicon-on-insulator (SOI) platform. Waveguides oriented along the (11-2) direction on the Si (110) plane were first fabricated by a standard e-beam lithography and dry etching process. A TMAH based anisotropic wet etching technique was then used to remove any residual side wall roughness. Using this fabrication technique propagation loss as low as 3.7dB/cm was realized in silicon slot waveguide for wavelengths near 1550nm. We also realized low propagation loss of 1dB/cm for silicon strip waveguides.
NASA Astrophysics Data System (ADS)
Zhao, Libo; Xia, Yong; Hebibul, Rahman; Wang, Jiuhong; Zhou, Xiangyang; Hu, Yingjie; Li, Zhikang; Luo, Guoxi; Zhao, Yulong; Jiang, Zhuangde
2018-03-01
This paper presents an experimental study using image processing to investigate width and width uniformity of sub-micrometer polyethylene oxide (PEO) lines fabricated by near-filed electrospinning (NFES) technique. An adaptive thresholding method was developed to determine the optimal gray values to accurately extract profiles of printed lines from original optical images. And it was proved with good feasibility. The mechanism of the proposed thresholding method was believed to take advantage of statistic property and get rid of halo induced errors. Triangular method and relative standard deviation (RSD) were introduced to calculate line width and width uniformity, respectively. Based on these image processing methods, the effects of process parameters including substrate speed (v), applied voltage (U), nozzle-to-collector distance (H), and syringe pump flow rate (Q) on width and width uniformity of printed lines were discussed. The research results are helpful to promote the NFES technique for fabricating high resolution micro and sub-micro lines and also helpful to optical image processing at sub-micro level.
Ardila-Rey, Jorge Alfredo; Montaña, Johny; Schurch, Roger; Covolan Ulson, José Alfredo; Bani, Nurul Aini
2018-01-01
Partial discharges (PDs) are one of the most important classes of ageing processes that occur within electrical insulation. PD detection is a standardized technique to qualify the state of the insulation in electric assets such as machines and power cables. Generally, the classical phase-resolved partial discharge (PRPD) patterns are used to perform the identification of the type of PD source when they are related to a specific degradation process and when the electrical noise level is low compared to the magnitudes of the PD signals. However, in practical applications such as measurements carried out in the field or in industrial environments, several PD sources and large noise signals are usually present simultaneously. In this study, three different inductive sensors have been used to evaluate and compare their performance in the detection and separation of multiple PD sources by applying the chromatic technique to each of the measured signals. PMID:29596337
Tackling sampling challenges in biomolecular simulations.
Barducci, Alessandro; Pfaendtner, Jim; Bonomi, Massimiliano
2015-01-01
Molecular dynamics (MD) simulations are a powerful tool to give an atomistic insight into the structure and dynamics of proteins. However, the time scales accessible in standard simulations, which often do not match those in which interesting biological processes occur, limit their predictive capabilities. Many advanced sampling techniques have been proposed over the years to overcome this limitation. This chapter focuses on metadynamics, a method based on the introduction of a time-dependent bias potential to accelerate sampling and recover equilibrium properties of a few descriptors that are able to capture the complexity of a process at a coarse-grained level. The theory of metadynamics and its combination with other popular sampling techniques such as the replica exchange method is briefly presented. Practical applications of these techniques to the study of the Trp-Cage miniprotein folding are also illustrated. The examples contain a guide for performing these calculations with PLUMED, a plugin to perform enhanced sampling simulations in combination with many popular MD codes.
NASA Astrophysics Data System (ADS)
Hancher, M.
2017-12-01
Recent years have seen promising results from many research teams applying deep learning techniques to geospatial data processing. In that same timeframe, TensorFlow has emerged as the most popular framework for deep learning in general, and Google has assembled petabytes of Earth observation data from a wide variety of sources and made them available in analysis-ready form in the cloud through Google Earth Engine. Nevertheless, developing and applying deep learning to geospatial data at scale has been somewhat cumbersome to date. We present a new set of tools and techniques that simplify this process. Our approach combines the strengths of several underlying tools: TensorFlow for its expressive deep learning framework; Earth Engine for data management, preprocessing, postprocessing, and visualization; and other tools in Google Cloud Platform to train TensorFlow models at scale, perform additional custom parallel data processing, and drive the entire process from a single familiar Python development environment. These tools can be used to easily apply standard deep neural networks, convolutional neural networks, and other custom model architectures to a variety of geospatial data structures. We discuss our experiences applying these and related tools to a range of machine learning problems, including classic problems like cloud detection, building detection, land cover classification, as well as more novel problems like illegal fishing detection. Our improved tools will make it easier for geospatial data scientists to apply modern deep learning techniques to their own problems, and will also make it easier for machine learning researchers to advance the state of the art of those techniques.
Diversity of thermophilic populations during thermophilic aerobic digestion of potato peel slurry.
Ugwuanyi, J O; Harvey, L M; McNeil, B
2008-01-01
To study the diversity of thermophiles during thermophilic aerobic digestion (TAD) of agro-food waste slurries under conditions similar to full-scale processes. Population diversity and development in TAD were studied by standard microbiological techniques and the processes monitored by standard fermentation procedures. Facultative thermophiles were identified as Bacillus coagulans and B. licheniformis, while obligate thermophiles were identified as B. stearothermophilus. They developed rapidly to peaks of 10(7) to 10(8) in
Optical fiber sensors measurement system and special fibers improvement
NASA Astrophysics Data System (ADS)
Jelinek, Michal; Hrabina, Jan; Hola, Miroslava; Hucl, Vaclav; Cizek, Martin; Rerucha, Simon; Lazar, Josef; Mikel, Bretislav
2017-06-01
We present method for the improvement of the measurement accuracy in the optical frequency spectra measurements based on tunable optical filters. The optical filter was used during the design and realization of the measurement system for the inspection of the fiber Bragg gratings. The system incorporates a reference block for the compensation of environmental influences, an interferometric verification subsystem and a PC - based control software implemented in LabView. The preliminary experimental verification of the measurement principle and the measurement system functionality were carried out on a testing rig with a specially prepared concrete console in the UJV Řež. The presented system is the laboratory version of the special nuclear power plant containment shape deformation measurement system which was installed in the power plant Temelin during last year. On the base of this research we started with preparation other optical fiber sensors to nuclear power plants measurement. These sensors will be based on the microstructured and polarization maintaining optical fibers. We started with development of new methods and techniques of the splicing and shaping optical fibers. We are able to made optical tapers from ultra-short called adiabatic with length around 400 um up to long tapers with length up to 6 millimeters. We developed new techniques of splicing standard Single Mode (SM) and Multimode (MM) optical fibers and splicing of optical fibers with different diameters in the wavelength range from 532 to 1550 nm. Together with development these techniques we prepared other techniques to splicing and shaping special optical fibers like as Polarization-Maintaining (PM) or hollow core Photonic Crystal Fiber (PCF) and theirs cross splicing methods with focus to minimalize backreflection and attenuation. The splicing special optical fibers especially PCF fibers with standard telecommunication and other SM fibers can be done by our developed techniques. Adjustment of the splicing process has to be prepared for any new optical fibers and new fibers combinations. The splicing of the same types of fibers from different manufacturers can be adjusted by several tested changes in the splicing process. We are able to splice PCF with standard telecommunication fiber with attenuation up to 2 dB. The method is also presented. Development of these new techniques and methods of the optical fibers splicing are made with respect to using these fibers to another research and development in the field of optical fibers sensors, laser frequency stabilization and laser interferometry based on optical fibers. Especially for the field of laser frequency stabilization we developed and present new techniques to closing microstructured fibers with gases inside.
Percy, Andrew J; Mohammed, Yassene; Yang, Juncong; Borchers, Christoph H
2015-12-01
An increasingly popular mass spectrometry-based quantitative approach for health-related research in the biomedical field involves the use of stable isotope-labeled standards (SIS) and multiple/selected reaction monitoring (MRM/SRM). To improve inter-laboratory precision and enable more widespread use of this 'absolute' quantitative technique in disease-biomarker assessment studies, methods must be standardized. Results/methodology: Using this MRM-with-SIS-peptide approach, we developed an automated method (encompassing sample preparation, processing and analysis) for quantifying 76 candidate protein markers (spanning >4 orders of magnitude in concentration) in neat human plasma. The assembled biomarker assessment kit - the 'BAK-76' - contains the essential materials (SIS mixes), methods (for acquisition and analysis), and tools (Qualis-SIS software) for performing biomarker discovery or verification studies in a rapid and standardized manner.
Accelerated testing of space mechanisms
NASA Technical Reports Server (NTRS)
Murray, S. Frank; Heshmat, Hooshang
1995-01-01
This report contains a review of various existing life prediction techniques used for a wide range of space mechanisms. Life prediction techniques utilized in other non-space fields such as turbine engine design are also reviewed for applicability to many space mechanism issues. The development of new concepts on how various tribological processes are involved in the life of the complex mechanisms used for space applications are examined. A 'roadmap' for the complete implementation of a tribological prediction approach for complex mechanical systems including standard procedures for test planning, analytical models for life prediction and experimental verification of the life prediction and accelerated testing techniques are discussed. A plan is presented to demonstrate a method for predicting the life and/or performance of a selected space mechanism mechanical component.
Shaikh, Muhammad Mujtaba; Memon, Abdul Jabbar; Hussain, Manzoor
2016-09-01
In this article, we describe details of the data used in the research paper "Confidence bounds for energy conservation in electric motors: An economical solution using statistical techniques" [1]. The data presented in this paper is intended to show benefits of high efficiency electric motors over the standard efficiency motors of similar rating in the industrial sector of Pakistan. We explain how the data was collected and then processed by means of formulas to show cost effectiveness of energy efficient motors in terms of three important parameters: annual energy saving, cost saving and payback periods. This data can be further used to construct confidence bounds for the parameters using statistical techniques as described in [1].
Energy resolution improvement of CdTe detectors by using the principal component analysis technique
NASA Astrophysics Data System (ADS)
Alharbi, T.
2018-02-01
In this paper, we report on the application of the Principal Component Analysis (PCA) technique for the improvement of the γ-ray energy resolution of CdTe detectors. The PCA technique is used to estimate the amount of charge-trapping effect which is reflected in the shape of each detector pulse, thereby correcting for the charge-trapping effect. The details of the method are described and the results obtained with a CdTe detector are shown. We have achieved an energy resolution of 1.8 % (FWHM) at 662 keV with full detection efficiency from a 1 mm thick CdTe detector which gives an energy resolution of 4.5 % (FWHM) by using the standard pulse processing method.
NASA Technical Reports Server (NTRS)
Krueger, Ronald; Goetze, Dirk; Ransom, Jonathon (Technical Monitor)
2006-01-01
Strain energy release rates were computed along straight delamination fronts of Double Cantilever Beam, End-Notched Flexure and Single Leg Bending specimens using the Virtual Crack Closure Technique (VCCT). Th e results were based on finite element analyses using ABAQUS# and ANSYS# and were calculated from the finite element results using the same post-processing routine to assure a consistent procedure. Mixed-mode strain energy release rates obtained from post-processing finite elem ent results were in good agreement for all element types used and all specimens modeled. Compared to previous studies, the models made of s olid twenty-node hexahedral elements and solid eight-node incompatible mode elements yielded excellent results. For both codes, models made of standard brick elements and elements with reduced integration did not correctly capture the distribution of the energy release rate acr oss the width of the specimens for the models chosen. The results suggested that element types with similar formulation yield matching results independent of the finite element software used. For comparison, m ixed-mode strain energy release rates were also calculated within ABAQUS#/Standard using the VCCT for ABAQUS# add on. For all specimens mod eled, mixed-mode strain energy release rates obtained from ABAQUS# finite element results using post-processing were almost identical to re sults calculated using the VCCT for ABAQUS# add on.
48 CFR 9904.401-50 - Techniques for application.
Code of Federal Regulations, 2010 CFR
2010-10-01
.... 9904.401-50 Section 9904.401-50 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD, OFFICE OF FEDERAL PROCUREMENT POLICY, OFFICE OF MANAGEMENT AND BUDGET PROCUREMENT PRACTICES AND COST ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.401-50 Techniques for application. (a) The standard...
Forrester, Jared A; Koritsanszky, Luca A; Amenu, Demisew; Haynes, Alex B; Berry, William R; Alemu, Seifu; Jiru, Fekadu; Weiser, Thomas G
2018-06-01
Surgical infections cause substantial morbidity and mortality in low-and middle-income countries (LMICs). To improve adherence to critical perioperative infection prevention standards, we developed Clean Cut, a checklist-based quality improvement program to improve compliance with best practices. We hypothesized that process mapping infection prevention activities can help clinicians identify strategies for improving surgical safety. We introduced Clean Cut at a tertiary hospital in Ethiopia. Infection prevention standards included skin antisepsis, ensuring a sterile field, instrument decontamination/sterilization, prophylactic antibiotic administration, routine swab/gauze counting, and use of a surgical safety checklist. Processes were mapped by a visiting surgical fellow and local operating theater staff to facilitate the development of contextually relevant solutions; processes were reassessed for improvements. Process mapping helped identify barriers to using alcohol-based hand solution due to skin irritation, inconsistent administration of prophylactic antibiotics due to variable delivery outside of the operating theater, inefficiencies in assuring sterility of surgical instruments through lack of confirmatory measures, and occurrences of retained surgical items through inappropriate guidelines, staffing, and training in proper routine gauze counting. Compliance with most processes improved significantly following organizational changes to align tasks with specific process goals. Enumerating the steps involved in surgical infection prevention using a process mapping technique helped identify opportunities for improving adherence and plotting contextually relevant solutions, resulting in superior compliance with antiseptic standards. Simplifying these process maps into an adaptable tool could be a powerful strategy for improving safe surgery delivery in LMICs. Copyright © 2018 American College of Surgeons. Published by Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fournel, B.; Barre, Y.; Lepeytre, C.
2012-07-01
Liquid wastes decontamination processes are mainly based on two techniques: Bulk processes and the so called Cartridges processes. The first technique has been developed for the French nuclear fuel reprocessing industry since the 60's in Marcoule and La Hague. It is a proven and mature technology which has been successfully and quickly implemented by AREVA at Fukushima site for the processing of contaminated waters. The second technique, involving cartridges processes, offers new opportunities for the use of innovative adsorbents. The AREVA process developed for Fukushima and some results obtained on site will be presented as well as laboratory scale resultsmore » obtained in CEA laboratories. Examples of new adsorbents development for liquid wastes decontamination are also given. A chemical process unit based on co-precipitation technique has been successfully and quickly implemented by AREVA at Fukushima site for the processing of contaminated waters. The asset of this technique is its ability to process large volumes in a continuous mode. Several chemical products can be used to address specific radioelements such as: Cs, Sr, Ru. Its drawback is the production of sludge (about 1% in volume of initial liquid volume). CEA developed strategies to model the co-precipitation phenomena in order to firstly minimize the quantity of added chemical reactants and secondly, minimize the size of co-precipitation units. We are on the way to design compact units that could be mobilized very quickly and efficiently in case of an accidental situation. Addressing the problem of sludge conditioning, cementation appears to be a very attractive solution. Fukushima accident has focused attention on optimizations that should be taken into account in future studies: - To better take account for non-typical aqueous matrixes like seawater; - To enlarge the spectrum of radioelements that can be efficiently processed and especially short lives radioelements that are usually less present in standard effluents resulting from nuclear activities; - To develop reversible solid adsorbents for cartridge-type applications in order to minimize wastes. (authors)« less
Fabrication of five-level ultraplanar micromirror arrays by flip-chip assembly
NASA Astrophysics Data System (ADS)
Michalicek, M. Adrian; Bright, Victor M.
2001-10-01
This paper reports a detailed study of the fabrication of various piston, torsion, and cantilever style micromirror arrays using a novel, simple, and inexpensive flip-chip assembly technique. Several rectangular and polar arrays were commercially prefabricated in the MUMPs process and then flip-chip bonded to form advanced micromirror arrays where adverse effects typically associated with surface micromachining were removed. These arrays were bonded by directly fusing the MUMPs gold layers with no complex preprocessing. The modules were assembled using a computer-controlled, custom-built flip-chip bonding machine. Topographically opposed bond pads were designed to correct for slight misalignment errors during bonding and typically result in less than 2 micrometers of lateral alignment error. Although flip-chip micromirror performance is briefly discussed, the means used to create these arrays is the focus of the paper. A detailed study of flip-chip process yield is presented which describes the primary failure mechanisms for flip-chip bonding. Studies of alignment tolerance, bonding force, stress concentration, module planarity, bonding machine calibration techniques, prefabrication errors, and release procedures are presented in relation to specific observations in process yield. Ultimately, the standard thermo-compression flip-chip assembly process remains a viable technique to develop highly complex prototypes of advanced micromirror arrays.
Orbitopterional Craniotomy Resection of Pediatric Suprasellar Craniopharyngioma.
LeFever, Devon; Storey, Chris; Guthikonda, Bharat
2018-04-01
The orbitopterional approach provides an excellent combination of basal access and suprasellar access. This approach also allows for less brain retraction when resecting larger suprasellar tumors that are more superiorly projecting due to a more frontal and inferior trajectory. In this operative video, the authors thoroughly detail an orbitopterional craniotomy utilizing a one-piece modified orbitozygomatic technique. This technique involves opening the craniotomy through a standard pterional incision. The craniotomy is performed using the standard three burr holes of a pterional approach; however, the osteotomy is extended anteriorly through the frontal process of the zygomatic bone as well as through the supraorbital rim. In this operative video atlas, the authors illustrate the operative anatomy, as well as surgical strategy and techniques to resect a large suprasellar craniopharyngioma in a 4-year-old male. Other reasonable approach options for a lesion of this size would include a standard pterional approach, a supraorbital approach, or expanded endoscopic transsphenoidal approach. The lesion was quite high and thus, the supraorbital approach may confine access to the superior portion of the tumor. While recognizing that some groups may have chosen the endoscopic expanded transsphenoidal approach for this lesion, the authors describe more confidence in achieving the goal of a safe and maximal resection with the orbitopterional approach. The link to the video can be found at: https://youtu.be/eznsK16BzR8 .
Sowunmi, Olaperi Yeside; Misra, Sanjay; Fernandez-Sanz, Luis; Crawford, Broderick; Soto, Ricardo
2016-01-01
The importance of quality assurance in the software development process cannot be overemphasized because its adoption results in high reliability and easy maintenance of the software system and other software products. Software quality assurance includes different activities such as quality control, quality management, quality standards, quality planning, process standardization and improvement amongst others. The aim of this work is to further investigate the software quality assurance practices of practitioners in Nigeria. While our previous work covered areas on quality planning, adherence to standardized processes and the inherent challenges, this work has been extended to include quality control, software process improvement and international quality standard organization membership. It also makes comparison based on a similar study carried out in Turkey. The goal is to generate more robust findings that can properly support decision making by the software community. The qualitative research approach, specifically, the use of questionnaire research instruments was applied to acquire data from software practitioners. In addition to the previous results, it was observed that quality assurance practices are quite neglected and this can be the cause of low patronage. Moreover, software practitioners are neither aware of international standards organizations or the required process improvement techniques; as such their claimed standards are not aligned to those of accredited bodies, and are only limited to their local experience and knowledge, which makes it questionable. The comparison with Turkey also yielded similar findings, making the results typical of developing countries. The research instrument used was tested for internal consistency using the Cronbach's alpha, and it was proved reliable. For the software industry in developing countries to grow strong and be a viable source of external revenue, software assurance practices have to be taken seriously because its effect is evident in the final product. Moreover, quality frameworks and tools which require minimum time and cost are highly needed in these countries.
Hughes, Sarah A; Huang, Rongfu; Mahaffey, Ashley; Chelme-Ayala, Pamela; Klamerth, Nikolaus; Meshref, Mohamed N A; Ibrahim, Mohamed D; Brown, Christine; Peru, Kerry M; Headley, John V; Gamal El-Din, Mohamed
2017-11-01
There are several established methods for the determination of naphthenic acids (NAs) in waters associated with oil sands mining operations. Due to their highly complex nature, measured concentration and composition of NAs vary depending on the method used. This study compared different common sample preparation techniques, analytical instrument methods, and analytical standards to measure NAs in groundwater and process water samples collected from an active oil sands operation. In general, the high- and ultrahigh-resolution methods, namely high performance liquid chromatography time-of-flight mass spectrometry (UPLC-TOF-MS) and Orbitrap mass spectrometry (Orbitrap-MS), were within an order of magnitude of the Fourier transform infrared spectroscopy (FTIR) methods. The gas chromatography mass spectrometry (GC-MS) methods consistently had the highest NA concentrations and greatest standard error. Total NAs concentration was not statistically different between sample preparation of solid phase extraction and liquid-liquid extraction. Calibration standards influenced quantitation results. This work provided a comprehensive understanding of the inherent differences in the various techniques available to measure NAs and hence the potential differences in measured amounts of NAs in samples. Results from this study will contribute to the analytical method standardization for NA analysis in oil sands related water samples. Copyright © 2017 Elsevier Ltd. All rights reserved.
Dyer, Brandon A; Jenshus, Abriel; Mayadev, Jyoti S
2018-02-28
Radiation therapy (RT) plays a definitive role in locally advanced vulvar cancer, and in the adjuvant setting with high risk postoperative features after wide local excision. There is significant morbidity associated with traditional, large RT fields using 2D or 3D techniques, and the use of intensity-modulated radiation therapy (IMRT) in vulvar cancer is increasing. However, there remains a paucity of technical information regarding the prevention of a marginal miss during the treatment planning process. The use of an integrated skin flash (ISF) during RT planning can be used to account for anatomic variation, and intra- and interfraction motion seen during treatment. Herein we present the case of a patient with a T1aN0M0, Stage IA vulva cancer to illustrate the progressive vulvar swelling and lymph edema seen during treatment and retrospectively evaluate the dosimetric effects of using an ISF RT plan vs standard RT planning techniques. Standard planning techniques to treat vulvar cancer patients with IMRT do not sufficiently account for the change in patient anatomy and can lead to a marginal miss. ISF is an RT planning technique that can decrease the risk of a marginal miss and the technique is easily implemented during the planning stages of RT treatment. Furthermore, use of an ISF technique can improve vulvar clinical target volume coverage and plan homogeneity. Based on our experience, and this study, a 2-cm ISF is suggested to account for variations in daily clinical setup and changes in patient anatomy during treatment. Published by Elsevier Inc.
48 CFR 9904.413-50 - Techniques for application.
Code of Federal Regulations, 2010 CFR
2010-10-01
.... 9904.413-50 Section 9904.413-50 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD... ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.413-50 Techniques for application. (a) Assignment of actuarial gains and losses. (1) In accordance with the provisions of Cost Accounting Standard 9904.412...
MATLAB Toolboxes for Reference Electrode Standardization Technique (REST) of Scalp EEG
Dong, Li; Li, Fali; Liu, Qiang; Wen, Xin; Lai, Yongxiu; Xu, Peng; Yao, Dezhong
2017-01-01
Reference electrode standardization technique (REST) has been increasingly acknowledged and applied as a re-reference technique to transform an actual multi-channels recordings to approximately zero reference ones in electroencephalography/event-related potentials (EEG/ERPs) community around the world in recent years. However, a more easy-to-use toolbox for re-referencing scalp EEG data to zero reference is still lacking. Here, we have therefore developed two open-source MATLAB toolboxes for REST of scalp EEG. One version of REST is closely integrated into EEGLAB, which is a popular MATLAB toolbox for processing the EEG data; and another is a batch version to make it more convenient and efficient for experienced users. Both of them are designed to provide an easy-to-use for novice researchers and flexibility for experienced researchers. All versions of the REST toolboxes can be freely downloaded at http://www.neuro.uestc.edu.cn/rest/Down.html, and the detailed information including publications, comments and documents on REST can also be found from this website. An example of usage is given with comparative results of REST and average reference. We hope these user-friendly REST toolboxes could make the relatively novel technique of REST easier to study, especially for applications in various EEG studies. PMID:29163006
MATLAB Toolboxes for Reference Electrode Standardization Technique (REST) of Scalp EEG.
Dong, Li; Li, Fali; Liu, Qiang; Wen, Xin; Lai, Yongxiu; Xu, Peng; Yao, Dezhong
2017-01-01
Reference electrode standardization technique (REST) has been increasingly acknowledged and applied as a re-reference technique to transform an actual multi-channels recordings to approximately zero reference ones in electroencephalography/event-related potentials (EEG/ERPs) community around the world in recent years. However, a more easy-to-use toolbox for re-referencing scalp EEG data to zero reference is still lacking. Here, we have therefore developed two open-source MATLAB toolboxes for REST of scalp EEG. One version of REST is closely integrated into EEGLAB, which is a popular MATLAB toolbox for processing the EEG data; and another is a batch version to make it more convenient and efficient for experienced users. Both of them are designed to provide an easy-to-use for novice researchers and flexibility for experienced researchers. All versions of the REST toolboxes can be freely downloaded at http://www.neuro.uestc.edu.cn/rest/Down.html, and the detailed information including publications, comments and documents on REST can also be found from this website. An example of usage is given with comparative results of REST and average reference. We hope these user-friendly REST toolboxes could make the relatively novel technique of REST easier to study, especially for applications in various EEG studies.
Characterization of Metal Powders Used for Additive Manufacturing
Slotwinski, JA; Garboczi, EJ; Stutzman, PE; Ferraris, CF; Watson, SS; Peltz, MA
2014-01-01
Additive manufacturing (AM) techniques1 can produce complex, high-value metal parts, with potential applications as critical parts, such as those found in aerospace components. The production of AM parts with consistent and predictable properties requires input materials (e.g., metal powders) with known and repeatable characteristics, which in turn requires standardized measurement methods for powder properties. First, based on our previous work, we assess the applicability of current standardized methods for powder characterization for metal AM powders. Then we present the results of systematic studies carried out on two different powder materials used for additive manufacturing: stainless steel and cobalt-chrome. The characterization of these powders is important in NIST efforts to develop appropriate measurements and standards for additive materials and to document the property of powders used in a NIST-led additive manufacturing material round robin. An extensive array of characterization techniques was applied to these two powders, in both virgin and recycled states. The physical techniques included laser diffraction particle size analysis, X-ray computed tomography for size and shape analysis, and optical and scanning electron microscopy. Techniques sensitive to structure and chemistry, including X-ray diffraction, energy dispersive analytical X-ray analysis using the X-rays generated during scanning electron microscopy, and X-Ray photoelectron spectroscopy were also employed. The results of these analyses show how virgin powder changes after being exposed to and recycled from one or more Direct Metal Laser Sintering (DMLS) additive manufacturing build cycles. In addition, these findings can give insight into the actual additive manufacturing process. PMID:26601040
Optically Remote Noncontact Heart Rates Sensing Technique
NASA Astrophysics Data System (ADS)
Thongkongoum, W.; Boonduang, S.; Limsuwan, P.
2017-09-01
Heart rate monitoring via optically remote noncontact technique was reported in this research. A green laser (5 mW, 532±10 nm) was projected onto the left carotid artery. The reflected laser light on the screen carried the deviation of the interference patterns. The interference patterns were recorded by the digital camera. The recorded videos of the interference patterns were frame by frame analysed by 2 standard digital image processing (DIP) techniques, block matching (BM) and optical flow (OF) techniques. The region of interest (ROI) pixels within the interference patterns were analysed for periodically changes of the interference patterns due to the heart pumping action. Both results of BM and OF techniques were compared with the reference medical heart rate monitoring device by which a contact measurement using pulse transit technique. The results obtained from BM technique was 74.67 bpm (beats per minute) and OF technique was 75.95 bpm. Those results when compared with the reference value of 75.43±1 bpm, the errors were found to be 1.01% and 0.69%, respectively.
NBS (National Bureau of Standards): Materials measurements
NASA Technical Reports Server (NTRS)
Manning, J. R.
1984-01-01
Work in support of NASA's Microgravity Science and Applications Program is described. The results of the following three tasks are given in detail: (1) surface tensions and their variations with temperature and impurities; (2) convection during unidirectional solidification; and (3) measurement of high temperature thermophysical properties. Tasks 1 and 2 were directed toward determining how the reduced gravity obtained in space flight can affect convection and solidification processes. Emphasis in task 3 was on development of levitation and containerless processing techniques which can be applied in space flight to provide thermodynamic measurements of reactive materials.
1982-01-01
Wastewaters in Hoboken and North Bersen, New Jersey. P00 757 An In Depth Compliance and Performance Analysis of the RBC (Rotating Biological Contactor...Contactors). PO00 770 Inhibition of Nitrification by Chromium in a Biodisc System. PO00 771 Scale-Up and Process Analysis Techniques for Plastic...with "Standard Methods for the Zxam- ination of Water and Wastewater" -o) or "’!ethods for Chemical Analysis of Water and w;astes" ). 640 o -j -c 0
Overlap junctions for high coherence superconducting qubits
NASA Astrophysics Data System (ADS)
Wu, X.; Long, J. L.; Ku, H. S.; Lake, R. E.; Bal, M.; Pappas, D. P.
2017-07-01
Fabrication of sub-micron Josephson junctions is demonstrated using standard processing techniques for high-coherence, superconducting qubits. These junctions are made in two separate lithography steps with normal-angle evaporation. Most significantly, this work demonstrates that it is possible to achieve high coherence with junctions formed on aluminum surfaces cleaned in situ by Ar plasma before junction oxidation. This method eliminates the angle-dependent shadow masks typically used for small junctions. Therefore, this is conducive to the implementation of typical methods for improving margins and yield using conventional CMOS processing. The current method uses electron-beam lithography and an additive process to define the top and bottom electrodes. Extension of this work to optical lithography and subtractive processes is discussed.
Development of a manualized protocol of massage therapy for clinical trials in osteoarthritis.
Ali, Ather; Kahn, Janet; Rosenberger, Lisa; Perlman, Adam I
2012-10-04
Clinical trial design of manual therapies may be especially challenging as techniques are often individualized and practitioner-dependent. This paper describes our methods in creating a standardized Swedish massage protocol tailored to subjects with osteoarthritis of the knee while respectful of the individualized nature of massage therapy, as well as implementation of this protocol in two randomized clinical trials. The manualization process involved a collaborative process between methodologic and clinical experts, with the explicit goals of creating a reproducible semi-structured protocol for massage therapy, while allowing some latitude for therapists' clinical judgment and maintaining consistency with a prior pilot study. The manualized protocol addressed identical specified body regions with distinct 30- and 60-min protocols, using standard Swedish strokes. Each protocol specifies the time allocated to each body region. The manualized 30- and 60-min protocols were implemented in a dual-site 24-week randomized dose-finding trial in patients with osteoarthritis of the knee, and is currently being implemented in a three-site 52-week efficacy trial of manualized Swedish massage therapy. In the dose-finding study, therapists adhered to the protocols and significant treatment effects were demonstrated. The massage protocol was manualized, using standard techniques, and made flexible for individual practitioner and subject needs. The protocol has been applied in two randomized clinical trials. This manualized Swedish massage protocol has real-world utility and can be readily utilized both in the research and clinical settings. Clinicaltrials.gov NCT00970008 (18 August 2009).
Business Model for the Security of a Large-Scale PACS, Compliance with ISO/27002:2013 Standard.
Gutiérrez-Martínez, Josefina; Núñez-Gaona, Marco Antonio; Aguirre-Meneses, Heriberto
2015-08-01
Data security is a critical issue in an organization; a proper information security management (ISM) is an ongoing process that seeks to build and maintain programs, policies, and controls for protecting information. A hospital is one of the most complex organizations, where patient information has not only legal and economic implications but, more importantly, an impact on the patient's health. Imaging studies include medical images, patient identification data, and proprietary information of the study; these data are contained in the storage device of a PACS. This system must preserve the confidentiality, integrity, and availability of patient information. There are techniques such as firewalls, encryption, and data encapsulation that contribute to the protection of information. In addition, the Digital Imaging and Communications in Medicine (DICOM) standard and the requirements of the Health Insurance Portability and Accountability Act (HIPAA) regulations are also used to protect the patient clinical data. However, these techniques are not systematically applied to the picture and archiving and communication system (PACS) in most cases and are not sufficient to ensure the integrity of the images and associated data during transmission. The ISO/IEC 27001:2013 standard has been developed to improve the ISM. Currently, health institutions lack effective ISM processes that enable reliable interorganizational activities. In this paper, we present a business model that accomplishes the controls of ISO/IEC 27002:2013 standard and criteria of security and privacy from DICOM and HIPAA to improve the ISM of a large-scale PACS. The methodology associated with the model can monitor the flow of data in a PACS, facilitating the detection of unauthorized access to images and other abnormal activities.
Knowledge Management Orientation: An Innovative Perspective to Hospital Management.
Ghasemi, Matina; Ghadiri Nejad, Mazyar; Bagzibagli, Kemal
2017-12-01
By considering innovation as a new project in hospitals, all the project management's standard steps should be followed in execution. This study investigated the validation of a new set of measures in terms of providing a procedure for knowledge management-oriented innovation that enriches the hospital management system. The relation between innovation and all the knowledge management areas, as the main constructs of project management, was illustrated by referring to project management standard steps and previous studies. Through consultations and meetings with a committee of professional project managers, a questionnaire was developed to measure ten knowledge management areas in hospital's innovation process. Additionally, a group of experts from hospital managers were invited to comment on the applicability of the questionnaires by considering if the items are measurable in hospitals practically. A close-ended, Likert-type scale items, consisted of ten sections, were developed based on project management body of knowledge thorough Delphi technique. It enables the managers to evaluate hospitals' situation to be aware whether the organization follows the knowledge management standards in innovation process or not. By pilot study, confirmatory factor analysis and exploratory factor analysis were conducted to ensure the validity and reliability of the measurement items. The developed items seem to have a potential to help hospital managers and subsequently delivering new products/services successfully based on the standard procedures in their organization. In all innovation processes, the knowledge management areas and their standard steps help hospital managers by a new tool as questionnaire format.
NASA Astrophysics Data System (ADS)
Morton, Kenneth D., Jr.; Torrione, Peter A.; Collins, Leslie
2011-05-01
Laser induced breakdown spectroscopy (LIBS) can provide rapid, minimally destructive, chemical analysis of substances with the benefit of little to no sample preparation. Therefore, LIBS is a viable technology for the detection of substances of interest in near real-time fielded remote sensing scenarios. Of particular interest to military and security operations is the detection of explosive residues on various surfaces. It has been demonstrated that LIBS is capable of detecting such residues, however, the surface or substrate on which the residue is present can alter the observed spectra. Standard chemometric techniques such as principal components analysis and partial least squares discriminant analysis have previously been applied to explosive residue detection, however, the classification techniques developed on such data perform best against residue/substrate pairs that were included in model training but do not perform well when the residue/substrate pairs are not in the training set. Specifically residues in the training set may not be correctly detected if they are presented on a previously unseen substrate. In this work, we explicitly model LIBS spectra resulting from the residue and substrate to attempt to separate the response from each of the two components. This separation process is performed jointly with classifier design to ensure that the classifier that is developed is able to detect residues of interest without being confused by variations in the substrates. We demonstrate that the proposed classification algorithm provides improved robustness to variations in substrate compared to standard chemometric techniques for residue detection.
NASA Astrophysics Data System (ADS)
Neji, N.; Jridi, M.; Alfalou, A.; Masmoudi, N.
2016-02-01
The double random phase encryption (DRPE) method is a well-known all-optical architecture which has many advantages especially in terms of encryption efficiency. However, the method presents some vulnerabilities against attacks and requires a large quantity of information to encode the complex output plane. In this paper, we present an innovative hybrid technique to enhance the performance of DRPE method in terms of compression and encryption. An optimized simultaneous compression and encryption method is applied simultaneously on the real and imaginary components of the DRPE output plane. The compression and encryption technique consists in using an innovative randomized arithmetic coder (RAC) that can well compress the DRPE output planes and at the same time enhance the encryption. The RAC is obtained by an appropriate selection of some conditions in the binary arithmetic coding (BAC) process and by using a pseudo-random number to encrypt the corresponding outputs. The proposed technique has the capabilities to process video content and to be standard compliant with modern video coding standards such as H264 and HEVC. Simulations demonstrate that the proposed crypto-compression system has presented the drawbacks of the DRPE method. The cryptographic properties of DRPE have been enhanced while a compression rate of one-sixth can be achieved. FPGA implementation results show the high performance of the proposed method in terms of maximum operating frequency, hardware occupation, and dynamic power consumption.
Li, Chen; Habler, Gerlinde; Baldwin, Lisa C; Abart, Rainer
2018-01-01
Focused ion beam (FIB) sample preparation technique in plan-view geometry allows direct correlations of the atomic structure study via transmission electron microscopy with micrometer-scale property measurements. However, one main technical difficulty is that a large amount of material must be removed underneath the specimen. Furthermore, directly monitoring the milling process is difficult unless very large material volumes surrounding the TEM specimen site are removed. In this paper, a new cutting geometry is introduced for FIB lift-out sample preparation with plan-view geometry. Firstly, an "isolated" cuboid shaped specimen is cut out, leaving a "bridge" connecting it with the bulk material. Subsequently the two long sides of the "isolated" cuboid are wedged, forming a triangular prism shape. A micromanipulator needle is used for in-situ transfer of the specimen to a FIB TEM grid, which has been mounted parallel with the specimen surface using a simple custom-made sample slit. Finally, the grid is transferred to the standard FIB grid holder for final thinning with standard procedures. This new cutting geometry provides clear viewing angles for monitoring the milling process, which solves the difficulty of judging whether the specimen has been entirely detached from the bulk material, with the least possible damage to the surrounding materials. With an improved success rate and efficiency, this plan-view FIB lift-out specimen preparation technique should have a wide application for material science. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Humayun, Q.; Hashim, U.; Ruzaidi, C. M.; Noriman, N. Z.
2017-03-01
The fabrication and characterization of sensitive and selective fluids delivery system for the application of nano laboratory on a single chip is a challenging task till to date. This paper is one of the initial attempt to resolve this challenging task by using a simple, cost effective and reproductive technique for pattering a microchannel structures on SU-8 resist. The objective of the research is to design, fabricate and characterize polydimethylsiloxane (PDMS) microchannel. The proposed device mask was designed initially by using AutoCAD software and then the designed was transferred to transparency sheet and to commercial chrome mask for better photo masking process. The standard photolithography process coupled with wet chemical etching process was used for the fabrication of proposed microchannel. This is a low cost fabrication technique for the formation of microchannel structure at resist. The fabrication process start from microchannel formation and then the structure was transformed to PDMS substrate, the microchannel structure was cured from mold and then the cured mold was bonded with the glass substrate by plasma oxidation bonding process. The surface morphology was characterized by high power microscope (HPM) and the structure was characterized by Hawk 3 D surface nanoprofiler. The next part of the research will be focus onto device testing and validation by using real biological samples by the implementation of a simple manual injection technique.
New integration concept of PIN photodiodes in 0.35μm CMOS technologies
NASA Astrophysics Data System (ADS)
Jonak-Auer, I.; Teva, J.; Park, J. M.; Jessenig, S.; Rohrbacher, M.; Wachmann, E.
2012-06-01
We report on a new and very cost effective way to integrate PIN photo detectors into a standard CMOS process. Starting with lowly p-doped (intrinsic) EPI we need just one additional mask and ion implantation in order to provide doping concentrations very similar to standard CMOS substrates to areas outside the photoactive regions. Thus full functionality of the standard CMOS logic can be guaranteed while the photo detectors highly benefit from the low doping concentrations of the intrinsic EPI. The major advantage of this integration concept is that complete modularity of the CMOS process remains untouched by the implementation of PIN photodiodes. Functionality of the implanted region as host of logic components was confirmed by electrical measurements of relevant standard transistor as well as ESD protection devices. We also succeeded in establishing an EPI deposition process in austriamicrosystems 200mm wafer fabrication which guarantees the formation of very lowly p-doped intrinsic layers, which major semiconductor vendors could not provide. With our EPI deposition process we acquire doping levels as low as 1•1012/cm3. In order to maintain those doping levels during CMOS processing we employed special surface protection techniques. After complete CMOS processing doping concentrations were about 4•1013/cm3 at the EPI surface while the bulk EPI kept its original low doping concentrations. Photodiode parameters could further be improved by bottom antireflective coatings and a special implant to reduce dark currents. For 100×100μm2 photodiodes in 20μm thick intrinsic EPI on highly p-doped substrates we achieved responsivities of 0.57A/W at λ=675nm, capacitances of 0.066pF and dark currents of 0.8pA at 2V reverse voltage.
Hanke, Alexander T; Tsintavi, Eleni; Ramirez Vazquez, Maria Del Pilar; van der Wielen, Luuk A M; Verhaert, Peter D E M; Eppink, Michel H M; van de Sandt, Emile J A X; Ottens, Marcel
2016-09-01
Knowledge-based development of chromatographic separation processes requires efficient techniques to determine the physicochemical properties of the product and the impurities to be removed. These characterization techniques are usually divided into approaches that determine molecular properties, such as charge, hydrophobicity and size, or molecular interactions with auxiliary materials, commonly in the form of adsorption isotherms. In this study we demonstrate the application of a three-dimensional liquid chromatography approach to a clarified cell homogenate containing a therapeutic enzyme. Each separation dimension determines a molecular property relevant to the chromatographic behavior of each component. Matching of the peaks across the different separation dimensions and against a high-resolution reference chromatogram allows to assign the determined parameters to pseudo-components, allowing to determine the most promising technique for the removal of each impurity. More detailed process design using mechanistic models requires isotherm parameters. For this purpose, the second dimension consists of multiple linear gradient separations on columns in a high-throughput screening compatible format, that allow regression of isotherm parameters with an average standard error of 8%. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:1283-1291, 2016. © 2016 American Institute of Chemical Engineers.
Evaluation of Improved Pushback Forecasts Derived from Airline Ground Operations Data
NASA Technical Reports Server (NTRS)
Carr, Francis; Theis, Georg; Feron, Eric; Clarke, John-Paul
2003-01-01
Accurate and timely predictions of airline pushbacks can potentially lead to improved performance of automated decision-support tools for airport surface traffic, thus reducing the variability and average duration of costly airline delays. One factor which affects the realization of these benefits is the level of uncertainty inherent in the turn processes. To characterize this inherent uncertainty, three techniques are developed for predicting time-to-go until pushback as a function of available ground-time; elapsed ground-time; and the status (not-started/in-progress/completed) of individual turn processes (cleaning, fueling, etc.). These techniques are tested against a large and detailed dataset covering approximately l0(exp 4) real-world turn operations obtained through collaboration with Deutsche Lufthansa AG. Even after the dataset is filtered to obtain a sample of turn operations with minimal uncertainty, the standard deviation of forecast error for all three techniques is lower-bounded away from zero, indicating that turn operations have a significant stochastic component. This lower-bound result shows that decision-support tools must be designed to incorporate robust mechanisms for coping with pushback demand stochasticity, rather than treating the pushback demand process as a known deterministic input.
Potential hazards in smoke-flavored fish
NASA Astrophysics Data System (ADS)
Lin, Hong; Jiang, Jie; Li, Donghua
2008-08-01
Smoking is widely used in fish processing for the color and flavor. Smoke flavorings have evolved as a successful alternative to traditional smoking. The hazards of the fish products treated by liquid-smoking process are discussed in this review. The smoke flavoring is one important ingredient in the smoke-flavored fish. This paper gives the definition of smoke flavorings and the hazard of polycyclic aromatic hydrocarbons (PAHs) residue in the smoke flavorings on the market. It gives also an assessment of chemical hazards such as carcinogenic PAHs, especially Benzo-[ a]pyrene, as well as biological hazards such as Listeria monocytogenes, Clostridium botulinum, histamine and parasites in smoke-flavored fish. The limitations in regulations or standards are discussed. Smoke flavored fish have lower content of PAHs as compared with the traditional smoking techniques if the PAHs residue in smoke flavorings is controlled by regulations or standards.
Ferrero, A; Campos, J; Rabal, A M; Pons, A; Hernanz, M L; Corróns, A
2011-09-26
The Bidirectional Reflectance Distribution Function (BRDF) is essential to characterize an object's reflectance properties. This function depends both on the various illumination-observation geometries as well as on the wavelength. As a result, the comprehensive interpretation of the data becomes rather complex. In this work we assess the use of the multivariable analysis technique of Principal Components Analysis (PCA) applied to the experimental BRDF data of a ceramic colour standard. It will be shown that the result may be linked to the various reflection processes occurring on the surface, assuming that the incoming spectral distribution is affected by each one of these processes in a specific manner. Moreover, this procedure facilitates the task of interpolating a series of BRDF measurements obtained for a particular sample. © 2011 Optical Society of America
Idbeaa, Tarik; Abdul Samad, Salina; Husain, Hafizah
2016-01-01
This paper presents a novel secure and robust steganographic technique in the compressed video domain namely embedding-based byte differencing (EBBD). Unlike most of the current video steganographic techniques which take into account only the intra frames for data embedding, the proposed EBBD technique aims to hide information in both intra and inter frames. The information is embedded into a compressed video by simultaneously manipulating the quantized AC coefficients (AC-QTCs) of luminance components of the frames during MPEG-2 encoding process. Later, during the decoding process, the embedded information can be detected and extracted completely. Furthermore, the EBBD basically deals with two security concepts: data encryption and data concealing. Hence, during the embedding process, secret data is encrypted using the simplified data encryption standard (S-DES) algorithm to provide better security to the implemented system. The security of the method lies in selecting candidate AC-QTCs within each non-overlapping 8 × 8 sub-block using a pseudo random key. Basic performance of this steganographic technique verified through experiments on various existing MPEG-2 encoded videos over a wide range of embedded payload rates. Overall, the experimental results verify the excellent performance of the proposed EBBD with a better trade-off in terms of imperceptibility and payload, as compared with previous techniques while at the same time ensuring minimal bitrate increase and negligible degradation of PSNR values. PMID:26963093
Idbeaa, Tarik; Abdul Samad, Salina; Husain, Hafizah
2016-01-01
This paper presents a novel secure and robust steganographic technique in the compressed video domain namely embedding-based byte differencing (EBBD). Unlike most of the current video steganographic techniques which take into account only the intra frames for data embedding, the proposed EBBD technique aims to hide information in both intra and inter frames. The information is embedded into a compressed video by simultaneously manipulating the quantized AC coefficients (AC-QTCs) of luminance components of the frames during MPEG-2 encoding process. Later, during the decoding process, the embedded information can be detected and extracted completely. Furthermore, the EBBD basically deals with two security concepts: data encryption and data concealing. Hence, during the embedding process, secret data is encrypted using the simplified data encryption standard (S-DES) algorithm to provide better security to the implemented system. The security of the method lies in selecting candidate AC-QTCs within each non-overlapping 8 × 8 sub-block using a pseudo random key. Basic performance of this steganographic technique verified through experiments on various existing MPEG-2 encoded videos over a wide range of embedded payload rates. Overall, the experimental results verify the excellent performance of the proposed EBBD with a better trade-off in terms of imperceptibility and payload, as compared with previous techniques while at the same time ensuring minimal bitrate increase and negligible degradation of PSNR values.
Integrated Broadband Quantum Cascade Laser
NASA Technical Reports Server (NTRS)
Mansour, Kamjou (Inventor); Soibel, Alexander (Inventor)
2016-01-01
A broadband, integrated quantum cascade laser is disclosed, comprising ridge waveguide quantum cascade lasers formed by applying standard semiconductor process techniques to a monolithic structure of alternating layers of claddings and active region layers. The resulting ridge waveguide quantum cascade lasers may be individually controlled by independent voltage potentials, resulting in control of the overall spectrum of the integrated quantum cascade laser source. Other embodiments are described and claimed.
Using machine-learning methods to analyze economic loss function of quality management processes
NASA Astrophysics Data System (ADS)
Dzedik, V. A.; Lontsikh, P. A.
2018-05-01
During analysis of quality management systems, their economic component is often analyzed insufficiently. To overcome this issue, it is necessary to withdraw the concept of economic loss functions from tolerance thinking and address it. Input data about economic losses in processes have a complex form, thus, using standard tools to solve this problem is complicated. Use of machine learning techniques allows one to obtain precise models of the economic loss function based on even the most complex input data. Results of such analysis contain data about the true efficiency of a process and can be used to make investment decisions.
Error-proofing test system of industrial components based on image processing
NASA Astrophysics Data System (ADS)
Huang, Ying; Huang, Tao
2018-05-01
Due to the improvement of modern industrial level and accuracy, conventional manual test fails to satisfy the test standards of enterprises, so digital image processing technique should be utilized to gather and analyze the information on the surface of industrial components, so as to achieve the purpose of test. To test the installation parts of automotive engine, this paper employs camera to capture the images of the components. After these images are preprocessed including denoising, the image processing algorithm relying on flood fill algorithm is used to test the installation of the components. The results prove that this system has very high test accuracy.
Aerospace Environmental Technology Conference: Exectutive summary
NASA Technical Reports Server (NTRS)
Whitaker, A. F. (Editor)
1995-01-01
The mandated elimination of CFC's, Halons, TCA, and other ozone depleting chemicals and specific hazardous materials has required changes and new developments in aerospace materials and processes. The aerospace industry has been involved for several years in providing product substitutions, redesigning entire production processes, and developing new materials that minimize or eliminate damage to the environment. These activities emphasize replacement cleaning solvents and their application verifications, compliant coatings including corrosion protection systems, and removal techniques, chemical propulsion effects on the environment, and the initiation of modifications to relevant processing and manufacturing specifications and standards. The papers from this conference are being published in a separate volume as NASA CP-3298.
Quaranta, Alessandro; DʼIsidoro, Orlando; Bambini, Fabrizio; Putignano, Angelo
2016-02-01
To compare the available potential bone-implant contact (PBIC) area of standard and short dental implants by micro-computed tomography (μCT) assessment. Three short implants with different diameters (4.5 × 6 mm, 4.1 × 7 mm, and 4.1 × 6 mm) and 2 standard implants (3.5 × 10 mm and 3.3 × 9 mm) with diverse design and surface features were scanned with μCT. Cross-sectional images were obtained. Image data were manually processed to find the plane that corresponds to the most coronal contact point between the crestal bone and implant. The available PBIC was calculated for each sample. Later on, the cross-sectional slices were processed by a 3-dimensional (3D) software, and 3D images of each sample were used for descriptive analysis and display the microtopography and macrotopography. The wide-diameter short implant (4.5 × 6 mm) showed the higher PBIC (210.89 mm) value followed by the standard (178.07 mm and 185.37 mm) and short implants (130.70 mm and 110.70 mm). Wide-diameter short implants show a surface area comparable with standard implants. Micro-CT analysis is a promising technique to evaluate surface area in dental implants with different macrodesign, microdesign, and surface features.
Laser ablation ICP-MS applications using the timescales of geologic and biologic processes
NASA Astrophysics Data System (ADS)
Ridley, W. I.
2003-04-01
Geochemists commonly examine geologic processes on timescales of 10^4--10^9 years, and accept that often age relations, e.g., chemical zoning in minerals, can only be measured in a relative sense. The progression of a geologic process that involves geochemical changes may be assessed using trace element microbeam techniques, because the textural, and therefore spatial context, of the analytical scheme can be preserved. However, quantification requires appropriate calibration standards. Laser ablation ICP-MS (LA-ICP-MS) is proving particularly useful now that appropriate standards are becoming available. For instance, trace element zoning patterns in primary sulfides (e.g., pyrite, sphalerite, chalcopyrite, galena) and secondary phases can be inverted to examine relative changes in fluid composition during cycles of hydrothermal mineralization. In turn such information provides insights into fluid sources, migration pathways and depositional processes. These studies have only become possible with the development of appropriate sulfide calibration standards. Another example, made possible with the development of appropriate silicate calibration standards, is the quantitative spatial mapping of REE variations in amphibolite-grade garnets. The recognition that the trace and major elements are decoupled provides a better understanding of the various sources of elements during metamorphic re-equilibration. There is also a growing realization that LA-ICP-MS has potential in biochemical studies, and geochemists have begun to turn their attention in this direction, working closely with biologists. Unlike many geologic processes, the timescales of biologic processes are measured in years to centuries and are frequently amenable to absolute dating. Examples that can be cited where LA-ICP-MS has been applied include annual trace metal variations in tree rings, corals, teeth, bones, bird feathers and various animal vibrissae (sea lion, walrus, wolf). The aim of such studies is to correlate trace element variations with changes in environmental variables. Such studies are proving informative in climate change and habitat management. Again, such variations have been quantified with the availability of appropriate organic, carbonate and phosphate calibration standards.
Cell-Detection Technique for Automated Patch Clamping
NASA Technical Reports Server (NTRS)
McDowell, Mark; Gray, Elizabeth
2008-01-01
A unique and customizable machinevision and image-data-processing technique has been developed for use in automated identification of cells that are optimal for patch clamping. [Patch clamping (in which patch electrodes are pressed against cell membranes) is an electrophysiological technique widely applied for the study of ion channels, and of membrane proteins that regulate the flow of ions across the membranes. Patch clamping is used in many biological research fields such as neurobiology, pharmacology, and molecular biology.] While there exist several hardware techniques for automated patch clamping of cells, very few of those techniques incorporate machine vision for locating cells that are ideal subjects for patch clamping. In contrast, the present technique is embodied in a machine-vision algorithm that, in practical application, enables the user to identify good and bad cells for patch clamping in an image captured by a charge-coupled-device (CCD) camera attached to a microscope, within a processing time of one second. Hence, the present technique can save time, thereby increasing efficiency and reducing cost. The present technique involves the utilization of cell-feature metrics to accurately make decisions on the degree to which individual cells are "good" or "bad" candidates for patch clamping. These metrics include position coordinates (x,y) in the image plane, major-axis length, minor-axis length, area, elongation, roundness, smoothness, angle of orientation, and degree of inclusion in the field of view. The present technique does not require any special hardware beyond commercially available, off-the-shelf patch-clamping hardware: A standard patchclamping microscope system with an attached CCD camera, a personal computer with an imagedata- processing board, and some experience in utilizing imagedata- processing software are all that are needed. A cell image is first captured by the microscope CCD camera and image-data-processing board, then the image data are analyzed by software that implements the present machine-vision technique. This analysis results in the identification of cells that are "good" candidates for patch clamping (see figure). Once a "good" cell is identified, a patch clamp can be effected by an automated patchclamping apparatus or by a human operator. This technique has been shown to enable reliable identification of "good" and "bad" candidate cells for patch clamping. The ultimate goal in further development of this technique is to combine artificial-intelligence processing with instrumentation and controls in order to produce a complete "turnkey" automated patch-clamping system capable of accurately and reliably patch clamping cells with a minimum intervention by a human operator. Moreover, this technique can be adapted to virtually any cellular-analysis procedure that includes repetitive operation of microscope hardware by a human.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Solaimani, Mohiuddin; Iftekhar, Mohammed; Khan, Latifur
Anomaly detection refers to the identi cation of an irregular or unusual pat- tern which deviates from what is standard, normal, or expected. Such deviated patterns typically correspond to samples of interest and are assigned different labels in different domains, such as outliers, anomalies, exceptions, or malware. Detecting anomalies in fast, voluminous streams of data is a formidable chal- lenge. This paper presents a novel, generic, real-time distributed anomaly detection framework for heterogeneous streaming data where anomalies appear as a group. We have developed a distributed statistical approach to build a model and later use it to detect anomaly. Asmore » a case study, we investigate group anomaly de- tection for a VMware-based cloud data center, which maintains a large number of virtual machines (VMs). We have built our framework using Apache Spark to get higher throughput and lower data processing time on streaming data. We have developed a window-based statistical anomaly detection technique to detect anomalies that appear sporadically. We then relaxed this constraint with higher accuracy by implementing a cluster-based technique to detect sporadic and continuous anomalies. We conclude that our cluster-based technique out- performs other statistical techniques with higher accuracy and lower processing time.« less
Accurate Rapid Lifetime Determination on Time-Gated FLIM Microscopy with Optical Sectioning
Silva, Susana F.; Domingues, José Paulo
2018-01-01
Time-gated fluorescence lifetime imaging microscopy (FLIM) is a powerful technique to assess the biochemistry of cells and tissues. When applied to living thick samples, it is hampered by the lack of optical sectioning and the need of acquiring many images for an accurate measurement of fluorescence lifetimes. Here, we report on the use of processing techniques to overcome these limitations, minimizing the acquisition time, while providing optical sectioning. We evaluated the application of the HiLo and the rapid lifetime determination (RLD) techniques for accurate measurement of fluorescence lifetimes with optical sectioning. HiLo provides optical sectioning by combining the high-frequency content from a standard image, obtained with uniform illumination, with the low-frequency content of a second image, acquired using structured illumination. Our results show that HiLo produces optical sectioning on thick samples without degrading the accuracy of the measured lifetimes. We also show that instrument response function (IRF) deconvolution can be applied with the RLD technique on HiLo images, improving greatly the accuracy of the measured lifetimes. These results open the possibility of using the RLD technique with pulsed diode laser sources to determine accurately fluorescence lifetimes in the subnanosecond range on thick multilayer samples, providing that offline processing is allowed. PMID:29599938
Accurate Rapid Lifetime Determination on Time-Gated FLIM Microscopy with Optical Sectioning.
Silva, Susana F; Domingues, José Paulo; Morgado, António Miguel
2018-01-01
Time-gated fluorescence lifetime imaging microscopy (FLIM) is a powerful technique to assess the biochemistry of cells and tissues. When applied to living thick samples, it is hampered by the lack of optical sectioning and the need of acquiring many images for an accurate measurement of fluorescence lifetimes. Here, we report on the use of processing techniques to overcome these limitations, minimizing the acquisition time, while providing optical sectioning. We evaluated the application of the HiLo and the rapid lifetime determination (RLD) techniques for accurate measurement of fluorescence lifetimes with optical sectioning. HiLo provides optical sectioning by combining the high-frequency content from a standard image, obtained with uniform illumination, with the low-frequency content of a second image, acquired using structured illumination. Our results show that HiLo produces optical sectioning on thick samples without degrading the accuracy of the measured lifetimes. We also show that instrument response function (IRF) deconvolution can be applied with the RLD technique on HiLo images, improving greatly the accuracy of the measured lifetimes. These results open the possibility of using the RLD technique with pulsed diode laser sources to determine accurately fluorescence lifetimes in the subnanosecond range on thick multilayer samples, providing that offline processing is allowed.
NASA Astrophysics Data System (ADS)
Cherry, M.; Dierken, J.; Boehnlein, T.; Pilchak, A.; Sathish, S.; Grandhi, R.
2018-01-01
A new technique for performing quantitative scanning acoustic microscopy imaging of Rayleigh surface wave (RSW) velocity was developed based on b-scan processing. In this technique, the focused acoustic beam is moved through many defocus distances over the sample and excited with an impulse excitation, and advanced algorithms based on frequency filtering and the Hilbert transform are used to post-process the b-scans to estimate the Rayleigh surface wave velocity. The new method was used to estimate the RSW velocity on an optically flat E6 glass sample, and the velocity was measured at ±2 m/s and the scanning time per point was on the order of 1.0 s, which are both improvement from the previous two-point defocus method. The new method was also applied to the analysis of two titanium samples, and the velocity was estimated with very low standard deviation in certain large grains on the sample. A new behavior was observed with the b-scan analysis technique where the amplitude of the surface wave decayed dramatically on certain crystallographic orientations. The new technique was also compared with previous results, and the new technique has been found to be much more reliable and to have higher contrast than previously possible with impulse excitation.
Tu, Li-ping; Chen, Jing-bo; Hu, Xiao-juan; Zhang, Zhi-feng
2016-01-01
Background and Goal. The application of digital image processing techniques and machine learning methods in tongue image classification in Traditional Chinese Medicine (TCM) has been widely studied nowadays. However, it is difficult for the outcomes to generalize because of lack of color reproducibility and image standardization. Our study aims at the exploration of tongue colors classification with a standardized tongue image acquisition process and color correction. Methods. Three traditional Chinese medical experts are chosen to identify the selected tongue pictures taken by the TDA-1 tongue imaging device in TIFF format through ICC profile correction. Then we compare the mean value of L * a * b * of different tongue colors and evaluate the effect of the tongue color classification by machine learning methods. Results. The L * a * b * values of the five tongue colors are statistically different. Random forest method has a better performance than SVM in classification. SMOTE algorithm can increase classification accuracy by solving the imbalance of the varied color samples. Conclusions. At the premise of standardized tongue acquisition and color reproduction, preliminary objectification of tongue color classification in Traditional Chinese Medicine (TCM) is feasible. PMID:28050555
Marcek, Dusan; Durisova, Maria
2016-01-01
This paper deals with application of quantitative soft computing prediction models into financial area as reliable and accurate prediction models can be very helpful in management decision-making process. The authors suggest a new hybrid neural network which is a combination of the standard RBF neural network, a genetic algorithm, and a moving average. The moving average is supposed to enhance the outputs of the network using the error part of the original neural network. Authors test the suggested model on high-frequency time series data of USD/CAD and examine the ability to forecast exchange rate values for the horizon of one day. To determine the forecasting efficiency, they perform a comparative statistical out-of-sample analysis of the tested model with autoregressive models and the standard neural network. They also incorporate genetic algorithm as an optimizing technique for adapting parameters of ANN which is then compared with standard backpropagation and backpropagation combined with K-means clustering algorithm. Finally, the authors find out that their suggested hybrid neural network is able to produce more accurate forecasts than the standard models and can be helpful in eliminating the risk of making the bad decision in decision-making process. PMID:26977450
Falat, Lukas; Marcek, Dusan; Durisova, Maria
2016-01-01
This paper deals with application of quantitative soft computing prediction models into financial area as reliable and accurate prediction models can be very helpful in management decision-making process. The authors suggest a new hybrid neural network which is a combination of the standard RBF neural network, a genetic algorithm, and a moving average. The moving average is supposed to enhance the outputs of the network using the error part of the original neural network. Authors test the suggested model on high-frequency time series data of USD/CAD and examine the ability to forecast exchange rate values for the horizon of one day. To determine the forecasting efficiency, they perform a comparative statistical out-of-sample analysis of the tested model with autoregressive models and the standard neural network. They also incorporate genetic algorithm as an optimizing technique for adapting parameters of ANN which is then compared with standard backpropagation and backpropagation combined with K-means clustering algorithm. Finally, the authors find out that their suggested hybrid neural network is able to produce more accurate forecasts than the standard models and can be helpful in eliminating the risk of making the bad decision in decision-making process.
Qi, Zhen; Tu, Li-Ping; Chen, Jing-Bo; Hu, Xiao-Juan; Xu, Jia-Tuo; Zhang, Zhi-Feng
2016-01-01
Background and Goal . The application of digital image processing techniques and machine learning methods in tongue image classification in Traditional Chinese Medicine (TCM) has been widely studied nowadays. However, it is difficult for the outcomes to generalize because of lack of color reproducibility and image standardization. Our study aims at the exploration of tongue colors classification with a standardized tongue image acquisition process and color correction. Methods . Three traditional Chinese medical experts are chosen to identify the selected tongue pictures taken by the TDA-1 tongue imaging device in TIFF format through ICC profile correction. Then we compare the mean value of L * a * b * of different tongue colors and evaluate the effect of the tongue color classification by machine learning methods. Results . The L * a * b * values of the five tongue colors are statistically different. Random forest method has a better performance than SVM in classification. SMOTE algorithm can increase classification accuracy by solving the imbalance of the varied color samples. Conclusions . At the premise of standardized tongue acquisition and color reproduction, preliminary objectification of tongue color classification in Traditional Chinese Medicine (TCM) is feasible.
[Online endpoint detection algorithm for blending process of Chinese materia medica].
Lin, Zhao-Zhou; Yang, Chan; Xu, Bing; Shi, Xin-Yuan; Zhang, Zhi-Qiang; Fu, Jing; Qiao, Yan-Jiang
2017-03-01
Blending process, which is an essential part of the pharmaceutical preparation, has a direct influence on the homogeneity and stability of solid dosage forms. With the official release of Guidance for Industry PAT, online process analysis techniques have been more and more reported in the applications in blending process, but the research on endpoint detection algorithm is still in the initial stage. By progressively increasing the window size of moving block standard deviation (MBSD), a novel endpoint detection algorithm was proposed to extend the plain MBSD from off-line scenario to online scenario and used to determine the endpoint in the blending process of Chinese medicine dispensing granules. By online learning of window size tuning, the status changes of the materials in blending process were reflected in the calculation of standard deviation in a real-time manner. The proposed method was separately tested in the blending processes of dextrin and three other extracts of traditional Chinese medicine. All of the results have shown that as compared with traditional MBSD method, the window size changes according to the proposed MBSD method (progressively increasing the window size) could more clearly reflect the status changes of the materials in blending process, so it is suitable for online application. Copyright© by the Chinese Pharmaceutical Association.
NASA Technical Reports Server (NTRS)
Polzin, Kurt A.; Hill, Carrie S.
2013-01-01
Inductive magnetic field probes (also known as B-dot probes and sometimes as B-probes or magnetic probes) are useful for performing measurements in electric space thrusters and various plasma accelerator applications where a time-varying magnetic field is present. Magnetic field probes have proven to be a mainstay in diagnosing plasma thrusters where changes occur rapidly with respect to time, providing the means to measure the magnetic fields produced by time-varying currents and even an indirect measure of the plasma current density through the application of Ampère's law. Examples of applications where this measurement technique has been employed include pulsed plasma thrusters and quasi-steady magnetoplasmadynamic thrusters. The Electric Propulsion Technical Committee (EPTC) of the American Institute of Aeronautics and Astronautics (AIAA) was asked to assemble a Committee on Standards (CoS) for Electric Propulsion Testing. The assembled CoS was tasked with developing Standards and Recommended Practices for various diagnostic techniques used in the evaluation of plasma thrusters. These include measurements that can yield either global information related to a thruster and its performance or detailed, local data related to the specific physical processes occurring in the plasma. This paper presents a summary of the standard, describing the preferred methods for fabrication, calibration, and usage of inductive magnetic field probes for use in diagnosing plasma thrusters. Inductive magnetic field probes (also called B-dot probes throughout this document) are commonly used in electric propulsion (EP) research and testing to measure unsteady magnetic fields produced by time-varying currents. The B-dot probe is relatively simple in construction, and requires minimal cost, making it a low-cost technique that is readily accessible to most researchers. While relatively simple, the design of a B-dot probe is not trivial and there are many opportunities for errors in probe construction, calibration, and usage, and in the post-processing of data that is produced by the probe. There are typically several ways in which each of these steps can be approached, and different applications may require more or less vigorous attention to various issues.
Applications of Automation Methods for Nonlinear Fracture Test Analysis
NASA Technical Reports Server (NTRS)
Allen, Phillip A.; Wells, Douglas N.
2013-01-01
As fracture mechanics material testing evolves, the governing test standards continue to be refined to better reflect the latest understanding of the physics of the fracture processes involved. The traditional format of ASTM fracture testing standards, utilizing equations expressed directly in the text of the standard to assess the experimental result, is self-limiting in the complexity that can be reasonably captured. The use of automated analysis techniques to draw upon a rich, detailed solution database for assessing fracture mechanics tests provides a foundation for a new approach to testing standards that enables routine users to obtain highly reliable assessments of tests involving complex, non-linear fracture behavior. Herein, the case for automating the analysis of tests of surface cracks in tension in the elastic-plastic regime is utilized as an example of how such a database can be generated and implemented for use in the ASTM standards framework. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.
Design and fabrication of self-assembled thin films
NASA Astrophysics Data System (ADS)
Topasna, Daniela M.; Topasna, Gregory A.
2015-10-01
Students experience the entire process of designing, fabricating and testing thin films during their capstone course. The films are fabricated by the ionic-self assembled monolayer (ISAM) technique, which is suited to a short class and is relatively rapid, inexpensive and environmentally friendly. The materials used are polymers, nanoparticles, and small organic molecules that, in various combinations, can create films with nanometer thickness and with specific properties. These films have various potential applications such as pH optical sensors or antibacterial coatings. This type of project offers students an opportunity to go beyond the standard lecture and labs and to experience firsthand the design and fabrication processes. They learn new techniques and procedures, as well as familiarize themselves with new instruments and optical equipment. For example, students learn how to characterize the films by using UV-Vis-NIR spectrophotometry and in the process learn how the instruments operate. This work compliments a previous exercise that we introduced where students use MATHCAD to numerically model the transmission and reflection of light from thin films.
Stable aesthetic standards delusion: changing 'artistic quality' by elaboration.
Carbon, Claus-Christian; Hesslinger, Vera M
2014-01-01
The present study challenges the notion that judgments of artistic quality are based on stable aesthetic standards. We propose that such standards are a delusion and that judgments of artistic quality are the combined result of exposure, elaboration, and discourse. We ran two experiments using elaboration tasks based on the repeated evaluation technique in which different versions of the Mona Lisa had to be elaborated deeply. During the initial task either the version known from the Louvre or an alternative version owned by the Prado was elaborated; during the second task both versions were elaborated in a comparative fashion. After both tasks multiple blends of the two versions had to be evaluated concerning several aesthetic key variables. Judgments of artistic quality of the blends were significantly different depending on the initially elaborated version of the Mona Lisa, indicating experience-based aesthetic processing, which contradicts the notion of stable aesthetic standards.
Yang, Yunfeng; Zhu, Mengxia; Wu, Liyou; Zhou, Jizhong
2008-09-16
Using genomic DNA as common reference in microarray experiments has recently been tested by different laboratories. Conflicting results have been reported with regard to the reliability of microarray results using this method. To explain it, we hypothesize that data processing is a critical element that impacts the data quality. Microarray experiments were performed in a gamma-proteobacterium Shewanella oneidensis. Pair-wise comparison of three experimental conditions was obtained either with two labeled cDNA samples co-hybridized to the same array, or by employing Shewanella genomic DNA as a standard reference. Various data processing techniques were exploited to reduce the amount of inconsistency between both methods and the results were assessed. We discovered that data quality was significantly improved by imposing the constraint of minimal number of replicates, logarithmic transformation and random error analyses. These findings demonstrate that data processing significantly influences data quality, which provides an explanation for the conflicting evaluation in the literature. This work could serve as a guideline for microarray data analysis using genomic DNA as a standard reference.
NASA Astrophysics Data System (ADS)
Anastasopoulos, Dimitrios; Moretti, Patrizia; Geernaert, Thomas; De Pauw, Ben; Nawrot, Urszula; De Roeck, Guido; Berghmans, Francis; Reynders, Edwin
2017-03-01
The presence of damage in a civil structure alters its stiffness and consequently its modal characteristics. The identification of these changes can provide engineers with useful information about the condition of a structure and constitutes the basic principle of the vibration-based structural health monitoring. While eigenfrequencies and mode shapes are the most commonly monitored modal characteristics, their sensitivity to structural damage may be low relative to their sensitivity to environmental influences. Modal strains or curvatures could offer an attractive alternative but current measurement techniques encounter difficulties in capturing the very small strain (sub-microstrain) levels occurring during ambient, or operational excitation, with sufficient accuracy. This paper investigates the ability to obtain sub-microstrain accuracy with standard fiber-optic Bragg gratings using a novel optical signal processing algorithm that identifies the wavelength shift with high accuracy and precision. The novel technique is validated in an extensive experimental modal analysis test on a steel I-beam which is instrumented with FBG sensors at its top and bottom flange. The raw wavelength FBG data are processed into strain values using both a novel correlation-based processing technique and a conventional peak tracking technique. Subsequently, the strain time series are used for identifying the beam's modal characteristics. Finally, the accuracy of both algorithms in identification of modal characteristics is extensively investigated.
Vapor hydrogen peroxide as alternative to dry heat microbial reduction
NASA Astrophysics Data System (ADS)
Chung, S.; Kern, R.; Koukol, R.; Barengoltz, J.; Cash, H.
2008-09-01
The Jet Propulsion Laboratory (JPL), in conjunction with the NASA Planetary Protection Officer, has selected vapor phase hydrogen peroxide (VHP) sterilization process for continued development as a NASA approved sterilization technique for spacecraft subsystems and systems. The goal was to include this technique, with an appropriate specification, in NASA Procedural Requirements 8020.12 as a low-temperature complementary technique to the dry heat sterilization process. The VHP process is widely used by the medical industry to sterilize surgical instruments and biomedical devices, but high doses of VHP may degrade the performance of flight hardware, or compromise material compatibility. The goal for this study was to determine the minimum VHP process conditions for planetary protection acceptable microbial reduction levels. Experiments were conducted by the STERIS Corporation, under contract to JPL, to evaluate the effectiveness of vapor hydrogen peroxide for the inactivation of the standard spore challenge, Geobacillus stearothermophilus. VHP process parameters were determined that provide significant reductions in spore viability while allowing survival of sufficient spores for statistically significant enumeration. In addition to the obvious process parameters of interest: hydrogen peroxide concentration, number of injection cycles, and exposure duration, the investigation also considered the possible effect on lethality of environmental parameters: temperature, absolute humidity, and material substrate. This study delineated a range of test sterilizer process conditions: VHP concentration, process duration, a process temperature range for which the worst case D-value may be imposed, a process humidity range for which the worst case D-value may be imposed, and the dependence on selected spacecraft material substrates. The derivation of D-values from the lethality data permitted conservative planetary protection recommendations.
NASA Astrophysics Data System (ADS)
Bhuiya, M. M. K.; Rasul, M. G.; Khan, M. M. K.; Ashwath, N.
2016-07-01
The Beauty Leaf Tree (Callophylum inophyllum) is regarded as an alternative source of energy to produce 2nd generation biodiesel due to its potentiality as well as high oil yield content in the seed kernels. The treating process is indispensable during the biodiesel production process because it can augment the yield as well as quality of the product. Oil extracted from both mechanical screw press and solvent extraction using n-hexane was refined. Five replications each of 25 gm of crude oil for screw press and five replications each of 25 gm of crude oil for n-hexane were selected for refining as well as biodiesel conversion processes. The oil refining processes consists of degumming, neutralization as well as dewaxing. The degumming, neutralization and dewaxing processes were performed to remove all the gums (phosphorous-based compounds), free fatty acids, and waxes from the fresh crude oil before the biodiesel conversion process carried out, respectively. The results indicated that up to 73% and 81% of mass conversion efficiency of the refined oil in the screw press and n-hexane refining processes were obtained, respectively. It was also found that up to 88% and 90% of biodiesel were yielded in terms of mass conversion efficiency in the transesterification process for the screw press and n-hexane techniques, respectively. While the entire processes (refining and transesterification) were considered, the conversion of beauty leaf tree (BLT) refined oil into biodiesel was yielded up to 65% and 73% of mass conversion efficiency for the screw press and n-hexane techniques, respectively. Physico-chemical properties of crude and refined oil, and biodiesel were characterized according to the ASTM standards. Overall, BLT has the potential to contribute as an alternative energy source because of high mass conversion efficiency.
Morgan, David G; Ramasse, Quentin M; Browning, Nigel D
2009-06-01
Zone axis images recorded using high-angle annular dark-field scanning transmission electron microscopy (HAADF-STEM or Z-contrast imaging) reveal the atomic structure with a resolution that is defined by the probe size of the microscope. In most cases, the full images contain many sub-images of the crystal unit cell and/or interface structure. Thanks to the repetitive nature of these images, it is possible to apply standard image processing techniques that have been developed for the electron crystallography of biological macromolecules and have been used widely in other fields of electron microscopy for both organic and inorganic materials. These methods can be used to enhance the signal-to-noise present in the original images, to remove distortions in the images that arise from either the instrumentation or the specimen itself and to quantify properties of the material in ways that are difficult without such data processing. In this paper, we describe briefly the theory behind these image processing techniques and demonstrate them for aberration-corrected, high-resolution HAADF-STEM images of Si(46) clathrates developed for hydrogen storage.
Efficient dynamic events discrimination technique for fiber distributed Brillouin sensors.
Galindez, Carlos A; Madruga, Francisco J; Lopez-Higuera, Jose M
2011-09-26
A technique to detect real time variations of temperature or strain in Brillouin based distributed fiber sensors is proposed and is investigated in this paper. The technique is based on anomaly detection methods such as the RX-algorithm. Detection and isolation of dynamic events from the static ones are demonstrated by a proper processing of the Brillouin gain values obtained by using a standard BOTDA system. Results also suggest that better signal to noise ratio, dynamic range and spatial resolution can be obtained. For a pump pulse of 5 ns the spatial resolution is enhanced, (from 0.541 m obtained by direct gain measurement, to 0.418 m obtained with the technique here exposed) since the analysis is concentrated in the variation of the Brillouin gain and not only on the averaging of the signal along the time. © 2011 Optical Society of America
Techniques for optimal crop selection in a controlled ecological life support system
NASA Technical Reports Server (NTRS)
Mccormack, Ann; Finn, Cory; Dunsky, Betsy
1993-01-01
A Controlled Ecological Life Support System (CELSS) utilizes a plant's natural ability to regenerate air and water while being grown as a food source in a closed life support system. Current plant research is directed toward obtaining quantitative empirical data on the regenerative ability of each species of plant and the system volume and power requirements. Two techniques were adapted to optimize crop species selection while at the same time minimizing the system volume and power requirements. Each allows the level of life support supplied by the plants to be selected, as well as other system parameters. The first technique uses decision analysis in the form of a spreadsheet. The second method, which is used as a comparison with and validation of the first, utilizes standard design optimization techniques. Simple models of plant processes are used in the development of these methods.
Techniques for optimal crop selection in a controlled ecological life support system
NASA Technical Reports Server (NTRS)
Mccormack, Ann; Finn, Cory; Dunsky, Betsy
1992-01-01
A Controlled Ecological Life Support System (CELSS) utilizes a plant's natural ability to regenerate air and water while being grown as a food source in a closed life support system. Current plant research is directed toward obtaining quantitative empirical data on the regenerative ability of each species of plant and the system volume and power requirements. Two techniques were adapted to optimize crop species selection while at the same time minimizing the system volume and power requirements. Each allows the level of life support supplied by the plants to be selected, as well as other system parameters. The first technique uses decision analysis in the form of a spreadsheet. The second method, which is used as a comparison with and validation of the first, utilizes standard design optimization techniques. Simple models of plant processes are used in the development of these methods.
Modelling Errors in Automatic Speech Recognition for Dysarthric Speakers
NASA Astrophysics Data System (ADS)
Caballero Morales, Santiago Omar; Cox, Stephen J.
2009-12-01
Dysarthria is a motor speech disorder characterized by weakness, paralysis, or poor coordination of the muscles responsible for speech. Although automatic speech recognition (ASR) systems have been developed for disordered speech, factors such as low intelligibility and limited phonemic repertoire decrease speech recognition accuracy, making conventional speaker adaptation algorithms perform poorly on dysarthric speakers. In this work, rather than adapting the acoustic models, we model the errors made by the speaker and attempt to correct them. For this task, two techniques have been developed: (1) a set of "metamodels" that incorporate a model of the speaker's phonetic confusion matrix into the ASR process; (2) a cascade of weighted finite-state transducers at the confusion matrix, word, and language levels. Both techniques attempt to correct the errors made at the phonetic level and make use of a language model to find the best estimate of the correct word sequence. Our experiments show that both techniques outperform standard adaptation techniques.
Polynomial-interpolation algorithm for van der Pauw Hall measurement in a metal hydride film
NASA Astrophysics Data System (ADS)
Koon, D. W.; Ares, J. R.; Leardini, F.; Fernández, J. F.; Ferrer, I. J.
2008-10-01
We apply a four-term polynomial-interpolation extension of the van der Pauw Hall measurement technique to a 330 nm Mg-Pd bilayer during both absorption and desorption of hydrogen at room temperature. We show that standard versions of the van der Pauw DC Hall measurement technique produce an error of over 100% due to a drifting offset signal and can lead to unphysical interpretations of the physical processes occurring in this film. The four-term technique effectively removes this source of error, even when the offset signal is drifting by an amount larger than the Hall signal in the time interval between successive measurements. This technique can be used to increase the resolution of transport studies of any material in which the resistivity is rapidly changing, particularly when the material is changing from metallic to insulating behavior.
In vitro evaluation of marginal adaptation in five ceramic restoration fabricating techniques.
Ural, Cağri; Burgaz, Yavuz; Saraç, Duygu
2010-01-01
To compare in vitro the marginal adaptation of crowns manufactured using ceramic restoration fabricating techniques. Fifty standardized master steel dies simulating molars were produced and divided into five groups, each containing 10 specimens. Test specimens were fabricated with CAD/CAM, heat-press, glass-infiltration, and conventional lost-wax techniques according to manufacturer instructions. Marginal adaptation of the test specimens was measured vertically before and after cementation using SEM. Data were statistically analyzed by one-way ANOVA with Tukey HSD tests (a = .05). Marginal adaptation of ceramic crowns was affected by fabrication technique and cementation process (P < .001). The lowest marginal opening values were obtained with Cerec-3 crowns before and after cementation (P < .001). The highest marginal discrepancy values were obtained with PFM crowns before and after cementation. Marginal adaptation values obtained in the compared systems were within clinically acceptable limits. Cementation causes a significant increase in the vertical marginal discrepancies of the test specimens.
Xiong, Zhenjie; Sun, Da-Wen; Pu, Hongbin; Gao, Wenhong; Dai, Qiong
2017-03-04
With improvement in people's living standards, many people nowadays pay more attention to quality and safety of meat. However, traditional methods for meat quality and safety detection and evaluation, such as manual inspection, mechanical methods, and chemical methods, are tedious, time-consuming, and destructive, which cannot meet the requirements of modern meat industry. Therefore, seeking out rapid, non-destructive, and accurate inspection techniques is important for the meat industry. In recent years, a number of novel and noninvasive imaging techniques, such as optical imaging, ultrasound imaging, tomographic imaging, thermal imaging, and odor imaging, have emerged and shown great potential in quality and safety assessment. In this paper, a detailed overview of advanced applications of these emerging imaging techniques for quality and safety assessment of different types of meat (pork, beef, lamb, chicken, and fish) is presented. In addition, advantages and disadvantages of each imaging technique are also summarized. Finally, future trends for these emerging imaging techniques are discussed, including integration of multiple imaging techniques, cost reduction, and developing powerful image-processing algorithms.
High-resolution interferometic microscope for traceable dimensional nanometrology in Brazil
NASA Astrophysics Data System (ADS)
Malinovski, I.; França, R. S.; Lima, M. S.; Bessa, M. S.; Silva, C. R.; Couceiro, I. B.
2016-07-01
The double color interferometric microscope is developed for step height standards nanometrology traceable to meter definition via primary wavelength laser standards. The setup is based on two stabilized lasers to provide traceable measurements of highest possible resolution down to the physical limits of the optical instruments in sub-nanometer to micrometer range of the heights. The wavelength reference is He-Ne 633 nm stabilized laser, the secondary source is Blue-Green 488 nm grating laser diode. Accurate fringe portion is measured by modulated phase-shift technique combined with imaging interferometry and Fourier processing. Self calibrating methods are developed to correct systematic interferometric errors.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rockward, Tommy
2012-07-16
For the past 6 years, open discussions and/or meetings have been held and are still on-going with OEM, Hydrogen Suppliers, other test facilities from the North America Team and International collaborators regarding experimental results, fuel clean-up cost, modeling, and analytical techniques to help determine levels of constituents for the development of an international standard for hydrogen fuel quality (ISO TC197 WG-12). Significant progress has been made. The process for the fuel standard is entering final stages as a result of the technical accomplishments. The objectives are to: (1) Determine the allowable levels of hydrogen fuel contaminants in support of themore » development of science-based international standards for hydrogen fuel quality (ISO TC197 WG-12); and (2) Validate the ASTM test method for determining low levels of non-hydrogen constituents.« less
a Study of Oxygen Precipitation in Heavily Doped Silicon.
NASA Astrophysics Data System (ADS)
Graupner, Robert Kurt
Gettering of impurities with oxygen precipitates is widely used during the fabrication of semiconductors to improve the performance and yield of the devices. Since the effectiveness of the gettering process is largely dependent on the initial interstitial oxygen concentration, accurate measurements of this parameter are of considerable importance. Measurements of interstitial oxygen following thermal cycles are required for development of semiconductor fabrication processes and for research into the mechanisms of oxygen precipitate nucleation and growth. Efforts by industrial associations have led to the development of standard procedures for the measurement of interstitial oxygen in wafers. However practical oxygen measurements often do not satisfy the requirements of such standard procedures. An additional difficulty arises when the silicon wafer has a low resitivity (high dopant concentration). In such cases the infrared light used for the measurement is severely attenuated by the electrons of holes introduced by the dopant. Since such wafers are the substrates used for the production of widely used epitaxial wafers, this measurement problem is economically important. Alternative methods such as Secondary Ion Mass Spectroscopy or Gas Fusion Analysis have been developed to measure oxygen in these cases. However, neither of these methods is capable of distinguishing interstitial oxygen from precipitated oxygen as required for precipitation studies. In addition to the commercial interest in heavily doped silicon substrates, they are also of interest for research into the role of point defects in nucleation and precipitation processes. Despite considerable research effort, there is still disagreement concerning the type of point defect and its role in semiconductor processes. Studies of changes in the interstitial oxygen concentration of heavily doped and lightly doped silicon wafers could help clarify the role of point defects in oxygen nucleation and precipitation processes. This could lead to more effective control and use of oxygen precipitation for gettering. One of the principal purposes of this thesis is the extension of the infrared interstitial oxygen measurement technique to situations outside the measurement capacities of the standard technique. These situations include silicon slices exhibiting interfering precipitate absorption bands and heavily doped n-type silicon wafers. A new method is presented for correcting for the effect of multiple reflections in silicon wafers with optically rough surfaces. The technique for the measurement of interstitial oxygen in heavily doped n-type wafers is then used to perform a comparative study of oxygen precipitation in heavily antimony doped (.035 ohm-cm) silicon and lightly doped p-type silicon. A model is presented to quantitatively explain the observed suppression of defect formation in heavily doped n-type wafers.
CICADA, CCD and Instrument Control Software
NASA Astrophysics Data System (ADS)
Young, Peter J.; Brooks, Mick; Meatheringham, Stephen J.; Roberts, William H.
Computerised Instrument Control and Data Acquisition (CICADA) is a software system for control of telescope instruments in a distributed computing environment. It is designed using object-oriented techniques and built with standard computing tools such as RPC, SysV IPC, Posix threads, Tcl, and GUI builders. The system is readily extensible to new instruments and currently supports the Astromed 3200 CCD controller and MSSSO's new tip-tilt system. Work is currently underway to provide support for the SDSU CCD controller and MSSSO's Double Beam Spectrograph. A core set of processes handle common communication and control tasks, while specific instruments are ``bolted'' on using C++ inheritance techniques.
Introduction to the mining of clinical data.
Harrison, James H
2008-03-01
The increasing volume of medical data online, including laboratory data, represents a substantial resource that can provide a foundation for improved understanding of disease presentation, response to therapy, and health care delivery processes. Data mining supports these goals by providing a set of techniques designed to discover similarities and relationships between data elements in large data sets. Currently, medical data have several characteristics that increase the difficulty of applying these techniques, although there have been notable medical data mining successes. Future developments in integrated medical data repositories, standardized data representation, and guidelines for the appropriate research use of medical data will decrease the barriers to mining projects.
[A method for inducing standardized spiral fractures of the tibia in the animal experiment].
Seibold, R; Schlegel, U; Cordey, J
1995-07-01
A method for the deliberate weakening of cortical bone has been developed on the basis of an already established technique for creating butterfly fractures. It enables one to create the same type of fracture, i.e., a spiral fracture, every time. The fracturing process is recorded as a force-strain curve. The results of the in vitro investigations form a basis for the preparation of experimental tasks aimed at demonstrating internal fixation techniques and their influence on the vascularity of the bone in simulated fractures. Animal protection law lays down that this fracture model must not fail in animal experiments.
Recent Developments in Microsystems Fabricated by the Liga-Technique
NASA Technical Reports Server (NTRS)
Schulz, J.; Bade, K.; El-Kholi, A.; Hein, H.; Mohr, J.
1995-01-01
As an example of microsystems fabricated by the LIGA-technique (x-ray lithography, electroplating and molding), three systems are described and characterized: a triaxial acceleration sensor system, a micro-optical switch, and a microsystem for the analysis of pollutants. The fabrication technologies are reviewed with respect to the key components of the three systems: an acceleration sensor, and electrostatic actuator, and a spectrometer made by the LIGA-technique. Aa micro-pump and micro-valve made by using micromachined tools for molding and optical fiber imaging are made possible by combining LIGA and anisotropic etching of silicon in a batch process. These examples show that the combination of technologies and components is the key to complex microsystems. The design of such microsystems will be facilitated is standardized interfaces are available.
Differential dynamic microscopy to characterize Brownian motion and bacteria motility
NASA Astrophysics Data System (ADS)
Germain, David; Leocmach, Mathieu; Gibaud, Thomas
2016-03-01
We have developed a lab module for undergraduate students, which involves the process of quantifying the dynamics of a suspension of microscopic particles using Differential Dynamic Microscopy (DDM). DDM is a relatively new technique that constitutes an alternative method to more classical techniques such as dynamic light scattering (DLS) or video particle tracking (VPT). The technique consists of imaging a particle dispersion with a standard light microscope and a camera and analyzing the images using a digital Fourier transform to obtain the intermediate scattering function, an autocorrelation function that characterizes the dynamics of the dispersion. We first illustrate DDM in the textbook case of colloids under Brownian motion, where we measure the diffusion coefficient. Then we show that DDM is a pertinent tool to characterize biological systems such as motile bacteria.
NASA Astrophysics Data System (ADS)
Liu, Leibo; Chen, Yingjie; Yin, Shouyi; Lei, Hao; He, Guanghui; Wei, Shaojun
2014-07-01
A VLSI architecture for entropy decoder, inverse quantiser and predictor is proposed in this article. This architecture is used for decoding video streams of three standards on a single chip, i.e. H.264/AVC, AVS (China National Audio Video coding Standard) and MPEG2. The proposed scheme is called MPMP (Macro-block-Parallel based Multilevel Pipeline), which is intended to improve the decoding performance to satisfy the real-time requirements while maintaining a reasonable area and power consumption. Several techniques, such as slice level pipeline, MB (Macro-Block) level pipeline, MB level parallel, etc., are adopted. Input and output buffers for the inverse quantiser and predictor are shared by the decoding engines for H.264, AVS and MPEG2, therefore effectively reducing the implementation overhead. Simulation shows that decoding process consumes 512, 435 and 438 clock cycles per MB in H.264, AVS and MPEG2, respectively. Owing to the proposed techniques, the video decoder can support H.264 HP (High Profile) 1920 × 1088@30fps (frame per second) streams, AVS JP (Jizhun Profile) 1920 × 1088@41fps streams and MPEG2 MP (Main Profile) 1920 × 1088@39fps streams when exploiting a 200 MHz working frequency.
Measuring the uncertainties of discharge measurements: interlaboratory experiments in hydrometry
NASA Astrophysics Data System (ADS)
Le Coz, Jérôme; Blanquart, Bertrand; Pobanz, Karine; Dramais, Guillaume; Pierrefeu, Gilles; Hauet, Alexandre; Despax, Aurélien
2015-04-01
Quantifying the uncertainty of streamflow data is key for hydrological sciences. The conventional uncertainty analysis based on error propagation techniques is restricted by the absence of traceable discharge standards and by the weight of difficult-to-predict errors related to the operator, procedure and measurement environment. Field interlaboratory experiments recently emerged as an efficient, standardized method to 'measure' the uncertainties of a given streamgauging technique in given measurement conditions. Both uncertainty approaches are compatible and should be developed jointly in the field of hydrometry. In the recent years, several interlaboratory experiments have been reported by different hydrological services. They involved different streamgauging techniques, including acoustic profilers (ADCP), current-meters and handheld radars (SVR). Uncertainty analysis was not always their primary goal: most often, testing the proficiency and homogeneity of instruments, makes and models, procedures and operators was the original motivation. When interlaboratory experiments are processed for uncertainty analysis, once outliers have been discarded all participants are assumed to be equally skilled and to apply the same streamgauging technique in equivalent conditions. A universal requirement is that all participants simultaneously measure the same discharge, which shall be kept constant within negligible variations. To our best knowledge, we were the first to apply the interlaboratory method for computing the uncertainties of streamgauging techniques, according to the authoritative international documents (ISO standards). Several specific issues arise due to the measurements conditions in outdoor canals and rivers. The main limitation is that the best available river discharge references are usually too uncertain to quantify the bias of the streamgauging technique, i.e. the systematic errors that are common to all participants in the experiment. A reference or a sensitivity analysis to the fixed parameters of the streamgauging technique remain very useful for estimating the uncertainty related to the (non quantified) bias correction. In the absence of a reference, the uncertainty estimate is referenced to the average of all discharge measurements in the interlaboratory experiment, ignoring the technique bias. Simple equations can be used to assess the uncertainty of the uncertainty results, as a function of the number of participants and of repeated measurements. The interlaboratory method was applied to several interlaboratory experiments on ADCPs and currentmeters mounted on wading rods, in streams of different sizes and aspects, with 10 to 30 instruments, typically. The uncertainty results were consistent with the usual expert judgment and highly depended on the measurement environment. Approximately, the expanded uncertainties (within the 95% probability interval) were ±5% to ±10% for ADCPs in good or poor conditions, and ±10% to ±15% for currentmeters in shallow creeks. Due to the specific limitations related to a slow measurement process and to small, natural streams, uncertainty results for currentmeters were more uncertain than for ADCPs, for which the site-specific errors were significantly evidenced. The proposed method can be applied to a wide range of interlaboratory experiments conducted in contrasted environments for different streamgauging techniques, in a standardized way. Ideally, an international open database would enhance the investigation of hydrological data uncertainties, according to the characteristics of the measurement conditions and procedures. Such a dataset could be used for implementing and validating uncertainty propagation methods in hydrometry.
Mishra, Apurva; Pandey, Ramesh K; Manickam, Natesan
2015-01-01
Rapid phylogenetic and functional gene (gtfB) identification of S. mutans from the dental plaque derived from children. Dental plaque collected from fifteen patients of age group 7-12 underwent centrifugation followed by genomic DNA extraction for S. mutans. Genomic DNA was processed with S. mutans specific primers in suitable PCR condtions for phylogenetic and functional gene (gtfB) identification. The yield and results were confirmed by agarose gel electrophoresis. 1% agarose gel electrophoresis depicts the positive PCR amplification at 1,485 bp when compared with standard 1 kbp indicating the presence of S. mutans in the test sample. Another PCR reaction was set using gtfB primers specific for S. mutans for functional gene identification. 1.2% agarose gel electrophoresis was done and a positive amplication was observed at 192 bp when compared to 100 bp standards. With the advancement in molecular biology techniques, PCR based identification and quantification of the bacterial load can be done within hours using species-specific primers and DNA probes. Thus, this technique may reduce the laboratory time spend in conventional culture methods, reduces the possibility of colony identification errors and is more sensitive to culture techniques.
Aboal, J R; Boquete, M T; Carballeira, A; Casanova, A; Debén, S; Fernández, J A
2017-05-01
In this study we examined 6080 data gathered by our research group during more than 20 years of research on the moss biomonitoring technique, in order to quantify the variability generated by different aspects of the protocol and to calculate the overall measurement uncertainty associated with the technique. The median variance of the concentrations of different pollutants measured in moss tissues attributed to the different methodological aspects was high, reaching values of 2851 (ng·g -1 ) 2 for Cd (sample treatment), 35.1 (μg·g -1 ) 2 for Cu (sample treatment), 861.7 (ng·g -1 ) 2 and for Hg (material selection). These variances correspond to standard deviations that constitute 67, 126 and 59% the regional background levels of these elements in the study region. The overall measurement uncertainty associated with the worst experimental protocol (5 subsamples, refrigerated, washed, 5 × 5 m size of the sampling area and once a year sampling) was between 2 and 6 times higher than that associated with the optimal protocol (30 subsamples, dried, unwashed, 20 × 20 m size of the sampling area and once a week sampling), and between 1.5 and 7 times higher than that associated with the standardized protocol (30 subsamples and once a year sampling). The overall measurement uncertainty associated with the standardized protocol could generate variations of between 14 and 47% in the regional background levels of Cd, Cu, Hg, Pb and Zn in the study area and much higher levels of variation in polluted sampling sites. We demonstrated that although the overall measurement uncertainty of the technique is still high, it can be reduced by using already well defined aspects of the protocol. Further standardization of the protocol together with application of the information on the overall measurement uncertainty would improve the reliability and comparability of the results of different biomonitoring studies, thus extending use of the technique beyond the context of scientific research. Copyright © 2017 Elsevier Ltd. All rights reserved.
UIAGM Ropehandling Techniques.
ERIC Educational Resources Information Center
Cloutier, K. Ross
The Union Internationale des Associations des Guides de Montagne's (UIAGM) rope handling techniques are intended to form the standard for guiding ropework worldwide. These techniques have become the legal standard for instructional institutions and commercial guiding organizations in UIAGM member countries: Austria, Canada, France, Germany, Great…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kirsch, Matthias
2009-06-29
At particle accelerators the Standard Model has been tested and will be tested further to a great precision. The data analyzed in this thesis have been collected at the world's highest energetic-collider, the Tevatron, located at the Fermi National Accelerator Laboratory (FNAL) in the vicinity of Chicago, IL, USA. There, protons and antiprotons are collided at a center-of-mass energy of {radical}s = 1.96 TeV. The discovery of the top quark was one of the remarkable results not only for the CDF and D0 experiments at the Tevatron collider, but also for the Standard Model, which had predicted the existence ofmore » the top quark because of symmetry arguments long before already. Still, the Tevatron is the only facility able to produce top quarks. The predominant production mechanism of top quarks is the production of a top-antitop quark pair via the strong force. However, the Standard Model also allows the production of single top quarks via the electroweak interaction. This process features the unique opportunity to measure the |V tb| matrix element of the Cabbibo-Kobayashi-Maskawa (CKM) matrix directly, without assuming unitarity of the matrix or assuming that the number of quark generations is three. Hence, the measurement of the cross section of electroweak top quark production is more than the technical challenge to extract a physics process that only occurs one out of ten billion collisions. It is also an important test of the V-A structure of the electroweak interaction and a potential window to physics beyond the Standard Model in the case where the measurement of |V{sub tb}| would result in a value significantly different from 1, the value predicted by the Standard Model. At the Tevatron two production processes contribute significantly to the production of single top quarks: the production via the t-channel, also called W-gluon fusion, and the production via the s-channel, known as well as W* process. This analysis searches for the combined s+t channel production cross section, assuming the ratio of s-channel production over t-channel production is realized in nature as predicted by the Standard Model. A data set of approximately 1 fb -1 is analyzed, the data set used by the D0 collaboration to claim evidence for single top quark production. Events with two, three, and four jets are used in the analysis if they contain one or two jets that were tagged as originating from the decay of a b hadron, an isolated muon or electron, and a significant amount of missing transverse energy. This selection of events follows the signature that the single top quark events are expected to show in the detector. In the meantime, both collaborations D0 and CDF have analyzed a larger data set and have celebrated the joint observation of single top quark production. The novelty of the analysis presented here is the way discriminating observables are determined. A so-called Multi-Process Factory evaluates each event under several hypotheses. A common analysis technique for example in top quark properties studies is to reconstruct the intermediate particles in the decay chain of the signal process from the final state objects measured in the various subdetectors. An essential part of such a method is to resolve the ambiguities that arise in the assignment of the final state objects to the partons of the decay chain. In a Multi-Process Factory this approach is extended and not only the decay chain of the signal process is reconstructed, but also the decay chains of the most important background processes. From the numerous possible event configurations for each of the signal and background decay chains the most probable configuration is selected based on a likelihood measure. Properties of this configuration, such as mass of the reconstructed top quark, are then used in a multivariate analysis technique to separate the expected signal contribution from the background processes. The technique which is used is called Boosted Decision Trees and has only recently been introduced in high energy physics analyses. A Bayesian approach is used to finally extract the cross section from the discriminant output of Boosted Decision Trees.« less
DLA-X Total Quality Management (TQM) Implementation Plan
1989-07-01
PAGES TOM (Total Quality Management ), Continuous Process Improvement.( .) 4L-- Administration 16. PRICE CODE 17. SECURITY CLASSIFICATION 18. SECURITY...NSN 7540-01-280-5500 Standard Form 298 (Rev. 2-89) Pr.-cr,bed by ANSI Std ,,fl.f 296-102 DLA-X TOTAL QUALITY MANAGEMENT (TQM) IMPLEMENTATION PLAN o...application of proven Total Quality Management techniques. Quality Policy: Responsibility for quality is delegated to every employee ;11 DLA-X. Every
Metal-Semiconductor Nanocomposites for High Efficiency Thermoelectric Power Generation
2013-12-07
standard III–V compound semiconductor processing techniques with terbium- doped InGaAs of high terbium concentration, Journal of Vacuum Science...even lower the required temperature for strong covalent bonding. We performed the oxide bonding for this substrate transfer task (see Figure 16 for...appropriate controls for assessing ErSb:InGaSb and other nanocomposites of p-type III-V compound semiconductors and their alloys. UCSC group calculated
Semi-Markov Models for Degradation-Based Reliability
2010-01-01
standard analysis techniques for Markov processes can be employed (cf. Whitt (1984), Altiok (1985), Perros (1994), and Osogami and Harchol-Balter...We want to approximate X by a PH random variable, sayY, with c.d.f. Ĥ. Marie (1980), Altiok (1985), Johnson (1993), Perros (1994), and Osogami and...provides a minimal representation when matching only two moments. By considering the guidance provided by Marie (1980), Whitt (1984), Altiok (1985), Perros
Collisional & Nonlinear Radiative Processes for Development of Coherent UV & XUV Sources.
1987-04-01
4- Charles K. Rhodes in the vicinity of an atomic unit, (e/a ). Extant theoretical work, however, 0 predicted ridiculously low rates...of 14 210 W/cm . These experiments clearly demonstrated that standard theoretical techniques were incapable, by a discrepancy as great as several...experiments were clearly in contradiction to all theoretical treatments, of which there is a considerable number (16-21). This unexpected result, of course
Mapping modern software process engineering techniques onto an HEP development environment
NASA Astrophysics Data System (ADS)
Wellisch, J. P.
2003-04-01
One of the most challenging issues faced in HEP in recent years is the question of how to capitalise on software development and maintenance experience in a continuous manner. To capitalise means in our context to evaluate and apply new process technologies as they arise, and to further evolve technologies already widely in use. It also implies the definition and adoption of standards. The CMS off-line software improvement effort aims at continual software quality improvement, and continual improvement in the efficiency of the working environment with the goal to facilitate doing great new physics. To achieve this, we followed a process improvement program based on ISO-15504, and Rational Unified Process. This experiment in software process improvement in HEP has been progressing now for a period of 3 years. Taking previous experience from ATLAS and SPIDER into account, we used a soft approach of continuous change within the limits of current culture to create of de facto software process standards within the CMS off line community as the only viable route to a successful software process improvement program in HEP. We will present the CMS approach to software process improvement in this process R&D, describe lessons learned, and mistakes made. We will demonstrate the benefits gained, and the current status of the software processes established in CMS off-line software.
Capture of Fluorescence Decay Times by Flow Cytometry
Naivar, Mark A.; Jenkins, Patrick; Freyer, James P.
2012-01-01
In flow cytometry, the fluorescence decay time of an excitable species has been largely underutilized and is not likely found as a standard parameter on any imaging cytometer, sorting, or analyzing system. Most cytometers lack fluorescence lifetime hardware mainly owing to two central issues. Foremost, research and development with lifetime techniques has lacked proper exploitation of modern laser systems, data acquisition boards, and signal processing techniques. Secondly, a lack of enthusiasm for fluorescence lifetime applications in cells and with bead-based assays has persisted among the greater cytometry community. In this unit, we describe new approaches that address these issues and demonstrate the simplicity of digitally acquiring fluorescence relaxation rates in flow. The unit is divided into protocol and commentary sections in order to provide a most comprehensive discourse on acquiring the fluorescence lifetime with frequency-domain methods. The unit covers (i) standard fluorescence lifetime acquisition (protocol-based) with frequency-modulated laser excitation, (ii) digital frequency-domain cytometry analyses, and (iii) interfacing fluorescence lifetime measurements onto sorting systems. Within the unit is also a discussion on how digital methods are used for aliasing in order to harness higher frequency ranges. Also, a final discussion is provided on heterodyning and processing of waveforms for multi-exponential decay extraction. PMID:25419263
Applying CLIPS to control of molecular beam epitaxy processing
NASA Technical Reports Server (NTRS)
Rabeau, Arthur A.; Bensaoula, Abdelhak; Jamison, Keith D.; Horton, Charles; Ignatiev, Alex; Glover, John R.
1990-01-01
A key element of U.S. industrial competitiveness in the 1990's will be the exploitation of advanced technologies which involve low-volume, high-profit manufacturing. The demands of such manufacture limit participation to a few major entities in the U.S. and elsewhere, and offset the lower manufacturing costs of other countries which have, for example, captured much of the consumer electronics market. One such technology is thin-film epitaxy, a technology which encompasses several techniques such as Molecular Beam Epitaxy (MBE), Chemical Beam Epitaxy (CBE), and Vapor-Phase Epitaxy (VPE). Molecular Beam Epitaxy (MBE) is a technology for creating a variety of electronic and electro-optical materials. Compared to standard microelectronic production techniques (including gaseous diffusion, ion implantation, and chemical vapor deposition), MBE is much more exact, though much slower. Although newer than the standard technologies, MBE is the technology of choice for fabrication of ultraprecise materials for cutting-edge microelectronic devices and for research into the properties of new materials.
Imaging biomarkers in liver fibrosis.
Berzigotti, A; França, M; Martí-Aguado, D; Martí-Bonmatí, L
There is a need for early identification of patients with chronic liver diseases due to their increasing prevalence and morbidity-mortality. The degree of liver fibrosis determines the prognosis and therapeutic options in this population. Liver biopsy represents the reference standard for fibrosis staging. However, given its limitations and complications, different non-invasive methods have been developed recently for the in vivo quantification of fibrosis. Due to their precision and reliability, biomarkers' measurements derived from Ultrasound and Magnetic Resonance stand out. This article reviews the different acquisition techniques and image processing methods currently used in the evaluation of liver fibrosis, focusing on their diagnostic performance, applicability and clinical value. In order to properly interpret their results in the appropriate clinical context, it seems necessary to understand the techniques and their quality parameters, the standardization and validation of the measurement units and the quality control of the methodological problems. Copyright © 2017 SERAM. Publicado por Elsevier España, S.L.U. All rights reserved.
Optimized protocol for combined PALM-dSTORM imaging.
Glushonkov, O; Réal, E; Boutant, E; Mély, Y; Didier, P
2018-06-08
Multi-colour super-resolution localization microscopy is an efficient technique to study a variety of intracellular processes, including protein-protein interactions. This technique requires specific labels that display transition between fluorescent and non-fluorescent states under given conditions. For the most commonly used label types, photoactivatable fluorescent proteins and organic fluorophores, these conditions are different, making experiments that combine both labels difficult. Here, we demonstrate that changing the standard imaging buffer of thiols/oxygen scavenging system, used for organic fluorophores, to the commercial mounting medium Vectashield increased the number of photons emitted by the fluorescent protein mEos2 and enhanced the photoconversion rate between its green and red forms. In addition, the photophysical properties of organic fluorophores remained unaltered with respect to the standard imaging buffer. The use of Vectashield together with our optimized protocol for correction of sample drift and chromatic aberrations enabled us to perform two-colour 3D super-resolution imaging of the nucleolus and resolve its three compartments.
NASA Astrophysics Data System (ADS)
Taer, Erman; Taslim, Rika
2018-02-01
The synthesis of activated carbon monolith electrode made from a biomass material using the hydrolytic pressure or the pelletization technique of pre-carbonized materials is one of standard reported methods. Several steps such as pre-carbonization, milling, chemical activation, hydraulic press, carbonization, physical activation, polishing and washing need to be accomplished in the production of electrodes by this method. This is relatively a long process that need to be simplified. In this paper we present the standard method and proceed with the introduction to several alternative methods in the synthesis of activated carbon monolith electrodes. The alternative methods were emphasized on the selection of suitable biomass materials. All of carbon electrodes prepared by different methods will be analyzed for physical and electrochemical properties. The density, degree of crystallinity, surface morphology are examples for physical study and specific capacitance was an electrochemical properties that has been analysed. This alternative method has offered a specific capacitance in the range of 10 to 171 F/g.
Application of EAP materials toward a refreshable Braille display
NASA Astrophysics Data System (ADS)
Di Spigna, N.; Chakraborti, P.; Yang, P.; Ghosh, T.; Franzon, P.
2009-03-01
The development of a multiline, refreshable Braille display will assist with the full inclusion and integration of blind people into society. The use of both polyvinylidene fluoride (PVDF) film planar bending mode actuators and silicone dielectric elastomer cylindrical tube actuators have been investigated for their potential use in a Braille cell. A liftoff process that allows for aggressive scaling of miniature bimorph actuators has been developed using standard semiconductor lithography techniques. The PVDF bimorphs have been demonstrated to provide enough displacement to raise a Braille dot using biases less than 1000V and operating at 10Hz. In addition, silicone tube actuators have also been demonstrated to achieve the necessary displacement, though requiring higher voltages. The choice of electrodes and prestrain conditions aimed at maximizing axial strain in tube actuators are discussed. Characterization techniques measuring actuation displacement and blocking forces appropriate for standard Braille cell specifications are presented. Finally, the integration of these materials into novel cell designs and the fabrication of a prototype Braille cell are discussed.
Wei, Rongfei; Guo, Qingjun; Wen, Hanjie; Peters, Marc; Yang, Junxing; Tian, Liyan; Han, Xiaokun
2017-01-01
In this study, key factors affecting the chromatographic separation of Cd from plants, such as the resin column, digestion and purification procedures, were experimentally investigated. A technique for separating Cd from plant samples based on single ion-exchange chromatography has been developed, which is suitable for the high-precision analysis of Cd isotopes by multiple-collector inductively coupled plasma mass spectrometry (MC-ICP-MS). The robustness of the technique was assessed by replicate analyses of Cd standard solutions and plant samples. The Cd yields of the whole separation process were higher than 95%, and the 114/110 Cd values of three Cd second standard solutions (Münster Cd, Spex Cd, Spex-1 Cd solutions) relative to the NIST SRM 3108 were measured accurately, which enabled the comparisons of Cd isotope results obtained in other laboratories. Hence, stable Cd isotope analyses represent a powerful tool for fingerprinting specific Cd sources and/or examining biogeochemical reactions in ecological and environmental systems.
Preservation of normal lung regions in the adult respiratory distress syndrome
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maunder, R.J.; Shuman, W.P.; McHugh, J.W.
1986-05-09
In this report, the authors challenge the commonly held assumption that the adult respiratory distress syndrome (ARDS) is a homogeneous process associated with generalized and relatively uniform damage to the alveolar capillary membrane. They studied 13 patients with ARDS, comparing the pulmonary parenchymal changes seen by standard bedside chest roentgenograms with those seen by computed tomography of the chest. Three patients demonstrated generalized lung involvement by both radiologic techniques. In another eight patients, despite the appearance of generalized involvement on the standard chest x-ray film, the computed tomographic scans showed patchy infiltrates interspersed with areas of normal-appearing lung. Two patientsmore » showed patchy involvement by both techniques. The fact that ARDS spares some regions of lung parenchyma is useful knowledge in understanding the gas-exchange abnormalities of ARDS, the variable responsiveness to positive end-expiratory pressure, and the occurrence of oxygen toxicity. The problem of regional inhomogeneity should also be kept in mind when interpreting lung biopsy specimens or bronchoalveolar lavage fluid in patients with ARDS.« less
Basu, Anirban; Kumar, Gopinatha Suresh
2015-05-15
The thermodynamics of the interaction of the food colourant tartrazine with two homologous serum proteins, HSA and BSA, were investigated, employing microcalorimetric techniques. At T=298.15K the equilibrium constants for the tartrazine-BSA and HSA complexation process were evaluated to be (1.92 ± 0.05) × 10(5)M(-1) and (1.04 ± 0.05) × 10(5)M(-1), respectively. The binding was driven by a large negative standard molar enthalpic contribution. The binding was dominated essentially by non-polyelectrolytic forces which remained largely invariant at all salt concentrations. The polyelectrolytic contribution was weak at all salt concentrations and accounted for only 6-18% of the total standard molar Gibbs energy change in the salt concentration range 10-50mM. The negative standard molar heat capacity values, in conjunction with the enthalpy-entropy compensation phenomenon observed, established the involvement of dominant hydrophobic forces in the complexation process. Tartrazine enhanced the stability of both serum albumins against thermal denaturation. Copyright © 2014 Elsevier Ltd. All rights reserved.
Improved Bat Algorithm Applied to Multilevel Image Thresholding
2014-01-01
Multilevel image thresholding is a very important image processing technique that is used as a basis for image segmentation and further higher level processing. However, the required computational time for exhaustive search grows exponentially with the number of desired thresholds. Swarm intelligence metaheuristics are well known as successful and efficient optimization methods for intractable problems. In this paper, we adjusted one of the latest swarm intelligence algorithms, the bat algorithm, for the multilevel image thresholding problem. The results of testing on standard benchmark images show that the bat algorithm is comparable with other state-of-the-art algorithms. We improved standard bat algorithm, where our modifications add some elements from the differential evolution and from the artificial bee colony algorithm. Our new proposed improved bat algorithm proved to be better than five other state-of-the-art algorithms, improving quality of results in all cases and significantly improving convergence speed. PMID:25165733
ANSI/AIAA S-081A, Pressure Vessel Standards Implementation Guidelines
NASA Technical Reports Server (NTRS)
Greene, Nathanael J.
2009-01-01
The stress rupture specification for Composite Overwrapped Pressure Vessels (COPV) is discussed. The composite shell of the COPV shall be designed to meet the design life considering the time it is under sustained load. A Mechcanical Damage Control Plan (MDCP) shall be created and implemented that assures the COPV will not fail due to mechanical damage due to manufacturing, testing, shipping, installation, or flight. Proven processes and procedures for fabrication and repair shall be used to preclude damage or material degradation during material processing, manufacturing operations, and refurbushment.Selected NDI techniques for the liner and/or boss(es) shall be performed before overwrapping with composite. When visual inspection reveals mechanical damage or defects exceeding manufacturing specification levels (and standard repair procedures), the damaged COPV shall be submitted to a material review board (MRB) for disposition. Every COPV shall be subjected to visual and other non-destructive inspection (NDI), per the inspection plan.
Prakash, Amol; Peterman, Scott; Ahmad, Shadab; Sarracino, David; Frewen, Barbara; Vogelsang, Maryann; Byram, Gregory; Krastins, Bryan; Vadali, Gouri; Lopez, Mary
2014-12-05
Data-dependent acquisition (DDA) and data-independent acquisition strategies (DIA) have both resulted in improved understanding of proteomics samples. Both strategies have advantages and disadvantages that are well-published, where DDA is typically applied for deep discovery and DIA may be used to create sample records. In this paper, we present a hybrid data acquisition and processing strategy (pSMART) that combines the strengths of both techniques and provides significant benefits for qualitative and quantitative peptide analysis. The performance of pSMART is compared to published DIA strategies in an experiment that allows the objective assessment of DIA performance with respect to interrogation of previously acquired MS data. The results of this experiment demonstrate that pSMART creates fewer decoy hits than a standard DIA strategy. Moreover, we show that pSMART is more selective, sensitive, and reproducible than either standard DIA or DDA strategies alone.
GSFC Cutting Edge Avionics Technologies for Spacecraft
NASA Technical Reports Server (NTRS)
Luers, Philip J.; Culver, Harry L.; Plante, Jeannette
1998-01-01
With the launch of NASA's first fiber optic bus on SAMPEX in 1992, GSFC has ushered in an era of new technology development and insertion into flight programs. Predating such programs the Lewis and Clark missions and the New Millenium Program, GSFC has spearheaded the drive to use cutting edge technologies on spacecraft for three reasons: to enable next generation Space and Earth Science, to shorten spacecraft development schedules, and to reduce the cost of NASA missions. The technologies developed have addressed three focus areas: standard interface components, high performance processing, and high-density packaging techniques enabling lower cost systems. To realize the benefits of standard interface components GSFC has developed and utilized radiation hardened/tolerant devices such as PCI target ASICs, Parallel Fiber Optic Data Bus terminals, MIL-STD-1773 and AS1773 transceivers, and Essential Services Node. High performance processing has been the focus of the Mongoose I and Mongoose V rad-hard 32-bit processor programs as well as the SMEX-Lite Computation Hub. High-density packaging techniques have resulted in 3-D stack DRAM packages and Chip-On-Board processes. Lower cost systems have been demonstrated by judiciously using all of our technology developments to enable "plug and play" scalable architectures. The paper will present a survey of development and insertion experiences for the above technologies, as well as future plans to enable more "better, faster, cheaper" spacecraft. Details of ongoing GSFC programs such as Ultra-Low Power electronics, Rad-Hard FPGAs, PCI master ASICs, and Next Generation Mongoose processors.
Functional Data Analysis for Dynamical System Identification of Behavioral Processes
Trail, Jessica B.; Collins, Linda M.; Rivera, Daniel E.; Li, Runze; Piper, Megan E.; Baker, Timothy B.
2014-01-01
Efficient new technology has made it straightforward for behavioral scientists to collect anywhere from several dozen to several thousand dense, repeated measurements on one or more time-varying variables. These intensive longitudinal data (ILD) are ideal for examining complex change over time, but present new challenges that illustrate the need for more advanced analytic methods. For example, in ILD the temporal spacing of observations may be irregular, and individuals may be sampled at different times. Also, it is important to assess both how the outcome changes over time and the variation between participants' time-varying processes to make inferences about a particular intervention's effectiveness within the population of interest. The methods presented in this article integrate two innovative ILD analytic techniques: functional data analysis and dynamical systems modeling. An empirical application is presented using data from a smoking cessation clinical trial. Study participants provided 42 daily assessments of pre-quit and post-quit withdrawal symptoms. Regression splines were used to approximate smooth functions of craving and negative affect and to estimate the variables' derivatives for each participant. We then modeled the dynamics of nicotine craving using standard input-output dynamical systems models. These models provide a more detailed characterization of the post-quit craving process than do traditional longitudinal models, including information regarding the type, magnitude, and speed of the response to an input. The results, in conjunction with standard engineering control theory techniques, could potentially be used by tobacco researchers to develop a more effective smoking intervention. PMID:24079929
International standards for programmes of training in intensive care medicine in Europe.
2011-03-01
To develop internationally harmonised standards for programmes of training in intensive care medicine (ICM). Standards were developed by using consensus techniques. A nine-member nominal group of European intensive care experts developed a preliminary set of standards. These were revised and refined through a modified Delphi process involving 28 European national coordinators representing national training organisations using a combination of moderated discussion meetings, email, and a Web-based tool for determining the level of agreement with each proposed standard, and whether the standard could be achieved in the respondent's country. The nominal group developed an initial set of 52 possible standards which underwent four iterations to achieve maximal consensus. All national coordinators approved a final set of 29 standards in four domains: training centres, training programmes, selection of trainees, and trainers' profiles. Only three standards were considered immediately achievable by all countries, demonstrating a willingness to aspire to quality rather than merely setting a minimum level. Nine proposed standards which did not achieve full consensus were identified as potential candidates for future review. This preliminary set of clearly defined and agreed standards provides a transparent framework for assuring the quality of training programmes, and a foundation for international harmonisation and quality improvement of training in ICM.
48 CFR 9905.505-50 - Techniques for application.
Code of Federal Regulations, 2010 CFR
2010-10-01
... this cost accounting principle does not require that allocation of unallowable costs to final cost.... 9905.505-50 Section 9905.505-50 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD... ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS FOR EDUCATIONAL INSTITUTIONS 9905.505-50 Techniques for...
48 CFR 9904.403-50 - Techniques for application.
Code of Federal Regulations, 2010 CFR
2010-10-01
.... 9904.403-50 Section 9904.403-50 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD, OFFICE OF FEDERAL PROCUREMENT POLICY, OFFICE OF MANAGEMENT AND BUDGET PROCUREMENT PRACTICES AND COST ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.403-50 Techniques for application. (a)(1) Separate...
Alyami, Hamad; Dahmash, Eman; Bowen, James
2017-01-01
Powder blend homogeneity is a critical attribute in formulation development of low dose and potent active pharmaceutical ingredients (API) yet a complex process with multiple contributing factors. Excipient characteristics play key role in efficient blending process and final product quality. In this work the effect of excipient type and properties, blending technique and processing time on content uniformity was investigated. Powder characteristics for three commonly used excipients (starch, pregelatinised starch and microcrystalline cellulose) were initially explored using laser diffraction particle size analyser, angle of repose for flowability, followed by thorough evaluations of surface topography employing scanning electron microscopy and interferometry. Blend homogeneity was evaluated based on content uniformity analysis of the model API, ergocalciferol, using a validated analytical technique. Flowability of powders were directly related to particle size and shape, while surface topography results revealed the relationship between surface roughness and ability of excipient with high surface roughness to lodge fine API particles within surface groves resulting in superior uniformity of content. Of the two blending techniques, geometric blending confirmed the ability to produce homogeneous blends at low dilution when processed for longer durations, whereas manual ordered blending failed to achieve compendial requirement for content uniformity despite mixing for 32 minutes. Employing the novel dry powder hybrid mixer device, developed at Aston University laboratory, results revealed the superiority of the device and enabled the production of homogenous blend irrespective of excipient type and particle size. Lower dilutions of the API (1% and 0.5% w/w) were examined using non-sieved excipients and the dry powder hybrid mixing device enabled the development of successful blends within compendial requirements and low relative standard deviation. PMID:28609454
Alyami, Hamad; Dahmash, Eman; Bowen, James; Mohammed, Afzal R
2017-01-01
Powder blend homogeneity is a critical attribute in formulation development of low dose and potent active pharmaceutical ingredients (API) yet a complex process with multiple contributing factors. Excipient characteristics play key role in efficient blending process and final product quality. In this work the effect of excipient type and properties, blending technique and processing time on content uniformity was investigated. Powder characteristics for three commonly used excipients (starch, pregelatinised starch and microcrystalline cellulose) were initially explored using laser diffraction particle size analyser, angle of repose for flowability, followed by thorough evaluations of surface topography employing scanning electron microscopy and interferometry. Blend homogeneity was evaluated based on content uniformity analysis of the model API, ergocalciferol, using a validated analytical technique. Flowability of powders were directly related to particle size and shape, while surface topography results revealed the relationship between surface roughness and ability of excipient with high surface roughness to lodge fine API particles within surface groves resulting in superior uniformity of content. Of the two blending techniques, geometric blending confirmed the ability to produce homogeneous blends at low dilution when processed for longer durations, whereas manual ordered blending failed to achieve compendial requirement for content uniformity despite mixing for 32 minutes. Employing the novel dry powder hybrid mixer device, developed at Aston University laboratory, results revealed the superiority of the device and enabled the production of homogenous blend irrespective of excipient type and particle size. Lower dilutions of the API (1% and 0.5% w/w) were examined using non-sieved excipients and the dry powder hybrid mixing device enabled the development of successful blends within compendial requirements and low relative standard deviation.
Deriving Global Convection Maps From SuperDARN Measurements
NASA Astrophysics Data System (ADS)
Gjerloev, J. W.; Waters, C. L.; Barnes, R. J.
2018-04-01
A new statistical modeling technique for determining the global ionospheric convection is described. The principal component regression (PCR)-based technique is based on Super Dual Auroral Radar Network (SuperDARN) observations and is an advanced version of the PCR technique that Waters et al. (https//:doi.org.10.1002/2015JA021596) used for the SuperMAG data. While SuperMAG ground magnetic field perturbations are vector measurements, SuperDARN provides line-of-sight measurements of the ionospheric convection flow. Each line-of-sight flow has a known azimuth (or direction), which must be converted into the actual vector flow. However, the component perpendicular to the azimuth direction is unknown. Our method uses historical data from the SuperDARN database and PCR to determine a fill-in model convection distribution for any given universal time. The fill-in data process is driven by a list of state descriptors (magnetic indices and the solar zenith angle). The final solution is then derived from a spherical cap harmonic fit to the SuperDARN measurements and the fill-in model. When compared with the standard SuperDARN fill-in model, we find that our fill-in model provides improved solutions, and the final solutions are in better agreement with the SuperDARN measurements. Our solutions are far less dynamic than the standard SuperDARN solutions, which we interpret as being due to a lack of magnetosphere-ionosphere inertia and communication delays in the standard SuperDARN technique while it is inherently included in our approach. Rather, we argue that the magnetosphere-ionosphere system has inertia that prevents the global convection from changing abruptly in response to an interplanetary magnetic field change.
The quantitative analysis of silicon carbide surface smoothing by Ar and Xe cluster ions
NASA Astrophysics Data System (ADS)
Ieshkin, A. E.; Kireev, D. S.; Ermakov, Yu. A.; Trifonov, A. S.; Presnov, D. E.; Garshev, A. V.; Anufriev, Yu. V.; Prokhorova, I. G.; Krupenin, V. A.; Chernysh, V. S.
2018-04-01
The gas cluster ion beam technique was used for the silicon carbide crystal surface smoothing. The effect of processing by two inert cluster ions, argon and xenon, was quantitatively compared. While argon is a standard element for GCIB, results for xenon clusters were not reported yet. Scanning probe microscopy and high resolution transmission electron microscopy techniques were used for the analysis of the surface roughness and surface crystal layer quality. The gas cluster ion beam processing results in surface relief smoothing down to average roughness about 1 nm for both elements. It was shown that xenon as the working gas is more effective: sputtering rate for xenon clusters is 2.5 times higher than for argon at the same beam energy. High resolution transmission electron microscopy analysis of the surface defect layer gives values of 7 ± 2 nm and 8 ± 2 nm for treatment with argon and xenon clusters.
3D visualization of Thoraco-Lumbar Spinal Lesions in German Shepherd Dog
DOE Office of Scientific and Technical Information (OSTI.GOV)
Azpiroz, J.; Krafft, J.; Cadena, M.
2006-09-08
Computed tomography (CT) has been found to be an excellent imaging modality due to its sensitivity to characterize the morphology of the spine in dogs. This technique is considered to be particularly helpful for diagnosing spinal cord atrophy and spinal stenosis. The three-dimensional visualization of organs and bones can significantly improve the diagnosis of certain diseases in dogs. CT images were acquired of a German shepherd's dog spinal cord to generate stacks and digitally process them to arrange them in a volume image. All imaging experiments were acquired using standard clinical protocols on a clinical CT scanner. The three-dimensional visualizationmore » allowed us to observe anatomical structures that otherwise are not possible to observe with two-dimensional images. The combination of an imaging modality like CT together with imaging processing techniques can be a powerful tool for the diagnosis of a number of animal diseases.« less
Gahlawat, P; Sehgal, S
1998-01-01
A technique for development of potato flour was standardized. Five products viz. cake, biscuit, weaning food, panjiri and ladoo were prepared incorporating potato flour, defatted soy flour and corn flour. Baking and roasting were the major processing techniques employed for the development of these products. Protein, ash and fat contents of potato flour were almost similar to those of raw potatoes. Significant differences in protein, ash and fat contents of all the products were observed. Protein and starch digestibility of potato flour was significantly higher than that of raw potatoes. Protein digestibility increased by 12 to 17 percent on baking or roasting of products. Processed products had significantly higher starch digestibility and mineral availability compared to raw products. Thus, it can be concluded that roasting and baking are effective means of improving starch and protein digestibility and mineral availability of products.
3D visualization of Thoraco-Lumbar Spinal Lesions in German Shepherd Dog
NASA Astrophysics Data System (ADS)
Azpiroz, J.; Krafft, J.; Cadena, M.; Rodríguez, A. O.
2006-09-01
Computed tomography (CT) has been found to be an excellent imaging modality due to its sensitivity to characterize the morphology of the spine in dogs. This technique is considered to be particularly helpful for diagnosing spinal cord atrophy and spinal stenosis. The three-dimensional visualization of organs and bones can significantly improve the diagnosis of certain diseases in dogs. CT images were acquired of a German shepherd's dog spinal cord to generate stacks and digitally process them to arrange them in a volume image. All imaging experiments were acquired using standard clinical protocols on a clinical CT scanner. The three-dimensional visualization allowed us to observe anatomical structures that otherwise are not possible to observe with two-dimensional images. The combination of an imaging modality like CT together with imaging processing techniques can be a powerful tool for the diagnosis of a number of animal diseases.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wenger, Andreas
2009-01-01
The study of processes involving flavour-changing neutral currents provides a particularly promising probe for New Physics beyond the Standard Model of particle physics. These processes are forbidden at tree level and proceed through loop processes, which are strongly suppressed in the Standard Model. Cross-sections for these processes can be significantly enhanced by contributions from new particles as they are proposed in most extentions of the Standard Model. This thesis presents searches for two flavour-changing neutral current decays, B± ! K±μ+μ- and B0 d ! K¤μ+μ-. The analysis was performed on 4.1 fb-1 of data collected by the DØ detector inmore » Run II of the Fermilab Tevatron. Candidate events for the decay B± ! K±μ+μ- were selected using a multi-variate analysis technique and the number of signal events determined by a fit to the invariant mass spectrum. Normalising to the known branching fraction for B± ! J/ÃK±, a branching fraction of B(B± ! K± μ+μ-) = 6.45 ± 2.24 (stat) ± 1.19 (syst) × 10-7 (1) was measured. The branching fraction for the decay B0 d ! K¤μ+μ- was determined in a similar way. Normalizing to the known branching fraction for B0 d ! J/ÃK¤, a branching fraction of B(B0 d ! K¤ μ+μ-) = 11.15 ± 3.05 (stat) ± 1.94 (syst) × 10-7 (2) was measured. All measurements are in agreement with the Standard Model.« less
Standardization of Laser Methods and Techniques for Vibration Measurements and Calibrations
NASA Astrophysics Data System (ADS)
von Martens, Hans-Jürgen
2010-05-01
The realization and dissemination of the SI units of motion quantities (vibration and shock) have been based on laser interferometer methods specified in international documentary standards. New and refined laser methods and techniques developed by national metrology institutes and by leading manufacturers in the past two decades have been swiftly specified as standard methods for inclusion into in the series ISO 16063 of international documentary standards. A survey of ISO Standards for the calibration of vibration and shock transducers demonstrates the extended ranges and improved accuracy (measurement uncertainty) of laser methods and techniques for vibration and shock measurements and calibrations. The first standard for the calibration of laser vibrometers by laser interferometry or by a reference accelerometer calibrated by laser interferometry (ISO 16063-41) is on the stage of a Draft International Standard (DIS) and may be issued by the end of 2010. The standard methods with refined techniques proved to achieve wider measurement ranges and smaller measurement uncertainties than that specified in the ISO Standards. The applicability of different standardized interferometer methods to vibrations at high frequencies was recently demonstrated up to 347 kHz (acceleration amplitudes up to 350 km/s2). The relative deviations between the amplitude measurement results of the different interferometer methods that were applied simultaneously, differed by less than 1% in all cases.
Rapid screening and species identification of E. coli, Listeria, and Salmonella by SERS technique
NASA Astrophysics Data System (ADS)
Liu, Yongliang; Chao, Kuanglin; Kim, Moon S.; Nou, Xiangwu
2008-04-01
Techniques for routine and rapid screening of the presence of foodborne bacteria are needed, and this study reports the feasibility of citrate-reduced silver colloidal SERS for identifying E. coli, Listeria, and Salmonella. Relative standard deviation (RSD) of SERS spectra from silver colloidal suspensions and ratios of P-O SERS peaks from small molecule (K3PO4) were used to assess the reproducibility, stability, and binding effectiveness of citrate-reduced silver colloids over batch and storage process. The results suggested the reproducibility of silver colloids over batch process and also stability and consistent binding effectiveness over 60-day storage period. Notably, although silver colloidal nanoparticles were stable for at least 90 days, their binding effectiveness began to decrease slightly after 60-day storage, with a binding reduction of about 12% at 90th day. Colloidal silver SERS, as demonstrated here, could be an important alternative technique in the rapid and simultaneous screening of the presence of three most outbreak bacteria due to the exclusive biomarkers, label-free and easy sampling attribute.
Environmental Development Plan (EDP). Enhanced gas recovery, FY 1977
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1978-03-01
This Enhanced Gcs Recovery EDP addresses the environmental impacts of enhanced gas recovery processes in shale and sandstone, methane drainage from coalbeds, and methane recovery from geopressured aquifers. The EDP addresses planning in two basic areas: environmental research and environmental assessment. Environmental research can be categorized as follows: characterization of pollutants from EGR processes; selective application of monitoring and measuring techniques; evaluation of control/mitigation techniques; and evaluation of the synergistic impacts of the development of EGR techniques. Environmental assessment activities scheduled by EDP include: assessment of ecological impacts; assessment of socioeconomic effects; EIA/EIS preparation; evaluation of control technology needs; andmore » analysis of applicable and proposed emission, effluent, and health and safety standards. The EGR EDP includes an EGR technology overview (Section 2), a discussion of EGR environmental issues and requirements (Section 3), an environmental action plan (Section 4), an environmental management strategy for the EGR program (Section 5), and supporting appendices which present information on Federal legislation applicable to EGR technology, a summary of ongoing and completed research, and future research and assessment projects.« less
Demographic management in a federated healthcare environment.
Román, I; Roa, L M; Reina-Tosina, J; Madinabeitia, G
2006-09-01
The purpose of this paper is to provide a further step toward the decentralization of identification and demographic information about persons by solving issues related to the integration of demographic agents in a federated healthcare environment. The aim is to identify a particular person in every system of a federation and to obtain a unified view of his/her demographic information stored in different locations. This work is based on semantic models and techniques, and pursues the reconciliation of several current standardization works including ITU-T's Open Distributed Processing, CEN's prEN 12967, OpenEHR's dual and reference models, CEN's General Purpose Information Components and CORBAmed's PID service. We propose a new paradigm for the management of person identification and demographic data, based on the development of an open architecture of specialized distributed components together with the incorporation of techniques for the efficient management of domain ontologies, in order to have a federated demographic service. This new service enhances previous correlation solutions sharing ideas with different standards and domains like semantic techniques and database systems. The federation philosophy enforces us to devise solutions to the semantic, functional and instance incompatibilities in our approach. Although this work is based on several models and standards, we have improved them by combining their contributions and developing a federated architecture that does not require the centralization of demographic information. The solution is thus a good approach to face integration problems and the applied methodology can be easily extended to other tasks involved in the healthcare organization.
NASA Astrophysics Data System (ADS)
Meng, Haoran; Ben-Zion, Yehuda
2018-01-01
We present a technique to detect small earthquakes not included in standard catalogues using data from a dense seismic array. The technique is illustrated with continuous waveforms recorded in a test day by 1108 vertical geophones in a tight array on the San Jacinto fault zone. Waveforms are first stacked without time-shift in nine non-overlapping subarrays to increase the signal-to-noise ratio. The nine envelope functions of the stacked records are then multiplied with each other to suppress signals associated with sources affecting only some of the nine subarrays. Running a short-term moving average/long-term moving average (STA/LTA) detection algorithm on the product leads to 723 triggers in the test day. Using a local P-wave velocity model derived for the surface layer from Betsy gunshot data, 5 s long waveforms of all sensors around each STA/LTA trigger are beamformed for various incident directions. Of the 723 triggers, 220 are found to have localized energy sources and 103 of these are confirmed as earthquakes by verifying their observation at 4 or more stations of the regional seismic network. This demonstrates the general validity of the method and allows processing further the validated events using standard techniques. The number of validated events in the test day is >5 times larger than that in the standard catalogue. Using these events as templates can lead to additional detections of many more earthquakes.
Trigo, Jesús Daniel; Martínez, Ignacio; Alesanco, Alvaro; Kollmann, Alexander; Escayola, Javier; Hayn, Dieter; Schreier, Günter; García, José
2012-07-01
This paper investigates the application of the enterprise information system (EIS) paradigm to standardized cardiovascular condition monitoring. There are many specifications in cardiology, particularly in the ECG standardization arena. The existence of ECG formats, however, does not guarantee the implementation of homogeneous, standardized solutions for ECG management. In fact, hospital management services need to cope with various ECG formats and, moreover, several different visualization applications. This heterogeneity hampers the normalization of integrated, standardized healthcare information systems, hence the need for finding an appropriate combination of ECG formats and a suitable EIS-based software architecture that enables standardized exchange and homogeneous management of ECG formats. Determining such a combination is one objective of this paper. The second aim is to design and develop the integrated healthcare information system that satisfies the requirements posed by the previous determination. The ECG formats selected include ISO/IEEE11073, Standard Communications Protocol for Computer-Assisted Electrocardiography, and an ECG ontology. The EIS-enabling techniques and technologies selected include web services, simple object access protocol, extensible markup language, or business process execution language. Such a selection ensures the standardized exchange of ECGs within, or across, healthcare information systems while providing modularity and accessibility.
Knowledge Management Orientation: An Innovative Perspective to Hospital Management
GHASEMI, Matina; GHADIRI NEJAD, Mazyar; BAGZIBAGLI, Kemal
2017-01-01
Background: By considering innovation as a new project in hospitals, all the project management’s standard steps should be followed in execution. This study investigated the validation of a new set of measures in terms of providing a procedure for knowledge management-oriented innovation that enriches the hospital management system. Methods: The relation between innovation and all the knowledge management areas, as the main constructs of project management, was illustrated by referring to project management standard steps and previous studies. Through consultations and meetings with a committee of professional project managers, a questionnaire was developed to measure ten knowledge management areas in hospital’s innovation process. Additionally, a group of experts from hospital managers were invited to comment on the applicability of the questionnaires by considering if the items are measurable in hospitals practically. Results: A close-ended, Likert-type scale items, consisted of ten sections, were developed based on project management body of knowledge thorough Delphi technique. It enables the managers to evaluate hospitals’ situation to be aware whether the organization follows the knowledge management standards in innovation process or not. By pilot study, confirmatory factor analysis and exploratory factor analysis were conducted to ensure the validity and reliability of the measurement items. Conclusion: The developed items seem to have a potential to help hospital managers and subsequently delivering new products/services successfully based on the standard procedures in their organization. In all innovation processes, the knowledge management areas and their standard steps help hospital managers by a new tool as questionnaire format. PMID:29259938
Modernized Techniques for Dealing with Quality Data and Derived Products
NASA Astrophysics Data System (ADS)
Neiswender, C.; Miller, S. P.; Clark, D.
2008-12-01
"I just want a picture of the ocean floor in this area" is expressed all too often by researchers, educators, and students in the marine geosciences. As more sophisticated systems are developed to handle data collection and processing, the demand for quality data, and standardized products continues to grow. Data management is an invisible bridge between science and researchers/educators. The SIOExplorer digital library presents more than 50 years of ocean-going research. Prior to publication, all data is checked for quality using standardized criterion developed for each data stream. Despite the evolution of data formats and processing systems, SIOExplorer continues to present derived products in well- established formats. Standardized products are published for each cruise, and include a cruise report, MGD77 merged data, multi-beam flipbook, and underway profiles. Creation of these products is made possible by processing scripts, which continue to change with ever-evolving data formats. We continue to explore the potential of database-enabled creation of standardized products, such as the metadata-rich MGD77 header file. Database-enabled, automated processing produces standards-compliant metadata for each data and derived product. Metadata facilitates discovery and interpretation of published products. This descriptive information is stored both in an ASCII file, and a searchable digital library database. SIOExplorer's underlying technology allows focused search and retrieval of data and products. For example, users can initiate a search of only multi-beam data, which includes data-specific parameters. This customization is made possible with a synthesis of database, XML, and PHP technology. The combination of standardized products and digital library technology puts quality data and derived products in the hands of scientists. Interoperable systems enable distribution these published resources using technology such as web services. By developing modernized strategies to deal with data, Scripps Institution of Oceanography is able to produce and distribute well-formed, and quality-tested derived products, which aid research, understanding, and education.
All-Digital Baseband 65nm PLL/FPLL Clock Multiplier using 10-cell Library
NASA Technical Reports Server (NTRS)
Shuler, Robert L., Jr.; Wu, Qiong; Liu, Rui; Chen, Li
2014-01-01
PLLs for clock generation are essential for modern circuits, to generate specialized frequencies for many interfaces and high frequencies for chip internal operation. These circuits depend on analog circuits and careful tailoring for each new process, and making them fault tolerant is an incompletely solved problem. Until now, all digital PLLs have been restricted to sampled data DSP techniques and not available for the highest frequency baseband applications. This paper presents the design and preliminary evaluation of an all-digital baseband technique built entirely with an easily portable 10-cell digital library. The library is also described, as it aids in research and low volume design porting to new processes. The advantages of the digital approach are the wide variety of techniques available to give varying degrees of fault tolerance, and the simplicity of porting the design to new processes, even to exotic processes that may not have analog capability. The only tuning parameter is digital gate delay. An all-digital approach presents unique problems and standard analog loop stability design criteria cannot be directly used. Because of the quantization of frequency, there is effectively infinite gain for very small loop error feedback. The numerically controlled oscillator (NCO) based on a tapped delay line cannot be reliably updated while a pulse is active in the delay line, and ordinarily does not have enough frequency resolution for a low-jitter output.
ALL-Digital Baseband 65nm PLL/FPLL Clock Multiplier Using 10-Cell Library
NASA Technical Reports Server (NTRS)
Schuler, Robert L., Jr.; Wu, Qiong; Liu, Rui; Chen, Li; Madala, Shridhar
2014-01-01
PLLs for clock generation are essential for modern circuits, to generate specialized frequencies for many interfaces and high frequencies for chip internal operation. These circuits depend on analog circuits and careful tailoring for each new process, and making them fault tolerant is an incompletely solved problem. Until now, all digital PLLs have been restricted to sampled data DSP techniques and not available for the highest frequency baseband applications. This paper presents the design and preliminary evaluation of an all-digital baseband technique built entirely with an easily portable 10-cell digital library. The library is also described, as it aids in research and low volume design porting to new processes. The advantages of the digital approach are the wide variety of techniques available to give varying degrees of fault tolerance, and the simplicity of porting the design to new processes, even to exotic processes that may not have analog capability. The only tuning parameter is digital gate delay. An all-digital approach presents unique problems and standard analog loop stability design criteria cannot be directly used. Because of the quantization of frequency, there is effectively infinite gain for very small loop error feedback. The numerically controlled oscillator (NCO) based on a tapped delay line cannot be reliably updated while a pulse is active in the delay line, and ordinarily does not have enough frequency resolution for a low-jitter output.
Neural systems and time course of proactive interference in working memory.
Du, Yingchun; Zhang, John X; Xiao, Zhuangwei; Wu, Renhua
2007-01-01
The storage of information in working memory suffers as a function of proactive interference. Many works using neuroimaging technique have been done to reveal the brain mechanism of interference resolution. However, less is yet known about the time course of this process. Event-related potential method(ERP) and standardized Low Resolution Brain Electromagnetic Tomography method (sLORETA) were used in this study to discover the time course of interference resolution in working memory. The anterior P2 was thought to reflect interference resolution and if so, this process occurred earlier in working memory than in long-term memory.
Liquid Argon TPC Signal Formation, Signal Processing and Hit Reconstruction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baller, Bruce
2017-03-11
This document describes the early stage of the reconstruction chain that was developed for the ArgoNeuT and MicroBooNE experiments at Fermilab. These experiments study accelerator neutrino interactions that occur in a Liquid Argon Time Projection Chamber. Reconstructing the properties of particles produced in these interactions requires knowledge of the micro-physics processes that affect the creation and transport of ionization electrons to the readout system. A wire signal deconvolution technique was developed to convert wire signals to a standard form for hit reconstruction, to remove artifacts in the electronics chain and to remove coherent noise.
Techniques for video compression
NASA Technical Reports Server (NTRS)
Wu, Chwan-Hwa
1995-01-01
In this report, we present our study on multiprocessor implementation of a MPEG2 encoding algorithm. First, we compare two approaches to implementing video standards, VLSI technology and multiprocessor processing, in terms of design complexity, applications, and cost. Then we evaluate the functional modules of MPEG2 encoding process in terms of their computation time. Two crucial modules are identified based on this evaluation. Then we present our experimental study on the multiprocessor implementation of the two crucial modules. Data partitioning is used for job assignment. Experimental results show that high speedup ratio and good scalability can be achieved by using this kind of job assignment strategy.
The exact fundamental solution for the Benes tracking problem
NASA Astrophysics Data System (ADS)
Balaji, Bhashyam
2009-05-01
The universal continuous-discrete tracking problem requires the solution of a Fokker-Planck-Kolmogorov forward equation (FPKfe) for an arbitrary initial condition. Using results from quantum mechanics, the exact fundamental solution for the FPKfe is derived for the state model of arbitrary dimension with Benes drift that requires only the computation of elementary transcendental functions and standard linear algebra techniques- no ordinary or partial differential equations need to be solved. The measurement process may be an arbitrary, discrete-time nonlinear stochastic process, and the time step size can be arbitrary. Numerical examples are included, demonstrating its utility in practical implementation.
García, Patricia; Balcells, M Elvira; Castillo, Claudia; Miranda, Carolina; Geoffroy, Enrique; Román, Juan C; Wozniak, Aniela
2017-08-01
Extra-pulmonary tuberculosis (TB) represents the 26.2% of total TB cases in Chile. Culture is the gold standard method, but the process is extremely slow. Xpert®MTB/RIF technique detects Mycobacterium tuberculosis complex (MTBc) through real time PCR in less than 3 h. However, it has been validated only for respiratory specimens. We aimed to determine the performance of Xpert®MTB/RIF test in detecting MTBc in extra-respiratory specimens compared with a combined gold standard consisting in a positive (liquid and solid) mycobacterial culture and/or a positive validated molecular method (q-RPC, Cobas®TaqMan®-MTB). Fifty extra-respiratory specimens were analyzed, from which 25 were positive and 25 negative for MTBc based on the combined gold standard. The 25 positive specimens had a positive result by Xpert®MTB/RIF; from the 25 negative specimens, 24 had a negative result and one had a positive result. We obtained an overall concordance of 98% between Xpert®MTB/RIF and the combined gold standard. Xpert®MTB/RIF test was able to detect 12 smear-negative specimens and 3 culture-negative specimens, all of them corresponding to extra-pulmonary TB cases. Xpert®MTB/RIF showed similar sensitivity to q-RPC in detecting MTBc in extra-respiratory specimens. This procedure allowed a substantial reduction in the time of diagnosis.
Better infrastructure for critical care trials: nomenclature, etymology, and informatics.
Singh, Jeffrey M; Ferguson, Niall D
2009-01-01
The goals of this review article are to review the importance and value of standardized definitions in clinical research, as well as to propose the necessary tools and infrastructure needed to advance nosology and medial taxonomy to improve the quality of clinical trials in the field of critical care. We searched MEDLINE for relevant articles, reviewed those selected and their reference lists, and consulted personal files for relevant information. When the pathobiology of diseases is well understood, standard disease definitions can be extremely specific and precise; however, when the pathobiology of the disease is less well understood or more complex, biological markers may not be diagnostically useful or even available. In these cases, syndromic definitions effectively classify and group illnesses with similar symptoms and clinical signs. There is no clear gold standard for the diagnosis of many clinical entities in the intensive care unit, including notably both acute respiratory distress syndrome and sepsis. There are several types of consensus methods that can be used to explicate the judgmental approach that is often needed in these cases, including interactive or consensus groups, the nominal group technique, and the Delphi technique. Ideally, the definition development process will create clear and unambiguous language in which each definition accurately reflects the current understanding of the disease state. The development, implementation, evaluation, revision, and reevaluation of standardized definitions are keys for advancing the quality of clinical trials in the critical care arena.
Garbarino, J.R.; Taylor, Howard E.
1996-01-01
An inductively coupled plasma-mass spectrometry method was developed for the determination of dissolved Al, As, B, Ba, Be, Cd, Co, Cr, Cu, Li, Mn, Mo, Ni, Pb, Sr, Tl, U, V, and Zn in natural waters. Detection limits are generally in the 50-100 picogram per milliliter (pg/mL) range, with the exception of As which is in the 1 microgram per liter (ug/L) range. Interferences associated with spectral overlap from concomitant isotopes or molecular ions and sample matrix composition have been identified. Procedures for interference correction and reduction related to isotope selection, instrumental operating conditions, and mathematical data processing techniques are described. Internal standards are used to minimize instrumental drift. The average analytical precision attainable for 5 times the detection limit is about 16 percent. The accuracy of the method was tested using a series of U.S. Geological Survey Standard Reference Water Standards (SWRS), National Research Council Canada Riverine Water Standard, and National Institute of Standards and Technology (NIST) Trace Elements in Water Standards. Average accuracies range from 90 to 110 percent of the published mean values.
Emwas, Abdul-Hamid; Luchinat, Claudio; Turano, Paola; Tenori, Leonardo; Roy, Raja; Salek, Reza M; Ryan, Danielle; Merzaban, Jasmeen S; Kaddurah-Daouk, Rima; Zeri, Ana Carolina; Nagana Gowda, G A; Raftery, Daniel; Wang, Yulan; Brennan, Lorraine; Wishart, David S
The metabolic composition of human biofluids can provide important diagnostic and prognostic information. Among the biofluids most commonly analyzed in metabolomic studies, urine appears to be particularly useful. It is abundant, readily available, easily stored and can be collected by simple, noninvasive techniques. Moreover, given its chemical complexity, urine is particularly rich in potential disease biomarkers. This makes it an ideal biofluid for detecting or monitoring disease processes. Among the metabolomic tools available for urine analysis, NMR spectroscopy has proven to be particularly well-suited, because the technique is highly reproducible and requires minimal sample handling. As it permits the identification and quantification of a wide range of compounds, independent of their chemical properties, NMR spectroscopy has been frequently used to detect or discover disease fingerprints and biomarkers in urine. Although protocols for NMR data acquisition and processing have been standardized, no consensus on protocols for urine sample selection, collection, storage and preparation in NMR-based metabolomic studies have been developed. This lack of consensus may be leading to spurious biomarkers being reported and may account for a general lack of reproducibility between laboratories. Here, we review a large number of published studies on NMR-based urine metabolic profiling with the aim of identifying key variables that may affect the results of metabolomics studies. From this survey, we identify a number of issues that require either standardization or careful accounting in experimental design and provide some recommendations for urine collection, sample preparation and data acquisition.
A New Active Cavitation Mapping Technique for Pulsed HIFU Applications – Bubble Doppler
Li, Tong; Khokhlova, Tatiana; Sapozhnikov, Oleg; Hwang, Joo Ha; Sapozhnikov, Oleg; O’Donnell, Matthew
2015-01-01
In this work, a new active cavitation mapping technique for pulsed high-intensity focused ultrasound (pHIFU) applications termed bubble Doppler is proposed and its feasibility tested in tissue-mimicking gel phantoms. pHIFU therapy uses short pulses, delivered at low pulse repetition frequency, to cause transient bubble activity that has been shown to enhance drug and gene delivery to tissues. The current gold standard for detecting and monitoring cavitation activity during pHIFU treatments is passive cavitation detection (PCD), which provides minimal information on the spatial distribution of the bubbles. B-mode imaging can detect hyperecho formation, but has very limited sensitivity, especially to small, transient microbubbles. The bubble Doppler method proposed here is based on a fusion of the adaptations of three Doppler techniques that had been previously developed for imaging of ultrasound contrast agents – color Doppler, pulse inversion Doppler, and decorrelation Doppler. Doppler ensemble pulses were interleaved with therapeutic pHIFU pulses using three different pulse sequences and standard Doppler processing was applied to the received echoes. The information yielded by each of the techniques on the distribution and characteristics of pHIFU-induced cavitation bubbles was evaluated separately, and found to be complementary. The unified approach - bubble Doppler – was then proposed to both spatially map the presence of transient bubbles and to estimate their sizes and the degree of nonlinearity. PMID:25265178