Coring Sample Acquisition Tool
NASA Technical Reports Server (NTRS)
Haddad, Nicolas E.; Murray, Saben D.; Walkemeyer, Phillip E.; Badescu, Mircea; Sherrit, Stewart; Bao, Xiaoqi; Kriechbaum, Kristopher L.; Richardson, Megan; Klein, Kerry J.
2012-01-01
A sample acquisition tool (SAT) has been developed that can be used autonomously to sample drill and capture rock cores. The tool is designed to accommodate core transfer using a sample tube to the IMSAH (integrated Mars sample acquisition and handling) SHEC (sample handling, encapsulation, and containerization) without ever touching the pristine core sample in the transfer process.
Collecting cometary soil samples? Development of the ROSETTA sample acquisition system
NASA Technical Reports Server (NTRS)
Coste, P. A.; Fenzi, M.; Eiden, Michael
1993-01-01
In the reference scenario of the ROSETTA CNRS mission, the Sample Acquisition System is mounted on the Comet Lander. Its tasks are to acquire three kinds of cometary samples and to transfer them to the Earth Return Capsule. Operations are to be performed in vacuum and microgravity, on a probably rough and dusty surface, in a largely unknown material, at temperatures in the order of 100 K. The concept and operation of the Sample Acquisition System are presented. The design of the prototype corer and surface sampling tool, and of the equipment for testing them at cryogenic temperatures in ambient conditions and in vacuum in various materials representing cometary soil, are described. Results of recent preliminary tests performed in low temperature thermal vacuum in a cometary analog ice-dust mixture are provided.
Neutron Tomography at the Los Alamos Neutron Science Center
DOE Office of Scientific and Technical Information (OSTI.GOV)
Myers, William Riley
Neutron imaging is an incredibly powerful tool for non-destructive sample characterization and materials science. Neutron tomography is one technique that results in a three-dimensional model of the sample, representing the interaction of the neutrons with the sample. This relies both on reliable data acquisition and on image processing after acquisition. Over the course of the project, the focus has changed from the former to the latter, culminating in a large-scale reconstruction of a meter-long fossilized skull. The full reconstruction is not yet complete, though tools have been developed to improve the speed and accuracy of the reconstruction. This project helpsmore » to improve the capabilities of LANSCE and LANL with regards to imaging large or unwieldy objects.« less
Device Acquires and Retains Rock or Ice Samples
NASA Technical Reports Server (NTRS)
Giersch, Louis R.; Backes, Paul G.
2009-01-01
The Rock Baller is a sample acquisition tool that improves sample retention. The basic elements of the Rock Baller are the tool rotation axis, the hub, the two jaws, and the cutting blades, which are located on each of the jaws. The entire device rotates about the tool rotation axis, which is aligned parallel to the nominal normal direction of the parent rock surface. Both jaws also rotate about the jaw axis, which is perpendicular to the tool rotation axis, at a rate much slower than the rotation about the tool rotation axis. This movement gradually closes the jaws into a nearly continuous hemispherical shell that encloses the sample as it is cut from the parent rock. When required the jaws are opened to release the sample. The hemispherical cutting method eliminates the sample retention problems associated with existing sample acquisition methods that employ conventional cylindrical cutting. The resulting samples are hemispherical, or nearly hemispherical, and as a result the aspect ratio (sample depth relative to sample radius) is essentially fixed. This fixed sample aspect ratio may be considered a drawback of the Rock Baller method, as samples with a higher aspect ratio (more depth, less width) may be considered more scientifically valuable because such samples would allow for a broader inspection of the geological record. This aspect ratio issue can be ameliorated if the Rock Baller is paired with a device similar to the Rock Abrasion Tool (RAT) used on the Mars Exploration Rovers. The RAT could be used to first grind into the surface of the parent rock, after which the Rock Baller would extract a sample from a depth inside the rock that would not have been possible without first using the RAT. Other potential applications for this technology include medical applications such as the removal of tissue samples or tumors from the body, particularly during endoscopic, laparoscopic, or thoracoscopic surgeries.
A multi-center study benchmarks software tools for label-free proteome quantification
Gillet, Ludovic C; Bernhardt, Oliver M.; MacLean, Brendan; Röst, Hannes L.; Tate, Stephen A.; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I.; Aebersold, Ruedi; Tenzer, Stefan
2016-01-01
The consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from SWATH-MS (sequential window acquisition of all theoretical fragment ion spectra), a method that uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test datasets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation windows setups. For consistent evaluation we developed LFQbench, an R-package to calculate metrics of precision and accuracy in label-free quantitative MS, and report the identification performance, robustness and specificity of each software tool. Our reference datasets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics. PMID:27701404
A multicenter study benchmarks software tools for label-free proteome quantification.
Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan
2016-11-01
Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.
Kuharev, Jörg; Navarro, Pedro; Distler, Ute; Jahn, Olaf; Tenzer, Stefan
2015-09-01
Label-free quantification (LFQ) based on data-independent acquisition workflows currently experiences increasing popularity. Several software tools have been recently published or are commercially available. The present study focuses on the evaluation of three different software packages (Progenesis, synapter, and ISOQuant) supporting ion mobility enhanced data-independent acquisition data. In order to benchmark the LFQ performance of the different tools, we generated two hybrid proteome samples of defined quantitative composition containing tryptically digested proteomes of three different species (mouse, yeast, Escherichia coli). This model dataset simulates complex biological samples containing large numbers of both unregulated (background) proteins as well as up- and downregulated proteins with exactly known ratios between samples. We determined the number and dynamic range of quantifiable proteins and analyzed the influence of applied algorithms (retention time alignment, clustering, normalization, etc.) on quantification results. Analysis of technical reproducibility revealed median coefficients of variation of reported protein abundances below 5% for MS(E) data for Progenesis and ISOQuant. Regarding accuracy of LFQ, evaluation with synapter and ISOQuant yielded superior results compared to Progenesis. In addition, we discuss reporting formats and user friendliness of the software packages. The data generated in this study have been deposited to the ProteomeXchange Consortium with identifier PXD001240 (http://proteomecentral.proteomexchange.org/dataset/PXD001240). © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Crosscutting Development- EVA Tools and Geology Sample Acquisition
NASA Technical Reports Server (NTRS)
2011-01-01
Exploration to all destinations has at one time or another involved the acquisition and return of samples and context data. Gathered at the summit of the highest mountain, the floor of the deepest sea, or the ice of a polar surface, samples and their value (both scientific and symbolic) have been a mainstay of Earthly exploration. In manned spaceflight exploration, the gathering of samples and their contextual information has continued. With the extension of collecting activities to spaceflight destinations comes the need for geology tools and equipment uniquely designed for use by suited crew members in radically different environments from conventional field geology. Beginning with the first Apollo Lunar Surface Extravehicular Activity (EVA), EVA Geology Tools were successfully used to enable the exploration and scientific sample gathering objectives of the lunar crew members. These early designs were a step in the evolution of Field Geology equipment, and the evolution continues today. Contemporary efforts seek to build upon and extend the knowledge gained in not only the Apollo program but a wealth of terrestrial field geology methods and hardware that have continued to evolve since the last lunar surface EVA. This paper is presented with intentional focus on documenting the continuing evolution and growing body of knowledge for both engineering and science team members seeking to further the development of EVA Geology. Recent engineering development and field testing efforts of EVA Geology equipment for surface EVA applications are presented, including the 2010 Desert Research and Technology Studies (Desert RATs) field trial. An executive summary of findings will also be presented, detailing efforts recommended for exotic sample acquisition and pre-return curation development regardless of planetary or microgravity destination.
Non-destructive sampling of a comet
NASA Astrophysics Data System (ADS)
Jessberger, H. L.; Kotthaus, M.
1991-04-01
Various conditions which must be met for the development of a nondestructive sampling and acquisition system are outlined and the development of a new robotic sampling system suited for use on a cometary surface is briefly discussed. The Rosetta mission of ESA will take samples of a comet nucleus and return both core and volatile samples to earth. Various considerations which must be taken into account for such a project are examined including the identification of design parameters for sample quality; the identification of the most probable site conditions; the development of a sample acquisition system with respect to these conditions; the production of model materials and model conditions; and the investigation of the relevant material properties. An adequate sampling system should also be designed and built, including various tools, and the system should be tested under simulated cometary conditions.
Planetary Sample Caching System Design Options
NASA Technical Reports Server (NTRS)
Collins, Curtis; Younse, Paulo; Backes, Paul
2009-01-01
Potential Mars Sample Return missions would aspire to collect small core and regolith samples using a rover with a sample acquisition tool and sample caching system. Samples would need to be stored in individual sealed tubes in a canister that could be transfered to a Mars ascent vehicle and returned to Earth. A sample handling, encapsulation and containerization system (SHEC) has been developed as part of an integrated system for acquiring and storing core samples for application to future potential MSR and other potential sample return missions. Requirements and design options for the SHEC system were studied and a recommended design concept developed. Two families of solutions were explored: 1)transfer of a raw sample from the tool to the SHEC subsystem and 2)transfer of a tube containing the sample to the SHEC subsystem. The recommended design utilizes sample tool bit change out as the mechanism for transferring tubes to and samples in tubes from the tool. The SHEC subsystem design, called the Bit Changeout Caching(BiCC) design, is intended for operations on a MER class rover.
Lightweight Low Force Rotary Percussive Coring Tool for Planetary Applications
NASA Technical Reports Server (NTRS)
Hironaka, Ross; Stanley, Scott
2010-01-01
A prototype low-force rotary-percussive rock coring tool for use in acquiring samples for geological surveys in future planetary missions was developed. The coring tool could eventually enable a lightweight robotic system to operate from a relatively small (less than 200 kg) mobile or fixed platform to acquire and cache Mars or other planetary rock samples for eventual return to Earth for analysis. To gain insight needed to design an integrated coring tool, the coring ability of commercially available coring bits was evaluated for effectiveness of varying key parameters: weight-on-bit, rotation speed, percussive rate and force. Trade studies were performed for different methods of breaking a core at its base and for retaining the core in a sleeve to facilitate sample transfer. This led to a custom coring tool design which incorporated coring, core breakage, core retention, and core extraction functions. The coring tool was tested on several types of rock and demonstrated the overall feasibility of this approach for robotic rock sample acquisition.
RTSPM: real-time Linux control software for scanning probe microscopy.
Chandrasekhar, V; Mehta, M M
2013-01-01
Real time computer control is an essential feature of scanning probe microscopes, which have become important tools for the characterization and investigation of nanometer scale samples. Most commercial (and some open-source) scanning probe data acquisition software uses digital signal processors to handle the real time data processing and control, which adds to the expense and complexity of the control software. We describe here scan control software that uses a single computer and a data acquisition card to acquire scan data. The computer runs an open-source real time Linux kernel, which permits fast acquisition and control while maintaining a responsive graphical user interface. Images from a simulated tuning-fork based microscope as well as a standard topographical sample are also presented, showing some of the capabilities of the software.
Quantitative nanoscopy: Tackling sampling limitations in (S)TEM imaging of polymers and composites.
Gnanasekaran, Karthikeyan; Snel, Roderick; de With, Gijsbertus; Friedrich, Heiner
2016-01-01
Sampling limitations in electron microscopy questions whether the analysis of a bulk material is representative, especially while analyzing hierarchical morphologies that extend over multiple length scales. We tackled this problem by automatically acquiring a large series of partially overlapping (S)TEM images with sufficient resolution, subsequently stitched together to generate a large-area map using an in-house developed acquisition toolbox (TU/e Acquisition ToolBox) and stitching module (TU/e Stitcher). In addition, we show that quantitative image analysis of the large scale maps provides representative information that can be related to the synthesis and process conditions of hierarchical materials, which moves electron microscopy analysis towards becoming a bulk characterization tool. We demonstrate the power of such an analysis by examining two different multi-phase materials that are structured over multiple length scales. Copyright © 2015 Elsevier B.V. All rights reserved.
MemAxes Visualization Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hardware advancements such as Intel's PEBS and AMD's IBS, as well as software developments such as the perf_event API in Linux have made available the acquisition of memory access samples with performance information. MemAxes is a visualization and analysis tool for memory access sample data. By mapping the samples to their associated code, variables, node topology, and application dataset, MemAxes provides intuitive views of the data.
MetaBar - a tool for consistent contextual data acquisition and standards compliant submission.
Hankeln, Wolfgang; Buttigieg, Pier Luigi; Fink, Dennis; Kottmann, Renzo; Yilmaz, Pelin; Glöckner, Frank Oliver
2010-06-30
Environmental sequence datasets are increasing at an exponential rate; however, the vast majority of them lack appropriate descriptors like sampling location, time and depth/altitude: generally referred to as metadata or contextual data. The consistent capture and structured submission of these data is crucial for integrated data analysis and ecosystems modeling. The application MetaBar has been developed, to support consistent contextual data acquisition. MetaBar is a spreadsheet and web-based software tool designed to assist users in the consistent acquisition, electronic storage, and submission of contextual data associated to their samples. A preconfigured Microsoft Excel spreadsheet is used to initiate structured contextual data storage in the field or laboratory. Each sample is given a unique identifier and at any stage the sheets can be uploaded to the MetaBar database server. To label samples, identifiers can be printed as barcodes. An intuitive web interface provides quick access to the contextual data in the MetaBar database as well as user and project management capabilities. Export functions facilitate contextual and sequence data submission to the International Nucleotide Sequence Database Collaboration (INSDC), comprising of the DNA DataBase of Japan (DDBJ), the European Molecular Biology Laboratory database (EMBL) and GenBank. MetaBar requests and stores contextual data in compliance to the Genomic Standards Consortium specifications. The MetaBar open source code base for local installation is available under the GNU General Public License version 3 (GNU GPL3). The MetaBar software supports the typical workflow from data acquisition and field-sampling to contextual data enriched sequence submission to an INSDC database. The integration with the megx.net marine Ecological Genomics database and portal facilitates georeferenced data integration and metadata-based comparisons of sampling sites as well as interactive data visualization. The ample export functionalities and the INSDC submission support enable exchange of data across disciplines and safeguarding contextual data.
MetaBar - a tool for consistent contextual data acquisition and standards compliant submission
2010-01-01
Background Environmental sequence datasets are increasing at an exponential rate; however, the vast majority of them lack appropriate descriptors like sampling location, time and depth/altitude: generally referred to as metadata or contextual data. The consistent capture and structured submission of these data is crucial for integrated data analysis and ecosystems modeling. The application MetaBar has been developed, to support consistent contextual data acquisition. Results MetaBar is a spreadsheet and web-based software tool designed to assist users in the consistent acquisition, electronic storage, and submission of contextual data associated to their samples. A preconfigured Microsoft® Excel® spreadsheet is used to initiate structured contextual data storage in the field or laboratory. Each sample is given a unique identifier and at any stage the sheets can be uploaded to the MetaBar database server. To label samples, identifiers can be printed as barcodes. An intuitive web interface provides quick access to the contextual data in the MetaBar database as well as user and project management capabilities. Export functions facilitate contextual and sequence data submission to the International Nucleotide Sequence Database Collaboration (INSDC), comprising of the DNA DataBase of Japan (DDBJ), the European Molecular Biology Laboratory database (EMBL) and GenBank. MetaBar requests and stores contextual data in compliance to the Genomic Standards Consortium specifications. The MetaBar open source code base for local installation is available under the GNU General Public License version 3 (GNU GPL3). Conclusion The MetaBar software supports the typical workflow from data acquisition and field-sampling to contextual data enriched sequence submission to an INSDC database. The integration with the megx.net marine Ecological Genomics database and portal facilitates georeferenced data integration and metadata-based comparisons of sampling sites as well as interactive data visualization. The ample export functionalities and the INSDC submission support enable exchange of data across disciplines and safeguarding contextual data. PMID:20591175
Rockballer Sample Acquisition Tool
NASA Technical Reports Server (NTRS)
Giersch, Louis R.; Cook, Brant T.
2013-01-01
It would be desirable to acquire rock and/or ice samples that extend below the surface of the parent rock or ice in extraterrestrial environments such as the Moon, Mars, comets, and asteroids. Such samples would allow measurements to be made further back into the geologic history of the rock, providing critical insight into the history of the local environment and the solar system. Such samples could also be necessary for sample return mission architectures that would acquire samples from extraterrestrial environments for return to Earth for more detailed scientific investigation.
Improving Ramsey spectroscopy in the extreme-ultraviolet region with a random-sampling approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eramo, R.; Bellini, M.; European Laboratory for Non-linear Spectroscopy
2011-04-15
Ramsey-like techniques, based on the coherent excitation of a sample by delayed and phase-correlated pulses, are promising tools for high-precision spectroscopic tests of QED in the extreme-ultraviolet (xuv) spectral region, but currently suffer experimental limitations related to long acquisition times and critical stability issues. Here we propose a random subsampling approach to Ramsey spectroscopy that, by allowing experimentalists to reach a given spectral resolution goal in a fraction of the usual acquisition time, leads to substantial improvements in high-resolution spectroscopy and may open the way to a widespread application of Ramsey-like techniques to precision measurements in the xuv spectral region.
NASA Technical Reports Server (NTRS)
Hudson, Nicolas; Lin, Ying; Barengoltz, Jack
2010-01-01
A method for evaluating the probability of a Viable Earth Microorganism (VEM) contaminating a sample during the sample acquisition and handling (SAH) process of a potential future Mars Sample Return mission is developed. A scenario where multiple core samples would be acquired using a rotary percussive coring tool, deployed from an arm on a MER class rover is analyzed. The analysis is conducted in a structured way by decomposing sample acquisition and handling process into a series of discrete time steps, and breaking the physical system into a set of relevant components. At each discrete time step, two key functions are defined: The probability of a VEM being released from each component, and the transport matrix, which represents the probability of VEM transport from one component to another. By defining the expected the number of VEMs on each component at the start of the sampling process, these decompositions allow the expected number of VEMs on each component at each sampling step to be represented as a Markov chain. This formalism provides a rigorous mathematical framework in which to analyze the probability of a VEM entering the sample chain, as well as making the analysis tractable by breaking the process down into small analyzable steps.
NASA Astrophysics Data System (ADS)
Tarmizi, H. B.; Daulay, M.; Muda, I.
2017-03-01
This study aims to test the aggregation of the economic growth of North Sumatra and the influence of the Tax on Acquisition of Land and Building to the Construction Cost Index in North Sumatra. This type of research is explanatory survey with quantitative methods. The population and the sample district in North Sumatra with the observation time series and cross sectional. The analysis tool used is multiple regression. The results showed that there was economic growth aggregation of North Sumatra and the influence of the Tax on Acquisition of Land and Building affect the Construction Cost Index.
NASA Astrophysics Data System (ADS)
Jandura, L.; Burke, K.; Kennedy, B.; Melko, J.; Okon, A.; Sunshine, D.
2009-12-01
The Sample Acquisition/Sample Processing and Handling (SA/SPaH) subsystem for the Mars Science Library (MSL) is a rover-based sampling system scheduled to launch in 2011. The SA/SPaH consists of a powdering drill and a scooping, sieving, and portioning device mounted on a turret at the end of a robotic arm. Also on the turret is a dust removal tool for clearing the surface of scientific targets, and two science instruments mounted on vibration isolators. The SA/SPaH can acquire powder from rocks at depths of 20 to 50 mm and can also pick up loose regolith with its scoop. The acquired sample is sieved and portioned and delivered to one of two instruments inside the rover for analysis. The functionality of the system will be described along with the targets the system can acquire and the sample that can be delivered. Top View of the SA/SPaH on the Rover
Comet sample acquisition for ROSETTA lander mission
NASA Astrophysics Data System (ADS)
Marchesi, M.; Campaci, R.; Magnani, P.; Mugnuolo, R.; Nista, A.; Olivier, A.; Re, E.
2001-09-01
ROSETTA/Lander is being developed with a combined effort of European countries, coordinated by German institutes. The commitment for such a challenging probe will provide a unique opportunity for in-situ analysis of a comet nucleus. The payload for coring, sampling and investigations of comet materials is called SD2 (Sampling Drilling and Distribution). The paper presents the drill/sampler tool and the sample transfer trough modeling, design and testing phases. Expected drilling parameters are then compared with experimental data; limited torque consumption and axial thrust on the tool constraint the operation and determine the success of tests. Qualification campaign involved the structural part and related vibration test, the auger/bit parts and drilling test, and the coring mechanism with related sampling test. Mechanical check of specimen volume is also reported, with emphasis on the measurement procedure and on the mechanical unit. The drill tool and all parts of the transfer chain were tested in the hypothetical comet environment, charcterized by frozen material at extreme low temperature and high vacuum (-160°C, 10-3 Pa).
NASA Technical Reports Server (NTRS)
Badescu, Mircea; Bonitz, Robert; Kulczycki, Erick; Aisen, Norman; Dandino, Charles M.; Cantrell, Brett S.; Gallagher, William; Shevin, Jesse; Ganino, Anthony; Haddad, Nicolas;
2013-01-01
The 2011 Decadal Survey for planetary science released by the National Research Council of the National Academies identified Comet Surface Sample Return (CSSR) as one of five high priority potential New Frontiers-class missions in the next decade. The main objectives of the research described in this publication are: develop a concept for an end-to-end system for collecting and storing a comet sample to be returned to Earth; design, fabricate and test a prototype Dynamic Acquisition and Retrieval Tool (DART) capable of collecting 500 cc sample in a canister and eject the canister with a predetermined speed; identify a set of simulants with physical properties at room temperature that suitably match the physical properties of the comet surface as it would be sampled. We propose the use of a dart that would be launched from the spacecraft to impact and penetrate the comet surface. After collecting the sample, the sample canister would be ejected at a speed greater than the comet's escape velocity and captured by the spacecraft, packaged into a return capsule and returned to Earth. The dart would be composed of an inner tube or sample canister, an outer tube, a decelerator, a means of capturing and retaining the sample, and a mechanism to eject the canister with the sample for later rendezvous with the spacecraft. One of the significant unknowns is the physical properties of the comet surface. Based on new findings from the recent Deep Impact comet encounter mission, we have limited our search of solutions for sampling materials to materials with 10 to 100 kPa shear strength in loose or consolidated form. As the possible range of values for the comet surface temperature is also significantly different than room temperature and testing at conditions other than the room temperature can become resource intensive, we sought sample simulants with physical properties at room temperature similar to the expected physical properties of the comet surface material. The chosen DART configuration, the efforts to identify a test simulant and the properties of these simulants, and the results of the preliminary testing will be described in this paper.
Marchand, Jérémy; Martineau, Estelle; Guitton, Yann; Dervilly-Pinel, Gaud; Giraudeau, Patrick
2017-02-01
Multi-dimensional NMR is an appealing approach for dealing with the challenging complexity of biological samples in metabolomics. This article describes how spectroscopists have recently challenged their imagination in order to make 2D NMR a powerful tool for quantitative metabolomics, based on innovative pulse sequences combined with meticulous analytical chemistry approaches. Clever time-saving strategies have also been explored to make 2D NMR a high-throughput tool for metabolomics, relying on alternative data acquisition schemes such as ultrafast NMR. Currently, much work is aimed at drastically boosting the NMR sensitivity thanks to hyperpolarisation techniques, which have been used in combination with fast acquisition methods and could greatly expand the application potential of NMR metabolomics. Copyright © 2016 Elsevier Ltd. All rights reserved.
Raman spectroscopic analysis of real samples: Brazilian bauxite mineralogy
NASA Astrophysics Data System (ADS)
Faulstich, Fabiano Richard Leite; Castro, Harlem V.; de Oliveira, Luiz Fernando Cappa; Neumann, Reiner
2011-10-01
In this investigation, Raman spectroscopy with 1064 and 632.8 nm excitation was used to investigate real mineral samples of bauxite ore from mines of Northern Brazil, together with Raman mapping and X-rays diffraction. The obtained results show clearly that the use of microRaman spectroscopy is a powerful tool for the identification of all the minerals usually found in bauxites: gibbsite, kaolinite, goethite, hematite, anatase and quartz. Bulk samples can also be analysed, and FT-Raman is more adequate due to better signal-to-noise ratio and representativity, although not efficient for kaolinite. The identification of fingerprinting vibrations for all the minerals allows the acquisition of Raman-based chemical maps, potentially powerful tools for process mineralogy applied to bauxite ores.
Image processing tools dedicated to quantification in 3D fluorescence microscopy
NASA Astrophysics Data System (ADS)
Dieterlen, A.; De Meyer, A.; Colicchio, B.; Le Calvez, S.; Haeberlé, O.; Jacquey, S.
2006-05-01
3-D optical fluorescent microscopy now becomes an efficient tool for the volume investigation of living biological samples. Developments in instrumentation have permitted to beat off the conventional Abbe limit. In any case the recorded image can be described by the convolution equation between the original object and the Point Spread Function (PSF) of the acquisition system. Due to the finite resolution of the instrument, the original object is recorded with distortions and blurring, and contaminated by noise. This induces that relevant biological information cannot be extracted directly from raw data stacks. If the goal is 3-D quantitative analysis, then to assess optimal performance of the instrument and to ensure the data acquisition reproducibility, the system characterization is mandatory. The PSF represents the properties of the image acquisition system; we have proposed the use of statistical tools and Zernike moments to describe a 3-D PSF system and to quantify the variation of the PSF. This first step toward standardization is helpful to define an acquisition protocol optimizing exploitation of the microscope depending on the studied biological sample. Before the extraction of geometrical information and/or intensities quantification, the data restoration is mandatory. Reduction of out-of-focus light is carried out computationally by deconvolution process. But other phenomena occur during acquisition, like fluorescence photo degradation named "bleaching", inducing an alteration of information needed for restoration. Therefore, we have developed a protocol to pre-process data before the application of deconvolution algorithms. A large number of deconvolution methods have been described and are now available in commercial package. One major difficulty to use this software is the introduction by the user of the "best" regularization parameters. We have pointed out that automating the choice of the regularization level; also greatly improves the reliability of the measurements although it facilitates the use. Furthermore, to increase the quality and the repeatability of quantitative measurements a pre-filtering of images improves the stability of deconvolution process. In the same way, the PSF prefiltering stabilizes the deconvolution process. We have shown that Zemike polynomials can be used to reconstruct experimental PSF, preserving system characteristics and removing the noise contained in the PSF.
Apollo Lunar Sample Integration into Google Moon: A New Approach to Digitization
NASA Technical Reports Server (NTRS)
Dawson, Melissa D.; Todd, nancy S.; Lofgren, Gary E.
2011-01-01
The Google Moon Apollo Lunar Sample Data Integration project is part of a larger, LASER-funded 4-year lunar rock photo restoration project by NASA s Acquisition and Curation Office [1]. The objective of this project is to enhance the Apollo mission data already available on Google Moon with information about the lunar samples collected during the Apollo missions. To this end, we have combined rock sample data from various sources, including Curation databases, mission documentation and lunar sample catalogs, with newly available digital photography of rock samples to create a user-friendly, interactive tool for learning about the Apollo Moon samples
Cleft audit protocol for speech (CAPS-A): a comprehensive training package for speech analysis.
Sell, D; John, A; Harding-Bell, A; Sweeney, T; Hegarty, F; Freeman, J
2009-01-01
The previous literature has largely focused on speech analysis systems and ignored process issues, such as the nature of adequate speech samples, data acquisition, recording and playback. Although there has been recognition of the need for training on tools used in speech analysis associated with cleft palate, little attention has been paid to this issue. To design, execute, and evaluate a training programme for speech and language therapists on the systematic and reliable use of the Cleft Audit Protocol for Speech-Augmented (CAPS-A), addressing issues of standardized speech samples, data acquisition, recording, playback, and listening guidelines. Thirty-six specialist speech and language therapists undertook the training programme over four days. This consisted of two days' training on the CAPS-A tool followed by a third day, making independent ratings and transcriptions on ten new cases which had been previously recorded during routine audit data collection. This task was repeated on day 4, a minimum of one month later. Ratings were made using the CAPS-A record form with the CAPS-A definition table. An analysis was made of the speech and language therapists' CAPS-A ratings at occasion 1 and occasion 2 and the intra- and inter-rater reliability calculated. Trained therapists showed consistency in individual judgements on specific sections of the tool. Intraclass correlation coefficients were calculated for each section with good agreement on eight of 13 sections. There were only fair levels of agreement on anterior oral cleft speech characteristics, non-cleft errors/immaturities and voice. This was explained, at least in part, by their low prevalence which affects the calculation of the intraclass correlation coefficient statistic. Speech and language therapists benefited from training on the CAPS-A, focusing on specific aspects of speech using definitions of parameters and scalar points, in order to apply the tool systematically and reliably. Ratings are enhanced by ensuring a high degree of attention to the nature of the data, standardizing the speech sample, data acquisition, the listening process together with the use of high-quality recording and playback equipment. In addition, a method is proposed for maintaining listening skills following training as part of an individual's continuing education.
Development of a knowledge acquisition tool for an expert system flight status monitor
NASA Technical Reports Server (NTRS)
Disbrow, J. D.; Duke, E. L.; Regenie, V. A.
1986-01-01
Two of the main issues in artificial intelligence today are knowledge acquisition dion and knowledge representation. The Dryden Flight Research Facility of NASA's Ames Research Center is presently involved in the design and implementation of an expert system flight status monitor that will provide expertise and knowledge to aid the flight systems engineer in monitoring today's advanced high-performance aircraft. The flight status monitor can be divided into two sections: the expert system itself and the knowledge acquisition tool. The knowledge acquisition tool, the means it uses to extract knowledge from the domain expert, and how that knowledge is represented for computer use is discussed. An actual aircraft system has been codified by this tool with great success. Future real-time use of the expert system has been facilitated by using the knowledge acquisition tool to easily generate a logically consistent and complete knowledge base.
Development of a knowledge acquisition tool for an expert system flight status monitor
NASA Technical Reports Server (NTRS)
Disbrow, J. D.; Duke, E. L.; Regenie, V. A.
1986-01-01
Two of the main issues in artificial intelligence today are knowledge acquisition and knowledge representation. The Dryden Flight Research Facility of NASA's Ames Research Center is presently involved in the design and implementation of an expert system flight status monitor that will provide expertise and knowledge to aid the flight systems engineer in monitoring today's advanced high-performance aircraft. The flight status monitor can be divided into two sections: the expert system itself and the knowledge acquisition tool. This paper discusses the knowledge acquisition tool, the means it uses to extract knowledge from the domain expert, and how that knowledge is represented for computer use. An actual aircraft system has been codified by this tool with great success. Future real-time use of the expert system has been facilitated by using the knowledge acquisition tool to easily generate a logically consistent and complete knowledge base.
Meta-tools for software development and knowledge acquisition
NASA Technical Reports Server (NTRS)
Eriksson, Henrik; Musen, Mark A.
1992-01-01
The effectiveness of tools that provide support for software development is highly dependent on the match between the tools and their task. Knowledge-acquisition (KA) tools constitute a class of development tools targeted at knowledge-based systems. Generally, KA tools that are custom-tailored for particular application domains are more effective than are general KA tools that cover a large class of domains. The high cost of custom-tailoring KA tools manually has encouraged researchers to develop meta-tools for KA tools. Current research issues in meta-tools for knowledge acquisition are the specification styles, or meta-views, for target KA tools used, and the relationships between the specification entered in the meta-tool and other specifications for the target program under development. We examine different types of meta-views and meta-tools. Our current project is to provide meta-tools that produce KA tools from multiple specification sources--for instance, from a task analysis of the target application.
NeuroPG: open source software for optical pattern generation and data acquisition
Avants, Benjamin W.; Murphy, Daniel B.; Dapello, Joel A.; Robinson, Jacob T.
2015-01-01
Patterned illumination using a digital micromirror device (DMD) is a powerful tool for optogenetics. Compared to a scanning laser, DMDs are inexpensive and can easily create complex illumination patterns. Combining these complex spatiotemporal illumination patterns with optogenetics allows DMD-equipped microscopes to probe neural circuits by selectively manipulating the activity of many individual cells or many subcellular regions at the same time. To use DMDs to study neural activity, scientists must develop specialized software to coordinate optical stimulation patterns with the acquisition of electrophysiological and fluorescence data. To meet this growing need we have developed an open source optical pattern generation software for neuroscience—NeuroPG—that combines, DMD control, sample visualization, and data acquisition in one application. Built on a MATLAB platform, NeuroPG can also process, analyze, and visualize data. The software is designed specifically for the Mightex Polygon400; however, as an open source package, NeuroPG can be modified to incorporate any data acquisition, imaging, or illumination equipment that is compatible with MATLAB’s Data Acquisition and Image Acquisition toolboxes. PMID:25784873
Cleft Audit Protocol for Speech (CAPS-A): A Comprehensive Training Package for Speech Analysis
ERIC Educational Resources Information Center
Sell, D.; John, A.; Harding-Bell, A.; Sweeney, T.; Hegarty, F.; Freeman, J.
2009-01-01
Background: The previous literature has largely focused on speech analysis systems and ignored process issues, such as the nature of adequate speech samples, data acquisition, recording and playback. Although there has been recognition of the need for training on tools used in speech analysis associated with cleft palate, little attention has been…
Borehole Tool for the Comprehensive Characterization of Hydrate-bearing Sediments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dai, Sheng; Santamarina, J. Carlos
Reservoir characterization and simulation require reliable parameters to anticipate hydrate deposits responses and production rates. The acquisition of the required fundamental properties currently relies on wireline logging, pressure core testing, and/or laboratory observations of synthesized specimens, which are challenged by testing capabilities and innate sampling disturbances. The project reviews hydrate-bearing sediments, properties, and inherent sampling effects, albeit lessen with the developments in pressure core technology, in order to develop robust correlations with index parameters. The resulting information is incorporated into a tool for optimal field characterization and parameter selection with uncertainty analyses. Ultimately, the project develops a borehole tool formore » the comprehensive characterization of hydrate-bearing sediments at in situ, with the design recognizing past developments and characterization experience and benefited from the inspiration of nature and sensor miniaturization.« less
Compact, Non-Pneumatic Rock-Powder Samplers
NASA Technical Reports Server (NTRS)
Sherrit, Stewart; Bar-Cohen, Yoseph; Badescu, Mircea; Bao, Xiaoqi; Chang, Zensheu; Jones, Christopher; Aldrich, Jack
2008-01-01
Tool bits that automatically collect powdered rock, permafrost, or other hard material generated in repeated hammering action have been invented. The present invention pertains to the special case in which it is desired to collect samples in powder form for analysis by x-ray diffraction and possibly other techniques. The present invention eliminates the need for both the mechanical collection equipment and the crushing chamber and the pneumatic collection equipment of prior approaches, so that it becomes possible to make the overall sample-acquisition apparatus more compact.
HTAPP: High-Throughput Autonomous Proteomic Pipeline
Yu, Kebing; Salomon, Arthur R.
2011-01-01
Recent advances in the speed and sensitivity of mass spectrometers and in analytical methods, the exponential acceleration of computer processing speeds, and the availability of genomic databases from an array of species and protein information databases have led to a deluge of proteomic data. The development of a lab-based automated proteomic software platform for the automated collection, processing, storage, and visualization of expansive proteomic datasets is critically important. The high-throughput autonomous proteomic pipeline (HTAPP) described here is designed from the ground up to provide critically important flexibility for diverse proteomic workflows and to streamline the total analysis of a complex proteomic sample. This tool is comprised of software that controls the acquisition of mass spectral data along with automation of post-acquisition tasks such as peptide quantification, clustered MS/MS spectral database searching, statistical validation, and data exploration within a user-configurable lab-based relational database. The software design of HTAPP focuses on accommodating diverse workflows and providing missing software functionality to a wide range of proteomic researchers to accelerate the extraction of biological meaning from immense proteomic data sets. Although individual software modules in our integrated technology platform may have some similarities to existing tools, the true novelty of the approach described here is in the synergistic and flexible combination of these tools to provide an integrated and efficient analysis of proteomic samples. PMID:20336676
Mahmud, Mufti; Vassanelli, Stefano
2016-01-01
In recent years multichannel neuronal signal acquisition systems have allowed scientists to focus on research questions which were otherwise impossible. They act as a powerful means to study brain (dys)functions in in-vivo and in in-vitro animal models. Typically, each session of electrophysiological experiments with multichannel data acquisition systems generate large amount of raw data. For example, a 128 channel signal acquisition system with 16 bits A/D conversion and 20 kHz sampling rate will generate approximately 17 GB data per hour (uncompressed). This poses an important and challenging problem of inferring conclusions from the large amounts of acquired data. Thus, automated signal processing and analysis tools are becoming a key component in neuroscience research, facilitating extraction of relevant information from neuronal recordings in a reasonable time. The purpose of this review is to introduce the reader to the current state-of-the-art of open-source packages for (semi)automated processing and analysis of multichannel extracellular neuronal signals (i.e., neuronal spikes, local field potentials, electroencephalogram, etc.), and the existing Neuroinformatics infrastructure for tool and data sharing. The review is concluded by pinpointing some major challenges that are being faced, which include the development of novel benchmarking techniques, cloud-based distributed processing and analysis tools, as well as defining novel means to share and standardize data. PMID:27313507
An Assessment Tool of Performance Based Logistics Appropriateness
2012-03-01
weighted tool score. The reason might be the willing to use PBL as an acquisition method . There is an 8.51% positive difference is present. Figure 20 shows...performance-based acquisition methods to the maximum extent practicable when acquiring services with little exclusion’ is mandated. Although PBL...determines the factors affecting the success in selecting PBL as an acquisition method . Each factor is examined in detail and built into a spreadsheet tool
Data Independent Acquisition analysis in ProHits 4.0.
Liu, Guomin; Knight, James D R; Zhang, Jian Ping; Tsou, Chih-Chiang; Wang, Jian; Lambert, Jean-Philippe; Larsen, Brett; Tyers, Mike; Raught, Brian; Bandeira, Nuno; Nesvizhskii, Alexey I; Choi, Hyungwon; Gingras, Anne-Claude
2016-10-21
Affinity purification coupled with mass spectrometry (AP-MS) is a powerful technique for the identification and quantification of physical interactions. AP-MS requires careful experimental design, appropriate control selection and quantitative workflows to successfully identify bona fide interactors amongst a large background of contaminants. We previously introduced ProHits, a Laboratory Information Management System for interaction proteomics, which tracks all samples in a mass spectrometry facility, initiates database searches and provides visualization tools for spectral counting-based AP-MS approaches. More recently, we implemented Significance Analysis of INTeractome (SAINT) within ProHits to provide scoring of interactions based on spectral counts. Here, we provide an update to ProHits to support Data Independent Acquisition (DIA) with identification software (DIA-Umpire and MSPLIT-DIA), quantification tools (through DIA-Umpire, or externally via targeted extraction), and assessment of quantitative enrichment (through mapDIA) and scoring of interactions (through SAINT-intensity). With additional improvements, notably support of the iProphet pipeline, facilitated deposition into ProteomeXchange repositories and enhanced export and viewing functions, ProHits 4.0 offers a comprehensive suite of tools to facilitate affinity proteomics studies. It remains challenging to score, annotate and analyze proteomics data in a transparent manner. ProHits was previously introduced as a LIMS to enable storing, tracking and analysis of standard AP-MS data. In this revised version, we expand ProHits to include integration with a number of identification and quantification tools based on Data-Independent Acquisition (DIA). ProHits 4.0 also facilitates data deposition into public repositories, and the transfer of data to new visualization tools. Copyright © 2016 Elsevier B.V. All rights reserved.
Computer-Aided Process and Tools for Mobile Software Acquisition
2013-04-01
Software Acquisition Christopher Bonine , Man-Tak Shing, and Thomas W. Otani Naval Postgraduate School Published April 1, 2013 Approved for public...ManTech International Corporation Computer-Aided Process and Tools for Mobile Software Acquisition Christopher Bonine , Man-Tak Shing, and Thomas W. Otani...Mobile Software Acquisition Christopher Bonine — Bonine is a lieutenant in the United States Navy. He is currently assigned to the Navy Cyber Defense
An automated field phenotyping pipeline for application in grapevine research.
Kicherer, Anna; Herzog, Katja; Pflanz, Michael; Wieland, Markus; Rüger, Philipp; Kecke, Steffen; Kuhlmann, Heiner; Töpfer, Reinhard
2015-02-26
Due to its perennial nature and size, the acquisition of phenotypic data in grapevine research is almost exclusively restricted to the field and done by visual estimation. This kind of evaluation procedure is limited by time, cost and the subjectivity of records. As a consequence, objectivity, automation and more precision of phenotypic data evaluation are needed to increase the number of samples, manage grapevine repositories, enable genetic research of new phenotypic traits and, therefore, increase the efficiency in plant research. In the present study, an automated field phenotyping pipeline was setup and applied in a plot of genetic resources. The application of the PHENObot allows image acquisition from at least 250 individual grapevines per hour directly in the field without user interaction. Data management is handled by a database (IMAGEdata). The automatic image analysis tool BIVcolor (Berries in Vineyards-color) permitted the collection of precise phenotypic data of two important fruit traits, berry size and color, within a large set of plants. The application of the PHENObot represents an automated tool for high-throughput sampling of image data in the field. The automated analysis of these images facilitates the generation of objective and precise phenotypic data on a larger scale.
An Automated Field Phenotyping Pipeline for Application in Grapevine Research
Kicherer, Anna; Herzog, Katja; Pflanz, Michael; Wieland, Markus; Rüger, Philipp; Kecke, Steffen; Kuhlmann, Heiner; Töpfer, Reinhard
2015-01-01
Due to its perennial nature and size, the acquisition of phenotypic data in grapevine research is almost exclusively restricted to the field and done by visual estimation. This kind of evaluation procedure is limited by time, cost and the subjectivity of records. As a consequence, objectivity, automation and more precision of phenotypic data evaluation are needed to increase the number of samples, manage grapevine repositories, enable genetic research of new phenotypic traits and, therefore, increase the efficiency in plant research. In the present study, an automated field phenotyping pipeline was setup and applied in a plot of genetic resources. The application of the PHENObot allows image acquisition from at least 250 individual grapevines per hour directly in the field without user interaction. Data management is handled by a database (IMAGEdata). The automatic image analysis tool BIVcolor (Berries in Vineyards-color) permitted the collection of precise phenotypic data of two important fruit traits, berry size and color, within a large set of plants. The application of the PHENObot represents an automated tool for high-throughput sampling of image data in the field. The automated analysis of these images facilitates the generation of objective and precise phenotypic data on a larger scale. PMID:25730485
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-07
... Authorization Act for Fiscal Year 2009. Section 815 requires acquisition plans for major weapons systems to... hardware for major defense acquisition programs through the end of the service life of the related weapons... affects all contracts for major weapons that will require special tooling associated with the production...
48 CFR 31.205-40 - Special tooling and special test equipment costs.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 1 2011-10-01 2011-10-01 false Special tooling and special test equipment costs. 31.205-40 Section 31.205-40 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL CONTRACTING REQUIREMENTS CONTRACT COST PRINCIPLES AND PROCEDURES Contracts With Commercial Organizations 31.205-40...
Touch and Go Surface Sampler (TGSS)
NASA Technical Reports Server (NTRS)
Gorevan, S. P.; Rafeek, S.
2001-01-01
The Touch and Go Surface Sampler (TGSS) is a new class of planetary and small body sample acquisition tool that can be used for the surface exploration of Europa, Titan and comets. TGSS in its basic configuration consists of a high speed sampling head attached to the end of a flexible shaft. The sampling head consists of counter rotating cutters that rotates at speeds of 3000 to 15000 RPM. The attractive feature of this if touch and go type sampler is that there are no requirements for a lander type spacecraft. Additional information is contained in the original extended abstract.
Parallel multispot smFRET analysis using an 8-pixel SPAD array
NASA Astrophysics Data System (ADS)
Ingargiola, A.; Colyer, R. A.; Kim, D.; Panzeri, F.; Lin, R.; Gulinatti, A.; Rech, I.; Ghioni, M.; Weiss, S.; Michalet, X.
2012-02-01
Single-molecule Förster resonance energy transfer (smFRET) is a powerful tool for extracting distance information between two fluorophores (a donor and acceptor dye) on a nanometer scale. This method is commonly used to monitor binding interactions or intra- and intermolecular conformations in biomolecules freely diffusing through a focal volume or immobilized on a surface. The diffusing geometry has the advantage to not interfere with the molecules and to give access to fast time scales. However, separating photon bursts from individual molecules requires low sample concentrations. This results in long acquisition time (several minutes to an hour) to obtain sufficient statistics. It also prevents studying dynamic phenomena happening on time scales larger than the burst duration and smaller than the acquisition time. Parallelization of acquisition overcomes this limit by increasing the acquisition rate using the same low concentrations required for individual molecule burst identification. In this work we present a new two-color smFRET approach using multispot excitation and detection. The donor excitation pattern is composed of 4 spots arranged in a linear pattern. The fluorescent emission of donor and acceptor dyes is then collected and refocused on two separate areas of a custom 8-pixel SPAD array. We report smFRET measurements performed on various DNA samples synthesized with various distances between the donor and acceptor fluorophores. We demonstrate that our approach provides identical FRET efficiency values to a conventional single-spot acquisition approach, but with a reduced acquisition time. Our work thus opens the way to high-throughput smFRET analysis on freely diffusing molecules.
Dual Raman-Brillouin spectroscopic investigation of plant stress response and development
NASA Astrophysics Data System (ADS)
Coker, Zachary; Troyanova-Wood, Maria; Marble, Kassie; Yakovlev, Vladislav
2018-03-01
Raman and Brillouin spectroscopy are powerful tools for non-invasive and non-destructive investigations of material chemical and mechanical properties. In this study, we use a newly developed custom-built dual Raman-Brillouin microspectroscopy instrument to build on previous works studying in-vivo stress response of live plants using only a Raman spectroscopy system. This dual Raman-Brillouin spectroscopy system is capable of fast simultaneous spectra acquisition from single-point locations. Shifts and changes in a samples Brillouin spectrum indicate a change in the physical characteristics of the sample, namely mechano-elasticity; in measuring this change, we can establish a relationship between the mechanical properties of a sample and known stress response agents, such as reactive oxygen species and other chemical constituents as indicated by peaks in the Raman spectra of the same acquisition point. Simultaneous application of these spectroscopic techniques offers great promise for future development and applications in agricultural and biological studies and can help to improve our understanding of mechanochemical changes of plants and other biological samples in response to environmental and chemically induced stresses at microscopic or cellular level.
Automatic differential analysis of NMR experiments in complex samples.
Margueritte, Laure; Markov, Petar; Chiron, Lionel; Starck, Jean-Philippe; Vonthron-Sénécheau, Catherine; Bourjot, Mélanie; Delsuc, Marc-André
2018-06-01
Liquid state nuclear magnetic resonance (NMR) is a powerful tool for the analysis of complex mixtures of unknown molecules. This capacity has been used in many analytical approaches: metabolomics, identification of active compounds in natural extracts, and characterization of species, and such studies require the acquisition of many diverse NMR measurements on series of samples. Although acquisition can easily be performed automatically, the number of NMR experiments involved in these studies increases very rapidly, and this data avalanche requires to resort to automatic processing and analysis. We present here a program that allows the autonomous, unsupervised processing of a large corpus of 1D, 2D, and diffusion-ordered spectroscopy experiments from a series of samples acquired in different conditions. The program provides all the signal processing steps, as well as peak-picking and bucketing of 1D and 2D spectra, the program and its components are fully available. In an experiment mimicking the search of a bioactive species in a natural extract, we use it for the automatic detection of small amounts of artemisinin added to a series of plant extracts and for the generation of the spectral fingerprint of this molecule. This program called Plasmodesma is a novel tool that should be useful to decipher complex mixtures, particularly in the discovery of biologically active natural products from plants extracts but can also in drug discovery or metabolomics studies. Copyright © 2017 John Wiley & Sons, Ltd.
HECTOR: A 240kV micro-CT setup optimized for research
NASA Astrophysics Data System (ADS)
Masschaele, Bert; Dierick, Manuel; Van Loo, Denis; Boone, Matthieu N.; Brabant, Loes; Pauwels, Elin; Cnudde, Veerle; Van Hoorebeke, Luc
2013-10-01
X-ray micro-CT has become a very powerful and common tool for non-destructive three-dimensional (3D) visualization and analysis of objects. Many systems are commercially available, but they are typically limited in terms of operational freedom both from a mechanical point of view as well as for acquisition routines. HECTOR is the latest system developed by the Ghent University Centre for X-ray Tomography (http://www.ugct.ugent.be) in collaboration with X-Ray Engineering (XRE bvba, Ghent, Belgium). It consists of a mechanical setup with nine motorized axes and a modular acquisition software package and combines a microfocus directional target X-ray source up to 240 kV with a large flat-panel detector. Provisions are made to install a line-detector for a maximal operational range. The system can accommodate samples up to 80 kg, 1 m long and 80 cm in diameter while it is also suited for high resolution (down to 4 μm) tomography. The bi-directional detector tiling is suited for large samples while the variable source-detector distance optimizes the signal to noise ratio (SNR) for every type of sample, even with peripheral equipment such as compression stages or climate chambers. The large vertical travel of 1 m can be used for helical scanning and a vertical detector rotation axis allows laminography experiments. The setup is installed in a large concrete bunker to allow accommodation of peripheral equipment such as pumps, chillers, etc., which can be integrated in the modular acquisition software to obtain a maximal correlation between the environmental control and the CT data taken. The acquisition software does not only allow good coupling with the peripheral equipment but its scripting feature is also particularly interesting for testing new and exotic acquisition routines.
Electric Motors Maintenance Planning From Its Operating Variables
NASA Astrophysics Data System (ADS)
Rodrigues, Francisco; Fonseca, Inácio; Farinha, José Torres; Ferreira, Luís; Galar, Diego
2017-09-01
The maintenance planning corresponds to an approach that seeks to maximize the availability of equipment and, consequently, increase the levels of competitiveness of companies by increasing production times. This paper presents a maintenance planning based on operating variables (number of hours worked, duty cycles, number of revolutions) to maximizing the availability of operation of electrical motors. The reading of the operating variables and its sampling is done based on predetermined sampling cycles and subsequently is made the data analysis through time series algorithms aiming to launch work orders before reaching the variables limit values. This approach is supported by tools and technologies such as logical applications that enable a graphical user interface for access to relevant information about their Physical Asset HMI (Human Machine Interface), including the control and supervision by acquisition through SCADA (Supervisory Control And data acquisition) data, also including the communication protocols among different logical applications.
USDA-ARS?s Scientific Manuscript database
Predictive models are valuable tools for assessing food safety. Existing thermal inactivation models for Salmonella and ground chicken do not provide predictions above 71 degrees C, which is below the recommended final cooked temperature of 73.9 degrees C. They also do not predict when all Salmone...
A Fixed-Wing Aircraft Simulation Tool for Improving the efficiency of DoD Acquisition
2015-10-05
simulation tool , CREATETM-AV Helios [12-14], a high fidelity rotary wing vehicle simulation tool , and CREATETM-AV DaVinci [15-16], a conceptual through...05/2015 Oct 2008-Sep 2015 A Fixed-Wing Aircraft Simulation Tool for Improving the Efficiency of DoD Acquisition Scott A. Morton and David R...multi-disciplinary fixed-wing virtual aircraft simulation tool incorporating aerodynamics, structural dynamics, kinematics, and kinetics. Kestrel allows
Accelerated radial Fourier-velocity encoding using compressed sensing.
Hilbert, Fabian; Wech, Tobias; Hahn, Dietbert; Köstler, Herbert
2014-09-01
Phase Contrast Magnetic Resonance Imaging (MRI) is a tool for non-invasive determination of flow velocities inside blood vessels. Because Phase Contrast MRI only measures a single mean velocity per voxel, it is only applicable to vessels significantly larger than the voxel size. In contrast, Fourier Velocity Encoding measures the entire velocity distribution inside a voxel, but requires a much longer acquisition time. For accurate diagnosis of stenosis in vessels on the scale of spatial resolution, it is important to know the velocity distribution of a voxel. Our aim was to determine velocity distributions with accelerated Fourier Velocity Encoding in an acquisition time required for a conventional Phase Contrast image. We imaged the femoral artery of healthy volunteers with ECG-triggered, radial CINE acquisition. Data acquisition was accelerated by undersampling, while missing data were reconstructed by Compressed Sensing. Velocity spectra of the vessel were evaluated by high resolution Phase Contrast images and compared to spectra from fully sampled and undersampled Fourier Velocity Encoding. By means of undersampling, it was possible to reduce the scan time for Fourier Velocity Encoding to the duration required for a conventional Phase Contrast image. Acquisition time for a fully sampled data set with 12 different Velocity Encodings was 40 min. By applying a 12.6-fold retrospective undersampling, a data set was generated equal to 3:10 min acquisition time, which is similar to a conventional Phase Contrast measurement. Velocity spectra from fully sampled and undersampled Fourier Velocity Encoded images are in good agreement and show the same maximum velocities as compared to velocity maps from Phase Contrast measurements. Compressed Sensing proved to reliably reconstruct Fourier Velocity Encoded data. Our results indicate that Fourier Velocity Encoding allows an accurate determination of the velocity distribution in vessels in the order of the voxel size. Thus, compared to normal Phase Contrast measurements delivering only mean velocities, no additional scan time is necessary to retrieve meaningful velocity spectra in small vessels. Copyright © 2013. Published by Elsevier GmbH.
Ferrando-Climent, L; Rodriguez-Mozaz, S; Barceló, D
2013-07-01
In the present work, the development, optimization, and validation (including a whole stability study) of a fast, reliable, and comprehensive method for the analysis of ten anticancer drugs in hospital and urban wastewater is described. Extraction of these pharmaceutical compounds was performed using automated off-line solid-phase extraction followed by their determination by ultra-performance liquid chromatography coupled to a triple quadrupole-linear ion trap mass spectrometer. Target compounds include nine cytotoxic agents: cyclophosphamide, ifosfamide, docetaxel, paclitaxel, etoposide, vincristine, tamoxifen, methotrexate, and azathioprine; and the cytotoxic quinolone, ciprofloxacin. Method detection limits (MDL) ranged from 0.8 to 24 ng/L. Levels found of cytostatic agents in the hospital and wastewater influents did not differ significantly, and therefore, hospitals cannot be considered as the primary source of this type of contaminants. All the target compounds were detected in at least one of the influent samples analyzed: Ciprofloxacin, cyclophosphamide, tamoxifen, and azathioprine were found in most of them and achieving maximum levels of 14.725, 0.201, 0.133, and 0.188 μg/L, respectively. The rest of target cancer drugs were less frequently detected and at values ranging between MDL and 0.406 μg/L. Furthermore, a feasible, useful, and advantageous approach based on information acquisition tool (information-dependent acquisition) was used for the screening of human metabolites in hospital effluents, where the hydroxy tamoxifen, endoxifen, and carboxyphosphamide were detected.
Sex differences in tool use acquisition in bonobos (Pan paniscus).
Boose, Klaree J; White, Frances J; Meinelt, Audra
2013-09-01
All the great ape species are known tool users in both the wild and captivity, although there is great variation in ability and behavioral repertoire. Differences in tool use acquisition between chimpanzees and gorillas have been attributed to differing levels of social tolerance as a result of differences in social structure. Chimpanzees also show sex differences in acquisition and both chimpanzees and bonobos demonstrate a female bias in tool use behaviors. Studies of acquisition are limited in the wild and between species comparisons are complicated in captivity by contexts that often do not reflect natural conditions. Here we investigated tool use acquisition in a captive group of naïve bonobos by simulating naturalistic conditions. We constructed an artificial termite mound fashioned after those that occur in the wild and tested individuals within a social group context. We found sex differences in latencies to attempt and to succeed where females attempted to fish, were successful more quickly, and fished more frequently than males. We compared our results to those reported for chimpanzees and gorillas. Males across all three species did not differ in latency to attempt or to succeed. In contrast, bonobo and chimpanzee females succeeded more quickly than did female gorillas. Female bonobos and female chimpanzees did not differ in either latency to attempt or to succeed. We tested the social tolerance hypothesis by investigating the relationship between tool behaviors and number of neighbors present. We also compared these results to those reported for chimpanzees and gorillas and found that bonobos had the fewest numbers of neighbors present. The results of this study do not support the association between number of neighbors and tool behavior reported for chimpanzees. However, bonobos demonstrated a similar sex difference in tool use acquisition, supporting the hypothesis of a female bias in tool use in Pan. © 2013 Wiley Periodicals, Inc.
MilQuant: a free, generic software tool for isobaric tagging-based quantitation.
Zou, Xiao; Zhao, Minzhi; Shen, Hongyan; Zhao, Xuyang; Tong, Yuanpeng; Wang, Qingsong; Wei, Shicheng; Ji, Jianguo
2012-09-18
Isobaric tagging techniques such as iTRAQ and TMT are widely used in quantitative proteomics and especially useful for samples that demand in vitro labeling. Due to diversity in choices of MS acquisition approaches, identification algorithms, and relative abundance deduction strategies, researchers are faced with a plethora of possibilities when it comes to data analysis. However, the lack of generic and flexible software tool often makes it cumbersome for researchers to perform the analysis entirely as desired. In this paper, we present MilQuant, mzXML-based isobaric labeling quantitator, a pipeline of freely available programs that supports native acquisition files produced by all mass spectrometer types and collection approaches currently used in isobaric tagging based MS data collection. Moreover, aside from effective normalization and abundance ratio deduction algorithms, MilQuant exports various intermediate results along each step of the pipeline, making it easy for researchers to customize the analysis. The functionality of MilQuant was demonstrated by four distinct datasets from different laboratories. The compatibility and extendibility of MilQuant makes it a generic and flexible tool that can serve as a full solution to data analysis of isobaric tagging-based quantitation. Copyright © 2012 Elsevier B.V. All rights reserved.
Identification and restoration in 3D fluorescence microscopy
NASA Astrophysics Data System (ADS)
Dieterlen, Alain; Xu, Chengqi; Haeberle, Olivier; Hueber, Nicolas; Malfara, R.; Colicchio, B.; Jacquey, Serge
2004-06-01
3-D optical fluorescent microscopy becomes now an efficient tool for volumic investigation of living biological samples. The 3-D data can be acquired by Optical Sectioning Microscopy which is performed by axial stepping of the object versus the objective. For any instrument, each recorded image can be described by a convolution equation between the original object and the Point Spread Function (PSF) of the acquisition system. To assess performance and ensure the data reproducibility, as for any 3-D quantitative analysis, the system indentification is mandatory. The PSF explains the properties of the image acquisition system; it can be computed or acquired experimentally. Statistical tools and Zernike moments are shown appropriate and complementary to describe a 3-D system PSF and to quantify the variation of the PSF as function of the optical parameters. Some critical experimental parameters can be identified with these tools. This is helpful for biologist to define an aquisition protocol optimizing the use of the system. Reduction of out-of-focus light is the task of 3-D microscopy; it is carried out computationally by deconvolution process. Pre-filtering the images improves the stability of deconvolution results, now less dependent on the regularization parameter; this helps the biologists to use restoration process.
ERIC Educational Resources Information Center
Van Liew, Gayle Dorothy
Data collected from library acquisition records on microfiche helped ascertain the effects of the Thor Power Tool Company Supreme Court decision in 1979 on a sampling of four randomly selected schools in the Atlanta Public School System between 1978 and 1985. Prior to this decision, some publishers kept a substantial inventory of backlist titles…
Searle, Brian C.; Egertson, Jarrett D.; Bollinger, James G.; Stergachis, Andrew B.; MacCoss, Michael J.
2015-01-01
Targeted mass spectrometry is an essential tool for detecting quantitative changes in low abundant proteins throughout the proteome. Although selected reaction monitoring (SRM) is the preferred method for quantifying peptides in complex samples, the process of designing SRM assays is laborious. Peptides have widely varying signal responses dictated by sequence-specific physiochemical properties; one major challenge is in selecting representative peptides to target as a proxy for protein abundance. Here we present PREGO, a software tool that predicts high-responding peptides for SRM experiments. PREGO predicts peptide responses with an artificial neural network trained using 11 minimally redundant, maximally relevant properties. Crucial to its success, PREGO is trained using fragment ion intensities of equimolar synthetic peptides extracted from data independent acquisition experiments. Because of similarities in instrumentation and the nature of data collection, relative peptide responses from data independent acquisition experiments are a suitable substitute for SRM experiments because they both make quantitative measurements from integrated fragment ion chromatograms. Using an SRM experiment containing 12,973 peptides from 724 synthetic proteins, PREGO exhibits a 40–85% improvement over previously published approaches at selecting high-responding peptides. These results also represent a dramatic improvement over the rules-based peptide selection approaches commonly used in the literature. PMID:26100116
Kandaswamy, Umasankar; Rotman, Ziv; Watt, Dana; Schillebeeckx, Ian; Cavalli, Valeria; Klyachko, Vitaly
2013-01-01
High-resolution live-cell imaging studies of neuronal structure and function are characterized by large variability in image acquisition conditions due to background and sample variations as well as low signal-to-noise ratio. The lack of automated image analysis tools that can be generalized for varying image acquisition conditions represents one of the main challenges in the field of biomedical image analysis. Specifically, segmentation of the axonal/dendritic arborizations in brightfield or fluorescence imaging studies is extremely labor-intensive and still performed mostly manually. Here we describe a fully automated machine-learning approach based on textural analysis algorithms for segmenting neuronal arborizations in high-resolution brightfield images of live cultured neurons. We compare performance of our algorithm to manual segmentation and show that it combines 90% accuracy, with similarly high levels of specificity and sensitivity. Moreover, the algorithm maintains high performance levels under a wide range of image acquisition conditions indicating that it is largely condition-invariable. We further describe an application of this algorithm to fully automated synapse localization and classification in fluorescence imaging studies based on synaptic activity. Textural analysis-based machine-learning approach thus offers a high performance condition-invariable tool for automated neurite segmentation. PMID:23261652
Automated sample area definition for high-throughput microscopy.
Zeder, M; Ellrott, A; Amann, R
2011-04-01
High-throughput screening platforms based on epifluorescence microscopy are powerful tools in a variety of scientific fields. Although some applications are based on imaging geometrically defined samples such as microtiter plates, multiwell slides, or spotted gene arrays, others need to cope with inhomogeneously located samples on glass slides. The analysis of microbial communities in aquatic systems by sample filtration on membrane filters followed by multiple fluorescent staining, or the investigation of tissue sections are examples. Therefore, we developed a strategy for flexible and fast definition of sample locations by the acquisition of whole slide overview images and automated sample recognition by image analysis. Our approach was tested on different microscopes and the computer programs are freely available (http://www.technobiology.ch). Copyright © 2011 International Society for Advancement of Cytometry.
The Simultaneous Medicina-Planck Experiment: data acquisition, reduction and first results
NASA Astrophysics Data System (ADS)
Procopio, P.; Massardi, M.; Righini, S.; Zanichelli, A.; Ricciardi, S.; Libardi, P.; Burigana, C.; Cuttaia, F.; Mack, K.-H.; Terenzi, L.; Villa, F.; Bonavera, L.; Morgante, G.; Trigilio, C.; Trombetti, T.; Umana, G.
2011-10-01
The Simultaneous Medicina-Planck Experiment (SiMPlE) is aimed at observing a selected sample of 263 extragalactic and Galactic sources with the Medicina 32-m single-dish radio telescope in the same epoch as the Planck satellite observations. The data, acquired with a frequency coverage down to 5 GHz and combined with Planck at frequencies above 30 GHz, will constitute a useful reference catalogue of bright sources over the whole Northern hemisphere. Furthermore, source observations performed in different epochs and comparisons with other catalogues will allow the investigation of source variabilities on different time-scales. In this work, we describe the sample selection, the ongoing data acquisition campaign, the data reduction procedures, the developed tools and the comparison with other data sets. We present 5 and 8.3 GHz data for the SiMPlE Northern sample, consisting of 79 sources with δ≥ 45° selected from our catalogue and observed during the first 6 months of the project. A first analysis of their spectral behaviour and long-term variability is also presented.
Development of Data Acquisition Set-up for Steady-state Experiments
NASA Astrophysics Data System (ADS)
Srivastava, Amit K.; Gupta, Arnab D.; Sunil, S.; Khan, Ziauddin
2017-04-01
For short duration experiments, generally digitized data is transferred for processing and storage after the experiment whereas in case of steady-state experiment the data is acquired, processed, displayed and stored continuously in pipelined manner. This requires acquiring data through special techniques for storage and on-the-go viewing data to display the current data trends for various physical parameters. A small data acquisition set-up is developed for continuously acquiring signals from various physical parameters at different sampling rate for long duration experiment. This includes the hardware set-up for signal digitization, Field Programmable Gate Arrays (FPGA) based timing system for clock synchronization and event/trigger distribution, time slicing of data streams for storage of data chunks to enable viewing of data during acquisition and channel profile display through down sampling etc. In order to store a long data stream of indefinite/long time duration, the data stream is divided into data slices/chunks of user defined time duration. Data chunks avoid the problem of non-access of server data until the channel data file is closed at the end of the long duration experiment. A graphical user interface has been developed in Lab VIEW application development environment for configuring the data acquisition hardware and storing data chunks on local machine as well as at remote data server through Python for further data access. The data plotting and analysis utilities have been developed with Python software, which provides tools for further data processing. This paper describes the development and implementation of data acquisition for steady-state experiment.
NASA Astrophysics Data System (ADS)
Rosu-Hamzescu, Mihnea; Polonschii, Cristina; Oprea, Sergiu; Popescu, Dragos; David, Sorin; Bratu, Dumitru; Gheorghiu, Eugen
2018-06-01
Electro-optical measurements, i.e., optical waveguides and plasmonic based electrochemical impedance spectroscopy (P-EIS), are based on the sensitive dependence of refractive index of electro-optical sensors on surface charge density, modulated by an AC electrical field applied to the sensor surface. Recently, P-EIS has emerged as a new analytical tool that can resolve local impedance with high, optical spatial resolution, without using microelectrodes. This study describes a high speed image acquisition and processing system for electro-optical measurements, based on a high speed complementary metal-oxide semiconductor (CMOS) sensor and a field-programmable gate array (FPGA) board. The FPGA is used to configure CMOS parameters, as well as to receive and locally process the acquired images by performing Fourier analysis for each pixel, deriving the real and imaginary parts of the Fourier coefficients for the AC field frequencies. An AC field generator, for single or multi-sine signals, is synchronized with the high speed acquisition system for phase measurements. The system was successfully used for real-time angle-resolved electro-plasmonic measurements from 30 Hz up to 10 kHz, providing results consistent to ones obtained by a conventional electrical impedance approach. The system was able to detect amplitude variations with a relative variation of ±1%, even for rather low sampling rates per period (i.e., 8 samples per period). The PC (personal computer) acquisition and control software allows synchronized acquisition for multiple FPGA boards, making it also suitable for simultaneous angle-resolved P-EIS imaging.
Sinkó, József; Kákonyi, Róbert; Rees, Eric; Metcalf, Daniel; Knight, Alex E.; Kaminski, Clemens F.; Szabó, Gábor; Erdélyi, Miklós
2014-01-01
Localization-based super-resolution microscopy image quality depends on several factors such as dye choice and labeling strategy, microscope quality and user-defined parameters such as frame rate and number as well as the image processing algorithm. Experimental optimization of these parameters can be time-consuming and expensive so we present TestSTORM, a simulator that can be used to optimize these steps. TestSTORM users can select from among four different structures with specific patterns, dye and acquisition parameters. Example results are shown and the results of the vesicle pattern are compared with experimental data. Moreover, image stacks can be generated for further evaluation using localization algorithms, offering a tool for further software developments. PMID:24688813
Knowledge Acquisition: A Review of Tools and Ideas.
1987-08-01
tools. However, none could be applied directly to solving the problem of acquiring knowledge for the ASPA. RECOMMENDATIONS Develop a tool based on...the social sciences. BACKGROUND Because of the newness and complexity of the knowledge acquisition problem, the background of the knowledge...4. Minimal (does not incorporate any unnecessary complexities ) 5. Expected (experts are not in disagreement over any important aspect) (Grover 1983
Paiva, Anthony; Shou, Wilson Z
2016-08-01
The last several years have seen the rapid adoption of the high-resolution MS (HRMS) for bioanalytical support of high throughput in vitro ADME profiling. Many capable software tools have been developed and refined to process quantitative HRMS bioanalysis data for ADME samples with excellent performance. Additionally, new software applications specifically designed for quan/qual soft spot identification workflows using HRMS have greatly enhanced the quality and efficiency of the structure elucidation process for high throughput metabolite ID in early in vitro ADME profiling. Finally, novel approaches in data acquisition and compression, as well as tools for transferring, archiving and retrieving HRMS data, are being continuously refined to tackle the issue of large data file size typical for HRMS analyses.
Public data and open source tools for multi-assay genomic investigation of disease.
Kannan, Lavanya; Ramos, Marcel; Re, Angela; El-Hachem, Nehme; Safikhani, Zhaleh; Gendoo, Deena M A; Davis, Sean; Gomez-Cabrero, David; Castelo, Robert; Hansen, Kasper D; Carey, Vincent J; Morgan, Martin; Culhane, Aedín C; Haibe-Kains, Benjamin; Waldron, Levi
2016-07-01
Molecular interrogation of a biological sample through DNA sequencing, RNA and microRNA profiling, proteomics and other assays, has the potential to provide a systems level approach to predicting treatment response and disease progression, and to developing precision therapies. Large publicly funded projects have generated extensive and freely available multi-assay data resources; however, bioinformatic and statistical methods for the analysis of such experiments are still nascent. We review multi-assay genomic data resources in the areas of clinical oncology, pharmacogenomics and other perturbation experiments, population genomics and regulatory genomics and other areas, and tools for data acquisition. Finally, we review bioinformatic tools that are explicitly geared toward integrative genomic data visualization and analysis. This review provides starting points for accessing publicly available data and tools to support development of needed integrative methods. © The Author 2015. Published by Oxford University Press.
A Focused Observation Tool Using Dreyfus Stages of Skill Acquisition as an Evaluative Scale.
Driver, Richard; Grose, Brian; Serafini, Mario; Cottrell, Scott; Sizemore, Daniel; Vallejo, Manuel
2017-01-01
Focused Observartion (FO) is associated with assessing complex skills and differs from generalized observations and evaluations. We've developed a FO assessing clinical procedural skills using Hubert Dreyfus Stages of Skill Acquisition as descriptive anchors. This study sought to analyze the effectiveness of this measure of skill progression. During week 1 and week 4 of training, FO was performed repetitively on 6 residents during endotracheal intubation. Skill stage ratings were converted to numerical scores. A dependent, paired samples t-test was calculated using total mean score (dependent variable) and an effect size. (Cohen's d) was performed to ascertain the standardized mean difference between observations. A significant improvement in mean scores occurred between Week 1 (AVG 1.2, STDV ± 0.1) and Week 4 (AVG 2.0, STDV ± 0.1) (t= -3.9, p<.05) Calculated Chohen's d indicates that this difference was meaningful. This study demonstrates success in adapting a Focused Observation technique and an innovative evaluative scale based upon Dreyfus stages of skill acquisition.
Acoustically levitated droplets: a contactless sampling method for fluorescence studies.
Leiterer, Jork; Grabolle, Markus; Rurack, Knut; Resch-Genger, Ute; Ziegler, Jan; Nann, Thomas; Panne, Ulrich
2008-01-01
Acoustic levitation is used as a new tool to study concentration-dependent processes in fluorescence spectroscopy. With this technique, small amounts of liquid and solid samples can be measured without the need for sample supports or containers, which often limits signal acquisition and can even alter sample properties due to interactions with the support material. We demonstrate that, because of the small sample volume, fluorescence measurements at high concentrations of an organic dye are possible without the limitation of inner-filter effects, which hamper such experiments in conventional, cuvette-based measurements. Furthermore, we show that acoustic levitation of liquid samples provides an experimentally simple way to study distance-dependent fluorescence modulations in semiconductor nanocrystals. The evaporation of the solvent during levitation leads to a continuous increase of solute concentration and can easily be monitored by laser-induced fluorescence.
Milani, S; Wright, C; Purcell, O; Macleod, I; Gerasimidis, K
2013-06-01
Acquisition of anthropometric measurements and assessment of growth in paediatric inpatients remains poor. The introduction of malnutrition screening tools that incorporate weight and height/length measurements might improve their acquisition and utilisation in other aspects of patient care. Documentation of weight and/length measurements and their plotting on growth charts was evaluated using a case notes review in paediatric inpatients who were admitted before (n = 146), during (n = 154) and after the pilot (n = 151) and official (n = 128) clinical use of a screening tool. Documentation of weight was high in all periods (> 97% of patients). Height/length measurement documentation was negligible (4% of patients) but improved after the introduction of the screening tool (> 62%; P < 0.0001), except in infants, who were not part of the screening programme. Introduction of a screening tool improved the acquisition of anthropometric measurements by nursing staff, although its utilisation by medical staff remained poor. © 2013 The Authors Journal of Human Nutrition and Dietetics © 2013 The British Dietetic Association Ltd.
The key to using a learning or skill acquisition plan.
Nicholls, Delwyn; Sweet, Linda; Westerway, Sue Campbell; Gibbins, Annie
2014-11-01
A learning plan is a tool to guide the development of knowledge, skills and professional attitudes required for practice. A learning plan is an ideal tool for both supervisors and mentors to guide the process of teaching and learning a medical ultrasound examination. A good learning plan will state the learning goal, identify the learning activities and resources needed to achieve this goal, and highlight the outcome measures, which when achieved indicate the goal has been accomplished. A skill acquisition plan provides a framework for task acquisition and skill stratification; and is an extension of the application of the student learning plan. One unique feature of a skill acquisition plan is it requires the tutor to first undertake a task analysis. The task steps are progressively learnt in sequence, termed scaffolding. The skills to develop and use a learning or skill acquisition plan are also learnt, but are an integral component to the ultrasound tutors skill set. This paper will provide an outline of how to use and apply a learning and skill acquisition plan. We will review how these tools can be personalised to each student and skill teaching environment.
NASA Astrophysics Data System (ADS)
Wason, H.; Herrmann, F. J.; Kumar, R.
2016-12-01
Current efforts towards dense shot (or receiver) sampling and full azimuthal coverage to produce high resolution images have led to the deployment of multiple source vessels (or streamers) across marine survey areas. Densely sampled marine seismic data acquisition, however, is expensive, and hence necessitates the adoption of sampling schemes that save acquisition costs and time. Compressed sensing is a sampling paradigm that aims to reconstruct a signal--that is sparse or compressible in some transform domain--from relatively fewer measurements than required by the Nyquist sampling criteria. Leveraging ideas from the field of compressed sensing, we show how marine seismic acquisition can be setup as a compressed sensing problem. A step ahead from multi-source seismic acquisition is simultaneous source acquisition--an emerging technology that is stimulating both geophysical research and commercial efforts--where multiple source arrays/vessels fire shots simultaneously resulting in better coverage in marine surveys. Following the design principles of compressed sensing, we propose a pragmatic simultaneous time-jittered time-compressed marine acquisition scheme where single or multiple source vessels sail across an ocean-bottom array firing airguns at jittered times and source locations, resulting in better spatial sampling and speedup acquisition. Our acquisition is low cost since our measurements are subsampled. Simultaneous source acquisition generates data with overlapping shot records, which need to be separated for further processing. We can significantly impact the reconstruction quality of conventional seismic data from jittered data and demonstrate successful recovery by sparsity promotion. In contrast to random (sub)sampling, acquisition via jittered (sub)sampling helps in controlling the maximum gap size, which is a practical requirement of wavefield reconstruction with localized sparsifying transforms. We illustrate our results with simulations of simultaneous time-jittered marine acquisition for 2D and 3D ocean-bottom cable survey.
EXPECT: Explicit Representations for Flexible Acquisition
NASA Technical Reports Server (NTRS)
Swartout, BIll; Gil, Yolanda
1995-01-01
To create more powerful knowledge acquisition systems, we not only need better acquisition tools, but we need to change the architecture of the knowledge based systems we create so that their structure will provide better support for acquisition. Current acquisition tools permit users to modify factual knowledge but they provide limited support for modifying problem solving knowledge. In this paper, the authors argue that this limitation (and others) stem from the use of incomplete models of problem-solving knowledge and inflexible specification of the interdependencies between problem-solving and factual knowledge. We describe the EXPECT architecture which addresses these problems by providing an explicit representation for problem-solving knowledge and intent. Using this more explicit representation, EXPECT can automatically derive the interdependencies between problem-solving and factual knowledge. By deriving these interdependencies from the structure of the knowledge-based system itself EXPECT supports more flexible and powerful knowledge acquisition.
Video Game Telemetry as a Critical Tool in the Study of Complex Skill Learning
Thompson, Joseph J.; Blair, Mark R.; Chen, Lihan; Henrey, Andrew J.
2013-01-01
Cognitive science has long shown interest in expertise, in part because prediction and control of expert development would have immense practical value. Most studies in this area investigate expertise by comparing experts with novices. The reliance on contrastive samples in studies of human expertise only yields deep insight into development where differences are important throughout skill acquisition. This reliance may be pernicious where the predictive importance of variables is not constant across levels of expertise. Before the development of sophisticated machine learning tools for data mining larger samples, and indeed, before such samples were available, it was difficult to test the implicit assumption of static variable importance in expertise development. To investigate if this reliance may have imposed critical restrictions on the understanding of complex skill development, we adopted an alternative method, the online acquisition of telemetry data from a common daily activity for many: video gaming. Using measures of cognitive-motor, attentional, and perceptual processing extracted from game data from 3360 Real-Time Strategy players at 7 different levels of expertise, we identified 12 variables relevant to expertise. We show that the static variable importance assumption is false - the predictive importance of these variables shifted as the levels of expertise increased - and, at least in our dataset, that a contrastive approach would have been misleading. The finding that variable importance is not static across levels of expertise suggests that large, diverse datasets of sustained cognitive-motor performance are crucial for an understanding of expertise in real-world contexts. We also identify plausible cognitive markers of expertise. PMID:24058656
Video game telemetry as a critical tool in the study of complex skill learning.
Thompson, Joseph J; Blair, Mark R; Chen, Lihan; Henrey, Andrew J
2013-01-01
Cognitive science has long shown interest in expertise, in part because prediction and control of expert development would have immense practical value. Most studies in this area investigate expertise by comparing experts with novices. The reliance on contrastive samples in studies of human expertise only yields deep insight into development where differences are important throughout skill acquisition. This reliance may be pernicious where the predictive importance of variables is not constant across levels of expertise. Before the development of sophisticated machine learning tools for data mining larger samples, and indeed, before such samples were available, it was difficult to test the implicit assumption of static variable importance in expertise development. To investigate if this reliance may have imposed critical restrictions on the understanding of complex skill development, we adopted an alternative method, the online acquisition of telemetry data from a common daily activity for many: video gaming. Using measures of cognitive-motor, attentional, and perceptual processing extracted from game data from 3360 Real-Time Strategy players at 7 different levels of expertise, we identified 12 variables relevant to expertise. We show that the static variable importance assumption is false--the predictive importance of these variables shifted as the levels of expertise increased--and, at least in our dataset, that a contrastive approach would have been misleading. The finding that variable importance is not static across levels of expertise suggests that large, diverse datasets of sustained cognitive-motor performance are crucial for an understanding of expertise in real-world contexts. We also identify plausible cognitive markers of expertise.
Computational Modeling for Language Acquisition: A Tutorial With Syntactic Islands.
Pearl, Lisa S; Sprouse, Jon
2015-06-01
Given the growing prominence of computational modeling in the acquisition research community, we present a tutorial on how to use computational modeling to investigate learning strategies that underlie the acquisition process. This is useful for understanding both typical and atypical linguistic development. We provide a general overview of why modeling can be a particularly informative tool and some general considerations when creating a computational acquisition model. We then review a concrete example of a computational acquisition model for complex structural knowledge referred to as syntactic islands. This includes an overview of syntactic islands knowledge, a precise definition of the acquisition task being modeled, the modeling results, and how to meaningfully interpret those results in a way that is relevant for questions about knowledge representation and the learning process. Computational modeling is a powerful tool that can be used to understand linguistic development. The general approach presented here can be used to investigate any acquisition task and any learning strategy, provided both are precisely defined.
NASA Technical Reports Server (NTRS)
Dolgin, B.; Yarbrough, C.; Carson, J.; Troy, R.
2000-01-01
The proposed Mars Sample Transfer Chain Architecture provides Planetary Protection Officers with clean samples that are required for the eventual release from confinement of the returned Martian samples. At the same time, absolute cleanliness and sterility requirement is not placed of any part of the Lander (including the deep drill), Mars Assent Vehicle (MAV), any part of the Orbiting Sample container (OS), Rover mobility platform, any part of the Minicorer, Robotic arm (including instrument sensors), and most of the caching equipment on the Rover. The removal of the strict requirements in excess of the Category IVa cleanliness (Pathfinder clean) is expected to lead to significant cost savings. The proposed architecture assumes that crosscontamination renders all surfaces in the vicinity of the rover(s) and the lander(s) contaminated. Thus, no accessible surface of Martian rocks and soil is Earth contamination free. As a result of the latter, only subsurface samples (either rock or soil) can be and will be collected for eventual return to Earth. Uncontaminated samples can be collected from a Category IVa clean platform. Both subsurface soil and rock samples can be maintained clean if they are collected by devices that are self-contained and clean and sterile inside only. The top layer of the sample is removed in a manner that does not contaminate the collection tools. Biobarrier (e.g., aluminum foil) covering the moving parts of these devices may be used as the only self removing bio-blanket that is required. The samples never leave the collection tools. The lids are placed on these tools inside the collection device. These single use tools with the lid and the sample inside are brought to Earth in the OS. The lids have to be designed impenetrable to the Earth organisms. The latter is a well established art.
NASA Astrophysics Data System (ADS)
Omoragbon, Amen
Although, the Aerospace and Defense (A&D) industry is a significant contributor to the United States' economy, national prestige and national security, it experiences significant cost and schedule overruns. This problem is related to the differences between technology acquisition assessments and aerospace vehicle conceptual design. Acquisition assessments evaluate broad sets of alternatives with mostly qualitative techniques, while conceptual design tools evaluate narrow set of alternatives with multidisciplinary tools. In order for these two fields to communicate effectively, a common platform for both concerns is desired. This research is an original contribution to a three-part solution to this problem. It discusses the decomposition step of an innovation technology and sizing tool generation framework. It identifies complex multidisciplinary system definitions as a bridge between acquisition and conceptual design. It establishes complex multidisciplinary building blocks that can be used to build synthesis systems as well as technology portfolios. It also describes a Graphical User Interface Designed to aid in decomposition process. Finally, it demonstrates an application of the methodology to a relevant acquisition and conceptual design problem posed by the US Air Force.
Lunar Processing Cabinet 2.0: Retrofitting Gloveboxes into the 21st Century
NASA Technical Reports Server (NTRS)
Calaway, M. J.
2015-01-01
In 2014, the Apollo 16 Lunar Processing Glovebox (cabinet 38) in the Lunar Curation Laboratory at NASA JSC received an upgrade including new technology interfaces. A Jacobs - Technology Innovation Project provided the primary resources to retrofit this glovebox into the 21st century. NASA Astromaterials Acquisition & Curation Office continues the over 40 year heritage of preserving lunar materials for future scientific studies in state-of-the-art facilities. This enhancement has not only modernized the contamination controls, but provides new innovative tools for processing and characterizing lunar samples as well as supports real-time exchange of sample images and information with the scientific community throughout the world.
Cryo-tomography Tilt-series Alignment with Consideration of the Beam-induced Sample Motion
Fernandez, Jose-Jesus; Li, Sam; Bharat, Tanmay A. M.; Agard, David A.
2018-01-01
Recent evidence suggests that the beam-induced motion of the sample during tilt-series acquisition is a major resolution-limiting factor in electron cryo-tomography (cryoET). It causes suboptimal tilt-series alignment and thus deterioration of the reconstruction quality. Here we present a novel approach to tilt-series alignment and tomographic reconstruction that considers the beam-induced sample motion through the tilt-series. It extends the standard fiducial-based alignment approach in cryoET by introducing quadratic polynomials to model the sample motion. The model can be used during reconstruction to yield a motion-compensated tomogram. We evaluated our method on various datasets with different sample sizes. The results demonstrate that our method could be a useful tool to improve the quality of tomograms and the resolution in cryoET. PMID:29410148
Open source cardiology electronic health record development for DIGICARDIAC implementation
NASA Astrophysics Data System (ADS)
Dugarte, Nelson; Medina, Rubén.; Huiracocha, Lourdes; Rojas, Rubén.
2015-12-01
This article presents the development of a Cardiology Electronic Health Record (CEHR) system. Software consists of a structured algorithm designed under Health Level-7 (HL7) international standards. Novelty of the system is the integration of high resolution ECG (HRECG) signal acquisition and processing tools, patient information management tools and telecardiology tools. Acquisition tools are for management and control of the DIGICARDIAC electrocardiograph functions. Processing tools allow management of HRECG signal analysis searching for indicative patterns of cardiovascular pathologies. Telecardiology tools incorporation allows system communication with other health care centers decreasing access time to the patient information. CEHR system was completely developed using open source software. Preliminary results of process validation showed the system efficiency.
Designing a Graphical Decision Support Tool to Improve System Acquisition Decisions
2009-06-01
relationships within the data [9]. Displaying acquisition data in a graphical manner was chosen because graphical formats, in general, have been...acquisition plan which includes information pertaining to the acquisition objectives, the required capability of the system, design trade-off, budgeting...which introduce artificial neural networks to approximate the real world experience of an acquisition manager [8]. However, these strategies lack a
ESKAPE/CF: A Knowledge Acquisition Tool for Expert Systems Using Cognitive Feedback
1991-03-01
NAVAL POSTGRADUATE SCHOOL Monterey, California AD-A241 815i!1! lit 1i iill 1111 !! I 1111 ST E * ODTIC OCT22 z 99I; THESIS ESKAPE /CF: A KNOWLEDGE...11. TITLE (include Security Classification) ESKAPE /CF: A KNOWLEDGE ACQUISITION TOOL FOR EXPERT SYSTEMS USING COGNITIVE FEEDBACK (U) e PERSOIAL AUTVR(Yl...tool using Cognitive Feedback ( ESKAPE /CF), based on Lens model techniques which have demonstrated effectiveness in cap- turing policy knowledge. The
Oberacher, Herbert; Schubert, Birthe; Libiseller, Kathrin; Schweissgut, Anna
2013-04-03
Systematic toxicological analysis (STA) is aimed at detecting and identifying all substances of toxicological relevance (i.e. drugs, drugs of abuse, poisons and/or their metabolites) in biological material. Particularly, gas chromatography-mass spectrometry (GC/MS) represents a competent and commonly applied screening and confirmation tool. Herein, we present an untargeted liquid chromatography-tandem mass spectrometry (LC/MS/MS) assay aimed to complement existing GC/MS screening for the detection and identification of drugs in blood, plasma and urine samples. Solid-phase extraction was accomplished on mixed-mode cartridges. LC was based on gradient elution in a miniaturized C18 column. High resolution electrospray ionization-MS/MS in positive ion mode with data-dependent acquisition control was used to generate tandem mass spectral information that enabled compound identification via automated library search in the "Wiley Registry of Tandem Mass Spectral Data, MSforID". Fitness of the developed LC/MS/MS method for application in STA in terms of selectivity, detection capability and reliability of identification (sensitivity/specificity) was demonstrated with blank samples, certified reference materials, proficiency test samples, and authentic casework samples. Copyright © 2013 Elsevier B.V. All rights reserved.
Tayyeb, Rakhshanda
2013-01-01
To assess effectiveness of PBL as an instructional tool in clinical years to improve learning of undergraduate students in terms of acquisition of content knowledge, critical thinking and problem solving skills through problem based learning and traditional way of teaching. Quasi-experimental study. Fatima Jinnah Medical College for Women, Lahore, from October 2009 to April 2010. Final year medical students attending Obstetrics and Gynaecology and Surgery rotations were inducted as participants in this study. Two batches of 50 students each attended Gynaecology rotation and two batches attended Surgery rotation, i.e. 100 students in each. Each batch was divided into two groups i.e. A and B of 25 students each. Group-A learnt through traditional teaching, involving bedside teaching and lectures in wards and Group-B learnt relevant clinical knowledge through a modified PBL process. Content knowledge was tested by MCQs testing recall while clinical reasoning and problem were assessed by MCQs testing analysis and critical thinking. Intra-group comparison of mean scores of pre and post-test scores was done using paired sample t-tests while for intergroup comparison of mean scores was done through independent sample t-test. Teaching through traditional method significantly improved content knowledge, (p = 0.001) but did not considerably improve clinical reasoning and problem solving skills (p = 0.093) whereas, content knowledge of students who studied through PBL remained the same (p = 0.202) but there was marked improvement in their clinical reasoning and problem solving skills (p = < 0.001). PBL is an effective instructional tool to foster critical thinking and problem solving skills among medical students.
Advances and challenges in cryo ptychography at the Advanced Photon Source.
Deng, J; Vine, D J; Chen, S; Nashed, Y S G; Jin, Q; Peterka, T; Vogt, S; Jacobsen, C
Ptychography has emerged as a nondestructive tool to quantitatively study extended samples at a high spatial resolution. In this manuscript, we report on recent developments from our team. We have combined cryo ptychography and fluorescence microscopy to provide simultaneous views of ultrastructure and elemental composition, we have developed multi-GPU parallel computation to speed up ptychographic reconstructions, and we have implemented fly-scan ptychography to allow for faster data acquisition. We conclude with a discussion of future challenges in high-resolution 3D ptychography.
Tu, S W; Eriksson, H; Gennari, J H; Shahar, Y; Musen, M A
1995-06-01
PROTEGE-II is a suite of tools and a methodology for building knowledge-based systems and domain-specific knowledge-acquisition tools. In this paper, we show how PROTEGE-II can be applied to the task of providing protocol-based decision support in the domain of treating HIV-infected patients. To apply PROTEGE-II, (1) we construct a decomposable problem-solving method called episodic skeletal-plan refinement, (2) we build an application ontology that consists of the terms and relations in the domain, and of method-specific distinctions not already captured in the domain terms, and (3) we specify mapping relations that link terms from the application ontology to the domain-independent terms used in the problem-solving method. From the application ontology, we automatically generate a domain-specific knowledge-acquisition tool that is custom-tailored for the application. The knowledge-acquisition tool is used for the creation and maintenance of domain knowledge used by the problem-solving method. The general goal of the PROTEGE-II approach is to produce systems and components that are reusable and easily maintained. This is the rationale for constructing ontologies and problem-solving methods that can be composed from a set of smaller-grained methods and mechanisms. This is also why we tightly couple the knowledge-acquisition tools to the application ontology that specifies the domain terms used in the problem-solving systems. Although our evaluation is still preliminary, for the application task of providing protocol-based decision support, we show that these goals of reusability and easy maintenance can be achieved. We discuss design decisions and the tradeoffs that have to be made in the development of the system.
ERIC Educational Resources Information Center
Sherman, Tracy; Shulman, Brian B.
1999-01-01
This study examined test characteristics of the Pediatric Language Acquisition Screening Tool for Early Referral-Revised (PLASTER-R), a set of developmental questionnaires for children 3 to 60 months of age. The PLASTER-R was moderately to highly successful in identifying children within normal limits for language development. Test-retest…
Tools to Support Human Factors and Systems Engineering Interactions During Early Analysis
NASA Technical Reports Server (NTRS)
Thronesbery, Carroll; Malin, Jane T.; Holden, Kritina; Smith, Danielle Paige
2005-01-01
We describe an approach and existing software tool support for effective interactions between human factors engineers and systems engineers in early analysis activities during system acquisition. We examine the tasks performed during this stage, emphasizing those tasks where system engineers and human engineers interact. The Concept of Operations (ConOps) document is an important product during this phase, and particular attention is paid to its influences on subsequent acquisition activities. Understanding this influence helps ConOps authors describe a complete system concept that guides subsequent acquisition activities. We identify commonly used system engineering and human engineering tools and examine how they can support the specific tasks associated with system definition. We identify possible gaps in the support of these tasks, the largest of which appears to be creating the ConOps document itself. Finally, we outline the goals of our future empirical investigations of tools to support system concept definition.
Tools to Support Human Factors and Systems Engineering Interactions During Early Analysis
NASA Technical Reports Server (NTRS)
Thronesbery, Carroll; Malin, Jane T.; Holden, Kritina; Smith, Danielle Paige
2006-01-01
We describe an approach and existing software tool support for effective interactions between human factors engineers and systems engineers in early analysis activities during system acquisition. We examine the tasks performed during this stage, emphasizing those tasks where system engineers and human engineers interact. The Concept of Operations (ConOps) document is an important product during this phase, and particular attention is paid to its influences on subsequent acquisition activities. Understanding this influence helps ConOps authors describe a complete system concept that guides subsequent acquisition activities. We identify commonly used system engineering and human engineering tools and examine how they can support the specific tasks associated with system definition. We identify possible gaps in the support of these tasks, the largest of which appears to be creating the ConOps document itself. Finally, we outline the goals of our future empirical investigations of tools to support system concept definition.
Development of a software tool to support chemical and biological terrorism intelligence analysis
NASA Astrophysics Data System (ADS)
Hunt, Allen R.; Foreman, William
1997-01-01
AKELA has developed a software tool which uses a systems analytic approach to model the critical processes which support the acquisition of biological and chemical weapons by terrorist organizations. This tool has four major components. The first is a procedural expert system which describes the weapon acquisition process. It shows the relationship between the stages a group goes through to acquire and use a weapon, and the activities in each stage required to be successful. It applies to both state sponsored and small group acquisition. An important part of this expert system is an analysis of the acquisition process which is embodied in a list of observables of weapon acquisition activity. These observables are cues for intelligence collection The second component is a detailed glossary of technical terms which helps analysts with a non- technical background understand the potential relevance of collected information. The third component is a linking capability which shows where technical terms apply to the parts of the acquisition process. The final component is a simple, intuitive user interface which shows a picture of the entire process at a glance and lets the user move quickly to get more detailed information. This paper explains e each of these five model components.
Manufactured Porous Ambient Surface Simulants
NASA Technical Reports Server (NTRS)
Carey, Elizabeth M.; Peters, Gregory H.; Chu, Lauren; Zhou, Yu Meng; Cohen, Brooklin; Panossian, Lara; Green, Jacklyn R.; Moreland, Scott; Backes, Paul
2016-01-01
The planetary science decadal survey for 2013-2022 (Vision and Voyages, NRC 2011) has promoted mission concepts for sample acquisition from small solar system bodies. Numerous comet-sampling tools are in development to meet this standard. Manufactured Porous Ambient Surface Simulants (MPASS) materials provide an opportunity to simulate variable features at ambient temperatures and pressures to appropriately test potential sample acquisition systems for comets, asteroids, and planetary surfaces. The original "flavor" of MPASS materials is known as Manufactured Porous Ambient Comet Simulants (MPACS), which was developed in parallel with the development of the Biblade Comet Sampling System (Backes et al., in review). The current suite of MPACS materials was developed through research of the physical and mechanical properties of comets from past comet missions results and modeling efforts, coordination with the science community at the Jet Propulsion Laboratory and testing of a wide range of materials and formulations. These simulants were required to represent the physical and mechanical properties of cometary nuclei, based on the current understanding of the science community. Working with cryogenic simulants can be tedious and costly; thus MPACS is a suite of ambient simulants that yields a brittle failure mode similar to that of cryogenic icy materials. Here we describe our suite of comet simulants known as MPACS that will be used to test and validate the Biblade Comet Sampling System (Backes et al., in review).
48 CFR 218.170 - Additional acquisition flexibilities.
Code of Federal Regulations, 2010 CFR
2010-10-01
... SYSTEM, DEPARTMENT OF DEFENSE CONTRACTING METHODS AND CONTRACT TYPES EMERGENCY ACQUISITIONS Available... vessels. The contracting officer, without soliciting offers, may issue a written job order for emergency..., Restrictions on food, clothing, fabrics, specialty metals, and hand or measuring tools: (1) Acquisitions at or...
Code of Federal Regulations, 2013 CFR
2013-10-01
... 48 Federal Acquisition Regulations System 1 2013-10-01 2013-10-01 false Policy. 22.503 Section 22.503 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION SOCIOECONOMIC PROGRAMS... Projects 22.503 Policy. (a) Project labor agreements are a tool that agencies may use to promote economy...
NASA Technical Reports Server (NTRS)
Okon, Avi B.; Brown, Kyle M.; McGrath, Paul L.; Klein, Kerry J.; Cady, Ian W.; Lin, Justin Y.; Ramirez, Frank E.; Haberland, Matt
2012-01-01
This drill (see Figure 1) is the primary sample acquisition element of the Mars Science Laboratory (MSL) that collects powdered samples from various types of rock (from clays to massive basalts) at depths up to 50 mm below the surface. A rotary-percussive sample acquisition device was developed with an emphasis on toughness and robustness to handle the harsh environment on Mars. It is the first rover-based sample acquisition device to be flight-qualified (see Figure 2). This drill features an autonomous tool change-out on a mobile robot, and novel voice-coil-based percussion. The drill comprises seven subelements. Starting at the end of the drill, there is a bit assembly that cuts the rock and collects the sample. Supporting the bit is a subassembly comprising a chuck mechanism to engage and release the new and worn bits, respectively, and a spindle mechanism to rotate the bit. Just aft of that is a percussion mechanism, which generates hammer blows to break the rock and create the dynamic environment used to flow the powdered sample. These components are mounted to a translation mechanism, which provides linear motion and senses weight-on-bit with a force sensor. There is a passive-contact sensor/stabilizer mechanism that secures the drill fs position on the rock surface, and flex harness management hardware to provide the power and signals to the translating components. The drill housing serves as the primary structure of the turret, to which the additional tools and instruments are attached. The drill bit assembly (DBA) is a passive device that is rotated and hammered in order to cut rock (i.e. science targets) and collect the cuttings (powder) in a sample chamber until ready for transfer to the CHIMRA (Collection and Handling for Interior Martian Rock Analysis). The DBA consists of a 5/8-in. (.1.6- cm) commercial hammer drill bit whose shank has been turned down and machined with deep flutes designed for aggressive cutting removal. Surrounding the shank of the bit is a thick-walled maraging steel collection tube allowing the powdered sample to be augured up the hole into the sample chamber. For robustness, the wall thickness of the DBA was maximized while still ensuring effective sample collection. There are four recesses in the bit tube that are used to retain the fresh bits in their bit box. The rotating bit is supported by a back-to-back duplex bearing pair within a housing that is connected to the outer DBA housing by two titanium diaphragms. The only bearings on the drill in the sample flow are protected by a spring-energized seal, and an integrated shield that diverts the ingested powdered sample from the moving interface. The DBA diaphragms provide radial constraint of the rotating bit and form the sample chambers. Between the diaphragms there is a sample exit tube from which the sample is transferred to the CHIMRA. To ensure that the entire collected sample is retained, no matter the orientation of the drill with respect to gravity during sampling, the pass-through from the forward to the aft chamber resides opposite to the exit tube.
Gas chromatography-vacuum ultraviolet spectroscopy for analysis of fatty acid methyl esters.
Fan, Hui; Smuts, Jonathan; Bai, Ling; Walsh, Phillip; Armstrong, Daniel W; Schug, Kevin A
2016-03-01
A new vacuum ultraviolet (VUV) detector for gas chromatography was recently developed and applied to fatty acid methyl ester (FAME) analysis. VUV detection features full spectral acquisition in a wavelength range of 115-240nm, where virtually all chemical species absorb. VUV absorption spectra of 37 FAMEs, including saturated, monounsaturated, and polyunsaturated types were recorded. Unsaturated FAMEs show significantly different gas phase absorption profiles than saturated ones, and these classes can be easily distinguished with the VUV detector. Another advantage includes differentiating cis/trans-isomeric FAMEs (e.g. oleic acid methyl ester and linoleic acid methyl ester isomers) and the ability to use VUV data analysis software for deconvolution of co-eluting signals. As a universal detector, VUV also provides high specificity, sensitivity, and a fast data acquisition rate, making it a powerful tool for fatty acid screening when combined with gas chromatography. The fatty acid profile of several food oil samples (olive, canola, vegetable, corn, sunflower and peanut oils) were analyzed in this study to demonstrate applicability to real world samples. Copyright © 2015 Elsevier Ltd. All rights reserved.
SignalPlant: an open signal processing software platform.
Plesinger, F; Jurco, J; Halamek, J; Jurak, P
2016-07-01
The growing technical standard of acquisition systems allows the acquisition of large records, often reaching gigabytes or more in size as is the case with whole-day electroencephalograph (EEG) recordings, for example. Although current 64-bit software for signal processing is able to process (e.g. filter, analyze, etc) such data, visual inspection and labeling will probably suffer from rather long latency during the rendering of large portions of recorded signals. For this reason, we have developed SignalPlant-a stand-alone application for signal inspection, labeling and processing. The main motivation was to supply investigators with a tool allowing fast and interactive work with large multichannel records produced by EEG, electrocardiograph and similar devices. The rendering latency was compared with EEGLAB and proves significantly faster when displaying an image from a large number of samples (e.g. 163-times faster for 75 × 10(6) samples). The presented SignalPlant software is available free and does not depend on any other computation software. Furthermore, it can be extended with plugins by third parties ensuring its adaptability to future research tasks and new data formats.
Test and Validation of the Mars Science Laboratory Robotic Arm
NASA Technical Reports Server (NTRS)
Robinson, M.; Collins, C.; Leger, P.; Kim, W.; Carsten, J.; Tompkins, V.; Trebi-Ollennu, A.; Florow, B.
2013-01-01
The Mars Science Laboratory Robotic Arm (RA) is a key component for achieving the primary scientific goals of the mission. The RA supports sample acquisition by precisely positioning a scoop above loose regolith or accurately preloading a percussive drill on Martian rocks or rover-mounted organic check materials. It assists sample processing by orienting a sample processing unit called CHIMRA through a series of gravity-relative orientations and sample delivery by positioning the sample portion door above an instrument inlet or the observation tray. In addition the RA facilitates contact science by accurately positioning the dust removal tool, Alpha Particle X-Ray Spectrometer (APXS) and the Mars Hand Lens Imager (MAHLI) relative to surface targets. In order to fulfill these seemingly disparate science objectives the RA must satisfy a variety of accuracy and performance requirements. This paper describes the necessary arm requirement specification and the test campaign to demonstrate these requirements were satisfied.
Low Cost Electroencephalographic Acquisition Amplifier to serve as Teaching and Research Tool
Jain, Ankit; Kim, Insoo; Gluckman, Bruce J.
2012-01-01
We describe the development and testing of a low cost, easily constructed electroencephalographic acquisition amplifier for noninvasive Brain Computer Interface (BCI) education and research. The acquisition amplifier is constructed from newly available off-the-shelf integrated circuit components, and readily sends a 24-bit data stream via USB bus to a computer platform. We demonstrate here the hardware’s use in the analysis of a visually evoked P300 paradigm for a choose one-of-eight task. This clearly shows the applicability of this system as a low cost teaching and research tool. PMID:22254699
Peptide Identification by Database Search of Mixture Tandem Mass Spectra*
Wang, Jian; Bourne, Philip E.; Bandeira, Nuno
2011-01-01
In high-throughput proteomics the development of computational methods and novel experimental strategies often rely on each other. In certain areas, mass spectrometry methods for data acquisition are ahead of computational methods to interpret the resulting tandem mass spectra. Particularly, although there are numerous situations in which a mixture tandem mass spectrum can contain fragment ions from two or more peptides, nearly all database search tools still make the assumption that each tandem mass spectrum comes from one peptide. Common examples include mixture spectra from co-eluting peptides in complex samples, spectra generated from data-independent acquisition methods, and spectra from peptides with complex post-translational modifications. We propose a new database search tool (MixDB) that is able to identify mixture tandem mass spectra from more than one peptide. We show that peptides can be reliably identified with up to 95% accuracy from mixture spectra while considering only a 0.01% of all possible peptide pairs (four orders of magnitude speedup). Comparison with current database search methods indicates that our approach has better or comparable sensitivity and precision at identifying single-peptide spectra while simultaneously being able to identify 38% more peptides from mixture spectra at significantly higher precision. PMID:21862760
Reengineering the Acquisition/Procurement Process: A Methodology for Requirements Collection
NASA Technical Reports Server (NTRS)
Taylor, Randall; Vanek, Thomas
2011-01-01
This paper captures the systematic approach taken by JPL's Acquisition Reengineering Project team, the methodology used, challenges faced, and lessons learned. It provides pragmatic "how-to" techniques and tools for collecting requirements and for identifying areas of improvement in an acquisition/procurement process or other core process of interest.
Broecker, Sebastian; Herre, Sieglinde; Wüst, Bernhard; Zweigenbaum, Jerry; Pragst, Fritz
2011-04-01
A library of collision-induced dissociation (CID) accurate mass spectra has been developed for efficient use of liquid chromatography in combination with hybrid quadrupole time-of-flight mass spectrometry (LC-QTOF-MS) as a tool in systematic toxicological analysis. The mass spectra (Δm < 3 ppm) of more than 2,500 illegal and therapeutic drugs, pesticides, alkaloids, other toxic chemicals and metabolites were measured, by use of an Agilent 6530 instrument, by flow-injection of 1 ng of the pure substances in aqueous ammonium formate-formic acid-methanol, with positive and negative electrospray-ionization (ESI), selection of the protonated or deprotonated molecules [M+H](+) or [M-H](-) by the quadrupole, and collision induced dissociation (CID) with nitrogen as collision gas at CID energies of 10, 20, and 40 eV. The fragment mass spectra were controlled for structural plausibility, corrected by recalculation to the theoretical fragment masses and added to a database of accurate mass data and molecular formulas of more than 7,500 toxicologically relevant substances to form the "database and library of toxic compounds". For practical evaluation, blood and urine samples were spiked with a mixture of 33 drugs at seven concentrations between 0.5 and 500 ng mL(-1), prepared by dichloromethane extraction or protein precipitation, and analyzed by LC-QTOF-MS in data-dependent acquisition mode. Unambiguous identification by library search was possible for typical basic drugs down to 0.5-2 ng mL(-1) and for benzodiazepines down to 2-20 ng mL(-1). The efficiency of the method was also demonstrated by re-analysis of venous blood samples from 50 death cases and comparison with previous results. In conclusion, LC-QTOF-MS in data-dependent acquisition mode combined with an accurate mass database and CID spectra library seemed to be one of the most efficient tools for systematic toxicological analysis.
Brama, Elisabeth; Peddie, Christopher J; Wilkes, Gary; Gu, Yan; Collinson, Lucy M; Jones, Martin L
2016-12-13
In-resin fluorescence (IRF) protocols preserve fluorescent proteins in resin-embedded cells and tissues for correlative light and electron microscopy, aiding interpretation of macromolecular function within the complex cellular landscape. Dual-contrast IRF samples can be imaged in separate fluorescence and electron microscopes, or in dual-modality integrated microscopes for high resolution correlation of fluorophore to organelle. IRF samples also offer a unique opportunity to automate correlative imaging workflows. Here we present two new locator tools for finding and following fluorescent cells in IRF blocks, enabling future automation of correlative imaging. The ultraLM is a fluorescence microscope that integrates with an ultramicrotome, which enables 'smart collection' of ultrathin sections containing fluorescent cells or tissues for subsequent transmission electron microscopy or array tomography. The miniLM is a fluorescence microscope that integrates with serial block face scanning electron microscopes, which enables 'smart tracking' of fluorescent structures during automated serial electron image acquisition from large cell and tissue volumes.
Wan, Xiaohua; Katchalski, Tsvi; Churas, Christopher; Ghosh, Sreya; Phan, Sebastien; Lawrence, Albert; Hao, Yu; Zhou, Ziying; Chen, Ruijuan; Chen, Yu; Zhang, Fa; Ellisman, Mark H
2017-05-01
Because of the significance of electron microscope tomography in the investigation of biological structure at nanometer scales, ongoing improvement efforts have been continuous over recent years. This is particularly true in the case of software developments. Nevertheless, verification of improvements delivered by new algorithms and software remains difficult. Current analysis tools do not provide adaptable and consistent methods for quality assessment. This is particularly true with images of biological samples, due to image complexity, variability, low contrast and noise. We report an electron tomography (ET) simulator with accurate ray optics modeling of image formation that includes curvilinear trajectories through the sample, warping of the sample and noise. As a demonstration of the utility of our approach, we have concentrated on providing verification of the class of reconstruction methods applicable to wide field images of stained plastic-embedded samples. Accordingly, we have also constructed digital phantoms derived from serial block face scanning electron microscope images. These phantoms are also easily modified to include alignment features to test alignment algorithms. The combination of more realistic phantoms with more faithful simulations facilitates objective comparison of acquisition parameters, alignment and reconstruction algorithms and their range of applicability. With proper phantoms, this approach can also be modified to include more complex optical models, including distance-dependent blurring and phase contrast functions, such as may occur in cryotomography. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Pedings, Marc
2007-01-01
RT-Display is a MATLAB-based data acquisition environment designed to use a variety of commercial off-the-shelf (COTS) hardware to digitize analog signals to a standard data format usable by other post-acquisition data analysis tools. This software presents the acquired data in real time using a variety of signal-processing algorithms. The acquired data is stored in a standard Operator Interactive Signal Processing Software (OISPS) data-formatted file. RT-Display is primarily configured to use the Agilent VXI (or equivalent) data acquisition boards used in such systems as MIDDAS (Multi-channel Integrated Dynamic Data Acquisition System). The software is generalized and deployable in almost any testing environment, without limitations or proprietary configuration for a specific test program or project. With the Agilent hardware configured and in place, users can start the program and, in one step, immediately begin digitizing multiple channels of data. Once the acquisition is completed, data is converted into a common binary format that also can be translated to specific formats used by external analysis software, such as OISPS and PC-Signal (product of AI Signal Research Inc.). RT-Display at the time of this reporting was certified on Agilent hardware capable of acquisition up to 196,608 samples per second. Data signals are presented to the user on-screen simultaneously for 16 channels. Each channel can be viewed individually, with a maximum capability of 160 signal channels (depending on hardware configuration). Current signal presentations include: time data, fast Fourier transforms (FFT), and power spectral density plots (PSD). Additional processing algorithms can be easily incorporated into this environment.
NASA Astrophysics Data System (ADS)
Sherman, Justin; Azzari, Phillip; Crilly, P. B.; Duke-Tinson, Omar; James, Royce W.; Karama, Jackson; Page, E. J.; Schlank, Carter; Zuniga, Jonathan
2014-10-01
CGAPL is conducting small investigations in plasma physics and magneto-hydrodynamics buoy positioning. For data management, we are developing capability to analyze/digitize data with a National Instruments Data Acquisition board, 2 MS/s sampling rate (long time scale), and an Express Octopus card, 125 MS/s sampling rate (short scale). Sampling at 12 bits precision, we use LabVIEW as a programing language; GUIs will control variables in 1 or more concurrent runs and monitor of diagnostics. HPX utilizes high density (1013 cm3 up), low pressure (.01 T) Ar gas (fill pressure: on 104 mTorr order). Helicon/W Mode plasmas become a diagnostics test-bed for other investigations and a tool for future spacecraft propulsion devices. Plasmas created by directing energy into gas-filled Pyrex tube; power supply and matching box, up to 250 W power in 20-100 MHz frequencies, provide energy to ignite. Uniform magnetic field needed to reach the W-Mode. We employ an electromagnet to B-field while an acceleration coil positions plasma in vacuum chamber, facilitating analysis. Initial field requirements and accuracy calibration have been completed. Progress on development and implementation of probes and DAQ/GUI system will be reported. Supported by U.S. DEPS Grant [HEL-JTO] PRWJFY13.
Knowledge-Acquisition Tool For Expert System
NASA Technical Reports Server (NTRS)
Disbrow, James D.; Duke, Eugene L.; Regenie, Victoria A.
1988-01-01
Digital flight-control systems monitored by computer program that evaluates and recommends. Flight-systems engineers for advanced, high-performance aircraft use knowlege-acquisition tool for expert-system flight-status monitor suppling interpretative data. Interpretative function especially important in time-critical, high-stress situations because it facilitates problem identification and corrective strategy. Conditions evaluated and recommendations made by ground-based engineers having essential knowledge for analysis and monitoring of performances of advanced aircraft systems.
48 CFR 26.205 - Disaster Response Registry.
Code of Federal Regulations, 2012 CFR
2012-10-01
....acquisition.gov to determine the availability of contractors for debris removal, distribution of supplies... retrieved using the CCR Search tool, which can be accessed via https://www.acquisition.gov. These vendors...
Problem-Based Learning in Instrumentation: Synergism of Real and Virtual Modular Acquisition Chains
ERIC Educational Resources Information Center
Nonclercq, A.; Biest, A. V.; De Cuyper, K.; Leroy, E.; Martinez, D. L.; Robert, F.
2010-01-01
As part of an instrumentation course, a problem-based learning framework was selected for laboratory instruction. Two acquisition chains were designed to help students carry out realistic instrumentation problems. The first tool is a virtual (simulated) modular acquisition chain that allows rapid overall understanding of the main problems in…
Suzuki, Yuki; Sakai, Nobuaki; Yoshida, Aiko; Uekusa, Yoshitsugu; Yagi, Akira; Imaoka, Yuka; Ito, Shuichi; Karaki, Koichi; Takeyasu, Kunio
2013-01-01
A hybrid atomic force microscopy (AFM)-optical fluorescence microscopy is a powerful tool for investigating cellular morphologies and events. However, the slow data acquisition rates of the conventional AFM unit of the hybrid system limit the visualization of structural changes during cellular events. Therefore, high-speed AFM units equipped with an optical/fluorescence detection device have been a long-standing wish. Here we describe the implementation of high-speed AFM coupled with an optical fluorescence microscope. This was accomplished by developing a tip-scanning system, instead of a sample-scanning system, which operates on an inverted optical microscope. This novel device enabled the acquisition of high-speed AFM images of morphological changes in individual cells. Using this instrument, we conducted structural studies of living HeLa and 3T3 fibroblast cell surfaces. The improved time resolution allowed us to image dynamic cellular events. PMID:23823461
Suzuki, Yuki; Sakai, Nobuaki; Yoshida, Aiko; Uekusa, Yoshitsugu; Yagi, Akira; Imaoka, Yuka; Ito, Shuichi; Karaki, Koichi; Takeyasu, Kunio
2013-01-01
A hybrid atomic force microscopy (AFM)-optical fluorescence microscopy is a powerful tool for investigating cellular morphologies and events. However, the slow data acquisition rates of the conventional AFM unit of the hybrid system limit the visualization of structural changes during cellular events. Therefore, high-speed AFM units equipped with an optical/fluorescence detection device have been a long-standing wish. Here we describe the implementation of high-speed AFM coupled with an optical fluorescence microscope. This was accomplished by developing a tip-scanning system, instead of a sample-scanning system, which operates on an inverted optical microscope. This novel device enabled the acquisition of high-speed AFM images of morphological changes in individual cells. Using this instrument, we conducted structural studies of living HeLa and 3T3 fibroblast cell surfaces. The improved time resolution allowed us to image dynamic cellular events.
Peckner, Ryan; Myers, Samuel A; Jacome, Alvaro Sebastian Vaca; Egertson, Jarrett D; Abelin, Jennifer G; MacCoss, Michael J; Carr, Steven A; Jaffe, Jacob D
2018-05-01
Mass spectrometry with data-independent acquisition (DIA) is a promising method to improve the comprehensiveness and reproducibility of targeted and discovery proteomics, in theory by systematically measuring all peptide precursors in a biological sample. However, the analytical challenges involved in discriminating between peptides with similar sequences in convoluted spectra have limited its applicability in important cases, such as the detection of single-nucleotide polymorphisms (SNPs) and alternative site localizations in phosphoproteomics data. We report Specter (https://github.com/rpeckner-broad/Specter), an open-source software tool that uses linear algebra to deconvolute DIA mixture spectra directly through comparison to a spectral library, thus circumventing the problems associated with typical fragment-correlation-based approaches. We validate the sensitivity of Specter and its performance relative to that of other methods, and show that Specter is able to successfully analyze cases involving highly similar peptides that are typically challenging for DIA analysis methods.
On the acquisition and representation of procedural knowledge
NASA Technical Reports Server (NTRS)
Saito, T.; Ortiz, C.; Loftin, R. B.
1992-01-01
Historically knowledge acquisition has proven to be one of the greatest barriers to the development of intelligent systems. Current practice generally requires lengthy interactions between the expert whose knowledge is to be captured and the knowledge engineer whose responsibility is to acquire and represent knowledge in a useful form. Although much research has been devoted to the development of methodologies and computer software to aid in the capture and representation of some of some types of knowledge, little attention has been devoted to procedural knowledge. NASA personnel frequently perform tasks that are primarily procedural in nature. Previous work is reviewed in the field of knowledge acquisition and then focus on knowledge acquisition for procedural tasks with special attention devoted to the Navy's VISTA tool. The design and development is described of a system for the acquisition and representation of procedural knowledge-TARGET (Task Analysis and Rule Generation Tool). TARGET is intended as a tool that permits experts to visually describe procedural tasks and as a common medium for knowledge refinement by the expert and knowledge engineer. The system is designed to represent the acquired knowledge in the form of production rules. Systems such as TARGET have the potential to profoundly reduce the time, difficulties, and costs of developing knowledge-based systems for the performance of procedural tasks.
Development and Flight Testing of an Adaptive Vehicle Health-Monitoring Architecture
NASA Technical Reports Server (NTRS)
Woodard, Stanley E.; Coffey, Neil C.; Gonzalez, Guillermo A.; Taylor, B. Douglas; Brett, Rube R.; Woodman, Keith L.; Weathered, Brenton W.; Rollins, Courtney H.
2002-01-01
On going development and testing of an adaptable vehicle health-monitoring architecture is presented. The architecture is being developed for a fleet of vehicles. It has three operational levels: one or more remote data acquisition units located throughout the vehicle; a command and control unit located within the vehicle, and, a terminal collection unit to collect analysis results from all vehicles. Each level is capable of performing autonomous analysis with a trained expert system. The expert system is parameterized, which makes it adaptable to be trained to both a user's subject reasoning and existing quantitative analytic tools. Communication between all levels is done with wireless radio frequency interfaces. The remote data acquisition unit has an eight channel programmable digital interface that allows the user discretion for choosing type of sensors; number of sensors, sensor sampling rate and sampling duration for each sensor. The architecture provides framework for a tributary analysis. All measurements at the lowest operational level are reduced to provide analysis results necessary to gauge changes from established baselines. These are then collected at the next level to identify any global trends or common features from the prior level. This process is repeated until the results are reduced at the highest operational level. In the framework, only analysis results are forwarded to the next level to reduce telemetry congestion. The system's remote data acquisition hardware and non-analysis software have been flight tested on the NASA Langley B757's main landing gear. The flight tests were performed to validate the following: the wireless radio frequency communication capabilities of the system, the hardware design, command and control; software operation and, data acquisition, storage and retrieval.
Robot Manipulator Technologies for Planetary Exploration
NASA Technical Reports Server (NTRS)
Das, H.; Bao, X.; Bar-Cohen, Y.; Bonitz, R.; Lindemann, R.; Maimone, M.; Nesnas, I.; Voorhees, C.
1999-01-01
NASA exploration missions to Mars, initiated by the Mars Pathfinder mission in July 1997, will continue over the next decade. The missions require challenging innovations in robot design and improvements in autonomy to meet ambitious objectives under tight budget and time constraints. The authors are developing design tools, component technologies and capabilities to address these needs for manipulation with robots for planetary exploration. The specific developments are: 1) a software analysis tool to reduce robot design iteration cycles and optimize on design solutions, 2) new piezoelectric ultrasonic motors (USM) for light-weight and high torque actuation in planetary environments, 3) use of advanced materials and structures for strong and light-weight robot arms and 4) intelligent camera-image coordinated autonomous control of robot arms for instrument placement and sample acquisition from a rover vehicle.
Blattmann, Peter; Heusel, Moritz; Aebersold, Ruedi
2016-01-01
SWATH-MS is an acquisition and analysis technique of targeted proteomics that enables measuring several thousand proteins with high reproducibility and accuracy across many samples. OpenSWATH is popular open-source software for peptide identification and quantification from SWATH-MS data. For downstream statistical and quantitative analysis there exist different tools such as MSstats, mapDIA and aLFQ. However, the transfer of data from OpenSWATH to the downstream statistical tools is currently technically challenging. Here we introduce the R/Bioconductor package SWATH2stats, which allows convenient processing of the data into a format directly readable by the downstream analysis tools. In addition, SWATH2stats allows annotation, analyzing the variation and the reproducibility of the measurements, FDR estimation, and advanced filtering before submitting the processed data to downstream tools. These functionalities are important to quickly analyze the quality of the SWATH-MS data. Hence, SWATH2stats is a new open-source tool that summarizes several practical functionalities for analyzing, processing, and converting SWATH-MS data and thus facilitates the efficient analysis of large-scale SWATH/DIA datasets.
Computer-Aided Process and Tools for Mobile Software Acquisition
2013-07-30
moldo^j= pmlkploba=obmloq=pbofbp= Computer-Aided Process and Tools for Mobile Software Acquisition 30 July 2013 LT Christopher Bonine , USN, Dr...Christopher Bonine is a lieutenant in the United States Navy. He is currently assigned to the Navy Cyber Defense Operations Command in Norfolk, VA. He has...interests are in development and implementation of cyber security policy. Bonine has a master’s in computer science from the Naval Postgraduate School
NASA Technical Reports Server (NTRS)
Howard, S. D.
1987-01-01
Effective user interface design in software systems is a complex task that takes place without adequate modeling tools. By combining state transition diagrams and the storyboard technique of filmmakers, State Transition Storyboards were developed to provide a detailed modeling technique for the Goldstone Solar System Radar Data Acquisition System human-machine interface. Illustrations are included with a description of the modeling technique.
A Task-oriented Approach for Hydrogeological Site Characterization
NASA Astrophysics Data System (ADS)
Rubin, Y.; Nowak, W.; de Barros, F.
2010-12-01
Hydrogeological site characterization is a challenging task from several reasons: (1) the large spatial variability and scarcity of prior information render the outcome of any planned sampling campaign uncertain; (2) there are no simple tools for comparing between the many alternative measurement techniques and data acquisition strategies, and (3) physical and budgetary constraints associated with data acquisition. This paper presents several ideas on how to plan sampling campaigns in a rational manner while addressing these challenges. The first idea is to recognize that different sites and different problems require different characterization strategies. Hence the idea is to plan data acquisition according to its capability for meeting site-specific goals. For example, the characterization needs at a “research problem” site (e.g., a site intended to investigate the transport of uranium in the subsurface such as in Hanford) are different from those of a “problem” site (e.g., contaminated site associated with a health risk to human such as Camp Lejeune, or determining the safe yield of an aquifer). This distinction requires planners to define the characterization goal(s) in a quantitative manner. The second idea is to define metrics that could link specific data types and data acquisition strategies with the site-specific goals in a way that would allow planners to compare between strongly different, alternatives strategies at the design stage (even prior to data acquisition) and to modify the strategies as more data become available. To meet this goal, we developed the concept of the (comparative) information yield curve. Finally, we propose to look at site characterization from the perspective of statistical hypothesis testing, whereby data acquisition strategies could be evaluated in terms of their ability to support or refute various hypotheses made with regard to the characterization goals, and the strategies could be modified once the test is completed. Accept/reject regions for hypothesis testing can be determined based on goals determined by regulations or by agreement between the stakeholders. Hypothesis-driven design could help in minimizing the chances of making wrong decision (false positives or false negatives) with regard to the site-specific goals.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-02
... tooling, but should include ``all property, i.e., special test equipment, ground support equipment, machine tools and machines and other intangibles to maintain capability.'' Response: DoD is fully...
Data Acquisition for Modular Biometric Monitoring System
NASA Technical Reports Server (NTRS)
Grodsinsky, Carlos M. (Inventor); Chmiel, Alan J. (Inventor); Humphreys, Bradley T. (Inventor)
2014-01-01
A modular system for acquiring biometric data includes a plurality of data acquisition modules configured to sample biometric data from at least one respective input channel at a data acquisition rate. A representation of the sampled biometric data is stored in memory of each of the plurality of data acquisition modules. A central control system is in communication with each of the plurality of data acquisition modules through a bus. The central control system is configured to collect data asynchronously, via the bus, from the memory of the plurality of data acquisition modules according to a relative fullness of the memory of the plurality of data acquisition modules.
Purdue ionomics information management system. An integrated functional genomics platform.
Baxter, Ivan; Ouzzani, Mourad; Orcun, Seza; Kennedy, Brad; Jandhyala, Shrinivas S; Salt, David E
2007-02-01
The advent of high-throughput phenotyping technologies has created a deluge of information that is difficult to deal with without the appropriate data management tools. These data management tools should integrate defined workflow controls for genomic-scale data acquisition and validation, data storage and retrieval, and data analysis, indexed around the genomic information of the organism of interest. To maximize the impact of these large datasets, it is critical that they are rapidly disseminated to the broader research community, allowing open access for data mining and discovery. We describe here a system that incorporates such functionalities developed around the Purdue University high-throughput ionomics phenotyping platform. The Purdue Ionomics Information Management System (PiiMS) provides integrated workflow control, data storage, and analysis to facilitate high-throughput data acquisition, along with integrated tools for data search, retrieval, and visualization for hypothesis development. PiiMS is deployed as a World Wide Web-enabled system, allowing for integration of distributed workflow processes and open access to raw data for analysis by numerous laboratories. PiiMS currently contains data on shoot concentrations of P, Ca, K, Mg, Cu, Fe, Zn, Mn, Co, Ni, B, Se, Mo, Na, As, and Cd in over 60,000 shoot tissue samples of Arabidopsis (Arabidopsis thaliana), including ethyl methanesulfonate, fast-neutron and defined T-DNA mutants, and natural accession and populations of recombinant inbred lines from over 800 separate experiments, representing over 1,000,000 fully quantitative elemental concentrations. PiiMS is accessible at www.purdue.edu/dp/ionomics.
Documentation and knowledge acquisition
NASA Technical Reports Server (NTRS)
Rochowiak, Daniel; Moseley, Warren
1990-01-01
Traditional approaches to knowledge acquisition have focused on interviews. An alternative focuses on the documentation associated with a domain. Adopting a documentation approach provides some advantages during familiarization. A knowledge management tool was constructed to gain these advantages.
Miller, C.; Waddell, K.; Tang, N.
2010-01-01
RP-122 Peptide quantitation using Multiple Reaction Monitoring (MRM) has been established as an important methodology for biomarker verification andvalidation.This requires high throughput combined with high sensitivity to analyze potentially thousands of target peptides in each sample.Dynamic MRM allows the system to only acquire the required MRMs of the peptide during a retention window corresponding to when each peptide is eluting. This reduces the number of concurrent MRM and therefore improves quantitation and sensitivity. MRM Selector allows the user to generate an MRM transition list with retention time information from discovery data obtained on a QTOF MS system.This list can be directly imported into the triple quadrupole acquisition software.However, situations can exist where a) the list of MRMs contain an excess of MRM transitions allowable under the ideal acquisition conditions chosen ( allowing for cycle time and chromatography conditions), or b) too many transitions in a certain retention time region which would result in an unacceptably low dwell time and cycle time.A new tool - MRM viewer has been developed to help users automatically generate multiple dynamic MRM methods from a single MRM list.In this study, a list of 3293 MRM transitions from a human plasma sample was compiled.A single dynamic MRM method with 3293 transitions results in a minimum dwell time of 2.18ms.Using MRM viewer we can generate three dynamic MRM methods with a minimum dwell time of 20ms which can give a better quality MRM quantitation.This tool facilitates both high throughput and high sensitivity for MRM quantitation.
NASA Astrophysics Data System (ADS)
Allen, Wes M.; Chin, Lixin; Sampson, David D.; Kennedy, Brendan F.
2016-03-01
Incomplete excision of tumour margins is a major issue in breast-conserving surgery. Currently 20 - 60% of cases require a second surgical procedure required as a result of cancer recurrence. A number of techniques have been proposed to assess margin status, including frozen section analysis and imprint cytology. However, the recurrence rate after using these techniques remains very high. Over the last several years, our group has been developing optical coherence elastography (OCE) as a tool for the intraoperative assessment of tumour margins in breast cancer. We have reported a feasibility study on 65 ex vivo samples from patients undergoing mastectomy or wide local excision demonstrates the potential of OCE in differentiating benign from malignant tissue. In this study, malignant tissue was readily distinguished from surrounding relative tissue by a distinctive heterogeneous pattern in micro-elastograms. To date the largest field of view for a micro-elastogram is 20 x 20mm, however, lumpectomy samples are typically ~50 x 50 x 30mm. For OCE to progress as a useful clinical tool, elastograms must be acquired over larger areas to allow a greater portion of the surface area of lumpectomies to be assessed. Here, we propose a wide-field OCE scanner that utilizes a piezoelectric transducer with an internal diameter of 65mm. In this approach partially overlapped elastograms are stitched together forming a mosaic with overall dimensions of 50 x 50mm in a total acquisition time of 15 - 30 minutes. We present results using this approach on both tissue-mimicking phantoms and tissue, and discuss prospects for shorter acquisitions times.
Potentials for the use of tool-integrated in-line data acquisition systems in press shops
NASA Astrophysics Data System (ADS)
Maier, S.; Schmerbeck, T.; Liebig, A.; Kautz, T.; Volk, W.
2017-09-01
Robust in-line data acquisition systems are required for the realization of process monitoring and control systems in press shops. A promising approach is the integration of sensors in the following press tools. There they can be easy integrated and maintained. It also achieves the necessary robustness for the rough press environment. Such concepts were already investigated for the measurement of the geometrical accuracy as well as for the material flow of inner part areas. They enable the monitoring of each produced part’s quality. An important success factor are practical approaches to the use of this new process information in press shops. This work presents various applications of these measuring concepts, based on real car body components of the BMW Group. For example, the procedure of retroactive error analysis is explained for a side frame. It also shows how this data acquisition can be used for the optimization of drawing tools in tool shops. With the skid-line, there is a continuous value that can be monitored from planning to serial production.
Scanning transmission electron microscopy through-focal tilt-series on biological specimens.
Trepout, Sylvain; Messaoudi, Cédric; Perrot, Sylvie; Bastin, Philippe; Marco, Sergio
2015-10-01
Since scanning transmission electron microscopy can produce high signal-to-noise ratio bright-field images of thick (≥500 nm) specimens, this tool is emerging as the method of choice to study thick biological samples via tomographic approaches. However, in a convergent-beam configuration, the depth of field is limited because only a thin portion of the specimen (from a few nanometres to tens of nanometres depending on the convergence angle) can be imaged in focus. A method known as through-focal imaging enables recovery of the full depth of information by combining images acquired at different levels of focus. In this work, we compare tomographic reconstruction with the through-focal tilt-series approach (a multifocal series of images per tilt angle) with reconstruction with the classic tilt-series acquisition scheme (one single-focus image per tilt angle). We visualised the base of the flagellum in the protist Trypanosoma brucei via an acquisition and image-processing method tailored to obtain quantitative and qualitative descriptors of reconstruction volumes. Reconstructions using through-focal imaging contained more contrast and more details for thick (≥500 nm) biological samples. Copyright © 2015 Elsevier Ltd. All rights reserved.
Autonomous Sample Acquisition for Planetary and Small Body Explorations
NASA Technical Reports Server (NTRS)
Ghavimi, Ali R.; Serricchio, Frederick; Dolgin, Ben; Hadaegh, Fred Y.
2000-01-01
Robotic drilling and autonomous sample acquisition are considered as the key technology requirements in future planetary or small body exploration missions. Core sampling or subsurface drilling operation is envisioned to be off rovers or landers. These supporting platforms are inherently flexible, light, and can withstand only limited amount of reaction forces and torques. This, together with unknown properties of sampled materials, makes the sampling operation a tedious task and quite challenging. This paper highlights the recent advancements in the sample acquisition control system design and development for the in situ scientific exploration of planetary and small interplanetary missions.
The Acquisition Cost-Estimating Workforce. Census and Characteristics
2009-01-01
Abbreviations AAC Air Armament Center ACAT acquisition category ACEIT Automated Cost Estimating Integrated Tools AF Air Force AFB Air Force Base AFCAA Air...3 3 4 Automated Cost Estimating Integrated Tools ( ACEIT ) 0 1 12 6 Tecolotea training 0 0 10 5 Other 3 13 24 18 No training 18 4 29 18 Total 100 100...other sources, including AFIT, ACEIT ,9 or the contracting agency that employed them. The remain- ing 29 percent reported having received no training
Autonomous Surface Sample Acquisition for Planetary and Lunar Exploration
NASA Astrophysics Data System (ADS)
Barnes, D. P.
2007-08-01
Surface science sample acquisition is a critical activity within any planetary and lunar exploration mission, and our research is focused upon the design, implementation, experimentation and demonstration of an onboard autonomous surface sample acquisition capability for a rover equipped with a robotic arm upon which are mounted appropriate science instruments. Images captured by a rover stereo camera system can be processed using shape from stereo methods and a digital elevation model (DEM) generated. We have developed a terrain feature identification algorithm that can determine autonomously from DEM data suitable regions for instrument placement and/or surface sample acquisition. Once identified, surface normal data can be generated autonomously which are then used to calculate an arm trajectory for instrument placement and sample acquisition. Once an instrument placement and sample acquisition trajectory has been calculated, a collision detection algorithm is required to ensure the safe operation of the arm during sample acquisition.We have developed a novel adaptive 'bounding spheres' approach to this problem. Once potential science targets have been identified, and these are within the reach of the arm and will not cause any undesired collision, then the 'cost' of executing the sample acquisition activity is required. Such information which includes power expenditure and duration can be used to select the 'best' target from a set of potential targets. We have developed a science sample acquisition resource requirements calculation that utilises differential inverse kinematics methods to yield a high fidelity result, thus improving upon simple 1st order approximations. To test our algorithms a new Planetary Analogue Terrain (PAT) Laboratory has been created that has a terrain region composed of Mars Soil Simulant-D from DLR Germany, and rocks that have been fully characterised in the laboratory. These have been donated by the UK Planetary Analogue Field Study network, and constitute the science targets for our autonomous sample acquisition work. Our PAT Lab. terrain has been designed to support our new rover chassis which is based upon the ExoMars rover Concept-E mechanics which were investigated during the ESA ExoMars Phase A study. The rover has 6 wheel drives, 6 wheels steering, and a 6 wheel walking capability. Mounted on the rover chassis is the UWA robotic arm and mast. We have designed and built a PanCam system complete with a computer controlled pan and tilt mechanism. The UWA PanCam is based upon the ExoMars PanCam (Phase A study) and hence supports two Wide Angle Cameras (WAC - 64 degree FOV), and a High Resolution Camera (HRC - 5 degree FOV). WAC separation is 500 mm. Software has been developed to capture images which form the data input into our on-board autonomous surface sample acquisition algorithms.
Ruppert, Kai; Amzajerdian, Faraz; Hamedani, Hooman; Xin, Yi; Loza, Luis; Achekzai, Tahmina; Duncan, Ian F; Profka, Harrilla; Siddiqui, Sarmad; Pourfathi, Mehrdad; Cereda, Maurizio F; Kadlecek, Stephen; Rizi, Rahim R
2018-04-22
To demonstrate the feasibility of using a 3D radial double golden-means acquisition with variable flip angles to monitor pulmonary gas transport in a single breath hold with hyperpolarized xenon-129 MRI. Hyperpolarized xenon-129 MRI scans with interleaved gas-phase and dissolved-phase excitations were performed using a 3D radial double golden-means acquisition in mechanically ventilated rabbits. The flip angle was either held fixed at 15 ° or 5 °, or it was varied linearly in ascending or descending order between 5 ° and 15 ° over a sampling interval of 1000 spokes. Dissolved-phase and gas-phase images were reconstructed at high resolution (32 × 32 × 32 matrix size) using all 1000 spokes, or at low resolution (22 × 22 × 22 matrix size) using 400 spokes at a time in a sliding-window fashion. Based on these sliding-window images, relative change maps were obtained using the highest mean flip angle as the reference, and aggregated pixel-based changes were tracked. Although the signal intensities in the dissolve-phase maps were mostly constant in the fixed flip-angle acquisitions, they varied significantly as a function of average flip angle in the variable flip-angle acquisitions. The latter trend reflects the underlying changes in observed dissolve-phase magnetization distribution due to pulmonary gas uptake and transport. 3D radial double golden-means acquisitions with variable flip angles provide a robust means for rapidly assessing lung function during a single breath hold, thereby constituting a particularly valuable tool for imaging uncooperative or pediatric patient populations. © 2018 International Society for Magnetic Resonance in Medicine.
NASA Astrophysics Data System (ADS)
Arevalo, S.; Atwood, C.; Bell, P.; Blacker, T. D.; Dey, S.; Fisher, D.; Fisher, D. A.; Genalis, P.; Gorski, J.; Harris, A.; Hill, K.; Hurwitz, M.; Kendall, R. P.; Meakin, R. L.; Morton, S.; Moyer, E. T.; Post, D. E.; Strawn, R.; Veldhuizen, D. v.; Votta, L. G.; Wynn, S.; Zelinski, G.
2008-07-01
In FY2008, the U.S. Department of Defense (DoD) initiated the Computational Research and Engineering Acquisition Tools and Environments (CREATE) program, a 360M program with a two-year planning phase and a ten-year execution phase. CREATE will develop and deploy three computational engineering tool sets for DoD acquisition programs to use to design aircraft, ships and radio-frequency antennas. The planning and execution of CREATE are based on the 'lessons learned' from case studies of large-scale computational science and engineering projects. The case studies stress the importance of a stable, close-knit development team; a focus on customer needs and requirements; verification and validation; flexible and agile planning, management, and development processes; risk management; realistic schedules and resource levels; balanced short- and long-term goals and deliverables; and stable, long-term support by the program sponsor. Since it began in FY2008, the CREATE program has built a team and project structure, developed requirements and begun validating them, identified candidate products, established initial connections with the acquisition programs, begun detailed project planning and development, and generated the initial collaboration infrastructure necessary for success by its multi-institutional, multidisciplinary teams.
NASA Astrophysics Data System (ADS)
Nguyen, Tien M.; Guillen, Andy T.; Hant, James J.; Kizer, Justin R.; Min, Inki A.; Siedlak, Dennis J. L.; Yoh, James
2017-05-01
The U.S. Air Force (USAF) has recognized the needs for owning the program and technical knowledge within the Air Force concerning the systems being acquired to ensure success. This paper extends the previous work done by the authors [1-2] on the "Resilient Program Technical Baseline Framework for Future Space Systems" and "Portfolio Decision Support Tool (PDST)" to the development and implementation of the Program and Technical Baseline (PTB) Tracking Tool (PTBTL) for the DOD acquisition life cycle. The paper describes the "simplified" PTB tracking model with a focus on the preaward phases and discusses how to implement this model in PDST.
Multispectral analysis tools can increase utility of RGB color images in histology
NASA Astrophysics Data System (ADS)
Fereidouni, Farzad; Griffin, Croix; Todd, Austin; Levenson, Richard
2018-04-01
Multispectral imaging (MSI) is increasingly finding application in the study and characterization of biological specimens. However, the methods typically used come with challenges on both the acquisition and the analysis front. MSI can be slow and photon-inefficient, leading to long imaging times and possible phototoxicity and photobleaching. The resulting datasets can be large and complex, prompting the development of a number of mathematical approaches for segmentation and signal unmixing. We show that under certain circumstances, just three spectral channels provided by standard color cameras, coupled with multispectral analysis tools, including a more recent spectral phasor approach, can efficiently provide useful insights. These findings are supported with a mathematical model relating spectral bandwidth and spectral channel number to achievable spectral accuracy. The utility of 3-band RGB and MSI analysis tools are demonstrated on images acquired using brightfield and fluorescence techniques, as well as a novel microscopy approach employing UV-surface excitation. Supervised linear unmixing, automated non-negative matrix factorization and phasor analysis tools all provide useful results, with phasors generating particularly helpful spectral display plots for sample exploration.
48 CFR 1852.223-76 - Federal Automotive Statistical Tool Reporting.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Statistical Tool Reporting. 1852.223-76 Section 1852.223-76 Federal Acquisition Regulations System NATIONAL... Provisions and Clauses 1852.223-76 Federal Automotive Statistical Tool Reporting. As prescribed at 1823.271 and 1851.205, insert the following clause: Federal Automotive Statistical Tool Reporting (JUL 2003) If...
Development of decision-making support tools for early right-of-way acquisitions.
DOT National Transportation Integrated Search
2010-01-01
This report documents the work performed during phase two of Project 0-5534, Asset Management Texas : Style. This phase included gathering historical Texas Department of Transportation (TxDOT) right-of-way : acquisition information, analyzi...
Drilling and Caching Architecture for the Mars2020 Mission
NASA Astrophysics Data System (ADS)
Zacny, K.
2013-12-01
We present a Sample Acquisition and Caching (SAC) architecture for the Mars2020 mission and detail how the architecture meets the sampling requirements described in the Mars2020 Science Definition Team (SDT) report. The architecture uses 'One Bit per Core' approach. Having dedicated bit for each rock core allows a reduction in the number of core transfer steps and actuators and this reduces overall mission risk. It also alleviates the bit life problem, eliminates cross contamination, and aids in hermetic sealing. An added advantage is faster drilling time, lower power, lower energy, and lower Weight on Bit (which reduces Arm preload requirements). To enable replacing of core samples, the drill bits are based on the BigTooth bit design. The BigTooth bit cuts a core diameter slightly smaller than the imaginary hole inscribed by the inner surfaces of the bits. Hence the rock core could be much easier ejected along the gravity vector. The architecture also has three additional types of bits that allow analysis of rocks. Rock Abrasion and Brushing Bit (RABBit) allows brushing and grinding of rocks in the same was as Rock Abrasion Tool does on MER. PreView bit allows viewing and analysis of rock core surfaces. Powder and Regolith Acquisition Bit (PRABit) captures regolith and rock powder either for in situ analysis or sample return. PRABit also allows sieving capabilities. The architecture can be viewed here: http://www.youtube.com/watch?v=_-hOO4-zDtE
Technical advances in proteomics: new developments in data-independent acquisition.
Hu, Alex; Noble, William S; Wolf-Yadlin, Alejandro
2016-01-01
The ultimate aim of proteomics is to fully identify and quantify the entire complement of proteins and post-translational modifications in biological samples of interest. For the last 15 years, liquid chromatography-tandem mass spectrometry (LC-MS/MS) in data-dependent acquisition (DDA) mode has been the standard for proteomics when sampling breadth and discovery were the main objectives; multiple reaction monitoring (MRM) LC-MS/MS has been the standard for targeted proteomics when precise quantification, reproducibility, and validation were the main objectives. Recently, improvements in mass spectrometer design and bioinformatics algorithms have resulted in the rediscovery and development of another sampling method: data-independent acquisition (DIA). DIA comprehensively and repeatedly samples every peptide in a protein digest, producing a complex set of mass spectra that is difficult to interpret without external spectral libraries. Currently, DIA approaches the identification breadth of DDA while achieving the reproducible quantification characteristic of MRM or its newest version, parallel reaction monitoring (PRM). In comparative de novo identification and quantification studies in human cell lysates, DIA identified up to 89% of the proteins detected in a comparable DDA experiment while providing reproducible quantification of over 85% of them. DIA analysis aided by spectral libraries derived from prior DIA experiments or auxiliary DDA data produces identification and quantification as reproducible and precise as that achieved by MRM/PRM, except on low‑abundance peptides that are obscured by stronger signals. DIA is still a work in progress toward the goal of sensitive, reproducible, and precise quantification without external spectral libraries. New software tools applied to DIA analysis have to deal with deconvolution of complex spectra as well as proper filtering of false positives and false negatives. However, the future outlook is positive, and various researchers are working on novel bioinformatics techniques to address these issues and increase the reproducibility, fidelity, and identification breadth of DIA.
Modular Biometric Monitoring System
NASA Technical Reports Server (NTRS)
Chmiel, Alan J. (Inventor); Humphreys, Bradley T. (Inventor)
2017-01-01
A modular system for acquiring biometric data includes a plurality of data acquisition modules configured to sample biometric data from at least one respective input channel at a data acquisition rate. A representation of the sampled biometric data is stored in memory of each of the plurality of data acquisition modules. A central control system is in communication with each of the plurality of data acquisition modules through a bus. The central control system is configured to control communication of data, via the bus, with each of the plurality of data acquisition modules.
Dumont, Elodie; De Bleye, Charlotte; Sacré, Pierre-Yves; Netchacovitch, Lauranne; Hubert, Philippe; Ziemons, Eric
2016-05-01
Over recent decades, spreading environmental concern entailed the expansion of green chemistry analytical tools. Vibrational spectroscopy, belonging to this class of analytical tool, is particularly interesting taking into account its numerous advantages such as fast data acquisition and no sample preparation. In this context, near-infrared, Raman and mainly surface-enhanced Raman spectroscopy (SERS) have thus gained interest in many fields including bioanalysis. The two former techniques only ensure the analysis of concentrated compounds in simple matrices, whereas the emergence of SERS improved the performances of vibrational spectroscopy to very sensitive and selective analyses. Complex SERS substrates were also developed enabling biomarker measurements, paving the way for SERS immunoassays. Therefore, in this paper, the strengths and weaknesses of these techniques will be highlighted with a focus on recent progress.
Allenby, Mark C; Misener, Ruth; Panoskaltsis, Nicki; Mantalaris, Athanasios
2017-02-01
Three-dimensional (3D) imaging techniques provide spatial insight into environmental and cellular interactions and are implemented in various fields, including tissue engineering, but have been restricted by limited quantification tools that misrepresent or underutilize the cellular phenomena captured. This study develops image postprocessing algorithms pairing complex Euclidean metrics with Monte Carlo simulations to quantitatively assess cell and microenvironment spatial distributions while utilizing, for the first time, the entire 3D image captured. Although current methods only analyze a central fraction of presented confocal microscopy images, the proposed algorithms can utilize 210% more cells to calculate 3D spatial distributions that can span a 23-fold longer distance. These algorithms seek to leverage the high sample cost of 3D tissue imaging techniques by extracting maximal quantitative data throughout the captured image.
Application of ZigBee sensor network to data acquisition and monitoring
NASA Astrophysics Data System (ADS)
Terada, Mitsugu
2009-01-01
A ZigBee sensor network for data acquisition and monitoring is presented in this paper. It is configured using a commercially available ZigBee solution. A ZigBee module is connected via a USB interface to a Microsoft Windows PC, which works as a base station in the sensor network. Data collected by remote devices are sent to the base station PC, which is set as a data sink. Each remote device is built of a commercially available ZigBee module product and a sensor. The sensor is a thermocouple connected to a cold junction compensator amplifier. The signal from the amplifier is input to an AD converter port on the ZigBee module. Temperature data are transmitted according to the ZigBee protocol from the remote device to the data sink PC. The data sampling rate is one sampling per second; the highest possible rate is four samplings per second. The data are recorded in the hexadecimal number format by device control software, and the data file is stored in text format on the data sink PC. Time-dependent data changes can be monitored using the macro function of spreadsheet software. The system is considered a useful tool in the field of education, based on the results of trial use for measurement in an undergraduate laboratory class at a university.
NASA Astrophysics Data System (ADS)
Mitchell, Garrett A.; Orange, Daniel L.; Gharib, Jamshid J.; Kennedy, Paul
2018-06-01
Marine seep hunting surveys are a current focus of hydrocarbon exploration surveys due to recent advances in offshore geophysical surveying, geochemical sampling, and analytical technologies. Hydrocarbon seeps are ephemeral, small, discrete, and therefore difficult to sample on the deep seafloor. Multibeam echosounders are an efficient seafloor exploration tool to remotely locate and map seep features. Geophysical signatures from hydrocarbon seeps are acoustically-evident in bathymetric, seafloor backscatter, midwater backscatter datasets. Interpretation of these signatures in backscatter datasets is a fundamental component of commercial seep hunting campaigns. Degradation of backscatter datasets resulting from environmental, geometric, and system noise can interfere with the detection and delineation of seeps. We present a relative backscatter intensity normalization method and an oversampling acquisition technique that can improve the geological resolvability of hydrocarbon seeps. We use Green Canyon (GC) Block 600 in the Northern Gulf of Mexico as a seep calibration site for a Kongsberg EM302 30 kHz MBES prior to the start of the Gigante seep hunting program to analyze these techniques. At GC600, we evaluate the results of a backscatter intensity normalization, assess the effectiveness of 2X seafloor coverage in resolving seep-related features in backscatter data, and determine the off-nadir detection limits of bubble plumes using the EM302. Incorporating these techniques into seep hunting surveys can improve the detectability and sampling of seafloor seeps.
New Tools and Methods for Assessing Risk-Management Strategies
2004-03-01
Theories to evaluate the risks and benefits of various acquisition alternatives and allowed researchers to monitor the process students used to make a...revealed distinct risk-management strategies. 15. SUBJECT TERMS risk managements, acquisition process, expected value theory , multi-attribute utility theory ...Utility Theories to evaluate the risks and benefits of various acquisition alternatives, and allowed us to monitor the process subjects used to arrive at
Amazon Business And GSA Advantage: A Comparative Analysis
2017-12-01
training for businesses or a customer -ordering guide; however, the site does offer a help center where businesses and users can submit questions...Electronic Offer FAR Federal Acquisition Regulation FAS Federal Acquisition Service FASA Federal Acquisition Streamlining Act FGO Field Grade Officer...component of GSA Advantage, is an online procurement tool that allows customers to request quotes for (1) commercial supplies and services under
In Situ Strategy of the 2011 Mars Science Laboratory to Investigate the Habitability of Ancient Mars
NASA Technical Reports Server (NTRS)
Mahaffy, Paul R.
2011-01-01
The ten science investigations of the 2011 Mars Science Laboratory (MSL) Rover named "Curiosity" seek to provide a quantitative assessment of habitability through chemical and geological measurements from a highly capable robotic' platform. This mission seeks to understand if the conditions for life on ancient Mars are preserved in the near-surface geochemical record. These substantial payload resources enabled by MSL's new entry descent and landing (EDL) system have allowed the inclusion of instrument types nevv to the Mars surface including those that can accept delivered sample from rocks and soils and perform a wide range of chemical, isotopic, and mineralogical analyses. The Chemistry and Mineralogy (CheMin) experiment that is located in the interior of the rover is a powder x-ray Diffraction (XRD) and X-ray Fluorescence (XRF) instrument that provides elemental and mineralogical information. The Sample Analysis at Mars (SAM) suite of instruments complements this experiment by analyzing the volatile component of identically processed samples and by analyzing atmospheric composition. Other MSL payload tools such as the Mast Camera (Mastcam) and the Chemistry & Camera (ChemCam) instruments are utilized to identify targets for interrogation first by the arm tools and subsequent ingestion into SAM and CheMin using the Sample Acquisition, Processing, and Handling (SA/SPaH) subsystem. The arm tools include the Mars Hand Lens Imager (MAHLI) and the Chemistry and Alpha Particle X-ray Spectrometer (APXX). The Dynamic Albedo of Neutrons (DAN) instrument provides subsurface identification of hydrogen such as that contained in hydrated minerals
NASA Astrophysics Data System (ADS)
Carey, Elizabeth M.; Peters, Gregory H.; Choukroun, Mathieu; Chu, Lauren; Carpenter, Emma; Cohen, Brooklin; Panossian, Lara; Zhou, Yu Meng; Sarkissian, Ani; Moreland, Scott; Shiraishi, Lori R.; Backes, Paul; Zacny, Kris; Green, Jacklyn R.; Raymond, Carol
2017-11-01
Comets are icy remnants of the Solar System formation, and as such contain some of the most primitive volatiles and organic materials. Sampling the surface of a comet is a high priority for the New Frontiers program. Planetary simulants are crucial to the development of adequate in situ instruments and sample acquisition systems. A high-fidelity comet surface simulant has been developed to support hardware design and development for one Comet Surface Sample Return tool, the BiBlade Comet Sampler. Mechanical Porous Ambient Comet Simulants (MPACS) can be manufactured to cover a wide range of desired physical properties, such as density and cone penetration resistance, and exhibit a brittle fracture mode. The structure of the MPACS materials is an aggregated composite structure of weakly-bonded grains of very small size (diameter ≤ 40 μm) that are most relevant to the structure of the surface of a comet nucleus.
A Methodology for Developing Army Acquisition Strategies for an Uncertain Future
2007-01-01
manuscript for publication. Acronyms ABP Assumption-Based Planning ACEIT Automated Cost Estimating Integrated Tool ACR Armored Cavalry Regiment ACTD...decisions. For example, they employ the Automated Cost Estimating Integrated Tools ( ACEIT ) to simplify life cycle cost estimates; other tools are
PAnalyzer: a software tool for protein inference in shotgun proteomics.
Prieto, Gorka; Aloria, Kerman; Osinalde, Nerea; Fullaondo, Asier; Arizmendi, Jesus M; Matthiesen, Rune
2012-11-05
Protein inference from peptide identifications in shotgun proteomics must deal with ambiguities that arise due to the presence of peptides shared between different proteins, which is common in higher eukaryotes. Recently data independent acquisition (DIA) approaches have emerged as an alternative to the traditional data dependent acquisition (DDA) in shotgun proteomics experiments. MSE is the term used to name one of the DIA approaches used in QTOF instruments. MSE data require specialized software to process acquired spectra and to perform peptide and protein identifications. However the software available at the moment does not group the identified proteins in a transparent way by taking into account peptide evidence categories. Furthermore the inspection, comparison and report of the obtained results require tedious manual intervention. Here we report a software tool to address these limitations for MSE data. In this paper we present PAnalyzer, a software tool focused on the protein inference process of shotgun proteomics. Our approach considers all the identified proteins and groups them when necessary indicating their confidence using different evidence categories. PAnalyzer can read protein identification files in the XML output format of the ProteinLynx Global Server (PLGS) software provided by Waters Corporation for their MSE data, and also in the mzIdentML format recently standardized by HUPO-PSI. Multiple files can also be read simultaneously and are considered as technical replicates. Results are saved to CSV, HTML and mzIdentML (in the case of a single mzIdentML input file) files. An MSE analysis of a real sample is presented to compare the results of PAnalyzer and ProteinLynx Global Server. We present a software tool to deal with the ambiguities that arise in the protein inference process. Key contributions are support for MSE data analysis by ProteinLynx Global Server and technical replicates integration. PAnalyzer is an easy to use multiplatform and free software tool.
PAnalyzer: A software tool for protein inference in shotgun proteomics
2012-01-01
Background Protein inference from peptide identifications in shotgun proteomics must deal with ambiguities that arise due to the presence of peptides shared between different proteins, which is common in higher eukaryotes. Recently data independent acquisition (DIA) approaches have emerged as an alternative to the traditional data dependent acquisition (DDA) in shotgun proteomics experiments. MSE is the term used to name one of the DIA approaches used in QTOF instruments. MSE data require specialized software to process acquired spectra and to perform peptide and protein identifications. However the software available at the moment does not group the identified proteins in a transparent way by taking into account peptide evidence categories. Furthermore the inspection, comparison and report of the obtained results require tedious manual intervention. Here we report a software tool to address these limitations for MSE data. Results In this paper we present PAnalyzer, a software tool focused on the protein inference process of shotgun proteomics. Our approach considers all the identified proteins and groups them when necessary indicating their confidence using different evidence categories. PAnalyzer can read protein identification files in the XML output format of the ProteinLynx Global Server (PLGS) software provided by Waters Corporation for their MSE data, and also in the mzIdentML format recently standardized by HUPO-PSI. Multiple files can also be read simultaneously and are considered as technical replicates. Results are saved to CSV, HTML and mzIdentML (in the case of a single mzIdentML input file) files. An MSE analysis of a real sample is presented to compare the results of PAnalyzer and ProteinLynx Global Server. Conclusions We present a software tool to deal with the ambiguities that arise in the protein inference process. Key contributions are support for MSE data analysis by ProteinLynx Global Server and technical replicates integration. PAnalyzer is an easy to use multiplatform and free software tool. PMID:23126499
Content Analysis in Systems Engineering Acquisition Activities
2016-04-30
Acquisition Activities Karen Holness, Assistant Professor, NPS Update on the Department of the Navy Systems Engineering Career Competency Model Clifford...systems engineering toolkit . Having a common analysis tool that is easy to use would support the feedback of observed system performance trends from the
Implementation of TAMSIM and EROW right-of-way acquisition decision - support tools.
DOT National Transportation Integrated Search
2011-04-01
An implementation project was performed to initiate use of TAMSIM and EROW tools in region offices and : the Right of Way (ROW) Division. The research team worked with Texas Department of Transportation : regional ROW staffs to apply both tools to a ...
Assessing and calibrating the ATR-FTIR approach as a carbonate rock characterization tool
NASA Astrophysics Data System (ADS)
Henry, Delano G.; Watson, Jonathan S.; John, Cédric M.
2017-01-01
ATR-FTIR (attenuated total reflectance Fourier transform infrared) spectroscopy can be used as a rapid and economical tool for qualitative identification of carbonates, calcium sulphates, oxides and silicates, as well as quantitatively estimating the concentration of minerals. Over 200 powdered samples with known concentrations of two, three, four and five phase mixtures were made, then a suite of calibration curves were derived that can be used to quantify the minerals. The calibration curves in this study have an R2 that range from 0.93-0.99, a RMSE (root mean square error) of 1-5 wt.% and a maximum error of 3-10 wt.%. The calibration curves were used on 35 geological samples that have previously been studied using XRD (X-ray diffraction). The identification of the minerals using ATR-FTIR is comparable with XRD and the quantitative results have a RMSD (root mean square deviation) of 14% and 12% for calcite and dolomite respectively when compared to XRD results. ATR-FTIR is a rapid technique (identification and quantification takes < 5 min) that involves virtually no cost if the machine is available. It is a common tool in most analytical laboratories, but it also has the potential to be deployed on a rig for real-time data acquisition of the mineralogy of cores and rock chips at the surface as there is no need for special sample preparation, rapid data collection and easy analysis.
Purdue Ionomics Information Management System. An Integrated Functional Genomics Platform1[C][W][OA
Baxter, Ivan; Ouzzani, Mourad; Orcun, Seza; Kennedy, Brad; Jandhyala, Shrinivas S.; Salt, David E.
2007-01-01
The advent of high-throughput phenotyping technologies has created a deluge of information that is difficult to deal with without the appropriate data management tools. These data management tools should integrate defined workflow controls for genomic-scale data acquisition and validation, data storage and retrieval, and data analysis, indexed around the genomic information of the organism of interest. To maximize the impact of these large datasets, it is critical that they are rapidly disseminated to the broader research community, allowing open access for data mining and discovery. We describe here a system that incorporates such functionalities developed around the Purdue University high-throughput ionomics phenotyping platform. The Purdue Ionomics Information Management System (PiiMS) provides integrated workflow control, data storage, and analysis to facilitate high-throughput data acquisition, along with integrated tools for data search, retrieval, and visualization for hypothesis development. PiiMS is deployed as a World Wide Web-enabled system, allowing for integration of distributed workflow processes and open access to raw data for analysis by numerous laboratories. PiiMS currently contains data on shoot concentrations of P, Ca, K, Mg, Cu, Fe, Zn, Mn, Co, Ni, B, Se, Mo, Na, As, and Cd in over 60,000 shoot tissue samples of Arabidopsis (Arabidopsis thaliana), including ethyl methanesulfonate, fast-neutron and defined T-DNA mutants, and natural accession and populations of recombinant inbred lines from over 800 separate experiments, representing over 1,000,000 fully quantitative elemental concentrations. PiiMS is accessible at www.purdue.edu/dp/ionomics. PMID:17189337
2014-11-18
this research was to characterize the naturalistic decision making process used in Naval Aviation acquisition to assess cost, schedule and...Naval Aviation acquisitions can be identified, which can support the future development of new processes and tools for training and decision making...part of Department of Defense acquisition processes , HSI ensures that operator, maintainer and sustainer considerations are incorporated into
Clos, Lawrence J; Jofre, M Fransisca; Ellinger, James J; Westler, William M; Markley, John L
2013-06-01
To facilitate the high-throughput acquisition of nuclear magnetic resonance (NMR) experimental data on large sets of samples, we have developed a simple and straightforward automated methodology that capitalizes on recent advances in Bruker BioSpin NMR spectrometer hardware and software. Given the daunting challenge for non-NMR experts to collect quality spectra, our goal was to increase user accessibility, provide customized functionality, and improve the consistency and reliability of resultant data. This methodology, NMRbot, is encoded in a set of scripts written in the Python programming language accessible within the Bruker BioSpin TopSpin ™ software. NMRbot improves automated data acquisition and offers novel tools for use in optimizing experimental parameters on the fly. This automated procedure has been successfully implemented for investigations in metabolomics, small-molecule library profiling, and protein-ligand titrations on four Bruker BioSpin NMR spectrometers at the National Magnetic Resonance Facility at Madison. The investigators reported benefits from ease of setup, improved spectral quality, convenient customizations, and overall time savings.
NASA Astrophysics Data System (ADS)
Sandford, S. A.; Chabot, N. L.; Dello Russo, N.; Leary, J. C.; Reynolds, E. L.; Weaver, H. A.; Wooden, D. H.
2017-07-01
CORSAIR (COmet Rendezvous, Sample Acquisition, Investigation, and Return) is a mission concept submitted in response to NASA's New Frontiers 4 call. CORSAIR's proposed mission is to return comet nucleus samples to Earth for detailed analysis.
Praveen, Bavishna B; Ashok, Praveen C; Mazilu, Michael; Riches, Andrew; Herrington, Simon; Dholakia, Kishan
2012-07-01
In the field of biomedical optics, Raman spectroscopy is a powerful tool for probing the chemical composition of biological samples. In particular, fiber Raman probes play a crucial role for in vivo and ex vivo tissue analysis. However, the high-fluorescence background typically contributed by the auto fluorescence from both a tissue sample and the fiber-probe interferes strongly with the relatively weak Raman signal. Here we demonstrate the implementation of wavelength-modulated Raman spectroscopy (WMRS) to suppress the fluorescence background while analyzing tissues using fiber Raman probes. We have observed a significant signal-to-noise ratio enhancement in the Raman bands of bone tissue, which have a relatively high fluorescence background. Implementation of WMRS in fiber-probe-based bone tissue study yielded usable Raman spectra in a relatively short acquisition time (∼30 s), notably without any special sample preparation stage. Finally, we have validated its capability to suppress fluorescence on other tissue samples such as adipose tissue derived from four different species.
DDS-Suite - A Dynamic Data Acquisition, Processing, and Analysis System for Wind Tunnel Testing
NASA Technical Reports Server (NTRS)
Burnside, Jathan J.
2012-01-01
Wind Tunnels have optimized their steady-state data systems for acquisition and analysis and even implemented large dynamic-data acquisition systems, however development of near real-time processing and analysis tools for dynamic-data have lagged. DDS-Suite is a set of tools used to acquire, process, and analyze large amounts of dynamic data. Each phase of the testing process: acquisition, processing, and analysis are handled by separate components so that bottlenecks in one phase of the process do not affect the other, leading to a robust system. DDS-Suite is capable of acquiring 672 channels of dynamic data at rate of 275 MB / s. More than 300 channels of the system use 24-bit analog-to-digital cards and are capable of producing data with less than 0.01 of phase difference at 1 kHz. System architecture, design philosophy, and examples of use during NASA Constellation and Fundamental Aerodynamic tests are discussed.
Digital Curation of Marine Physical Samples at Ocean Networks Canada
NASA Astrophysics Data System (ADS)
Jenkyns, R.; Tomlin, M. C.; Timmerman, R.
2015-12-01
Ocean Networks Canada (ONC) has collected hundreds of geological, biological and fluid samples from the water column and seafloor during its maintenance expeditions. These samples have been collected by Remotely Operated Vehicles (ROVs), divers, networked and autonomously deployed instruments, and rosettes. Subsequent measurements are used for scientific experiments, calibration of in-situ and remote sensors, monitoring of Marine Protected Areas, and environment characterization. Tracking the life cycles of these samples from collection to dissemination of results with all the pertinent documents (e.g., protocols, imagery, reports), metadata (e.g., location, identifiers, purpose, method) and data (e.g., measurements, taxonomic classification) is a challenge. The initial collection of samples is normally documented in SeaScribe (an ROV dive logging tool within ONC's Oceans 2.0 software) for which ONC has defined semantics and syntax. Next, samples are often sent to individual scientists and institutions (e.g., Royal BC Museum) for processing and storage, making acquisition of results and life cycle metadata difficult. Finally, this information needs to be retrieved and collated such that multiple user scenarios can be addressed. ONC aims to improve and extend its digital infrastructure for physical samples to support this complex array of samples, workflows and applications. However, in order to promote effective data discovery and exchange, interoperability and community standards must be an integral part of the design. Thus, integrating recommendations and outcomes of initiatives like the EarthCube iSamples working groups are essential. Use cases, existing tools, schemas and identifiers are reviewed, while remaining gaps and challenges are identified. The current status, selected approaches and possible future directions to enhance ONC's digital infrastructure for each sample type are presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
W. Davis; P. Roney; T. Carroll
The MDSplus data acquisition system has been used successfully since the 1999 startup of NSTX [National Spherical Torus Experiment] for control, data acquisition, and analysis for diagnostic subsystems. For each plasma ''shot'' on NSTX about 75 MBs of data is acquired and loaded into MDSplus hierarchical data structures in 2-3 minutes. Physicists adapted to the MDSplus software tools with no real difficulty. Some locally developed tools are described. The support from the developers at MIT [Massachusetts Institute of Technology] was timely and insightful. The use of MDSplus has resulted in a significant cost savings for NSTX.
Rotorcraft Conceptual Design Environment
2009-10-01
systems engineering design tool sets. The DaVinci Project vision is to develop software architecture and tools specifically for acquisition system...enable movement of that information to and from analyses. Finally, a recently developed rotorcraft system analysis tool is described. Introduction...information to and from analyses. Finally, a recently developed rotorcraft system analysis tool is described. 15. SUBJECT TERMS 16. SECURITY CLASSIFICATION
Small Business and Defense Acquisitions: A Review of Policies and Current Practices
2011-01-01
Office of Management and Budget xviii Small Business and Defense Acquisitions: A Review of Policies and Current Practices PSC Product and Service Code...themselves as minority-owned, women-owned, veteran- owned, or small disadvantaged businesses . The resulting database gives sourcing managers a tool for...REPORT DATE 2011 2. REPORT TYPE 3. DATES COVERED 00-00-2011 to 00-00-2011 4. TITLE AND SUBTITLE Small Business and Defense Acquisitions: A
Macellini, S.; Maranesi, M.; Bonini, L.; Simone, L.; Rozzi, S.; Ferrari, P. F.; Fogassi, L.
2012-01-01
Macaques can efficiently use several tools, but their capacity to discriminate the relevant physical features of a tool and the social factors contributing to their acquisition are still poorly explored. In a series of studies, we investigated macaques' ability to generalize the use of a stick as a tool to new objects having different physical features (study 1), or to new contexts, requiring them to adapt the previously learned motor strategy (study 2). We then assessed whether the observation of a skilled model might facilitate tool-use learning by naive observer monkeys (study 3). Results of study 1 and study 2 showed that monkeys trained to use a tool generalize this ability to tools of different shape and length, and learn to adapt their motor strategy to a new task. Study 3 demonstrated that observing a skilled model increases the observers' manipulations of a stick, thus facilitating the individual discovery of the relevant properties of this object as a tool. These findings support the view that in macaques, the motor system can be modified through tool use and that it has a limited capacity to adjust the learnt motor skills to a new context. Social factors, although important to facilitate the interaction with tools, are not crucial for tool-use learning. PMID:22106424
Overcoming Learning Time and Space Constraints through Technological Tool
ERIC Educational Resources Information Center
Zarei, Nafiseh; Hussin, Supyan; Rashid, Taufik
2015-01-01
Today the use of technological tools has become an evolution in language learning and language acquisition. Many instructors and lecturers believe that integrating Web-based learning tools into language courses allows pupils to become active learners during learning process. This study investigates how the Learning Management Blog (LMB) overcomes…
NASA Technical Reports Server (NTRS)
Moroz, L. V.; Schmidt, M.; Schade, U.; Hiroi, T.; Ivanova, M. A.
2005-01-01
The meteorites Dho 225 and Dho 735 were recently found in Oman. Studies of their mineralogical and chemical composition suggest that these unusual meteorites are thermally metamorphosed CM2 chondrites [1,2,3]. Similar to Antarctic metamorphosed carbonaceous chondrites, the Dho 225 and Dho 735 are enriched in heavy oxygen compared to normal CMs [1,2]. However, IR studies indicating dehydration of matrix phyllosilicates are needed to confirm that the two new meteorites from Oman are thermally metamorphosed [4]. Synchrotron-based IR microspectroscopy is a new promising technique which allows the acquisition of IR spectra from extremely small samples. Here we demonstrate that this non-destructive technique is a useful tool to study hydration states of carbonaceous chondrites in situ. In addition, we acquired reflectance spectra of bulk powders of the Dho 225 and Dho 735 in the range of 0.3-50 microns.
A survey of parallel programming tools
NASA Technical Reports Server (NTRS)
Cheng, Doreen Y.
1991-01-01
This survey examines 39 parallel programming tools. Focus is placed on those tool capabilites needed for parallel scientific programming rather than for general computer science. The tools are classified with current and future needs of Numerical Aerodynamic Simulator (NAS) in mind: existing and anticipated NAS supercomputers and workstations; operating systems; programming languages; and applications. They are divided into four categories: suggested acquisitions, tools already brought in; tools worth tracking; and tools eliminated from further consideration at this time.
Right-of-Way Acquisition and Utility Adjustment Process Duration Information Tool (RUDI) User Guide.
DOT National Transportation Integrated Search
2008-07-01
Ever since right of way acquisition has been an organized business activity in Texas, : involved and affected parties both sides of the R/W line have asked the question How long does : it take to acquire right of way? And, When will this new hi...
Federated Search Tools in Fusion Centers: Bridging Databases in the Information Sharing Environment
2012-09-01
considerable variation in how fusion centers plan for, gather requirements, select and acquire federated search tools to bridge disparate databases...centers, when considering integrating federated search tools; by evaluating the importance of the planning, requirements gathering, selection and...acquisition processes for integrating federated search tools; by acknowledging the challenges faced by some fusion centers during these integration processes
2011-03-01
Humansystems® Warfighter Integrated Physical Ergonomics Tool Development Page 14 e) Forces: Griffon seat design assessments include questions of vibration...the suitability of alternative designs . Humansystems® Warfighter Integrated Physical Ergonomics Tool Development Page 5 e) Performance Measures...configurations to assess Humansystems® Warfighter Integrated Physical Ergonomics Tool Development Page 8 design and acquisition decisions, and more
Analyzing the texture changes in the quantitative phase maps of adipocytes
NASA Astrophysics Data System (ADS)
Roitshtain, Darina; Sharabani-Yosef, Orna; Gefen, Amit; Shaked, Natan T.
2016-03-01
We present a new analysis tool for studying texture changes in the quantitative phase maps of live cells acquired by wide-field interferometry. The sensitivity of wide-field interferometry systems to small changes in refractive index enables visualizing cells and inner cell organelles without the using fluorescent dyes or other cell-invasive approaches, which may affect the measurement and require external labeling. Our label-free texture-analysis tool is based directly on the optical path delay profile of the sample and does not necessitate decoupling refractive index and thickness in the cell quantitative phase profile; thus, relevant parameters can be calculated using a single-frame acquisition. Our experimental system includes low-coherence wide-field interferometer, combined with simultaneous florescence microscopy system for validation. We used this system and analysis tool for studying lipid droplets formation in adipocytes. The latter demonstration is relevant for various cellular functions such as lipid metabolism, protein storage and degradation to viral replication. These processes are functionally linked to several physiological and pathological conditions, including obesity and metabolic diseases. Quantification of these biological phenomena based on the texture changes in the cell phase map has a potential as a new cellular diagnosis tool.
NMR methods for metabolomics of mammalian cell culture bioreactors.
Aranibar, Nelly; Reily, Michael D
2014-01-01
Metabolomics has become an important tool for measuring pools of small molecules in mammalian cell cultures expressing therapeutic proteins. NMR spectroscopy has played an important role, largely because it requires minimal sample preparation, does not require chromatographic separation, and is quantitative. The concentrations of large numbers of small molecules in the extracellular media or within the cells themselves can be measured directly on the culture supernatant and on the supernatant of the lysed cells, respectively, and correlated with endpoints such as titer, cell viability, or glycosylation patterns. The observed changes can be used to generate hypotheses by which these parameters can be optimized. This chapter focuses on the sample preparation, data acquisition, and analysis to get the most out of NMR metabolomics data from CHO cell cultures but could easily be extended to other in vitro culture systems.
NASA Astrophysics Data System (ADS)
Zheng, Yong; Chen, Yan
2013-10-01
To realize the design of dynamic acquisition system for real-time detection of transmission chain error is very important to improve the machining accuracy of machine tool. In this paper, the USB controller and FPGA is used for hardware platform design, combined with LabVIEW to design user applications, NI-VISA is taken for develop USB drivers, and ultimately achieve the dynamic acquisition system design of transmission error
How the public uses social media wechat to obtain health information in china: a survey study.
Zhang, Xingting; Wen, Dong; Liang, Jun; Lei, Jianbo
2017-07-05
On average, 570 million users, 93% in China's first-tier cities, log on to WeChat every day. WeChat has become the most widely and frequently used social media in China, and has been profoundly integrated into the daily life of many Chinese people. A variety of health-related information may be found on WeChat. The objective of this study is to understand how the general public views the impact of the rapidly emerging social media on health information acquisition. A self-administered questionnaire was designed, distributed, collected, and analyzed utilizing the online survey tool Sojump. WeChat was adopted to randomly release the questionnaires using convenience sampling and collect the results after a certain amount of time. (1) A total of 1636 questionnaires (WeChat customers) were collected from 32 provinces. (2) The primary means by which respondents received health education was via the Internet (71.79%). Baidu and WeChat were the top 2 search tools utilized (90.71% and 28.30%, respectively). Only 12.41% of respondents were satisfied with their online health information search. (3) Almost all had seen (98.35%) or read (97.68%) health information; however, only 14.43% believed that WeChat health information could improve health. Nearly one-third frequently received and read health information through WeChat. WeChat was selected (63.26%) as the most expected means for obtaining health information. (4) The major concerns regarding health information through WeChat included the following: excessively homogeneous information, the lack of a guarantee of professionalism, and the presence of advertisements. (5) Finally, the general public was most interested in individualized and interactive health information by managing clinicians, they will highly benefit from using social media rather than Internet search tools. The current state of health acquisition proves worrisome. The public has a high chance to access health information via WeChat. The growing popularity of interactive social platforms (e.g. WeChat) presents a variety of challenges and opportunities with respect to public health acquisition.
NASA Technical Reports Server (NTRS)
Snead, C. J.; McCubbin, F. M.; Nakamura-Messenger, K.; Righter, K.
2018-01-01
The Astromaterials Acquisition and Curation office at NASA Johnson Space Center has established an Advanced Curation program that is tasked with developing procedures, technologies, and data sets necessary for the curation of future astromaterials collections as envisioned by NASA exploration goals. One particular objective of the Advanced Curation program is the development of new methods for the collection, storage, handling and characterization of small (less than 100 micrometer) particles. Astromaterials Curation currently maintains four small particle collections: Cosmic Dust that has been collected in Earth's stratosphere by ER2 and WB-57 aircraft, Comet 81P/Wild 2 dust returned by NASA's Stardust spacecraft, interstellar dust that was returned by Stardust, and asteroid Itokawa particles that were returned by the JAXA's Hayabusa spacecraft. NASA Curation is currently preparing for the anticipated return of two new astromaterials collections - asteroid Ryugu regolith to be collected by Hayabusa2 spacecraft in 2021 (samples will be provided by JAXA as part of an international agreement), and asteroid Bennu regolith to be collected by the OSIRIS-REx spacecraft and returned in 2023. A substantial portion of these returned samples are expected to consist of small particle components, and mission requirements necessitate the development of new processing tools and methods in order to maximize the scientific yield from these valuable acquisitions. Here we describe initial progress towards the development of applicable sample handling methods for the successful curation of future small particle collections.
Mars sample return: Site selection and sample acquisition study
NASA Technical Reports Server (NTRS)
Nickle, N. (Editor)
1980-01-01
Various vehicle and mission options were investigated for the continued exploration of Mars; the cost of a minimum sample return mission was estimated; options and concepts were synthesized into program possibilities; and recommendations for the next Mars mission were made to the Planetary Program office. Specific sites and all relevant spacecraft and ground-based data were studied in order to determine: (1) the adequacy of presently available data for identifying landing sities for a sample return mission that would assure the acquisition of material from the most important geologic provinces of Mars; (2) the degree of surface mobility required to assure sample acquisition for these sites; (3) techniques to be used in the selection and drilling of rock a samples; and (4) the degree of mobility required at the two Viking sites to acquire these samples.
Grijalva, Carlos G.; Griffin, Marie R.; Edwards, Kathryn M.; Williams, John V.; Gil, Ana I.; Verastegui, Hector; Hartinger, Stella M.; Vidal, Jorge E.; Klugman, Keith P.; Lanata, Claudio F.
2014-01-01
Background. Animal models suggest that influenza infection favors nasopharyngeal acquisition of pneumococci. We assessed this relationship with influenza and other respiratory viruses in young children. Methods. A case-control study was nested within a prospective cohort study of acute respiratory illness (ARI) in Andean children <3 years of age (RESPIRA-PERU study). Weekly household visits were made to identify ARI and obtain nasal swabs for viral detection using real-time reverse-transcription polymerase chain reaction. Monthly nasopharyngeal (NP) samples were obtained to assess pneumococcal colonization. We determined whether specific respiratory viral ARI episodes occurring within the interval between NP samples increased the risk of NP acquisition of new pneumococcal serotypes. Results. A total of 729 children contributed 2128 episodes of observation, including 681 pneumococcal acquisition episodes (new serotype, not detected in prior sample), 1029 nonacquisition episodes (no colonization or persistent colonization with the same serotype as the prior sample), and 418 indeterminate episodes. The risk of pneumococcal acquisition increased following influenza-ARI (adjusted odds ratio [AOR], 2.19; 95% confidence interval [CI], 1.02–4.69) and parainfluenza-ARI (AOR, 1.86; 95% CI, 1.15–3.01), when compared with episodes without ARI. Other viral infections (respiratory syncytial virus, human metapneumovirus, human rhinovirus, and adenovirus) were not associated with acquisition. Conclusions. Influenza and parainfluenza ARIs appeared to facilitate pneumococcal acquisition among young children. As acquisition increases the risk of pneumococcal diseases, these observations are pivotal in our attempts to prevent pneumococcal disease. PMID:24621951
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-30
... tool. The PBP analysis tool is a cash-flow model for evaluating alternative financing arrangements, and... PBP analysis tool is a cash-flow model for evaluating alternative financing arrangements, and is... that reflects adequate consideration to the Government for the improved contractor cash flow...
Astromaterials Acquisition and Curation Office (KT) Overview
NASA Technical Reports Server (NTRS)
Allen, Carlton
2014-01-01
The Astromaterials Acquisition and Curation Office has the unique responsibility to curate NASA's extraterrestrial samples - from past and forthcoming missions - into the indefinite future. Currently, curation includes documentation, preservation, physical security, preparation, and distribution of samples from the Moon, asteroids, comets, the solar wind, and the planet Mars. Each of these sample sets has a unique history and comes from a unique environment. The curation laboratories and procedures developed over 40 years have proven both necessary and sufficient to serve the evolving needs of a worldwide research community. A new generation of sample return missions to destinations across the solar system is being planned and proposed. The curators are developing the tools and techniques to meet the challenges of these new samples. Extraterrestrial samples pose unique curation requirements. These samples were formed and exist under conditions strikingly different from those on the Earth's surface. Terrestrial contamination would destroy much of the scientific significance of extraterrestrial materials. To preserve the research value of these precious samples, contamination must be minimized, understood, and documented. In addition, the samples must be preserved - as far as possible - from physical and chemical alteration. The elaborate curation facilities at JSC were designed and constructed, and have been operated for many years, to keep sample contamination and alteration to a minimum. Currently, JSC curates seven collections of extraterrestrial samples: (a)) Lunar rocks and soils collected by the Apollo astronauts, (b) Meteorites collected on dedicated expeditions to Antarctica, (c) Cosmic dust collected by high-altitude NASA aircraft,t (d) Solar wind atoms collected by the Genesis spacecraft, (e) Comet particles collected by the Stardust spacecraft, (f) Interstellar dust particles collected by the Stardust spacecraft, and (g) Asteroid soil particles collected by the Japan Aerospace Exploration Agency (JAXA) Hayabusa spacecraft Each of these sample sets has a unique history and comes from a unique environment. We have developed specialized laboratories and practices over many years to preserve and protect the samples, not only for current research but for studies that may be carried out in the indefinite future.
Radial q-space sampling for DSI
Baete, Steven H.; Yutzy, Stephen; Boada, Fernando, E.
2015-01-01
Purpose Diffusion Spectrum Imaging (DSI) has been shown to be an effective tool for non-invasively depicting the anatomical details of brain microstructure. Existing implementations of DSI sample the diffusion encoding space using a rectangular grid. Here we present a different implementation of DSI whereby a radially symmetric q-space sampling scheme for DSI (RDSI) is used to improve the angular resolution and accuracy of the reconstructed Orientation Distribution Functions (ODF). Methods Q-space is sampled by acquiring several q-space samples along a number of radial lines. Each of these radial lines in q-space is analytically connected to a value of the ODF at the same angular location by the Fourier slice theorem. Results Computer simulations and in vivo brain results demonstrate that RDSI correctly estimates the ODF when moderately high b-values (4000 s/mm2) and number of q-space samples (236) are used. Conclusion The nominal angular resolution of RDSI depends on the number of radial lines used in the sampling scheme, and only weakly on the maximum b-value. In addition, the radial analytical reconstruction reduces truncation artifacts which affect Cartesian reconstructions. Hence, a radial acquisition of q-space can be favorable for DSI. PMID:26363002
Ashtiani, Dariush; Venugopal, Hari; Belousoff, Matthew; Spicer, Bradley; Mak, Johnson; Neild, Adrian; de Marco, Alex
2018-04-06
Cryo-Electron Microscopy (cryo-EM) has become an invaluable tool for structural biology. Over the past decade, the advent of direct electron detectors and automated data acquisition has established cryo-EM as a central method in structural biology. However, challenges remain in the reliable and efficient preparation of samples in a manner which is compatible with high time resolution. The delivery of sample onto the grid is recognized as a critical step in the workflow as it is a source of variability and loss of material due to the blotting which is usually required. Here, we present a method for sample delivery and plunge freezing based on the use of Surface Acoustic Waves to deploy 6-8 µm droplets to the EM grid. This method minimises the sample dead volume and ensures vitrification within 52.6 ms from the moment the sample leaves the microfluidics chip. We demonstrate a working protocol to minimize the atomised volume and apply it to plunge freeze three different samples and provide proof that no damage occurs due to the interaction between the sample and the acoustic waves. Copyright © 2018 Elsevier Inc. All rights reserved.
On-line Monitoring for Cutting Tool Wear Condition Based on the Parameters
NASA Astrophysics Data System (ADS)
Han, Fenghua; Xie, Feng
2017-07-01
In the process of cutting tools, it is very important to monitor the working state of the tools. On the basis of acceleration signal acquisition under the constant speed, time domain and frequency domain analysis of relevant indicators monitor the online of tool wear condition. The analysis results show that the method can effectively judge the tool wear condition in the process of machining. It has certain application value.
A Computational Tool to Detect and Avoid Redundancy in Selected Reaction Monitoring
Röst, Hannes; Malmström, Lars; Aebersold, Ruedi
2012-01-01
Selected reaction monitoring (SRM), also called multiple reaction monitoring, has become an invaluable tool for targeted quantitative proteomic analyses, but its application can be compromised by nonoptimal selection of transitions. In particular, complex backgrounds may cause ambiguities in SRM measurement results because peptides with interfering transitions similar to those of the target peptide may be present in the sample. Here, we developed a computer program, the SRMCollider, that calculates nonredundant theoretical SRM assays, also known as unique ion signatures (UIS), for a given proteomic background. We show theoretically that UIS of three transitions suffice to conclusively identify 90% of all yeast peptides and 85% of all human peptides. Using predicted retention times, the SRMCollider also simulates time-scheduled SRM acquisition, which reduces the number of interferences to consider and leads to fewer transitions necessary to construct an assay. By integrating experimental fragment ion intensities from large scale proteome synthesis efforts (SRMAtlas) with the information content-based UIS, we combine two orthogonal approaches to create high quality SRM assays ready to be deployed. We provide a user friendly, open source implementation of an algorithm to calculate UIS of any order that can be accessed online at http://www.srmcollider.org to find interfering transitions. Finally, our tool can also simulate the specificity of novel data-independent MS acquisition methods in Q1–Q3 space. This allows us to predict parameters for these methods that deliver a specificity comparable with that of SRM. Using SRM interference information in addition to other sources of information can increase the confidence in an SRM measurement. We expect that the consideration of information content will become a standard step in SRM assay design and analysis, facilitated by the SRMCollider. PMID:22535207
NASA Astrophysics Data System (ADS)
Tritscher, Torsten; Koched, Amine; Han, Hee-Siew; Filimundi, Eric; Johnson, Tim; Elzey, Sherrie; Avenido, Aaron; Kykal, Carsten; Bischof, Oliver F.
2015-05-01
Electrical mobility classification (EC) followed by Condensation Particle Counter (CPC) detection is the technique combined in Scanning Mobility Particle Sizers(SMPS) to retrieve nanoparticle size distributions in the range from 2.5 nm to 1 μm. The detectable size range of SMPS systems can be extended by the addition of an Optical Particle Sizer(OPS) that covers larger sizes from 300 nm to 10 μm. This optical sizing method reports an optical equivalent diameter, which is often different from the electrical mobility diameter measured by the standard SMPS technique. Multi-Instrument Manager (MIMTM) software developed by TSI incorporates algorithms that facilitate merging SMPS data sets with data based on optical equivalent diameter to compile single, wide-range size distributions. Here we present MIM 2.0, the next-generation of the data merging tool that offers many advanced features for data merging and post-processing. MIM 2.0 allows direct data acquisition with OPS and NanoScan SMPS instruments to retrieve real-time particle size distributions from 10 nm to 10 μm, which we show in a case study at a fireplace. The merged data can be adjusted using one of the merging options, which automatically determines an overall aerosol effective refractive index. As a result an indirect and average characterization of aerosol optical and shape properties is possible. The merging tool allows several pre-settings, data averaging and adjustments, as well as the export of data sets and fitted graphs. MIM 2.0 also features several post-processing options for SMPS data and differences can be visualized in a multi-peak sample over a narrow size range.
ERIC Educational Resources Information Center
Rodriguez, Lisa Ann; Shepard, MaryFriend
2013-01-01
This study explored the perceptions of adult English language learners about audience response systems (clickers) as tools to facilitate communication. According to second language acquisition theory, learners' receptive capabilities in the early stages of second language acquisition surpass expressive capabilities, often rendering them silent in…
Widening the Knowledge Acquisition Bottleneck for Constraint-Based Tutors
ERIC Educational Resources Information Center
Suraweera, Pramuditha; Mitrovic, Antonija; Martin, Brent
2010-01-01
Intelligent Tutoring Systems (ITS) are effective tools for education. However, developing them is a labour-intensive and time-consuming process. A major share of the effort is devoted to acquiring the domain knowledge that underlies the system's intelligence. The goal of this research is to reduce this knowledge acquisition bottleneck and better…
Campagnola, Luke; Kratz, Megan B; Manis, Paul B
2014-01-01
The complexity of modern neurophysiology experiments requires specialized software to coordinate multiple acquisition devices and analyze the collected data. We have developed ACQ4, an open-source software platform for performing data acquisition and analysis in experimental neurophysiology. This software integrates the tasks of acquiring, managing, and analyzing experimental data. ACQ4 has been used primarily for standard patch-clamp electrophysiology, laser scanning photostimulation, multiphoton microscopy, intrinsic imaging, and calcium imaging. The system is highly modular, which facilitates the addition of new devices and functionality. The modules included with ACQ4 provide for rapid construction of acquisition protocols, live video display, and customizable analysis tools. Position-aware data collection allows automated construction of image mosaics and registration of images with 3-dimensional anatomical atlases. ACQ4 uses free and open-source tools including Python, NumPy/SciPy for numerical computation, PyQt for the user interface, and PyQtGraph for scientific graphics. Supported hardware includes cameras, patch clamp amplifiers, scanning mirrors, lasers, shutters, Pockels cells, motorized stages, and more. ACQ4 is available for download at http://www.acq4.org.
NASA Technical Reports Server (NTRS)
Modesitt, Kenneth L.
1990-01-01
Since 1984, an effort has been underway at Rocketdyne, manufacturer of the Space Shuttle Main Engine (SSME), to automate much of the analysis procedure conducted after engine test firings. Previously published articles at national and international conferences have contained the context of and justification for this effort. Here, progress is reported in building the full system, including the extensions of integrating large databases with the system, known as Scotty. Inductive knowledge acquisition has proven itself to be a key factor in the success of Scotty. The combination of a powerful inductive expert system building tool (ExTran), a relational data base management system (Reliance), and software engineering principles and Computer-Assisted Software Engineering (CASE) tools makes for a practical, useful and state-of-the-art application of an expert system.
Clark, Andrea J.; Petty, Howard R.
2016-01-01
This protocol describes the methods and steps involved in performing biomarker ratio imaging microscopy (BRIM) using formalin fixed paraffin-embedded (FFPE) samples of human breast tissue. The technique is based on the acquisition of two fluorescence images of the same microscopic field using two biomarkers and immunohistochemical tools. The biomarkers are selected such that one biomarker correlates with breast cancer aggressiveness while the second biomarker anti-correlates with aggressiveness. When the former image is divided by the latter image, a computed ratio image is formed that reflects the aggressiveness of tumor cells while increasing contrast and eliminating path-length and other artifacts from the image. For example, the aggressiveness of epithelial cells may be assessed by computing ratio images of N-cadherin and E-cadherin images or CD44 and CD24 images, which specifically reflect the mesenchymal or stem cell nature of the constituent cells, respectively. This methodology is illustrated for tissue samples of ductal carcinoma in situ (DCIS) and invasive breast cancer. This tool should be useful in tissue studies of experimental cancer as well as the management of cancer patients. PMID:27857940
NASA Technical Reports Server (NTRS)
Truszkowski, Walt; Paterra, Frank; Bailin, Sidney
1993-01-01
The old maxim goes: 'A picture is worth a thousand words'. The objective of the research reported in this paper is to demonstrate this idea as it relates to the knowledge acquisition process and the automated development of an expert system's rule base. A prototype tool, the Knowledge From Pictures (KFP) tool, has been developed which configures an expert system's rule base by an automated analysis of and reasoning about a 'picture', i.e., a graphical representation of some target system to be supported by the diagnostic capabilities of the expert system under development. This rule base, when refined, could then be used by the expert system for target system monitoring and fault analysis in an operational setting. Most people, when faced with the problem of understanding the behavior of a complicated system, resort to the use of some picture or graphical representation of the system as an aid in thinking about it. This depiction provides a means of helping the individual to visualize the bahavior and dynamics of the system under study. An analysis of the picture augmented with the individual's background information, allows the problem solver to codify knowledge about the system. This knowledge can, in turn, be used to develop computer programs to automatically monitor the system's performance. The approach taken is this research was to mimic this knowledge acquisition paradigm. A prototype tool was developed which provides the user: (1) a mechanism for graphically representing sample system-configurations appropriate for the domain, and (2) a linguistic device for annotating the graphical representation with the behaviors and mutual influences of the components depicted in the graphic. The KFP tool, reasoning from the graphical depiction along with user-supplied annotations of component behaviors and inter-component influences, generates a rule base that could be used in automating the fault detection, isolation, and repair of the system.
A Sub-Sampling Approach for Data Acquisition in Gamma Ray Emission Tomography
NASA Astrophysics Data System (ADS)
Fysikopoulos, Eleftherios; Kopsinis, Yannis; Georgiou, Maria; Loudos, George
2016-06-01
State of the art data acquisition systems for small animal imaging gamma ray detectors often rely on free running Analog to Digital Converters (ADCs) and high density Field Programmable Gate Arrays (FPGA) devices for digital signal processing. In this work, a sub-sampling acquisition approach, which exploits a priori information regarding the shape of the obtained detector pulses is proposed. Output pulses shape depends on the response of the scintillation crystal, photodetector's properties and amplifier/shaper operation. Using these known characteristics of the detector pulses prior to digitization, one can model the voltage pulse derived from the shaper (a low-pass filter, last in the front-end electronics chain), in order to reduce the desirable sampling rate of ADCs. Fitting with a small number of measurements, pulse shape estimation is then feasible. In particular, the proposed sub-sampling acquisition approach relies on a bi-exponential modeling of the pulse shape. We show that the properties of the pulse that are relevant for Single Photon Emission Computed Tomography (SPECT) event detection (i.e., position and energy) can be calculated by collecting just a small fraction of the number of samples usually collected in data acquisition systems used so far. Compared to the standard digitization process, the proposed sub-sampling approach allows the use of free running ADCs with sampling rate reduced by a factor of 5. Two small detectors consisting of Cerium doped Gadolinium Aluminum Gallium Garnet (Gd3Al2Ga3O12 : Ce or GAGG:Ce) pixelated arrays (array elements: 2 × 2 × 5 mm3 and 1 × 1 × 10 mm3 respectively) coupled to a Position Sensitive Photomultiplier Tube (PSPMT) were used for experimental evaluation. The two detectors were used to obtain raw images and energy histograms under 140 keV and 661.7 keV irradiation respectively. The sub-sampling acquisition technique (10 MHz sampling rate) was compared with a standard acquisition method (52 MHz sampling rate), in terms of energy resolution and image signal to noise ratio for both gamma ray energies. The Levenberg-Marquardt (LM) non-linear least-squares algorithm was used, in post processing, in order to fit the acquired data with the proposed model. The results showed that analog pulses prior to digitization are being estimated with high accuracy after fitting with the bi-exponential model.
Order of stimulus presentation influences children's acquisition in receptive identification tasks.
Petursdottir, Anna Ingeborg; Aguilar, Gabriella
2016-03-01
Receptive identification is usually taught in matching-to-sample format, which entails the presentation of an auditory sample stimulus and several visual comparison stimuli in each trial. Conflicting recommendations exist regarding the order of stimulus presentation in matching-to-sample trials. The purpose of this study was to compare acquisition in receptive identification tasks under 2 conditions: when the sample was presented before the comparisons (sample first) and when the comparisons were presented before the sample (comparison first). Participants included 4 typically developing kindergarten-age boys. Stimuli, which included birds and flags, were presented on a computer screen. Acquisition in the 2 conditions was compared in an adapted alternating-treatments design combined with a multiple baseline design across stimulus sets. All participants took fewer trials to meet the mastery criterion in the sample-first condition than in the comparison-first condition. © 2015 Society for the Experimental Analysis of Behavior.
An ontology-driven, diagnostic modeling system.
Haug, Peter J; Ferraro, Jeffrey P; Holmen, John; Wu, Xinzi; Mynam, Kumar; Ebert, Matthew; Dean, Nathan; Jones, Jason
2013-06-01
To present a system that uses knowledge stored in a medical ontology to automate the development of diagnostic decision support systems. To illustrate its function through an example focused on the development of a tool for diagnosing pneumonia. We developed a system that automates the creation of diagnostic decision-support applications. It relies on a medical ontology to direct the acquisition of clinic data from a clinical data warehouse and uses an automated analytic system to apply a sequence of machine learning algorithms that create applications for diagnostic screening. We refer to this system as the ontology-driven diagnostic modeling system (ODMS). We tested this system using samples of patient data collected in Salt Lake City emergency rooms and stored in Intermountain Healthcare's enterprise data warehouse. The system was used in the preliminary development steps of a tool to identify patients with pneumonia in the emergency department. This tool was compared with a manually created diagnostic tool derived from a curated dataset. The manually created tool is currently in clinical use. The automatically created tool had an area under the receiver operating characteristic curve of 0.920 (95% CI 0.916 to 0.924), compared with 0.944 (95% CI 0.942 to 0.947) for the manually created tool. Initial testing of the ODMS demonstrates promising accuracy for the highly automated results and illustrates the route to model improvement. The use of medical knowledge, embedded in ontologies, to direct the initial development of diagnostic computing systems appears feasible.
Hoffman, Melissa A; Fang, Bin; Haura, Eric B; Rix, Uwe; Koomen, John M
2018-01-05
Recent developments in instrumentation and bioinformatics have led to new quantitative mass spectrometry platforms including LC-MS/MS with data-independent acquisition (DIA) and targeted analysis using parallel reaction monitoring mass spectrometry (LC-PRM), which provide alternatives to well-established methods, such as LC-MS/MS with data-dependent acquisition (DDA) and targeted analysis using multiple reaction monitoring mass spectrometry (LC-MRM). These tools have been used to identify signaling perturbations in lung cancers and other malignancies, supporting the development of effective kinase inhibitors and, more recently, providing insights into therapeutic resistance mechanisms and drug repurposing opportunities. However, detection of kinases in biological matrices can be challenging; therefore, activity-based protein profiling enrichment of ATP-utilizing proteins was selected as a test case for exploring the limits of detection of low-abundance analytes in complex biological samples. To examine the impact of different MS acquisition platforms, quantification of kinase ATP uptake following kinase inhibitor treatment was analyzed by four different methods: LC-MS/MS with DDA and DIA, LC-MRM, and LC-PRM. For discovery data sets, DIA increased the number of identified kinases by 21% and reduced missingness when compared with DDA. In this context, MRM and PRM were most effective at identifying global kinome responses to inhibitor treatment, highlighting the value of a priori target identification and manual evaluation of quantitative proteomics data sets. We compare results for a selected set of desthiobiotinylated peptides from PRM, MRM, and DIA and identify considerations for selecting a quantification method and postprocessing steps that should be used for each data acquisition strategy.
Low cost light-sheet microscopy for whole brain imaging
NASA Astrophysics Data System (ADS)
Kumar, Manish; Nasenbeny, Jordan; Kozorovitskiy, Yevgenia
2018-02-01
Light-sheet microscopy has evolved as an indispensable tool in imaging biological samples. It can image 3D samples at fast speed, with high-resolution optical sectioning, and with reduced photobleaching effects. These properties make light-sheet microscopy ideal for imaging fluorophores in a variety of biological samples and organisms, e.g. zebrafish, drosophila, cleared mouse brains, etc. While most commercial turnkey light-sheet systems are expensive, the existing lower cost implementations, e.g. OpenSPIM, are focused on achieving high-resolution imaging of small samples or organisms like zebrafish. In this work, we substantially reduce the cost of light-sheet microscope system while targeting to image much larger samples, i.e. cleared mouse brains, at single-cell resolution. The expensive components of a lightsheet system - excitation laser, water-immersion objectives, and translation stage - are replaced with an incoherent laser diode, dry objectives, and a custom-built Arduino-controlled translation stage. A low-cost CUBIC protocol is used to clear fixed mouse brain samples. The open-source platforms of μManager and Fiji support image acquisition, processing, and visualization. Our system can easily be extended to multi-color light-sheet microscopy.
Sediment Sampling in Estuarine Mudflats with an Aerial-Ground Robotic Team
Deusdado, Pedro; Guedes, Magno; Silva, André; Marques, Francisco; Pinto, Eduardo; Rodrigues, Paulo; Lourenço, André; Mendonça, Ricardo; Santana, Pedro; Corisco, José; Almeida, Susana Marta; Portugal, Luís; Caldeira, Raquel; Barata, José; Flores, Luis
2016-01-01
This paper presents a robotic team suited for bottom sediment sampling and retrieval in mudflats, targeting environmental monitoring tasks. The robotic team encompasses a four-wheel-steering ground vehicle, equipped with a drilling tool designed to be able to retain wet soil, and a multi-rotor aerial vehicle for dynamic aerial imagery acquisition. On-demand aerial imagery, properly fused on an aerial mosaic, is used by remote human operators for specifying the robotic mission and supervising its execution. This is crucial for the success of an environmental monitoring study, as often it depends on human expertise to ensure the statistical significance and accuracy of the sampling procedures. Although the literature is rich on environmental monitoring sampling procedures, in mudflats, there is a gap as regards including robotic elements. This paper closes this gap by also proposing a preliminary experimental protocol tailored to exploit the capabilities offered by the robotic system. Field trials in the south bank of the river Tagus’ estuary show the ability of the robotic system to successfully extract and transport bottom sediment samples for offline analysis. The results also show the efficiency of the extraction and the benefits when compared to (conventional) human-based sampling. PMID:27618060
Sequential time interleaved random equivalent sampling for repetitive signal.
Zhao, Yijiu; Liu, Jingjing
2016-12-01
Compressed sensing (CS) based sampling techniques exhibit many advantages over other existing approaches for sparse signal spectrum sensing; they are also incorporated into non-uniform sampling signal reconstruction to improve the efficiency, such as random equivalent sampling (RES). However, in CS based RES, only one sample of each acquisition is considered in the signal reconstruction stage, and it will result in more acquisition runs and longer sampling time. In this paper, a sampling sequence is taken in each RES acquisition run, and the corresponding block measurement matrix is constructed using a Whittaker-Shannon interpolation formula. All the block matrices are combined into an equivalent measurement matrix with respect to all sampling sequences. We implemented the proposed approach with a multi-cores analog-to-digital converter (ADC), whose ADC cores are time interleaved. A prototype realization of this proposed CS based sequential random equivalent sampling method has been developed. It is able to capture an analog waveform at an equivalent sampling rate of 40 GHz while sampled at 1 GHz physically. Experiments indicate that, for a sparse signal, the proposed CS based sequential random equivalent sampling exhibits high efficiency.
Geospatial methods and data analysis for assessing distribution of grazing livestock
USDA-ARS?s Scientific Manuscript database
Free-ranging livestock research must begin with a well conceived problem statement and employ appropriate data acquisition tools and analytical techniques to accomplish the research objective. These requirements are especially critical in addressing animal distribution. Tools and statistics used t...
The Attitude Determination Scale for Value Acquisition: A Validity and Reliability Study
ERIC Educational Resources Information Center
Cetin, Saban
2017-01-01
This study aims to develop a measurement tool having measurement reliability with the aim of determining attitudes for values acquisition of secondary school students. The study was conducted on totally 325 high school senior students as 200 female and 125 male students in spring semester of 2014-2015 educational year. In the study, expert opinion…
Arduino-Based Data Acquisition into Excel, LabVIEW, and MATLAB
ERIC Educational Resources Information Center
Nichols, Daniel
2017-01-01
Data acquisition equipment for physics can be quite expensive. As an alternative, data can be acquired using a low-cost Arduino microcontroller. The Arduino has been used in physics labs where the data are acquired using the Arduino software. The Arduino software, however, does not contain a suite of tools for data fitting and analysis. The data…
Benkali, K; Marquet, P; Rérolle, JP; Le Meur, Y; Gastinel, LN
2008-01-01
Background LC-MALDI-TOF/TOF analysis is a potent tool in biomarkers discovery characterized by its high sensitivity and high throughput capacity. However, methods based on MALDI-TOF/TOF for biomarkers discovery still need optimization, in particular to reduce analysis time and to evaluate their reproducibility for peak intensities measurement. The aims of this methodological study were: (i) to optimize and critically evaluate each step of urine biomarker discovery method based on Nano-LC coupled off-line to MALDI-TOF/TOF, taking full advantage of the dual decoupling between Nano-LC, MS and MS/MS to reduce the overall analysis time; (ii) to evaluate the quantitative performance and reproducibility of nano-LC-MALDI analysis in biomarker discovery; and (iii) to evaluate the robustness of biomarkers selection. Results A pool of urine sample spiked at increasing concentrations with a mixture of standard peptides was used as a specimen for biological samples with or without biomarkers. Extraction and nano-LC-MS variabilities were estimated by analyzing in triplicates and hexaplicates, respectively. The stability of chromatographic fractions immobilised with MALDI matrix on MALDI plates was evaluated by successive MS acquisitions after different storage times at different temperatures. Low coefficient of variation (CV%: 10–22%) and high correlation (R2 > 0.96) values were obtained for the quantification of the spiked peptides, allowing quantification of these peptides in the low fentomole range, correct group discrimination and selection of "specific" markers using principal component analysis. Excellent peptide integrity and stable signal intensity were found when MALDI plates were stored for periods of up to 2 months at +4°C. This allowed storage of MALDI plates between LC separation and MS acquisition (first decoupling), and between MS and MSMS acquisitions while the selection of inter-group discriminative ions is done (second decoupling). Finally the recording of MSMS spectra to obtain structural information was focused only on discriminative ions in order to minimize analysis time. Conclusion Contrary to other classical approaches with direct online coupling of chromatographic separation and on the flight MS and/or MSMS data acquisition for all detected analytes, our dual decoupling strategy allowed us to focus on the most discriminative analytes, giving us more time to acquire more replicates of the same urine samples thus increasing detection sensitivity and mass precision. PMID:19014585
NASA Astrophysics Data System (ADS)
Canora, C. P.; Moral, A. G.; Rull, F.; Maurice, S.; Hutchinson, I.; Ramos, G.; López-Reyes, G.; Belenguer, T.; Canchal, R.; Prieto, J. A. R.; Rodriguez, P.; Santamaria, P.; Berrocal, A.; Colombo, M.; Gallago, P.; Seoane, L.; Quintana, C.; Ibarmia, S.; Zafra, J.; Saiz, J.; Santiago, A.; Marin, A.; Gordillo, C.; Escribano, D.; Sanz-Palominoa, M.
2017-09-01
The Raman Laser Spectrometer (RLS) is one of the Pasteur Payload instruments, within the ESA's Aurora Exploration Programme, ExoMars mission. Raman spectroscopy is based on the analysis of spectral fingerprints due to the inelastic scattering of light when interacting with matter. RLS is composed by Units: SPU (Spectrometer Unit), iOH (Internal Optical Head), and ICEU (Instrument Control and Excitation Unit) and the harnesses (EH and OH). The iOH focuses the excitation laser on the samples and collects the Raman emission from the sample via SPU (CCD) and the video data (analog) is received, digitalizing it and transmiting it to the processor module (ICEU). The main sources of noise arise from the sample, the background, and the instrument (Laser, CCD, focuss, acquisition parameters, operation control). In this last case the sources are mainly perturbations from the optics, dark signal and readout noise. Also flicker noise arising from laser emission fluctuations can be considered as instrument noise. In order to evaluate the SNR of a Raman instrument in a practical manner it is useful to perform end-to-end measurements on given standards samples. These measurements have to be compared with radiometric simulations using Raman efficiency values from literature and taking into account the different instrumental contributions to the SNR. The RLS EQM instrument performances results and its functionalities have been demonstrated in accordance with the science expectations. The Instrument obtained SNR performances in the RLS EQM will be compared experimentally and via analysis, with the Instrument Radiometric Model tool. The characterization process for SNR optimization is still on going. The operational parameters and RLS algorithms (fluorescence removal and acquisition parameters estimation) will be improved in future models (EQM-2) until FM Model delivery.
Real-time oil-saturation monitoring in rock cores with low-field NMR.
Mitchell, J; Howe, A M; Clarke, A
2015-07-01
Nuclear magnetic resonance (NMR) provides a powerful suite of tools for studying oil in reservoir core plugs at the laboratory scale. Low-field magnets are preferred for well-log calibration and to minimize magnetic-susceptibility-induced internal gradients in the porous medium. We demonstrate that careful data processing, combined with prior knowledge of the sample properties, enables real-time acquisition and interpretation of saturation state (relative amount of oil and water in the pores of a rock). Robust discrimination of oil and brine is achieved with diffusion weighting. We use this real-time analysis to monitor the forced displacement of oil from porous materials (sintered glass beads and sandstones) and to generate capillary desaturation curves. The real-time output enables in situ modification of the flood protocol and accurate control of the saturation state prior to the acquisition of standard NMR core analysis data, such as diffusion-relaxation correlations. Although applications to oil recovery and core analysis are demonstrated, the implementation highlights the general practicality of low-field NMR as an inline sensor for real-time industrial process control. Copyright © 2015 Elsevier Inc. All rights reserved.
A novel 3D Cartesian random sampling strategy for Compressive Sensing Magnetic Resonance Imaging.
Valvano, Giuseppe; Martini, Nicola; Santarelli, Maria Filomena; Chiappino, Dante; Landini, Luigi
2015-01-01
In this work we propose a novel acquisition strategy for accelerated 3D Compressive Sensing Magnetic Resonance Imaging (CS-MRI). This strategy is based on a 3D cartesian sampling with random switching of the frequency encoding direction with other K-space directions. Two 3D sampling strategies are presented. In the first strategy, the frequency encoding direction is randomly switched with one of the two phase encoding directions. In the second strategy, the frequency encoding direction is randomly chosen between all the directions of the K-Space. These strategies can lower the coherence of the acquisition, in order to produce reduced aliasing artifacts and to achieve a better image quality after Compressive Sensing (CS) reconstruction. Furthermore, the proposed strategies can reduce the typical smoothing of CS due to the limited sampling of high frequency locations. We demonstrated by means of simulations that the proposed acquisition strategies outperformed the standard Compressive Sensing acquisition. This results in a better quality of the reconstructed images and in a greater achievable acceleration.
Improving the Acquisition and Management of Sample Curation Data
NASA Technical Reports Server (NTRS)
Todd, Nancy S.; Evans, Cindy A.; Labasse, Dan
2011-01-01
This paper discusses the current sample documentation processes used during and after a mission, examines the challenges and special considerations needed for designing effective sample curation data systems, and looks at the results of a simulated sample result mission and the lessons learned from this simulation. In addition, it introduces a new data architecture for an integrated sample Curation data system being implemented at the NASA Astromaterials Acquisition and Curation department and discusses how it improves on existing data management systems.
Frontiers of in situ electron microscopy
Zheng, Haimei; Zhu, Yimei; Meng, Shirley Ying
2015-01-01
In situ transmission electron microscopy (TEM) has become an increasingly important tool for materials characterization. It provides key information on the structural dynamics of a material during transformations and the correlation between structure and properties of materials. With the recent advances in instrumentation, including aberration corrected optics, sample environment control, the sample stage, and fast and sensitive data acquisition, in situ TEM characterization has become more and more powerful. In this article, a brief review of the current status and future opportunities of in situ TEM is included. It also provides an introduction to the six articles covered by inmore » this issue of MRS Bulletin explore the frontiers of in situ electron microscopy, including liquid and gas environmental TEM, dynamic four-dimensional TEM, nanomechanics, ferroelectric domain switching studied by in situ TEM, and state-of-the-art atomic imaging of light elements (i.e., carbon atoms) and individual defects.« less
A user's guide to localization-based super-resolution fluorescence imaging.
Dempsey, Graham T
2013-01-01
Advances in far-field fluorescence microscopy over the past decade have led to the development of super-resolution imaging techniques that provide more than an order of magnitude improvement in spatial resolution compared to conventional light microscopy. One such approach, called Stochastic Optical Reconstruction Microscopy (STORM) uses the sequential, nanometer-scale localization of individual fluorophores to reconstruct a high-resolution image of a structure of interest. This is an attractive method for biological investigation at the nanoscale due to its relative simplicity, both conceptually and practically in the laboratory. Like most research tools, however, the devil is in the details. The aim of this chapter is to serve as a guide for applying STORM to the study of biological samples. This chapter will discuss considerations for choosing a photoswitchable fluorescent probe, preparing a sample, selecting hardware for data acquisition, and collecting and analyzing data for image reconstruction. Copyright © 2013 Elsevier Inc. All rights reserved.
Broadband polarized emission from P(NDI2OD-T2) polymer.
Ulrich, Steve; Sutch, Tabitha; Szulczewski, Greg; Schweizer, Matthias; Barbosa, Newton; Araujo, Paulo
2018-05-18
We investigate the P(NDI2OD-T2) photophysical properties via absorbance and fluorescence spectroscopy, in association with the experimental approach baptized Stokes Spectroscopy, which provides valuable material information through the acquisition and analysis of the fluorescence polarization degree. By changing solvents and using different samples such as solutions, thick, and thin films, it is possible to control the polarization degree spectrum associated to the fluorescence emitted by the polymer's isolated chains and aggregates. We show that the polarization degree could become a powerful tool to obtain information related to the samples morphology, which is connected to their microscopic structure. Moreover, the polarization degree spectra suggest that depolarization effects linked to energy and charge transfer mechanisms are likely taking place. Our findings indicate that P(NDI2OD-T2) polymers are excellent candidates for the advancement of organic technologies that rely on the emission and detection of polarized lights. © 2018 IOP Publishing Ltd.
Terahertz imaging applied to cancer diagnosis.
Brun, M-A; Formanek, F; Yasuda, A; Sekine, M; Ando, N; Eishii, Y
2010-08-21
We report on terahertz (THz) time-domain spectroscopy imaging of 10 microm thick histological sections. The sections are prepared according to standard pathological procedures and deposited on a quartz window for measurements in reflection geometry. Simultaneous acquisition of visible images enables registration of THz images and thus the use of digital pathology tools to investigate the links between the underlying cellular structure and specific THz information. An analytic model taking into account the polarization of the THz beam, its incidence angle, the beam shift between the reference and sample pulses as well as multiple reflections within the sample is employed to determine the frequency-dependent complex refractive index. Spectral images are produced through segmentation of the extracted refractive index data using clustering methods. Comparisons of visible and THz images demonstrate spectral differences not only between tumor and healthy tissues but also within tumors. Further visualization using principal component analysis suggests different mechanisms as to the origin of image contrast.
Broadband polarized emission from P(NDI2OD-T2) polymer
NASA Astrophysics Data System (ADS)
Ulrich, Steven V.; Sutch, Tabitha; Szulczewski, Greg; Schweizer, Matthias; Barbosa Neto, Newton M.; Araujo, Paulo T.
2018-07-01
We investigate the P(NDI2OD-T2) photophysical properties via absorbance and fluorescence spectroscopy, in association with the experimental approach baptized Stokes Spectroscopy, which provides valuable material information through the acquisition and analysis of the fluorescence polarization degree. By changing solvents and using different samples such as solutions, thick, and thin films, it is possible to control the polarization degree spectrum associated to the fluorescence emitted by the polymer’s isolated chains and aggregates. We show that the polarization degree could become a powerful tool to obtain information related to the samples morphology, which is connected to their microscopic structure. Moreover, the polarization degree spectra suggest that depolarization effects linked to energy and charge transfer mechanisms are likely taking place. Our findings indicate that P(NDI2OD-T2) polymers are excellent candidates for the advancement of organic technologies that rely on the emission and detection of polarized lights.
Wang, Yuhao; Li, Xin; Xu, Kai; Ren, Fengbo; Yu, Hao
2017-04-01
Compressive sensing is widely used in biomedical applications, and the sampling matrix plays a critical role on both quality and power consumption of signal acquisition. It projects a high-dimensional vector of data into a low-dimensional subspace by matrix-vector multiplication. An optimal sampling matrix can ensure accurate data reconstruction and/or high compression ratio. Most existing optimization methods can only produce real-valued embedding matrices that result in large energy consumption during data acquisition. In this paper, we propose an efficient method that finds an optimal Boolean sampling matrix in order to reduce the energy consumption. Compared to random Boolean embedding, our data-driven Boolean sampling matrix can improve the image recovery quality by 9 dB. Moreover, in terms of sampling hardware complexity, it reduces the energy consumption by 4.6× and the silicon area by 1.9× over the data-driven real-valued embedding.
Digital Curation of Earth Science Samples Starts in the Field
NASA Astrophysics Data System (ADS)
Lehnert, K. A.; Hsu, L.; Song, L.; Carter, M. R.
2014-12-01
Collection of physical samples in the field is an essential part of research in the Earth Sciences. Samples provide a basis for progress across many disciplines, from the study of global climate change now and over the Earth's history, to present and past biogeochemical cycles, to magmatic processes and mantle dynamics. The types of samples, methods of collection, and scope and scale of sampling campaigns are highly diverse, ranging from large-scale programs to drill rock and sediment cores on land, in lakes, and in the ocean, to environmental observation networks with continuous sampling, to single investigator or small team expeditions to remote areas around the globe or trips to local outcrops. Cyberinfrastructure for sample-related fieldwork needs to cater to the different needs of these diverse sampling activities, aligning with specific workflows, regional constraints such as connectivity or climate, and processing of samples. In general, digital tools should assist with capture and management of metadata about the sampling process (location, time, method) and the sample itself (type, dimension, context, images, etc.), management of the physical objects (e.g., sample labels with QR codes), and the seamless transfer of sample metadata to data systems and software relevant to the post-sampling data acquisition, data processing, and sample curation. In order to optimize CI capabilities for samples, tools and workflows need to adopt community-based standards and best practices for sample metadata, classification, identification and registration. This presentation will provide an overview and updates of several ongoing efforts that are relevant to the development of standards for digital sample management: the ODM2 project that has generated an information model for spatially-discrete, feature-based earth observations resulting from in-situ sensors and environmental samples, aligned with OGC's Observation & Measurements model (Horsburgh et al, AGU FM 2014); implementation of the IGSN (International Geo Sample Number) as a globally unique sample identifier via a distributed system of allocating agents and a central registry; and the EarthCube Research Coordination Network iSamplES (Internet of Samples in the Earth Sciences) that aims to improve sharing and curation of samples through the use of CI.
Lonsdorf, Elizabeth V
2006-01-01
This paper explores the role of maternal influences on the acquisition of a tool-using task in wild chimpanzees (Pan troglodytes schweinfurthii) in order to build on and complement previous work done in captivity. Young chimpanzees show a long period of offspring dependency on mothers and it is during this period that offspring learn several important skills, especially how to and on what to forage. At Gombe National Park, one skill that is acquired during dependency is termite-fishing, a complex behavior that involves inserting a tool made from the surrounding vegetation into a termite mound and extracting the termites that attack and cling to the tool. All chimpanzees observed at Gombe have acquired the termite-fishing skill by the age of 5.5 years. Since the mother is the primary source of information throughout this time period, I investigated the influence of mothers' individual termite-fishing characteristics on their offsprings' speed of acquisition and proficiency at the skill once acquired. Mother's time spent alone or with maternal family members, which is highly correlated to time spent termite-fishing, was positively correlated to offspring's acquisition of critical elements of the skill. I also investigated the specific types of social interactions that occur between mothers and offspring at the termite mound and found that mothers are highly tolerant to offspring, even when the behavior of the offspring may disrupt the termite-fishing attempt. However, no active facilitation by mothers of offsprings' attempts were observed.
Effective use of metadata in the integration and analysis of multi-dimensional optical data
NASA Astrophysics Data System (ADS)
Pastorello, G. Z.; Gamon, J. A.
2012-12-01
Data discovery and integration relies on adequate metadata. However, creating and maintaining metadata is time consuming and often poorly addressed or avoided altogether, leading to problems in later data analysis and exchange. This is particularly true for research fields in which metadata standards do not yet exist or are under development, or within smaller research groups without enough resources. Vegetation monitoring using in-situ and remote optical sensing is an example of such a domain. In this area, data are inherently multi-dimensional, with spatial, temporal and spectral dimensions usually being well characterized. Other equally important aspects, however, might be inadequately translated into metadata. Examples include equipment specifications and calibrations, field/lab notes and field/lab protocols (e.g., sampling regimen, spectral calibration, atmospheric correction, sensor view angle, illumination angle), data processing choices (e.g., methods for gap filling, filtering and aggregation of data), quality assurance, and documentation of data sources, ownership and licensing. Each of these aspects can be important as metadata for search and discovery, but they can also be used as key data fields in their own right. If each of these aspects is also understood as an "extra dimension," it is possible to take advantage of them to simplify the data acquisition, integration, analysis, visualization and exchange cycle. Simple examples include selecting data sets of interest early in the integration process (e.g., only data collected according to a specific field sampling protocol) or applying appropriate data processing operations to different parts of a data set (e.g., adaptive processing for data collected under different sky conditions). More interesting scenarios involve guided navigation and visualization of data sets based on these extra dimensions, as well as partitioning data sets to highlight relevant subsets to be made available for exchange. The DAX (Data Acquisition to eXchange) Web-based tool uses a flexible metadata representation model and takes advantage of multi-dimensional data structures to translate metadata types into data dimensions, effectively reshaping data sets according to available metadata. With that, metadata is tightly integrated into the acquisition-to-exchange cycle, allowing for more focused exploration of data sets while also increasing the value of, and incentives for, keeping good metadata. The tool is being developed and tested with optical data collected in different settings, including laboratory, field, airborne, and satellite platforms.
Repeatable reference for positioning sensors and transducers in drill pipe
Hall, David R.; Fox, Joe; Pixton, David S.; Hall, Jr., H. Tracy
2005-05-03
A drill pipe having a box end having a tapered thread, and an internal shoulder and an external face for engagement with a drill pipe pin end having a tapered mating thread, and an external shoulder and an external face adapted for data acquisition or transmission. The relative dimensions of the box and pin ends are precisely controlled so that when the tool joint is made up, a repeatable reference plane is established for transmitting power and tuning downhole sensors, transducers, and means for sending and receiving data along the drill string. When the power or data acquisition and transmission means are located in the tool joint, the dimensions of the tool joint are further proportioned to compensate for the loss of cross-sectional area in order maintain the joints ability to sustain nominal makeup torque.
Lidierth, Malcolm
2005-02-15
This paper describes software that runs in the Spike2 for Windows environment and provides a versatile tool for generating stimuli during data acquisition from the 1401 family of interfaces (CED, UK). A graphical user interface (GUI) is used to provide dynamic control of stimulus timing. Both single stimuli and trains of stimuli can be generated. The pulse generation routines make use of programmable variables within the interface and allow these to be rapidly changed during an experiment. The routines therefore provide the ease-of-use associated with external, stand-alone pulse generators. Complex stimulus protocols can be loaded from an external text file and facilities are included to create these files through the GUI. The software consists of a Spike2 script that runs in the host PC, and accompanying routines written in the 1401 sequencer control code, that run in the 1401 interface. Handshaking between the PC and the interface card are built into the routines and provides for full integration of sampling, analysis and stimulus generation during an experiment. Control of the 1401 digital-to-analogue converters is also provided; this allows control of stimulus amplitude as well as timing and also provides a sample-hold feature that may be used to remove DC offsets and drift from recorded data.
Circular tomosynthesis for neuro perfusion imaging on an interventional C-arm
NASA Astrophysics Data System (ADS)
Claus, Bernhard E.; Langan, David A.; Al Assad, Omar; Wang, Xin
2015-03-01
There is a clinical need to improve cerebral perfusion assessment during the treatment of ischemic stroke in the interventional suite. The clinician is able to determine whether the arterial blockage was successfully opened but is unable to sufficiently assess blood flow through the parenchyma. C-arm spin acquisitions can image the cerebral blood volume (CBV) but are challenged to capture the temporal dynamics of the iodinated contrast bolus, which is required to derive, e.g., cerebral blood flow (CBF) and mean transit time (MTT). Here we propose to utilize a circular tomosynthesis acquisition on the C-arm to achieve the necessary temporal sampling of the volume at the cost of incomplete data. We address the incomplete data problem by using tools from compressed sensing and incorporate temporal interpolation to improve our temporal resolution. A CT neuro perfusion data set is utilized for generating a dynamic (4D) volumetric model from which simulated tomo projections are generated. The 4D model is also used as a ground truth reference for performance evaluation. The performance that may be achieved with the tomo acquisition and 4D reconstruction (under simulation conditions, i.e., without considering data fidelity limitations due to imaging physics and imaging chain) is evaluated. In the considered scenario, good agreement between the ground truth and the tomo reconstruction in the parenchyma was achieved.
Icy Soil Acquisition Device for the 2007 Phoenix Mars Lander
NASA Technical Reports Server (NTRS)
Chu, Philip; Wilson, Jack; Davis, Kiel; Shiraishi, Lori; Burke, Kevin
2008-01-01
The Icy Soil Acquisition Device is a first of its kind mechanism that is designed to acquire ice-bearing soil from the surface of the Martian polar region and transfer the samples to analytical instruments, playing a critical role in the potential discovery of existing water on Mars. The device incorporates a number of novel features that further the state of the art in spacecraft design for harsh environments, sample acquisition and handling, and high-speed low torque mechanism design.
NASA Astrophysics Data System (ADS)
Qiang, Wei
2011-12-01
We describe a sampling scheme for the two-dimensional (2D) solid state NMR experiments, which can be readily applied to the sensitivity-limited samples. The sampling scheme utilizes continuous, non-uniform sampling profile for the indirect dimension, i.e. the acquisition number decreases as a function of the evolution time ( t1) in the indirect dimension. For a beta amyloid (Aβ) fibril sample, we observed overall 40-50% signal enhancement by measuring the cross peak volume, while the cross peak linewidths remained comparable to the linewidths obtained by regular sampling and processing strategies. Both the linear and Gaussian decay functions for the acquisition numbers result in similar percentage of increment in signal. In addition, we demonstrated that this sampling approach can be applied with different dipolar recoupling approaches such as radiofrequency assisted diffusion (RAD) and finite-pulse radio-frequency-driven recoupling (fpRFDR). This sampling scheme is especially suitable for the sensitivity-limited samples which require long signal averaging for each t1 point, for instance the biological membrane proteins where only a small fraction of the sample is isotopically labeled.
TFTR CAMAC systems and components
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rauch, W.A.; Bergin, W.; Sichta, P.
1987-08-01
Princeton's tokamak fusion test reactor (TFTR) utilizes Computer Automated Measurement and Control (CAMAC) to provide instrumentation for real and quasi real time control, monitoring, and data acquisition systems. This paper describes and discusses the complement of CAMAC hardware systems and components that comprise the interface for tokamak control and measurement instrumentation, and communication with the central instrumentation control and data acquisition (CICADA) system. It also discusses CAMAC reliability and calibration, types of modules used, a summary of data acquisition and control points, and various diagnostic maintenance tools used to support and troubleshoot typical CAMAC systems on TFTR.
Sparse-sampling with time-encoded (TICO) stimulated Raman scattering for fast image acquisition
NASA Astrophysics Data System (ADS)
Hakert, Hubertus; Eibl, Matthias; Karpf, Sebastian; Huber, Robert
2017-07-01
Modern biomedical imaging modalities aim to provide researchers a multimodal contrast for a deeper insight into a specimen under investigation. A very promising technique is stimulated Raman scattering (SRS) microscopy, which can unveil the chemical composition of a sample with a very high specificity. Although the signal intensities are enhanced manifold to achieve a faster acquisition of images if compared to standard Raman microscopy, there is a trade-off between specificity and acquisition speed. Commonly used SRS concepts either probe only very few Raman transitions as the tuning of the applied laser sources is complicated or record whole spectra with a spectrometer based setup. While the first approach is fast, it reduces the specificity and the spectrometer approach records whole spectra -with energy differences where no Raman information is present-, which limits the acquisition speed. Therefore, we present a new approach based on the TICO-Raman concept, which we call sparse-sampling. The TICO-sparse-sampling setup is fully electronically controllable and allows probing of only the characteristic peaks of a Raman spectrum instead of always acquiring a whole spectrum. By reducing the spectral points to the relevant peaks, the acquisition time can be greatly reduced compared to a uniformly, equidistantly sampled Raman spectrum while the specificity and the signal to noise ratio (SNR) are maintained. Furthermore, all laser sources are completely fiber based. The synchronized detection enables a full resolution of the Raman signal, whereas the analogue and digital balancing allows shot noise limited detection. First imaging results with polystyrene (PS) and polymethylmethacrylate (PMMA) beads confirm the advantages of TICO sparse-sampling. We achieved a pixel dwell time as low as 35 μs for an image differentiating both species. The mechanical properties of the applied voice coil stage for scanning the sample currently limits even faster acquisition.
Emwas, Abdul-Hamid; Luchinat, Claudio; Turano, Paola; Tenori, Leonardo; Roy, Raja; Salek, Reza M; Ryan, Danielle; Merzaban, Jasmeen S; Kaddurah-Daouk, Rima; Zeri, Ana Carolina; Nagana Gowda, G A; Raftery, Daniel; Wang, Yulan; Brennan, Lorraine; Wishart, David S
The metabolic composition of human biofluids can provide important diagnostic and prognostic information. Among the biofluids most commonly analyzed in metabolomic studies, urine appears to be particularly useful. It is abundant, readily available, easily stored and can be collected by simple, noninvasive techniques. Moreover, given its chemical complexity, urine is particularly rich in potential disease biomarkers. This makes it an ideal biofluid for detecting or monitoring disease processes. Among the metabolomic tools available for urine analysis, NMR spectroscopy has proven to be particularly well-suited, because the technique is highly reproducible and requires minimal sample handling. As it permits the identification and quantification of a wide range of compounds, independent of their chemical properties, NMR spectroscopy has been frequently used to detect or discover disease fingerprints and biomarkers in urine. Although protocols for NMR data acquisition and processing have been standardized, no consensus on protocols for urine sample selection, collection, storage and preparation in NMR-based metabolomic studies have been developed. This lack of consensus may be leading to spurious biomarkers being reported and may account for a general lack of reproducibility between laboratories. Here, we review a large number of published studies on NMR-based urine metabolic profiling with the aim of identifying key variables that may affect the results of metabolomics studies. From this survey, we identify a number of issues that require either standardization or careful accounting in experimental design and provide some recommendations for urine collection, sample preparation and data acquisition.
ERIC Educational Resources Information Center
VanTol, Kathleen M.
2009-01-01
The purpose of this study was to design and establish the technical adequacy of curriculum-based measures (CBMs) of vocabulary acquisition for use with preschool children. This study sought to establish the technical adequacy of two tools that can be used for measuring benchmarks of vocabulary acquisition for both native English speakers and for…
Competency Assessment in Senior Emergency Medicine Residents for Core Ultrasound Skills.
Schmidt, Jessica N; Kendall, John; Smalley, Courtney
2015-11-01
Quality resident education in point-of-care ultrasound (POC US) is becoming increasingly important in emergency medicine (EM); however, the best methods to evaluate competency in graduating residents has not been established. We sought to design and implement a rigorous assessment of image acquisition and interpretation in POC US in a cohort of graduating residents at our institution. We evaluated nine senior residents in both image acquisition and image interpretation for five core US skills (focused assessment with sonography for trauma (FAST), aorta, echocardiogram (ECHO), pelvic, central line placement). Image acquisition, using an observed clinical skills exam (OSCE) directed assessment with a standardized patient model. Image interpretation was measured with a multiple-choice exam including normal and pathologic images. Residents performed well on image acquisition for core skills with an average score of 85.7% for core skills and 74% including advanced skills (ovaries, advanced ECHO, advanced aorta). Residents scored well but slightly lower on image interpretation with an average score of 76%. Senior residents performed well on core POC US skills as evaluated with a rigorous assessment tool. This tool may be developed further for other EM programs to use for graduating resident evaluation.
ACQ4: an open-source software platform for data acquisition and analysis in neurophysiology research
Campagnola, Luke; Kratz, Megan B.; Manis, Paul B.
2014-01-01
The complexity of modern neurophysiology experiments requires specialized software to coordinate multiple acquisition devices and analyze the collected data. We have developed ACQ4, an open-source software platform for performing data acquisition and analysis in experimental neurophysiology. This software integrates the tasks of acquiring, managing, and analyzing experimental data. ACQ4 has been used primarily for standard patch-clamp electrophysiology, laser scanning photostimulation, multiphoton microscopy, intrinsic imaging, and calcium imaging. The system is highly modular, which facilitates the addition of new devices and functionality. The modules included with ACQ4 provide for rapid construction of acquisition protocols, live video display, and customizable analysis tools. Position-aware data collection allows automated construction of image mosaics and registration of images with 3-dimensional anatomical atlases. ACQ4 uses free and open-source tools including Python, NumPy/SciPy for numerical computation, PyQt for the user interface, and PyQtGraph for scientific graphics. Supported hardware includes cameras, patch clamp amplifiers, scanning mirrors, lasers, shutters, Pockels cells, motorized stages, and more. ACQ4 is available for download at http://www.acq4.org. PMID:24523692
Boot, Walter R; Sumner, Anna; Towne, Tyler J; Rodriguez, Paola; Anders Ericsson, K
2017-04-01
Video games are ideal platforms for the study of skill acquisition for a variety of reasons. However, our understanding of the development of skill and the cognitive representations that support skilled performance can be limited by a focus on game scores. We present an alternative approach to the study of skill acquisition in video games based on the tools of the Expert Performance Approach. Our investigation was motivated by a detailed analysis of the behaviors responsible for the superior performance of one of the highest scoring players of the video game Space Fortress (Towne, Boot, & Ericsson, ). This analysis revealed how certain behaviors contributed to his exceptional performance. In this study, we recruited a participant for a similar training regimen, but we collected concurrent and retrospective verbal protocol data throughout training. Protocol analysis revealed insights into strategies, errors, mental representations, and shifting game priorities. We argue that these insights into the developing representations that guided skilled performance could only easily have been derived from the tools of the Expert Performance Approach. We propose that the described approach could be applied to understand performance and skill acquisition in many different video games (and other short- to medium-term skill acquisition paradigms) and help reveal mechanisms of transfer from gameplay to other measures of laboratory and real-world performance. Copyright © 2016 Cognitive Science Society, Inc.
Microcomputer-Based Intelligent Tutoring Systems: An Assessment.
ERIC Educational Resources Information Center
Schaffer, John William
Computer-assisted instruction, while familiar to most teachers, has failed to become an effective self-motivating instructional tool. Developments in artificial intelligence, however, have provided new and better tools for exploring human knowledge acquisition and utilization. Expert system technology represents one of the most promising of these…
Update 76: Selected Recent Works in the Social Sciences.
ERIC Educational Resources Information Center
Pike, Mary L., Ed.; Lusignan, Louise, Ed.
This is a selected bibliography of current reference and acquisition tools in the social sciences. The tools include sourcebooks, dictionaries, indexes, conference proceedings, special bibliographies, directories, research reports, and journals. Most citations represent works published since 1970 and new editions of important earlier works.…
Lippok, Norman; Villiger, Martin; Jun, Chang–Su; Bouma, Brett E.
2015-01-01
Fiber–based polarization sensitive OFDI is more challenging than free–space implementations. Using multiple input states, fiber–based systems provide sample birefringence information with the benefit of a flexible sample arm but come at the cost of increased system and acquisition complexity, and either reduce acquisition speed or require increased acquisition bandwidth. Here we show that with the calibration of a single polarization state, fiber–based configurations can approach the conceptual simplicity of traditional free–space configurations. We remotely control the polarization state of the light incident at the sample using the eigenpolarization states of a wave plate as a reference, and determine the Jones matrix of the output fiber. We demonstrate this method for polarization sensitive imaging of biological samples. PMID:25927775
Román, Jessica K; Walsh, Callee M; Oh, Junho; Dana, Catherine E; Hong, Sungmin; Jo, Kyoo D; Alleyne, Marianne; Miljkovic, Nenad; Cropek, Donald M
2018-03-01
Laser-ablation electrospray ionization (LAESI) imaging mass spectrometry (IMS) is an emerging bioanalytical tool for direct imaging and analysis of biological tissues. Performing ionization in an ambient environment, this technique requires little sample preparation and no additional matrix, and can be performed on natural, uneven surfaces. When combined with optical microscopy, the investigation of biological samples by LAESI allows for spatially resolved compositional analysis. We demonstrate here the applicability of LAESI-IMS for the chemical analysis of thin, desiccated biological samples, specifically Neotibicen pruinosus cicada wings. Positive-ion LAESI-IMS accurate ion-map data was acquired from several wing cells and superimposed onto optical images allowing for compositional comparisons across areas of the wing. Various putative chemical identifications were made indicating the presence of hydrocarbons, lipids/esters, amines/amides, and sulfonated/phosphorylated compounds. With the spatial resolution capability, surprising chemical distribution patterns were observed across the cicada wing, which may assist in correlating trends in surface properties with chemical distribution. Observed ions were either (1) equally dispersed across the wing, (2) more concentrated closer to the body of the insect (proximal end), or (3) more concentrated toward the tip of the wing (distal end). These findings demonstrate LAESI-IMS as a tool for the acquisition of spatially resolved chemical information from fragile, dried insect wings. This LAESI-IMS technique has important implications for the study of functional biomaterials, where understanding the correlation between chemical composition, physical structure, and biological function is critical. Graphical abstract Positive-ion laser-ablation electrospray ionization mass spectrometry coupled with optical imaging provides a powerful tool for the spatially resolved chemical analysis of cicada wings.
NASA Astrophysics Data System (ADS)
Clausing, Eric; Kraetzer, Christian; Dittmann, Jana; Vielhauer, Claus
2012-10-01
An important part of criminalistic forensics is the analysis of toolmarks. Such toolmarks often consist of plenty of single striations, scratches and dents which can allow for conclusions in regards to the sequence of events or used tools. To receive qualified results with an automated analysis and contactless acquisition of such toolmarks, a detailed digital representation of these and their orientation as well as placing to each other is required. For marks of firearms and tools the desired result of an analysis is a conclusion whether or not a mark has been generated by a tool under suspicion. For toolmark analysis on locking cylinders, the aim is not an identification of the used tool but rather an identification of the opening method. The challenge of such an identification is that a one-to-one comparison of two images is not sufficient - although two marked objects look completely different in regards to the specific location and shape of found marks they still can represent a sample for the identical opening method. This paper provides the first approach for modelling toolmarks on lock pins and takes into consideration the different requirements necessary to generate a detailed and interpretable digital representation of these traces. These requirements are 'detail', i.e. adequate features which allow for a suitable representation and interpretation of single marks, 'meta detail', i.e. adequate representation of the context and connection between all marks and 'distinctiveness', i.e. the possibility to reliably distinguish different sample types by the according model. The model is evaluated with a set of 15 physical samples (resulting in 675 digital scans) of lock pins from cylinders opened with different opening methods, contactlessly scanned with a confocal laser microscope. The presented results suggest a high suitability for the aspired purpose of opening method determination.
Radial q-space sampling for DSI.
Baete, Steven H; Yutzy, Stephen; Boada, Fernando E
2016-09-01
Diffusion spectrum imaging (DSI) has been shown to be an effective tool for noninvasively depicting the anatomical details of brain microstructure. Existing implementations of DSI sample the diffusion encoding space using a rectangular grid. Here we present a different implementation of DSI whereby a radially symmetric q-space sampling scheme for DSI is used to improve the angular resolution and accuracy of the reconstructed orientation distribution functions. Q-space is sampled by acquiring several q-space samples along a number of radial lines. Each of these radial lines in q-space is analytically connected to a value of the orientation distribution functions at the same angular location by the Fourier slice theorem. Computer simulations and in vivo brain results demonstrate that radial diffusion spectrum imaging correctly estimates the orientation distribution functions when moderately high b-values (4000 s/mm2) and number of q-space samples (236) are used. The nominal angular resolution of radial diffusion spectrum imaging depends on the number of radial lines used in the sampling scheme, and only weakly on the maximum b-value. In addition, the radial analytical reconstruction reduces truncation artifacts which affect Cartesian reconstructions. Hence, a radial acquisition of q-space can be favorable for DSI. Magn Reson Med 76:769-780, 2016. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.
A seamless acquisition digital storage oscilloscope with three-dimensional waveform display
NASA Astrophysics Data System (ADS)
Yang, Kuojun; Tian, Shulin; Zeng, Hao; Qiu, Lei; Guo, Lianping
2014-04-01
In traditional digital storage oscilloscope (DSO), sampled data need to be processed after each acquisition. During data processing, the acquisition is stopped and oscilloscope is blind to the input signal. Thus, this duration is called dead time. With the rapid development of modern electronic systems, the effect of infrequent events becomes significant. To capture these occasional events in shorter time, dead time in traditional DSO that causes the loss of measured signal needs to be reduced or even eliminated. In this paper, a seamless acquisition oscilloscope without dead time is proposed. In this oscilloscope, three-dimensional waveform mapping (TWM) technique, which converts sampled data to displayed waveform, is proposed. With this technique, not only the process speed is improved, but also the probability information of waveform is displayed with different brightness. Thus, a three-dimensional waveform is shown to the user. To reduce processing time further, parallel TWM which processes several sampled points simultaneously, and dual-port random access memory based pipelining technique which can process one sampling point in one clock period are proposed. Furthermore, two DDR3 (Double-Data-Rate Three Synchronous Dynamic Random Access Memory) are used for storing sampled data alternately, thus the acquisition can continue during data processing. Therefore, the dead time of DSO is eliminated. In addition, a double-pulse test method is adopted to test the waveform capturing rate (WCR) of the oscilloscope and a combined pulse test method is employed to evaluate the oscilloscope's capture ability comprehensively. The experiment results show that the WCR of the designed oscilloscope is 6 250 000 wfms/s (waveforms per second), the highest value in all existing oscilloscopes. The testing results also prove that there is no dead time in our oscilloscope, thus realizing the seamless acquisition.
A seamless acquisition digital storage oscilloscope with three-dimensional waveform display
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Kuojun, E-mail: kuojunyang@gmail.com; Guo, Lianping; School of Electrical and Electronic Engineering, Nanyang Technological University
In traditional digital storage oscilloscope (DSO), sampled data need to be processed after each acquisition. During data processing, the acquisition is stopped and oscilloscope is blind to the input signal. Thus, this duration is called dead time. With the rapid development of modern electronic systems, the effect of infrequent events becomes significant. To capture these occasional events in shorter time, dead time in traditional DSO that causes the loss of measured signal needs to be reduced or even eliminated. In this paper, a seamless acquisition oscilloscope without dead time is proposed. In this oscilloscope, three-dimensional waveform mapping (TWM) technique, whichmore » converts sampled data to displayed waveform, is proposed. With this technique, not only the process speed is improved, but also the probability information of waveform is displayed with different brightness. Thus, a three-dimensional waveform is shown to the user. To reduce processing time further, parallel TWM which processes several sampled points simultaneously, and dual-port random access memory based pipelining technique which can process one sampling point in one clock period are proposed. Furthermore, two DDR3 (Double-Data-Rate Three Synchronous Dynamic Random Access Memory) are used for storing sampled data alternately, thus the acquisition can continue during data processing. Therefore, the dead time of DSO is eliminated. In addition, a double-pulse test method is adopted to test the waveform capturing rate (WCR) of the oscilloscope and a combined pulse test method is employed to evaluate the oscilloscope's capture ability comprehensively. The experiment results show that the WCR of the designed oscilloscope is 6 250 000 wfms/s (waveforms per second), the highest value in all existing oscilloscopes. The testing results also prove that there is no dead time in our oscilloscope, thus realizing the seamless acquisition.« less
Development of induction current acquisition device based on ARM
NASA Astrophysics Data System (ADS)
Ji, Yanju; Liu, Xiyang; Huang, Wanyu; Yao, Jiang; Yuan, Guiyang; Hui, Luan; Guan, Shanshan
2018-03-01
We design an induction current acquisition device based on ARM in order to realize high resolution and high sampling rate of acquisition for the induction current in wire-loop. Considering its characteristics of fast attenuation and small signal amplitude, we use the method of multi-path fusion for noise suppression. In the paper, the design is carried out from three aspects of analog circuit and device selection, independent power supply structure and the electromagnetic interference suppression of high frequency. DMA and ping-pong buffer, as a new data transmission technology, solves real time storage problem of massive data. The performance parameters of ARM acquisition device are tested. The comparison test of ARM acquisition device and cRIO acquisition device is performed at different time constants. The results show that it has 120dB dynamic range, 47kHz bandwidth, 96kHz sampling rate, 5μV the smallest resolution, and its average error value is not more than 4%, which proves the high accuracy and stability of the device.
NASA Technical Reports Server (NTRS)
Blumenfeld, E. H.; Evans, C. A.; Oshel, E. R.; Liddle, D. A.; Beaulieu, K.; Zeigler, R. A.; Hanna, R. D.; Ketcham, R. A.
2016-01-01
New technologies make possible the advancement of documentation and visualization practices that can enhance conservation and curation protocols for NASA's Astromaterials Collections. With increasing demands for accessibility to updated comprehensive data, and with new sample return missions on the horizon, it is of primary importance to develop new standards for contemporary documentation and visualization methodologies. Our interdisciplinary team has expertise in the fields of heritage conservation practices, professional photography, photogrammetry, imaging science, application engineering, data curation, geoscience, and astromaterials curation. Our objective is to create virtual 3D reconstructions of Apollo Lunar and Antarctic Meteorite samples that are a fusion of two state-of-the-art data sets: the interior view of the sample by collecting Micro-XCT data and the exterior view of the sample by collecting high-resolution precision photography data. These new data provide researchers an information-rich visualization of both compositional and textural information prior to any physical sub-sampling. Since January 2013 we have developed a process that resulted in the successful creation of the first image-based 3D reconstruction of an Apollo Lunar Sample correlated to a 3D reconstruction of the same sample's Micro- XCT data, illustrating that this technique is both operationally possible and functionally beneficial. In May of 2016 we began a 3-year research period during which we aim to produce Virtual Astromaterials Samples for 60 high-priority Apollo Lunar and Antarctic Meteorite samples and serve them on NASA's Astromaterials Acquisition and Curation website. Our research demonstrates that research-grade Virtual Astromaterials Samples are beneficial in preserving for posterity a precise 3D reconstruction of the sample prior to sub-sampling, which greatly improves documentation practices, provides unique and novel visualization of the sample's interior and exterior features, offers scientists a preliminary research tool for targeted sub-sample requests, and additionally is a visually engaging interactive tool for bringing astromaterials science to the public.
NASA Astrophysics Data System (ADS)
Blumenfeld, E. H.; Evans, C. A.; Zeigler, R. A.; Righter, K.; Beaulieu, K. R.; Oshel, E. R.; Liddle, D. A.; Hanna, R.; Ketcham, R. A.; Todd, N. S.
2016-12-01
New technologies make possible the advancement of documentation and visualization practices that can enhance conservation and curation protocols for NASA's Astromaterials Collections. With increasing demands for accessibility to updated comprehensive data, and with new sample return missions on the horizon, it is of primary importance to develop new standards for contemporary documentation and visualization methodologies. Our interdisciplinary team has expertise in the fields of heritage conservation practices, professional photography, photogrammetry, imaging science, application engineering, data curation, geoscience, and astromaterials curation. Our objective is to create virtual 3D reconstructions of Apollo Lunar and Antarctic Meteorite samples that are a fusion of two state-of-the-art data sets: the interior view of the sample by collecting Micro-XCT data and the exterior view of the sample by collecting high-resolution precision photography data. These new data provide researchers an information-rich visualization of both compositional and textural information prior to any physical sub-sampling. Since January 2013 we have developed a process that resulted in the successful creation of the first image-based 3D reconstruction of an Apollo Lunar Sample correlated to a 3D reconstruction of the same sample's Micro-XCT data, illustrating that this technique is both operationally possible and functionally beneficial. In May of 2016 we began a 3-year research period during which we aim to produce Virtual Astromaterials Samples for 60 high-priority Apollo Lunar and Antarctic Meteorite samples and serve them on NASA's Astromaterials Acquisition and Curation website. Our research demonstrates that research-grade Virtual Astromaterials Samples are beneficial in preserving for posterity a precise 3D reconstruction of the sample prior to sub-sampling, which greatly improves documentation practices, provides unique and novel visualization of the sample's interior and exterior features, offers scientists a preliminary research tool for targeted sub-sample requests, and additionally is a visually engaging interactive tool for bringing astromaterials science to the public.
Single Cell Proteomics in Biomedicine: High-dimensional Data Acquisition, Visualization and Analysis
Su, Yapeng; Shi, Qihui; Wei, Wei
2017-01-01
New insights on cellular heterogeneity in the last decade provoke the development of a variety of single cell omics tools at a lightning pace. The resultant high-dimensional single cell data generated by these tools require new theoretical approaches and analytical algorithms for effective visualization and interpretation. In this review, we briefly survey the state-of-the-art single cell proteomic tools with a particular focus on data acquisition and quantification, followed by an elaboration of a number of statistical and computational approaches developed to date for dissecting the high-dimensional single cell data. The underlying assumptions, unique features and limitations of the analytical methods with the designated biological questions they seek to answer will be discussed. Particular attention will be given to those information theoretical approaches that are anchored in a set of first principles of physics and can yield detailed (and often surprising) predictions. PMID:28128880
WebQuests as Language-Learning Tools
ERIC Educational Resources Information Center
Aydin, Selami
2016-01-01
This study presents a review of the literature that examines WebQuests as tools for second-language acquisition and foreign language-learning processes to guide teachers in their teaching activities and researchers in further research on the issue. The study first introduces the theoretical background behind WebQuest use in the mentioned…
Tools for Scientific Thinking: Microcomputer-Based Laboratories for the Naive Science Learner.
ERIC Educational Resources Information Center
Thornton, Ronald K.
A promising new development in science education is the use of microcomputer-based laboratory tools that allow for student-directed data acquisition, display, and analysis. Microcomputer-based laboratories (MBL) make use of inexpensive microcomputer-connected probes to measure such physical quantities as temperature, position, and various…
Mobile Adaptive Communication Support for Vocabulary Acquisition
ERIC Educational Resources Information Center
Epp, Carrie Demmans
2014-01-01
This work explores the use of an adaptive mobile tool for language learning. A school-based deployment study showed that the tool supported learning. A second study is being conducted in informal learning environments. Current work focuses on building models that increase our understanding of the relationship between application usage and learning.
VETA x ray data acquisition and control system
NASA Technical Reports Server (NTRS)
Brissenden, Roger J. V.; Jones, Mark T.; Ljungberg, Malin; Nguyen, Dan T.; Roll, John B., Jr.
1992-01-01
We describe the X-ray Data Acquisition and Control System (XDACS) used together with the X-ray Detection System (XDS) to characterize the X-ray image during testing of the AXAF P1/H1 mirror pair at the MSFC X-ray Calibration Facility. A variety of X-ray data were acquired, analyzed and archived during the testing including: mirror alignment, encircled energy, effective area, point spread function, system housekeeping and proportional counter window uniformity data. The system architecture is presented with emphasis placed on key features that include a layered UNIX tool approach, dedicated subsystem controllers, real-time X-window displays, flexibility in combining tools, network connectivity and system extensibility. The VETA test data archive is also described.
Monitoring tools of COMPASS experiment at CERN
NASA Astrophysics Data System (ADS)
Bodlak, M.; Frolov, V.; Huber, S.; Jary, V.; Konorov, I.; Levit, D.; Novy, J.; Salac, R.; Tomsa, J.; Virius, M.
2015-12-01
This paper briefly introduces the data acquisition system of the COMPASS experiment and is mainly focused on the part that is responsible for the monitoring of the nodes in the whole newly developed data acquisition system of this experiment. The COMPASS is a high energy particle experiment with a fixed target located at the SPS of the CERN laboratory in Geneva, Switzerland. The hardware of the data acquisition system has been upgraded to use FPGA cards that are responsible for data multiplexing and event building. The software counterpart of the system includes several processes deployed in heterogenous network environment. There are two processes, namely Message Logger and Message Browser, taking care of monitoring. These tools handle messages generated by nodes in the system. While Message Logger collects and saves messages to the database, the Message Browser serves as a graphical interface over the database containing these messages. For better performance, certain database optimizations have been used. Lastly, results of performance tests are presented.
Flow Cytometry Data Preparation Guidelines for Improved Automated Phenotypic Analysis.
Jimenez-Carretero, Daniel; Ligos, José M; Martínez-López, María; Sancho, David; Montoya, María C
2018-05-15
Advances in flow cytometry (FCM) increasingly demand adoption of computational analysis tools to tackle the ever-growing data dimensionality. In this study, we tested different data input modes to evaluate how cytometry acquisition configuration and data compensation procedures affect the performance of unsupervised phenotyping tools. An analysis workflow was set up and tested for the detection of changes in reference bead subsets and in a rare subpopulation of murine lymph node CD103 + dendritic cells acquired by conventional or spectral cytometry. Raw spectral data or pseudospectral data acquired with the full set of available detectors by conventional cytometry consistently outperformed datasets acquired and compensated according to FCM standards. Our results thus challenge the paradigm of one-fluorochrome/one-parameter acquisition in FCM for unsupervised cluster-based analysis. Instead, we propose to configure instrument acquisition to use all available fluorescence detectors and to avoid integration and compensation procedures, thereby using raw spectral or pseudospectral data for improved automated phenotypic analysis. Copyright © 2018 by The American Association of Immunologists, Inc.
Samanipour, Saer; Reid, Malcolm J; Bæk, Kine; Thomas, Kevin V
2018-04-17
Nontarget analysis is considered one of the most comprehensive tools for the identification of unknown compounds in a complex sample analyzed via liquid chromatography coupled to high-resolution mass spectrometry (LC-HRMS). Due to the complexity of the data generated via LC-HRMS, the data-dependent acquisition mode, which produces the MS 2 spectra of a limited number of the precursor ions, has been one of the most common approaches used during nontarget screening. However, data-independent acquisition mode produces highly complex spectra that require proper deconvolution and library search algorithms. We have developed a deconvolution algorithm and a universal library search algorithm (ULSA) for the analysis of complex spectra generated via data-independent acquisition. These algorithms were validated and tested using both semisynthetic and real environmental data. A total of 6000 randomly selected spectra from MassBank were introduced across the total ion chromatograms of 15 sludge extracts at three levels of background complexity for the validation of the algorithms via semisynthetic data. The deconvolution algorithm successfully extracted more than 60% of the added ions in the analytical signal for 95% of processed spectra (i.e., 3 complexity levels multiplied by 6000 spectra). The ULSA ranked the correct spectra among the top three for more than 95% of cases. We further tested the algorithms with 5 wastewater effluent extracts for 59 artificial unknown analytes (i.e., their presence or absence was confirmed via target analysis). These algorithms did not produce any cases of false identifications while correctly identifying ∼70% of the total inquiries. The implications, capabilities, and the limitations of both algorithms are further discussed.
Laser-induced photo emission detection: data acquisition based on light intensity counting
NASA Astrophysics Data System (ADS)
Yulianto, N.; Yudasari, N.; Putri, K. Y.
2017-04-01
Laser Induced Breakdown Detection (LIBD) is one of the quantification techniques for colloids. There are two ways of detection in LIBD: optical detection and acoustic detection. LIBD is based on the detection of plasma emission due to the interaction between particle and laser beam. In this research, the changing of light intensity during plasma formations was detected by a photodiode sensor. A photo emission data acquisition system was built to collect and transform them into digital counts. The real-time system used data acquisition device National Instrument DAQ 6009 and LABVIEW software. The system has been tested on distilled water and tap water samples. The result showed 99.8% accuracy by using counting technique in comparison to the acoustic detection with sample rate of 10 Hz, thus the acquisition system can be applied as an alternative method to the existing LIBD acquisition system.
Gerona, Roy R; Schwartz, Jackie M; Pan, Janet; Friesen, Matthew M; Lin, Thomas; Woodruff, Tracey J
2018-03-01
The use and advantages of high-resolution mass spectrometry (MS) as a discovery tool for environmental chemical monitoring has been demonstrated for environmental samples but not for biological samples. We developed a method using liquid chromatography-quadrupole time-of-flight MS (LC-QTOF/MS) for discovery of previously unmeasured environmental chemicals in human serum. Using non-targeted data acquisition (full scan MS analysis) we were able to screen for environmental organic acids (EOAs) in 20 serum samples from second trimester pregnant women. We define EOAs as environmental organic compounds with at least one dissociable proton which are utilized in commerce. EOAs include environmental phenols, phthalate metabolites, perfluorinated compounds, phenolic metabolites of polybrominated diphenyl ethers and polychlorinated biphenyls, and acidic pesticides and/or predicted acidic pesticide metabolites. Our validated method used solid phase extraction, reversed-phase chromatography in a C18 column with gradient elution, electrospray ionization in negative polarity and automated tandem MS (MS/MS) data acquisition to maximize true positive rates. We identified "suspect EOAs" using Agilent MassHunter Qualitative Analysis software, to match chemical formulas generated from each sample run with molecular formulas in our unique database of 693 EOAs assembled from multiple environmental literature sources. We found potential matches for 282 (41%) of the EOAs in our database. Sixty-five of these suspect EOAs were detected in at least 75% of the samples; only 19 of these compounds are currently biomonitored in National Health and Nutrition Examination Survey. We confirmed two of three suspect EOAs by LC-QTOF/MS using a targeted method developed through LC-MS/MS, reporting the first confirmation of benzophenone-1 and bisphenol S in pregnant women's sera. Our suspect screening workflow provides an approach to comprehensively scan environmental chemical exposures in humans. This can provide a better source of exposure information to help improve exposure and risk evaluation of industrial chemicals.
Williams-Hatala, Erin Marie; Hatala, Kevin G; Gordon, McKenzie; Key, Alastair; Kasper, Margaret; Kivell, Tracy L
2018-06-01
It is widely agreed that biomechanical stresses imposed by stone tool behaviors influenced the evolution of the human hand. Though archaeological evidence suggests that early hominins participated in a variety of tool behaviors, it is unlikely that all behaviors equally influenced modern human hand anatomy. It is more probable that a behavior's likelihood of exerting a selective pressure was a weighted function of the magnitude of stresses associated with that behavior, the benefits received from it, and the amount of time spent performing it. Based on this premise, we focused on the first part of that equation and evaluated magnitudes of stresses associated with stone tool behaviors thought to have been commonly practiced by early hominins, to determine which placed the greatest loads on the digits. Manual pressure data were gathered from 39 human subjects using a Novel Pliance ® manual pressure system while they participated in multiple Plio-Pleistocene tool behaviors: nut-cracking, marrow acquisition with a hammerstone, flake production with a hammerstone, and handaxe and flake use. Manual pressure distributions varied significantly according to behavior, though there was a tendency for regions of the hand subject to the lowest pressures (e.g., proximal phalanges) to be affected less by behavior type. Hammerstone use during marrow acquisition and flake production consistently placed the greatest loads on the digits collectively, on each digit and on each phalanx. Our results suggest that, based solely on the magnitudes of stresses, hammerstone use during marrow acquisition and flake production are the most likely of the assessed behaviors to have influenced the anatomical and functional evolution of the human hand. Copyright © 2018 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Kim, Minjung; Kim, Soo-Jin; Stoel-Gammon, Carol
2017-01-01
This study investigates the phonological acquisition of Korean consonants using conversational speech samples collected from sixty monolingual typically developing Korean children aged two, three, and four years. Phonemic acquisition was examined for syllable-initial and syllable-final consonants. Results showed that Korean children acquired stops…
Robust Methods for Sensing and Reconstructing Sparse Signals
ERIC Educational Resources Information Center
Carrillo, Rafael E.
2012-01-01
Compressed sensing (CS) is an emerging signal acquisition framework that goes against the traditional Nyquist sampling paradigm. CS demonstrates that a sparse, or compressible, signal can be acquired using a low rate acquisition process. Since noise is always present in practical data acquisition systems, sensing and reconstruction methods are…
Roy, Shumita; Park, Norman W; Roy, Eric A; Almeida, Quincy J
2015-01-01
Previous research suggests that different aspects of tool knowledge are mediated by different memory systems. It is believed that tool attributes (e.g., function, color) are represented as declarative memory while skill learning is supported by procedural memory. It has been proposed that other aspects (e.g., skilled tool use) may rely on an interaction of both declarative and procedural memory. However, the specific form of procedural memory underlying skilled tool use and the nature of interaction between declarative and procedural memory systems remain unclear. In the current study, individuals with Parkinson's disease (PD) and healthy controls were trained over 2 sessions, 3 weeks apart, to use a set of novel complex tools. They were also tested on their ability to recall tool attributes as well as their ability to demonstrate grasp and use of the tools to command. Results showed that, compared to controls, participants with PD showed intact motor skill acquisition and tool use to command within sessions, but failed to retain performance across sessions. In contrast, people with PD showed equivalent recall of tool attributes and tool grasping relative to controls, both within and across sessions. Current findings demonstrate that the frontal-striatal network, compromised in PD, mediates long-term retention of motor skills. Intact initial skill learning raises the possibility of compensation from declarative memory for frontal-striatal dysfunction. Lastly, skilled tool use appears to rely on both memory systems which may reflect a cooperative interaction between the two systems. Current findings regarding memory representations of tool knowledge and skill learning may have important implications for delivery of rehabilitation programs for individuals with PD. Copyright © 2014 Elsevier Ltd. All rights reserved.
Hand portable thin-layer chromatography system
Haas, Jeffrey S.; Kelly, Fredrick R.; Bushman, John F.; Wiefel, Michael H.; Jensen, Wayne A.
2000-01-01
A hand portable, field-deployable thin-layer chromatography (TLC) unit and a hand portable, battery-operated unit for development, illumination, and data acquisition of the TLC plates contain many miniaturized features that permit a large number of samples to be processed efficiently. The TLC unit includes a solvent tank, a holder for TLC plates, and a variety of tool chambers for storing TLC plates, solvent, and pipettes. After processing in the TLC unit, a TLC plate is positioned in a collapsible illumination box, where the box and a CCD camera are optically aligned for optimal pixel resolution of the CCD images of the TLC plate. The TLC system includes an improved development chamber for chemical development of TLC plates that prevents solvent overflow.
Illumination box and camera system
Haas, Jeffrey S.; Kelly, Fredrick R.; Bushman, John F.; Wiefel, Michael H.; Jensen, Wayne A.; Klunder, Gregory L.
2002-01-01
A hand portable, field-deployable thin-layer chromatography (TLC) unit and a hand portable, battery-operated unit for development, illumination, and data acquisition of the TLC plates contain many miniaturized features that permit a large number of samples to be processed efficiently. The TLC unit includes a solvent tank, a holder for TLC plates, and a variety of tool chambers for storing TLC plates, solvent, and pipettes. After processing in the TLC unit, a TLC plate is positioned in a collapsible illumination box, where the box and a CCD camera are optically aligned for optimal pixel resolution of the CCD images of the TLC plate. The TLC system includes an improved development chamber for chemical development of TLC plates that prevents solvent overflow.
Simultaneous fast scanning XRF, dark field, phase-, and absorption contrast tomography
NASA Astrophysics Data System (ADS)
Medjoubi, Kadda; Bonissent, Alain; Leclercq, Nicolas; Langlois, Florent; Mercère, Pascal; Somogyi, Andrea
2013-09-01
Scanning hard X-ray nanoprobe imaging provides a unique tool for probing specimens with high sensitivity and large penetration depth. Moreover, the combination of complementary techniques such as X-ray fluorescence, absorption, phase contrast and dark field imaging gives complete quantitative information on the sample structure, composition and chemistry. The multi-technique "FLYSCAN" data acquisition scheme developed at Synchrotron SOLEIL permits to perform fast continuous scanning imaging and as such makes scanning tomography techniques feasible in a time-frame well-adapted to typical user experiments. Here we present the recent results of simultaneous fast scanning multi-technique tomography performed at Soleil. This fast scanning scheme will be implemented at the Nanoscopium beamline for large field of view 2D and 3D multimodal imaging.
The Acquisition and Transfer of Botanical Classification by Elementary Science Methods Students.
ERIC Educational Resources Information Center
Knapp, Clifford Edward
Investigated were two questions related to the acquisition and transfer of botanical classification skill by elementary science methods students. Data were collected from a sample of 89 students enrolled in methods courses. Sixty-two students served as the experimental sample, and 27 served as the control for the transfer portion of the research.…
NASA Technical Reports Server (NTRS)
Jandura, Louise
2010-01-01
The Sample Acquisition/Sample Processing and Handling subsystem for the Mars Science Laboratory is a highly-mechanized, Rover-based sampling system that acquires powdered rock and regolith samples from the Martian surface, sorts the samples into fine particles through sieving, and delivers small portions of the powder into two science instruments inside the Rover. SA/SPaH utilizes 17 actuated degrees-of-freedom to perform the functions needed to produce 5 sample pathways in support of the scientific investigation on Mars. Both hardware redundancy and functional redundancy are employed in configuring this sampling system so some functionality is retained even with the loss of a degree-of-freedom. Intentional dynamic environments are created to move sample while vibration isolators attenuate this environment at the sensitive instruments located near the dynamic sources. In addition to the typical flight hardware qualification test program, two additional types of testing are essential for this kind of sampling system: characterization of the intentionally-created dynamic environment and testing of the sample acquisition and processing hardware functions using Mars analog materials in a low pressure environment. The overall subsystem design and configuration are discussed along with some of the challenges, tradeoffs, and lessons learned in the areas of fault tolerance, intentional dynamic environments, and special testing
Migrants and Mobile Technology Use: Gaps in the Support Provided by Current Tools
ERIC Educational Resources Information Center
Epp, Carrie Demmans
2017-01-01
Our current understanding of how migrants use mobile tools to support their communication and language learning is inadequate. This study, therefore, explores the learner-initiated use of technologies to support their comprehension, production, and acquisition of English following migration to Canada. Information about migrant use of technologies…
Simulation Software's Effect on College Students Spreadsheet Project Scores
ERIC Educational Resources Information Center
Atkinson, J. Kirk; Thrasher, Evelyn H.; Coleman, Phillip D.
2011-01-01
The purpose of this study is to explore the potential impact of support materials on student spreadsheet skill acquisition. Specifically, this study examines the use of an online spreadsheet simulation tool versus a printed book across two independent student groups. This study hypothesizes that the online spreadsheet simulation tool will have a…
Furukawa, Makoto; Takagai, Yoshitaka
2016-10-04
Online solid-phase extraction (SPE) coupled with inductively coupled plasma mass spectrometry (ICPMS) is a useful tool in automatic sequential analysis. However, it cannot simultaneously quantify the analytical targets and their recovery percentages (R%) in one-shot samples. We propose a system that simultaneously acquires both data in a single sample injection. The main flowline of the online solid-phase extraction is divided into main and split flows. The split flow line (i.e., bypass line), which circumvents the SPE column, was placed on the main flow line. Under program-controlled switching of the automatic valve, the ICPMS sequentially measures the targets in a sample before and after column preconcentration and determines the target concentrations and the R% on the SPE column. This paper describes the system development and two demonstrations to exhibit the analytical significance, i.e., the ultratrace amounts of radioactive strontium ( 90 Sr) using commercial Sr-trap resin and multielement adsorbability on the SPE column. This system is applicable to other flow analyses and detectors in online solid phase extraction.
Accelerated Optical Projection Tomography Applied to In Vivo Imaging of Zebrafish
Correia, Teresa; Yin, Jun; Ramel, Marie-Christine; Andrews, Natalie; Katan, Matilda; Bugeon, Laurence; Dallman, Margaret J.; McGinty, James; Frankel, Paul; French, Paul M. W.; Arridge, Simon
2015-01-01
Optical projection tomography (OPT) provides a non-invasive 3-D imaging modality that can be applied to longitudinal studies of live disease models, including in zebrafish. Current limitations include the requirement of a minimum number of angular projections for reconstruction of reasonable OPT images using filtered back projection (FBP), which is typically several hundred, leading to acquisition times of several minutes. It is highly desirable to decrease the number of required angular projections to decrease both the total acquisition time and the light dose to the sample. This is particularly important to enable longitudinal studies, which involve measurements of the same fish at different time points. In this work, we demonstrate that the use of an iterative algorithm to reconstruct sparsely sampled OPT data sets can provide useful 3-D images with 50 or fewer projections, thereby significantly decreasing the minimum acquisition time and light dose while maintaining image quality. A transgenic zebrafish embryo with fluorescent labelling of the vasculature was imaged to acquire densely sampled (800 projections) and under-sampled data sets of transmitted and fluorescence projection images. The under-sampled OPT data sets were reconstructed using an iterative total variation-based image reconstruction algorithm and compared against FBP reconstructions of the densely sampled data sets. To illustrate the potential for quantitative analysis following rapid OPT data acquisition, a Hessian-based method was applied to automatically segment the reconstructed images to select the vasculature network. Results showed that 3-D images of the zebrafish embryo and its vasculature of sufficient visual quality for quantitative analysis can be reconstructed using the iterative algorithm from only 32 projections—achieving up to 28 times improvement in imaging speed and leading to total acquisition times of a few seconds. PMID:26308086
ERIC Educational Resources Information Center
Musawi, Ali Al; Ambusaidi, Abdullah; Al-Balushi, Sulaiman; Al-Sinani, Mohamed; Al-Balushi, Kholoud
2017-01-01
This paper aims to measure the effectiveness of the 3DL on Omani students' acquisition of practical abilities and skills. It examines the effectiveness of the 3D-lab in science education and scientific thinking acquisition as part of a national project funded by The Research Council. Four research tools in a Pre-Post Test Control Group Design,…
2012-02-03
node to the analysis of eigenmodes (connected trees /networks) of disruption sequences. The identification of disruption eigenmodes is particularly...investment portfolio approach enables the identification of optimal SoS network topologies and provides a tool for acquisition professionals to...a program based on its ability to provide a new capability for a given cost, and not on its ability to meet specific performance requirements ( Spacy
2011-07-01
TECHNOLOGIES INTO DEFENSE ACqUISITION UNIVERSITY LEARNING ASSETS Nada Dabbagh, Kevin Clark, Susan Dass , Salim Al Waaili, Sally Byrd, Susan...demographic data, four Likert- scale questions that targeted respondents’ familiarity with ALT, and one Likert- scale question addressing the...use of technology in learning with under- served populations. (E-mail address: kclark6@gmu.edu) Ms. Susan Dass has over 20 years’ experi- ence in
ERIC Educational Resources Information Center
Hamilton, Robert
2014-01-01
In this study, the prototype of a new type of bilingual picture book was field-tested with two sets of mother-son subject pairs. This picture book was designed as a possible tool for providing children with comprehensible input during their critical period for second language acquisition. Context is provided by visual cues and both Japanese and…
ERIC Educational Resources Information Center
Tynan, Mark; McCarney, Eoin
2014-01-01
University College Dublin became the first library in the Republic of Ireland to trial patron-driven acquisition (PDA) as a collection development tool in 2013. A total of 42% of UCD Library's book budget was allocated to the project, which included both electronic and print books. This article describes the twelve month project from the tender…
Lott, Gus K; Johnson, Bruce R; Bonow, Robert H; Land, Bruce R; Hoy, Ronald R
2009-01-01
We present g-PRIME, a software based tool for physiology data acquisition, analysis, and stimulus generation in education and research. This software was developed in an undergraduate neurophysiology course and strongly influenced by instructor and student feedback. g-PRIME is a free, stand-alone, windows application coded and "compiled" in Matlab (does not require a Matlab license). g-PRIME supports many data acquisition interfaces from the PC sound card to expensive high throughput calibrated equipment. The program is designed as a software oscilloscope with standard trigger modes, multi-channel visualization controls, and data logging features. Extensive analysis options allow real time and offline filtering of signals, multi-parameter threshold-and-window based event detection, and two-dimensional display of a variety of parameters including event time, energy density, maximum FFT frequency component, max/min amplitudes, and inter-event rate and intervals. The software also correlates detected events with another simultaneously acquired source (event triggered average) in real time or offline. g-PRIME supports parameter histogram production and a variety of elegant publication quality graphics outputs. A major goal of this software is to merge powerful engineering acquisition and analysis tools with a biological approach to studies of nervous system function.
Optronic System Imaging Simulator (OSIS): imager simulation tool of the ECOMOS project
NASA Astrophysics Data System (ADS)
Wegner, D.; Repasi, E.
2018-04-01
ECOMOS is a multinational effort within the framework of an EDA Project Arrangement. Its aim is to provide a generally accepted and harmonized European computer model for computing nominal Target Acquisition (TA) ranges of optronic imagers operating in the Visible or thermal Infrared (IR). The project involves close co-operation of defense and security industry and public research institutes from France, Germany, Italy, The Netherlands and Sweden. ECOMOS uses two approaches to calculate Target Acquisition (TA) ranges, the analytical TRM4 model and the image-based Triangle Orientation Discrimination model (TOD). In this paper the IR imager simulation tool, Optronic System Imaging Simulator (OSIS), is presented. It produces virtual camera imagery required by the TOD approach. Pristine imagery is degraded by various effects caused by atmospheric attenuation, optics, detector footprint, sampling, fixed pattern noise, temporal noise and digital signal processing. Resulting images might be presented to observers or could be further processed for automatic image quality calculations. For convenience OSIS incorporates camera descriptions and intermediate results provided by TRM4. For input OSIS uses pristine imagery tied with meta information about scene content, its physical dimensions, and gray level interpretation. These images represent planar targets placed at specified distances to the imager. Furthermore, OSIS is extended by a plugin functionality that enables integration of advanced digital signal processing techniques in ECOMOS such as compression, local contrast enhancement, digital turbulence mitiga- tion, to name but a few. By means of this image-based approach image degradations and image enhancements can be investigated, which goes beyond the scope of the analytical TRM4 model.
Investigating Astromaterials Curation Applications for Dexterous Robotic Arms
NASA Technical Reports Server (NTRS)
Snead, C. J.; Jang, J. H.; Cowden, T. R.; McCubbin, F. M.
2018-01-01
The Astromaterials Acquisition and Curation office at NASA Johnson Space Center is currently investigating tools and methods that will enable the curation of future astromaterials collections. Size and temperature constraints for astromaterials to be collected by current and future proposed missions will require the development of new robotic sample and tool handling capabilities. NASA Curation has investigated the application of robot arms in the past, and robotic 3-axis micromanipulators are currently in use for small particle curation in the Stardust and Cosmic Dust laboratories. While 3-axis micromanipulators have been extremely successful for activities involving the transfer of isolated particles in the 5-20 micron range (e.g. from microscope slide to epoxy bullet tip, beryllium SEM disk), their limited ranges of motion and lack of yaw, pitch, and roll degrees of freedom restrict their utility in other applications. For instance, curators removing particles from cosmic dust collectors by hand often employ scooping and rotating motions to successfully free trapped particles from the silicone oil coatings. Similar scooping and rotating motions are also employed when isolating a specific particle of interest from an aliquot of crushed meteorite. While cosmic dust curators have been remarkably successful with these kinds of particle manipulations using handheld tools, operator fatigue limits the number of particles that can be removed during a given extraction session. The challenges for curation of small particles will be exacerbated by mission requirements that samples be processed in N2 sample cabinets (i.e. gloveboxes). We have been investigating the use of compact robot arms to facilitate sample handling within gloveboxes. Six-axis robot arms potentially have applications beyond small particle manipulation. For instance, future sample return missions may involve biologically sensitive astromaterials that can be easily compromised by physical interaction with a curator; other potential future returned samples may require cryogenic curation. Robot arms may be combined with high resolution cameras within a sample cabinet and controlled remotely by curator. Sophisticated robot arm and hand combination systems can be programmed to mimic the movements of a curator wearing a data glove; successful implementation of such a system may ultimately allow a curator to virtually operate in a nitrogen, cryogenic, or biologically sensitive environment with dexterity comparable to that of a curator physically handling samples in a glove box.
Ning, Jia; Sun, Yongliang; Xie, Sheng; Zhang, Bida; Huang, Feng; Koken, Peter; Smink, Jouke; Yuan, Chun; Chen, Huijun
2018-05-01
To propose a simultaneous acquisition sequence for improved hepatic pharmacokinetics quantification accuracy (SAHA) method for liver dynamic contrast-enhanced MRI. The proposed SAHA simultaneously acquired high temporal-resolution 2D images for vascular input function extraction using Cartesian sampling and 3D large-coverage high spatial-resolution liver dynamic contrast-enhanced images using golden angle stack-of-stars acquisition in an interleaved way. Simulations were conducted to investigate the accuracy of SAHA in pharmacokinetic analysis. A healthy volunteer and three patients with cirrhosis or hepatocellular carcinoma were included in the study to investigate the feasibility of SAHA in vivo. Simulation studies showed that SAHA can provide closer results to the true values and lower root mean square error of estimated pharmacokinetic parameters in all of the tested scenarios. The in vivo scans of subjects provided fair image quality of both 2D images for arterial input function and portal venous input function and 3D whole liver images. The in vivo fitting results showed that the perfusion parameters of healthy liver were significantly different from those of cirrhotic liver and HCC. The proposed SAHA can provide improved accuracy in pharmacokinetic modeling and is feasible in human liver dynamic contrast-enhanced MRI, suggesting that SAHA is a potential tool for liver dynamic contrast-enhanced MRI. Magn Reson Med 79:2629-2641, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.
Knowledge acquisition for temporal abstraction.
Stein, A; Musen, M A; Shahar, Y
1996-01-01
Temporal abstraction is the task of detecting relevant patterns in data over time. The knowledge-based temporal-abstraction method uses knowledge about a clinical domain's contexts, external events, and parameters to create meaningful interval-based abstractions from raw time-stamped clinical data. In this paper, we describe the acquisition and maintenance of domain-specific temporal-abstraction knowledge. Using the PROTEGE-II framework, we have designed a graphical tool for acquiring temporal knowledge directly from expert physicians, maintaining the knowledge in a sharable form, and converting the knowledge into a suitable format for use by an appropriate problem-solving method. In initial tests, the tool offered significant gains in our ability to rapidly acquire temporal knowledge and to use that knowledge to perform automated temporal reasoning.
Ontology-Based Information Extraction for Business Intelligence
NASA Astrophysics Data System (ADS)
Saggion, Horacio; Funk, Adam; Maynard, Diana; Bontcheva, Kalina
Business Intelligence (BI) requires the acquisition and aggregation of key pieces of knowledge from multiple sources in order to provide valuable information to customers or feed statistical BI models and tools. The massive amount of information available to business analysts makes information extraction and other natural language processing tools key enablers for the acquisition and use of that semantic information. We describe the application of ontology-based extraction and merging in the context of a practical e-business application for the EU MUSING Project where the goal is to gather international company intelligence and country/region information. The results of our experiments so far are very promising and we are now in the process of building a complete end-to-end solution.
Pseudotargeted MS Method for the Sensitive Analysis of Protein Phosphorylation in Protein Complexes.
Lyu, Jiawen; Wang, Yan; Mao, Jiawei; Yao, Yating; Wang, Shujuan; Zheng, Yong; Ye, Mingliang
2018-05-15
In this study, we presented an enrichment-free approach for the sensitive analysis of protein phosphorylation in minute amounts of samples, such as purified protein complexes. This method takes advantage of the high sensitivity of parallel reaction monitoring (PRM). Specifically, low confident phosphopeptides identified from the data-dependent acquisition (DDA) data set were used to build a pseudotargeted list for PRM analysis to allow the identification of additional phosphopeptides with high confidence. The development of this targeted approach is very easy as the same sample and the same LC-system were used for the discovery and the targeted analysis phases. No sample fractionation or enrichment was required for the discovery phase which allowed this method to analyze minute amount of sample. We applied this pseudotargeted MS method to quantitatively examine phosphopeptides in affinity purified endogenous Shc1 protein complexes at four temporal stages of EGF signaling and identified 82 phospho-sites. To our knowledge, this is the highest number of phospho-sites identified from the protein complexes. This pseudotargeted MS method is highly sensitive in the identification of low abundance phosphopeptides and could be a powerful tool to study phosphorylation-regulated assembly of protein complex.
Bielmann, V; Gillan, J; Perkins, N R; Skidmore, A L; Godden, S; Leslie, K E
2010-08-01
Acquisition of high quality colostrum is an important factor influencing neonatal calf health. Many methods have been used to assess the Ig concentration of colostrum; however, improved, validated evaluation tools are needed. The aims of this study were to evaluate both optical and digital Brix refractometer instruments for the measurement of Ig concentration of colostrum as compared with the gold standard radial immunodiffusion assay laboratory assessment and to determine the correlation between Ig measurements taken from fresh and frozen colostrum samples for both Brix refractometer instruments. This research was completed using 288 colostrum samples from 3 different farms. It was concluded that the optical and digital Brix refractometers were highly correlated for both fresh and frozen samples (r=0.98 and r=0.97, respectively). Correlation between both refractometer instruments for fresh and frozen samples and the gold standard radial immunodiffusion assay were determined to be very similar, with a correlation coefficient between 0.71 and 0.74. Both instruments exhibited excellent test characteristics, indicating an appropriate cut-off point of 22% Brix score for the identification of good quality colostrum. Copyright (c) 2010 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Shariff, U; Kullar, N; Haray, P N; Dorudi, S; Balasubramanian, S P
2015-05-01
Conventional teaching in surgical training programmes is constrained by time and cost, and has room for improvement. This study aimed to determine the effectiveness of a multimedia educational tool developed for an index colorectal surgical procedure (anterior resection) in teaching and assessment of cognitive skills and to evaluate its acceptability amongst general surgical trainees. Multimedia educational tools in open and laparoscopic anterior resection were developed by filming multiple operations which were edited into procedural steps and substeps and then integrated onto interactive navigational platforms using Adobe® Flash® Professional CS5 10.1. A randomized controlled trial was conducted on general surgical trainees to evaluate the effectiveness of online multimedia in comparison with conventional 'study day' teaching for the acquisition of cognitive skills. All trainees were assessed before and after the study period. Trainees in the multimedia group evaluated the tools by completing a survey. Fifty-nine trainees were randomized but 27% dropped out, leaving 43 trainees randomized to the multimedia group (n = 25) and study day group (n = 18) who were available for analysis. Posttest scores improved significantly in both groups (P < 0.01). The change in scores (mean ± SD) in the multimedia group was not significantly different from the study day group (6.02 ± 5.12 and 5.31 ± 3.42, respectively; P = 0.61). Twenty-five trainees completed the evaluation survey and experienced an improvement in their decision making (67%) and in factual and anatomical knowledge (88%); 96% agreed that the multimedia tool was a useful additional educational resource. Multimedia tools are effective for the acquisition of cognitive skills in colorectal surgery and are well accepted as an educational resource. Colorectal Disease © 2014 The Association of Coloproctology of Great Britain and Ireland.
NASA Astrophysics Data System (ADS)
Cryar, Adam; Groves, Kate; Quaglia, Milena
2017-06-01
Hydrogen-deuterium exchange mass spectrometry (HDX-MS) is an important tool for measuring and monitoring protein structure. A bottom-up approach to HDX-MS provides peptide level deuterium uptake values and a more refined localization of deuterium incorporation compared with global HDX-MS measurements. The degree of localization provided by HDX-MS is proportional to the number of peptides that can be identified and monitored across an exchange experiment. Ion mobility spectrometry (IMS) has been shown to improve MS-based peptide analysis of biological samples through increased separation capacity. The integration of IMS within HDX-MS workflows has been commercialized but presently its adoption has not been widespread. The potential benefits of IMS, therefore, have not yet been fully explored. We herein describe a comprehensive evaluation of traveling wave ion mobility integrated within an online-HDX-MS system and present the first reported example of UDMSE acquisition for HDX analysis. Instrument settings required for optimal peptide identifications are described and the effects of detector saturation due to peak compression are discussed. A model system is utilized to confirm the comparability of HDX-IM-MS and HDX-MS uptake values prior to an evaluation of the benefits of IMS at increasing sample complexity. Interestingly, MS and IM-MS acquisitions were found to identify distinct populations of peptides that were unique to the respective methods, a property that can be utilized to increase the spatial resolution of HDX-MS experiments by >60%. [Figure not available: see fulltext.
Abushareeda, Wadha; Lyris, Emmanouil; Kraiem, Suhail; Wahaibi, Aisha Al; Alyazidi, Sameera; Dbes, Najib; Lommen, Arjen; Nielen, Michel; Horvatovich, Peter L; Alsayrafi, Mohammed; Georgakopoulos, Costas
2017-09-15
This paper presents the development and validation of a high-resolution full scan (FS) electron impact ionization (EI) gas chromatography coupled to quadrupole Time-of-Flight mass spectrometry (GC/QTOF) platform for screening anabolic androgenic steroids (AAS) in human urine samples. The World Antidoping Agency (WADA) enlists AAS as prohibited doping agents in sports, and our method has been developed to comply with the qualitative specifications of WADA to be applied for the detection of sports antidoping prohibited substances, mainly for AAS. The method also comprises of the quantitative analysis of the WADA's Athlete Biological Passport (ABP) endogenous steroidal parameters. The applied preparation of urine samples includes enzymatic hydrolysis for the cleavage of the Phase II glucuronide conjugates, generic liquid-liquid extraction and trimethylsilyl (TMS) derivatization steps. Tandem mass spectrometry (MS/MS) acquisition was applied on few selected ions to enhance the specificity and sensitivity of GC/TOF signal of few compounds. The full scan high resolution acquisition of analytical signal, for known and unknown TMS derivatives of AAS provides the antidoping system with a new analytical tool for the detection designer drugs and novel metabolites, which prolongs the AAS detection, after electronic data files' reprocessing. The current method is complementary to the respective liquid chromatography coupled to mass spectrometry (LC/MS) methodology widely used to detect prohibited molecules in sport, which cannot be efficiently ionized with atmospheric pressure ionization interface. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Fitzgerald, Ryan; Karanassios, Vassili
2017-05-01
There are many applications requiring chemical analysis in the field and analytical results in (near) real-time. For example, when accidental spills occur. In others, collecting samples in the field followed by analysis in a lab increases costs and introduces time-delays. In such cases, "bring part of the lab to the sample" would be ideal. Toward this ideal (and to further reduce size and weight), we developed a relatively inexpensive, battery-operated, wireless data acquisition hardware system around an Arduino nano micro-controller and a 16-bit ADC (Analog-to- Digital Converter) with a max sampling rate of 860 samples/s. The hardware communicates the acquired data using low-power Bluetooth. Software for data acquisition and data display was written in Python. Potential ways of making the hardware-software approach described here a part of the Internet-of-Things (IoT) are presented.
Acquisition of German Pluralization Rules in Monolingual and Multilingual Children
ERIC Educational Resources Information Center
Zaretsky, Eugen; Lange, Benjamin P.; Euler, Harald A.; Neumann, Katrin
2013-01-01
Existing studies on plural acquisition in German have relied on small samples and thus hardly deliver generalizable and differentiated results. Here, overgeneralizations of certain plural allomorphs and other tendencies in the acquisition of German plural markers are described on the basis of test data from 7,394 3- to 5-year-old monolingual…
Williams, Brad J; Ciavarini, Steve J; Devlin, Curt; Cohn, Steven M; Xie, Rong; Vissers, Johannes P C; Martin, LeRoy B; Caswell, Allen; Langridge, James I; Geromanos, Scott J
2016-08-01
In proteomics studies, it is generally accepted that depth of coverage and dynamic range is limited in data-directed acquisitions. The serial nature of the method limits both sensitivity and the number of precursor ions that can be sampled. To that end, a number of data-independent acquisition (DIA) strategies have been introduced with these methods, for the most part, immune to the sampling issue; nevertheless, some do have other limitations with respect to sensitivity. The major limitation with DIA approaches is interference, i.e., MS/MS spectra are highly chimeric and often incapable of being identified using conventional database search engines. Utilizing each available dimension of separation prior to ion detection, we present a new multi-mode acquisition (MMA) strategy multiplexing both narrowband and wideband DIA acquisitions in a single analytical workflow. The iterative nature of the MMA workflow limits the adverse effects of interference with minimal loss in sensitivity. Qualitative identification can be performed by selected ion chromatograms or conventional database search strategies. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
The Macro Dynamics of Weapon System Acquisition: Shaping Early Decisions to Get Better Outcomes
2012-05-17
defects and rework •Design tools and processes •Lack of feedback to key design and SE processes •Lack of quantified risk and uncertainty at key... Tools for Rapid Exploration of the Physical Design Space Coupling Operability, Interoperability, and Physical Feasibility Analyses – a Game Changer...Interoperability •Training Quantified Margins and Uncertainties at Each Critical Decision Point M&S RDT&E A Continuum of Tools Underpinned with
Nurses' and midwives' acquisition of competency in spiritual care: a focus on education.
Attard, Josephine; Baldacchino, Donia R; Camilleri, Liberato
2014-12-01
The debate that spirituality is 'caught' in practice rather than 'taught' implies that spiritual awareness comes about through clinical experience and exposure, requiring no formal education and integration within the curricula. This is challenged as it seems that providing students with a 'taught' component equips students with tools to identify and strengthen resources in 'catching' the concept. This study forms part of a modified Delphi study, which aims to identify the predictive effect of pre- and post-registration 'taught' study units in spiritual care competency of qualified nurses/midwives. A purposive sample of 111 nurses and 101 midwives were eligible to participate in the study. Quantitative data were collected by the Spiritual Care Competency Scale (SCCS) (Van Leeuwen et al., 2008) [response rate: nurses (89%; n=99) and midwives (74%; n=75)]. Overall nurses/midwives who had undertaken the study units on spiritual care scored higher in the competency of spiritual care. Although insignificant, nurses scored higher in the overall competency in spiritual care than the midwives. 'Taught' study units on spiritual care at pre- or post-registration nursing/midwifery education may contribute towards the acquisition of competency in spiritual care. Copyright © 2014 Elsevier Ltd. All rights reserved.
The Trigger and Data Acquisition System for the 8 tower subsystem of the KM3NeT detector
NASA Astrophysics Data System (ADS)
Manzali, M.; Chiarusi, T.; Favaro, M.; Giacomini, F.; Margiotta, A.; Pellegrino, C.
2016-07-01
KM3NeT is a deep-sea research infrastructure being constructed in the Mediterranean Sea. It will host a large Cherenkov neutrino telescope that will collect photons emitted along the path of the charged particles produced in neutrino interactions in the vicinity of the detector. The philosophy of the DAQ system of the detector foresees that all data are sent to shore after a proper sampling of the photomultiplier signals. No off-shore hardware trigger is implemented and a software selection of the data is performed with an on-line Trigger and Data Acquisition System (TriDAS) to reduce the large throughput due to the environmental light background. A first version of the TriDAS has been developed to operate a prototype detection unit deployed in March 2013 in the abyssal site of Capo Passero (Sicily, Italy), about 3500 m deep. A revised and improved version has been developed to meet the requirements of the final detector, using new tools and modern design solutions. First installation and scalability tests have been performed at the Bologna Common Infrastructure and results comparable to what expected have been observed.
NASA Technical Reports Server (NTRS)
Aaron, Kim
1991-01-01
The Sample Acquisition, Analysis, and Preservation Project is summarized in outline and graphic form. The objective of the project is to develop component and system level technology to enable the unmanned collection, analysis and preservation of physical, chemical and mineralogical data from the surface of planetary bodies. Technology needs and challenges are identified and specific objectives are described.
The Mars Science Laboratory Organic Check Material
NASA Technical Reports Server (NTRS)
Conrad, Pamela G.; Eigenbrode, J. E.; Mogensen, C. T.; VonderHeydt, M. O.; Glavin, D. P.; Mahaffy, P. M.; Johnson, J. A.
2011-01-01
The Organic Check Material (OCM) has been developed for use on the Mars Science Laboratory mission to serve as a sample standard for verification of organic cleanliness and characterization of potential sample alteration as a function of the sample acquisition and portioning process on the Curiosity rover. OCM samples will be acquired using the same procedures for drilling, portioning and delivery as are used to study martian samples with The Sample Analysis at Mars (SAM) instrument suite during MSL surface operations. Because the SAM suite is highly sensitive to organic molecules, the mission can better verify the cleanliness of Curiosity's sample acquisition hardware if a known material can be processed through SAM and compared with the results obtained from martian samples.
Lesot, Philippe; Kazimierczuk, Krzysztof; Trébosc, Julien; Amoureux, Jean-Paul; Lafon, Olivier
2015-11-01
Unique information about the atom-level structure and dynamics of solids and mesophases can be obtained by the use of multidimensional nuclear magnetic resonance (NMR) experiments. Nevertheless, the acquisition of these experiments often requires long acquisition times. We review here alternative sampling methods, which have been proposed to circumvent this issue in the case of solids and mesophases. Compared to the spectra of solutions, those of solids and mesophases present some specificities because they usually display lower signal-to-noise ratios, non-Lorentzian line shapes, lower spectral resolutions and wider spectral widths. We highlight herein the advantages and limitations of these alternative sampling methods. A first route to accelerate the acquisition time of multidimensional NMR spectra consists in the use of sparse sampling schemes, such as truncated, radial or random sampling ones. These sparsely sampled datasets are generally processed by reconstruction methods differing from the Discrete Fourier Transform (DFT). A host of non-DFT methods have been applied for solids and mesophases, including the G-matrix Fourier transform, the linear least-square procedures, the covariance transform, the maximum entropy and the compressed sensing. A second class of alternative sampling consists in departing from the Jeener paradigm for multidimensional NMR experiments. These non-Jeener methods include Hadamard spectroscopy as well as spatial or orientational encoding of the evolution frequencies. The increasing number of high field NMR magnets and the development of techniques to enhance NMR sensitivity will contribute to widen the use of these alternative sampling methods for the study of solids and mesophases in the coming years. Copyright © 2015 John Wiley & Sons, Ltd.
An Empiric HIV Risk Scoring Tool to Predict HIV-1 Acquisition in African Women.
Balkus, Jennifer E; Brown, Elizabeth; Palanee, Thesla; Nair, Gonasagrie; Gafoor, Zakir; Zhang, Jingyang; Richardson, Barbra A; Chirenje, Zvavahera M; Marrazzo, Jeanne M; Baeten, Jared M
2016-07-01
To develop and validate an HIV risk assessment tool to predict HIV acquisition among African women. Data were analyzed from 3 randomized trials of biomedical HIV prevention interventions among African women (VOICE, HPTN 035, and FEM-PrEP). We implemented standard methods for the development of clinical prediction rules to generate a risk-scoring tool to predict HIV acquisition over the course of 1 year. Performance of the score was assessed through internal and external validations. The final risk score resulting from multivariable modeling included age, married/living with a partner, partner provides financial or material support, partner has other partners, alcohol use, detection of a curable sexually transmitted infection, and herpes simplex virus 2 serostatus. Point values for each factor ranged from 0 to 2, with a maximum possible total score of 11. Scores ≥5 were associated with HIV incidence >5 per 100 person-years and identified 91% of incident HIV infections from among only 64% of women. The area under the curve (AUC) for predictive ability of the score was 0.71 (95% confidence interval [CI]: 0.68 to 0.74), indicating good predictive ability. Risk score performance was generally similar with internal cross-validation (AUC = 0.69; 95% CI: 0.66 to 0.73) and external validation in HPTN 035 (AUC = 0.70; 95% CI: 0.65 to 0.75) and FEM-PrEP (AUC = 0.58; 95% CI: 0.51 to 0.65). A discrete set of characteristics that can be easily assessed in clinical and research settings was predictive of HIV acquisition over 1 year. The use of a validated risk score could improve efficiency of recruitment into HIV prevention research and inform scale-up of HIV prevention strategies in women at highest risk.
WebProtégé: A Collaborative Ontology Editor and Knowledge Acquisition Tool for the Web
Tudorache, Tania; Nyulas, Csongor; Noy, Natalya F.; Musen, Mark A.
2012-01-01
In this paper, we present WebProtégé—a lightweight ontology editor and knowledge acquisition tool for the Web. With the wide adoption of Web 2.0 platforms and the gradual adoption of ontologies and Semantic Web technologies in the real world, we need ontology-development tools that are better suited for the novel ways of interacting, constructing and consuming knowledge. Users today take Web-based content creation and online collaboration for granted. WebProtégé integrates these features as part of the ontology development process itself. We tried to lower the entry barrier to ontology development by providing a tool that is accessible from any Web browser, has extensive support for collaboration, and a highly customizable and pluggable user interface that can be adapted to any level of user expertise. The declarative user interface enabled us to create custom knowledge-acquisition forms tailored for domain experts. We built WebProtégé using the existing Protégé infrastructure, which supports collaboration on the back end side, and the Google Web Toolkit for the front end. The generic and extensible infrastructure allowed us to easily deploy WebProtégé in production settings for several projects. We present the main features of WebProtégé and its architecture and describe briefly some of its uses for real-world projects. WebProtégé is free and open source. An online demo is available at http://webprotege.stanford.edu. PMID:23807872
Gilbert, Jack A; Meyer, Folker; Jansson, Janet; Gordon, Jeff; Pace, Norman; Tiedje, James; Ley, Ruth; Fierer, Noah; Field, Dawn; Kyrpides, Nikos; Glöckner, Frank-Oliver; Klenk, Hans-Peter; Wommack, K Eric; Glass, Elizabeth; Docherty, Kathryn; Gallery, Rachel; Stevens, Rick; Knight, Rob
2010-12-25
This report details the outcome the first meeting of the Earth Microbiome Project to discuss sample selection and acquisition. The meeting, held at the Argonne National Laboratory on Wednesday October 6(th) 2010, focused on discussion of how to prioritize environmental samples for sequencing and metagenomic analysis as part of the global effort of the EMP to systematically determine the functional and phylogenetic diversity of microbial communities across the world.
Interactive specification acquisition via scenarios: A proposal
NASA Technical Reports Server (NTRS)
Hall, Robert J.
1992-01-01
Some reactive systems are most naturally specified by giving large collections of behavior scenarios. These collections not only specify the behavior of the system, but also provide good test suites for validating the implemented system. Due to the complexity of the systems and the number of scenarios, however, it appears that automated assistance is necessary to make this software development process workable. Interactive Specification Acquisition Tool (ISAT) is a proposed interactive system for supporting the acquisition and maintenance of a formal system specification from scenarios, as well as automatic synthesis of control code and automated test generation. This paper discusses the background, motivation, proposed functions, and implementation status of ISAT.
application architecture, energy informatics, scalable acquisition of sensor data, and software tools for engaging occupants in building energy performance. Prior to joining NREL, Anya developed custom business
The Center/TRACON Automation System (CTAS): A video presentation
NASA Technical Reports Server (NTRS)
Green, Steven M.; Freeman, Jeannine
1992-01-01
NASA Ames, working with the FAA, has developed a highly effective set of automation tools for aiding the air traffic controller in traffic management within the terminal area. To effectively demonstrate these tools, the video AAV-1372, entitled 'Center/TRACON Automation System,' was produced. The script to the video is provided along with instructions for its acquisition.
The 2009 DOD Cost Research Workshop: Acquisition Reform
2010-02-01
2 ACEIT Enhancement, Help-Desk/Training, Consulting DASA-CE–3 Command, Control, Communications, Computers, Intelligence, Surveillance, and...Management Information System (OSMIS) online interactive relational database DASA-CE–2 Title: ACEIT Enhancement, Help-Desk/Training, Consulting Summary...support and training for the Automated Cost estimator Integrated Tools ( ACEIT ) software suite. ACEIT is the Army standard suite of analytical tools for
NASA Technical Reports Server (NTRS)
Peters, P. N.; Hester, H. B.; Bertsch, W.; Mayfield, H.; Zatko, D.
1983-01-01
An investigation involving sampling the rapidly changing environment of the Shuttle cargo bay is considered. Four time-integrated samples and one rapid acquisition sample were collected to determine the types and quantities of contaminants present during ascent and descent of the Shuttle. The sampling times for the various bottles were controlled by valves operated by the Data Acquisition and Control System (DACS) of the IECM. Many of the observed species were found to be common solvents used in cleaning surfaces. When the actual volume sampled is taken into account, the relative mass of organics sampled during descent is about 20 percent less than during ascent.
Advanced Curation of Current and Future Extraterrestrial Samples
NASA Technical Reports Server (NTRS)
Allen, Carlton C.
2013-01-01
Curation of extraterrestrial samples is the critical interface between sample return missions and the international research community. Curation includes documentation, preservation, preparation, and distribution of samples. The current collections of extraterrestrial samples include: Lunar rocks / soils collected by the Apollo astronauts Meteorites, including samples of asteroids, the Moon, and Mars "Cosmic dust" (asteroid and comet particles) collected by high-altitude aircraft Solar wind atoms collected by the Genesis spacecraft Comet particles collected by the Stardust spacecraft Interstellar dust collected by the Stardust spacecraft Asteroid particles collected by the Hayabusa spacecraft These samples were formed in environments strikingly different from that on Earth. Terrestrial contamination can destroy much of the scientific significance of many extraterrestrial materials. In order to preserve the research value of these precious samples, contamination must be minimized, understood, and documented. In addition the samples must be preserved - as far as possible - from physical and chemical alteration. In 2011 NASA selected the OSIRIS-REx mission, designed to return samples from the primitive asteroid 1999 RQ36 (Bennu). JAXA will sample C-class asteroid 1999 JU3 with the Hayabusa-2 mission. ESA is considering the near-Earth asteroid sample return mission Marco Polo-R. The Decadal Survey listed the first lander in a Mars sample return campaign as its highest priority flagship-class mission, with sample return from the South Pole-Aitken basin and the surface of a comet among additional top priorities. The latest NASA budget proposal includes a mission to capture a 5-10 m asteroid and return it to the vicinity of the Moon as a target for future sampling. Samples, tools, containers, and contamination witness materials from any of these missions carry unique requirements for acquisition and curation. Some of these requirements represent significant advances over methods currently used. New analytical and screening techniques will increase the value of current sample collections. Improved web-based tools will make information on all samples more accessible to researchers and the public. Advanced curation of current and future extraterrestrial samples includes: Contamination Control - inorganic / organic Temperature of preservation - subfreezing / cryogenic Non-destructive preliminary examination - X-ray tomography / XRF mapping / Raman mapping Microscopic samples - handling / sectioning / transport Special samples - unopened lunar cores Informatics - online catalogs / community-based characterization.
Acquisition Challenge: The Importance of Incompressibility in Comparing Learning Curve Models
2015-10-01
parameters for all four learning mod- els used in the study . The learning rate factor, b, is the slope of the linear regression line, which in this case is...incorporated within the DoD acquisition environment. This study tested three alternative learning models (the Stanford-B model, DeJong’s learning formula...appropriate tools to calculate accurate and reliable predictions. However, conventional learning curve methodology has been in practice since the pre
Analyzing the Effects of the Weapon Systems Acquisition Reform Act
2014-06-01
otherwise, an ICD is developed. The ICD is the first key document that JCIDS contributes to the acquisition system. This document feeds into the MSA...WSARA is the initiator of bottom-line change, if not the catalyst for changes that occur. . The bottom line in the corporate world is profit. For a...combat engineering, force sustainment, petroleum and water, sets, kits, outfits and tools, test 22 measurement and diagnostic equipment, and
Analyzing the Effects of the Weapon Systems Acquisition Reform Act
2014-05-28
an ICD is developed. The ICD is the first key document that JCIDS contributes to the acquisition system. This document feeds into the MSA and the...make an assumption that the WSARA is the initiator of bottom-line change, if not the catalyst for changes that occur. The bottom line in the...force sustainment, petroleum and water, sets, kits, outfits and tools, test measurement and diagnostic equipment, and contingency basing infrastructure
Research on remote sensing image pixel attribute data acquisition method in AutoCAD
NASA Astrophysics Data System (ADS)
Liu, Xiaoyang; Sun, Guangtong; Liu, Jun; Liu, Hui
2013-07-01
The remote sensing image has been widely used in AutoCAD, but AutoCAD lack of the function of remote sensing image processing. In the paper, ObjectARX was used for the secondary development tool, combined with the Image Engine SDK to realize remote sensing image pixel attribute data acquisition in AutoCAD, which provides critical technical support for AutoCAD environment remote sensing image processing algorithms.
2013-05-02
REPORT Statistical Relational Learning ( SRL ) as an Enabling Technology for Data Acquisition and Data Fusion in Video 14. ABSTRACT 16. SECURITY...particular, it is important to reason about which portions of video require expensive analysis and storage. This project aims to make these...inferences using new and existing tools from Statistical Relational Learning ( SRL ). SRL is a recently emerging technology that enables the effective 1
How to buy and sell a group practice.
Groth, C D
1988-01-01
This article reviews the world of mergers, acquisitions and divestitures, providing guidelines for the group practice administrator who is in the position of considering a merger or sale. The importance of strategic planning is discussed, and a set of working tools for buying and selling a medical practice is provided, along with suggestions for ways for groups to compete with industrial health/clinic programs in the area of long-term growth/acquisition programs.
Lock-in thermography using a cellphone attachment infrared camera
NASA Astrophysics Data System (ADS)
Razani, Marjan; Parkhimchyk, Artur; Tabatabaei, Nima
2018-03-01
Lock-in thermography (LIT) is a thermal-wave-based, non-destructive testing, technique which has been widely utilized in research settings for characterization and evaluation of biological and industrial materials. However, despite promising research outcomes, the wide spread adaptation of LIT in industry, and its commercialization, is hindered by the high cost of the infrared cameras used in the LIT setups. In this paper, we report on the feasibility of using inexpensive cellphone attachment infrared cameras for performing LIT. While the cost of such cameras is over two orders of magnitude less than their research-grade counterparts, our experimental results on block sample with subsurface defects and tooth with early dental caries suggest that acceptable performance can be achieved through careful instrumentation and implementation of proper data acquisition and image processing steps. We anticipate this study to pave the way for development of low-cost thermography systems and their commercialization as inexpensive tools for non-destructive testing of industrial samples as well as affordable clinical devices for diagnostic imaging of biological tissues.
hydropower biological evaluation tools
DOE Office of Scientific and Technical Information (OSTI.GOV)
This software is a set of analytical tools to evaluate the physical and biological performance of existing, refurbished, or newly installed conventional hydro-turbines nationwide where fish passage is a regulatory concern. The current version is based on information collected by the Sensor Fish. Future version will include other technologies. The tool set includes data acquisition, data processing, and biological response tools with applications to various turbine designs and other passage alternatives. The associated database is centralized, and can be accessed remotely. We have demonstrated its use for various applications including both turbines and spillways
A Review and Annotated Bibliography of Armor Gunnery Training Device Effectiveness Literature
1993-11-01
training effectiveness (skill acquisition, skill reten-tion, performance prediction, transfer of training) and (b) research limitations (sample size...standalone, tank-appended, subcaliber, and laser) and four areas of training effectiveness (skill acquisition, skill retention, performance prediction, and...standalone, tank-appended, subcaliber, laser) and areas of training effectiveness (skill acquisition, skill retention, performance prediction, transfer of
Scanning SQUID microscope with an in-situ magnetization/demagnetization field for geological samples
NASA Astrophysics Data System (ADS)
Du, Junwei; Liu, Xiaohong; Qin, Huafeng; Wei, Zhao; Kong, Xiangyang; Liu, Qingsong; Song, Tao
2018-04-01
Magnetic properties of rocks are crucial for paleo-, rock-, environmental-magnetism, and magnetic material sciences. Conventional rock magnetometers deal with bulk properties of samples, whereas scanning microscope can map the distribution of remanent magnetization. In this study, a new scanning microscope based on a low-temperature DC superconducting quantum interference device (SQUID) equipped with an in-situ magnetization/demagnetization device was developed. To realize the combination of sensitive instrument as SQUID with high magnetizing/demagnetizing fields, the pick-up coil, the magnetization/demagnetization coils and the measurement mode of the system were optimized. The new microscope has a field sensitivity of 250 pT/√Hz at a coil-to-sample spacing of ∼350 μm, and high magnetization (0-1 T)/ demagnetization (0-300 mT, 400 Hz) functions. With this microscope, isothermal remanent magnetization (IRM) acquisition and the according alternating field (AF) demagnetization curves can be obtained for each point without transferring samples between different procedures, which could result in position deviation, waste of time, and other interferences. The newly-designed SQUID microscope, thus, can be used to investigate the rock magnetic properties of samples at a micro-area scale, and has a great potential to be an efficient tool in paleomagnetism, rock magnetism, and magnetic material studies.
Evaluation of the quality of the teaching-learning process in undergraduate courses in Nursing.
González-Chordá, Víctor Manuel; Maciá-Soler, María Loreto
2015-01-01
to identify aspects of improvement of the quality of the teaching-learning process through the analysis of tools that evaluated the acquisition of skills by undergraduate students of Nursing. prospective longitudinal study conducted in a population of 60 secondyear Nursing students based on registration data, from which quality indicators that evaluate the acquisition of skills were obtained, with descriptive and inferential analysis. nine items were identified and nine learning activities included in the assessment tools that did not reach the established quality indicators (p<0.05). There are statistically significant differences depending on the hospital and clinical practices unit (p<0.05). the analysis of the evaluation tools used in the article "Nursing Care in Welfare Processes" of the analyzed university undergraduate course enabled the detection of the areas for improvement in the teachinglearning process. The challenge of education in nursing is to reach the best clinical research and educational results, in order to provide improvements to the quality of education and health care.
Su, Yapeng; Shi, Qihui; Wei, Wei
2017-02-01
New insights on cellular heterogeneity in the last decade provoke the development of a variety of single cell omics tools at a lightning pace. The resultant high-dimensional single cell data generated by these tools require new theoretical approaches and analytical algorithms for effective visualization and interpretation. In this review, we briefly survey the state-of-the-art single cell proteomic tools with a particular focus on data acquisition and quantification, followed by an elaboration of a number of statistical and computational approaches developed to date for dissecting the high-dimensional single cell data. The underlying assumptions, unique features, and limitations of the analytical methods with the designated biological questions they seek to answer will be discussed. Particular attention will be given to those information theoretical approaches that are anchored in a set of first principles of physics and can yield detailed (and often surprising) predictions. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
MEA-Tools: an open source toolbox for the analysis of multi-electrode data with MATLAB.
Egert, U; Knott, Th; Schwarz, C; Nawrot, M; Brandt, A; Rotter, S; Diesmann, M
2002-05-30
Recent advances in electrophysiological techniques have created new tools for the acquisition and storage of neuronal activity recorded simultaneously with numerous electrodes. These techniques support the analysis of the function as well as the structure of individual electrogenic cells in the context of surrounding neuronal or cardiac network. Commercially available tools for the analysis of such data, however, cannot be easily adapted to newly emerging requirements for data analysis and visualization, and cross compatibility between them is limited. In this report we introduce a free open source toolbox called microelectrode array tools (MEA-Tools) for the analysis of multi-electrode data based on the common data analysis environment MATLAB (version 5.3-6.1, The Mathworks, Natick, MA). The toolbox itself is platform independent. The file interface currently supports files recorded with MCRack (Multi Channel Systems, Reutlingen, Germany) under Microsoft Windows 95, 98, NT, and 2000, but can be adapted to other data acquisition systems. Functions are controlled via command line input and graphical user interfaces, and support common requirements for the analysis of local field potentials, extracellular spike activity, and continuous recordings, in addition to supplementary data acquired by additional instruments, e.g. intracellular amplifiers. Data may be processed as continuous recordings or time windows triggered to some event.
Seekatz, Anna; Bassis, Christine M; Lolans, Karen; Yelin, Rachel D; Moore, Nicholas M; Okamoto, Koh; Rhee, Yoona; Bell, Pamela; Dangana, Thelma; Sidimirova, Galina; Weinstein, Robert A; Fogg, Louis; Lin, Michael Y; Young, Vincent B; Hayden, Mary K
2017-01-01
Abstract Background Colonization with KPC-Kp precedes infection and represents a potential target for intervention. To identify microbial signatures associated with KPC-Kp acquisition, we conducted a prospective, longitudinal study of the fecal microbiota in LTACH patients at risk of acquiring KPC-Kp. Methods We collected admission and weekly rectal swab samples from patients admitted to one LTACH from May 2015 to May 2016. Patients were screened for KPC-Kp by PCR at each sampling time. KPC acquisition was confirmed by culture of KPC-Kp. To assess changes in the microbiota related to acquisition, we sequenced the 16S rRNA gene (V4 region) from collected rectal swabs. Diversity, intra-individual changes, and the relative abundance of the operational taxonomic unit (OTU) that contains KPC-Kp were compared in patients who were KPC-Kp negative upon admission and who had at least one additional swab sample collected. Results 318 patients (1247 samples) were eligible for analysis; 3.7 samples (mean) were collected per patient. Sixty-two patients (19.5%) acquired KPC-Kp (cases) and 256 patients remained negative for all carbapenem-resistant Enterobacteriaceae throughout their stay (controls). Median length of stay before KPC-Kp detection was 14.5 days. At time of KPC-Kp acquisition, levels of an Enterobacteriaceae OTU increased significantly compared with pre-acquisition samples and to samples from control patients (Wilcoxon test, P < 0.0001). Similarly, we observed a decrease in total diversity of the fecal microbiota at time of acquisition in cases (P < 0.01). Compared with controls, cases exhibited decreased intra-individual fecal microbiota similarity immediately prior to acquisition of KPC-Kp (P < 0.01). Comparison of microbial features at time of admission using random forest revealed a higher abundance of Enterococcus and Escherichia OTUs in controls vs cases. Conclusion We observed intra-individual changes in the fecal microbiota of case patients prior to acquisition of KPC-Kp. Compared with patients who did not acquire KPC-Kp, cases exhibited significant changes in microbiota diversity and increased abundance of potential KPC-Kp at acquisition. Our results suggest that shifts in the microbiota may precede colonization by KPC-Kp. Disclosures N. M. Moore, Cepheid: Research Contractor, Funded and provided reagents for associated research projects; R. A. Weinstein, OpGen: Receipt of donated laboratory services for project, Research support; CLorox: Receipt of contributed product, Conducting studies in healthcare facilities that are receiving contributed product; Molnlycke: Receipt of contributed product, Conducting studies in healthcare facilities that are receiving contributed product; Sage Products: Receipt of contributed product, Conducting studies in healthcare facilities that are receiving contributed product; M. Y. Lin, Sage, Inc.: receipt of contributed product, Conducting studies in healthcare facilities that are receiving contributed product; OpGen, Inc.: receipt of in-kind laboratory services, Conducting studies in healthcare facilities that are receiving contributed product; M. K. Hayden, OpGen, Inc.: Receipt of donated laboratory services for project, Research support; Clorox: Receipt of contributed product, Conducting studies in healthcare facilities that are receiving contributed product; Molnlycke: Receipt of contributed product, Conducting studies in healthcare facilities that are receiving contributed product; Sage Products: Receipt of contributed product, Conducting studies in healthcare facilities that are receiving contributed product.
Reducing acquisition times in multidimensional NMR with a time-optimized Fourier encoding algorithm
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Zhiyong; Department of Electronic Science, Fujian Provincial Key Laboratory of Plasma and Magnetic Resonance, Xiamen University, Xiamen, Fujian 361005; Smith, Pieter E. S.
Speeding up the acquisition of multidimensional nuclear magnetic resonance (NMR) spectra is an important topic in contemporary NMR, with central roles in high-throughput investigations and analyses of marginally stable samples. A variety of fast NMR techniques have been developed, including methods based on non-uniform sampling and Hadamard encoding, that overcome the long sampling times inherent to schemes based on fast-Fourier-transform (FFT) methods. Here, we explore the potential of an alternative fast acquisition method that leverages a priori knowledge, to tailor polychromatic pulses and customized time delays for an efficient Fourier encoding of the indirect domain of an NMR experiment. Bymore » porting the encoding of the indirect-domain to the excitation process, this strategy avoids potential artifacts associated with non-uniform sampling schemes and uses a minimum number of scans equal to the number of resonances present in the indirect dimension. An added convenience is afforded by the fact that a usual 2D FFT can be used to process the generated data. Acquisitions of 2D heteronuclear correlation NMR spectra on quinine and on the anti-inflammatory drug isobutyl propionic phenolic acid illustrate the new method's performance. This method can be readily automated to deal with complex samples such as those occurring in metabolomics, in in-cell as well as in in vivo NMR applications, where speed and temporal stability are often primary concerns.« less
Pistell, P J; Falls, W A
2008-09-09
Pavlovian conditioning is a useful tool for elucidating the neural mechanisms involved with learning and memory, especially in regard to the stimuli associated with aversive events. The amygdala has been repeatedly implicated as playing a significant role in the acquisition and expression of fear. If the amygdala is critical for the acquisition of fear, then it should contribute to this processes regardless of the parameters used to induce or evaluate conditioned fear. A series of experiments using reversible inactivation techniques evaluated the role of the amygdala in the acquisition of conditioned fear when training was conducted over several days in rats. Fear-potentiated startle was used to evaluate the acquisition of conditioned fear. Pretraining infusions of N-methyl-d-aspartic acid (NMDA) or non-NMDA receptor antagonists alone into the amygdala interfered with the acquisition of fear early in training, but not later. Pretraining infusions of a cocktail consisting of both an NMDA and non-NMDA antagonist interfered with the acquisition of conditioned fear across all days of training. Taken together these results suggest the amygdala may potentially be critical for the acquisition of conditioned fear regardless of the parameters utilized.
Enhanced methodology of focus control and monitoring on scanner tool
NASA Astrophysics Data System (ADS)
Chen, Yen-Jen; Kim, Young Ki; Hao, Xueli; Gomez, Juan-Manuel; Tian, Ye; Kamalizadeh, Ferhad; Hanson, Justin K.
2017-03-01
As the demand of the technology node shrinks from 14nm to 7nm, the reliability of tool monitoring techniques in advanced semiconductor fabs to achieve high yield and quality becomes more critical. Tool health monitoring methods involve periodic sampling of moderately processed test wafers to detect for particles, defects, and tool stability in order to ensure proper tool health. For lithography TWINSCAN scanner tools, the requirements for overlay stability and focus control are very strict. Current scanner tool health monitoring methods include running BaseLiner to ensure proper tool stability on a periodic basis. The focus measurement on YIELDSTAR by real-time or library-based reconstruction of critical dimensions (CD) and side wall angle (SWA) has been demonstrated as an accurate metrology input to the control loop. The high accuracy and repeatability of the YIELDSTAR focus measurement provides a common reference of scanner setup and user process. In order to further improve the metrology and matching performance, Diffraction Based Focus (DBF) metrology enabling accurate, fast, and non-destructive focus acquisition, has been successfully utilized for focus monitoring/control of TWINSCAN NXT immersion scanners. The optimal DBF target was determined to have minimized dose crosstalk, dynamic precision, set-get residual, and lens aberration sensitivity. By exploiting this new measurement target design, 80% improvement in tool-to-tool matching, >16% improvement in run-to-run mean focus stability, and >32% improvement in focus uniformity have been demonstrated compared to the previous BaseLiner methodology. Matching <2.4 nm across multiple NXT immersion scanners has been achieved with the new methodology of set baseline reference. This baseline technique, with either conventional BaseLiner low numerical aperture (NA=1.20) mode or advanced illumination high NA mode (NA=1.35), has also been evaluated to have consistent performance. This enhanced methodology of focus control and monitoring on multiple illumination conditions, opens an avenue to significantly reduce Focus-Exposure Matrix (FEM) wafer exposure for new product/layer best focus (BF) setup.
Optimisation of wavelength modulated Raman spectroscopy: towards high throughput cell screening.
Praveen, Bavishna B; Mazilu, Michael; Marchington, Robert F; Herrington, C Simon; Riches, Andrew; Dholakia, Kishan
2013-01-01
In the field of biomedicine, Raman spectroscopy is a powerful technique to discriminate between normal and cancerous cells. However the strong background signal from the sample and the instrumentation affects the efficiency of this discrimination technique. Wavelength Modulated Raman spectroscopy (WMRS) may suppress the background from the Raman spectra. In this study we demonstrate a systematic approach for optimizing the various parameters of WMRS to achieve a reduction in the acquisition time for potential applications such as higher throughput cell screening. The Signal to Noise Ratio (SNR) of the Raman bands depends on the modulation amplitude, time constant and total acquisition time. It was observed that the sampling rate does not influence the signal to noise ratio of the Raman bands if three or more wavelengths are sampled. With these optimised WMRS parameters, we increased the throughput in the binary classification of normal human urothelial cells and bladder cancer cells by reducing the total acquisition time to 6 s which is significantly lower in comparison to previous acquisition times required for the discrimination between similar cell types.
E-learning for occupational physicians' CME: a study case.
Mazzoleni, M Cristina; Rognoni, Carla; Finozzi, Enrico; Gri, Tommaso; Pagani, Marco; Imbriani, Marcello
2011-01-01
The present study reports the results of the evaluation of an e-learning CME course in the field of Occupational Medicine. In particular the following aspects have been investigated: If and how the course contents have met the educational users' needs; The effectiveness of the course in terms of knowledge improvement; Users' behaviour. Attendance data and results of a sample of 1128 attendees have been analyzed via ad hoc developed tools for direct inspection of Moodle CMS database. The results document the effectiveness of the e-learning course, as regards meeting the educational needs of physicians and also the improvement in terms of knowledge and problem solving skill acquisition. Users' behaviour has revealed a certain tendency for passing the tests, more than for pursuing the best possible result. Interaction with the tutor is low.
Mars Science Laboratory CHIMRA/IC/DRT Flight Software for Sample Acquisition and Processing
NASA Technical Reports Server (NTRS)
Kim, Won S.; Leger, Chris; Carsten, Joseph; Helmick, Daniel; Kuhn, Stephen; Redick, Richard; Trujillo, Diana
2013-01-01
The design methodologies of using sequence diagrams, multi-process functional flow diagrams, and hierarchical state machines were successfully applied in designing three MSL (Mars Science Laboratory) flight software modules responsible for handling actuator motions of the CHIMRA (Collection and Handling for In Situ Martian Rock Analysis), IC (Inlet Covers), and DRT (Dust Removal Tool) mechanisms. The methodologies were essential to specify complex interactions with other modules, support concurrent foreground and background motions, and handle various fault protections. Studying task scenarios with multi-process functional flow diagrams yielded great insight to overall design perspectives. Since the three modules require three different levels of background motion support, the methodologies presented in this paper provide an excellent comparison. All three modules are fully operational in flight.
High-throughput microcoil NMR of compound libraries using zero-dispersion segmented flow analysis.
Kautz, Roger A; Goetzinger, Wolfgang K; Karger, Barry L
2005-01-01
An automated system for loading samples into a microcoil NMR probe has been developed using segmented flow analysis. This approach enhanced 2-fold the throughput of the published direct injection and flow injection methods, improved sample utilization 3-fold, and was applicable to high-field NMR facilities with long transfer lines between the sample handler and NMR magnet. Sample volumes of 2 microL (10-30 mM, approximately 10 microg) were drawn from a 96-well microtiter plate by a sample handler, then pumped to a 0.5-microL microcoil NMR probe as a queue of closely spaced "plugs" separated by an immiscible fluorocarbon fluid. Individual sample plugs were detected by their NMR signal and automatically positioned for stopped-flow data acquisition. The sample in the NMR coil could be changed within 35 s by advancing the queue. The fluorocarbon liquid wetted the wall of the Teflon transfer line, preventing the DMSO samples from contacting the capillary wall and thus reducing sample losses to below 5% after passage through the 3-m transfer line. With a wash plug of solvent between samples, sample-to-sample carryover was <1%. Significantly, the samples did not disperse into the carrier liquid during loading or during acquisitions of several days for trace analysis. For automated high-throughput analysis using a 16-second acquisition time, spectra were recorded at a rate of 1.5 min/sample and total deuterated solvent consumption was <0.5 mL (1 US dollar) per 96-well plate.
ERIC Educational Resources Information Center
Asteris, Mark M., Jr.
2012-01-01
This study was designed to investigate the differences in Motivational Interviewing (MI) skill acquisition and retention among probation officers. This study had a randomized, experimental, pretest-posttest control group design using the MITI 3.1.1 and the VASE-R to measure MI skill acquisition and retention. A random sample (n = 24) of probation…
NASA Astrophysics Data System (ADS)
Pesaresi, Damiano; Sleeman, Reinoud
2010-05-01
Many medium to big size seismic data centers around the world are facing the same question: which software to use to acquire seismic data in real-time? A home-made or a commercial one? Both choices have pros and cons. The in-house development of software usually requires an increased investment in human resources rather than a financial investment. However, the advantage of fully accomplishing your own needs could be put in danger when the software engineer quits the job! Commercial software offers the advantage of being maintained, but it may require both a considerable financial investment and training. The main seismic software data acquisition suites available nowadays are the public domain SeisComP and EarthWorm packages and the commercial package Antelope. Nanometrics, Guralp and RefTek also provide seismic data acquisition software, but they are mainly intended for single station/network acquisition. Antelope is a software package for real-time acquisition and processing of seismic network data, with its roots in the academic seismological community. The software is developed by Boulder Real Time Technology (BRTT) and commercialized by Kinemetrics. It is used by IRIS affiliates for off-line data processing and it is the main acquisition tool for the USArray program and data centers in Europe like the ORFEUS Data Center, OGS (Italy), ZAMG (Austria), ARSO (Slovenia) and GFU (Czech Republic). SeisComP was originally developed for the GEOFON global network to provide a system for data acquisition, data exchange (SeedLink protocol) and automatic processing. It has evolved into to a widely distributed, networked seismographic system for data acquisition and real-time data exchange over Internet and is supported by ORFEUS as the standard seismic data acquisition tool in Europe. SeisComP3 is the next generation of the software and was developed for the German Indonesian Tsunami Early Warning System (GITEWS). SeisComP is licensed by GFZ (free of charge) and maintained by a private company (GEMPA). EarthWorm was originally developed by United States Geological Survey (USGS) to exchange data with the Canadian seismologists. Its is now used by several institution around the world. It is maintained and developed by a commercial software house, ISTI.
Using Predictive Analytics to Detect Major Problems in Department of Defense Acquisition Programs
2012-03-01
research is focused on three questions. First, can we predict the contractor provided estimate at complete (EAC)? Second, can we use those predictions to...develop an algorithm to determine if a problem will occur in an acquisition program or sub-program? Lastly, can we provide the probability of a problem...more than doubling the probability of a problem occurrence compared to current tools in the cost community. Though program managers can use this
2013-04-01
Teresa Wu, Arizona State University Eugene Rex Jalao, Arizona State University and University of the Philippines Christopher Auger, Lars Baldus, Brian...of Technology The RITE Approach to Agile Acquisition Timothy Boyce, Iva Sherman, and Nicholas Roussel Space and Naval Warfare Systems Center Pacific...Demonstration Office: Ad Hoc Problem Solving as a Mechanism for Adaptive Change Kathryn Aten and John T . Dillard Naval Postgraduate School A Comparative
Medeiros, Pâmella de; Capistrano, Renata; Zequinão, Marcela Almeida; Silva, Siomara Aparecida da; Beltrame, Thais Silva; Cardoso, Fernando Luiz
2017-01-01
To analyze the literature on the effectiveness of exergames in physical education classes and in the acquisition and development of motor skills and abilities. The analyses were carried out by two independent evaluators, limited to English and Portuguese, in four databases: Web of Science, Science Direct, Scopus and PubMed, without restrictions related with year. The keywords used were: "Exergames and motor learning and motor skill" and "Exergames and motor skill and physical education". The inclusion criteria were: articles that evaluated the effectiveness of exergames in physical education classes regarding the acquisition and development of motor skills. The following were excluded: books, theses and dissertations; repetitions; articles published in proceedings and conference summaries; and studies with sick children and/or use of the tool for rehabilitation purposes. 96 publications were found, and 8 studies were selected for a final review. The quality of the articles was evaluated using the Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) scale and the Physiotherapy Evidence Database (PEDro) scale. Evidence was found on the recurring positive effects of exergames in both motor skills acquisition and motor skills development. Exergames, when used in a conscious manner - so as to not completely replace sports and other recreational activities -, incorporate good strategies for parents and physical education teachers in motivating children and adolescents to practice physical exercise.
ERIC Educational Resources Information Center
Burdick, Barry
1999-01-01
Discusses the use of leasing programs as an equipment-acquisition tool to help schools conserve valuable capital. Discusses why leasing is a viable option and describes the different types of leasing plans. (GR)
ERIC Educational Resources Information Center
Weaver, Dave
Science interfacing packages (also known as microcomputer-based laboratories or probeware) generally consist of a set of programs on disks, a user's manual, and hardware which includes one or more sensory devices. Together with a microcomputer they combine to make a powerful data acquisition and analysis tool. Packages are available for accurately…
ERIC Educational Resources Information Center
Bell, Justine C.
2014-01-01
To test the claim that digital learning tools enhance the acquisition of visual literacy in this generation of biology students, a learning intervention was carried out with 33 students enrolled in an introductory college biology course. This study compared learning outcomes following two types of learning tools: a traditional drawing activity, or…
ERIC Educational Resources Information Center
Hsu, T. Ella; And Others
This study was designed to investigate the effects of the presence or absence of metacognitive skill tools available in hyperspace environments on field independent and field dependent learners. Learners were engaged in problem solving in an information-rich hyperspace based on a lesson on the attack on Pearl Harbor. Forty undergraduates were…
Design and simulation of EVA tools for first servicing mission of HST
NASA Technical Reports Server (NTRS)
Naik, Dipak; Dehoff, P. H.
1994-01-01
The Hubble Space Telescope (HST) was launched into near-earth orbit by the Space Shuttle Discovery on April 24, 1990. The payload of two cameras, two spectrographs, and a high-speed photometer is supplemented by three fine-guidance sensors that can be used for astronomy as well as for star tracking. A widely reported spherical aberration in the primary mirror causes HST to produce images of much lower quality than intended. A Space Shuttle repair mission in January 1994 installed small corrective mirrors that restored the full intended optical capability of the HST. The First Servicing Mission (FSM) involved considerable Extra Vehicular Activity (EVA). Special EVA tools for the FSM were designed and developed for this specific purpose. In an earlier report, the details of the Data Acquisition System developed to test the performance of the various EVA tools in ambient as well as simulated space environment were presented. The general schematic of the test setup is reproduced in this report for continuity. Although the data acquisition system was used extensively to test a number of fasteners, only the results of one test each carried on various fasteners and the Power Ratchet Tool are included in this report.
Knowledge-acquisition tools for medical knowledge-based systems.
Lanzola, G; Quaglini, S; Stefanelli, M
1995-03-01
Knowledge-based systems (KBS) have been proposed to solve a large variety of medical problems. A strategic issue for KBS development and maintenance are the efforts required for both knowledge engineers and domain experts. The proposed solution is building efficient knowledge acquisition (KA) tools. This paper presents a set of KA tools we are developing within a European Project called GAMES II. They have been designed after the formulation of an epistemological model of medical reasoning. The main goal is that of developing a computational framework which allows knowledge engineers and domain experts to interact cooperatively in developing a medical KBS. To this aim, a set of reusable software components is highly recommended. Their design was facilitated by the development of a methodology for KBS construction. It views this process as comprising two activities: the tailoring of the epistemological model to the specific medical task to be executed and the subsequent translation of this model into a computational architecture so that the connections between computational structures and their knowledge level counterparts are maintained. The KA tools we developed are illustrated taking examples from the behavior of a KBS we are building for the management of children with acute myeloid leukemia.
ScanImage: flexible software for operating laser scanning microscopes.
Pologruto, Thomas A; Sabatini, Bernardo L; Svoboda, Karel
2003-05-17
Laser scanning microscopy is a powerful tool for analyzing the structure and function of biological specimens. Although numerous commercial laser scanning microscopes exist, some of the more interesting and challenging applications demand custom design. A major impediment to custom design is the difficulty of building custom data acquisition hardware and writing the complex software required to run the laser scanning microscope. We describe a simple, software-based approach to operating a laser scanning microscope without the need for custom data acquisition hardware. Data acquisition and control of laser scanning are achieved through standard data acquisition boards. The entire burden of signal integration and image processing is placed on the CPU of the computer. We quantitate the effectiveness of our data acquisition and signal conditioning algorithm under a variety of conditions. We implement our approach in an open source software package (ScanImage) and describe its functionality. We present ScanImage, software to run a flexible laser scanning microscope that allows easy custom design.
Reiné, J; Zangari, T; Owugha, JT; Pennington, SH; Gritzfeld, JF; Wright, AD; Collins, AM; van Selm, S; de Jonge, MI; Gordon, SB; Weiser, JN; Ferreira, DM
2016-01-01
The ability of pneumococcal conjugate vaccine (PCV) to decrease transmission by blocking the acquisition of colonization has been attributed to herd immunity. We describe the role of mucosal IgG to capsular polysaccharide (CPS) in mediating protection from carriage, translating our findings from a murine model to humans. We used a flow-cytometric assay to quantify antibody-mediated agglutination demonstrating that hyperimmune sera generated against an unencapsulated mutant was poorly agglutinating. Passive immunization with this antiserum was ineffective to block acquisition of colonization compared to agglutinating antisera raised against the encapsulated parent strain. In the human challenge model samples were collected from PCV and control vaccinated adults. In PCV-vaccinated subjects IgG levels to CPS were increased in serum and nasal wash (NW). IgG to the inoculated strain CPS dropped in NW samples after inoculation suggesting its sequestration by colonizing pneumococci. In post-vaccination NW samples pneumococci were heavily agglutinated compared to pre-vaccination samples in subjects protected against carriage. Our results indicate that pneumococcal agglutination mediated by CPS specific antibodies is a key mechanism of protection against acquisition of carriage. Capsule may be the only vaccine target that can elicit strong agglutinating antibody responses, leading to protection against carriage acquisition and generation of herd immunity. PMID:27579859
Design of Multishell Sampling Schemes with Uniform Coverage in Diffusion MRI
Caruyer, Emmanuel; Lenglet, Christophe; Sapiro, Guillermo; Deriche, Rachid
2017-01-01
Purpose In diffusion MRI, a technique known as diffusion spectrum imaging reconstructs the propagator with a discrete Fourier transform, from a Cartesian sampling of the diffusion signal. Alternatively, it is possible to directly reconstruct the orientation distribution function in q-ball imaging, providing so-called high angular resolution diffusion imaging. In between these two techniques, acquisitions on several spheres in q-space offer an interesting trade-off between the angular resolution and the radial information gathered in diffusion MRI. A careful design is central in the success of multishell acquisition and reconstruction techniques. Methods The design of acquisition in multishell is still an open and active field of research, however. In this work, we provide a general method to design multishell acquisition with uniform angular coverage. This method is based on a generalization of electrostatic repulsion to multishell. Results We evaluate the impact of our method using simulations, on the angular resolution in one and two bundles of fiber configurations. Compared to more commonly used radial sampling, we show that our method improves the angular resolution, as well as fiber crossing discrimination. Discussion We propose a novel method to design sampling schemes with optimal angular coverage and show the positive impact on angular resolution in diffusion MRI. PMID:23625329
Nakamoto, Masahiko; Nakada, Kazuhisa; Sato, Yoshinobu; Konishi, Kozo; Hashizume, Makoto; Tamura, Shinichi
2008-02-01
This paper describes a ultrasound (3-D US) system that aims to achieve augmented reality (AR) visualization during laparoscopic surgery, especially for the liver. To acquire 3-D US data of the liver, the tip of a laparoscopic ultrasound probe is tracked inside the abdominal cavity using a magnetic tracker. The accuracy of magnetic trackers, however, is greatly affected by magnetic field distortion that results from the close proximity of metal objects and electronic equipment, which is usually unavoidable in the operating room. In this paper, we describe a calibration method for intraoperative magnetic distortion that can be applied to laparoscopic 3-D US data acquisition; we evaluate the accuracy and feasibility of the method by in vitro and in vivo experiments. Although calibration data can be acquired freehand using a magneto-optic hybrid tracker, there are two problems associated with this method--error caused by the time delay between measurements of the optical and magnetic trackers, and instability of the calibration accuracy that results from the uniformity and density of calibration data. A temporal calibration procedure is developed to estimate the time delay, which is then integrated into the calibration, and a distortion model is formulated by zeroth-degree to fourth-degree polynomial fitting to the calibration data. In the in vivo experiment using a pig, the positional error caused by magnetic distortion was reduced from 44.1 to 2.9 mm. The standard deviation of corrected target positions was less than 1.0 mm. Freehand acquisition of calibration data was performed smoothly using a magneto-optic hybrid sampling tool through a trocar under guidance by realtime 3-D monitoring of the tool trajectory; data acquisition time was less than 2 min. The present study suggests that our proposed method could correct for magnetic field distortion inside the patient's abdomen during a laparoscopic procedure within a clinically permissible period of time, as well as enabling an accurate 3-D US reconstruction to be obtained that can be superimposed onto live endoscopic images.
Raman-in-SEM, a multimodal and multiscale analytical tool: performance for materials and expertise.
Wille, Guillaume; Bourrat, Xavier; Maubec, Nicolas; Lahfid, Abdeltif
2014-12-01
The availability of Raman spectroscopy in a powerful analytical scanning electron microscope (SEM) allows morphological, elemental, chemical, physical and electronic analysis without moving the sample between instruments. This paper documents the metrological performance of the SEMSCA commercial Raman interface operated in a low vacuum SEM. It provides multiscale and multimodal analyses as Raman/EDS, Raman/cathodoluminescence or Raman/STEM (STEM: scanning transmission electron microscopy) as well as Raman spectroscopy on nanomaterials. Since Raman spectroscopy in a SEM can be influenced by several SEM-related phenomena, this paper firstly presents a comparison of this new tool with a conventional micro-Raman spectrometer. Then, some possible artefacts are documented, which are due to the impact of electron beam-induced contamination or cathodoluminescence contribution to the Raman spectra, especially with geological samples. These effects are easily overcome by changing or adapting the Raman spectrometer and the SEM settings and methodology. The deletion of the adverse effect of cathodoluminescence is solved by using a SEM beam shutter during Raman acquisition. In contrast, this interface provides the ability to record the cathodoluminescence (CL) spectrum of a phase. In a second part, this study highlights the interest and efficiency of the coupling in characterizing micrometric phases at the same point. This multimodal approach is illustrated with various issues encountered in geosciences. Copyright © 2014 Elsevier Ltd. All rights reserved.
Saletti, Dominique
2017-01-01
Rapid progress in ultra-high-speed imaging has allowed material properties to be studied at high strain rates by applying full-field measurements and inverse identification methods. Nevertheless, the sensitivity of these techniques still requires a better understanding, since various extrinsic factors present during an actual experiment make it difficult to separate different sources of errors that can significantly affect the quality of the identified results. This study presents a methodology using simulated experiments to investigate the accuracy of the so-called spalling technique (used to study tensile properties of concrete subjected to high strain rates) by numerically simulating the entire identification process. The experimental technique uses the virtual fields method and the grid method. The methodology consists of reproducing the recording process of an ultra-high-speed camera by generating sequences of synthetically deformed images of a sample surface, which are then analysed using the standard tools. The investigation of the uncertainty of the identified parameters, such as Young's modulus along with the stress–strain constitutive response, is addressed by introducing the most significant user-dependent parameters (i.e. acquisition speed, camera dynamic range, grid sampling, blurring), proving that the used technique can be an effective tool for error investigation. This article is part of the themed issue ‘Experimental testing and modelling of brittle materials at high strain rates’. PMID:27956505
Quantitative myocardial perfusion from static cardiac and dynamic arterial CT
NASA Astrophysics Data System (ADS)
Bindschadler, Michael; Branch, Kelley R.; Alessio, Adam M.
2018-05-01
Quantitative myocardial blood flow (MBF) estimation by dynamic contrast enhanced cardiac computed tomography (CT) requires multi-frame acquisition of contrast transit through the blood pool and myocardium to inform the arterial input and tissue response functions. Both the input and the tissue response functions for the entire myocardium are sampled with each acquisition. However, the long breath holds and frequent sampling can result in significant motion artifacts and relatively high radiation dose. To address these limitations, we propose and evaluate a new static cardiac and dynamic arterial (SCDA) quantitative MBF approach where (1) the input function is well sampled using either prediction from pre-scan timing bolus data or measured from dynamic thin slice ‘bolus tracking’ acquisitions, and (2) the whole-heart tissue response data is limited to one contrast enhanced CT acquisition. A perfusion model uses the dynamic arterial input function to generate a family of possible myocardial contrast enhancement curves corresponding to a range of MBF values. Combined with the timing of the single whole-heart acquisition, these curves generate a lookup table relating myocardial contrast enhancement to quantitative MBF. We tested the SCDA approach in 28 patients that underwent a full dynamic CT protocol both at rest and vasodilator stress conditions. Using measured input function plus single (enhanced CT only) or plus double (enhanced and contrast free baseline CT’s) myocardial acquisitions yielded MBF estimates with root mean square (RMS) error of 1.2 ml/min/g and 0.35 ml/min/g, and radiation dose reductions of 90% and 83%, respectively. The prediction of the input function based on timing bolus data and the static acquisition had an RMS error compared to the measured input function of 26.0% which led to MBF estimation errors greater than threefold higher than using the measured input function. SCDA presents a new, simplified approach for quantitative perfusion imaging with an acquisition strategy offering substantial radiation dose and computational complexity savings over dynamic CT.
Low rank magnetic resonance fingerprinting.
Mazor, Gal; Weizman, Lior; Tal, Assaf; Eldar, Yonina C
2016-08-01
Magnetic Resonance Fingerprinting (MRF) is a relatively new approach that provides quantitative MRI using randomized acquisition. Extraction of physical quantitative tissue values is preformed off-line, based on acquisition with varying parameters and a dictionary generated according to the Bloch equations. MRF uses hundreds of radio frequency (RF) excitation pulses for acquisition, and therefore high under-sampling ratio in the sampling domain (k-space) is required. This under-sampling causes spatial artifacts that hamper the ability to accurately estimate the quantitative tissue values. In this work, we introduce a new approach for quantitative MRI using MRF, called Low Rank MRF. We exploit the low rank property of the temporal domain, on top of the well-known sparsity of the MRF signal in the generated dictionary domain. We present an iterative scheme that consists of a gradient step followed by a low rank projection using the singular value decomposition. Experiments on real MRI data demonstrate superior results compared to conventional implementation of compressed sensing for MRF at 15% sampling ratio.
Fu, Riqiang; Hernández-Maldonado, Arturo J
2018-05-24
A small flip-angle pulse direct polarization is the simplest method commonly used to quantify various compositions in many materials applications. This method sacrifices the sensitivity per scan in exchange for rapid repeating of data acquisition for signal accumulation. In addition, the resulting spectrum often encounters artifacts from background signals from probe components and/or from acoustic rings leading to a distorted baseline, especially in low-γ nuclei and wideline NMR. In this work, a multi-acquisition scheme is proposed to boost the sensitivity per scan and at the same time effectively suppress these artifacts. Here, an adiabatic inversion pulse is first applied in order to bring the magnetization from the +z to -z axis and then a small flip-angle pulse excitation is used before the data acquisition. Right after the first acquisition, the adiabatic inversion pulse is applied again to flip the magnetization back to the +z axis. The second data acquisition takes place after another small flip-angle pulse excitation. The difference between the two consecutive acquisitions cancels out any artifacts, while the wanted signals are accumulated. This acquisition process can be repeated many times before going into next scan. Therefore, by acquiring the signals multiple times in a single scan the sensitivity is improved. A mixture sample of flufenamic acid and 3,5-difluorobenzoic acid and a titanium silicate sample have been used to demonstrate the advantages of this newly proposed method. Copyright © 2018 Elsevier Inc. All rights reserved.
A digital acquisition and elaboration system for nuclear fast pulse detection
NASA Astrophysics Data System (ADS)
Esposito, B.; Riva, M.; Marocco, D.; Kaschuck, Y.
2007-03-01
A new digital acquisition and elaboration system has been developed and assembled in ENEA-Frascati for the direct sampling of fast pulses from nuclear detectors such as scintillators and diamond detectors. The system is capable of performing the digital sampling of the pulses (200 MSamples/s, 14-bit) and the simultaneous (compressed) data transfer for further storage and software elaboration. The design (FPGA-based) is oriented to real-time applications and has been developed in order to allow acquisition with no loss of pulses and data storage for long-time intervals (tens of s at MHz pulse count rates) without the need of large on-board memory. A dedicated pulse analysis software, written in LabVIEWTM, performs the treatment of the acquired pulses, including pulse recognition, pile-up rejection, baseline removal, pulse shape particle separation and pulse height spectra analysis. The acquisition and pre-elaboration programs have been fully integrated with the analysis software.
Prakash, Amol; Peterman, Scott; Ahmad, Shadab; Sarracino, David; Frewen, Barbara; Vogelsang, Maryann; Byram, Gregory; Krastins, Bryan; Vadali, Gouri; Lopez, Mary
2014-12-05
Data-dependent acquisition (DDA) and data-independent acquisition strategies (DIA) have both resulted in improved understanding of proteomics samples. Both strategies have advantages and disadvantages that are well-published, where DDA is typically applied for deep discovery and DIA may be used to create sample records. In this paper, we present a hybrid data acquisition and processing strategy (pSMART) that combines the strengths of both techniques and provides significant benefits for qualitative and quantitative peptide analysis. The performance of pSMART is compared to published DIA strategies in an experiment that allows the objective assessment of DIA performance with respect to interrogation of previously acquired MS data. The results of this experiment demonstrate that pSMART creates fewer decoy hits than a standard DIA strategy. Moreover, we show that pSMART is more selective, sensitive, and reproducible than either standard DIA or DDA strategies alone.
myBrain: a novel EEG embedded system for epilepsy monitoring.
Pinho, Francisco; Cerqueira, João; Correia, José; Sousa, Nuno; Dias, Nuno
2017-10-01
The World Health Organisation has pointed that a successful health care delivery, requires effective medical devices as tools for prevention, diagnosis, treatment and rehabilitation. Several studies have concluded that longer monitoring periods and outpatient settings might increase diagnosis accuracy and success rate of treatment selection. The long-term monitoring of epileptic patients through electroencephalography (EEG) has been considered a powerful tool to improve the diagnosis, disease classification, and treatment of patients with such condition. This work presents the development of a wireless and wearable EEG acquisition platform suitable for both long-term and short-term monitoring in inpatient and outpatient settings. The developed platform features 32 passive dry electrodes, analogue-to-digital signal conversion with 24-bit resolution and a variable sampling frequency from 250 Hz to 1000 Hz per channel, embedded in a stand-alone module. A computer-on-module embedded system runs a Linux ® operating system that rules the interface between two software frameworks, which interact to satisfy the real-time constraints of signal acquisition as well as parallel recording, processing and wireless data transmission. A textile structure was developed to accommodate all components. Platform performance was evaluated in terms of hardware, software and signal quality. The electrodes were characterised through electrochemical impedance spectroscopy and the operating system performance running an epileptic discrimination algorithm was evaluated. Signal quality was thoroughly assessed in two different approaches: playback of EEG reference signals and benchmarking with a clinical-grade EEG system in alpha-wave replacement and steady-state visual evoked potential paradigms. The proposed platform seems to efficiently monitor epileptic patients in both inpatient and outpatient settings and paves the way to new ambulatory clinical regimens as well as non-clinical EEG applications.
Vale, Gillian L.; Davis, Sarah J.; Lambeth, Susan P.; Schapiro, Steven J.; Whiten, Andrew
2017-01-01
Cumulative culture underpins humanity’s enormous success as a species. Claims that other animals are incapable of cultural ratcheting are prevalent, but are founded on just a handful of empirical studies. Whether cumulative culture is unique to humans thus remains a controversial and understudied question that has far-reaching implications for our understanding of the evolution of this phenomenon. We investigated whether one of human’s two closest living primate relatives, chimpanzees, are capable of a degree of cultural ratcheting by exposing captive populations to a novel juice extraction task. We found that groups (N = 3) seeded with a model trained to perform a tool modification that built upon simpler, unmodified tool use developed the seeded tool method that allowed greater juice returns than achieved by groups not exposed to a trained model (non-seeded controls; N = 3). One non-seeded group also discovered the behavioral sequence, either by coupling asocial and social learning or by repeated invention. This behavioral sequence was found to be beyond what an additional control sample of chimpanzees (N = 1 group) could discover for themselves without a competent model and lacking experience with simpler, unmodified tool behaviors. Five chimpanzees tested individually with no social information, but with experience of simple unmodified tool use, invented part, but not all, of the behavioral sequence. Our findings indicate that (i) social learning facilitated the propagation of the model-demonstrated tool modification technique, (ii) experience with simple tool behaviors may facilitate individual discovery of more complex tool manipulations, and (iii) a subset of individuals were capable of learning relatively complex behaviors either by learning asocially and socially or by repeated invention over time. That chimpanzees learn increasingly complex behaviors through social and asocial learning suggests that humans’ extraordinary ability to do so was built on such prior foundations. PMID:29333058
NASA Technical Reports Server (NTRS)
Cramer, Christopher J.; Wright, James D.; Simmons, Scott A.; Bobbitt, Lynn E.; DeMoss, Joshua A.
2015-01-01
The paper will present a brief background of the previous data acquisition system at the National Transonic Facility (NTF) and the reasoning and goals behind the upgrade to the current Test SLATE (Test Software Laboratory and Automated Testing Environments) data acquisition system. The components, performance characteristics, and layout of the Test SLATE system within the NTF control room will be discussed. The development, testing, and integration of Test SLATE within NTF operations will be detailed. The operational capabilities of the system will be outlined including: test setup, instrumentation calibration, automatic test sequencer setup, data recording, communication between data and facility control systems, real time display monitoring, and data reduction. The current operational status of the Test SLATE system and its performance during recent NTF testing will be highlighted including high-speed, frame-by-frame data acquisition with conditional sampling post-processing applied. The paper concludes with current development work on the system including the capability for real-time conditional sampling during data acquisition and further efficiency enhancements to the wind tunnel testing process.
High frequency signal acquisition and control system based on DSP+FPGA
NASA Astrophysics Data System (ADS)
Liu, Xiao-qi; Zhang, Da-zhi; Yin, Ya-dong
2017-10-01
This paper introduces a design and implementation of high frequency signal acquisition and control system based on DSP + FPGA. The system supports internal/external clock and internal/external trigger sampling. It has a maximum sampling rate of 400MBPS and has a 1.4GHz input bandwidth for the ADC. Data can be collected continuously or periodically in systems and they are stored in DDR2. At the same time, the system also supports real-time acquisition, the collected data after digital frequency conversion and Cascaded Integrator-Comb (CIC) filtering, which then be sent to the CPCI bus through the high-speed DSP, can be assigned to the fiber board for subsequent processing. The system integrates signal acquisition and pre-processing functions, which uses high-speed A/D, high-speed DSP and FPGA mixed technology and has a wide range of uses in data acquisition and recording. In the signal processing, the system can be seamlessly connected to the dedicated processor board. The system has the advantages of multi-selectivity, good scalability and so on, which satisfies the different requirements of different signals in different projects.
Optimized Design and Analysis of Sparse-Sampling fMRI Experiments
Perrachione, Tyler K.; Ghosh, Satrajit S.
2013-01-01
Sparse-sampling is an important methodological advance in functional magnetic resonance imaging (fMRI), in which silent delays are introduced between MR volume acquisitions, allowing for the presentation of auditory stimuli without contamination by acoustic scanner noise and for overt vocal responses without motion-induced artifacts in the functional time series. As such, the sparse-sampling technique has become a mainstay of principled fMRI research into the cognitive and systems neuroscience of speech, language, hearing, and music. Despite being in use for over a decade, there has been little systematic investigation of the acquisition parameters, experimental design considerations, and statistical analysis approaches that bear on the results and interpretation of sparse-sampling fMRI experiments. In this report, we examined how design and analysis choices related to the duration of repetition time (TR) delay (an acquisition parameter), stimulation rate (an experimental design parameter), and model basis function (an analysis parameter) act independently and interactively to affect the neural activation profiles observed in fMRI. First, we conducted a series of computational simulations to explore the parameter space of sparse design and analysis with respect to these variables; second, we validated the results of these simulations in a series of sparse-sampling fMRI experiments. Overall, these experiments suggest the employment of three methodological approaches that can, in many situations, substantially improve the detection of neurophysiological response in sparse fMRI: (1) Sparse analyses should utilize a physiologically informed model that incorporates hemodynamic response convolution to reduce model error. (2) The design of sparse fMRI experiments should maintain a high rate of stimulus presentation to maximize effect size. (3) TR delays of short to intermediate length can be used between acquisitions of sparse-sampled functional image volumes to increase the number of samples and improve statistical power. PMID:23616742
Optimized design and analysis of sparse-sampling FMRI experiments.
Perrachione, Tyler K; Ghosh, Satrajit S
2013-01-01
Sparse-sampling is an important methodological advance in functional magnetic resonance imaging (fMRI), in which silent delays are introduced between MR volume acquisitions, allowing for the presentation of auditory stimuli without contamination by acoustic scanner noise and for overt vocal responses without motion-induced artifacts in the functional time series. As such, the sparse-sampling technique has become a mainstay of principled fMRI research into the cognitive and systems neuroscience of speech, language, hearing, and music. Despite being in use for over a decade, there has been little systematic investigation of the acquisition parameters, experimental design considerations, and statistical analysis approaches that bear on the results and interpretation of sparse-sampling fMRI experiments. In this report, we examined how design and analysis choices related to the duration of repetition time (TR) delay (an acquisition parameter), stimulation rate (an experimental design parameter), and model basis function (an analysis parameter) act independently and interactively to affect the neural activation profiles observed in fMRI. First, we conducted a series of computational simulations to explore the parameter space of sparse design and analysis with respect to these variables; second, we validated the results of these simulations in a series of sparse-sampling fMRI experiments. Overall, these experiments suggest the employment of three methodological approaches that can, in many situations, substantially improve the detection of neurophysiological response in sparse fMRI: (1) Sparse analyses should utilize a physiologically informed model that incorporates hemodynamic response convolution to reduce model error. (2) The design of sparse fMRI experiments should maintain a high rate of stimulus presentation to maximize effect size. (3) TR delays of short to intermediate length can be used between acquisitions of sparse-sampled functional image volumes to increase the number of samples and improve statistical power.
Due Diligence Processes for Public Acquisition of Mining-Impacted Landscapes
NASA Astrophysics Data System (ADS)
Martin, E.; Monohan, C.; Keeble-Toll, A. K.
2016-12-01
The acquisition of public land is critical for achieving conservation and habitat goals in rural regions projected to experience continuously high rates of population growth. To ensure that public funds are utilized responsibly in the purchase of conservation easements appropriate due diligence processes must be established that limit landowner liability post-acquisition. Traditional methods of characterizing contamination in regions where legacy mining activities were prevalent may not utilize current scientific knowledge and understanding of contaminant fate, transport and bioavailability, and therefore are likely to have type two error. Agency prescribed assessment methods utilized under CERLA in many cases fail to detect contamination that presents liability issues by failing to require water quality sampling that would reveal offsite transport potential of contaminants posing human health risks, including mercury. Historical analysis can be used to inform judgmental sampling to identify hotspots and contaminants of concern. Land acquisition projects at two historic mine sites in Nevada County, California, the Champion Mine Complex and the Black Swan Preserve have established the necessity of re-thinking due diligence processes for mining-impacted landscapes. These pilot projects demonstrate that pre-acquisition assessment in the Gold Country must include judgmental sampling and evaluation of contaminant transport. Best practices using the current scientific knowledge must be codified by agencies, consultants, and NGOs in order to ensure responsible use of public funds and to safeguard public health.
Ovejero, M C; Pérez Vega-Leal, A; Gallardo, M I; Espino, J M; Selva, A; Cortés-Giraldo, M A; Arráns, R
2017-02-01
The aim of this work is to present a new data acquisition, control, and analysis software system written in LabVIEW. This system has been designed to obtain the dosimetry of a silicon strip detector in polyethylene. It allows the full automation of the experiments and data analysis required for the dosimetric characterization of silicon detectors. It becomes a useful tool that can be applied in the daily routine check of a beam accelerator.
NASA Technical Reports Server (NTRS)
Cady, E. C.
1977-01-01
A design analysis, is developed based on experimental data, to predict the effects of transient flow and pressure surges (caused either by valve or pump operation, or by boiling of liquids in warm lines) on the retention performance of screen acquisition systems. A survey of screen liquid acquisition system applications was performed to determine appropriate system environment and classification. A screen model was developed which assumed that the screen device was a uniformly distributed composite orthotropic structure, and which accounted for liquid inflow/outflow, gas ingestion quality, screen stress, and liquid spill. A series of 177 tests using 13 specimens (5 screen meshes, 4 screen device construction/backup methods, and 2 orientations) with three test fluids (isopropyl alcohol, Freon 114, and LH2) provided data which verified important features of the screen model and resulted in a design tool which could accurately predict the transient startup performance acquisition devices.
ERIC Educational Resources Information Center
Nikonova, Elina I.; Sharonov, Ivan A.; Sorokoumova, Svetlana N.; Suvorova, Olga V.; Sorokoumova, Elena A.
2016-01-01
The relevance of the study is conditioned by the changes in the content of socio-humanitarian education, aimed at the acquisition of knowledge, the development of tolerance, civic and moral education. The purpose of the paper is to identify the modern functions of a textbook on social sciences and humanities as an informational management tool of…
ERIC Educational Resources Information Center
Adhitama, Egy; Fauzi, Ahmad
2018-01-01
In this study, a pendulum experimental tool with a light-based timer has been developed to measure the period of a simple pendulum. The obtained data was automatically recorded in an Excel spreadsheet. The intensity of monochromatic light, sensed by a 3DU5C phototransistor, dynamically changes as the pendulum swings. The changed intensity varies…
Molecular brain imaging in the multimodality era
Price, Julie C
2012-01-01
Multimodality molecular brain imaging encompasses in vivo visualization, evaluation, and measurement of cellular/molecular processes. Instrumentation and software developments over the past 30 years have fueled advancements in multimodality imaging platforms that enable acquisition of multiple complementary imaging outcomes by either combined sequential or simultaneous acquisition. This article provides a general overview of multimodality neuroimaging in the context of positron emission tomography as a molecular imaging tool and magnetic resonance imaging as a structural and functional imaging tool. Several image examples are provided and general challenges are discussed to exemplify complementary features of the modalities, as well as important strengths and weaknesses of combined assessments. Alzheimer's disease is highlighted, as this clinical area has been strongly impacted by multimodality neuroimaging findings that have improved understanding of the natural history of disease progression, early disease detection, and informed therapy evaluation. PMID:22434068
Computer-assisted knowledge acquisition for hypermedia systems
NASA Technical Reports Server (NTRS)
Steuck, Kurt
1990-01-01
The usage of procedural and declarative knowledge to set up the structure or 'web' of a hypermedia environment is described. An automated knowledge acquisition tool was developed that helps a knowledge engineer elicit and represent an expert's knowledge involved in performing procedural tasks. The tool represents both procedural and prerequisite, declarative knowledge that supports each activity performed by the expert. This knowledge is output and subsequently read by a hypertext scripting language to generate the link between blank, but labeled cards. Each step of the expert's activity and each piece of supporting declarative knowledge is set up as an empty node. An instructional developer can then enter detailed instructional material concerning each step and declarative knowledge into these empty nodes. Other research is also described that facilitates the translation of knowledge from one form into a form more readily useable by computerized systems.
X-ray system simulation software tools for radiology and radiography education.
Kengyelics, Stephen M; Treadgold, Laura A; Davies, Andrew G
2018-02-01
To develop x-ray simulation software tools to support delivery of radiological science education for a range of learning environments and audiences including individual study, lectures, and tutorials. Two software tools were developed; one simulated x-ray production for a simple two dimensional radiographic system geometry comprising an x-ray source, beam filter, test object and detector. The other simulated the acquisition and display of two dimensional radiographic images of complex three dimensional objects using a ray casting algorithm through three dimensional mesh objects. Both tools were intended to be simple to use, produce results accurate enough to be useful for educational purposes, and have an acceptable simulation time on modest computer hardware. The radiographic factors and acquisition geometry could be altered in both tools via their graphical user interfaces. A comparison of radiographic contrast measurements of the simulators to a real system was performed. The contrast output of the simulators had excellent agreement with measured results. The software simulators were deployed to 120 computers on campus. The software tools developed are easy-to-use, clearly demonstrate important x-ray physics and imaging principles, are accessible within a standard University setting and could be used to enhance the teaching of x-ray physics to undergraduate students. Current approaches to teaching x-ray physics in radiological science lack immediacy when linking theory with practice. This method of delivery allows students to engage with the subject in an experiential learning environment. Copyright © 2017. Published by Elsevier Ltd.
NASA Astrophysics Data System (ADS)
Seaquist, J. W.; Li Johansson, Emma; Nicholas, Kimberly A.
2014-11-01
Global land acquisitions, often dubbed ‘land grabbing’ are increasingly becoming drivers of land change. We use the tools of network science to describe the connectivity of the global acquisition system. We find that 126 countries participate in this form of global land trade. Importers are concentrated in the Global North, the emerging economies of Asia, and the Middle East, while exporters are confined to the Global South and Eastern Europe. A small handful of countries account for the majority of land acquisitions (particularly China, the UK, and the US), the cumulative distribution of which is best described by a power law. We also find that countries with many land trading partners play a disproportionately central role in providing connectivity across the network with the shortest trading path between any two countries traversing either China, the US, or the UK over a third of the time. The land acquisition network is characterized by very few trading cliques and therefore characterized by a low degree of preferential trading or regionalization. We also show that countries with many export partners trade land with countries with few import partners, and vice versa, meaning that less developed countries have a large array of export partnerships with developed countries, but very few import partnerships (dissassortative relationship). Finally, we find that the structure of the network is potentially prone to propagating crises (e.g., if importing countries become dependent on crops exported from their land trading partners). This network analysis approach can be used to quantitatively analyze and understand telecoupled systems as well as to anticipate and diagnose the potential effects of telecoupling.
Experiences of building a medical data acquisition system based on two-level modeling.
Li, Bei; Li, Jianbin; Lan, Xiaoyun; An, Ying; Gao, Wuqiang; Jiang, Yuqiao
2018-04-01
Compared to traditional software development strategies, the two-level modeling approach is more flexible and applicable to build an information system in the medical domain. However, the standards of two-level modeling such as openEHR appear complex to medical professionals. This study aims to investigate, implement, and improve the two-level modeling approach, and discusses the experience of building a unified data acquisition system for four affiliated university hospitals based on this approach. After the investigation, we simplified the approach of archetype modeling and developed a medical data acquisition system where medical experts can define the metadata for their own specialties by using a visual easy-to-use tool. The medical data acquisition system for multiple centers, clinical specialties, and diseases has been developed, and integrates the functions of metadata modeling, form design, and data acquisition. To date, 93,353 data items and 6,017 categories for 285 specific diseases have been created by medical experts, and over 25,000 patients' information has been collected. OpenEHR is an advanced two-level modeling method for medical data, but its idea to separate domain knowledge and technical concern is not easy to realize. Moreover, it is difficult to reach an agreement on archetype definition. Therefore, we adopted simpler metadata modeling, and employed What-You-See-Is-What-You-Get (WYSIWYG) tools to further improve the usability of the system. Compared with the archetype definition, our approach lowers the difficulty. Nevertheless, to build such a system, every participant should have some knowledge in both medicine and information technology domains, as these interdisciplinary talents are necessary. Copyright © 2018 Elsevier B.V. All rights reserved.
Three-dimensional reciprocal space x-ray coherent scattering tomography of two-dimensional object.
Zhu, Zheyuan; Pang, Shuo
2018-04-01
X-ray coherent scattering tomography is a powerful tool in discriminating biological tissues and bio-compatible materials. Conventional x-ray scattering tomography framework can only resolve isotropic scattering profile under the assumption that the material is amorphous or in powder form, which is not true especially for biological samples with orientation-dependent structure. Previous tomography schemes based on x-ray coherent scattering failed to preserve the scattering pattern from samples with preferred orientations, or required elaborated data acquisition scheme, which could limit its application in practical settings. Here, we demonstrate a simple imaging modality to preserve the anisotropic scattering signal in three-dimensional reciprocal (momentum transfer) space of a two-dimensional sample layer. By incorporating detector movement along the direction of x-ray beam, combined with a tomographic data acquisition scheme, we match the five dimensions of the measurements with the five dimensions (three in momentum transfer domain, and two in spatial domain) of the object. We employed a collimated pencil beam of a table-top copper-anode x-ray tube, along with a panel detector to investigate the feasibility of our method. We have demonstrated x-ray coherent scattering tomographic imaging at a spatial resolution ~2 mm and momentum transfer resolution 0.01 Å -1 for the rotation-invariant scattering direction. For any arbitrary, non-rotation-invariant direction, the same spatial and momentum transfer resolution can be achieved based on the spatial information from the rotation-invariant direction. The reconstructed scattering profile of each pixel from the experiment is consistent with the x-ray diffraction profile of each material. The three-dimensional scattering pattern recovered from the measurement reveals the partially ordered molecular structure of Teflon wrap in our sample. We extend the applicability of conventional x-ray coherent scattering tomography to the reconstruction of two-dimensional samples with anisotropic scattering profile by introducing additional degree of freedom on the detector. The presented method has the potential to achieve low-cost, high-specificity material discrimination based on x-ray coherent scattering. © 2018 American Association of Physicists in Medicine.
NASA Astrophysics Data System (ADS)
Keleshis, C.; Ioannou, S.; Vrekoussis, M.; Levin, Z.; Lange, M. A.
2014-08-01
Continuous advances in unmanned aerial vehicles (UAV) and the increased complexity of their applications raise the demand for improved data acquisition systems (DAQ). These improvements may comprise low power consumption, low volume and weight, robustness, modularity and capability to interface with various sensors and peripherals while maintaining the high sampling rates and processing speeds. Such a system has been designed and developed and is currently integrated on the Autonomous Flying Platforms for Atmospheric and Earth Surface Observations (APAESO/NEA-YΠOΔOMH/NEKΠ/0308/09) however, it can be easily adapted to any UAV or any other mobile vehicle. The system consists of a single-board computer with a dual-core processor, rugged surface-mount memory and storage device, analog and digital input-output ports and many other peripherals that enhance its connectivity with various sensors, imagers and on-board devices. The system is powered by a high efficiency power supply board. Additional boards such as frame-grabbers, differential global positioning system (DGPS) satellite receivers, general packet radio service (3G-4G-GPRS) modems for communication redundancy have been interfaced to the core system and are used whenever there is a mission need. The onboard DAQ system can be preprogrammed for automatic data acquisition or it can be remotely operated during the flight from the ground control station (GCS) using a graphical user interface (GUI) which has been developed and will also be presented in this paper. The unique design of the GUI and the DAQ system enables the synchronized acquisition of a variety of scientific and UAV flight data in a single core location. The new DAQ system and the GUI have been successfully utilized in several scientific UAV missions. In conclusion, the novel DAQ system provides the UAV and the remote-sensing community with a new tool capable of reliably acquiring, processing, storing and transmitting data from any sensor integrated on an UAV.
Circulation of core collection monographs in an academic medical library.
Schmidt, C M; Eckerman, N L
2001-04-01
Academic medical librarians responsible for monograph acquisition face a challenging task. From the plethora of medical monographs published each year, academic medical librarians must select those most useful to their patrons. Unfortunately, none of the selection tools available to medical librarians are specifically intended to assist academic librarians with medical monograph selection. The few short core collection lists that are available are intended for use in the small hospital or internal medicine department library. As these are the only selection tools available, however, many academic medical librarians spend considerable time reviewing these collection lists and place heavy emphasis on the acquisition of listed books. The study reported here was initiated to determine whether the circulation of listed books in an academic library justified the emphasis placed on the acquisition of these books. Circulation statistics for "listed" and "nonlisted" books in the hematology (WH) section of Indiana University School of Medicine's Ruth Lilly Medical Library were studied. The average circulation figures for listed books were nearly two times as high as the corresponding figures for the WH books in general. These data support the policies of those academic medical libraries that place a high priority on collection of listed books.
Supplemental knowledge acquisition through external product interface for CLIPS
NASA Technical Reports Server (NTRS)
Saito, Tim; Ebaud, Stephen; Loftin, Bowen R.
1990-01-01
Traditionally, the acquisition of knowledge for expert systems consisted of the interview process with the domain or subject matter expert (SME), observation of domain environment, and information gathering and research which constituted a direct form of knowledge acquisition (KA). The knowledge engineer would be responsible for accumulating pertinent information and/or knowledge from the SME(s) for input into the appropriate expert system development tool. The direct KA process may (or may not) have included forms of data or documentation to incorporate from the SME's surroundings. The differentiation between direct KA and supplemental KA (indirect) would be the difference in the use of data. In acquiring supplemental knowledge, the knowledge engineer would access other types of evidence (manuals, documents, data files, spreadsheets, etc.) that would support the reasoning or premises of the SME. When an expert makes a decision in a particular task, one tool that may have been used to justify a recommendation, would have been a spreadsheet total or column figure. Locating specific decision points from that data within the SME's framework would constitute supplemental KA. Data used for a specific purpose in one system or environment would be used as supplemental knowledge for another, specifically a CLIPS project.
Circulation of core collection monographs in an academic medical library
Schmidt, Cynthia M.; Eckerman, Nancy L.
2001-01-01
Academic medical librarians responsible for monograph acquisition face a challenging task. From the plethora of medical monographs published each year, academic medical librarians must select those most useful to their patrons. Unfortunately, none of the selection tools available to medical librarians are specifically intended to assist academic librarians with medical monograph selection. The few short core collection lists that are available are intended for use in the small hospital or internal medicine department library. As these are the only selection tools available, however, many academic medical librarians spend considerable time reviewing these collection lists and place heavy emphasis on the acquisition of listed books. The study reported here was initiated to determine whether the circulation of listed books in an academic library justified the emphasis placed on the acquisition of these books. Circulation statistics for “listed” and “nonlisted” books in the hematology (WH) section of Indiana University School of Medicine's Ruth Lilly Medical Library were studied. The average circulation figures for listed books were nearly two times as high as the corresponding figures for the WH books in general. These data support the policies of those academic medical libraries that place a high priority on collection of listed books. PMID:11337947
Microprogrammable Integrated Data Acquisition System-Fatigue Life Data Application
1976-03-01
Lt. James W. Sturges, successfully applied the Midas general system [Sturges, 1975] to the fatigue life data monitoring problem and proved its...life data problem . The Midas FLD system computer program generates the required signals in the proper sequence for effectively sampling the 8-channel...Integrated Data Acquisition System- Fatigue Life Data Application" ( Midas FLD) is a microprocessor based data acquisition system. It incorporates a Pro-Log
Low-dose x-ray tomography through a deep convolutional neural network
Yang, Xiaogang; De Andrade, Vincent; Scullin, William; ...
2018-02-07
Synchrotron-based X-ray tomography offers the potential of rapid large-scale reconstructions of the interiors of materials and biological tissue at fine resolution. However, for radiation sensitive samples, there remain fundamental trade-offs between damaging samples during longer acquisition times and reducing signals with shorter acquisition times. We present a deep convolutional neural network (CNN) method that increases the acquired X-ray tomographic signal by at least a factor of 10 during low-dose fast acquisition by improving the quality of recorded projections. Short exposure time projections enhanced with CNN show similar signal to noise ratios as compared with long exposure time projections and muchmore » lower noise and more structural information than low-dose fats acquisition without CNN. We optimized this approach using simulated samples and further validated on experimental nano-computed tomography data of radiation sensitive mouse brains acquired with a transmission X-ray microscopy. We demonstrate that automated algorithms can reliably trace brain structures in datasets collected with low dose-CNN. As a result, this method can be applied to other tomographic or scanning based X-ray imaging techniques and has great potential for studying faster dynamics in specimens.« less
Low-dose x-ray tomography through a deep convolutional neural network
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Xiaogang; De Andrade, Vincent; Scullin, William
Synchrotron-based X-ray tomography offers the potential of rapid large-scale reconstructions of the interiors of materials and biological tissue at fine resolution. However, for radiation sensitive samples, there remain fundamental trade-offs between damaging samples during longer acquisition times and reducing signals with shorter acquisition times. We present a deep convolutional neural network (CNN) method that increases the acquired X-ray tomographic signal by at least a factor of 10 during low-dose fast acquisition by improving the quality of recorded projections. Short exposure time projections enhanced with CNN show similar signal to noise ratios as compared with long exposure time projections and muchmore » lower noise and more structural information than low-dose fats acquisition without CNN. We optimized this approach using simulated samples and further validated on experimental nano-computed tomography data of radiation sensitive mouse brains acquired with a transmission X-ray microscopy. We demonstrate that automated algorithms can reliably trace brain structures in datasets collected with low dose-CNN. As a result, this method can be applied to other tomographic or scanning based X-ray imaging techniques and has great potential for studying faster dynamics in specimens.« less
Applying Online Monitoring for Nuclear Power Plant Instrumentation and Control
NASA Astrophysics Data System (ADS)
Hashemian, H. M.
2010-10-01
This paper presents a practical review of the state-of-the-art means for applying OLM data acquisition in nuclear power plant instrumentation and control, qualifying or validating the OLM data, and then analyzing it for static and dynamic performance monitoring applications. Whereas data acquisition for static or steady-state OLM applications can require sample rates of anywhere from 1 to 10 seconds to 1 minutes per sample, for dynamic data acquisition, higher sampling frequencies are required (e.g., 100 to 1000 Hz) using a dedicated data acquisition system capable of providing isolation, anti-aliasing and removal of extraneous noise, and analog-to-digital (A/D) conversion. Qualifying the data for use with OLM algorithms can involve removing data `dead' spots (for static data) and calculating, examining, and trending amplitude probability density, variance, skewness, and kurtosis. For static OLM applications with redundant signals, trending and averaging qualification techniques are used, and for single or non-redundant signals physical and empirical modeling are used. Dynamic OLM analysis is performed in the frequency domain and/or time domain, and is based on the assumption that sensors' or transmitters' dynamic characteristics are linear and that the input noise signal (i.e., the process fluctuations) has proper spectral characteristics.
NASA Technical Reports Server (NTRS)
Hazra, Rajeeb; Viles, Charles L.; Park, Stephen K.; Reichenbach, Stephen E.; Sieracki, Michael E.
1992-01-01
Consideration is given to a model-based method for estimating the spatial frequency response of a digital-imaging system (e.g., a CCD camera) that is modeled as a linear, shift-invariant image acquisition subsystem that is cascaded with a linear, shift-variant sampling subsystem. The method characterizes the 2D frequency response of the image acquisition subsystem to beyond the Nyquist frequency by accounting explicitly for insufficient sampling and the sample-scene phase. Results for simulated systems and a real CCD-based epifluorescence microscopy system are presented to demonstrate the accuracy of the method.
Sample acquisition and instrument deployment
NASA Technical Reports Server (NTRS)
Boyd, Robert C.
1995-01-01
Progress is reported in developing the Sample Acquisition and Instrument Deployment (SAID) system, a robotic system for deploying science instruments and acquiring samples for analysis. The system is a conventional four degree of freedom manipulator 2 meters in length. A baseline design has been achieved through analysis and trade studies. The design considers environmental operating conditions on the surface of Mars, as well as volume constraints on proposed Mars landers. Control issues have also been studied, and simulations of joint and tip movements have been performed. The systems have been fabricated and tested in environmental chambers, as well as soil testing and robotic control testing.
Multiphoton minimal inertia scanning for fast acquisition of neural activity signals
NASA Astrophysics Data System (ADS)
Schuck, Renaud; Go, Mary Ann; Garasto, Stefania; Reynolds, Stephanie; Dragotti, Pier Luigi; Schultz, Simon R.
2018-04-01
Objective. Multi-photon laser scanning microscopy provides a powerful tool for monitoring the spatiotemporal dynamics of neural circuit activity. It is, however, intrinsically a point scanning technique. Standard raster scanning enables imaging at subcellular resolution; however, acquisition rates are limited by the size of the field of view to be scanned. Recently developed scanning strategies such as travelling salesman scanning (TSS) have been developed to maximize cellular sampling rate by scanning only select regions in the field of view corresponding to locations of interest such as somata. However, such strategies are not optimized for the mechanical properties of galvanometric scanners. We thus aimed to develop a new scanning algorithm which produces minimal inertia trajectories, and compare its performance with existing scanning algorithms. Approach. We describe here the adaptive spiral scanning (SSA) algorithm, which fits a set of near-circular trajectories to the cellular distribution to avoid inertial drifts of galvanometer position. We compare its performance to raster scanning and TSS in terms of cellular sampling frequency and signal-to-noise ratio (SNR). Main Results. Using surrogate neuron spatial position data, we show that SSA acquisition rates are an order of magnitude higher than those for raster scanning and generally exceed those achieved by TSS for neural densities comparable with those found in the cortex. We show that this result also holds true for in vitro hippocampal mouse brain slices bath loaded with the synthetic calcium dye Cal-520 AM. The ability of TSS to ‘park’ the laser on each neuron along the scanning trajectory, however, enables higher SNR than SSA when all targets are precisely scanned. Raster scanning has the highest SNR but at a substantial cost in number of cells scanned. To understand the impact of sampling rate and SNR on functional calcium imaging, we used the Cramér-Rao Bound on evoked calcium traces recorded simultaneously with electrophysiology traces to calculate the lower bound estimate of the spike timing occurrence. Significance. The results show that TSS and SSA achieve comparable accuracy in spike time estimates compared to raster scanning, despite lower SNR. SSA is an easily implementable way for standard multi-photon laser scanning systems to gain temporal precision in the detection of action potentials while scanning hundreds of active cells.
Workshop on data acquisition and trigger system simulations for high energy physics
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1992-12-31
This report discusses the following topics: DAQSIM: A data acquisition system simulation tool; Front end and DCC Simulations for the SDC Straw Tube System; Simulation of Non-Blocklng Data Acquisition Architectures; Simulation Studies of the SDC Data Collection Chip; Correlation Studies of the Data Collection Circuit & The Design of a Queue for this Circuit; Fast Data Compression & Transmission from a Silicon Strip Wafer; Simulation of SCI Protocols in Modsim; Visual Design with vVHDL; Stochastic Simulation of Asynchronous Buffers; SDC Trigger Simulations; Trigger Rates, DAQ & Online Processing at the SSC; Planned Enhancements to MODSEM II & SIMOBJECT -- anmore » Overview -- R.; DAGAR -- A synthesis system; Proposed Silicon Compiler for Physics Applications; Timed -- LOTOS in a PROLOG Environment: an Algebraic language for Simulation; Modeling and Simulation of an Event Builder for High Energy Physics Data Acquisition Systems; A Verilog Simulation for the CDF DAQ; Simulation to Design with Verilog; The DZero Data Acquisition System: Model and Measurements; DZero Trigger Level 1.5 Modeling; Strategies Optimizing Data Load in the DZero Triggers; Simulation of the DZero Level 2 Data Acquisition System; A Fast Method for Calculating DZero Level 1 Jet Trigger Properties and Physics Input to DAQ Studies.« less
Uav-Based Automatic Tree Growth Measurement for Biomass Estimation
NASA Astrophysics Data System (ADS)
Karpina, M.; Jarząbek-Rychard, M.; Tymków, P.; Borkowski, A.
2016-06-01
Manual in-situ measurements of geometric tree parameters for the biomass volume estimation are time-consuming and economically non-effective. Photogrammetric techniques can be deployed in order to automate the measurement procedure. The purpose of the presented work is an automatic tree growth estimation based on Unmanned Aircraft Vehicle (UAV) imagery. The experiment was conducted in an agriculture test field with scots pine canopies. The data was collected using a Leica Aibotix X6V2 platform equipped with a Nikon D800 camera. Reference geometric parameters of selected sample plants were measured manually each week. In situ measurements were correlated with the UAV data acquisition. The correlation aimed at the investigation of optimal conditions for a flight and parameter settings for image acquisition. The collected images are processed in a state of the art tool resulting in a generation of dense 3D point clouds. The algorithm is developed in order to estimate geometric tree parameters from 3D points. Stem positions and tree tops are identified automatically in a cross section, followed by the calculation of tree heights. The automatically derived height values are compared to the reference measurements performed manually. The comparison allows for the evaluation of automatic growth estimation process. The accuracy achieved using UAV photogrammetry for tree heights estimation is about 5cm.
Imaging brain tumour microstructure.
Nilsson, Markus; Englund, Elisabet; Szczepankiewicz, Filip; van Westen, Danielle; Sundgren, Pia C
2018-05-08
Imaging is an indispensable tool for brain tumour diagnosis, surgical planning, and follow-up. Definite diagnosis, however, often demands histopathological analysis of microscopic features of tissue samples, which have to be obtained by invasive means. A non-invasive alternative may be to probe corresponding microscopic tissue characteristics by MRI, or so called 'microstructure imaging'. The promise of microstructure imaging is one of 'virtual biopsy' with the goal to offset the need for invasive procedures in favour of imaging that can guide pre-surgical planning and can be repeated longitudinally to monitor and predict treatment response. The exploration of such methods is motivated by the striking link between parameters from MRI and tumour histology, for example the correlation between the apparent diffusion coefficient and cellularity. Recent microstructure imaging techniques probe even more subtle and specific features, providing parameters associated to cell shape, size, permeability, and volume distributions. However, the range of scenarios in which these techniques provide reliable imaging biomarkers that can be used to test medical hypotheses or support clinical decisions is yet unknown. Accurate microstructure imaging may moreover require acquisitions that go beyond conventional data acquisition strategies. This review covers a wide range of candidate microstructure imaging methods based on diffusion MRI and relaxometry, and explores advantages, challenges, and potential pitfalls in brain tumour microstructure imaging. Copyright © 2018. Published by Elsevier Inc.
Gaining perspective on the water-energy nexus at the community scale.
Perrone, Debra; Murphy, Jennifer; Hornberger, George M
2011-05-15
Water and energy resources are interrelated but their influence on each other is rarely considered. To quantify the water and energy portfolios associated with a community's water-energy nexus (WEN) and the influence of geographic location on resources, we present the WEN tool. The WEN tool quantifies a community's transport (consumed for or lost before delivery) and nexus (energy for water and water for energy) resources so communities can assess their resource flows. In addition, to provide insight into the full range of impacts of water and energy resource acquisition and to frame the influence of geography on resources, we coin the term "urban resource islands". The concept of urban resource islands provides a framework for considering the implication of geography on a community's water and energy resource acquisition and use. The WEN tool and the concept of resource islands can promote communities to think about their hidden resources and integrate such concepts into their sustainability trade-off analyses and policy decisions. In this paper, we use Tucson, Arizona, United States as a case study.
Evaluation of the quality of the teaching-learning process in undergraduate courses in Nursing 1
González-Chordá, Víctor Manuel; Maciá-Soler, María Loreto
2015-01-01
Abstract Objective: to identify aspects of improvement of the quality of the teaching-learning process through the analysis of tools that evaluated the acquisition of skills by undergraduate students of Nursing. Method: prospective longitudinal study conducted in a population of 60 secondyear Nursing students based on registration data, from which quality indicators that evaluate the acquisition of skills were obtained, with descriptive and inferential analysis. Results: nine items were identified and nine learning activities included in the assessment tools that did not reach the established quality indicators (p<0.05). There are statistically significant differences depending on the hospital and clinical practices unit (p<0.05). Conclusion: the analysis of the evaluation tools used in the article "Nursing Care in Welfare Processes" of the analyzed university undergraduate course enabled the detection of the areas for improvement in the teachinglearning process. The challenge of education in nursing is to reach the best clinical research and educational results, in order to provide improvements to the quality of education and health care. PMID:26444173
Data acquisition channel apparatus
NASA Astrophysics Data System (ADS)
Higgins, C. H.; Skipper, J. D.
1985-10-01
Dicussed is a hybrid integrated circuit data acquisition channel apparatus employing an operational amplifier fed by a low current differential bipolar transistor preamplifier having separate feedback gain and signal gain determining elements and providing an amplified signal output to a sample and hold and analog-to-digital converter circuits. The disclosed apparatus operates with low energy and small space requirements and is capable of operations without the sample and hold circuit where the nature of the applied input signal permits.
Microcomputer data acquisition and control.
East, T D
1986-01-01
In medicine and biology there are many tasks that involve routine well defined procedures. These tasks are ideal candidates for computerized data acquisition and control. As the performance of microcomputers rapidly increases and cost continues to go down the temptation to automate the laboratory becomes great. To the novice computer user the choices of hardware and software are overwhelming and sadly most of the computer sales persons are not at all familiar with real-time applications. If you want to bill your patients you have hundreds of packaged systems to choose from; however, if you want to do real-time data acquisition the choices are very limited and confusing. The purpose of this chapter is to provide the novice computer user with the basics needed to set up a real-time data acquisition system with the common microcomputers. This chapter will cover the following issues necessary to establish a real time data acquisition and control system: Analysis of the research problem: Definition of the problem; Description of data and sampling requirements; Cost/benefit analysis. Choice of Microcomputer hardware and software: Choice of microprocessor and bus structure; Choice of operating system; Choice of layered software. Digital Data Acquisition: Parallel Data Transmission; Serial Data Transmission; Hardware and software available. Analog Data Acquisition: Description of amplitude and frequency characteristics of the input signals; Sampling theorem; Specification of the analog to digital converter; Hardware and software available; Interface to the microcomputer. Microcomputer Control: Analog output; Digital output; Closed-Loop Control. Microcomputer data acquisition and control in the 21st Century--What is in the future? High speed digital medical equipment networks; Medical decision making and artificial intelligence.
Digital Documentation: Using Computers to Create Multimedia Reports.
ERIC Educational Resources Information Center
Speitel, Tom; And Others
1996-01-01
Describes methods for creating integrated multimedia documents using recent advances in print, audio, and video digitization that bring added usefulness to computers as data acquisition, processing, and presentation tools. Discusses advantages of digital documentation. (JRH)
The cost of biomedical equipment repair and maintenance: results of a survey.
Cohen, T
1982-01-01
The survey presented in this paper shows that for 19 large hospitals the average ratio of equipment repair costs to acquisition cost was 7.4%. In addition, this survey shows that costs such as rent for building space, utilities, and test equipment are not included in many clinical engineering department budgets. This is one reason for the divergent cost data reported by the various hospitals. These costs should be considered particularly for comparisons between in-house service costs and other sources of service. It seems that, of the indicators observed in this survey, equipment acquisition cost provides the best indicator for equipment maintenance costs. All hospital finance officers should have acquisition value information, because this information is used in calculating capital equipment depreciation. This information should also be available to clinical engineers. In addition, procedures need to be set up so that the total annual repair and maintenance costs can be easily obtained from hospital finance departments. Providing the clinical engineer with this type of data will allow further analysis of repair cost and will aid in long-term planning for the hospital. The ratio of equipment repair cost to acquisition value may be useful as a tool to predict future costs of a given hospital's medical equipment maintenance. This tool may also be useful as a measurement of the effectiveness of a change in a hospital's approach to biomedical equipment maintenance. Further work must be done to standardize equipment maintenance cost reporting so that more detailed comparisons can be made.
Single molecule force spectroscopy at high data acquisition: A Bayesian nonparametric analysis
NASA Astrophysics Data System (ADS)
Sgouralis, Ioannis; Whitmore, Miles; Lapidus, Lisa; Comstock, Matthew J.; Pressé, Steve
2018-03-01
Bayesian nonparametrics (BNPs) are poised to have a deep impact in the analysis of single molecule data as they provide posterior probabilities over entire models consistent with the supplied data, not just model parameters of one preferred model. Thus they provide an elegant and rigorous solution to the difficult problem encountered when selecting an appropriate candidate model. Nevertheless, BNPs' flexibility to learn models and their associated parameters from experimental data is a double-edged sword. Most importantly, BNPs are prone to increasing the complexity of the estimated models due to artifactual features present in time traces. Thus, because of experimental challenges unique to single molecule methods, naive application of available BNP tools is not possible. Here we consider traces with time correlations and, as a specific example, we deal with force spectroscopy traces collected at high acquisition rates. While high acquisition rates are required in order to capture dwells in short-lived molecular states, in this setup, a slow response of the optical trap instrumentation (i.e., trapped beads, ambient fluid, and tethering handles) distorts the molecular signals introducing time correlations into the data that may be misinterpreted as true states by naive BNPs. Our adaptation of BNP tools explicitly takes into consideration these response dynamics, in addition to drift and noise, and makes unsupervised time series analysis of correlated single molecule force spectroscopy measurements possible, even at acquisition rates similar to or below the trap's response times.
Percy, Andrew J; Mohammed, Yassene; Yang, Juncong; Borchers, Christoph H
2015-12-01
An increasingly popular mass spectrometry-based quantitative approach for health-related research in the biomedical field involves the use of stable isotope-labeled standards (SIS) and multiple/selected reaction monitoring (MRM/SRM). To improve inter-laboratory precision and enable more widespread use of this 'absolute' quantitative technique in disease-biomarker assessment studies, methods must be standardized. Results/methodology: Using this MRM-with-SIS-peptide approach, we developed an automated method (encompassing sample preparation, processing and analysis) for quantifying 76 candidate protein markers (spanning >4 orders of magnitude in concentration) in neat human plasma. The assembled biomarker assessment kit - the 'BAK-76' - contains the essential materials (SIS mixes), methods (for acquisition and analysis), and tools (Qualis-SIS software) for performing biomarker discovery or verification studies in a rapid and standardized manner.
NASA Astrophysics Data System (ADS)
Foltynowicz, Aleksandra; Picqué, Nathalie; Ye, Jun
2018-05-01
Frequency combs are becoming enabling tools for many applications in science and technology, beyond the original purpose of frequency metrology of simple atoms. The precisely evenly spaced narrow lines of a laser frequency comb inspire intriguing approaches to molecular spectroscopy, designed and implemented by a growing community of scientists. Frequency-comb spectroscopy advances the frontiers of molecular physics across the entire electro-magnetic spectrum. Used as frequency rulers, frequency combs enable absolute frequency measurements and precise line shape studies of molecular transitions, for e.g. tests of fundamental physics and improved determination of fundamental constants. As light sources interrogating the molecular samples, they dramatically improve the resolution, precision, sensitivity and acquisition time of broad spectral-bandwidth spectroscopy and open up new opportunities and applications at the leading edge of molecular spectroscopy and sensing.
From Sequences to Insights in Microbial Ecology
Knight, R.
2010-01-01
s4-3 Rapid declines in the cost of sequencing have made large volumes of DNA sequence data available to individual investigators. Now, data analysis is the rate-limiting step: providing a user with sequences alone typically leads to bewilderment, frustration, and skepticism about the technology. In this talk, I focus on how to extract insights from 16S rRNA data, including key lab steps (barcoding and normalization) and on which tools are available to perform routine but essential processing steps such as denoising, chimera detection, taxonomy assignment, and diversity analyses (including detection of biological clusters and gradients in the samples). Providing users with advice on these points and with a standard pipeline they can exploit (but modify if circumstances require) can greatly accelerate the rate of understanding, publication, and acquisition of funding for further studies.
An open tool for input function estimation and quantification of dynamic PET FDG brain scans.
Bertrán, Martín; Martínez, Natalia; Carbajal, Guillermo; Fernández, Alicia; Gómez, Álvaro
2016-08-01
Positron emission tomography (PET) analysis of clinical studies is mostly restricted to qualitative evaluation. Quantitative analysis of PET studies is highly desirable to be able to compute an objective measurement of the process of interest in order to evaluate treatment response and/or compare patient data. But implementation of quantitative analysis generally requires the determination of the input function: the arterial blood or plasma activity which indicates how much tracer is available for uptake in the brain. The purpose of our work was to share with the community an open software tool that can assist in the estimation of this input function, and the derivation of a quantitative map from the dynamic PET study. Arterial blood sampling during the PET study is the gold standard method to get the input function, but is uncomfortable and risky for the patient so it is rarely used in routine studies. To overcome the lack of a direct input function, different alternatives have been devised and are available in the literature. These alternatives derive the input function from the PET image itself (image-derived input function) or from data gathered from previous similar studies (population-based input function). In this article, we present ongoing work that includes the development of a software tool that integrates several methods with novel strategies for the segmentation of blood pools and parameter estimation. The tool is available as an extension to the 3D Slicer software. Tests on phantoms were conducted in order to validate the implemented methods. We evaluated the segmentation algorithms over a range of acquisition conditions and vasculature size. Input function estimation algorithms were evaluated against ground truth of the phantoms, as well as on their impact over the final quantification map. End-to-end use of the tool yields quantification maps with [Formula: see text] relative error in the estimated influx versus ground truth on phantoms. The main contribution of this article is the development of an open-source, free to use tool that encapsulates several well-known methods for the estimation of the input function and the quantification of dynamic PET FDG studies. Some alternative strategies are also proposed and implemented in the tool for the segmentation of blood pools and parameter estimation. The tool was tested on phantoms with encouraging results that suggest that even bloodless estimators could provide a viable alternative to blood sampling for quantification using graphical analysis. The open tool is a promising opportunity for collaboration among investigators and further validation on real studies.
ACQUISITION OF REPRESENTATIVE GROUND WATER QUALITY SAMPLES FOR METALS
R.S. Kerr Environmental Research Laboratory (RSKERL) personnel have evaluated sampling procedures for the collection of representative, accurate, and reproducible ground water quality samples for metals for the past four years. Intensive sampling research at three different field...
Pollution Prevention in Air Force System Acquisition Programs
1994-09-01
Audits Number of Facility Audits 12. Chemical Spill Prevention Measures Number of Measures 13. Unresolved Notices of Violation Number of Open Notices...14. Air Force Environmental Audit Findings Number of Open Findings 15. Awareness / Information Tools Number of New Tools 16. Environmental Training...building thirty-six aircraft (FY94 budget) to be delivered in 1995-1996. The latest audited cost data that can be used in the negotiations ends in
Survey of Human Systems Integration (HSI) Tools for USCG Acquisitions
2009-04-01
an IMPRINT HPM. IMPRINT uses task network modeling to represent human performance. As the name implies, task networks use a flowchart type format...tools; and built-in tutoring support for beginners . A perceptual/motor layer extending ACT-R’s theory of cognition to perception and action is also...chisystems.com B.8 Information and Functional Flow Analysis Description In information flow analysis, a flowchart of the information and decisions
Meade, Michelle A; Forchheimer, Martin B; Krause, James S; Charlifue, Susan
2011-03-01
To examine the associations of job acquisition and job retention to secondary conditions, hospitalizations, and nursing home stays for adults with spinal cord injury (SCI). Retrospective analysis of longitudinal data from multicenter study. Community setting. Two samples of adults participating in the SCI Model Systems; the first sample consisted of persons who reported being unemployed at follow-up (n=9501); the second sample consisted of those who reported working at follow-up (n=5,150). Not applicable. Job acquisition (change from not working at 1 anniversary of injury to working at the following data collection) and job retention (maintenance of work between 2 assessment periods). Discrete time hazard modeling was used to assess how secondary conditions affect job acquisition. After controlling for the effects of demographic and injury characteristics, hospitalizations within the last 12 months were associated with decreased chance of having obtained employment. Hierarchic logistic regression analyses were used to examine job retention. Hospitalizations and the presence of PUs were associated with lower odds of job retention once demographic and injury characteristics were controlled. Secondary conditions from the previous assessment period were not significantly related to either job acquisition or job retention after the variance from demographic and injury characteristics and current secondary conditions were controlled. Hospitalization, as well as a limited number of secondary conditions, were associated with reduced odds of both job acquisition and job retention among adults with SCI. Interventions that can prevent secondary conditions and reduce the need for hospitalizations may be beneficial in improving employment for this population. Copyright © 2011 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
Demir, E; Babur, O; Dogrusoz, U; Gursoy, A; Nisanci, G; Cetin-Atalay, R; Ozturk, M
2002-07-01
Availability of the sequences of entire genomes shifts the scientific curiosity towards the identification of function of the genomes in large scale as in genome studies. In the near future, data produced about cellular processes at molecular level will accumulate with an accelerating rate as a result of proteomics studies. In this regard, it is essential to develop tools for storing, integrating, accessing, and analyzing this data effectively. We define an ontology for a comprehensive representation of cellular events. The ontology presented here enables integration of fragmented or incomplete pathway information and supports manipulation and incorporation of the stored data, as well as multiple levels of abstraction. Based on this ontology, we present the architecture of an integrated environment named Patika (Pathway Analysis Tool for Integration and Knowledge Acquisition). Patika is composed of a server-side, scalable, object-oriented database and client-side editors to provide an integrated, multi-user environment for visualizing and manipulating network of cellular events. This tool features automated pathway layout, functional computation support, advanced querying and a user-friendly graphical interface. We expect that Patika will be a valuable tool for rapid knowledge acquisition, microarray generated large-scale data interpretation, disease gene identification, and drug development. A prototype of Patika is available upon request from the authors.
Bimodal Imaging at ICON Using Neutrons and X-rays
NASA Astrophysics Data System (ADS)
Kaestner, A. P.; Hovind, J.; Boillat, P.; Muehlebach, C.; Carminati, C.; Zarebanadkouki, M.; Lehmann, E. H.
For experiments with low contrast between the relevant features it can be beneficial to add a second modality to reduce ambiguity. At Paul Scherrer Institut the two neutron imaging facilities NEUTRA (thermal neutrons) and ICON (cold neutrons) we have installed X-ray beamlines for on-site bimodal imaging with neutrons and X-rays. This allows us to leave the sample untouched in the sample environment throughout an experiment and to reduce the waiting times between acquisitions using each modality. The applications and energy ranges of the X-ray installations are different at the two facilities. At NEUTRA larger samples are intended (60-320 kV) and at ICON small samples and simultaneous acquisition are intended (40-150 kV). Here, we report the more recent installation at ICON. The X-ray beamline uses a cone beam source and is arranged across the neutron beamline. The beamline is designed to allow up to ten times magnification. This matches the voxel-size that can be achieved with the micro-setup for neutrons. The oblique arrangement of the X-ray beamline further makes real-time acquisition possible since both modalities have a free view of the sample at any time. Reconstruction of cone beam data requires more knowledge about the beam geometry and sample position. Therefore, the beamline is equipped with laser based distance sensors and a calibration procedure has been developed to increase the accuracy of the reconstruction. The purpose of using multimodal acquisition is to fuse the data in a way that enhances the output of the experiment. We demonstrate the current system performance and provide a basic analysis with experiment data.
NASA Astrophysics Data System (ADS)
Ćwikła, G.; Gwiazda, A.; Banaś, W.; Monica, Z.; Foit, K.
2017-08-01
The article presents the study of possible application of selected methods of complex description, that can be used as a support of the Manufacturing Information Acquisition System (MIAS) methodology, describing how to design a data acquisition system, allowing for collecting and processing real-time data on the functioning of a production system, necessary for management of a company. MIAS can allow conversion into Cyber-Physical Production System. MIAS is gathering and pre-processing data on the state of production system, including e.g. realisation of production orders, state of machines, materials and human resources. Systematised approach and model-based development is proposed for improving the quality of the design of MIAS methodology-based complex systems supporting data acquisition in various types of companies. Graphical specification can be the baseline for any model-based development in specified areas. The possibility of application of SysML and BPMN, both being UML-based languages, representing different approaches to modelling of requirements, architecture and implementation of the data acquisition system, as a tools supporting description of required features of MIAS, were considered.
The application of a computer data acquisition system to a new high temperature tribometer
NASA Technical Reports Server (NTRS)
Bonham, Charles D.; Dellacorte, Christopher
1991-01-01
The two data acquisition computer programs are described which were developed for a high temperature friction and wear test apparatus, a tribometer. The raw data produced by the tribometer and the methods used to sample that data are explained. In addition, the instrumentation and computer hardware and software are presented. Also shown is how computer data acquisition was applied to increase convenience and productivity on a high temperature tribometer.
The application of a computer data acquisition system for a new high temperature tribometer
NASA Technical Reports Server (NTRS)
Bonham, Charles D.; Dellacorte, Christopher
1990-01-01
The two data acquisition computer programs are described which were developed for a high temperature friction and wear test apparatus, a tribometer. The raw data produced by the tribometer and the methods used to sample that data are explained. In addition, the instrumentation and computer hardware and software are presented. Also shown is how computer data acquisition was applied to increase convenience and productivity on a high temperature tribometer.
Astromaterials Curation Online Resources for Principal Investigators
NASA Technical Reports Server (NTRS)
Todd, Nancy S.; Zeigler, Ryan A.; Mueller, Lina
2017-01-01
The Astromaterials Acquisition and Curation office at NASA Johnson Space Center curates all of NASA's extraterrestrial samples, the most extensive set of astromaterials samples available to the research community worldwide. The office allocates 1500 individual samples to researchers and students each year and has served the planetary research community for 45+ years. The Astromaterials Curation office provides access to its sample data repository and digital resources to support the research needs of sample investigators and to aid in the selection and request of samples for scientific study. These resources can be found on the Astromaterials Acquisition and Curation website at https://curator.jsc.nasa.gov. To better serve our users, we have engaged in several activities to enhance the data available for astromaterials samples, to improve the accessibility and performance of the website, and to address user feedback. We havealso put plans in place for continuing improvements to our existing data products.
Practical continuous-variable quantum key distribution without finite sampling bandwidth effects.
Li, Huasheng; Wang, Chao; Huang, Peng; Huang, Duan; Wang, Tao; Zeng, Guihua
2016-09-05
In a practical continuous-variable quantum key distribution system, finite sampling bandwidth of the employed analog-to-digital converter at the receiver's side may lead to inaccurate results of pulse peak sampling. Then, errors in the parameters estimation resulted. Subsequently, the system performance decreases and security loopholes are exposed to eavesdroppers. In this paper, we propose a novel data acquisition scheme which consists of two parts, i.e., a dynamic delay adjusting module and a statistical power feedback-control algorithm. The proposed scheme may improve dramatically the data acquisition precision of pulse peak sampling and remove the finite sampling bandwidth effects. Moreover, the optimal peak sampling position of a pulse signal can be dynamically calibrated through monitoring the change of the statistical power of the sampled data in the proposed scheme. This helps to resist against some practical attacks, such as the well-known local oscillator calibration attack.
NASA Astrophysics Data System (ADS)
Bindschadler, Michael; Modgil, Dimple; Branch, Kelley R.; La Riviere, Patrick J.; Alessio, Adam M.
2014-04-01
Myocardial blood flow (MBF) can be estimated from dynamic contrast enhanced (DCE) cardiac CT acquisitions, leading to quantitative assessment of regional perfusion. The need for low radiation dose and the lack of consensus on MBF estimation methods motivates this study to refine the selection of acquisition protocols and models for CT-derived MBF. DCE cardiac CT acquisitions were simulated for a range of flow states (MBF = 0.5, 1, 2, 3 ml (min g)-1, cardiac output = 3, 5, 8 L min-1). Patient kinetics were generated by a mathematical model of iodine exchange incorporating numerous physiological features including heterogenenous microvascular flow, permeability and capillary contrast gradients. CT acquisitions were simulated for multiple realizations of realistic x-ray flux levels. CT acquisitions that reduce radiation exposure were implemented by varying both temporal sampling (1, 2, and 3 s sampling intervals) and tube currents (140, 70, and 25 mAs). For all acquisitions, we compared three quantitative MBF estimation methods (two-compartment model, an axially-distributed model, and the adiabatic approximation to the tissue homogeneous model) and a qualitative slope-based method. In total, over 11 000 time attenuation curves were used to evaluate MBF estimation in multiple patient and imaging scenarios. After iodine-based beam hardening correction, the slope method consistently underestimated flow by on average 47.5% and the quantitative models provided estimates with less than 6.5% average bias and increasing variance with increasing dose reductions. The three quantitative models performed equally well, offering estimates with essentially identical root mean squared error (RMSE) for matched acquisitions. MBF estimates using the qualitative slope method were inferior in terms of bias and RMSE compared to the quantitative methods. MBF estimate error was equal at matched dose reductions for all quantitative methods and range of techniques evaluated. This suggests that there is no particular advantage between quantitative estimation methods nor to performing dose reduction via tube current reduction compared to temporal sampling reduction. These data are important for optimizing implementation of cardiac dynamic CT in clinical practice and in prospective CT MBF trials.
One GigaSample Per Second Data Acquisition using Available Gate Array Technology
NASA Technical Reports Server (NTRS)
Wagner, K.W.
1999-01-01
A new National Aeronautics and Space Administration instrument forced demanding requirements upon its altimeter digitizer system. Eight-bit data would be generated at a rate of one billion samples per second. NASA had never before attempted to capture such high-speed data in the radiation, low-power, no-convective-cooling, limited-board-area environment of space. This presentation describes how the gate array technology available at the time of the design was used to implement this one gigasample per second data acquisition system
Code of Federal Regulations, 2014 CFR
2014-10-01
... ENVIRONMENT, ENERGY AND WATER EFFICIENCY, RENEWABLE ENERGY TECHNOLOGIES, OCCUPATIONAL SAFETY, AND DRUG-FREE... the acquisition and use of designated recycled content, and Energy Star ®, Electronic Product Environmental Assessment Tool (EPEAT)-registered, energy-efficient, bio-based, and environmentally preferable...
Lott, Gus K; Johnson, Bruce R; Bonow, Robert H; Land, Bruce R; Hoy, Ronald R
2009-01-01
We report on the real-time creation of an application for hands-on neurophysiology in an advanced undergraduate teaching laboratory. Enabled by the rapid software development tools included in the Matlab technical computing environment (The Mathworks, Natick, MA), a team, consisting of a neurophysiology educator and a biophysicist trained as an electrical engineer, interfaced to a course of approximately 15 students from engineering and biology backgrounds. The result is the powerful freeware data acquisition and analysis environment, "g-PRIME." The software was developed from week to week in response to curriculum demands, and student feedback. The program evolved from a simple software oscilloscope, enabling RC circuit analysis, to a suite of tools supporting analysis of neuronal excitability and synaptic transmission analysis in invertebrate model systems. The program has subsequently expanded in application to university courses, research, and high school projects in the US and abroad as free courseware.
Portfolio as a tool to evaluate clinical competences of traumatology in medical students
Santonja-Medina, Fernando; García-Sanz, M Paz; Martínez-Martínez, Francisco; Bó, David; García-Estañ, Joaquín
2016-01-01
This article investigates whether a reflexive portfolio is instrumental in determining the level of acquisition of clinical competences in traumatology, a subject in the 5th year of the degree of medicine. A total of 131 students used the portfolio during their clinical rotation of traumatology. The students’ portfolios were blind evaluated by four professors who annotated the existence (yes/no) of 23 learning outcomes. The reliability of the portfolio was moderate, according to the kappa index (0.48), but the evaluation scores between evaluators were very similar. Considering the mean percentage, 59.8% of the students obtained all the competences established and only 13 of the 23 learning outcomes (56.5%) were fulfilled by >50% of the students. Our study suggests that the portfolio may be an important tool to quantitatively analyze the acquisition of traumatology competences of medical students, thus allowing the implementation of methods to improve its teaching. PMID:26929675
ORAC: 21st Century Observing at UKIRT
NASA Astrophysics Data System (ADS)
Bridger, A.; Wright, G. S.; Tan, M.; Pickup, D. A.; Economou, F.; Currie, M. J.; Adamson, A. J.; Rees, N. P.; Purves, M. H.
The Observatory Reduction and Acquisition Control system replaces all of the existing software which interacts with the observers at UKIRT. The aim is to improve observing efficiency with a set of integrated tools that take the user from pre-observing preparation, through the acquisition of observations to the reduction using a data-driven pipeline. ORAC is designed to be flexible and extensible, and is intended for use with all future UKIRT instruments, as well as existing telescope hardware and ``legacy'' instruments. It is also designed to allow integration with phase-1 and queue-scheduled observing tools in anticipation of possible future requirements. A brief overview of the project and its relationship to other systems is given. ORAC also re-uses much code from other systems and we discuss issues relating to the trade-off between reuse and the generation of new software specific to our requirements.
Portfolio as a tool to evaluate clinical competences of traumatology in medical students.
Santonja-Medina, Fernando; García-Sanz, M Paz; Martínez-Martínez, Francisco; Bó, David; García-Estañ, Joaquín
2016-01-01
This article investigates whether a reflexive portfolio is instrumental in determining the level of acquisition of clinical competences in traumatology, a subject in the 5th year of the degree of medicine. A total of 131 students used the portfolio during their clinical rotation of traumatology. The students' portfolios were blind evaluated by four professors who annotated the existence (yes/no) of 23 learning outcomes. The reliability of the portfolio was moderate, according to the kappa index (0.48), but the evaluation scores between evaluators were very similar. Considering the mean percentage, 59.8% of the students obtained all the competences established and only 13 of the 23 learning outcomes (56.5%) were fulfilled by >50% of the students. Our study suggests that the portfolio may be an important tool to quantitatively analyze the acquisition of traumatology competences of medical students, thus allowing the implementation of methods to improve its teaching.
Fortin, Guillaume; Lecomte, Tania; Corbière, Marc
2017-06-01
When employment difficulties in people with severe mental illness (SMI) occur, it could be partly linked to issues not specific to SMI, such as personality traits or problems. Despite the fact that personality has a marked influence on almost every aspect of work behavior, it has scarcely been investigated in the context of employment for people with SMI. We aimed to evaluate if personality was more predictive than clinical variables of different competitive work outcomes, namely acquisition of competitive employment, delay to acquisition and job tenure. A sample of 82 people with a SMI enrolled in supported employment programs (SEP) was recruited and asked to complete various questionnaires and interviews. Statistical analyses included logistic regressions and survival analyses (Cox regressions). Prior employment, personality problems and negative symptoms are significantly related to acquisition of a competitive employment and to delay to acquisition whereas the conscientiousness personality trait was predictive of job tenure. Our results point out the relevance of personality traits and problems as predictors of work outcomes in people with SMI registered in SEP. Future studies should recruit larger samples and also investigate these links with other factors related to work outcomes.
Near-real-time mosaics from high-resolution side-scan sonar
Danforth, William W.; O'Brien, Thomas F.; Schwab, W.C.
1991-01-01
High-resolution side-scan sonar has proven to be a very effective tool for stuyding and understanding the surficial geology of the seafloor. Since the mid-1970s, the US Geological Survey has used high-resolution side-scan sonar systems for mapping various areas of the continental shelf. However, two problems typically encountered included the short range and the high sampling rate of high-resolution side-scan sonar systems and the acquisition and real-time processing of the enormous volume of sonar data generated by high-resolution suystems. These problems were addressed and overcome in August 1989 when the USGS conducted a side-scan sonar and bottom sampling survey of a 1000-sq-km section of the continental shelf in the Gulf of Farallones located offshore of San Francisco. The primary goal of this survey was to map an area of critical interest for studying continental shelf sediment dynamics. This survey provided an opportunity to test an image processing scheme that enabled production of a side-scan sonar hard-copy mosaic during the cruise in near real-time.
AE Monitoring of Diamond Turned Rapidly Soldified Aluminium 443
NASA Astrophysics Data System (ADS)
Onwuka, G.; Abou-El-Hossein, K.; Mkoko, Z.
2017-05-01
The fast replacement of conventional aluminium with rapidly solidified aluminium alloys has become a noticeable trend in the current manufacturing industries involved in the production of optics and optical molding inserts. This is as a result of the improved performance and durability of rapidly solidified aluminium alloys when compared to conventional aluminium. Melt spinning process is vital for manufacturing rapidly solidified aluminium alloys like RSA 905, RSA 6061 and RSA 443 which are common in the industries today. RSA 443 is a newly developed alloy with few research findings and huge research potential. There is no available literature focused on monitoring the machining of RSA 443 alloys. In this research, Acoustic Emission sensing technique was applied to monitor the single point diamond turning of RSA 443 on an ultrahigh precision lathe machine. The machining process was carried out after careful selection of feed, speed and depths of cut. The monitoring process was achieved with a high sampling data acquisition system using different tools while concurrent measurement of the surface roughness and tool wear were initiated after covering a total feed distance of 13km. An increasing trend of raw AE spikes and peak to peak signal were observed with an increase in the surface roughness and tool wear values. Hence, acoustic emission sensing technique proves to be an effective monitoring method for the machining of RSA 443 alloy.
Advanced Self-Calibrating, Self-Repairing Data Acquisition System
NASA Technical Reports Server (NTRS)
Medelius, Pedro J. (Inventor); Eckhoff, Anthony J. (Inventor); Angel, Lucena R. (Inventor); Perotti, Jose M. (Inventor)
2002-01-01
An improved self-calibrating and self-repairing Data Acquisition System (DAS) for use in inaccessible areas, such as onboard spacecraft, and capable of autonomously performing required system health checks, failure detection. When required, self-repair is implemented utilizing a "spare parts/tool box" system. The available number of spare components primarily depends upon each component's predicted reliability which may be determined using Mean Time Between Failures (MTBF) analysis. Failing or degrading components are electronically removed and disabled to reduce power consumption, before being electronically replaced with spare components.
Optimal sampling with prior information of the image geometry in microfluidic MRI.
Han, S H; Cho, H; Paulsen, J L
2015-03-01
Recent advances in MRI acquisition for microscopic flows enable unprecedented sensitivity and speed in a portable NMR/MRI microfluidic analysis platform. However, the application of MRI to microfluidics usually suffers from prolonged acquisition times owing to the combination of the required high resolution and wide field of view necessary to resolve details within microfluidic channels. When prior knowledge of the image geometry is available as a binarized image, such as for microfluidic MRI, it is possible to reduce sampling requirements by incorporating this information into the reconstruction algorithm. The current approach to the design of the partial weighted random sampling schemes is to bias toward the high signal energy portions of the binarized image geometry after Fourier transformation (i.e. in its k-space representation). Although this sampling prescription is frequently effective, it can be far from optimal in certain limiting cases, such as for a 1D channel, or more generally yield inefficient sampling schemes at low degrees of sub-sampling. This work explores the tradeoff between signal acquisition and incoherent sampling on image reconstruction quality given prior knowledge of the image geometry for weighted random sampling schemes, finding that optimal distribution is not robustly determined by maximizing the acquired signal but from interpreting its marginal change with respect to the sub-sampling rate. We develop a corresponding sampling design methodology that deterministically yields a near optimal sampling distribution for image reconstructions incorporating knowledge of the image geometry. The technique robustly identifies optimal weighted random sampling schemes and provides improved reconstruction fidelity for multiple 1D and 2D images, when compared to prior techniques for sampling optimization given knowledge of the image geometry. Copyright © 2015 Elsevier Inc. All rights reserved.
Petty, Julia
2013-01-01
Learning technology is increasingly being implemented for programmes of blended learning within nurse education. With a growing emphasis on self-directed study particularly in post-basic education, there is a need for learners to be guided in their learning away from practice and limited classroom time. Technology-enabled (TE) tools which engage learners actively can play a part in this. The effectiveness and value of interactive TE learning strategies within healthcare is the focus of this paper. To identify literature that explores the effectiveness of interactive, TE tools on knowledge acquisition and learner satisfaction within healthcare with a view to evaluating their use for post-basic nurse education. A Literature Review was performed focusing on papers exploring the comparative value and perceived benefit of TE tools compared to traditional modes of learning within healthcare. The Databases identified as most suitable due to their relevance to healthcare were accessed through EBSCOhost. Primary, Boolean and advanced searches on key terms were undertaken. Inclusion and exclusion criteria were applied which resulted in a final selection of 11 studies for critique. Analysis of the literature found that knowledge acquisition in most cases was enhanced and measured learner satisfaction was generally positive for interactive, self-regulated TE tools. However, TE education may not suit all learners and this is critiqued in the light of the identified limitations. Interactive self regulation and/or testing can be a valuable learning strategy that can be incorporated into self-directed programmes of study for post-registration learners. Whilst acknowledging the learning styles not suited to such tools, the concurrent use of self-directed TE tools with those learning strategies necessitating a more social presence can work together to support enhancement of knowledge required to deliver rationale for nursing practice. Copyright © 2012 Elsevier Ltd. All rights reserved.
French, Beverley; Thomas, Lois H; Baker, Paula; Burton, Christopher R; Pennington, Lindsay; Roddam, Hazel
2009-05-19
Given the current emphasis on networks as vehicles for innovation and change in health service delivery, the ability to conceptualize and measure organisational enablers for the social construction of knowledge merits attention. This study aimed to develop a composite tool to measure the organisational context for evidence-based practice (EBP) in healthcare. A structured search of the major healthcare and management databases for measurement tools from four domains: research utilisation (RU), research activity (RA), knowledge management (KM), and organisational learning (OL). Included studies were reports of the development or use of measurement tools that included organisational factors. Tools were appraised for face and content validity, plus development and testing methods. Measurement tool items were extracted, merged across the four domains, and categorised within a constructed framework describing the absorptive and receptive capacities of organisations. Thirty measurement tools were identified and appraised. Eighteen tools from the four domains were selected for item extraction and analysis. The constructed framework consists of seven categories relating to three core organisational attributes of vision, leadership, and a learning culture, and four stages of knowledge need, acquisition of new knowledge, knowledge sharing, and knowledge use. Measurement tools from RA or RU domains had more items relating to the categories of leadership, and acquisition of new knowledge; while tools from KM or learning organisation domains had more items relating to vision, learning culture, knowledge need, and knowledge sharing. There was equal emphasis on knowledge use in the different domains. If the translation of evidence into knowledge is viewed as socially mediated, tools to measure the organisational context of EBP in healthcare could be enhanced by consideration of related concepts from the organisational and management sciences. Comparison of measurement tools across domains suggests that there is scope within EBP for supplementing the current emphasis on human and technical resources to support information uptake and use by individuals. Consideration of measurement tools from the fields of KM and OL shows more content related to social mechanisms to facilitate knowledge recognition, translation, and transfer between individuals and groups.
French, Beverley; Thomas, Lois H; Baker, Paula; Burton, Christopher R; Pennington, Lindsay; Roddam, Hazel
2009-01-01
Background Given the current emphasis on networks as vehicles for innovation and change in health service delivery, the ability to conceptualise and measure organisational enablers for the social construction of knowledge merits attention. This study aimed to develop a composite tool to measure the organisational context for evidence-based practice (EBP) in healthcare. Methods A structured search of the major healthcare and management databases for measurement tools from four domains: research utilisation (RU), research activity (RA), knowledge management (KM), and organisational learning (OL). Included studies were reports of the development or use of measurement tools that included organisational factors. Tools were appraised for face and content validity, plus development and testing methods. Measurement tool items were extracted, merged across the four domains, and categorised within a constructed framework describing the absorptive and receptive capacities of organisations. Results Thirty measurement tools were identified and appraised. Eighteen tools from the four domains were selected for item extraction and analysis. The constructed framework consists of seven categories relating to three core organisational attributes of vision, leadership, and a learning culture, and four stages of knowledge need, acquisition of new knowledge, knowledge sharing, and knowledge use. Measurement tools from RA or RU domains had more items relating to the categories of leadership, and acquisition of new knowledge; while tools from KM or learning organisation domains had more items relating to vision, learning culture, knowledge need, and knowledge sharing. There was equal emphasis on knowledge use in the different domains. Conclusion If the translation of evidence into knowledge is viewed as socially mediated, tools to measure the organisational context of EBP in healthcare could be enhanced by consideration of related concepts from the organisational and management sciences. Comparison of measurement tools across domains suggests that there is scope within EBP for supplementing the current emphasis on human and technical resources to support information uptake and use by individuals. Consideration of measurement tools from the fields of KM and OL shows more content related to social mechanisms to facilitate knowledge recognition, translation, and transfer between individuals and groups. PMID:19454008
NASA Astrophysics Data System (ADS)
Kodama, Kazuto
2015-02-01
This study proposes a new method for measuring transient magnetization of natural samples induced by a pulsed field with duration of 11 ms using a pulse magnetizer. An experimental system was constructed, consisting of a pair of differential sensing coils connected with a high-speed digital oscilloscope for data acquisition. The data were transferred to a computer to obtain an initial magnetization curve and a descending branch of a hysteresis loop in a rapidly changing positive field. This system was tested with synthetic samples (permalloy ribbon, aluminum plate, and nickel powder) as well as two volcanic rock samples. Results from the synthetic samples showed considerable differences from those measured by a quasi-static method using a vibrating sample magnetometer (VSM). These differences were principally due to the time-dependent magnetic properties or to electromagnetic effects, such as magnetic viscosity, eddy current loss, or magnetic relaxation. Results from the natural samples showed that the transient magnetization-field curves were largely comparable to the corresponding portions of the hysteresis loops. However, the relative magnetization (scaled to the saturation magnetization) at the end of a pulse was greater than that measured by a VSM. This discrepancy, together with the occurrence of rapid exponential decay after a pulse, indicates magnetic relaxations that could be interpreted in terms of domain wall displacement. These results suggest that with further developments, the proposed technique can become a useful tool for characterizing magnetic particles contained in a variety of natural materials.
Sullivan, Julie M.; Prasanna, Pataje G. S.; Grace, Marcy B.; Wathen, Lynne; Wallace, Rodney L.; Koerner, John F.; Coleman, C. Norman
2013-01-01
Following a mass-casualty nuclear disaster, effective medical triage has the potential to save tens of thousands of lives. In order to best use the available scarce resources, there is an urgent need for biodosimetry tools to determine an individual’s radiation dose. Initial triage for radiation exposure will include location during the incident, symptoms, and physical examination. Stepwise triage will include point of care assessment of less than or greater than 2 Gy, followed by secondary assessment, possibly with high throughput screening, to further define an individual’s dose. Given the multisystem nature of radiation injury, it is unlikely that any single biodosimetry assay can be used as a stand-alone tool to meet the surge in capacity with the timeliness and accuracy needed. As part of the national preparedness and planning for a nuclear or radiological incident, we reviewed the primary literature to determine the capabilities and limitations of a number of biodosimetry assays currently available or under development for use in the initial and secondary triage of patients. Understanding the requirements from a response standpoint and the capability and logistics for the various assays will help inform future biodosimetry technology development and acquisition. Factors considered include: type of sample required, dose detection limit, time interval when the assay is feasible biologically, time for sample preparation and analysis, ease of use, logistical requirements, potential throughput, point-of-care capability, and the ability to support patient diagnosis and treatment within a therapeutically relevant time point. PMID:24162058
Avnet, Hagai; Mazaaki, Eyal; Shen, Ori; Cohen, Sarah; Yagel, Simcha
2016-01-01
We aimed to evaluate the use of spatiotemporal image correlation (STIC) as a tool for training nonexpert examiners to perform screening examinations of the fetal heart by acquiring and examining STIC volumes according to a standardized questionnaire based on the 5 transverse planes of the fetal heart. We conducted a prospective study at 2 tertiary care centers. Two sonographers without formal training in fetal echocardiography received theoretical instruction on the 5 fetal echocardiographic transverse planes, as well as STIC technology. Only women with conditions allowing 4-dimensional STIC volume acquisitions (grayscale and Doppler) were included in the study. Acquired volumes were evaluated offline according to a standardized protocol that required the trainee to mark 30 specified structures on 5 required axial planes. Volumes were then reviewed by an expert examiner for quality of acquisition and correct identification of specified structures. Ninety-six of 112 pregnant women examined entered the study. Patients had singleton pregnancies between 20 and 32 weeks' gestation. After an initial learning curve of 20 examinations, trainees succeeded in identifying 97% to 98% of structures, with a highly significant degree of agreement with the expert's analysis (P < .001). A median of 2 STIC volumes for each examination was necessary for maximal structure identification. Acquisition quality scores were high (8.6-8.7 of a maximal score of 10) and were found to correlate with identification rates (P = .017). After an initial learning curve and under expert guidance, STIC is an excellent tool for trainees to master extended screening examinations of the fetal heart.
A multimedia perioperative record keeper for clinical research.
Perrino, A C; Luther, M A; Phillips, D B; Levin, F L
1996-05-01
To develop a multimedia perioperative recordkeeper that provides: 1. synchronous, real-time acquisition of multimedia data, 2. on-line access to the patient's chart data, and 3. advanced data analysis capabilities through integrated, multimedia database and analysis applications. To minimize cost and development time, the system design utilized industry standard hardware components and graphical. software development tools. The system was configured to use a Pentium PC complemented with a variety of hardware interfaces to external data sources. These sources included physiologic monitors with data in digital, analog, video, and audio as well as paper-based formats. The development process was guided by trials in over 80 clinical cases and by the critiques from numerous users. As a result of this process, a suite of custom software applications were created to meet the design goals. The Perioperative Data Acquisition application manages data collection from a variety of physiological monitors. The Charter application provides for rapid creation of an electronic medical record from the patient's paper-based chart and investigator's notes. The Multimedia Medical Database application provides a relational database for the organization and management of multimedia data. The Triscreen application provides an integrated data analysis environment with simultaneous, full-motion data display. With recent technological advances in PC power, data acquisition hardware, and software development tools, the clinical researcher now has the ability to collect and examine a more complete perioperative record. It is hoped that the description of the MPR and its development process will assist and encourage others to advance these tools for perioperative research.
Use of a priori statistics to minimize acquisition time for RFI immune spread spectrum systems
NASA Technical Reports Server (NTRS)
Holmes, J. K.; Woo, K. T.
1978-01-01
The optimum acquisition sweep strategy was determined for a PN code despreader when the a priori probability density function was not uniform. A psuedo noise spread spectrum system was considered which could be utilized in the DSN to combat radio frequency interference. In a sample case, when the a priori probability density function was Gaussian, the acquisition time was reduced by about 41% compared to a uniform sweep approach.
LC-MS/MS imaging with thermal film-based laser microdissection.
Oya, Michiko; Suzuki, Hiromi; Anas, Andrea Roxanne J; Oishi, Koichi; Ono, Kenji; Yamaguchi, Shun; Eguchi, Megumi; Sawada, Makoto
2018-01-01
Mass spectrometry (MS) imaging is a useful tool for direct and simultaneous visualization of specific molecules. Liquid chromatography-tandem mass spectrometry (LC-MS/MS) is used to evaluate the abundance of molecules in tissues using sample homogenates. To date, however, LC-MS/MS has not been utilized as an imaging tool because spatial information is lost during sample preparation. Here we report a new approach for LC-MS/MS imaging using a thermal film-based laser microdissection (LMD) technique. To isolate tissue spots, our LMD system uses a 808-nm near infrared laser, the diameter of which can be freely changed from 2.7 to 500 μm; for imaging purposes in this study, the diameter was fixed at 40 μm, allowing acquisition of LC-MS/MS images at a 40-μm resolution. The isolated spots are arranged on a thermal film at 4.5-mm intervals, corresponding to the well spacing on a 384-well plate. Each tissue spot is handled on the film in such a manner as to maintain its spatial information, allowing it to be extracted separately in its individual well. Using analytical LC-MS/MS in combination with the spatial information of each sample, we can reconstruct LC-MS/MS images. With this imaging technique, we successfully obtained the distributions of pilocarpine, glutamate, γ-aminobutyric acid, acetylcholine, and choline in a cross-section of mouse hippocampus. The protocol we established in this study is applicable to revealing the neurochemistry of pilocarpine model of epilepsy. Our system has a wide range of uses in fields such as biology, pharmacology, pathology, and neuroscience. Graphical abstract Schematic Indication of LMD-LC-MS/MS imaging.
Machado, V S; Oikonomou, G; Bicalho, M L S; Knauer, W A; Gilbert, R; Bicalho, R C
2012-10-12
The objective of this study was the use of metagenomic pyrosequencing of the 16S rRNA gene for the investigation of postpartum dairy cows' uterine bacterial diversity. The effect of subcutaneous supplementation of a trace mineral supplement containing Zn, Mn, Se, and Cu (Multimin North America, Inc., Fort Collins, CO) at 230 days of gestation and 260 days of gestation on dairy cows' uterine microbiota was also evaluated. Uterine lavage samples were collected at 35 DIM and were visually scored for the presence of purulent or mucopurulent secretion. The same samples were also used for the acquisition of bacterial DNA. The 16S rRNA genes were individually amplified from each sample. Pyrosequencing of the samples was carried at the Cornell University Life Sciences Core Laboratories Center using Roche 454 GS-FLX System Titanium Chemistry. The Ribosomal Database Project online tools were used for the analysis of the obtained sequences library. Bacteroides spp., Ureaplasma spp., Fusobacterium spp., Peptostreptococcus spp., Sneathia spp., Prevotella spp. and Arcanobacterium spp. prevalence was significantly (P<0.05) higher in samples derived from cows that had a higher uterine lavage sample score. Bacteroides spp., Ureaplasma spp., Fusobacterium spp., and Arcanobacterium spp. prevalence was significantly (P<0.05) higher in samples derived from cows that were not pregnant by 200 DIM. Anaerococcus spp., Peptostreptococcus spp., Parabacteroides spp., and Propionibacterium spp. prevalence was significantly (P<0.05) lower in samples derived from cows that were trace mineral supplemented. Copyright © 2012 Elsevier B.V. All rights reserved.
Concrete thawing studied by single-point ramped imaging.
Prado, P J; Balcom, B J; Beyea, S D; Armstrong, R L; Bremner, T W
1997-12-01
A series of two-dimensional images of proton distribution in a hardened concrete sample has been obtained during the thawing process (from -50 degrees C up to 11 degrees C). The SPRITE sequence is optimal for this study given the characteristic short relaxation times of water in this porous media (T2* < 200 micros and T1 < 3.6 ms). The relaxation parameters of the sample were determined in order to optimize the time efficiency of the sequence, permitting a 4-scan 64 x 64 acquisition in under 3 min. The image acquisition is fast on the time scale of the temperature evolution of the specimen. The frozen water distribution is quantified through a position based study of the image contrast. A multiple point acquisition method is presented and the signal sensitivity improvement is discussed.
NASA Astrophysics Data System (ADS)
Simini, F.; Santos, D.; Francescoli, L.
2016-04-01
We measure the Tibiofemoral contact point migration to offer clinicians a tool to evaluate Anterior Cruciate Ligament reconstruction. The design of the tool includes a C arm with fluoroscopy, image acquisition and processing system, interactive software and report generation for the clinical record. The procedure samples 30 images from the videofluoroscopy describing 2 seconds movements of hanging-to-full-extension of the knee articulation. A geometrical routine implemented in the original equipment (CINARTRO) helps capture tibial plateau and femoral condile profile by interaction with the user. The tightness or looseness of the knee is expressed by the migration given in terms of movement of the femur along the tibial plateau, as a percentage. We automatically create clinical reports in standard Clinical Document Architecture or CDA format. A special phantom was developed to correct the “pin cushion effect” in Rx images. Five cases of broken ACL patients were measured giving meaningful results for clinical follow up. Tibiofemoral contact point migration was measured as 60% of the tibial plateau, with standard deviation of 6% for healthy knees, 4% when injured and 1% after reconstruction.
Single-scan 2D NMR: An Emerging Tool in Analytical Spectroscopy
Giraudeau, Patrick; Frydman, Lucio
2016-01-01
Two-dimensional Nuclear Magnetic Resonance (2D NMR) spectroscopy is widely used in chemical and biochemical analyses. Multidimensional NMR is also witnessing an increased use in quantitative and metabolic screening applications. Conventional 2D NMR experiments, however, are affected by inherently long acquisition durations, arising from their need to sample the frequencies involved along their indirect domains in an incremented, scan-by-scan nature. A decade ago a so-called “ultrafast” (UF) approach was proposed, capable to deliver arbitrary 2D NMR spectra involving any kind of homo- or hetero-nuclear correlations, in a single scan. During the intervening years the performance of this sub-second 2D NMR methodology has been greatly improved, and UF 2D NMR is rapidly becoming a powerful analytical tool witnessing an expanded scope of applications. The present reviews summarizes the principles and the main developments which have contributed to the success of this approach, and focuses on applications which have been recently demonstrated in various areas of analytical chemistry –from the real time monitoring of chemical and biochemical processes, to extensions in hyphenated techniques and in quantitative applications. PMID:25014342
Defense Acquisition University (DAU) Program Managers Tool Kit
2008-03-01
SUBTITLE Program Managers Tool Kit 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e . TASK...strve for optmal solutons, seek better ways to manage, and provde lessons -learned to those who follow; • be candd about program status, ncludng...practces, lessons learned, and rsks to avod; • valdated practces wth consstent, verfiable nformaton; • an actve knowledge base to help wth
2016-08-01
recently incorporated a new emotional intelligence tool with Defense AT&L: July-August 2016 34 the goal of enhancing the “ Emotional Quotient (EQ...of our workforce. This tool is the EQi 2.0® ( Emotional Quotient In ventory, Version 2.0). The concept of emotional intelligence goes back to 1983...when Daniel Goleman, a science writer for the New York Times, published his book Emotional Intelligence: Why It Can Matter More Than IQ in 1995
High-throughput hyperpolarized 13C metabolic investigations using a multi-channel acquisition system
NASA Astrophysics Data System (ADS)
Lee, Jaehyuk; Ramirez, Marc S.; Walker, Christopher M.; Chen, Yunyun; Yi, Stacey; Sandulache, Vlad C.; Lai, Stephen Y.; Bankson, James A.
2015-11-01
Magnetic resonance imaging and spectroscopy of hyperpolarized (HP) compounds such as [1-13C]-pyruvate have shown tremendous potential for offering new insight into disease and response to therapy. New applications of this technology in clinical research and care will require extensive validation in cells and animal models, a process that may be limited by the high cost and modest throughput associated with dynamic nuclear polarization. Relatively wide spectral separation between [1-13C]-pyruvate and its chemical endpoints in vivo are conducive to simultaneous multi-sample measurements, even in the presence of a suboptimal global shim. Multi-channel acquisitions could conserve costs and accelerate experiments by allowing acquisition from multiple independent samples following a single dissolution. Unfortunately, many existing preclinical MRI systems are equipped with only a single channel for broadband acquisitions. In this work, we examine the feasibility of this concept using a broadband multi-channel digital receiver extension and detector arrays that allow concurrent measurement of dynamic spectroscopic data from ex vivo enzyme phantoms, in vitro anaplastic thyroid carcinoma cells, and in vivo in tumor-bearing mice. Throughput and the cost of consumables were improved by up to a factor of four. These preliminary results demonstrate the potential for efficient multi-sample studies employing hyperpolarized agents.
Quantitative Analysis of Electron Beam Damage in Organic Thin Films
2017-01-01
In transmission electron microscopy (TEM) the interaction of an electron beam with polymers such as P3HT:PCBM photovoltaic nanocomposites results in electron beam damage, which is the most important factor limiting acquisition of structural or chemical data at high spatial resolution. Beam effects can vary depending on parameters such as electron dose rate, temperature during imaging, and the presence of water and oxygen in the sample. Furthermore, beam damage will occur at different length scales. To assess beam damage at the angstrom scale, we followed the intensity of P3HT and PCBM diffraction rings as a function of accumulated electron dose by acquiring dose series and varying the electron dose rate, sample preparation, and the temperature during acquisition. From this, we calculated a critical dose for diffraction experiments. In imaging mode, thin film deformation was assessed using the normalized cross-correlation coefficient, while mass loss was determined via changes in average intensity and standard deviation, also varying electron dose rate, sample preparation, and temperature during acquisition. The understanding of beam damage and the determination of critical electron doses provides a framework for future experiments to maximize the information content during the acquisition of images and diffraction patterns with (cryogenic) transmission electron microscopy. PMID:28553431
The road to successful ITS software acquisition : executive summary
DOT National Transportation Integrated Search
1999-04-01
The Long Term Pavement Performance (LTPP) program was established to support a broad range of pavement performance analyses leading to improved engineering tools to design, construct, and manage pavements. Since 1989, LTPP has collected data on the p...
NASA Technical Reports Server (NTRS)
Modesitt, Kenneth L.
1990-01-01
A prediction was made that the terms expert systems and knowledge acquisition would begin to disappear over the next several years. This is not because they are falling into disuse; it is rather that practitioners are realizing that they are valuable adjuncts to software engineering, in terms of problem domains addressed, user acceptance, and in development methodologies. A specific problem was discussed, that of constructing an automated test analysis system for the Space Shuttle Main Engine. In this domain, knowledge acquisition was part of requirements systems analysis, and was performed with the aid of a powerful inductive ESBT in conjunction with a computer aided software engineering (CASE) tool. The original prediction is not a very risky one -- it has already been accomplished.
Synchronization software for automation in anesthesia.
Bressan, Nadja; Castro, Ana; Brás, Susana; Oliveira, Hélder P; Ribeiro, Lénio; Ferreira, David A; Antunes, Luís; Amorim, Pedro; Nunes, Catarina S
2007-01-01
This work presents the development of a software for data acquisition and control (ASYS) on a clinical setup. Similar to the industrial Supervisory Control And Data Acquisition (SCADA) the software assembles a Target Controlled Infusion (TCI) monitoring and supervisory control data in real time from devices in a surgical room. The software is not a full controller since the TCI systems comprehend permanent interaction from the anesthesiologist. Based on pharmacokinetic models, the effect-site and plasma concentrations can be related with the drug dose infused and vice versa. The software determines the infusion rates of the drug which are given as commands to the infusion pumps. This software provides the anesthesiologist with a trustworthy tool for managing a safe and balanced anesthesia. Since it also incorporates the acquisition and display of patients brain signals.
Cyclostationarity approach for monitoring chatter and tool wear in high speed milling
NASA Astrophysics Data System (ADS)
Lamraoui, M.; Thomas, M.; El Badaoui, M.
2014-02-01
Detection of chatter and tool wear is crucial in the machining process and their monitoring is a key issue, for: (1) insuring better surface quality, (2) increasing productivity and (3) protecting both machines and safe workpiece. This paper presents an investigation of chatter and tool wear using the cyclostationary method to process the vibrations signals acquired from high speed milling. Experimental cutting tests were achieved on slot milling operation of aluminum alloy. The experimental set-up is designed for acquisition of accelerometer signals and encoding information picked up from an encoder. The encoder signal is used for re-sampling accelerometers signals in angular domain using a specific algorithm that was developed in LASPI laboratory. The use of cyclostationary on accelerometer signals has been applied for monitoring chatter and tool wear in high speed milling. The cyclostationarity appears on average properties (first order) of signals, on the energetic properties (second order) and it generates spectral lines at cyclic frequencies in spectral correlation. Angular power and kurtosis are used to analyze chatter phenomena. The formation of chatter is characterized by unstable, chaotic motion of the tool and strong anomalous fluctuations of cutting forces. Results show that stable machining generates only very few cyclostationary components of second order while chatter is strongly correlated to cyclostationary components of second order. By machining in the unstable region, chatter results in flat angular kurtosis and flat angular power, such as a pseudo (white) random signal with flat spectrum. Results reveal that spectral correlation and Wigner Ville spectrum or integrated Wigner Ville issued from second-order cyclostationary are an efficient parameter for the early diagnosis of faults in high speed machining, such as chatter, tool wear and bearings, compared to traditional stationary methods. Wigner Ville representation of the residual signal shows that the energy corresponding to the tooth passing decreases when chatter phenomenon occurs. The effect of the tool wear and the number of broken teeth on the excitation of structure resonances appears in Wigner Ville presentation.
Fluorescent Cell Barcoding for Multiplex Flow Cytometry
Krutzik, Peter O.; Clutter, Matthew R.; Trejo, Angelica; Nolan, Garry P.
2011-01-01
Fluorescent Cell Barcoding (FCB) enables high throughput, i.e. high content flow cytometry by multiplexing samples prior to staining and acquisition on the cytometer. Individual cell samples are barcoded, or labeled, with unique signatures of fluorescent dyes so that they can be mixed together, stained, and analyzed as a single sample. By mixing samples prior to staining, antibody consumption is typically reduced 10 to 100-fold. In addition, data robustness is increased through the combination of control and treated samples, which minimizes pipetting error, staining variation, and the need for normalization. Finally, speed of acquisition is enhanced, enabling large profiling experiments to be run with standard cytometer hardware. In this unit, we outline the steps necessary to apply the FCB method to cell lines as well as primary peripheral blood samples. Important technical considerations such as choice of barcoding dyes, concentrations, labeling buffers, compensation, and software analysis are discussed. PMID:21207359
Factors Influencing Consonant Acquisition in Brazilian Portuguese-Speaking Children
ERIC Educational Resources Information Center
Ceron, Marizete Ilha; Gubiani, Marileda Barichello; de Oliveira, Camila Rosa; Keske-Soares, Márcia
2017-01-01
Purpose: We sought to provide valid and reliable data on the acquisition of consonant sounds in speakers of Brazilian Portuguese. Method: The sample comprised 733 typically developing monolingual speakers of Brazilian Portuguese (ages 3;0-8;11 [years;months]). The presence of surface speech error patterns, the revised percentage consonants…
Rapid Profile: A Second Language Screening Procedure.
ERIC Educational Resources Information Center
Mackey, Alison; And Others
1991-01-01
Rapid Profile, developed by Manfred Pienemann of National Languages Institute of Australia/Language Acquisition Research Centre, is a computer-based procedure for screening speech samples collected from language learners to assess their level of language development as compared to standard patterns in the acquisition of the target language. Rapid…
ERIC Educational Resources Information Center
Coyle-Rogers, Patricia G.; Rogers, George E.
A study determined whether there are any differences in the adaptive competency acquisition between technology education teachers who have completed a school district add-on alternative certification process and technology education teachers who completed a traditional baccalaureate degree certification program. Non-probability sampling was used…
IRTs of the ABCs: Children's Letter Name Acquisition
ERIC Educational Resources Information Center
Phillips, Beth M.; Piasta, Shayne B.; Anthony, Jason L.; Lonigan, Christopher J.; Francis, David J.
2012-01-01
We examined the developmental sequence of letter name knowledge acquisition by children from 2 to five years of age. Data from 2 samples representing diverse regions, ethnicity, and socioeconomic backgrounds (ns=1074 and 500) were analyzed using item response theory (IRT) and differential item functioning techniques. Results from factor analyses…
Usage-Based Language: Investigating the Latent Structures That Underpin Acquisition
ERIC Educational Resources Information Center
Ellis, Nick C.; O'Donnell, Matthew Brook; Romer, Ute
2013-01-01
Each of us as language learners had different language experiences, yet somehow we have converged upon broadly the same language system. From diverse, often noisy samples, we have attained similar linguistic competence. How so? What mechanisms channel language acquisition? Could our linguistic commonalities possibly have converged from our shared…
Characteristics of Marijuana Acquisition among a National Sample of Adolescent Users
ERIC Educational Resources Information Center
King, Keith A.; Merianos, Ashley L.; Vidourek, Rebecca A.
2016-01-01
Background: Because marijuana is becoming more accessible and perceived norms of use are becoming increasingly more favorable, research is needed to understand characteristics of marijuana acquisition among adolescents. Purpose: The study purpose was to examine whether sources and locations where adolescent users obtain and use marijuana differed…
Lobas, Anna A; Solovyeva, Elizaveta M; Sidorenko, Alena S; Gorshkov, Vladimir; Kjeldsen, Frank; Bubis, Julia A; Ivanov, Mark V; Ilina, Irina Y; Moshkovskii, Sergei A; Chumakov, Peter M; Gorshkov, Mikhail V
2018-01-01
An acquisition of increased sensitivity of cancer cells to viruses is a common outcome of malignant progression that justifies the development of oncolytic viruses as anticancer therapeutics. Studying molecular changes that underlie the sensitivity to viruses would help to identify cases where oncolytic virus therapy would be most effective. We quantified changes in protein abundances in two glioblastoma multiforme (GBM) cell lines that differ in the ability to induce resistance to vesicular stomatitis virus (VSV) infection in response to type I interferon (IFN) treatment. In IFN-treated samples we observed an up-regulation of protein products of some IFN-regulated genes (IRGs). In total, the proteome analysis revealed up to 20% more proteins encoded by IRGs in the glioblastoma cell line, which develops resistance to VSV infection after pre-treatment with IFN. In both cell lines protein-protein interaction and signaling pathway analyses have revealed a significant stimulation of processes related to type I IFN signaling and defense responses to viruses. However, we observed a deficiency in STAT2 protein in the VSV-sensitive cell line that suggests a de-regulation of the JAK/STAT/IRF9 signaling. The study has shown that the up-regulation of IRG proteins induced by the IFNα treatment of GBM cells can be detected at the proteome level. Similar analyses could be applied for revealing functional alterations within the antiviral mechanisms in glioblastoma samples, accompanying by acquisition of sensitivity to oncolytic viruses. The approach can be useful for discovering the biomarkers that predict a potential sensitivity of individual glioblastoma tumors to oncolytic virus therapy. PMID:29416731
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wagar, M; Bhagwat, M; O’Farrell, D
2015-06-15
Purpose: There are unique obstacles to implementing the MatriXX ionchamber array as a QA tool in Brachytherapy given that the device is designed for use in the MV energy range. One of the challenges we investigate is the affect of acquisition rates on dose measurement accuracy for HDR treatment plans. Methods: A treatment plan was optimized in Oncentra Brachy TPS to deliver a planar dose to a 5×5cm region at 10mm depth. The applicator was affixed to the surface of the MatriXX array. The plan was delivered multiple times using a Nucleatron HDR afterloader with a 2.9Ci Ir192 source. Formore » each measurement the sampling rate of the MatriXX movie mode was varied (30ms and 500ms). This experiment was repeated with identical parameters, following a source exchange, with an 11.2Ci Ir192 source. Finally, a single snap measurement was acquired. Analysis was preformed to evaluate the fidelity of the dose delivery for each iteration of the experiment. Evaluation was based on the comparison between the measured and TPS predicted dose. Results: Higher sample rates induce a greater discrepancy between the predicted and measured dose. Delivering the plan using a lower activity source also produced greater discrepancy in the measurement due to the increased delivery time. Analyzing the single snap measurement showed little difference from the 500ms integral dose measurement. Conclusion: The advantage of using movie mode for HDR treatment delivery QA is the ability for real time source tracking in addition to dose measurement. Our analysis indicates that 500ms is an optimal frame rate.« less
Increasingly automated procedure acquisition in dynamic systems
NASA Technical Reports Server (NTRS)
Mathe, Nathalie; Kedar, Smadar
1992-01-01
Procedures are widely used by operators for controlling complex dynamic systems. Currently, most development of such procedures is done manually, consuming a large amount of paper, time, and manpower in the process. While automated knowledge acquisition is an active field of research, not much attention has been paid to the problem of computer-assisted acquisition and refinement of complex procedures for dynamic systems. The Procedure Acquisition for Reactive Control Assistant (PARC), which is designed to assist users in more systematically and automatically encoding and refining complex procedures. PARC is able to elicit knowledge interactively from the user during operation of the dynamic system. We categorize procedure refinement into two stages: diagnosis - diagnose the failure and choose a repair - and repair - plan and perform the repair. The basic approach taken in PARC is to assist the user in all steps of this process by providing increased levels of assistance with layered tools. We illustrate the operation of PARC in refining procedures for the control of a robot arm.
An advanced artificial intelligence tool for menu design.
Khan, Abdus Salam; Hoffmann, Achim
2003-01-01
The computer-assisted menu design still remains a difficult task. Usually knowledge that aids in menu design by a computer is hard-coded and because of that a computerised menu planner cannot handle the menu design problem for an unanticipated client. To address this problem we developed a menu design tool, MIKAS (menu construction using incremental knowledge acquisition system), an artificial intelligence system that allows the incremental development of a knowledge-base for menu design. We allow an incremental knowledge acquisition process in which the expert is only required to provide hints to the system in the context of actual problem instances during menu design using menus stored in a so-called Case Base. Our system incorporates Case-Based Reasoning (CBR), an Artificial Intelligence (AI) technique developed to mimic human problem solving behaviour. Ripple Down Rules (RDR) are a proven technique for the acquisition of classification knowledge from expert directly while they are using the system, which complement CBR in a very fruitful way. This combination allows the incremental improvement of the menu design system while it is already in routine use. We believe MIKAS allows better dietary practice by leveraging a dietitian's skills and expertise. As such MIKAS has the potential to be helpful for any institution where dietary advice is practised.
Taylor, Philip D; Brzustowski, John M; Matkovich, Carolyn; Peckford, Michael L; Wilson, Dave
2010-10-26
Radar has been used for decades to study movement of insects, birds and bats. In spite of this, there are few readily available software tools for the acquisition, storage and processing of such data. Program radR was developed to solve this problem. Program radR is an open source software tool for the acquisition, storage and analysis of data from marine radars operating in surveillance mode. radR takes time series data with a two-dimensional spatial component as input from some source (typically a radar digitizing card) and extracts and retains information of biological relevance (i.e. moving targets). Low-level data processing is implemented in "C" code, but user-defined functions written in the "R" statistical programming language can be called at pre-defined steps in the calculations. Output data formats are designed to allow for future inclusion of additional data items without requiring change to C code. Two brands of radar digitizing card are currently supported as data sources. We also provide an overview of the basic considerations of setting up and running a biological radar study. Program radR provides a convenient, open source platform for the acquisition and analysis of radar data of biological targets.
Muir, Dylan R; Kampa, Björn M
2014-01-01
Two-photon calcium imaging of neuronal responses is an increasingly accessible technology for probing population responses in cortex at single cell resolution, and with reasonable and improving temporal resolution. However, analysis of two-photon data is usually performed using ad-hoc solutions. To date, no publicly available software exists for straightforward analysis of stimulus-triggered two-photon imaging experiments. In addition, the increasing data rates of two-photon acquisition systems imply increasing cost of computing hardware required for in-memory analysis. Here we present a Matlab toolbox, FocusStack, for simple and efficient analysis of two-photon calcium imaging stacks on consumer-level hardware, with minimal memory footprint. We also present a Matlab toolbox, StimServer, for generation and sequencing of visual stimuli, designed to be triggered over a network link from a two-photon acquisition system. FocusStack is compatible out of the box with several existing two-photon acquisition systems, and is simple to adapt to arbitrary binary file formats. Analysis tools such as stack alignment for movement correction, automated cell detection and peri-stimulus time histograms are already provided, and further tools can be easily incorporated. Both packages are available as publicly-accessible source-code repositories.
Muir, Dylan R.; Kampa, Björn M.
2015-01-01
Two-photon calcium imaging of neuronal responses is an increasingly accessible technology for probing population responses in cortex at single cell resolution, and with reasonable and improving temporal resolution. However, analysis of two-photon data is usually performed using ad-hoc solutions. To date, no publicly available software exists for straightforward analysis of stimulus-triggered two-photon imaging experiments. In addition, the increasing data rates of two-photon acquisition systems imply increasing cost of computing hardware required for in-memory analysis. Here we present a Matlab toolbox, FocusStack, for simple and efficient analysis of two-photon calcium imaging stacks on consumer-level hardware, with minimal memory footprint. We also present a Matlab toolbox, StimServer, for generation and sequencing of visual stimuli, designed to be triggered over a network link from a two-photon acquisition system. FocusStack is compatible out of the box with several existing two-photon acquisition systems, and is simple to adapt to arbitrary binary file formats. Analysis tools such as stack alignment for movement correction, automated cell detection and peri-stimulus time histograms are already provided, and further tools can be easily incorporated. Both packages are available as publicly-accessible source-code repositories1. PMID:25653614
2010-01-01
Background Radar has been used for decades to study movement of insects, birds and bats. In spite of this, there are few readily available software tools for the acquisition, storage and processing of such data. Program radR was developed to solve this problem. Results Program radR is an open source software tool for the acquisition, storage and analysis of data from marine radars operating in surveillance mode. radR takes time series data with a two-dimensional spatial component as input from some source (typically a radar digitizing card) and extracts and retains information of biological relevance (i.e. moving targets). Low-level data processing is implemented in "C" code, but user-defined functions written in the "R" statistical programming language can be called at pre-defined steps in the calculations. Output data formats are designed to allow for future inclusion of additional data items without requiring change to C code. Two brands of radar digitizing card are currently supported as data sources. We also provide an overview of the basic considerations of setting up and running a biological radar study. Conclusions Program radR provides a convenient, open source platform for the acquisition and analysis of radar data of biological targets. PMID:20977735
Li, Xueming; Zheng, Shawn; Agard, David A.; Cheng, Yifan
2015-01-01
Newly developed direct electron detection cameras have a high image output frame rate that enables recording dose fractionated image stacks of frozen hydrated biological samples by electron cryomicroscopy (cryoEM). Such novel image acquisition schemes provide opportunities to analyze cryoEM data in ways that were previously impossible. The file size of a dose fractionated image stack is 20 ~ 60 times larger than that of a single image. Thus, efficient data acquisition and on-the-fly analysis of a large number of dose-fractionated image stacks become a serious challenge to any cryoEM data acquisition system. We have developed a computer-assisted system, named UCSFImage4, for semi-automated cryo-EM image acquisition that implements an asynchronous data acquisition scheme. This facilitates efficient acquisition, on-the-fly motion correction, and CTF analysis of dose fractionated image stacks with a total time of ~60 seconds/exposure. Here we report the technical details and configuration of this system. PMID:26370395
Automated liver sampling using a gradient dual-echo Dixon-based technique.
Bashir, Mustafa R; Dale, Brian M; Merkle, Elmar M; Boll, Daniel T
2012-05-01
Magnetic resonance spectroscopy of the liver requires input from a physicist or physician at the time of acquisition to insure proper voxel selection, while in multiecho chemical shift imaging, numerous regions of interest must be manually selected in order to ensure analysis of a representative portion of the liver parenchyma. A fully automated technique could improve workflow by selecting representative portions of the liver prior to human analysis. Complete volumes from three-dimensional gradient dual-echo acquisitions with two-point Dixon reconstruction acquired at 1.5 and 3 T were analyzed in 100 subjects, using an automated liver sampling algorithm, based on ratio pairs calculated from signal intensity image data as fat-only/water-only and log(in-phase/opposed-phase) on a voxel-by-voxel basis. Using different gridding variations of the algorithm, the average correct liver volume samples ranged from 527 to 733 mL. The average percentage of sample located within the liver ranged from 95.4 to 97.1%, whereas the average incorrect volume selected was 16.5-35.4 mL (2.9-4.6%). Average run time was 19.7-79.0 s. The algorithm consistently selected large samples of the hepatic parenchyma with small amounts of erroneous extrahepatic sampling, and run times were feasible for execution on an MRI system console during exam acquisition. Copyright © 2011 Wiley Periodicals, Inc.
Full information acquisition in scanning probe microscopy and spectroscopy
Jesse, Stephen; Belianinov, Alex; Kalinin, Sergei V.; Somnath, Suhas
2017-04-04
Apparatus and methods are described for scanning probe microscopy and spectroscopy based on acquisition of full probe response. The full probe response contains valuable information about the probe-sample interaction that is lost in traditional scanning probe microscopy and spectroscopy methods. The full probe response is analyzed post data acquisition using fast Fourier transform and adaptive filtering, as well as multivariate analysis. The full response data is further compressed to retain only statistically significant components before being permanently stored.
Automated Thermal Sample Acquisition with Applications
NASA Astrophysics Data System (ADS)
Kooshesh, K. A.; Lineberger, D. H.
2012-03-01
We created an Arduino®-based robot to detect samples subject to an experiment, perform measurements once each sample is located, and store the results for further analysis. We then relate the robot’s performance to an experiment on thermal inertia.
Statistical Analysis Tools for Learning in Engineering Laboratories.
ERIC Educational Resources Information Center
Maher, Carolyn A.
1990-01-01
Described are engineering programs that have used automated data acquisition systems to implement data collection and analyze experiments. Applications include a biochemical engineering laboratory, heat transfer performance, engineering materials testing, mechanical system reliability, statistical control laboratory, thermo-fluid laboratory, and a…
LCACCESS: PROMOTING THE USE OF LIFE CYCLE ASSESSMENT
Evaluating environmental impacts holistically from raw material acquisition, through manufacture, to use and disposal using a life cycle perspective is gradually being viewed by environmental managers and decision-makers as an important element in the tools that are used to achie...
EPICS-based control and data acquisition for the APS slope profiler (Conference Presentation)
NASA Astrophysics Data System (ADS)
Sullivan, Joseph; Assoufid, Lahsen; Qian, Jun; Jemian, Peter R.; Mooney, Tim; Rivers, Mark L.; Goetze, Kurt; Sluiter, Ronald L.; Lang, Keenan
2016-09-01
The motion control, data acquisition and analysis system for APS Slope Measuring Profiler was implemented using the Experimental Physics and Industrial Control System (EPICS). EPICS was designed as a framework with software tools and applications that provide a software infrastructure used in building distributed control systems to operate devices such as particle accelerators, large experiments and major telescopes. EPICS was chosen to implement the APS Slope Measuring Profiler because it is also applicable to single purpose systems. The control and data handling capability available in the EPICS framework provides the basic functionality needed for high precision X-ray mirror measurement. Those built in capabilities include hardware integration of high-performance motion control systems (3-axis gantry and tip-tilt stages), mirror measurement devices (autocollimator, laser spot camera) and temperature sensors. Scanning the mirror and taking measurements was accomplished with an EPICS feature (the sscan record) which synchronizes motor positioning with measurement triggers and data storage. Various mirror scanning modes were automatically configured using EPICS built-in scripting. EPICS tools also provide low-level image processing (areaDetector). Operation screens were created using EPICS-aware GUI screen development tools.
Low frequency noise elimination technique for 24-bit Σ-Δ data acquisition systems.
Qu, Shao-Bo; Robert, Olivier; Lognonné, Philippe; Zhou, Ze-Bing; Yang, Shan-Qing
2015-03-01
Low frequency 1/f noise is one of the key limiting factors of high precision measurement instruments. In this paper, digital correlated double sampling is implemented to reduce the offset and low frequency 1/f noise of a data acquisition system with 24-bit sigma delta (Σ-Δ) analog to digital converter (ADC). The input voltage is modulated by cross-coupled switches, which are synchronized to the sampling clock, and converted into digital signal by ADC. By using a proper switch frequency, the unwanted parasitic signal frequencies generated by the switches are avoided. The noise elimination processing is made through the principle of digital correlated double sampling, which is equivalent to a time shifted subtraction for the sampled voltage. The low frequency 1/f noise spectrum density of the data acquisition system is reduced to be flat down to the measurement frequency lower limit, which is about 0.0001 Hz in this paper. The noise spectrum density is eliminated by more than 60 dB at 0.0001 Hz, with a residual noise floor of (9 ± 2) nV/Hz(1/2) which is limited by the intrinsic white noise floor of the ADC above its corner frequency.
NASA Astrophysics Data System (ADS)
Liu, Yuanyuan; Peng, Yankun; Zhang, Leilei; Dhakal, Sagar; Wang, Caiping
2014-05-01
Pork is one of the highly consumed meat item in the world. With growing improvement of living standard, concerned stakeholders including consumers and regulatory body pay more attention to comprehensive quality of fresh pork. Different analytical-laboratory based technologies exist to determine quality attributes of pork. However, none of the technologies are able to meet industrial desire of rapid and non-destructive technological development. Current study used optical instrument as a rapid and non-destructive tool to classify 24 h-aged pork longissimus dorsi samples into three kinds of meat (PSE, Normal and DFD), on the basis of color L* and pH24. Total of 66 samples were used in the experiment. Optical system based on Vis/NIR spectral acquisition system (300-1100 nm) was self- developed in laboratory to acquire spectral signal of pork samples. Median smoothing filter (M-filter) and multiplication scatter correction (MSC) was used to remove spectral noise and signal drift. Support vector machine (SVM) prediction model was developed to classify the samples based on their comprehensive qualities. The results showed that the classification model is highly correlated with the actual quality parameters with classification accuracy more than 85%. The system developed in this study being simple and easy to use, results being promising, the system can be used in meat processing industry for real time, non-destructive and rapid detection of pork qualities in future.
Xu, Zheng; Wang, Sheng; Li, Yeqing; Zhu, Feiyun; Huang, Junzhou
2018-02-08
The most recent history of parallel Magnetic Resonance Imaging (pMRI) has in large part been devoted to finding ways to reduce acquisition time. While joint total variation (JTV) regularized model has been demonstrated as a powerful tool in increasing sampling speed for pMRI, however, the major bottleneck is the inefficiency of the optimization method. While all present state-of-the-art optimizations for the JTV model could only reach a sublinear convergence rate, in this paper, we squeeze the performance by proposing a linear-convergent optimization method for the JTV model. The proposed method is based on the Iterative Reweighted Least Squares algorithm. Due to the complexity of the tangled JTV objective, we design a novel preconditioner to further accelerate the proposed method. Extensive experiments demonstrate the superior performance of the proposed algorithm for pMRI regarding both accuracy and efficiency compared with state-of-the-art methods.
Performance Evaluation of Pressure Transducers for Water Impacts
NASA Technical Reports Server (NTRS)
Vassilakos, Gregory J.; Stegall, David E.; Treadway, Sean
2012-01-01
The Orion Multi-Purpose Crew Vehicle is being designed for water landings. In order to benchmark the ability of engineering tools to predict water landing loads, test programs are underway for scale model and full-scale water impacts. These test programs are predicated on the reliable measurement of impact pressure histories. Tests have been performed with a variety of pressure transducers from various manufacturers. Both piezoelectric and piezoresistive devices have been tested. Effects such as thermal shock, pinching of the transducer head, and flushness of the transducer mounting have been studied. Data acquisition issues such as sampling rate and anti-aliasing filtering also have been studied. The response of pressure transducers have been compared side-by-side on an impulse test rig and on a 20-inch diameter hemisphere dropped into a pool of water. The results have identified a range of viable configurations for pressure measurement dependent on the objectives of the test program.
Image reconstructions from super-sampled data sets with resolution modeling in PET imaging.
Li, Yusheng; Matej, Samuel; Metzler, Scott D
2014-12-01
Spatial resolution in positron emission tomography (PET) is still a limiting factor in many imaging applications. To improve the spatial resolution for an existing scanner with fixed crystal sizes, mechanical movements such as scanner wobbling and object shifting have been considered for PET systems. Multiple acquisitions from different positions can provide complementary information and increased spatial sampling. The objective of this paper is to explore an efficient and useful reconstruction framework to reconstruct super-resolution images from super-sampled low-resolution data sets. The authors introduce a super-sampling data acquisition model based on the physical processes with tomographic, downsampling, and shifting matrices as its building blocks. Based on the model, we extend the MLEM and Landweber algorithms to reconstruct images from super-sampled data sets. The authors also derive a backprojection-filtration-like (BPF-like) method for the super-sampling reconstruction. Furthermore, they explore variant methods for super-sampling reconstructions: the separate super-sampling resolution-modeling reconstruction and the reconstruction without downsampling to further improve image quality at the cost of more computation. The authors use simulated reconstruction of a resolution phantom to evaluate the three types of algorithms with different super-samplings at different count levels. Contrast recovery coefficient (CRC) versus background variability, as an image-quality metric, is calculated at each iteration for all reconstructions. The authors observe that all three algorithms can significantly and consistently achieve increased CRCs at fixed background variability and reduce background artifacts with super-sampled data sets at the same count levels. For the same super-sampled data sets, the MLEM method achieves better image quality than the Landweber method, which in turn achieves better image quality than the BPF-like method. The authors also demonstrate that the reconstructions from super-sampled data sets using a fine system matrix yield improved image quality compared to the reconstructions using a coarse system matrix. Super-sampling reconstructions with different count levels showed that the more spatial-resolution improvement can be obtained with higher count at a larger iteration number. The authors developed a super-sampling reconstruction framework that can reconstruct super-resolution images using the super-sampling data sets simultaneously with known acquisition motion. The super-sampling PET acquisition using the proposed algorithms provides an effective and economic way to improve image quality for PET imaging, which has an important implication in preclinical and clinical region-of-interest PET imaging applications.
High-resolution seismic reflection surveying with a land streamer
NASA Astrophysics Data System (ADS)
Cengiz Tapırdamaz, Mustafa; Cankurtaranlar, Ali; Ergintav, Semih; Kurt, Levent
2013-04-01
In this study, newly designed seismic reflection data acquisition array (land streamer) is utilized to image the shallow subsurface. Our acquisition system consist of 24 geophones screwed on iron plates with 2 m spacing, moving on the surface of the earth which are connected with fire hose. Completely original, 4.5 Kg weight iron plates provides satisfactory coupling. This land-streamer system enables rapid and cost effective acquisition of seismic reflection data due to its operational facilities. First test studies were performed using various seismic sources such as a mini-vibro truck, buffalo-gun and hammer. The final fieldwork was performed on a landslide area which was studied before. Data acquisition was carried out on the line that was previously measured by the seismic survey using 5 m geophone and shot spacing. This line was chosen in order to re-image known reflection patterns obtained from the previous field study. Taking penetration depth into consideration, a six-cartridge buffalo-gun was selected as a seismic source to achieve high vertical resolution. Each shot-point drilled 50 cm for gunshots to obtain high resolution source signature. In order to avoid surface waves, the offset distance between the source and the first channel was chosen to be 50 m and the shot spacing was 2 m. These acquisition parameters provided 12 folds at each CDP points. Spatial sampling interval was 1 m at the surface. The processing steps included standard stages such as gain recovery, editing, frequency filtering, CDP sorting, NMO correction, static correction and stacking. Furthermore, surface consistent residual static corrections were applied recursively to improve image quality. 2D F-K filter application was performed to suppress air and surface waves at relatively deep part of the seismic section. Results show that, this newly designed, high-resolution land seismic data acquisition equipment (land-streamer) can be successfully used to image subsurface. Likewise, results are and compatible with the results obtained from the previous study. This tool is extremely practical and very effective in imaging the shallow subsurface. Next step, an integrated GPS receiver will be added to recorder to obtain shot and receiver station position information during data acquisition. Also, some mechanical parts will be placed to further improve the stability and durability of the land streamer. In addition, nonlinear geophone layout will be added after completion of test. We are planning to use this land streamer not only in landslide areas but also in archaeological sites, engineering applications such as detection of buried pipelines and faults. This equipment will make it possible to perform these studies both in urban and territory areas.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tenney, J.L.
SARS is a data acquisition system designed to gather and process radar data from aircraft flights. A database of flight trajectories has been developed for Albuquerque, NM, and Amarillo, TX. The data is used for safety analysis and risk assessment reports. To support this database effort, Sandia developed a collection of hardware and software tools to collect and post process the aircraft radar data. This document describes the data reduction tools which comprise the SARS, and maintenance procedures for the hardware and software system.
C3I Analysis Tools for Development Planning. Volume 1
1985-09-27
otherwise as in any manner licensing the holder or any other person or conveying any rights or permission to manufacture, use, or sell any patented... PERSONAL AUTHOR(S) ».A. Vail, G.H. Weissman, J.G. Wohl TYPE OF REPORT ?inal 13b. TIME COVERED FROM SUPPLEMENTARY NOTATION Refer to ESD-TR-86...support for decisions about the relative value of acquisition programs. 5. The software tool developed on a personal computer demonstrates that
2001-12-01
of lightweight aluminum alloy, steel , and titanium. The skin of the aircraft is fashioned primarily from fiberglass and Kevlar. The landing gear...endurance has made it one of the most flexible tools available to MAGTF Commanders. However, by compensating for the performance deficiencies of the CH...without jeopardizing both. Using cost effectiveness analysis as a tool to ascertain how that balance might be struck, will be the focus of the
Using predictive uncertainty analysis to optimise tracer test design and data acquisition
NASA Astrophysics Data System (ADS)
Wallis, Ilka; Moore, Catherine; Post, Vincent; Wolf, Leif; Martens, Evelien; Prommer, Henning
2014-07-01
Tracer injection tests are regularly-used tools to identify and characterise flow and transport mechanisms in aquifers. Examples of practical applications are manifold and include, among others, managed aquifer recharge schemes, aquifer thermal energy storage systems and, increasingly important, the disposal of produced water from oil and shale gas wells. The hydrogeological and geochemical data collected during the injection tests are often employed to assess the potential impacts of injection on receptors such as drinking water wells and regularly serve as a basis for the development of conceptual and numerical models that underpin the prediction of potential impacts. As all field tracer injection tests impose substantial logistical and financial efforts, it is crucial to develop a solid a-priori understanding of the value of the various monitoring data to select monitoring strategies which provide the greatest return on investment. In this study, we demonstrate the ability of linear predictive uncertainty analysis (i.e. “data worth analysis”) to quantify the usefulness of different tracer types (bromide, temperature, methane and chloride as examples) and head measurements in the context of a field-scale aquifer injection trial of coal seam gas (CSG) co-produced water. Data worth was evaluated in terms of tracer type, in terms of tracer test design (e.g., injection rate, duration of test and the applied measurement frequency) and monitoring disposition to increase the reliability of injection impact assessments. This was followed by an uncertainty targeted Pareto analysis, which allowed the interdependencies of cost and predictive reliability for alternative monitoring campaigns to be compared directly. For the evaluated injection test, the data worth analysis assessed bromide as superior to head data and all other tracers during early sampling times. However, with time, chloride became a more suitable tracer to constrain simulations of physical transport processes, followed by methane. Temperature data was assessed as the least informative of the solute tracers. However, taking costs of data acquisition into account, it could be shown that temperature data when used in conjunction with other tracers was a valuable and cost-effective marker species due to temperatures low cost to worth ratio. In contrast, the high costs of acquisition of methane data compared to its muted worth, highlighted methanes unfavourable return on investment. Areas of optimal monitoring bore position as well as optimal numbers of bores for the investigated injection site were also established. The proposed tracer test optimisation is done through the application of common use groundwater flow and transport models in conjunction with publicly available tools for predictive uncertainty analysis to provide modelers and practitioners with a powerful yet efficient and cost effective tool which is generally applicable and easily transferrable from the present study to many applications beyond the case study of injection of treated CSG produced water.
LabVIEW: a software system for data acquisition, data analysis, and instrument control.
Kalkman, C J
1995-01-01
Computer-based data acquisition systems play an important role in clinical monitoring and in the development of new monitoring tools. LabVIEW (National Instruments, Austin, TX) is a data acquisition and programming environment that allows flexible acquisition and processing of analog and digital data. The main feature that distinguishes LabVIEW from other data acquisition programs is its highly modular graphical programming language, "G," and a large library of mathematical and statistical functions. The advantage of graphical programming is that the code is flexible, reusable, and self-documenting. Subroutines can be saved in a library and reused without modification in other programs. This dramatically reduces development time and enables researchers to develop or modify their own programs. LabVIEW uses a large amount of processing power and computer memory, thus requiring a powerful computer. A large-screen monitor is desirable when developing larger applications. LabVIEW is excellently suited for testing new monitoring paradigms, analysis algorithms, or user interfaces. The typical LabVIEW user is the researcher who wants to develop a new monitoring technique, a set of new (derived) variables by integrating signals from several existing patient monitors, closed-loop control of a physiological variable, or a physiological simulator.
Shewokis, Patricia A; Shariff, Faiz U; Liu, Yichuan; Ayaz, Hasan; Castellanos, Andres; Lind, D Scott
2017-02-01
Using functional near infrared spectroscopy, a noninvasive, optical brain imaging tool that monitors changes in hemodynamics within the prefrontal cortex (PFC), we assessed performance and cognitive effort during the acquisition, retention and transfer of multiple simulated laparoscopic tasks by novice learners within a contextual interference paradigm. Third-year medical students (n = 10) were randomized to either a blocked or random practice schedule. Across 3 days, students performed 108 acquisition trials of 3 laparoscopic tasks on the LapSim ® simulator followed by delayed retention and transfer tests. Performance metrics (Global score, Total time) and hemodynamic responses (total hemoglobin (μm)) were assessed during skill acquisition, retention and transfer. All acquisition tasks resulted in significant practice schedule X trial block interactions for the left medial anterior PFC. During retention and transfer, random performed the skills in less time and had lower total hemoglobin change in the right dorsolateral PFC than blocked. Compared with blocked, random practice resulted in enhanced learning through better performance and less cognitive load for retention and transfer of simulated laparoscopic tasks. Copyright © 2016 Elsevier Inc. All rights reserved.
Data Acquisition with GPUs: The DAQ for the Muon $g$-$2$ Experiment at Fermilab
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gohn, W.
Graphical Processing Units (GPUs) have recently become a valuable computing tool for the acquisition of data at high rates and for a relatively low cost. The devices work by parallelizing the code into thousands of threads, each executing a simple process, such as identifying pulses from a waveform digitizer. The CUDA programming library can be used to effectively write code to parallelize such tasks on Nvidia GPUs, providing a significant upgrade in performance over CPU based acquisition systems. The muonmore » $g$-$2$ experiment at Fermilab is heavily relying on GPUs to process its data. The data acquisition system for this experiment must have the ability to create deadtime-free records from 700 $$\\mu$$s muon spills at a raw data rate 18 GB per second. Data will be collected using 1296 channels of $$\\mu$$TCA-based 800 MSPS, 12 bit waveform digitizers and processed in a layered array of networked commodity processors with 24 GPUs working in parallel to perform a fast recording of the muon decays during the spill. The described data acquisition system is currently being constructed, and will be fully operational before the start of the experiment in 2017.« less
Correcting sample drift using Fourier harmonics.
Bárcena-González, G; Guerrero-Lebrero, M P; Guerrero, E; Reyes, D F; Braza, V; Yañez, A; Nuñez-Moraleda, B; González, D; Galindo, P L
2018-07-01
During image acquisition of crystalline materials by high-resolution scanning transmission electron microscopy, the sample drift could lead to distortions and shears that hinder their quantitative analysis and characterization. In order to measure and correct this effect, several authors have proposed different methodologies making use of series of images. In this work, we introduce a methodology to determine the drift angle via Fourier analysis by using a single image based on the measurements between the angles of the second Fourier harmonics in different quadrants. Two different approaches, that are independent of the angle of acquisition of the image, are evaluated. In addition, our results demonstrate that the determination of the drift angle is more accurate by using the measurements of non-consecutive quadrants when the angle of acquisition is an odd multiple of 45°. Copyright © 2018 Elsevier Ltd. All rights reserved.
A synchrotron radiation microtomography system for the analysis of trabecular bone samples.
Salomé, M; Peyrin, F; Cloetens, P; Odet, C; Laval-Jeantet, A M; Baruchel, J; Spanne, P
1999-10-01
X-ray computed microtomography is particularly well suited for studying trabecular bone architecture, which requires three-dimensional (3-D) images with high spatial resolution. For this purpose, we describe a three-dimensional computed microtomography (microCT) system using synchrotron radiation, developed at ESRF. Since synchrotron radiation provides a monochromatic and high photon flux x-ray beam, it allows high resolution and a high signal-to-noise ratio imaging. The principle of the system is based on truly three-dimensional parallel tomographic acquisition. It uses a two-dimensional (2-D) CCD-based detector to record 2-D radiographs of the transmitted beam through the sample under different angles of view. The 3-D tomographic reconstruction, performed by an exact 3-D filtered backprojection algorithm, yields 3-D images with cubic voxels. The spatial resolution of the detector was experimentally measured. For the application to bone investigation, the voxel size was set to 6.65 microm, and the experimental spatial resolution was found to be 11 microm. The reconstructed linear attenuation coefficient was calibrated from hydroxyapatite phantoms. Image processing tools are being developed to extract structural parameters quantifying trabecular bone architecture from the 3-D microCT images. First results on human trabecular bone samples are presented.
Genetic diversity of Aspergillus fumigatus in indoor hospital environments.
Araujo, Ricardo; Amorim, António; Gusmão, Leonor
2010-09-01
Environmental isolates of Aspergillus fumigatus are less studied than those recovered from clinical sources. In the present study, the genetic diversity among such environmental isolates was assessed, as well as their dispersion ability and the acquisition of new strains in 19 medical units of the same hospital. A. fumigatus isolates were genotyped using a single multiplex PCR-based reaction with eight microsatellite markers and an insertion/deletion polymorphism. A total of 130 unique genotypes were found among a total of 250 A. fumigatus isolates. Genotypic diversity ranged from 0.86 to 1 in samples from hospital rooms, and there was no correlation between these samples and the presence of high-efficiency particulate air filters or any other air filtration system. Four of the six most prevalent A. fumigatus strains were recovered from water samples. The occurrence of microvariation was common among environmental isolates, which affected each of the microsatellite markers. The assessment of the genetic diversity of A. fumigatus is a useful tool for illustrating the presence or absence of specific clonal populations in a clinical setting. A. fumigatus populations were highly dynamic indoors, and new populations were found in just a few months. Due to the high indoor dispersion capability of A. fumigatus, more attention should be given to strains with increased pathogenic potential or reduced susceptibility to anti-fungal drugs.
Enhancing forensic science with spectroscopic imaging
NASA Astrophysics Data System (ADS)
Ricci, Camilla; Kazarian, Sergei G.
2006-09-01
This presentation outlines the research we are developing in the area of Fourier Transform Infrared (FTIR) spectroscopic imaging with the focus on materials of forensic interest. FTIR spectroscopic imaging has recently emerged as a powerful tool for characterisation of heterogeneous materials. FTIR imaging relies on the ability of the military-developed infrared array detector to simultaneously measure spectra from thousands of different locations in a sample. Recently developed application of FTIR imaging using an ATR (Attenuated Total Reflection) mode has demonstrated the ability of this method to achieve spatial resolution beyond the diffraction limit of infrared light in air. Chemical visualisation with enhanced spatial resolution in micro-ATR mode broadens the range of materials studied with FTIR imaging with applications to pharmaceutical formulations or biological samples. Macro-ATR imaging has also been developed for chemical imaging analysis of large surface area samples and was applied to analyse the surface of human skin (e.g. finger), counterfeit tablets, textile materials (clothing), etc. This approach demonstrated the ability of this imaging method to detect trace materials attached to the surface of the skin. This may also prove as a valuable tool in detection of traces of explosives left or trapped on the surfaces of different materials. This FTIR imaging method is substantially superior to many of the other imaging methods due to inherent chemical specificity of infrared spectroscopy and fast acquisition times of this technique. Our preliminary data demonstrated that this methodology will provide the means to non-destructive detection method that could relate evidence to its source. This will be important in a wider crime prevention programme. In summary, intrinsic chemical specificity and enhanced visualising capability of FTIR spectroscopic imaging open a window of opportunities for counter-terrorism and crime-fighting, with applications ranging from analysis of trace evidence (e.g. in soil), tablets, drugs, fibres, tape explosives, biological samples to detection of gunshot residues and imaging of fingerprints.
Motor Acquisition Rate in Brazilian Infants
ERIC Educational Resources Information Center
Lopes, Virlaine Bardella; de Lima, Carolina Daniel; Tudella, Eloisa
2009-01-01
This study used the Alberta Infant Motor Scale (AIMS) with the aim of characterizing motor acquisition rate in 70 healthy 0-6-month-old Brazilian infants, as well as comparing both emergence (initial age) and establishment (final age) of each skill between the study sample and the AIMS normative data. New motor skills were continuously acquired…
The Role of Chinese Face in the Perpetration of Dating Partner Violence
ERIC Educational Resources Information Center
Chan, Ko Ling
2012-01-01
This study explored the associations between the perpetration of partner violence and two types of face orientation--protective and acquisitive--in Chinese societies. Data from a convenience sample of 3,388 university students from Hong Kong, Shanghai, and Beijing were analyzed. The participants completed the Protective and Acquisitive Face…
NASA Astrophysics Data System (ADS)
Cros, Maria; Joemai, Raoul M. S.; Geleijns, Jacob; Molina, Diego; Salvadó, Marçal
2017-08-01
This study aims to develop and test software for assessing and reporting doses for standard patients undergoing computed tomography (CT) examinations in a 320 detector-row cone-beam scanner. The software, called SimDoseCT, is based on the Monte Carlo (MC) simulation code, which was developed to calculate organ doses and effective doses in ICRP anthropomorphic adult reference computational phantoms for acquisitions with the Aquilion ONE CT scanner (Toshiba). MC simulation was validated by comparing CTDI measurements within standard CT dose phantoms with results from simulation under the same conditions. SimDoseCT consists of a graphical user interface connected to a MySQL database, which contains the look-up-tables that were generated with MC simulations for volumetric acquisitions at different scan positions along the phantom using any tube voltage, bow tie filter, focal spot and nine different beam widths. Two different methods were developed to estimate organ doses and effective doses from acquisitions using other available beam widths in the scanner. A correction factor was used to estimate doses in helical acquisitions. Hence, the user can select any available protocol in the Aquilion ONE scanner for a standard adult male or female and obtain the dose results through the software interface. Agreement within 9% between CTDI measurements and simulations allowed the validation of the MC program. Additionally, the algorithm for dose reporting in SimDoseCT was validated by comparing dose results from this tool with those obtained from MC simulations for three volumetric acquisitions (head, thorax and abdomen). The comparison was repeated using eight different collimations and also for another collimation in a helical abdomen examination. The results showed differences of 0.1 mSv or less for absolute dose in most organs and also in the effective dose calculation. The software provides a suitable tool for dose assessment in standard adult patients undergoing CT examinations in a 320 detector-row cone-beam scanner.
Cros, Maria; Joemai, Raoul M S; Geleijns, Jacob; Molina, Diego; Salvadó, Marçal
2017-07-17
This study aims to develop and test software for assessing and reporting doses for standard patients undergoing computed tomography (CT) examinations in a 320 detector-row cone-beam scanner. The software, called SimDoseCT, is based on the Monte Carlo (MC) simulation code, which was developed to calculate organ doses and effective doses in ICRP anthropomorphic adult reference computational phantoms for acquisitions with the Aquilion ONE CT scanner (Toshiba). MC simulation was validated by comparing CTDI measurements within standard CT dose phantoms with results from simulation under the same conditions. SimDoseCT consists of a graphical user interface connected to a MySQL database, which contains the look-up-tables that were generated with MC simulations for volumetric acquisitions at different scan positions along the phantom using any tube voltage, bow tie filter, focal spot and nine different beam widths. Two different methods were developed to estimate organ doses and effective doses from acquisitions using other available beam widths in the scanner. A correction factor was used to estimate doses in helical acquisitions. Hence, the user can select any available protocol in the Aquilion ONE scanner for a standard adult male or female and obtain the dose results through the software interface. Agreement within 9% between CTDI measurements and simulations allowed the validation of the MC program. Additionally, the algorithm for dose reporting in SimDoseCT was validated by comparing dose results from this tool with those obtained from MC simulations for three volumetric acquisitions (head, thorax and abdomen). The comparison was repeated using eight different collimations and also for another collimation in a helical abdomen examination. The results showed differences of 0.1 mSv or less for absolute dose in most organs and also in the effective dose calculation. The software provides a suitable tool for dose assessment in standard adult patients undergoing CT examinations in a 320 detector-row cone-beam scanner.
A Real-Time Image Acquisition And Processing System For A RISC-Based Microcomputer
NASA Astrophysics Data System (ADS)
Luckman, Adrian J.; Allinson, Nigel M.
1989-03-01
A low cost image acquisition and processing system has been developed for the Acorn Archimedes microcomputer. Using a Reduced Instruction Set Computer (RISC) architecture, the ARM (Acorn Risc Machine) processor provides instruction speeds suitable for image processing applications. The associated improvement in data transfer rate has allowed real-time video image acquisition without the need for frame-store memory external to the microcomputer. The system is comprised of real-time video digitising hardware which interfaces directly to the Archimedes memory, and software to provide an integrated image acquisition and processing environment. The hardware can digitise a video signal at up to 640 samples per video line with programmable parameters such as sampling rate and gain. Software support includes a work environment for image capture and processing with pixel, neighbourhood and global operators. A friendly user interface is provided with the help of the Archimedes Operating System WIMP (Windows, Icons, Mouse and Pointer) Manager. Windows provide a convenient way of handling images on the screen and program control is directed mostly by pop-up menus.
48 CFR 32.104 - Providing contract financing.
Code of Federal Regulations, 2010 CFR
2010-10-01
... performance, considering the availability of private financing and the probable impact on working capital of... Providing contract financing. (a) Prudent contract financing can be a useful working tool in Government acquisition by expediting the performance of essential contracts. Contracting officers must consider the...
Geo-referenced digital data acquisition and processing system using LiDAR technology.
DOT National Transportation Integrated Search
2006-02-01
LiDAR technology, introduced in the late 90s, has received wide acceptance in airborne surveying as a leading : tool for obtaining high-quality surface data at decimeter-level vertical accuracy in an unprecedentedly short : turnaround time. State-of-...
MMX-I: A data-processing software for multi-modal X-ray imaging and tomography
NASA Astrophysics Data System (ADS)
Bergamaschi, A.; Medjoubi, K.; Messaoudi, C.; Marco, S.; Somogyi, A.
2017-06-01
Scanning hard X-ray imaging allows simultaneous acquisition of multimodal information, including X-ray fluorescence, absorption, phase and dark-field contrasts, providing structural and chemical details of the samples. Combining these scanning techniques with the infrastructure developed for fast data acquisition at Synchrotron Soleil permits to perform multimodal imaging and tomography during routine user experiments at the Nanoscopium beamline. A main challenge of such imaging techniques is the online processing and analysis of the generated very large volume (several hundreds of Giga Bytes) multimodal data-sets. This is especially important for the wide user community foreseen at the user oriented Nanoscopium beamline (e.g. from the fields of Biology, Life Sciences, Geology, Geobiology), having no experience in such data-handling. MMX-I is a new multi-platform open-source freeware for the processing and reconstruction of scanning multi-technique X-ray imaging and tomographic datasets. The MMX-I project aims to offer, both expert users and beginners, the possibility of processing and analysing raw data, either on-site or off-site. Therefore we have developed a multi-platform (Mac, Windows and Linux 64bit) data processing tool, which is easy to install, comprehensive, intuitive, extendable and user-friendly. MMX-I is now routinely used by the Nanoscopium user community and has demonstrated its performance in treating big data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Etschmann, B.; Ryan, C; Brugger, J
2010-01-01
Synchrotron X-ray fluorescence (SXRF) and X-ray absorption spectroscopy (XAS) have become standard tools to measure element concentration, distribution at micrometer- to nanometer-scale, and speciation (e.g., nature of host phase; oxidation state) in inhomogeneous geomaterials. The new Maia X-ray detector system provides a quantum leap for the method in terms of data acquisition rate. It is now possible to rapidly collect fully quantitative maps of the distribution of major and trace elements at micrometer spatial resolution over areas as large as 1 x 5 cm{sup 2}. Fast data acquisition rates also open the way to X-ray absorption near-edge structure (XANES) imaging,more » in which spectroscopic information is available at each pixel in the map. These capabilities are critical for studying inhomogeneous Earth materials. Using a 96-element prototype Maia detector, we imaged thin sections of an oxidized pisolitic regolith (2 x 4.5 mm{sup 2} at 2.5 x 2.5 {micro}m{sup 2} pixel size) and a metamorphosed, sedimentary exhalative Mn-Fe ore (3.3 x 4 mm{sup 2} at 1.25 x 5 {micro}m{sup 2}). In both cases, As K-edge XANES imaging reveals localized occurrence of reduced As in parts of these oxidized samples, which would have been difficult to recognize using traditional approaches.« less
Computer system for scanning tunneling microscope automation
NASA Astrophysics Data System (ADS)
Aguilar, M.; García, A.; Pascual, P. J.; Presa, J.; Santisteban, A.
1987-03-01
A computerized system for the automation of a scanning tunneling microscope is presented. It is based on an IBM personal computer (PC) either an XT or an AT, which performs the control, data acquisition and storage operations, displays the STM "images" in real time, and provides image processing tools for the restoration and analysis of data. It supports different data acquisition and control cards and image display cards. The software has been designed in a modular way to allow the replacement of these cards and other equipment improvements as well as the inclusion of user routines for data analysis.
Using the FORTH Language to Develop an ICU Data Acquisition System
Goldberg, Arthur; SooHoo, Spencer L.; Koerner, Spencer K.; Chang, Robert S. Y.
1980-01-01
This paper describes a powerful programming tool that should be considered as an alternative to the more conventional programming languages now in use for developing medical computer systems. Forth provides instantaneous response to user commands, rapid program execution and tremendous programming versatility. An operating system and a language in one carefully designed unit, Forth is well suited for developing data acquisition systems and for interfacing computers to other instruments. We present some of the general features of Forth and describe its use in implementing a data collection system for a Respiratory Intensive Care Unit (RICU).
Arcilla, Maris S; van Hattem, Jarne M; Haverkate, Manon R; Bootsma, Martin C J; van Genderen, Perry J J; Goorhuis, Abraham; Grobusch, Martin P; Lashof, Astrid M Oude; Molhoek, Nicky; Schultsz, Constance; Stobberingh, Ellen E; Verbrugh, Henri A; de Jong, Menno D; Melles, Damian C; Penders, John
2017-01-01
International travel contributes to the dissemination of antimicrobial resistance. We investigated the acquisition of extended-spectrum β-lactamase-producing Enterobacteriaceae (ESBL-E) during international travel, with a focus on predictive factors for acquisition, duration of colonisation, and probability of onward transmission. Within the prospective, multicentre COMBAT study, 2001 Dutch travellers and 215 non-travelling household members were enrolled. Faecal samples and questionnaires on demographics, illnesses, and behaviour were collected before travel and immediately and 1, 3, 6, and 12 months after return. Samples were screened for the presence of ESBL-E. In post-travel samples, ESBL genes were sequenced and PCR with specific primers for plasmid-encoded β-lactamase enzymes TEM, SHV, and CTX-M group 1, 2, 8, 9, and 25 was used to confirm the presence of ESBL genes in follow-up samples. Multivariable regression analyses and mathematical modelling were used to identify predictors for acquisition and sustained carriage, and to determine household transmission rates. This study is registered with ClinicalTrials.gov, number NCT01676974. 633 (34·3%) of 1847 travellers who were ESBL negative before travel and had available samples after return had acquired ESBL-E during international travel (95% CI 32·1-36·5), with the highest number of acquisitions being among those who travelled to southern Asia in 136 of 181 (75·1%, 95% CI 68·4-80·9). Important predictors for acquisition of ESBL-E were antibiotic use during travel (adjusted odds ratio 2·69, 95% CI 1·79-4·05), traveller's diarrhoea that persisted after return (2·31, 1·42-3·76), and pre-existing chronic bowel disease (2·10, 1·13-3·90). The median duration of colonisation after travel was 30 days (95% CI 29-33). 65 (11·3%) of 577 remained colonised at 12 months. CTX-M enzyme group 9 ESBLs were associated with a significantly increased risk of sustained carriage (median duration 75 days, 95% CI 48-102, p=0·0001). Onward transmission was found in 13 (7·7%) of 168 household members. The probability of transmitting ESBL-E to another household member was 12% (95% CI 5-18). Acquisition and spread of ESBL-E during and after international travel was substantial and worrisome. Travellers to areas with a high risk of ESBL-E acquisition should be viewed as potential carriers of ESBL-E for up to 12 months after return. Netherlands Organisation for Health Research and Development (ZonMw). Copyright © 2017 Elsevier Ltd. All rights reserved.
Determining biological tissue optical properties via integrating sphere spatial measurements
Baba, Justin S [Knoxville, TN; Letzen, Brian S [Coral Springs, FL
2011-01-11
An optical sample is mounted on a spatial-acquisition apparatus that is placed in or on an enclosure. An incident beam is irradiated on a surface of the sample and the specular reflection is allowed to escape from the enclosure through an opening. The spatial-acquisition apparatus is provided with a light-occluding slider that moves in front of the sample to block portions of diffuse scattering from the sample. As the light-occluding slider moves across the front of the sample, diffuse light scattered into the area of the backside of the light-occluding slider is absorbed by back side surface of the light-occluding slider. By measuring a baseline diffuse reflectance without a light-occluding slider and subtracting measured diffuse reflectance with a light-occluding slider therefrom, diffuse reflectance for the area blocked by the light-occluding slider can be calculated.
Using the Xbox Kinect sensor for positional data acquisition
NASA Astrophysics Data System (ADS)
Ballester, Jorge; Pheatt, Chuck
2013-01-01
The Kinect sensor was introduced in November 2010 by Microsoft for the Xbox 360 video game system. It is designed to be positioned above or below a video display to track player body and hand movements in three dimensions (3D). The sensor contains a red, green, and blue (RGB) camera, a depth sensor, an infrared (IR) light source, a three-axis accelerometer, and a multi-array microphone, as well as hardware required to transmit sensor information to an external receiver. In this article, we evaluate the capabilities of the Kinect sensor as a 3D data-acquisition platform for use in physics experiments. Data obtained for a simple pendulum, a spherical pendulum, projectile motion, and a bouncing basketball are presented. Overall, the Kinect sensor is found to be a useful data-acquisition tool for motion studies in the physics laboratory.
On Shaft Data Acquisition System (OSDAS)
NASA Technical Reports Server (NTRS)
Pedings, Marc; DeHart, Shawn; Formby, Jason; Naumann, Charles
2012-01-01
On Shaft Data Acquisition System (OSDAS) is a rugged, compact, multiple-channel data acquisition computer system that is designed to record data from instrumentation while operating under extreme rotational centrifugal or gravitational acceleration forces. This system, which was developed for the Heritage Fuel Air Turbine Test (HFATT) program, addresses the problem of recording multiple channels of high-sample-rate data on most any rotating test article by mounting the entire acquisition computer onboard with the turbine test article. With the limited availability of slip ring wires for power and communication, OSDAS utilizes its own resources to provide independent power and amplification for each instrument. Since OSDAS utilizes standard PC technology as well as shared code interfaces with the next-generation, real-time health monitoring system (SPARTAA Scalable Parallel Architecture for Real Time Analysis and Acquisition), this system could be expanded beyond its current capabilities, such as providing advanced health monitoring capabilities for the test article. High-conductor-count slip rings are expensive to purchase and maintain, yet only provide a limited number of conductors for routing instrumentation off the article and to a stationary data acquisition system. In addition to being limited to a small number of instruments, slip rings are prone to wear quickly, and introduce noise and other undesirable characteristics to the signal data. This led to the development of a system capable of recording high-density instrumentation, at high sample rates, on the test article itself, all while under extreme rotational stress. OSDAS is a fully functional PC-based system with 48 channels of 24-bit, high-sample-rate input channels, phase synchronized, with an onboard storage capacity of over 1/2-terabyte of solid-state storage. This recording system takes a novel approach to the problem of recording multiple channels of instrumentation, integrated with the test article itself, packaged in a compact/rugged form factor, consuming limited power, all while rotating at high turbine speeds.
Accelerated x-ray scatter projection imaging using multiple continuously moving pencil beams
NASA Astrophysics Data System (ADS)
Dydula, Christopher; Belev, George; Johns, Paul C.
2017-03-01
Coherent x-ray scatter varies with angle and photon energy in a manner dependent on the chemical composition of the scattering material, even for amorphous materials. Therefore, images generated from scattered photons can have much higher contrast than conventional projection radiographs. We are developing a scatter projection imaging prototype at the BioMedical Imaging and Therapy (BMIT) facility of the Canadian Light Source (CLS) synchrotron in Saskatoon, Canada. The best images are obtained using step-and-shoot scanning with a single pencil beam and area detector to capture sequentially the scatter pattern for each primary beam location on the sample. Primary x-ray transmission is recorded simultaneously using photodiodes. The technological challenge is to acquire the scatter data in a reasonable time. Using multiple pencil beams producing partially-overlapping scatter patterns reduces acquisition time but increases complexity due to the need for a disentangling algorithm to extract the data. Continuous sample motion, rather than step-and-shoot, also reduces acquisition time at the expense of introducing motion blur. With a five-beam (33.2 keV, 3.5 mm2 beam area) continuous sample motion configuration, a rectangular array of 12 x 100 pixels with 1 mm sampling width has been acquired in 0.4 minutes (3000 pixels per minute). The acquisition speed is 38 times the speed for single beam step-and-shoot. A system model has been developed to calculate detected scatter patterns given the material composition of the object to be imaged. Our prototype development, image acquisition of a plastic phantom and modelling are described.
Can we trust the calculation of texture indices of CT images? A phantom study.
Caramella, Caroline; Allorant, Adrien; Orlhac, Fanny; Bidault, Francois; Asselain, Bernard; Ammari, Samy; Jaranowski, Patricia; Moussier, Aurelie; Balleyguier, Corinne; Lassau, Nathalie; Pitre-Champagnat, Stephanie
2018-04-01
Texture analysis is an emerging tool in the field of medical imaging analysis. However, many issues have been raised in terms of its use in assessing patient images and it is crucial to harmonize and standardize this new imaging measurement tool. This study was designed to evaluate the reliability of texture indices of CT images on a phantom including a reproducibility study, to assess the discriminatory capacity of indices potentially relevant in CT medical images and to determine their redundancy. For the reproducibility and discriminatory analysis, eight identical CT acquisitions were performed on a phantom including one homogeneous insert and two close heterogeneous inserts. Texture indices were selected for their high reproducibility and capability of discriminating different textures. For the redundancy analysis, 39 acquisitions of the same phantom were performed using varying acquisition parameters and a correlation matrix was used to explore the 2 × 2 relationships. LIFEx software was used to explore 34 different parameters including first order and texture indices. Only eight indices of 34 exhibited high reproducibility and discriminated textures from each other. Skewness and kurtosis from histogram were independent from the six other indices but were intercorrelated, the other six indices correlated in diverse degrees (entropy, dissimilarity, and contrast of the co-occurrence matrix, contrast of the Neighborhood Gray Level difference matrix, SZE, ZLNU of the Gray-Level Size Zone Matrix). Care should be taken when using texture analysis as a tool to characterize CT images because changes in quantitation may be primarily due to internal variability rather than from real physio-pathological effects. Some textural indices appear to be sufficiently reliable and capable to discriminate close textures on CT images. © 2018 American Association of Physicists in Medicine.
A new scalable modular data acquisition system for SPECT (PET)
NASA Astrophysics Data System (ADS)
Stenstrom, P.; Rillbert, A.; Bergquist, M.; Habte, F.; Bohm, C.; Larsson, S. A.
1998-06-01
Describes a modular decentralized data acquisition system that continuously samples shaped PMT pulses from a SPECT detector. The pulse waveform data are used by signal processors to accurately reconstruct amplitude and time for each scintillation event. Data acquisition for a PMT channel is triggered in two alternative ways, either when its own signal exceeds a selected digital threshold, or when it receives a trigger pulse from one of its neighboring PMTs. The triggered region is restricted to seven, thirteen or nineteen neighboring PMT channels. Each acquisition module supports three PMT channels and connects to all other modules and a reconstruction computer via Firewire to cover the 72 channels in the Stockholm University/Karolinska Hospital cylindrical SPECT camera.
Model-based quantification of image quality
NASA Technical Reports Server (NTRS)
Hazra, Rajeeb; Miller, Keith W.; Park, Stephen K.
1989-01-01
In 1982, Park and Schowengerdt published an end-to-end analysis of a digital imaging system quantifying three principal degradation components: (1) image blur - blurring caused by the acquisition system, (2) aliasing - caused by insufficient sampling, and (3) reconstruction blur - blurring caused by the imperfect interpolative reconstruction. This analysis, which measures degradation as the square of the radiometric error, includes the sample-scene phase as an explicit random parameter and characterizes the image degradation caused by imperfect acquisition and reconstruction together with the effects of undersampling and random sample-scene phases. In a recent paper Mitchell and Netravelli displayed the visual effects of the above mentioned degradations and presented subjective analysis about their relative importance in determining image quality. The primary aim of the research is to use the analysis of Park and Schowengerdt to correlate their mathematical criteria for measuring image degradations with subjective visual criteria. Insight gained from this research can be exploited in the end-to-end design of optical systems, so that system parameters (transfer functions of the acquisition and display systems) can be designed relative to each other, to obtain the best possible results using quantitative measurements.
Tsujino, Soichiro; Tomizaki, Takashi
2016-05-06
Increasing the data acquisition rate of X-ray diffraction images for macromolecular crystals at room temperature at synchrotrons has the potential to significantly accelerate both structural analysis of biomolecules and structure-based drug developments. Using lysozyme model crystals, we demonstrated the rapid acquisition of X-ray diffraction datasets by combining a high frame rate pixel array detector with ultrasonic acoustic levitation of protein crystals in liquid droplets. The rapid spinning of the crystal within a levitating droplet ensured an efficient sampling of the reciprocal space. The datasets were processed with a program suite developed for serial femtosecond crystallography (SFX). The structure, which was solved by molecular replacement, was found to be identical to the structure obtained by the conventional oscillation method for up to a 1.8-Å resolution limit. In particular, the absence of protein crystal damage resulting from the acoustic levitation was carefully established. These results represent a key step towards a fully automated sample handling and measurement pipeline, which has promising prospects for a high acquisition rate and high sample efficiency for room temperature X-ray crystallography.
Ultrasonic acoustic levitation for fast frame rate X-ray protein crystallography at room temperature
NASA Astrophysics Data System (ADS)
Tsujino, Soichiro; Tomizaki, Takashi
2016-05-01
Increasing the data acquisition rate of X-ray diffraction images for macromolecular crystals at room temperature at synchrotrons has the potential to significantly accelerate both structural analysis of biomolecules and structure-based drug developments. Using lysozyme model crystals, we demonstrated the rapid acquisition of X-ray diffraction datasets by combining a high frame rate pixel array detector with ultrasonic acoustic levitation of protein crystals in liquid droplets. The rapid spinning of the crystal within a levitating droplet ensured an efficient sampling of the reciprocal space. The datasets were processed with a program suite developed for serial femtosecond crystallography (SFX). The structure, which was solved by molecular replacement, was found to be identical to the structure obtained by the conventional oscillation method for up to a 1.8-Å resolution limit. In particular, the absence of protein crystal damage resulting from the acoustic levitation was carefully established. These results represent a key step towards a fully automated sample handling and measurement pipeline, which has promising prospects for a high acquisition rate and high sample efficiency for room temperature X-ray crystallography.
Ultrasonic acoustic levitation for fast frame rate X-ray protein crystallography at room temperature
Tsujino, Soichiro; Tomizaki, Takashi
2016-01-01
Increasing the data acquisition rate of X-ray diffraction images for macromolecular crystals at room temperature at synchrotrons has the potential to significantly accelerate both structural analysis of biomolecules and structure-based drug developments. Using lysozyme model crystals, we demonstrated the rapid acquisition of X-ray diffraction datasets by combining a high frame rate pixel array detector with ultrasonic acoustic levitation of protein crystals in liquid droplets. The rapid spinning of the crystal within a levitating droplet ensured an efficient sampling of the reciprocal space. The datasets were processed with a program suite developed for serial femtosecond crystallography (SFX). The structure, which was solved by molecular replacement, was found to be identical to the structure obtained by the conventional oscillation method for up to a 1.8-Å resolution limit. In particular, the absence of protein crystal damage resulting from the acoustic levitation was carefully established. These results represent a key step towards a fully automated sample handling and measurement pipeline, which has promising prospects for a high acquisition rate and high sample efficiency for room temperature X-ray crystallography. PMID:27150272
Imaging system design and image interpolation based on CMOS image sensor
NASA Astrophysics Data System (ADS)
Li, Yu-feng; Liang, Fei; Guo, Rui
2009-11-01
An image acquisition system is introduced, which consists of a color CMOS image sensor (OV9620), SRAM (CY62148), CPLD (EPM7128AE) and DSP (TMS320VC5509A). The CPLD implements the logic and timing control to the system. SRAM stores the image data, and DSP controls the image acquisition system through the SCCB (Omni Vision Serial Camera Control Bus). The timing sequence of the CMOS image sensor OV9620 is analyzed. The imaging part and the high speed image data memory unit are designed. The hardware and software design of the image acquisition and processing system is given. CMOS digital cameras use color filter arrays to sample different spectral components, such as red, green, and blue. At the location of each pixel only one color sample is taken, and the other colors must be interpolated from neighboring samples. We use the edge-oriented adaptive interpolation algorithm for the edge pixels and bilinear interpolation algorithm for the non-edge pixels to improve the visual quality of the interpolated images. This method can get high processing speed, decrease the computational complexity, and effectively preserve the image edges.
Li, Bo; Li, Hao; Dong, Li; Huang, Guofu
2017-11-01
In this study, we sought to investigate the feasibility of fast carotid artery MR angiography (MRA) by combining three-dimensional time-of-flight (3D TOF) with compressed sensing method (CS-3D TOF). A pseudo-sequential phase encoding order was developed for CS-3D TOF to generate hyper-intense vessel and suppress background tissues in under-sampled 3D k-space. Seven healthy volunteers and one patient with carotid artery stenosis were recruited for this study. Five sequential CS-3D TOF scans were implemented at 1, 2, 3, 4 and 5-fold acceleration factors for carotid artery MRA. Blood signal-to-tissue ratio (BTR) values for fully-sampled and under-sampled acquisitions were calculated and compared in seven subjects. Blood area (BA) was measured and compared between fully sampled acquisition and each under-sampled one. There were no significant differences between the fully-sampled dataset and each under-sampled in BTR comparisons (P>0.05 for all comparisons). The carotid vessel BAs measured from the images of CS-3D TOF sequences with 2, 3, 4 and 5-fold acceleration scans were all highly correlated with that of the fully-sampled acquisition. The contrast between blood vessels and background tissues of the images at 2 to 5-fold acceleration is comparable to that of fully sampled images. The images at 2× to 5× exhibit the comparable lumen definition to the corresponding images at 1×. By combining the pseudo-sequential phase encoding order, CS reconstruction, and 3D TOF sequence, this technique provides excellent visualizations for carotid vessel and calcifications in a short scan time. It has the potential to be integrated into current multiple blood contrast imaging protocol. Copyright © 2017. Published by Elsevier Inc.
2011 Ground Robotics Capabilities Conference and Exhibition
2011-03-24
and reconnaissance, urban warfare, first responder, surveillance/ hostage situations and other critical missions. All have hard anodized bodies ... Body Bomb Tool Kit OBJECTIVE: Develop a set of tools that can be changed and operated remotely that address the specific threat of an explosive...Innovation Acquisition Opportunities for Future Scientists & Engineers Requirements Technology & Innovation 5 ATLAS, Cheetah & ARM (DARPA) Conformal
Wacker, Michael A.
2010-01-01
Borehole geophysical logs were obtained from selected exploratory coreholes in the vicinity of the Florida Power and Light Company Turkey Point Power Plant. The geophysical logging tools used and logging sequences performed during this project are summarized herein to include borehole logging methods, descriptions of the properties measured, types of data obtained, and calibration information.
Endoscopic full-thickness resection: Current status
Schmidt, Arthur; Meier, Benjamin; Caca, Karel
2015-01-01
Conventional endoscopic resection techniques such as endoscopic mucosal resection or endoscopic submucosal dissection are powerful tools for treatment of gastrointestinal neoplasms. However, those techniques are restricted to superficial layers of the gastrointestinal wall. Endoscopic full-thickness resection (EFTR) is an evolving technique, which is just about to enter clinical routine. It is not only a powerful tool for diagnostic tissue acquisition but also has the potential to spare surgical therapy in selected patients. This review will give an overview about current EFTR techniques and devices. PMID:26309354
Endoscopic full-thickness resection: Current status.
Schmidt, Arthur; Meier, Benjamin; Caca, Karel
2015-08-21
Conventional endoscopic resection techniques such as endoscopic mucosal resection or endoscopic submucosal dissection are powerful tools for treatment of gastrointestinal neoplasms. However, those techniques are restricted to superficial layers of the gastrointestinal wall. Endoscopic full-thickness resection (EFTR) is an evolving technique, which is just about to enter clinical routine. It is not only a powerful tool for diagnostic tissue acquisition but also has the potential to spare surgical therapy in selected patients. This review will give an overview about current EFTR techniques and devices.
Development of integrated control system for smart factory in the injection molding process
NASA Astrophysics Data System (ADS)
Chung, M. J.; Kim, C. Y.
2018-03-01
In this study, we proposed integrated control system for automation of injection molding process required for construction of smart factory. The injection molding process consists of heating, tool close, injection, cooling, tool open, and take-out. Take-out robot controller, image processing module, and process data acquisition interface module are developed and assembled to integrated control system. By adoption of integrated control system, the injection molding process can be simplified and the cost for construction of smart factory can be inexpensive.
Tools for the IDL widget set within the X-windows environment
NASA Technical Reports Server (NTRS)
Turgeon, B.; Aston, A.
1992-01-01
New tools using the IDL widget set are presented. In particular, a utility allowing the easy creation and update of slide presentations, XSlideManager, is explained in detail and examples of its application are shown. In addition to XSlideManager, other mini-utilities are discussed. These various pieces of software follow the philosophy of the X-Windows distribution system and are made available to anyone within the Internet network. Acquisition procedures through anonymous ftp are clearly explained.
Agent-based Training: Facilitating Knowledge and Skill Acquisition in a Modern Space Operations Team
2002-04-01
face, and being careful to not add to existing problems such as limited display space. This required us to work closely with members of the SBIRS operational community and use research tools such as cognitive task analysis methods.
The Funding of Community Colleges: A Typology of State Funding Formulas
ERIC Educational Resources Information Center
Mullin, Christopher M.; Honeyman, David S.
2007-01-01
Community college funding formulas are tools utilized to substantiate the acquisition of funds and delineate the cost of education. This study develops a typology of community college funding formulas placing 48 states in three categories and five subcategories. (Contains 5 tables.)
48 CFR 1545.309 - Providing Government production and research property under special restrictions.
Code of Federal Regulations, 2010 CFR
2010-10-01
... improvements necessary for installing special tooling, special test equipment, or plant equipment, shall not be... production and research property under special restrictions. 1545.309 Section 1545.309 Federal Acquisition Regulations System ENVIRONMENTAL PROTECTION AGENCY CONTRACT MANAGEMENT GOVERNMENT PROPERTY Providing...
DECUS Proceedings; Fall 1971, Papers and Presentations.
ERIC Educational Resources Information Center
1971
Papers and presentations at the 1971 symposium of the Digital Equipment Computer Users Society (DECUS) are presented. The papers deal with medical and physiological applications, computer graphics, simulation education, small computer executive systems, management information tools, data acquisition systems, and high level languages. Although many…
A Single-Display Groupware Collaborative Language Laboratory
ERIC Educational Resources Information Center
Calderón, Juan Felipe; Nussbaum, Miguel; Carmach, Ignacio; Díaz, Juan Jaime; Villalta, Marco
2016-01-01
Language learning tools have evolved to take into consideration new teaching models of collaboration and communication. While second language acquisition tasks have been taken online, the traditional language laboratory has remained unchanged. By continuing to follow its original configuration based on individual work, the language laboratory…
A Different Kind of Web-Based Knowledge Management: The DTRA Acquisition ToolBook
2008-12-01
Reimbursement Order MIPR: Military Interdepartmental Purchase Request PWS: Performance Work Statement SOO: Statement of Objectives SOW: Scope of Work Figure 1...Competitive BAA: Broad Agency Announcement IACRO: Inter-Agency Cost Reimbur ement Order MIPR: Military Interdepartmental Purchas Request PWS
Conceptual Acquisition and Change through Social Interaction.
ERIC Educational Resources Information Center
Kobayashi, Yoshikazu
1994-01-01
Examines the role of social interaction as a facilitator of learning in general and conceptual change in particular. Three conditions are proposed as necessary for social interaction to facilitate knowledge construction--horizontal information, comparable domain knowledge, and availability of cognitive tools. Suggests that these conditions assure…