Sample records for acquisition processing analysis

  1. Contractor relationships and inter-organizational strategies in NASA's R and D acquisition process

    NASA Technical Reports Server (NTRS)

    Guiltinan, J.

    1976-01-01

    Interorganizational analysis of NASA's acquisition process for research and development systems is discussed. The importance of understanding the contractor environment, constraints, and motives in selecting an acquisition strategy is demonstrated. By articulating clear project goals, by utilizing information about the contractor and his needs at each stage in the acquisition process, and by thorough analysis of the inter-organizational relationship, improved selection of acquisition strategies and business practices is possible.

  2. Automated data acquisition technology development:Automated modeling and control development

    NASA Technical Reports Server (NTRS)

    Romine, Peter L.

    1995-01-01

    This report documents the completion of, and improvements made to, the software developed for automated data acquisition and automated modeling and control development on the Texas Micro rackmounted PC's. This research was initiated because a need was identified by the Metal Processing Branch of NASA Marshall Space Flight Center for a mobile data acquisition and data analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC based system was chosen. The Welding Measurement System (WMS), is a dedicated instrument strickly for use of data acquisition and data analysis. In addition to the data acquisition functions described in this thesis, WMS also supports many functions associated with process control. The hardware and software requirements for an automated acquisition system for welding process parameters, welding equipment checkout, and welding process modeling were determined in 1992. From these recommendations, NASA purchased the necessary hardware and software. The new welding acquisition system is designed to collect welding parameter data and perform analysis to determine the voltage versus current arc-length relationship for VPPA welding. Once the results of this analysis are obtained, they can then be used to develop a RAIL function to control welding startup and shutdown without torch crashing.

  3. 32 CFR 989.1 - Purpose.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Acquisition Programs and Major Automated Information System Acquisition Programs. 1 To comply with NEPA and... ANALYSIS PROCESS (EIAP) § 989.1 Purpose. (a) This part implements the Air Force Environmental Impact Analysis Process (EIAP) and provides procedures for environmental impact analysis both within the United...

  4. DDS-Suite - A Dynamic Data Acquisition, Processing, and Analysis System for Wind Tunnel Testing

    NASA Technical Reports Server (NTRS)

    Burnside, Jathan J.

    2012-01-01

    Wind Tunnels have optimized their steady-state data systems for acquisition and analysis and even implemented large dynamic-data acquisition systems, however development of near real-time processing and analysis tools for dynamic-data have lagged. DDS-Suite is a set of tools used to acquire, process, and analyze large amounts of dynamic data. Each phase of the testing process: acquisition, processing, and analysis are handled by separate components so that bottlenecks in one phase of the process do not affect the other, leading to a robust system. DDS-Suite is capable of acquiring 672 channels of dynamic data at rate of 275 MB / s. More than 300 channels of the system use 24-bit analog-to-digital cards and are capable of producing data with less than 0.01 of phase difference at 1 kHz. System architecture, design philosophy, and examples of use during NASA Constellation and Fundamental Aerodynamic tests are discussed.

  5. Graphical user interface for image acquisition and processing

    DOEpatents

    Goldberg, Kenneth A.

    2002-01-01

    An event-driven GUI-based image acquisition interface for the IDL programming environment designed for CCD camera control and image acquisition directly into the IDL environment where image manipulation and data analysis can be performed, and a toolbox of real-time analysis applications. Running the image acquisition hardware directly from IDL removes the necessity of first saving images in one program and then importing the data into IDL for analysis in a second step. Bringing the data directly into IDL creates an opportunity for the implementation of IDL image processing and display functions in real-time. program allows control over the available charge coupled device (CCD) detector parameters, data acquisition, file saving and loading, and image manipulation and processing, all from within IDL. The program is built using IDL's widget libraries to control the on-screen display and user interface.

  6. Combined Acquisition/Processing For Data Reduction

    NASA Astrophysics Data System (ADS)

    Kruger, Robert A.

    1982-01-01

    Digital image processing systems necessarily consist of three components: acquisition, storage/retrieval and processing. The acquisition component requires the greatest data handling rates. By coupling together the acquisition witn some online hardwired processing, data rates and capacities for short term storage can be reduced. Furthermore, long term storage requirements can be reduced further by appropriate processing and editing of image data contained in short term memory. The net result could be reduced performance requirements for mass storage, processing and communication systems. Reduced amounts of data also snouid speed later data analysis and diagnostic decision making.

  7. An overview of data acquisition, signal coding and data analysis techniques for MST radars

    NASA Technical Reports Server (NTRS)

    Rastogi, P. K.

    1986-01-01

    An overview is given of the data acquisition, signal processing, and data analysis techniques that are currently in use with high power MST/ST (mesosphere stratosphere troposphere/stratosphere troposphere) radars. This review supplements the works of Rastogi (1983) and Farley (1984) presented at previous MAP workshops. A general description is given of data acquisition and signal processing operations and they are characterized on the basis of their disparate time scales. Then signal coding, a brief description of frequently used codes, and their limitations are discussed, and finally, several aspects of statistical data processing such as signal statistics, power spectrum and autocovariance analysis, outlier removal techniques are discussed.

  8. United States Air Force Computer-Aided Acquisition and Logistics Support (CALS). Logistics Support Analysis Current Environment. Volume 2

    DOT National Transportation Integrated Search

    1988-10-01

    An analysis of the current environment within the Acquisition stage of the Weapon System Life Cycle Pertaining to the Logistics Support Analysis (LSA) process, the Logistics Support Analysis Record (LSAR), and other Logistics Support data was underta...

  9. United States Air Force Computer-Aided Acquisition and Logistics Support (CALS). Logistics Support Analysis Current Environment. Volume 1

    DOT National Transportation Integrated Search

    1988-10-01

    An analysis of the current environment within the Acquisition stage of the Weapon System Life Cycle Pertaining to the Logistics Support Analysis (LSA) process, the Logistics Support Analysis Record (LSAR), and other Logistics Support data was underta...

  10. Ecological Fallacy in Reading Acquisition Research: Masking Constructive Processes of the Learner.

    ERIC Educational Resources Information Center

    Berninger, Virginia W.; Abbott, Robert D.

    A study examined whether conclusions about constructive processes in reading based on analysis of group data were consistent with those based on an analysis of individual data. Subjects, selected from a larger sample of 45 first grade students who had participated in a longitudinal study on acquisition of linguistic procedures for printed words,…

  11. Development of a software tool to support chemical and biological terrorism intelligence analysis

    NASA Astrophysics Data System (ADS)

    Hunt, Allen R.; Foreman, William

    1997-01-01

    AKELA has developed a software tool which uses a systems analytic approach to model the critical processes which support the acquisition of biological and chemical weapons by terrorist organizations. This tool has four major components. The first is a procedural expert system which describes the weapon acquisition process. It shows the relationship between the stages a group goes through to acquire and use a weapon, and the activities in each stage required to be successful. It applies to both state sponsored and small group acquisition. An important part of this expert system is an analysis of the acquisition process which is embodied in a list of observables of weapon acquisition activity. These observables are cues for intelligence collection The second component is a detailed glossary of technical terms which helps analysts with a non- technical background understand the potential relevance of collected information. The third component is a linking capability which shows where technical terms apply to the parts of the acquisition process. The final component is a simple, intuitive user interface which shows a picture of the entire process at a glance and lets the user move quickly to get more detailed information. This paper explains e each of these five model components.

  12. Online Analysis Enhances Use of NASA Earth Science Data

    NASA Technical Reports Server (NTRS)

    Acker, James G.; Leptoukh, Gregory

    2007-01-01

    Giovanni, the Goddard Earth Sciences Data and Information Services Center (GES DISC) Interactive Online Visualization and Analysis Infrastructure, has provided researchers with advanced capabilities to perform data exploration and analysis with observational data from NASA Earth observation satellites. In the past 5-10 years, examining geophysical events and processes with remote-sensing data required a multistep process of data discovery, data acquisition, data management, and ultimately data analysis. Giovanni accelerates this process by enabling basic visualization and analysis directly on the World Wide Web. In the last two years, Giovanni has added new data acquisition functions and expanded analysis options to increase its usefulness to the Earth science research community.

  13. FASEA: A FPGA Acquisition System and Software Event Analysis for liquid scintillation counting

    NASA Astrophysics Data System (ADS)

    Steele, T.; Mo, L.; Bignell, L.; Smith, M.; Alexiev, D.

    2009-10-01

    The FASEA (FPGA based Acquisition and Software Event Analysis) system has been developed to replace the MAC3 for coincidence pulse processing. The system uses a National Instruments Virtex 5 FPGA card (PXI-7842R) for data acquisition and a purpose developed data analysis software for data analysis. Initial comparisons to the MAC3 unit are included based on measurements of 89Sr and 3H, confirming that the system is able to accurately emulate the behaviour of the MAC3 unit.

  14. Bottleneck Analysis on the DoD Pre-Milestone B Acquisition Processes

    DTIC Science & Technology

    2013-04-01

    Acquisition Processes Danielle Worger and Teresa Wu, Arizona State University Eugene Rex Jalao, Arizona State University and University of the...Danielle Worger and Teresa Wu, Arizona State University Eugene Rex Jalao, Arizona State University and University of the Philippines Christopher...Air Force Institute of Technology The RITE Approach to Agile Acquisition Timothy Boyce, Iva Sherman, and Nicholas Roussel Space and Naval Warfare

  15. Hybrid data acquisition and processing strategies with increased throughput and selectivity: pSMART analysis for global qualitative and quantitative analysis.

    PubMed

    Prakash, Amol; Peterman, Scott; Ahmad, Shadab; Sarracino, David; Frewen, Barbara; Vogelsang, Maryann; Byram, Gregory; Krastins, Bryan; Vadali, Gouri; Lopez, Mary

    2014-12-05

    Data-dependent acquisition (DDA) and data-independent acquisition strategies (DIA) have both resulted in improved understanding of proteomics samples. Both strategies have advantages and disadvantages that are well-published, where DDA is typically applied for deep discovery and DIA may be used to create sample records. In this paper, we present a hybrid data acquisition and processing strategy (pSMART) that combines the strengths of both techniques and provides significant benefits for qualitative and quantitative peptide analysis. The performance of pSMART is compared to published DIA strategies in an experiment that allows the objective assessment of DIA performance with respect to interrogation of previously acquired MS data. The results of this experiment demonstrate that pSMART creates fewer decoy hits than a standard DIA strategy. Moreover, we show that pSMART is more selective, sensitive, and reproducible than either standard DIA or DDA strategies alone.

  16. An Operationally Responsive Space Architecture for 2025

    DTIC Science & Technology

    2008-06-22

    Organizational Relationships, Asset Loss Mitigation, Availability, Flexibility, and Streamlined Acquisition Processes . These pillars allowed the solutions...were considered. Analysis was further supported by a performance versus cost process which provided a final test of solution feasibility. Relative cost...Availability, Flexibility, and Streamlined Acquisition Processes . These pillars allowed the solutions, material and non-material, to be organized for

  17. Enabling Design for Affordability: An Epoch-Era Analysis Approach

    DTIC Science & Technology

    2013-04-01

    Analysis on the DoD Pre-Milestone B Acquisition Processes Danielle Worger and Teresa Wu, Arizona State University Eugene Rex Jalao, Arizona State...Management Best Practices Brandon Keller and J. Robert Wirthlin Air Force Institute of Technology The RITE Approach to Agile Acquisition Timothy Boyce...Change Kathryn Aten and John T . Dillard Naval Postgraduate School A Comparative Assessment of the Navy’s Future Naval Capabilities (FNC) Process

  18. A system design of data acquisition and processing for side-scatter lidar

    NASA Astrophysics Data System (ADS)

    Zhang, ZhanYe; Xie, ChenBo; Wang, ZhenZhu; Kuang, ZhiQiang; Deng, Qian; Tao, ZongMing; Liu, Dong; Wang, Yingjian

    2018-03-01

    A system for collecting data of Side-Scatter lidar based on Charge Coupled Device (CCD),is designed and implemented. The system of data acquisition is based on Microsoft. Net structure and the language of C# is used to call dynamic link library (DLL) of CCD for realization of the real-time data acquisition and processing. The software stores data as txt file for post data acquisition and analysis. The system has ability to operate CCD device in all-day, automatic, continuous and high frequency data acquisition and processing conditions, which will catch 24-hour information of the atmospheric scatter's light intensity and retrieve the spatial and temporal properties of aerosol particles. The experimental result shows that the system is convenient to observe the aerosol optical characteristics near surface.

  19. Fully Burdened Cost of Fuel Using Input-Output Analysis

    DTIC Science & Technology

    2011-12-01

    Distribution Model could be used to replace the current seven-step Fully Burdened Cost of Fuel process with a single step, allowing for less complex and...wide extension of the Bulk Fuels Distribution Model could be used to replace the current seven-step Fully Burdened Cost of Fuel process with a single...ABBREVIATIONS AEM Atlantic, Europe, and the Mediterranean AOAs Analysis of Alternatives DAG Defense Acquisition Guidebook DAU Defense Acquisition University

  20. The Requirement for Acquisition and Logistics Integration: An Examination of Reliability Management Within the Marine Corps Acquisition Process

    DTIC Science & Technology

    2002-12-01

    HMMWV family of vehicles, LVS family of vehicles, and the M198 Howitzer). The analysis is limited to an assessment of reliability management issues...AND LOGISTICS INTEGRATION: AN EXAMINATION OF RELIABILITY MANAGEMENT WITHIN THE MARINE CORPS ACQUISITION PROCESS by Marvin L. Norcross, Jr...Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington, VA 22202-4302, and to the Office of Management and Budget, Paperwork Reduction

  1. Department of Defense Cost Analysis Symposium (26th) on Cost Analysis in an Uncertain Defense Environment Held in Washington, DC on 9-11 September 1992

    DTIC Science & Technology

    1992-09-09

    ASHER Office of the Assistant Secretary of Defense (Program, Analysis & Evaluation) MR. JAMES C. PILGER Office of the Assistant Secretary of the Army...CHANGES TO THE MAJOR WEAPONS SYSTEM ACQUISITION PROCESS The major weapon system acquisition processes forged during the Cold War may not be practical...No one can estimate the extent of cost growth with a high degree of accuracy. However, review of 30-40 years of cold war history dops allow the

  2. The use of artificial intelligence techniques to improve the multiple payload integration process

    NASA Technical Reports Server (NTRS)

    Cutts, Dannie E.; Widgren, Brian K.

    1992-01-01

    A maximum return of science and products with a minimum expenditure of time and resources is a major goal of mission payload integration. A critical component then, in successful mission payload integration is the acquisition and analysis of experiment requirements from the principal investigator and payload element developer teams. One effort to use artificial intelligence techniques to improve the acquisition and analysis of experiment requirements within the payload integration process is described.

  3. Quantitative assessment of the impact of biomedical image acquisition on the results obtained from image analysis and processing.

    PubMed

    Koprowski, Robert

    2014-07-04

    Dedicated, automatic algorithms for image analysis and processing are becoming more and more common in medical diagnosis. When creating dedicated algorithms, many factors must be taken into consideration. They are associated with selecting the appropriate algorithm parameters and taking into account the impact of data acquisition on the results obtained. An important feature of algorithms is the possibility of their use in other medical units by other operators. This problem, namely operator's (acquisition) impact on the results obtained from image analysis and processing, has been shown on a few examples. The analysed images were obtained from a variety of medical devices such as thermal imaging, tomography devices and those working in visible light. The objects of imaging were cellular elements, the anterior segment and fundus of the eye, postural defects and others. In total, almost 200'000 images coming from 8 different medical units were analysed. All image analysis algorithms were implemented in C and Matlab. For various algorithms and methods of medical imaging, the impact of image acquisition on the results obtained is different. There are different levels of algorithm sensitivity to changes in the parameters, for example: (1) for microscope settings and the brightness assessment of cellular elements there is a difference of 8%; (2) for the thyroid ultrasound images there is a difference in marking the thyroid lobe area which results in a brightness assessment difference of 2%. The method of image acquisition in image analysis and processing also affects: (3) the accuracy of determining the temperature in the characteristic areas on the patient's back for the thermal method - error of 31%; (4) the accuracy of finding characteristic points in photogrammetric images when evaluating postural defects - error of 11%; (5) the accuracy of performing ablative and non-ablative treatments in cosmetology - error of 18% for the nose, 10% for the cheeks, and 7% for the forehead. Similarly, when: (7) measuring the anterior eye chamber - there is an error of 20%; (8) measuring the tooth enamel thickness - error of 15%; (9) evaluating the mechanical properties of the cornea during pressure measurement - error of 47%. The paper presents vital, selected issues occurring when assessing the accuracy of designed automatic algorithms for image analysis and processing in bioengineering. The impact of acquisition of images on the problems arising in their analysis has been shown on selected examples. It has also been indicated to which elements of image analysis and processing special attention should be paid in their design.

  4. A review of breast tomosynthesis. Part II. Image reconstruction, processing and analysis, and advanced applications

    PubMed Central

    Sechopoulos, Ioannis

    2013-01-01

    Many important post-acquisition aspects of breast tomosynthesis imaging can impact its clinical performance. Chief among them is the reconstruction algorithm that generates the representation of the three-dimensional breast volume from the acquired projections. But even after reconstruction, additional processes, such as artifact reduction algorithms, computer aided detection and diagnosis, among others, can also impact the performance of breast tomosynthesis in the clinical realm. In this two part paper, a review of breast tomosynthesis research is performed, with an emphasis on its medical physics aspects. In the companion paper, the first part of this review, the research performed relevant to the image acquisition process is examined. This second part will review the research on the post-acquisition aspects, including reconstruction, image processing, and analysis, as well as the advanced applications being investigated for breast tomosynthesis. PMID:23298127

  5. Improving DoD Energy Efficiency: Combining MMOWGLI Social-Media Brainstorming with Lexical Link Analysis (LLA) to Strengthen the Defense Acquisition Process

    DTIC Science & Technology

    2013-04-01

    from MIT and co-founded Quantum Intelligence, Inc. She has been a principal investigator (PI) for six DoD Small Business Innovation Research (SBIR... consciously and consistently.  Following a series of deliberate experiments, long-term procedural improvements to the formal milestone acquisition process

  6. An Exploratory Analysis of the U.S. System of Major Defense Acquisition Utilizing the CLIOS Process

    DTIC Science & Technology

    2009-09-01

    SPENDING COUNTRIES .............................................20 1. The United States’ Defense Acquisition System..............................21 2...Improvement Initiatives ...............................................................20 Table 4. The Top 15 Military Spender Countries in 2008...other top military spending countries . It will end with a review of the major defense acquisition literature. This literature review will focus on

  7. Design and application of pulse information acquisition and analysis system with dynamic recognition in traditional Chinese medicine.

    PubMed

    Zhang, Jian; Niu, Xin; Yang, Xue-zhi; Zhu, Qing-wen; Li, Hai-yan; Wang, Xuan; Zhang, Zhi-guo; Sha, Hong

    2014-09-01

    To design the pulse information which includes the parameter of pulse-position, pulse-number, pulse-shape and pulse-force acquisition and analysis system with function of dynamic recognition, and research the digitalization and visualization of some common cardiovascular mechanism of single pulse. To use some flexible sensors to catch the radial artery pressure pulse wave and utilize the high frequency B mode ultrasound scanning technology to synchronously obtain the information of radial extension and axial movement, by the way of dynamic images, then the gathered information was analyzed and processed together with ECG. Finally, the pulse information acquisition and analysis system was established which has the features of visualization and dynamic recognition, and it was applied to serve for ten healthy adults. The new system overcome the disadvantage of one-dimensional pulse information acquisition and process method which was common used in current research area of pulse diagnosis in traditional Chinese Medicine, initiated a new way of pulse diagnosis which has the new features of dynamic recognition, two-dimensional information acquisition, multiplex signals combination and deep data mining. The newly developed system could translate the pulse signals into digital, visual and measurable motion information of vessel.

  8. Development of the Data Acquisition and Processing System for a Pulsed 2-Micron Coherent Doppler Lidar System

    NASA Technical Reports Server (NTRS)

    Beyon, Jeffrey Y.; Koch, Grady J.; Kavaya, Michael J.

    2010-01-01

    A general overview of the development of a data acquisition and processing system is presented for a pulsed, 2-micron coherent Doppler Lidar system located in NASA Langley Research Center in Hampton, Virginia, USA. It is a comprehensive system that performs high-speed data acquisition, analysis, and data display both in real time and offline. The first flight missions are scheduled for the summer of 2010 as part of the NASA Genesis and Rapid Intensification Processes (GRIP) campaign for the study of hurricanes. The system as well as the control software is reviewed and its requirements and unique features are discussed.

  9. Analysis of the times involved in processing and communication in a lower limb simulation system controlled by SEMG

    NASA Astrophysics Data System (ADS)

    Profumieri, A.; Bonell, C.; Catalfamo, P.; Cherniz, A.

    2016-04-01

    Virtual reality has been proposed for different applications, including the evaluation of new control strategies and training protocols for upper limb prostheses and for the study of new rehabilitation programs. In this study, a lower limb simulation environment commanded by surface electromyography signals is evaluated. The time delays generated by the acquisition and processing stages for the signals that would command the knee joint, were measured and different acquisition windows were analysed. The subjective perception of the quality of simulation was also evaluated when extra delays were added to the process. The results showed that the acquisition window is responsible for the longest delay. Also, the basic implemented processes allowed for the acquisition of three signal channels for commanding the simulation. Finally, the communication between different applications is arguably efficient, although it depends on the amount of data to be sent.

  10. Sensor, signal, and image informatics - state of the art and current topics.

    PubMed

    Lehmann, T M; Aach, T; Witte, H

    2006-01-01

    The number of articles published annually in the fields of biomedical signal and image acquisition and processing is increasing. Based on selected examples, this survey aims at comprehensively demonstrating the recent trends and developments. Four articles are selected for biomedical data acquisition covering topics such as dose saving in CT, C-arm X-ray imaging systems for volume imaging, and the replacement of dose-intensive CT-based diagnostic with harmonic ultrasound imaging. Regarding biomedical signal analysis (BSA), the four selected articles discuss the equivalence of different time-frequency approaches for signal analysis, an application to Cochlea implants, where time-frequency analysis is applied for controlling the replacement system, recent trends for fusion of different modalities, and the role of BSA as part of a brain machine interfaces. To cover the broad spectrum of publications in the field of biomedical image processing, six papers are focused. Important topics are content-based image retrieval in medical applications, automatic classification of tongue photographs from traditional Chinese medicine, brain perfusion analysis in single photon emission computed tomography (SPECT), model-based visualization of vascular trees, and virtual surgery, where enhanced visualization and haptic feedback techniques are combined with a sphere-filled model of the organ. The selected papers emphasize the five fields forming the chain of biomedical data processing: (1) data acquisition, (2) data reconstruction and pre-processing, (3) data handling, (4) data analysis, and (5) data visualization. Fields 1 and 2 form the sensor informatics, while fields 2 to 5 form signal or image informatics with respect to the nature of the data considered. Biomedical data acquisition and pre-processing, as well as data handling, analysis and visualization aims at providing reliable tools for decision support that improve the quality of health care. Comprehensive evaluation of the processing methods and their reliable integration in routine applications are future challenges in the field of sensor, signal and image informatics.

  11. Review of the all Source Analysis System as a Part of the Audit of the Effectiveness of the Defense Acquisition Board Review Process-FY 1993.

    DTIC Science & Technology

    1993-04-20

    OFFICE OF THE INSPECTOR GENERAL REVIEW OF THE ALL SOURCE ANALYSIS SYSTEM AS A PART OF THE AUDIT OF THE EFFECTIVENESS OF THE DEFENSE...System as a Part of the Audit of the Effectiveness of the Defense Acquisition Board Review Process--FY 1993 (Report No. 93-087) We are providing...appreciate the courtesies extended to the audit staff. If you have questions on this report, please contact Program Director Russell A. Rau at (703) 693

  12. Data acquisition and processing system for the HT-6M tokamak fusion experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shu, Y.T.; Liu, G.C.; Pang, J.Q.

    1987-08-01

    This paper describes a high-speed data acquisition and processing system which has been successfully operated on the HT-6M tokamak fusion experimental device. The system collects, archives and analyzes up to 512 kilobytes of data from each shot of the experiment. A shot lasts 50-150 milliseconds and occurs every 5-10 minutes. The system consists of two PDP-11/24 computer systems. One PDP-11/24 is used for real-time data taking and on-line data analysis. It is based upon five CAMAC crates organized into a parallel branch. Another PDP-11/24 is used for off-line data processing. Both data acquisition software RSX-DAS and data processing software RSX-DAPmore » have modular, multi-tasking and concurrent processing features.« less

  13. Plant phenomics: an overview of image acquisition technologies and image data analysis algorithms

    PubMed Central

    Perez-Sanz, Fernando; Navarro, Pedro J

    2017-01-01

    Abstract The study of phenomes or phenomics has been a central part of biology. The field of automatic phenotype acquisition technologies based on images has seen an important advance in the last years. As with other high-throughput technologies, it addresses a common set of problems, including data acquisition and analysis. In this review, we give an overview of the main systems developed to acquire images. We give an in-depth analysis of image processing with its major issues and the algorithms that are being used or emerging as useful to obtain data out of images in an automatic fashion. PMID:29048559

  14. A Macro-Level Analysis of SRL Processes and Their Relations to the Acquisition of a Sophisticated Mental Model of a Complex System

    ERIC Educational Resources Information Center

    Greene, Jeffrey Alan; Azevedo, Roger

    2009-01-01

    In this study, we used think-aloud verbal protocols to examine how various macro-level processes of self-regulated learning (SRL; e.g., planning, monitoring, strategy use, handling of task difficulty and demands) were associated with the acquisition of a sophisticated mental model of a complex biological system. Numerous studies examine how…

  15. Electro-Optic Data Acquisition and Processing.

    DTIC Science & Technology

    Methods for the analysis of electro - optic relaxation data are discussed. Emphasis is on numerical methods using high speed computers. A data acquisition system using a minicomputer for data manipulation is described. Relationship of the results obtained here to other possible uses is given. (Author)

  16. Data Acquisition and Processing System for Airborne Wind Profiling with a Pulsed, 2-Micron, Coherent-Detection, Doppler Lidar System

    NASA Technical Reports Server (NTRS)

    Beyon, J. Y.; Koch, G. J.; Kavaya, M. J.

    2010-01-01

    A data acquisition and signal processing system is being developed for a 2-micron airborne wind profiling coherent Doppler lidar system. This lidar, called the Doppler Aerosol Wind Lidar (DAWN), is based on a Ho:Tm:LuLiF laser transmitter and 15-cm diameter telescope. It is being packaged for flights onboard the NASA DC-8, with the first flights in the summer of 2010 in support of the NASA Genesis and Rapid Intensification Processes (GRIP) campaign for the study of hurricanes. The data acquisition and processing system is housed in a compact PCI chassis and consists of four components such as a digitizer, a digital signal processing (DSP) module, a video controller, and a serial port controller. The data acquisition and processing software (DAPS) is also being developed to control the system including real-time data analysis and display. The system detects an external 10 Hz trigger pulse and initiates the data acquisition and processing process, and displays selected wind profile parameters such as Doppler shift, power distribution, wind directions and velocities. Doppler shift created by aircraft motion is measured by an inertial navigation/GPS sensor and fed to the signal processing system for real-time removal of aircraft effects from wind measurements. A general overview of the system and the DAPS as well as the coherent Doppler lidar system is presented in this paper.

  17. Data Streaming for Metabolomics: Accelerating Data Processing and Analysis from Days to Minutes

    PubMed Central

    2016-01-01

    The speed and throughput of analytical platforms has been a driving force in recent years in the “omics” technologies and while great strides have been accomplished in both chromatography and mass spectrometry, data analysis times have not benefited at the same pace. Even though personal computers have become more powerful, data transfer times still represent a bottleneck in data processing because of the increasingly complex data files and studies with a greater number of samples. To meet the demand of analyzing hundreds to thousands of samples within a given experiment, we have developed a data streaming platform, XCMS Stream, which capitalizes on the acquisition time to compress and stream recently acquired data files to data processing servers, mimicking just-in-time production strategies from the manufacturing industry. The utility of this XCMS Online-based technology is demonstrated here in the analysis of T cell metabolism and other large-scale metabolomic studies. A large scale example on a 1000 sample data set demonstrated a 10 000-fold time savings, reducing data analysis time from days to minutes. Further, XCMS Stream has the capability to increase the efficiency of downstream biochemical dependent data acquisition (BDDA) analysis by initiating data conversion and data processing on subsets of data acquired, expanding its application beyond data transfer to smart preliminary data decision-making prior to full acquisition. PMID:27983788

  18. Data streaming for metabolomics: Accelerating data processing and analysis from days to minutes

    DOE PAGES

    Montenegro-Burke, J. Rafael; Aisporna, Aries E.; Benton, H. Paul; ...

    2016-12-16

    The speed and throughput of analytical platforms has been a driving force in recent years in the “omics” technologies and while great strides have been accomplished in both chromatography and mass spectrometry, data analysis times have not benefited at the same pace. Even though personal computers have become more powerful, data transfer times still represent a bottleneck in data processing because of the increasingly complex data files and studies with a greater number of samples. To meet the demand of analyzing hundreds to thousands of samples within a given experiment, we have developed a data streaming platform, XCMS Stream, whichmore » capitalizes on the acquisition time to compress and stream recently acquired data files to data processing servers, mimicking just-in-time production strategies from the manufacturing industry. The utility of this XCMS Online-based technology is demonstrated here in the analysis of T cell metabolism and other large-scale metabolomic studies. A large scale example on a 1000 sample data set demonstrated a 10 000-fold time savings, reducing data analysis time from days to minutes. Here, XCMS Stream has the capability to increase the efficiency of downstream biochemical dependent data acquisition (BDDA) analysis by initiating data conversion and data processing on subsets of data acquired, expanding its application beyond data transfer to smart preliminary data decision-making prior to full acquisition.« less

  19. Data Streaming for Metabolomics: Accelerating Data Processing and Analysis from Days to Minutes.

    PubMed

    Montenegro-Burke, J Rafael; Aisporna, Aries E; Benton, H Paul; Rinehart, Duane; Fang, Mingliang; Huan, Tao; Warth, Benedikt; Forsberg, Erica; Abe, Brian T; Ivanisevic, Julijana; Wolan, Dennis W; Teyton, Luc; Lairson, Luke; Siuzdak, Gary

    2017-01-17

    The speed and throughput of analytical platforms has been a driving force in recent years in the "omics" technologies and while great strides have been accomplished in both chromatography and mass spectrometry, data analysis times have not benefited at the same pace. Even though personal computers have become more powerful, data transfer times still represent a bottleneck in data processing because of the increasingly complex data files and studies with a greater number of samples. To meet the demand of analyzing hundreds to thousands of samples within a given experiment, we have developed a data streaming platform, XCMS Stream, which capitalizes on the acquisition time to compress and stream recently acquired data files to data processing servers, mimicking just-in-time production strategies from the manufacturing industry. The utility of this XCMS Online-based technology is demonstrated here in the analysis of T cell metabolism and other large-scale metabolomic studies. A large scale example on a 1000 sample data set demonstrated a 10 000-fold time savings, reducing data analysis time from days to minutes. Further, XCMS Stream has the capability to increase the efficiency of downstream biochemical dependent data acquisition (BDDA) analysis by initiating data conversion and data processing on subsets of data acquired, expanding its application beyond data transfer to smart preliminary data decision-making prior to full acquisition.

  20. Arms Transfers to Venezuela: A Comparative and Critical Analysis of the Acquisition Process (1980-1996).

    DTIC Science & Technology

    1999-03-01

    Budget (Oficina Central de Presupuesto [OCEPRE]), which is the presidential agency with overall responsibility to formulate the national budget...Budget (Oficina Central de Presupuesto OCEPRE), and they receive a special treatment in the Venezuelan Budgetary process. The OCEPRE is the...the Central Office of Budget (Oficina Central de Presupuesto , OCEPRE). This occurs when funds for weapons acquisitions come from the ordinary budget

  1. 3D acquisition and modeling for flint artefacts analysis

    NASA Astrophysics Data System (ADS)

    Loriot, B.; Fougerolle, Y.; Sestier, C.; Seulin, R.

    2007-07-01

    In this paper, we are interested in accurate acquisition and modeling of flint artefacts. Archaeologists needs accurate geometry measurements to refine their understanding of the flint artefacts manufacturing process. Current techniques require several operations. First, a copy of a flint artefact is reproduced. The copy is then sliced. A picture is taken for each slice. Eventually, geometric information is manually determined from the pictures. Such a technique is very time consuming, and the processing applied to the original, as well as the reproduced object, induces several measurement errors (prototyping approximations, slicing, image acquisition, and measurement). By using 3D scanners, we significantly reduce the number of operations related to data acquisition and completely suppress the prototyping step to obtain an accurate 3D model. The 3D models are segmented into sliced parts that are then analyzed. Each slice is then automatically fitted by mathematical representation. Such a representation offers several interesting properties: geometric features can be characterized (e.g. shapes, curvature, sharp edges, etc), and a shape of the original piece of stone can be extrapolated. The contributions of this paper are an acquisition technique using 3D scanners that strongly reduces human intervention, acquisition time and measurement errors, and the representation of flint artefacts as mathematical 2D sections that enable accurate analysis.

  2. Probabilistic Round Trip Contamination Analysis of a Mars Sample Acquisition and Handling Process Using Markovian Decompositions

    NASA Technical Reports Server (NTRS)

    Hudson, Nicolas; Lin, Ying; Barengoltz, Jack

    2010-01-01

    A method for evaluating the probability of a Viable Earth Microorganism (VEM) contaminating a sample during the sample acquisition and handling (SAH) process of a potential future Mars Sample Return mission is developed. A scenario where multiple core samples would be acquired using a rotary percussive coring tool, deployed from an arm on a MER class rover is analyzed. The analysis is conducted in a structured way by decomposing sample acquisition and handling process into a series of discrete time steps, and breaking the physical system into a set of relevant components. At each discrete time step, two key functions are defined: The probability of a VEM being released from each component, and the transport matrix, which represents the probability of VEM transport from one component to another. By defining the expected the number of VEMs on each component at the start of the sampling process, these decompositions allow the expected number of VEMs on each component at each sampling step to be represented as a Markov chain. This formalism provides a rigorous mathematical framework in which to analyze the probability of a VEM entering the sample chain, as well as making the analysis tractable by breaking the process down into small analyzable steps.

  3. Conducting financial due diligence of medical practices.

    PubMed

    Louiselle, P

    1995-12-01

    Many healthcare organizations are acquiring medical practices in an effort to build more integrated systems of healthcare products and services. This acquisition activity must be approached cautiously to ensure that medical practices being acquired do not have deficiencies that would jeopardize integration efforts. Conducting a thorough due diligence analysis of medical practices before finalizing the transaction can limit the acquiring organizations' legal and financial exposure and is a necessary component to the acquisition process. The author discusses the components of a successful financial due diligence analysis and addresses some of the risk factors in a practice acquisition.

  4. Plant phenomics: an overview of image acquisition technologies and image data analysis algorithms.

    PubMed

    Perez-Sanz, Fernando; Navarro, Pedro J; Egea-Cortines, Marcos

    2017-11-01

    The study of phenomes or phenomics has been a central part of biology. The field of automatic phenotype acquisition technologies based on images has seen an important advance in the last years. As with other high-throughput technologies, it addresses a common set of problems, including data acquisition and analysis. In this review, we give an overview of the main systems developed to acquire images. We give an in-depth analysis of image processing with its major issues and the algorithms that are being used or emerging as useful to obtain data out of images in an automatic fashion. © The Author 2017. Published by Oxford University Press.

  5. Lexical Link Analysis Application: Improving Web Service to Acquisition Visibility Portal

    DTIC Science & Technology

    2013-04-01

    Maryland Bottleneck Analysis on the DoD Pre-Milestone B Acquisition Processes Danielle Worger and Teresa Wu, Arizona State University Eugene Rex ...Creative Program Management Best Practices Brandon Keller and J. Robert Wirthlin Air Force Institute of Technology The RITE Approach to Agile ...Mechanism for Adaptive Change Kathryn Aten and John T . Dillard Naval Postgraduate School A Comparative Assessment of the Navy’s Future Naval

  6. General-purpose interface bus for multiuser, multitasking computer system

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R.; Roth, Don J.; Stang, David B.

    1990-01-01

    The architecture of a multiuser, multitasking, virtual-memory computer system intended for the use by a medium-size research group is described. There are three central processing units (CPU) in the configuration, each with 16 MB memory, and two 474 MB hard disks attached. CPU 1 is designed for data analysis and contains an array processor for fast-Fourier transformations. In addition, CPU 1 shares display images viewed with the image processor. CPU 2 is designed for image analysis and display. CPU 3 is designed for data acquisition and contains 8 GPIB channels and an analog-to-digital conversion input/output interface with 16 channels. Up to 9 users can access the third CPU simultaneously for data acquisition. Focus is placed on the optimization of hardware interfaces and software, facilitating instrument control, data acquisition, and processing.

  7. LabVIEW: a software system for data acquisition, data analysis, and instrument control.

    PubMed

    Kalkman, C J

    1995-01-01

    Computer-based data acquisition systems play an important role in clinical monitoring and in the development of new monitoring tools. LabVIEW (National Instruments, Austin, TX) is a data acquisition and programming environment that allows flexible acquisition and processing of analog and digital data. The main feature that distinguishes LabVIEW from other data acquisition programs is its highly modular graphical programming language, "G," and a large library of mathematical and statistical functions. The advantage of graphical programming is that the code is flexible, reusable, and self-documenting. Subroutines can be saved in a library and reused without modification in other programs. This dramatically reduces development time and enables researchers to develop or modify their own programs. LabVIEW uses a large amount of processing power and computer memory, thus requiring a powerful computer. A large-screen monitor is desirable when developing larger applications. LabVIEW is excellently suited for testing new monitoring paradigms, analysis algorithms, or user interfaces. The typical LabVIEW user is the researcher who wants to develop a new monitoring technique, a set of new (derived) variables by integrating signals from several existing patient monitors, closed-loop control of a physiological variable, or a physiological simulator.

  8. A sophisticated, multi-channel data acquisition and processing system for high frequency noise research

    NASA Technical Reports Server (NTRS)

    Hall, David G.; Bridges, James

    1992-01-01

    A sophisticated, multi-channel computerized data acquisition and processing system was developed at the NASA LeRC for use in noise experiments. This technology, which is available for transfer to industry, provides a convenient, cost-effective alternative to analog tape recording for high frequency acoustic measurements. This system provides 32-channel acquisition of microphone signals with an analysis bandwidth up to 100 kHz per channel. Cost was minimized through the use of off-the-shelf components. Requirements to allow for future expansion were met by choosing equipment which adheres to established industry standards for hardware and software. Data processing capabilities include narrow band and 1/3 octave spectral analysis, compensation for microphone frequency response/directivity, and correction of acoustic data to standard day conditions. The system was used successfully in a major wind tunnel test program at NASA LeRC to acquire and analyze jet noise data in support of the High Speed Civil Transport (HSCT) program.

  9. Estimating Manpower, Personnel, and Training Requirements Early in the Weapon System Acquisition Process: An Application of the HARDMAN Methodology to the Army’s Division Support Weapon System

    DTIC Science & Technology

    1984-02-01

    identifies the supply of personnel and training resources that can be expected at critical dates in the conceptual weapon system’s acquisition schedule...impact analysis matches demand to supply and identifies shortfalls in skills, new skill requirements, and high resource drivers. The tradeoff analysis...system. Step 5 - Conduct Impact Analysis The Impact Analysis determines the Army’s supply of those personnel and training resources required by the

  10. Economics of human performance and systems total ownership cost.

    PubMed

    Onkham, Wilawan; Karwowski, Waldemar; Ahram, Tareq Z

    2012-01-01

    Financial costs of investing in people is associated with training, acquisition, recruiting, and resolving human errors have a significant impact on increased total ownership costs. These costs can also affect the exaggerate budgets and delayed schedules. The study of human performance economical assessment in the system acquisition process enhances the visibility of hidden cost drivers which support program management informed decisions. This paper presents the literature review of human total ownership cost (HTOC) and cost impacts on overall system performance. Economic value assessment models such as cost benefit analysis, risk-cost tradeoff analysis, expected value of utility function analysis (EV), growth readiness matrix, multi-attribute utility technique, and multi-regressions model were introduced to reflect the HTOC and human performance-technology tradeoffs in terms of the dollar value. The human total ownership regression model introduces to address the influencing human performance cost component measurement. Results from this study will increase understanding of relevant cost drivers in the system acquisition process over the long term.

  11. Conducting a Competitive Prototype Acquisition Program: An Account of the Joint Light Tactical Vehicle (JLTV) Technology Development Phase

    DTIC Science & Technology

    2013-03-01

    9  B.  REQUIREMENTS ANALYSIS PROCESS ..................................................9  1.  Requirements Management and... Analysis Plan ................................9  2.  Knowledge Point Reviews .................................................................11  3...are Identified .......12  5.  RMAP/CDD Process Analysis and Results......................................13  IV.  TD PHASE BEGINS

  12. Systematic reviews, systematic error and the acquisition of clinical knowledge

    PubMed Central

    2010-01-01

    Background Since its inception, evidence-based medicine and its application through systematic reviews, has been widely accepted. However, it has also been strongly criticised and resisted by some academic groups and clinicians. One of the main criticisms of evidence-based medicine is that it appears to claim to have unique access to absolute scientific truth and thus devalues and replaces other types of knowledge sources. Discussion The various types of clinical knowledge sources are categorised on the basis of Kant's categories of knowledge acquisition, as being either 'analytic' or 'synthetic'. It is shown that these categories do not act in opposition but rather, depend upon each other. The unity of analysis and synthesis in knowledge acquisition is demonstrated during the process of systematic reviewing of clinical trials. Systematic reviews constitute comprehensive synthesis of clinical knowledge but depend upon plausible, analytical hypothesis development for the trials reviewed. The dangers of systematic error regarding the internal validity of acquired knowledge are highlighted on the basis of empirical evidence. It has been shown that the systematic review process reduces systematic error, thus ensuring high internal validity. It is argued that this process does not exclude other types of knowledge sources. Instead, amongst these other types it functions as an integrated element during the acquisition of clinical knowledge. Conclusions The acquisition of clinical knowledge is based on interaction between analysis and synthesis. Systematic reviews provide the highest form of synthetic knowledge acquisition in terms of achieving internal validity of results. In that capacity it informs the analytic knowledge of the clinician but does not replace it. PMID:20537172

  13. Real-Time Data Display

    NASA Technical Reports Server (NTRS)

    Pedings, Marc

    2007-01-01

    RT-Display is a MATLAB-based data acquisition environment designed to use a variety of commercial off-the-shelf (COTS) hardware to digitize analog signals to a standard data format usable by other post-acquisition data analysis tools. This software presents the acquired data in real time using a variety of signal-processing algorithms. The acquired data is stored in a standard Operator Interactive Signal Processing Software (OISPS) data-formatted file. RT-Display is primarily configured to use the Agilent VXI (or equivalent) data acquisition boards used in such systems as MIDDAS (Multi-channel Integrated Dynamic Data Acquisition System). The software is generalized and deployable in almost any testing environment, without limitations or proprietary configuration for a specific test program or project. With the Agilent hardware configured and in place, users can start the program and, in one step, immediately begin digitizing multiple channels of data. Once the acquisition is completed, data is converted into a common binary format that also can be translated to specific formats used by external analysis software, such as OISPS and PC-Signal (product of AI Signal Research Inc.). RT-Display at the time of this reporting was certified on Agilent hardware capable of acquisition up to 196,608 samples per second. Data signals are presented to the user on-screen simultaneously for 16 channels. Each channel can be viewed individually, with a maximum capability of 160 signal channels (depending on hardware configuration). Current signal presentations include: time data, fast Fourier transforms (FFT), and power spectral density plots (PSD). Additional processing algorithms can be easily incorporated into this environment.

  14. Processing and Analysis of Multichannel Extracellular Neuronal Signals: State-of-the-Art and Challenges

    PubMed Central

    Mahmud, Mufti; Vassanelli, Stefano

    2016-01-01

    In recent years multichannel neuronal signal acquisition systems have allowed scientists to focus on research questions which were otherwise impossible. They act as a powerful means to study brain (dys)functions in in-vivo and in in-vitro animal models. Typically, each session of electrophysiological experiments with multichannel data acquisition systems generate large amount of raw data. For example, a 128 channel signal acquisition system with 16 bits A/D conversion and 20 kHz sampling rate will generate approximately 17 GB data per hour (uncompressed). This poses an important and challenging problem of inferring conclusions from the large amounts of acquired data. Thus, automated signal processing and analysis tools are becoming a key component in neuroscience research, facilitating extraction of relevant information from neuronal recordings in a reasonable time. The purpose of this review is to introduce the reader to the current state-of-the-art of open-source packages for (semi)automated processing and analysis of multichannel extracellular neuronal signals (i.e., neuronal spikes, local field potentials, electroencephalogram, etc.), and the existing Neuroinformatics infrastructure for tool and data sharing. The review is concluded by pinpointing some major challenges that are being faced, which include the development of novel benchmarking techniques, cloud-based distributed processing and analysis tools, as well as defining novel means to share and standardize data. PMID:27313507

  15. Simulate different environments TDLAS On the analysis of the test signal strength

    NASA Astrophysics Data System (ADS)

    Li, Xin; Zhou, Tao; Jia, Xiaodong

    2014-12-01

    TDLAS system is the use of the wavelength tuning characteristics of the laser diode, for detecting the absorption spectrum of the gas absorption line. Detecting the gas space, temperature, pressure and flow rate and concentration. The use of laboratory techniques TDLAS gas detection, experimental simulation engine combustion water vapor and smoke. using an optical lens system receives the signal acquisition and signal interference test analysis. Analog water vapor and smoke in two different environments in the sample pool interference. In both experiments environmental interference gas absorption in the optical signal acquisition, signal amplitude variation analysis, and records related to the signal data. In order to study site conditions in the engine combustion process for signal acquisition provides an ideal experimental data .

  16. DA+ data acquisition and analysis software at the Swiss Light Source macromolecular crystallography beamlines.

    PubMed

    Wojdyla, Justyna Aleksandra; Kaminski, Jakub W; Panepucci, Ezequiel; Ebner, Simon; Wang, Xiaoqiang; Gabadinho, Jose; Wang, Meitian

    2018-01-01

    Data acquisition software is an essential component of modern macromolecular crystallography (MX) beamlines, enabling efficient use of beam time at synchrotron facilities. Developed at the Paul Scherrer Institute, the DA+ data acquisition software is implemented at all three Swiss Light Source (SLS) MX beamlines. DA+ consists of distributed services and components written in Python and Java, which communicate via messaging and streaming technologies. The major components of DA+ are the user interface, acquisition engine, online processing and database. Immediate data quality feedback is achieved with distributed automatic data analysis routines. The software architecture enables exploration of the full potential of the latest instrumentation at the SLS MX beamlines, such as the SmarGon goniometer and the EIGER X 16M detector, and development of new data collection methods.

  17. A high speed data acquisition and analysis system for transonic velocity, density, and total temperature fluctuations

    NASA Technical Reports Server (NTRS)

    Clukey, Steven J.

    1988-01-01

    The high speed Dynamic Data Acquisition System (DDAS) is described which provides the capability for the simultaneous measurement of velocity, density, and total temperature fluctuations. The system of hardware and software is described in context of the wind tunnel environment. The DDAS replaces both a recording mechanism and a separate data processing system. The data acquisition and data reduction process has been combined within DDAS. DDAS receives input from hot wires and anemometers, amplifies and filters the signals with computer controlled modules, and converts the analog signals to digital with real-time simultaneous digitization followed by digital recording on disk or tape. Automatic acquisition (either from a computer link to an existing wind tunnel acquisition system, or from data acquisition facilities within DDAS) collects necessary calibration and environment data. The generation of hot wire sensitivities is done in DDAS, as is the application of sensitivities to the hot wire data to generate turbulence quantities. The presentation of the raw and processed data, in terms of root mean square values of velocity, density and temperature, and the processing of the spectral data is accomplished on demand in near-real-time- with DDAS. A comprehensive description of the interface to the DDAS and of the internal mechanisms will be prosented. A summary of operations relevant to the use of the DDAS will be provided.

  18. Information Acquisition, Analysis and Integration

    DTIC Science & Technology

    2016-08-03

    of sensing and processing, theory, applications, signal processing, image and video processing, machine learning , technology transfer. 16. SECURITY... learning . 5. Solved elegantly old problems like image and video debluring, intro- ducing new revolutionary approaches. 1 DISTRIBUTION A: Distribution...Polatkan, G. Sapiro, D. Blei, D. B. Dunson, and L. Carin, “ Deep learning with hierarchical convolution factor analysis,” IEEE 6 DISTRIBUTION A

  19. 50 CFR 37.53 - Submission of data and information.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... processing. (c) Processed geophysical information shall be submitted with extraneous signals and interference... of data gathering or utilization, i.e., acquisition, processing, reprocessing, analysis, and... survey conducted under the permittee's permit, including digital navigational data, if obtained, and...

  20. 50 CFR 37.53 - Submission of data and information.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... processing. (c) Processed geophysical information shall be submitted with extraneous signals and interference... of data gathering or utilization, i.e., acquisition, processing, reprocessing, analysis, and... survey conducted under the permittee's permit, including digital navigational data, if obtained, and...

  1. 50 CFR 37.53 - Submission of data and information.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... processing. (c) Processed geophysical information shall be submitted with extraneous signals and interference... of data gathering or utilization, i.e., acquisition, processing, reprocessing, analysis, and... survey conducted under the permittee's permit, including digital navigational data, if obtained, and...

  2. 50 CFR 37.53 - Submission of data and information.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... processing. (c) Processed geophysical information shall be submitted with extraneous signals and interference... of data gathering or utilization, i.e., acquisition, processing, reprocessing, analysis, and... survey conducted under the permittee's permit, including digital navigational data, if obtained, and...

  3. 50 CFR 37.53 - Submission of data and information.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... processing. (c) Processed geophysical information shall be submitted with extraneous signals and interference... of data gathering or utilization, i.e., acquisition, processing, reprocessing, analysis, and... survey conducted under the permittee's permit, including digital navigational data, if obtained, and...

  4. TARGET's role in knowledge acquisition, engineering, validation, and documentation

    NASA Technical Reports Server (NTRS)

    Levi, Keith R.

    1994-01-01

    We investigate the use of the TARGET task analysis tool for use in the development of rule-based expert systems. We found TARGET to be very helpful in the knowledge acquisition process. It enabled us to perform knowledge acquisition with one knowledge engineer rather than two. In addition, it improved communication between the domain expert and knowledge engineer. We also found it to be useful for both the rule development and refinement phases of the knowledge engineering process. Using the network in these phases required us to develop guidelines that enabled us to easily translate the network into production rules. A significant requirement for TARGET remaining useful throughout the knowledge engineering process was the need to carefully maintain consistency between the network and the rule representations. Maintaining consistency not only benefited the knowledge engineering process, but also has significant payoffs in the areas of validation of the expert system and documentation of the knowledge in the system.

  5. Analysis of the Federal Aviation Administration’s Host Computer Acquisition Process and Potential Application in Department of Defense Acquisitions

    DTIC Science & Technology

    1988-09-01

    defense programs lost far more to inefficient procedures than to fraud and dishonesty * (President’s Commission, l986c:15). Based on the Commission...recommendations from current studies, lessons learned from a successful program, and DOD expert opinions to develop an acquisition management strategy that...established for the alternative(s) selected in the preceding phase. 5. In the concept demonstration/validation phase the technical risk and economic

  6. Draft Environmental Impact Statement. MX Deployment Area Selection and Land Withdrawal/Acquisition DEIS. Volume IV. Part I. Environmental Consequences to the Study Regions and Operating Base Vicinities.

    DTIC Science & Technology

    1980-12-01

    consequences such that the ecosystem will not recover at all, (7) are the consequences such that the impact may be large but the recovery process...Bswe $Vicinitoe MLWI Impact Analysis Process DEPLOYMENT AREA SELECTION AND LAND WITHDRAWAL/ ACQUISITION DISI, DEPARTMENT OF THE AmR F1ORC ’oritinax...Subtitle) S. TYPE OF REPORT & PERIOD COVEREDDraft Environmental Impact Statement-MX Deployment Area Selection-Environmental Draft-December 80 Consequences

  7. Data acquisition, processing and firing aid software for multichannel EMP simulation

    NASA Astrophysics Data System (ADS)

    Eumurian, Gregoire; Arbaud, Bruno

    1986-08-01

    Electromagnetic compatibility testing yields a large quantity of data for systematic analysis. An automated data acquisition system has been developed. It is based on standard EMP instrumentation which allows a pre-established program to be followed whilst orientating the measurements according to the results obtained. The system is controlled by a computer running interactive programs (multitask windows, scrollable menus, mouse, etc.) which handle the measurement channels, files, displays and process data in addition to providing an aid to firing.

  8. Full-field wrist pulse signal acquisition and analysis by 3D Digital Image Correlation

    NASA Astrophysics Data System (ADS)

    Xue, Yuan; Su, Yong; Zhang, Chi; Xu, Xiaohai; Gao, Zeren; Wu, Shangquan; Zhang, Qingchuan; Wu, Xiaoping

    2017-11-01

    Pulse diagnosis is an essential part in four basic diagnostic methods (inspection, listening, inquiring and palpation) in traditional Chinese medicine, which depends on longtime training and rich experience, so computerized pulse acquisition has been proposed and studied to ensure the objectivity. To imitate the process that doctors using three fingertips with different pressures to feel fluctuations in certain areas containing three acupoints, we established a five dimensional pulse signal acquisition system adopting a non-contacting optical metrology method, 3D digital image correlation, to record the full-field displacements of skin fluctuations under different pressures. The system realizes real-time full-field vibration mode observation with 10 FPS. The maximum sample frequency is 472 Hz for detailed post-processing. After acquisition, the signals are analyzed according to the amplitude, pressure, and pulse wave velocity. The proposed system provides a novel optical approach for digitalizing pulse diagnosis and massive pulse signal data acquisition for various types of patients.

  9. PLACE: an open-source python package for laboratory automation, control, and experimentation.

    PubMed

    Johnson, Jami L; Tom Wörden, Henrik; van Wijk, Kasper

    2015-02-01

    In modern laboratories, software can drive the full experimental process from data acquisition to storage, processing, and analysis. The automation of laboratory data acquisition is an important consideration for every laboratory. When implementing a laboratory automation scheme, important parameters include its reliability, time to implement, adaptability, and compatibility with software used at other stages of experimentation. In this article, we present an open-source, flexible, and extensible Python package for Laboratory Automation, Control, and Experimentation (PLACE). The package uses modular organization and clear design principles; therefore, it can be easily customized or expanded to meet the needs of diverse laboratories. We discuss the organization of PLACE, data-handling considerations, and then present an example using PLACE for laser-ultrasound experiments. Finally, we demonstrate the seamless transition to post-processing and analysis with Python through the development of an analysis module for data produced by PLACE automation. © 2014 Society for Laboratory Automation and Screening.

  10. DA+ data acquisition and analysis software at the Swiss Light Source macromolecular crystallography beamlines

    PubMed Central

    Wojdyla, Justyna Aleksandra; Kaminski, Jakub W.; Ebner, Simon; Wang, Xiaoqiang; Gabadinho, Jose; Wang, Meitian

    2018-01-01

    Data acquisition software is an essential component of modern macromolecular crystallography (MX) beamlines, enabling efficient use of beam time at synchrotron facilities. Developed at the Paul Scherrer Institute, the DA+ data acquisition software is implemented at all three Swiss Light Source (SLS) MX beamlines. DA+ consists of distributed services and components written in Python and Java, which communicate via messaging and streaming technologies. The major components of DA+ are the user interface, acquisition engine, online processing and database. Immediate data quality feedback is achieved with distributed automatic data analysis routines. The software architecture enables exploration of the full potential of the latest instrumentation at the SLS MX beamlines, such as the SmarGon goniometer and the EIGER X 16M detector, and development of new data collection methods. PMID:29271779

  11. Big Data Analysis of Manufacturing Processes

    NASA Astrophysics Data System (ADS)

    Windmann, Stefan; Maier, Alexander; Niggemann, Oliver; Frey, Christian; Bernardi, Ansgar; Gu, Ying; Pfrommer, Holger; Steckel, Thilo; Krüger, Michael; Kraus, Robert

    2015-11-01

    The high complexity of manufacturing processes and the continuously growing amount of data lead to excessive demands on the users with respect to process monitoring, data analysis and fault detection. For these reasons, problems and faults are often detected too late, maintenance intervals are chosen too short and optimization potential for higher output and increased energy efficiency is not sufficiently used. A possibility to cope with these challenges is the development of self-learning assistance systems, which identify relevant relationships by observation of complex manufacturing processes so that failures, anomalies and need for optimization are automatically detected. The assistance system developed in the present work accomplishes data acquisition, process monitoring and anomaly detection in industrial and agricultural processes. The assistance system is evaluated in three application cases: Large distillation columns, agricultural harvesting processes and large-scale sorting plants. In this paper, the developed infrastructures for data acquisition in these application cases are described as well as the developed algorithms and initial evaluation results.

  12. Evaluation of the quality of the teaching-learning process in undergraduate courses in Nursing.

    PubMed

    González-Chordá, Víctor Manuel; Maciá-Soler, María Loreto

    2015-01-01

    to identify aspects of improvement of the quality of the teaching-learning process through the analysis of tools that evaluated the acquisition of skills by undergraduate students of Nursing. prospective longitudinal study conducted in a population of 60 secondyear Nursing students based on registration data, from which quality indicators that evaluate the acquisition of skills were obtained, with descriptive and inferential analysis. nine items were identified and nine learning activities included in the assessment tools that did not reach the established quality indicators (p<0.05). There are statistically significant differences depending on the hospital and clinical practices unit (p<0.05). the analysis of the evaluation tools used in the article "Nursing Care in Welfare Processes" of the analyzed university undergraduate course enabled the detection of the areas for improvement in the teachinglearning process. The challenge of education in nursing is to reach the best clinical research and educational results, in order to provide improvements to the quality of education and health care.

  13. Endoscopic ultrasound guided fine needle aspiration and useful ancillary methods

    PubMed Central

    Tadic, Mario; Stoos-Veic, Tajana; Kusec, Rajko

    2014-01-01

    The role of endoscopic ultrasound (EUS) in evaluating pancreatic pathology has been well documented from the beginning of its clinical use. High spatial resolution and the close proximity to the evaluated organs within the mediastinum and abdominal cavity allow detection of small focal lesions and precise tissue acquisition from suspected lesions within the reach of this method. Fine needle aspiration (FNA) is considered of additional value to EUS and is performed to obtain tissue diagnosis. Tissue acquisition from suspected lesions for cytological or histological analysis allows, not only the differentiation between malignant and non-malignant lesions, but, in most cases, also the accurate distinction between the various types of malignant lesions. It is well documented that the best results are achieved only if an adequate sample is obtained for further analysis, if the material is processed in an appropriate way, and if adequate ancillary methods are performed. This is a multi-step process and could be quite a challenge in some cases. In this article, we discuss the technical aspects of tissue acquisition by EUS-guided-FNA (EUS-FNA), as well as the role of an on-site cytopathologist, various means of specimen processing, and the selection of the appropriate ancillary method for providing an accurate tissue diagnosis and maximizing the yield of this method. The main goal of this review is to alert endosonographers, not only to the different possibilities of tissue acquisition, namely EUS-FNA, but also to bring to their attention the importance of proper sample processing in the evaluation of various lesions in the gastrointestinal tract and other accessible organs. All aspects of tissue acquisition (needles, suction, use of stylet, complications, etc.) have been well discussed lately. Adequate tissue samples enable comprehensive diagnoses, which answer the main clinical questions, thus enabling targeted therapy. PMID:25339816

  14. Development of image analysis software for quantification of viable cells in microchips.

    PubMed

    Georg, Maximilian; Fernández-Cabada, Tamara; Bourguignon, Natalia; Karp, Paola; Peñaherrera, Ana B; Helguera, Gustavo; Lerner, Betiana; Pérez, Maximiliano S; Mertelsmann, Roland

    2018-01-01

    Over the past few years, image analysis has emerged as a powerful tool for analyzing various cell biology parameters in an unprecedented and highly specific manner. The amount of data that is generated requires automated methods for the processing and analysis of all the resulting information. The software available so far are suitable for the processing of fluorescence and phase contrast images, but often do not provide good results from transmission light microscopy images, due to the intrinsic variation of the acquisition of images technique itself (adjustment of brightness / contrast, for instance) and the variability between image acquisition introduced by operators / equipment. In this contribution, it has been presented an image processing software, Python based image analysis for cell growth (PIACG), that is able to calculate the total area of the well occupied by cells with fusiform and rounded morphology in response to different concentrations of fetal bovine serum in microfluidic chips, from microscopy images in transmission light, in a highly efficient way.

  15. The nature of verbal memory impairment in multiple sclerosis: a list-learning and meta-analytic study.

    PubMed

    Lafosse, Jose M; Mitchell, Sandra M; Corboy, John R; Filley, Christopher M

    2013-10-01

    The primary purpose of this study was to test the hypothesis that multiple sclerosis (MS) patients have impaired acquisition rather than a retrieval deficit. Verbal memory impairment in MS was examined in 53 relapsing-remitting MS patients and 31 healthy controls (HC), and in a meta-analysis of studies that examined memory functioning in MS with list-learning tasks. The MS group demonstrated significantly lower acquisition and delayed recall performance than the HC group, and the meta-analysis revealed that the largest effect sizes were obtained for acquisition measures relative to delayed recall and recognition. Our data argue against a retrieval deficit as the sole explanation for verbal memory impairment in MS, and make a consistent case for the position that deficient acquisition contributes to the memory dysfunction of MS patients. Deficient acquisition may result from demyelination in relevant white matter tracts that reduces encoding efficiency as a result of impaired speed of information processing.

  16. Hyperspectral data acquisition and analysis in imaging and real-time active MIR backscattering spectroscopy

    NASA Astrophysics Data System (ADS)

    Jarvis, Jan; Haertelt, Marko; Hugger, Stefan; Butschek, Lorenz; Fuchs, Frank; Ostendorf, Ralf; Wagner, Joachim; Beyerer, Juergen

    2017-04-01

    In this work we present data analysis algorithms for detection of hazardous substances in hyperspectral observations acquired using active mid-infrared (MIR) backscattering spectroscopy. We present a novel background extraction algorithm based on the adaptive target generation process proposed by Ren and Chang called the adaptive background generation process (ABGP) that generates a robust and physically meaningful set of background spectra for operation of the well-known adaptive matched subspace detection (AMSD) algorithm. It is shown that the resulting AMSD-ABGP detection algorithm competes well with other widely used detection algorithms. The method is demonstrated in measurement data obtained by two fundamentally different active MIR hyperspectral data acquisition devices. A hyperspectral image sensor applicable in static scenes takes a wavelength sequential approach to hyperspectral data acquisition, whereas a rapid wavelength-scanning single-element detector variant of the same principle uses spatial scanning to generate the hyperspectral observation. It is shown that the measurement timescale of the latter is sufficient for the application of the data analysis algorithms even in dynamic scenarios.

  17. War-gaming application for future space systems acquisition: MATLAB implementation of war-gaming acquisition models and simulation results

    NASA Astrophysics Data System (ADS)

    Vienhage, Paul; Barcomb, Heather; Marshall, Karel; Black, William A.; Coons, Amanda; Tran, Hien T.; Nguyen, Tien M.; Guillen, Andy T.; Yoh, James; Kizer, Justin; Rogers, Blake A.

    2017-05-01

    The paper describes the MATLAB (MathWorks) programs that were developed during the REU workshop1 to implement The Aerospace Corporation developed Unified Game-based Acquisition Framework and Advanced Game - based Mathematical Framework (UGAF-AGMF) and its associated War-Gaming Engine (WGE) models. Each game can be played from the perspectives of the Department of Defense Acquisition Authority (DAA) or of an individual contractor (KTR). The programs also implement Aerospace's optimum "Program and Technical Baseline (PTB) and associated acquisition" strategy that combines low Total Ownership Cost (TOC) with innovative designs while still meeting warfighter needs. The paper also describes the Bayesian Acquisition War-Gaming approach using Monte Carlo simulations, a numerical analysis technique to account for uncertainty in decision making, which simulate the PTB development and acquisition processes and will detail the procedure of the implementation and the interactions between the games.

  18. An Analysis of Japanese University Students' Oral Performance in English Using Processability Theory

    ERIC Educational Resources Information Center

    Sakai, Hideki

    2008-01-01

    This paper presents a brief summary of processability theory as proposed by [Pienemann, M., 1998a. "Language Processing and Second Language Development: Processability Theory." John Benjamins, Amsterdam; Pienemann, M., 1998b. "Developmental dynamics in L1 and L2 acquisition: processability theory and generative entrenchment." "Bilingualism:…

  19. High speed television camera system processes photographic film data for digital computer analysis

    NASA Technical Reports Server (NTRS)

    Habbal, N. A.

    1970-01-01

    Data acquisition system translates and processes graphical information recorded on high speed photographic film. It automatically scans the film and stores the information with a minimal use of the computer memory.

  20. Further Evidence on the Effect of Acquisition Policy and Process on Cost Growth

    DTIC Science & Technology

    2016-04-30

    bust periods. A complete summary also would need to take into account parallel analyses for the boom periods and the comparisons of cost growth in...qÜáêíÉÉåíÜ=^ååì~ä= ^Åèìáëáíáçå=oÉëÉ~êÅÜ= póãéçëáìã= tÉÇåÉëÇ~ó=pÉëëáçåë= sçäìãÉ=f= = Further Evidence on the Effect of Acquisition Policy and Process on Cost ...Goeller, Defense Acquisition Analyst, Institute for Defense Analyses Stanley Horowitz, Assistant Director, Cost Analysis and Research Division

  1. Perspectives on Academic Staff Involvement in the Acquisition and Implementation of Educational Technologies

    ERIC Educational Resources Information Center

    Habib, Laurence; Johannesen, Monica

    2014-01-01

    This article presents the results of a study using both quantitative and qualitative data to uncover the extent and nature of the involvement of academic staff in the processes of acquisition and implementation of educational technologies. Actor-network theory (ANT) is used to inform the design of the study and the analysis of the data. Three main…

  2. Dynamic Information and Library Processing.

    ERIC Educational Resources Information Center

    Salton, Gerard

    This book provides an introduction to automated information services: collection, analysis, classification, storage, retrieval, transmission, and dissemination. An introductory chapter is followed by an overview of mechanized processes for acquisitions, cataloging, and circulation. Automatic indexing and abstracting methods are covered, followed…

  3. Early acquisition of gender agreement in the Spanish noun phrase: starting small.

    PubMed

    Mariscal, Sonia

    2009-01-01

    Nativist and constructivist accounts differ in their characterization of children's knowledge of grammatical categories. In this paper we present research on the process of acquisition of a particular grammatical system, gender agreement in the Spanish noun phrase, in children under three years of age. The design of the longitudinal study employed presents some variations in relation to classical studies. The aim was to obtain a large corpus of NP data which would allow different types of analysis of the children's productions to be carried out. Intra-individual variability in early NP types was analyzed and measured, and an elicitation task for adjectives was used. Results show that the acquisition of NP and gender agreement is a complex process which advances as the children gradually integrate different pieces of evidence: phonological, distributional and functional. The reduction of variability as the grammatical process advances is a key feature for its explanation.

  4. Research and design of portable photoelectric rotary table data-acquisition and analysis system

    NASA Astrophysics Data System (ADS)

    Yang, Dawei; Yang, Xiufang; Han, Junfeng; Yan, Xiaoxu

    2015-02-01

    Photoelectric rotary table as the main test tracking measurement platform, widely use in shooting range and aerospace fields. In the range of photoelectric tracking measurement system, in order to meet the photoelectric testing instruments and equipment of laboratory and field application demand, research and design the portable photoelectric rotary table data acquisition and analysis system, and introduces the FPGA device based on Xilinx company Virtex-4 series and its peripheral module of the system hardware design, and the software design of host computer in VC++ 6.0 programming platform and MFC package based on class libraries. The data acquisition and analysis system for data acquisition, display and storage, commission control, analysis, laboratory wave playback, transmission and fault diagnosis, and other functions into an organic whole, has the advantages of small volume, can be embedded, high speed, portable, simple operation, etc. By photoelectric tracking turntable as experimental object, carries on the system software and hardware alignment, the experimental results show that the system can realize the data acquisition, analysis and processing of photoelectric tracking equipment and control of turntable debugging good, and measurement results are accurate, reliable and good maintainability and extensibility. The research design for advancing the photoelectric tracking measurement equipment debugging for diagnosis and condition monitoring and fault analysis as well as the standardization and normalization of the interface and improve the maintainability of equipment is of great significance, and has certain innovative and practical value.

  5. Analysis of the possibility of SysML and BPMN application in formal data acquisition system description

    NASA Astrophysics Data System (ADS)

    Ćwikła, G.; Gwiazda, A.; Banaś, W.; Monica, Z.; Foit, K.

    2017-08-01

    The article presents the study of possible application of selected methods of complex description, that can be used as a support of the Manufacturing Information Acquisition System (MIAS) methodology, describing how to design a data acquisition system, allowing for collecting and processing real-time data on the functioning of a production system, necessary for management of a company. MIAS can allow conversion into Cyber-Physical Production System. MIAS is gathering and pre-processing data on the state of production system, including e.g. realisation of production orders, state of machines, materials and human resources. Systematised approach and model-based development is proposed for improving the quality of the design of MIAS methodology-based complex systems supporting data acquisition in various types of companies. Graphical specification can be the baseline for any model-based development in specified areas. The possibility of application of SysML and BPMN, both being UML-based languages, representing different approaches to modelling of requirements, architecture and implementation of the data acquisition system, as a tools supporting description of required features of MIAS, were considered.

  6. An Ibm PC/AT-Based Image Acquisition And Processing System For Quantitative Image Analysis

    NASA Astrophysics Data System (ADS)

    Kim, Yongmin; Alexander, Thomas

    1986-06-01

    In recent years, a large number of applications have been developed for image processing systems in the area of biological imaging. We have already finished the development of a dedicated microcomputer-based image processing and analysis system for quantitative microscopy. The system's primary function has been to facilitate and ultimately automate quantitative image analysis tasks such as the measurement of cellular DNA contents. We have recognized from this development experience, and interaction with system users, biologists and technicians, that the increasingly widespread use of image processing systems, and the development and application of new techniques for utilizing the capabilities of such systems, would generate a need for some kind of inexpensive general purpose image acquisition and processing system specially tailored for the needs of the medical community. We are currently engaged in the development and testing of hardware and software for a fairly high-performance image processing computer system based on a popular personal computer. In this paper, we describe the design and development of this system. Biological image processing computer systems have now reached a level of hardware and software refinement where they could become convenient image analysis tools for biologists. The development of a general purpose image processing system for quantitative image analysis that is inexpensive, flexible, and easy-to-use represents a significant step towards making the microscopic digital image processing techniques more widely applicable not only in a research environment as a biologist's workstation, but also in clinical environments as a diagnostic tool.

  7. Automatic Data Processing Equipment (ADPE) acquisition plan for the medical sciences

    NASA Technical Reports Server (NTRS)

    1979-01-01

    An effective mechanism for meeting the SLSD/MSD data handling/processing requirements for Shuttle is discussed. The ability to meet these requirements depends upon the availability of a general purpose high speed digital computer system. This system is expected to implement those data base management and processing functions required across all SLSD/MSD programs during training, laboratory operations/analysis, simulations, mission operations, and post mission analysis/reporting.

  8. Stakeholder Collaboration in Air Force Acquisition: Adaptive Design Using System Representations

    DTIC Science & Technology

    2003-06-01

    261 Figure 6.10. Notional efficacy of SR and analysis for different emphasis areas…...….264 Figure 7.1. Adaptive functions during...closer collaboration spanning requirements activities in the user community and acquisition activities . Drafts in 2003 of new versions of DoD...come to grips with the necessary changes in their activities and processes to effectively implement these objectives. As this research effort

  9. Data processing has major impact on the outcome of quantitative label-free LC-MS analysis.

    PubMed

    Chawade, Aakash; Sandin, Marianne; Teleman, Johan; Malmström, Johan; Levander, Fredrik

    2015-02-06

    High-throughput multiplexed protein quantification using mass spectrometry is steadily increasing in popularity, with the two major techniques being data-dependent acquisition (DDA) and targeted acquisition using selected reaction monitoring (SRM). However, both techniques involve extensive data processing, which can be performed by a multitude of different software solutions. Analysis of quantitative LC-MS/MS data is mainly performed in three major steps: processing of raw data, normalization, and statistical analysis. To evaluate the impact of data processing steps, we developed two new benchmark data sets, one each for DDA and SRM, with samples consisting of a long-range dilution series of synthetic peptides spiked in a total cell protein digest. The generated data were processed by eight different software workflows and three postprocessing steps. The results show that the choice of the raw data processing software and the postprocessing steps play an important role in the final outcome. Also, the linear dynamic range of the DDA data could be extended by an order of magnitude through feature alignment and a charge state merging algorithm proposed here. Furthermore, the benchmark data sets are made publicly available for further benchmarking and software developments.

  10. Macro Level Simulation Model Of Space Shuttle Processing

    NASA Technical Reports Server (NTRS)

    2000-01-01

    The contents include: 1) Space Shuttle Processing Simulation Model; 2) Knowledge Acquisition; 3) Simulation Input Analysis; 4) Model Applications in Current Shuttle Environment; and 5) Model Applications for Future Reusable Launch Vehicles (RLV's). This paper is presented in viewgraph form.

  11. Planning applications in image analysis

    NASA Technical Reports Server (NTRS)

    Boddy, Mark; White, Jim; Goldman, Robert; Short, Nick, Jr.

    1994-01-01

    We describe two interim results from an ongoing effort to automate the acquisition, analysis, archiving, and distribution of satellite earth science data. Both results are applications of Artificial Intelligence planning research to the automatic generation of processing steps for image analysis tasks. First, we have constructed a linear conditional planner (CPed), used to generate conditional processing plans. Second, we have extended an existing hierarchical planning system to make use of durations, resources, and deadlines, thus supporting the automatic generation of processing steps in time and resource-constrained environments.

  12. A Comparative Analysis of Industrial Escherichia coli K–12 and B Strains in High-Glucose Batch Cultivations on Process-, Transcriptome- and Proteome Level

    PubMed Central

    Marisch, Karoline; Bayer, Karl; Scharl, Theresa; Mairhofer, Juergen; Krempl, Peter M.; Hummel, Karin; Razzazi-Fazeli, Ebrahim; Striedner, Gerald

    2013-01-01

    Escherichia coli K–12 and B strains are among the most frequently used bacterial hosts for production of recombinant proteins on an industrial scale. To improve existing processes and to accelerate bioprocess development, we performed a detailed host analysis. We investigated the different behaviors of the E. coli production strains BL21, RV308, and HMS174 in response to high-glucose concentrations. Tightly controlled cultivations were conducted under defined environmental conditions for the in-depth analysis of physiological behavior. In addition to acquisition of standard process parameters, we also used DNA microarray analysis and differential gel electrophoresis (EttanTM DIGE). Batch cultivations showed different yields of the distinct strains for cell dry mass and growth rate, which were highest for BL21. In addition, production of acetate, triggered by excess glucose supply, was much higher for the K–12 strains compared to the B strain. Analysis of transcriptome data showed significant alteration in 347 of 3882 genes common among all three hosts. These differentially expressed genes included, for example, those involved in transport, iron acquisition, and motility. The investigation of proteome patterns additionally revealed a high number of differentially expressed proteins among the investigated hosts. The subsequently selected 38 spots included proteins involved in transport and motility. The results of this comprehensive analysis delivered a full genomic picture of the three investigated strains. Differentially expressed groups for targeted host modification were identified like glucose transport or iron acquisition, enabling potential optimization of strains to improve yield and process quality. Dissimilar growth profiles of the strains confirm different genotypes. Furthermore, distinct transcriptome patterns support differential regulation at the genome level. The identified proteins showed high agreement with the transcriptome data and suggest similar regulation within a host at both levels for the identified groups. Such host attributes need to be considered in future process design and operation. PMID:23950949

  13. A comparative analysis of industrial Escherichia coli K-12 and B strains in high-glucose batch cultivations on process-, transcriptome- and proteome level.

    PubMed

    Marisch, Karoline; Bayer, Karl; Scharl, Theresa; Mairhofer, Juergen; Krempl, Peter M; Hummel, Karin; Razzazi-Fazeli, Ebrahim; Striedner, Gerald

    2013-01-01

    Escherichia coli K-12 and B strains are among the most frequently used bacterial hosts for production of recombinant proteins on an industrial scale. To improve existing processes and to accelerate bioprocess development, we performed a detailed host analysis. We investigated the different behaviors of the E. coli production strains BL21, RV308, and HMS174 in response to high-glucose concentrations. Tightly controlled cultivations were conducted under defined environmental conditions for the in-depth analysis of physiological behavior. In addition to acquisition of standard process parameters, we also used DNA microarray analysis and differential gel electrophoresis (Ettan(TM) DIGE). Batch cultivations showed different yields of the distinct strains for cell dry mass and growth rate, which were highest for BL21. In addition, production of acetate, triggered by excess glucose supply, was much higher for the K-12 strains compared to the B strain. Analysis of transcriptome data showed significant alteration in 347 of 3882 genes common among all three hosts. These differentially expressed genes included, for example, those involved in transport, iron acquisition, and motility. The investigation of proteome patterns additionally revealed a high number of differentially expressed proteins among the investigated hosts. The subsequently selected 38 spots included proteins involved in transport and motility. The results of this comprehensive analysis delivered a full genomic picture of the three investigated strains. Differentially expressed groups for targeted host modification were identified like glucose transport or iron acquisition, enabling potential optimization of strains to improve yield and process quality. Dissimilar growth profiles of the strains confirm different genotypes. Furthermore, distinct transcriptome patterns support differential regulation at the genome level. The identified proteins showed high agreement with the transcriptome data and suggest similar regulation within a host at both levels for the identified groups. Such host attributes need to be considered in future process design and operation.

  14. A multiple process solution to the logical problem of language acquisition*

    PubMed Central

    MACWHINNEY, BRIAN

    2006-01-01

    Many researchers believe that there is a logical problem at the center of language acquisition theory. According to this analysis, the input to the learner is too inconsistent and incomplete to determine the acquisition of grammar. Moreover, when corrective feedback is provided, children tend to ignore it. As a result, language learning must rely on additional constraints from universal grammar. To solve this logical problem, theorists have proposed a series of constraints and parameterizations on the form of universal grammar. Plausible alternatives to these constraints include: conservatism, item-based learning, indirect negative evidence, competition, cue construction, and monitoring. Careful analysis of child language corpora has cast doubt on claims regarding the absence of positive exemplars. Using demonstrably available positive data, simple learning procedures can be formulated for each of the syntactic structures that have traditionally motivated invocation of the logical problem. Within the perspective of emergentist theory (MacWhinney, 2001), the operation of a set of mutually supportive processes is viewed as providing multiple buffering for developmental outcomes. However, the fact that some syntactic structures are more difficult to learn than others can be used to highlight areas of intense grammatical competition and processing load. PMID:15658750

  15. Recommendations for improved and coherent acquisition and processing of backscatter data from seafloor-mapping sonars

    NASA Astrophysics Data System (ADS)

    Lamarche, Geoffroy; Lurton, Xavier

    2018-06-01

    Multibeam echosounders are becoming widespread for the purposes of seafloor bathymetry mapping, but the acquisition and the use of seafloor backscatter measurements, acquired simultaneously with the bathymetric data, are still insufficiently understood, controlled and standardized. This presents an obstacle to well-accepted, standardized analysis and application by end users. The Marine Geological and Biological Habitat Mapping group (Geohab.org) has long recognized the need for better coherence and common agreement on acquisition, processing and interpretation of seafloor backscatter data, and established the Backscatter Working Group (BSWG) in May 2013. This paper presents an overview of this initiative, the mandate, structure and program of the working group, and a synopsis of the BSWG Guidelines and Recommendations to date. The paper includes (1) an overview of the current status in sensors and techniques available in seafloor backscatter data from multibeam sonars; (2) the presentation of the BSWG structure and results; (3) recommendations to operators, end-users, sonar manufacturers, and software developers using sonar backscatter for seafloor-mapping applications, for best practice methods and approaches for data acquisition and processing; and (4) a discussion on the development needs for future systems and data processing. We propose for the first time a nomenclature of backscatter processing levels that affords a means to accurately and efficiently describe the data processing status, and to facilitate comparisons of final products from various origins.

  16. The X-33 range Operations Control Center

    NASA Technical Reports Server (NTRS)

    Shy, Karla S.; Norman, Cynthia L.

    1998-01-01

    This paper describes the capabilities and features of the X-33 Range Operations Center at NASA Dryden Flight Research Center. All the unprocessed data will be collected and transmitted over fiber optic lines to the Lockheed Operations Control Center for real-time flight monitoring of the X-33 vehicle. By using the existing capabilities of the Western Aeronautical Test Range, the Range Operations Center will provide the ability to monitor all down-range tracking sites for the Extended Test Range systems. In addition to radar tracking and aircraft telemetry data, the Telemetry and Radar Acquisition and Processing System is being enhanced to acquire vehicle command data, differential Global Positioning System corrections and telemetry receiver signal level status. The Telemetry and Radar Acquisition Processing System provides the flexibility to satisfy all X-33 data processing requirements quickly and efficiently. Additionally, the Telemetry and Radar Acquisition Processing System will run a real-time link margin analysis program. The results of this model will be compared in real-time with actual flight data. The hardware and software concepts presented in this paper describe a method of merging all types of data into a common database for real-time display in the Range Operations Center in support of the X-33 program. All types of data will be processed for real-time analysis and display of the range system status to ensure public safety.

  17. Evaluation of the quality of the teaching-learning process in undergraduate courses in Nursing 1

    PubMed Central

    González-Chordá, Víctor Manuel; Maciá-Soler, María Loreto

    2015-01-01

    Abstract Objective: to identify aspects of improvement of the quality of the teaching-learning process through the analysis of tools that evaluated the acquisition of skills by undergraduate students of Nursing. Method: prospective longitudinal study conducted in a population of 60 secondyear Nursing students based on registration data, from which quality indicators that evaluate the acquisition of skills were obtained, with descriptive and inferential analysis. Results: nine items were identified and nine learning activities included in the assessment tools that did not reach the established quality indicators (p<0.05). There are statistically significant differences depending on the hospital and clinical practices unit (p<0.05). Conclusion: the analysis of the evaluation tools used in the article "Nursing Care in Welfare Processes" of the analyzed university undergraduate course enabled the detection of the areas for improvement in the teachinglearning process. The challenge of education in nursing is to reach the best clinical research and educational results, in order to provide improvements to the quality of education and health care. PMID:26444173

  18. U.S. EPA'S RESEARCH ON LIFE-CYCLE ANALYSIS

    EPA Science Inventory

    Life-cycle analysis (LCA) consists of looking at a product, process or activity from its inception through its completion. or consumer products, this includes the stages of raw material acquisition, manufacturing and fabrication, distribution, consumer use/reuse and final disposa...

  19. Optimization of oncological {sup 18}F-FDG PET/CT imaging based on a multiparameter analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Menezes, Vinicius O., E-mail: vinicius@radtec.com.br; Machado, Marcos A. D.; Queiroz, Cleiton C.

    2016-02-15

    Purpose: This paper describes a method to achieve consistent clinical image quality in {sup 18}F-FDG scans accounting for patient habitus, dose regimen, image acquisition, and processing techniques. Methods: Oncological PET/CT scan data for 58 subjects were evaluated retrospectively to derive analytical curves that predict image quality. Patient noise equivalent count rate and coefficient of variation (CV) were used as metrics in their analysis. Optimized acquisition protocols were identified and prospectively applied to 179 subjects. Results: The adoption of different schemes for three body mass ranges (<60 kg, 60–90 kg, >90 kg) allows improved image quality with both point spread functionmore » and ordered-subsets expectation maximization-3D reconstruction methods. The application of this methodology showed that CV improved significantly (p < 0.0001) in clinical practice. Conclusions: Consistent oncological PET/CT image quality on a high-performance scanner was achieved from an analysis of the relations existing between dose regimen, patient habitus, acquisition, and processing techniques. The proposed methodology may be used by PET/CT centers to develop protocols to standardize PET/CT imaging procedures and achieve better patient management and cost-effective operations.« less

  20. Effects of automation of information-processing functions on teamwork.

    PubMed

    Wright, Melanie C; Kaber, David B

    2005-01-01

    We investigated the effects of automation as applied to different stages of information processing on team performance in a complex decision-making task. Forty teams of 2 individuals performed a simulated Theater Defense Task. Four automation conditions were simulated with computer assistance applied to realistic combinations of information acquisition, information analysis, and decision selection functions across two levels of task difficulty. Multiple measures of team effectiveness and team coordination were used. Results indicated different forms of automation have different effects on teamwork. Compared with a baseline condition, an increase in automation of information acquisition led to an increase in the ratio of information transferred to information requested; an increase in automation of information analysis resulted in higher team coordination ratings; and automation of decision selection led to better team effectiveness under low levels of task difficulty but at the cost of higher workload. The results support the use of early and intermediate forms of automation related to acquisition and analysis of information in the design of team tasks. Decision-making automation may provide benefits in more limited contexts. Applications of this research include the design and evaluation of automation in team environments.

  1. Electrophoresis gel image processing and analysis using the KODAK 1D software.

    PubMed

    Pizzonia, J

    2001-06-01

    The present article reports on the performance of the KODAK 1D Image Analysis Software for the acquisition of information from electrophoresis experiments and highlights the utility of several mathematical functions for subsequent image processing, analysis, and presentation. Digital images of Coomassie-stained polyacrylamide protein gels containing molecular weight standards and ethidium bromide stained agarose gels containing DNA mass standards are acquired using the KODAK Electrophoresis Documentation and Analysis System 290 (EDAS 290). The KODAK 1D software is used to optimize lane and band identification using features such as isomolecular weight lines. Mathematical functions for mass standard representation are presented, and two methods for estimation of unknown band mass are compared. Given the progressive transition of electrophoresis data acquisition and daily reporting in peer-reviewed journals to digital formats ranging from 8-bit systems such as EDAS 290 to more expensive 16-bit systems, the utility of algorithms such as Gaussian modeling, which can correct geometric aberrations such as clipping due to signal saturation common at lower bit depth levels, is discussed. Finally, image-processing tools that can facilitate image preparation for presentation are demonstrated.

  2. Enhanced Data-Acquisition System

    NASA Technical Reports Server (NTRS)

    Mustain, Roy W.

    1990-01-01

    Time-consuming, costly digitization of analog signals on magnetic tape eliminated. Proposed data-acquisition system provides nearly immediate access to data in incoming signals by digitizing and recording them both on magnetic tape and on optical disk. Tape and/or disk later played back to reconstruct signals in analog or digital form for analysis. Of interest in industrial and scientific applications in which necessary to digitize, store, and/or process large quantities of experimental data.

  3. An Analysis of U.S. Army Health Hazard Assessments During the Acquisition of Military Materiel

    DTIC Science & Technology

    2010-06-03

    protective equipment (PPE) (Milz, Conrad, & Soule , 2003). Engineering controls can eliminate hazards through system design, substitution of hazardous...Milz, Conrad, & Soule , 2003). Engineering control measures can serve to 7 minimize hazards where they cannot be eliminated, with preference for...during the materiel acquisitions process, and (c) will evaluate a sample of the database for accuracy by comparing the data entries to original reports

  4. A tailored 200 parameter VME based data acquisition system for IBA at the Lund Ion Beam Analysis Facility - Hardware and software

    NASA Astrophysics Data System (ADS)

    Elfman, Mikael; Ros, Linus; Kristiansson, Per; Nilsson, E. J. Charlotta; Pallon, Jan

    2016-03-01

    With the recent advances towards modern Ion Beam Analysis (IBA), going from one- or few-parameter detector systems to multi-parameter systems, it has been necessary to expand and replace the more than twenty years old CAMAC based system. A new VME multi-parameter (presently up to 200 channels) data acquisition and control system has been developed and implemented at the Lund Ion Beam Analysis Facility (LIBAF). The system is based on the VX-511 Single Board Computer (SBC), acting as master with arbiter functionality and consists of standard VME modules like Analog to Digital Converters (ADC's), Charge to Digital Converters (QDC's), Time to Digital Converters (TDC's), scaler's, IO-cards, high voltage and waveform units. The modules have been specially selected to support all of the present detector systems in the laboratory, with the option of future expansion. Typically, the detector systems consist of silicon strip detectors, silicon drift detectors and scintillator detectors, for detection of charged particles, X-rays and γ-rays. The data flow of the raw data buffers out from the VME bus to the final storage place on a 16 terabyte network attached storage disc (NAS-disc) is described. The acquisition process, remotely controlled over one of the SBCs ethernet channels, is also discussed. The user interface is written in the Kmax software package, and is used to control the acquisition process as well as for advanced online and offline data analysis through a user-friendly graphical user interface (GUI). In this work the system implementation, layout and performance are presented. The user interface and possibilities for advanced offline analysis are also discussed and illustrated.

  5. On-line 3-dimensional confocal imaging in vivo.

    PubMed

    Li, J; Jester, J V; Cavanagh, H D; Black, T D; Petroll, W M

    2000-09-01

    In vivo confocal microscopy through focusing (CMTF) can provide a 3-D stack of high-resolution corneal images and allows objective measurements of corneal sublayer thickness and backscattering. However, current systems require time-consuming off-line image processing and analysis on multiple software platforms. Furthermore, there is a trade off between the CMTF speed and measurement precision. The purpose of this study was to develop a novel on-line system for in vivo corneal imaging and analysis that overcomes these limitations. A tandem scanning confocal microscope (TSCM) was used for corneal imaging. The TSCM video camera was interfaced directly to a PC image acquisition board to implement real-time digitization. Software was developed to allow in vivo 2-D imaging, CMTF image acquisition, interactive 3-D reconstruction, and analysis of CMTF data to be performed on line in a single user-friendly environment. A procedure was also incorporated to separate the odd/even video fields, thereby doubling the CMTF sampling rate and theoretically improving the precision of CMTF thickness measurements by a factor of two. In vivo corneal examinations of a normal human and a photorefractive keratectomy patient are presented to demonstrate the capabilities of the new system. Improvements in the convenience, speed, and functionality of in vivo CMTF image acquisition, display, and analysis are demonstrated. This is the first full-featured software package designed for in vivo TSCM imaging of the cornea, which performs both 2-D and 3-D image acquisition, display, and processing as well as CMTF analysis. The use of a PC platform and incorporation of easy to use, on line, and interactive features should help to improve the clinical utility of this technology.

  6. The Dynamics of Community Health Care Consolidation: Acquisition of Physician Practices

    PubMed Central

    Christianson, Jon B; Carlin, Caroline S; Warrick, Louise H

    2014-01-01

    Context Health care delivery systems are becoming increasingly consolidated in urban areas of the United States. While this consolidation could increase efficiency and improve quality, it also could raise the cost of health care for payers. This article traces the consolidation trajectory in a single community, focusing on factors influencing recent acquisitions of physician practices by integrated delivery systems. Methods We used key informant interviews, supplemented by document analysis. Findings The acquisition of physician practices is a process that will be difficult to reverse in the current health care environment. Provider revenue uncertainty is a key factor driving consolidation, with public and private attempts to control health care costs contributing to that uncertainty. As these efforts will likely continue, and possibly intensify, community health care systems now are less consolidated than they will be in the future. Acquisitions of multispecialty and primary care practices by integrated delivery systems follow a common process, with relatively predictable issues relating to purchase agreements, employment contracts, and compensation. Acquisitions of single-specialty practices are less common, with motivations for acquisitions likely to vary by specialty type, group size, and market structure. Total cost of care contracting could be an important catalyst for practice acquisitions in the future. Conclusions In the past, market and regulatory forces aimed at controlling costs have both encouraged and rewarded the consolidation of providers, with important new developments likely to create momentum for further consolidation, including acquisitions of physician practices. PMID:25199899

  7. The dynamics of community health care consolidation: acquisition of physician practices.

    PubMed

    Christianson, Jon B; Carlin, Caroline S; Warrick, Louise H

    2014-09-01

    Health care delivery systems are becoming increasingly consolidated in urban areas of the United States. While this consolidation could increase efficiency and improve quality, it also could raise the cost of health care for payers. This article traces the consolidation trajectory in a single community, focusing on factors influencing recent acquisitions of physician practices by integrated delivery systems. We used key informant interviews, supplemented by document analysis. The acquisition of physician practices is a process that will be difficult to reverse in the current health care environment. Provider revenue uncertainty is a key factor driving consolidation, with public and private attempts to control health care costs contributing to that uncertainty. As these efforts will likely continue, and possibly intensify, community health care systems now are less consolidated than they will be in the future. Acquisitions of multispecialty and primary care practices by integrated delivery systems follow a common process, with relatively predictable issues relating to purchase agreements, employment contracts, and compensation. Acquisitions of single-specialty practices are less common, with motivations for acquisitions likely to vary by specialty type, group size, and market structure. Total cost of care contracting could be an important catalyst for practice acquisitions in the future. In the past, market and regulatory forces aimed at controlling costs have both encouraged and rewarded the consolidation of providers, with important new developments likely to create momentum for further consolidation, including acquisitions of physician practices. © 2014 Milbank Memorial Fund.

  8. A marker-free system for the analysis of movement disabilities.

    PubMed

    Legrand, L; Marzani, F; Dusserre, L

    1998-01-01

    A major step toward improving the treatments of disabled persons may be achieved by using motion analysis equipment. We are developing such a system. It allows the analysis of plane human motion (e.g. gait) without using the tracking of markers. The system is composed of one fixed camera which acquires an image sequence of a human in motion. Then the treatment is divided into two steps: first, a large number of pixels belonging to the boundaries of the human body are extracted at each acquisition time. Secondly, a two-dimensional model of the human body, based on tapered superquadrics, is successively matched with the sets of pixels previously extracted; a specific fuzzy clustering process is used for this purpose. Moreover, an optical flow procedure gives a prediction of the model location at each acquisition time from its location at the previous time. Finally we present some results of this process applied to a leg in motion.

  9. Open source cardiology electronic health record development for DIGICARDIAC implementation

    NASA Astrophysics Data System (ADS)

    Dugarte, Nelson; Medina, Rubén.; Huiracocha, Lourdes; Rojas, Rubén.

    2015-12-01

    This article presents the development of a Cardiology Electronic Health Record (CEHR) system. Software consists of a structured algorithm designed under Health Level-7 (HL7) international standards. Novelty of the system is the integration of high resolution ECG (HRECG) signal acquisition and processing tools, patient information management tools and telecardiology tools. Acquisition tools are for management and control of the DIGICARDIAC electrocardiograph functions. Processing tools allow management of HRECG signal analysis searching for indicative patterns of cardiovascular pathologies. Telecardiology tools incorporation allows system communication with other health care centers decreasing access time to the patient information. CEHR system was completely developed using open source software. Preliminary results of process validation showed the system efficiency.

  10. Clinical evaluation of reducing acquisition time on single-photon emission computed tomography image quality using proprietary resolution recovery software.

    PubMed

    Aldridge, Matthew D; Waddington, Wendy W; Dickson, John C; Prakash, Vineet; Ell, Peter J; Bomanji, Jamshed B

    2013-11-01

    A three-dimensional model-based resolution recovery (RR) reconstruction algorithm that compensates for collimator-detector response, resulting in an improvement in reconstructed spatial resolution and signal-to-noise ratio of single-photon emission computed tomography (SPECT) images, was tested. The software is said to retain image quality even with reduced acquisition time. Clinically, any improvement in patient throughput without loss of quality is to be welcomed. Furthermore, future restrictions in radiotracer supplies may add value to this type of data analysis. The aims of this study were to assess improvement in image quality using the software and to evaluate the potential of performing reduced time acquisitions for bone and parathyroid SPECT applications. Data acquisition was performed using the local standard SPECT/CT protocols for 99mTc-hydroxymethylene diphosphonate bone and 99mTc-methoxyisobutylisonitrile parathyroid SPECT imaging. The principal modification applied was the acquisition of an eight-frame gated data set acquired using an ECG simulator with a fixed signal as the trigger. This had the effect of partitioning the data such that the effect of reduced time acquisitions could be assessed without conferring additional scanning time on the patient. The set of summed data sets was then independently reconstructed using the RR software to permit a blinded assessment of the effect of acquired counts upon reconstructed image quality as adjudged by three experienced observers. Data sets reconstructed with the RR software were compared with the local standard processing protocols; filtered back-projection and ordered-subset expectation-maximization. Thirty SPECT studies were assessed (20 bone and 10 parathyroid). The images reconstructed with the RR algorithm showed improved image quality for both full-time and half-time acquisitions over local current processing protocols (P<0.05). The RR algorithm improved image quality compared with local processing protocols and has been introduced into routine clinical use. SPECT acquisitions are now acquired at half of the time previously required. The method of binning the data can be applied to any other camera system to evaluate the reduction in acquisition time for similar processes. The potential for dose reduction is also inherent with this approach.

  11. Engineering the Business of Defense Acquisition: An Analysis of Program Office Processes

    DTIC Science & Technology

    2015-04-30

    ability to stay focused on the decision outcome rather than procrastinate and wait for a time- dependent resolution. Another critical aspect of...instituted an effective issue resolution process. Conflict, left unmanaged, tended to result in further procrastination and less effective outcomes

  12. A Research on Second Language Acquisition and College English Teaching

    ERIC Educational Resources Information Center

    Li, Changyu

    2009-01-01

    It was in the 1970s that American linguist S.D. Krashen created the theory of "language acquisition". The theories on second language acquisition were proposed based on the study on the second language acquisition process and its rules. Here, the second language acquisition process refers to the process in which a learner with the…

  13. Interactive effects of age-of-acquisition and repetition priming in the lexical decision task. A multiple-loci account.

    PubMed

    Spataro, Pietro; Longobardi, Emiddia; Saraulli, Daniele; Rossi-Arnaud, Clelia

    2013-01-01

    The analysis of the interaction between repetition priming and age of acquisition may be used to shed further light on the question of which stages of elaboration are affected by this psycholinguistic variable. In the present study we applied this method in the context of two versions of a lexical decision task that differed in the type of non-words employed at test. When the non-words were illegal and unpronounceable, repetition priming was primarily based on the analysis of orthographic information, while phonological processes were additionally recruited only when using legal pronounceable non-words. The results showed a significant interaction between repetition priming and age of acquisition in both conditions, with priming being greater for late- than for early-acquired words. These findings support a multiple-loci account, indicating that age of acquisition influences implicit memory by facilitating the retrieval of both the orthographic and the phonological representations of studied words.

  14. A knowledge acquisition process to analyse operational problems in solid waste management facilities.

    PubMed

    Dokas, Ioannis M; Panagiotakopoulos, Demetrios C

    2006-08-01

    The available expertise on managing and operating solid waste management (SWM) facilities varies among countries and among types of facilities. Few experts are willing to record their experience, while few researchers systematically investigate the chains of events that could trigger operational failures in a facility; expertise acquisition and dissemination, in SWM, is neither popular nor easy, despite the great need for it. This paper presents a knowledge acquisition process aimed at capturing, codifying and expanding reliable expertise and propagating it to non-experts. The knowledge engineer (KE), the person performing the acquisition, must identify the events (or causes) that could trigger a failure, determine whether a specific event could trigger more than one failure, and establish how various events are related among themselves and how they are linked to specific operational problems. The proposed process, which utilizes logic diagrams (fault trees) widely used in system safety and reliability analyses, was used for the analysis of 24 common landfill operational problems. The acquired knowledge led to the development of a web-based expert system (Landfill Operation Management Advisor, http://loma.civil.duth.gr), which estimates the occurrence possibility of operational problems, provides advice and suggests solutions.

  15. Synchronized and noise-robust audio recordings during realtime magnetic resonance imaging scans.

    PubMed

    Bresch, Erik; Nielsen, Jon; Nayak, Krishna; Narayanan, Shrikanth

    2006-10-01

    This letter describes a data acquisition setup for recording, and processing, running speech from a person in a magnetic resonance imaging (MRI) scanner. The main focus is on ensuring synchronicity between image and audio acquisition, and in obtaining good signal to noise ratio to facilitate further speech analysis and modeling. A field-programmable gate array based hardware design for synchronizing the scanner image acquisition to other external data such as audio is described. The audio setup itself features two fiber optical microphones and a noise-canceling filter. Two noise cancellation methods are described including a novel approach using a pulse sequence specific model of the gradient noise of the MRI scanner. The setup is useful for scientific speech production studies. Sample results of speech and singing data acquired and processed using the proposed method are given.

  16. Synchronized and noise-robust audio recordings during realtime magnetic resonance imaging scans (L)

    PubMed Central

    Bresch, Erik; Nielsen, Jon; Nayak, Krishna; Narayanan, Shrikanth

    2007-01-01

    This letter describes a data acquisition setup for recording, and processing, running speech from a person in a magnetic resonance imaging (MRI) scanner. The main focus is on ensuring synchronicity between image and audio acquisition, and in obtaining good signal to noise ratio to facilitate further speech analysis and modeling. A field-programmable gate array based hardware design for synchronizing the scanner image acquisition to other external data such as audio is described. The audio setup itself features two fiber optical microphones and a noise-canceling filter. Two noise cancellation methods are described including a novel approach using a pulse sequence specific model of the gradient noise of the MRI scanner. The setup is useful for scientific speech production studies. Sample results of speech and singing data acquired and processed using the proposed method are given. PMID:17069275

  17. The Effectiveness of Processing Instruction and Production-Based Instruction on L2 Grammar Acquisition: A Meta-Analysis

    ERIC Educational Resources Information Center

    Shintani, Natsuko

    2015-01-01

    This article reports a meta-analysis of 42 experiments in 33 published studies involving processing instruction (PI) and production-based instruction (PB) used in the PI studies. The comparative effectiveness of PI and PB showed that although PI was more effective than PB for developing receptive knowledge, PB was just as effective as PI for…

  18. Computer system for scanning tunneling microscope automation

    NASA Astrophysics Data System (ADS)

    Aguilar, M.; García, A.; Pascual, P. J.; Presa, J.; Santisteban, A.

    1987-03-01

    A computerized system for the automation of a scanning tunneling microscope is presented. It is based on an IBM personal computer (PC) either an XT or an AT, which performs the control, data acquisition and storage operations, displays the STM "images" in real time, and provides image processing tools for the restoration and analysis of data. It supports different data acquisition and control cards and image display cards. The software has been designed in a modular way to allow the replacement of these cards and other equipment improvements as well as the inclusion of user routines for data analysis.

  19. Topics in Chemical Instrumentation.

    ERIC Educational Resources Information Center

    Settle, Frank A. Jr., Ed.

    1989-01-01

    Using Fourier transformation methods in nuclear resonance has made possible increased sensitivity in chemical analysis. This article describes data acquisition, data processing, and the frequency spectrum as they relate to this technique. (CW)

  20. Toward a Practical Model of Cognitive/Information Task Analysis and Schema Acquisition for Complex Problem-Solving Situations.

    ERIC Educational Resources Information Center

    Braune, Rolf; Foshay, Wellesley R.

    1983-01-01

    The proposed three-step strategy for research on human information processing--concept hierarchy analysis, analysis of example sets to teach relations among concepts, and analysis of problem sets to build a progressively larger schema for the problem space--may lead to practical procedures for instructional design and task analysis. Sixty-four…

  1. Due Diligence Processes for Public Acquisition of Mining-Impacted Landscapes

    NASA Astrophysics Data System (ADS)

    Martin, E.; Monohan, C.; Keeble-Toll, A. K.

    2016-12-01

    The acquisition of public land is critical for achieving conservation and habitat goals in rural regions projected to experience continuously high rates of population growth. To ensure that public funds are utilized responsibly in the purchase of conservation easements appropriate due diligence processes must be established that limit landowner liability post-acquisition. Traditional methods of characterizing contamination in regions where legacy mining activities were prevalent may not utilize current scientific knowledge and understanding of contaminant fate, transport and bioavailability, and therefore are likely to have type two error. Agency prescribed assessment methods utilized under CERLA in many cases fail to detect contamination that presents liability issues by failing to require water quality sampling that would reveal offsite transport potential of contaminants posing human health risks, including mercury. Historical analysis can be used to inform judgmental sampling to identify hotspots and contaminants of concern. Land acquisition projects at two historic mine sites in Nevada County, California, the Champion Mine Complex and the Black Swan Preserve have established the necessity of re-thinking due diligence processes for mining-impacted landscapes. These pilot projects demonstrate that pre-acquisition assessment in the Gold Country must include judgmental sampling and evaluation of contaminant transport. Best practices using the current scientific knowledge must be codified by agencies, consultants, and NGOs in order to ensure responsible use of public funds and to safeguard public health.

  2. Performance Confirmation Data Aquisition System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    D.W. Markman

    2000-10-27

    The purpose of this analysis is to identify and analyze concepts for the acquisition of data in support of the Performance Confirmation (PC) program at the potential subsurface nuclear waste repository at Yucca Mountain. The scope and primary objectives of this analysis are to: (1) Review the criteria for design as presented in the Performance Confirmation Data Acquisition/Monitoring System Description Document, by way of the Input Transmittal, Performance Confirmation Input Criteria (CRWMS M&O 1999c). (2) Identify and describe existing and potential new trends in data acquisition system software and hardware that would support the PC plan. The data acquisition softwaremore » and hardware will support the field instruments and equipment that will be installed for the observation and perimeter drift borehole monitoring, and in-situ monitoring within the emplacement drifts. The exhaust air monitoring requirements will be supported by a data communication network interface with the ventilation monitoring system database. (3) Identify the concepts and features that a data acquisition system should have in order to support the PC process and its activities. (4) Based on PC monitoring needs and available technologies, further develop concepts of a potential data acquisition system network in support of the PC program and the Site Recommendation and License Application.« less

  3. A robust close-range photogrammetric target extraction algorithm for size and type variant targets

    NASA Astrophysics Data System (ADS)

    Nyarko, Kofi; Thomas, Clayton; Torres, Gilbert

    2016-05-01

    The Photo-G program conducted by Naval Air Systems Command at the Atlantic Test Range in Patuxent River, Maryland, uses photogrammetric analysis of large amounts of real-world imagery to characterize the motion of objects in a 3-D scene. Current approaches involve several independent processes including target acquisition, target identification, 2-D tracking of image features, and 3-D kinematic state estimation. Each process has its own inherent complications and corresponding degrees of both human intervention and computational complexity. One approach being explored for automated target acquisition relies on exploiting the pixel intensity distributions of photogrammetric targets, which tend to be patterns with bimodal intensity distributions. The bimodal distribution partitioning algorithm utilizes this distribution to automatically deconstruct a video frame into regions of interest (ROI) that are merged and expanded to target boundaries, from which ROI centroids are extracted to mark target acquisition points. This process has proved to be scale, position and orientation invariant, as well as fairly insensitive to global uniform intensity disparities.

  4. Advances in Mid-Infrared Spectroscopy for Chemical Analysis

    NASA Astrophysics Data System (ADS)

    Haas, Julian; Mizaikoff, Boris

    2016-06-01

    Infrared spectroscopy in the 3-20 μm spectral window has evolved from a routine laboratory technique into a state-of-the-art spectroscopy and sensing tool by benefitting from recent progress in increasingly sophisticated spectra acquisition techniques and advanced materials for generating, guiding, and detecting mid-infrared (MIR) radiation. Today, MIR spectroscopy provides molecular information with trace to ultratrace sensitivity, fast data acquisition rates, and high spectral resolution catering to demanding applications in bioanalytics, for example, and to improved routine analysis. In addition to advances in miniaturized device technology without sacrificing analytical performance, selected innovative applications for MIR spectroscopy ranging from process analysis to biotechnology and medical diagnostics are highlighted in this review.

  5. Data Visualization and Animation Lab (DVAL) overview

    NASA Technical Reports Server (NTRS)

    Stacy, Kathy; Vonofenheim, Bill

    1994-01-01

    The general capabilities of the Langley Research Center Data Visualization and Animation Laboratory is described. These capabilities include digital image processing, 3-D interactive computer graphics, data visualization and analysis, video-rate acquisition and processing of video images, photo-realistic modeling and animation, video report generation, and color hardcopies. A specialized video image processing system is also discussed.

  6. Construct mine environment monitoring system based on wireless mesh network

    NASA Astrophysics Data System (ADS)

    Chen, Xin; Ge, Gengyu; Liu, Yinmei; Cheng, Aimin; Wu, Jun; Fu, Jun

    2018-04-01

    The system uses wireless Mesh network as a network transmission medium, and strive to establish an effective and reliable underground environment monitoring system. The system combines wireless network technology and embedded technology to monitor the internal data collected in the mine and send it to the processing center for analysis and environmental assessment. The system can be divided into two parts: the main control network module and the data acquisition terminal, and the SPI bus technology is used for mutual communication between them. Multi-channel acquisition and control interface design Data acquisition and control terminal in the analog signal acquisition module, digital signal acquisition module, and digital signal output module. The main control network module running Linux operating system, in which the transplant SPI driver, USB card driver and AODV routing protocol. As a result, the internal data collection and reporting of the mine are realized.

  7. 48 CFR 8.705-3 - Allocation process.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 1 2011-10-01 2011-10-01 false Allocation process. 8.705-3 Section 8.705-3 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION ACQUISITION PLANNING REQUIRED SOURCES OF SUPPLIES AND SERVICES Acquisition From Nonprofit Agencies Employing People Who...

  8. Vectorized data acquisition and fast triple-correlation integrals for Fluorescence Triple Correlation Spectroscopy

    NASA Astrophysics Data System (ADS)

    Ridgeway, William K.; Millar, David P.; Williamson, James R.

    2013-04-01

    Fluorescence Correlation Spectroscopy (FCS) is widely used to quantify reaction rates and concentrations of molecules in vitro and in vivo. We recently reported Fluorescence Triple Correlation Spectroscopy (F3CS), which correlates three signals together instead of two. F3CS can analyze the stoichiometries of complex mixtures and detect irreversible processes by identifying time-reversal asymmetries. Here we report the computational developments that were required for the realization of F3CS and present the results as the Triple Correlation Toolbox suite of programs. Triple Correlation Toolbox is a complete data analysis pipeline capable of acquiring, correlating and fitting large data sets. Each segment of the pipeline handles error estimates for accurate error-weighted global fitting. Data acquisition was accelerated with a combination of off-the-shelf counter-timer chips and vectorized operations on 128-bit registers. This allows desktop computers with inexpensive data acquisition cards to acquire hours of multiple-channel data with sub-microsecond time resolution. Off-line correlation integrals were implemented as a two delay time multiple-tau scheme that scales efficiently with multiple processors and provides an unprecedented view of linked dynamics. Global fitting routines are provided to fit FCS and F3CS data to models containing up to ten species. Triple Correlation Toolbox is a complete package that enables F3CS to be performed on existing microscopes. Catalogue identifier: AEOP_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEOP_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 50189 No. of bytes in distributed program, including test data, etc.: 6135283 Distribution format: tar.gz Programming language: C/Assembly. Computer: Any with GCC and library support. Operating system: Linux and OS X (data acq. for Linux only due to library availability), not tested on Windows. RAM: ≥512 MB. Classification: 16.4. External routines: NIDAQmx (National Instruments), Gnu Scientific Library, GTK+, PLplot (optional) Nature of problem: Fluorescence Triple Correlation Spectroscopy required three things: data acquisition at faster speeds than were possible without expensive custom hardware, triple-correlation routines that could process 1/2 TB data sets rapidly, and fitting routines capable of handling several to a hundred fit parameters and 14,000 + data points, each with error estimates. Solution method: A novel data acquisition concept mixed signal processing with off-the-shelf hardware and data-parallel processing using 128-bit registers found in desktop CPUs. Correlation algorithms used fractal data structures and multithreading to reduce data analysis times. Global fitting was implemented with robust minimization routines and provides feedback that allows the user to critically inspect initial guesses and fits. Restrictions: Data acquisition only requires a National Instruments data acquisition card (it was tested on Linux using card PCIe-6251) and a simple home-built circuit. Unusual features: Hand-coded ×86-64 assembly for data acquisition loops (platform-independent C code also provided). Additional comments: A complete collection of tools to perform Fluorescence Triple Correlation Spectroscopy-from data acquisition to two-tau correlation of large data sets, to model fitting. Running time: 1-5 h of data analysis per hour of data collected. Varies depending on data-acquisition length, time resolution, data density and number of cores used for correlation integrals.

  9. Dependency of image quality on acquisition protocol and image processing in chest tomosynthesis-a visual grading study based on clinical data.

    PubMed

    Jadidi, Masoud; Båth, Magnus; Nyrén, Sven

    2018-04-09

    To compare the quality of images obtained with two different protocols with different acquisition time and the influence from image post processing in a chest digital tomosynthesis (DTS) system. 20 patients with suspected lung cancer were imaged with a chest X-ray equipment with tomosynthesis option. Two examination protocols with different acquisition times (6.3 and 12 s) were performed on each patient. Both protocols were presented with two different image post-processing (standard DTS processing and more advanced processing optimised for chest radiography). Thus, 4 series from each patient, altogether 80 series, were presented anonymously and in a random order. Five observers rated the quality of the reconstructed section images according to predefined quality criteria in three different classes. Visual grading characteristics (VGC) was used to analyse the data and the area under the VGC curve (AUC VGC ) was used as figure-of-merit. The 12 s protocol and the standard DTS processing were used as references in the analyses. The protocol with 6.3 s acquisition time had a statistically significant advantage over the vendor-recommended protocol with 12 s acquisition time for the classes of criteria, Demarcation (AUC VGC = 0.56, p = 0.009) and Disturbance (AUC VGC = 0.58, p < 0.001). A similar value of AUC VGC was found also for the class Structure (definition of bone structures in the spine) (0.56) but it could not be statistically separated from 0.5 (p = 0.21). For the image processing, the VGC analysis showed a small but statistically significant advantage for the standard DTS processing over the more advanced processing for the classes of criteria Demarcation (AUC VGC = 0.45, p = 0.017) and Disturbance (AUC VGC = 0.43, p = 0.005). A similar value of AUC VGC was found also for the class Structure (0.46), but it could not be statistically separated from 0.5 (p = 0.31). The study indicates that the protocol with 6.3 s acquisition time yields slightly better image quality than the vender-recommended protocol with acquisition time 12 s for several anatomical structures. Furthermore, the standard gradation processing  (the vendor-recommended post-processing for DTS), yields to some extent advantage over the gradation processing/multiobjective frequency processing/flexible noise control processing in terms of image quality for all classes of criteria. Advances in knowledge: The study proves that the image quality may be strongly affected by the selection of DTS protocol and that the vendor-recommended protocol may not always be the optimal choice.

  10. Standards for data acquisition and software-based analysis of in vivo electroencephalography recordings from animals. A TASK1-WG5 report of the AES/ILAE Translational Task Force of the ILAE.

    PubMed

    Moyer, Jason T; Gnatkovsky, Vadym; Ono, Tomonori; Otáhal, Jakub; Wagenaar, Joost; Stacey, William C; Noebels, Jeffrey; Ikeda, Akio; Staley, Kevin; de Curtis, Marco; Litt, Brian; Galanopoulou, Aristea S

    2017-11-01

    Electroencephalography (EEG)-the direct recording of the electrical activity of populations of neurons-is a tremendously important tool for diagnosing, treating, and researching epilepsy. Although standard procedures for recording and analyzing human EEG exist and are broadly accepted, there are no such standards for research in animal models of seizures and epilepsy-recording montages, acquisition systems, and processing algorithms may differ substantially among investigators and laboratories. The lack of standard procedures for acquiring and analyzing EEG from animal models of epilepsy hinders the interpretation of experimental results and reduces the ability of the scientific community to efficiently translate new experimental findings into clinical practice. Accordingly, the intention of this report is twofold: (1) to review current techniques for the collection and software-based analysis of neural field recordings in animal models of epilepsy, and (2) to offer pertinent standards and reporting guidelines for this research. Specifically, we review current techniques for signal acquisition, signal conditioning, signal processing, data storage, and data sharing, and include applicable recommendations to standardize collection and reporting. We close with a discussion of challenges and future opportunities, and include a supplemental report of currently available acquisition systems and analysis tools. This work represents a collaboration on behalf of the American Epilepsy Society/International League Against Epilepsy (AES/ILAE) Translational Task Force (TASK1-Workgroup 5), and is part of a larger effort to harmonize video-EEG interpretation and analysis methods across studies using in vivo and in vitro seizure and epilepsy models. Wiley Periodicals, Inc. © 2017 International League Against Epilepsy.

  11. Exploring Mechanisms Underlying Impaired Brain Function in Gulf War Illness through Advanced Network Analysis

    DTIC Science & Technology

    2017-10-01

    networks of the brain responsible for visual processing, mood regulation, motor coordination, sensory processing, and language command, but increased...4    For each subject, the rsFMRI voxel time-series were temporally shifted to account for differences in slice acquisition times...responsible for visual processing, mood regulation, motor coordination, sensory processing, and language command, but increased connectivity in

  12. Note: Quasi-real-time analysis of dynamic near field scattering data using a graphics processing unit

    NASA Astrophysics Data System (ADS)

    Cerchiari, G.; Croccolo, F.; Cardinaux, F.; Scheffold, F.

    2012-10-01

    We present an implementation of the analysis of dynamic near field scattering (NFS) data using a graphics processing unit. We introduce an optimized data management scheme thereby limiting the number of operations required. Overall, we reduce the processing time from hours to minutes, for typical experimental conditions. Previously the limiting step in such experiments, the processing time is now comparable to the data acquisition time. Our approach is applicable to various dynamic NFS methods, including shadowgraph, Schlieren and differential dynamic microscopy.

  13. An Analysis of the Performance-Based Service Acquisition (PBSA) and Its Applicability to Hellenic Navy Service Acquisition Activities

    DTIC Science & Technology

    2012-12-01

    potential offeror in response to its request must not be disclosed if doing so would reveal the potential offeror’s confidential business strategy...also the Navy Academy and the Navy hospital have and operate kitchen facilities. Ships have their own kitchen facilities too and dedicated personnel...abandoned. Now the HN dedicates personnel, has kitchen facilities to provide the food services and doing the whole process. The HN could outsource

  14. Lexical Link Analysis Application: Improving Web Service to Acquisition Visibility Portal Phase III

    DTIC Science & Technology

    2015-04-30

    It is a supervised learning method but best for Big Data with low dimensions. It is an approximate inference good for Big Data and Hadoop ...Each process produces large amounts of information ( Big Data ). There is a critical need for automation, validation, and discovery to help acquisition...can inform managers where areas might have higher program risk and how resource and big data management might affect the desired return on investment

  15. Cleft Audit Protocol for Speech (CAPS-A): A Comprehensive Training Package for Speech Analysis

    ERIC Educational Resources Information Center

    Sell, D.; John, A.; Harding-Bell, A.; Sweeney, T.; Hegarty, F.; Freeman, J.

    2009-01-01

    Background: The previous literature has largely focused on speech analysis systems and ignored process issues, such as the nature of adequate speech samples, data acquisition, recording and playback. Although there has been recognition of the need for training on tools used in speech analysis associated with cleft palate, little attention has been…

  16. Organizational Analysis of the United States Army Evaluation Center

    DTIC Science & Technology

    2014-12-01

    analysis of qualitative or quantitative data obtained from design reviews, hardware inspections, M&S, hardware and software testing , metrics review... Research Development Test & Evaluation (RDT&E) appropriation account. The Defense Acquisition Portal ACQuipedia website describes RDT&E as “ one of the... research , design , development, test and evaluation, production, installation, operation, and maintenance; data collection; processing and analysis

  17. 23 CFR 710.305 - Environmental analysis.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 23 Highways 1 2012-04-01 2012-04-01 false Environmental analysis. 710.305 Section 710.305 Highways... (NEPA) process, as described in FHWA's NEPA regulations in 23 CFR part 771, normally must be conducted... acquisition, provided conditions prescribed in 23 U.S.C. 108(c) and 23 CFR 710.501, are satisfied. ...

  18. A Top Level Analysis of Training Management Functions.

    ERIC Educational Resources Information Center

    Ackerson, Jack

    1995-01-01

    Discusses how to conduct a top-level analysis of training management functions to identify problems within a training system resulting from rapid growth, the acquisition of new departments, or mergers. The data gathering process and analyses are explained, training management functions and activities are described, and root causes and solutions…

  19. Development and Flight Testing of an Adaptable Vehicle Health-Monitoring Architecture

    NASA Technical Reports Server (NTRS)

    Woodard, Stanley E.; Coffey, Neil C.; Gonzalez, Guillermo A.; Woodman, Keith L.; Weathered, Brenton W.; Rollins, Courtney H.; Taylor, B. Douglas; Brett, Rube R.

    2003-01-01

    Development and testing of an adaptable wireless health-monitoring architecture for a vehicle fleet is presented. It has three operational levels: one or more remote data acquisition units located throughout the vehicle; a command and control unit located within the vehicle; and a terminal collection unit to collect analysis results from all vehicles. Each level is capable of performing autonomous analysis with a trained adaptable expert system. The remote data acquisition unit has an eight channel programmable digital interface that allows the user discretion for choosing type of sensors; number of sensors, sensor sampling rate, and sampling duration for each sensor. The architecture provides framework for a tributary analysis. All measurements at the lowest operational level are reduced to provide analysis results necessary to gauge changes from established baselines. These are then collected at the next level to identify any global trends or common features from the prior level. This process is repeated until the results are reduced at the highest operational level. In the framework, only analysis results are forwarded to the next level to reduce telemetry congestion. The system's remote data acquisition hardware and non-analysis software have been flight tested on the NASA Langley B757's main landing gear.

  20. Practical considerations for obtaining high quality quantitative computed tomography data of the skeletal system.

    PubMed

    Troy, Karen L; Edwards, W Brent

    2018-05-01

    Quantitative CT (QCT) analysis involves the calculation of specific parameters such as bone volume and density from CT image data, and can be a powerful tool for understanding bone quality and quantity. However, without careful attention to detail during all steps of the acquisition and analysis process, data can be of poor- to unusable-quality. Good quality QCT for research requires meticulous attention to detail and standardization of all aspects of data collection and analysis to a degree that is uncommon in a clinical setting. Here, we review the literature to summarize practical and technical considerations for obtaining high quality QCT data, and provide examples of how each recommendation affects calculated variables. We also provide an overview of the QCT analysis technique to illustrate additional opportunities to improve data reproducibility and reliability. Key recommendations include: standardizing the scanner and data acquisition settings, minimizing image artifacts, selecting an appropriate reconstruction algorithm, and maximizing repeatability and objectivity during QCT analysis. The goal of the recommendations is to reduce potential sources of error throughout the analysis, from scan acquisition to the interpretation of results. Copyright © 2018 Elsevier Inc. All rights reserved.

  1. A Parallel Distributed-Memory Particle Method Enables Acquisition-Rate Segmentation of Large Fluorescence Microscopy Images

    PubMed Central

    Afshar, Yaser; Sbalzarini, Ivo F.

    2016-01-01

    Modern fluorescence microscopy modalities, such as light-sheet microscopy, are capable of acquiring large three-dimensional images at high data rate. This creates a bottleneck in computational processing and analysis of the acquired images, as the rate of acquisition outpaces the speed of processing. Moreover, images can be so large that they do not fit the main memory of a single computer. We address both issues by developing a distributed parallel algorithm for segmentation of large fluorescence microscopy images. The method is based on the versatile Discrete Region Competition algorithm, which has previously proven useful in microscopy image segmentation. The present distributed implementation decomposes the input image into smaller sub-images that are distributed across multiple computers. Using network communication, the computers orchestrate the collectively solving of the global segmentation problem. This not only enables segmentation of large images (we test images of up to 1010 pixels), but also accelerates segmentation to match the time scale of image acquisition. Such acquisition-rate image segmentation is a prerequisite for the smart microscopes of the future and enables online data compression and interactive experiments. PMID:27046144

  2. A Parallel Distributed-Memory Particle Method Enables Acquisition-Rate Segmentation of Large Fluorescence Microscopy Images.

    PubMed

    Afshar, Yaser; Sbalzarini, Ivo F

    2016-01-01

    Modern fluorescence microscopy modalities, such as light-sheet microscopy, are capable of acquiring large three-dimensional images at high data rate. This creates a bottleneck in computational processing and analysis of the acquired images, as the rate of acquisition outpaces the speed of processing. Moreover, images can be so large that they do not fit the main memory of a single computer. We address both issues by developing a distributed parallel algorithm for segmentation of large fluorescence microscopy images. The method is based on the versatile Discrete Region Competition algorithm, which has previously proven useful in microscopy image segmentation. The present distributed implementation decomposes the input image into smaller sub-images that are distributed across multiple computers. Using network communication, the computers orchestrate the collectively solving of the global segmentation problem. This not only enables segmentation of large images (we test images of up to 10(10) pixels), but also accelerates segmentation to match the time scale of image acquisition. Such acquisition-rate image segmentation is a prerequisite for the smart microscopes of the future and enables online data compression and interactive experiments.

  3. Data Acquisition and Analyses of Magnetotelluric Sounding in Lujiang-Zongyang Ore Concentrated Area

    NASA Astrophysics Data System (ADS)

    Tang, J.; Xiao, X.; Zhou, C.; Lu, Q.

    2010-12-01

    It is really challenging to perform MT data acquisition and processing in the Lujiang-Zongyang ore concentrated area, where severe and complicated noise is mixed with the useful data. Dense population, well-developed water systems, transport networks, communication and power grids, and some in-exploiting mines are main sources of the noise. However, to conduct MT sounding in this area is not only helpful to the study of geological structure and tectonics of this zone, but also brings valuable experience of data analysis and processing in real field work with heavy interference. This work has been accomplished by us in 5 survey lines with 500 sounding stations in total. In order to verify the consistency of the 6 V5-2000 data acquisition systems employed in our study, a consistency experiment was conducted in a test area with weak interference. Curves of apparent resistivity and phase obtained from these 6 instruments are plotted in Fig.1, in which acceptable consistency is showed, except for a few high noise frequencies. To determine the optimal interval for data acquisition in this noise-heavy survey area, a comparison experiment was implemented in a single sounding station for comparing the data quality by different intervals. We found 20 hours or more were required for each acquisition. The evaluation was based on degree of coherence and signal-to-noise ratio. With analysis of the MT data in both time and frequency domain, noise was categorized into several patterns according to the characteristic of various noise sources, and then corresponding filters were adopted. After removing flying-spot, cubic spline smoothing and spacial filtering to all the sounding curves, apparent resistivity profiles were obtained. Further studies including 2D and 3D inverse analysis are on processing. Fig 1. Consistency experiment of the apparatus (a) and (b) are apparent resistivity curves of yx and xy direction;(c) and (d) are phase curves of yx and xy direction;J1,J2,J3,J4,J5,J6 are marks of the 6 apparatus respectively.

  4. Pain related inflammation analysis using infrared images

    NASA Astrophysics Data System (ADS)

    Bhowmik, Mrinal Kanti; Bardhan, Shawli; Das, Kakali; Bhattacharjee, Debotosh; Nath, Satyabrata

    2016-05-01

    Medical Infrared Thermography (MIT) offers a potential non-invasive, non-contact and radiation free imaging modality for assessment of abnormal inflammation having pain in the human body. The assessment of inflammation mainly depends on the emission of heat from the skin surface. Arthritis is a disease of joint damage that generates inflammation in one or more anatomical joints of the body. Osteoarthritis (OA) is the most frequent appearing form of arthritis, and rheumatoid arthritis (RA) is the most threatening form of them. In this study, the inflammatory analysis has been performed on the infrared images of patients suffering from RA and OA. For the analysis, a dataset of 30 bilateral knee thermograms has been captured from the patient of RA and OA by following a thermogram acquisition standard. The thermograms are pre-processed, and areas of interest are extracted for further processing. The investigation of the spread of inflammation is performed along with the statistical analysis of the pre-processed thermograms. The objectives of the study include: i) Generation of a novel thermogram acquisition standard for inflammatory pain disease ii) Analysis of the spread of the inflammation related to RA and OA using K-means clustering. iii) First and second order statistical analysis of pre-processed thermograms. The conclusion reflects that, in most of the cases, RA oriented inflammation affects bilateral knees whereas inflammation related to OA present in the unilateral knee. Also due to the spread of inflammation in OA, contralateral asymmetries are detected through the statistical analysis.

  5. Improved Holistic Analysis of Rayleigh Waves for Single- and Multi-Offset Data: Joint Inversion of Rayleigh-Wave Particle Motion and Vertical- and Radial-Component Velocity Spectra

    NASA Astrophysics Data System (ADS)

    Dal Moro, Giancarlo; Moustafa, Sayed S. R.; Al-Arifi, Nassir S.

    2018-01-01

    Rayleigh waves often propagate according to complex mode excitation so that the proper identification and separation of specific modes can be quite difficult or, in some cases, just impossible. Furthermore, the analysis of a single component (i.e., an inversion procedure based on just one objective function) necessarily prevents solving the problems related to the non-uniqueness of the solution. To overcome these issues and define a holistic analysis of Rayleigh waves, we implemented a procedure to acquire data that are useful to define and efficiently invert the three objective functions defined from the three following "objects": the velocity spectra of the vertical- and radial-components and the Rayleigh-wave particle motion (RPM) frequency-offset data. Two possible implementations are presented. In the first case we consider classical multi-offset (and multi-component) data, while in a second possible approach we exploit the data recorded by a single three-component geophone at a fixed offset from the source. Given the simple field procedures, the method could be particularly useful for the unambiguous geotechnical exploration of large areas, where more complex acquisition procedures, based on the joint acquisition of Rayleigh and Love waves, would not be economically viable. After illustrating the different kinds of data acquisition and the data processing, the results of the proposed methodology are illustrated in a case study. Finally, a series of theoretical and practical aspects are discussed to clarify some issues involved in the overall procedure (data acquisition and processing).

  6. Multibeam Sonar Backscatter Data Acquisition and Processing: Guidelines and Recommendations from the GEOHAB Backscatter Working Group

    NASA Astrophysics Data System (ADS)

    Heffron, E.; Lurton, X.; Lamarche, G.; Brown, C.; Lucieer, V.; Rice, G.; Schimel, A.; Weber, T.

    2015-12-01

    Backscatter data acquired with multibeam sonars are now commonly used for the remote geological interpretation of the seabed. The systems hardware, software, and processing methods and tools have grown in numbers and improved over the years, yet many issues linger: there are no standard procedures for acquisition, poor or absent calibration, limited understanding and documentation of processing methods, etc. A workshop organized at the GeoHab (a community of geoscientists and biologists around the topic of marine habitat mapping) annual meeting in 2013 was dedicated to seafloor backscatter data from multibeam sonars and concluded that there was an overwhelming need for better coherence and agreement on the topics of acquisition, processing and interpretation of data. The GeoHab Backscatter Working Group (BSWG) was subsequently created with the purpose of documenting and synthetizing the state-of-the-art in sensors and techniques available today and proposing methods for best practice in the acquisition and processing of backscatter data. Two years later, the resulting document "Backscatter measurements by seafloor-mapping sonars: Guidelines and Recommendations" was completed1. The document provides: An introduction to backscatter measurements by seafloor-mapping sonars; A background on the physical principles of sonar backscatter; A discussion on users' needs from a wide spectrum of community end-users; A review on backscatter measurement; An analysis of best practices in data acquisition; A review of data processing principles with details on present software implementation; and finally A synthesis and key recommendations. This presentation reviews the BSWG mandate, structure, and development of this document. It details the various chapter contents, its recommendations to sonar manufacturers, operators, data processing software developers and end-users and its implication for the marine geology community. 1: Downloadable at https://www.niwa.co.nz/coasts-and-oceans/research-projects/backscatter-measurement-guidelines

  7. Simulation Based Acquisition for NASA's Office of Exploration Systems

    NASA Technical Reports Server (NTRS)

    Hale, Joe

    2004-01-01

    In January 2004, President George W. Bush unveiled his vision for NASA to advance U.S. scientific, security, and economic interests through a robust space exploration program. This vision includes the goal to extend human presence across the solar system, starting with a human return to the Moon no later than 2020, in preparation for human exploration of Mars and other destinations. In response to this vision, NASA has created the Office of Exploration Systems (OExS) to develop the innovative technologies, knowledge, and infrastructures to explore and support decisions about human exploration destinations, including the development of a new Crew Exploration Vehicle (CEV). Within the OExS organization, NASA is implementing Simulation Based Acquisition (SBA), a robust Modeling & Simulation (M&S) environment integrated across all acquisition phases and programs/teams, to make the realization of the President s vision more certain. Executed properly, SBA will foster better informed, timelier, and more defensible decisions throughout the acquisition life cycle. By doing so, SBA will improve the quality of NASA systems and speed their development, at less cost and risk than would otherwise be the case. SBA is a comprehensive, Enterprise-wide endeavor that necessitates an evolved culture, a revised spiral acquisition process, and an infrastructure of advanced Information Technology (IT) capabilities. SBA encompasses all project phases (from requirements analysis and concept formulation through design, manufacture, training, and operations), professional disciplines, and activities that can benefit from employing SBA capabilities. SBA capabilities include: developing and assessing system concepts and designs; planning manufacturing, assembly, transport, and launch; training crews, maintainers, launch personnel, and controllers; planning and monitoring missions; responding to emergencies by evaluating effects and exploring solutions; and communicating across the OExS enterprise, within the Government, and with the general public. The SBA process features empowered collaborative teams (including industry partners) to integrate requirements, acquisition, training, operations, and sustainment. The SBA process also utilizes an increased reliance on and investment in M&S to reduce design risk. SBA originated as a joint Industry and Department of Defense (DoD) initiative to define and integrate an acquisition process that employs robust, collaborative use of M&S technology across acquisition phases and programs. The SBA process was successfully implemented in the Air Force s Joint Strike Fighter (JSF) Program.

  8. 48 CFR 908.7116 - Electronic data processing tape.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Electronic data processing... Electronic data processing tape. (a) Acquisitions of electronic data processing tape by DOE offices shall be in accordance with FPMR 41 CFR 101-26.508. (b) Acquisitions of electronic data processing tape by...

  9. 48 CFR 908.7116 - Electronic data processing tape.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Electronic data processing... Electronic data processing tape. (a) Acquisitions of electronic data processing tape by DOE offices shall be in accordance with FPMR 41 CFR 101-26.508. (b) Acquisitions of electronic data processing tape by...

  10. 48 CFR 908.7116 - Electronic data processing tape.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 5 2013-10-01 2013-10-01 false Electronic data processing... Electronic data processing tape. (a) Acquisitions of electronic data processing tape by DOE offices shall be in accordance with FPMR 41 CFR 101-26.508. (b) Acquisitions of electronic data processing tape by...

  11. 48 CFR 908.7116 - Electronic data processing tape.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Electronic data processing... Electronic data processing tape. (a) Acquisitions of electronic data processing tape by DOE offices shall be in accordance with FPMR 41 CFR 101-26.508. (b) Acquisitions of electronic data processing tape by...

  12. 48 CFR 908.7116 - Electronic data processing tape.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 5 2012-10-01 2012-10-01 false Electronic data processing... Electronic data processing tape. (a) Acquisitions of electronic data processing tape by DOE offices shall be in accordance with FPMR 41 CFR 101-26.508. (b) Acquisitions of electronic data processing tape by...

  13. An Introduction to Data Analysis in Asteroseismology

    NASA Astrophysics Data System (ADS)

    Campante, Tiago L.

    A practical guide is presented to some of the main data analysis concepts and techniques employed contemporarily in the asteroseismic study of stars exhibiting solar-like oscillations. The subjects of digital signal processing and spectral analysis are introduced first. These concern the acquisition of continuous physical signals to be subsequently digitally analyzed. A number of specific concepts and techniques relevant to asteroseismology are then presented as we follow the typical workflow of the data analysis process, namely, the extraction of global asteroseismic parameters and individual mode parameters (also known as peak-bagging) from the oscillation spectrum.

  14. Engineering the Business of Defense Acquisition: An Analysis of Program Office Processes

    DTIC Science & Technology

    2015-05-01

    Information Technology and Business Process Redesign | MIT Sloan Management Review . MIT Sloan Management Review . Retrieved from http://sloanreview.mit.edu...links systems management to process execution Three Phases/ Multi-Year Effort (This Phase) Literature review Model development— Formal and...estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining

  15. Development and Flight Testing of an Autonomous Landing Gear Health-Monitoring System

    NASA Technical Reports Server (NTRS)

    Woodard, Stanley E.; Coffey, Neil C.; Gonzalez, Guillermo A.; Taylor, B. Douglas; Brett, Rube R.; Woodman, Keith L.; Weathered, Brenton W.; Rollins, Courtney H.

    2003-01-01

    Development and testing of an adaptable vehicle health-monitoring architecture is presented. The architecture is being developed for a fleet of vehicles. It has three operational levels: one or more remote data acquisition units located throughout the vehicle; a command and control unit located within the vehicle; and, a terminal collection unit to collect analysis results from all vehicles. Each level is capable of performing autonomous analysis with a trained expert system. Communication between all levels is done with wireless radio frequency interfaces. The remote data acquisition unit has an eight channel programmable digital interface that allows the user discretion for choosing type of sensors; number of sensors, sensor sampling rate and sampling duration for each sensor. The architecture provides framework for a tributary analysis. All measurements at the lowest operational level are reduced to provide analysis results necessary to gauge changes from established baselines. These are then collected at the next level to identify any global trends or common features from the prior level. This process is repeated until the results are reduced at the highest operational level. In the framework, only analysis results are forwarded to the next level to reduce telemetry congestion. The system's remote data acquisition hardware and non-analysis software have been flight tested on the NASA Langley B757's main landing gear. The flight tests were performed to validate the following: the wireless radio frequency communication capabilities of the system, the hardware design, command and control; software operation; and, data acquisition, storage and retrieval.

  16. Improving GPR Surveys Productivity by Array Technology and Fully Automated Processing

    NASA Astrophysics Data System (ADS)

    Morello, Marco; Ercoli, Emanuele; Mazzucchelli, Paolo; Cottino, Edoardo

    2016-04-01

    The realization of network infrastructures with lower environmental impact and the tendency to use digging technologies less invasive in terms of time and space of road occupation and restoration play a key-role in the development of communication networks. However, pre-existing buried utilities must be detected and located in the subsurface, to exploit the high productivity of modern digging apparatus. According to SUE quality level B+ both position and depth of subsurface utilities must be accurately estimated, demanding for 3D GPR surveys. In fact, the advantages of 3D GPR acquisitions (obtained either by multiple 2D recordings or by an antenna array) versus 2D acquisitions are well-known. Nonetheless, the amount of acquired data for such 3D acquisitions does not usually allow to complete processing and interpretation directly in field and in real-time, thus limiting the overall efficiency of the GPR acquisition. As an example, the "low impact mini-trench "technique (addressed in ITU - International Telecommunication Union - L.83 recommendation) requires that non-destructive mapping of buried services enhances its productivity to match the improvements of new digging equipment. Nowadays multi-antenna and multi-pass GPR acquisitions demand for new processing techniques that can obtain high quality subsurface images, taking full advantage of 3D data: the development of a fully automated and real-time 3D GPR processing system plays a key-role in overall optical network deployment profitability. Furthermore, currently available computing power suggests the feasibility of processing schemes that incorporate better focusing algorithms. A novel processing scheme, whose goal is the automated processing and detection of buried targets that can be applied in real-time to 3D GPR array systems, has been developed and fruitfully tested with two different GPR arrays (16 antennas, 900 MHz central frequency, and 34 antennas, 600 MHz central frequency). The proposed processing scheme take advantage of 3D data multiplicity by continuous real time data focusing. Pre-stack reflection angle gathers G(x, θ; v) are computed at nv different velocities (by the mean of Kirchhoff depth-migration kernels, that can naturally cope with any acquisition pattern and handle irregular sampling issues). It must be noted that the analysis of pre-stack reflection angle gathers plays a key-role in automated detection: targets are identified and the best local propagation velocities are recovered through a correlation estimate computed for all the nv reflection angle gathers. Indeed, the data redundancy of 3D GPR acquisitions highly improves the proposed automatic detection reliability. The goal of real-time automated processing has been pursued without the need of specific high performance processing hardware (a simple laptop is required). Moreover, the automatization of the entire surveying process allows to obtain high quality and repeatable results without the need of skilled interpreters. The proposed acquisition procedure has been extensively tested: more than 100 Km of acquired data prove the feasibility of the proposed approach.

  17. 48 CFR 736.602-5 - Short selection process for procurements not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Short selection process for procurements not to exceed the simplified acquisition threshold. 736.602-5 Section 736.602-5... selection process for procurements not to exceed the simplified acquisition threshold. References to FAR 36...

  18. Conversation and Silence: Transfer of Learning through the Arts

    ERIC Educational Resources Information Center

    Catterall, James S.

    2005-01-01

    This article explores transfer of learning in the arts to non-arts learning. The analysis is presented in the context of theories of knowledge acquisition more generally. Behavioral and neuro-function processes are discussed.

  19. A real-time spectrum acquisition system design based on quantum dots-quantum well detector

    NASA Astrophysics Data System (ADS)

    Zhang, S. H.; Guo, F. M.

    2016-01-01

    In this paper, we studied the structure characteristics of quantum dots-quantum well photodetector with response wavelength range from 400 nm to 1000 nm. It has the characteristics of high sensitivity, low dark current and the high conductance gain. According to the properties of the quantum dots-quantum well photodetectors, we designed a new type of capacitive transimpedence amplifier (CTIA) readout circuit structure with the advantages of adjustable gain, wide bandwidth and high driving ability. We have implemented the chip packaging between CTIA-CDS structure readout circuit and quantum dots detector and tested the readout response characteristics. According to the timing signals requirements of our readout circuit, we designed a real-time spectral data acquisition system based on FPGA and ARM. Parallel processing mode of programmable devices makes the system has high sensitivity and high transmission rate. In addition, we realized blind pixel compensation and smoothing filter algorithm processing to the real time spectrum data by using C++. Through the fluorescence spectrum measurement of carbon quantum dots and the signal acquisition system and computer software system to realize the collection of the spectrum signal processing and analysis, we verified the excellent characteristics of detector. It meets the design requirements of quantum dot spectrum acquisition system with the characteristics of short integration time, real-time and portability.

  20. Development of a data acquisition system using a RISC/UNIX TM workstation

    NASA Astrophysics Data System (ADS)

    Takeuchi, Y.; Tanimori, T.; Yasu, Y.

    1993-05-01

    We have developed a compact data acquisition system on RISC/UNIX workstations. A SUN TM SPARCstation TM IPC was used, in which an extension bus "SBus TM" was linked to a VMEbus. The transfer rate achieved was better than 7 Mbyte/s between the VMEbus and the SUN. A device driver for CAMAC was developed in order to realize an interruptive feature in UNIX. In addition, list processing has been incorporated in order to keep the high priority of the data handling process in UNIX. The successful developments of both device driver and list processing have made it possible to realize the good real-time feature on the RISC/UNIX system. Based on this architecture, a portable and versatile data taking system has been developed, which consists of a graphical user interface, I/O handler, user analysis process, process manager and a CAMAC device driver.

  1. Low Computational Signal Acquisition for GNSS Receivers Using a Resampling Strategy and Variable Circular Correlation Time

    PubMed Central

    Zhang, Yeqing; Wang, Meiling; Li, Yafeng

    2018-01-01

    For the objective of essentially decreasing computational complexity and time consumption of signal acquisition, this paper explores a resampling strategy and variable circular correlation time strategy specific to broadband multi-frequency GNSS receivers. In broadband GNSS receivers, the resampling strategy is established to work on conventional acquisition algorithms by resampling the main lobe of received broadband signals with a much lower frequency. Variable circular correlation time is designed to adapt to different signal strength conditions and thereby increase the operation flexibility of GNSS signal acquisition. The acquisition threshold is defined as the ratio of the highest and second highest correlation results in the search space of carrier frequency and code phase. Moreover, computational complexity of signal acquisition is formulated by amounts of multiplication and summation operations in the acquisition process. Comparative experiments and performance analysis are conducted on four sets of real GPS L2C signals with different sampling frequencies. The results indicate that the resampling strategy can effectively decrease computation and time cost by nearly 90–94% with just slight loss of acquisition sensitivity. With circular correlation time varying from 10 ms to 20 ms, the time cost of signal acquisition has increased by about 2.7–5.6% per millisecond, with most satellites acquired successfully. PMID:29495301

  2. Low Computational Signal Acquisition for GNSS Receivers Using a Resampling Strategy and Variable Circular Correlation Time.

    PubMed

    Zhang, Yeqing; Wang, Meiling; Li, Yafeng

    2018-02-24

    For the objective of essentially decreasing computational complexity and time consumption of signal acquisition, this paper explores a resampling strategy and variable circular correlation time strategy specific to broadband multi-frequency GNSS receivers. In broadband GNSS receivers, the resampling strategy is established to work on conventional acquisition algorithms by resampling the main lobe of received broadband signals with a much lower frequency. Variable circular correlation time is designed to adapt to different signal strength conditions and thereby increase the operation flexibility of GNSS signal acquisition. The acquisition threshold is defined as the ratio of the highest and second highest correlation results in the search space of carrier frequency and code phase. Moreover, computational complexity of signal acquisition is formulated by amounts of multiplication and summation operations in the acquisition process. Comparative experiments and performance analysis are conducted on four sets of real GPS L2C signals with different sampling frequencies. The results indicate that the resampling strategy can effectively decrease computation and time cost by nearly 90-94% with just slight loss of acquisition sensitivity. With circular correlation time varying from 10 ms to 20 ms, the time cost of signal acquisition has increased by about 2.7-5.6% per millisecond, with most satellites acquired successfully.

  3. On-line Monitoring for Cutting Tool Wear Condition Based on the Parameters

    NASA Astrophysics Data System (ADS)

    Han, Fenghua; Xie, Feng

    2017-07-01

    In the process of cutting tools, it is very important to monitor the working state of the tools. On the basis of acceleration signal acquisition under the constant speed, time domain and frequency domain analysis of relevant indicators monitor the online of tool wear condition. The analysis results show that the method can effectively judge the tool wear condition in the process of machining. It has certain application value.

  4. Welding process modelling and control

    NASA Technical Reports Server (NTRS)

    Romine, Peter L.; Adenwala, Jinen A.

    1993-01-01

    The research and analysis performed, and software developed, and hardware/software recommendations made during 1992 in development of the PC-based data acquisition system for support of Welding Process Modeling and Control is reported. A need was identified by the Metals Processing Branch of NASA Marshall Space Flight Center, for a mobile data aquisition and analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC-based system was chosen. The Welding Measurement System (WMS) is a dedicated instrument, strictly for the use of data aquisition and analysis. Although the WMS supports many of the functions associated with the process control, it is not the intention for this system to be used for welding process control.

  5. Multiplex lexical networks reveal patterns in early word acquisition in children

    NASA Astrophysics Data System (ADS)

    Stella, Massimo; Beckage, Nicole M.; Brede, Markus

    2017-04-01

    Network models of language have provided a way of linking cognitive processes to language structure. However, current approaches focus only on one linguistic relationship at a time, missing the complex multi-relational nature of language. In this work, we overcome this limitation by modelling the mental lexicon of English-speaking toddlers as a multiplex lexical network, i.e. a multi-layered network where N = 529 words/nodes are connected according to four relationship: (i) free association, (ii) feature sharing, (iii) co-occurrence, and (iv) phonological similarity. We investigate the topology of the resulting multiplex and then proceed to evaluate single layers and the full multiplex structure on their ability to predict empirically observed age of acquisition data of English speaking toddlers. We find that the multiplex topology is an important proxy of the cognitive processes of acquisition, capable of capturing emergent lexicon structure. In fact, we show that the multiplex structure is fundamentally more powerful than individual layers in predicting the ordering with which words are acquired. Furthermore, multiplex analysis allows for a quantification of distinct phases of lexical acquisition in early learners: while initially all the multiplex layers contribute to word learning, after about month 23 free associations take the lead in driving word acquisition.

  6. Analysis of EA-18G Growler Engine Maintenance at Naval Air Station Whidbey Island, WA

    DTIC Science & Technology

    2013-05-30

    The aviation maintenance community considers the NAMP its bible . “The NAMP applies to all organizations operating or supporting Navy and Marine Corps...the present scenario and this will increase to 114, which is the total number of aircraft at the completion of the acquisition process in year 2018 ...increasing by 12 per year until 2018 where eight will complete the acquisition).  The cost of the F414-GE-400 engine is about $3.7 million

  7. Analysis of EA-18G Growler Engine Maintenance at Naval Air Station Whidbey Island, WA

    DTIC Science & Technology

    2013-06-01

    maintenance. A. NAVAL AVIATION MAINTENANCE PROGRAM The aviation maintenance community considers the NAMP its bible . “The NAMP applies to all...will increase to 114, which is the total number of aircraft at the completion of the acquisition process in year 2018 (increasing by 12 per year...until 2018 where eight will complete the acquisition)  The cost of the F414-GE-400 engine is about 3.7 million dollars. B. CALCULATING THE ENGINE Aₒ

  8. Further Evidence on the Effect of Acquisition Policy and Process on Cost Growth of Major Defense Acquisition Programs

    DTIC Science & Technology

    2016-06-01

    several changes that may prove to be consequential. B. Funding Climates The amount appropriated to DoD each year for procurement over the period FY...important to realize that the turns in the funds appropriated for procurement are a lagging indicator of a change in budget climate . Funding and...each of the events identified to signal major and sustained changes in the defense funding climate , which in fact they did. Analysis of surrounding

  9. An Analysis of the United States Special Operations Command’s Acquisition Process to Determine Its Compliance with Acquisition Reform Initiatives of the Past Decade

    DTIC Science & Technology

    1996-12-01

    This includes an exemption from publishing the opportunity in the Commerce Business Daily ( CBD ) and elimination of the requirement to hold the...of assigned programs. In discharging this responsibility, thc 990 coor.int-tes his efforts with ether ASN(RDMA) offices, b. TEFlO..M-VAL OPERATIONS...Communications, Computers and Information Systems CA Civil Affairs CAIV Cost as an Independent Variable CBD Commerce Business Daily CBPL Capabilities

  10. Blueprint for Acquisition Reform, Version 3.0

    DTIC Science & Technology

    2008-07-01

    represents a substantial and immediate step forward in establishing the Coast Guard as a model mid-sized federal agency for acquisition processes...Blueprint for Acquisition Reform in the U. S. Coast Guard “The Coast Guard must become the model for mid-sized Federal agency acquisition in process...acquisition (DoD 5000 model >CG Major Systems Acquisition Manual) • Deepwater Program Executive Officer (PEO): System of Systems performance-based

  11. An Acquisition Guide for Executives

    EPA Pesticide Factsheets

    This guide covers the following subjects; What is Acquisition?, Purpose and Primary Functions of the Agency’s Acquisition System, Key Organizations in Acquisitions, Legal Framework, Key Players in Acquisitions, Acquisition Process, Acquisition Thresholds

  12. Unmanned Maritime Systems Incremental Acquisition Approach

    DTIC Science & Technology

    2016-12-01

    We find that current UMS acquisitions are utilizing previous acquisition reforms, but could benefit from additional contractor peer competition and...peer review. Additional cost and schedule benefits could result from contractor competition during build processes in each incremental process. We...acquisitions are utilizing previous acquisition reforms, but could benefit from additional contractor peer competition and peer review. Additional

  13. 48 CFR 36.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 1 2013-10-01 2013-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 36.602-5 Section 36.602-5 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  14. 48 CFR 36.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 1 2014-10-01 2014-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 36.602-5 Section 36.602-5 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  15. 48 CFR 36.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 1 2012-10-01 2012-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 36.602-5 Section 36.602-5 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  16. 48 CFR 36.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 36.602-5 Section 36.602-5 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  17. 48 CFR 36.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 1 2011-10-01 2011-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 36.602-5 Section 36.602-5 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  18. Regional environmental analysis and management: New techniques for current problems

    NASA Technical Reports Server (NTRS)

    Honea, R. B.; Paludan, C. T. N.

    1974-01-01

    Advances in data acquisition and processing procedures for regional environmental analysis are discussed. Automated and semi-automated techniques employing Earth Resources Technology Satellite data and conventional data sources are presented. Experiences are summarized. The ERTS computer compatible tapes provide a very complete and flexible record of earth resources data and represent a viable medium to enhance regional environmental analysis research.

  19. Second Language Acquisition: Possible Insights from Studies on How Birds Acquire Song.

    ERIC Educational Resources Information Center

    Neapolitan, Denise M.; And Others

    1988-01-01

    Reviews research that demonstrates parallels between general linguistic and cognitive processes in human language acquisition and avian acquisition of song and discusses how such research may provide new insights into the processes of second-language acquisition. (Author/CB)

  20. Environmental Impact Analysis Process. Deployment Area Selection and Land Withdrawal/Acquisition DEIS. Chapter V. Appendices.

    DTIC Science & Technology

    1980-12-01

    Analysis of the White Pine Power Project. Bureau of Business and Economic Research, University of Nevada, Reno. Basile , J. V., and T. N. Lonner, 1979...Suspected of Pesticide Poisoning. Avian Diseases 18:487-489. Resource Area, Nye County, Nevada. Bureau of Land Management, Battle Mountain District. Rhoads, W

  1. L'analyse contrastive: histoire et situation actuelle (Contrastive Analysis: History and Current Situation).

    ERIC Educational Resources Information Center

    Py, Bernard

    1984-01-01

    It is suggested that it is not between two languages that transfers and interference occur, but within the learner. The learner mediates and constructs this relationship according to acquisition operations, processes, strategies, and stages that contrastive analysis, despite its utility, can neither account for nor predict. (MSE)

  2. Systems integration of marketable subsystems: A collection of progress reports

    NASA Technical Reports Server (NTRS)

    1978-01-01

    Monthly progress reports are given in the areas of marketable subsystems integration; development, design, and building of site data acquisition subsystems and data processing systems; operation of the solar test facility and a systems analysis.

  3. Research and Development in Very Long Baseline Interferometry (VLBI)

    NASA Technical Reports Server (NTRS)

    Himwich, William E.

    2004-01-01

    Contents include the following: 1.Observation coordination. 2. Data acquisition system control software. 3. Station support. 4. Correlation, data processing, and analysis. 5. Data distribution and archiving. 6. Technique improvement and research. 7. Computer support.

  4. Triaxial Probe Magnetic Data Analysis

    NASA Technical Reports Server (NTRS)

    Shultz, Kimberly; Whittlesey, Albert; Narvaez, Pablo

    2007-01-01

    The Triaxial Magnetic Moment Analysis software uses measured magnetic field test data to compute dipole and quadrupole moment information from a hardware element. It is used to support JPL projects needing magnetic control and an understanding of the spacecraft-generated magnetic fields. Evaluation of the magnetic moment of an object consists of three steps: acquisition, conditioning, and analysis. This version of existing software was extensively rewritten for easier data acquisition, data analysis, and report presentation, including immediate feedback to the test operator during data acquisition. While prior JPL computer codes provided the same data content, this program has a better graphic display including original data overlaid with reconstructed results to show goodness of fit accuracy and better appearance of the report graphic page. Data are acquired using three magnetometers and two rotations of the device under test. A clean acquisition user interface presents required numeric data and graphic summaries, and the analysis module yields the best fit (least squares) for the magnetic dipole and/or quadrupole moment of a device. The acquisition module allows the user to record multiple data sets, selecting the best data to analyze, and is repeated three times for each of the z-axial and y-axial rotations. In this update, the y-axial rotation starting position has been changed to an option, allowing either the x- or z-axis to point towards the magnetometer. The code has been rewritten to use three simultaneous axes of magnetic data (three probes), now using two "rotations" of the device under test rather than the previous three rotations, thus reducing handling activities on the device under test. The present version of the software gathers data in one-degree increments, which permits much better accuracy of the fit ted data than the coarser data acquisition of the prior software. The data-conditioning module provides a clean data set for the analysis module. For multiple measurements at a given degree, the first measurement is used. For omitted measurements, the missing field is estimated by linear interpolation between the two nearest measurements. The analysis module was rewritten for the dual rotation, triaxial probe measurement process and now has better moment estimation accuracy, based on the finer one degree of data acquisition resolution. The magnetic moments thus computed are used as an input to summarize the total spacecraft field.

  5. Understanding information synthesis in oral surgery for the design of systems for clinical information technology.

    PubMed

    Suebnukarn, Siriwan; Chanakarn, Piyawadee; Phisutphatthana, Sirada; Pongpatarat, Kanchala; Wongwaithongdee, Udom; Oupadissakoon, Chanekrid

    2015-12-01

    An understanding of the processes of clinical decision-making is essential for the development of health information technology. In this study we have analysed the acquisition of information during decision-making in oral surgery, and analysed cognitive tasks using a "think-aloud" protocol. We studied the techniques of processing information that were used by novices and experts as they completed 4 oral surgical cases modelled from data obtained from electronic hospital records. We studied 2 phases of an oral surgeon's preoperative practice including the "diagnosis and planning of treatment" and "preparing for a procedure". A framework analysis approach was used to analyse the qualitative data, and a descriptive statistical analysis was made of the quantitative data. The results showed that novice surgeons used hypotheticodeductive reasoning, whereas experts recognised patterns to diagnose and manage patients. Novices provided less detail when they prepared for a procedure. Concepts regarding "signs", "importance", "decisions", and "process" occurred most often during acquisition of information by both novices and experts. Based on these results, we formulated recommendations for the design of clinical information technology that would help to improve the acquisition of clinical information required by oral surgeons at all levels of expertise in their clinical decision-making. Copyright © 2015 The British Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  6. Airborne Wind Profiling With the Data Acquisition and Processing System for a Pulsed 2-Micron Coherent Doppler Lidar System

    NASA Technical Reports Server (NTRS)

    Beyon, Jeffrey Y.; Koch, Grady J.; Kavaya, Michael J.

    2012-01-01

    A pulsed 2-micron coherent Doppler lidar system at NASA Langley Research Center in Virginia flew on the NASA's DC-8 aircraft during the NASA Genesis and Rapid Intensification Processes (GRIP) during the summer of 2010. The participation was part of the project Doppler Aerosol Wind Lidar (DAWN) Air. Selected results of airborne wind profiling are presented and compared with the dropsonde data for verification purposes. Panoramic presentations of different wind parameters over a nominal observation time span are also presented for selected GRIP data sets. The realtime data acquisition and analysis software that was employed during the GRIP campaign is introduced with its unique features.

  7. Comparative muscle study fatigue with sEMG signals during the isotonic and isometric tasks for diagnostics purposes.

    PubMed

    Sarmiento, Jhon F; Benevides, Alessandro B; Moreira, Marcelo H; Elias, Arlindo; Bastos, Teodiano F; Silva, Ian V; Pelegrina, Claudinei C

    2011-01-01

    The study of fatigue is an important tool for diagnostics of disease, sports, ergonomics and robotics areas. This work deals with the analysis of sEMG most important fatigue muscle indicators with use of signal processing in isometric and isotonic tasks with the propose of standardizing fatigue protocol to select the data acquisition and processing with diagnostic proposes. As a result, the slope of the RMS, ARV and MNF indicators were successful to describe the fatigue behavior expected. Whereas that, MDF and AIF indicators failed in the description of fatigue. Similarly, the use of a constant load for sEMG data acquisition was the best strategy in both tasks.

  8. The Defense Systems Acquisition and Review Council

    DTIC Science & Technology

    1976-09-15

    THE DEFENSE SYSTEMS ACQUISITION AND REVIEW COUNCIL.. A Study of Areas of Consideration Affecting the Functions and Process of Defense Major...COUNfIL: 4 Study of Areas of Considerationi Affecting/he Functions and Process of Defense _- 1 Major Systems Acquisition. , ; O,. v AUTHOR(e) I. C...Studies DSARC -- Functions and Process OSDCAIG *, Army Systems . A - =A -- he Defense Systems Acquisition Review Council (DSARC) was created to assume

  9. A multimedia perioperative record keeper for clinical research.

    PubMed

    Perrino, A C; Luther, M A; Phillips, D B; Levin, F L

    1996-05-01

    To develop a multimedia perioperative recordkeeper that provides: 1. synchronous, real-time acquisition of multimedia data, 2. on-line access to the patient's chart data, and 3. advanced data analysis capabilities through integrated, multimedia database and analysis applications. To minimize cost and development time, the system design utilized industry standard hardware components and graphical. software development tools. The system was configured to use a Pentium PC complemented with a variety of hardware interfaces to external data sources. These sources included physiologic monitors with data in digital, analog, video, and audio as well as paper-based formats. The development process was guided by trials in over 80 clinical cases and by the critiques from numerous users. As a result of this process, a suite of custom software applications were created to meet the design goals. The Perioperative Data Acquisition application manages data collection from a variety of physiological monitors. The Charter application provides for rapid creation of an electronic medical record from the patient's paper-based chart and investigator's notes. The Multimedia Medical Database application provides a relational database for the organization and management of multimedia data. The Triscreen application provides an integrated data analysis environment with simultaneous, full-motion data display. With recent technological advances in PC power, data acquisition hardware, and software development tools, the clinical researcher now has the ability to collect and examine a more complete perioperative record. It is hoped that the description of the MPR and its development process will assist and encourage others to advance these tools for perioperative research.

  10. An innovative experimental sequence on electromagnetic induction and eddy currents based on video analysis and cheap data acquisition

    NASA Astrophysics Data System (ADS)

    Bonanno, A.; Bozzo, G.; Sapia, P.

    2017-11-01

    In this work, we present a coherent sequence of experiments on electromagnetic (EM) induction and eddy currents, appropriate for university undergraduate students, based on a magnet falling through a drilled aluminum disk. The sequence, leveraging on the didactical interplay between the EM and mechanical aspects of the experiments, allows us to exploit the students’ awareness of mechanics to elicit their comprehension of EM phenomena. The proposed experiments feature two kinds of measurements: (i) kinematic measurements (performed by means of high-speed video analysis) give information on the system’s kinematics and, via appropriate numerical data processing, allow us to get dynamic information, in particular on energy dissipation; (ii) induced electromagnetic field (EMF) measurements (by using a homemade multi-coil sensor connected to a cheap data acquisition system) allow us to quantitatively determine the inductive effects of the moving magnet on its neighborhood. The comparison between experimental results and the predictions from an appropriate theoretical model (of the dissipative coupling between the moving magnet and the conducting disk) offers many educational hints on relevant topics related to EM induction, such as Maxwell’s displacement current, magnetic field flux variation, and the conceptual link between induced EMF and induced currents. Moreover, the didactical activity gives students the opportunity to be trained in video analysis, data acquisition and numerical data processing.

  11. Real-time acquisition and display of flow contrast using speckle variance optical coherence tomography in a graphics processing unit.

    PubMed

    Xu, Jing; Wong, Kevin; Jian, Yifan; Sarunic, Marinko V

    2014-02-01

    In this report, we describe a graphics processing unit (GPU)-accelerated processing platform for real-time acquisition and display of flow contrast images with Fourier domain optical coherence tomography (FDOCT) in mouse and human eyes in vivo. Motion contrast from blood flow is processed using the speckle variance OCT (svOCT) technique, which relies on the acquisition of multiple B-scan frames at the same location and tracking the change of the speckle pattern. Real-time mouse and human retinal imaging using two different custom-built OCT systems with processing and display performed on GPU are presented with an in-depth analysis of performance metrics. The display output included structural OCT data, en face projections of the intensity data, and the svOCT en face projections of retinal microvasculature; these results compare projections with and without speckle variance in the different retinal layers to reveal significant contrast improvements. As a demonstration, videos of real-time svOCT for in vivo human and mouse retinal imaging are included in our results. The capability of performing real-time svOCT imaging of the retinal vasculature may be a useful tool in a clinical environment for monitoring disease-related pathological changes in the microcirculation such as diabetic retinopathy.

  12. Research Status and Development Trend of Remote Sensing in China Using Bibliometric Analysis

    NASA Astrophysics Data System (ADS)

    Zeng, Y.; Zhang, J.; Niu, R.

    2015-06-01

    Remote sensing was introduced into China in 1970s and then began to flourish. At present, China has developed into a big remote sensing country, and remote sensing is increasingly playing an important role in various fields of national economic construction and social development. Based on China Academic Journals Full-text Database and China Citation Database published by China National Knowledge Infrastructure, this paper analyzed academic characteristics of 963 highly cited papers published by 16 professional and academic journals in the field of surveying and mapping from January 2010 to December 2014 in China, which include hot topics, literature authors, research institutions, and fundations. At the same time, it studied a total of 51,149 keywords published by these 16 journals during the same period. Firstly by keyword selection, keyword normalization, keyword consistency and keyword incorporation, and then by analysis of high frequency keywords, the progress and prospect of China's remote sensing technology in data acquisition, data processing and applications during the past five years were further explored and revealed. It can be seen that: highly cited paper analysis and word frequency analysis is complementary on subject progress analysis; in data acquisition phase, research focus is new civilian remote sensing satellite systems and UAV remote sensing system; research focus of data processing and analysis is multi-source information extraction and classification, laser point cloud data processing, objectoriented high resolution image analysis, SAR data and hyper-spectral image processing, etc.; development trend of remote sensing data processing is quantitative, intelligent, automated, and real-time, and the breadth and depth of remote sensing application is gradually increased; parallel computing, cloud computing and geographic conditions monitoring and census are the new research focuses to be paid attention to.

  13. How Do Turkish Middle School Science Coursebooks Present the Science Process Skills?

    ERIC Educational Resources Information Center

    Aslan, Oktay

    2015-01-01

    An important objective in science education is the acquisition of science process skills (SPS) by the students. Therefore, science coursebooks, among the main resources of elementary science curricula, are to convey accurate SPS. This study is a qualitative study based on the content analysis of the science coursebooks used at middle schools. In…

  14. 48 CFR 15.101-1 - Tradeoff process.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Tradeoff process. 15.101-1 Section 15.101-1 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Source Selection Processes and Techniques 15.101-1...

  15. Acquisition by Processing Theory: A Theory of Everything?

    ERIC Educational Resources Information Center

    Carroll, Susanne E.

    2004-01-01

    Truscott and Sharwood Smith (henceforth T&SS) propose a novel theory of language acquisition, "Acquisition by Processing Theory" (APT), designed to account for both first and second language acquisition, monolingual and bilingual speech perception and parsing, and speech production. This is a tall order. Like any theoretically ambitious…

  16. 29. Perimeter acquisition radar building room #318, data processing system ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    29. Perimeter acquisition radar building room #318, data processing system area; data processor maintenance and operations center, showing data processing consoles - Stanley R. Mickelsen Safeguard Complex, Perimeter Acquisition Radar Building, Limited Access Area, between Limited Access Patrol Road & Service Road A, Nekoma, Cavalier County, ND

  17. Development of Data Acquisition Set-up for Steady-state Experiments

    NASA Astrophysics Data System (ADS)

    Srivastava, Amit K.; Gupta, Arnab D.; Sunil, S.; Khan, Ziauddin

    2017-04-01

    For short duration experiments, generally digitized data is transferred for processing and storage after the experiment whereas in case of steady-state experiment the data is acquired, processed, displayed and stored continuously in pipelined manner. This requires acquiring data through special techniques for storage and on-the-go viewing data to display the current data trends for various physical parameters. A small data acquisition set-up is developed for continuously acquiring signals from various physical parameters at different sampling rate for long duration experiment. This includes the hardware set-up for signal digitization, Field Programmable Gate Arrays (FPGA) based timing system for clock synchronization and event/trigger distribution, time slicing of data streams for storage of data chunks to enable viewing of data during acquisition and channel profile display through down sampling etc. In order to store a long data stream of indefinite/long time duration, the data stream is divided into data slices/chunks of user defined time duration. Data chunks avoid the problem of non-access of server data until the channel data file is closed at the end of the long duration experiment. A graphical user interface has been developed in Lab VIEW application development environment for configuring the data acquisition hardware and storing data chunks on local machine as well as at remote data server through Python for further data access. The data plotting and analysis utilities have been developed with Python software, which provides tools for further data processing. This paper describes the development and implementation of data acquisition for steady-state experiment.

  18. High speed CMOS acquisition system based on FPGA embedded image processing for electro-optical measurements

    NASA Astrophysics Data System (ADS)

    Rosu-Hamzescu, Mihnea; Polonschii, Cristina; Oprea, Sergiu; Popescu, Dragos; David, Sorin; Bratu, Dumitru; Gheorghiu, Eugen

    2018-06-01

    Electro-optical measurements, i.e., optical waveguides and plasmonic based electrochemical impedance spectroscopy (P-EIS), are based on the sensitive dependence of refractive index of electro-optical sensors on surface charge density, modulated by an AC electrical field applied to the sensor surface. Recently, P-EIS has emerged as a new analytical tool that can resolve local impedance with high, optical spatial resolution, without using microelectrodes. This study describes a high speed image acquisition and processing system for electro-optical measurements, based on a high speed complementary metal-oxide semiconductor (CMOS) sensor and a field-programmable gate array (FPGA) board. The FPGA is used to configure CMOS parameters, as well as to receive and locally process the acquired images by performing Fourier analysis for each pixel, deriving the real and imaginary parts of the Fourier coefficients for the AC field frequencies. An AC field generator, for single or multi-sine signals, is synchronized with the high speed acquisition system for phase measurements. The system was successfully used for real-time angle-resolved electro-plasmonic measurements from 30 Hz up to 10 kHz, providing results consistent to ones obtained by a conventional electrical impedance approach. The system was able to detect amplitude variations with a relative variation of ±1%, even for rather low sampling rates per period (i.e., 8 samples per period). The PC (personal computer) acquisition and control software allows synchronized acquisition for multiple FPGA boards, making it also suitable for simultaneous angle-resolved P-EIS imaging.

  19. Cue acquisition: A feature of Malawian midwives decision making process to support normality during the first stage of labour.

    PubMed

    Chodzaza, Elizabeth; Haycock-Stuart, Elaine; Holloway, Aisha; Mander, Rosemary

    2018-03-01

    to explore Malawian midwives decision making when caring for women during the first stage of labour in the hospital setting. this focused ethnographic study examined the decision making process of 9 nurse-midwives with varying years of clinical experience in the real world setting of an urban and semi urban hospital from October 2013 to May 2014.This was done using 27 participant observations and 27 post-observation in-depth interviews over a period of six months. Qualitative data analysis software, NVivo 10, was used to assist with data management for the analysis. All data was analysed using the principle of theme and category formation. analysis revealed a six-stage process of decision making that include a baseline for labour, deciding to admit a woman to labour ward, ascertaining the normal physiological progress of labour, supporting the normal physiological progress of labour, embracing uncertainty: the midwives' construction of unusual labour as normal, dealing with uncertainty and deciding to intervene in unusual labour. This six-stage process of decision making is conceptualised as the 'role of cue acquisition', illustrating the ways in which midwives utilise their assessment of labouring women to reason and make decisions on how to care for them in labour. Cue acquisition involved the midwives piecing together segments of information they obtained from the women to formulate an understanding of the woman's birthing progress and inform the midwives decision making process. This understanding of cue acquisition by midwives is significant for supporting safe care in the labour setting. When there was uncertainty in a woman's progress of labour, midwives used deductive reasoning, for example, by cross-checking and analysing the information obtained during the span of labour. Supporting normal labour physiological processes was identified as an underlying principle that shaped the midwives clinical judgement and decision making when they cared for women in labour. the significance of this study is in the new understanding and insight into the process of midwifery decision making. Whilst the approach to decision making by the midwives requires further testing and refinement in order to explore implications for practice, the findings here provide new conceptual and practical clarity of midwifery decision making. The work contributes to the identified lack of knowledge of how midwives working clinically, in the 'real world setting. These findings therefore, contribute to this body of knowledge with regards to our understanding of decision making of midwives. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Cognitive inhibition of number/length interference in a Piaget-like task: evidence by combining ERP and MEG.

    PubMed

    Joliot, Marc; Leroux, Gaëlle; Dubal, Stéphanie; Tzourio-Mazoyer, Nathalie; Houdé, Olivier; Mazoyer, Bernard; Petit, Laurent

    2009-08-01

    We combined event-related potential (ERP) and magnetoencephalography (MEG) acquisition and analysis to investigate the electrophysiological markers of the inhibitory processes involved in the number/length interference in a Piaget-like numerical task. Eleven healthy subjects performed four gradually interfering conditions with the heuristic "length equals number" to be inhibited. Low resolution tomography reconstruction was performed on the combined grand averaged electromagnetic data at the early (N1, P1) and late (P2, N2, P3(early) and P3(late)) latencies. Every condition was analyzed at both scalp and regional brain levels. The inhibitory processes were visible on the late components of the electromagnetic brain activity. A right P2-related frontal orbital activation reflected the change of strategy in the inhibitory processes. N2-related SMA/cingulate activation revealed the first occurrence of the stimuli processing to be inhibited. Both P3 components revealed the working memory processes operating in a medial temporal complex and the mental imagery processes subtended by the precuneus. Simultaneous ERP and MEG signal acquisition and analysis allowed to describe the spatiotemporal patterns of neural networks involved in the inhibition of the "length equals number" interference. Combining ERP and MEG ensured a sensitivity which could be reached previously only through invasive intracortical recordings.

  1. Machine Learning and Computer Vision System for Phenotype Data Acquisition and Analysis in Plants.

    PubMed

    Navarro, Pedro J; Pérez, Fernando; Weiss, Julia; Egea-Cortines, Marcos

    2016-05-05

    Phenomics is a technology-driven approach with promising future to obtain unbiased data of biological systems. Image acquisition is relatively simple. However data handling and analysis are not as developed compared to the sampling capacities. We present a system based on machine learning (ML) algorithms and computer vision intended to solve the automatic phenotype data analysis in plant material. We developed a growth-chamber able to accommodate species of various sizes. Night image acquisition requires near infrared lightning. For the ML process, we tested three different algorithms: k-nearest neighbour (kNN), Naive Bayes Classifier (NBC), and Support Vector Machine. Each ML algorithm was executed with different kernel functions and they were trained with raw data and two types of data normalisation. Different metrics were computed to determine the optimal configuration of the machine learning algorithms. We obtained a performance of 99.31% in kNN for RGB images and a 99.34% in SVM for NIR. Our results show that ML techniques can speed up phenomic data analysis. Furthermore, both RGB and NIR images can be segmented successfully but may require different ML algorithms for segmentation.

  2. 48 CFR 801.602-78 - Processing solicitations and contract documents for legal or technical review-Veterans Health...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Processing solicitations..., Central Office (except Office of Construction and Facilities Management), the National Acquisition Center, and the Denver Acquisition and Logistics Center. 801.602-78 Section 801.602-78 Federal Acquisition...

  3. The acquisition process of musical tonal schema: implications from connectionist modeling.

    PubMed

    Matsunaga, Rie; Hartono, Pitoyo; Abe, Jun-Ichi

    2015-01-01

    Using connectionist modeling, we address fundamental questions concerning the acquisition process of musical tonal schema of listeners. Compared to models of previous studies, our connectionist model (Learning Network for Tonal Schema, LeNTS) was better equipped to fulfill three basic requirements. Specifically, LeNTS was equipped with a learning mechanism, bound by culture-general properties, and trained by sufficient melody materials. When exposed to Western music, LeNTS acquired musical 'scale' sensitivity early and 'harmony' sensitivity later. The order of acquisition of scale and harmony sensitivities shown by LeNTS was consistent with the culture-specific acquisition order shown by musically westernized children. The implications of these results for the acquisition process of a tonal schema of listeners are as follows: (a) the acquisition process may entail small and incremental changes, rather than large and stage-like changes, in corresponding neural circuits; (b) the speed of schema acquisition may mainly depend on musical experiences rather than maturation; and (c) the learning principles of schema acquisition may be culturally invariant while the acquired tonal schemas are varied with exposed culture-specific music.

  4. The acquisition process of musical tonal schema: implications from connectionist modeling

    PubMed Central

    Matsunaga, Rie; Hartono, Pitoyo; Abe, Jun-ichi

    2015-01-01

    Using connectionist modeling, we address fundamental questions concerning the acquisition process of musical tonal schema of listeners. Compared to models of previous studies, our connectionist model (Learning Network for Tonal Schema, LeNTS) was better equipped to fulfill three basic requirements. Specifically, LeNTS was equipped with a learning mechanism, bound by culture-general properties, and trained by sufficient melody materials. When exposed to Western music, LeNTS acquired musical ‘scale’ sensitivity early and ‘harmony’ sensitivity later. The order of acquisition of scale and harmony sensitivities shown by LeNTS was consistent with the culture-specific acquisition order shown by musically westernized children. The implications of these results for the acquisition process of a tonal schema of listeners are as follows: (a) the acquisition process may entail small and incremental changes, rather than large and stage-like changes, in corresponding neural circuits; (b) the speed of schema acquisition may mainly depend on musical experiences rather than maturation; and (c) the learning principles of schema acquisition may be culturally invariant while the acquired tonal schemas are varied with exposed culture-specific music. PMID:26441725

  5. Reengineering the Acquisition/Procurement Process: A Methodology for Requirements Collection

    NASA Technical Reports Server (NTRS)

    Taylor, Randall; Vanek, Thomas

    2011-01-01

    This paper captures the systematic approach taken by JPL's Acquisition Reengineering Project team, the methodology used, challenges faced, and lessons learned. It provides pragmatic "how-to" techniques and tools for collecting requirements and for identifying areas of improvement in an acquisition/procurement process or other core process of interest.

  6. 48 CFR 15.202 - Advisory multi-step process.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Advisory multi-step... Information 15.202 Advisory multi-step process. (a) The agency may publish a presolicitation notice (see 5.204... participate in the acquisition. This process should not be used for multi-step acquisitions where it would...

  7. Utilization of high-frequency Rayleigh waves in near-surface geophysics

    USGS Publications Warehouse

    Xia, J.; Miller, R.D.; Park, C.B.; Ivanov, J.; Tian, G.; Chen, C.

    2004-01-01

    Shear-wave velocities can be derived from inverting the dispersive phase velocity of the surface. The multichannel analysis of surface waves (MASW) is one technique for inverting high-frequency Rayleigh waves. The process includes acquisition of high-frequency broad-band Rayleigh waves, efficient and accurate algorithms designed to extract Rayleigh-wave dispersion curves from Rayleigh waves, and stable and efficient inversion algorithms to obtain near-surface S-wave velocity profiles. MASW estimates S-wave velocity from multichannel vertical compoent data and consists of data acquisition, dispersion-curve picking, and inversion.

  8. Wireless photoplethysmographic device for heart rate variability signal acquisition and analysis.

    PubMed

    Reyes, Ivan; Nazeran, Homer; Franco, Mario; Haltiwanger, Emily

    2012-01-01

    The photoplethysmographic (PPG) signal has the potential to aid in the acquisition and analysis of heart rate variability (HRV) signal: a non-invasive quantitative marker of the autonomic nervous system that could be used to assess cardiac health and other physiologic conditions. A low-power wireless PPG device was custom-developed to monitor, acquire and analyze the arterial pulse in the finger. The system consisted of an optical sensor to detect arterial pulse as variations in reflected light intensity, signal conditioning circuitry to process the reflected light signal, a microcontroller to control PPG signal acquisition, digitization and wireless transmission, a receiver to collect the transmitted digital data and convert them back to their analog representations. A personal computer was used to further process the captured PPG signals and display them. A MATLAB program was then developed to capture the PPG data, detect the RR peaks, perform spectral analysis of the PPG data, and extract the HRV signal. A user-friendly graphical user interface (GUI) was developed in LabView to display the PPG data and their spectra. The performance of each module (sensing unit, signal conditioning, wireless transmission/reception units, and graphical user interface) was assessed individually and the device was then tested as a whole. Consequently, PPG data were obtained from five healthy individuals to test the utility of the wireless system. The device was able to reliably acquire the PPG signals from the volunteers. To validate the accuracy of the MATLAB codes, RR peak information from each subject was fed into Kubios software as a text file. Kubios was able to generate a report sheet with the time domain and frequency domain parameters of the acquired data. These features were then compared against those calculated by MATLAB. The preliminary results demonstrate that the prototype wireless device could be used to perform HRV signal acquisition and analysis.

  9. Evaluation of optimized b-value sampling schemas for diffusion kurtosis imaging with an application to stroke patient data

    PubMed Central

    Yan, Xu; Zhou, Minxiong; Ying, Lingfang; Yin, Dazhi; Fan, Mingxia; Yang, Guang; Zhou, Yongdi; Song, Fan; Xu, Dongrong

    2013-01-01

    Diffusion kurtosis imaging (DKI) is a new method of magnetic resonance imaging (MRI) that provides non-Gaussian information that is not available in conventional diffusion tensor imaging (DTI). DKI requires data acquisition at multiple b-values for parameter estimation; this process is usually time-consuming. Therefore, fewer b-values are preferable to expedite acquisition. In this study, we carefully evaluated various acquisition schemas using different numbers and combinations of b-values. Acquisition schemas that sampled b-values that were distributed to two ends were optimized. Compared to conventional schemas using equally spaced b-values (ESB), optimized schemas require fewer b-values to minimize fitting errors in parameter estimation and may thus significantly reduce scanning time. Following a ranked list of optimized schemas resulted from the evaluation, we recommend the 3b schema based on its estimation accuracy and time efficiency, which needs data from only 3 b-values at 0, around 800 and around 2600 s/mm2, respectively. Analyses using voxel-based analysis (VBA) and region-of-interest (ROI) analysis with human DKI datasets support the use of the optimized 3b (0, 1000, 2500 s/mm2) DKI schema in practical clinical applications. PMID:23735303

  10. A Conceptual Methodology for Assessing Acquisition Requirements Robustness against Technology Uncertainties

    NASA Astrophysics Data System (ADS)

    Chou, Shuo-Ju

    2011-12-01

    In recent years the United States has shifted from a threat-based acquisition policy that developed systems for countering specific threats to a capabilities-based strategy that emphasizes the acquisition of systems that provide critical national defense capabilities. This shift in policy, in theory, allows for the creation of an "optimal force" that is robust against current and future threats regardless of the tactics and scenario involved. In broad terms, robustness can be defined as the insensitivity of an outcome to "noise" or non-controlled variables. Within this context, the outcome is the successful achievement of defense strategies and the noise variables are tactics and scenarios that will be associated with current and future enemies. Unfortunately, a lack of system capability, budget, and schedule robustness against technology performance and development uncertainties has led to major setbacks in recent acquisition programs. This lack of robustness stems from the fact that immature technologies have uncertainties in their expected performance, development cost, and schedule that cause to variations in system effectiveness and program development budget and schedule requirements. Unfortunately, the Technology Readiness Assessment process currently used by acquisition program managers and decision-makers to measure technology uncertainty during critical program decision junctions does not adequately capture the impact of technology performance and development uncertainty on program capability and development metrics. The Technology Readiness Level metric employed by the TRA to describe program technology elements uncertainties can only provide a qualitative and non-descript estimation of the technology uncertainties. In order to assess program robustness, specifically requirements robustness, against technology performance and development uncertainties, a new process is needed. This process should provide acquisition program managers and decision-makers with the ability to assess or measure the robustness of program requirements against such uncertainties. A literature review of techniques for forecasting technology performance and development uncertainties and subsequent impacts on capability, budget, and schedule requirements resulted in the conclusion that an analysis process that coupled a probabilistic analysis technique such as Monte Carlo Simulations with quantitative and parametric models of technology performance impact and technology development time and cost requirements would allow the probabilities of meeting specific constraints of these requirements to be established. These probabilities of requirements success metrics can then be used as a quantitative and probabilistic measure of program requirements robustness against technology uncertainties. Combined with a Multi-Objective Genetic Algorithm optimization process and computer-based Decision Support System, critical information regarding requirements robustness against technology uncertainties can be captured and quantified for acquisition decision-makers. This results in a more informed and justifiable selection of program technologies during initial program definition as well as formulation of program development and risk management strategies. To meet the stated research objective, the ENhanced TEchnology Robustness Prediction and RISk Evaluation (ENTERPRISE) methodology was formulated to provide a structured and transparent process for integrating these enabling techniques to provide a probabilistic and quantitative assessment of acquisition program requirements robustness against technology performance and development uncertainties. In order to demonstrate the capabilities of the ENTERPRISE method and test the research Hypotheses, an demonstration application of this method was performed on a notional program for acquiring the Carrier-based Suppression of Enemy Air Defenses (SEAD) using Unmanned Combat Aircraft Systems (UCAS) and their enabling technologies. The results of this implementation provided valuable insights regarding the benefits and inner workings of this methodology as well as its limitations that should be addressed in the future to narrow the gap between current state and the desired state.

  11. The Mars Science Laboratory Organic Check Material

    NASA Technical Reports Server (NTRS)

    Conrad, Pamela G.; Eigenbrode, J. E.; Mogensen, C. T.; VonderHeydt, M. O.; Glavin, D. P.; Mahaffy, P. M.; Johnson, J. A.

    2011-01-01

    The Organic Check Material (OCM) has been developed for use on the Mars Science Laboratory mission to serve as a sample standard for verification of organic cleanliness and characterization of potential sample alteration as a function of the sample acquisition and portioning process on the Curiosity rover. OCM samples will be acquired using the same procedures for drilling, portioning and delivery as are used to study martian samples with The Sample Analysis at Mars (SAM) instrument suite during MSL surface operations. Because the SAM suite is highly sensitive to organic molecules, the mission can better verify the cleanliness of Curiosity's sample acquisition hardware if a known material can be processed through SAM and compared with the results obtained from martian samples.

  12. A noninterference blade vibration measurement system for gas turbine engines

    NASA Astrophysics Data System (ADS)

    Watkins, William B.; Chi, Ray M.

    1987-06-01

    A noninterfering blade vibration system has been demonstrated in tests of a gas turbine first stage fan. Conceptual design of the system, including its theory, design of case mounted probes, and data acquisition and signal processing hardware was done in a previous effort. The current effort involved instrumentation of an engine fan stage with strain gages; data acquisition using shaft-mounted reference and case-mounted optical probes; recording of data on a wideband tape recorder; and posttest processing using off-line analysis in a facility computer and a minicomputer-based readout system designed for near- real-time readout. Results are presented in terms of true blade vibration frequencies, time and frequency dependent vibration amplitudes and comparison of the optical noninterference results with strain gage readings.

  13. When Performance Is the Product: Problems in the Analysis of Online Distance Education

    ERIC Educational Resources Information Center

    Hamilton, David; Dahlgren, Ethel; Hult, Agneta; Roos, Bertil; Soderstrom, Tor

    2004-01-01

    This article examines two ideologies that have been prominent in recent, if not current, education thinking. The first is that means can be separated from ends (or processes from products); the second is that learning is merely a process of knowledge acquisition. Attention to these ideologies arises from two projects in the overlapping fields of…

  14. Multiyear Subcontractor Selection Criteria Analysis.

    DTIC Science & Technology

    1983-09-01

    advancement are program instability, higher costs, and increased lead-times. Compounding the instability created by advancing technology are changes in...drive smaller firms out of business (17:46). Technology is advancing at an ever increasing pace, demanding higher performance and larger amounts of engi...Process Adding to the external factors mentioned above, the weapon systems acquisition process tends to retard pro- ductivity advancements by its very

  15. 48 CFR 215.404 - Proposal analysis.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 3 2011-10-01 2011-10-01 false Proposal analysis. 215.404 Section 215.404 Federal Acquisition Regulations System DEFENSE ACQUISITION REGULATIONS SYSTEM, DEPARTMENT... Proposal analysis. ...

  16. 48 CFR 215.404 - Proposal analysis.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Proposal analysis. 215.404 Section 215.404 Federal Acquisition Regulations System DEFENSE ACQUISITION REGULATIONS SYSTEM, DEPARTMENT... Proposal analysis. ...

  17. Quantitative nanoscopy: Tackling sampling limitations in (S)TEM imaging of polymers and composites.

    PubMed

    Gnanasekaran, Karthikeyan; Snel, Roderick; de With, Gijsbertus; Friedrich, Heiner

    2016-01-01

    Sampling limitations in electron microscopy questions whether the analysis of a bulk material is representative, especially while analyzing hierarchical morphologies that extend over multiple length scales. We tackled this problem by automatically acquiring a large series of partially overlapping (S)TEM images with sufficient resolution, subsequently stitched together to generate a large-area map using an in-house developed acquisition toolbox (TU/e Acquisition ToolBox) and stitching module (TU/e Stitcher). In addition, we show that quantitative image analysis of the large scale maps provides representative information that can be related to the synthesis and process conditions of hierarchical materials, which moves electron microscopy analysis towards becoming a bulk characterization tool. We demonstrate the power of such an analysis by examining two different multi-phase materials that are structured over multiple length scales. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. radR: an open-source platform for acquiring and analysing data on biological targets observed by surveillance radar.

    PubMed

    Taylor, Philip D; Brzustowski, John M; Matkovich, Carolyn; Peckford, Michael L; Wilson, Dave

    2010-10-26

    Radar has been used for decades to study movement of insects, birds and bats. In spite of this, there are few readily available software tools for the acquisition, storage and processing of such data. Program radR was developed to solve this problem. Program radR is an open source software tool for the acquisition, storage and analysis of data from marine radars operating in surveillance mode. radR takes time series data with a two-dimensional spatial component as input from some source (typically a radar digitizing card) and extracts and retains information of biological relevance (i.e. moving targets). Low-level data processing is implemented in "C" code, but user-defined functions written in the "R" statistical programming language can be called at pre-defined steps in the calculations. Output data formats are designed to allow for future inclusion of additional data items without requiring change to C code. Two brands of radar digitizing card are currently supported as data sources. We also provide an overview of the basic considerations of setting up and running a biological radar study. Program radR provides a convenient, open source platform for the acquisition and analysis of radar data of biological targets.

  19. Signal processing and general purpose data acquisition system for on-line tomographic measurements

    NASA Astrophysics Data System (ADS)

    Murari, A.; Martin, P.; Hemming, O.; Manduchi, G.; Marrelli, L.; Taliercio, C.; Hoffmann, A.

    1997-01-01

    New analog signal conditioning electronics and data acquisition systems have been developed for the soft x-ray and bolometric tomography diagnostic in the reverse field pinch experiment (RFX). For the soft x-ray detectors the analog signal processing includes a fully differential current to voltage conversion, with up to a 200 kHz bandwidth. For the bolometers, a 50 kHz carrier frequency amplifier allows a maximum bandwidth of 10 kHz. In both cases the analog signals are digitized with a 1 MHz sampling rate close to the diagnostic and are transmitted via a transparent asynchronous xmitter/receiver interface (TAXI) link to purpose built Versa Module Europa (VME) modules which perform data acquisition. A software library has been developed for data preprocessing and tomographic reconstruction. It has been written in C language and is self-contained, i.e., no additional mathematical library is required. The package is therefore platform-free: in particular it can perform online analysis in a real-time application, such as continuous display and feedback, and is portable for long duration fusion or other physical experiments. Due to the modular organization of the library, new preprocessing and analysis modules can be easily integrated in the environment. This software is implemented in RFX over three different platforms: open VMS, digital Unix, and VME 68040 CPU.

  20. radR: an open-source platform for acquiring and analysing data on biological targets observed by surveillance radar

    PubMed Central

    2010-01-01

    Background Radar has been used for decades to study movement of insects, birds and bats. In spite of this, there are few readily available software tools for the acquisition, storage and processing of such data. Program radR was developed to solve this problem. Results Program radR is an open source software tool for the acquisition, storage and analysis of data from marine radars operating in surveillance mode. radR takes time series data with a two-dimensional spatial component as input from some source (typically a radar digitizing card) and extracts and retains information of biological relevance (i.e. moving targets). Low-level data processing is implemented in "C" code, but user-defined functions written in the "R" statistical programming language can be called at pre-defined steps in the calculations. Output data formats are designed to allow for future inclusion of additional data items without requiring change to C code. Two brands of radar digitizing card are currently supported as data sources. We also provide an overview of the basic considerations of setting up and running a biological radar study. Conclusions Program radR provides a convenient, open source platform for the acquisition and analysis of radar data of biological targets. PMID:20977735

  1. 48 CFR 1336.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 1336.602-5 Section 1336.602-5 Federal Acquisition Regulations System DEPARTMENT OF COMMERCE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  2. 48 CFR 836.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 5 2012-10-01 2012-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 836.602-5 Section 836.602-5 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  3. 48 CFR 636.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 4 2013-10-01 2013-10-01 false Short selection processes for contracts not to exceed the simplified acquisition threshold. 636.602-5 Section 636.602-5 Federal Acquisition Regulations System DEPARTMENT OF STATE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  4. 48 CFR 436.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 4 2013-10-01 2013-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 436.602-5 Section 436.602-5 Federal Acquisition Regulations System DEPARTMENT OF AGRICULTURE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  5. 48 CFR 1436.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Short selection processes for contracts not to exceed the simplified acquisition threshold. 1436.602-5 Section 1436.602-5 Federal Acquisition Regulations System DEPARTMENT OF THE INTERIOR SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  6. 48 CFR 736.602-5 - Short selection process for procurements not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Short selection process for procurements not to exceed the simplified acquisition threshold. 736.602-5 Section 736.602-5 Federal Acquisition Regulations System AGENCY FOR INTERNATIONAL DEVELOPMENT SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACT...

  7. 48 CFR 1336.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 1336.602-5 Section 1336.602-5 Federal Acquisition Regulations System DEPARTMENT OF COMMERCE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  8. 48 CFR 636.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Short selection processes for contracts not to exceed the simplified acquisition threshold. 636.602-5 Section 636.602-5 Federal Acquisition Regulations System DEPARTMENT OF STATE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  9. 48 CFR 1036.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 1036.602-5 Section 1036.602-5 Federal Acquisition Regulations System DEPARTMENT OF THE TREASURY SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  10. 48 CFR 436.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 4 2014-10-01 2014-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 436.602-5 Section 436.602-5 Federal Acquisition Regulations System DEPARTMENT OF AGRICULTURE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  11. 48 CFR 436.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 4 2011-10-01 2011-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 436.602-5 Section 436.602-5 Federal Acquisition Regulations System DEPARTMENT OF AGRICULTURE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  12. 48 CFR 1436.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Short selection processes for contracts not to exceed the simplified acquisition threshold. 1436.602-5 Section 1436.602-5 Federal Acquisition Regulations System DEPARTMENT OF THE INTERIOR SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  13. 48 CFR 736.602-5 - Short selection process for procurements not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 5 2012-10-01 2012-10-01 false Short selection process for procurements not to exceed the simplified acquisition threshold. 736.602-5 Section 736.602-5 Federal Acquisition Regulations System AGENCY FOR INTERNATIONAL DEVELOPMENT SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACT...

  14. 48 CFR 436.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 436.602-5 Section 436.602-5 Federal Acquisition Regulations System DEPARTMENT OF AGRICULTURE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  15. 48 CFR 836.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 836.602-5 Section 836.602-5 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  16. 48 CFR 1336.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 5 2013-10-01 2013-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 1336.602-5 Section 1336.602-5 Federal Acquisition Regulations System DEPARTMENT OF COMMERCE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  17. 48 CFR 836.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 836.602-5 Section 836.602-5 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  18. 48 CFR 1336.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 5 2012-10-01 2012-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 1336.602-5 Section 1336.602-5 Federal Acquisition Regulations System DEPARTMENT OF COMMERCE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  19. 48 CFR 1436.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 5 2012-10-01 2012-10-01 false Short selection processes for contracts not to exceed the simplified acquisition threshold. 1436.602-5 Section 1436.602-5 Federal Acquisition Regulations System DEPARTMENT OF THE INTERIOR SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  20. 48 CFR 636.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 4 2011-10-01 2011-10-01 false Short selection processes for contracts not to exceed the simplified acquisition threshold. 636.602-5 Section 636.602-5 Federal Acquisition Regulations System DEPARTMENT OF STATE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  1. 48 CFR 636.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... for contracts not to exceed the simplified acquisition threshold. 636.602-5 Section 636.602-5 Federal... not to exceed the simplified acquisition threshold. The short selection process described in FAR 36.602-5 is authorized for use for contracts not expected to exceed the simplified acquisition threshold...

  2. 48 CFR 736.602-5 - Short selection process for procurements not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Short selection process for procurements not to exceed the simplified acquisition threshold. 736.602-5 Section 736.602-5 Federal Acquisition Regulations System AGENCY FOR INTERNATIONAL DEVELOPMENT SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACT...

  3. 48 CFR 1036.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 5 2013-10-01 2013-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 1036.602-5 Section 1036.602-5 Federal Acquisition Regulations System DEPARTMENT OF THE TREASURY SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  4. 48 CFR 436.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 4 2012-10-01 2012-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 436.602-5 Section 436.602-5 Federal... to exceed the simplified acquisition threshold. The HCA may include either or both procedures in FAR...

  5. 48 CFR 1336.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... for contracts not to exceed the simplified acquisition threshold. 1336.602-5 Section 1336.602-5... for contracts not to exceed the simplified acquisition threshold. (a) In contracts not expected to exceed the simplified acquisition threshold, either or both of the short selection processes set out at...

  6. 48 CFR 1036.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 5 2012-10-01 2012-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 1036.602-5 Section 1036.602-5 Federal Acquisition Regulations System DEPARTMENT OF THE TREASURY SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  7. 48 CFR 1036.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 1036.602-5 Section 1036.602-5 Federal Acquisition Regulations System DEPARTMENT OF THE TREASURY SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  8. 48 CFR 1436.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 5 2013-10-01 2013-10-01 false Short selection processes for contracts not to exceed the simplified acquisition threshold. 1436.602-5 Section 1436.602-5 Federal Acquisition Regulations System DEPARTMENT OF THE INTERIOR SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  9. 48 CFR 636.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 4 2014-10-01 2014-10-01 false Short selection processes for contracts not to exceed the simplified acquisition threshold. 636.602-5 Section 636.602-5 Federal Acquisition Regulations System DEPARTMENT OF STATE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  10. Autonomous Metabolomics for Rapid Metabolite Identification in Global Profiling

    DOE PAGES

    Benton, H. Paul; Ivanisevic, Julijana; Mahieu, Nathaniel G.; ...

    2014-12-12

    An autonomous metabolomic workflow combining mass spectrometry analysis with tandem mass spectrometry data acquisition was designed to allow for simultaneous data processing and metabolite characterization. Although previously tandem mass spectrometry data have been generated on the fly, the experiments described herein combine this technology with the bioinformatic resources of XCMS and METLIN. We can analyze large profiling datasets and simultaneously obtain structural identifications, as a result of this unique integration. Furthermore, validation of the workflow on bacterial samples allowed the profiling on the order of a thousand metabolite features with simultaneous tandem mass spectra data acquisition. The tandem mass spectrometrymore » data acquisition enabled automatic search and matching against the METLIN tandem mass spectrometry database, shortening the current workflow from days to hours. Overall, the autonomous approach to untargeted metabolomics provides an efficient means of metabolomic profiling, and will ultimately allow the more rapid integration of comparative analyses, metabolite identification, and data analysis at a systems biology level.« less

  11. Theory and applications of structured light single pixel imaging

    NASA Astrophysics Data System (ADS)

    Stokoe, Robert J.; Stockton, Patrick A.; Pezeshki, Ali; Bartels, Randy A.

    2018-02-01

    Many single-pixel imaging techniques have been developed in recent years. Though the methods of image acquisition vary considerably, the methods share unifying features that make general analysis possible. Furthermore, the methods developed thus far are based on intuitive processes that enable simple and physically-motivated reconstruction algorithms, however, this approach may not leverage the full potential of single-pixel imaging. We present a general theoretical framework of single-pixel imaging based on frame theory, which enables general, mathematically rigorous analysis. We apply our theoretical framework to existing single-pixel imaging techniques, as well as provide a foundation for developing more-advanced methods of image acquisition and reconstruction. The proposed frame theoretic framework for single-pixel imaging results in improved noise robustness, decrease in acquisition time, and can take advantage of special properties of the specimen under study. By building on this framework, new methods of imaging with a single element detector can be developed to realize the full potential associated with single-pixel imaging.

  12. Automated benthic counting of living and non-living components in Ngedarrak Reef, Palau via subsurface underwater video.

    PubMed

    Marcos, Ma Shiela Angeli; David, Laura; Peñaflor, Eileen; Ticzon, Victor; Soriano, Maricor

    2008-10-01

    We introduce an automated benthic counting system in application for rapid reef assessment that utilizes computer vision on subsurface underwater reef video. Video acquisition was executed by lowering a submersible bullet-type camera from a motor boat while moving across the reef area. A GPS and echo sounder were linked to the video recorder to record bathymetry and location points. Analysis of living and non-living components was implemented through image color and texture feature extraction from the reef video frames and classification via Linear Discriminant Analysis. Compared to common rapid reef assessment protocols, our system can perform fine scale data acquisition and processing in one day. Reef video was acquired in Ngedarrak Reef, Koror, Republic of Palau. Overall success performance ranges from 60% to 77% for depths of 1 to 3 m. The development of an automated rapid reef classification system is most promising for reef studies that need fast and frequent data acquisition of percent cover of living and nonliving components.

  13. Development of automation software for neutron activation analysis process in Malaysian nuclear agency

    NASA Astrophysics Data System (ADS)

    Yussup, N.; Rahman, N. A. A.; Ibrahim, M. M.; Mokhtar, M.; Salim, N. A. A.; Soh@Shaari, S. C.; Azman, A.

    2017-01-01

    Neutron Activation Analysis (NAA) process has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s. Most of the procedures established especially from sample registration to sample analysis are performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient. Hence, a software to support the system automation is developed to provide an effective method to replace redundant manual data entries and produce faster sample analysis and calculation process. This paper describes the design and development of automation software for NAA process which consists of three sub-programs. The sub-programs are sample registration, hardware control and data acquisition; and sample analysis. The data flow and connection between the sub-programs will be explained. The software is developed by using National Instrument LabView development package.

  14. Computer image analysis in obtaining characteristics of images: greenhouse tomatoes in the process of generating learning sets of artificial neural networks

    NASA Astrophysics Data System (ADS)

    Zaborowicz, M.; Przybył, J.; Koszela, K.; Boniecki, P.; Mueller, W.; Raba, B.; Lewicki, A.; Przybył, K.

    2014-04-01

    The aim of the project was to make the software which on the basis on image of greenhouse tomato allows for the extraction of its characteristics. Data gathered during the image analysis and processing were used to build learning sets of artificial neural networks. Program enables to process pictures in jpeg format, acquisition of statistical information of the picture and export them to an external file. Produced software is intended to batch analyze collected research material and obtained information saved as a csv file. Program allows for analysis of 33 independent parameters implicitly to describe tested image. The application is dedicated to processing and image analysis of greenhouse tomatoes. The program can be used for analysis of other fruits and vegetables of a spherical shape.

  15. Real-time oil-saturation monitoring in rock cores with low-field NMR.

    PubMed

    Mitchell, J; Howe, A M; Clarke, A

    2015-07-01

    Nuclear magnetic resonance (NMR) provides a powerful suite of tools for studying oil in reservoir core plugs at the laboratory scale. Low-field magnets are preferred for well-log calibration and to minimize magnetic-susceptibility-induced internal gradients in the porous medium. We demonstrate that careful data processing, combined with prior knowledge of the sample properties, enables real-time acquisition and interpretation of saturation state (relative amount of oil and water in the pores of a rock). Robust discrimination of oil and brine is achieved with diffusion weighting. We use this real-time analysis to monitor the forced displacement of oil from porous materials (sintered glass beads and sandstones) and to generate capillary desaturation curves. The real-time output enables in situ modification of the flood protocol and accurate control of the saturation state prior to the acquisition of standard NMR core analysis data, such as diffusion-relaxation correlations. Although applications to oil recovery and core analysis are demonstrated, the implementation highlights the general practicality of low-field NMR as an inline sensor for real-time industrial process control. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. System integration of marketable subsystems. [for residential solar heating and cooling

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Progress is reported in the following areas: systems integration of marketable subsystems; development, design, and building of site data acquisition subsystems; development and operation of the central data processing system; operation of the MSFC Solar Test Facility; and systems analysis.

  17. The key to using a learning or skill acquisition plan.

    PubMed

    Nicholls, Delwyn; Sweet, Linda; Westerway, Sue Campbell; Gibbins, Annie

    2014-11-01

    A learning plan is a tool to guide the development of knowledge, skills and professional attitudes required for practice. A learning plan is an ideal tool for both supervisors and mentors to guide the process of teaching and learning a medical ultrasound examination. A good learning plan will state the learning goal, identify the learning activities and resources needed to achieve this goal, and highlight the outcome measures, which when achieved indicate the goal has been accomplished. A skill acquisition plan provides a framework for task acquisition and skill stratification; and is an extension of the application of the student learning plan. One unique feature of a skill acquisition plan is it requires the tutor to first undertake a task analysis. The task steps are progressively learnt in sequence, termed scaffolding. The skills to develop and use a learning or skill acquisition plan are also learnt, but are an integral component to the ultrasound tutors skill set. This paper will provide an outline of how to use and apply a learning and skill acquisition plan. We will review how these tools can be personalised to each student and skill teaching environment.

  18. Selecting Senior Acquisition Officials: Assessing the Current Processes and Practices for Recruiting, Confirming, and Retaining Senior Officials in the Acquisition Workforce

    DTIC Science & Technology

    2016-04-21

    Selecting Senior Acquisition Officials Assessing the Current Processes and Practices for Recruiting, Confirming, and Retaining Senior Officials...Task Group 2 Terms of Reference (TOR)  Selection of Senior Officials in the Acquisition Workforce – Consider ethics rules, congressional committee... Senior Acquisition positions – Re-validate the conflicts of interest and risk mitigation rules “[T]he committee directs the Chair of the Defense Business

  19. Considerations for Using Agile in DoD Acquisition

    DTIC Science & Technology

    2010-04-01

    successfully used in manufacturing throughout the world for decades, such as ―just-in- time,‖ Lean, Kanban , and work-flow-based planning. Another new...of this analysis is provided in Table 2. 29 Kanban / lean style of Agile might be the most relevant for this phase. 31 | CMU/SEI-2010-TN-002...family of approaches, including Kanban [14], Rational Unified Process (RUP), Personal Software Process (PSP), Team Software Process (TSP), and Cleanroom

  20. Department of Defense Costing References Web. Phase 1. Establishing the Foundation.

    DTIC Science & Technology

    1997-03-01

    a functional economic analysis under one set of constraints and having to repeat the entire process for the MAISRC. Recommendations for automated...MAISRC s acquisition oversight process . The cost and cycle time for each iteration can be in the order of $300,000 and 6 months, respectively...Institute resources were expected to become available at the conclusion of another BPR project. The contents list for the first Business Process

  1. Air Traffic Control: Immature Software Acquisition Processes Increase FAA System Acquisition Risks

    DOT National Transportation Integrated Search

    1997-03-01

    The General Accounting Office (GAO) at the request of Congress reviewed (1) : the maturity of Federal Aviation Administration's (FAA's) Air Traffic Control : (ATC) modernization software acquisition processes, and (2) the steps/actions : FAA has unde...

  2. United States Navy Contracting Officer Warranting Process

    DTIC Science & Technology

    2011-03-01

    by 30% or more of the respondents: Contract Law , Cost Analysis, Market Research, Contract Source Selection, Simplified Acquisition Procedures, and...that the majority of AOs found the following course at least somewhat important: Contract Law , Cost Analysis, Market Research, Contract 52 Source...the budget and appropriation cycle 4. Ethics and conduct standards 5. Basic contract laws and regulations 6. Socio-economic requirements in

  3. Severe storms and local weather research

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Developments in the use of space related techniques to understand storms and local weather are summarized. The observation of lightning, storm development, cloud development, mesoscale phenomena, and ageostrophic circulation are discussed. Data acquisition, analysis, and the development of improved sensor and computer systems capability are described. Signal processing and analysis and application of Doppler lidar data are discussed. Progress in numerous experiments is summarized.

  4. Near ground level sensing for spatial analysis of vegetation

    NASA Technical Reports Server (NTRS)

    Sauer, Tom; Rasure, John; Gage, Charlie

    1991-01-01

    Measured changes in vegetation indicate the dynamics of ecological processes and can identify the impacts from disturbances. Traditional methods of vegetation analysis tend to be slow because they are labor intensive; as a result, these methods are often confined to small local area measurements. Scientists need new algorithms and instruments that will allow them to efficiently study environmental dynamics across a range of different spatial scales. A new methodology that addresses this problem is presented. This methodology includes the acquisition, processing, and presentation of near ground level image data and its corresponding spatial characteristics. The systematic approach taken encompasses a feature extraction process, a supervised and unsupervised classification process, and a region labeling process yielding spatial information.

  5. Heterogeneous Optimization Framework: Reproducible Preprocessing of Multi-Spectral Clinical MRI for Neuro-Oncology Imaging Research.

    PubMed

    Milchenko, Mikhail; Snyder, Abraham Z; LaMontagne, Pamela; Shimony, Joshua S; Benzinger, Tammie L; Fouke, Sarah Jost; Marcus, Daniel S

    2016-07-01

    Neuroimaging research often relies on clinically acquired magnetic resonance imaging (MRI) datasets that can originate from multiple institutions. Such datasets are characterized by high heterogeneity of modalities and variability of sequence parameters. This heterogeneity complicates the automation of image processing tasks such as spatial co-registration and physiological or functional image analysis. Given this heterogeneity, conventional processing workflows developed for research purposes are not optimal for clinical data. In this work, we describe an approach called Heterogeneous Optimization Framework (HOF) for developing image analysis pipelines that can handle the high degree of clinical data non-uniformity. HOF provides a set of guidelines for configuration, algorithm development, deployment, interpretation of results and quality control for such pipelines. At each step, we illustrate the HOF approach using the implementation of an automated pipeline for Multimodal Glioma Analysis (MGA) as an example. The MGA pipeline computes tissue diffusion characteristics of diffusion tensor imaging (DTI) acquisitions, hemodynamic characteristics using a perfusion model of susceptibility contrast (DSC) MRI, and spatial cross-modal co-registration of available anatomical, physiological and derived patient images. Developing MGA within HOF enabled the processing of neuro-oncology MR imaging studies to be fully automated. MGA has been successfully used to analyze over 160 clinical tumor studies to date within several research projects. Introduction of the MGA pipeline improved image processing throughput and, most importantly, effectively produced co-registered datasets that were suitable for advanced analysis despite high heterogeneity in acquisition protocols.

  6. Is Children's Acquisition of the Passive a Staged Process? Evidence from Six- and Nine-Year-Olds' Production of Passives

    ERIC Educational Resources Information Center

    Messenger, Katherine; Branigan, Holly P.; McLean, Janet F.

    2012-01-01

    We report a syntactic priming experiment that examined whether children's acquisition of the passive is a staged process, with acquisition of constituent structure preceding acquisition of thematic role mappings. Six-year-olds and nine-year-olds described transitive actions after hearing active and passive prime descriptions involving the same or…

  7. Development and Flight Testing of an Adaptive Vehicle Health-Monitoring Architecture

    NASA Technical Reports Server (NTRS)

    Woodard, Stanley E.; Coffey, Neil C.; Gonzalez, Guillermo A.; Taylor, B. Douglas; Brett, Rube R.; Woodman, Keith L.; Weathered, Brenton W.; Rollins, Courtney H.

    2002-01-01

    On going development and testing of an adaptable vehicle health-monitoring architecture is presented. The architecture is being developed for a fleet of vehicles. It has three operational levels: one or more remote data acquisition units located throughout the vehicle; a command and control unit located within the vehicle, and, a terminal collection unit to collect analysis results from all vehicles. Each level is capable of performing autonomous analysis with a trained expert system. The expert system is parameterized, which makes it adaptable to be trained to both a user's subject reasoning and existing quantitative analytic tools. Communication between all levels is done with wireless radio frequency interfaces. The remote data acquisition unit has an eight channel programmable digital interface that allows the user discretion for choosing type of sensors; number of sensors, sensor sampling rate and sampling duration for each sensor. The architecture provides framework for a tributary analysis. All measurements at the lowest operational level are reduced to provide analysis results necessary to gauge changes from established baselines. These are then collected at the next level to identify any global trends or common features from the prior level. This process is repeated until the results are reduced at the highest operational level. In the framework, only analysis results are forwarded to the next level to reduce telemetry congestion. The system's remote data acquisition hardware and non-analysis software have been flight tested on the NASA Langley B757's main landing gear. The flight tests were performed to validate the following: the wireless radio frequency communication capabilities of the system, the hardware design, command and control; software operation and, data acquisition, storage and retrieval.

  8. 48 CFR 242.1203 - Processing agreements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Processing agreements. 242.1203 Section 242.1203 Federal Acquisition Regulations System DEFENSE ACQUISITION REGULATIONS SYSTEM, DEPARTMENT OF DEFENSE CONTRACT MANAGEMENT CONTRACT ADMINISTRATION AND AUDIT SERVICES Novation and Change-of...

  9. 25 CFR 700.115 - Preliminary acquisition notice.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Acquisition and Disposal of Habitations and/or Improvements § 700.115 Preliminary acquisition notice. As soon as feasible in the acquisition process, the Commission shall issue a preliminary acquisition notice.../her habitations and/or improvements. (b) Explain that such preliminary acquisition notice is not a...

  10. System safety management lessons learned from the US Army acquisition process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piatt, J.A.

    1989-05-01

    The Assistant Secretary of the Army for Research, Development and Acquisition directed the Army Safety Center to provide an audit of the causes of accidents and safety of use restrictions on recently fielded systems by tracking residual hazards back through the acquisition process. The objective was to develop lessons learned'' that could be applied to the acquisition process to minimize mishaps in fielded systems. System safety management lessons learned are defined as Army practices or policies, derived from past successes and failures, that are expected to be effective in eliminating or reducing specific systemic causes of residual hazards. They aremore » broadly applicable and supportive of the Army structure and acquisition objectives. Pacific Northwest Laboratory (PNL) was given the task of conducting an independent, objective appraisal of the Army's system safety program in the context of the Army materiel acquisition process by focusing on four fielded systems which are products of that process. These systems included the Apache helicopter, the Bradley Fighting Vehicle (BFV), the Tube Launched, Optically Tracked, Wire Guided (TOW) Missile and the High Mobility Multipurpose Wheeled Vehicle (HMMWV). The objective of this study was to develop system safety management lessons learned associated with the acquisition process. The first step was to identify residual hazards associated with the selected systems. Since it was impossible to track all residual hazards through the acquisition process, certain well-known, high visibility hazards were selected for detailed tracking. These residual hazards illustrate a variety of systemic problems. Systemic or process causes were identified for each residual hazard and analyzed to determine why they exist. System safety management lessons learned were developed to address related systemic causal factors. 29 refs., 5 figs.« less

  11. Professional Learning Networks Designed for Teacher Learning

    ERIC Educational Resources Information Center

    Trust, Torrey

    2012-01-01

    In the information age, students must learn to navigate and evaluate an expanding network of information. Highly effective teachers model this process of information analysis and knowledge acquisition by continually learning through collaboration, professional development, and studying pedagogical techniques and best practices. Many teachers have…

  12. Excel2Genie: A Microsoft Excel application to improve the flexibility of the Genie-2000 Spectroscopic software.

    PubMed

    Forgács, Attila; Balkay, László; Trón, Lajos; Raics, Péter

    2014-12-01

    Excel2Genie, a simple and user-friendly Microsoft Excel interface, has been developed to the Genie-2000 Spectroscopic Software of Canberra Industries. This Excel application can directly control Canberra Multichannel Analyzer (MCA), process the acquired data and visualize them. Combination of Genie-2000 with Excel2Genie results in remarkably increased flexibility and a possibility to carry out repetitive data acquisitions even with changing parameters and more sophisticated analysis. The developed software package comprises three worksheets: display parameters and results of data acquisition, data analysis and mathematical operations carried out on the measured gamma spectra. At the same time it also allows control of these processes. Excel2Genie is freely available to assist gamma spectrum measurements and data evaluation by the interested Canberra users. With access to the Visual Basic Application (VBA) source code of this application users are enabled to modify the developed interface according to their intentions. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Biosynthesis of a broad-spectrum nicotianamine-like metallophore in Staphylococcus aureus.

    PubMed

    Ghssein, Ghassan; Brutesco, Catherine; Ouerdane, Laurent; Fojcik, Clémentine; Izaute, Amélie; Wang, Shuanglong; Hajjar, Christine; Lobinski, Ryszard; Lemaire, David; Richaud, Pierre; Voulhoux, Romé; Espaillat, Akbar; Cava, Felipe; Pignol, David; Borezée-Durant, Elise; Arnoux, Pascal

    2016-05-27

    Metal acquisition is a vital microbial process in metal-scarce environments, such as inside a host. Using metabolomic exploration, targeted mutagenesis, and biochemical analysis, we discovered an operon in Staphylococcus aureus that encodes the different functions required for the biosynthesis and trafficking of a broad-spectrum metallophore related to plant nicotianamine (here called staphylopine). The biosynthesis of staphylopine reveals the association of three enzyme activities: a histidine racemase, an enzyme distantly related to nicotianamine synthase, and a staphylopine dehydrogenase belonging to the DUF2338 family. Staphylopine is involved in nickel, cobalt, zinc, copper, and iron acquisition, depending on the growth conditions. This biosynthetic pathway is conserved across other pathogens, thus underscoring the importance of this metal acquisition strategy in infection. Copyright © 2016, American Association for the Advancement of Science.

  14. Political Efficacy and Expected Political Participation among Lower and Upper Secondary Students. A Comparative Analysis with Data from the IEA Civic Education Study

    ERIC Educational Resources Information Center

    Schulz, Wolfram

    2005-01-01

    The process of political socialisation of adolescents includes more than the acquisition of knowledge about society, citizenship and the political system. In a democracy, citizens are expected to participate actively in the political process. Active participation, however, requires citizens to believe in their own ability to influence the course…

  15. Development of advanced image analysis techniques for the in situ characterization of multiphase dispersions occurring in bioreactors.

    PubMed

    Galindo, Enrique; Larralde-Corona, C Patricia; Brito, Teresa; Córdova-Aguilar, Ma Soledad; Taboada, Blanca; Vega-Alvarado, Leticia; Corkidi, Gabriel

    2005-03-30

    Fermentation bioprocesses typically involve two liquid phases (i.e. water and organic compounds) and one gas phase (air), together with suspended solids (i.e. biomass), which are the components to be dispersed. Characterization of multiphase dispersions is required as it determines mass transfer efficiency and bioreactor homogeneity. It is also needed for the appropriate design of contacting equipment, helping in establishing optimum operational conditions. This work describes the development of image analysis based techniques with advantages (in terms of data acquisition and processing), for the characterization of oil drops and bubble diameters in complex simulated fermentation broths. The system consists of fully digital acquisition of in situ images obtained from the inside of a mixing tank using a CCD camera synchronized with a stroboscopic light source, which are processed with a versatile commercial software. To improve the automation of particle recognition and counting, the Hough transform (HT) was used, so bubbles and oil drops were automatically detected and the processing time was reduced by 55% without losing accuracy with respect to a fully manual analysis. The system has been used for the detailed characterization of a number of operational conditions, including oil content, biomass morphology, presence of surfactants (such as proteins) and viscosity of the aqueous phase.

  16. Toward a Model of Human Information Processing for Decision-Making and Skill Acquisition in Laparoscopic Colorectal Surgery.

    PubMed

    White, Eoin J; McMahon, Muireann; Walsh, Michael T; Coffey, J Calvin; O Sullivan, Leonard

    To create a human information-processing model for laparoscopic surgery based on already established literature and primary research to enhance laparoscopic surgical education in this context. We reviewed the literature for information-processing models most relevant to laparoscopic surgery. Our review highlighted the necessity for a model that accounts for dynamic environments, perception, allocation of attention resources between the actions of both hands of an operator, and skill acquisition and retention. The results of the literature review were augmented through intraoperative observations of 7 colorectal surgical procedures, supported by laparoscopic video analysis of 12 colorectal procedures. The Wickens human information-processing model was selected as the most relevant theoretical model to which we make adaptions for this specific application. We expanded the perception subsystem of the model to involve all aspects of perception during laparoscopic surgery. We extended the decision-making system to include dynamic decision-making to account for case/patient-specific and surgeon-specific deviations. The response subsystem now includes dual-task performance and nontechnical skills, such as intraoperative communication. The memory subsystem is expanded to include skill acquisition and retention. Surgical decision-making during laparoscopic surgery is the result of a highly complex series of processes influenced not only by the operator's knowledge, but also patient anatomy and interaction with the surgical team. Newer developments in simulation-based education must focus on the theoretically supported elements and events that underpin skill acquisition and affect the cognitive abilities of novice surgeons. The proposed human information-processing model builds on established literature regarding information processing, accounting for a dynamic environment of laparoscopic surgery. This revised model may be used as a foundation for a model describing robotic surgery. Copyright © 2017 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  17. Aquarius's Instrument Science Data System (ISDS) Automated to Acquire, Process, Trend Data and Produce Radiometric System Assessment Reports

    NASA Technical Reports Server (NTRS)

    2008-01-01

    The Aquarius Radiometer, a subsystem of the Aquarius Instrument required a data acquisition ground system to support calibration and radiometer performance assessment. To support calibration and compose performance assessments, we developed an automated system which uploaded raw data to a ftp server and saved raw and processed data to a database. This paper details the overall functionalities of the Aquarius Instrument Science Data System (ISDS) and the individual electrical ground support equipment (EGSE) which produced data files that were infused into the ISDS. Real time EGSEs include an ICDS Simulator, Calibration GSE, Labview controlled power supply, and a chamber data acquisition system. ICDS Simulator serves as a test conductor primary workstation, collecting radiometer housekeeping (HK) and science data and passing commands and HK telemetry collection request to the radiometer. Calibration GSE (Radiometer Active Test Source) provides source choice from multiple targets for the radiometer external calibration. Power Supply GSE, controlled by labview, provides real time voltage and current monitoring of the radiometer. And finally the chamber data acquisition system produces data reflecting chamber vacuum pressure, thermistor temperatures, AVG and watts. Each GSE system produce text based data files every two to six minutes and automatically copies the data files to the Central Archiver PC. The Archiver PC stores the data files, schedules automated uploads of these files to an external FTP server, and accepts request to copy all data files to the ISDS for offline data processing and analysis. Aquarius Radiometer ISDS contains PHP and MATLab programs to parse, process and save all data to a MySQL database. Analysis tools (MATLab programs) in the ISDS system are capable of displaying radiometer science, telemetry and auxiliary data in near real time as well as performing data analysis and producing automated performance assessment reports of the Aquarius Radiometer.

  18. Monitoring and Acquisition Real-time System (MARS)

    NASA Technical Reports Server (NTRS)

    Holland, Corbin

    2013-01-01

    MARS is a graphical user interface (GUI) written in MATLAB and Java, allowing the user to configure and control the Scalable Parallel Architecture for Real-Time Acquisition and Analysis (SPARTAA) data acquisition system. SPARTAA not only acquires data, but also allows for complex algorithms to be applied to the acquired data in real time. The MARS client allows the user to set up and configure all settings regarding the data channels attached to the system, as well as have complete control over starting and stopping data acquisition. It provides a unique "Test" programming environment, allowing the user to create tests consisting of a series of alarms, each of which contains any number of data channels. Each alarm is configured with a particular algorithm, determining the type of processing that will be applied on each data channel and tested against a defined threshold. Tests can be uploaded to SPARTAA, thereby teaching it how to process the data. The uniqueness of MARS is in its capability to be adaptable easily to many test configurations. MARS sends and receives protocols via TCP/IP, which allows for quick integration into almost any test environment. The use of MATLAB and Java as the programming languages allows for developers to integrate the software across multiple operating platforms.

  19. Continuous country-wide rainfall observation using a large network of commercial microwave links: Challenges, solutions and applications

    NASA Astrophysics Data System (ADS)

    Chwala, Christian; Boose, Yvonne; Smiatek, Gerhard; Kunstmann, Harald

    2017-04-01

    Commercial microwave link (CML) networks have proven to be a valuable source for rainfall information over the last years. However, up to now, analysis of CML data was always limited to certain snapshots of data for historic periods due to limited data access. With the real-time availability of CML data in Germany (Chwala et al. 2016) this situation has improved significantly. We are continuously acquiring and processing data from 3000 CMLs in Germany in near real-time with one minute temporal resolution. Currently the data acquisition system is extended to 10000 CMLs so that the whole of Germany is covered and a continuous country-wide rainfall product can be provided. In this contribution we will elaborate on the challenges and solutions regarding data acquisition, data management and robust processing. We will present the details of our data acquisition system that we run operationally at the network of the CML operator Ericsson Germany to solve the problem of limited data availability. Furthermore we will explain the implementation of our data base, its web-frontend for easy data access and present our data processing algorithms. Finally we will showcase an application of our data in hydrological modeling and its potential usage to improve radar QPE. Bibliography: Chwala, C., Keis, F., and Kunstmann, H.: Real-time data acquisition of commercial microwave link networks for hydrometeorological applications, Atmos. Meas. Tech., 9, 991-999, doi:10.5194/amt-9-991-2016, 2016

  20. 48 CFR 15.404 - Proposal analysis.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 1 2011-10-01 2011-10-01 false Proposal analysis. 15.404 Section 15.404 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Contract Pricing 15.404 Proposal analysis. ...

  1. 48 CFR 15.404 - Proposal analysis.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Proposal analysis. 15.404 Section 15.404 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Contract Pricing 15.404 Proposal analysis. ...

  2. The influence of biological and technical factors on quantitative analysis of amyloid PET: Points to consider and recommendations for controlling variability in longitudinal data.

    PubMed

    Schmidt, Mark E; Chiao, Ping; Klein, Gregory; Matthews, Dawn; Thurfjell, Lennart; Cole, Patricia E; Margolin, Richard; Landau, Susan; Foster, Norman L; Mason, N Scott; De Santi, Susan; Suhy, Joyce; Koeppe, Robert A; Jagust, William

    2015-09-01

    In vivo imaging of amyloid burden with positron emission tomography (PET) provides a means for studying the pathophysiology of Alzheimer's and related diseases. Measurement of subtle changes in amyloid burden requires quantitative analysis of image data. Reliable quantitative analysis of amyloid PET scans acquired at multiple sites and over time requires rigorous standardization of acquisition protocols, subject management, tracer administration, image quality control, and image processing and analysis methods. We review critical points in the acquisition and analysis of amyloid PET, identify ways in which technical factors can contribute to measurement variability, and suggest methods for mitigating these sources of noise. Improved quantitative accuracy could reduce the sample size necessary to detect intervention effects when amyloid PET is used as a treatment end point and allow more reliable interpretation of change in amyloid burden and its relationship to clinical course. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  3. 48 CFR 50.103-5 - Processing cases.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Processing cases. 50.103-5 Section 50.103-5 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION CONTRACT MANAGEMENT... knowledge of individuals when documentary evidence is lacking, and audits if considered necessary to...

  4. Data Acquisition and Mass Storage

    NASA Astrophysics Data System (ADS)

    Vande Vyvre, P.

    2004-08-01

    The experiments performed at supercolliders will constitute a new challenge in several disciplines of High Energy Physics and Information Technology. This will definitely be the case for data acquisition and mass storage. The microelectronics, communication, and computing industries are maintaining an exponential increase of the performance of their products. The market of commodity products remains the largest and the most competitive market of technology products. This constitutes a strong incentive to use these commodity products extensively as components to build the data acquisition and computing infrastructures of the future generation of experiments. The present generation of experiments in Europe and in the US already constitutes an important step in this direction. The experience acquired in the design and the construction of the present experiments has to be complemented by a large R&D effort executed with good awareness of industry developments. The future experiments will also be expected to follow major trends of our present world: deliver physics results faster and become more and more visible and accessible. The present evolution of the technologies and the burgeoning of GRID projects indicate that these trends will be made possible. This paper includes a brief overview of the technologies currently used for the different tasks of the experimental data chain: data acquisition, selection, storage, processing, and analysis. The major trends of the computing and networking technologies are then indicated with particular attention paid to their influence on the future experiments. Finally, the vision of future data acquisition and processing systems and their promise for future supercolliders is presented.

  5. Image sequence analysis workstation for multipoint motion analysis

    NASA Astrophysics Data System (ADS)

    Mostafavi, Hassan

    1990-08-01

    This paper describes an application-specific engineering workstation designed and developed to analyze motion of objects from video sequences. The system combines the software and hardware environment of a modem graphic-oriented workstation with the digital image acquisition, processing and display techniques. In addition to automation and Increase In throughput of data reduction tasks, the objective of the system Is to provide less invasive methods of measurement by offering the ability to track objects that are more complex than reflective markers. Grey level Image processing and spatial/temporal adaptation of the processing parameters is used for location and tracking of more complex features of objects under uncontrolled lighting and background conditions. The applications of such an automated and noninvasive measurement tool include analysis of the trajectory and attitude of rigid bodies such as human limbs, robots, aircraft in flight, etc. The system's key features are: 1) Acquisition and storage of Image sequences by digitizing and storing real-time video; 2) computer-controlled movie loop playback, freeze frame display, and digital Image enhancement; 3) multiple leading edge tracking in addition to object centroids at up to 60 fields per second from both live input video or a stored Image sequence; 4) model-based estimation and tracking of the six degrees of freedom of a rigid body: 5) field-of-view and spatial calibration: 6) Image sequence and measurement data base management; and 7) offline analysis software for trajectory plotting and statistical analysis.

  6. Laser communication experiment. Volume 1: Design study report: Spacecraft transceiver. Part 2: Appendices

    NASA Technical Reports Server (NTRS)

    1970-01-01

    The application of a carbon dioxide laser for optical communication with ATS satellites is discussed. The following elements of the laser communication equipment are reported: (1) operational ground equipment, (2) data acquisition plan, and (3) data processing, reduction, and analysis plan.

  7. Teaching Psychology Students Computer Applications.

    ERIC Educational Resources Information Center

    Atnip, Gilbert W.

    This paper describes an undergraduate-level course designed to teach the applications of computers that are most relevant in the social sciences, especially psychology. After an introduction to the basic concepts and terminology of computing, separate units were devoted to word processing, data analysis, data acquisition, artificial intelligence,…

  8. Color enhancement of landsat agricultural imagery: JPL LACIE image processing support task

    NASA Technical Reports Server (NTRS)

    Madura, D. P.; Soha, J. M.; Green, W. B.; Wherry, D. B.; Lewis, S. D.

    1978-01-01

    Color enhancement techniques were applied to LACIE LANDSAT segments to determine if such enhancement can assist analysis in crop identification. The procedure involved increasing the color range by removing correlation between components. First, a principal component transformation was performed, followed by contrast enhancement to equalize component variances, followed by an inverse transformation to restore familiar color relationships. Filtering was applied to lower order components to reduce color speckle in the enhanced products. Use of single acquisition and multiple acquisition statistics to control the enhancement were compared, and the effects of normalization investigated. Evaluation is left to LACIE personnel.

  9. Integrating Data Sources for Process Sustainability ...

    EPA Pesticide Factsheets

    To perform a chemical process sustainability assessment requires significant data about chemicals, process design specifications, and operating conditions. The required information includes the identity of the chemicals used, the quantities of the chemicals within the context of the sustainability assessment, physical properties of these chemicals, equipment inventory, as well as health, environmental, and safety properties of the chemicals. Much of this data are currently available to the process engineer either from the process design in the chemical process simulation software or online through chemical property and environmental, health, and safety databases. Examples of these databases include the U.S. Environmental Protection Agency’s (USEPA’s) Aggregated Computational Toxicology Resource (ACToR), National Institute for Occupational Safety and Health’s (NIOSH’s) Hazardous Substance Database (HSDB), and National Institute of Standards and Technology’s (NIST’s) Chemistry Webbook. This presentation will provide methods and procedures for extracting chemical identity and flow information from process design tools (such as chemical process simulators) and chemical property information from the online databases. The presentation will also demonstrate acquisition and compilation of the data for use in the EPA’s GREENSCOPE process sustainability analysis tool. This presentation discusses acquisition of data for use in rapid LCI development.

  10. The Department of Defense Acquisition Workforce: Background, Analysis, and Questions for Congress

    DTIC Science & Technology

    2016-07-29

    The Department of Defense Acquisition Workforce: Background, Analysis, and Questions for Congress Moshe Schwartz Specialist in Defense... Acquisition Kathryn A. Francis Analyst in Government Organization and Management Charles V. O’Connor U.S. Department of Defense Fellow July 29, 2016...Congressional Research Service 7-5700 www.crs.gov R44578 The Department of Defense Acquisition Workforce: Background, Analysis, and Questions

  11. An Analysis of Turkey’s Defense Systems Acquisition Policy

    DTIC Science & Technology

    2009-03-01

    i NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA MBA PROFESSIONAL REPORT An Analysis of Turkey’s Defense Systems Acquisition...2009 3. REPORT TYPE AND DATES COVERED MBA THESIS 4. TITLE AND SUBTITLE An Analysis of Turkey’s Defense Systems Acquisition Policy 6. AUTHOR(S...DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) The purpose of this MBA thesis is to analyze Turkey’s defense systems acquisition policy and

  12. DataForge: Modular platform for data storage and analysis

    NASA Astrophysics Data System (ADS)

    Nozik, Alexander

    2018-04-01

    DataForge is a framework for automated data acquisition, storage and analysis based on modern achievements of applied programming. The aim of the DataForge is to automate some standard tasks like parallel data processing, logging, output sorting and distributed computing. Also the framework extensively uses declarative programming principles via meta-data concept which allows a certain degree of meta-programming and improves results reproducibility.

  13. 76 FR 14568 - Federal Acquisition Regulation; Use of Commercial Services Item Authority

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-16

    ...-AL44 Federal Acquisition Regulation; Use of Commercial Services Item Authority AGENCIES: Department of... interim rule amending the Federal Acquisition Regulation (FAR) to implement section 868 of the Duncan... interim rule. II. Discussion/Analysis The analysis of public comments by the Defense Acquisition...

  14. Computer-Aided Process and Tools for Mobile Software Acquisition

    DTIC Science & Technology

    2013-04-01

    Software Acquisition Christopher Bonine , Man-Tak Shing, and Thomas W. Otani Naval Postgraduate School Published April 1, 2013 Approved for public...ManTech International Corporation Computer-Aided Process and Tools for Mobile Software Acquisition Christopher Bonine , Man-Tak Shing, and Thomas W. Otani...Mobile Software Acquisition Christopher Bonine — Bonine is a lieutenant in the United States Navy. He is currently assigned to the Navy Cyber Defense

  15. New Tools and Methods for Assessing Risk-Management Strategies

    DTIC Science & Technology

    2004-03-01

    Theories to evaluate the risks and benefits of various acquisition alternatives and allowed researchers to monitor the process students used to make a...revealed distinct risk-management strategies. 15. SUBJECT TERMS risk managements, acquisition process, expected value theory , multi-attribute utility theory ...Utility Theories to evaluate the risks and benefits of various acquisition alternatives, and allowed us to monitor the process subjects used to arrive at

  16. Three Big Ideas for Reforming Acquisition: Evidence-Based Propositions for Transformation

    DTIC Science & Technology

    2015-04-30

    specific ideas for improving key aspects of defense acquisition reforming the process for managing capabilities, addressing technology insertion, and...offers three specific ideas for improving key aspects of defense acquisition: reforming the process for managing capabilities, addressing technology...and process changes need to be made for any significant change to be seen. This paper offers reform ideas in three specific areas: achieving the

  17. Impairment of probabilistic reward-based learning in schizophrenia.

    PubMed

    Weiler, Julia A; Bellebaum, Christian; Brüne, Martin; Juckel, Georg; Daum, Irene

    2009-09-01

    Recent models assume that some symptoms of schizophrenia originate from defective reward processing mechanisms. Understanding the precise nature of reward-based learning impairments might thus make an important contribution to the understanding of schizophrenia and the development of treatment strategies. The present study investigated several features of probabilistic reward-based stimulus association learning, namely the acquisition of initial contingencies, reversal learning, generalization abilities, and the effects of reward magnitude. Compared to healthy controls, individuals with schizophrenia exhibited attenuated overall performance during acquisition, whereas learning rates across blocks were similar to the rates of controls. On the group level, persons with schizophrenia were, however, unable to learn the reversal of the initial reward contingencies. Exploratory analysis of only the subgroup of individuals with schizophrenia who showed significant learning during acquisition yielded deficits in reversal learning with low reward magnitudes only. There was further evidence of a mild generalization impairment of the persons with schizophrenia in an acquired equivalence task. In summary, although there was evidence of intact basic processing of reward magnitudes, individuals with schizophrenia were impaired at using this feedback for the adaptive guidance of behavior.

  18. Potentials for the use of tool-integrated in-line data acquisition systems in press shops

    NASA Astrophysics Data System (ADS)

    Maier, S.; Schmerbeck, T.; Liebig, A.; Kautz, T.; Volk, W.

    2017-09-01

    Robust in-line data acquisition systems are required for the realization of process monitoring and control systems in press shops. A promising approach is the integration of sensors in the following press tools. There they can be easy integrated and maintained. It also achieves the necessary robustness for the rough press environment. Such concepts were already investigated for the measurement of the geometrical accuracy as well as for the material flow of inner part areas. They enable the monitoring of each produced part’s quality. An important success factor are practical approaches to the use of this new process information in press shops. This work presents various applications of these measuring concepts, based on real car body components of the BMW Group. For example, the procedure of retroactive error analysis is explained for a side frame. It also shows how this data acquisition can be used for the optimization of drawing tools in tool shops. With the skid-line, there is a continuous value that can be monitored from planning to serial production.

  19. Interferometric analysis of polishing surface with a petal tool

    NASA Astrophysics Data System (ADS)

    Salas-Sánchez, Alfonso; Leal-Cabrera, Irce; Percino Zacarias, Elizabeth; Granados-Agustín, Fermín S.

    2011-09-01

    In this work, we describe a phase shift interferometric monitoring of polishing processes produced by a petal tool over a spherical surface to obtain a parabolic surface. In the process, we used a commercial polishing machine; the purpose of this work is to have control of polishing time. To achieve this analysis, we used a Fizeau interferometer of ZYGO Company for optical shop testing, and the Durango software from Diffraction International Company. For data acquisition, simulation and evaluation of optical surfaces, we start polishing process with a spherical surface with 15.46 cm of diameter; a 59.9 cm of radius curvature and, with f/# 1.9.

  20. Effects of Orientation and Anisometry of Magnetic Resonance Imaging Acquisitions on Diffusion Tensor Imaging and Structural Connectomes.

    PubMed

    Tudela, Raúl; Muñoz-Moreno, Emma; López-Gil, Xavier; Soria, Guadalupe

    2017-01-01

    Diffusion-weighted imaging (DWI) quantifies water molecule diffusion within tissues and is becoming an increasingly used technique. However, it is very challenging as correct quantification depends on many different factors, ranging from acquisition parameters to a long pipeline of image processing. In this work, we investigated the influence of voxel geometry on diffusion analysis, comparing different acquisition orientations as well as isometric and anisometric voxels. Diffusion-weighted images of one rat brain were acquired with four different voxel geometries (one isometric and three anisometric in different directions) and three different encoding orientations (coronal, axial and sagittal). Diffusion tensor scalar measurements, tractography and the brain structural connectome were analyzed for each of the 12 acquisitions. The acquisition direction with respect to the main magnetic field orientation affected the diffusion results. When the acquisition slice-encoding direction was not aligned with the main magnetic field, there were more artifacts and a lower signal-to-noise ratio that led to less anisotropic tensors (lower fractional anisotropic values), producing poorer quality results. The use of anisometric voxels generated statistically significant differences in the values of diffusion metrics in specific regions. It also elicited differences in tract reconstruction and in different graph metric values describing the brain networks. Our results highlight the importance of taking into account the geometric aspects of acquisitions, especially when comparing diffusion data acquired using different geometries.

  1. Colony image acquisition and genetic segmentation algorithm and colony analyses

    NASA Astrophysics Data System (ADS)

    Wang, W. X.

    2012-01-01

    Colony anaysis is used in a large number of engineerings such as food, dairy, beverages, hygiene, environmental monitoring, water, toxicology, sterility testing. In order to reduce laboring and increase analysis acuracy, many researchers and developers have made efforts for image analysis systems. The main problems in the systems are image acquisition, image segmentation and image analysis. In this paper, to acquire colony images with good quality, an illumination box was constructed. In the box, the distances between lights and dishe, camra lens and lights, and camera lens and dishe are adjusted optimally. In image segmentation, It is based on a genetic approach that allow one to consider the segmentation problem as a global optimization,. After image pre-processing and image segmentation, the colony analyses are perfomed. The colony image analysis consists of (1) basic colony parameter measurements; (2) colony size analysis; (3) colony shape analysis; and (4) colony surface measurements. All the above visual colony parameters can be selected and combined together, used to make a new engineeing parameters. The colony analysis can be applied into different applications.

  2. Implicit and Explicit Cognitive Processes in Incidental Vocabulary Acquisition

    ERIC Educational Resources Information Center

    Ender, Andrea

    2016-01-01

    Studies on vocabulary acquisition in second language learning have revealed that a large amount of vocabulary is learned without an overt intention, in other words, incidentally. This article investigates the relevance of different lexical processing strategies for vocabulary acquisition when reading a text for comprehension among 24 advanced…

  3. A framework for analysis of large database of old art paintings

    NASA Astrophysics Data System (ADS)

    Da Rugna, Jérome; Chareyron, Ga"l.; Pillay, Ruven; Joly, Morwena

    2011-03-01

    For many years, a lot of museums and countries organize the high definition digitalization of their own collections. In consequence, they generate massive data for each object. In this paper, we only focus on art painting collections. Nevertheless, we faced a very large database with heterogeneous data. Indeed, image collection includes very old and recent scans of negative photos, digital photos, multi and hyper spectral acquisitions, X-ray acquisition, and also front, back and lateral photos. Moreover, we have noted that art paintings suffer from much degradation: crack, softening, artifact, human damages and, overtime corruption. Considering that, it appears necessary to develop specific approaches and methods dedicated to digital art painting analysis. Consequently, this paper presents a complete framework to evaluate, compare and benchmark devoted to image processing algorithms.

  4. Resource Analysis of Cognitive Process Flow Used to Achieve Autonomy

    DTIC Science & Technology

    2016-03-01

    to be used as a decision - making aid to guide system designers and program managers not necessarily familiar with cognitive pro- cessing, or resource...implementing end-to-end cognitive processing flows multiplies and the impact of these design decisions on efficiency and effectiveness increases [1]. The...end-to-end cognitive systems and alternative computing technologies, then system design and acquisition personnel could make systematic analyses and

  5. MNE Scan: Software for real-time processing of electrophysiological data.

    PubMed

    Esch, Lorenz; Sun, Limin; Klüber, Viktor; Lew, Seok; Baumgarten, Daniel; Grant, P Ellen; Okada, Yoshio; Haueisen, Jens; Hämäläinen, Matti S; Dinh, Christoph

    2018-06-01

    Magnetoencephalography (MEG) and Electroencephalography (EEG) are noninvasive techniques to study the electrophysiological activity of the human brain. Thus, they are well suited for real-time monitoring and analysis of neuronal activity. Real-time MEG/EEG data processing allows adjustment of the stimuli to the subject's responses for optimizing the acquired information especially by providing dynamically changing displays to enable neurofeedback. We introduce MNE Scan, an acquisition and real-time analysis software based on the multipurpose software library MNE-CPP. MNE Scan allows the development and application of acquisition and novel real-time processing methods in both research and clinical studies. The MNE Scan development follows a strict software engineering process to enable approvals required for clinical software. We tested the performance of MNE Scan in several device-independent use cases, including, a clinical epilepsy study, real-time source estimation, and Brain Computer Interface (BCI) application. Compared to existing tools we propose a modular software considering clinical software requirements expected by certification authorities. At the same time the software is extendable and freely accessible. We conclude that MNE Scan is the first step in creating a device-independent open-source software to facilitate the transition from basic neuroscience research to both applied sciences and clinical applications. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. Memory Forensics: Review of Acquisition and Analysis Techniques

    DTIC Science & Technology

    2013-11-01

    Management Overview Processes running on modern multitasking operating systems operate on an abstraction of RAM, called virtual memory [7]. In these systems...information such as user names, email addresses and passwords [7]. Analysts also use tools such as WinHex to identify headers or other suspicious data within

  7. An Experimental Analysis of Memory Processing

    ERIC Educational Resources Information Center

    Wright, Anthony A.

    2007-01-01

    Rhesus monkeys were trained and tested in visual and auditory list-memory tasks with sequences of four travel pictures or four natural/environmental sounds followed by single test items. Acquisitions of the visual list-memory task are presented. Visual recency (last item) memory diminished with retention delay, and primacy (first item) memory…

  8. Lexical Link Analysis Application: Improving Web Service to Acquisition Visibility Portal

    DTIC Science & Technology

    2013-09-30

    during the Empire Challenge 2008 and 2009 (EC08/09) field experiments and for numerous other field experiments of new technologies during Trident Warrior...Empirical Methods in Natural Language Processing and Very Large Corpora (EMNLP/ VLC -2000) (pp. 63–70). Retrieved from http://nlp.stanford.edu/manning

  9. An Integrated Library System from Existing Microcomputer Programs.

    ERIC Educational Resources Information Center

    Kuntz, Lynda S.

    1988-01-01

    Demonstrates how three commercial microcomputer software packages--PC-Talk III, Wordstar, and dBase III--were combined to produce an integrated library system at the U.S. Army Concepts Analysis Agency library. The retrospective conversion process is discussed, and the four modules of the system are described: acquisitions/cataloging; online…

  10. International Instrumentation Symposium, 39th, Albuquerque, NM, May 2-6, 1993, Proceedings

    NASA Astrophysics Data System (ADS)

    Various papers on instrumentation are presented. The general topics addressed include: data acquisition and processing, wind tunnels, pressure measurements, thermal measurements, force measurements, aerospace, metrology, flow measurements, real-time systems, measurement uncertainty, data analysis and calibration, computer applications, special tests, reentry vehicle systems, and human engineering.

  11. San Joaquin, California, High-Speed Rail Grade Crossing Data Acquisition Characteristics, Methodology, and Risk Assessment

    DOT National Transportation Integrated Search

    2006-11-01

    This report discusses data acquisition and analysis for grade crossing risk analysis at the proposed San Joaquin High-Speed Rail Corridor in San Joaquin, California, and documents the data acquisition and analysis methodologies used to collect and an...

  12. 48 CFR 1511.011-76 - Legal analysis.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Legal analysis. 1511.011-76 Section 1511.011-76 Federal Acquisition Regulations System ENVIRONMENTAL PROTECTION AGENCY ACQUISITION PLANNING DESCRIBING AGENCY NEEDS 1511.011-76 Legal analysis. Contracting Officers shall insert the clause...

  13. Hyperspectral image analysis for rapid and accurate discrimination of bacterial infections: A benchmark study.

    PubMed

    Arrigoni, Simone; Turra, Giovanni; Signoroni, Alberto

    2017-09-01

    With the rapid diffusion of Full Laboratory Automation systems, Clinical Microbiology is currently experiencing a new digital revolution. The ability to capture and process large amounts of visual data from microbiological specimen processing enables the definition of completely new objectives. These include the direct identification of pathogens growing on culturing plates, with expected improvements in rapid definition of the right treatment for patients affected by bacterial infections. In this framework, the synergies between light spectroscopy and image analysis, offered by hyperspectral imaging, are of prominent interest. This leads us to assess the feasibility of a reliable and rapid discrimination of pathogens through the classification of their spectral signatures extracted from hyperspectral image acquisitions of bacteria colonies growing on blood agar plates. We designed and implemented the whole data acquisition and processing pipeline and performed a comprehensive comparison among 40 combinations of different data preprocessing and classification techniques. High discrimination performance has been achieved also thanks to improved colony segmentation and spectral signature extraction. Experimental results reveal the high accuracy and suitability of the proposed approach, driving the selection of most suitable and scalable classification pipelines and stimulating clinical validations. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Current Status of the LOFAR EoR Key Science Project

    NASA Astrophysics Data System (ADS)

    Koopmans, L. V. E.; LOFAR EoR KSP Team

    2018-05-01

    A short status update on the LOFAR Epoch of Reionization (EoR) Key Science Project (KSP) is given, regarding data acquisition, data processing and analysis, and current power-spectrum limits on the redshifted 21-cm signal of neutral hydrogen at redshifts z = 8 - 10. With caution, we present a preliminary astrophysical analysis of ~60 hr of processed LOFAR data and their resulting power spectrum, showing that potentially already interesting limits on X-ray heating during the Cosmic Dawn can already be gained. This is by no means the final analysis of this sub-set of data, but illustrates the future potential when all nearly 3000 hr of data in hand on two EoR windows will have been processed.

  15. Collection, processing and dissemination of data for the national solar demonstration program

    NASA Technical Reports Server (NTRS)

    Day, R. E.; Murphy, L. J.; Smok, J. T.

    1978-01-01

    A national solar data system developed for the DOE by IBM provides for automatic gathering, conversion, transfer, and analysis of demonstration site data. NASA requirements for this system include providing solar site hardware, engineering, data collection, and analysis. The specific tasks include: (1) solar energy system design/integration; (2) developing a site data acquisition subsystem; (3) developing a central data processing system; (4) operating the test facility at Marshall Space Flight Center; (5) collecting and analyzing data. The systematic analysis and evaluation of the data from the National Solar Data System is reflected in a monthly performance report and a solar energy system performance evaluation report.

  16. High angular resolution diffusion imaging with stimulated echoes: compensation and correction in experiment design and analysis.

    PubMed

    Lundell, Henrik; Alexander, Daniel C; Dyrby, Tim B

    2014-08-01

    Stimulated echo acquisition mode (STEAM) diffusion MRI can be advantageous over pulsed-gradient spin-echo (PGSE) for diffusion times that are long compared with T2 . It therefore has potential for biomedical diffusion imaging applications at 7T and above where T2 is short. However, gradient pulses other than the diffusion gradients in the STEAM sequence contribute much greater diffusion weighting than in PGSE and lead to a disrupted experimental design. Here, we introduce a simple compensation to the STEAM acquisition that avoids the orientational bias and disrupted experiment design that these gradient pulses can otherwise produce. The compensation is simple to implement by adjusting the gradient vectors in the diffusion pulses of the STEAM sequence, so that the net effective gradient vector including contributions from diffusion and other gradient pulses is as the experiment intends. High angular resolution diffusion imaging (HARDI) data were acquired with and without the proposed compensation. The data were processed to derive standard diffusion tensor imaging (DTI) maps, which highlight the need for the compensation. Ignoring the other gradient pulses, a bias in DTI parameters from STEAM acquisition is found, due both to confounds in the analysis and the experiment design. Retrospectively correcting the analysis with a calculation of the full B matrix can partly correct for these confounds, but an acquisition that is compensated as proposed is needed to remove the effect entirely. © 2014 The Authors. NMR in Biomedicine published by John Wiley & Sons, Ltd.

  17. Image processing tools dedicated to quantification in 3D fluorescence microscopy

    NASA Astrophysics Data System (ADS)

    Dieterlen, A.; De Meyer, A.; Colicchio, B.; Le Calvez, S.; Haeberlé, O.; Jacquey, S.

    2006-05-01

    3-D optical fluorescent microscopy now becomes an efficient tool for the volume investigation of living biological samples. Developments in instrumentation have permitted to beat off the conventional Abbe limit. In any case the recorded image can be described by the convolution equation between the original object and the Point Spread Function (PSF) of the acquisition system. Due to the finite resolution of the instrument, the original object is recorded with distortions and blurring, and contaminated by noise. This induces that relevant biological information cannot be extracted directly from raw data stacks. If the goal is 3-D quantitative analysis, then to assess optimal performance of the instrument and to ensure the data acquisition reproducibility, the system characterization is mandatory. The PSF represents the properties of the image acquisition system; we have proposed the use of statistical tools and Zernike moments to describe a 3-D PSF system and to quantify the variation of the PSF. This first step toward standardization is helpful to define an acquisition protocol optimizing exploitation of the microscope depending on the studied biological sample. Before the extraction of geometrical information and/or intensities quantification, the data restoration is mandatory. Reduction of out-of-focus light is carried out computationally by deconvolution process. But other phenomena occur during acquisition, like fluorescence photo degradation named "bleaching", inducing an alteration of information needed for restoration. Therefore, we have developed a protocol to pre-process data before the application of deconvolution algorithms. A large number of deconvolution methods have been described and are now available in commercial package. One major difficulty to use this software is the introduction by the user of the "best" regularization parameters. We have pointed out that automating the choice of the regularization level; also greatly improves the reliability of the measurements although it facilitates the use. Furthermore, to increase the quality and the repeatability of quantitative measurements a pre-filtering of images improves the stability of deconvolution process. In the same way, the PSF prefiltering stabilizes the deconvolution process. We have shown that Zemike polynomials can be used to reconstruct experimental PSF, preserving system characteristics and removing the noise contained in the PSF.

  18. Characterization of Decision Making Behaviors Associated with Human Systems Integration (HSI) Design Tradeoffs: Subject Matter Expert Interviews

    DTIC Science & Technology

    2014-11-18

    this research was to characterize the naturalistic decision making process used in Naval Aviation acquisition to assess cost, schedule and...Naval Aviation acquisitions can be identified, which can support the future development of new processes and tools for training and decision making...part of Department of Defense acquisition processes , HSI ensures that operator, maintainer and sustainer considerations are incorporated into

  19. Application of the Maximum Entropy Method to Risk Analysis of Mergers and Acquisitions

    NASA Astrophysics Data System (ADS)

    Xie, Jigang; Song, Wenyun

    The maximum entropy (ME) method can be used to analyze the risk of mergers and acquisitions when only pre-acquisition information is available. A practical example of the risk analysis of China listed firms’ mergers and acquisitions is provided to testify the feasibility and practicality of the method.

  20. NiftyPET: a High-throughput Software Platform for High Quantitative Accuracy and Precision PET Imaging and Analysis.

    PubMed

    Markiewicz, Pawel J; Ehrhardt, Matthias J; Erlandsson, Kjell; Noonan, Philip J; Barnes, Anna; Schott, Jonathan M; Atkinson, David; Arridge, Simon R; Hutton, Brian F; Ourselin, Sebastien

    2018-01-01

    We present a standalone, scalable and high-throughput software platform for PET image reconstruction and analysis. We focus on high fidelity modelling of the acquisition processes to provide high accuracy and precision quantitative imaging, especially for large axial field of view scanners. All the core routines are implemented using parallel computing available from within the Python package NiftyPET, enabling easy access, manipulation and visualisation of data at any processing stage. The pipeline of the platform starts from MR and raw PET input data and is divided into the following processing stages: (1) list-mode data processing; (2) accurate attenuation coefficient map generation; (3) detector normalisation; (4) exact forward and back projection between sinogram and image space; (5) estimation of reduced-variance random events; (6) high accuracy fully 3D estimation of scatter events; (7) voxel-based partial volume correction; (8) region- and voxel-level image analysis. We demonstrate the advantages of this platform using an amyloid brain scan where all the processing is executed from a single and uniform computational environment in Python. The high accuracy acquisition modelling is achieved through span-1 (no axial compression) ray tracing for true, random and scatter events. Furthermore, the platform offers uncertainty estimation of any image derived statistic to facilitate robust tracking of subtle physiological changes in longitudinal studies. The platform also supports the development of new reconstruction and analysis algorithms through restricting the axial field of view to any set of rings covering a region of interest and thus performing fully 3D reconstruction and corrections using real data significantly faster. All the software is available as open source with the accompanying wiki-page and test data.

  1. Development of MATLAB software to control data acquisition from a multichannel systems multi-electrode array.

    PubMed

    Messier, Erik

    2016-08-01

    A Multichannel Systems (MCS) microelectrode array data acquisition (DAQ) unit is used to collect multichannel electrograms (EGM) from a Langendorff perfused rabbit heart system to study sudden cardiac death (SCD). MCS provides software through which data being processed by the DAQ unit can be displayed and saved, but this software's combined utility with MATLAB is not very effective. MCSs software stores recorded EGM data in a MathCad (MCD) format, which is then converted to a text file format. These text files are very large, and it is therefore very time consuming to import the EGM data into MATLAB for real-time analysis. Therefore, customized MATLAB software was developed to control the acquisition of data from the MCS DAQ unit, and provide specific laboratory accommodations for this study of SCD. The developed DAQ unit control software will be able to accurately: provide real time display of EGM signals; record and save EGM signals in MATLAB in a desired format; and produce real time analysis of the EGM signals; all through an intuitive GUI.

  2. Good practice for conducting and reporting MEG research

    PubMed Central

    Gross, Joachim; Baillet, Sylvain; Barnes, Gareth R.; Henson, Richard N.; Hillebrand, Arjan; Jensen, Ole; Jerbi, Karim; Litvak, Vladimir; Maess, Burkhard; Oostenveld, Robert; Parkkonen, Lauri; Taylor, Jason R.; van Wassenhove, Virginie; Wibral, Michael; Schoffelen, Jan-Mathijs

    2013-01-01

    Magnetoencephalographic (MEG) recordings are a rich source of information about the neural dynamics underlying cognitive processes in the brain, with excellent temporal and good spatial resolution. In recent years there have been considerable advances in MEG hardware developments and methods. Sophisticated analysis techniques are now routinely applied and continuously improved, leading to fascinating insights into the intricate dynamics of neural processes. However, the rapidly increasing level of complexity of the different steps in a MEG study make it difficult for novices, and sometimes even for experts, to stay aware of possible limitations and caveats. Furthermore, the complexity of MEG data acquisition and data analysis requires special attention when describing MEG studies in publications, in order to facilitate interpretation and reproduction of the results. This manuscript aims at making recommendations for a number of important data acquisition and data analysis steps and suggests details that should be specified in manuscripts reporting MEG studies. These recommendations will hopefully serve as guidelines that help to strengthen the position of the MEG research community within the field of neuroscience, and may foster discussion in order to further enhance the quality and impact of MEG research. PMID:23046981

  3. Proteomic Analysis of Embryogenesis and the Acquisition of Seed Dormancy in Norway Maple (Acer platanoides L.)

    PubMed Central

    Staszak, Aleksandra Maria; Pawłowski, Tomasz Andrzej

    2014-01-01

    The proteome of zygotic embryos of Acer platanoides L. was analyzed via high-resolution 2D-SDS-PAGE and MS/MS in order to: (1) identify significant physiological processes associated with embryo development; and (2) identify changes in the proteome of the embryo associated with the acquisition of seed dormancy. Seventeen spots were identified as associated with morphogenesis at 10 to 13 weeks after flowering (WAF). Thirty-three spots were associated with maturation of the embryo at 14 to 22 WAF. The greatest changes in protein abundance occurred at 22 WAF, when seeds become fully mature. Overall, the stage of morphogenesis was characterized by changes in the abundance of proteins (tubulins and actin) associated with the growth and development of the embryo. Enzymes related to energy supply were especially elevated, most likely due to the energy demand associated with rapid growth and cell division. The stage of maturation is crucial to the establishment of seed dormancy and is associated with a higher abundance of proteins involved in genetic information processing, energy and carbon metabolism and cellular and antioxidant processes. Results indicated that a glycine-rich RNA-binding protein and proteasome proteins may be directly involved in dormancy acquisition control, and future studies are warranted to verify this association. PMID:24941250

  4. Proteomic analysis of embryogenesis and the acquisition of seed dormancy in Norway maple (Acer platanoides L.).

    PubMed

    Staszak, Aleksandra Maria; Pawłowski, Tomasz Andrzej

    2014-06-17

    The proteome of zygotic embryos of Acer platanoides L. was analyzed via high-resolution 2D-SDS-PAGE and MS/MS in order to: (1) identify significant physiological processes associated with embryo development; and (2) identify changes in the proteome of the embryo associated with the acquisition of seed dormancy. Seventeen spots were identified as associated with morphogenesis at 10 to 13 weeks after flowering (WAF). Thirty-three spots were associated with maturation of the embryo at 14 to 22 WAF. The greatest changes in protein abundance occurred at 22 WAF, when seeds become fully mature. Overall, the stage of morphogenesis was characterized by changes in the abundance of proteins (tubulins and actin) associated with the growth and development of the embryo. Enzymes related to energy supply were especially elevated, most likely due to the energy demand associated with rapid growth and cell division. The stage of maturation is crucial to the establishment of seed dormancy and is associated with a higher abundance of proteins involved in genetic information processing, energy and carbon metabolism and cellular and antioxidant processes. Results indicated that a glycine-rich RNA-binding protein and proteasome proteins may be directly involved in dormancy acquisition control, and future studies are warranted to verify this association.

  5. A Distributed Data Acquisition System for the Sensor Network of the TAWARA_RTM Project

    NASA Astrophysics Data System (ADS)

    Fontana, Cristiano Lino; Donati, Massimiliano; Cester, Davide; Fanucci, Luca; Iovene, Alessandro; Swiderski, Lukasz; Moretto, Sandra; Moszynski, Marek; Olejnik, Anna; Ruiu, Alessio; Stevanato, Luca; Batsch, Tadeusz; Tintori, Carlo; Lunardon, Marcello

    This paper describes a distributed Data Acquisition System (DAQ) developed for the TAWARA_RTM project (TAp WAter RAdioactivity Real Time Monitor). The aim is detecting the presence of radioactive contaminants in drinking water; in order to prevent deliberate or accidental threats. Employing a set of detectors, it is possible to detect alpha, beta and gamma radiations, from emitters dissolved in water. The Sensor Network (SN) consists of several heterogeneous nodes controlled by a centralized server. The SN cyber-security is guaranteed in order to protect it from external intrusions and malicious acts. The nodes were installed in different locations, along the water treatment processes, in the waterworks plant supplying the aqueduct of Warsaw, Poland. Embedded computers control the simpler nodes, and are directly connected to the SN. Local-PCs (LPCs) control the more complex nodes that consist signal digitizers acquiring data from several detectors. The DAQ in the LPC is split in several processes communicating with sockets in a local sub-network. Each process is dedicated to a very simple task (e.g. data acquisition, data analysis, hydraulics management) in order to have a flexible and fault-tolerant system. The main SN and the local DAQ networks are separated by data routers to ensure the cyber-security.

  6. Flexible software platform for fast-scan cyclic voltammetry data acquisition and analysis.

    PubMed

    Bucher, Elizabeth S; Brooks, Kenneth; Verber, Matthew D; Keithley, Richard B; Owesson-White, Catarina; Carroll, Susan; Takmakov, Pavel; McKinney, Collin J; Wightman, R Mark

    2013-11-05

    Over the last several decades, fast-scan cyclic voltammetry (FSCV) has proved to be a valuable analytical tool for the real-time measurement of neurotransmitter dynamics in vitro and in vivo. Indeed, FSCV has found application in a wide variety of disciplines including electrochemistry, neurobiology, and behavioral psychology. The maturation of FSCV as an in vivo technique led users to pose increasingly complex questions that require a more sophisticated experimental design. To accommodate recent and future advances in FSCV application, our lab has developed High Definition Cyclic Voltammetry (HDCV). HDCV is an electrochemical software suite that includes data acquisition and analysis programs. The data collection program delivers greater experimental flexibility and better user feedback through live displays. It supports experiments involving multiple electrodes with customized waveforms. It is compatible with transistor-transistor logic-based systems that are used for monitoring animal behavior, and it enables simultaneous recording of electrochemical and electrophysiological data. HDCV analysis streamlines data processing with superior filtering options, seamlessly manages behavioral events, and integrates chemometric processing. Furthermore, analysis is capable of handling single files collected over extended periods of time, allowing the user to consider biological events on both subsecond and multiminute time scales. Here we describe and demonstrate the utility of HDCV for in vivo experiments.

  7. Enhancement of the Acquisition Process for a Combat System-A Case Study to Model the Workflow Processes for an Air Defense System Acquisition

    DTIC Science & Technology

    2009-12-01

    Business Process Modeling BPMN Business Process Modeling Notation SoA Service-oriented Architecture UML Unified Modeling Language CSP...system developers. Supporting technologies include Business Process Modeling Notation ( BPMN ), Unified Modeling Language (UML), model-driven architecture

  8. 48 CFR 215.404-2 - Information to support proposal analysis.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Contract Pricing 215.404-2 Information to support proposal analysis. See PGI 215.404-2 for guidance on... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Information to support proposal analysis. 215.404-2 Section 215.404-2 Federal Acquisition Regulations System DEFENSE ACQUISITION...

  9. 48 CFR 215.404-2 - Information to support proposal analysis.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Contract Pricing 215.404-2 Information to support proposal analysis. See PGI 215.404-2 for guidance on... 48 Federal Acquisition Regulations System 3 2011-10-01 2011-10-01 false Information to support proposal analysis. 215.404-2 Section 215.404-2 Federal Acquisition Regulations System DEFENSE ACQUISITION...

  10. 48 CFR 215.404-2 - Information to support proposal analysis.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Contract Pricing 215.404-2 Information to support proposal analysis. See PGI 215.404-2 for guidance on... 48 Federal Acquisition Regulations System 3 2012-10-01 2012-10-01 false Information to support proposal analysis. 215.404-2 Section 215.404-2 Federal Acquisition Regulations System DEFENSE ACQUISITION...

  11. Non-destructive forensic latent fingerprint acquisition with chromatic white light sensors

    NASA Astrophysics Data System (ADS)

    Leich, Marcus; Kiltz, Stefan; Dittmann, Jana; Vielhauer, Claus

    2011-02-01

    Non-destructive latent fingerprint acquisition is an emerging field of research, which, unlike traditional methods, makes latent fingerprints available for additional verification or further analysis like tests for substance abuse or age estimation. In this paper a series of tests is performed to investigate the overall suitability of a high resolution off-the-shelf chromatic white light sensor for the contact-less and non-destructive latent fingerprint acquisition. Our paper focuses on scanning previously determined regions with exemplary acquisition parameter settings. 3D height field and reflection data of five different latent fingerprints on six different types of surfaces (HDD platter, brushed metal, painted car body (metallic and non-metallic finish), blued metal, veneered plywood) are experimentally studied. Pre-processing is performed by removing low-frequency gradients. The quality of the results is assessed subjectively; no automated feature extraction is performed. Additionally, the degradation of the fingerprint during the acquisition period is observed. While the quality of the acquired data is highly dependent on surface structure, the sensor is capable of detecting the fingerprint on all sample surfaces. On blued metal the residual material is detected; however, the ridge line structure dissolves within minutes after fingerprint placement.

  12. Evolution Of The Operational Energy Strategy And Its Consideration In The Defense Acquisition Process

    DTIC Science & Technology

    2016-09-01

    OPERATIONAL ENERGY STRATEGY AND ITS CONSIDERATION IN THE DEFENSE ACQUISITION PROCESS by Richard J. Kendig Ashley D. Seaton Robert J. Rodgers...project 4. TITLE AND SUBTITLE EVOLUTION OF THE OPERATIONAL ENERGY STRATEGY AND ITS CONSIDERATION IN THE DEFENSE ACQUISITION PROCESS 5. FUNDING...looked at the DOD Operational Energy Strategy evolution and how it applies to new and modified weapon systems, considering the three-legged table of the

  13. Dynamic Data Management Based on Archival Process Integration at the Centre for Environmental Data Archival

    NASA Astrophysics Data System (ADS)

    Conway, Esther; Waterfall, Alison; Pepler, Sam; Newey, Charles

    2015-04-01

    In this paper we decribe a business process modelling approach to the integration of exisiting archival activities. We provide a high level overview of existing practice and discuss how procedures can be extended and supported through the description of preservation state. The aim of which is to faciliate the dynamic controlled management of scientific data through its lifecycle. The main types of archival processes considered are: • Management processes that govern the operation of an archive. These management processes include archival governance (preservation state management, selection of archival candidates and strategic management) . • Operational processes that constitute the core activities of the archive which maintain the value of research assets. These operational processes are the acquisition, ingestion, deletion, generation of metadata and preservation actvities, • Supporting processes, which include planning, risk analysis and monitoring of the community/preservation environment. We then proceed by describing the feasability testing of extended risk management and planning procedures which integrate current practices. This was done through the CEDA Archival Format Audit which inspected British Atmospherics Data Centre and National Earth Observation Data Centre Archival holdings. These holdings are extensive, comprising of around 2PB of data and 137 million individual files which were analysed and characterised in terms of format based risk. We are then able to present an overview of the risk burden faced by a large scale archive attempting to maintain the usability of heterogeneous environmental data sets. We conclude by presenting a dynamic data management information model that is capable of describing the preservation state of archival holdings throughout the data lifecycle. We provide discussion of the following core model entities and their relationships: • Aspirational entities, which include Data Entity definitions and their associated Preservation Objectives. • Risk entities, which act as drivers for change within the data lifecycle. These include Acquisitional Risks, Technical Risks, Strategic Risks and External Risks • Plan entities, which detail the actions to bring about change within an archive. These include Acquisition Plans, Preservation Plans and Monitoring plans • The Result entities describe the successful outcomes of the executed plans. These include Acquisitions, Mitigations and Accepted Risks.

  14. PDT - PARTICLE DISPLACEMENT TRACKING SOFTWARE

    NASA Technical Reports Server (NTRS)

    Wernet, M. P.

    1994-01-01

    Particle Imaging Velocimetry (PIV) is a quantitative velocity measurement technique for measuring instantaneous planar cross sections of a flow field. The technique offers very high precision (1%) directionally resolved velocity vector estimates, but its use has been limited by high equipment costs and complexity of operation. Particle Displacement Tracking (PDT) is an all-electronic PIV data acquisition and reduction procedure which is simple, fast, and easily implemented. The procedure uses a low power, continuous wave laser and a Charged Coupled Device (CCD) camera to electronically record the particle images. A frame grabber board in a PC is used for data acquisition and reduction processing. PDT eliminates the need for photographic processing, system costs are moderately low, and reduced data are available within seconds of acquisition. The technique results in velocity estimate accuracies on the order of 5%. The software is fully menu-driven from the acquisition to the reduction and analysis of the data. Options are available to acquire a single image or 5- or 25-field series of images separated in time by multiples of 1/60 second. The user may process each image, specifying its boundaries to remove unwanted glare from the periphery and adjusting its background level to clearly resolve the particle images. Data reduction routines determine the particle image centroids and create time history files. PDT then identifies the velocity vectors which describe the particle movement in the flow field. Graphical data analysis routines are included which allow the user to graph the time history files and display the velocity vector maps, interpolated velocity vector grids, iso-velocity vector contours, and flow streamlines. The PDT data processing software is written in FORTRAN 77 and the data acquisition routine is written in C-Language for 80386-based IBM PC compatibles running MS-DOS v3.0 or higher. Machine requirements include 4 MB RAM (3 MB Extended), a single or multiple frequency RGB monitor (EGA or better), a math co-processor, and a pointing device. The printers supported by the graphical analysis routines are the HP Laserjet+, Series II, and Series III with at least 1.5 MB memory. The data acquisition routines require the EPIX 4-MEG video board and optional 12.5MHz oscillator, and associated EPIX software. Data can be acquired from any CCD or RS-170 compatible video camera with pixel resolution of 600hX400v or better. PDT is distributed on one 5.25 inch 360K MS-DOS format diskette. Due to the use of required proprietary software, executable code is not provided on the distribution media. Compiling the source code requires the Microsoft C v5.1 compiler, Microsoft QuickC v2.0, the Microsoft Mouse Library, EPIX Image Processing Libraries, the Microway NDP-Fortran-386 v2.1 compiler, and the Media Cybernetics HALO Professional Graphics Kernal System. Due to the complexities of the machine requirements, COSMIC strongly recommends the purchase and review of the documentation prior to the purchase of the program. The source code, and sample input and output files are provided in PKZIP format; the PKUNZIP utility is included. PDT was developed in 1990. All trade names used are the property of their respective corporate owners.

  15. An analytical approach to customer requirement information processing

    NASA Astrophysics Data System (ADS)

    Zhou, Zude; Xiao, Zheng; Liu, Quan; Ai, Qingsong

    2013-11-01

    'Customer requirements' (CRs) management is a key component of customer relationship management (CRM). By processing customer-focused information, CRs management plays an important role in enterprise systems (ESs). Although two main CRs analysis methods, quality function deployment (QFD) and Kano model, have been applied to many fields by many enterprises in the past several decades, the limitations such as complex processes and operations make them unsuitable for online businesses among small- and medium-sized enterprises (SMEs). Currently, most SMEs do not have the resources to implement QFD or Kano model. In this article, we propose a method named customer requirement information (CRI), which provides a simpler and easier way for SMEs to run CRs analysis. The proposed method analyses CRs from the perspective of information and applies mathematical methods to the analysis process. A detailed description of CRI's acquisition, classification and processing is provided.

  16. An Analysis of the Army Service Acquisition Review Requirements and the Perceived Effectiveness on Intended Improvements

    DTIC Science & Technology

    2016-06-01

    design will help assess each individual’s perceptions on the five primary research questions. D. PILOT TESTING After creating the survey, it’s...distributed to individuals that have submitted requirements packages through the ASSP process. The survey field test was designed to determine the...Will be designated for each of the service portfolio groups and collaborates to define common processes across DOD Component Level Lead (CLL

  17. Combining MMOWGLI Social Media Brainstorming with Lexical Link Analysis (LLA) to Strengthen the DoD Acquisition Process

    DTIC Science & Technology

    2013-09-30

    founded Quantum Intelligence, Inc. She was principal investigator (PI) for six contracts awarded by the DoD Small Business Innovation Research (SBIR... Quantum Intelligence, Inc. CLA is a computer-based learning agent, or agent collaboration, capable of ingesting and processing data sources. We have...opportunities all need to be addressed consciously and consistently.  Following a series of deliberate experiments, long-term procedural improvements to the

  18. Automatic differential analysis of NMR experiments in complex samples.

    PubMed

    Margueritte, Laure; Markov, Petar; Chiron, Lionel; Starck, Jean-Philippe; Vonthron-Sénécheau, Catherine; Bourjot, Mélanie; Delsuc, Marc-André

    2018-06-01

    Liquid state nuclear magnetic resonance (NMR) is a powerful tool for the analysis of complex mixtures of unknown molecules. This capacity has been used in many analytical approaches: metabolomics, identification of active compounds in natural extracts, and characterization of species, and such studies require the acquisition of many diverse NMR measurements on series of samples. Although acquisition can easily be performed automatically, the number of NMR experiments involved in these studies increases very rapidly, and this data avalanche requires to resort to automatic processing and analysis. We present here a program that allows the autonomous, unsupervised processing of a large corpus of 1D, 2D, and diffusion-ordered spectroscopy experiments from a series of samples acquired in different conditions. The program provides all the signal processing steps, as well as peak-picking and bucketing of 1D and 2D spectra, the program and its components are fully available. In an experiment mimicking the search of a bioactive species in a natural extract, we use it for the automatic detection of small amounts of artemisinin added to a series of plant extracts and for the generation of the spectral fingerprint of this molecule. This program called Plasmodesma is a novel tool that should be useful to decipher complex mixtures, particularly in the discovery of biologically active natural products from plants extracts but can also in drug discovery or metabolomics studies. Copyright © 2017 John Wiley & Sons, Ltd.

  19. The Impact of Maintenance Free Operating Period Approach to Acquisition Approaches, System Sustainment, and Costs

    DTIC Science & Technology

    2013-01-07

    of MFOP principles on processes, procedures , and costs in acquisition planning. It investigates MFOP and reviews the results of a 2005 submarine pilot...approach be a game changer? This paper evaluates the potential impact of MFOP principles on processes, procedures , and costs in acquisition planning. It...elli=lc=_rpfkbpp=C=mr_if`=mlif`v= = - 2 - k^s^i=mlpqdo^ar^qb=p`elli= procedures , and costs in acquisition planning. The scope of the research was to

  20. Automating Acquisitions: The Planning Process.

    ERIC Educational Resources Information Center

    Bryant, Bonita

    1984-01-01

    Account of process followed at large academic library in preparing for automation of acquisition and fund accounting functions highlights planning criteria, local goals, planning process elements (selecting participants, assigning tasks, devising timetable, providing foundations, evaluating systems, determining costs, formulating recommendations).…

  1. IDC Re-Engineering Phase 2 System Requirements Document Version 1.4

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, James M.; Burns, John F.; Satpathi, Meara Allena

    This System Requirements Document (SRD) defines waveform data processing requirements for the International Data Centre (IDC) of the Comprehensive Nuclear Test Ban Treaty Organization (CTBTO). The IDC applies, on a routine basis, automatic processing methods and interactive analysis to raw International Monitoring System (IMS) data in order to produce, archive, and distribute standard IDC products on behalf of all States Parties. The routine processing includes characterization of events with the objective of screening out events considered to be consistent with natural phenomena or non-nuclear, man-made phenomena. This document does not address requirements concerning acquisition, processing and analysis of radionuclide data,more » but includes requirements for the dissemination of radionuclide data and products.« less

  2. IDC Re-Engineering Phase 2 System Requirements Document V1.3.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, James M.; Burns, John F.; Satpathi, Meara Allena

    2015-12-01

    This System Requirements Document (SRD) defines waveform data processing requirements for the International Data Centre (IDC) of the Comprehensive Nuclear Test Ban Treaty Organization (CTBTO). The IDC applies, on a routine basis, automatic processing methods and interactive analysis to raw International Monitoring System (IMS) data in order to produce, archive, and distribute standard IDC products on behalf of all States Parties. The routine processing includes characterization of events with the objective of screening out events considered to be consistent with natural phenomena or non-nuclear, man-made phenomena. This document does not address requirements concerning acquisition, processing and analysis of radionuclide datamore » but includes requirements for the dissemination of radionuclide data and products.« less

  3. 48 CFR 15.404-2 - Data to support proposal analysis.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Data to support proposal analysis. 15.404-2 Section 15.404-2 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Contract Pricing 15.404-2 Data to support proposal analysis. (a) Field pricing...

  4. 48 CFR 801.602-78 - Processing solicitations and contract documents for legal or technical review-Veterans Health...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ..., Central Office (except Office of Construction and Facilities Management), the National Acquisition Center, and the Denver Acquisition and Logistics Center. 801.602-78 Section 801.602-78 Federal Acquisition... Acquisition Center, and the Denver Acquisition and Logistics Center. (a) If legal or technical review is...

  5. 48 CFR 801.602-78 - Processing solicitations and contract documents for legal or technical review-Veterans Health...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ..., Central Office (except Office of Construction and Facilities Management), the National Acquisition Center, and the Denver Acquisition and Logistics Center. 801.602-78 Section 801.602-78 Federal Acquisition... Acquisition Center, and the Denver Acquisition and Logistics Center. (a) If legal or technical review is...

  6. 48 CFR 801.602-78 - Processing solicitations and contract documents for legal or technical review-Veterans Health...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ..., Central Office (except Office of Construction and Facilities Management), the National Acquisition Center, and the Denver Acquisition and Logistics Center. 801.602-78 Section 801.602-78 Federal Acquisition... Acquisition Center, and the Denver Acquisition and Logistics Center. (a) If legal or technical review is...

  7. 48 CFR 852.246-72 - Frozen processed foods.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Frozen processed foods. 852.246-72 Section 852.246-72 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS... Frozen processed foods. As prescribed in 846.302-72, insert the following clause: Frozen Processed Foods...

  8. 48 CFR 852.246-72 - Frozen processed foods.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Frozen processed foods. 852.246-72 Section 852.246-72 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS... Frozen processed foods. As prescribed in 846.302-72, insert the following clause: Frozen Processed Foods...

  9. 48 CFR 852.246-72 - Frozen processed foods.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Frozen processed foods. 852.246-72 Section 852.246-72 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS... Frozen processed foods. As prescribed in 846.302-72, insert the following clause: Frozen Processed Foods...

  10. 48 CFR 852.246-72 - Frozen processed foods.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 5 2012-10-01 2012-10-01 false Frozen processed foods. 852.246-72 Section 852.246-72 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS... Frozen processed foods. As prescribed in 846.302-72, insert the following clause: Frozen Processed Foods...

  11. 48 CFR 852.246-72 - Frozen processed foods.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 5 2013-10-01 2013-10-01 false Frozen processed foods. 852.246-72 Section 852.246-72 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS... Frozen processed foods. As prescribed in 846.302-72, insert the following clause: Frozen Processed Foods...

  12. Analytical Ultrasonics in Materials Research and Testing

    NASA Technical Reports Server (NTRS)

    Vary, A.

    1986-01-01

    Research results in analytical ultrasonics for characterizing structural materials from metals and ceramics to composites are presented. General topics covered by the conference included: status and advances in analytical ultrasonics for characterizing material microstructures and mechanical properties; status and prospects for ultrasonic measurements of microdamage, degradation, and underlying morphological factors; status and problems in precision measurements of frequency-dependent velocity and attenuation for materials analysis; procedures and requirements for automated, digital signal acquisition, processing, analysis, and interpretation; incentives for analytical ultrasonics in materials research and materials processing, testing, and inspection; and examples of progress in ultrasonics for interrelating microstructure, mechanical properites, and dynamic response.

  13. Degree of food processing of household acquisition patterns in a Brazilian urban area is related to food buying preferences and perceived food environment.

    PubMed

    Vedovato, G M; Trude, A C B; Kharmats, A Y; Martins, P A

    2015-04-01

    This cross-sectional study examined the association between local food environment and consumers' acquisition of ultra-processed food. Households were randomly selected from 36 census tracts in Santos City, Brazil. Mothers, of varying economic status, who had children ages 10 or younger (n = 538) were interviewed concerning: their household food acquisition of 31 groups of food and beverages, perceptions of local food environment, food sources destinations, means of transportation used, and socioeconomic status. Food acquisition patterns were classified based on the degree of industrial food processing. Logistic regression models were fitted to assess the association between consumer behaviors and acquisition patterns. The large variety of fresh produce available in supermarkets was significantly related to lower odds of ultra-processed food purchases. After adjusting for sociodemographic characteristics, higher odds for minimally-processed food acquisition were associated with: frequent use of specialized markets to purchase fruits and vegetables (OR 1.89, 95% CI 1.01-2.34), the habit of walking to buy food (OR 1.58, 95% CI 1.08-2.30), and perceived availability of fresh produce in participants' neighborhood (OR 1.58, 95% CI 1.08-2.30). Acquisition of ultra-processed food was positively associated with the use of taxis as principal means of transportation to food sources (OR 2.35, 95% CI 1.08-5.13), and negatively associated with perceived availability of a variety of fruits and vegetables in the neighborhood (OR 0.57, 95% CI 0.37-0.88). The results suggest that interventions aiming to promote acquisition of less processed food in settings similar to Santos, may be most effective if they focus on increasing the number of specialized fresh food markets in local neighborhood areas, improve residents' awareness of these markets' availability, and provide appropriate transportation. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Optical method for measuring the surface area of a threaded fastener

    Treesearch

    Douglas Rammer; Samuel Zelinka

    2010-01-01

    This article highlights major aspects of a new optical technique to determine the surface area of a threaded fastener; the theoretical framework has been reported elsewhere. Specifically, this article describes general surface area expressions used in the analysis, details of image acquisition system, and major image processing steps contained within the measurement...

  15. The Study on Virtual Medical Instrument based on LabVIEW.

    PubMed

    Chengwei, Li; Limei, Zhang; Xiaoming, Hu

    2005-01-01

    With the increasing performance of computer, the virtual instrument technology has greatly advanced over the years, and then virtual medical instrument technology becomes available. This paper presents the virtual medical instrument, and then as an example, an application of a signal acquisition, processing and analysis system using LabVIEW is also given.

  16. Bilingualism, Cultural Transmutation, and Fields of Coexistence: California's Spanish Language Legacy

    ERIC Educational Resources Information Center

    Garcia, Sara

    2006-01-01

    This is an historical analysis of English Only programs in California and their impact on bilingualism as a natural acquisition process. Factors that propagate bilingualism such as a continual flow of Spanish speaking immigrants, and social, economic and ethnic isolation, are delineated for theorizing about key aspects of multilingualism, the…

  17. Analysis of Commercial Pricing Factors: A Framework for Commercial Item Pricing

    DTIC Science & Technology

    2002-03-01

    24 A. INTRODUCTION ...................................24 B. PRICING THEORIES ...............................25 1. Market Theory ...25 2. Transactional Cost Economics .................30 3. Game or Bargaining Theory ....................35 C. CURRENT PRICE...from Congressional intent to have the Government acquisition process rely on market forces to determine fair and reasonable prices. (Ref. 22, p. 2

  18. Cooperative knowledge evolution: a construction-integration approach to knowledge discovery in medicine.

    PubMed

    Schmalhofer, F J; Tschaitschian, B

    1998-11-01

    In this paper, we perform a cognitive analysis of knowledge discovery processes. As a result of this analysis, the construction-integration theory is proposed as a general framework for developing cooperative knowledge evolution systems. We thus suggest that for the acquisition of new domain knowledge in medicine, one should first construct pluralistic views on a given topic which may contain inconsistencies as well as redundancies. Only thereafter does this knowledge become consolidated into a situation-specific circumscription and the early inconsistencies become eliminated. As a proof for the viability of such knowledge acquisition processes in medicine, we present the IDEAS system, which can be used for the intelligent documentation of adverse events in clinical studies. This system provides a better documentation of the side-effects of medical drugs. Thereby, knowledge evolution occurs by achieving consistent explanations in increasingly larger contexts (i.e., more cases and more pharmaceutical substrates). Finally, it is shown how prototypes, model-based approaches and cooperative knowledge evolution systems can be distinguished as different classes of knowledge-based systems.

  19. Combining a Deconvolution and a Universal Library Search Algorithm for the Nontarget Analysis of Data-Independent Acquisition Mode Liquid Chromatography-High-Resolution Mass Spectrometry Results.

    PubMed

    Samanipour, Saer; Reid, Malcolm J; Bæk, Kine; Thomas, Kevin V

    2018-04-17

    Nontarget analysis is considered one of the most comprehensive tools for the identification of unknown compounds in a complex sample analyzed via liquid chromatography coupled to high-resolution mass spectrometry (LC-HRMS). Due to the complexity of the data generated via LC-HRMS, the data-dependent acquisition mode, which produces the MS 2 spectra of a limited number of the precursor ions, has been one of the most common approaches used during nontarget screening. However, data-independent acquisition mode produces highly complex spectra that require proper deconvolution and library search algorithms. We have developed a deconvolution algorithm and a universal library search algorithm (ULSA) for the analysis of complex spectra generated via data-independent acquisition. These algorithms were validated and tested using both semisynthetic and real environmental data. A total of 6000 randomly selected spectra from MassBank were introduced across the total ion chromatograms of 15 sludge extracts at three levels of background complexity for the validation of the algorithms via semisynthetic data. The deconvolution algorithm successfully extracted more than 60% of the added ions in the analytical signal for 95% of processed spectra (i.e., 3 complexity levels multiplied by 6000 spectra). The ULSA ranked the correct spectra among the top three for more than 95% of cases. We further tested the algorithms with 5 wastewater effluent extracts for 59 artificial unknown analytes (i.e., their presence or absence was confirmed via target analysis). These algorithms did not produce any cases of false identifications while correctly identifying ∼70% of the total inquiries. The implications, capabilities, and the limitations of both algorithms are further discussed.

  20. Comparison of Data Acquisition Strategies on Quadrupole Ion Trap Instrumentation for Shotgun Proteomics

    PubMed Central

    Canterbury, Jesse D.; Merrihew, Gennifer E.; Goodlett, David R.; MacCoss, Michael J.; Shaffer, Scott A.

    2015-01-01

    A common strategy in mass spectrometry analyses of complex protein mixtures is to digest the proteins to peptides, separate the peptides by microcapillary liquid chromatography and collect tandem mass spectra (MS/MS) on the eluting, complex peptide mixtures, a process commonly termed “shotgun proteomics”. For years, the most common way of data collection was via data-dependent acquisition (DDA), a process driven by an automated instrument control routine that directs MS/MS acquisition from the highest abundant signals to the lowest, a process often leaving lower abundant signals unanalyzed and therefore unidentified in the experiment. Advances in both instrumentation duty cycle and sensitivity allow DDA to probe to lower peptide abundance and therefore enable mapping proteomes to a more significant depth. An alternative to acquiring data by DDA is by data-independent acquisition (DIA), in which a specified range in m/z is fragmented without regard to prioritization of a precursor ion or its relative abundance in the mass spectrum. As a consequence, DIA acquisition potentially offers more comprehensive analysis of peptides than DDA and in principle can yield tandem mass spectra of all ionized molecules following their conversion to the gas-phase. In this work, we evaluate both DDA and DIA on three different linear ion trap instruments: an LTQ, an LTQ modified in-house with an electrodynamic ion funnel, and an LTQ-Velos. These instruments were chosen as they are representative of both older (LTQ) and newer (LTQ-Velos) ion trap designs i.e., linear ion trap and dual ion traps, respectively, and allow direct comparison of peptide identification using both DDA and DIA analysis. Further, as the LTQ-Velos has an improved “S-lens” ion guide in the high-pressure region to improve ion flux, we found it logical to determine if the former LTQ model could be leveraged by improving sensitivity by modifying with an electrodynamic ion guide of significantly different design to the S-lens. We find that the ion funnel enabled LTQ identifies more proteins in the insoluble fraction of a yeast lysate than the other two instruments in DIA mode, while the faster scanning LTQ-Velos performs better in DDA mode. We explore reasons for these results, including differences in scan speed, source ion optics, and linear ion trap design. PMID:25261218

  1. A neuromathematical model of human information processing and its application to science content acquisition

    NASA Astrophysics Data System (ADS)

    Anderson, O. Roger

    The rate of information processing during science learning and the efficiency of the learner in mobilizing relevant information in long-term memory as an aid in transmitting newly acquired information to stable storage in long-term memory are fundamental aspects of science content acquisition. These cognitive processes, moreover, may be substantially related in tempo and quality of organization to the efficiency of higher thought processes such as divergent thinking and problem-solving ability that characterize scientific thought. As a contribution to our quantitative understanding of these fundamental information processes, a mathematical model of information acquisition is presented and empirically evaluated in comparison to evidence obtained from experimental studies of science content acquisition. Computer-based models are used to simulate variations in learning parameters and to generate the theoretical predictions to be empirically tested. The initial tests of the predictive accuracy of the model show close agreement between predicted and actual mean recall scores in short-term learning tasks. Implications of the model for human information acquisition and possible future research are discussed in the context of the unique theoretical framework of the model.

  2. A novel intermediate in processing of murine leukemia virus envelope glycoproteins. Proteolytic cleavage in the late Golgi region.

    PubMed

    Bedgood, R M; Stallcup, M R

    1992-04-05

    The intracellular processing of the murine leukemia virus envelope glycoprotein precursor Pr85 to the mature products gp70 and p15e was analyzed in the mouse T-lymphoma cell line W7MG1. Kinetic (pulse-chase) analysis of synthesis and processing, coupled with endoglycosidase (endo H) and neuraminidase digestions revealed the existence of a novel high molecular weight processing intermediate, gp95, containing endo H-resistant terminally glycosylated oligosaccharide chains. In contrast to previously published conclusions, our data indicate that proteolytic cleavage of the envelope precursor occurs after the acquisition of endo H-resistant chains and terminal glycosylation and thus after the mannosidase II step. In the same W7MG1 cell line, the type and order of murine leukemia virus envelope protein processing events was identical to that for the mouse mammary tumor virus envelope protein. Interestingly, complete mouse mammary tumor virus envelope protein processing requires the addition of glucocorticoid hormone, whereas murine leukemia virus envelope protein processing occurs constitutively in these W7MG1 cells. We propose that all retroviral envelope proteins share a common processing pathway in which proteolytic processing is a late event that follows acquisition of endo H resistance and terminal glycosylation.

  3. Workload-Matched Adaptive Automation Support of Air Traffic Controller Information Processing Stages

    NASA Technical Reports Server (NTRS)

    Kaber, David B.; Prinzel, Lawrence J., III; Wright, Melanie C.; Clamann, Michael P.

    2002-01-01

    Adaptive automation (AA) has been explored as a solution to the problems associated with human-automation interaction in supervisory control environments. However, research has focused on the performance effects of dynamic control allocations of early stage sensory and information acquisition functions. The present research compares the effects of AA to the entire range of information processing stages of human operators, such as air traffic controllers. The results provide evidence that the effectiveness of AA is dependent on the stage of task performance (human-machine system information processing) that is flexibly automated. The results suggest that humans are better able to adapt to AA when applied to lower-level sensory and psychomotor functions, such as information acquisition and action implementation, as compared to AA applied to cognitive (analysis and decision-making) tasks. The results also provide support for the use of AA, as compared to completely manual control. These results are discussed in terms of implications for AA design for aviation.

  4. 48 CFR 15.101-2 - Lowest price technically acceptable source selection process.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Lowest price technically acceptable source selection process. 15.101-2 Section 15.101-2 Federal Acquisition Regulations System FEDERAL... Processes and Techniques 15.101-2 Lowest price technically acceptable source selection process. (a) The...

  5. Quantitative analysis of geomorphic processes using satellite image data at different scales

    NASA Technical Reports Server (NTRS)

    Williams, R. S., Jr.

    1985-01-01

    When aerial and satellite photographs and images are used in the quantitative analysis of geomorphic processes, either through direct observation of active processes or by analysis of landforms resulting from inferred active or dormant processes, a number of limitations in the use of such data must be considered. Active geomorphic processes work at different scales and rates. Therefore, the capability of imaging an active or dormant process depends primarily on the scale of the process and the spatial-resolution characteristic of the imaging system. Scale is an important factor in recording continuous and discontinuous active geomorphic processes, because what is not recorded will not be considered or even suspected in the analysis of orbital images. If the geomorphic process of landform change caused by the process is less than 200 m in x to y dimension, then it will not be recorded. Although the scale factor is critical, in the recording of discontinuous active geomorphic processes, the repeat interval of orbital-image acquisition of a planetary surface also is a consideration in order to capture a recurring short-lived geomorphic process or to record changes caused by either a continuous or a discontinuous geomorphic process.

  6. Acquisition and Tracking Behavior of Phase-Locked Loops

    NASA Technical Reports Server (NTRS)

    Viterbi, A. J.

    1958-01-01

    Phase-locked or APC loops have found increasing applications in recent years as tracking filters, synchronizing devices, and narrowband FM discriminators. Considerable work has been performed to determine the noise-squelching properties of the loop when it is operating in or near phase lock and is functioning as a linear coherent detector. However, insufficient consideration has been devoted to the non-linear behavior of the loop when it is out of lock and in the process of pulling in. Experimental evidence has indicated that there is a strong tendency for phase-locked loops to achieve lock under most circumstances. However, the analysis which has appeared in the literature iis limited to the acquisition of a constant frequency reference signal with only one phase-locked loop filter configuration. This work represents an investigation of frequency acquisition properties of phase-locked loops for a variety of reference-signal behavior and loop configurations

  7. Environmental Characterization for Target Acquisition. Report 2. Analysis of Thermal and Visible Imagery

    DTIC Science & Technology

    1993-11-01

    4 Im age M etrics .......................................... 8 Analysis Procedures .................................... 14 3...trgtI’oi4.1 top) then ter jit I" to ,amtqts -i do eno; A26 Appendx A Metices Image Processing S,)ftware Source Code AGANETRIC 4 OF 8 Vat 1.J. k., I I integer...A 4 •A--TIC - OF 8 Appendx A Wbkri Image Prooiing Software Source Code A31 AGACOMPT I OF 3

  8. Acoustic Signal Characteristics Measured with the LAMBDA III During CHURCH STROKE III

    DTIC Science & Technology

    1980-09-15

    analysis. Dr. William M. Carey and Dr. Richard Doolittle participated in various stages of acquisition, processing and analysis of the information...reported herein. Drs. Carey , Doolittle and Mr. Gereben are the authors of this report. (U) This report: Acoustic Signal Characteristics Measured with... Tortugas Terrace and the East Yucatan Channel,the Catoche Tongue and the Eastern region of the Gulf of Mexico. (U) The exercise was conducted by the Long

  9. An Analysis of Causes of Contract Price Change for Competitive Procurements of Replenishment Spare Parts.

    DTIC Science & Technology

    1984-09-01

    Research Study Number Three. In 1982, Mr. Edward Brost , then a graduate student at AFIT, completed a thesis which further analyzed the effects of sole...task, Brost formulated three research questions. They are as follows: 1. Is there a reduction in replenishment spare parts prices when competition is...analysis, Brost made three significant conclusions. They are: 1. The introduction of competition into the replenishment spare parts acquisition process

  10. Analysis of Rapid Acquisition Processes to Fulfill Future Urgent Needs

    DTIC Science & Technology

    2015-12-01

    Pickar, Ph.D. Brad Naegle, Academic Associate Graduate School of Business and Public Policy iv THIS PAGE INTENTIONALLY LEFT BLANK v ANALYSIS OF...simply go back to business as usual. D. METHODOLOGY As a lens through which to review these regulations and policies, this research shows how the MRAP...the ineffective gear issued or procuring a perceived safer and more reliable piece of equipment. Some individuals even created their own weapon

  11. The Making of a Government LSI - From Warfare Capability to Operational System

    DTIC Science & Technology

    2015-04-30

    continues to evolve and implement Lead System Integrator (LSI) acquisition strategies, they have started to define numerous program initiatives that...employ more integrated engineering and management processes and techniques. These initiatives are developing varying acquisition approaches that define (1...government LSI transformation. Navy Systems Commands have begun adding a higher level of integration into their acquisition process with the

  12. Signal existence verification (SEV) for GPS low received power signal detection using the time-frequency approach.

    PubMed

    Jan, Shau-Shiun; Sun, Chih-Cheng

    2010-01-01

    The detection of low received power of global positioning system (GPS) signals in the signal acquisition process is an important issue for GPS applications. Improving the miss-detection problem of low received power signal is crucial, especially for urban or indoor environments. This paper proposes a signal existence verification (SEV) process to detect and subsequently verify low received power GPS signals. The SEV process is based on the time-frequency representation of GPS signal, and it can capture the characteristic of GPS signal in the time-frequency plane to enhance the GPS signal acquisition performance. Several simulations and experiments are conducted to show the effectiveness of the proposed method for low received power signal detection. The contribution of this work is that the SEV process is an additional scheme to assist the GPS signal acquisition process in low received power signal detection, without changing the original signal acquisition or tracking algorithms.

  13. [Real time 3D echocardiography

    NASA Technical Reports Server (NTRS)

    Bauer, F.; Shiota, T.; Thomas, J. D.

    2001-01-01

    Three-dimensional representation of the heart is an old concern. Usually, 3D reconstruction of the cardiac mass is made by successive acquisition of 2D sections, the spatial localisation and orientation of which require complex guiding systems. More recently, the concept of volumetric acquisition has been introduced. A matricial emitter-receiver probe complex with parallel data processing provides instantaneous of a pyramidal 64 degrees x 64 degrees volume. The image is restituted in real time and is composed of 3 planes (planes B and C) which can be displaced in all spatial directions at any time during acquisition. The flexibility of this system of acquisition allows volume and mass measurement with greater accuracy and reproducibility, limiting inter-observer variability. Free navigation of the planes of investigation allows reconstruction for qualitative and quantitative analysis of valvular heart disease and other pathologies. Although real time 3D echocardiography is ready for clinical usage, some improvements are still necessary to improve its conviviality. Then real time 3D echocardiography could be the essential tool for understanding, diagnosis and management of patients.

  14. First Language Acquisition and Teaching

    ERIC Educational Resources Information Center

    Cruz-Ferreira, Madalena

    2011-01-01

    "First language acquisition" commonly means the acquisition of a single language in childhood, regardless of the number of languages in a child's natural environment. Language acquisition is variously viewed as predetermined, wondrous, a source of concern, and as developing through formal processes. "First language teaching" concerns schooling in…

  15. 48 CFR 836.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... for contracts not to exceed the simplified acquisition threshold. 836.602-5 Section 836.602-5 Federal... contracts not to exceed the simplified acquisition threshold. Either of the procedures provided in FAR 36... simplified acquisition threshold. ...

  16. 48 CFR 3046.792 - Cost benefit analysis (USCG).

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Cost benefit analysis (USCG). 3046.792 Section 3046.792 Federal Acquisition Regulations System DEPARTMENT OF HOMELAND SECURITY, HOMELAND SECURITY ACQUISITION REGULATION (HSAR) CONTRACT MANAGEMENT QUALITY ASSURANCE Warranties 3046.792...

  17. Data acquisition instrument for EEG based on embedded system

    NASA Astrophysics Data System (ADS)

    Toresano, La Ode Husein Z.; Wijaya, Sastra Kusuma; Prawito, Sudarmaji, Arief; Syakura, Abdan; Badri, Cholid

    2017-02-01

    An electroencephalogram (EEG) is a device for measuring and recording the electrical activity of brain. The EEG data of signal can be used as a source of analysis for human brain function. The purpose of this study was to design a portable multichannel EEG based on embedded system and ADS1299. The ADS1299 is an analog front-end to be used as an Analog to Digital Converter (ADC) to convert analog signal of electrical activity of brain, a filter of electrical signal to reduce the noise on low-frequency band and a data communication to the microcontroller. The system has been tested to capture brain signal within a range of 1-20 Hz using the NETECH EEG simulator 330. The developed system was relatively high accuracy of more than 82.5%. The EEG Instrument has been successfully implemented to acquire the brain signal activity using a PC (Personal Computer) connection for displaying the recorded data. The final result of data acquisition has been processed using OpenBCI GUI (Graphical User Interface) based through real-time process for 8-channel signal acquisition, brain-mapping and power spectral decomposition signal using the standard FFT (Fast Fourier Transform) algorithm.

  18. AV-8B Remanufacture Program as Part of the Audit of the Defense Acquisition Board Review Process - FY 1994

    DTIC Science & Technology

    1994-06-03

    wft*:¥A:ft:i:ft& OFFICE OF THE INSPECTOR GENERAL AV-8B REMANUFACTURE PROGRAM AS PART OF THE AUDIT OF THE DEFENSE ACQUISITION BOARD...Part of the Audit of the Defense Acquisition Board Review Process - FY 1994 B. DATE Report Downloaded From the Internet: 03/23/99 C. Report’s Point...NAVY FOR RESEARCH DEVELOPMENT AND ACQUISITION SUBJECT: Audit Report on the AV-8B Remanufacture Program as Part of the Audit of the Defense

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tenney, J.L.

    SARS is a data acquisition system designed to gather and process radar data from aircraft flights. A database of flight trajectories has been developed for Albuquerque, NM, and Amarillo, TX. The data is used for safety analysis and risk assessment reports. To support this database effort, Sandia developed a collection of hardware and software tools to collect and post process the aircraft radar data. This document describes the data reduction tools which comprise the SARS, and maintenance procedures for the hardware and software system.

  20. Learning (Not) to Predict: Grammatical Gender Processing in Second Language Acquisition

    ERIC Educational Resources Information Center

    Hopp, Holger

    2016-01-01

    In two experiments, this article investigates the predictive processing of gender agreement in adult second language (L2) acquisition. We test (1) whether instruction on lexical gender can lead to target predictive agreement processing and (2) how variability in lexical gender representations moderates L2 gender agreement processing. In a…

  1. Assessment of Navy Contract Management Processes

    DTIC Science & Technology

    2016-02-22

    Assessment of Navy Contract Management Processes 22 February 2016 Dr. Rene G. Rendon, Associate Professor Graduate School of Business ...Know) for each survey item in each contract management process area. Acquisition Research Program Graduate School of Business ...management process . Figure 1. U.S. Navy CMMM Maturity Levels Acquisition Research Program Graduate School of Business

  2. ATS-6 - Preliminary results from the 13/18-GHz COMSAT Propagation Experiment

    NASA Technical Reports Server (NTRS)

    Hyde, G.

    1975-01-01

    The 13/18-GHz COMSAT Propagation Experiment (CPE) is reviewed, the data acquisition and processing are discussed, and samples of preliminary results are presented. The need for measurements of both hydrometeor-induced attenuation statistics and diversity effectiveness is brought out. The facilitation of the experiment - CPE dual frequency and diversity site location, the CPE ground transmit terminals, the CPE transponder on Applications Technology Satellite-6 (ATS-6), and the CPE receive and data acquisition system - is briefly examined. The on-line preprocessing of the received signal is reviewed, followed by a discussion of the off-line processing of this database to remove signal fluctuations not due to hydrometeors. Finally, samples of the results of first-level analysis of the resultant data for the 18-GHz diversity site near Boston, Mass., and for the dual frequency 13/18-GHz site near Detroit, Mich., are presented and discussed.

  3. Acquisition of background and technical information and class trip planning

    NASA Technical Reports Server (NTRS)

    Mackinnon, R. M.; Wake, W. H.

    1981-01-01

    Instructors who are very familiar with a study area, as well as those who are not, find the field trip information acquisition and planning process speeded and made more effective by organizing it in stages. The stage follow a deductive progression: from the associated context region, to the study area, to the specific sample window sites, and from generalized background information on the study region to specific technical data on the environmental and human use systems to be interpreted at each site. On the class trip and in the follow up laboratory, the learning/interpretive process are at first deductive in applying previously learned information and skills to analysis of the study site, then inductive in reading and interpreting the landscape, imagery, and maps of the site, correlating them with information of other samples sites and building valid generalizations about the larger study area, its context region, and other (similar and/or contrasting) regions.

  4. Acquisition of Scientific Literature in Developing Countries. 2: Malaysia.

    ERIC Educational Resources Information Center

    Taib, Rosna

    1989-01-01

    Describes the acquisition of scientific literature by academic libraries in Malaysia. The discussion covers the impact of government policies, library acquisition policies, the selection process, acquisition of special materials, the role of gifts and exchanges, and problems with customs clearance and censorship. Progress in cooperative…

  5. TH-E-17A-07: Improved Cine Four-Dimensional Computed Tomography (4D CT) Acquisition and Processing Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Castillo, S; Castillo, R; Castillo, E

    2014-06-15

    Purpose: Artifacts arising from the 4D CT acquisition and post-processing methods add systematic uncertainty to the treatment planning process. We propose an alternate cine 4D CT acquisition and post-processing method to consistently reduce artifacts, and explore patient parameters indicative of image quality. Methods: In an IRB-approved protocol, 18 patients with primary thoracic malignancies received a standard cine 4D CT acquisition followed by an oversampling 4D CT that doubled the number of images acquired. A second cohort of 10 patients received the clinical 4D CT plus 3 oversampling scans for intra-fraction reproducibility. The clinical acquisitions were processed by the standard phasemore » sorting method. The oversampling acquisitions were processed using Dijkstras algorithm to optimize an artifact metric over available image data. Image quality was evaluated with a one-way mixed ANOVA model using a correlation-based artifact metric calculated from the final 4D CT image sets. Spearman correlations and a linear mixed model tested the association between breathing parameters, patient characteristics, and image quality. Results: The oversampling 4D CT scans reduced artifact presence significantly by 27% and 28%, for the first cohort and second cohort respectively. From cohort 2, the inter-replicate deviation for the oversampling method was within approximately 13% of the cross scan average at the 0.05 significance level. Artifact presence for both clinical and oversampling methods was significantly correlated with breathing period (ρ=0.407, p-value<0.032 clinical, ρ=0.296, p-value<0.041 oversampling). Artifact presence in the oversampling method was significantly correlated with amount of data acquired, (ρ=-0.335, p-value<0.02) indicating decreased artifact presence with increased breathing cycles per scan location. Conclusion: The 4D CT oversampling acquisition with optimized sorting reduced artifact presence significantly and reproducibly compared to the phase-sorted clinical acquisition.« less

  6. Metabolic profiling of body fluids and multivariate data analysis.

    PubMed

    Trezzi, Jean-Pierre; Jäger, Christian; Galozzi, Sara; Barkovits, Katalin; Marcus, Katrin; Mollenhauer, Brit; Hiller, Karsten

    2017-01-01

    Metabolome analyses of body fluids are challenging due pre-analytical variations, such as pre-processing delay and temperature, and constant dynamical changes of biochemical processes within the samples. Therefore, proper sample handling starting from the time of collection up to the analysis is crucial to obtain high quality samples and reproducible results. A metabolomics analysis is divided into 4 main steps: 1) Sample collection, 2) Metabolite extraction, 3) Data acquisition and 4) Data analysis. Here, we describe a protocol for gas chromatography coupled to mass spectrometry (GC-MS) based metabolic analysis for biological matrices, especially body fluids. This protocol can be applied on blood serum/plasma, saliva and cerebrospinal fluid (CSF) samples of humans and other vertebrates. It covers sample collection, sample pre-processing, metabolite extraction, GC-MS measurement and guidelines for the subsequent data analysis. Advantages of this protocol include: •Robust and reproducible metabolomics results, taking into account pre-analytical variations that may occur during the sampling process•Small sample volume required•Rapid and cost-effective processing of biological samples•Logistic regression based determination of biomarker signatures for in-depth data analysis.

  7. Progressive Recruitment of Mesenchymal Progenitors Reveals a Time-Dependent Process of Cell Fate Acquisition in Mouse and Human Nephrogenesis.

    PubMed

    Lindström, Nils O; De Sena Brandine, Guilherme; Tran, Tracy; Ransick, Andrew; Suh, Gio; Guo, Jinjin; Kim, Albert D; Parvez, Riana K; Ruffins, Seth W; Rutledge, Elisabeth A; Thornton, Matthew E; Grubbs, Brendan; McMahon, Jill A; Smith, Andrew D; McMahon, Andrew P

    2018-06-04

    Mammalian nephrons arise from a limited nephron progenitor pool through a reiterative inductive process extending over days (mouse) or weeks (human) of kidney development. Here, we present evidence that human nephron patterning reflects a time-dependent process of recruitment of mesenchymal progenitors into an epithelial nephron precursor. Progressive recruitment predicted from high-resolution image analysis and three-dimensional reconstruction of human nephrogenesis was confirmed through direct visualization and cell fate analysis of mouse kidney organ cultures. Single-cell RNA sequencing of the human nephrogenic niche provided molecular insights into these early patterning processes and predicted developmental trajectories adopted by nephron progenitor cells in forming segment-specific domains of the human nephron. The temporal-recruitment model for nephron polarity and patterning suggested by direct analysis of human kidney development provides a framework for integrating signaling pathways driving mammalian nephrogenesis. Copyright © 2018 Elsevier Inc. All rights reserved.

  8. Preliminary System Analysis of In Situ Resource Utilization for Mars Human Exploration

    NASA Technical Reports Server (NTRS)

    Rapp, Donald; Andringa, Jason; Easter, Robert; Smith, Jeffrey H .; Wilson, Thomas; Clark, D. Larry; Payne, Kevin

    2005-01-01

    We carried out a system analysis of processes for utilization of Mars resources to support human exploration of Mars by production of propellants from indigenous resources. Seven ISRU processes were analyzed to determine mass. power and propellant storage volume requirements. The major elements of each process include C02 acquisition, chemical conversion, and storage of propellants. Based on a figure of merit (the ratio of the mass of propellants that must be brought from Earth in a non-ISRU mission to the mass of the ISRU system. tanks and feedstocks that must be brought from Earth for a ISRU mission) the most attractive process (by far); is one where indigenous Mars water is accessible and this is processed via Sabatier/Electrolysis to methane and oxygen. These processes are technically relatively mature. Other processes with positive leverage involve reverse water gas shift and solid oxide electrolysis.

  9. Nuclear Magnetic Resonance Spectroscopy-Based Identification of Yeast.

    PubMed

    Himmelreich, Uwe; Sorrell, Tania C; Daniel, Heide-Marie

    2017-01-01

    Rapid and robust high-throughput identification of environmental, industrial, or clinical yeast isolates is important whenever relatively large numbers of samples need to be processed in a cost-efficient way. Nuclear magnetic resonance (NMR) spectroscopy generates complex data based on metabolite profiles, chemical composition and possibly on medium consumption, which can not only be used for the assessment of metabolic pathways but also for accurate identification of yeast down to the subspecies level. Initial results on NMR based yeast identification where comparable with conventional and DNA-based identification. Potential advantages of NMR spectroscopy in mycological laboratories include not only accurate identification but also the potential of automated sample delivery, automated analysis using computer-based methods, rapid turnaround time, high throughput, and low running costs.We describe here the sample preparation, data acquisition and analysis for NMR-based yeast identification. In addition, a roadmap for the development of classification strategies is given that will result in the acquisition of a database and analysis algorithms for yeast identification in different environments.

  10. Asynchronous data acquisition and on-the-fly analysis of dose fractionated cryoEM images by UCSFImage

    PubMed Central

    Li, Xueming; Zheng, Shawn; Agard, David A.; Cheng, Yifan

    2015-01-01

    Newly developed direct electron detection cameras have a high image output frame rate that enables recording dose fractionated image stacks of frozen hydrated biological samples by electron cryomicroscopy (cryoEM). Such novel image acquisition schemes provide opportunities to analyze cryoEM data in ways that were previously impossible. The file size of a dose fractionated image stack is 20 ~ 60 times larger than that of a single image. Thus, efficient data acquisition and on-the-fly analysis of a large number of dose-fractionated image stacks become a serious challenge to any cryoEM data acquisition system. We have developed a computer-assisted system, named UCSFImage4, for semi-automated cryo-EM image acquisition that implements an asynchronous data acquisition scheme. This facilitates efficient acquisition, on-the-fly motion correction, and CTF analysis of dose fractionated image stacks with a total time of ~60 seconds/exposure. Here we report the technical details and configuration of this system. PMID:26370395

  11. "It's Her Body". When Students' Argumentation Shows Displacement of Content in a Science Classroom

    ERIC Educational Resources Information Center

    Orlander Arvola, Auli; Lundegard, Iann

    2012-01-01

    This paper approaches learning as a response instead of the acquisition of something previously expected. More specifically, it describes a process of argumentation on socioscientific issues in a classroom situation in school science amongst 15-year-old students in Sweden. The analysis of an argumentation on abortion in a science classroom…

  12. System integration of marketable subsystems

    NASA Technical Reports Server (NTRS)

    1978-01-01

    These monthly reports, covering the period February 1978 through June 1978, describe the progress made in the major areas of the program. The areas covered are: systems integration of marketable subsystems; development, design, and building of site data acquisition subsystems; development and operation of the central data processing system; operation of the MSFC Solar Test Facility; and systems analysis.

  13. Digital Avionics Information System (DAIS): Mid-1980's Maintenance Task Analysis. Final Report.

    ERIC Educational Resources Information Center

    Czuchry, Andrew J.; And Others

    The fundamental objective of the Digital Avionics Information System (DAIS) Life Cycle Cost (LCC) Study is to provide the Air Force with an enhanced in-house capability to incorporate LCC considerations during all stages of the system acquisition process. The purpose of this report is to describe the technical approach, results, and conclusions…

  14. Mass spectroscopic apparatus and method

    DOEpatents

    Bomse, David S.; Silver, Joel A.; Stanton, Alan C.

    1991-01-01

    The disclosure is directed to a method and apparatus for ionization modulated mass spectrometric analysis. Analog or digital data acquisition and processing can be used. Ions from a time variant source are detected and quantified. The quantified ion output is analyzed using a computer to provide a two-dimensional representation of at least one component present within an analyte.

  15. Use of aerial thermography in Canadian energy conservation programs

    NASA Technical Reports Server (NTRS)

    Cihlar, J.; Brown, R. J.; Lawrence, G.; Barry, J. N.; James, R. B.

    1977-01-01

    Recent developments in the use of aerial thermography in energy conservation programs within Canada were summarized. Following a brief review of studies conducted during the last three years, methodologies of data acquisition, processing, analysis and interpretation was discussed. Examples of results from an industrial oriented project were presented and recommendations for future basic work were outlined.

  16. Exploitation of realistic computational anthropomorphic phantoms for the optimization of nuclear imaging acquisition and processing protocols.

    PubMed

    Loudos, George K; Papadimitroulas, Panagiotis G; Kagadis, George C

    2014-01-01

    Monte Carlo (MC) simulations play a crucial role in nuclear medical imaging since they can provide the ground truth for clinical acquisitions, by integrating and quantifing all physical parameters that affect image quality. The last decade a number of realistic computational anthropomorphic models have been developed to serve imaging, as well as other biomedical engineering applications. The combination of MC techniques with realistic computational phantoms can provide a powerful tool for pre and post processing in imaging, data analysis and dosimetry. This work aims to create a global database for simulated Single Photon Emission Computed Tomography (SPECT) and Positron Emission Tomography (PET) exams and the methodology, as well as the first elements are presented. Simulations are performed using the well validated GATE opensource toolkit, standard anthropomorphic phantoms and activity distribution of various radiopharmaceuticals, derived from literature. The resulting images, projections and sinograms of each study are provided in the database and can be further exploited to evaluate processing and reconstruction algorithms. Patient studies using different characteristics are included in the database and different computational phantoms were tested for the same acquisitions. These include the XCAT, Zubal and the Virtual Family, which some of which are used for the first time in nuclear imaging. The created database will be freely available and our current work is towards its extension by simulating additional clinical pathologies.

  17. Auditory Processing Disorder and Foreign Language Acquisition

    ERIC Educational Resources Information Center

    Veselovska, Ganna

    2015-01-01

    This article aims at exploring various strategies for coping with the auditory processing disorder in the light of foreign language acquisition. The techniques relevant to dealing with the auditory processing disorder can be attributed to environmental and compensatory approaches. The environmental one involves actions directed at creating a…

  18. Software Acquisition: Evolution, Total Quality Management, and Applications to the Army Tactical Missile System

    DTIC Science & Technology

    1992-06-01

    presents the concept of software Total Quality Management (TQM) which focuses on the entire process of software acquisition, as a partial solution to...software TQM can be applied to software acquisition. Software Development, Software Acquisition, Total Quality management (TQM), Army Tactical Missile

  19. 32 CFR 700.326 - The Assistant Secretary of the Navy (Research, Development and Acquisition).

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., development and acquisition, except for military requirements and operational test and evaluation; (b) Direct management of acquisition programs; (c) All aspects of the acquisition process within the Department of the..., procurement, competition, contracts and business management, logistics, product integrity, and education and...

  20. 48 CFR 1436.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... for contracts not to exceed the simplified acquisition threshold. 1436.602-5 Section 1436.602-5... for contracts not to exceed the simplified acquisition threshold. At each occurrence, CO approval...-engineer contracts not expected to exceed the simplified acquisition threshold. ...

  1. Image processing and 3D visualization in the interpretation of patterned injury of the skin

    NASA Astrophysics Data System (ADS)

    Oliver, William R.; Altschuler, Bruce R.

    1995-09-01

    The use of image processing is becoming increasingly important in the evaluation of violent crime. While much work has been done in the use of these techniques for forensic purposes outside of forensic pathology, its use in the pathologic examination of wounding has been limited. We are investigating the use of image processing in the analysis of patterned injuries and tissue damage. Our interests are currently concentrated on 1) the use of image processing techniques to aid the investigator in observing and evaluating patterned injuries in photographs, 2) measurement of the 3D shape characteristics of surface lesions, and 3) correlation of patterned injuries with deep tissue injury as a problem in 3D visualization. We are beginning investigations in data-acquisition problems for performing 3D scene reconstructions from the pathology perspective of correlating tissue injury to scene features and trace evidence localization. Our primary tool for correlation of surface injuries with deep tissue injuries has been the comparison of processed surface injury photographs with 3D reconstructions from antemortem CT and MRI data. We have developed a prototype robot for the acquisition of 3D wound and scene data.

  2. An Analysis of Alternatives for Reducing Outpatient Military Health Care Costs for Active Duty Members and their Families: Implementing a Recommended Savings Strategy Using Defense Acquisition Principles

    DTIC Science & Technology

    2007-12-01

    the degree of MASTER OF BUSINESS ADMINISTRATION from the NAVAL POSTGRADUATE SCHOOL December 2007 Authors...Graduate School of Business and Public Policy iv THIS PAGE INTENTIONALLY LEFT BLANK v AN ANALYSIS OF ALTERNATIVES FOR REDUCING OUTPATIENT...through this arduous process. We would also like to give a special thanks to Amelia and Catalina. Without your support and understanding, this

  3. Characterizing Impacts of Land Grabbing on Terrestrial Vegetation and Ecohydrologic change in Mozambique through Multiple-sensor Remote Sensing and Models

    NASA Astrophysics Data System (ADS)

    Flores, A. N.; Lakshmi, V.; Al-Barakat, R.; Maksimowicz, M.

    2017-12-01

    Land grabbing, the acquisition of large areas of land by external entities, results from interactions of complex global economic, social, and political processes. These transactions are controversial because they can result in large-scale disruptions to historical land uses, including increased intensity of agricultural practices and significant conversions in land cover. These large-scale disruptions have the potential to impact surface water and energy balance because vegetation controls the partitioning of incoming energy into latent and sensible heat fluxes and precipitation into runoff and infiltration. Because large-scale land acquisitions can impact local ecosystem services, it is important to document changes in terrestrial vegetation associated with these acquisitions to support the assessment of associated impacts on regional surface water and energy balance, spatiotemporal scales of those changes, and interactions and feedbacks with other processes, particularly in the atmosphere. We use remote sensing data from multiple satellite platforms to diagnose and characterize changes in terrestrial vegetation and ecohydrology in Mozambique during periods that bracket periods associated with significant. The Advanced very High Resolution Radiometer (AVHRR) sensor provides long-term continuous data that can document historical seasonal cycles of vegetation greenness. These data are augmented with analyses from Landsat multispectral data, which provides significantly higher spatial resolution. Here we quantify spatiotemporal changes in vegetation are associated with periods of significant land acquisitions in Mozambique. This analysis complements a suite of land-atmosphere modeling experiments designed to deduce potential changes in land surface water and energy budgets associated with these acquisitions. This work advance understanding of how telecouplings between global economic and political forcings and regional hydrology and climate.

  4. 48 CFR 1401.7001-4 - Acquisition performance measurement systems.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...-pronged approach that includes self assessment, statistical data for validation and flexible quality... regulations governing the acquisition process; and (3) Identify and implement changes necessary to improve the... through the review and oversight process. ...

  5. 48 CFR 1401.7001-4 - Acquisition performance measurement systems.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...-pronged approach that includes self assessment, statistical data for validation and flexible quality... regulations governing the acquisition process; and (3) Identify and implement changes necessary to improve the... through the review and oversight process. ...

  6. 48 CFR 1401.7001-4 - Acquisition performance measurement systems.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...-pronged approach that includes self assessment, statistical data for validation and flexible quality... regulations governing the acquisition process; and (3) Identify and implement changes necessary to improve the... through the review and oversight process. ...

  7. 48 CFR 1401.7001-4 - Acquisition performance measurement systems.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...-pronged approach that includes self assessment, statistical data for validation and flexible quality... regulations governing the acquisition process; and (3) Identify and implement changes necessary to improve the... through the review and oversight process. ...

  8. Research on control law accelerator of digital signal process chip TMS320F28035 for real-time data acquisition and processing

    NASA Astrophysics Data System (ADS)

    Zhao, Shuangle; Zhang, Xueyi; Sun, Shengli; Wang, Xudong

    2017-08-01

    TI C2000 series digital signal process (DSP) chip has been widely used in electrical engineering, measurement and control, communications and other professional fields, DSP TMS320F28035 is one of the most representative of a kind. When using the DSP program, need data acquisition and data processing, and if the use of common mode C or assembly language programming, the program sequence, analogue-to-digital (AD) converter cannot be real-time acquisition, often missing a lot of data. The control low accelerator (CLA) processor can run in parallel with the main central processing unit (CPU), and the frequency is consistent with the main CPU, and has the function of floating point operations. Therefore, the CLA coprocessor is used in the program, and the CLA kernel is responsible for data processing. The main CPU is responsible for the AD conversion. The advantage of this method is to reduce the time of data processing and realize the real-time performance of data acquisition.

  9. SHARP's systems engineering challenge: rectifying integrated product team requirements with performance issues in an evolutionary spiral development acquisition

    NASA Astrophysics Data System (ADS)

    Kuehl, C. Stephen

    2003-08-01

    Completing its final development and early deployment on the Navy's multi-role aircraft, the F/A-18 E/F Super Hornet, the SHAred Reconnaissance Pod (SHARP) provides the war fighter with the latest digital tactical reconnaissance (TAC Recce) Electro-Optical/Infrared (EO/IR) sensor system. The SHARP program is an evolutionary acquisition that used a spiral development process across a prototype development phase tightly coupled into overlapping Engineering and Manufacturing Development (EMD) and Low Rate Initial Production (LRIP) phases. Under a tight budget environment with a highly compressed schedule, SHARP challenged traditional acquisition strategies and systems engineering (SE) processes. Adopting tailored state-of-the-art systems engineering process models allowd the SHARP program to overcome the technical knowledge transition challenges imposed by a compressed program schedule. The program's original goal was the deployment of digital TAC Recce mission capabilities to the fleet customer by summer of 2003. Hardware and software integration technical challenges resulted from requirements definition and analysis activities performed across a government-industry led Integrated Product Team (IPT) involving Navy engineering and test sites, Boeing, and RTSC-EPS (with its subcontracted hardware and government furnished equipment vendors). Requirements development from a bottoms-up approach was adopted using an electronic requirements capture environment to clarify and establish the SHARP EMD product baseline specifications as relevant technical data became available. Applying Earned-Value Management (EVM) against an Integrated Master Schedule (IMS) resulted in efficiently managing SE task assignments and product deliveries in a dynamically evolving customer requirements environment. Application of Six Sigma improvement methodologies resulted in the uncovering of root causes of errors in wiring interconnectivity drawings, pod manufacturing processes, and avionics requirements specifications. Utilizing the draft NAVAIR SE guideline handbook and the ANSI/EIA-632 standard: Processes for Engineering a System, a systems engineering tailored process approach was adopted for the accelerated SHARP EMD prgram. Tailoring SE processes in this accelerated product delivery environment provided unique opportunities to be technically creative in the establishment of a product performance baseline. This paper provides an historical overview of the systems engineering activities spanning the prototype phase through the EMD SHARP program phase, the performance requirement capture activities and refinement process challenges, and what SE process improvements can be applied to future SHARP-like programs adopting a compressed, evolutionary spiral development acquisition paradigm.

  10. Software manual for operating particle displacement tracking data acquisition and reduction system

    NASA Technical Reports Server (NTRS)

    Wernet, Mark P.

    1991-01-01

    The software manual is presented. The necessary steps required to record, analyze, and reduce Particle Image Velocimetry (PIV) data using the Particle Displacement Tracking (PDT) technique are described. The new PDT system is an all electronic technique employing a CCD video camera and a large memory buffer frame-grabber board to record low velocity (less than or equal to 20 cm/s) flows. Using a simple encoding scheme, a time sequence of single exposure images are time coded into a single image and then processed to track particle displacements and determine 2-D velocity vectors. All the PDT data acquisition, analysis, and data reduction software is written to run on an 80386 PC.

  11. Incident reporting: Its role in aviation safety and the acquisition of human error data

    NASA Technical Reports Server (NTRS)

    Reynard, W. D.

    1983-01-01

    The rationale for aviation incident reporting systems is presented and contrasted to some of the shortcomings of accident investigation procedures. The history of the United State's Aviation Safety Reporting System (ASRS) is outlined and the program's character explained. The planning elements that resulted in the ASRS program's voluntary, confidential, and non-punitive design are discussed. Immunity, from enforcement action and misuse of the volunteered data, is explained and evaluated. Report generation techniques and the ASRS data analysis process are described; in addition, examples of the ASRS program's output and accomplishments are detailed. Finally, the value of incident reporting for the acquisition of safety information, particularly human error data, is explored.

  12. Contributions of the Study of Japanese as a Second language to our General Understanding of Second Language Acquisition and the Definition of Second Language Acquisition Research.

    ERIC Educational Resources Information Center

    Wakabayashi, Shigenori

    2003-01-01

    Reviews three books on the acquisition of Japanese as a second language: "Second Language Acquisition Process in the Classroom" by A.S. Ohta;"The Acquisition of Grammar by Learners of Japanese" (English translation of title), by H. Noda, K. Sakoda, K. Shibuya, and N. Kobayashi; and "The Acquisition of Japanese as a Second Language," B. K. Kanno,…

  13. Introduction to Defense Acquisition Management

    DTIC Science & Technology

    1989-03-01

    the system after Most new systems follow the same formatted and its usefullness in the weapon inventory predictable life cycle, and fit the model...natives for system concept development Statement (MNS) setting forth requirements need- -An acquisition strategy is developed to guide ed to meet the...6 BUSINESS, FINANCIAL AND TECHNICAL ASPECTS OF SYSTEMS ACQUISITION Management of the systems acquisition process The acquisition planning phase of the

  14. P2P-Based Data System for the EAST Experiment

    NASA Astrophysics Data System (ADS)

    Shu, Yantai; Zhang, Liang; Zhao, Weifeng; Chen, Haiming; Luo, Jiarong

    2006-06-01

    A peer-to-peer (P2P)-based EAST Data System is being designed to provide data acquisition and analysis support for the EAST superconducting tokamak. Instead of transferring data to the servers, all collected data are stored in the data acquisition subsystems locally and the PC clients can access the raw data directly using the P2P architecture. Both online and offline systems are based on Napster-like P2P architecture. This allows the peer (PC) to act both as a client and as a server. A simulation-based method and a steady-state operational analysis technique are used for performance evaluation. These analyses show that the P2P technique can significantly reduce the completion time of raw data display and real-time processing on the online system, and raise the workload capacity and reduce the delay on the offline system.

  15. Silhouette-based approach of 3D image reconstruction for automated image acquisition using robotic arm

    NASA Astrophysics Data System (ADS)

    Azhar, N.; Saad, W. H. M.; Manap, N. A.; Saad, N. M.; Syafeeza, A. R.

    2017-06-01

    This study presents the approach of 3D image reconstruction using an autonomous robotic arm for the image acquisition process. A low cost of the automated imaging platform is created using a pair of G15 servo motor connected in series to an Arduino UNO as a main microcontroller. Two sets of sequential images were obtained using different projection angle of the camera. The silhouette-based approach is used in this study for 3D reconstruction from the sequential images captured from several different angles of the object. Other than that, an analysis based on the effect of different number of sequential images on the accuracy of 3D model reconstruction was also carried out with a fixed projection angle of the camera. The effecting elements in the 3D reconstruction are discussed and the overall result of the analysis is concluded according to the prototype of imaging platform.

  16. 48 CFR 215.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... reliability of its estimating and accounting systems. [63 FR 55040, Oct. 14, 1998, as amended at 71 FR 69494... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Proposal analysis techniques. 215.404-1 Section 215.404-1 Federal Acquisition Regulations System DEFENSE ACQUISITION...

  17. Technology Readiness Assessment (TRA) Deskbook

    DTIC Science & Technology

    2003-09-01

    be certified as being compliant with the FMEA by the Under Secretary of Defense (Comptroller) (USD(C)). B.3 A COMMENT ON THE TRA PROCESS The Interim...I-2 1.4 Acquisition Process Overview...II-6 2.2.3 Processing the TRA Results .......................................................... II-6 2.3 Component Acquisition Executive (CAE

  18. Development of data processing interpretation and analysis system for the remote sensing of trace atmospheric gas species

    NASA Technical Reports Server (NTRS)

    Casas, J. C.; Koziana, J. V.; Saylor, M. S.; Kindle, E. C.

    1982-01-01

    Problems associated with the development of the measurement of air pollution from satellites (MAPS) experiment program are addressed. The primary thrust of this research was the utilization of the MAPS experiment data in three application areas: low altitude aircraft flights (one to six km); mid altitude aircraft flights (eight to 12 km); and orbiting space platforms. Extensive research work in four major areas of data management was the framework for implementation of the MAPS experiment technique. These areas are: (1) data acquisition; (2) data processing, analysis and interpretation algorithms; (3) data display techniques; and (4) information production.

  19. Mineralogy and Elemental Composition of Wind Drift Soil at Rocknest, Gale Crater

    NASA Technical Reports Server (NTRS)

    Blake, D. F.; Bish, D. L.; Morris, R. V.; Downs, R. T.; Trieman, A. H.; Morrison, S. M.; Chipera, S. J.; Ming, D. W.; Yen, A. S.; Vaniman, D. T.; hide

    2013-01-01

    The Mars Science Laboratory rover Curiosity has been exploring Mars since August 5, 2012, conducting engineering and first-time activities with its mobility system, arm, sample acquisition and processing system (SA/SPaH-CHIMRA) and science instruments. Curiosity spent 54 sols at a location named "Rocknest," collecting and processing five scoops of loose, unconsolidated materials ("soil") acquired from an aeolian bedform (Fig. 1). The Chemistry and Mineralogy (CheMin) instrument analyzed portions of scoops 3, 4, and 5, to obtain the first quantitative mineralogical analysis of Mars soil, and to provide context for Sample Analysis at Mars (SAM) measurements of volatiles, isotopes and possible organic materials.

  20. A low-cost PC-based telemetry data-reduction system

    NASA Astrophysics Data System (ADS)

    Simms, D. A.; Butterfield, C. P.

    1990-04-01

    The Solar Energy Research Institute's (SERI) Wind Research Branch is using Pulse Code Modulation (PCM) telemetry data-acquisition systems to study horizontal-axis wind turbines. PCM telemetry systems are used in test installations that require accurate multiple-channel measurements taken from a variety of different locations. SERI has found them ideal for use in tests requiring concurrent acquisition of data-reduction system to facilitate quick, in-the-field multiple-channel data analysis. Called the PC-PCM System, it consists of two basic components. First, AT-compatible hardware boards are used for decoding and combining PCM data streams. Up to four hardware boards can be installed in a single PC, which provides the capability to combine data from four PCM streams directly to PC disk or memory. Each stream can have up to 62 data channels. Second, a software package written for the DOS operating system was developed to simplify data-acquisition control and management. The software provides a quick, easy-to-use interface between the PC and PCM data streams. Called the Quick-Look Data Management Program, it is a comprehensive menu-driven package used to organize, acquire, process, and display information from incoming PCM data streams. This paper describes both hardware and software aspects of the SERI PC-PCM system, concentrating on features that make it useful in an experiment test environment to quickly examine and verify incoming data. Also discussed are problems and techniques associated with PC-based telemetry data acquisition, processing, and real-time display.

  1. Practice acquisition: a due diligence checklist. HFMA Principles and Practices Board.

    PubMed

    1995-12-01

    As healthcare executives act to form integrated healthcare systems that encompass entities such as physician-hospital organizations and medical group practices, they often discover that practical guidance on acquiring physician practices is scarce. To address the need for authoritative guidance on practice acquisition, HFMA's Principles and Practices Board has developed a detailed analysis of physician practices acquisition issues, Issues Analysis 95-1: Acquisition of Physician Practices. This analysis includes a detailed due diligence checklist developed to assist both healthcare financial managers involved in acquiring physician practices and physician owners interested in selling their practices.

  2. A Psychometric Study of Reading Processes in L2 Acquisition: Deploying Deep Processing to Push Learners' Discourse Towards Syntactic Processing-Based Constructions

    ERIC Educational Resources Information Center

    Manuel, Carlos J.

    2009-01-01

    This study assesses reading processes and/or strategies needed to deploy deep processing that could push learners towards syntactic-based constructions in L2 classrooms. Research has found L2 acquisition to present varying degrees of success and/or fossilization (Bley-Vroman 1989, Birdsong 1992 and Sharwood Smith 1994). For example, learners have…

  3. Development of data processing, interpretation and analysis system for the remote sensing of trace atmospheric gas species

    NASA Technical Reports Server (NTRS)

    Casas, Joseph C.; Saylor, Mary S.; Kindle, Earl C.

    1987-01-01

    The major emphasis is on the advancement of remote sensing technology. In particular, the gas filter correlation radiometer (GFCR) technique was applied to the measurement of trace gas species, such as carbon monoxide (CO), from airborne and Earth orbiting platforms. Through a series of low altitude aircraft flights, high altitude aircraft flights, and orbiting space platform flights, data were collected and analyzed, culminating in the first global map of carbon monoxide concentration in the middle troposphere and stratosphere. The four major areas of this remote sensing program, known as the Measurement of Air Pollution from Satellites (MAPS) experiment, are: (1) data acquisition, (2) data processing, analysis, and interpretation algorithms, (3) data display techniques, and (4) information processing.

  4. Parameter estimation for terrain modeling from gradient data. [navigation system for Martian rover

    NASA Technical Reports Server (NTRS)

    Dangelo, K. R.

    1974-01-01

    A method is developed for modeling terrain surfaces for use on an unmanned Martian roving vehicle. The modeling procedure employs a two-step process which uses gradient as well as height data in order to improve the accuracy of the model's gradient. Least square approximation is used in order to stochastically determine the parameters which describe the modeled surface. A complete error analysis of the modeling procedure is included which determines the effect of instrumental measurement errors on the model's accuracy. Computer simulation is used as a means of testing the entire modeling process which includes the acquisition of data points, the two-step modeling process and the error analysis. Finally, to illustrate the procedure, a numerical example is included.

  5. Image analysis of multiple moving wood pieces in real time

    NASA Astrophysics Data System (ADS)

    Wang, Weixing

    2006-02-01

    This paper presents algorithms for image processing and image analysis of wood piece materials. The algorithms were designed for auto-detection of wood piece materials on a moving conveyor belt or a truck. When wood objects on moving, the hard task is to trace the contours of the objects in n optimal way. To make the algorithms work efficiently in the plant, a flexible online system was designed and developed, which mainly consists of image acquisition, image processing, object delineation and analysis. A number of newly-developed algorithms can delineate wood objects with high accuracy and high speed, and in the wood piece analysis part, each wood piece can be characterized by a number of visual parameters which can also be used for constructing experimental models directly in the system.

  6. Collaborative Research and Development (CR&D) III Task Order 0090: Image Processing Framework: From Acquisition and Analysis to Archival Storage

    DTIC Science & Technology

    2013-05-01

    contract or a PhD di sse rtation typically are a " proo f- of-concept" code base that can onl y read a single set of inputs and are not designed ...AFRL-RX-WP-TR-2013-0210 COLLABORATIVE RESEARCH AND DEVELOPMENT (CR&D) III Task Order 0090: Image Processing Framework: From...public release; distribution unlimited. See additional restrictions described on inside pages. STINFO COPY AIR FORCE RESEARCH LABORATORY

  7. The influence of polarization on millimeter wave propagation through rain. [radio signals

    NASA Technical Reports Server (NTRS)

    Bostian, C. W.; Stutzman, W. L.; Wiley, P. H.; Marshall, R. E.

    1973-01-01

    The measurement and analysis of the depolarization and attenuation that occur when millimeter wave radio signals propagate through rain are described. Progress was made in three major areas: the processing of recorded 1972 data, acquisition and processing of a large amount of 1973 data, and the development of a new theoretical model to predict rain cross polarization and attenuation. Each of these topics is described in detail along with radio frequency system design for cross polarization measurements.

  8. A real-time spectroscopic sensor for monitoring laser welding processes.

    PubMed

    Sibillano, Teresa; Ancona, Antonio; Berardi, Vincenzo; Lugarà, Pietro Mario

    2009-01-01

    In this paper we report on the development of a sensor for real time monitoring of laser welding processes based on spectroscopic techniques. The system is based on the acquisition of the optical spectra emitted from the laser generated plasma plume and their use to implement an on-line algorithm for both the calculation of the plasma electron temperature and the analysis of the correlations between selected spectral lines. The sensor has been patented and it is currently available on the market.

  9. Minimalism and Beyond: Second Language Acquisition for the Twenty-First Century.

    ERIC Educational Resources Information Center

    Balcom, Patricia A.

    2001-01-01

    Provides a general overview of two books--"The Second Time Around: Minimalism and Second Language Acquisition" and "Second Language Syntax: A Generative Introduction--and shows how the respond to key issues in second language acquisition, including the process of second language acquisition, access to universal grammar, the role of…

  10. Defense Acquisition Workforce: The Air Force Needs to Evaluate Changes in Funding for Civilians Engaged in Space Acquisition

    DTIC Science & Technology

    2013-07-01

    1) revitalize the acquisition workforce; (2) improve the requirements generation process; (3) instill budget and financial DOD Acquisition...with our four recommended actions (see app . I). In concurring with our recommendations, DOD stated that the Air Force will evaluate the pilot program

  11. ARCHITECT: The architecture-based technology evaluation and capability tradeoff method

    NASA Astrophysics Data System (ADS)

    Griendling, Kelly A.

    The use of architectures for the design, development, and documentation of system-of-systems engineering has become a common practice in recent years. This practice became mandatory in the defense industry in 2004 when the Department of Defense Architecture Framework (DoDAF) Promulgation Memo mandated that all Department of Defense (DoD) architectures must be DoDAF compliant. Despite this mandate, there has been significant confusion and a lack of consistency in the creation and the use of the architecture products. Products are typically created as static documents used for communication and documentation purposes that are difficult to change and do not support engineering design activities and acquisition decision making. At the same time, acquisition guidance has been recently reformed to move from the bottom-up approach of the Requirements Generation System (RGS) to the top-down approach mandated by the Joint Capabilities Integration and Devel- opment System (JCIDS), which requires the use of DoDAF to support acquisition. Defense agencies have had difficulty adjusting to this new policy, and are struggling to determine how to meet new acquisition requirements. This research has developed the Architecture-based Technology Evaluation and Capability Tradeoff (ARCHITECT) Methodology to respond to these challenges and address concerns raised about the defense acquisition process, particularly the time required to implement parts of the process, the need to evaluate solutions across capability and mission areas, and the need to use a rigorous, traceable, repeatable method that utilizes modeling and simulation to better substantiate early-phase acquisition decisions. The objective is to create a capability-based systems engineering methodology for the early phases of design and acquisition (specifically Pre-Milestone A activities) which improves agility in defense acquisition by (1) streamlining the development of key elements of JCIDS and DoDAF, (2) moving the creation of DoDAF products forward in the defense acquisition process, and (3) using DoDAF products for more than documentation by integrating them into the problem definition and analysis of alternatives phases and applying executable architecting. This research proposes and demonstrates the plausibility of a prescriptive methodology for developing executable DoDAF products which will explicitly support decision-making in the early phases of JCIDS. A set of criteria by which CBAs should be judged is proposed, and the methodology is developed with these criteria in mind. The methodology integrates existing tools and techniques for systems engineering and system of systems engineering with several new modeling and simulation tools and techniques developed as part of this research to fill gaps noted in prior CBAs. A suppression of enemy air defenses (SEAD) mission is used to demonstrate the ap- plication of ARCHITECT and to show the plausibility of the approach. For the SEAD study, metrics are derived and a gap analysis is performed. The study then identifies and quantitatively compares system and operational architecture alternatives for performing SEAD. A series of down-selections is performed to identify promising architectures, and these promising solutions are subject to further analysis where the impacts of force structure and network structure are examined. While the numerical results of the SEAD study are notional and could not be applied to an actual SEAD CBA, the example served to highlight many of the salient features of the methodology. The SEAD study presented enabled pre-Milestone A tradeoffs to be performed quantitatively across a large number of architectural alternatives in a traceable and repeatable manner. The alternatives considered included variations on operations, systems, organizational responsibilities (through the assignment of systems to tasks), network (or collaboration) structure, interoperability level, and force structure. All of the information used in the study is preserved in the environment, which is dynamic and allows for on-the-fly analysis. The assumptions used were consistent, which was assured through the use of single file documenting all inputs, which was shared across all models. Furthermore, a model was made of the ARCHITECT methodology itself, and was used to demonstrate that even if the steps took twice as long to perform as they did in the case of the SEAD example, the methodology still provides the ability to conduct CBA analyses in less time than prior CBAs to date. Overall, it is shown that the ARCHITECT methodology results in an improvement over current CBAs in the criteria developed here.

  12. Computational Modeling for Language Acquisition: A Tutorial With Syntactic Islands.

    PubMed

    Pearl, Lisa S; Sprouse, Jon

    2015-06-01

    Given the growing prominence of computational modeling in the acquisition research community, we present a tutorial on how to use computational modeling to investigate learning strategies that underlie the acquisition process. This is useful for understanding both typical and atypical linguistic development. We provide a general overview of why modeling can be a particularly informative tool and some general considerations when creating a computational acquisition model. We then review a concrete example of a computational acquisition model for complex structural knowledge referred to as syntactic islands. This includes an overview of syntactic islands knowledge, a precise definition of the acquisition task being modeled, the modeling results, and how to meaningfully interpret those results in a way that is relevant for questions about knowledge representation and the learning process. Computational modeling is a powerful tool that can be used to understand linguistic development. The general approach presented here can be used to investigate any acquisition task and any learning strategy, provided both are precisely defined.

  13. [Model oriented assessment of literacy performance in children with cochlear implants].

    PubMed

    Fiori, A; Reichmuth, K; Matulat, P; Schmidt, C M; Dinnesen, A G

    2006-07-01

    Although most hearing-impaired children lag behind normally hearing children in literacy acquisition, this aspect has hardly been addressed in the evaluation of language acquisition after cochlear implantation. The present study investigated written language abilities in 8 school-age children with cochlear implants. Neurolinguistic dual-route-models of written language processing indicate that literacy acquisition leads to the establishment of two distinct reading and writing strategies: a lexical one for the quick processing of known words and a sublexical one for decoding unfamiliar words or nonwords letter by letter. 8 school-aged children were investigated, a very heterogeneous group concerning age of onset of hearing impairment, educational placement, and competences in sign language. However, this range is typical of the group of CI-children. The aim was to investigate if children with cochlear implants are able to establish both strategies or if they need to find a differential and individual access to written language. Performance within the Salzburger Lese-Rechtschreib-Test was evaluated. Individual performance of each subject was analysed. Performance varied substantially ranging from only rudimentary spoken and written language abilities in two children to age-equivalent performance in three of them. Severe qualitative differences in written language processing were shown in the remaining three subjects. Suggestions for remediation were made and a re-test was carried out after 12 months. Their individual profiles of performance are described in detail. The present study stresses the importance of a thorough investigation of written language performance in the evaluation of language acquisition after cochlear implantation. The results draw a very heterogeneous picture of performance. Model-oriented testing and analysis of performance prove to be sensible in at least a subpopulation of children with cochlear implants. Based on a better understanding of their acquired word-processing strategies, remediation programs meeting the needs of each individual child can be derived.

  14. Data processing and error analysis for the CE-1 Lunar microwave radiometer

    NASA Astrophysics Data System (ADS)

    Feng, Jian-Qing; Su, Yan; Liu, Jian-Jun; Zou, Yong-Liao; Li, Chun-Lai

    2013-03-01

    The microwave radiometer (MRM) onboard the Chang' E-1 (CE-1) lunar orbiter is a 4-frequency microwave radiometer, and it is mainly used to obtain the brightness temperature (TB) of the lunar surface, from which the thickness, temperature, dielectric constant and other related properties of the lunar regolith can be derived. The working mode of the CE-1 MRM, the ground calibration (including the official calibration coefficients), as well as the acquisition and processing of the raw data are introduced. Our data analysis shows that TB increases with increasing frequency, decreases towards the lunar poles and is significantly affected by solar illumination. Our analysis also reveals that the main uncertainty in TB comes from ground calibration.

  15. Quantitative analysis of phosphoinositide 3-kinase (PI3K) signaling using live-cell total internal reflection fluorescence (TIRF) microscopy.

    PubMed

    Johnson, Heath E; Haugh, Jason M

    2013-12-02

    This unit focuses on the use of total internal reflection fluorescence (TIRF) microscopy and image analysis methods to study the dynamics of signal transduction mediated by class I phosphoinositide 3-kinases (PI3Ks) in mammalian cells. The first four protocols cover live-cell imaging experiments, image acquisition parameters, and basic image processing and segmentation. These methods are generally applicable to live-cell TIRF experiments. The remaining protocols outline more advanced image analysis methods, which were developed in our laboratory for the purpose of characterizing the spatiotemporal dynamics of PI3K signaling. These methods may be extended to analyze other cellular processes monitored using fluorescent biosensors. Copyright © 2013 John Wiley & Sons, Inc.

  16. The Influence of Working Memory and Phonological Processing on English Language Learner Children's Bilingual Reading and Language Acquisition

    ERIC Educational Resources Information Center

    Swanson, H. Lee; Orosco, Michael J.; Lussier, Cathy M.; Gerber, Michael M.; Guzman-Orth, Danielle A.

    2011-01-01

    In this study, we explored whether the contribution of working memory (WM) to children's (N = 471) 2nd language (L2) reading and language acquisition was best accounted for by processing efficiency at a phonological level and/or by executive processes independent of phonological processing. Elementary school children (Grades 1, 2, & 3) whose…

  17. A seamless acquisition digital storage oscilloscope with three-dimensional waveform display

    NASA Astrophysics Data System (ADS)

    Yang, Kuojun; Tian, Shulin; Zeng, Hao; Qiu, Lei; Guo, Lianping

    2014-04-01

    In traditional digital storage oscilloscope (DSO), sampled data need to be processed after each acquisition. During data processing, the acquisition is stopped and oscilloscope is blind to the input signal. Thus, this duration is called dead time. With the rapid development of modern electronic systems, the effect of infrequent events becomes significant. To capture these occasional events in shorter time, dead time in traditional DSO that causes the loss of measured signal needs to be reduced or even eliminated. In this paper, a seamless acquisition oscilloscope without dead time is proposed. In this oscilloscope, three-dimensional waveform mapping (TWM) technique, which converts sampled data to displayed waveform, is proposed. With this technique, not only the process speed is improved, but also the probability information of waveform is displayed with different brightness. Thus, a three-dimensional waveform is shown to the user. To reduce processing time further, parallel TWM which processes several sampled points simultaneously, and dual-port random access memory based pipelining technique which can process one sampling point in one clock period are proposed. Furthermore, two DDR3 (Double-Data-Rate Three Synchronous Dynamic Random Access Memory) are used for storing sampled data alternately, thus the acquisition can continue during data processing. Therefore, the dead time of DSO is eliminated. In addition, a double-pulse test method is adopted to test the waveform capturing rate (WCR) of the oscilloscope and a combined pulse test method is employed to evaluate the oscilloscope's capture ability comprehensively. The experiment results show that the WCR of the designed oscilloscope is 6 250 000 wfms/s (waveforms per second), the highest value in all existing oscilloscopes. The testing results also prove that there is no dead time in our oscilloscope, thus realizing the seamless acquisition.

  18. A seamless acquisition digital storage oscilloscope with three-dimensional waveform display

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Kuojun, E-mail: kuojunyang@gmail.com; Guo, Lianping; School of Electrical and Electronic Engineering, Nanyang Technological University

    In traditional digital storage oscilloscope (DSO), sampled data need to be processed after each acquisition. During data processing, the acquisition is stopped and oscilloscope is blind to the input signal. Thus, this duration is called dead time. With the rapid development of modern electronic systems, the effect of infrequent events becomes significant. To capture these occasional events in shorter time, dead time in traditional DSO that causes the loss of measured signal needs to be reduced or even eliminated. In this paper, a seamless acquisition oscilloscope without dead time is proposed. In this oscilloscope, three-dimensional waveform mapping (TWM) technique, whichmore » converts sampled data to displayed waveform, is proposed. With this technique, not only the process speed is improved, but also the probability information of waveform is displayed with different brightness. Thus, a three-dimensional waveform is shown to the user. To reduce processing time further, parallel TWM which processes several sampled points simultaneously, and dual-port random access memory based pipelining technique which can process one sampling point in one clock period are proposed. Furthermore, two DDR3 (Double-Data-Rate Three Synchronous Dynamic Random Access Memory) are used for storing sampled data alternately, thus the acquisition can continue during data processing. Therefore, the dead time of DSO is eliminated. In addition, a double-pulse test method is adopted to test the waveform capturing rate (WCR) of the oscilloscope and a combined pulse test method is employed to evaluate the oscilloscope's capture ability comprehensively. The experiment results show that the WCR of the designed oscilloscope is 6 250 000 wfms/s (waveforms per second), the highest value in all existing oscilloscopes. The testing results also prove that there is no dead time in our oscilloscope, thus realizing the seamless acquisition.« less

  19. The logical syntax of number words: theory, acquisition and processing.

    PubMed

    Musolino, Julien

    2009-04-01

    Recent work on the acquisition of number words has emphasized the importance of integrating linguistic and developmental perspectives [Musolino, J. (2004). The semantics and acquisition of number words: Integrating linguistic and developmental perspectives. Cognition93, 1-41; Papafragou, A., Musolino, J. (2003). Scalar implicatures: Scalar implicatures: Experiments at the semantics-pragmatics interface. Cognition, 86, 253-282; Hurewitz, F., Papafragou, A., Gleitman, L., Gelman, R. (2006). Asymmetries in the acquisition of numbers and quantifiers. Language Learning and Development, 2, 76-97; Huang, Y. T., Snedeker, J., Spelke, L. (submitted for publication). What exactly do numbers mean?]. Specifically, these studies have shown that data from experimental investigations of child language can be used to illuminate core theoretical issues in the semantic and pragmatic analysis of number terms. In this article, I extend this approach to the logico-syntactic properties of number words, focusing on the way numerals interact with each other (e.g. Three boys are holding two balloons) as well as with other quantified expressions (e.g. Three boys are holding each balloon). On the basis of their intuitions, linguists have claimed that such sentences give rise to at least four different interpretations, reflecting the complexity of the linguistic structure and syntactic operations involved. Using psycholinguistic experimentation with preschoolers (n=32) and adult speakers of English (n=32), I show that (a) for adults, the intuitions of linguists can be verified experimentally, (b) by the age of 5, children have knowledge of the core aspects of the logical syntax of number words, (c) in spite of this knowledge, children nevertheless differ from adults in systematic ways, (d) the differences observed between children and adults can be accounted for on the basis of an independently motivated, linguistically-based processing model [Geurts, B. (2003). Quantifying kids. Language Acquisition, 11(4), 197-218]. In doing so, this work ties together research on the acquisition of the number vocabulary with a growing body of work on the development of quantification and sentence processing abilities in young children [Geurts, 2003; Lidz, J., Musolino, J. (2002). Children's command of quantification. Cognition, 84, 113-154; Musolino, J., Lidz, J. (2003). The scope of isomorphism: Turning adults into children. Language Acquisition, 11(4), 277-291; Trueswell, J., Sekerina, I., Hilland, N., Logrip, M. (1999). The kindergarten-path effect: Studying on-line sentence processing in young children. Cognition, 73, 89-134; Noveck, I. (2001). When children are more logical than adults: Experimental investigations of scalar implicature. Cognition, 78, 165-188; Noveck, I., Guelminger, R., Georgieff, N., & Labruyere, N. (2007). What autism can tell us about every. . . not sentences. Journal of Semantics,24(1), 73-90. On a more general level, this work confirms the importance of integrating formal and developmental perspectives [Musolino, 2004], this time by highlighting the explanatory power of linguistically-based models of language acquisition and by showing that the complex structure postulated by linguists has important implications for developmental accounts of the number vocabulary.

  20. Development of a data independent acquisition mass spectrometry workflow to enable glycopeptide analysis without predefined glycan compositional knowledge.

    PubMed

    Lin, Chi-Hung; Krisp, Christoph; Packer, Nicolle H; Molloy, Mark P

    2018-02-10

    Glycoproteomics investigates glycan moieties in a site specific manner to reveal the functional roles of protein glycosylation. Identification of glycopeptides from data-dependent acquisition (DDA) relies on high quality MS/MS spectra of glycopeptide precursors and often requires manual validation to ensure confident assignments. In this study, we investigated pseudo-MRM (MRM-HR) and data-independent acquisition (DIA) as alternative acquisition strategies for glycopeptide analysis. These approaches allow data acquisition over the full MS/MS scan range allowing data re-analysis post-acquisition, without data re-acquisition. The advantage of MRM-HR over DDA for N-glycopeptide detection was demonstrated from targeted analysis of bovine fetuin where all three N-glycosylation sites were detected, which was not the case with DDA. To overcome the duty cycle limitation of MRM-HR acquisition needed for analysis of complex samples such as plasma we trialed DIA. This allowed development of a targeted DIA method to identify N-glycopeptides without pre-defined knowledge of the glycan composition, thus providing the potential to identify N-glycopeptides with unexpected structures. This workflow was demonstrated by detection of 59 N-glycosylation sites from 41 glycoproteins from a HILIC enriched human plasma tryptic digest. 21 glycoforms of IgG1 glycopeptides were identified including two truncated structures that are rarely reported. We developed a data-independent mass spectrometry workflow to identify specific glycopeptides from complex biological mixtures. The novelty is that this approach does not require glycan composition to be pre-defined, thereby allowing glycopeptides carrying unexpected glycans to be identified. This is demonstrated through the analysis of immunoglobulins in human plasma where we detected two IgG1 glycoforms that are rarely observed. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. 48 CFR 750.7110 - Processing cases.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Processing cases. 750.7110 Section 750.7110 Federal Acquisition Regulations System AGENCY FOR INTERNATIONAL DEVELOPMENT CONTRACT MANAGEMENT EXTRAORDINARY CONTRACTUAL ACTIONS Extraordinary Contractual Actions To Protect Foreign Policy...

  2. 48 CFR 3009.570-2 - Policy.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... ACQUISITION REGULATION (HSAR) ACQUISITION PLANNING CONTRACTOR QUALIFICATIONS Organizational and Consultant... took appropriate steps to prevent any organizational conflict of interest in the selection process; or... process over which the entity exercised no control. (c) CONSTRUCTION—Nothing in this section 3009.570...

  3. A Five-Year Plan for Meeting the Automatic Data Processing and Telecommunications Needs of the Federal Government. Volume 2: Major Information Technology Systems Acquisition Plans of Federal Executive Agencies, 1984-1989.

    ERIC Educational Resources Information Center

    Department of Commerce, Washington, DC.

    This volume, the second of two, presents and analyzes the information technology acquisition plans of the Federal Government by agency and component. A brief description covers the outlays planned for major information technology acquisitions of general purpose data processing and telecommunications systems, facilities, and related services for 6…

  4. Multilingualism and fMRI: Longitudinal Study of Second Language Acquisition

    PubMed Central

    Andrews, Edna; Frigau, Luca; Voyvodic-Casabo, Clara; Voyvodic, James; Wright, John

    2013-01-01

    BOLD fMRI is often used for the study of human language. However, there are still very few attempts to conduct longitudinal fMRI studies in the study of language acquisition by measuring auditory comprehension and reading. The following paper is the first in a series concerning a unique longitudinal study devoted to the analysis of bi- and multilingual subjects who are: (1) already proficient in at least two languages; or (2) are acquiring Russian as a second/third language. The focus of the current analysis is to present data from the auditory sections of a set of three scans acquired from April, 2011 through April, 2012 on a five-person subject pool who are learning Russian during the study. All subjects were scanned using the same protocol for auditory comprehension on the same General Electric LX 3T Signa scanner in Duke University Hospital. Using a multivariate analysis of covariance (MANCOVA) for statistical analysis, proficiency measurements are shown to correlate significantly with scan results in the Russian conditions over time. The importance of both the left and right hemispheres in language processing is discussed. Special attention is devoted to the importance of contextualizing imaging data with corresponding behavioral and empirical testing data using a multivariate analysis of variance. This is the only study to date that includes: (1) longitudinal fMRI data with subject-based proficiency and behavioral data acquired in the same time frame; and (2) statistical modeling that demonstrates the importance of covariate language proficiency data for understanding imaging results of language acquisition. PMID:24961428

  5. Multilingualism and fMRI: Longitudinal Study of Second Language Acquisition.

    PubMed

    Andrews, Edna; Frigau, Luca; Voyvodic-Casabo, Clara; Voyvodic, James; Wright, John

    2013-05-28

    BOLD fMRI is often used for the study of human language. However, there are still very few attempts to conduct longitudinal fMRI studies in the study of language acquisition by measuring auditory comprehension and reading. The following paper is the first in a series concerning a unique longitudinal study devoted to the analysis of bi- and multilingual subjects who are: (1) already proficient in at least two languages; or (2) are acquiring Russian as a second/third language. The focus of the current analysis is to present data from the auditory sections of a set of three scans acquired from April, 2011 through April, 2012 on a five-person subject pool who are learning Russian during the study. All subjects were scanned using the same protocol for auditory comprehension on the same General Electric LX 3T Signa scanner in Duke University Hospital. Using a multivariate analysis of covariance (MANCOVA) for statistical analysis, proficiency measurements are shown to correlate significantly with scan results in the Russian conditions over time. The importance of both the left and right hemispheres in language processing is discussed. Special attention is devoted to the importance of contextualizing imaging data with corresponding behavioral and empirical testing data using a multivariate analysis of variance. This is the only study to date that includes: (1) longitudinal fMRI data with subject-based proficiency and behavioral data acquired in the same time frame; and (2) statistical modeling that demonstrates the importance of covariate language proficiency data for understanding imaging results of language acquisition.

  6. Development and evaluation of an intelligent traceability system for frozen tilapia fillet processing.

    PubMed

    Xiao, Xinqing; Fu, Zetian; Qi, Lin; Mira, Trebar; Zhang, Xiaoshuan

    2015-10-01

    The main export varieties in China are brand-name, high-quality bred aquatic products. Among them, tilapia has become the most important and fast-growing species since extensive consumer markets in North America and Europe have evolved as a result of commodity prices, year-round availability and quality of fresh and frozen products. As the largest tilapia farming country, China has over one-third of its tilapia production devoted to further processing and meeting foreign market demand. Using by tilapia fillet processing, this paper introduces the efforts for developing and evaluating ITS-TF: an intelligent traceability system integrated with statistical process control (SPC) and fault tree analysis (FTA). Observations, literature review and expert questionnaires were used for system requirement and knowledge acquisition; scenario simulation was applied to evaluate and validate ITS-TF performance. The results show that traceability requirement is evolved from a firefighting model to a proactive model for enhancing process management capacity for food safety; ITS-TF transforms itself as an intelligent system to provide functions on early warnings and process management by integrated SPC and FTA. The valuable suggestion that automatic data acquisition and communication technology should be integrated into ITS-TF was achieved for further system optimization, perfection and performance improvement. © 2014 Society of Chemical Industry.

  7. Possible Overlapping Time Frames of Acquisition and Consolidation Phases in Object Memory Processes: A Pharmacological Approach

    ERIC Educational Resources Information Center

    Akkerman, Sven; Blokland, Arjan; Prickaerts, Jos

    2016-01-01

    In previous studies, we have shown that acetylcholinesterase inhibitors and phosphodiesterase inhibitors (PDE-Is) are able to improve object memory by enhancing acquisition processes. On the other hand, only PDE-Is improve consolidation processes. Here we show that the cholinesterase inhibitor donepezil also improves memory performance when…

  8. The Acquisition of Integrated Science Process Skills in a Web-Based Learning Environment

    ERIC Educational Resources Information Center

    Saat, Rohaida Mohd

    2004-01-01

    Web-based learning is becoming prevalent in science learning. Some use specially designed programs, while others use materials available on the Internet. This qualitative case study examined the process of acquisition of integrated science process skills, particularly the skill of controlling variables, in a web-based learning environment among…

  9. Word Order Processing in the Bilingual Brain

    ERIC Educational Resources Information Center

    Saur, Dorothee; Baumgaertner, Annette; Moehring, Anja; Buchel, Christian; Bonnesen, Matthias; Rose, Michael; Musso, Mariachristina; Meisel, Jurgen M.

    2009-01-01

    One of the issues debated in the field of bilingualism is the question of a "critical period" for second language acquisition. Recent studies suggest an influence of age of onset of acquisition (AOA) particularly on syntactic processing; however, the processing of word order in a sentence context has not yet been examined specifically. We used…

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harben, P E; Harris, D; Myers, S

    Seismic imaging and tracking methods have intelligence and monitoring applications. Current systems, however, do not adequately calibrate or model the unknown geological heterogeneity. Current systems are also not designed for rapid data acquisition and analysis in the field. This project seeks to build the core technological capabilities coupled with innovative deployment, processing, and analysis methodologies to allow seismic methods to be effectively utilized in the applications of seismic imaging and vehicle tracking where rapid (minutes to hours) and real-time analysis is required. The goal of this project is to build capabilities in acquisition system design, utilization and in full 3Dmore » finite difference modeling as well as statistical characterization of geological heterogeneity. Such capabilities coupled with a rapid field analysis methodology based on matched field processing are applied to problems associated with surveillance, battlefield management, finding hard and deeply buried targets, and portal monitoring. This project benefits the U.S. military and intelligence community in support of LLNL's national-security mission. FY03 was the final year of this project. In the 2.5 years this project has been active, numerous and varied developments and milestones have been accomplished. A wireless communication module for seismic data was developed to facilitate rapid seismic data acquisition and analysis. The E3D code was enhanced to include topographic effects. Codes were developed to implement the Karhunen-Loeve (K-L) statistical methodology for generating geological heterogeneity that can be utilized in E3D modeling. The matched field processing methodology applied to vehicle tracking and based on a field calibration to characterize geological heterogeneity was tested and successfully demonstrated in a tank tracking experiment at the Nevada Test Site. A 3-seismic-array vehicle tracking testbed was installed on-site at LLNL for testing real-time seismic tracking methods. A field experiment was conducted over a tunnel at the Nevada Site that quantified the tunnel reflection signal and, coupled with modeling, identified key needs and requirements in experimental layout of sensors. A large field experiment was conducted at the Lake Lynn Laboratory, a mine safety research facility in Pennsylvania, over a tunnel complex in realistic, difficult conditions. This experiment gathered the necessary data for a full 3D attempt to apply the methodology. The experiment also collected data to analyze the capabilities to detect and locate in-tunnel explosions for mine safety and other applications.« less

  11. Acquisition Management for Systems-of-Systems: Exploratory Model Development and Experimentation

    DTIC Science & Technology

    2009-04-22

    outputs of the Requirements Development and Logical Analysis processes into alternative design solutions and selects a final design solution. Decision...Analysis Provides the basis for evaluating and selecting alternatives when decisions need to be made. Implementation Yields the lowest-level system... Dependenc y Matrix 1 ⎥ ⎥ ⎥ ⎦ ⎤ ⎢ ⎢ ⎢ ⎣ ⎡ 011 100 110 2 ⎥ ⎥ ⎥ ⎦ ⎤ ⎢ ⎢ ⎢ ⎣ ⎡ 000 100 100 a) Example of SoS b) Model Structure for Example SoS

  12. DoD Acquisition Workforce Education: An SBA Education Case Study

    ERIC Educational Resources Information Center

    Davenport, Richard W.

    2009-01-01

    A Department of Defense (DoD) M&S education task force is in the process of studying the Modeling and Simulation (M&S) education of the acquisition workforce. Historically, DoD acquisition workforce education is not referred to as education, but rather what the Defense Acquisition University (DAU) refers to as "practitioner training, career…

  13. Language acquisition is model-based rather than model-free.

    PubMed

    Wang, Felix Hao; Mintz, Toben H

    2016-01-01

    Christiansen & Chater (C&C) propose that learning language is learning to process language. However, we believe that the general-purpose prediction mechanism they propose is insufficient to account for many phenomena in language acquisition. We argue from theoretical considerations and empirical evidence that many acquisition tasks are model-based, and that different acquisition tasks require different, specialized models.

  14. Missile Defense: European Phased Adaptive Approach Acquisitions Face Synchronization, Transparency, and Accountability Challenges

    DTIC Science & Technology

    2010-12-21

    House of Representatives Subject: Missile Defense: European Phased Adaptive Approach Acquisitions Face Synchronization , Transparency, and...TITLE AND SUBTITLE Missile Defense: European Phased Adaptive Approach Acquisitions Face Synchronization , Transparency, and Accountability...However, we found that DOD has not fully implemented a management process that synchronizes EPAA acquisition activities and ensures transparency and

  15. Spectral Dynamics Inc., ships hybrid, 316-channel data acquisition system to Sandia Labs.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schwartz, Douglas

    2003-09-01

    Spectral Dynamics announced the shipment of a 316-channel data acquisition system. The system was custom designed for the Light Initiated High Explosive (LIHE) facility at Sandia Labs in Albuquerque, New Mexico by Spectral Dynamics Advanced Research Products Group. This Spectral Dynamics data acquisition system was tailored to meet the unique LIHE environmental and testing requirements utilizing Spectral Dynamics commercial off the shelf (COTS) Jaguar and VIDAS products supplemented by SD Alliance partner's (COTS) products. 'This system is just the beginning of our cutting edge merged technology solutions,' stated Mark Remelman, Manager for the Spectral Dynamics Advanced Research Products Group. 'Thismore » Hybrid system has 316-channels of data acquisition capability, comprised of 102.4kHz direct to disk acquisition and 2.5MHz, 200Mhz & 500Mhz RAM based capabilities. In addition it incorporates the advanced bridge conditioning and dynamic configuration capabilities offered by Spectral Dynamics new Smart Interface Panel System (SIPS{trademark}).' After acceptance testing, Tony King, the Instrumentation Engineer facilitating the project for the Sandia LIHE group commented; 'The LIHE staff was very impressed with the design, construction, attention to detail and overall performance of the instrumentation system'. This system combines VIDAS, a leading edge fourth generation SD-VXI hardware and field-proven software system from SD's Advanced Research Products Group with SD's Jaguar, a multiple Acquisition Control Peripheral (ACP) system that allows expansion to hundreds of channels without sacrificing signal processing performance. Jaguar incorporates dedicated throughput disks for each ACP providing time streaming to disk at up to the maximum sample rate. Spectral Dynamics, Inc. is a leading worldwide supplier of systems and software for advanced computer-automated data acquisition, vibration testing, structural dynamics, explosive shock, high-speed transient capture, acoustic analysis, monitoring, measurement, control and backup. Spectral Dynamics products are used for research, design verification, product testing and process improvement by manufacturers of all types of electrical, electronic and mechanical products, as well as by universities and government-funded agencies. The Advanced Research Products Group is the newest addition to the Spectral Dynamics family. Their newest VXI data acquisition hardware pushes the envelope on capabilities and embodies the same rock solid design methodologies, which have always differentiated Spectral Dynamics from its competition.« less

  16. Coordinating Council. Seventh Meeting: Acquisitions

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The theme for this NASA Scientific and Technical Information Program Coordinating Council meeting was Acquisitions. In addition to NASA and the NASA Center for AeroSpace Information (CASI) presentations, the report contains fairly lengthy visuals about acquisitions at the Defense Technical Information Center. CASI's acquisitions program and CASI's proactive acquisitions activity were described. There was a presentation on the document evaluation process at CASI. A talk about open literature scope and coverage at the American Institute of Aeronautics and Astronautics was also given. An overview of the STI Program's Acquisitions Experts Committee was given next. Finally acquisitions initiatives of the NASA STI program were presented.

  17. Strategic and Nonstrategic Information Acquisition.

    ERIC Educational Resources Information Center

    Berger, Charles R.

    2002-01-01

    Uses dual-process theories and research concerned with automaticity and the role conceptual short-term memory plays in visual information processing to illustrate both the ubiquity of nonstrategic information acquisition during interpersonal communication and its potential consequences on judgments and behavior. Discusses theoretical and…

  18. Sustaining Equipment and the Rapid Acquisition Process: The Forgotten Phase

    DTIC Science & Technology

    2012-02-24

    Operation of the Defense Acquisition System,” December 8, 2008. 7 Rasch , Robert. A, Jr. Lessons Learned from Rapid Acquisition: Better, Faster, Cheaper...Life Cycle Management Responsibilities,” Defense AR Journal, 17.2 (April 2010): 183. 37 Robert A. Rasch , Lessons Learned from Rapid Acquisition: Better...Accountability Office (GAO) Report, Subject: Rapid Acquisition of Mine Resistant Protected Vehicles, July 15, 2008, 4. 39 Ibid. 40 Ibid. 41 Robert A. Rasch

  19. Robust object tracking techniques for vision-based 3D motion analysis applications

    NASA Astrophysics Data System (ADS)

    Knyaz, Vladimir A.; Zheltov, Sergey Y.; Vishnyakov, Boris V.

    2016-04-01

    Automated and accurate spatial motion capturing of an object is necessary for a wide variety of applications including industry and science, virtual reality and movie, medicine and sports. For the most part of applications a reliability and an accuracy of the data obtained as well as convenience for a user are the main characteristics defining the quality of the motion capture system. Among the existing systems for 3D data acquisition, based on different physical principles (accelerometry, magnetometry, time-of-flight, vision-based), optical motion capture systems have a set of advantages such as high speed of acquisition, potential for high accuracy and automation based on advanced image processing algorithms. For vision-based motion capture accurate and robust object features detecting and tracking through the video sequence are the key elements along with a level of automation of capturing process. So for providing high accuracy of obtained spatial data the developed vision-based motion capture system "Mosca" is based on photogrammetric principles of 3D measurements and supports high speed image acquisition in synchronized mode. It includes from 2 to 4 technical vision cameras for capturing video sequences of object motion. The original camera calibration and external orientation procedures provide the basis for high accuracy of 3D measurements. A set of algorithms as for detecting, identifying and tracking of similar targets, so for marker-less object motion capture is developed and tested. The results of algorithms' evaluation show high robustness and high reliability for various motion analysis tasks in technical and biomechanics applications.

  20. Plans for the development of EOS SAR systems using the Alaska SAR facility. [Earth Observing System (EOS)

    NASA Technical Reports Server (NTRS)

    Carsey, F. D.; Weeks, W.

    1988-01-01

    The Alaska SAR Facility (ASF) program for the acquisition and processing of data from the ESA ERS-1, the NASDA ERS-1, and Radarsat and to carry out a program of science investigations using the data is introduced. Agreements for data acquisition and analysis are in place except for the agreement between NASA and Radarsat which is in negotiation. The ASF baseline system, consisting of the Receiving Ground System, the SAR Processor System and the Archive and Operations System, passed critical design review and is fully in implementation phase. Augments to the baseline system for systems to perform geophysical processing and for processing of J-ERS-1 optical data are in the design and implementation phase. The ASF provides a very effective vehicle with which to prepare for the Earth Observing System (EOS) in that it will aid the development of systems and technologies for handling the data volumes produced by the systems of the next decades, and it will also supply some of the data types that will be produced by EOS.

  1. Real Time Data Acquisition and Online Signal Processing for Magnetoencephalography

    NASA Astrophysics Data System (ADS)

    Rongen, H.; Hadamschek, V.; Schiek, M.

    2006-06-01

    To establish improved therapies for patients suffering from severe neurological and psychiatric diseases, a demand controlled and desynchronizing brain-pacemaker has been developed with techniques from statistical physics and nonlinear dynamics. To optimize the novel therapeutic approach, brain activity is investigated with a Magnetoencephalography (MEG) system prior to surgery. For this, a real time data acquisition system for a 148 channel MEG and online signal processing for artifact rejection, filtering, cross trial phase resetting analysis and three-dimensional (3-D) reconstruction of the cerebral current sources was developed. The developed PCI bus hardware is based on a FPGA and DSP design, using the benefits from both architectures. The reconstruction and visualization of the 3-D volume data is done by the PC which hosts the real time DAQ and pre-processing board. The framework of the MEG-online system is introduced and the architecture of the real time DAQ board and online reconstruction is described. In addition we show first results with the MEG-Online system for the investigation of dynamic brain activities in relation to external visual stimulation, based on test data sets.

  2. A wireless data acquisition system for acoustic emission testing

    NASA Astrophysics Data System (ADS)

    Zimmerman, A. T.; Lynch, J. P.

    2013-01-01

    As structural health monitoring (SHM) systems have seen increased demand due to lower costs and greater capabilities, wireless technologies have emerged that enable the dense distribution of transducers and the distributed processing of sensor data. In parallel, ultrasonic techniques such as acoustic emission (AE) testing have become increasingly popular in the non-destructive evaluation of materials and structures. These techniques, which involve the analysis of frequency content between 1 kHz and 1 MHz, have proven effective in detecting the onset of cracking and other early-stage failure in active structures such as airplanes in flight. However, these techniques typically involve the use of expensive and bulky monitoring equipment capable of accurately sensing AE signals at sampling rates greater than 1 million samples per second. In this paper, a wireless data acquisition system is presented that is capable of collecting, storing, and processing AE data at rates of up to 20 MHz. Processed results can then be wirelessly transmitted in real-time, creating a system that enables the use of ultrasonic techniques in large-scale SHM systems.

  3. Data processing and analysis for 2D imaging GEM detector system

    NASA Astrophysics Data System (ADS)

    Czarski, T.; Chernyshova, M.; Pozniak, K. T.; Kasprowicz, G.; Byszuk, A.; Juszczyk, B.; Kolasinski, P.; Linczuk, M.; Wojenski, A.; Zabolotny, W.; Zienkiewicz, P.

    2014-11-01

    The Triple Gas Electron Multiplier (T-GEM) is presented as soft X-ray (SXR) energy and position sensitive detector for high-resolution X-ray diagnostics of magnetic confinement fusion plasmas [1]. Multi-channel measurement system and essential data processing for X-ray energy and position recognition is consider. Several modes of data acquisition are introduced depending on processing division for hardware and software components. Typical measuring issues aredeliberated for enhancement of data quality. Fundamental output characteristics are presented for one and two dimensional detector structure. Representative results for reference X-ray source and tokamak plasma are demonstrated.

  4. High density event-related potential data acquisition in cognitive neuroscience.

    PubMed

    Slotnick, Scott D

    2010-04-16

    Functional magnetic resonance imaging (fMRI) is currently the standard method of evaluating brain function in the field of Cognitive Neuroscience, in part because fMRI data acquisition and analysis techniques are readily available. Because fMRI has excellent spatial resolution but poor temporal resolution, this method can only be used to identify the spatial location of brain activity associated with a given cognitive process (and reveals virtually nothing about the time course of brain activity). By contrast, event-related potential (ERP) recording, a method that is used much less frequently than fMRI, has excellent temporal resolution and thus can track rapid temporal modulations in neural activity. Unfortunately, ERPs are under utilized in Cognitive Neuroscience because data acquisition techniques are not readily available and low density ERP recording has poor spatial resolution. In an effort to foster the increased use of ERPs in Cognitive Neuroscience, the present article details key techniques involved in high density ERP data acquisition. Critically, high density ERPs offer the promise of excellent temporal resolution and good spatial resolution (or excellent spatial resolution if coupled with fMRI), which is necessary to capture the spatial-temporal dynamics of human brain function.

  5. Applying Online Monitoring for Nuclear Power Plant Instrumentation and Control

    NASA Astrophysics Data System (ADS)

    Hashemian, H. M.

    2010-10-01

    This paper presents a practical review of the state-of-the-art means for applying OLM data acquisition in nuclear power plant instrumentation and control, qualifying or validating the OLM data, and then analyzing it for static and dynamic performance monitoring applications. Whereas data acquisition for static or steady-state OLM applications can require sample rates of anywhere from 1 to 10 seconds to 1 minutes per sample, for dynamic data acquisition, higher sampling frequencies are required (e.g., 100 to 1000 Hz) using a dedicated data acquisition system capable of providing isolation, anti-aliasing and removal of extraneous noise, and analog-to-digital (A/D) conversion. Qualifying the data for use with OLM algorithms can involve removing data `dead' spots (for static data) and calculating, examining, and trending amplitude probability density, variance, skewness, and kurtosis. For static OLM applications with redundant signals, trending and averaging qualification techniques are used, and for single or non-redundant signals physical and empirical modeling are used. Dynamic OLM analysis is performed in the frequency domain and/or time domain, and is based on the assumption that sensors' or transmitters' dynamic characteristics are linear and that the input noise signal (i.e., the process fluctuations) has proper spectral characteristics.

  6. Application of a single area array detector for acquistion, tracking and point-ahead in space optical communications

    NASA Technical Reports Server (NTRS)

    Clark, D. L.; Cosgrove, M.; Vanvranken, R.; Park, H.; Fitzmaurice, M.

    1989-01-01

    Functions of acquisition, tracking, and point-ahead in space optical communications are being combined into a single system utilizing an area array detector. An analysis is presented of the feasibility concept. The key parameters are: optical power less than 1 pW at 0.86 micrometer, acquisition in less than 30 seconds in an acquisition field of view (FOV) of 1 mrad, tracking with 0.5 microrad rms noise at 1000 Hz update rate, and point ahead transfer function precision of 0.25 microrad over a region of 150 microrad. Currently available array detectors were examined. The most demanding specifications are low output noise, a high detection efficiency, a large number of pixels, and frame rates over 1kHz. A proof of concept (POC) demonstration system is currently being built utilizing the Kodak HS-40 detector (a 128 x 128 photodiode array with a 64 channel CCD readout architecture which can be operated at frame rates as high as 40,000/sec). The POC system implements a windowing scheme and special purpose digital signal processing electronic for matched filter acquisition and tracking algorithms.

  7. A Constraint-Based Approach to Acquisition of Word-Final Consonant Clusters in Turkish Children

    ERIC Educational Resources Information Center

    Gokgoz-Kurt, Burcu

    2017-01-01

    The current study provides a constraint-based analysis of L1 word-final consonant cluster acquisition in Turkish child language, based on the data originally presented by Topbas and Kopkalli-Yavuz (2008). The present analysis was done using [?]+obstruent consonant cluster acquisition. A comparison of Gradual Learning Algorithm (GLA) under…

  8. Untangling Locality and Orientation Constraints in the L2 Acquisition of Anaphoric Binding: A Feature-Based Approach

    ERIC Educational Resources Information Center

    Dominguez, Laura; Hicks, Glyn; Song, Hee-Jeong

    2012-01-01

    This study offers a Minimalist analysis of the L2 acquisition of binding properties whereby cross-linguistic differences arise from the interaction of anaphoric feature specifications and operations of the computational system (Reuland 2001, 2011; Hicks 2009). This analysis attributes difficulties in the L2 acquisition of locality and orientation…

  9. Interfacing LabVIEW With Instrumentation for Electronic Failure Analysis and Beyond

    NASA Technical Reports Server (NTRS)

    Buchanan, Randy K.; Bryan, Coleman; Ludwig, Larry

    1996-01-01

    The Laboratory Virtual Instrumentation Engineering Workstation (LabVIEW) software is designed such that equipment and processes related to control systems can be operationally lined and controlled by the use of a computer. Various processes within the failure analysis laboratories of NASA's Kennedy Space Center (KSC) demonstrate the need for modernization and, in some cases, automation, using LabVIEW. An examination of procedures and practices with the Failure Analaysis Laboratory resulted in the conclusion that some device was necessary to elevate the potential users of LabVIEW to an operational level in minimum time. This paper outlines the process involved in creating a tutorial application to enable personnel to apply LabVIEW to their specific projects. Suggestions for furthering the extent to which LabVIEW is used are provided in the areas of data acquisition and process control.

  10. Action Sport Cameras as an Instrument to Perform a 3D Underwater Motion Analysis.

    PubMed

    Bernardina, Gustavo R D; Cerveri, Pietro; Barros, Ricardo M L; Marins, João C B; Silvatti, Amanda P

    2016-01-01

    Action sport cameras (ASC) are currently adopted mainly for entertainment purposes but their uninterrupted technical improvements, in correspondence of cost decreases, are going to disclose them for three-dimensional (3D) motion analysis in sport gesture study and athletic performance evaluation quantitatively. Extending this technology to sport analysis however still requires a methodologic step-forward to making ASC a metric system, encompassing ad-hoc camera setup, image processing, feature tracking, calibration and 3D reconstruction. Despite traditional laboratory analysis, such requirements become an issue when coping with both indoor and outdoor motion acquisitions of athletes. In swimming analysis for example, the camera setup and the calibration protocol are particularly demanding since land and underwater cameras are mandatory. In particular, the underwater camera calibration can be an issue affecting the reconstruction accuracy. In this paper, the aim is to evaluate the feasibility of ASC for 3D underwater analysis by focusing on camera setup and data acquisition protocols. Two GoPro Hero3+ Black (frequency: 60Hz; image resolutions: 1280×720/1920×1080 pixels) were located underwater into a swimming pool, surveying a working volume of about 6m3. A two-step custom calibration procedure, consisting in the acquisition of one static triad and one moving wand, carrying nine and one spherical passive markers, respectively, was implemented. After assessing camera parameters, a rigid bar, carrying two markers at known distance, was acquired in several positions within the working volume. The average error upon the reconstructed inter-marker distances was less than 2.5mm (1280×720) and 1.5mm (1920×1080). The results of this study demonstrate that the calibration of underwater ASC is feasible enabling quantitative kinematic measurements with accuracy comparable to traditional motion capture systems.

  11. Action Sport Cameras as an Instrument to Perform a 3D Underwater Motion Analysis

    PubMed Central

    Cerveri, Pietro; Barros, Ricardo M. L.; Marins, João C. B.; Silvatti, Amanda P.

    2016-01-01

    Action sport cameras (ASC) are currently adopted mainly for entertainment purposes but their uninterrupted technical improvements, in correspondence of cost decreases, are going to disclose them for three-dimensional (3D) motion analysis in sport gesture study and athletic performance evaluation quantitatively. Extending this technology to sport analysis however still requires a methodologic step-forward to making ASC a metric system, encompassing ad-hoc camera setup, image processing, feature tracking, calibration and 3D reconstruction. Despite traditional laboratory analysis, such requirements become an issue when coping with both indoor and outdoor motion acquisitions of athletes. In swimming analysis for example, the camera setup and the calibration protocol are particularly demanding since land and underwater cameras are mandatory. In particular, the underwater camera calibration can be an issue affecting the reconstruction accuracy. In this paper, the aim is to evaluate the feasibility of ASC for 3D underwater analysis by focusing on camera setup and data acquisition protocols. Two GoPro Hero3+ Black (frequency: 60Hz; image resolutions: 1280×720/1920×1080 pixels) were located underwater into a swimming pool, surveying a working volume of about 6m3. A two-step custom calibration procedure, consisting in the acquisition of one static triad and one moving wand, carrying nine and one spherical passive markers, respectively, was implemented. After assessing camera parameters, a rigid bar, carrying two markers at known distance, was acquired in several positions within the working volume. The average error upon the reconstructed inter-marker distances was less than 2.5mm (1280×720) and 1.5mm (1920×1080). The results of this study demonstrate that the calibration of underwater ASC is feasible enabling quantitative kinematic measurements with accuracy comparable to traditional motion capture systems. PMID:27513846

  12. UK Defence Acquisition Process for NEC: Transaction Governance within an Integrated Project Team

    DTIC Science & Technology

    2009-04-22

    3-tier framework for a study of the acquisition of an Advance Military Vehicle (AMV), we explore the shaping of the buyer -supplier relationship in...of an Advance Military Vehicle (AMV), we explore the shaping of the buyer -supplier relationship in the context of the UK defence acquisition process...of the buyer , the MoD, and how this impacts its suppliers in the defence industrial base. An historical review of defence industrial relations is

  13. Signal acquisition and scale calibration for beam power density distribution of electron beam welding

    NASA Astrophysics Data System (ADS)

    Peng, Yong; Li, Hongqiang; Shen, Chunlong; Guo, Shun; Zhou, Qi; Wang, Kehong

    2017-06-01

    The power density distribution of electron beam welding (EBW) is a key factor to reflect the beam quality. The beam quality test system was designed for the actual beam power density distribution of high-voltage EBW. After the analysis of characteristics and phase relationship between the deflection control signal and the acquisition signal, the Post-Trigger mode was proposed for the signal acquisition meanwhile the same external clock source was shared by the control signal and the sampling clock. The power density distribution of beam cross-section was reconstructed using one-dimensional signal that was processed by median filtering, twice signal segmentation and spatial scale calibration. The diameter of beam cross-section was defined by amplitude method and integral method respectively. The measured diameter of integral definition is bigger than that of amplitude definition, but for the ideal distribution the former is smaller than the latter. The measured distribution without symmetrical shape is not concentrated compared to Gaussian distribution.

  14. Target-locking acquisition with real-time confocal (TARC) microscopy.

    PubMed

    Lu, Peter J; Sims, Peter A; Oki, Hidekazu; Macarthur, James B; Weitz, David A

    2007-07-09

    We present a real-time target-locking confocal microscope that follows an object moving along an arbitrary path, even as it simultaneously changes its shape, size and orientation. This Target-locking Acquisition with Realtime Confocal (TARC) microscopy system integrates fast image processing and rapid image acquisition using a Nipkow spinning-disk confocal microscope. The system acquires a 3D stack of images, performs a full structural analysis to locate a feature of interest, moves the sample in response, and then collects the next 3D image stack. In this way, data collection is dynamically adjusted to keep a moving object centered in the field of view. We demonstrate the system's capabilities by target-locking freely-diffusing clusters of attractive colloidal particles, and activelytransported quantum dots (QDs) endocytosed into live cells free to move in three dimensions, for several hours. During this time, both the colloidal clusters and live cells move distances several times the length of the imaging volume.

  15. A multi-center study benchmarks software tools for label-free proteome quantification

    PubMed Central

    Gillet, Ludovic C; Bernhardt, Oliver M.; MacLean, Brendan; Röst, Hannes L.; Tate, Stephen A.; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I.; Aebersold, Ruedi; Tenzer, Stefan

    2016-01-01

    The consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from SWATH-MS (sequential window acquisition of all theoretical fragment ion spectra), a method that uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test datasets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation windows setups. For consistent evaluation we developed LFQbench, an R-package to calculate metrics of precision and accuracy in label-free quantitative MS, and report the identification performance, robustness and specificity of each software tool. Our reference datasets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics. PMID:27701404

  16. A multicenter study benchmarks software tools for label-free proteome quantification.

    PubMed

    Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan

    2016-11-01

    Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.

  17. 48 CFR 434.003 - Responsibilities.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...) individually or as a group will participate in this decision making process. (b) The Chief Information Officer (CIO) is the Major Information Technology Systems Executive. For acquisitions of information technology... information technology system acquisition, designating an acquisition to be a major information technology...

  18. 48 CFR 434.003 - Responsibilities.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...) individually or as a group will participate in this decision making process. (b) The Chief Information Officer (CIO) is the Major Information Technology Systems Executive. For acquisitions of information technology... information technology system acquisition, designating an acquisition to be a major information technology...

  19. 48 CFR 434.003 - Responsibilities.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...) individually or as a group will participate in this decision making process. (b) The Chief Information Officer (CIO) is the Major Information Technology Systems Executive. For acquisitions of information technology... information technology system acquisition, designating an acquisition to be a major information technology...

  20. 48 CFR 434.003 - Responsibilities.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...) individually or as a group will participate in this decision making process. (b) The Chief Information Officer (CIO) is the Major Information Technology Systems Executive. For acquisitions of information technology... information technology system acquisition, designating an acquisition to be a major information technology...

  1. 48 CFR 15.102 - Oral presentations.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Oral presentations. 15.102 Section 15.102 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Source Selection Processes and Techniques 15.102 Oral...

  2. Health Hazard Assessment and Toxicity Clearances in the Army Acquisition Process

    NASA Technical Reports Server (NTRS)

    Macko, Joseph A., Jr.

    2000-01-01

    The United States Army Materiel Command, Army Acquisition Pollution Prevention Support Office (AAPPSO) is responsible for creating and managing the U.S. Army Wide Acquisition Pollution Prevention Program. They have established Integrated Process Teams (IPTs) within each of the Major Subordinate Commands of the Army Materiel Command. AAPPSO provides centralized integration, coordination, and oversight of the Army Acquisition Pollution Prevention Program (AAPPP) , and the IPTs provide the decentralized execution of the AAPPSO program. AAPPSO issues policy and guidance, provides resources and prioritizes P2 efforts. It is the policy of the (AAPPP) to require United States Army Surgeon General approval of all materials or substances that will be used as an alternative to existing hazardous materials, toxic materials and substances, and ozone-depleting substances. The Army has a formal process established to address this effort. Army Regulation 40-10 requires a Health Hazard Assessment (HHA) during the Acquisition milestones of a new Army system. Army Regulation 40-5 addresses the Toxicity Clearance (TC) process to evaluate new chemicals and materials prior to acceptance as an alternative. U.S. Army Center for Health Promotion and Preventive Medicine is the Army's matrixed medical health organization that performs the HHA and TC mission.

  3. Data Acquisition with GPUs: The DAQ for the Muon $g$-$2$ Experiment at Fermilab

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gohn, W.

    Graphical Processing Units (GPUs) have recently become a valuable computing tool for the acquisition of data at high rates and for a relatively low cost. The devices work by parallelizing the code into thousands of threads, each executing a simple process, such as identifying pulses from a waveform digitizer. The CUDA programming library can be used to effectively write code to parallelize such tasks on Nvidia GPUs, providing a significant upgrade in performance over CPU based acquisition systems. The muonmore » $g$-$2$ experiment at Fermilab is heavily relying on GPUs to process its data. The data acquisition system for this experiment must have the ability to create deadtime-free records from 700 $$\\mu$$s muon spills at a raw data rate 18 GB per second. Data will be collected using 1296 channels of $$\\mu$$TCA-based 800 MSPS, 12 bit waveform digitizers and processed in a layered array of networked commodity processors with 24 GPUs working in parallel to perform a fast recording of the muon decays during the spill. The described data acquisition system is currently being constructed, and will be fully operational before the start of the experiment in 2017.« less

  4. MEA-Tools: an open source toolbox for the analysis of multi-electrode data with MATLAB.

    PubMed

    Egert, U; Knott, Th; Schwarz, C; Nawrot, M; Brandt, A; Rotter, S; Diesmann, M

    2002-05-30

    Recent advances in electrophysiological techniques have created new tools for the acquisition and storage of neuronal activity recorded simultaneously with numerous electrodes. These techniques support the analysis of the function as well as the structure of individual electrogenic cells in the context of surrounding neuronal or cardiac network. Commercially available tools for the analysis of such data, however, cannot be easily adapted to newly emerging requirements for data analysis and visualization, and cross compatibility between them is limited. In this report we introduce a free open source toolbox called microelectrode array tools (MEA-Tools) for the analysis of multi-electrode data based on the common data analysis environment MATLAB (version 5.3-6.1, The Mathworks, Natick, MA). The toolbox itself is platform independent. The file interface currently supports files recorded with MCRack (Multi Channel Systems, Reutlingen, Germany) under Microsoft Windows 95, 98, NT, and 2000, but can be adapted to other data acquisition systems. Functions are controlled via command line input and graphical user interfaces, and support common requirements for the analysis of local field potentials, extracellular spike activity, and continuous recordings, in addition to supplementary data acquired by additional instruments, e.g. intracellular amplifiers. Data may be processed as continuous recordings or time windows triggered to some event.

  5. 48 CFR 803.806 - Processing suspected violations.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Processing suspected violations. 803.806 Section 803.806 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS GENERAL IMPROPER BUSINESS PRACTICES AND PERSONAL CONFLICTS OF INTEREST Limitation on the Payment of Funds...

  6. 48 CFR 842.1203 - Processing agreements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Processing agreements. 842.1203 Section 842.1203 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS CONTRACT... submit all supporting agreements and documentation to the OGC for review as to legal sufficiency. ...

  7. 48 CFR 30.604 - Processing changes to disclosed or established cost accounting practices.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... disclosed or established cost accounting practices. 30.604 Section 30.604 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL CONTRACTING REQUIREMENTS COST ACCOUNTING STANDARDS ADMINISTRATION CAS Administration 30.604 Processing changes to disclosed or established cost accounting practices...

  8. 48 CFR 30.604 - Processing changes to disclosed or established cost accounting practices.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... disclosed or established cost accounting practices. 30.604 Section 30.604 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL CONTRACTING REQUIREMENTS COST ACCOUNTING STANDARDS ADMINISTRATION CAS Administration 30.604 Processing changes to disclosed or established cost accounting practices...

  9. 48 CFR 915.404 - Proposal analysis.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Proposal analysis. 915.404 Section 915.404 Federal Acquisition Regulations System DEPARTMENT OF ENERGY CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Contract Pricing 915.404 Proposal analysis. ...

  10. 48 CFR 1215.404 - Proposal analysis.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Proposal analysis. 1215.404 Section 1215.404 Federal Acquisition Regulations System DEPARTMENT OF TRANSPORTATION CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Contract Pricing 1215.404 Proposal analysis. ...

  11. 48 CFR 2815.404 - Proposal analysis.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 6 2011-10-01 2011-10-01 false Proposal analysis. 2815.404 Section 2815.404 Federal Acquisition Regulations System DEPARTMENT OF JUSTICE CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Contract Pricing 2815.404 Proposal analysis. ...

  12. 48 CFR 1215.404 - Proposal analysis.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Proposal analysis. 1215.404 Section 1215.404 Federal Acquisition Regulations System DEPARTMENT OF TRANSPORTATION CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Contract Pricing 1215.404 Proposal analysis. ...

  13. 48 CFR 2815.404 - Proposal analysis.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Proposal analysis. 2815.404 Section 2815.404 Federal Acquisition Regulations System DEPARTMENT OF JUSTICE Contracting Methods and Contract Types CONTRACTING BY NEGOTIATION Contract Pricing 2815.404 Proposal analysis. ...

  14. 48 CFR 915.404 - Proposal analysis.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Proposal analysis. 915.404 Section 915.404 Federal Acquisition Regulations System DEPARTMENT OF ENERGY CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Contract Pricing 915.404 Proposal analysis. ...

  15. Heuristic-based information acquisition and decision making among pilots.

    PubMed

    Wiggins, Mark W; Bollwerk, Sandra

    2006-01-01

    This research was designed to examine the impact of heuristic-based approaches to the acquisition of task-related information on the selection of an optimal alternative during simulated in-flight decision making. The work integrated features of naturalistic and normative decision making and strategies of information acquisition within a computer-based, decision support framework. The study comprised two phases, the first of which involved familiarizing pilots with three different heuristic-based strategies of information acquisition: frequency, elimination by aspects, and majority of confirming decisions. The second stage enabled participants to choose one of the three strategies of information acquisition to resolve a fourth (choice) scenario. The results indicated that task-oriented experience, rather than the information acquisition strategies, predicted the selection of the optimal alternative. It was also evident that of the three strategies available, the elimination by aspects information acquisition strategy was preferred by most participants. It was concluded that task-oriented experience, rather than the process of information acquisition, predicted task accuracy during the decision-making task. It was also concluded that pilots have a preference for one particular approach to information acquisition. Applications of outcomes of this research include the development of decision support systems that adapt to the information-processing capabilities and preferences of users.

  16. Acquisition of Derivational Lexical Rules: A Case Study of the Acquisition of French Agent Noun Forms by L2 Learners

    ERIC Educational Resources Information Center

    Redouane, Rabia

    2007-01-01

    This study investigates L2 learners' use of French derivational processes and their strategies as they form agent nouns. It also attempts to find out which of the acquisitional principles (conventionality, semantic transparency, formal simplicity, and productivity) advanced by Clark (1993, 2003) for various L1s acquisition of word formation…

  17. An automatic frequency control loop using overlapping DFTs (Discrete Fourier Transforms)

    NASA Technical Reports Server (NTRS)

    Aguirre, S.

    1988-01-01

    An automatic frequency control (AFC) loop is introduced and analyzed in detail. The new scheme is a generalization of the well known Cross Product AFC loop that uses running overlapping discrete Fourier transforms (DFTs) to create a discriminator curve. Linear analysis is included and supported with computer simulations. The algorithm is tested in a low carrier to noise ratio (CNR) dynamic environment, and the probability of loss of lock is estimated via computer simulations. The algorithm discussed is a suboptimum tracking scheme with a larger frequency error variance compared to an optimum strategy, but offers simplicity of implementation and a very low operating threshold CNR. This technique can be applied during the carrier acquisition and re-acquisition process in the Advanced Receiver.

  18. SpaceOps 1992: Proceedings of the Second International Symposium on Ground Data Systems for Space Mission Operations

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The Second International Symposium featured 135 oral presentations in these 12 categories: Future Missions and Operations; System-Level Architectures; Mission-Specific Systems; Mission and Science Planning and Sequencing; Mission Control; Operations Automation and Emerging Technologies; Data Acquisition; Navigation; Operations Support Services; Engineering Data Analysis of Space Vehicle and Ground Systems; Telemetry Processing, Mission Data Management, and Data Archiving; and Operations Management. Topics focused on improvements in the productivity, effectiveness, efficiency, and quality of mission operations, ground systems, and data acquisition. Also emphasized were accomplishments in management of human factors; use of information systems to improve data retrieval, reporting, and archiving; design and implementation of logistics support for mission operations; and the use of telescience and teleoperations.

  19. Stress Modulates Instrumental Learning Performances in Horses (Equus caballus) in Interaction with Temperament

    PubMed Central

    Valenchon, Mathilde; Lévy, Frédéric; Prunier, Armelle; Moussu, Chantal; Calandreau, Ludovic; Lansade, Léa

    2013-01-01

    The present study investigates how the temperament of the animal affects the influence of acute stress on the acquisition and reacquisition processes of a learning task. After temperament was assessed, horses were subjected to a stressor before or after the acquisition session of an instrumental task. Eight days later, horses were subjected to a reacquisition session without any stressor. Stress before acquisition tended to enhance the number of successes at the beginning of the acquisition session. Eight days later, during the reacquisition session, contrary to non-stressed animals, horses stressed after acquisition, and, to a lesser extent, horses stressed before acquisition, did not improve their performance between acquisition and reacquisition sessions. Temperament influenced learning performances in stressed horses only. Particularly, locomotor activity improved performances whereas fearfulness impaired them under stressful conditions. Results suggest that direct exposure to a stressor tended to increase acquisition performances, whereas a state of stress induced by the memory of a stressor, because it has been previously associated with the learning context, impaired reacquisition performances. The negative effect of a state of stress on reacquisition performances appeared to be stronger when exposure to the stressor occurred after rather than before the acquisition session. Temperament had an impact on both acquisition and reacquisition processes, but under stressful conditions only. These results suggest that stress is necessary to reveal the influence of temperament on cognitive performances. PMID:23626801

  20. Integration of a Portfolio-based Approach to Evaluate Aerospace R and D Problem Formulation Into a Parametric Synthesis Tool

    NASA Astrophysics Data System (ADS)

    Oza, Amit R.

    The focus of this study is to improve R&D effectiveness towards aerospace and defense planning in the early stages of the product development lifecycle. Emphasis is on: correct formulation of a decision problem, with special attention to account for data relationships between the individual design problem and the system capability required to size the aircraft, understanding of the meaning of the acquisition strategy objective and subjective data requirements that are required to arrive at a balanced analysis and/or "correct" mix of technology projects, understanding the meaning of the outputs that can be created from the technology analysis, and methods the researcher can use at effectively support decisions at the acquisition and conceptual design levels through utilization of a research and development portfolio strategy. The primary objectives of this study are to: (1) determine what strategy should be used to initialize conceptual design parametric sizing processes during requirements analysis for the materiel solution analysis stage of the product development lifecycle when utilizing data already constructed in the latter phase when working with a generic database management system synthesis tool integration architecture for aircraft design , and (2) assess how these new data relationships can contribute for innovative decision-making when solving acquisition hardware/technology portfolio problems. As such, an automated composable problem formulation system is developed to consider data interactions for the system architecture that manages acquisition pre-design concept refinement portfolio management, and conceptual design parametric sizing requirements. The research includes a way to: • Formalize the data storage and implement the data relationship structure with a system architecture automated through a database management system. • Allow for composable modeling, in terms of level of hardware abstraction, for the product model, mission model, and operational constraint model data blocks in the pre-design stages. • Allow the product model, mission model, and operational constraint model to be cross referenced with a generic aircraft synthesis capability to identify disciplinary analysis methods and processes. • Allow for matching, comparison, and balancing of the aircraft hardware portfolio to the associated developmental and technology risk metrics. • Allow for visualization technology portfolio decision space. The problem formulation architecture is finally implemented and verified for a generic hypersonic vehicle research demonstrator where a portfolio of technology hardware are measured for developmental and technology risks, prioritized by the researcher risk constraints, and the data generated delivered to a novel aircraft synthesis tool to confirm vehicle feasibility.

Top