Sample records for collection processing analysis

  1. Parallel log structured file system collective buffering to achieve a compact representation of scientific and/or dimensional data

    DOEpatents

    Grider, Gary A.; Poole, Stephen W.

    2015-09-01

    Collective buffering and data pattern solutions are provided for storage, retrieval, and/or analysis of data in a collective parallel processing environment. For example, a method can be provided for data storage in a collective parallel processing environment. The method comprises receiving data to be written for a plurality of collective processes within a collective parallel processing environment, extracting a data pattern for the data to be written for the plurality of collective processes, generating a representation describing the data pattern, and saving the data and the representation.

  2. Post-analysis report on Chesapeake Bay data processing. [spectral analysis and recognition computer signature extension

    NASA Technical Reports Server (NTRS)

    Thomson, F.

    1972-01-01

    The additional processing performed on data collected over the Rhode River Test Site and Forestry Site in November 1970 is reported. The techniques and procedures used to obtain the processed results are described. Thermal data collected over three approximately parallel lines of the site were contoured, and the results color coded, for the purpose of delineating important scene constituents and to identify trees attacked by pine bark beetles. Contouring work and histogram preparation are reviewed and the important conclusions from the spectral analysis and recognition computer (SPARC) signature extension work are summarized. The SPARC setup and processing records are presented and recommendations are made for future data collection over the site.

  3. Development of a Reference Image Collection Library for Histopathology Image Processing, Analysis and Decision Support Systems Research.

    PubMed

    Kostopoulos, Spiros; Ravazoula, Panagiota; Asvestas, Pantelis; Kalatzis, Ioannis; Xenogiannopoulos, George; Cavouras, Dionisis; Glotsos, Dimitris

    2017-06-01

    Histopathology image processing, analysis and computer-aided diagnosis have been shown as effective assisting tools towards reliable and intra-/inter-observer invariant decisions in traditional pathology. Especially for cancer patients, decisions need to be as accurate as possible in order to increase the probability of optimal treatment planning. In this study, we propose a new image collection library (HICL-Histology Image Collection Library) comprising 3831 histological images of three different diseases, for fostering research in histopathology image processing, analysis and computer-aided diagnosis. Raw data comprised 93, 116 and 55 cases of brain, breast and laryngeal cancer respectively collected from the archives of the University Hospital of Patras, Greece. The 3831 images were generated from the most representative regions of the pathology, specified by an experienced histopathologist. The HICL Image Collection is free for access under an academic license at http://medisp.bme.teiath.gr/hicl/ . Potential exploitations of the proposed library may span over a board spectrum, such as in image processing to improve visualization, in segmentation for nuclei detection, in decision support systems for second opinion consultations, in statistical analysis for investigation of potential correlations between clinical annotations and imaging findings and, generally, in fostering research on histopathology image processing and analysis. To the best of our knowledge, the HICL constitutes the first attempt towards creation of a reference image collection library in the field of traditional histopathology, publicly and freely available to the scientific community.

  4. Global Positioning System data collection, processing, and analysis conducted by the U.S. Geological Survey Earthquake Hazards Program

    USGS Publications Warehouse

    Murray, Jessica R.; Svarc, Jerry L.

    2017-01-01

    The U.S. Geological Survey Earthquake Science Center collects and processes Global Positioning System (GPS) data throughout the western United States to measure crustal deformation related to earthquakes and tectonic processes as part of a long‐term program of research and monitoring. Here, we outline data collection procedures and present the GPS dataset built through repeated temporary deployments since 1992. This dataset consists of observations at ∼1950 locations. In addition, this article details our data processing and analysis procedures, which consist of the following. We process the raw data collected through temporary deployments, in addition to data from continuously operating western U.S. GPS stations operated by multiple agencies, using the GIPSY software package to obtain position time series. Subsequently, we align the positions to a common reference frame, determine the optimal parameters for a temporally correlated noise model, and apply this noise model when carrying out time‐series analysis to derive deformation measures, including constant interseismic velocities, coseismic offsets, and transient postseismic motion.

  5. 75 FR 40839 - Agency Information Collection Activities; Proposed Collection; Comment Request; Hazard Analysis...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-14

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration [Docket No. FDA-2010-N-0357] Agency Information Collection Activities; Proposed Collection; Comment Request; Hazard Analysis and Critical Control Point Procedures for the Safe and Sanitary Processing and Importing of Juice AGENCY: Food...

  6. Metabolic profiling of body fluids and multivariate data analysis.

    PubMed

    Trezzi, Jean-Pierre; Jäger, Christian; Galozzi, Sara; Barkovits, Katalin; Marcus, Katrin; Mollenhauer, Brit; Hiller, Karsten

    2017-01-01

    Metabolome analyses of body fluids are challenging due pre-analytical variations, such as pre-processing delay and temperature, and constant dynamical changes of biochemical processes within the samples. Therefore, proper sample handling starting from the time of collection up to the analysis is crucial to obtain high quality samples and reproducible results. A metabolomics analysis is divided into 4 main steps: 1) Sample collection, 2) Metabolite extraction, 3) Data acquisition and 4) Data analysis. Here, we describe a protocol for gas chromatography coupled to mass spectrometry (GC-MS) based metabolic analysis for biological matrices, especially body fluids. This protocol can be applied on blood serum/plasma, saliva and cerebrospinal fluid (CSF) samples of humans and other vertebrates. It covers sample collection, sample pre-processing, metabolite extraction, GC-MS measurement and guidelines for the subsequent data analysis. Advantages of this protocol include: •Robust and reproducible metabolomics results, taking into account pre-analytical variations that may occur during the sampling process•Small sample volume required•Rapid and cost-effective processing of biological samples•Logistic regression based determination of biomarker signatures for in-depth data analysis.

  7. Closing Intelligence Gaps: Synchronizing the Collection Management Process

    DTIC Science & Technology

    information flow. The US military divides the world into six distinct geographic areas with corresponding commanders managing risk and weighing...analyzed information , creating a mismatch between supply and demand. The result is a burden on all facets of the intelligence process. However, if the target...system, or problem requiring analysis is not collected, intelligence fails. Executing collection management under the traditional tasking process

  8. Optimized Enhanced Bioremediation Through 4D Geophysical Monitoring and Autonomous Data Collection, Processing and Analysis

    DTIC Science & Technology

    2014-09-01

    ER-200717) Optimized Enhanced Bioremediation Through 4D Geophysical Monitoring and Autonomous Data Collection, Processing and Analysis...N/A 3. DATES COVERED - 4. TITLE AND SUBTITLE Optimized Enhanced Bioremediation Through 4D Geophysical Monitoring and Autonomous Data...8 2.1.2 The Geophysical Signatures of Bioremediation ......................................... 8 2.2 PRIOR

  9. 77 FR 71473 - Agency Information Collection Activities: Requests for Comments; Clearance of Renewed Approval of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-30

    ... validating and maintaining the effectiveness of air carrier training program curriculum content. DATES... training. AQP is continuously validated through the collection and analysis of trainee performance. Data collection and analysis processes ensure that the certificate holder provides performance information on its...

  10. Collection, processing and dissemination of data for the national solar demonstration program

    NASA Technical Reports Server (NTRS)

    Day, R. E.; Murphy, L. J.; Smok, J. T.

    1978-01-01

    A national solar data system developed for the DOE by IBM provides for automatic gathering, conversion, transfer, and analysis of demonstration site data. NASA requirements for this system include providing solar site hardware, engineering, data collection, and analysis. The specific tasks include: (1) solar energy system design/integration; (2) developing a site data acquisition subsystem; (3) developing a central data processing system; (4) operating the test facility at Marshall Space Flight Center; (5) collecting and analyzing data. The systematic analysis and evaluation of the data from the National Solar Data System is reflected in a monthly performance report and a solar energy system performance evaluation report.

  11. Video-processing-based system for automated pedestrian data collection and analysis when crossing the street

    NASA Astrophysics Data System (ADS)

    Mansouri, Nabila; Watelain, Eric; Ben Jemaa, Yousra; Motamed, Cina

    2018-03-01

    Computer-vision techniques for pedestrian detection and tracking have progressed considerably and become widely used in several applications. However, a quick glance at the literature shows a minimal use of these techniques in pedestrian behavior and safety analysis, which might be due to the technical complexities facing the processing of pedestrian videos. To extract pedestrian trajectories from a video automatically, all road users must be detected and tracked during sequences, which is a challenging task, especially in a congested open-outdoor urban space. A multipedestrian tracker based on an interframe-detection-association process was proposed and evaluated. The tracker results are used to implement an automatic tool for pedestrians data collection when crossing the street based on video processing. The variations in the instantaneous speed allowed the detection of the street crossing phases (approach, waiting, and crossing). These were addressed for the first time in the pedestrian road security analysis to illustrate the causal relationship between pedestrian behaviors in the different phases. A comparison with a manual data collection method, by computing the root mean square error and the Pearson correlation coefficient, confirmed that the procedures proposed have significant potential to automate the data collection process.

  12. Forensic Analysis of Compromised Computers

    NASA Technical Reports Server (NTRS)

    Wolfe, Thomas

    2004-01-01

    Directory Tree Analysis File Generator is a Practical Extraction and Reporting Language (PERL) script that simplifies and automates the collection of information for forensic analysis of compromised computer systems. During such an analysis, it is sometimes necessary to collect and analyze information about files on a specific directory tree. Directory Tree Analysis File Generator collects information of this type (except information about directories) and writes it to a text file. In particular, the script asks the user for the root of the directory tree to be processed, the name of the output file, and the number of subtree levels to process. The script then processes the directory tree and puts out the aforementioned text file. The format of the text file is designed to enable the submission of the file as input to a spreadsheet program, wherein the forensic analysis is performed. The analysis usually consists of sorting files and examination of such characteristics of files as ownership, time of creation, and time of most recent access, all of which characteristics are among the data included in the text file.

  13. Rapid analysis of protein backbone resonance assignments using cryogenic probes, a distributed Linux-based computing architecture, and an integrated set of spectral analysis tools.

    PubMed

    Monleón, Daniel; Colson, Kimberly; Moseley, Hunter N B; Anklin, Clemens; Oswald, Robert; Szyperski, Thomas; Montelione, Gaetano T

    2002-01-01

    Rapid data collection, spectral referencing, processing by time domain deconvolution, peak picking and editing, and assignment of NMR spectra are necessary components of any efficient integrated system for protein NMR structure analysis. We have developed a set of software tools designated AutoProc, AutoPeak, and AutoAssign, which function together with the data processing and peak-picking programs NMRPipe and Sparky, to provide an integrated software system for rapid analysis of protein backbone resonance assignments. In this paper we demonstrate that these tools, together with high-sensitivity triple resonance NMR cryoprobes for data collection and a Linux-based computer cluster architecture, can be combined to provide nearly complete backbone resonance assignments and secondary structures (based on chemical shift data) for a 59-residue protein in less than 30 hours of data collection and processing time. In this optimum case of a small protein providing excellent spectra, extensive backbone resonance assignments could also be obtained using less than 6 hours of data collection and processing time. These results demonstrate the feasibility of high throughput triple resonance NMR for determining resonance assignments and secondary structures of small proteins, and the potential for applying NMR in large scale structural proteomics projects.

  14. 30 CFR 551.11 - Submission, inspection, and selection of geological data and information collected under a permit...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ..., or interpretation of any geological data and information. Initial analysis and processing are the stages of analysis or processing where the data and information first become available for in-house... geochemical) data and information describing each operation of analysis, processing, and interpretation; (2...

  15. 30 CFR 551.11 - Submission, inspection, and selection of geological data and information collected under a permit...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ..., or interpretation of any geological data and information. Initial analysis and processing are the stages of analysis or processing where the data and information first become available for in-house... geochemical) data and information describing each operation of analysis, processing, and interpretation; (2...

  16. 30 CFR 551.11 - Submission, inspection, and selection of geological data and information collected under a permit...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., or interpretation of any geological data and information. Initial analysis and processing are the stages of analysis or processing where the data and information first become available for in-house... geochemical) data and information describing each operation of analysis, processing, and interpretation; (2...

  17. When high achievers and low achievers work in the same group: the roles of group heterogeneity and processes in project-based learning.

    PubMed

    Cheng, Rebecca Wing-yi; Lam, Shui-fong; Chan, Joanne Chung-yan

    2008-06-01

    There has been an ongoing debate about the inconsistent effects of heterogeneous ability grouping on students in small group work such as project-based learning. The present research investigated the roles of group heterogeneity and processes in project-based learning. At the student level, we examined the interaction effect between students' within-group achievement and group processes on their self- and collective efficacy. At the group level, we examined how group heterogeneity was associated with the average self- and collective efficacy reported by the groups. The participants were 1,921 Hong Kong secondary students in 367 project-based learning groups. Student achievement was determined by school examination marks. Group processes, self-efficacy and collective efficacy were measured by a student-report questionnaire. Hierarchical linear modelling was used to analyse the nested data. When individual students in each group were taken as the unit of analysis, results indicated an interaction effect of group processes and students' within-group achievement on the discrepancy between collective- and self-efficacy. When compared with low achievers, high achievers reported lower collective efficacy than self-efficacy when group processes were of low quality. However, both low and high achievers reported higher collective efficacy than self-efficacy when group processes were of high quality. With 367 groups taken as the unit of analysis, the results showed that group heterogeneity, group gender composition and group size were not related to the discrepancy between collective- and self-efficacy reported by the students. Group heterogeneity was not a determinant factor in students' learning efficacy. Instead, the quality of group processes played a pivotal role because both high and low achievers were able to benefit when group processes were of high quality.

  18. Working Up a Good Sweat – The Challenges of Standardising Sweat Collection for Metabolomics Analysis

    PubMed Central

    Hussain, Joy N; Mantri, Nitin; Cohen, Marc M

    2017-01-01

    Introduction Human sweat is a complex biofluid of interest to diverse scientific fields. Metabolomics analysis of sweat promises to improve screening, diagnosis and self-monitoring of numerous conditions through new applications and greater personalisation of medical interventions. Before these applications can be fully developed, existing methods for the collection, handling, processing and storage of human sweat need to be revised. This review presents a cross-disciplinary overview of the origins, composition, physical characteristics and functional roles of human sweat, and explores the factors involved in standardising sweat collection for metabolomics analysis. Methods A literature review of human sweat analysis over the past 10 years (2006–2016) was performed to identify studies with metabolomics or similarly applicable ‘omics’ analysis. These studies were reviewed with attention to sweat induction and sampling techniques, timing of sweat collection, sweat storage conditions, laboratory derivation, processing and analytical platforms. Results Comparative analysis of 20 studies revealed numerous factors that can significantly impact the validity, reliability and reproducibility of sweat analysis including: anatomical site of sweat sampling, skin integrity and preparation; temperature and humidity at the sweat collection sites; timing and nature of sweat collection; metabolic quenching; transport and storage; qualitative and quantitative measurements of the skin microbiota at sweat collection sites; and individual variables such as diet, emotional state, metabolic conditions, pharmaceutical, recreational drug and supplement use. Conclusion Further development of standard operating protocols for human sweat collection can open the way for sweat metabolomics to significantly add to our understanding of human physiology in health and disease. PMID:28798503

  19. Vocational Education Operations Analysis Process.

    ERIC Educational Resources Information Center

    California State Dept. of Education, Sacramento. Vocational Education Services.

    This manual on the vocational education operations analysis process is designed to provide vocational administrators/coordinators with an internal device to collect, analyze, and display vocational education performance data. The first section describes the system and includes the following: analysis worksheet, data sources, utilization, system…

  20. Design of A Cyclone Separator Using Approximation Method

    NASA Astrophysics Data System (ADS)

    Sin, Bong-Su; Choi, Ji-Won; Lee, Kwon-Hee

    2017-12-01

    A Separator is a device installed in industrial applications to separate mixed objects. The separator of interest in this research is a cyclone type, which is used to separate a steam-brine mixture in a geothermal plant. The most important performance of the cyclone separator is the collection efficiency. The collection efficiency in this study is predicted by performing the CFD (Computational Fluid Dynamics) analysis. This research defines six shape design variables to maximize the collection efficiency. Thus, the collection efficiency is set up as the objective function in optimization process. Since the CFD analysis requires a lot of calculation time, it is impossible to obtain the optimal solution by linking the gradient-based optimization algorithm. Thus, two approximation methods are introduced to obtain an optimum design. In this process, an L18 orthogonal array is adopted as a DOE method, and kriging interpolation method is adopted to generate the metamodel for the collection efficiency. Based on the 18 analysis results, the relative importance of each variable to the collection efficiency is obtained through the ANOVA (analysis of variance). The final design is suggested considering the results obtained from two optimization methods. The fluid flow analysis of the cyclone separator is conducted by using the commercial CFD software, ANSYS-CFX.

  1. Improving data collection processes for routine evaluation of treatment cost-effectiveness.

    PubMed

    Monto, Sari; Penttilä, Riku; Kärri, Timo; Puolakka, Kari; Valpas, Antti; Talonpoika, Anna-Maria

    2016-04-01

    The healthcare system in Finland has begun routine collection of health-related quality of life (HRQoL) information for patients in hospitals to support more systematic cost-effectiveness analysis (CEA). This article describes the systematic collection of HRQoL survey data, and addresses challenges in the implementation of patient surveys and acquisition of cost data in the case hospital. Challenges include problems with incomplete data and undefined management processes. In order to support CEA of hospital treatments, improvements are sought from the process management literature and in the observation of healthcare professionals. The article has been written from an information system and process management perspective, concluding that process ownership, automation of data collection and better staff training are keys to generating more reliable data.

  2. Empirical analysis and modeling of manual turnpike tollbooths in China

    NASA Astrophysics Data System (ADS)

    Zhang, Hao

    2017-03-01

    To deal with low-level of service satisfaction at tollbooths of many turnpikes in China, we conduct an empirical study and use a queueing model to investigate performance measures. In this paper, we collect archived data from six tollbooths of a turnpike in China. Empirical analysis on vehicle's time-dependent arrival process and collector's time-dependent service time is conducted. It shows that the vehicle arrival process follows a non-homogeneous Poisson process while the collector service time follows a log-normal distribution. Further, we model the process of collecting tolls at tollbooths with MAP / PH / 1 / FCFS queue for mathematical tractability and present some numerical examples.

  3. A Job Announcement Analysis of Educational Technology Professional Positions: Knowledge, Skills, and Abilities

    ERIC Educational Resources Information Center

    Kang, YoungJu; Ritzhaupt, Albert D.

    2015-01-01

    The purpose of this research was to identify the competencies of an educational technologist via a job announcement analysis. Four hundred job announcements were collected from a variety of online job databases over a 5-month period. Following a systematic process of collection, documentation, and analysis, we derived over 150 knowledge, skill,…

  4. A State Space Modeling Approach to Mediation Analysis

    ERIC Educational Resources Information Center

    Gu, Fei; Preacher, Kristopher J.; Ferrer, Emilio

    2014-01-01

    Mediation is a causal process that evolves over time. Thus, a study of mediation requires data collected throughout the process. However, most applications of mediation analysis use cross-sectional rather than longitudinal data. Another implicit assumption commonly made in longitudinal designs for mediation analysis is that the same mediation…

  5. Preanalytical errors in medical laboratories: a review of the available methodologies of data collection and analysis.

    PubMed

    West, Jamie; Atherton, Jennifer; Costelloe, Seán J; Pourmahram, Ghazaleh; Stretton, Adam; Cornes, Michael

    2017-01-01

    Preanalytical errors have previously been shown to contribute a significant proportion of errors in laboratory processes and contribute to a number of patient safety risks. Accreditation against ISO 15189:2012 requires that laboratory Quality Management Systems consider the impact of preanalytical processes in areas such as the identification and control of non-conformances, continual improvement, internal audit and quality indicators. Previous studies have shown that there is a wide variation in the definition, repertoire and collection methods for preanalytical quality indicators. The International Federation of Clinical Chemistry Working Group on Laboratory Errors and Patient Safety has defined a number of quality indicators for the preanalytical stage, and the adoption of harmonized definitions will support interlaboratory comparisons and continual improvement. There are a variety of data collection methods, including audit, manual recording processes, incident reporting mechanisms and laboratory information systems. Quality management processes such as benchmarking, statistical process control, Pareto analysis and failure mode and effect analysis can be used to review data and should be incorporated into clinical governance mechanisms. In this paper, The Association for Clinical Biochemistry and Laboratory Medicine PreAnalytical Specialist Interest Group review the various data collection methods available. Our recommendation is the use of the laboratory information management systems as a recording mechanism for preanalytical errors as this provides the easiest and most standardized mechanism of data capture.

  6. STS payload data collection and accommodations analysis study. Volume 2: Payload data collection

    NASA Technical Reports Server (NTRS)

    1978-01-01

    A format developed for Space Transportation System payload data collection and a process for collecting the data are described along with payload volumes and a data deck to be used as input for the Marshall Interactive Planning System. Summary matrices of the data generated are included.

  7. Understanding the Perception of Very Small Software Companies towards the Adoption of Process Standards

    NASA Astrophysics Data System (ADS)

    Basri, Shuib; O'Connor, Rory V.

    This paper is concerned with understanding the issues that affect the adoption of software process standards by Very Small Entities (VSEs), their needs from process standards and their willingness to engage with the new ISO/IEC 29110 standard in particular. In order to achieve this goal, a series of industry data collection studies were undertaken with a collection of VSEs. A twin track approach of a qualitative data collection (interviews and focus groups) and quantitative data collection (questionnaire) were undertaken. Data analysis was being completed separately and the final results were merged, using the coding mechanisms of grounded theory. This paper serves as a roadmap for both researchers wishing to understand the issues of process standards adoption by very small companies and also for the software process standards community.

  8. 78 FR 77478 - Agency Information Collection Activities: Proposed Collection; Comment Request, Federal Emergency...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-23

    ..., Federal Emergency Management Agency Individual Assistance Customer Satisfaction Surveys AGENCY: Federal... concerning the collection of Individual Assistance customer satisfaction survey responses and information for..., Customer Satisfaction Analysis Section of the National Processing Service Center Division, Recovery...

  9. Guidelines for collecting and processing samples of stream bed sediment for analysis of trace elements and organic contaminants for the National Water-Quality Assessment Program

    USGS Publications Warehouse

    Shelton, Larry R.; Capel, Paul D.

    1994-01-01

    A major component of the U.S. Geological Survey's National Water-Quality Assessment program is to assess the occurrence and distribution of trace elements and organic contaminants in streams. The first phase of the strategy for the assessment is to analyze samples of bed sediments from depositional zones. Fine-grained particles deposited in these zones are natural accumulators of trace elements and hydrophobic organic compounds. For the information to be comparable among studies in many different parts of the Nation, strategies for selecting stream sites and depositional zones are critical. Fine-grained surficial sediments are obtained from several depositional zones within a stream reach and composited to yield a sample representing average conditions. Sample collection and processing must be done consistently and by procedures specifically designed to separate the fine material into fractions that yield uncontaminated samples for trace-level analytes in the laboratory. Special coring samplers and other instruments made of Teflon are used for collection. Samples are processed through a 2.0-millimeter stainless-steel mesh sieve for organic contaminate analysis and a 63-micrometer nylon-cloth sieve for trace-element analysis. Quality assurance is maintained by strict collection and processing procedures, duplicate samplings, and a rigid cleaning procedure.

  10. The Use of a Checklist and Qualitative Notebooks for an Interactive Process of Teaching and Learning Qualitative Research

    ERIC Educational Resources Information Center

    Frels, Rebecca K.; Sharma, Bipin; Onwuegbuzie, Anthony J.; Leech, Nancy L.; Stark, Marcella D.

    2011-01-01

    From the perspective of doctoral students and instructors, we explain a developmental, interactive process based upon the Checklist for Qualitative Data Collection, Data Analysis, and Data Interpretation (Onwuegbuzie, 2010) for students' writing assignments regarding: (a) the application of conceptual knowledge for collecting, analyzing, and…

  11. Utilizing the Theoretical Framework of Collective Identity to Understand Processes in Youth Programs

    ERIC Educational Resources Information Center

    Futch, Valerie A.

    2016-01-01

    This article explores collective identity as a useful theoretical framework for understanding social and developmental processes that occur in youth programs. Through narrative analysis of past participant interviews (n = 21) from an after-school theater program, known as "The SOURCE", it was found that participants very clearly describe…

  12. Automated process for solvent separation of organic/inorganic substance

    DOEpatents

    Schweighardt, F.K.

    1986-07-29

    There is described an automated process for the solvent separation of organic/inorganic substances that operates continuously and unattended and eliminates potential errors resulting from subjectivity and the aging of the sample during analysis. In the process, metered amounts of one or more solvents are passed sequentially through a filter containing the sample under the direction of a microprocessor control apparatus. The mixture in the filter is agitated by ultrasonic cavitation for a timed period and the filtrate is collected. The filtrate of each solvent extraction is collected individually and the residue on the filter element is collected to complete the extraction process. 4 figs.

  13. Automated process for solvent separation of organic/inorganic substance

    DOEpatents

    Schweighardt, Frank K.

    1986-01-01

    There is described an automated process for the solvent separation of organic/inorganic substances that operates continuously and unattended and eliminates potential errors resulting from subjectivity and the aging of the sample during analysis. In the process, metered amounts of one or more solvents are passed sequentially through a filter containing the sample under the direction of a microprocessor control apparatus. The mixture in the filter is agitated by ultrasonic cavitation for a timed period and the filtrate is collected. The filtrate of each solvent extraction is collected individually and the residue on the filter element is collected to complete the extraction process.

  14. Micro-based fact collection tool user's manual

    NASA Technical Reports Server (NTRS)

    Mayer, Richard

    1988-01-01

    A procedure designed for use by an analyst to assist in the collection and organization of data gathered during the interview processes associated with system analysis and modeling task is described. The basic concept behind the development of this tool is that during the interview process an analyst is presented with assertions of facts by the domain expert. The analyst also makes observations of the domain. These facts need to be collected and preserved in such a way as to allow them to serve as the basis for a number of decision making processes throughout the system development process. This tool can be thought of as a computerization of the analysts's notebook.

  15. The Geomorphic Road Analysis and Inventory Package (GRAIP) Volume 1: Data Collection Method

    Treesearch

    Thomas A. Black; Richard M. Cissel; Charles H. Luce

    2012-01-01

    An important first step in managing forest roads for improved water quality and aquatic habitat is the performance of an inventory. The Geomorphic Roads Analysis and Inventory Package (GRAIP) was developed as a tool for making a comprehensive inventory and analysis of the effects of forest roads on watersheds. This manual describes the data collection and process of a...

  16. WISE: Automated support for software project management and measurement. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Ramakrishnan, Sudhakar

    1995-01-01

    One important aspect of software development and IV&V is measurement. Unless a software development effort is measured in some way, it is difficult to judge the effectiveness of current efforts and predict future performances. Collection of metrics and adherence to a process are difficult tasks in a software project. Change activity is a powerful indicator of project status. Automated systems that can handle change requests, issues, and other process documents provide an excellent platform for tracking the status of the project. A World Wide Web based architecture is developed for (a) making metrics collection an implicit part of the software process, (b) providing metric analysis dynamically, (c) supporting automated tools that can complement current practices of in-process improvement, and (d) overcoming geographical barrier. An operational system (WISE) instantiates this architecture allowing for the improvement of software process in a realistic environment. The tool tracks issues in software development process, provides informal communication between the users with different roles, supports to-do lists (TDL), and helps in software process improvement. WISE minimizes the time devoted to metrics collection, analysis, and captures software change data. Automated tools like WISE focus on understanding and managing the software process. The goal is improvement through measurement.

  17. Unnecessary roughness? Testing the hypothesis that predators destined for molecular gut-content analysis must be hand-collected to avoid cross-contamination

    USDA-ARS?s Scientific Manuscript database

    Molecular gut-content analysis enables direct detection of arthropod predation with minimal disruption of on-going ecosystem processes. Mass-collection methods, such as sweep-netting, vacuum sampling, and foliage beating, could lead to regurgitation or even rupturing of predators along with uneaten ...

  18. Efficient Research Design: Using Value-of-Information Analysis to Estimate the Optimal Mix of Top-down and Bottom-up Costing Approaches in an Economic Evaluation alongside a Clinical Trial.

    PubMed

    Wilson, Edward C F; Mugford, Miranda; Barton, Garry; Shepstone, Lee

    2016-04-01

    In designing economic evaluations alongside clinical trials, analysts are frequently faced with alternative methods of collecting the same data, the extremes being top-down ("gross costing") and bottom-up ("micro-costing") approaches. A priori, bottom-up approaches may be considered superior to top-down approaches but are also more expensive to collect and analyze. In this article, we use value-of-information analysis to estimate the efficient mix of observations on each method in a proposed clinical trial. By assigning a prior bivariate distribution to the 2 data collection processes, the predicted posterior (i.e., preposterior) mean and variance of the superior process can be calculated from proposed samples using either process. This is then used to calculate the preposterior mean and variance of incremental net benefit and hence the expected net gain of sampling. We apply this method to a previously collected data set to estimate the value of conducting a further trial and identifying the optimal mix of observations on drug costs at 2 levels: by individual item (process A) and by drug class (process B). We find that substituting a number of observations on process A for process B leads to a modest £ 35,000 increase in expected net gain of sampling. Drivers of the results are the correlation between the 2 processes and their relative cost. This method has potential use following a pilot study to inform efficient data collection approaches for a subsequent full-scale trial. It provides a formal quantitative approach to inform trialists whether it is efficient to collect resource use data on all patients in a trial or on a subset of patients only or to collect limited data on most and detailed data on a subset. © The Author(s) 2016.

  19. A dynamic analysis of the radiation excitation from the activation of a current collecting system in space

    NASA Technical Reports Server (NTRS)

    Wang, J.; Hastings, D. E.

    1991-01-01

    Current collecting systems moving in the ionosphere will induce electromagnetic wave radiation. The commonly used static analysis is incapable of studying the situation when such systems undergo transient processes. A dynamic analysis has been developed, and the radiation excitation processes are studied. This dynamic analysis is applied to study the temporal wave radiation from the activation of current collecting systems in space. The global scale electrodynamic interactions between a space-station-like structure and the ionospheric plasma are studied. The temporal evolution and spatial propagation of the electric wave field after the activation are described. The wave excitations by tethered systems are also studied. The dependencies of the temporal Alfven wave and lower hybrid wave radiation on the activation time and the space system structure are discussed. It is shown that the characteristics of wave radiation are determined by the matching of two sets of characteristic frequencies, and a rapid change in the current collection can give rise to substantial transient radiation interference. The limitations of the static and linear analysis are examined, and the condition under which the static assumption is valid is obtained.

  20. Focused analyte spray emission apparatus and process for mass spectrometric analysis

    DOEpatents

    Roach, Patrick J [Kennewick, WA; Laskin, Julia [Richland, WA; Laskin, Alexander [Richland, WA

    2012-01-17

    An apparatus and process are disclosed that deliver an analyte deposited on a substrate to a mass spectrometer that provides for trace analysis of complex organic analytes. Analytes are probed using a small droplet of solvent that is formed at the junction between two capillaries. A supply capillary maintains the droplet of solvent on the substrate; a collection capillary collects analyte desorbed from the surface and emits analyte ions as a focused spray to the inlet of a mass spectrometer for analysis. The invention enables efficient separation of desorption and ionization events, providing enhanced control over transport and ionization of the analyte.

  1. An Analysis and Allocation System for Library Collections Budgets: The Comprehensive Allocation Process (CAP)

    ERIC Educational Resources Information Center

    Lyons, Lucy Eleonore; Blosser, John

    2012-01-01

    The "Comprehensive Allocation Process" (CAP) is a reproducible decision-making structure for the allocation of new collections funds, for the reallocation of funds within stagnant budgets, and for budget cuts in the face of reduced funding levels. This system was designed to overcome common shortcomings of current methods. Its philosophical…

  2. A Model of Small Group Facilitator Competencies

    ERIC Educational Resources Information Center

    Kolb, Judith A.; Jin, Sungmi; Song, Ji Hoon

    2008-01-01

    This study used small group theory, quantitative and qualitative data collected from experienced practicing facilitators at three points of time, and a building block process of collection, analysis, further collection, and consolidation to develop a model of small group facilitator competencies. The proposed model has five components:…

  3. The development of participatory health research among incarcerated women in a Canadian prison

    PubMed Central

    Murphy, K.; Hanson, D.; Hemingway, C.; Ramsden, V.; Buxton, J.; Granger-Brown, A.; Condello, L-L.; Buchanan, M.; Espinoza-Magana, N.; Edworthy, G.; Hislop, T. G.

    2009-01-01

    This paper describes the development of a unique prison participatory research project, in which incarcerated women formed a research team, the research activities and the lessons learned. The participatory action research project was conducted in the main short sentence minimum/medium security women's prison located in a Western Canadian province. An ethnographic multi-method approach was used for data collection and analysis. Quantitative data was collected by surveys and analysed using descriptive statistics. Qualitative data was collected from orientation package entries, audio recordings, and written archives of research team discussions, forums and debriefings, and presentations. These data and ethnographic observations were transcribed and analysed using iterative and interpretative qualitative methods and NVivo 7 software. Up to 15 women worked each day as prison research team members; a total of 190 women participated at some time in the project between November 2005 and August 2007. Incarcerated women peer researchers developed the research processes including opportunities for them to develop leadership and technical skills. Through these processes, including data collection and analysis, nine health goals emerged. Lessons learned from the research processes were confirmed by the common themes that emerged from thematic analysis of the research activity data. Incarceration provides a unique opportunity for engagement of women as expert partners alongside academic researchers and primary care workers in participatory research processes to improve their health. PMID:25759141

  4. Interim results of quality-control sampling of surface water for the Upper Colorado River National Water-Quality Assessment Study Unit, water years 1995-96

    USGS Publications Warehouse

    Spahr, N.E.; Boulger, R.W.

    1997-01-01

    Quality-control samples provide part of the information needed to estimate the bias and variability that result from sample collection, processing, and analysis. Quality-control samples of surface water collected for the Upper Colorado River National Water-Quality Assessment study unit for water years 1995?96 are presented and analyzed in this report. The types of quality-control samples collected include pre-processing split replicates, concurrent replicates, sequential replicates, post-processing split replicates, and field blanks. Analysis of the pre-processing split replicates, concurrent replicates, sequential replicates, and post-processing split replicates is based on differences between analytical results of the environmental samples and analytical results of the quality-control samples. Results of these comparisons indicate that variability introduced by sample collection, processing, and handling is low and will not affect interpretation of the environmental data. The differences for most water-quality constituents is on the order of plus or minus 1 or 2 lowest rounding units. A lowest rounding unit is equivalent to the magnitude of the least significant figure reported for analytical results. The use of lowest rounding units avoids some of the difficulty in comparing differences between pairs of samples when concentrations span orders of magnitude and provides a measure of the practical significance of the effect of variability. Analysis of field-blank quality-control samples indicates that with the exception of chloride and silica, no systematic contamination of samples is apparent. Chloride contamination probably was the result of incomplete rinsing of the dilute cleaning solution from the outlet ports of the decaport sample splitter. Silica contamination seems to have been introduced by the blank water. Sampling and processing procedures for water year 1997 have been modified as a result of these analyses.

  5. Collective Digital Storytelling: An Activity-Theoretical Analysis of Second Language Learning and Teaching

    ERIC Educational Resources Information Center

    Kalyaniwala-Thapliyal, Carmenne

    2016-01-01

    This paper describes the collective activity of a group of four students who created a digital story as a teaching resource that was to be used for teaching English as a foreign language. It uncovers and analyzes the actual processes underlining the activity as it unfolds from one stage to another. Four processes, viz., sociocognitive…

  6. Determinants of job stress in chemical process industry: A factor analysis approach.

    PubMed

    Menon, Balagopal G; Praveensal, C J; Madhu, G

    2015-01-01

    Job stress is one of the active research domains in industrial safety research. The job stress can result in accidents and health related issues in workers in chemical process industries. Hence it is important to measure the level of job stress in workers so as to mitigate the same to avoid the worker's safety related problems in the industries. The objective of this study is to determine the job stress factors in the chemical process industry in Kerala state, India. This study also aims to propose a comprehensive model and an instrument framework for measuring job stress levels in the chemical process industries in Kerala, India. The data is collected through a questionnaire survey conducted in chemical process industries in Kerala. The collected data out of 1197 surveys is subjected to principal component and confirmatory factor analysis to develop the job stress factor structure. The factor analysis revealed 8 factors that influence the job stress in process industries. It is also found that the job stress in employees is most influenced by role ambiguity and the least by work environment. The study has developed an instrument framework towards measuring job stress utilizing exploratory factor analysis and structural equation modeling.

  7. AN ANALYSIS OF THE BEHAVIORAL PROCESSES INVOLVED IN SELF-INSTRUCTION WITH TEACHING MACHINES.

    ERIC Educational Resources Information Center

    HOLLAND, JAMES G.; SKINNER, B.F.

    THIS COLLECTION OF PAPERS CONSTITUTES THE FINAL REPORT OF A PROJECT DEVOTED TO AN ANALYSIS OF THE BEHAVIORAL PROCESSES UNDERLYING PROGRAMED INSTRUCTION. THE PAPERS ARE GROUPED UNDER THREE HEADINGS--(1) "PROGRAMING RESEARCH," (2) "BASIC SKILLS--RATIONALE AND PROCEDURE," AND (3) "BASIC SKILLS--SPECIFIC SKILLS." THE…

  8. Airborne Hyperspectral Imaging of Seagrass and Coral Reef

    NASA Astrophysics Data System (ADS)

    Merrill, J.; Pan, Z.; Mewes, T.; Herwitz, S.

    2013-12-01

    This talk presents the process of project preparation, airborne data collection, data pre-processing and comparative analysis of a series of airborne hyperspectral projects focused on the mapping of seagrass and coral reef communities in the Florida Keys. As part of a series of large collaborative projects funded by the NASA ROSES program and the Florida Fish and Wildlife Conservation Commission and administered by the NASA UAV Collaborative, a series of airborne hyperspectral datasets were collected over six sites in the Florida Keys in May 2012, October 2012 and May 2013 by Galileo Group, Inc. using a manned Cessna 172 and NASA's SIERRA Unmanned Aerial Vehicle. Precise solar and tidal data were used to calculate airborne collection parameters and develop flight plans designed to optimize data quality. Two independent Visible and Near-Infrared (VNIR) hyperspectral imaging systems covering 400-100nm were used to collect imagery over six Areas of Interest (AOIs). Multiple collections were performed over all sites across strict solar windows in the mornings and afternoons. Independently developed pre-processing algorithms were employed to radiometrically correct, synchronize and georectify individual flight lines which were then combined into color balanced mosaics for each Area of Interest. The use of two different hyperspectral sensor as well as environmental variations between each collection allow for the comparative analysis of data quality as well as the iterative refinement of flight planning and collection parameters.

  9. Conducting On-orbit Gene Expression Analysis on ISS: WetLab-2

    NASA Technical Reports Server (NTRS)

    Parra, Macarena; Almeida, Eduardo; Boone, Travis; Jung, Jimmy; Lera, Matthew P.; Ricco, Antonio; Souza, Kenneth; Wu, Diana; Richey, C. Scott

    2013-01-01

    WetLab-2 will enable expanded genomic research on orbit by developing tools that support in situ sample collection, processing, and analysis on ISS. This capability will reduce the time-to-results for investigators and define new pathways for discovery on the ISS National Lab. The primary objective is to develop a research platform on ISS that will facilitate real-time quantitative gene expression analysis of biological samples collected on orbit. WetLab-2 will be capable of processing multiple sample types ranging from microbial cultures to animal tissues dissected on orbit. WetLab-2 will significantly expand the analytical capabilities onboard ISS and enhance science return from ISS.

  10. High throughput DNA damage quantification of human tissue with home-based collection device

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Costes, Sylvain V.; Tang, Jonathan; Yannone, Steven M.

    Kits, methods and systems for providing a service to provide a subject with information regarding the state of a subject's DNA damage. Collection, processing and analysis of samples are also described.

  11. Collecting, archiving and processing DNA from wildlife samples using FTA® databasing paper

    PubMed Central

    Smith, LM; Burgoyne, LA

    2004-01-01

    Background Methods involving the analysis of nucleic acids have become widespread in the fields of traditional biology and ecology, however the storage and transport of samples collected in the field to the laboratory in such a manner to allow purification of intact nucleic acids can prove problematical. Results FTA® databasing paper is widely used in human forensic analysis for the storage of biological samples and for purification of nucleic acids. The possible uses of FTA® databasing paper in the purification of DNA from samples of wildlife origin were examined, with particular reference to problems expected due to the nature of samples of wildlife origin. The processing of blood and tissue samples, the possibility of excess DNA in blood samples due to nucleated erythrocytes, and the analysis of degraded samples were all examined, as was the question of long term storage of blood samples on FTA® paper. Examples of the end use of the purified DNA are given for all protocols and the rationale behind the processing procedures is also explained to allow the end user to adjust the protocols as required. Conclusions FTA® paper is eminently suitable for collection of, and purification of nucleic acids from, biological samples from a wide range of wildlife species. This technology makes the collection and storage of such samples much simpler. PMID:15072582

  12. Field guide for collecting and processing stream-water samples for the National Water-Quality Assessment Program

    USGS Publications Warehouse

    Shelton, Larry R.

    1994-01-01

    The U.S. Geological Survey's National Water-Quality Assessment program includes extensive data- collection efforts to assess the quality of the Nations's streams. These studies require analyses of stream samples for major ions, nutrients, sediments, and organic contaminants. For the information to be comparable among studies in different parts of the Nation, consistent procedures specifically designed to produce uncontaminated samples for trace analysis in the laboratory are critical. This field guide describes the standard procedures for collecting and processing samples for major ions, nutrients, organic contaminants, sediment, and field analyses of conductivity, pH, alkalinity, and dissolved oxygen. Samples are collected and processed using modified and newly designed equipment made of Teflon to avoid contamination, including nonmetallic samplers (D-77 and DH-81) and a Teflon sample splitter. Field solid-phase extraction procedures developed to process samples for organic constituent analyses produce an extracted sample with stabilized compounds for more accurate results. Improvements to standard operational procedures include the use of processing chambers and capsule filtering systems. A modified collecting and processing procedure for organic carbon is designed to avoid contamination from equipment cleaned with methanol. Quality assurance is maintained by strict collecting and processing procedures, replicate sampling, equipment blank samples, and a rigid cleaning procedure using detergent, hydrochloric acid, and methanol.

  13. How do I provide leukapheresis products? Blood center experience and evidence for process improvement.

    PubMed

    Ginzburg, Yelena; Kessler, Debra; Narici, Manlio; Caltabiano, Melinda; Rebosa, Mark; Strauss, Donna; Shaz, Beth

    2013-10-01

    The past few decades have seen a resurgence of interest in leukapheresis products to improve the survival of infected patients with neutropenia. These products have a short shelf life and require donor stimulation with dexamethasone before collection. Additionally, a system with good communications and logistical support is essential. A recent survey of blood centers in North America revealed that the majority of centers collecting leukapheresis products use steroid-stimulated donors. The survey results suggested that an analysis of the process and potential process improvement would be of interest to the transfusion medicine community. Data from 2008 to 2011 regarding donor selection, donor dexamethasone stimulation, leukapheresis collection, and correlations between potentially pertinent variables for process improvement were analyzed. Results from an analysis of cost are also included. We evaluate 432 leukapheresis donations and demonstrate correlations between 1) pre- and poststimulation white blood cell (WBC) count (p<0.0001), 2) interval (donor stimulation to collection) and poststimulation WBC count (p<0.0001), and 3) poststimulation WBC count and leukapheresis product granulocyte yield (p<0.0001). Significant improvement in granulocyte quality and yield can be accomplished in dexamethasone-stimulated donors, by selecting eligible donors with relatively high normal prestimulation WBC counts and/or previously good responses to dexamethasone, increasing the duration between dexamethasone stimulation and granulocyte collection, and maintaining optimal hematocrit (5%-10%) in granulocyte collections. Because the majority of surveyed blood centers collecting stimulated granulocytes use steroids alone, modifications presented here may prove useful. Further assessment of correlation between granulocyte yield and clinical outcome will await results of additional studies. © 2012 American Association of Blood Banks.

  14. Large-Scale Variability of Inpatient Tacrolimus Therapeutic Drug Monitoring at an Academic Transplant Center: a Retrospective Study.

    PubMed

    Strohbehn, Garth W; Pan, Warren W; Petrilli, Christopher M; Heidemann, Lauren; Larson, Sophia; Aaronson, Keith D; Johnson, Matt; Ellies, Tammy; Heung, Michael

    2018-04-30

    Inpatient tacrolimus therapeutic drug monitoring (TDM) lacks standardized guidelines. In this study, the authors analyzed variability in the pre-analytical phase of the inpatient tacrolimus TDM process at their institution. Patients receiving tacrolimus (twice-daily formulation) and tacrolimus laboratory analysis were included in the study. Times of tacrolimus administration and laboratory study collection were extracted and time distribution plots for each step in the inpatient TDM process were generated. Trough levels were drawn appropriately in 25.9% of the cases. Timing between doses was consistent, with 91.9% of the following dose administrations occurring 12 +/- 2 hours after the previous dose. Only 38.1% of the drug administrations occurred within one hour of laboratory study collection. Tacrolimus-related patient safety events were reported at a rate of 1.9 events per month while incorrect timing of TDM sample collection occurred approximately 200 times per month. Root cause analysis identified a TDM process marked by a lack of communication and coordination of drug administration and TDM sample collection. Extrapolating findings nationwide, we estimate $22 million in laboratory costs wasted annually. Based on this large single-center study, the authors concluded that the inpatient TDM process is prone to timing errors, thus is financially wasteful, and at its worst harmful to patients due to clinical decisions being made on the basis of unreliable data. Further work is needed on systems solutions to better align the laboratory study collection and drug administration processes.

  15. Clinical and technical considerations in the analysis of gingival crevicular fluid.

    PubMed

    Wassall, Rebecca R; Preshaw, Philip M

    2016-02-01

    Despite the technical challenges involved when collecting, processing and analyzing gingival crevicular fluid samples, research using gingival crevicular fluid has, and will continue to play, a fundamental role in expanding our understanding of periodontal pathogenesis and healing outcomes following treatment. A review of the literature, however, clearly demonstrates that there is considerable variation in the methods used for collection, processing and analysis of gingival crevicular fluid samples by different research groups around the world. Inconsistent or inadequate reporting impairs interpretation of results, prevents accurate comparison of data between studies and potentially limits the conclusions that can be made from a larger body of evidence. The precise methods used for collection and analysis of gingival crevicular fluid (including calibration studies required before definitive clinical studies) should be reported in detail, either in the methods section of published papers or as an online supplementary file, so that other researchers may reproduce the methodology. Only with clear and transparent reporting will the full impact of future gingival crevicular fluid research be realized. This paper discusses the complexities of gingival crevicular fluid collection and analysis and provides guidance to researchers working in this field. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  16. Developing a Framework of Facilitator Competencies: Lessons from the Field

    ERIC Educational Resources Information Center

    Kolb, Judith A.; Jin, Sungmi; Song, Ji Hoon

    2008-01-01

    People in organizations are increasingly called upon to serve as small group facilitators or to assist in this role. This article uses data collected from practicing facilitators at three points of time and a building block process of collection, analysis, further collection, and consolidation to develop and refine a list of competencies. A…

  17. Surface Water Quality-Assurance Plan for the North Florida Program Office of the U.S. Geological Survey

    USGS Publications Warehouse

    Franklin, Marvin A.

    2000-01-01

    The U.S. Geological Survey, Water Resources Division, has a policy that requires each District office to prepare a Surface Water Quality-Assurance Plan. The plan for each District describes the policies and procedures that ensure high quality in the collection, processing, analysis, computer storage, and publication of surface-water data. The North Florida Program Office Surface Water Quality-Assurance Plan documents the standards, policies, and procedures used by the North Florida Program office for activities related to the collection, processing, storage, analysis, and publication of surface-water data.

  18. Design principles for data- and change-oriented organisational analysis in workplace health promotion.

    PubMed

    Inauen, A; Jenny, G J; Bauer, G F

    2012-06-01

    This article focuses on organizational analysis in workplace health promotion (WHP) projects. It shows how this analysis can be designed such that it provides rational data relevant to the further context-specific and goal-oriented planning of WHP and equally supports individual and organizational change processes implied by WHP. Design principles for organizational analysis were developed on the basis of a narrative review of the guiding principles of WHP interventions and organizational change as well as the scientific principles of data collection. Further, the practical experience of WHP consultants who routinely conduct organizational analysis was considered. This resulted in a framework with data-oriented and change-oriented design principles, addressing the following elements of organizational analysis in WHP: planning the overall procedure, data content, data-collection methods and information processing. Overall, the data-oriented design principles aim to produce valid, reliable and representative data, whereas the change-oriented design principles aim to promote motivation, coherence and a capacity for self-analysis. We expect that the simultaneous consideration of data- and change-oriented design principles for organizational analysis will strongly support the WHP process. We finally illustrate the applicability of the design principles to health promotion within a WHP case study.

  19. 3-D Imaging of Mars’ Polar Ice Caps Using Orbital Radar Data

    PubMed Central

    Foss, Frederick J.; Putzig, Nathaniel E.; Campbell, Bruce A.; Phillips, Roger J.

    2018-01-01

    Since its arrival in early 2006, various instruments aboard NASA’s Mars Reconnaissance Orbiter (MRO) have been collecting a variety of scientific and engineering data from orbit around Mars. Among these is the SHAllow RADar (SHARAD) instrument, supplied by Agenzia Spaziale Italiana (ASI) and designed for subsurface sounding in the 15–25 MHz frequency band. As of this writing, MRO has completed over 46,000 nearly polar orbits of Mars, 30% of which have included active SHARAD data collection. By 2009, a sufficient density of SHARAD coverage had been obtained over the polar regions to support 3-D processing and analysis of the data. Using tools and techniques commonly employed in terrestrial seismic data processing, we have processed subsets of the resulting collection of SHARAD observations covering the north and south polar regions as SHARAD 3-D volumes, imaging the interiors of the north and south polar ice caps known, respectively, as Planum Boreum and Planum Australe. After overcoming a series of challenges revealed during the 3-D processing and analysis, a completed Planum Boreum 3-D volume is currently being used for scientific research. Lessons learned in the northern work fed forward into our 3-D processing and analysis of the Planum Australe 3-D volume, currently under way. We discuss our experiences with these projects and present results and scientific insights stemming from these efforts. PMID:29400351

  20. [Applying healthcare failure mode and effect analysis to improve the surgical specimen transportation process and rejection rate].

    PubMed

    Hu, Pao-Hsueh; Hu, Hsiao-Chen; Huang, Hui-Ju; Chao, Hui-Lin; Lei, Ei-Fang

    2014-04-01

    Because surgical pathology specimens are crucial to the diagnosis and treatment of disease, it is critical that they be collected and transported safely and securely. Due to recent near-miss events in our department, we used the healthcare failure model and effect analysis to identify 14 potential perils in the specimen collection and transportation process. Improvement and prevention strategies were developed accordingly to improve quality of care. Using health care failure mode and effect analysis (HFMEA) may improve the surgical specimen transportation process and reduce the rate of surgical specimen rejection. Rectify standard operating procedures for surgical pathology specimen collection and transportation. Create educational videos and posters. Rectify methods of specimen verification. Organize and create an online and instantaneous management system for specimen tracking and specimen rejection. Implementation of the new surgical specimen transportation process effectively eliminated the 14 identified potential perils. In addition, the specimen rejection fell from 0.86% to 0.03%. This project was applied to improve the specimen transportation process, enhance interdisciplinary cooperation, and improve the patient-centered healthcare system. The creation and implementation of an online information system significantly facilitates specimen tracking, hospital cost reductions, and patient safety improvements. The success in our department is currently being replicated across all departments in our hospital that transport specimens. Our experience and strategy may be applied to inter-hospital specimen transportation in the future.

  1. Reverse engineering the physical chemistry of making Egyptian faience through compositional analysis of the cementation process

    NASA Astrophysics Data System (ADS)

    Pourattar, Parisa

    The cementation process of making Egyptian faience, reported by Hans Wulff from a workshop in Qom, Iran, has not been easy to replicate and various views have been set forth to understand the transport of materials from the glazing powder to the surfaces of the crushed quartz beads. Replications of the process fired to 950° C and under-fired to 850° C were characterized by electron beam microprobe analysis (EPMA), petrographic thin section analysis, and scanning electron microscopy with energy dispersive x-ray analysis (SEM-EDS). Chemical variations were modeled using thermal data, phase diagrams, and copper vaporization experiments. These replications were compared to 52 examples from various collections, including 20th century ethnographic collections of beads, glazing powder and plant ash, 12th century CE beads and glazing powder from Fustat (Old Cairo), Egypt, and to an earlier example from Abydos, Egypt in the New Kingdom and to an ash example from the Smithsonian Institution National Museum of Natural History.

  2. Rapid Naming and Phonological Processing as Predictors of Reading and Spelling

    ERIC Educational Resources Information Center

    Christo, Catherine; Davis, Jack

    2008-01-01

    This study examined the relationships between the cognitive processes of rapid naming and phonological processing and various literacy skills. Variables measured and used in this analysis were phonological processing, rapid naming, reading comprehension, isolated and nonsense word reading, and spelling. Data were collected from 65 second-to-fifth…

  3. Sampling procedure for lake or stream surface water chemistry

    Treesearch

    Robert Musselman

    2012-01-01

    Surface waters collected in the field for chemical analyses are easily contaminated. This research note presents a step-by-step detailed description of how to avoid sample contamination when field collecting, processing, and transporting surface water samples for laboratory analysis.

  4. 75 FR 40847 - Agency Information Collection Activities: Proposed Collection; Comment Request, 1660-0036...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-14

    ... Emergency Management Agency Individual Assistance Customer Satisfaction Surveys AGENCY: Federal Emergency..., timeliness and satisfaction with initial, continuing and final delivery of disaster-related assistance. DATES..., Customer Satisfaction Analysis Section, Texas National Processing Service Center, Recovery Directorate...

  5. Continental shelf processes affecting the oceanography of the South Atlantic Bight: Progress report, June 1, 1987 to May 31, 1988. [FLEX

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Atkinson, L.P.

    This study of continental shelf processes affecting the oceanography of the South Atlantic Bight (SAB) is part of the interdisciplinary DOE-sponsored South Atlantic Bight Program. Our part of the program involves hydrographic and nutrient characteristics of the region. Current research efforts in the SAB Program are being focused on the inner shelf region where effects of bottom friction, local wind forcing, river and estuarine discharge, and tides, which are all small scale processes, are important. Our major accomplishment during the past year was the completion of the FLEX (Fall Experiment) field study. Since most of our data collection is computerized,more » preliminary hydrographic data analysis was done on board ship during the cruise and preliminary results are available. These results will be presented in this report. We are just beginning our standard data processing and data analysis procedures. We continued the processing and analysis of SPREX data collected during April 1985. Work has also continued on the older GABEX I and II data sets. 8 refs., 19 figs., 2 tabs.« less

  6. Determination of Nutrient Intakes by a Modified Visual Estimation Method and Computerized Nutritional Analysis for Dietary Assessments

    DTIC Science & Technology

    1987-09-01

    a useful average for population studies, do not delay data processing , and is relatively Inexpensive. Using MVEN and observing recipe preparation...for population studies, do not delay data processing , and is relatively inexpensive. Using HVEM and observing recipe preparation procedures improve the...extensive review of the procedures and problems in design, collection, analysis, processing and interpretation of dietary survey data for individuals

  7. Joint Intelligence Operations Center (JIOC) Baseline Business Process Model & Capabilities Evaluation Methodology

    DTIC Science & Technology

    2012-03-01

    Targeting Review Board OPLAN Operations Plan OPORD Operations Order OPSIT Operational Situation OSINT Open Source Intelligence OV...Analysis Evaluate FLTREPs MISREPs Unit Assign Assets Feedback Asset Shortfalls Multi-Int Collection Political & Embasy Law Enforcement HUMINT OSINT ...Embassy Information OSINT Manage Theater HUMINT Law Enforcement Collection Sort Requests Platform Information Agency Information M-I Collect

  8. An Analysis of Secondary Teachers' Reasoning with Participatory Sensing Data

    ERIC Educational Resources Information Center

    Gould, Robert; Bargagliotti, Anna; Johnson, Terri

    2017-01-01

    Participatory sensing is a data collection method in which communities of people collect and share data to investigate large-scale processes. These data have many features often associated with the big data paradigm: they are rich and multivariate, include non-numeric data, and are collected as determined by an algorithm rather than by traditional…

  9. Identification and Mitigation of Generated Solid By-Products during Advanced Electrode Materials Processing

    DOE PAGES

    Tsai, Candace S. J.; Dysart, Arthur D.; Beltz, Jay H.; ...

    2015-12-30

    A scalable, solid-state elevated temperature process was developed to produce high capacity carbonaceous electrode materials for energy storage devices via decomposition of starch-based precursor in an inert atmosphere. The fabricated carbon-based architectures are useful as an excellent electrode material for lithium-ion, sodium-ion and lithium-sulfur batteries. This article focuses on the study and analysis of the formed nanometer-sized byproducts during the lab-scale production of carbonaceous electrode materials in the process design phase. The complete material production process was studied by operation, namely during heating, holding the reaction at elevated temperature, followed by cooling. The unknown downstream particles in the process exhaustmore » were collected and characterized via aerosol and liquid suspensions, and they were quantified using direct-reading instruments for number and mass concentrations. The airborne emissions were collected on polycarbonate filters and TEM grids using the Tsai diffusion sampler (TDS) for characterization and further analysis. Released byproduct aerosols collected in a deionized (DI) water trap were analyzed using a Nanosight real time nanoparticle characterization system and the aerosols emitted post water suspension were collected and characterized. Individual particles in the nanometer size range were found in exhaust aerosols, however, crystal structured aggregates were formed on the sampling substrate after a long-term sampling of emitted exhaust. After characterizing the released aerosol byproducts, methods were also identified to mitigate possible human and environmental exposures upon the industrial implementation of such a process.« less

  10. Old River Control Complex Sedimentation Investigation

    DTIC Science & Technology

    2015-06-01

    efforts to describe the shoaling processes and sediment transport in the two-river system. Geomorphic analysis The geomorphic assessment utilized...District, New Orleans. The investigation was conducted via a combination of field data collection and laboratory analysis, geomorphic assessments, and...6 Geomorphic analysis

  11. Research of flaw image collecting and processing technology based on multi-baseline stereo imaging

    NASA Astrophysics Data System (ADS)

    Yao, Yong; Zhao, Jiguang; Pang, Xiaoyan

    2008-03-01

    Aiming at the practical situations such as accurate optimal design, complex algorithms and precise technical demands of gun bore flaw image collecting, the design frame of a 3-D image collecting and processing system based on multi-baseline stereo imaging was presented in this paper. This system mainly including computer, electrical control box, stepping motor and CCD camera and it can realize function of image collection, stereo matching, 3-D information reconstruction and after-treatments etc. Proved by theoretical analysis and experiment results, images collected by this system were precise and it can slake efficiently the uncertainty problem produced by universally veins or repeated veins. In the same time, this system has faster measure speed and upper measure precision.

  12. Assessment of environmental impacts part one. Intervention analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hipel, Keith William; Lettenmaier, Dennis P.; McLeod, A. Ian

    The use of intervention analysis as a statistical method of gauging the effects of environmental changes is discussed. The Box-Jenkins model, serves as the basis for the intervention analysis methodology. Environmental studies of the Aswan Dam, the South Saskatchewan River, and a forest fire near the Pipers Hole River, Canada, are included as case studies in which intervention analysis was employed. Methods of data collection for intervention analysis are found to have a significant impact on model reliability; effective data collection processes for the Box-Jenkins model are provided. (15 graphs, 27 references, 2 tables)

  13. A Data-Processing System for Quantitative Analysis in Speech Production. CLCS Occasional Paper No. 17.

    ERIC Educational Resources Information Center

    Chasaide, Ailbhe Ni; Davis, Eugene

    The data processing system used at Trinity College's Centre for Language and Communication Studies (Ireland) enables computer-automated collection and analysis of phonetic data and has many advantages for research on speech production. The system allows accurate handling of large quantities of data, eliminates many of the limitations of manual…

  14. Quantitative Analysis of Plutonium Content in Particles Collected from a Certified Reference Material by Total Nuclear Reaction Energy (Q Value) Spectroscopy

    NASA Astrophysics Data System (ADS)

    Croce, M. P.; Hoover, A. S.; Rabin, M. W.; Bond, E. M.; Wolfsberg, L. E.; Schmidt, D. R.; Ullom, J. N.

    2016-08-01

    Microcalorimeters with embedded radioisotopes are an emerging category of sensor with advantages over existing methods for isotopic analysis of trace-level nuclear materials. For each nuclear decay, the energy of all decay products captured by the absorber (alpha particles, gamma rays, X-rays, electrons, daughter nuclei, etc.) is measured in one pulse. For alpha-decaying isotopes, this gives a measurement of the total nuclear reaction energy (Q value) and the spectra consist of well-separated, narrow peaks. We have demonstrated a simple mechanical alloying process to create an absorber structure consisting of a gold matrix with small inclusions of a radioactive sample. This absorber structure provides an optimized energy thermalization environment, resulting in high-resolution spectra with minimal tailing. We have applied this process to the analysis of particles collected from the surface of a plutonium metal certified reference material (CRM-126A from New Brunswick Laboratory) and demonstrated isotopic analysis by microcalorimeter Q value spectroscopy. Energy resolution from the Gaussian component of a Bortels function fit was 1.3 keV FWHM at 5244 keV. The collected particles were integrated directly into the detector absorber without any chemical processing. The ^{238}Pu/^{239}Pu and ^{240}Pu/^{239}Pu mass ratios were measured and the results confirmed against the certificate of analysis for the reference material. We also demonstrated inter-element analysis capability by measuring the ^{241}Am/^{239}Pu mass ratio.

  15. Valuing the Accreditation Process

    ERIC Educational Resources Information Center

    Bahr, Maria

    2018-01-01

    The value of the National Association for Developmental Education (NADE) accreditation process is far-reaching. Not only do students and programs benefit from the process, but also the entire institution. Through data collection of student performance, analysis, and resulting action plans, faculty and administrators can work cohesively towards…

  16. Velocity Mapping Toolbox (VMT): a processing and visualization suite for moving-vessel ADCP measurements

    USGS Publications Warehouse

    Parsons, D.R.; Jackson, P.R.; Czuba, J.A.; Engel, F.L.; Rhoads, B.L.; Oberg, K.A.; Best, J.L.; Mueller, D.S.; Johnson, K.K.; Riley, J.D.

    2013-01-01

    The use of acoustic Doppler current profilers (ADCP) for discharge measurements and three-dimensional flow mapping has increased rapidly in recent years and has been primarily driven by advances in acoustic technology and signal processing. Recent research has developed a variety of methods for processing data obtained from a range of ADCP deployments and this paper builds on this progress by describing new software for processing and visualizing ADCP data collected along transects in rivers or other bodies of water. The new utility, the Velocity Mapping Toolbox (VMT), allows rapid processing (vector rotation, projection, averaging and smoothing), visualization (planform and cross-section vector and contouring), and analysis of a range of ADCP-derived datasets. The paper documents the data processing routines in the toolbox and presents a set of diverse examples that demonstrate its capabilities. The toolbox is applicable to the analysis of ADCP data collected in a wide range of aquatic environments and is made available as open-source code along with this publication.

  17. Using Python Packages in 6D (Py)Ferret: EOF Analysis, OPeNDAP Sequence Data

    NASA Astrophysics Data System (ADS)

    Smith, K. M.; Manke, A.; Hankin, S. C.

    2012-12-01

    PyFerret was designed to provide the easy methods of access, analysis, and display of data found in the Ferret under the simple yet powerful Python scripting/programming language. This has enabled PyFerret to take advantage of a large and expanding collection of third-party scientific Python modules. Furthermore, ensemble and forecast axes have been added to Ferret and PyFerret for creating and working with collections of related data in Ferret's delayed-evaluation and minimal-data-access mode of operation. These axes simplify processing and visualization of these collections of related data. As one example, an empirical orthogonal function (EOF) analysis Python module was developed, taking advantage of the linear algebra module and other standard functionality in NumPy for efficient numerical array processing. This EOF analysis module is used in a Ferret function to provide an ensemble of levels of data explained by each EOF and Time Amplitude Function (TAF) product. Another example makes use of the PyDAP Python module to provide OPeNDAP sequence data for use in Ferret with minimal data access characteristic of Ferret.

  18. An innovative and shared methodology for event reconstruction using images in forensic science.

    PubMed

    Milliet, Quentin; Jendly, Manon; Delémont, Olivier

    2015-09-01

    This study presents an innovative methodology for forensic science image analysis for event reconstruction. The methodology is based on experiences from real cases. It provides real added value to technical guidelines such as standard operating procedures (SOPs) and enriches the community of practices at stake in this field. This bottom-up solution outlines the many facets of analysis and the complexity of the decision-making process. Additionally, the methodology provides a backbone for articulating more detailed and technical procedures and SOPs. It emerged from a grounded theory approach; data from individual and collective interviews with eight Swiss and nine European forensic image analysis experts were collected and interpreted in a continuous, circular and reflexive manner. Throughout the process of conducting interviews and panel discussions, similarities and discrepancies were discussed in detail to provide a comprehensive picture of practices and points of view and to ultimately formalise shared know-how. Our contribution sheds light on the complexity of the choices, actions and interactions along the path of data collection and analysis, enhancing both the researchers' and participants' reflexivity. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  19. Utilization of a Multi-Disciplinary Approach to Building Effective Command Centers: Process and Products

    DTIC Science & Technology

    2005-06-01

    cognitive task analysis , organizational information dissemination and interaction, systems engineering, collaboration and communications processes, decision-making processes, and data collection and organization. By blending these diverse disciplines command centers can be designed to support decision-making, cognitive analysis, information technology, and the human factors engineering aspects of Command and Control (C2). This model can then be used as a baseline when dealing with work in areas of business processes, workflow engineering, information management,

  20. Data Validation Package June 2016 Groundwater and Surface Water Sampling at the Old and New Rifle, Colorado, Processing Sites September 2016

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bush, Richard; Lemke, Peter

    Sampling Period: June 14–17 and July 7, 2016. Water samples were collected from 36 locations at New Rifle and Old Rifle, Colorado, Disposal/Processing Sites. Planned monitoring locations are shown in Attachment 1, Sampling and Analysis Work Order. Duplicate samples were collected from New Rifle locations 0216 and 0855, and Old Rifle location 0655. One equipment blank was collected after decontamination of non-dedicated equipment used to collect one surface water sample. See Attachment 2, Trip Report for additional details. Sampling and analyses were conducted as specified in the Sampling and Analysis Plan for U.S. Department of Energy Office of Legacy Managementmore » Sites (LMS/PRO/S04351, continually updated, http://energy.gov/lm/downloads/sampling-and- analysis-plan-us-department-energy-office-legacy-management-sites). New Rifle Site Samples were collected at the New Rifle site from 16 monitoring wells and 7 surface locations in compliance with the December 2008 Groundwater Compliance Action Plan [GCAP] for the New Rifle, Colorado, Processing Site (LMS/RFN/S01920). Monitoring well 0216 could not be sampled in June because it was surrounded by standing water due to the high river stage from spring runoff, it was later sampled in July. Monitoring well 0635 and surface location 0322 could not be sampled because access through the elk fence along Interstate 70 has not been completed at this time. Old Rifle Site Samples were collected at the Old Rifle site from eight monitoring wells and five surface locations in compliance with the December 2001 Ground Water Compliance Action Plan for the Old Rifle, Colorado, UMTRA Project Site (GJ0-2000-177-TAR).« less

  1. Reusable Rocket Engine Operability Modeling and Analysis

    NASA Technical Reports Server (NTRS)

    Christenson, R. L.; Komar, D. R.

    1998-01-01

    This paper describes the methodology, model, input data, and analysis results of a reusable launch vehicle engine operability study conducted with the goal of supporting design from an operations perspective. Paralleling performance analyses in schedule and method, this requires the use of metrics in a validated operations model useful for design, sensitivity, and trade studies. Operations analysis in this view is one of several design functions. An operations concept was developed given an engine concept and the predicted operations and maintenance processes incorporated into simulation models. Historical operations data at a level of detail suitable to model objectives were collected, analyzed, and formatted for use with the models, the simulations were run, and results collected and presented. The input data used included scheduled and unscheduled timeline and resource information collected into a Space Transportation System (STS) Space Shuttle Main Engine (SSME) historical launch operations database. Results reflect upon the importance not only of reliable hardware but upon operations and corrective maintenance process improvements.

  2. The artificial water cycle: emergy analysis of waste water treatment.

    PubMed

    Bastianoni, Simone; Fugaro, Laura; Principi, Ilaria; Rosini, Marco

    2003-04-01

    The artificial water cycle can be divided into the phases of water capture from the environment, potabilisation, distribution, waste water collection, waste water treatment and discharge back into the environment. The terminal phase of this cycle, from waste water collection to discharge into the environment, was assessed by emergy analysis. Emergy is the quantity of solar energy needed directly or indirectly to provide a product or energy flow in a given process. The emergy flow attributed to a process is therefore an index of the past and present environmental cost to support it. Six municipalities on the western side of the province of Bologna were analysed. Waste water collection is managed by the municipal councils and treatment is carried out in plants managed by a service company. Waste water collection was analysed by compiling a mass balance of the sewer system serving the six municipalities, including construction materials and sand for laying the pipelines. Emergy analysis of the water treatment plants was also carried out. The results show that the great quantity of emergy required to treat a gram of water is largely due to input of non renewable fossil fuels. As found in our previous analysis of the first part of the cycle, treatment is likewise characterised by high expenditure of non renewable resources, indicating a correlation with energy flows.

  3. The use of the Podotrack in forensic podiatry for collection and analysis of bare footprints using the Reel method of measurement.

    PubMed

    Burrow, J Gordon

    2016-05-01

    This small-scale study examined the role that bare footprint collection and measurement processes have on the Reel method of measurement in forensic podiatry and its use in the Criminal Justice System. Previous research indicated that the Reel method was a valid and reliable measurement system for bare footprint analysis but various collection systems have been used to collect footprint data and both manual and digital measurement processes were utilized in forensic podiatry and other disciplines. This study contributes to the debate about collecting bare footprints; the techniques employed to quantify various Reel measurements and considered whether there was asymmetry between feet and footprints of the same person. An inductive, quantitative paradigm used the Podotrack gathering procedure for footprint collection and the subsequent dynamic footprints subjected to Adobe Photoshop techniques of calculating the Reel linear variables. Statistical analyses using paired-sample t tests were conducted to test hypotheses and compare data sets. Standard error of mean (SEM) showed variation between feet and the findings provide support for the Reel study and measurement method. Copyright © 2016 The Chartered Society of Forensic Sciences. Published by Elsevier Ireland Ltd. All rights reserved.

  4. Applications of active microwave imagery

    NASA Technical Reports Server (NTRS)

    Weber, F. P.; Childs, L. F.; Gilbert, R.; Harlan, J. C.; Hoffer, R. M.; Miller, J. M.; Parsons, J.; Polcyn, F.; Schardt, B. B.; Smith, J. L.

    1978-01-01

    The following topics were discussed in reference to active microwave applications: (1) Use of imaging radar to improve the data collection/analysis process; (2) Data collection tasks for radar that other systems will not perform; (3) Data reduction concepts; and (4) System and vehicle parameters: aircraft and spacecraft.

  5. SUPERFUND REMOTE SENSING SUPPORT

    EPA Science Inventory

    This task provides remote sensing technical support to the Superfund program. Support includes the collection, processing, and analysis of remote sensing data to characterize hazardous waste disposal sites and their history. Image analysis reports, aerial photographs, and assoc...

  6. Ohio's Abandoned Mine Lands Reclamation Program: a Study of Data Collection and Evaluation Techniques

    NASA Technical Reports Server (NTRS)

    Sperry, S. L.

    1982-01-01

    The planning process for a statewide reclamation plan of Ohio abandoned minelands in response to the Federal Surface Mining Control and Reclamation Act of 1977 included: (1) the development of a screening and ranking methodology; (2) the establishment of a statewide review of major watersheds affected by mining; (3) the development of an immediate action process; and (4) a prototypical study of a priority watershed demonstrating the data collection, analysis, display and evaluation to be used for the remaining state watersheds. Historical methods for satisfying map information analysis and evaluation, as well as current methodologies being used were discussed. Various computer mapping and analysis programs were examined for their usability in evaluating the priority reclamation sites. Hand methods were chosen over automated procedures; intuitive evaluation was the primary reason.

  7. Instrument to collect fogwater for chemical analysis

    NASA Astrophysics Data System (ADS)

    Jacob, Daniel J.; Waldman, Jed M.; Haghi, Mehrdad; Hoffmann, Michael R.; Flagan, Richard C.

    1985-06-01

    An instrument is presented which collects large samples of ambient fogwater by impaction of droplets on a screen. The collection efficiency of the instrument is determined as a function of droplet size, and it is shown that fog droplets in the range 3-100-μm diameter are efficiently collected. No significant evaporation or condensation occurs at any stage of the collection process. Field testing indicates that samples collected are representative of the ambient fogwater. The instrument may easily be automated, and is suitable for use in routine air quality monitoring programs.

  8. The influence of collective neutrino oscillations on a supernova r process

    NASA Astrophysics Data System (ADS)

    Duan, Huaiyu; Friedland, Alexander; McLaughlin, Gail C.; Surman, Rebecca

    2011-03-01

    Recently, it has been demonstrated that neutrinos in a supernova oscillate collectively. This process occurs much deeper than the conventional matter-induced Mikheyev-Smirnov-Wolfenstein effect and hence may have an impact on nucleosynthesis. In this paper we explore the effects of collective neutrino oscillations on the r-process, using representative late-time neutrino spectra and outflow models. We find that accurate modeling of the collective oscillations is essential for this analysis. As an illustration, the often-used 'single-angle' approximation makes grossly inaccurate predictions for the yields in our setup. With the proper multiangle treatment, the effect of the oscillations is found to be less dramatic, but still significant. Since the oscillation patterns are sensitive to the details of the emitted fluxes and the sign of the neutrino mass hierarchy, so are the r-process yields. The magnitude of the effect also depends sensitively on the astrophysical conditions—in particular on the interplay between the time when nuclei begin to exist in significant numbers and the time when the collective oscillation begins. A more definitive understanding of the astrophysical conditions, and accurate modeling of the collective oscillations for those conditions, is necessary.

  9. Investigating performance variability of processing, exploitation, and dissemination using a socio-technical systems analysis approach

    NASA Astrophysics Data System (ADS)

    Danczyk, Jennifer; Wollocko, Arthur; Farry, Michael; Voshell, Martin

    2016-05-01

    Data collection processes supporting Intelligence, Surveillance, and Reconnaissance (ISR) missions have recently undergone a technological transition accomplished by investment in sensor platforms. Various agencies have made these investments to increase the resolution, duration, and quality of data collection, to provide more relevant and recent data to warfighters. However, while sensor improvements have increased the volume of high-resolution data, they often fail to improve situational awareness and actionable intelligence for the warfighter because it lacks efficient Processing, Exploitation, and Dissemination and filtering methods for mission-relevant information needs. The volume of collected ISR data often overwhelms manual and automated processes in modern analysis enterprises, resulting in underexploited data, insufficient, or lack of answers to information requests. The outcome is a significant breakdown in the analytical workflow. To cope with this data overload, many intelligence organizations have sought to re-organize their general staffing requirements and workflows to enhance team communication and coordination, with hopes of exploiting as much high-value data as possible and understanding the value of actionable intelligence well before its relevance has passed. Through this effort we have taken a scholarly approach to this problem by studying the evolution of Processing, Exploitation, and Dissemination, with a specific focus on the Army's most recent evolutions using the Functional Resonance Analysis Method. This method investigates socio-technical processes by analyzing their intended functions and aspects to determine performance variabilities. Gaps are identified and recommendations about force structure and future R and D priorities to increase the throughput of the intelligence enterprise are discussed.

  10. How groups cope with collective responsibility for ecological problems: Symbolic coping and collective emotions.

    PubMed

    Caillaud, Sabine; Bonnot, Virginie; Ratiu, Eugenia; Krauth-Gruber, Silvia

    2016-06-01

    This study explores the way groups cope with collective responsibility for ecological problems. The social representations approach was adopted, and the collective symbolic coping model was used as a frame of analysis, integrating collective emotions to enhance the understanding of coping processes. The original feature of this study is that the analysis is at group level. Seven focus groups were conducted with French students. An original use of focus groups was proposed: Discussions were structured to induce feelings of collective responsibility and enable observation of how groups cope with such feelings at various levels (social knowledge; social identities; group dynamics). Two analyses were conducted: Qualitative analysis of participants' use of various kinds of knowledge, social categories and the group dynamics, and lexicometric analysis to reveal how emotions varied during the different discussion phases. Results showed that groups' emotional states moved from negative to positive: They used specific social categories and resorted to shared stereotypes to cope with collective responsibility and maintain the integrity of their worldview. Only then did debate become possible again; it was anchored in the nature-culture dichotomy such that groups switched from group-based to system-based emotions. © 2015 The British Psychological Society.

  11. Planar Inlet Design and Analysis Process (PINDAP)

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Gruber, Christopher R.

    2005-01-01

    The Planar Inlet Design and Analysis Process (PINDAP) is a collection of software tools that allow the efficient aerodynamic design and analysis of planar (two-dimensional and axisymmetric) inlets. The aerodynamic analysis is performed using the Wind-US computational fluid dynamics (CFD) program. A major element in PINDAP is a Fortran 90 code named PINDAP that can establish the parametric design of the inlet and efficiently model the geometry and generate the grid for CFD analysis with design changes to those parameters. The use of PINDAP is demonstrated for subsonic, supersonic, and hypersonic inlets.

  12. Impact of Positive Emotions Enhancement on Physiological Processes and Psychological Functioning in Military Pilots

    DTIC Science & Technology

    2009-10-01

    8 weeks. The experimental procedure consisted in collecting (i) psychological data (resilience, well-being, anxiety ), (ii) 12h-night urines to assess...was performed during 6 to 8 weeks. The experimental procedure consisted in collecting (i) psychological data (resilience, well-being, anxiety ), (ii...cardio- vascular regulation, the spectral analysis of heart rate variability ( HRV ) analysis is usually proposed as a method to assess vagal tone [7,2,8

  13. 77 FR 50692 - Request for Information on Quality Measurement Enabled by Health IT-Extension Date for Responses

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-22

    ... what types of quality measures should a combination of natural language processing and structured data... collection, analysis, processing, and its ability to facilitate information exchange among and across care...

  14. Dynamic Information and Library Processing.

    ERIC Educational Resources Information Center

    Salton, Gerard

    This book provides an introduction to automated information services: collection, analysis, classification, storage, retrieval, transmission, and dissemination. An introductory chapter is followed by an overview of mechanized processes for acquisitions, cataloging, and circulation. Automatic indexing and abstracting methods are covered, followed…

  15. Process improvement methods increase the efficiency, accuracy, and utility of a neurocritical care research repository.

    PubMed

    O'Connor, Sydney; Ayres, Alison; Cortellini, Lynelle; Rosand, Jonathan; Rosenthal, Eric; Kimberly, W Taylor

    2012-08-01

    Reliable and efficient data repositories are essential for the advancement of research in Neurocritical care. Various factors, such as the large volume of patients treated within the neuro ICU, their differing length and complexity of hospital stay, and the substantial amount of desired information can complicate the process of data collection. We adapted the tools of process improvement to the data collection and database design of a research repository for a Neuroscience intensive care unit. By the Shewhart-Deming method, we implemented an iterative approach to improve the process of data collection for each element. After an initial design phase, we re-evaluated all data fields that were challenging or time-consuming to collect. We then applied root-cause analysis to optimize the accuracy and ease of collection, and to determine the most efficient manner of collecting the maximal amount of data. During a 6-month period, we iteratively analyzed the process of data collection for various data elements. For example, the pre-admission medications were found to contain numerous inaccuracies after comparison with a gold standard (sensitivity 71% and specificity 94%). Also, our first method of tracking patient admissions and discharges contained higher than expected errors (sensitivity 94% and specificity 93%). In addition to increasing accuracy, we focused on improving efficiency. Through repeated incremental improvements, we reduced the number of subject records that required daily monitoring from 40 to 6 per day, and decreased daily effort from 4.5 to 1.5 h/day. By applying process improvement methods to the design of a Neuroscience ICU data repository, we achieved a threefold improvement in efficiency and increased accuracy. Although individual barriers to data collection will vary from institution to institution, a focus on process improvement is critical to overcoming these barriers.

  16. Efficient collective influence maximization in cascading processes with first-order transitions

    PubMed Central

    Pei, Sen; Teng, Xian; Shaman, Jeffrey; Morone, Flaviano; Makse, Hernán A.

    2017-01-01

    In many social and biological networks, the collective dynamics of the entire system can be shaped by a small set of influential units through a global cascading process, manifested by an abrupt first-order transition in dynamical behaviors. Despite its importance in applications, efficient identification of multiple influential spreaders in cascading processes still remains a challenging task for large-scale networks. Here we address this issue by exploring the collective influence in general threshold models of cascading process. Our analysis reveals that the importance of spreaders is fixed by the subcritical paths along which cascades propagate: the number of subcritical paths attached to each spreader determines its contribution to global cascades. The concept of subcritical path allows us to introduce a scalable algorithm for massively large-scale networks. Results in both synthetic random graphs and real networks show that the proposed method can achieve larger collective influence given the same number of seeds compared with other scalable heuristic approaches. PMID:28349988

  17. Phase separation like dynamics during Myxococcus xanthus fruiting body formation

    NASA Astrophysics Data System (ADS)

    Liu, Guannan; Thutupalli, Shashi; Wigbers, Manon; Shaevitz, Joshua

    2015-03-01

    Collective motion exists in many living organisms as an advantageous strategy to help the entire group with predation, forage, and survival. However, the principles of self-organization underlying such collective motions remain unclear. During various developmental stages of the soil-dwelling bacterium, Myxococcus xanthus, different types of collective motions are observed. In particular, when starved, M. xanthus cells eventually aggregate together to form 3-dimensional structures (fruiting bodies), inside which cells sporulate in response to the stress. We study the fruiting body formation process as an out of equilibrium phase separation process. As local cell density increases, the dynamics of the aggregation M. xanthus cells switch from a spatio-temporally random process, resembling nucleation and growth, to an emergent pattern formation process similar to a spinodal decomposition. By employing high-resolution microscopy and a video analysis system, we are able to track the motion of single cells within motile collective groups, while separately tuning local cell density, cell velocity and reversal frequency, probing the multi-dimensional phase space of M. xanthus development.

  18. A new method for placental/cord blood processing in the collection bag. I. Analysis of factors involved in red blood cell removal.

    PubMed

    Bertolini, F; Battaglia, M; Zibera, C; Baroni, G; Soro, V; Perotti, C; Salvaneschi, L; Robustelli della Cuna, G

    1996-10-01

    We describe a new procedure for large-scale CB processing in the collection bag, thus minimizing the risk of CB contamination. A solution of 6% hydroxyethyl starch (HES) was added directly to the CB containing bag. After RBC sedimentation at 4 degrees C, the WBC-rich supernatant was collected in a satellite bag and centrifuged. After supernatant removal, the cell pellet was resuspended and the percent recovery of total WBC, CD34+ progenitor cells, CFU-GM and cobblestone area-forming cells (CAFC) evaluated. Results obtained with three different types of CB collection bags (300, 600 and 1000 ml) were analyzed and compared with those of an open system in 50 ml tubes. CB processing procedures in 300 and 1000 ml bags were associated with better WBC, CFU, CD34+ cell and CAFC recovery (83-93%). This novel CB processing procedure appears to be easy, effective and particularly suitable for large-scale banking under GMP conditions.

  19. Efficient collective influence maximization in cascading processes with first-order transitions

    NASA Astrophysics Data System (ADS)

    Pei, Sen; Teng, Xian; Shaman, Jeffrey; Morone, Flaviano; Makse, Hernán A.

    2017-03-01

    In many social and biological networks, the collective dynamics of the entire system can be shaped by a small set of influential units through a global cascading process, manifested by an abrupt first-order transition in dynamical behaviors. Despite its importance in applications, efficient identification of multiple influential spreaders in cascading processes still remains a challenging task for large-scale networks. Here we address this issue by exploring the collective influence in general threshold models of cascading process. Our analysis reveals that the importance of spreaders is fixed by the subcritical paths along which cascades propagate: the number of subcritical paths attached to each spreader determines its contribution to global cascades. The concept of subcritical path allows us to introduce a scalable algorithm for massively large-scale networks. Results in both synthetic random graphs and real networks show that the proposed method can achieve larger collective influence given the same number of seeds compared with other scalable heuristic approaches.

  20. A software tool to analyze clinical workflows from direct observations.

    PubMed

    Schweitzer, Marco; Lasierra, Nelia; Hoerbst, Alexander

    2015-01-01

    Observational data of clinical processes need to be managed in a convenient way, so that process information is reliable, valid and viable for further analysis. However, existing tools for allocating observations fail in systematic data collection of specific workflow recordings. We present a software tool which was developed to facilitate the analysis of clinical process observations. The tool was successfully used in the project OntoHealth, to build, store and analyze observations of diabetes routine consultations.

  1. Sampling and sample processing in pesticide residue analysis.

    PubMed

    Lehotay, Steven J; Cook, Jo Marie

    2015-05-13

    Proper sampling and sample processing in pesticide residue analysis of food and soil have always been essential to obtain accurate results, but the subject is becoming a greater concern as approximately 100 mg test portions are being analyzed with automated high-throughput analytical methods by agrochemical industry and contract laboratories. As global food trade and the importance of monitoring increase, the food industry and regulatory laboratories are also considering miniaturized high-throughput methods. In conjunction with a summary of the symposium "Residues in Food and Feed - Going from Macro to Micro: The Future of Sample Processing in Residue Analytical Methods" held at the 13th IUPAC International Congress of Pesticide Chemistry, this is an opportune time to review sampling theory and sample processing for pesticide residue analysis. If collected samples and test portions do not adequately represent the actual lot from which they came and provide meaningful results, then all costs, time, and efforts involved in implementing programs using sophisticated analytical instruments and techniques are wasted and can actually yield misleading results. This paper is designed to briefly review the often-neglected but crucial topic of sample collection and processing and put the issue into perspective for the future of pesticide residue analysis. It also emphasizes that analysts should demonstrate the validity of their sample processing approaches for the analytes/matrices of interest and encourages further studies on sampling and sample mass reduction to produce a test portion.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tenney, J.L.

    SARS is a data acquisition system designed to gather and process radar data from aircraft flights. A database of flight trajectories has been developed for Albuquerque, NM, and Amarillo, TX. The data is used for safety analysis and risk assessment reports. To support this database effort, Sandia developed a collection of hardware and software tools to collect and post process the aircraft radar data. This document describes the data reduction tools which comprise the SARS, and maintenance procedures for the hardware and software system.

  3. Understanding Classrooms through Social Network Analysis: A Primer for Social Network Analysis in Education Research

    PubMed Central

    Wiggins, Benjamin L.; Goodreau, Steven M.

    2014-01-01

    Social interactions between students are a major and underexplored part of undergraduate education. Understanding how learning relationships form in undergraduate classrooms, as well as the impacts these relationships have on learning outcomes, can inform educators in unique ways and improve educational reform. Social network analysis (SNA) provides the necessary tool kit for investigating questions involving relational data. We introduce basic concepts in SNA, along with methods for data collection, data processing, and data analysis, using a previously collected example study on an undergraduate biology classroom as a tutorial. We conduct descriptive analyses of the structure of the network of costudying relationships. We explore generative processes that create observed study networks between students and also test for an association between network position and success on exams. We also cover practical issues, such as the unique aspects of human subjects review for network studies. Our aims are to convince readers that using SNA in classroom environments allows rich and informative analyses to take place and to provide some initial tools for doing so, in the process inspiring future educational studies incorporating relational data. PMID:26086650

  4. Comparison of Niskin vs. in situ approaches for analysis of gene expression in deep Mediterranean Sea water samples

    NASA Astrophysics Data System (ADS)

    Edgcomb, V. P.; Taylor, C.; Pachiadaki, M. G.; Honjo, S.; Engstrom, I.; Yakimov, M.

    2016-07-01

    Obtaining an accurate picture of microbial processes occurring in situ is essential for our understanding of marine biogeochemical cycles of global importance. Water samples are typically collected at depth and returned to the sea surface for processing and downstream experiments. Metatranscriptome analysis is one powerful approach for investigating metabolic activities of microorganisms in their habitat and which can be informative for determining responses of microbiota to disturbances such as the Deepwater Horizon oil spill. For studies of microbial processes occurring in the deep sea, however, sample handling, pressure, and other changes during sample recovery can subject microorganisms to physiological changes that alter the expression profile of labile messenger RNA. Here we report a comparison of gene expression profiles for whole microbial communities in a bathypelagic water column sample collected in the Eastern Mediterranean Sea using Niskin bottle sample collection and a new water column sampler for studies of marine microbial ecology, the Microbial Sampler - In Situ Incubation Device (MS-SID). For some taxa, gene expression profiles from samples collected and preserved in situ were significantly different from potentially more stressful Niskin sampling and preservation on deck. Some categories of transcribed genes also appear to be affected by sample handling more than others. This suggests that for future studies of marine microbial ecology, particularly targeting deep sea samples, an in situ sample collection and preservation approach should be considered.

  5. The dynamics of meaningful social interactions and the emergence of collective knowledge

    PubMed Central

    Dankulov, Marija Mitrović; Melnik, Roderick; Tadić, Bosiljka

    2015-01-01

    Collective knowledge as a social value may arise in cooperation among actors whose individual expertise is limited. The process of knowledge creation requires meaningful, logically coordinated interactions, which represents a challenging problem to physics and social dynamics modeling. By combining two-scale dynamics model with empirical data analysis from a well-known Questions & Answers system Mathematics, we show that this process occurs as a collective phenomenon in an enlarged network (of actors and their artifacts) where the cognitive recognition interactions are properly encoded. The emergent behavior is quantified by the information divergence and innovation advancing of knowledge over time and the signatures of self-organization and knowledge sharing communities. These measures elucidate the impact of each cognitive element and the individual actor’s expertise in the collective dynamics. The results are relevant to stochastic processes involving smart components and to collaborative social endeavors, for instance, crowdsourcing scientific knowledge production with online games. PMID:26174482

  6. The dynamics of meaningful social interactions and the emergence of collective knowledge

    NASA Astrophysics Data System (ADS)

    Dankulov, Marija Mitrović; Melnik, Roderick; Tadić, Bosiljka

    2015-07-01

    Collective knowledge as a social value may arise in cooperation among actors whose individual expertise is limited. The process of knowledge creation requires meaningful, logically coordinated interactions, which represents a challenging problem to physics and social dynamics modeling. By combining two-scale dynamics model with empirical data analysis from a well-known Questions & Answers system Mathematics, we show that this process occurs as a collective phenomenon in an enlarged network (of actors and their artifacts) where the cognitive recognition interactions are properly encoded. The emergent behavior is quantified by the information divergence and innovation advancing of knowledge over time and the signatures of self-organization and knowledge sharing communities. These measures elucidate the impact of each cognitive element and the individual actor’s expertise in the collective dynamics. The results are relevant to stochastic processes involving smart components and to collaborative social endeavors, for instance, crowdsourcing scientific knowledge production with online games.

  7. The dynamics of meaningful social interactions and the emergence of collective knowledge.

    PubMed

    Dankulov, Marija Mitrović; Melnik, Roderick; Tadić, Bosiljka

    2015-07-15

    Collective knowledge as a social value may arise in cooperation among actors whose individual expertise is limited. The process of knowledge creation requires meaningful, logically coordinated interactions, which represents a challenging problem to physics and social dynamics modeling. By combining two-scale dynamics model with empirical data analysis from a well-known Questions &Answers system Mathematics, we show that this process occurs as a collective phenomenon in an enlarged network (of actors and their artifacts) where the cognitive recognition interactions are properly encoded. The emergent behavior is quantified by the information divergence and innovation advancing of knowledge over time and the signatures of self-organization and knowledge sharing communities. These measures elucidate the impact of each cognitive element and the individual actor's expertise in the collective dynamics. The results are relevant to stochastic processes involving smart components and to collaborative social endeavors, for instance, crowdsourcing scientific knowledge production with online games.

  8. MOSAIC: Software for creating mosaics from collections of images

    NASA Technical Reports Server (NTRS)

    Varosi, F.; Gezari, D. Y.

    1992-01-01

    We have developed a powerful, versatile image processing and analysis software package called MOSAIC, designed specifically for the manipulation of digital astronomical image data obtained with (but not limited to) two-dimensional array detectors. The software package is implemented using the Interactive Data Language (IDL), and incorporates new methods for processing, calibration, analysis, and visualization of astronomical image data, stressing effective methods for the creation of mosaic images from collections of individual exposures, while at the same time preserving the photometric integrity of the original data. Since IDL is available on many computers, the MOSAIC software runs on most UNIX and VAX workstations with the X-Windows or Sun View graphics interface.

  9. 78 FR 69689 - Agency Information Collection Activities; Proposed Collection; Comment Request; Hazard Analysis...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-20

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration [Docket No. FDA-2013-N-1427... Critical Control Point Procedures for the Safe and Sanitary Processing and Importing of Juice AGENCY: Food and Drug Administration, HHS. ACTION: Notice. SUMMARY: The Food and Drug Administration (FDA) is...

  10. 23 CFR 971.204 - Management systems requirements.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION FEDERAL LANDS HIGHWAYS FOREST SERVICE... maintain the management systems and their associated databases; and (5) A process for data collection, processing, analysis, and updating for each management system. (c) All management systems will use databases...

  11. 23 CFR 970.204 - Management systems requirements.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION FEDERAL LANDS HIGHWAYS NATIONAL PARK... the management systems and their associated databases; and (5) A process for data collection, processing, analysis and updating for each management system. (d) All management systems will use databases...

  12. Analysis of the decision-making process of nurse managers: a collective reflection.

    PubMed

    Eduardo, Elizabete Araujo; Peres, Aida Maris; de Almeida, Maria de Lourdes; Roglio, Karina de Dea; Bernardino, Elizabeth

    2015-01-01

    to analyze the decision-making model adopted by nurses from the perspective of some decision-making process theories. qualitative approach, based on action research. Semi-structured questionnaires and seminars were conducted from April to June 2012 in order to understand the nature of decisions and the decision-making process of nine nurses in position of managers at a public hospital in Southern Brazil. Data were subjected to content analysis. data were classified in two categories: the current situation of decision-making, which showed a lack of systematization; the construction and collective decision-making, which emphasizes the need to develop a decision-making model. the decision-making model used by nurses is limited because it does not consider two important factors: the limits of human rationality, and the external and internal organizational environments that influence and determine right decisions.

  13. Multi-criteria decision analysis for waste management in Saharawi refugee camps

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garfi, M.; Tondelli, S.; Bonoli, A.

    2009-10-15

    The aim of this paper is to compare different waste management solutions in Saharawi refugee camps (Algeria) and to test the feasibility of a decision-making method developed to be applied in particular conditions in which environmental and social aspects must be considered. It is based on multi criteria analysis, and in particular on the analytic hierarchy process (AHP), a mathematical technique for multi-criteria decision making (Saaty, T.L., 1980. The Analytic Hierarchy Process. McGraw-Hill, New York, USA; Saaty, T.L., 1990. How to Make a Decision: The Analytic Hierarchy Process. European Journal of Operational Research; Saaty, T.L., 1994. Decision Making for Leaders:more » The Analytic Hierarchy Process in a Complex World. RWS Publications, Pittsburgh, PA), and on participatory approach, focusing on local community's concerns. The research compares four different waste collection and management alternatives: waste collection by using three tipper trucks, disposal and burning in an open area; waste collection by using seven dumpers and disposal in a landfill; waste collection by using seven dumpers and three tipper trucks and disposal in a landfill; waste collection by using three tipper trucks and disposal in a landfill. The results show that the second and the third solutions provide better scenarios for waste management. Furthermore, the discussion of the results points out the multidisciplinarity of the approach, and the equilibrium between social, environmental and technical impacts. This is a very important aspect in a humanitarian and environmental project, confirming the appropriateness of the chosen method.« less

  14. China’s Energy Security and the South China Sea

    DTIC Science & Technology

    2002-05-01

    discussion focuses on review of Chinese economic and energy policies. Subsequent analysis details Chinese behavior within the broader context of...15 3. AREA ANALYSIS ...supported by the deductive reasoning process through the collection and analysis of relevant research in the field. For an appropriate assessment of the

  15. Analysis of ECs and related compounds in plasma: artifactual isomerization and ex vivo enzymatic generation of 2-MGs.

    PubMed

    Pastor, Antoni; Farré, Magí; Fitó, Montserrat; Fernandez-Aranda, Fernando; de la Torre, Rafael

    2014-05-01

    The analysis of peripheral endocannabinoids (ECs) is a good biomarker of the EC system. Their concentrations, from clinical studies, strongly depend on sample collection and time processing conditions taking place in clinical and laboratory settings. The analysis of 2-monoacylglycerols (MGs) (i.e., 2-arachidonoylglycerol or 2-oleoylglycerol) is a particularly challenging issue because of their ex vivo formation and chemical isomerization that occur after blood sample collection. We provide evidence that their ex vivo formation can be minimized by adding Orlistat, an enzymatic lipase inhibitor, to plasma. Taking into consideration the low cost of Orlistat, we recommend its addition to plasma collecting tubes while maintaining sample cold chain until storage. We have validated a method for the determination of the EC profile of a range of MGs and N-acylethanolamides in plasma that preserves the original isomer ratio of MGs. Nevertheless, the chemical isomerization of 2-MGs can only be avoided by an immediate processing and analysis of samples due to their instability during conservation. We believe that this new methodology can aid in the harmonization of the measurement of ECs and related compounds in clinical samples.

  16. The dynamics of information-driven coordination phenomena: A transfer entropy analysis

    PubMed Central

    Borge-Holthoefer, Javier; Perra, Nicola; Gonçalves, Bruno; González-Bailón, Sandra; Arenas, Alex; Moreno, Yamir; Vespignani, Alessandro

    2016-01-01

    Data from social media provide unprecedented opportunities to investigate the processes that govern the dynamics of collective social phenomena. We consider an information theoretical approach to define and measure the temporal and structural signatures typical of collective social events as they arise and gain prominence. We use the symbolic transfer entropy analysis of microblogging time series to extract directed networks of influence among geolocalized subunits in social systems. This methodology captures the emergence of system-level dynamics close to the onset of socially relevant collective phenomena. The framework is validated against a detailed empirical analysis of five case studies. In particular, we identify a change in the characteristic time scale of the information transfer that flags the onset of information-driven collective phenomena. Furthermore, our approach identifies an order-disorder transition in the directed network of influence between social subunits. In the absence of clear exogenous driving, social collective phenomena can be represented as endogenously driven structural transitions of the information transfer network. This study provides results that can help define models and predictive algorithms for the analysis of societal events based on open source data. PMID:27051875

  17. The dynamics of information-driven coordination phenomena: A transfer entropy analysis.

    PubMed

    Borge-Holthoefer, Javier; Perra, Nicola; Gonçalves, Bruno; González-Bailón, Sandra; Arenas, Alex; Moreno, Yamir; Vespignani, Alessandro

    2016-04-01

    Data from social media provide unprecedented opportunities to investigate the processes that govern the dynamics of collective social phenomena. We consider an information theoretical approach to define and measure the temporal and structural signatures typical of collective social events as they arise and gain prominence. We use the symbolic transfer entropy analysis of microblogging time series to extract directed networks of influence among geolocalized subunits in social systems. This methodology captures the emergence of system-level dynamics close to the onset of socially relevant collective phenomena. The framework is validated against a detailed empirical analysis of five case studies. In particular, we identify a change in the characteristic time scale of the information transfer that flags the onset of information-driven collective phenomena. Furthermore, our approach identifies an order-disorder transition in the directed network of influence between social subunits. In the absence of clear exogenous driving, social collective phenomena can be represented as endogenously driven structural transitions of the information transfer network. This study provides results that can help define models and predictive algorithms for the analysis of societal events based on open source data.

  18. 23 CFR 970.204 - Management systems requirements.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... the management systems and their associated databases; and (5) A process for data collection, processing, analysis and updating for each management system. (d) All management systems will use databases with a geographical reference system that can be used to geolocate all database information. (e...

  19. 23 CFR 970.204 - Management systems requirements.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... the management systems and their associated databases; and (5) A process for data collection, processing, analysis and updating for each management system. (d) All management systems will use databases with a geographical reference system that can be used to geolocate all database information. (e...

  20. 23 CFR 970.204 - Management systems requirements.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... the management systems and their associated databases; and (5) A process for data collection, processing, analysis and updating for each management system. (d) All management systems will use databases with a geographical reference system that can be used to geolocate all database information. (e...

  1. 23 CFR 970.204 - Management systems requirements.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... the management systems and their associated databases; and (5) A process for data collection, processing, analysis and updating for each management system. (d) All management systems will use databases with a geographical reference system that can be used to geolocate all database information. (e...

  2. An Analysis of the Marine Corps Selection Process: Does Increased Competition Lead to Increased Quality

    DTIC Science & Technology

    2018-03-01

    collection of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data sources...gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this burden estimate or...any other aspect of this collection of information , including suggestions for reducing this burden, to Washington headquarters Services, Directorate

  3. Kuipers performs Water Sample Analysis

    NASA Image and Video Library

    2012-05-15

    ISS031-E-084619 (15 May 2012) --- After collecting samples from the Water Recovery System (WRS), European Space Agency astronaut Andre Kuipers, Expedition 31 flight engineer, processes the samples for chemical and microbial analysis in the Unity node of the International Space Station.

  4. Field guide for collecting samples for analysis of volatile organic compounds in stream water for the National Water-Quality Assessment Program

    USGS Publications Warehouse

    Shelton, Larry R.

    1997-01-01

    For many years, stream samples for analysis of volatile organic compounds have been collected without specific guidelines or a sampler designed to avoid analyte loss. In 1996, the U.S. Geological Survey's National Water-Quality Assessment Program began aggressively monitoring urban stream-water for volatile organic compounds. To assure representative samples and consistency in collection procedures, a specific sampler was designed to collect samples for analysis of volatile organic compounds in stream water. This sampler, and the collection procedures, were tested in the laboratory and in the field for compound loss, contamination, sample reproducibility, and functional capabilities. This report describes that sampler and its use, and outlines field procedures specifically designed to provide contaminant-free, reproducible volatile organic compound data from stream-water samples. These guidelines and the equipment described represent a significant change in U.S. Geological Survey instructions for collecting and processing stream-water samples for analysis of volatile organic compounds. They are intended to produce data that are both defensible and interpretable, particularly for concentrations below the microgram-per-liter level. The guidelines also contain detailed recommendations for quality-control samples.

  5. On the value of information for Industry 4.0

    NASA Astrophysics Data System (ADS)

    Omenzetter, Piotr

    2018-03-01

    Industry 4.0, or the fourth industrial revolution, that blurs the boundaries between the physical and the digital, is underpinned by vast amounts of data collected by sensors that monitor processes and components of smart factories that continuously communicate amongst one another and with the network hubs via the internet of things. Yet, collection of those vast amounts of data, which are inherently imperfect and burdened with uncertainties and noise, entails costs including hardware and software, data storage, processing, interpretation and integration into the decision-making process to name just the few main expenditures. This paper discusses a framework for rationalizing the adoption of (big) data collection for Industry 4.0. The pre-posterior Bayesian decision analysis is used to that end and industrial process evolution with time is conceptualized as a stochastic observable and controllable dynamical system. The chief underlying motivation is to be able to use the collected data in such a way as to derive the most benefit from them by trading off successfully the management of risks pertinent to failure of the monitored processes and/or its components against the cost of data collection, processing and interpretation. This enables formulation of optimization problems for data collection, e.g. for selecting the monitoring system type, topology and/or time of deployment. An illustrative example utilizing monitoring of the operation of an assembly line and optimizing the topology of a monitoring system is provided to illustrate the theoretical concepts.

  6. Neural networks for data mining electronic text collections

    NASA Astrophysics Data System (ADS)

    Walker, Nicholas; Truman, Gregory

    1997-04-01

    The use of neural networks in information retrieval and text analysis has primarily suffered from the issues of adequate document representation, the ability to scale to very large collections, dynamism in the face of new information and the practical difficulties of basing the design on the use of supervised training sets. Perhaps the most important approach to begin solving these problems is the use of `intermediate entities' which reduce the dimensionality of document representations and the size of documents collections to manageable levels coupled with the use of unsupervised neural network paradigms. This paper describes the issues, a fully configured neural network-based text analysis system--dataHARVEST--aimed at data mining text collections which begins this process, along with the remaining difficulties and potential ways forward.

  7. Moving from Complaints to Action: Oppositional Consciousness and Collective Action in a Political Community

    ERIC Educational Resources Information Center

    Kwon, Soo Ah

    2008-01-01

    This article analyzes the process of youth political activism and development by drawing on ethnographic research on Asian and Pacific Islander youth activists. Young people revealed that collective action begins with a critical analysis of their lived experiences with inequalities. Their actions also involved oppositional consciousness that was…

  8. 78 FR 47701 - Agency Information Collection Activities; Proposed Collection; Comment Request; Procedures for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-06

    ... Fishery Products--21 CFR Part 123 (OMB Control Number 0910-0354)-- Extension FDA regulations in part 123 (21 CFR part 123) mandate the application of hazard analysis and critical control point (HACCP) principles to the processing of seafood. HACCP is a preventive system of hazard control designed to help...

  9. 75 FR 18211 - Agency Information Collection Activities; Proposed Collection; Comment Request; Procedures for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-09

    ... Importing of Fish and Fishery Products--21 CFR Part 123 (OMB Control Number 0910-0354)-- Extension FDA regulations in part 123 (21 CFR part 123) mandate the application of hazard analysis and critical control point (HACCP) principles to the processing of seafood. HACCP is a preventive system of hazard control...

  10. Alternatives to current flow cytometry data analysis for clinical and research studies.

    PubMed

    Gondhalekar, Carmen; Rajwa, Bartek; Patsekin, Valery; Ragheb, Kathy; Sturgis, Jennifer; Robinson, J Paul

    2018-02-01

    Flow cytometry has well-established methods for data analysis based on traditional data collection techniques. These techniques typically involved manual insertion of tube samples into an instrument that, historically, could only measure 1-3 colors. The field has since evolved to incorporate new technologies for faster and highly automated sample preparation and data collection. For example, the use of microwell plates on benchtop instruments is now a standard on virtually every new instrument, and so users can easily accumulate multiple data sets quickly. Further, because the user must carefully define the layout of the plate, this information is already defined when considering the analytical process, expanding the opportunities for automated analysis. Advances in multi-parametric data collection, as demonstrated by the development of hyperspectral flow-cytometry, 20-40 color polychromatic flow cytometry, and mass cytometry (CyTOF), are game-changing. As data and assay complexity increase, so too does the complexity of data analysis. Complex data analysis is already a challenge to traditional flow cytometry software. New methods for reviewing large and complex data sets can provide rapid insight into processes difficult to define without more advanced analytical tools. In settings such as clinical labs where rapid and accurate data analysis is a priority, rapid, efficient and intuitive software is needed. This paper outlines opportunities for analysis of complex data sets using examples of multiplexed bead-based assays, drug screens and cell cycle analysis - any of which could become integrated into the clinical environment. Copyright © 2017. Published by Elsevier Inc.

  11. Collection and analysis of high-resolution elevation data for the Lincoln Lidar Project, Lincoln, Nebraska, 2004

    USGS Publications Warehouse

    Meyer, P.D.; Greenlee, Susan K.; Gesch, Dean B.; Hubl, Erik J.; Axmann, Ryan N.

    2005-01-01

    The Lincoln Lidar Project was a partnership developed between the U.S. Geological Survey National Center for Earth Resources Observations and Science (EROS), Lancaster County and the city of Lincoln, Nebraska. This project demonstrated a successful planning, collection, analysis and integration of high-resolution elevation information using Light Detection and Ranging, (Lidar) data. This report describes the partnership developed to collect local Lidar data and transform the data into information useable at local to national levels. This report specifically describes project planning, quality assurance, processing, transforming raw Lidar points to useable data layers, and visualizing and disseminating the raw and final products.

  12. Soil Gas Sample Handling: Evaluation of Water Removal and Sample Ganging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fritz, Brad G.; Abrecht, David G.; Hayes, James C.

    2016-10-31

    Soil gas sampling is currently conducted in support of Nuclear Test Ban treaty verification. Soil gas samples are collected and analyzed for isotopes of interest. Some issues that can impact sampling and analysis of these samples are excess moisture and sample processing time. Here we discuss three potential improvements to the current sampling protocol; a desiccant for water removal, use of molecular sieve to remove CO 2 from the sample during collection, and a ganging manifold to allow composite analysis of multiple samples.

  13. The visual information system

    Treesearch

    Merlyn J. Paulson

    1979-01-01

    This paper outlines a project level process (V.I.S.) which utilizes very accurate and flexible computer algorithms in combination with contemporary site analysis and design techniques for visual evaluation, design and management. The process provides logical direction and connecting bridges through problem identification, information collection and verification, visual...

  14. 23 CFR 971.204 - Management systems requirements.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... maintain the management systems and their associated databases; and (5) A process for data collection, processing, analysis, and updating for each management system. (c) All management systems will use databases with a common or coordinated reference system, that can be used to geolocate all database information...

  15. 23 CFR 971.204 - Management systems requirements.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... maintain the management systems and their associated databases; and (5) A process for data collection, processing, analysis, and updating for each management system. (c) All management systems will use databases with a common or coordinated reference system, that can be used to geolocate all database information...

  16. 23 CFR 971.204 - Management systems requirements.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... maintain the management systems and their associated databases; and (5) A process for data collection, processing, analysis, and updating for each management system. (c) All management systems will use databases with a common or coordinated reference system, that can be used to geolocate all database information...

  17. 23 CFR 971.204 - Management systems requirements.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... maintain the management systems and their associated databases; and (5) A process for data collection, processing, analysis, and updating for each management system. (c) All management systems will use databases with a common or coordinated reference system, that can be used to geolocate all database information...

  18. Development and evaluation of a web-based software for crash data collection, processing and analysis.

    PubMed

    Montella, Alfonso; Chiaradonna, Salvatore; Criscuolo, Giorgio; De Martino, Salvatore

    2017-02-05

    First step of the development of an effective safety management system is to create reliable crash databases since the quality of decision making in road safety depends on the quality of the data on which decisions are based. Improving crash data is a worldwide priority, as highlighted in the Global Plan for the Decade of Action for Road Safety adopted by the United Nations, which recognizes that the overall goal of the plan will be attained improving the quality of data collection at the national, regional and global levels. Crash databases provide the basic information for effective highway safety efforts at any level of government, but lack of uniformity among countries and among the different jurisdictions in the same country is observed. Several existing databases show significant drawbacks which hinder their effective use for safety analysis and improvement. Furthermore, modern technologies offer great potential for significant improvements of existing methods and procedures for crash data collection, processing and analysis. To address these issues, in this paper we present the development and evaluation of a web-based platform-independent software for crash data collection, processing and analysis. The software is designed for mobile and desktop electronic devices and enables a guided and automated drafting of the crash report, assisting police officers both on-site and in the office. The software development was based both on the detailed critical review of existing Australasian, EU, and U.S. crash databases and software as well as on the continuous consultation with the stakeholders. The evaluation was carried out comparing the completeness, timeliness, and accuracy of crash data before and after the use of the software in the city of Vico Equense, in south of Italy showing significant advantages. The amount of collected information increased from 82 variables to 268 variables, i.e., a 227% increase. The time saving was more than one hour per crash, i.e., a 36% reduction. The on-site data collection did not produce time saving, however this is a temporary weakness that will be annihilated very soon in the future after officers are more acquainted with the software. The phase of evaluation, processing and analysis carried out in the office was dramatically shortened, i.e., a 69% reduction. Another benefit was the standardization which allowed fast and consistent data analysis and evaluation. Even if all these benefits are remarkable, the most valuable benefit of the new procedure was the reduction of the police officers mistakes during the manual operations of survey and data evaluation. Because of these benefits, the satisfaction questionnaires administrated to the police officers after the testing phase showed very good acceptance of the procedure. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. OzFlux data: network integration from collection to curation

    NASA Astrophysics Data System (ADS)

    Isaac, Peter; Cleverly, James; McHugh, Ian; van Gorsel, Eva; Ewenz, Cacilia; Beringer, Jason

    2017-06-01

    Measurement of the exchange of energy and mass between the surface and the atmospheric boundary-layer by the eddy covariance technique has undergone great change in the last 2 decades. Early studies of these exchanges were confined to brief field campaigns in carefully controlled conditions followed by months of data analysis. Current practice is to run tower-based eddy covariance systems continuously over several years due to the need for continuous monitoring as part of a global effort to develop local-, regional-, continental- and global-scale budgets of carbon, water and energy. Efficient methods of processing the increased quantities of data are needed to maximise the time available for analysis and interpretation. Standardised methods are needed to remove differences in data processing as possible contributors to observed spatial variability. Furthermore, public availability of these data sets assists with undertaking global research efforts. The OzFlux data path has been developed (i) to provide a standard set of quality control and post-processing tools across the network, thereby facilitating inter-site integration and spatial comparisons; (ii) to increase the time available to researchers for analysis and interpretation by reducing the time spent collecting and processing data; (iii) to propagate both data and metadata to the final product; and (iv) to facilitate the use of the OzFlux data by adopting a standard file format and making the data available from web-based portals. Discovery of the OzFlux data set is facilitated through incorporation in FLUXNET data syntheses and the publication of collection metadata via the RIF-CS format. This paper serves two purposes. The first is to describe the data sets, along with their quality control and post-processing, for the other papers of this Special Issue. The second is to provide an example of one solution to the data collection and curation challenges that are encountered by similar flux tower networks worldwide.

  20. Politics and Educational Reform

    ERIC Educational Resources Information Center

    Merritt, Richard L.; Coombs, Fred S.

    1977-01-01

    Provides a brief analysis of educational policymaking processes in terms of how they underscore the political basis of educational decisions. Examines four approaches to explaining collective decisions--rational decisionmaking theory, pluralist theory, systems analysis, and cybernetic theory--to explore how each might relate in quite different…

  1. Data Validation Package - April and July 2015 Groundwater and Surface Water Sampling at the Gunnison, Colorado, Processing Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Linard, Joshua; Campbell, Sam

    This event included annual sampling of groundwater and surface water locations at the Gunnison, Colorado, Processing Site. Sampling and analyses were conducted as specified in Sampling and Analysis Plan for U.S. Department of Energy Office of Legacy Management Sites. Samples were collected from 28 monitoring wells, three domestic wells, and six surface locations in April at the processing site as specified in the 2010 Ground Water Compliance Action Plan for the Gunnison, Colorado, Processing Site. Domestic wells 0476 and 0477 were sampled in July because the homes were unoccupied in April, and the wells were not in use. Duplicate samplesmore » were collected from locations 0113, 0248, and 0477. One equipment blank was collected during this sampling event. Water levels were measured at all monitoring wells that were sampled. No issues were identified during the data validation process that requires additional action or follow-up.« less

  2. Four applications of a software data collection and analysis methodology

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Selby, Richard W., Jr.

    1985-01-01

    The evaluation of software technologies suffers because of the lack of quantitative assessment of their effect on software development and modification. A seven-step data collection and analysis methodology couples software technology evaluation with software measurement. Four in-depth applications of the methodology are presented. The four studies represent each of the general categories of analyses on the software product and development process: blocked subject-project studies, replicated project studies, multi-project variation studies, and single project strategies. The four applications are in the areas of, respectively, software testing, cleanroom software development, characteristic software metric sets, and software error analysis.

  3. Using Language Sample Analysis to Assess Spoken Language Production in Adolescents

    ERIC Educational Resources Information Center

    Miller, Jon F.; Andriacchi, Karen; Nockerts, Ann

    2016-01-01

    Purpose: This tutorial discusses the importance of language sample analysis and how Systematic Analysis of Language Transcripts (SALT) software can be used to simplify the process and effectively assess the spoken language production of adolescents. Method: Over the past 30 years, thousands of language samples have been collected from typical…

  4. Computer image analysis in obtaining characteristics of images: greenhouse tomatoes in the process of generating learning sets of artificial neural networks

    NASA Astrophysics Data System (ADS)

    Zaborowicz, M.; Przybył, J.; Koszela, K.; Boniecki, P.; Mueller, W.; Raba, B.; Lewicki, A.; Przybył, K.

    2014-04-01

    The aim of the project was to make the software which on the basis on image of greenhouse tomato allows for the extraction of its characteristics. Data gathered during the image analysis and processing were used to build learning sets of artificial neural networks. Program enables to process pictures in jpeg format, acquisition of statistical information of the picture and export them to an external file. Produced software is intended to batch analyze collected research material and obtained information saved as a csv file. Program allows for analysis of 33 independent parameters implicitly to describe tested image. The application is dedicated to processing and image analysis of greenhouse tomatoes. The program can be used for analysis of other fruits and vegetables of a spherical shape.

  5. An Analysis of Metal Finishing Technology and its Status in Industrial Teacher-Education.

    ERIC Educational Resources Information Center

    Singletary, Thomas Alexander

    The purposes of this study were to analyze and describe metal finishing processes and to ascertain the extent to which these processes are taught in industrial teacher education programs. Handbooks and industrial literature were reviewed, and a survey of 165 teacher education departments was made to collect the data. Finishing processes were…

  6. The importance of hypoxia and extra physiologic oxygen shock/stress for collection and processing of stem and progenitor cells to understand true physiology/pathology of these cells ex vivo.

    PubMed

    Broxmeyer, Hal E; O'Leary, Heather A; Huang, Xinxin; Mantel, Charlie

    2015-07-01

    Hematopoietic stem (HSCs) and progenitor (HPCs) cells reside in a hypoxic (lowered oxygen tension) environment, in vivo. We review literature on growth of HSCs and HPCs under hypoxic and normoxic (ambient air) conditions with a focus on our recent work demonstrating the detrimental effects of collecting and processing cells in ambient air through a phenomenon termed extra physiologic oxygen shock/stress (EPHOSS), and we describe means to counteract EPHOSS for enhanced collection of HSCs. Collection and processing of bone marrow and cord blood cells in ambient air cause rapid differentiation and loss of HSCs, with increases in HPCs. This apparently irreversible EPHOSS phenomenon results from increased mitochondrial reactive oxygen species, mediated by a p53-cyclophilin D-mitochondrial permeability transition pore axis, and involves hypoxia inducing factor-1α and micro-RNA 210. EPHOSS can be mitigated by collecting and processing cells in lowered (3%) oxygen, or in ambient air in the presence of, cyclosporine A which effects the mitochondrial permeability transition pore, resulting in increased HSC collections. Our recent findings may be advantageous for HSC collection for hematopoietic cell transplantation, and likely for enhanced collection of other stem cell types. EPHOSS should be considered when ex-vivo cell analysis is utilized for personalized medicine, as metabolism of cells and their response to targeted drug treatment ex vivo may not mimic what occurs in vivo.

  7. Scope of ACE in Australia. Volume 1: Implications for Improved Data Collection and Reporting [and] Volume 2: Analysis of Existing Information in National Education and Training Data Collection.

    ERIC Educational Resources Information Center

    Borthwick, J.; Knight, B.; Bender, A.; Loveder, P.

    These two volumes provide information on the scope of adult and community education (ACE) in Australia and implications for improved data collection and reporting. Volume 1 begins with a glossary. Chapter 1 addresses project objectives and processes and methodology. Chapter 2 analyzes the scope and diversity of ACE in terms of what is currently…

  8. Textural Analysis and Substrate Classification in the Nearshore Region of Lake Superior Using High-Resolution Multibeam Bathymetry

    NASA Astrophysics Data System (ADS)

    Dennison, Andrew G.

    Classification of the seafloor substrate can be done with a variety of methods. These methods include Visual (dives, drop cameras); mechanical (cores, grab samples); acoustic (statistical analysis of echosounder returns). Acoustic methods offer a more powerful and efficient means of collecting useful information about the bottom type. Due to the nature of an acoustic survey, larger areas can be sampled, and by combining the collected data with visual and mechanical survey methods provide greater confidence in the classification of a mapped region. During a multibeam sonar survey, both bathymetric and backscatter data is collected. It is well documented that the statistical characteristic of a sonar backscatter mosaic is dependent on bottom type. While classifying the bottom-type on the basis on backscatter alone can accurately predict and map bottom-type, i.e a muddy area from a rocky area, it lacks the ability to resolve and capture fine textural details, an important factor in many habitat mapping studies. Statistical processing of high-resolution multibeam data can capture the pertinent details about the bottom-type that are rich in textural information. Further multivariate statistical processing can then isolate characteristic features, and provide the basis for an accurate classification scheme. The development of a new classification method is described here. It is based upon the analysis of textural features in conjunction with ground truth sampling. The processing and classification result of two geologically distinct areas in nearshore regions of Lake Superior; off the Lester River,MN and Amnicon River, WI are presented here, using the Minnesota Supercomputer Institute's Mesabi computing cluster for initial processing. Processed data is then calibrated using ground truth samples to conduct an accuracy assessment of the surveyed areas. From analysis of high-resolution bathymetry data collected at both survey sites is was possible to successfully calculate a series of measures that describe textural information about the lake floor. Further processing suggests that the features calculated capture a significant amount of statistical information about the lake floor terrain as well. Two sources of error, an anomalous heave and refraction error significantly deteriorated the quality of the processed data and resulting validate results. Ground truth samples used to validate the classification methods utilized for both survey sites, however, resulted in accuracy values ranging from 5 -30 percent at the Amnicon River, and between 60-70 percent for the Lester River. The final results suggest that this new processing methodology does adequately capture textural information about the lake floor and does provide an acceptable classification in the absence of significant data quality issues.

  9. Assessment strategies for municipal selective waste collection schemes.

    PubMed

    Ferreira, Fátima; Avelino, Catarina; Bentes, Isabel; Matos, Cristina; Teixeira, Carlos Afonso

    2017-01-01

    An important strategy to promote a strong sustainable growth relies on an efficient municipal waste management, and phasing out waste landfilling through waste prevention and recycling emerges as a major target. For this purpose, effective collection schemes are required, in particular those regarding selective waste collection, pursuing a more efficient and high quality recycling of reusable materials. This paper addresses the assessment and benchmarking of selective collection schemes, relevant to guide future operational improvements. In particular, the assessment is based on the monitoring and statistical analysis of a core-set of performance indicators that highlights collection trends, complemented with a performance index that gathers a weighted linear combination of these indicators. This combined analysis underlines a potential tool to support decision makers involved in the process of selecting the collection scheme with best overall performance. The presented approach was applied to a case study conducted in Oporto Municipality, with data gathered from two distinct selective collection schemes. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Advanced instrumentation for the collection, retrieval, and processing of urban stormwater data

    USGS Publications Warehouse

    Robinson, Jerald B.; Bales, Jerad D.; Young, Wendi S.; ,

    1995-01-01

    The U.S. Geological Survey, in cooperation with the City of Charlotte and Mecklenburg County, North Carolina, has developed a data-collection network that uses advanced instrumentation to automatically collect, retrieve, and process urban stormwater data. Precipitation measurement and water-quality networks provide data for (1) planned watershed simulation models, (2) early warning of possible flooding, (3) computation of material export, and (4) characterization of water quality in relation to basin conditions. Advantages of advanced instrumentation include remote access to real-time data, reduced demands on and more efficient use of limited human resources, and direct importation of data into a geographical information system for display and graphic analysis.

  11. Web-based data collection: detailed methods of a questionnaire and data gathering tool

    PubMed Central

    Cooper, Charles J; Cooper, Sharon P; del Junco, Deborah J; Shipp, Eva M; Whitworth, Ryan; Cooper, Sara R

    2006-01-01

    There have been dramatic advances in the development of web-based data collection instruments. This paper outlines a systematic web-based approach to facilitate this process through locally developed code and to describe the results of using this process after two years of data collection. We provide a detailed example of a web-based method that we developed for a study in Starr County, Texas, assessing high school students' work and health status. This web-based application includes data instrument design, data entry and management, and data tables needed to store the results that attempt to maximize the advantages of this data collection method. The software also efficiently produces a coding manual, web-based statistical summary and crosstab reports, as well as input templates for use by statistical packages. Overall, web-based data entry using a dynamic approach proved to be a very efficient and effective data collection system. This data collection method expedited data processing and analysis and eliminated the need for cumbersome and expensive transfer and tracking of forms, data entry, and verification. The code has been made available for non-profit use only to the public health research community as a free download [1]. PMID:16390556

  12. Coastal bathymetry data collected in 2011 from the Chandeleur Islands, Louisiana

    USGS Publications Warehouse

    DeWitt, Nancy T.; Pfeiffer, William R.; Bernier, Julie C.; Buster, Noreen A.; Miselis, Jennifer L.; Flocks, James G.; Reynolds, Billy J.; Wiese, Dana S.; Kelso, Kyle W.

    2014-01-01

    This report serves as an archive of processed interferometric swath and single-beam bathymetry data. Geographic Iinformation System data products include a 50-meter cell-size interpolated bathymetry grid surface, trackline maps, and point data files. Additional files include error analysis maps, Field Activity Collection System logs, and formal Federal Geographic Data Committee metadata.

  13. Phenomenography and Grounded Theory as Research Methods in Computing Education Research Field

    ERIC Educational Resources Information Center

    Kinnunen, Paivi; Simon, Beth

    2012-01-01

    This paper discusses two qualitative research methods, phenomenography and grounded theory. We introduce both methods' data collection and analysis processes and the type or results you may get at the end by using examples from computing education research. We highlight some of the similarities and differences between the aim, data collection and…

  14. Factors Influencing Error Recovery in Collections Databases: A Museum Case Study

    ERIC Educational Resources Information Center

    Marty, Paul F.

    2005-01-01

    This article offers an analysis of the process of error recovery as observed in the development and use of collections databases in a university museum. It presents results from a longitudinal case study of the development of collaborative systems and practices designed to reduce the number of errors found in the museum's databases as museum…

  15. Moving beyond Mentoring: A Collective Case Study Examining the Perceived Characteristics of Positive Transformational Figures

    ERIC Educational Resources Information Center

    Bean, Brent W.; Kroth, Michael

    2013-01-01

    The purpose of this collective-case study was to explore the characteristics of transformational figures. This study revealed that interpersonal encounters were seen as a catalyst that assisted study participants through the process of transformation. Ten themes emerged from the cross-case analysis: Imposed and Intentional Influence; Metaphors of…

  16. Business Intelligence Applied to the ALMA Software Integration Process

    NASA Astrophysics Data System (ADS)

    Zambrano, M.; Recabarren, C.; González, V.; Hoffstadt, A.; Soto, R.; Shen, T.-C.

    2012-09-01

    Software quality assurance and planning of an astronomy project is a complex task, specially if it is a distributed collaborative project such as ALMA, where the development centers are spread across the globe. When you execute a software project there is much valuable information about this process itself that you might be able to collect. One of the ways you can receive this input is via an issue tracking system that will gather the problem reports relative to software bugs captured during the testing of the software, during the integration of the different components or even worst, problems occurred during production time. Usually, there is little time spent on analyzing them but with some multidimensional processing you can extract valuable information from them and it might help you on the long term planning and resources allocation. We present an analysis of the information collected at ALMA from a collection of key unbiased indicators. We describe here the extraction, transformation and load process and how the data was processed. The main goal is to assess a software process and get insights from this information.

  17. The Molecular Signatures Database (MSigDB) hallmark gene set collection.

    PubMed

    Liberzon, Arthur; Birger, Chet; Thorvaldsdóttir, Helga; Ghandi, Mahmoud; Mesirov, Jill P; Tamayo, Pablo

    2015-12-23

    The Molecular Signatures Database (MSigDB) is one of the most widely used and comprehensive databases of gene sets for performing gene set enrichment analysis. Since its creation, MSigDB has grown beyond its roots in metabolic disease and cancer to include >10,000 gene sets. These better represent a wider range of biological processes and diseases, but the utility of the database is reduced by increased redundancy across, and heterogeneity within, gene sets. To address this challenge, here we use a combination of automated approaches and expert curation to develop a collection of "hallmark" gene sets as part of MSigDB. Each hallmark in this collection consists of a "refined" gene set, derived from multiple "founder" sets, that conveys a specific biological state or process and displays coherent expression. The hallmarks effectively summarize most of the relevant information of the original founder sets and, by reducing both variation and redundancy, provide more refined and concise inputs for gene set enrichment analysis.

  18. A Web-Based Course Assessment Tool with Direct Mapping to Student Outcomes

    ERIC Educational Resources Information Center

    Ibrahim, Walid; Atif, Yacine; Shuaib, Khaled; Sampson, Demetrios

    2015-01-01

    The assessment of curriculum outcomes is an essential element for continuous academic improvement. However, the collection, aggregation and analysis of assessment data are notoriously complex and time-consuming processes. At the same time, only few developments of supporting electronic processes and tools for continuous academic program assessment…

  19. Assessing sustainability using data from the Forest Inventory and Analysis Program of the United States Forest Service

    Treesearch

    Ronald E. McRoberts; William H. McWilliams; Gregory A. Reams; Thomas L. Schmidt; Jennifer C. Jenkins; Katherine P. O' Neill; Patrick D. Miles; Gary J. Brand

    2004-01-01

    Forest sustainability has emerged as a crucial component of all current issues related to forest management. The seven Montreal Process Criteria are well accepted as categories of processes for evaluating forest management with respect to sustainability, and data collected.

  20. What can one sample tell us? Stable isotopes can assess complex processes in national assessments of lakes, rivers and streams.

    EPA Science Inventory

    Stable isotopes can be very useful in large-scale monitoring programs because samples for isotopic analysis are easy to collect, and isotopes integrate information about complex processes such as evaporation from water isotopes and denitrification from nitrogen isotopes. Traditi...

  1. The AgroEcoSystem-Watershed (AgES-W) model: overview and application to experimental watersheds

    USDA-ARS?s Scientific Manuscript database

    Progress in the understanding of physical, chemical, and biological processes influencing water quality, coupled with advances in the collection and analysis of hydrologic data, provide opportunities for significant innovations in the manner and level with which watershed-scale processes may be quan...

  2. Conceptual Level of Understanding about Sound Concept: Sample of Fifth Grade Students

    ERIC Educational Resources Information Center

    Bostan Sarioglan, Ayberk

    2016-01-01

    In this study, students' conceptual change processes related to the sound concept were examined. Study group was comprises of 325 fifth grade middle school students. Three multiple-choice questions were used as the data collection tool. At the data analysis process "scientific response", "scientifically unacceptable response"…

  3. Organizational Analysis of the United States Army Evaluation Center

    DTIC Science & Technology

    2014-12-01

    analysis of qualitative or quantitative data obtained from design reviews, hardware inspections, M&S, hardware and software testing , metrics review... Research Development Test & Evaluation (RDT&E) appropriation account. The Defense Acquisition Portal ACQuipedia website describes RDT&E as “ one of the... research , design , development, test and evaluation, production, installation, operation, and maintenance; data collection; processing and analysis

  4. Teaching Earth Signals Analysis Using the Java-DSP Earth Systems Edition: Modern and Past Climate Change

    ERIC Educational Resources Information Center

    Ramamurthy, Karthikeyan Natesan; Hinnov, Linda A.; Spanias, Andreas S.

    2014-01-01

    Modern data collection in the Earth Sciences has propelled the need for understanding signal processing and time-series analysis techniques. However, there is an educational disconnect in the lack of instruction of time-series analysis techniques in many Earth Science academic departments. Furthermore, there are no platform-independent freeware…

  5. Status of the calibration and alignment framework at the Belle II experiment

    NASA Astrophysics Data System (ADS)

    Dossett, D.; Sevior, M.; Ritter, M.; Kuhr, T.; Bilka, T.; Yaschenko, S.; Belle Software Group, II

    2017-10-01

    The Belle II detector at the Super KEKB e+e-collider plans to take first collision data in 2018. The monetary and CPU time costs associated with storing and processing the data mean that it is crucial for the detector components at Belle II to be calibrated quickly and accurately. A fast and accurate calibration system would allow the high level trigger to increase the efficiency of event selection, and can give users analysis-quality reconstruction promptly. A flexible framework to automate the fast production of calibration constants is being developed in the Belle II Analysis Software Framework (basf2). Detector experts only need to create two components from C++ base classes in order to use the automation system. The first collects data from Belle II event data files and outputs much smaller files to pass to the second component. This runs the main calibration algorithm to produce calibration constants ready for upload into the conditions database. A Python framework coordinates the input files, order of processing, and submission of jobs. Splitting the operation into collection and algorithm processing stages allows the framework to optionally parallelize the collection stage on a batch system.

  6. Statistical analysis of CCSN/SS7 traffic data from working CCS subnetworks

    NASA Astrophysics Data System (ADS)

    Duffy, Diane E.; McIntosh, Allen A.; Rosenstein, Mark; Willinger, Walter

    1994-04-01

    In this paper, we report on an ongoing statistical analysis of actual CCSN traffic data. The data consist of approximately 170 million signaling messages collected from a variety of different working CCS subnetworks. The key findings from our analysis concern: (1) the characteristics of both the telephone call arrival process and the signaling message arrival process; (2) the tail behavior of the call holding time distribution; and (3) the observed performance of the CCSN with respect to a variety of performance and reliability measurements.

  7. Data Validation Package, April and June 2016 Groundwater and Surface Water Sampling at the Gunnison, Colorado, Processing Site, October 2016

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Linard, Joshua; Campbell, Sam

    This event included annual sampling of groundwater and surface water locations at the Gunnison, Colorado, Processing Site. Sampling and analyses were conducted as specified in Sampling and Analysis Plan for US Department of Energy Office of Legacy Management Sites (LMS/PRO/S04351, continually updated, http://energy.gov/lm/downloads/sampling-and­ analysis-plan-us-department-energy-office-legacy-management-sites). Samples were collected from 28 monitoring wells, three domestic wells, and six surface locations in April at the processing site as specified in the draft 2010 Ground Water Compliance Action Plan for the Gunnison, Colorado, Processing Site. Planned monitoring locations are shown in Attachment 1, Sampling and Analysis Work Order. Domestic wells 0476 and 0477 weremore » sampled in June because the homes were unoccupied in April, and the wells were not in use. Duplicate samples were collected from locations 0126, 0477, and 0780. One equipment blank was collected during this sampling event. Water levels were measured at all monitoring wells that were sampled. See Attachment 2, Trip Reports for additional details. The analytical data and associated qualifiers can be viewed in environmental database reports and are also available for viewing with dynamic mapping via the GEMS (Geospatial Environmental Mapping System) website at http://gems.lm.doe.gov/#. No issues were identified during the data validation process that requires additional action or follow-up. An assessment of anomalous data is included in Attachment 3. Interpretation and presentation of results, including an assessment ofthe natural flushing compliance strategy, will be reported in the upcoming 2016 Verification Monitoring Report. U.S.« less

  8. Removing external DNA contamination from arthropod predators destined for molecular gut-content analysis

    USDA-ARS?s Scientific Manuscript database

    Molecular gut-content analysis enables detection of arthropod predation with minimal disruption of ecosystem processes. Field and laboratory experiments have demonstrated that mass-collection methods, such as sweep-netting, vacuum sampling, and foliage beating, can lead to contamination of fed pred...

  9. Removing external DNA decontamination from arthropod predators destined for molecular gut-content analysis

    USDA-ARS?s Scientific Manuscript database

    Molecular gut-content analysis enables detection of arthropod predation with minimal disruption of ecosystem processes. Field and laboratory experiments have demonstrated that mass-collection methods, such as sweep-netting, vacuum sampling, and foliage beating, can lead to contamination of fed pred...

  10. Catalysis: A Potential Alternative to Kraft Pulping

    Treesearch

    Alan W. Rudie; Peter W. Hart

    2014-01-01

    A thorough analysis of the kraft pulping process makes it obvious why it has dominated for over a century as an industrial process with no replacement in sight. It uses low cost raw materials, collects and regenerates over 90% of the chemicals needed in the process, is indifferent to wood raw material and good at preserving the cellulose portion of the wood which is...

  11. Catalysis: A Potential Alternative to Kraft Pulping

    Treesearch

    Alan W. Rudie; Peter W. Hart

    2014-01-01

    A thorough analysis of the kraft pulping process makes it obvious why it has dominated for over a century as an industrial process with no replacement in sight. It uses low-cost raw materials; collects and regenerates over 90% of the chemicals needed in the process; and is indifferent to wood raw material and good at preserving the cellulose portion of the wood, the...

  12. Status of the NASA Robotic Mission Conjunction Assessment Effort

    NASA Technical Reports Server (NTRS)

    Newman, Lauri Kraft

    2007-01-01

    This viewgraph presentation discusses NASA's processes and tools used to mitigate threats to NASA's robotic assets. The topics include: 1) Background; 2) Goddard Stakeholders and Mission Support; 3) ESC and TDRS Mission Descriptions; 4) TDRS Conjunction Assessment Process; 5) ESMO Conjunction Assessment Process; 6) Recent Operations Experiences; 7) Statistics Collected for ESC Regime; and 8) Current and Future Analysis Items.

  13. [Psychosocial analysis of the health-disease process].

    PubMed

    Sawaia, B B

    1994-04-01

    This article is a reflection about the transdisciplinary paradigmas of the health-illness process noting the symbolic mediation between the reactions of the biological organism and the socio-environment factors including the pathogenic ones. The symbolic-affective mediation is analyzed from the perspective of Social Representation theory allowing one to comprehend the references of individual and collective actions in the health-illness process.

  14. Application of a High-Fidelity Icing Analysis Method to a Model-Scale Rotor in Forward Flight

    NASA Technical Reports Server (NTRS)

    Narducci, Robert; Orr, Stanley; Kreeger, Richard E.

    2012-01-01

    An icing analysis process involving the loose coupling of OVERFLOW-RCAS for rotor performance prediction and with LEWICE3D for thermal analysis and ice accretion is applied to a model-scale rotor for validation. The process offers high-fidelity rotor analysis for the noniced and iced rotor performance evaluation that accounts for the interaction of nonlinear aerodynamics with blade elastic deformations. Ice accumulation prediction also involves loosely coupled data exchanges between OVERFLOW and LEWICE3D to produce accurate ice shapes. Validation of the process uses data collected in the 1993 icing test involving Sikorsky's Powered Force Model. Non-iced and iced rotor performance predictions are compared to experimental measurements as are predicted ice shapes.

  15. Using Unified Modelling Language (UML) as a process-modelling technique for clinical-research process improvement.

    PubMed

    Kumarapeli, P; De Lusignan, S; Ellis, T; Jones, B

    2007-03-01

    The Primary Care Data Quality programme (PCDQ) is a quality-improvement programme which processes routinely collected general practice computer data. Patient data collected from a wide range of different brands of clinical computer systems are aggregated, processed, and fed back to practices in an educational context to improve the quality of care. Process modelling is a well-established approach used to gain understanding and systematic appraisal, and identify areas of improvement of a business process. Unified modelling language (UML) is a general purpose modelling technique used for this purpose. We used UML to appraise the PCDQ process to see if the efficiency and predictability of the process could be improved. Activity analysis and thinking-aloud sessions were used to collect data to generate UML diagrams. The UML model highlighted the sequential nature of the current process as a barrier for efficiency gains. It also identified the uneven distribution of process controls, lack of symmetric communication channels, critical dependencies among processing stages, and failure to implement all the lessons learned in the piloting phase. It also suggested that improved structured reporting at each stage - especially from the pilot phase, parallel processing of data and correctly positioned process controls - should improve the efficiency and predictability of research projects. Process modelling provided a rational basis for the critical appraisal of a clinical data processing system; its potential maybe underutilized within health care.

  16. Teachers' work ability: a study of relationships between collective efficacy and self-efficacy beliefs.

    PubMed

    Guidetti, Gloria; Viotti, Sara; Bruno, Andreina; Converso, Daniela

    2018-01-01

    Work ability constitutes one of the most studied well-being indicators related to work. Past research highlighted the relationship with work-related resources and demands, and personal resources. However, no studies highlight the role of collective and self-efficacy beliefs in sustaining work ability. The purpose of this study was to examine whether and by which mechanism work ability is linked with individual and collective efficacies in a sample of primary and middle school teachers. Using a dataset consisting of 415 primary and middle school Italian teachers, the analysis tested for the mediating role of self-efficacy between collective efficacy and work ability. Mediational analysis highlights that teachers' self-efficacy totally mediates the relationship between collective efficacy and perceived work ability. Results of this study enhance the theoretical knowledge and empirical evidence regarding the link between teachers' collective efficacy and self-efficacy, giving further emphasis to the concept of collective efficacy in school contexts. Moreover, the results contribute to the study of well-being in the teaching profession, highlighting a process that sustains and promotes levels of work ability through both collective and personal resources.

  17. Methodological tools for the collection and analysis of participant observation data using grounded theory.

    PubMed

    Laitinen, Heleena; Kaunonen, Marja; Astedt-Kurki, Päivi

    2014-11-01

    To give clarity to the analysis of participant observation in nursing when implementing the grounded theory method. Participant observation (PO) is a method of collecting data that reveals the reality of daily life in a specific context. In grounded theory, interviews are the primary method of collecting data but PO gives a distinctive insight, revealing what people are really doing, instead of what they say they are doing. However, more focus is needed on the analysis of PO. An observational study carried out to gain awareness of nursing care and its electronic documentation in four acute care wards in hospitals in Finland. Discussion of using the grounded theory method and PO as a data collection tool. The following methodological tools are discussed: an observational protocol, jotting of notes, microanalysis, the use of questioning, constant comparison, and writing and illustrating. Each tool has specific significance in collecting and analysing data, working in constant interaction. Grounded theory and participant observation supplied rich data and revealed the complexity of the daily reality of acute care. In this study, the methodological tools provided a base for the study at the research sites and outside. The process as a whole was challenging. It was time-consuming and it required rigorous and simultaneous data collection and analysis, including reflective writing. Using these methodological tools helped the researcher stay focused from data collection and analysis to building theory. Using PO as a data collection method in qualitative nursing research provides insights. It is not commonly discussed in nursing research and therefore this study can provide insight, which cannot be seen or revealed by using other data collection methods. Therefore, this paper can produce a useful tool for those who intend to use PO and grounded theory in their nursing research.

  18. Implementation and evaluation of ILLIAC 4 algorithms for multispectral image processing

    NASA Technical Reports Server (NTRS)

    Swain, P. H.

    1974-01-01

    Data concerning a multidisciplinary and multi-organizational effort to implement multispectral data analysis algorithms on a revolutionary computer, the Illiac 4, are reported. The effectiveness and efficiency of implementing the digital multispectral data analysis techniques for producing useful land use classifications from satellite collected data were demonstrated.

  19. Variation and Commonality in Phenomenographic Research Methods

    ERIC Educational Resources Information Center

    Akerlind, Gerlese S.

    2012-01-01

    This paper focuses on the data analysis stage of phenomenographic research, elucidating what is involved in terms of both commonality and variation in accepted practice. The analysis stage of phenomenographic research is often not well understood. This paper helps to clarify the process, initially by collecting together in one location the more…

  20. Improving energy audit process and report outcomes through planning initiatives

    NASA Astrophysics Data System (ADS)

    Sprau Coulter, Tabitha L.

    Energy audits and energy models are an important aspect of the retrofit design process, as they provide project teams with an opportunity to evaluate a facilities current building systems' and energy performance. The information collected during an energy audit is typically used to develop an energy model and an energy audit report that are both used to assist in making decisions about the design and implementation of energy conservation measures in a facility. The current lack of energy auditing standards results in a high degree of variability in energy audit outcomes depending on the individual performing the audit. The research presented is based on the conviction that performing an energy audit and producing a value adding energy model for retrofit buildings can benefit from a revised approach. The research was divided into four phases, with the initial three phases consisting of: 1.) process mapping activity - aimed at reducing variability in the energy auditing and energy modeling process. 2.) survey analysis -- To examine the misalignment between how industry members use the top energy modeling tools compared to their intended use as defined by software representatives. 3.) sensitivity analysis -- analysis of the affect key energy modeling inputs are having on energy modeling analysis results. The initial three phases helped define the need for an improved energy audit approach that better aligns data collection with facility owners' needs and priorities. The initial three phases also assisted in the development of a multi-criteria decision support tool that incorporates a House of Quality approach to guide a pre-audit planning activity. For the fourth and final research phase explored the impacts and evaluation methods of a pre-audit planning activity using two comparative energy audits as case studies. In each case, an energy audit professionals was asked to complete an audit using their traditional methods along with an audit which involved them first participating in a pre-audit planning activity that aligned the owner's priorities with the data collection. A comparative analysis was then used to evaluate the effects of the pre-audit planning activity in developing a more strategic method for collecting data and representing findings in an energy audit report to a facility owner. The case studies demonstrated that pre-audit planning has the potential to improve the efficiency of an energy audit process through reductions in transition time waste. The cases also demonstrated the value of audit report designs that are perceived by owners to be project specific vs. generic. The research demonstrated the ability to influence and alter an auditors' behavior through participating in a pre-audit planning activity. It also shows the potential benefits of using the House of Quality as a method of aligning data collection with owner's goals and priorities to develop reports that have increased value.

  1. The Topological Weighted Centroid (TWC): A topological approach to the time-space structure of epidemic and pseudo-epidemic processes

    NASA Astrophysics Data System (ADS)

    Buscema, Massimo; Massini, Giulia; Sacco, Pier Luigi

    2018-02-01

    This paper offers the first systematic presentation of the topological approach to the analysis of epidemic and pseudo-epidemic spatial processes. We introduce the basic concepts and proofs, at test the approach on a diverse collection of case studies of historically documented epidemic and pseudo-epidemic processes. The approach is found to consistently provide reliable estimates of the structural features of epidemic processes, and to provide useful analytical insights and interpretations of fragmentary pseudo-epidemic processes. Although this analysis has to be regarded as preliminary, we find that the approach's basic tenets are strongly corroborated by this first test and warrant future research in this vein.

  2. Analysis of Management Control Techniques for the Data Processing Department at the Navy Finance Center, Cleveland, Ohio.

    DTIC Science & Technology

    1983-03-01

    Sysiem are: Order processinq coordinators Order processing management Credit and collections Accounts receivable Support management Admin ianagemenr...or sales secretary, then by order processing (OP). Phone-in orders go directly to OP. The infor- mation is next Transcribed onto an order entry... ORDER PROCESSING : The central systems validate The order items and codes t!, processing them against the customer file, the prodicT or PA? ts file, and

  3. An analysis of learning process based on scientific approach in physical chemsitry experiment

    NASA Astrophysics Data System (ADS)

    Arlianty, Widinda Normalia; Febriana, Beta Wulan; Diniaty, Artina

    2017-03-01

    This study aimed to analysis the quality of learning process based on scientific approach in physical chemistry experiment of Chemistry Education students, Islamic University of Indonesia. The research was descriptive qualitative. The samples of this research were 2nd semester student, class of 2015. Scientific data of learning process were collected by observation sheet and documentation of seven title experimental. The results showed that the achievement of scientific learning process on observing, questioning, experimenting and associating data were 73.98%; 81.79%; 80.74%; and 76.94% respectively, which categorized as medium. Furthermore, for aspect communicating had high category at 86.11% of level achievement.

  4. The Web Measurement Environment (WebME): A Tool for Combining and Modeling Distributed Data

    NASA Technical Reports Server (NTRS)

    Tesoriero, Roseanne; Zelkowitz, Marvin

    1997-01-01

    Many organizations have incorporated data collection into their software processes for the purpose of process improvement. However, in order to improve, interpreting the data is just as important as the collection of data. With the increased presence of the Internet and the ubiquity of the World Wide Web, the potential for software processes being distributed among several physically separated locations has also grown. Because project data may be stored in multiple locations and in differing formats, obtaining and interpreting data from this type of environment becomes even more complicated. The Web Measurement Environment (WebME), a Web-based data visualization tool, is being developed to facilitate the understanding of collected data in a distributed environment. The WebME system will permit the analysis of development data in distributed, heterogeneous environments. This paper provides an overview of the system and its capabilities.

  5. Lean production tools and decision latitude enable conditions for innovative learning in organizations: a multilevel analysis.

    PubMed

    Fagerlind Ståhl, Anna-Carin; Gustavsson, Maria; Karlsson, Nadine; Johansson, Gun; Ekberg, Kerstin

    2015-03-01

    The effect of lean production on conditions for learning is debated. This study aimed to investigate how tools inspired by lean production (standardization, resource reduction, visual monitoring, housekeeping, value flow analysis) were associated with an innovative learning climate and with collective dispersion of ideas in organizations, and whether decision latitude contributed to these associations. A questionnaire was sent out to employees in public, private, production and service organizations (n = 4442). Multilevel linear regression analyses were used. Use of lean tools and decision latitude were positively associated with an innovative learning climate and collective dispersion of ideas. A low degree of decision latitude was a modifier in the association to collective dispersion of ideas. Lean tools can enable shared understanding and collective spreading of ideas, needed for the development of work processes, especially when decision latitude is low. Value flow analysis played a pivotal role in the associations. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  6. Quantifying the sensitivity of feedstock properties and process conditions on hydrochar yield, carbon content, and energy content.

    PubMed

    Li, Liang; Wang, Yiying; Xu, Jiting; Flora, Joseph R V; Hoque, Shamia; Berge, Nicole D

    2018-08-01

    Hydrothermal carbonization (HTC) is a wet, low temperature thermal conversion process that continues to gain attention for the generation of hydrochar. The importance of specific process conditions and feedstock properties on hydrochar characteristics is not well understood. To evaluate this, linear and non-linear models were developed to describe hydrochar characteristics based on data collected from HTC-related literature. A Sobol analysis was subsequently conducted to identify parameters that most influence hydrochar characteristics. Results from this analysis indicate that for each investigated hydrochar property, the model fit and predictive capability associated with the random forest models is superior to both the linear and regression tree models. Based on results from the Sobol analysis, the feedstock properties and process conditions most influential on hydrochar yield, carbon content, and energy content were identified. In addition, a variational process parameter sensitivity analysis was conducted to determine how feedstock property importance changes with process conditions. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. Multi-level multi-criteria analysis of alternative fuels for waste collection vehicles in the United States.

    PubMed

    Maimoun, Mousa; Madani, Kaveh; Reinhart, Debra

    2016-04-15

    Historically, the U.S. waste collection fleet was dominated by diesel-fueled waste collection vehicles (WCVs); the growing need for sustainable waste collection has urged decision makers to incorporate economically efficient alternative fuels, while mitigating environmental impacts. The pros and cons of alternative fuels complicate the decisions making process, calling for a comprehensive study that assesses the multiple factors involved. Multi-criteria decision analysis (MCDA) methods allow decision makers to select the best alternatives with respect to selection criteria. In this study, two MCDA methods, Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) and Simple Additive Weighting (SAW), were used to rank fuel alternatives for the U.S. waste collection industry with respect to a multi-level environmental and financial decision matrix. The environmental criteria consisted of life-cycle emissions, tail-pipe emissions, water footprint (WFP), and power density, while the financial criteria comprised of vehicle cost, fuel price, fuel price stability, and fueling station availability. The overall analysis showed that conventional diesel is still the best option, followed by hydraulic-hybrid WCVs, landfill gas (LFG) sourced natural gas, fossil natural gas, and biodiesel. The elimination of the WFP and power density criteria from the environmental criteria ranked biodiesel 100 (BD100) as an environmentally better alternative compared to other fossil fuels (diesel and natural gas). This result showed that considering the WFP and power density as environmental criteria can make a difference in the decision process. The elimination of the fueling station and fuel price stability criteria from the decision matrix ranked fossil natural gas second after LFG-sourced natural gas. This scenario was found to represent the status quo of the waste collection industry. A sensitivity analysis for the status quo scenario showed the overall ranking of diesel and fossil natural gas to be more sensitive to changing fuel prices as compared to other alternatives. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. How to do (or not to do)… gender analysis in health systems research.

    PubMed

    Morgan, Rosemary; George, Asha; Ssali, Sarah; Hawkins, Kate; Molyneux, Sassy; Theobald, Sally

    2016-10-01

    Gender-the socially constructed roles, behaviours, activities and attributes that a given society considers appropriate for males, females and other genders-affects how people live, work and relate to each other at all levels, including in relation to the health system. Health systems research (HSR) aims to inform more strategic, effective and equitable health systems interventions, programs and policies; and the inclusion of gender analysis into HSR is a core part of that endeavour. We outline what gender analysis is and how gender analysis can be incorporated into HSR content, process and outcomes Starting with HSR content, i.e. the substantive focus of HSR, we recommend exploring whether and how gender power relations affect females and males in health systems through the use of sex disaggregated data, gender frameworks and questions. Sex disaggregation flags female-male differences or similarities that warrant further analysis; and further analysis is guided by gender frameworks and questions to understand how gender power relations are constituted and negotiated in health systems. Critical aspects of understanding gender power relations include examining who has what (access to resources); who does what (the division of labour and everyday practices); how values are defined (social norms) and who decides (rules and decision-making). Secondly, we examine gender in HSR process by reflecting on how the research process itself is imbued with power relations. We focus on data collection and analysis by reviewing who participates as respondents; when data is collected and where; who is present; who collects data and who analyses data. Thirdly, we consider gender and HSR outcomes by considering who is empowered and disempowered as a result of HSR, including the extent to which HSR outcomes progressively transform gender power relations in health systems, or at least do not further exacerbate them. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  9. Collective efficacy in Denver, Colorado: Strengthening neighborhoods and health through community gardens.

    PubMed

    Teig, Ellen; Amulya, Joy; Bardwell, Lisa; Buchenau, Michael; Marshall, Julie A; Litt, Jill S

    2009-12-01

    Community gardens are viewed as a potentially useful environmental change strategy to promote active and healthy lifestyles but the scientific evidence base for gardens is limited. As a step towards understanding whether gardens are a viable health promotion strategy for local communities, we set out to examine the social processes that might explain the connection between gardens, garden participation and health. We analyzed data from semi-structured interviews with community gardeners in Denver. The analysis examined social processes described by community gardeners and how those social processes were cultivated by or supportive of activities in community gardens. After presenting results describing these social processes and the activities supporting them, we discuss the potential for the place-based social processes found in community gardens to support collective efficacy, a powerful mechanism for enhancing the role of gardens in promoting health.

  10. Fractional Brownian motion run with a multi-scaling clock mimics diffusion of spherical colloids in microstructural fluids.

    PubMed

    Park, Moongyu; Cushman, John Howard; O'Malley, Dan

    2014-09-30

    The collective molecular reorientations within a nematic liquid crystal fluid bathing a spherical colloid cause the colloid to diffuse anomalously on a short time scale (i.e., as a non-Brownian particle). The deformations and fluctuations of long-range orientational order in the liquid crystal profoundly influence the transient diffusive regimes. Here we show that an anisotropic fractional Brownian process run with a nonlinear multiscaling clock effectively mimics this collective and transient phenomenon. This novel process has memory, Gaussian increments, and a multiscale mean square displacement that can be chosen independently from the fractal dimension of a particle trajectory. The process is capable of modeling multiscale sub-, super-, or classical diffusion. The finite-size Lyapunov exponents for this multiscaling process are defined for future analysis of related mixing processes.

  11. Evolutionary Capability Delivery of Coast Guard Manpower System

    DTIC Science & Technology

    2014-06-01

    Office IID iterative incremental development model IT information technology MA major accomplishment MRA manpower requirements analysis MRD manpower...CG will need to ensure that development is low risk. The CG uses Manpower Requirements Analysis ( MRAs ) to collect the necessary manpower data to...of users. The CG uses two business processes to manage human capital: Manpower Requirements Analysis ( MRA ) and Manpower Requirements

  12. Measuring the software process and product: Lessons learned in the SEL

    NASA Technical Reports Server (NTRS)

    Basili, V. R.

    1985-01-01

    The software development process and product can and should be measured. The software measurement process at the Software Engineering Laboratory (SEL) has taught a major lesson: develop a goal-driven paradigm (also characterized as a goal/question/metric paradigm) for data collection. Project analysis under this paradigm leads to a design for evaluating and improving the methodology of software development and maintenance.

  13. Final report of coordination and cooperation with the European Union on embankment failure analysis

    USDA-ARS?s Scientific Manuscript database

    There has been an emphasis in the European Union (EU) community on the investigation of extreme flood processes and the uncertainties related to these processes. Over a 3-year period, the EU and the U.S. dam safety community (1) coordinated their efforts and collected information needed to integrate...

  14. Advances in distributed watershed modeling: a review and application of the AgroEcoSystem-Watershed (AgES-W) model

    USDA-ARS?s Scientific Manuscript database

    Progress in the understanding of physical, chemical, and biological processes influencing water quality, coupled with advancements in the collection and analysis of hydrologic data, provide opportunities for significant innovations in the manner and level with which watershed-scale processes may be ...

  15. Challenges and progress in distributed watershed modeling: applications of the AgroEcoSystem-Watershed (AgES-W) model

    USDA-ARS?s Scientific Manuscript database

    Progress in the understanding of physical, chemical, and biological processes influencing water quality, coupled with advances in the collection and analysis of hydrologic data, provide opportunities for significant innovations in the manner and level with which watershed-scale processes may be quan...

  16. Getting More out of Your Interview Data: Toward a Framework for Debriefing the Transcriber of Interviews

    ERIC Educational Resources Information Center

    Weinbaum, Rebecca K.; Onwuegbuzie, Anthony J.

    2016-01-01

    In most qualitative research studies involving the creation of interview transcriptions, researchers seldom demonstrate much reflexivity about the transcription process, rarely making mention of transcription processes as part of their reporting of data collection and analysis procedures beyond a simple statement that audio- or videotaped data…

  17. Determining e-Portfolio Elements in Learning Process Using Fuzzy Delphi Analysis

    ERIC Educational Resources Information Center

    Mohamad, Syamsul Nor Azlan; Embi, Mohamad Amin; Nordin, Norazah

    2015-01-01

    The present article introduces the Fuzzy Delphi method results obtained in the study on determining e-Portfolio elements in learning process for art and design context. This method bases on qualified experts that assure the validity of the collected information. In particular, the confirmation of elements is based on experts' opinion and…

  18. Information Processing and Risk Perception: An Adaptation of the Heuristic-Systematic Model.

    ERIC Educational Resources Information Center

    Trumbo, Craig W.

    2002-01-01

    Describes heuristic-systematic information-processing model and risk perception--the two major conceptual areas of the analysis. Discusses the proposed model, describing the context of the data collections (public health communication involving cancer epidemiology) and providing the results of a set of three replications using the proposed model.…

  19. Detection of Subsurface Defects in Levees in Correlation to Weather Conditions Utilizing Ground Penetrating Radar

    NASA Astrophysics Data System (ADS)

    Martinez, I. A.; Eisenmann, D.

    2012-12-01

    Ground Penetrating Radar (GPR) has been used for many years in successful subsurface detection of conductive and non-conductive objects in all types of material including different soils and concrete. Typical defect detection is based on subjective examination of processed scans using data collection and analysis software to acquire and analyze the data, often requiring a developed expertise or an awareness of how a GPR works while collecting data. Processing programs, such as GSSI's RADAN analysis software are then used to validate the collected information. Iowa State University's Center for Nondestructive Evaluation (CNDE) has built a test site, resembling a typical levee used near rivers, which contains known sub-surface targets of varying size, depth, and conductivity. Scientist at CNDE have developed software with the enhanced capabilities, to decipher a hyperbola's magnitude and amplitude for GPR signal processing. With this enhanced capability, the signal processing and defect detection capabilities for GPR have the potential to be greatly enhanced. This study will examine the effects of test parameters, antenna frequency (400MHz), data manipulation methods (which include data filters and restricting the range of depth in which the chosen antenna's signal can reach), and real-world conditions using this test site (such as varying weather conditions) , with the goal of improving GPR tests sensitivity for differing soil conditions.

  20. A statistical metadata model for clinical trials' data management.

    PubMed

    Vardaki, Maria; Papageorgiou, Haralambos; Pentaris, Fragkiskos

    2009-08-01

    We introduce a statistical, process-oriented metadata model to describe the process of medical research data collection, management, results analysis and dissemination. Our approach explicitly provides a structure for pieces of information used in Clinical Study Data Management Systems, enabling a more active role for any associated metadata. Using the object-oriented paradigm, we describe the classes of our model that participate during the design of a clinical trial and the subsequent collection and management of the relevant data. The advantage of our approach is that we focus on presenting the structural inter-relation of these classes when used during datasets manipulation by proposing certain transformations that model the simultaneous processing of both data and metadata. Our solution reduces the possibility of human errors and allows for the tracking of all changes made during datasets lifecycle. The explicit modeling of processing steps improves data quality and assists in the problem of handling data collected in different clinical trials. The case study illustrates the applicability of the proposed framework demonstrating conceptually the simultaneous handling of datasets collected during two randomized clinical studies. Finally, we provide the main considerations for implementing the proposed framework into a modern Metadata-enabled Information System.

  1. Analysis of the enablers of capacities to produce primary health care-based reforms in Latin America: a multiple case study

    PubMed Central

    Báscolo, Ernesto Pablo; Yavich, Natalia; Denis, Jean-Louis

    2016-01-01

    Abstract Background Primary health care (PHC)-based reforms have had different results in Latin America. Little attention has been paid to the enablers of collective action capacities required to produce a comprehensive PHC approach. Objective To analyse the enablers of collective action capacities to transform health systems towards a comprehensive PHC approach in Latin American PHC-based reforms. Methods We conducted a longitudinal, retrospective case study of three municipal PHC-based reforms in Bolivia and Argentina. We used multiple data sources and methodologies: document review; interviews with policymakers, managers and practitioners; and household and services surveys. We used temporal bracketing to analyse how the dynamic of interaction between the institutional reform process and the collective action characteristics enabled or hindered the enablers of collective action capacities required to produce the envisioned changes. Results The institutional structuring dynamics and collective action capacities were different in each case. In Cochabamba, there was an ‘interrupted’ structuring process that achieved the establishment of a primary level with a selective PHC approach. In Vicente López, there was a ‘path-dependency’ structuring process that permitted the consolidation of a ‘primary care’ approach, but with limited influence in hospitals. In Rosario, there was a ‘dialectic’ structuring process that favoured the development of the capacities needed to consolidate a comprehensive PHC approach that permeates the entire system. Conclusion The institutional change processes achieved the development of a primary health care level with different degrees of consolidation and system-wide influence given how the characteristics of each collective action enabled or hindered the ‘structuring’ processes. PMID:27209640

  2. DART system analysis.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boggs, Paul T.; Althsuler, Alan; Larzelere, Alex R.

    2005-08-01

    The Design-through-Analysis Realization Team (DART) is chartered with reducing the time Sandia analysts require to complete the engineering analysis process. The DART system analysis team studied the engineering analysis processes employed by analysts in Centers 9100 and 8700 at Sandia to identify opportunities for reducing overall design-through-analysis process time. The team created and implemented a rigorous analysis methodology based on a generic process flow model parameterized by information obtained from analysts. They also collected data from analysis department managers to quantify the problem type and complexity distribution throughout Sandia's analyst community. They then used this information to develop a communitymore » model, which enables a simple characterization of processes that span the analyst community. The results indicate that equal opportunity for reducing analysis process time is available both by reducing the ''once-through'' time required to complete a process step and by reducing the probability of backward iteration. In addition, reducing the rework fraction (i.e., improving the engineering efficiency of subsequent iterations) offers approximately 40% to 80% of the benefit of reducing the ''once-through'' time or iteration probability, depending upon the process step being considered. Further, the results indicate that geometry manipulation and meshing is the largest portion of an analyst's effort, especially for structural problems, and offers significant opportunity for overall time reduction. Iteration loops initiated late in the process are more costly than others because they increase ''inner loop'' iterations. Identifying and correcting problems as early as possible in the process offers significant opportunity for time savings.« less

  3. A high-throughput core sampling device for the evaluation of maize stalk composition

    PubMed Central

    2012-01-01

    Background A major challenge in the identification and development of superior feedstocks for the production of second generation biofuels is the rapid assessment of biomass composition in a large number of samples. Currently, highly accurate and precise robotic analysis systems are available for the evaluation of biomass composition, on a large number of samples, with a variety of pretreatments. However, the lack of an inexpensive and high-throughput process for large scale sampling of biomass resources is still an important limiting factor. Our goal was to develop a simple mechanical maize stalk core sampling device that can be utilized to collect uniform samples of a dimension compatible with robotic processing and analysis, while allowing the collection of hundreds to thousands of samples per day. Results We have developed a core sampling device (CSD) to collect maize stalk samples compatible with robotic processing and analysis. The CSD facilitates the collection of thousands of uniform tissue cores consistent with high-throughput analysis required for breeding, genetics, and production studies. With a single CSD operated by one person with minimal training, more than 1,000 biomass samples were obtained in an eight-hour period. One of the main advantages of using cores is the high level of homogeneity of the samples obtained and the minimal opportunity for sample contamination. In addition, the samples obtained with the CSD can be placed directly into a bath of ice, dry ice, or liquid nitrogen maintaining the composition of the biomass sample for relatively long periods of time. Conclusions The CSD has been demonstrated to successfully produce homogeneous stalk core samples in a repeatable manner with a throughput substantially superior to the currently available sampling methods. Given the variety of maize developmental stages and the diversity of stalk diameter evaluated, it is expected that the CSD will have utility for other bioenergy crops as well. PMID:22548834

  4. Processing and population genetic analysis of multigenic datasets with ProSeq3 software.

    PubMed

    Filatov, Dmitry A

    2009-12-01

    The current tendency in molecular population genetics is to use increasing numbers of genes in the analysis. Here I describe a program for handling and population genetic analysis of DNA polymorphism data collected from multiple genes. The program includes a sequence/alignment editor and an internal relational database that simplify the preparation and manipulation of multigenic DNA polymorphism datasets. The most commonly used DNA polymorphism analyses are implemented in ProSeq3, facilitating population genetic analysis of large multigenic datasets. Extensive input/output options make ProSeq3 a convenient hub for sequence data processing and analysis. The program is available free of charge from http://dps.plants.ox.ac.uk/sequencing/proseq.htm.

  5. The Reliability Mandate: Optimizing the Use of Highly Reliable Parts, Materials, and Processes (PM&P) to Maximize System Component Reliability in the Life Cycle

    DTIC Science & Technology

    2002-06-01

    projects are converted into bricks and mortar , as Figure 5 illustrates. Making major changes in LCC after projects are turned over to production is...matter experts ( SMEs ) in the parts, materials, and processes functional area. Data gathering and analysis were conducted through structured interviews...The analysis synthesized feedback and searched for collective issues from the various SMEs on managing PM&P Program requirements, the

  6. The Separation of Blood Components Using Standing Surface Acoustic Waves (SSAWs) Microfluidic Devices: Analysis and Simulation.

    PubMed

    Soliman, Ahmed M; Eldosoky, Mohamed A; Taha, Taha E

    2017-03-29

    The separation of blood components (WBCs, RBCs, and platelets) is important for medical applications. Recently, standing surface acoustic wave (SSAW) microfluidic devices are used for the separation of particles. In this paper, the design analysis of SSAW microfluidics is presented. Also, the analysis of SSAW force with Rayleigh angle effect and its attenuation in liquid-loaded substrate, viscous drag force, hydrodynamic force, and diffusion force are explained and analyzed. The analyses are provided for selecting the piezoelectric material, width of the main microchannel, working area of SAW, wavelength, minimum input power required for the separation process, and widths of outlet collecting microchannels. The design analysis of SSAW microfluidics is provided for determining the minimum input power required for the separation process with appropriated the displacement contrast of the particles.The analyses are applied for simulation the separation of blood components. The piezoelectric material, width of the main microchannel, working area of SAW, wavelength, and minimum input power required for the separation process are selected as LiNbO₃, 120 μm, 1.08 mm², 300 μm, 371 mW. The results are compared to other published results. The results of these simulations achieve minimum power consumption, less complicated setup, and high collecting efficiency. All simulation programs are built by MATLAB.

  7. The Separation of Blood Components Using Standing Surface Acoustic Waves (SSAWs) Microfluidic Devices: Analysis and Simulation

    PubMed Central

    Soliman, Ahmed M.; Eldosoky, Mohamed A.; Taha, Taha E.

    2017-01-01

    The separation of blood components (WBCs, RBCs, and platelets) is important for medical applications. Recently, standing surface acoustic wave (SSAW) microfluidic devices are used for the separation of particles. In this paper, the design analysis of SSAW microfluidics is presented. Also, the analysis of SSAW force with Rayleigh angle effect and its attenuation in liquid-loaded substrate, viscous drag force, hydrodynamic force, and diffusion force are explained and analyzed. The analyses are provided for selecting the piezoelectric material, width of the main microchannel, working area of SAW, wavelength, minimum input power required for the separation process, and widths of outlet collecting microchannels. The design analysis of SSAW microfluidics is provided for determining the minimum input power required for the separation process with appropriated the displacement contrast of the particles.The analyses are applied for simulation the separation of blood components. The piezoelectric material, width of the main microchannel, working area of SAW, wavelength, and minimum input power required for the separation process are selected as LiNbO3, 120 μm, 1.08 mm2, 300 μm, 371 mW. The results are compared to other published results. The results of these simulations achieve minimum power consumption, less complicated setup, and high collecting efficiency. All simulation programs are built by MATLAB. PMID:28952506

  8. Body Mapping as a Youth Sexual Health Intervention and Data Collection Tool.

    PubMed

    Lys, Candice; Gesink, Dionne; Strike, Carol; Larkin, June

    2018-06-01

    In this article, we describe and evaluate body mapping as (a) an arts-based activity within Fostering Open eXpression Among Youth (FOXY), an educational intervention targeting Northwest Territories (NWT) youth, and (b) a research data collection tool. Data included individual interviews with 41 female participants (aged 13-17 years) who attended FOXY body mapping workshops in six communities in 2013, field notes taken by the researcher during the workshops and interviews, and written reflections from seven FOXY facilitators on the body mapping process (from 2013 to 2016). Thematic analysis explored the utility of body mapping using a developmental evaluation methodology. The results show body mapping is an intervention tool that supports and encourages participant self-reflection, introspection, personal connectedness, and processing difficult emotions. Body mapping is also a data collection catalyst that enables trust and youth voice in research, reduces verbal communication barriers, and facilitates the collection of rich data regarding personal experiences.

  9. A fuzzy cost-benefit function to select economical products for processing in a closed-loop supply chain

    NASA Astrophysics Data System (ADS)

    Pochampally, Kishore K.; Gupta, Surendra M.; Cullinane, Thomas P.

    2004-02-01

    The cost-benefit analysis of data associated with re-processing of used products often involves the uncertainty feature of cash-flow modeling. The data is not objective because of uncertainties in supply, quality and disassembly times of used products. Hence, decision-makers must rely on "fuzzy" data for analysis. The same parties that are involved in the forward supply chain often carry out the collection and re-processing of used products. It is therefore important that the cost-benefit analysis takes the data of both new products and used products into account. In this paper, a fuzzy cost-benefit function is proposed that is used to perform a multi-criteria economic analysis to select the most economical products to process in a closed-loop supply chain. Application of the function is detailed through an illustrative example.

  10. Communal Participation in Payment for Environmental Services (PES): Unpacking the Collective Decision to Enroll

    NASA Astrophysics Data System (ADS)

    Murtinho, Felipe; Hayes, Tanya

    2017-06-01

    Payment for Environmental Service programs are increasingly applied in communal settings where resource users collectively join the program and agree to limit their shared use of a common-property resource. Who decides to join PES and the degree to which community members agree with the collective decision is critical for the success of said programs. Yet, we have limited understanding of the factors that influence communal participation and the collective decision process. This paper examines communal participation in a national payment for conservation program in Ecuador. We use quantitative and qualitative analysis to (i) identify the attributes of the communities that participate (or not), and factors that facilitate participation ( n = 67), and (ii) assess household preference and alignment with the collective decision to participate ( n = 212). Household participation preferences indicate varying degrees of consensus with the collective decision to participate, with those using the resource less likely to support participation. At the communal level, however, our results indicate that over time, those communities that depend more heavily on their resource systems may ultimately choose to participate. Our findings suggest that communal governance structures and outside organizations may be instrumental in gaining participation in resource-dependent communities and building consensus. Findings also point to the need for further research on communal decision-processes to ensure that the collective decision is based on an informed and democratic process.

  11. A fluid collection system for dermal wounds in clinical investigations

    PubMed Central

    Klopfer, Michael; Li, G.-P.; Widgerow, Alan; Bachman, Mark

    2016-01-01

    In this work, we demonstrate the use of a thin, self adherent, and clinically durable patch device that can collect fluid from a wound site for analysis. This device is manufactured from laminated silicone layers using a novel all-silicone double-molding process. In vitro studies for flow and delivery were followed by a clinical demonstration for exudate collection efficiency from a clinically presented partial thickness burn. The demonstrated utility of this device lends itself for use as a research implement used to clinically sample wound exudate for analysis. This device can serve as a platform for future integration of wearable technology into wound monitoring and care. The demonstrated fabrication method can be used for devices requiring thin membrane construction. PMID:27051470

  12. Environmental Sampling & Analytical Methods (ESAM) Program - Home

    EPA Pesticide Factsheets

    ESAM is a comprehensive program to facilitate a coordinated response to a chemical, radiochemical, biotoxin or pathogen contamination incident focusing on sample collection, processing, and analysis to provide quality results to the field.

  13. Measuring the efficiency of a healthcare waste management system in Serbia with data envelopment analysis.

    PubMed

    Ratkovic, Branislava; Andrejic, Milan; Vidovic, Milorad

    2012-06-01

    In 2007, the Serbian Ministry of Health initiated specific activities towards establishing a workable model based on the existing administrative framework, which corresponds to the needs of healthcare waste management throughout Serbia. The objective of this research was to identify the reforms carried out and their outcomes by estimating the efficiencies of a sample of 35 healthcare facilities engaged in the process of collection and treatment of healthcare waste, using data envelopment analysis. Twenty-one (60%) of the 35 healthcare facilities analysed were found to be technically inefficient, with an average level of inefficiency of 13%. This fact indicates deficiencies in the process of collection and treatment of healthcare waste and the information obtained and presented in this paper could be used for further improvement and development of healthcare waste management in Serbia.

  14. Health Education Specialist Practice Analysis 2015 (HESPA 2015): Process and Outcomes

    ERIC Educational Resources Information Center

    McKenzie, James F.; Dennis, Dixie; Auld, M. Elaine; Lysoby, Linda; Doyle, Eva; Muenzen, Patricia M.; Caro, Carla M.; Kusorgbor-Narh, Cynthia S.

    2016-01-01

    The Health Education Specialist Practice Analysis 2015 (HESPA 2015) was conducted to update and validate the Areas of Responsibilities, Competencies, and Sub-competencies for Entry- and Advanced-Level Health Education Specialists. Two data collection instruments were developed--one was focused on Sub-competencies and the other on knowledge items…

  15. Network analysis reveals multiscale controls on streamwater chemistry

    Treesearch

    Kevin J. McGuire; Christian E. Torgersen; Gene E. Likens; Donald C. Buso; Winsor H. Lowe; Scott W. Bailey

    2014-01-01

    By coupling synoptic data from a basin-wide assessment of streamwater chemistry with network-based geostatistical analysis, we show that spatial processes differentially affect biogeochemical condition and pattern across a headwater stream network. We analyzed a high-resolution dataset consisting of 664 water samples collected every 100 m throughout 32 tributaries in...

  16. Determination of Sulfate by Conductometric Titration: An Undergraduate Laboratory Experiment

    ERIC Educational Resources Information Center

    Garcia, Jennifer; Schultz, Linda D.

    2016-01-01

    The classic technique for sulfate analysis in an undergraduate quantitative analysis lab involves precipitation as the barium salt with barium chloride, collection of the precipitate by gravity filtration using ashless filter paper, and removal of the filter paper by charring over a Bunsen burner. The entire process is time-consuming, hazardous,…

  17. 9 CFR 381.94 - Contamination with Microorganisms; process control verification criteria and testing; pathogen...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...) Analysis of samples. Laboratories may use any quantitative method for analysis of E. coli that is approved... FSIS in its nationwide microbiological baseline data collection programs and surveys. (Copies of... Surveys used in determining the prevalence of Salmonella on raw products are available in the FSIS Docket...

  18. 9 CFR 381.94 - Contamination with Microorganisms; process control verification criteria and testing; pathogen...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...) Analysis of samples. Laboratories may use any quantitative method for analysis of E. coli that is approved... FSIS in its nationwide microbiological baseline data collection programs and surveys. (Copies of... Surveys used in determining the prevalence of Salmonella on raw products are available in the FSIS Docket...

  19. 9 CFR 381.94 - Contamination with Microorganisms; process control verification criteria and testing; pathogen...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ...) Analysis of samples. Laboratories may use any quantitative method for analysis of E. coli that is approved... FSIS in its nationwide microbiological baseline data collection programs and surveys. (Copies of... Surveys used in determining the prevalence of Salmonella on raw products are available in the FSIS Docket...

  20. 9 CFR 381.94 - Contamination with Microorganisms; process control verification criteria and testing; pathogen...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...) Analysis of samples. Laboratories may use any quantitative method for analysis of E. coli that is approved... FSIS in its nationwide microbiological baseline data collection programs and surveys. (Copies of... Surveys used in determining the prevalence of Salmonella on raw products are available in the FSIS Docket...

  1. 9 CFR 381.94 - Contamination with Microorganisms; process control verification criteria and testing; pathogen...

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ...) Analysis of samples. Laboratories may use any quantitative method for analysis of E. coli that is approved... FSIS in its nationwide microbiological baseline data collection programs and surveys. (Copies of... Surveys used in determining the prevalence of Salmonella on raw products are available in the FSIS Docket...

  2. An Examination of Canadian Information Professionals' Involvement in the Provision of Business Information Synthesis and Analysis Services

    ERIC Educational Resources Information Center

    Patterson, Liane; Martzoukou, Konstantina

    2012-01-01

    The present study investigated the processes information professionals, working in a business environment, follow to meet business clients' information needs and particularly their involvement in information synthesis and analysis practices. A combination of qualitative and quantitative data was collected via a survey of 98 information…

  3. A Model of Practice in Special Education: Dynamic Ecological Analysis

    ERIC Educational Resources Information Center

    Hannant, Barbara; Lim, Eng Leong; McAllum, Ruth

    2010-01-01

    Dynamic Ecological Analysis (DEA) is a model of practice which increases a teams' efficacy by enabling the development of more effective interventions through collaboration and collective reflection. This process has proved to be useful in: a) clarifying thinking and problem-solving, b) transferring knowledge and thinking to significant parties,…

  4. Remote access and automation of SPring-8 MX beamlines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ueno, Go, E-mail: ueno@spring8.or.jp; Hikima, Takaaki; Yamashita, Keitaro

    At SPring-8 MX beamlines, a remote access system has been developed and started user operation in 2010. The system has been developed based on an automated data collection and data management architecture utilized for the confirmed scheme of SPring-8 mail-in data collection. Currently, further improvement to the remote access and automation which covers data processing and analysis are being developed.

  5. Oceanographic and Fisheries Data Collection and Telemetry From Commercial Fishing Vessels

    DTIC Science & Technology

    1998-09-30

    for the oceanographic and fisheries communities . TRANSITIONS Mike Curran, of the Naval Oceanographic Office, has offerred to help coordinate the...1 Oceanographic and Fisheries Data Collection and Telemetry From Commercial Fishing Vessels Ann Bucklin Ocean Process Analysis Laboratory...control number. 1. REPORT DATE 1998 2. REPORT TYPE 3. DATES COVERED 00-00-1998 to 00-00-1998 4. TITLE AND SUBTITLE Oceanographic and Fisheries

  6. Influence of the Pulse Duration in the Anthropomorphic Test Device (ATD) Lower-Leg Loading Mechanics

    DTIC Science & Technology

    2015-08-01

    this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data ...sources, gathering and maintaining the data needed, and completing and reviewing the collection information. Send comments regarding this burden...its specialists for supporting the numerical analysis, experiment, data processing, and various discussions that gave me valuable ideas. 1 1

  7. Standards for the Analysis and Processing of Surface-Water Data and Information Using Electronic Methods

    USGS Publications Warehouse

    Sauer, Vernon B.

    2002-01-01

    Surface-water computation methods and procedures are described in this report to provide standards from which a completely automated electronic processing system can be developed. To the greatest extent possible, the traditional U. S. Geological Survey (USGS) methodology and standards for streamflow data collection and analysis have been incorporated into these standards. Although USGS methodology and standards are the basis for this report, the report is applicable to other organizations doing similar work. The proposed electronic processing system allows field measurement data, including data stored on automatic field recording devices and data recorded by the field hydrographer (a person who collects streamflow and other surface-water data) in electronic field notebooks, to be input easily and automatically. A user of the electronic processing system easily can monitor the incoming data and verify and edit the data, if necessary. Input of the computational procedures, rating curves, shift requirements, and other special methods are interactive processes between the user and the electronic processing system, with much of this processing being automatic. Special computation procedures are provided for complex stations such as velocity-index, slope, control structures, and unsteady-flow models, such as the Branch-Network Dynamic Flow Model (BRANCH). Navigation paths are designed to lead the user through the computational steps for each type of gaging station (stage-only, stagedischarge, velocity-index, slope, rate-of-change in stage, reservoir, tide, structure, and hydraulic model stations). The proposed electronic processing system emphasizes the use of interactive graphics to provide good visual tools for unit values editing, rating curve and shift analysis, hydrograph comparisons, data-estimation procedures, data review, and other needs. Documentation, review, finalization, and publication of records are provided for with the electronic processing system, as well as archiving, quality assurance, and quality control.

  8. [Health community agent: subject of the buccal health practice in Alagoinhas, Bahia state].

    PubMed

    Rodrigues, Ana Aurea Alécio de Oliveira; Santos, Adriano Maia Dos; Assis, Marluce Maria Araújo

    2010-05-01

    This study about the work of micro politics was carried out by the Buccal Health Team (ESB) in the Family Health Program (PSF) of Alagoinhas, Bahia State, and has as central theoretical purpose the specific and singular forms in the practice of daily work, using the technologies (hard, light-hard and light). The methodological trajectory is based on the historical-social current in view of a dialectic approach of qualitative nature. The techniques of data collection used were: semi structured interview, observation of the work process and documental analysis. The analysis of the data was oriented by the hermeneutics-dialectics, allowing to compare the different levels of analysis, articulating the theoretical with the empirical evidence. The results reveal that the Family Health Teams are multidisciplinary, but have still not developed an interdisciplinary work, hence occurring juxtaposition of skills. Each unit plans their work process according to the singularities of the social subjects, implementing different characteristics in how to welcome, inform, attend and refer. An effort in changing the work process can be perceived in the perspective of amplified clinic with the health community agent standing out as a social/collective subject.

  9. The Mechanics of Single Cell and Collective Migration of Tumor Cells

    PubMed Central

    Lintz, Marianne; Muñoz, Adam; Reinhart-King, Cynthia A.

    2017-01-01

    Metastasis is a dynamic process in which cancer cells navigate the tumor microenvironment, largely guided by external chemical and mechanical cues. Our current understanding of metastatic cell migration has relied primarily on studies of single cell migration, most of which have been performed using two-dimensional (2D) cell culture techniques and, more recently, using three-dimensional (3D) scaffolds. However, the current paradigm focused on single cell movements is shifting toward the idea that collective migration is likely one of the primary modes of migration during metastasis of many solid tumors. Not surprisingly, the mechanics of collective migration differ significantly from single cell movements. As such, techniques must be developed that enable in-depth analysis of collective migration, and those for examining single cell migration should be adopted and modified to study collective migration to allow for accurate comparison of the two. In this review, we will describe engineering approaches for studying metastatic migration, both single cell and collective, and how these approaches have yielded significant insight into the mechanics governing each process. PMID:27814431

  10. Novel Diffusion-Weighted MRI for High-Grade Prostate Cancer Detection

    DTIC Science & Technology

    2016-10-01

    in image resolution and scale.This process is critical for evaluating new imaging modalities.Our initial findings illustrate the potential of the...eligible for analysis as determined by adequate pathologic processing and MR images deemed to be of adequate quality by the study team.  The...histology samples have been requested from the UIC biorepository for digitization  All MR images have been collected and prepared for image processing

  11. Application of Ensemble Detection and Analysis to Modeling Uncertainty in Non Stationary Process

    NASA Technical Reports Server (NTRS)

    Racette, Paul

    2010-01-01

    Characterization of non stationary and nonlinear processes is a challenge in many engineering and scientific disciplines. Climate change modeling and projection, retrieving information from Doppler measurements of hydrometeors, and modeling calibration architectures and algorithms in microwave radiometers are example applications that can benefit from improvements in the modeling and analysis of non stationary processes. Analyses of measured signals have traditionally been limited to a single measurement series. Ensemble Detection is a technique whereby mixing calibrated noise produces an ensemble measurement set. The collection of ensemble data sets enables new methods for analyzing random signals and offers powerful new approaches to studying and analyzing non stationary processes. Derived information contained in the dynamic stochastic moments of a process will enable many novel applications.

  12. The influence of institutions and organizations on urban waste collection systems: an analysis of waste collection system in Accra, Ghana (1985-2000).

    PubMed

    Fobil, Julius N; Armah, Nathaniel A; Hogarh, Jonathan N; Carboo, Derick

    2008-01-01

    Urban waste collection system is a pivotal component of all waste management schemes around the world. Therefore, the efficient performance and the success of these schemes in urban pollution control rest on the ability of the collection systems to fully adapt to the prevailing cultural and social contexts within which they operate. Conceptually, institutions being the rules guiding the conduct of public service provision and routine social interactions, waste collection systems embedded in institutions can only realize their potentials if they fully evolve continuously to reflect evolving social and technical matrices underlying the cultures, organizations, institutions and social conditions they are designed to address. This paper is a product of an analysis of waste collection performance in Ghana under two different institutional and/or organizational regimes; from an initial entirely public sector dependence to a current mix of public-private sector participation drawing on actual planning data from 1985 to 2000. The analysis found that the overall performance of waste collection services in Ghana increased under the coupled system, with efficiency (in terms of total waste clearance and coverage of service provision) increasing rapidly with increased private-sector controls and levels of involvement, e.g. for solid waste, collection rate and disposal improved from 51% in 1998 to about 91% in the year 2000. However, such an increase in performance could not be sustained beyond 10 years of public-private partnerships. This analysis argues that the sustainability of improved waste collection efficiency is a function of the franchise and lease arrangements between private sector group on the one hand and public sector group (local authorities) on the other hand. The analysis therefore concludes that if such franchise and lease arrangements are not conceived out of an initial transparent process, such a provision could undermine the overall sustainability of private sector initiatives in collection services delivery in the long term, as in the case of the Accra example.

  13. An Analysis of Occupational Titles and Competencies Needed in Agricultural Food Products Processing Plants.

    ERIC Educational Resources Information Center

    Smeltz, LeRoy C.

    To identify, rate, and cluster groups of competencies and occupational titles at entry and advance levels for occupations in five food products commodity areas, data were collected by interviews with personnel managers in 25 Pennsylvania food processing plants. Some findings were: (1) There were meaningful competency factor and occupational title…

  14. Temporal mapping and analysis

    NASA Technical Reports Server (NTRS)

    O'Hara, Charles G. (Inventor); Shrestha, Bijay (Inventor); Vijayaraj, Veeraraghavan (Inventor); Mali, Preeti (Inventor)

    2011-01-01

    A compositing process for selecting spatial data collected over a period of time, creating temporal data cubes from the spatial data, and processing and/or analyzing the data using temporal mapping algebra functions. In some embodiments, the temporal data cube is creating a masked cube using the data cubes, and computing a composite from the masked cube by using temporal mapping algebra.

  15. A Grounded Theory of Western-Trained Asian Group Leaders Leading Groups in Asia

    ERIC Educational Resources Information Center

    Taephant, Nattasuda; Rubel, Deborah; Champe, Julia

    2015-01-01

    This grounded theory research explored the experiences of Western-trained Asian group leaders leading groups in Asia. A total of 6 participants from Japan, Taiwan, and Thailand were interviewed 3 times over 9 months. The recursive process of data collection and analysis yielded substantive theory describing the participants' process of reconciling…

  16. Analysis of complex decisionmaking processes. [with application to jet engine development

    NASA Technical Reports Server (NTRS)

    Hill, J. D.; Ollila, R. G.

    1978-01-01

    The analysis of corporate decisionmaking processes related to major system developments is unusually difficult because of the number of decisionmakers involved in the process and the long development cycle. A method for analyzing such decision processes is developed and illustrated through its application to the analysis of the commercial jet engine development process. The method uses interaction matrices as the key tool for structuring the problem, recording data, and analyzing the data to establish the rank order of the major factors affecting development decisions. In the example, the use of interaction matrices permitted analysts to collect and analyze approximately 50 factors that influenced decisions during the four phases of the development cycle, and to determine the key influencers of decisions at each development phase. The results of this study indicate that the cost of new technology installed on an aircraft is the prime concern of the engine manufacturer.

  17. HYDICE postflight data processing

    NASA Astrophysics Data System (ADS)

    Aldrich, William S.; Kappus, Mary E.; Resmini, Ronald G.; Mitchell, Peter A.

    1996-06-01

    The hyperspectral digital imagery collection experiment (HYDICE) sensor records instrument counts for scene data, in-flight spectral and radiometric calibration sequences, and dark current levels onto an AMPEX DCRsi data tape. Following flight, the HYDICE ground data processing subsystem (GDPS) transforms selected scene data from digital numbers (DN) to calibrated radiance levels at the sensor aperture. This processing includes: dark current correction, spectral and radiometric calibration, conversion to radiance, and replacement of bad detector elements. A description of the algorithms for post-flight data processing is presented. A brief analysis of the original radiometric calibration procedure is given, along with a description of the development of the modified procedure currently used. Example data collected during the 1995 flight season, but uncorrected and processed, are shown to demonstrate the removal of apparent sensor artifacts (e.g., non-uniformities in detector response over the array) as a result of this transformation.

  18. Measurement-based reliability/performability models

    NASA Technical Reports Server (NTRS)

    Hsueh, Mei-Chen

    1987-01-01

    Measurement-based models based on real error-data collected on a multiprocessor system are described. Model development from the raw error-data to the estimation of cumulative reward is also described. A workload/reliability model is developed based on low-level error and resource usage data collected on an IBM 3081 system during its normal operation in order to evaluate the resource usage/error/recovery process in a large mainframe system. Thus, both normal and erroneous behavior of the system are modeled. The results provide an understanding of the different types of errors and recovery processes. The measured data show that the holding times in key operational and error states are not simple exponentials and that a semi-Markov process is necessary to model the system behavior. A sensitivity analysis is performed to investigate the significance of using a semi-Markov process, as opposed to a Markov process, to model the measured system.

  19. Visual analysis of trash bin processing on garbage trucks in low resolution video

    NASA Astrophysics Data System (ADS)

    Sidla, Oliver; Loibner, Gernot

    2015-03-01

    We present a system for trash can detection and counting from a camera which is mounted on a garbage collection truck. A working prototype has been successfully implemented and tested with several hours of real-world video. The detection pipeline consists of HOG detectors for two trash can sizes, and meanshift tracking and low level image processing for the analysis of the garbage disposal process. Considering the harsh environment and unfavorable imaging conditions, the process works already good enough so that very useful measurements from video data can be extracted. The false positive/false negative rate of the full processing pipeline is about 5-6% at fully automatic operation. Video data of a full day (about 8 hrs) can be processed in about 30 minutes on a standard PC.

  20. Storing and managing information artifacts collected by information analysts using a computing device

    DOEpatents

    Pike, William A; Riensche, Roderick M; Best, Daniel M; Roberts, Ian E; Whyatt, Marie V; Hart, Michelle L; Carr, Norman J; Thomas, James J

    2012-09-18

    Systems and computer-implemented processes for storage and management of information artifacts collected by information analysts using a computing device. The processes and systems can capture a sequence of interactive operation elements that are performed by the information analyst, who is collecting an information artifact from at least one of the plurality of software applications. The information artifact can then be stored together with the interactive operation elements as a snippet on a memory device, which is operably connected to the processor. The snippet comprises a view from an analysis application, data contained in the view, and the sequence of interactive operation elements stored as a provenance representation comprising operation element class, timestamp, and data object attributes for each interactive operation element in the sequence.

  1. Teachers’ work ability: a study of relationships between collective efficacy and self-efficacy beliefs

    PubMed Central

    Guidetti, Gloria; Viotti, Sara; Bruno, Andreina; Converso, Daniela

    2018-01-01

    Introduction Work ability constitutes one of the most studied well-being indicators related to work. Past research highlighted the relationship with work-related resources and demands, and personal resources. However, no studies highlight the role of collective and self-efficacy beliefs in sustaining work ability. Purpose The purpose of this study was to examine whether and by which mechanism work ability is linked with individual and collective efficacies in a sample of primary and middle school teachers. Materials and methods Using a dataset consisting of 415 primary and middle school Italian teachers, the analysis tested for the mediating role of self-efficacy between collective efficacy and work ability. Results Mediational analysis highlights that teachers’ self-efficacy totally mediates the relationship between collective efficacy and perceived work ability. Conclusion Results of this study enhance the theoretical knowledge and empirical evidence regarding the link between teachers’ collective efficacy and self-efficacy, giving further emphasis to the concept of collective efficacy in school contexts. Moreover, the results contribute to the study of well-being in the teaching profession, highlighting a process that sustains and promotes levels of work ability through both collective and personal resources. PMID:29861646

  2. Collective and decentralized management model in public hospitals: perspective of the nursing team.

    PubMed

    Bernardes, Andrea; Cecilio, Luiz Carlos de Oliveira; Evora, Yolanda Dora Martinez; Gabriel, Carmen Silvia; Carvalho, Mariana Bernardes de

    2011-01-01

    This research aims to present the implementation of the collective and decentralized management model in functional units of a public hospital in the city of Ribeirão Preto, state of São Paulo, according to the view of the nursing staff and the health technical assistant. This historical and organizational case study used qualitative thematic content analysis proposed by Bardin for data analysis. The institution started the decentralization of its administrative structure in 1999, through collective management, which permitted several internal improvements, with positive repercussion for the care delivered to users. The top-down implementation of the process seems to have jeopardized workers adherence, although collective management has intensified communication and the sharing of power and decision. The study shows that there is still much work to be done to concretize this innovative management proposal, despite the advances regarding the quality of care.

  3. A Preliminary Analysis of Precipitation Properties and Processes during NASA GPM IFloodS

    NASA Technical Reports Server (NTRS)

    Carey, Lawrence; Gatlin, Patrick; Petersen, Walt; Wingo, Matt; Lang, Timothy; Wolff, Dave

    2014-01-01

    The Iowa Flood Studies (IFloodS) is a NASA Global Precipitation Measurement (GPM) ground measurement campaign, which took place in eastern Iowa from May 1 to June 15, 2013. The goals of the field campaign were to collect detailed measurements of surface precipitation using ground instruments and advanced weather radars while simultaneously collecting data from satellites passing overhead. Data collected by the radars and other ground instruments, such as disdrometers and rain gauges, will be used to characterize precipitation properties throughout the vertical column, including the precipitation type (e.g., rain, graupel, hail, aggregates, ice crystals), precipitation amounts (e.g., rain rate), and the size and shape of raindrops. The impact of physical processes, such as aggregation, melting, breakup and coalescence on the measured liquid and ice precipitation properties will be investigated. These ground observations will ultimately be used to improve rainfall estimates from satellites and in particular the algorithms that interpret raw data for the upcoming GPM mission's Core Observatory satellite, which launches in 2014. The various precipitation data collected will eventually be used as input to flood forecasting models in an effort to improve capabilities and test the utility and limitations of satellite precipitation data for flood forecasting. In this preliminary study, the focus will be on analysis of NASA NPOL (S-band, polarimetric) radar (e.g., radar reflectivity, differential reflectivity, differential phase, correlation coefficient) and NASA 2D Video Disdrometers (2DVDs) measurements. Quality control and processing of the radar and disdrometer data sets will be outlined. In analyzing preliminary cases, particular emphasis will be placed on 1) documenting the evolution of the rain drop size distribution (DSD) as a function of column melting processes and 2) assessing the impact of range on ground-based polarimetric radar estimates of DSD properties.

  4. Measuring the efficiency of zakat collection process using data envelopment analysis

    NASA Astrophysics Data System (ADS)

    Hamzah, Ahmad Aizuddin; Krishnan, Anath Rau

    2016-10-01

    It is really necessary for each zakat institution in the nation to timely measure and understand their efficiency in collecting zakat for the sake of continuous betterment. Pusat Zakat Sabah, Malaysia which has kicked off its operation in early of 2007, is not excused from this obligation as well. However, measuring the collection efficiency is not a very easy task as it usually incorporates the consideration of multiple inputs or/and outputs. This paper sequentially employed three data envelopment analysis models, namely Charnes-Cooper-Rhodes (CCR) primal model, CCR dual model, and slack based model to quantitatively evaluate the efficiency of zakat collection in Sabah across the year of 2007 up to 2015 by treating each year as a decision making unit. The three models were developed based on two inputs (i.e. number of zakat branches and number of staff) and one output (i.e. total collection). The causes for not achieving efficiency and the suggestions on how the efficiency in each year could have been improved were disclosed.

  5. Collecting verbal autopsies: improving and streamlining data collection processes using electronic tablets.

    PubMed

    Flaxman, Abraham D; Stewart, Andrea; Joseph, Jonathan C; Alam, Nurul; Alam, Sayed Saidul; Chowdhury, Hafizur; Mooney, Meghan D; Rampatige, Rasika; Remolador, Hazel; Sanvictores, Diozele; Serina, Peter T; Streatfield, Peter Kim; Tallo, Veronica; Murray, Christopher J L; Hernandez, Bernardo; Lopez, Alan D; Riley, Ian Douglas

    2018-02-01

    There is increasing interest in using verbal autopsy to produce nationally representative population-level estimates of causes of death. However, the burden of processing a large quantity of surveys collected with paper and pencil has been a barrier to scaling up verbal autopsy surveillance. Direct electronic data capture has been used in other large-scale surveys and can be used in verbal autopsy as well, to reduce time and cost of going from collected data to actionable information. We collected verbal autopsy interviews using paper and pencil and using electronic tablets at two sites, and measured the cost and time required to process the surveys for analysis. From these cost and time data, we extrapolated costs associated with conducting large-scale surveillance with verbal autopsy. We found that the median time between data collection and data entry for surveys collected on paper and pencil was approximately 3 months. For surveys collected on electronic tablets, this was less than 2 days. For small-scale surveys, we found that the upfront costs of purchasing electronic tablets was the primary cost and resulted in a higher total cost. For large-scale surveys, the costs associated with data entry exceeded the cost of the tablets, so electronic data capture provides both a quicker and cheaper method of data collection. As countries increase verbal autopsy surveillance, it is important to consider the best way to design sustainable systems for data collection. Electronic data capture has the potential to greatly reduce the time and costs associated with data collection. For long-term, large-scale surveillance required by national vital statistical systems, electronic data capture reduces costs and allows data to be available sooner.

  6. Activity of the CNES/CLS Analysis Center for the IDS contribution to ITRF2014

    NASA Astrophysics Data System (ADS)

    Soudarin, Laurent; Capdeville, Hugues; Lemoine, Jean-Michel

    2016-12-01

    Within the frame of the International DORIS Service, the CNES/CLS Analysis Center, previously known as LCA and renamed GRG, contributes to the geodetic and geophysical research activity through DORIS data analysis. The main work carried out in the past two years concerns the processing of the measurements collected by the DORIS-equipped satellites over 22 years, in order to provide a homogeneous series of station coordinates and Earth pole parameters for the IDS contribution to the ITRF2014. First, we brought several upgrades to the processing and the modeling. Some of them are corrective actions to issues raised during or shortly after the production of our contribution to the ITRF2008 (ground station frequency offsets, attitude laws and macromodels). Recent models have been assessed with the aim to update our analysis configuration. Among others, we adopted the time variable gravity (TVG) model EIGEN-6S2 and applied tropospheric gradients. Then we processed almost all the DORIS data collected between January 1993 and December 2014. The series of weekly SINEX solutions derived from this processing is labeled grgwd40. This new series performs better than the series produced for ITRF2008. Especially, the results discussed in this paper show a decrease of 2% of the DORIS orbit residuals as well as a strong reduction of the annual terms of the TRF scale and Tz translation which can be explained by the application of the TVG model.

  7. FTOOLS: A FITS Data Processing and Analysis Software Package

    NASA Astrophysics Data System (ADS)

    Blackburn, J. K.

    FTOOLS, a highly modular collection of over 110 utilities for processing and analyzing data in the FITS (Flexible Image Transport System) format, has been developed in support of the HEASARC (High Energy Astrophysics Science Archive Research Center) at NASA's Goddard Space Flight Center. Each utility performs a single simple task such as presentation of file contents, extraction of specific rows or columns, appending or merging tables, binning values in a column or selecting subsets of rows based on a boolean expression. Individual utilities can easily be chained together in scripts to achieve more complex operations such as the generation and displaying of spectra or light curves. The collection of utilities provides both generic processing and analysis utilities and utilities specific to high energy astrophysics data sets used for the ASCA, ROSAT, GRO, and XTE missions. A core set of FTOOLS providing support for generic FITS data processing, FITS image analysis and timing analysis can easily be split out of the full software package for users not needing the high energy astrophysics mission utilities. The FTOOLS software package is designed to be both compatible with IRAF and completely stand alone in a UNIX or VMS environment. The user interface is controlled by standard IRAF parameter files. The package is self documenting through the IRAF help facility and a stand alone help task. Software is written in ANSI C and \\fortran to provide portability across most computer systems. The data format dependencies between hardware platforms are isolated through the FITSIO library package.

  8. Understanding the distributed cognitive processes of intensive care patient discharge.

    PubMed

    Lin, Frances; Chaboyer, Wendy; Wallis, Marianne

    2014-03-01

    To better understand and identify vulnerabilities and risks in the ICU patient discharge process, which provides evidence for service improvement. Previous studies have identified that 'after hours' discharge and 'premature' discharge from ICU are associated with increased mortality. However, some of these studies have largely been retrospective reviews of various administrative databases, while others have focused on specific aspects of the process, which may miss crucial components of the discharge process. This is an ethnographic exploratory study. Distributed cognition and activity theory were used as theoretical frameworks. Ethnographic data collection techniques including informal interviews, direct observations and collecting existing documents were used. A total of 56 one-to-one interviews were conducted with 46 participants; 28 discharges were observed; and numerous documents were collected during a five-month period. A triangulated technique was used in both data collection and data analysis to ensure the research rigour. Under the guidance of activity theory and distributed cognition theoretical frameworks, five themes emerged: hierarchical power and authority, competing priorities, ineffective communication, failing to enact the organisational processes and working collaboratively to optimise the discharge process. Issues with teamwork, cognitive processes and team members' interaction with cognitive artefacts influenced the discharge process. Strategies to improve shared situational awareness are needed to improve teamwork, patient flow and resource efficiency. Tools need to be evaluated regularly to ensure their continuous usefulness. Health care professionals need to be aware of the impact of their competing priorities and ensure discharges occur in a timely manner. Activity theory and distributed cognition are useful theoretical frameworks to support healthcare organisational research. © 2013 John Wiley & Sons Ltd.

  9. ThunderSTORM: a comprehensive ImageJ plug-in for PALM and STORM data analysis and super-resolution imaging

    PubMed Central

    Ovesný, Martin; Křížek, Pavel; Borkovec, Josef; Švindrych, Zdeněk; Hagen, Guy M.

    2014-01-01

    Summary: ThunderSTORM is an open-source, interactive and modular plug-in for ImageJ designed for automated processing, analysis and visualization of data acquired by single-molecule localization microscopy methods such as photo-activated localization microscopy and stochastic optical reconstruction microscopy. ThunderSTORM offers an extensive collection of processing and post-processing methods so that users can easily adapt the process of analysis to their data. ThunderSTORM also offers a set of tools for creation of simulated data and quantitative performance evaluation of localization algorithms using Monte Carlo simulations. Availability and implementation: ThunderSTORM and the online documentation are both freely accessible at https://code.google.com/p/thunder-storm/ Contact: guy.hagen@lf1.cuni.cz Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24771516

  10. Text recognition and correction for automated data collection by mobile devices

    NASA Astrophysics Data System (ADS)

    Ozarslan, Suleyman; Eren, P. Erhan

    2014-03-01

    Participatory sensing is an approach which allows mobile devices such as mobile phones to be used for data collection, analysis and sharing processes by individuals. Data collection is the first and most important part of a participatory sensing system, but it is time consuming for the participants. In this paper, we discuss automatic data collection approaches for reducing the time required for collection, and increasing the amount of collected data. In this context, we explore automated text recognition on images of store receipts which are captured by mobile phone cameras, and the correction of the recognized text. Accordingly, our first goal is to evaluate the performance of the Optical Character Recognition (OCR) method with respect to data collection from store receipt images. Images captured by mobile phones exhibit some typical problems, and common image processing methods cannot handle some of them. Consequently, the second goal is to address these types of problems through our proposed Knowledge Based Correction (KBC) method used in support of the OCR, and also to evaluate the KBC method with respect to the improvement on the accurate recognition rate. Results of the experiments show that the KBC method improves the accurate data recognition rate noticeably.

  11. Microwave Extraction of Volatiles for Mars Science and ISRU

    NASA Technical Reports Server (NTRS)

    Ethridge, Edwin C.; Kaulker, William F.

    2012-01-01

    The greatest advantage of microwave heating for volatiles extraction is that excavation can be greatly reduced. Surface support operations would be simple consisting of rovers with drilling capability for insertion of microwaves down bore holes to heat at desired depths. The rovers would also provide support to scientific instruments for volatiles analysis and for volatiles collection and storage. The process has the potential for a much lower mass and a less complex system than other in-situ processes. Microwave energy penetrates the surface heating within with subsequent sublimation of water or decomposition of volatile containing minerals. On Mars the volatiles should migrate to the surface to be captured with a cold trap. The water extraction and transport process coupled with atmospheric CO2 collection could readily lead to a propellant production process, H2O + CO2 yields CH4 + O2.

  12. Computational Environments and Analysis methods available on the NCI High Performance Computing (HPC) and High Performance Data (HPD) Platform

    NASA Astrophysics Data System (ADS)

    Evans, B. J. K.; Foster, C.; Minchin, S. A.; Pugh, T.; Lewis, A.; Wyborn, L. A.; Evans, B. J.; Uhlherr, A.

    2014-12-01

    The National Computational Infrastructure (NCI) has established a powerful in-situ computational environment to enable both high performance computing and data-intensive science across a wide spectrum of national environmental data collections - in particular climate, observational data and geoscientific assets. This paper examines 1) the computational environments that supports the modelling and data processing pipelines, 2) the analysis environments and methods to support data analysis, and 3) the progress in addressing harmonisation of the underlying data collections for future transdisciplinary research that enable accurate climate projections. NCI makes available 10+ PB major data collections from both the government and research sectors based on six themes: 1) weather, climate, and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology, and 6) astronomy, social and biosciences. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. The data is largely sourced from NCI's partners (which include the custodians of many of the national scientific records), major research communities, and collaborating overseas organisations. The data is accessible within an integrated HPC-HPD environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large scale and high-bandwidth Lustre filesystems. This computational environment supports a catalogue of integrated reusable software and workflows from earth system and ecosystem modelling, weather research, satellite and other observed data processing and analysis. To enable transdisciplinary research on this scale, data needs to be harmonised so that researchers can readily apply techniques and software across the corpus of data available and not be constrained to work within artificial disciplinary boundaries. Future challenges will involve the further integration and analysis of this data across the social sciences to facilitate the impacts across the societal domain, including timely analysis to more accurately predict and forecast future climate and environmental state.

  13. Satellite temperature monitoring and prediction system

    NASA Technical Reports Server (NTRS)

    Barnett, U. R.; Martsolf, J. D.; Crosby, F. L.

    1980-01-01

    The paper describes the Florida Satellite Freeze Forecast System (SFFS) in its current state. All data collection options have been demonstrated, and data collected over a three year period have been stored for future analysis. Presently, specific minimum temperature forecasts are issued routinely from November through March. The procedures for issuing these forecast are discussed. The automated data acquisition and processing system is described, and the physical and statistical models employed are examined.

  14. Measurement of chromium VI and chromium III in stainless steel welding fumes with electrom spectroscopy for chemical analysis and neutron activation analysis.

    PubMed

    Lautner, G M; Carver, J C; Konzen, R B

    1978-08-01

    Electron Spectroscopy for Chemical Analysis (ESCA) was explored as a means of studying the oxidation state of chromium in SMAC (coated electrode) stainless steel welding fume collected on Nucleopore filters in the laboratory. Chromuim VI and III (as a percent of the total chromium) obtained from ESCA analysis was applied to results from Neutron Activation Analysis (NAA) to yield an average of 69 microgram chromium VI per sample. Diphenylcarbazide/atomic absorption (DPC/AA) results are reported for samples submitted to an industrial laboratory. Possible chemical species and solubility of chromium VI in stainless steel fumes is discussed in light of analogy between the SMAC process and the manufacturing process for chromates.

  15. REVIEW ARTICLE: Current trends and future requirements for the mass spectrometric investigation of microbial, mammalian and plant metabolomes

    NASA Astrophysics Data System (ADS)

    Dunn, Warwick B.

    2008-03-01

    The functional levels of biological cells or organisms can be separated into the genome, transcriptome, proteome and metabolome. Of these the metabolome offers specific advantages to the investigation of the phenotype of biological systems. The investigation of the metabolome (metabolomics) has only recently appeared as a mainstream scientific discipline and is currently developing rapidly for the study of microbial, plant and mammalian metabolomes. The metabolome pipeline or workflow encompasses the processes of sample collection and preparation, collection of analytical data, raw data pre-processing, data analysis and data storage. Of these processes the collection of analytical data will be discussed in this review with specific interest shown in the application of mass spectrometry in the metabolomics pipeline. The current developments in mass spectrometry platforms (GC-MS, LC-MS, DIMS and imaging MS) and applications of specific interest will be highlighted. The current limitations of these platforms and applications will be discussed with areas requiring further development also highlighted. These include the detectable coverage of the metabolome, the identification of metabolites and the process of converting raw data to biological knowledge.

  16. North Carolina, 2011 forest inventory and analysis factsheet

    Treesearch

    Mark J. Brown; Barry D. New

    2013-01-01

    Forest Inventory and Analysis (FIA) factsheets are produced periodically to keep the public updated on the extent and condition of forest lands in each State. Estimates in the factsheets are based upon data collected from thousands of sample plots distributed across the landscape in a systematic manner. In North Carolina, this process is a collaborative effort between...

  17. Lost in a Giant Database: The Potentials and Pitfalls of Secondary Analysis for Deaf Education

    ERIC Educational Resources Information Center

    Kluwin, T. N.; Morris, C. S.

    2006-01-01

    Secondary research or archival research is the analysis of data collected by another person or agency. It offers several advantages, including reduced cost, a less time-consuming research process, and access to larger populations and thus greater generalizability. At the same time, it offers several limitations, including the fact that the…

  18. Wilderness on the internet: identifying wilderness information domains

    Treesearch

    Chuck Burgess

    2000-01-01

    Data collected from an online needs assessment revealed that Web site visitors with an interest in wilderness seek several different types of information. In order to gain further insight into the process of Web use for wilderness information, a follow-up analysis was conducted. This analysis was exploratory in nature, with the goal of identifying information domains...

  19. The Problem of Sample Contamination in a Fluvial Geochemistry Research Experience for Undergraduates.

    ERIC Educational Resources Information Center

    Andersen, Charles B.

    2001-01-01

    Introduces the analysis of a river as an excellent way to teach geochemical techniques because of the relative ease of sample collection and speed of sample analysis. Focuses on the potential sources of sample contamination during sampling, filtering, and bottle cleaning processes, and reviews methods to reduce and detect contamination. Includes…

  20. The Research Portfolio: Educating Teacher Researchers in Data Analysis

    ERIC Educational Resources Information Center

    Bates, Alisa J.; Bryant, Jill D.

    2013-01-01

    This paper describes research on a course assignment, the research portfolio, designed for a two-course teacher research experience in a Masters of Arts in Teaching program. The focus of the assignment is the process of data collection and analysis that is critical to the success of teacher research. We sought a way to help our teacher candidates…

  1. Identifying Engineering Students' English Sentence Reading Comprehension Errors: Applying a Data Mining Technique

    ERIC Educational Resources Information Center

    Tsai, Yea-Ru; Ouyang, Chen-Sen; Chang, Yukon

    2016-01-01

    The purpose of this study is to propose a diagnostic approach to identify engineering students' English reading comprehension errors. Student data were collected during the process of reading texts of English for science and technology on a web-based cumulative sentence analysis system. For the analysis, the association-rule, data mining technique…

  2. Learner Perception of Personal Spaces of Information (PSIs): A Mental Model Analysis

    ERIC Educational Resources Information Center

    Hardof-Jaffe, Sharon; Aladjem, Ruthi

    2018-01-01

    A personal space of information (PSI) refers to the collection of digital information items created, saved and organized, on digital devices. PSIs play a central and significant role in learning processes. This study explores the mental models and perceptions of PSIs by learners, using drawing analysis. Sixty-three graduate students were asked to…

  3. Instrument Would Detect and Collect Biological Aerosols

    NASA Technical Reports Server (NTRS)

    Savoy, Steve; Mayo, Mike

    2006-01-01

    A proposed compact, portable instrument would sample micron-sized airborne particles, would discriminate between biological ones (e.g., bacteria) and nonbiological ones (e.g., dust particles), and would collect the detected biological particles for further analysis. The instrument is intended to satisfy a growing need for means of rapid, inexpensive collection of bioaerosols in a variety of indoor and outdoor settings. Purposes that could be served by such collection include detecting airborne pathogens inside buildings and their ventilation systems, measuring concentrations of airborne biological contaminants around municipal waste-processing facilities, monitoring airborne effluents from suspected biowarfare facilities, and warning of the presence of airborne biowarfare agents

  4. Knowledge Reasoning with Semantic Data for Real-Time Data Processing in Smart Factory

    PubMed Central

    Wang, Shiyong; Li, Di; Liu, Chengliang

    2018-01-01

    The application of high-bandwidth networks and cloud computing in manufacturing systems will be followed by mass data. Industrial data analysis plays important roles in condition monitoring, performance optimization, flexibility, and transparency of the manufacturing system. However, the currently existing architectures are mainly for offline data analysis, not suitable for real-time data processing. In this paper, we first define the smart factory as a cloud-assisted and self-organized manufacturing system in which physical entities such as machines, conveyors, and products organize production through intelligent negotiation and the cloud supervises this self-organized process for fault detection and troubleshooting based on data analysis. Then, we propose a scheme to integrate knowledge reasoning and semantic data where the reasoning engine processes the ontology model with real time semantic data coming from the production process. Based on these ideas, we build a benchmarking system for smart candy packing application that supports direct consumer customization and flexible hybrid production, and the data are collected and processed in real time for fault diagnosis and statistical analysis. PMID:29415444

  5. The GRB All-sky Spectrometer Experiment II: Data Collection and Analysis

    NASA Astrophysics Data System (ADS)

    Voigt, Elana; Martinot, Zachary; Banks, Zachary; Pober, Jonathan; Morales, Miguel F.

    2015-01-01

    The GRB All-sky Spectrometer Experiment (GASE) is a widefield interferometer radio telescope designed to look for Gamma Ray Bursts in the 30 to 50 MHz range. It is based and operated as a wholly undergraduate experiment at the University of Washington. This poster will focus on data analysis and the relation of data analysis to the commissioning process of our 8 element GASE array.

  6. A framework for analysis of large database of old art paintings

    NASA Astrophysics Data System (ADS)

    Da Rugna, Jérome; Chareyron, Ga"l.; Pillay, Ruven; Joly, Morwena

    2011-03-01

    For many years, a lot of museums and countries organize the high definition digitalization of their own collections. In consequence, they generate massive data for each object. In this paper, we only focus on art painting collections. Nevertheless, we faced a very large database with heterogeneous data. Indeed, image collection includes very old and recent scans of negative photos, digital photos, multi and hyper spectral acquisitions, X-ray acquisition, and also front, back and lateral photos. Moreover, we have noted that art paintings suffer from much degradation: crack, softening, artifact, human damages and, overtime corruption. Considering that, it appears necessary to develop specific approaches and methods dedicated to digital art painting analysis. Consequently, this paper presents a complete framework to evaluate, compare and benchmark devoted to image processing algorithms.

  7. Systems integration of marketable subsystems: A collection of progress reports

    NASA Technical Reports Server (NTRS)

    1978-01-01

    Monthly progress reports are given in the areas of marketable subsystems integration; development, design, and building of site data acquisition subsystems and data processing systems; operation of the solar test facility and a systems analysis.

  8. Database architecture and query structures for probe data processing.

    DOT National Transportation Integrated Search

    2012-03-01

    This report summarizes findings and implementations of probe vehicle data collection based on Bluetooth MAC address matching technology. Probe vehicle travel time data are studied in the following field deployment case studies: analysis of traffic ch...

  9. BIOMONITORING OF EXPOSURE IN FARMWORKER STUDIES

    EPA Science Inventory

    Though biomonitoring has been used in many occupational and environmental health and exposure studies, we are only beginning to understand the complexities and uncertainties involved with the biomonitoring process -- from study design, to sample collection, to chemical analysis -...

  10. Sensitivity Analysis Reveals Critical Factors that Affect Wetland Methane Emissions using Soil Biogeochemistry Model

    NASA Astrophysics Data System (ADS)

    Alonso-Contes, C.; Gerber, S.; Bliznyuk, N.; Duerr, I.

    2017-12-01

    Wetlands contribute approximately 20 to 40 % to global sources of methane emissions. We build a Methane model for tropical and subtropical forests, that allows inundated conditions, following the approaches used in more complex global biogeochemical emission models (LPJWhyMe and CLM4Me). The model was designed to replace model formulations with field and remotely sensed collected data for 2 essential drivers: plant productivity and hydrology. This allows us to directly focus on the central processes of methane production, consumption and transport. One of our long term goals is to make the model available to a scientists interested in including methane modeling in their location of study. Sensitivity analysis results help in focusing field data collection efforts. Here, we present results from a pilot global sensitivity analysis of the model order to determine which parameters and processes contribute most to the model's uncertainty of methane emissions. Results show that parameters related to water table behavior, carbon input (in form of plant productivity) and rooting depth affect simulated methane emissions the most. Current efforts include to perform the sensitivity analysis again on methane emissions outputs from an updated model that incorporates a soil heat flux routine and to determine the extent by which the soil temperature parameters affect CH4 emissions. Currently we are conducting field collection of data during Summer 2017 for comparison among 3 different landscapes located in the Ordway-Swisher Biological Station in Melrose, FL. We are collecting soil moisture and CH4 emission data from 4 different wetland types. Having data from 4 wetland types allows for calibration of the model to diverse soil, water and vegetation characteristics.

  11. Atlas of TOMS ozone data collected during the Genesis of Atlantic Lows Experiment (GALE), 1986

    NASA Technical Reports Server (NTRS)

    Larko, David E.; Uccellini, Louis W.; Krueger, Arlin J.

    1986-01-01

    Data from the TOMS (Total Ozone Mapping Spectrometer) instrument aboard the Nimbus-7 satellite were collected daily in real time during the GALE (Genesis of Atlantic Lows Experiment) from January 15 through March 15, l986. The TOMS ozone data values were processed into GEMPAK format and transferred from the Goddard Space Flight Center to GALE operations in Raleigh-Durham, NC, in as little as three hours for use, in part, to direct aircraft research flights recording in situ measurements of ozone and water vapor in areas of interest. Once in GEMPAK format, the ozone values were processed into gridded form using the Barnes objective analysis scheme and contour plots of the ozone created. This atlas provides objectively analyzed contour plots of the ozone for each of the sixty days of GALE as well as four-panel presentations of the ozone analysis combined on the basis of GALE Intensive Observing Periods (IOP's).

  12. A DMAIC approach for process capability improvement an engine crankshaft manufacturing process

    NASA Astrophysics Data System (ADS)

    Sharma, G. V. S. S.; Rao, P. Srinivasa

    2014-05-01

    The define-measure-analyze-improve-control (DMAIC) approach is a five-strata approach, namely DMAIC. This approach is the scientific approach for reducing the deviations and improving the capability levels of the manufacturing processes. The present work elaborates on DMAIC approach applied in reducing the process variations of the stub-end-hole boring operation of the manufacture of crankshaft. This statistical process control study starts with selection of the critical-to-quality (CTQ) characteristic in the define stratum. The next stratum constitutes the collection of dimensional measurement data of the CTQ characteristic identified. This is followed by the analysis and improvement strata where the various quality control tools like Ishikawa diagram, physical mechanism analysis, failure modes effects analysis and analysis of variance are applied. Finally, the process monitoring charts are deployed at the workplace for regular monitoring and control of the concerned CTQ characteristic. By adopting DMAIC approach, standard deviation is reduced from 0.003 to 0.002. The process potential capability index ( C P) values improved from 1.29 to 2.02 and the process performance capability index ( C PK) values improved from 0.32 to 1.45, respectively.

  13. Reduction and analysis of data collected during the electromagnetic tornado experiment

    NASA Technical Reports Server (NTRS)

    Davisson, L. D.

    1976-01-01

    Techniques for data processing and analysis are described to support tornado detection by analysis of radio frequency interference in various frequency bands, and sea state determination from short pulse radar measurements. Activities include: strip chart recording of tornado data; the development and implementation of computer programs for digitalization and analysis of the data; data reduction techniques for short pulse radar data, and the simulation of radar returns from the sea surface by computer models.

  14. Collective Health Nursing: the construction of critical thinking about the reality of health.

    PubMed

    Chaves, Maria Marta Nolasco; Larocca, Liliana Müller; Peres, Aida Maris

    2011-12-01

    This article presents an analysis of the Collective Health Nursing teaching-learning processes and research in view of the consolidation of the Brazilian National Health System (Sistema Único de Saúde - SUS), performed with the objective to acknowledge the potentiality of the health reality of the population as a strategy to approximate the field of nursing practice and training as a way to revert undesired health situations. Thus, the authors reflect about the work of Collective Health Nursing, as they understand it is a mediator to promoting teaching, learning and knowledge development in this field. The authors believe that those processes, founded on critical thinking, permit to reflect about the contradictions between the current public policy and the actions promoted by the sector, and, this way, contribute to overcome the current health care mode, which has historically been founded on curative actions towards individuals, to assuming a model that acknowledges the health needs and intervenes in the social determination of the health-disease process.

  15. Software Formal Inspections Standard

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This Software Formal Inspections Standard (hereinafter referred to as Standard) is applicable to NASA software. This Standard defines the requirements that shall be fulfilled by the software formal inspections process whenever this process is specified for NASA software. The objective of this Standard is to define the requirements for a process that inspects software products to detect and eliminate defects as early as possible in the software life cycle. The process also provides for the collection and analysis of inspection data to improve the inspection process as well as the quality of the software.

  16. Aspect level sentiment analysis using machine learning

    NASA Astrophysics Data System (ADS)

    Shubham, D.; Mithil, P.; Shobharani, Meesala; Sumathy, S.

    2017-11-01

    In modern world the development of web and smartphones increases the usage of online shopping. The overall feedback about product is generated with the help of sentiment analysis using text processing.Opinion mining or sentiment analysis is used to collect and categorized the reviews of product. The proposed system uses aspect leveldetection in which features are extracted from the datasets. The system performs pre-processing operation such as tokenization, part of speech and limitization on the data tofinds meaningful information which is used to detect the polarity level and assigns rating to product. The proposed model focuses on aspects to produces accurate result by avoiding the spam reviews.

  17. [The pedagogical practice of nursing teachers and their knowledge].

    PubMed

    Madeira, Maria Zélia de Araújo; Lima, Maria da Glória Soares Barbosa

    2007-01-01

    This article has as objectives investigate the faculty knowledge that embase the pedagogical practice of the nursing-professors, glimpsing to understand the meaning of this social practice in what it refers to the process to become a professional professor. The qualitative nature study, with methodological emphasis in verbal story, used as instruments of data collection semi-arranged interviews, and the data analysis starting from the content analysis. Among the results obtained from the analysis, it has proven that the faculty knowledge and the pedagogical practice positively incises for the consolidation of the process to become professional professor in the scope of the faculty in the nursing course at UFPI.

  18. Characterization of plastic blends made from mixed plastics waste of different sources.

    PubMed

    Turku, Irina; Kärki, Timo; Rinne, Kimmo; Puurtinen, Ari

    2017-02-01

    This paper studies the recyclability of construction and household plastic waste collected from local landfills. Samples were processed from mixed plastic waste by injection moulding. In addition, blends of pure plastics, polypropylene and polyethylene were processed as a reference set. Reference samples with known plastic ratio were used as the calibration set for quantitative analysis of plastic fractions in recycled blends. The samples were tested for the tensile properties; scanning electron microscope-energy-dispersive X-ray spectroscopy was used for elemental analysis of the blend surfaces and Fourier transform infrared (FTIR) analysis was used for the quantification of plastics contents.

  19. Functional brain segmentation using inter-subject correlation in fMRI.

    PubMed

    Kauppi, Jukka-Pekka; Pajula, Juha; Niemi, Jari; Hari, Riitta; Tohka, Jussi

    2017-05-01

    The human brain continuously processes massive amounts of rich sensory information. To better understand such highly complex brain processes, modern neuroimaging studies are increasingly utilizing experimental setups that better mimic daily-life situations. A new exploratory data-analysis approach, functional segmentation inter-subject correlation analysis (FuSeISC), was proposed to facilitate the analysis of functional magnetic resonance (fMRI) data sets collected in these experiments. The method provides a new type of functional segmentation of brain areas, not only characterizing areas that display similar processing across subjects but also areas in which processing across subjects is highly variable. FuSeISC was tested using fMRI data sets collected during traditional block-design stimuli (37 subjects) as well as naturalistic auditory narratives (19 subjects). The method identified spatially local and/or bilaterally symmetric clusters in several cortical areas, many of which are known to be processing the types of stimuli used in the experiments. The method is not only useful for spatial exploration of large fMRI data sets obtained using naturalistic stimuli, but also has other potential applications, such as generation of a functional brain atlases including both lower- and higher-order processing areas. Finally, as a part of FuSeISC, a criterion-based sparsification of the shared nearest-neighbor graph was proposed for detecting clusters in noisy data. In the tests with synthetic data, this technique was superior to well-known clustering methods, such as Ward's method, affinity propagation, and K-means ++. Hum Brain Mapp 38:2643-2665, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  20. The Intersection of Identity Development and Peer Relationship Processes in Adolescence and Young Adulthood: Contributions of the Special Issue

    ERIC Educational Resources Information Center

    Galliher, Renee V.; Kerpelman, Jennifer L.

    2012-01-01

    This analysis of the papers in the special section on the intersection of identity development and peer relationship processes calls attention to conceptual contribution this collection of papers makes to the literature on identity development. Together these ten papers build on strong theoretical foundations in identity development, which posit…

  1. Community Strategic Visioning as a Method to Define and Address Poverty: An Analysis from Select Rural Montana Communities

    ERIC Educational Resources Information Center

    Lachapelle, Paul; Austin, Eric; Clark, Daniel

    2010-01-01

    Community strategic visioning is a citizen-based planning process in which diverse sectors of a community collectively determine a future state and coordinate a plan of action. Twenty-one communities in rural Montana participated in a multi-phase poverty reduction program that culminated in a community strategic vision process. Research on this…

  2. Application of mass spectrometry to process control for polymer material in autoclave curing

    NASA Technical Reports Server (NTRS)

    Smith, A. C.

    1983-01-01

    Mass spectrometer analysis of gas samples collected during a cure cycle of polymer materials can be used as a process control technique. This technique is particularly helpful in studying the various types of solvents and resin systems used in the preparation of polymer materials and characterizing the chemical composition of different resin systems and their mechanism of polymerization.

  3. The Process of Making Meaning: The Interplay between Teachers' Knowledge of Mathematical Proofs and Their Classroom Practices

    ERIC Educational Resources Information Center

    Paddack, Megan

    2009-01-01

    The purpose of this study was to investigate and describe how middle school mathematics teachers "make meaning" of proofs and the process of proving in the context of their classroom practices. A framework of "making meaning," created by the researcher, guided the data collection and analysis phases of the study. This framework…

  4. A Grounded Theory of Text Revision Processes Used by Young Adolescents Who Are Deaf

    ERIC Educational Resources Information Center

    Yuknis, Christina

    2014-01-01

    This study examined the revising processes used by 8 middle school students who are deaf or hard-of-hearing as they composed essays for their English classes. Using grounded theory, interviews with students and teachers in one middle school, observations of the students engaging in essay creation, and writing samples were collected for analysis.…

  5. Shaking the Trees: The Psychology of Collecting in U.S. Newspaper Coverage of the College Admissions Process

    ERIC Educational Resources Information Center

    Bishop, Ronald

    2009-01-01

    A frame analysis was conducted to explore themes in recent coverage by print journalists of the college application process, with special attention paid to the use by reporters of "keywords, stock phrases, stereotyped images, sources of information, and sentences that provide reinforcing clusters of facts or judgments" (Entman, p. 52) about this…

  6. Radar data processing and analysis

    NASA Technical Reports Server (NTRS)

    Ausherman, D.; Larson, R.; Liskow, C.

    1976-01-01

    Digitized four-channel radar images corresponding to particular areas from the Phoenix and Huntington test sites were generated in conjunction with prior experiments performed to collect X- and L-band synthetic aperture radar imagery of these two areas. The methods for generating this imagery are documented. A secondary objective was the investigation of digital processing techniques for extraction of information from the multiband radar image data. Following the digitization, the remaining resources permitted a preliminary machine analysis to be performed on portions of the radar image data. The results, although necessarily limited, are reported.

  7. DNA data in criminal procedure in the European fundamental rights context.

    PubMed

    Soleto, Helena

    2014-01-01

    Despite being one of the most useful and reliable identification tools, DNA profiling in criminal procedure balances on the border between the limitation and violation of Fundamental Rights that can occur beginning with the collection of the sample, its analysis, and its use; and ending with its processing. Throughout this complex process, violation of human or fundamental rights -such as the right to physical and moral integrity, the right not to be subject to degrading treatment, the right not to incriminate oneself, the right to family privacy together with that of not incriminating descendants or relatives in general, the right to personal development and the right to informative self-determination- is possible. This article presents an analysis of all the above-mentioned DNA treating phases in criminal process in the light of possible violations of some Fundamental Rights, while at the same time discarding some of them on the basis of European human rights protection standards. As the case-law of the European Court of Human Rights shows, the legislation on DNA collection and DNA related data processing or its implementation does not always respect all human rights and should be carefully considered before its adoption and during its application.

  8. Software Process Assessment (SPA)

    NASA Technical Reports Server (NTRS)

    Rosenberg, Linda H.; Sheppard, Sylvia B.; Butler, Scott A.

    1994-01-01

    NASA's environment mirrors the changes taking place in the nation at large, i.e. workers are being asked to do more work with fewer resources. For software developers at NASA's Goddard Space Flight Center (GSFC), the effects of this change are that we must continue to produce quality code that is maintainable and reusable, but we must learn to produce it more efficiently and less expensively. To accomplish this goal, the Data Systems Technology Division (DSTD) at GSFC is trying a variety of both proven and state-of-the-art techniques for software development (e.g., object-oriented design, prototyping, designing for reuse, etc.). In order to evaluate the effectiveness of these techniques, the Software Process Assessment (SPA) program was initiated. SPA was begun under the assumption that the effects of different software development processes, techniques, and tools, on the resulting product must be evaluated in an objective manner in order to assess any benefits that may have accrued. SPA involves the collection and analysis of software product and process data. These data include metrics such as effort, code changes, size, complexity, and code readability. This paper describes the SPA data collection and analysis methodology and presents examples of benefits realized thus far by DSTD's software developers and managers.

  9. Real-Time Monitoring of Psychotherapeutic Processes: Concept and Compliance

    PubMed Central

    Schiepek, Günter; Aichhorn, Wolfgang; Gruber, Martin; Strunk, Guido; Bachler, Egon; Aas, Benjamin

    2016-01-01

    Objective: The feasibility of a high-frequency real-time monitoring approach to psychotherapy is outlined and tested for patients' compliance to evaluate its integration to everyday practice. Criteria concern the ecological momentary assessment, the assessment of therapy-related cognitions and emotions, equidistant time sampling, real-time nonlinear time series analysis, continuous participative process control by client and therapist, and the application of idiographic (person-specific) surveys. Methods: The process-outcome monitoring is technically realized by an internet-based device for data collection and data analysis, the Synergetic Navigation System. Its feasibility is documented by a compliance study on 151 clients treated in an inpatient and a day-treatment clinic. Results: We found high compliance rates (mean: 78.3%, median: 89.4%) amongst the respondents, independent of the severity of symptoms or the degree of impairment. Compared to other diagnoses, the compliance rate was lower in the group diagnosed with personality disorders. Conclusion: The results support the feasibility of high-frequency monitoring in routine psychotherapy settings. Daily collection of psychological surveys allows for the assessment of highly resolved, equidistant time series data which gives insight into the nonlinear qualities of therapeutic change processes (e.g., pattern transitions, critical instabilities). PMID:27199837

  10. Development of processes for the production of solar grade silicon from halides and alkali metals, phase 1 and phase 2

    NASA Technical Reports Server (NTRS)

    Dickson, C. R.; Gould, R. K.; Felder, W.

    1981-01-01

    High temperature reactions of silicon halides with alkali metals for the production of solar grade silicon are described. Product separation and collection processes were evaluated, measure heat release parameters for scaling purposes and effects of reactants and/or products on materials of reactor construction were determined, and preliminary engineering and economic analysis of a scaled up process were made. The feasibility of the basic process to make and collect silicon was demonstrated. The jet impaction/separation process was demonstrated to be a purification process. The rate at which gas phase species from silicon particle precursors, the time required for silane decomposition to produce particles, and the competing rate of growth of silicon seed particles injected into a decomposing silane environment were determined. The extent of silane decomposition as a function of residence time, temperature, and pressure was measured by infrared absorption spectroscopy. A simplistic model is presented to explain the growth of silicon in a decomposing silane enviroment.

  11. A prioritization and analysis strategy for environmental surveillance results.

    PubMed

    Shyr, L J; Herrera, H; Haaker, R

    1997-11-01

    DOE facilities are required to conduct environmental surveillance to verify that facility operations are operated within the approved risk envelope and have not caused undue risk to the public and the environment. Given a reduced budget, a strategy for analyzing environmental surveillance data was developed to set priorities for sampling needs. The radiological and metal data collected at Sandia National Laboratories, New Mexico, were used to demonstrate the analysis strategy. Sampling locations were prioritized for further investigation and the needs for routine sampling. The process of data management, analysis, prioritization, and presentation has been automated through a custom-designed computer tool. Data collected over years can be analyzed and summarized in a short table format for prioritization and decision making.

  12. Development and Flight Testing of an Adaptable Vehicle Health-Monitoring Architecture

    NASA Technical Reports Server (NTRS)

    Woodard, Stanley E.; Coffey, Neil C.; Gonzalez, Guillermo A.; Woodman, Keith L.; Weathered, Brenton W.; Rollins, Courtney H.; Taylor, B. Douglas; Brett, Rube R.

    2003-01-01

    Development and testing of an adaptable wireless health-monitoring architecture for a vehicle fleet is presented. It has three operational levels: one or more remote data acquisition units located throughout the vehicle; a command and control unit located within the vehicle; and a terminal collection unit to collect analysis results from all vehicles. Each level is capable of performing autonomous analysis with a trained adaptable expert system. The remote data acquisition unit has an eight channel programmable digital interface that allows the user discretion for choosing type of sensors; number of sensors, sensor sampling rate, and sampling duration for each sensor. The architecture provides framework for a tributary analysis. All measurements at the lowest operational level are reduced to provide analysis results necessary to gauge changes from established baselines. These are then collected at the next level to identify any global trends or common features from the prior level. This process is repeated until the results are reduced at the highest operational level. In the framework, only analysis results are forwarded to the next level to reduce telemetry congestion. The system's remote data acquisition hardware and non-analysis software have been flight tested on the NASA Langley B757's main landing gear.

  13. Development of a software tool to support chemical and biological terrorism intelligence analysis

    NASA Astrophysics Data System (ADS)

    Hunt, Allen R.; Foreman, William

    1997-01-01

    AKELA has developed a software tool which uses a systems analytic approach to model the critical processes which support the acquisition of biological and chemical weapons by terrorist organizations. This tool has four major components. The first is a procedural expert system which describes the weapon acquisition process. It shows the relationship between the stages a group goes through to acquire and use a weapon, and the activities in each stage required to be successful. It applies to both state sponsored and small group acquisition. An important part of this expert system is an analysis of the acquisition process which is embodied in a list of observables of weapon acquisition activity. These observables are cues for intelligence collection The second component is a detailed glossary of technical terms which helps analysts with a non- technical background understand the potential relevance of collected information. The third component is a linking capability which shows where technical terms apply to the parts of the acquisition process. The final component is a simple, intuitive user interface which shows a picture of the entire process at a glance and lets the user move quickly to get more detailed information. This paper explains e each of these five model components.

  14. The textual characteristics of traditional and Open Access scientific journals are similar.

    PubMed

    Verspoor, Karin; Cohen, K Bretonnel; Hunter, Lawrence

    2009-06-15

    Recent years have seen an increased amount of natural language processing (NLP) work on full text biomedical journal publications. Much of this work is done with Open Access journal articles. Such work assumes that Open Access articles are representative of biomedical publications in general and that methods developed for analysis of Open Access full text publications will generalize to the biomedical literature as a whole. If this assumption is wrong, the cost to the community will be large, including not just wasted resources, but also flawed science. This paper examines that assumption. We collected two sets of documents, one consisting only of Open Access publications and the other consisting only of traditional journal publications. We examined them for differences in surface linguistic structures that have obvious consequences for the ease or difficulty of natural language processing and for differences in semantic content as reflected in lexical items. Regarding surface linguistic structures, we examined the incidence of conjunctions, negation, passives, and pronominal anaphora, and found that the two collections did not differ. We also examined the distribution of sentence lengths and found that both collections were characterized by the same mode. Regarding lexical items, we found that the Kullback-Leibler divergence between the two collections was low, and was lower than the divergence between either collection and a reference corpus. Where small differences did exist, log likelihood analysis showed that they were primarily in the area of formatting and in specific named entities. We did not find structural or semantic differences between the Open Access and traditional journal collections.

  15. The textual characteristics of traditional and Open Access scientific journals are similar

    PubMed Central

    Verspoor, Karin; Cohen, K Bretonnel; Hunter, Lawrence

    2009-01-01

    Background Recent years have seen an increased amount of natural language processing (NLP) work on full text biomedical journal publications. Much of this work is done with Open Access journal articles. Such work assumes that Open Access articles are representative of biomedical publications in general and that methods developed for analysis of Open Access full text publications will generalize to the biomedical literature as a whole. If this assumption is wrong, the cost to the community will be large, including not just wasted resources, but also flawed science. This paper examines that assumption. Results We collected two sets of documents, one consisting only of Open Access publications and the other consisting only of traditional journal publications. We examined them for differences in surface linguistic structures that have obvious consequences for the ease or difficulty of natural language processing and for differences in semantic content as reflected in lexical items. Regarding surface linguistic structures, we examined the incidence of conjunctions, negation, passives, and pronominal anaphora, and found that the two collections did not differ. We also examined the distribution of sentence lengths and found that both collections were characterized by the same mode. Regarding lexical items, we found that the Kullback-Leibler divergence between the two collections was low, and was lower than the divergence between either collection and a reference corpus. Where small differences did exist, log likelihood analysis showed that they were primarily in the area of formatting and in specific named entities. Conclusion We did not find structural or semantic differences between the Open Access and traditional journal collections. PMID:19527520

  16. 14CO2 analysis of soil gas: Evaluation of sample size limits and sampling devices

    NASA Astrophysics Data System (ADS)

    Wotte, Anja; Wischhöfer, Philipp; Wacker, Lukas; Rethemeyer, Janet

    2017-12-01

    Radiocarbon (14C) analysis of CO2 respired from soils or sediments is a valuable tool to identify different carbon sources. The collection and processing of the CO2, however, is challenging and prone to contamination. We thus continuously improve our handling procedures and present a refined method for the collection of even small amounts of CO2 in molecular sieve cartridges (MSCs) for accelerator mass spectrometry 14C analysis. Using a modified vacuum rig and an improved desorption procedure, we were able to increase the CO2 recovery from the MSC (95%) as well as the sample throughput compared to our previous study. By processing series of different sample size, we show that our MSCs can be used for CO2 samples of as small as 50 μg C. The contamination by exogenous carbon determined in these laboratory tests, was less than 2.0 μg C from fossil and less than 3.0 μg C from modern sources. Additionally, we tested two sampling devices for the collection of CO2 samples released from soils or sediments, including a respiration chamber and a depth sampler, which are connected to the MSC. We obtained a very promising, low process blank for the entire CO2 sampling and purification procedure of ∼0.004 F14C (equal to 44,000 yrs BP) and ∼0.003 F14C (equal to 47,000 yrs BP). In contrast to previous studies, we observed no isotopic fractionation towards lighter δ13C values during the passive sampling with the depth samplers.

  17. Leadership in nursing: analysis of the process of choosing the heads.

    PubMed

    de Moura, Gisela Maria Schebella Souto; de Magalhaes, Ana Maria Müller; Dall'agnol, Clarice Maria; Juchem, Beatriz Cavalcanti; Marona, Daniela dos Santos

    2010-01-01

    The process of choosing heads can be strategic to achieve desired results in nursing care. This study presents an exploratory and descriptive research that aims to analyze the process of choosing heads for the ward, in the nursing area of a teaching hospital in Porto Alegre. Data was collected from registered nurses, technicians and nursing auxiliaries through a semi-structured interview technique and free choice of words. Three theme categories emerged from content analysis: process of choosing heads, managerial competences of the head-to-be and team articulation. Leadership was the word most frequently associated with the process of choosing heads. The consultation process for the choice of the leader also contributes to the success of the manager, as it makes the team members feel co-responsible for the results achieved and legitimizes the head-to-be in their group.

  18. Understanding Process in Group-Based Intervention Delivery: Social Network Analysis and Intra-entity Variability Methods as Windows into the "Black Box".

    PubMed

    Molloy Elreda, Lauren; Coatsworth, J Douglas; Gest, Scott D; Ram, Nilam; Bamberger, Katharine

    2016-11-01

    Although the majority of evidence-based programs are designed for group delivery, group process and its role in participant outcomes have received little empirical attention. Data were collected from 20 groups of participants (94 early adolescents, 120 parents) enrolled in an efficacy trial of a mindfulness-based adaptation of the Strengthening Families Program (MSFP). Following each weekly session, participants reported on their relations to group members. Social network analysis and methods sensitive to intraindividual variability were integrated to examine weekly covariation between group process and participant progress, and to predict post-intervention outcomes from levels and changes in group process. Results demonstrate hypothesized links between network indices of group process and intervention outcomes and highlight the value of this unique analytic approach to studying intervention group process.

  19. Development of Na Adaptive Filter to Estimate the Percentage of Body Fat Based on Anthropometric Measures

    NASA Astrophysics Data System (ADS)

    do Lago, Naydson Emmerson S. P.; Kardec Barros, Allan; Sousa, Nilviane Pires S.; Junior, Carlos Magno S.; Oliveira, Guilherme; Guimares Polisel, Camila; Eder Carvalho Santana, Ewaldo

    2018-01-01

    This study aims to develop an algorithm of an adaptive filter to determine the percentage of body fat based on the use of anthropometric indicators in adolescents. Measurements such as body mass, height and waist circumference were collected for a better analysis. The development of this filter was based on the Wiener filter, used to produce an estimate of a random process. The Wiener filter minimizes the mean square error between the estimated random process and the desired process. The LMS algorithm was also studied for the development of the filter because it is important due to its simplicity and facility of computation. Excellent results were obtained with the filter developed, being these results analyzed and compared with the data collected.

  20. Design of standard operating procedure production proceses (case study on the home industry Bedugul Baturiti Tabanan Bali)

    NASA Astrophysics Data System (ADS)

    Kasiani; Suhantono, Djoko; Mirah Kencanawati, AAA

    2018-01-01

    Candikuning is part of the district of Baturiti, tourism village, better known by the name of Bedugul. No less interesting is the variety of chips produced by two partner groups as a souvenir after the tour, such as Chips: Spinach; beans; Tempeh. The purpose of this research were to design a Standard Operating Procedure (SOP): Production Processes on the Home Industry Bedugul Baturiti Tabanan Bali. The data technic collected use: observation; Documentation; and then interview to collect information. The data analysis technic done by using the Miles & Huberman. Result this research that the draft SOP: Production Processes Chips (Menu). The conclusion in this research SOP Production Processes use with flowchart and description on the Home Industry Bedugul Baturiti Tabanan Bali.

  1. Minimal Network Topologies for Signal Processing during Collective Cell Chemotaxis.

    PubMed

    Yue, Haicen; Camley, Brian A; Rappel, Wouter-Jan

    2018-06-19

    Cell-cell communication plays an important role in collective cell migration. However, it remains unclear how cells in a group cooperatively process external signals to determine the group's direction of motion. Although the topology of signaling pathways is vitally important in single-cell chemotaxis, the signaling topology for collective chemotaxis has not been systematically studied. Here, we combine mathematical analysis and simulations to find minimal network topologies for multicellular signal processing in collective chemotaxis. We focus on border cell cluster chemotaxis in the Drosophila egg chamber, in which responses to several experimental perturbations of the signaling network are known. Our minimal signaling network includes only four elements: a chemoattractant, the protein Rac (indicating cell activation), cell protrusion, and a hypothesized global factor responsible for cell-cell interaction. Experimental data on cell protrusion statistics allows us to systematically narrow the number of possible topologies from more than 40,000,000 to only six minimal topologies with six interactions between the four elements. This analysis does not require a specific functional form of the interactions, and only qualitative features are needed; it is thus robust to many modeling choices. Simulations of a stochastic biochemical model of border cell chemotaxis show that the qualitative selection procedure accurately determines which topologies are consistent with the experiment. We fit our model for all six proposed topologies; each produces results that are consistent with all experimentally available data. Finally, we suggest experiments to further discriminate possible pathway topologies. Copyright © 2018 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anheier, Norman C.; Cannon, Bret D.; Martinez, Alonzo

    The International Atomic Energy Agency’s (IAEA’s) long-term research and development plan calls for more cost-effective and efficient safeguard methods to detect and deter misuse of gaseous centrifuge enrichment plants (GCEPs). The IAEA’s current safeguards approaches at GCEPs are based on a combination of routine and random inspections that include environmental sampling and destructive assay (DA) sample collection from UF6 in-process material and selected cylinders. Samples are then shipped offsite for subsequent laboratory analysis. In this paper, a new DA sample collection and onsite analysis approach that could help to meet challenges in transportation and chain of custody for UF6 DAmore » samples is introduced. This approach uses a handheld sampler concept and a Laser Ablation, Laser Absorbance Spectrometry (LAARS) analysis instrument, both currently under development at the Pacific Northwest National Laboratory. A LAARS analysis instrument could be temporarily or permanently deployed in the IAEA control room of the facility, in the IAEA data acquisition cabinet, for example. The handheld PNNL DA sampler design collects and stabilizes a much smaller DA sample mass compared to current sampling methods. The significantly lower uranium mass reduces the sample radioactivity and the stabilization approach diminishes the risk of uranium and hydrogen fluoride release. These attributes enable safe sample handling needed during onsite LAARS assay and may help ease shipping challenges for samples to be processed at the IAEA’s offsite laboratory. The LAARS and DA sampler implementation concepts will be described and preliminary technical viability results presented.« less

  3. Time-course human urine proteomics in space-flight simulation experiments.

    PubMed

    Binder, Hans; Wirth, Henry; Arakelyan, Arsen; Lembcke, Kathrin; Tiys, Evgeny S; Ivanisenko, Vladimir A; Kolchanov, Nikolay A; Kononikhin, Alexey; Popov, Igor; Nikolaev, Evgeny N; Pastushkova, Lyudmila; Larina, Irina M

    2014-01-01

    Long-term space travel simulation experiments enabled to discover different aspects of human metabolism such as the complexity of NaCl salt balance. Detailed proteomics data were collected during the Mars105 isolation experiment enabling a deeper insight into the molecular processes involved. We studied the abundance of about two thousand proteins extracted from urine samples of six volunteers collected weekly during a 105-day isolation experiment under controlled dietary conditions including progressive reduction of salt consumption. Machine learning using Self Organizing maps (SOM) in combination with different analysis tools was applied to describe the time trajectories of protein abundance in urine. The method enables a personalized and intuitive view on the physiological state of the volunteers. The abundance of more than one half of the proteins measured clearly changes in the course of the experiment. The trajectory splits roughly into three time ranges, an early (week 1-6), an intermediate (week 7-11) and a late one (week 12-15). Regulatory modes associated with distinct biological processes were identified using previous knowledge by applying enrichment and pathway flow analysis. Early protein activation modes can be related to immune response and inflammatory processes, activation at intermediate times to developmental and proliferative processes and late activations to stress and responses to chemicals. The protein abundance profiles support previous results about alternative mechanisms of salt storage in an osmotically inactive form. We hypothesize that reduced NaCl consumption of about 6 g/day presumably will reduce or even prevent the activation of inflammatory processes observed in the early time range of isolation. SOM machine learning in combination with analysis methods of class discovery and functional annotation enable the straightforward analysis of complex proteomics data sets generated by means of mass spectrometry.

  4. An evaluation of the ERTS data collection system as a potential operational tool. [automatic hydrologic data collection and processing system for geological surveys

    NASA Technical Reports Server (NTRS)

    Paulson, R. W.

    1974-01-01

    The Earth Resources Technology Satellite Data Collection System has been shown to be, from the users vantage point, a reliable and simple system for collecting data from U.S. Geological Survey operational field instrumentation. It is technically feasible to expand the ERTS system into an operational polar-orbiting data collection system to gather data from the Geological Survey's Hydrologic Data Network. This could permit more efficient internal management of the Network, and could enable the Geological Survey to make data available to cooperating agencies in near-real time. The Geological Survey is conducting an analysis of the costs and benefits of satellite data-relay systems.

  5. Analysis of Gas Dissociation Solar Thermal Power System

    DTIC Science & Technology

    1975-01-01

    of utilizing the collected heat for chemical processing are discussed. / INrRoDUCro,4 treservoir.Along the bottom of the reservoir are placed The...that have accumulated at the facility is effected by using reversible chemical reactions bottom of the reservoir. in a dosed-cycle gaseous working fluid...solar energy collection field, the molten-solid salt heat plus 23 kcal mole’ of chemical reaction energy. Hence, energy reservoir, the gaseous closed

  6. Collection, processing and error analysis of Terrestrial Laser Scanning data from fluvial gravel surfaces

    NASA Astrophysics Data System (ADS)

    Hodge, R.; Brasington, J.; Richards, K.

    2009-04-01

    The ability to collect 3D elevation data at mm-resolution from in-situ natural surfaces, such as fluvial and coastal sediments, rock surfaces, soils and dunes, is beneficial for a range of geomorphological and geological research. From these data the properties of the surface can be measured, and Digital Terrain Models (DTM) can be constructed. Terrestrial Laser Scanning (TLS) can collect quickly such 3D data with mm-precision and mm-spacing. This paper presents a methodology for the collection and processing of such TLS data, and considers how the errors in this TLS data can be quantified. TLS has been used to collect elevation data from fluvial gravel surfaces. Data were collected from areas of approximately 1 m2, with median grain sizes ranging from 18 to 63 mm. Errors are inherent in such data as a result of the precision of the TLS, and the interaction of factors including laser footprint, surface topography, surface reflectivity and scanning geometry. The methodology for the collection and processing of TLS data from complex surfaces like these fluvial sediments aims to minimise the occurrence of, and remove, such errors. The methodology incorporates taking scans from multiple scanner locations, averaging repeat scans, and applying a series of filters to remove erroneous points. Analysis of 2.5D DTMs interpolated from the processed data has identified geomorphic properties of the gravel surfaces, including the distribution of surface elevations, preferential grain orientation and grain imbrication. However, validation of the data and interpolated DTMs is limited by the availability of techniques capable of collecting independent elevation data of comparable quality. Instead, two alternative approaches to data validation are presented. The first consists of careful internal validation to optimise filter parameter values during data processing combined with a series of laboratory experiments. In the experiments, TLS data were collected from a sphere and planes with different reflectivities to measure the accuracy and precision of TLS data of these geometrically simple objects. Whilst this first approach allows the maximum precision of TLS data from complex surfaces to be estimated, it cannot quantify the distribution of errors within the TLS data and across the interpolated DTMs. The second approach enables this by simulating the collection of TLS data from complex surfaces of a known geometry. This simulated scanning has been verified through systematic comparison with laboratory TLS data. Two types of surface geometry have been investigated: simulated regular arrays of uniform spheres used to analyse the effect of sphere size; and irregular beds of spheres with the same grain size distribution as the fluvial gravels, which provide a comparable complex geometry to the field sediment surfaces. A series of simulated scans of these surfaces has enabled the magnitude and spatial distribution of errors in the interpolated DTMs to be quantified, as well as demonstrating the utility of the different processing stages in removing errors from TLS data. As well as demonstrating the application of simulated scanning as a technique to quantify errors, these results can be used to estimate errors in comparable TLS data.

  7. Indications of comprehensiveness in the pedagogical relationship: a design to be constructed in nursing education.

    PubMed

    Lima, Margarete Maria de; Reibnitz, Kenya Schmidt; Kloh, Daiana; Martini, Jussara Gue; Backes, Vania Marli Schubert

    2017-11-27

    To analyze how the indications of comprehensiveness translate into the teaching-learning process in a nursing undergraduate course. Qualitative case study carried out with professors of a Nursing Undergraduate Course. Data collection occurred through documentary analysis, non-participant observation and individual interviews. Data analysis was guided from an analytical matrix following the steps of the operative proposal. Eight professors participated in the study. Some indications of comprehensiveness such as dialogue, listening, mutual respect, bonding and welcoming are present in the daily life of some professors. The indications of comprehensiveness are applied by some professors in the pedagogical relationship. The results refer to the Comprehensiveness of teaching-learning in a single and double loop model, and in this the professor and the student assume an open posture for new possibilities in the teaching-learning process. Comprehensiveness, as it is recognized as a pedagogical principle, allows the disruption of a professor-centered teaching and advances in collective learning, enabling the professor and student to create their own design anchored in a reflective process about their practices and the reality found in the health services.

  8. Two Strategies for Qualitative Content Analysis: An Intramethod Approach to Triangulation.

    PubMed

    Renz, Susan M; Carrington, Jane M; Badger, Terry A

    2018-04-01

    The overarching aim of qualitative research is to gain an understanding of certain social phenomena. Qualitative research involves the studied use and collection of empirical materials, all to describe moments and meanings in individuals' lives. Data derived from these various materials require a form of analysis of the content, focusing on written or spoken language as communication, to provide context and understanding of the message. Qualitative research often involves the collection of data through extensive interviews, note taking, and tape recording. These methods are time- and labor-intensive. With the advances in computerized text analysis software, the practice of combining methods to analyze qualitative data can assist the researcher in making large data sets more manageable and enhance the trustworthiness of the results. This article will describe a novel process of combining two methods of qualitative data analysis, or Intramethod triangulation, as a means to provide a deeper analysis of text.

  9. First comparative analysis concerning the plasma platelet contamination during MNC collection.

    PubMed

    Pfeiffer, Hella; Achenbach, Susanne; Strobel, Julian; Zimmermann, Robert; Eckstein, Reinhold; Strasser, Erwin F

    2017-08-01

    Monocytes can be cultured into dendritic cells with addition of autologous plasma, which is highly prone to platelet contamination due to the apheresis process. Since platelets affect the maturation process of monocytes into dendritic cells and might even lead to a diminished harvest of dendritic cells, it is very important to reduce the platelet contamination. A new collection device (Spectra Optia) was analyzed, compared to two established devices (COM.TEC, Cobe Spectra) and evaluated regarding the potential generation of source plasma. Concurrent plasma collected during leukapheresis was analyzed for residual cell contamination in a prospective study with the new Spectra Optia apheresis device (n=24) and was compared with COM.TEC and Cobe Spectra data (retrospective analysis, n=72). Donor pre-donation counts of platelets were analyzed for their predictive value of contaminating PLTs in plasma harvests. The newest apheresis device showed the lowest residual platelet count of the collected concurrent plasma (median 3.50×10 9 /l) independent of pre-donation counts. The other two devices and sets had a higher platelet contamination. The contamination of the plasma with leukocytes was very low (only 2.0% were higher than 0.5×10 9 /l). This study showed a significant reduction of platelet contamination of the concurrent plasma collected with the new Spectra Optia device. This plasma product with low residual platelets and leukocytes might also be used as plasma for fractionation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tyrrell, Evan; Denny, Angelita

    Fifty-two groundwater samples and one surface water sample were collected at the Monument Valley, Arizona, Processing Site to monitor groundwater contaminants for evaluating the effectiveness of the proposed compliance strategy as specified in the 1999 Final Site Observational Work Plan for the UMTRA Project Site at Monument Valley, Arizona. Sampling and analyses were conducted as specified in the Sampling and Analysis Plan for U.S. Department of Energy Office of Legacy Management Sites (LMS/PRO/S04351, continually updated, http://energy.gov/lm/downloads/sampling-and-analysis-plan-us-department- energy-office-legacy-management-sites). Samples were collected for metals, anions, nitrate + nitrite as N, and ammonia as N analyses at all locations.

  11. Three Tier Unified Process Model for Requirement Negotiations and Stakeholder Collaborations

    NASA Astrophysics Data System (ADS)

    Niazi, Muhammad Ashraf Khan; Abbas, Muhammad; Shahzad, Muhammad

    2012-11-01

    This research paper is focused towards carrying out a pragmatic qualitative analysis of various models and approaches of requirements negotiations (a sub process of requirements management plan which is an output of scope managementís collect requirements process) and studies stakeholder collaborations methodologies (i.e. from within communication management knowledge area). Experiential analysis encompass two tiers; first tier refers to the weighted scoring model while second tier focuses on development of SWOT matrices on the basis of findings of weighted scoring model for selecting an appropriate requirements negotiation model. Finally the results are simulated with the help of statistical pie charts. On the basis of simulated results of prevalent models and approaches of negotiations, a unified approach for requirements negotiations and stakeholder collaborations is proposed where the collaboration methodologies are embeded into selected requirements negotiation model as internal parameters of the proposed process alongside some external required parameters like MBTI, opportunity analysis etc.

  12. Environmental sampling and analysis in support of NTI-3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McGuire, R.R.; Harrar, J.E.; Haas, J.S.

    1991-04-06

    The third National Trail Inspection took place at the Monsanto Chemical Plant in Luling, Louisiana. In order to test the effectiveness of environmental sampling (soil, water and air) in determining the nature of the chemical process in a given production plant and to examine the distance from a process building that samples can effectively be taken, we needed to select some materials that constituted components of process streams. Three materials were selected: 1. isopropyl amine for air monitoring, 2. 4-nitrophenol, one of the precursors in the acetaminophen process, and 3. an intermediate in the production of glyphosate for ROUNDUP thatmore » is known simply as glyphosate intermediated. LLNL did not participate in the air sampling nor the analysis for isopropyl amine. This paper discussed the steps in this experiment including sample collection, sample workshop, sample analysis the results and discussion and the conclusion. 3 figs., 6 tabs.« less

  13. The project office of the Gaia Data Processing and Analysis Consortium

    NASA Astrophysics Data System (ADS)

    Mercier, E.; Els, S.; Gracia, G.; O'Mullane, W.; Lock, T.; Comoretto, G.

    2010-07-01

    Gaia is Europe's future astrometry satellite which is currently under development. The data collected by Gaia will be treated and analyzed by the "Data Processing and Analysis Consortium" (DPAC). DPAC consists of over 400 scientists in more than 22 countries, which are currently developing the required data reduction, analysis and handling algorithms and routines. DPAC is organized in Coordination Units (CU's) and Data Processing Centres (DPCs). Each of these entities is individually responsible for the development of software for the processing of the different data. In 2008, the DPAC Project Office (PO) has been set-up with the task to manage the day-to-day activities of the consortium including implementation, development and operations. This paper describes the tasks DPAC faces and the role of the DPAC PO in the Gaia framework and how it supports the DPAC entities in their effort to fulfill the Gaia promise.

  14. Sense-making for intelligence analysis on social media data

    NASA Astrophysics Data System (ADS)

    Pritzkau, Albert

    2016-05-01

    Social networks, in particular online social networks as a subset, enable the analysis of social relationships which are represented by interaction, collaboration, or other sorts of influence between people. Any set of people and their internal social relationships can be modelled as a general social graph. These relationships are formed by exchanging emails, making phone calls, or carrying out a range of other activities that build up the network. This paper presents an overview of current approaches to utilizing social media as a ubiquitous sensor network in the context of national and global security. Exploitation of social media is usually an interdisciplinary endeavour, in which the relevant technologies and methods are identified and linked in order ultimately demonstrate selected applications. Effective and efficient intelligence is usually accomplished in a combined human and computer effort. Indeed, the intelligence process heavily depends on combining a human's flexibility, creativity, and cognitive ability with the bandwidth and processing power of today's computers. To improve the usability and accuracy of the intelligence analysis we will have to rely on data-processing tools at the level of natural language. Especially the collection and transformation of unstructured data into actionable, structured data requires scalable computational algorithms ranging from Artificial Intelligence, via Machine Learning, to Natural Language Processing (NLP). To support intelligence analysis on social media data, social media analytics is concerned with developing and evaluating computational tools and frameworks to collect, monitor, analyze, summarize, and visualize social media data. Analytics methods are employed to extract of significant patterns that might not be obvious. As a result, different data representations rendering distinct aspects of content and interactions serve as a means to adapt the focus of the intelligence analysis to specific information requests.

  15. Marital power process of Korean men married to foreign women: a qualitative study.

    PubMed

    Kim, Miyoung; Park, Gyeong Sook; Windsor, Carol

    2013-03-01

    This study explored how Korean men married to migrant women construct meaning around married life. Data were collected through in-depth interviews with 10 men who had had been married to migrant women for ≥ 2 years. Data collection and analysis were performed concurrently using a grounded theory approach. The core category generated was the process of sustaining a family unit. The men came to understand the importance of a distribution of power within the family in sustaining the family unit. Constituting this process were four stages: recognizing an imbalance of power, relinquishing power, empowering, and fine-tuning the balance of power. This study provides important insight into the dynamics of marital power from men's point of view by demonstrating a link between the way people adjust to married life and the process by which married couples adjust through the distribution and redistribution of power. © 2012 Wiley Publishing Asia Pty Ltd.

  16. A review of earth observation using mobile personal communication devices

    NASA Astrophysics Data System (ADS)

    Ferster, Colin J.; Coops, Nicholas C.

    2013-02-01

    Earth observation using mobile personal communication devices (MPCDs) is a recent advance with considerable promise for acquiring important and timely measurements. Globally, over 5 billion people have access to mobile phones, with an increasing proportion having access to smartphones with capabilities such as a camera, microphone, global positioning system (GPS), data storage, and networked data transfer. Scientists can view these devices as embedded sensors with the potential to take measurements of the Earth's surface and processes. To advance the state of Earth observation using MPCDs, scientists need to consider terms and concepts, from a broad range of disciplines including citizen science, image analysis, and computer vision. In this paper, as a result of our literature review, we identify a number of considerations for Earth observation using MPCDs such as methods of field collection, collecting measurements over broad areas, errors and biases, data processing, and accessibility of data. Developing effective frameworks for mobile data collection with public participation and strategies for minimizing bias, in combination with advancements in image processing techniques, will offer opportunities to collect Earth sensing data across a range of scales and perspectives, complimenting airborne and spaceborne remote sensing measurements.

  17. Rapid DNA analysis for automated processing and interpretation of low DNA content samples.

    PubMed

    Turingan, Rosemary S; Vasantgadkar, Sameer; Palombo, Luke; Hogan, Catherine; Jiang, Hua; Tan, Eugene; Selden, Richard F

    2016-01-01

    Short tandem repeat (STR) analysis of casework samples with low DNA content include those resulting from the transfer of epithelial cells from the skin to an object (e.g., cells on a water bottle, or brim of a cap), blood spatter stains, and small bone and tissue fragments. Low DNA content (LDC) samples are important in a wide range of settings, including disaster response teams to assist in victim identification and family reunification, military operations to identify friend or foe, criminal forensics to identify suspects and exonerate the innocent, and medical examiner and coroner offices to identify missing persons. Processing LDC samples requires experienced laboratory personnel, isolated workstations, and sophisticated equipment, requires transport time, and involves complex procedures. We present a rapid DNA analysis system designed specifically to generate STR profiles from LDC samples in field-forward settings by non-technical operators. By performing STR in the field, close to the site of collection, rapid DNA analysis has the potential to increase throughput and to provide actionable information in real time. A Low DNA Content BioChipSet (LDC BCS) was developed and manufactured by injection molding. It was designed to function in the fully integrated Accelerated Nuclear DNA Equipment (ANDE) instrument previously designed for analysis of buccal swab and other high DNA content samples (Investigative Genet. 4(1):1-15, 2013). The LDC BCS performs efficient DNA purification followed by microfluidic ultrafiltration of the purified DNA, maximizing the quantity of DNA available for subsequent amplification and electrophoretic separation and detection of amplified fragments. The system demonstrates accuracy, precision, resolution, signal strength, and peak height ratios appropriate for casework analysis. The LDC rapid DNA analysis system is effective for the generation of STR profiles from a wide range of sample types. The technology broadens the range of sample types that can be processed and minimizes the time between sample collection, sample processing and analysis, and generation of actionable intelligence. The fully integrated Expert System is capable of interpreting a wide range or sample types and input DNA quantities, allowing samples to be processed and interpreted without a technical operator.

  18. A Systematic Mapping on the Learning Analytics Field and Its Analysis in the Massive Open Online Courses Context

    ERIC Educational Resources Information Center

    Moissa, Barbara; Gasparini, Isabela; Kemczinski, Avanilde

    2015-01-01

    Learning Analytics (LA) is a field that aims to optimize learning through the study of dynamical processes occurring in the students' context. It covers the measurement, collection, analysis and reporting of data about students and their contexts. This study aims at surveying existing research on LA to identify approaches, topics, and needs for…

  19. Making the Most of PIAAC: Preliminary Investigation of Adults' Numeracy Practices through Secondary Analysis of the PIAAC Dataset

    ERIC Educational Resources Information Center

    Coben, Diana; Miller-Reilly, Barbara; Satherley, Paul; Earle, David

    2016-01-01

    The Programme for the International Assessment of Adult Competencies (PIAAC) assesses key information processing skills and collects information on how often people undertake a range of activities at work and in everyday life. We are exploring what secondary analysis of online anonymised PIAAC data can tell us about adults' numeracy practices. In…

  20. Understanding Task-in-Process through the Lens of Laughter: Activity Designs, Instructional Materials, Learner Orientations, and Interpersonal Relationships

    ERIC Educational Resources Information Center

    Hasegawa, Atsushi

    2018-01-01

    Using the framework of conversation analysis, this study investigated the interactional workings of laughter in task-based interactions. The analysis was drawn from 160 cases of pair work interactions, collected in 2nd-semester Japanese-as-a-foreign-language classrooms. The pair work activities examined in this study are mostly grammar-focused,…

  1. A Novel Method for the In-Depth Multimodal Analysis of Student Learning Trajectories in Intelligent Tutoring Systems

    ERIC Educational Resources Information Center

    Liu, Ran; Stamper, John; Davenport, Jodi

    2018-01-01

    Temporal analyses are critical to understanding learning processes, yet understudied in education research. Data from different sources are often collected at different grain sizes, which are difficult to integrate. Making sense of data at many levels of analysis, including the most detailed levels, is highly time-consuming. In this paper, we…

  2. 78 FR 48912 - Agency Information Collection Activities: Submission to OMB for Reinstatement, With, of a...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-12

    ... for analysis in the NCUA Low-Income Designation (LID) Tool. The LID Tool is a geocoding software... the member address data are obtained through the examination process and the results of the LID Tool... may send an electronic member address data file for analysis in the LID Tool. If a credit union does...

  3. 78 FR 59377 - Agency Information Collection Activities: Submission to OMB for Reinstatement, With Change, of a...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-26

    ... analysis in the NCUA Low-Income Designation (LID) Tool. The LID Tool is a geocoding software program which... data are obtained through the examination process and the results of the LID Tool indicate the credit... electronic member address data file for analysis in the LID Tool. If a credit union does not qualify for a...

  4. Can Item Analysis of MCQs Accomplish the Need of a Proper Assessment Strategy for Curriculum Improvement in Medical Education?

    ERIC Educational Resources Information Center

    Pawade, Yogesh R.; Diwase, Dipti S.

    2016-01-01

    Item analysis of Multiple Choice Questions (MCQs) is the process of collecting, summarizing and utilizing information from students' responses to evaluate the quality of test items. Difficulty Index (p-value), Discrimination Index (DI) and Distractor Efficiency (DE) are the parameters which help to evaluate the quality of MCQs used in an…

  5. Optimisation of logistics processes of energy grass collection

    NASA Astrophysics Data System (ADS)

    Bányai, Tamás.

    2010-05-01

    The collection of energy grass is a logistics-intensive process [1]. The optimal design and control of transportation and collection subprocesses is a critical point of the supply chain. To avoid irresponsible decisions by right of experience and intuition, the optimisation and analysis of collection processes based on mathematical models and methods is the scientific suggestible way. Within the frame of this work, the author focuses on the optimisation possibilities of the collection processes, especially from the point of view transportation and related warehousing operations. However the developed optimisation methods in the literature [2] take into account the harvesting processes, county-specific yields, transportation distances, erosion constraints, machinery specifications, and other key variables, but the possibility of more collection points and the multi-level collection were not taken into consideration. The possible areas of using energy grass is very wide (energetically use, biogas and bio alcohol production, paper and textile industry, industrial fibre material, foddering purposes, biological soil protection [3], etc.), so not only a single level but also a multi-level collection system with more collection and production facilities has to be taken into consideration. The input parameters of the optimisation problem are the followings: total amount of energy grass to be harvested in each region; specific facility costs of collection, warehousing and production units; specific costs of transportation resources; pre-scheduling of harvesting process; specific transportation and warehousing costs; pre-scheduling of processing of energy grass at each facility (exclusive warehousing). The model take into consideration the following assumptions: (1) cooperative relation among processing and production facilties, (2) capacity constraints are not ignored, (3) the cost function of transportation is non-linear, (4) the drivers conditions are ignored. The objective function of the optimisation is the maximisation of the profit which means the maximization of the difference between revenue and cost. The objective function trades off the income of the assigned transportation demands against the logistic costs. The constraints are the followings: (1) the free capacity of the assigned transportation resource is more than the re-quested capacity of the transportation demand; the calculated arrival time of the transportation resource to the harvesting place is not later than the requested arrival time of them; (3) the calculated arrival time of the transportation demand to the processing and production facility is not later than the requested arrival time; (4) one transportation demand is assigned to one transportation resource and one resource is assigned to one transportation resource. The decision variable of the optimisation problem is the set of scheduling variables and the assignment of resources to transportation demands. The evaluation parameters of the optimised system are the followings: total costs of the collection process; utilisation of transportation resources and warehouses; efficiency of production and/or processing facilities. However the multidimensional heuristic optimisation method is based on genetic algorithm, but the routing sequence of the optimisation works on the base of an ant colony algorithm. The optimal routes are calculated by the aid of the ant colony algorithm as a subroutine of the global optimisation method and the optimal assignment is given by the genetic algorithm. One important part of the mathematical method is the sensibility analysis of the objective function, which shows the influence rate of the different input parameters. Acknowledgements This research was implemented within the frame of the project entitled "Development and operation of the Technology and Knowledge Transfer Centre of the University of Miskolc". with support by the European Union and co-funding of the European Social Fund. References [1] P. R. Daniel: The Economics of Harvesting and Transporting Corn Stover for Conversion to Fuel Ethanol: A Case Study for Minnesota. University of Minnesota, Department of Applied Economics. 2006. http://ideas.repec.org/p/ags/umaesp/14213.html [2] T. G. Douglas, J. Brendan, D. Erin & V.-D. Becca: Energy and Chemicals from Native Grasses: Production, Transportation and Processing Technologies Considered in the Northern Great Plains. University of Minnesota, Department of Applied Economics. 2006. http://ideas.repec.org/p/ags/umaesp/13838.html [3] Homepage of energygrass. www.energiafu.hu

  6. Selected Physical, Chemical, and Biological Data for 30 Urbanizing Streams in the North Carolina Piedmont Ecoregion, 2002-2003

    USGS Publications Warehouse

    Giddings, E.M.; Moorman, Michelle; Cuffney, Thomas F.; McMahon, Gerard; Harned, Douglas A.

    2007-01-01

    This report provides summarized physical, chemical, and biological data collected during a study of the effects of urbanization on stream ecosystems as part of the U.S. Geological Survey's National Water-Quality Assessment study. The purpose of this study was to examine differences in biological, chemical, and physical characteristics of streams across a gradient of urban intensity. Thirty sites were selected along an urbanization gradient that represents conditions in the North Carolina Piedmont ecoregion, including the cities of Raleigh, Durham, Cary, Greensboro, Winston-Salem, High Point, Asheboro, and Oxford. Data collected included streamflow variability, stream temperature, instream chemistry, instream aquatic habitat, and collections of the algal, macroinvertebrate, and fish communities. In addition, ancillary data describing land use, socioeconomic conditions, and urban infrastructure were compiled for each basin using a geographic information system analysis. All data were processed and summarized for analytical use and are presented in downloadable data tables, along with the methods of data collection and processing.

  7. Body Mapping as a Youth Sexual Health Intervention and Data Collection Tool

    PubMed Central

    Lys, Candice; Gesink, Dionne; Strike, Carol; Larkin, June

    2018-01-01

    In this article, we describe and evaluate body mapping as (a) an arts-based activity within Fostering Open eXpression Among Youth (FOXY), an educational intervention targeting Northwest Territories (NWT) youth, and (b) a research data collection tool. Data included individual interviews with 41 female participants (aged 13–17 years) who attended FOXY body mapping workshops in six communities in 2013, field notes taken by the researcher during the workshops and interviews, and written reflections from seven FOXY facilitators on the body mapping process (from 2013 to 2016). Thematic analysis explored the utility of body mapping using a developmental evaluation methodology. The results show body mapping is an intervention tool that supports and encourages participant self-reflection, introspection, personal connectedness, and processing difficult emotions. Body mapping is also a data collection catalyst that enables trust and youth voice in research, reduces verbal communication barriers, and facilitates the collection of rich data regarding personal experiences. PMID:29303048

  8. Highly comparative time-series analysis: the empirical structure of time series and their methods.

    PubMed

    Fulcher, Ben D; Little, Max A; Jones, Nick S

    2013-06-06

    The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines.

  9. Highly comparative time-series analysis: the empirical structure of time series and their methods

    PubMed Central

    Fulcher, Ben D.; Little, Max A.; Jones, Nick S.

    2013-01-01

    The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines. PMID:23554344

  10. Method for automatically evaluating a transition from a batch manufacturing technique to a lean manufacturing technique

    DOEpatents

    Ivezic, Nenad; Potok, Thomas E.

    2003-09-30

    A method for automatically evaluating a manufacturing technique comprises the steps of: receiving from a user manufacturing process step parameters characterizing a manufacturing process; accepting from the user a selection for an analysis of a particular lean manufacturing technique; automatically compiling process step data for each process step in the manufacturing process; automatically calculating process metrics from a summation of the compiled process step data for each process step; and, presenting the automatically calculated process metrics to the user. A method for evaluating a transition from a batch manufacturing technique to a lean manufacturing technique can comprise the steps of: collecting manufacturing process step characterization parameters; selecting a lean manufacturing technique for analysis; communicating the selected lean manufacturing technique and the manufacturing process step characterization parameters to an automatic manufacturing technique evaluation engine having a mathematical model for generating manufacturing technique evaluation data; and, using the lean manufacturing technique evaluation data to determine whether to transition from an existing manufacturing technique to the selected lean manufacturing technique.

  11. Performance Analysis of GYRO: A Tool Evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Worley, P.; Roth, P.; Candy, J.

    2005-06-26

    The performance of the Eulerian gyrokinetic-Maxwell solver code GYRO is analyzed on five high performance computing systems. First, a manual approach is taken, using custom scripts to analyze the output of embedded wall clock timers, floating point operation counts collected using hardware performance counters, and traces of user and communication events collected using the profiling interface to Message Passing Interface (MPI) libraries. Parts of the analysis are then repeated or extended using a number of sophisticated performance analysis tools: IPM, KOJAK, SvPablo, TAU, and the PMaC modeling tool suite. The paper briefly discusses what has been discovered via this manualmore » analysis process, what performance analyses are inconvenient or infeasible to attempt manually, and to what extent the tools show promise in accelerating or significantly extending the manual performance analyses.« less

  12. Heavy physical work under time pressure: the garbage collection service--a case study.

    PubMed

    Camada, Ilza Mitsuko de Oliveira; Pataro, Silvana Maria Santos; Fernandes, Rita de Cássia Pereira

    2012-01-01

    The increased generation of garbage has become a problem in large cities, with greater demand for collection services. The collector is subjected to high workload. This study describes the work in garbage collection service, highlighting the requirements of time, resulting in physical and psychosocial demands to collectors. Ergonomic Work Analysis (EWA) - a method focused on the study of work in real situations was used. Initially, technical visits, global observations and unstructured interviews with different subjects of a garbage collection company were conducted. The following step of the systematic observations was accompanied by interviews conducted during the execution of tasks, inquiring about the actions taken, and also interviews about the actions, but conducted after the development of the tasks, photographic records and audiovisual recordings, of workers from two garbage collection teams. Contradictions between the prescribed work and activities (actual work) were identified, as well as the variability present in this process, and strategies adopted by these workers to regulate the workload. It was concluded that the insufficiency of means and the organizational structure of management ensue a situation where the collection process is maintained at the expense of hyper-requesting these workers, both physically and psychosocially.

  13. Molecular diversity management strategies for building and enhancement of diverse and focused lead discovery compound screening collections.

    PubMed

    Schuffenhauer, A; Popov, M; Schopfer, U; Acklin, P; Stanek, J; Jacoby, E

    2004-12-01

    This publication describes processes for the selection of chemical compounds for the building of a high-throughput screening (HTS) collection for drug discovery, using the currently implemented process in the Discovery Technologies Unit of the Novartis Institute for Biomedical Research, Basel Switzerland as reference. More generally, the currently existing compound acquisition models and practices are discussed. Our informatics, chemistry and biology-driven compound selection consists of two steps: 1) The individual compounds are filtered and grouped into three priority classes on the basis of their individual structural properties. Substructure filters are used to eliminate or penalize compounds based on unwanted structural properties. The similarity of the structures to reference ligands of the main proven druggable target families is computed, and drug-similar compounds are prioritized for the following diversity analysis. 2) The compounds are compared to the archive compounds and a diversity analysis is performed. This is done separately for the prioritized, regular and penalized compounds with increasingly stringent dissimilarity criterion. The process includes collecting vendor catalogues and monitoring the availability of samples together with the selection and purchase decision points. The development of a corporate vendor catalogue database is described. In addition to the selection methods on a per single molecule basis, selection criteria for scaffold and combinatorial chemistry projects in collaboration with compound vendors are discussed.

  14. Proximate composition of poultry processing wastewater particulate matter from broiler slaughter plants.

    PubMed

    Kiepper, B H; Merka, W C; Fletcher, D L

    2008-08-01

    An experiment was conducted to compare the proximate composition of particulate matter recovered from poultry processing wastewater (PPW) generated by broiler slaughter plants. Poultry processing wastewater is the cumulative wastewater stream generated during the processing of poultry following primary and secondary physical screening (typically to 500 mum) that removes gross offal. Composite samples of PPW from 3 broiler slaughter plants (southeast United States) were collected over 8 consecutive weeks. All 3 broiler slaughter plants process young chickens with an average live weight of 2.0 kg. At each plant, a single 72-L composite sample was collected using an automatic sampler programmed to collect 1 L of wastewater every 20 min for 24 h during one normal processing day each week. Each composite sample was thoroughly mixed, and 60 L was passed through a series of sieves (2.0 mm, 1.0 mm, 500 mum, and 53 mum). The amount of particulate solids collected on the 2.0 mm, 1.0 mm, and 500 mum sieves was insignificant. The solids recovered from the 53-mum sieve were subjected to proximate analysis to determine percent moisture, fat, protein, ash, and fiber. The average percentages of fat, protein, ash, and fiber for all samples on a dry-weight basis were 55.3, 27.1, 6.1, and 4.1, respectively. Fat made up over half of the dry-weight matter recovered, representing PPW particulate matter between 500 and 53 mum. Despite the variation in number of birds processed daily, further processing operations, and number and type of wastewater screens utilized, there were no significance differences in percentage of fat and fiber between the slaughter plants. There were significant differences in percent protein and ash between the slaughter plants.

  15. Analysis of Leaf Area Index and Fraction of PAR Absorbed by Vegetation Products from the Terra MODIS Sensor: 2000-2005

    NASA Technical Reports Server (NTRS)

    Yang, Wenze; Huang, Dong; Tan, Bin; Stroeve, Julienne C.; Shabanov, Nikolay V.; Knyazikhin, Yuri; Nemani, Ramakrishna R.; Myneni, Ranga B.

    2006-01-01

    The analysis of two years of Collection 3 and five years of Collection 4 Terra Moderate Resolution Imaging Spectroradiometer (MODIS) Leaf Area Index (LAI) and Fraction of Photosynthetically Active Radiation (FPAR) data sets is presented in this article with the goal of understanding product quality with respect to version (Collection 3 versus 4), algorithm (main versus backup), snow (snow-free versus snow on the ground), and cloud (cloud-free versus cloudy) conditions. Retrievals from the main radiative transfer algorithm increased from 55% in Collection 3 to 67% in Collection 4 due to algorithm refinements and improved inputs. Anomalously high LAI/FPAR values observed in Collection 3 product in some vegetation types were corrected in Collection 4. The problem of reflectance saturation and too few main algorithm retrievals in broadleaf forests persisted in Collection 4. The spurious seasonality in needleleaf LAI/FPAR fields was traced to fewer reliable input data and retrievals during the boreal winter period. About 97% of the snow covered pixels were processed by the backup Normalized Difference Vegetation Index-based algorithm. Similarly, a majority of retrievals under cloudy conditions were obtained from the backup algorithm. For these reasons, the users are advised to consult the quality flags accompanying the LAI and FPAR product.

  16. The Defense Threat Reduction Agency's Technical Nuclear Forensics Research and Development Program

    NASA Astrophysics Data System (ADS)

    Franks, J.

    2015-12-01

    The Defense Threat Reduction Agency (DTRA) Technical Nuclear Forensics (TNF) Research and Development (R&D) Program's overarching goal is to design, develop, demonstrate, and transition advanced technologies and methodologies that improve the interagency operational capability to provide forensics conclusions after the detonation of a nuclear device. This goal is attained through the execution of three focus areas covering the span of the TNF process to enable strategic decision-making (attribution): Nuclear Forensic Materials Exploitation - Development of targeted technologies, methodologies and tools enabling the timely collection, analysis and interpretation of detonation materials.Prompt Nuclear Effects Exploitation - Improve ground-based capabilities to collect prompt nuclear device outputs and effects data for rapid, complementary and corroborative information.Nuclear Forensics Device Characterization - Development of a validated and verified capability to reverse model a nuclear device with high confidence from observables (e.g., prompt diagnostics, sample analysis, etc.) seen after an attack. This presentation will outline DTRA's TNF R&D strategy and current investments, with efforts focusing on: (1) introducing new technical data collection capabilities (e.g., ground-based prompt diagnostics sensor systems; innovative debris collection and analysis); (2) developing new TNF process paradigms and concepts of operations to decrease timelines and uncertainties, and increase results confidence; (3) enhanced validation and verification (V&V) of capabilities through technology evaluations and demonstrations; and (4) updated weapon output predictions to account for the modern threat environment. A key challenge to expanding these efforts to a global capability is the need for increased post-detonation TNF international cooperation, collaboration and peer reviews.

  17. Process mining techniques: an application to time management

    NASA Astrophysics Data System (ADS)

    Khowaja, Ali Raza

    2018-04-01

    In an environment people have to make sure that all of their work are completed within a given time in accordance with its quality. In order to achieve the real phenomenon of process mining one needs to understand all of these processes in a detailed manner. Personal Information and communication has always been a highlighting issue on internet but for now information and communication tools within factual life refers to their daily schedule, location analysis, environmental analysis and, more generally, social media applications support these systems which makes data available for data analysis generated through event logs, but also for process analysis which combines environmental and location analysis. Process mining can be used to exploit all these real live processes with the help of the event logs which are already available in those datasets through user censored data or may be user labeled data. These processes could be used to redesign a user's flow and understand all these processes in a bit more detailed manner. In order to increase the quality of each of the processes that we go through our daily lives is to give a closer look to each of the processes and after analyzing them, one should make changes to get better results. On the contrarily, we applied process mining techniques on seven different subjects combined in a single dataset collected from Korea. Above all, the following paper comments on the efficiency of processes in the event logs referring to time management's sphere of influence.

  18. 3D analysis of semiconductor devices: A combination of 3D imaging and 3D elemental analysis

    NASA Astrophysics Data System (ADS)

    Fu, Bianzhu; Gribelyuk, Michael A.

    2018-04-01

    3D analysis of semiconductor devices using a combination of scanning transmission electron microscopy (STEM) Z-contrast tomography and energy dispersive spectroscopy (EDS) elemental tomography is presented. 3D STEM Z-contrast tomography is useful in revealing the depth information of the sample. However, it suffers from contrast problems between materials with similar atomic numbers. Examples of EDS elemental tomography are presented using an automated EDS tomography system with batch data processing, which greatly reduces the data collection and processing time. 3D EDS elemental tomography reveals more in-depth information about the defect origin in semiconductor failure analysis. The influence of detector shadowing and X-rays absorption on the EDS tomography's result is also discussed.

  19. A collective case study of nursing students with learning disabilities.

    PubMed

    Kolanko, Kathrine M

    2003-01-01

    This collective case study described the meaning of being a nursing student with a learning disability and examined how baccalaureate nursing students with learning disabilities experienced various aspects of the nursing program. It also examined how their disabilities and previous educational and personal experiences influenced the meaning that they gave to their educational experiences. Seven nursing students were interviewed, completed a demographic data form, and submitted various artifacts (test scores, evaluation reports, and curriculum-based material) for document analysis. The researcher used Stake's model for collective case study research and analysis (1). Data analysis revealed five themes: 1) struggle, 2) learning how to learn with LD, 3) issues concerning time, 4) social support, and 5) personal stories. Theme clusters and individual variations were identified for each theme. Document analysis revealed that participants had average to above average intellectual functioning with an ability-achievement discrepancy among standardized test scores. Participants noted that direct instruction, structure, consistency, clear directions, organization, and a positive instructor attitude assisted learning. Anxiety, social isolation from peers, and limited time to process and complete work were problems faced by the participants.

  20. Evaluation of Saltzman and phenoldisulfonic acid methods for determining NO/sub x/ in engine exhaust gases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Groth, R.H.; Calabro, D.S.

    1969-11-01

    The two methods normally used for the analysis of NO/sub x/ are the Saltzman and the phenoldisulfonic acid technique. This paper describes an evaluation of these wet chemical methods to determine their practical application to engine exhaust gas analysis. Parameters considered for the Saltzman method included bubbler collection efficiency, NO to NO/sub 2/ conversion efficiency, masking effect of other contaminants usually present in exhaust gases and the time-temperature effect of these contaminants on store developed solutions. Collection efficiency and the effects of contaminants were also considered for the phenoldisulfonic acid method. Test results indicated satisfactory collection and conversion efficiencies formore » the Saltzman method, but contaminants seriously affected the measurement accuracy particularly if the developed solution was stored for a number of hours at room temperature before analysis. Storage at 32/sup 0/F minimized effect. The standard procedure for the phenoldisulfonic acid method gave good results, but the process was found to be too time consuming for routine analysis and measured only total NO/sub x/. 3 references, 9 tables.« less

  1. Analysis of Long Bone and Vertebral Failure Patterns.

    DTIC Science & Technology

    1982-09-30

    processes further supported the findings of • :the scanning electron microscopy studies . In the impacted animals, the cartilage surface was eroded... cartilage matrix. In the six years post-impaction group, the articular cartilage had converted to fibrocartilage instead of normal hyaline cartilage . The...columns of four rhesus monkeys have been collected and are being processed for study with light microscopy and scanning electron microscopy. The baboon

  2. A Homegrown Design for Data Warehousing: A District Customizes Its Own Process for Generating Detailed Information about Students in Real Time

    ERIC Educational Resources Information Center

    Thompson, Terry J.; Gould, Karen J.

    2005-01-01

    In recent years the Metropolitan School District of Wayne Township in Indianapolis has been awash in data. In attempts to improve levels of student achievement, the authors collected all manner of statistical details about students and schools and attempted to perform data analysis as part of the school improvement process. The authors were never…

  3. Sensemaking: a driving force behind the integration of professional practices.

    PubMed

    Sylvain, Chantal; Lamothe, Lise

    2012-01-01

    There has been considerable effort in recent years to link and integrate professional services more closely for patients with comorbidities. However, difficulties persist, especially at the clinical level. This study aims to shed light on these difficulties by examining the process of sensemaking in professionals directly involved in this integration. The authors conducted an eight-year longitudinal case study of an organization specializing in mental health and substance abuse. Different data collection methods were used, including 34 interviews conducted between 2003 and 2009, observations and document analysis. The authors performed a qualitative analysis of the data using a processual perspective. This paper provides empirical insights about the nature of the sensemaking process in which professionals collectively participate and the effects of this process on the evolution of integrated services. It suggests that the development of integrated practices results from an evolutional and collective process of constructing meanings that is rooted in the work activities of the professionals involved. By drawing attention to the capacity of professionals to shape the projects they are implementing, this study questions the capacity of managers to actually manage such a process. In order to obtain the expected benefits of integration projects, such emergent dynamics must first be recognized and then supported. Only then can thought be given to mastering them. The fact that this is a single case study is not a limitation per se, although it does raise the issue of the transferability of results. Replicating the study in other contexts would verify the applicability of the authors' conclusions. This study offers a fresh perspective on the difficulties generally encountered at the clinical level when trying to integrate services. It makes a significant contribution to work on the dynamics of sensemaking in organizational life.

  4. Mobile Monitoring Data Processing & Analysis Strategies

    EPA Science Inventory

    The development of portable, high-time resolution instruments for measuring the concentrations of a variety of air pollutants has made it possible to collect data while in motion. This strategy, known as mobile monitoring, involves mounting air sensors on variety of different pla...

  5. Image processing, analysis, and management tools for gusset plate connections in steel truss bridges.

    DOT National Transportation Integrated Search

    2016-10-01

    This report details the research undertaken and software tools that were developed that enable digital : images of gusset plates to be converted into orthophotos, establish physical dimensions, collect : geometric information from them, and conduct s...

  6. Preparing Laboratory and Real-World EEG Data for Large-Scale Analysis: A Containerized Approach

    PubMed Central

    Bigdely-Shamlo, Nima; Makeig, Scott; Robbins, Kay A.

    2016-01-01

    Large-scale analysis of EEG and other physiological measures promises new insights into brain processes and more accurate and robust brain–computer interface models. However, the absence of standardized vocabularies for annotating events in a machine understandable manner, the welter of collection-specific data organizations, the difficulty in moving data across processing platforms, and the unavailability of agreed-upon standards for preprocessing have prevented large-scale analyses of EEG. Here we describe a “containerized” approach and freely available tools we have developed to facilitate the process of annotating, packaging, and preprocessing EEG data collections to enable data sharing, archiving, large-scale machine learning/data mining and (meta-)analysis. The EEG Study Schema (ESS) comprises three data “Levels,” each with its own XML-document schema and file/folder convention, plus a standardized (PREP) pipeline to move raw (Data Level 1) data to a basic preprocessed state (Data Level 2) suitable for application of a large class of EEG analysis methods. Researchers can ship a study as a single unit and operate on its data using a standardized interface. ESS does not require a central database and provides all the metadata data necessary to execute a wide variety of EEG processing pipelines. The primary focus of ESS is automated in-depth analysis and meta-analysis EEG studies. However, ESS can also encapsulate meta-information for the other modalities such as eye tracking, that are increasingly used in both laboratory and real-world neuroimaging. ESS schema and tools are freely available at www.eegstudy.org and a central catalog of over 850 GB of existing data in ESS format is available at studycatalog.org. These tools and resources are part of a larger effort to enable data sharing at sufficient scale for researchers to engage in truly large-scale EEG analysis and data mining (BigEEG.org). PMID:27014048

  7. Methodological challenges collecting parent phone-call healthcare utilization data.

    PubMed

    Moreau, Paula; Crawford, Sybil; Sullivan-Bolyai, Susan

    2016-02-01

    Recommendations by the National Institute of Nursing Research and other groups have strongly encouraged nurses to pay greater attention to cost-effectiveness analysis when conducting research. Given the increasing prominence of translational science and comparative effective research, cost-effective analysis has become a basic tool in determining intervention value in research. Tracking phone-call communication (number of calls and context) with cross-checks between parents and healthcare providers is an example of this type of healthcare utilization data collection. This article identifies some methodological challenges that have emerged in the process of collecting this type of data in a randomized controlled trial: Parent education Through Simulation-Diabetes (PETS-D). We also describe ways in which those challenges have been addressed with comparison data results, and make recommendations for future research. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. Massive violent death and contested national mourning in post-authoritarian Chile and Argentina: a sociocultural application of the dual process model.

    PubMed

    Robben, Antonius C G M

    2014-01-01

    This article uses the dual process model (DPM) in an analysis of the national mourning of tens of thousands of disappeared in Chile and Argentina by adapting the model from the individual to the collective level where society as a whole is bereaved. Perpetrators are also involved in the national mourning process as members of a bereaved society. This article aims to (a) demonstrate the DPMs significance for the analysis of national mourning in post-conflict societies and (b) explain oscillations between loss orientation and restoration orientation in coping with massive losses that seem contradictory from a grief work perspective.

  9. Qualitative Analysis for Maintenance Process Assessment

    NASA Technical Reports Server (NTRS)

    Brand, Lionel; Kim, Yong-Mi; Melo, Walcelio; Seaman, Carolyn; Basili, Victor

    1996-01-01

    In order to improve software maintenance processes, we first need to be able to characterize and assess them. These tasks must be performed in depth and with objectivity since the problems are complex. One approach is to set up a measurement-based software process improvement program specifically aimed at maintenance. However, establishing a measurement program requires that one understands the problems to be addressed by the measurement program and is able to characterize the maintenance environment and processes in order to collect suitable and cost-effective data. Also, enacting such a program and getting usable data sets takes time. A short term substitute is therefore needed. We propose in this paper a characterization process aimed specifically at maintenance and based on a general qualitative analysis methodology. This process is rigorously defined in order to be repeatable and usable by people who are not acquainted with such analysis procedures. A basic feature of our approach is that actual implemented software changes are analyzed in order to understand the flaws in the maintenance process. Guidelines are provided and a case study is shown that demonstrates the usefulness of the approach.

  10. Characterization Data Package for Containerized Sludge Samples Collected from Engineered Container SCS-CON-210

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fountain, Matthew S.; Fiskum, Sandra K.; Baldwin, David L.

    This data package contains the K Basin sludge characterization results obtained by Pacific Northwest National Laboratory during processing and analysis of four sludge core samples collected from Engineered Container SCS-CON-210 in 2010 as requested by CH2M Hill Plateau Remediation Company. Sample processing requirements, analytes of interest, detection limits, and quality control sample requirements are defined in the KBC-33786, Rev. 2. The core processing scope included reconstitution of a sludge core sample distributed among four to six 4-L polypropylene bottles into a single container. The reconstituted core sample was then mixed and subsampled to support a variety of characterization activities. Additionalmore » core sludge subsamples were combined to prepare a container composite. The container composite was fractionated by wet sieving through a 2,000 micron mesh and a 500-micron mesh sieve. Each sieve fraction was sampled to support a suite of analyses. The core composite analysis scope included density determination, radioisotope analysis, and metals analysis, including the Waste Isolation Pilot Plant Hazardous Waste Facility Permit metals (with the exception of mercury). The container composite analysis included most of the core composite analysis scope plus particle size distribution, particle density, rheology, and crystalline phase identification. A summary of the received samples, core sample reconstitution and subsampling activities, container composite preparation and subsampling activities, physical properties, and analytical results are presented. Supporting data and documentation are provided in the appendices. There were no cases of sample or data loss and all of the available samples and data are reported as required by the Quality Assurance Project Plan/Sampling and Analysis Plan.« less

  11. Analytic Steering: Inserting Context into the Information Dialog

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bohn, Shawn J.; Calapristi, Augustin J.; Brown, Shyretha D.

    2011-10-23

    An analyst’s intrinsic domain knowledge is a primary asset in almost any analysis task. Unstructured text analysis systems that apply un-supervised content analysis approaches can be more effective if they can leverage this domain knowledge in a manner that augments the information discovery process without obfuscating new or unexpected content. Current unsupervised approaches rely upon the prowess of the analyst to submit the right queries or observe generalized document and term relationships from ranked or visual results. We propose a new approach which allows the user to control or steer the analytic view within the unsupervised space. This process ismore » controlled through the data characterization process via user supplied context in the form of a collection of key terms. We show that steering with an appropriate choice of key terms can provide better relevance to the analytic domain and still enable the analyst to uncover un-expected relationships; this paper discusses cases where various analytic steering approaches can provide enhanced analysis results and cases where analytic steering can have a negative impact on the analysis process.« less

  12. First LHCb measurement with data from the LHC Run 2

    NASA Astrophysics Data System (ADS)

    Anderlini, L.; Amerio, S.

    2017-01-01

    LHCb has recently introduced a novel real-time detector alignment and calibration strategy for the Run 2. Data collected at the start of each LHC fill are processed in few minutes and used to update the alignment. On the other hand, the calibration constants will be evaluated for each run of data taking. An increase in the CPU and disk capacity of the event filter farm, combined with improvements to the reconstruction software, allow for efficient, exclusive selections already in the first stage of the High Level Trigger (HLT1), while the second stage, HLT2, performs complete, offline-quality, event reconstruction. In Run 2, LHCb will collect the largest data sample of charm mesons ever recorded. Novel data processing and analysis techniques are required to maximise the physics potential of this data sample with the available computing resources, taking into account data preservation constraints. In this write-up, we describe the full analysis chain used to obtain important results analysing the data collected in proton-proton collisions in 2015, such as the J/ψ and open charm production cross-sections, and consider the further steps required to obtain real-time results after the LHCb upgrade.

  13. Hydrologic Process-oriented Optimization of Electrical Resistivity Tomography

    NASA Astrophysics Data System (ADS)

    Hinnell, A.; Bechtold, M.; Ferre, T. A.; van der Kruk, J.

    2010-12-01

    Electrical resistivity tomography (ERT) is commonly used in hydrologic investigations. Advances in joint and coupled hydrogeophysical inversion have enhanced the quantitative use of ERT to construct and condition hydrologic models (i.e. identify hydrologic structure and estimate hydrologic parameters). However the selection of which electrical resistivity data to collect and use is often determined by a combination of data requirements for geophysical analysis, intuition on the part of the hydrogeophysicist and logistical constraints of the laboratory or field site. One of the advantages of coupled hydrogeophysical inversion is the direct link between the hydrologic model and the individual geophysical data used to condition the model. That is, there is no requirement to collect geophysical data suitable for independent geophysical inversion. The geophysical measurements collected can be optimized for estimation of hydrologic model parameters rather than to develop a geophysical model. Using a synthetic model of drip irrigation we evaluate the value of individual resistivity measurements to describe the soil hydraulic properties and then use this information to build a data set optimized for characterizing hydrologic processes. We then compare the information content in the optimized data set with the information content in a data set optimized using a Jacobian sensitivity analysis.

  14. Human Milk Fatty Acid Composition: Comparison of Novel Dried Milk Spot Versus Standard Liquid Extraction Methods.

    PubMed

    Rudolph, Michael C; Young, Bridget E; Jackson, Kristina Harris; Krebs, Nancy F; Harris, William S; MacLean, Paul S

    2016-12-01

    Accurate assessment of the long chain polyunsaturated fatty acid (LC-PUFA) content of human milk (HM) provides a powerful means to evaluate the FA nutrient status of breastfed infants. The conventional standard for FA composition analysis of HM is liquid extraction, trans-methylation, and analyte detection resolved by gas chromatography. This standard approach requires fresh or frozen samples, storage in deep freeze, organic solvents, and specialized equipment in processing and analysis. Further, HM collection is often impractical for many studies in the free living environment, particularly for studies in developing countries. In the present study, we compare a novel and more practical approach to sample collection and processing that involves the spotting and drying ~50 μL of HM on a specialized paper stored and transported at ambient temperatures until analysis. Deming regression indicated the two methods aligned very well for all LC-PUFA and the abundant HM FA. Additionally, strong correlations (r > 0.85) were observed for DHA, ARA, EPA, linoleic (LA), and alpha-linolenic acids (ALA), which are of particular interest to the health of the developing infant. Taken together, our data suggest this more practical and inexpensive method of collection, storage, and transport of HM milk samples could dramatically facilitate studies of HM, as well as understanding its lipid composition influences on human health and development.

  15. INTEGRATION OF FACILITY MODELING CAPABILITIES FOR NUCLEAR NONPROLIFERATION ANALYSIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gorensek, M.; Hamm, L.; Garcia, H.

    2011-07-18

    Developing automated methods for data collection and analysis that can facilitate nuclear nonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facility modeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facility modeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come frommore » many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facility modeling capabilities and illustrates how they could be integrated and utilized for nonproliferation analysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facility modeling tools. After considering a representative sampling of key facility modeling capabilities, the proposed integration framework is illustrated with several examples.« less

  16. Flat-plate solar-array project. Experimental process system development unit for producing semiconductor-grade silicon using the silane-to-silicon process

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The engineering design, fabrication, assembly, operation, economic analysis, and process support R and D for an Experimental Process System Development Unit (EPSDU) are reported. About 95% of purchased equipment is received and will be reshipped to the West Coast location. The Data Collection System is completed. In the area of melting/consolidation, to the system using silicon powder transfer, melting and shotting on a pseudocontinuous basis is demonstrated. It is proposed to continue the very promising fluid bed work.

  17. Occupational styrene exposure for twelve product categories in the reinforced-plastics industry.

    PubMed

    Lemasters, G K; Carson, A; Samuels, S J

    1985-08-01

    Approximately 1500 occupational styrene exposure values from 28 reinforced-plastic manufacturers were collected retrospectively from companies and state and federal agencies. This report describes the major types of manufacturing processes within the reinforced-plastics industry and reports on the availability, collection and analysis of historical exposure information. Average exposure to styrene in most open-mold companies (24-82 ppm) was generally 2-3 times the exposure in press-mold companies (11-26 ppm). Manufacturers of smaller boats had mean styrene exposures of 82 ppm as compared to 37 ppm for yacht companies. There was considerable overlap in styrene exposure among job titles classified as directly exposed within open- and press-mold processing.

  18. Windshield splatter analysis with the Galaxy metagenomic pipeline

    PubMed Central

    Kosakovsky Pond, Sergei; Wadhawan, Samir; Chiaromonte, Francesca; Ananda, Guruprasad; Chung, Wen-Yu; Taylor, James; Nekrutenko, Anton

    2009-01-01

    How many species inhabit our immediate surroundings? A straightforward collection technique suitable for answering this question is known to anyone who has ever driven a car at highway speeds. The windshield of a moving vehicle is subjected to numerous insect strikes and can be used as a collection device for representative sampling. Unfortunately the analysis of biological material collected in that manner, as with most metagenomic studies, proves to be rather demanding due to the large number of required tools and considerable computational infrastructure. In this study, we use organic matter collected by a moving vehicle to design and test a comprehensive pipeline for phylogenetic profiling of metagenomic samples that includes all steps from processing and quality control of data generated by next-generation sequencing technologies to statistical analyses and data visualization. To the best of our knowledge, this is also the first publication that features a live online supplement providing access to exact analyses and workflows used in the article. PMID:19819906

  19. A Case Study of Measuring Process Risk for Early Insights into Software Safety

    NASA Technical Reports Server (NTRS)

    Layman, Lucas; Basili, Victor; Zelkowitz, Marvin V.; Fisher, Karen L.

    2011-01-01

    In this case study, we examine software safety risk in three flight hardware systems in NASA's Constellation spaceflight program. We applied our Technical and Process Risk Measurement (TPRM) methodology to the Constellation hazard analysis process to quantify the technical and process risks involving software safety in the early design phase of these projects. We analyzed 154 hazard reports and collected metrics to measure the prevalence of software in hazards and the specificity of descriptions of software causes of hazardous conditions. We found that 49-70% of 154 hazardous conditions could be caused by software or software was involved in the prevention of the hazardous condition. We also found that 12-17% of the 2013 hazard causes involved software, and that 23-29% of all causes had a software control. The application of the TPRM methodology identified process risks in the application of the hazard analysis process itself that may lead to software safety risk.

  20. Complex Problem Solving in Teams: The Impact of Collective Orientation on Team Process Demands.

    PubMed

    Hagemann, Vera; Kluge, Annette

    2017-01-01

    Complex problem solving is challenging and a high-level cognitive process for individuals. When analyzing complex problem solving in teams, an additional, new dimension has to be considered, as teamwork processes increase the requirements already put on individual team members. After introducing an idealized teamwork process model, that complex problem solving teams pass through, and integrating the relevant teamwork skills for interdependently working teams into the model and combining it with the four kinds of team processes (transition, action, interpersonal, and learning processes), the paper demonstrates the importance of fulfilling team process demands for successful complex problem solving within teams. Therefore, results from a controlled team study within complex situations are presented. The study focused on factors that influence action processes, like coordination, such as emergent states like collective orientation, cohesion, and trust and that dynamically enable effective teamwork in complex situations. Before conducting the experiments, participants were divided by median split into two-person teams with either high ( n = 58) or low ( n = 58) collective orientation values. The study was conducted with the microworld C3Fire, simulating dynamic decision making, and acting in complex situations within a teamwork context. The microworld includes interdependent tasks such as extinguishing forest fires or protecting houses. Two firefighting scenarios had been developed, which takes a maximum of 15 min each. All teams worked on these two scenarios. Coordination within the team and the resulting team performance were calculated based on a log-file analysis. The results show that no relationships between trust and action processes and team performance exist. Likewise, no relationships were found for cohesion. Only collective orientation of team members positively influences team performance in complex environments mediated by action processes such as coordination within the team. The results are discussed in relation to previous empirical findings and to learning processes within the team with a focus on feedback strategies.

  1. Complex Problem Solving in Teams: The Impact of Collective Orientation on Team Process Demands

    PubMed Central

    Hagemann, Vera; Kluge, Annette

    2017-01-01

    Complex problem solving is challenging and a high-level cognitive process for individuals. When analyzing complex problem solving in teams, an additional, new dimension has to be considered, as teamwork processes increase the requirements already put on individual team members. After introducing an idealized teamwork process model, that complex problem solving teams pass through, and integrating the relevant teamwork skills for interdependently working teams into the model and combining it with the four kinds of team processes (transition, action, interpersonal, and learning processes), the paper demonstrates the importance of fulfilling team process demands for successful complex problem solving within teams. Therefore, results from a controlled team study within complex situations are presented. The study focused on factors that influence action processes, like coordination, such as emergent states like collective orientation, cohesion, and trust and that dynamically enable effective teamwork in complex situations. Before conducting the experiments, participants were divided by median split into two-person teams with either high (n = 58) or low (n = 58) collective orientation values. The study was conducted with the microworld C3Fire, simulating dynamic decision making, and acting in complex situations within a teamwork context. The microworld includes interdependent tasks such as extinguishing forest fires or protecting houses. Two firefighting scenarios had been developed, which takes a maximum of 15 min each. All teams worked on these two scenarios. Coordination within the team and the resulting team performance were calculated based on a log-file analysis. The results show that no relationships between trust and action processes and team performance exist. Likewise, no relationships were found for cohesion. Only collective orientation of team members positively influences team performance in complex environments mediated by action processes such as coordination within the team. The results are discussed in relation to previous empirical findings and to learning processes within the team with a focus on feedback strategies. PMID:29033886

  2. Compact full-motion video hyperspectral cameras: development, image processing, and applications

    NASA Astrophysics Data System (ADS)

    Kanaev, A. V.

    2015-10-01

    Emergence of spectral pixel-level color filters has enabled development of hyper-spectral Full Motion Video (FMV) sensors operating in visible (EO) and infrared (IR) wavelengths. The new class of hyper-spectral cameras opens broad possibilities of its utilization for military and industry purposes. Indeed, such cameras are able to classify materials as well as detect and track spectral signatures continuously in real time while simultaneously providing an operator the benefit of enhanced-discrimination-color video. Supporting these extensive capabilities requires significant computational processing of the collected spectral data. In general, two processing streams are envisioned for mosaic array cameras. The first is spectral computation that provides essential spectral content analysis e.g. detection or classification. The second is presentation of the video to an operator that can offer the best display of the content depending on the performed task e.g. providing spatial resolution enhancement or color coding of the spectral analysis. These processing streams can be executed in parallel or they can utilize each other's results. The spectral analysis algorithms have been developed extensively, however demosaicking of more than three equally-sampled spectral bands has been explored scarcely. We present unique approach to demosaicking based on multi-band super-resolution and show the trade-off between spatial resolution and spectral content. Using imagery collected with developed 9-band SWIR camera we demonstrate several of its concepts of operation including detection and tracking. We also compare the demosaicking results to the results of multi-frame super-resolution as well as to the combined multi-frame and multiband processing.

  3. Visualizing human communication in business process simulations

    NASA Astrophysics Data System (ADS)

    Groehn, Matti; Jalkanen, Janne; Haho, Paeivi; Nieminen, Marko; Smeds, Riitta

    1999-03-01

    In this paper a description of business process simulation is given. Crucial part in the simulation of business processes is the analysis of social contacts between the participants. We will introduce a tool to collect log data and how this log data can be effectively analyzed using two different kind of methods: discussion flow charts and self-organizing maps. Discussion flow charts revealed the communication patterns and self-organizing maps are a very effective way of clustering the participants into development groups.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matthew R. Kumjian; Giangrande, Scott E.; Mishra, Subashree

    Polarimetric radar observations increasingly are used to understand cloud microphysical processes, which is critical for improving their representation in cloud and climate models. In particular, there has been recent focus on improving representations of ice collection processes (e.g., aggregation, riming), as these influence precipitation rate, heating profiles, and ultimately cloud life cycles. However, distinguishing these processes using conventional polarimetric radar observations is difficult, as they produce similar fingerprints. This necessitates improved analysis techniques and integration of complementary data sources. Furthermore, the Midlatitude Continental Convective Clouds Experiment (MC3E) provided such an opportunity.

  5. Application of data mining techniques and data analysis methods to measure cancer morbidity and mortality data in a regional cancer registry: The case of the island of Crete, Greece.

    PubMed

    Varlamis, Iraklis; Apostolakis, Ioannis; Sifaki-Pistolla, Dimitra; Dey, Nilanjan; Georgoulias, Vassilios; Lionis, Christos

    2017-07-01

    Micro or macro-level mapping of cancer statistics is a challenging task that requires long-term planning, prospective studies and continuous monitoring of all cancer cases. The objective of the current study is to present how cancer registry data could be processed using data mining techniques in order to improve the statistical analysis outcomes. Data were collected from the Cancer Registry of Crete in Greece (counties of Rethymno and Lasithi) for the period 1998-2004. Data collection was performed on paper forms and manually transcribed to a single data file, thus introducing errors and noise (e.g. missing and erroneous values, duplicate entries etc.). Data were pre-processed and prepared for analysis using data mining tools and algorithms. Feature selection was applied to evaluate the contribution of each collected feature in predicting patients' survival. Several classifiers were trained and evaluated for their ability to predict survival of patients. Finally, statistical analysis of cancer morbidity and mortality rates in the two regions was performed in order to validate the initial findings. Several critical points in the process of data collection, preprocessing and analysis of cancer data were derived from the results, while a road-map for future population data studies was developed. In addition, increased morbidity rates were observed in the counties of Crete (Age Standardized Morbidity/Incidence Rates ASIR= 396.45 ± 2.89 and 274.77 ±2.48 for men and women, respectively) compared to European and world averages (ASIR= 281.6 and 207.3 for men and women in Europe and 203.8 and 165.1 in world level). Significant variation in cancer types between sexes and age groups (the ratio between deaths and reported cases for young patients, less than 34 years old, is at 0.055 when the respective ratio for patients over 75 years old is 0.366) was also observed. This study introduced a methodology for preprocessing and analyzing cancer data, using a combination of data mining techniques that could be a useful tool for other researchers and further enhancement of the cancer registries. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Flexibility and utility of pre-processing methods in converting STXM setups for ptychography - Final Paper

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fromm, Catherine

    2015-08-20

    Ptychography is an advanced diffraction based imaging technique that can achieve resolution of 5nm and below. It is done by scanning a sample through a beam of focused x-rays using discrete yet overlapping scan steps. Scattering data is collected on a CCD camera, and the phase of the scattered light is reconstructed with sophisticated iterative algorithms. Because the experimental setup is similar, ptychography setups can be created by retrofitting existing STXM beam lines with new hardware. The other challenge comes in the reconstruction of the collected scattering images. Scattering data must be adjusted and packaged with experimental parameters to calibratemore » the reconstruction software. The necessary pre-processing of data prior to reconstruction is unique to each beamline setup, and even the optical alignments used on that particular day. Pre-processing software must be developed to be flexible and efficient in order to allow experiments appropriate control and freedom in the analysis of their hard-won data. This paper will describe the implementation of pre-processing software which successfully connects data collection steps to reconstruction steps, letting the user accomplish accurate and reliable ptychography.« less

  7. Attenuated total reflectance-FT-IR spectroscopy for gunshot residue analysis: potential for ammunition determination.

    PubMed

    Bueno, Justin; Sikirzhytski, Vitali; Lednev, Igor K

    2013-08-06

    The ability to link a suspect to a particular shooting incident is a principal task for many forensic investigators. Here, we attempt to achieve this goal by analysis of gunshot residue (GSR) through the use of attenuated total reflectance (ATR) Fourier transform infrared spectroscopy (FT-IR) combined with statistical analysis. The firearm discharge process is analogous to a complex chemical process. Therefore, the products of this process (GSR) will vary based upon numerous factors, including the specific combination of the firearm and ammunition which was discharged. Differentiation of FT-IR data, collected from GSR particles originating from three different firearm-ammunition combinations (0.38 in., 0.40 in., and 9 mm calibers), was achieved using projection to latent structures discriminant analysis (PLS-DA). The technique was cross (leave-one-out), both internally and externally, validated. External validation was achieved via assignment (caliber identification) of unknown FT-IR spectra from unknown GSR particles. The results demonstrate great potential for ATR-FT-IR spectroscopic analysis of GSR for forensic purposes.

  8. Crime scene units: a look to the future

    NASA Astrophysics Data System (ADS)

    Baldwin, Hayden B.

    1999-02-01

    The scientific examination of physical evidence is well recognized as a critical element in conducting successful criminal investigations and prosecutions. The forensic science field is an ever changing discipline. With the arrival of DNA, new processing techniques for latent prints, portable lasers, and electro-static dust print lifters, and training of evidence technicians has become more important than ever. These scientific and technology breakthroughs have increased the possibility of collecting and analyzing physical evidence that was never possible before. The problem arises with the collection of physical evidence from the crime scene not from the analysis of the evidence. The need for specialized units in the processing of all crime scenes is imperative. These specialized units, called crime scene units, should be trained and equipped to handle all forms of crime scenes. The crime scenes units would have the capability to professionally evaluate and collect pertinent physical evidence from the crime scenes.

  9. Integrated Approach to Reduce Perinatal Adverse Events: Standardized Processes, Interdisciplinary Teamwork Training, and Performance Feedback.

    PubMed

    Riley, William; Begun, James W; Meredith, Les; Miller, Kristi K; Connolly, Kathy; Price, Rebecca; Muri, Janet H; McCullough, Mac; Davis, Stanley

    2016-12-01

    To improve safety practices and reduce adverse events in perinatal units of acute care hospitals. Primary data collected from perinatal units of 14 hospitals participating in the intervention between 2008 and 2012. Baseline secondary data collected from the same hospitals between 2006 and 2007. A prospective study involving 342,754 deliveries was conducted using a quality improvement collaborative that supported three primary interventions. Primary measures include adoption of three standardized care processes and four measures of outcomes. Chart audits were conducted to measure the implementation of standardized care processes. Outcome measures were collected and validated by the National Perinatal Information Center. The hospital perinatal units increased use of all three care processes, raising consolidated overall use from 38 to 81 percent between 2008 and 2012. The harms measured by the Adverse Outcome Index decreased 14 percent, and a run chart analysis revealed two special causes associated with the interventions. This study demonstrates the ability of hospital perinatal staff to implement efforts to reduce perinatal harm using a quality improvement collaborative. Findings help inform the relationship between the use of standardized care processes, teamwork training, and improved perinatal outcomes, and suggest that a multiplicity of integrated strategies, rather than a single intervention, may be essential to achieve high reliability. © Health Research and Educational Trust.

  10. Summary Report For The Analysis Of The Sludge Batch 7b (Macrobatch 9) DWPF Pour Stream Glass Sample For Canister S04023

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, F. C.

    2013-11-18

    In order to comply with the Defense Waste Processing Facility (DWPF) Waste Form Compliance Plan for Sluldge Batch 7b, Savannah River National Laboratory (SRNL) personnel characterized the Defense Waste Processing Facility (DWPF) pour stream (PS) glass sample collected while filling canister S04023. This report summarizes the results of the compositional analysis for reportable oxides and radionuclides and the normalized Product Consistency Test (PCT) results. The PCT responses indicate that the DWPF produced glass that is significantly more durable than the Environmental Assessment glass.

  11. Maintenance Decision Support System: Pilot Study and Cost-Benefit Analysis (Phase 2.5)

    DOT National Transportation Integrated Search

    2014-07-01

    This project focused on several tasks: development of in-vehicle hardware that permits implementation of an MDSS, development of software to collect and process road and weather data, a cost-benefit study, and pilot-scale implementation. Two Automati...

  12. Structuring an Internal Evaluation Process.

    ERIC Educational Resources Information Center

    Gordon, Sheila C.; Heinemann, Harry N.

    1980-01-01

    The design of an internal program evaluation system requires (1) formulation of program, operational, and institutional objectives; (2) establishment of evaluation criteria; (3) choice of data collection and evaluation techniques; (4) analysis of results; and (5) integration of the system into the mainstream of operations. (SK)

  13. Mobile Monitoring Data Processing and Analysis Strategies

    EPA Science Inventory

    The development of portable, high-time resolution instruments for measuring the concentrations of a variety of air pollutants has made it possible to collect data while in motion. This strategy, known as mobile monitoring, involves mounting air sensors on variety of different pla...

  14. Maintenance Decision Support System : Pilot Study and Cost-Benefit Analysis (Phase 2)

    DOT National Transportation Integrated Search

    2014-07-01

    This project focused on several tasks: development of in-vehicle hardware that permits implementation of an MDSS, development of software to collect and process road and weather data, a cost-benefit study, and pilot-scale implementation. Two Automati...

  15. VOLATILE POLAR METABOLITES IN EXHALED BREATH CONDENSATE (EBC): COLLECTION AND ANALYSIS

    EPA Science Inventory

    Environmental exposures, individual activities, and disease states can perturb normal metabolic processes and be expressed as a change in the patterns of polar volatile organic compounds (PVOCs) present in biological fluids. We explore the measurement of volatile endogenous bioma...

  16. Study of teeth phosphorescence detection technique

    NASA Astrophysics Data System (ADS)

    Cai, De-Fang; Wang, Shui-ping; Yang, Zhen-jiang; An, Yuying; Huang, Li-Zi; Liang, Yan

    1995-05-01

    On the basis of research and analysis into optical properties of teeth, this paper introduces the techniques to transform teeth phosphorescence excited by ultraviolet light into electric signals and following steps for data collection, analysis and processing. Also presented are the methods to diagnose pulp-vitality, decayed teeth, and, especially, infant caries and pre-caries diseases. By measurement of a tooth's temperature, other stomatic illnesses can be diagnosed.

  17. Processing and Probability Analysis of Pulsed Terahertz NDE of Corrosion under Shuttle Tile Data

    NASA Technical Reports Server (NTRS)

    Anastasi, Robert F.; Madaras, Eric I.; Seebo, Jeffrey P.; Ely, Thomas M.

    2009-01-01

    This paper examines data processing and probability analysis of pulsed terahertz NDE scans of corrosion defects under a Shuttle tile. Pulsed terahertz data collected from an aluminum plate with fabricated corrosion defects and covered with a Shuttle tile is presented. The corrosion defects imaged were fabricated by electrochemically etching areas of various diameter and depth in the plate. In this work, the aluminum plate echo signal is located in the terahertz time-of-flight data and a threshold is applied to produce a binary image of sample features. Feature location and area are examined and identified as corrosion through comparison with the known defect layout. The results are tabulated with hit, miss, or false call information for a probability of detection analysis that is used to identify an optimal processing threshold.

  18. The evolution of process-based hydrologic models: historical challenges and the collective quest for physical realism

    NASA Astrophysics Data System (ADS)

    Clark, Martyn P.; Bierkens, Marc F. P.; Samaniego, Luis; Woods, Ross A.; Uijlenhoet, Remko; Bennett, Katrina E.; Pauwels, Valentijn R. N.; Cai, Xitian; Wood, Andrew W.; Peters-Lidard, Christa D.

    2017-07-01

    The diversity in hydrologic models has historically led to great controversy on the correct approach to process-based hydrologic modeling, with debates centered on the adequacy of process parameterizations, data limitations and uncertainty, and computational constraints on model analysis. In this paper, we revisit key modeling challenges on requirements to (1) define suitable model equations, (2) define adequate model parameters, and (3) cope with limitations in computing power. We outline the historical modeling challenges, provide examples of modeling advances that address these challenges, and define outstanding research needs. We illustrate how modeling advances have been made by groups using models of different type and complexity, and we argue for the need to more effectively use our diversity of modeling approaches in order to advance our collective quest for physically realistic hydrologic models.

  19. The evolution of process-based hydrologic models: historical challenges and the collective quest for physical realism

    NASA Astrophysics Data System (ADS)

    Clark, M. P.; Nijssen, B.; Wood, A.; Mizukami, N.; Newman, A. J.

    2017-12-01

    The diversity in hydrologic models has historically led to great controversy on the "correct" approach to process-based hydrologic modeling, with debates centered on the adequacy of process parameterizations, data limitations and uncertainty, and computational constraints on model analysis. In this paper, we revisit key modeling challenges on requirements to (1) define suitable model equations, (2) define adequate model parameters, and (3) cope with limitations in computing power. We outline the historical modeling challenges, provide examples of modeling advances that address these challenges, and define outstanding research needs. We illustrate how modeling advances have been made by groups using models of different type and complexity, and we argue for the need to more effectively use our diversity of modeling approaches in order to advance our collective quest for physically realistic hydrologic models.

  20. Data processing and analysis with the autoPROC toolbox.

    PubMed

    Vonrhein, Clemens; Flensburg, Claus; Keller, Peter; Sharff, Andrew; Smart, Oliver; Paciorek, Wlodek; Womack, Thomas; Bricogne, Gérard

    2011-04-01

    A typical diffraction experiment will generate many images and data sets from different crystals in a very short time. This creates a challenge for the high-throughput operation of modern synchrotron beamlines as well as for the subsequent data processing. Novice users in particular may feel overwhelmed by the tables, plots and numbers that the different data-processing programs and software packages present to them. Here, some of the more common problems that a user has to deal with when processing a set of images that will finally make up a processed data set are shown, concentrating on difficulties that may often show up during the first steps along the path of turning the experiment (i.e. data collection) into a model (i.e. interpreted electron density). Difficulties such as unexpected crystal forms, issues in crystal handling and suboptimal choices of data-collection strategies can often be dealt with, or at least diagnosed, by analysing specific data characteristics during processing. In the end, one wants to distinguish problems over which one has no immediate control once the experiment is finished from problems that can be remedied a posteriori. A new software package, autoPROC, is also presented that combines third-party processing programs with new tools and an automated workflow script that is intended to provide users with both guidance and insight into the offline processing of data affected by the difficulties mentioned above, with particular emphasis on the automated treatment of multi-sweep data sets collected on multi-axis goniostats.

  1. Understanding public drug procurement in India: a comparative qualitative study of five Indian states

    PubMed Central

    Singh, Prabal Vikram; Tatambhotla, Anand; Kalvakuntla, Rohini; Chokshi, Maulik

    2013-01-01

    Objective To perform an initial qualitative comparison of the different procurement models in India to frame questions for future research in this area; to capture the finer differences between the state models through 53 process and price parameters to determine their functional efficiencies. Design Qualitative analysis is performed for the study. Five states: Tamil Nadu, Kerala, Odisha, Punjab and Maharashtra were chosen to ensure heterogeneity in a number of factors such as procurement type (centralised, decentralised or mixed); autonomy of the procurement organisation; state of public health infrastructure; geography and availability of data through Right to Information Act (RTI). Data on procurement processes were collected through key informant analysis by way of semistructured interviews with leadership teams of procuring organisations. These process data were validated through interviews with field staff (stakeholders of district hospitals, taluk hospitals, community health centres and primary health centres) in each state. A total of 30 actors were interviewed in all five states. The data collected are analysed against 52 process and price parameters to determine the functional efficiency of the model. Results The analysis indicated that autonomous procurement organisations were more efficient in relation to payments to suppliers, had relatively lower drug procurement prices and managed their inventory more scientifically. Conclusions The authors highlight critical success factors that significantly influence the outcome of any procurement model. In a way, this study raises more questions and seeks the need for further research in this arena to aid policy makers. PMID:23388196

  2. Understanding public drug procurement in India: a comparative qualitative study of five Indian states.

    PubMed

    Singh, Prabal Vikram; Tatambhotla, Anand; Kalvakuntla, Rohini; Chokshi, Maulik

    2013-01-01

    To perform an initial qualitative comparison of the different procurement models in India to frame questions for future research in this area; to capture the finer differences between the state models through 53 process and price parameters to determine their functional efficiencies. Qualitative analysis is performed for the study. Five states: Tamil Nadu, Kerala, Odisha, Punjab and Maharashtra were chosen to ensure heterogeneity in a number of factors such as procurement type (centralised, decentralised or mixed); autonomy of the procurement organisation; state of public health infrastructure; geography and availability of data through Right to Information Act (RTI). Data on procurement processes were collected through key informant analysis by way of semistructured interviews with leadership teams of procuring organisations. These process data were validated through interviews with field staff (stakeholders of district hospitals, taluk hospitals, community health centres and primary health centres) in each state. A total of 30 actors were interviewed in all five states. The data collected are analysed against 52 process and price parameters to determine the functional efficiency of the model. The analysis indicated that autonomous procurement organisations were more efficient in relation to payments to suppliers, had relatively lower drug procurement prices and managed their inventory more scientifically. The authors highlight critical success factors that significantly influence the outcome of any procurement model. In a way, this study raises more questions and seeks the need for further research in this arena to aid policy makers.

  3. Advanced image collection, information extraction, and change detection in support of NN-20 broad area search and analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Petrie, G.M.; Perry, E.M.; Kirkham, R.R.

    1997-09-01

    This report describes the work performed at the Pacific Northwest National Laboratory (PNNL) for the U.S. Department of Energy`s Office of Nonproliferation and National Security, Office of Research and Development (NN-20). The work supports the NN-20 Broad Area Search and Analysis, a program initiated by NN-20 to improve the detection and classification of undeclared weapons facilities. Ongoing PNNL research activities are described in three main components: image collection, information processing, and change analysis. The Multispectral Airborne Imaging System, which was developed to collect georeferenced imagery in the visible through infrared regions of the spectrum, and flown on a light aircraftmore » platform, will supply current land use conditions. The image information extraction software (dynamic clustering and end-member extraction) uses imagery, like the multispectral data collected by the PNNL multispectral system, to efficiently generate landcover information. The advanced change detection uses a priori (benchmark) information, current landcover conditions, and user-supplied rules to rank suspect areas by probable risk of undeclared facilities or proliferation activities. These components, both separately and combined, provide important tools for improving the detection of undeclared facilities.« less

  4. Appropriate Handling, Processing and Analysis of Blood Samples Is Essential to Avoid Oxidation of Vitamin C to Dehydroascorbic Acid

    PubMed Central

    Pullar, Juliet M.; Carr, Anitra C.

    2018-01-01

    Vitamin C (ascorbate) is the major water-soluble antioxidant in plasma and its oxidation to dehydroascorbic acid (DHA) has been proposed as a marker of oxidative stress in vivo. However, controversy exists in the literature around the amount of DHA detected in blood samples collected from various patient cohorts. In this study, we report on DHA concentrations in a selection of different clinical cohorts (diabetes, pneumonia, cancer, and critically ill). All clinical samples were collected into EDTA anticoagulant tubes and processed at 4 °C prior to storage at −80 °C for subsequent analysis by HPLC with electrochemical detection. We also investigated the effects of different handling and processing conditions on short-term and long-term ascorbate and DHA stability in vitro and in whole blood and plasma samples. These conditions included metal chelation, anticoagulants (EDTA and heparin), and processing temperatures (ice, 4 °C and room temperature). Analysis of our clinical cohorts indicated very low to negligible DHA concentrations. Samples exhibiting haemolysis contained significantly higher concentrations of DHA. Metal chelation inhibited oxidation of vitamin C in vitro, confirming the involvement of contaminating metal ions. Although EDTA is an effective metal chelator, complexes with transition metal ions are still redox active, thus its use as an anticoagulant can facilitate metal ion-dependent oxidation of vitamin C in whole blood and plasma. Handling and processing blood samples on ice (or at 4 °C) delayed oxidation of vitamin C by a number of hours. A review of the literature regarding DHA concentrations in clinical cohorts highlighted the fact that studies using colourimetric or fluorometric assays reported significantly higher concentrations of DHA compared to those using HPLC with electrochemical detection. In conclusion, careful handling and processing of samples, combined with appropriate analysis, is crucial for accurate determination of ascorbate and DHA in clinical samples. PMID:29439480

  5. On-line capacity-building program on "analysis of data" for medical educators in the South Asia region: a qualitative exploration of our experience.

    PubMed

    Dongre, A R; Chacko, T V; Banu, S; Bhandary, S; Sahasrabudhe, R A; Philip, S; Deshmukh, P R

    2010-11-01

    In medical education, using the World Wide Web is a new approach for building the capacity of faculty. However, there is little information available on medical education researchers' needs and their collective learning outcomes in such on-line environments. Hence, the present study attempted: 1)to identify needs for capacity-building of fellows in a faculty development program on the topic of data analysis; and 2) to describe, analyze and understand the collective learning outcomes of the fellows during this need-based on-line session. The present research is based on quantitative (on-line survey for needs assessment) and qualitative (contents of e-mails exchanged in listserv discussion) data which were generated during the October 2009 Mentoring and Learning (M-L) Web discussion on the topic of data analysis. The data sources were shared e-mail responses during the process of planning and executing the M-L Web discussion. Content analysis was undertaken and the categories of discussion were presented as a simple non-hierarchical typology which represents the collective learning of the project fellows. We identified the types of learning needs on the topic 'Analysis of Data' to be addressed for faculty development in the field of education research. This need-based M-L Web discussion could then facilitate collective learning on such topics as 'basic concepts in statistics', tests of significance, Likert scale analysis, bivariate correlation, and simple regression analysis and content analysis of qualitative data. Steps like identifying the learning needs for an on-line M-L Web discussion, addressing the immediate needs of learners and creating a flexible reflective learning environment on the M-L Web facilitated the collective learning of the fellows on the topic of data analysis. Our outcomes can be useful in the design of on-line pedagogical strategies for supporting research in medical education.

  6. Next Generation Offline Approaches to Trace Gas-Phase Organic Compound Speciation: Sample Collection and Analysis

    NASA Astrophysics Data System (ADS)

    Sheu, R.; Marcotte, A.; Khare, P.; Ditto, J.; Charan, S.; Gentner, D. R.

    2017-12-01

    Intermediate-volatility and semi-volatile organic compounds (I/SVOCs) are major precursors to secondary organic aerosol, and contribute to tropospheric ozone formation. Their wide volatility range, chemical complexity, behavior in analytical systems, and trace concentrations present numerous hurdles to characterization. We present an integrated sampling-to-analysis system for the collection and offline analysis of trace gas-phase organic compounds with the goal of preserving and recovering analytes throughout sample collection, transport, storage, and thermal desorption for accurate analysis. Custom multi-bed adsorbent tubes are used to collect samples for offline analysis by advanced analytical detectors. The analytical instrumentation comprises an automated thermal desorption system that introduces analytes from the adsorbent tubes into a gas chromatograph, which is coupled with an electron ionization mass spectrometer (GC-EIMS) and other detectors. In order to optimize the collection and recovery for a wide range of analyte volatility and functionalization, we evaluated a variety of commercially-available materials, including Res-Sil beads, quartz wool, glass beads, Tenax TA, and silica gel. Key properties for optimization include inertness, versatile chemical capture, minimal affinity for water, and minimal artifacts or degradation byproducts; these properties were assessed with a diverse mix of traditionally-measured and functionalized analytes. Along with a focus on material selection, we provide recommendations spanning the entire sampling-and-analysis process to improve the accuracy of future comprehensive I/SVOC measurements, including oxygenated and other functionalized I/SVOCs. We demonstrate the performance of our system by providing results on speciated VOCs-SVOCs from indoor, outdoor, and chamber studies that establish the utility of our protocols and pave the way for precise laboratory characterization via a mix of detection methods.

  7. Novel infrastructure for sepsis biomarker research in critically ill neonates and children.

    PubMed

    Juskewitch, Justin E; Enders, Felicity T; Abraham, Roshini S; Huskins, W Charles

    2013-02-01

    Sepsis biomarker research requires an infrastructure to identify septic patients efficiently and to collect and store specimens properly. We developed a novel infrastructure to study biomarkers of sepsis in children. Patients in pediatric and neonatal intensive care units were enrolled prospectively; enrollment information was stored in a secure, remotely accessible database. Researchers were notified of electronic medical record (EMR) orders for blood cultures (a surrogate for a diagnostic evaluation of suspected sepsis) by a page triggered by the order. Staff confirmed patient enrollment and remotely submitted an EMR order for collection of study specimens simultaneous with the blood culture. Specimens were processed and stored by a mobile clinical research unit. Over 2 years, 2029 patients were admitted; 138 were enrolled. Staff received pages for 95% of blood cultures collected from enrolled patients. The median time between the blood culture order and collection was 34 minutes (range 9-241). Study specimens were collected simultaneously with 41 blood cultures. The median times between specimen collection and storage for flow cytometry and cytokine analysis were 33 minutes (range 0-82) and 52 minutes (range 28-98), respectively. This novel infrastructure facilitated prompt, proper collection and storage of specimens for sepsis biomarker analysis. © 2013 Wiley Periodicals, Inc.

  8. An epigenetically distinct breast cancer cell subpopulation promotes collective invasion

    PubMed Central

    Westcott, Jill M.; Prechtl, Amanda M.; Maine, Erin A.; Dang, Tuyen T.; Esparza, Matthew A.; Sun, Han; Zhou, Yunyun; Xie, Yang; Pearson, Gray W.

    2015-01-01

    Tumor cells can engage in a process called collective invasion, in which cohesive groups of cells invade through interstitial tissue. Here, we identified an epigenetically distinct subpopulation of breast tumor cells that have an enhanced capacity to collectively invade. Analysis of spheroid invasion in an organotypic culture system revealed that these “trailblazer” cells are capable of initiating collective invasion and promote non-trailblazer cell invasion, indicating a commensal relationship among subpopulations within heterogenous tumors. Canonical mesenchymal markers were not sufficient to distinguish trailblazer cells from non-trailblazer cells, suggesting that defining the molecular underpinnings of the trailblazer phenotype could reveal collective invasion-specific mechanisms. Functional analysis determined that DOCK10, ITGA11, DAB2, PDFGRA, VASN, PPAP2B, and LPAR1 are highly expressed in trailblazer cells and required to initiate collective invasion, with DOCK10 essential for metastasis. In patients with triple-negative breast cancer, expression of these 7 genes correlated with poor outcome. Together, our results indicate that spontaneous conversion of the epigenetic state in a subpopulation of cells can promote a transition from in situ to invasive growth through induction of a cooperative form of collective invasion and suggest that therapeutic inhibition of trailblazer cell invasion may help prevent metastasis. PMID:25844900

  9. Historical analysis of US pipeline accidents triggered by natural hazards

    NASA Astrophysics Data System (ADS)

    Girgin, Serkan; Krausmann, Elisabeth

    2015-04-01

    Natural hazards, such as earthquakes, floods, landslides, or lightning, can initiate accidents in oil and gas pipelines with potentially major consequences on the population or the environment due to toxic releases, fires and explosions. Accidents of this type are also referred to as Natech events. Many major accidents highlight the risk associated with natural-hazard impact on pipelines transporting dangerous substances. For instance, in the USA in 1994, flooding of the San Jacinto River caused the rupture of 8 and the undermining of 29 pipelines by the floodwaters. About 5.5 million litres of petroleum and related products were spilled into the river and ignited. As a results, 547 people were injured and significant environmental damage occurred. Post-incident analysis is a valuable tool for better understanding the causes, dynamics and impacts of pipeline Natech accidents in support of future accident prevention and mitigation. Therefore, data on onshore hazardous-liquid pipeline accidents collected by the US Pipeline and Hazardous Materials Safety Administration (PHMSA) was analysed. For this purpose, a database-driven incident data analysis system was developed to aid the rapid review and categorization of PHMSA incident reports. Using an automated data-mining process followed by a peer review of the incident records and supported by natural hazard databases and external information sources, the pipeline Natechs were identified. As a by-product of the data-collection process, the database now includes over 800,000 incidents from all causes in industrial and transportation activities, which are automatically classified in the same way as the PHMSA record. This presentation describes the data collection and reviewing steps conducted during the study, provides information on the developed database and data analysis tools, and reports the findings of a statistical analysis of the identified hazardous liquid pipeline incidents in terms of accident dynamics and consequences.

  10. Training in metabolomics research. II. Processing and statistical analysis of metabolomics data, metabolite identification, pathway analysis, applications of metabolomics and its future

    PubMed Central

    Barnes, Stephen; Benton, H. Paul; Casazza, Krista; Cooper, Sara; Cui, Xiangqin; Du, Xiuxia; Engler, Jeffrey; Kabarowski, Janusz H.; Li, Shuzhao; Pathmasiri, Wimal; Prasain, Jeevan K.; Renfrow, Matthew B.; Tiwari, Hemant K.

    2017-01-01

    Metabolomics, a systems biology discipline representing analysis of known and unknown pathways of metabolism, has grown tremendously over the past 20 years. Because of its comprehensive nature, metabolomics requires careful consideration of the question(s) being asked, the scale needed to answer the question(s), collection and storage of the sample specimens, methods for extraction of the metabolites from biological matrices, the analytical method(s) to be employed and the quality control of the analyses, how collected data are correlated, the statistical methods to determine metabolites undergoing significant change, putative identification of metabolites, and the use of stable isotopes to aid in verifying metabolite identity and establishing pathway connections and fluxes. This second part of a comprehensive description of the methods of metabolomics focuses on data analysis, emerging methods in metabolomics and the future of this discipline. PMID:28239968

  11. Challenges in combining different data sets during analysis when using grounded theory.

    PubMed

    Rintala, Tuula-Maria; Paavilainen, Eija; Astedt-Kurki, Päivi

    2014-05-01

    To describe the challenges in combining two data sets during grounded theory analysis. The use of grounded theory in nursing research is common. It is a suitable method for studying human action and interaction. It is recommended that many alternative sources of data are collected to create as rich a dataset as possible. Data from interviews with people with diabetes (n=19) and their family members (n=19). Combining two data sets. When using grounded theory, there are numerous challenges in collecting and managing data, especially for the novice researcher. One challenge is to combine different data sets during the analysis. There are many methodological textbooks about grounded theory but there is little written in the literature about combining different data sets. Discussion is needed on the management of data and the challenges of grounded theory. This article provides a means for combining different data sets in the grounded theory analysis process.

  12. Shifting from Stewardship to Analytics of Massive Science Data

    NASA Astrophysics Data System (ADS)

    Crichton, D. J.; Doyle, R.; Law, E.; Hughes, S.; Huang, T.; Mahabal, A.

    2015-12-01

    Currently, the analysis of large data collections is executed through traditional computational and data analysis approaches, which require users to bring data to their desktops and perform local data analysis. Data collection, archiving and analysis from future remote sensing missions, be it from earth science satellites, planetary robotic missions, or massive radio observatories may not scale as more capable instruments stress existing architectural approaches and systems due to more continuous data streams, data from multiple observational platforms, and measurements and models from different agencies. A new paradigm is needed in order to increase the productivity and effectiveness of scientific data analysis. This paradigm must recognize that architectural choices, data processing, management, analysis, etc are interrelated, and must be carefully coordinated in any system that aims to allow efficient, interactive scientific exploration and discovery to exploit massive data collections. Future observational systems, including satellite and airborne experiments, and research in climate modeling will significantly increase the size of the data requiring new methodological approaches towards data analytics where users can more effectively interact with the data and apply automated mechanisms for data reduction, reduction and fusion across these massive data repositories. This presentation will discuss architecture, use cases, and approaches for developing a big data analytics strategy across multiple science disciplines.

  13. Understanding Classrooms through Social Network Analysis: A Primer for Social Network Analysis in Education Research.

    PubMed

    Grunspan, Daniel Z; Wiggins, Benjamin L; Goodreau, Steven M

    2014-01-01

    Social interactions between students are a major and underexplored part of undergraduate education. Understanding how learning relationships form in undergraduate classrooms, as well as the impacts these relationships have on learning outcomes, can inform educators in unique ways and improve educational reform. Social network analysis (SNA) provides the necessary tool kit for investigating questions involving relational data. We introduce basic concepts in SNA, along with methods for data collection, data processing, and data analysis, using a previously collected example study on an undergraduate biology classroom as a tutorial. We conduct descriptive analyses of the structure of the network of costudying relationships. We explore generative processes that create observed study networks between students and also test for an association between network position and success on exams. We also cover practical issues, such as the unique aspects of human subjects review for network studies. Our aims are to convince readers that using SNA in classroom environments allows rich and informative analyses to take place and to provide some initial tools for doing so, in the process inspiring future educational studies incorporating relational data. © 2014 D. Z. Grunspan et al. CBE—Life Sciences Education © 2014 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  14. Processes regulating the initiation and postejaculatory resumption of copulatory behavior in male hamsters.

    PubMed

    Floody, Owen R

    2014-06-01

    Studies using factor analysis have helped describe the organization of copulatory behavior in male rodents. However, the focus of these studies on a few traditional measures may have limited their results. To test this possibility, 74 sexually-experienced male hamsters were observed as they copulated with stimulus females. The measures collected exceeded the conventional ones in number, variety and independence. The factor analysis of these data revealed a structure with seven factors collectively accounting for 80% of the variance. Most resembled the factors in previous reports, reinforcing the contributions that the processes suggested by these factors make to the organization of male behavior. But several other factors were more novel, possibly reflecting the use of measures that were novel or revised for greater independence. The most interesting of these were two factors focusing on early steps in the progression leading to ejaculation. Importantly, both incorporated measures from each of the three copulatory series that were observed. Past work suggests that independent processes control the times required to initiate copulation and later resume it after an ejaculation. In contrast, these results suggest the existence of two processes, each of which contributes to both the initiation and reinitiation of copulation. Copyright © 2014 Elsevier B.V. All rights reserved.

  15. Process, cost, and clinical quality: the initial oral contraceptive visit.

    PubMed

    McMullen, Michael J; Woolford, Samuel W; Moore, Charles L; Berger, Barry M

    2013-01-01

    To demonstrate how the analysis of clinical process, cost, and outcomes can identify healthcare improvements that reduce cost without sacrificing quality, using the example of the initial visit associated with oral contraceptive pill use. Cross-sectional study using data collected by HealthMETRICS between 1996 and 2009. Using data collected from 106 sites in 24 states, the unintended pregnancy (UIP) rate, effectiveness of patient education, and unit visit cost were calculated. Staff type providing education and placement of education were recorded. Two-way analysis of variance models were created and tested for significance to identify differences between groups. Sites using nonclinical staff to provide education outside the exam were associated with lower cost, higher education scores, and a UIP rate no different from that of sites using clinical staff. Sites also providing patient education during the physical examination were associated with higher cost, lower education scores, and a UIP rate no lower than that of sites providing education outside of the exam. Through analyzing process, cost, and quality, lower-cost processes that did not reduce clinical quality were identified. This methodology is applicable to other clinical services for identifying low-cost processes that do not result in lower clinical quality. By using nonclinical staff educators to provide education outside of the physical examination, sites could save an average of 32% of the total cost of the visit.

  16. Book Review: Book review

    NASA Astrophysics Data System (ADS)

    Tweed, Fiona S.

    2017-08-01

    This special edition of Zeitschrift für Geomorphologie (ZfG) is based on presentations given at a conference entitled 'Hydrological Extreme Events in Historic and Prehistoric Times' which took place in Bonn in June 2014. The volume consists of an editorial introduction and nine research papers reflecting a range of approaches to understanding past events, including modelling, analysis of historical data and studies that focus on a consistent approach to collection and analysis of data from different areas. The HEX project, which generated the conference in Bonn, adopted a multidisciplinary approach and this is reflected in the collection of papers, which emphasise the importance of combining a range of approaches and analyses as tools for decoding both landscapes and processes.

  17. Toward the Darwinian transition: Switching between distributed and speciated states in a simple model of early life.

    PubMed

    Arnoldt, Hinrich; Strogatz, Steven H; Timme, Marc

    2015-01-01

    It has been hypothesized that in the era just before the last universal common ancestor emerged, life on earth was fundamentally collective. Ancient life forms shared their genetic material freely through massive horizontal gene transfer (HGT). At a certain point, however, life made a transition to the modern era of individuality and vertical descent. Here we present a minimal model for stochastic processes potentially contributing to this hypothesized "Darwinian transition." The model suggests that HGT-dominated dynamics may have been intermittently interrupted by selection-driven processes during which genotypes became fitter and decreased their inclination toward HGT. Stochastic switching in the population dynamics with three-point (hypernetwork) interactions may have destabilized the HGT-dominated collective state and essentially contributed to the emergence of vertical descent and the first well-defined species in early evolution. A systematic nonlinear analysis of the stochastic model dynamics covering key features of evolutionary processes (such as selection, mutation, drift and HGT) supports this view. Our findings thus suggest a viable direction out of early collective evolution, potentially enabling the start of individuality and vertical Darwinian evolution.

  18. Does recyclable separation reduce the cost of municipal waste management in Japan?

    PubMed

    Chifari, Rosaria; Lo Piano, Samuele; Matsumoto, Shigeru; Tasaki, Tomohiro

    2017-02-01

    Municipal solid waste (MSW) management is a system involving multiple sub-systems that typically require demanding inputs, materials and resources to properly process generated waste throughput. For this reason, MSW management is generally one of the most expensive services provided by municipalities. In this paper, we analyze the Japanese MSW management system and estimate the cost elasticity with respect to the waste volumes at three treatment stages: collection, processing, and disposal. Although we observe economies of scale at all three stages, the collection cost is less elastic than the disposal cost. We also examine whether source separation at home affects the cost of MSW management. The empirical results show that the separate collection of the recyclable fraction leads to reduced processing costs at intermediate treatment facilities, but does not change the overall waste management cost. Our analysis also reveals that the cost of waste management systems decreases when the service is provided by private companies through a public tender. The cost decreases even more when the service is performed under the coordination of adjacent municipalities. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Vortex information display system program description manual. [data acquisition from laser Doppler velocimeters and real time operation

    NASA Technical Reports Server (NTRS)

    Conway, R.; Matuck, G. N.; Roe, J. M.; Taylor, J.; Turner, A.

    1975-01-01

    A vortex information display system is described which provides flexible control through system-user interaction for collecting wing-tip-trailing vortex data, processing this data in real time, displaying the processed data, storing raw data on magnetic tape, and post processing raw data. The data is received from two asynchronous laser Doppler velocimeters (LDV's) and includes position, velocity, and intensity information. The raw data is written onto magnetic tape for permanent storage and is also processed in real time to locate vortices and plot their positions as a function of time. The interactive capability enables the user to make real time adjustments in processing data and provides a better definition of vortex behavior. Displaying the vortex information in real time produces a feedback capability to the LDV system operator allowing adjustments to be made in the collection of raw data. Both raw data and processing can be continually upgraded during flyby testing to improve vortex behavior studies. The post-analysis capability permits the analyst to perform in-depth studies of test data and to modify vortex behavior models to improve transport predictions.

  20. Microarray Data Processing Techniques for Genome-Scale Network Inference from Large Public Repositories.

    PubMed

    Chockalingam, Sriram; Aluru, Maneesha; Aluru, Srinivas

    2016-09-19

    Pre-processing of microarray data is a well-studied problem. Furthermore, all popular platforms come with their own recommended best practices for differential analysis of genes. However, for genome-scale network inference using microarray data collected from large public repositories, these methods filter out a considerable number of genes. This is primarily due to the effects of aggregating a diverse array of experiments with different technical and biological scenarios. Here we introduce a pre-processing pipeline suitable for inferring genome-scale gene networks from large microarray datasets. We show that partitioning of the available microarray datasets according to biological relevance into tissue- and process-specific categories significantly extends the limits of downstream network construction. We demonstrate the effectiveness of our pre-processing pipeline by inferring genome-scale networks for the model plant Arabidopsis thaliana using two different construction methods and a collection of 11,760 Affymetrix ATH1 microarray chips. Our pre-processing pipeline and the datasets used in this paper are made available at http://alurulab.cc.gatech.edu/microarray-pp.

  1. 15 CFR 971.202 - Statement of technological experience and capabilities.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... GENERAL REGULATIONS OF THE ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR COMMERCIAL... results to commercial mining. The more test data offered with the application the less analysis will be... step in the mining process, including nodule collection, retrieval, transfer to ship, environmental...

  2. Single molecule image formation, reconstruction and processing: introduction.

    PubMed

    Ashok, Amit; Piestun, Rafael; Stallinga, Sjoerd

    2016-07-01

    The ability to image at the single molecule scale has revolutionized research in molecular biology. This feature issue presents a collection of articles that provides new insights into the fundamental limits of single molecule imaging and reports novel techniques for image formation and analysis.

  3. Reflections on Language Learning.

    ERIC Educational Resources Information Center

    Barbara, Leila, Ed.; Scott, Mike, Ed.

    The collection of papers, dedicated to Maria Antonieta Alba Celani, a celebrated English professor in Brazil, consists of writings by colleagues on four themes: developments stemming from Dr. Celani's Brazilian national project for the teaching of English for special purposes; language teacher training; language processing; and analysis of…

  4. Development of a Relay Performance Web Tool for the Mars Network

    NASA Technical Reports Server (NTRS)

    Allard, Daniel A.; Edwards, Charles D.

    2009-01-01

    Modern Mars surface missions rely upon orbiting spacecraft to relay communications to and from Earth systems. An important component of this multi-mission relay process is the collection of relay performance statistics supporting strategic trend analysis and tactical anomaly identification and tracking.

  5. A Hands-on Activity for Teaching the Poisson Distribution Using the Stock Market

    ERIC Educational Resources Information Center

    Dunlap, Mickey; Studstill, Sharyn

    2014-01-01

    The number of increases a particular stock makes over a fixed period follows a Poisson distribution. This article discusses using this easily-found data as an opportunity to let students become involved in the data collection and analysis process.

  6. 15 CFR 971.202 - Statement of technological experience and capabilities.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... GENERAL REGULATIONS OF THE ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR COMMERCIAL... results to commercial mining. The more test data offered with the application the less analysis will be... step in the mining process, including nodule collection, retrieval, transfer to ship, environmental...

  7. 15 CFR 971.202 - Statement of technological experience and capabilities.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... GENERAL REGULATIONS OF THE ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR COMMERCIAL... results to commercial mining. The more test data offered with the application the less analysis will be... step in the mining process, including nodule collection, retrieval, transfer to ship, environmental...

  8. Comprehensive characterisation of sewage sludge for thermochemical conversion processes - Based on Singapore survey.

    PubMed

    Chan, Wei Ping; Wang, Jing-Yuan

    2016-08-01

    Recently, sludge attracted great interest as a potential feedstock in thermochemical conversion processes. However, compositions and thermal degradation behaviours of sludge were highly complex and distinctive compared to other traditional feedstock led to a need of fundamental research on sludge. Comprehensive characterisation of sludge specifically for thermochemical conversion was carried out for all existing Water Reclamation Plants in Singapore. In total, 14 sludge samples collected based on the type, plant, and batch categorisation. Existing characterisation methods for physical and chemical properties were analysed and reviewed using the collected samples. Qualitative similarities and quantitative variations of different sludge samples were identified and discussed. Oxidation of inorganic in sludge during ash forming analysis found to be causing significant deviations on proximate and ultimate analysis. Therefore, alternative parameters and comparison basis including Fixed Residues (FR), Inorganic Matters (IM) and Total Inorganics (TI) were proposed for better understanding on the thermochemical characteristics of sludge. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Introduction of Transplant Registry Unified Management Program 2 (TRUMP2): scripts for TRUMP data analyses, part I (variables other than HLA-related data).

    PubMed

    Atsuta, Yoshiko

    2016-01-01

    Collection and analysis of information on diseases and post-transplant courses of allogeneic hematopoietic stem cell transplant recipients have played important roles in improving therapeutic outcomes in hematopoietic stem cell transplantation. Efficient, high-quality data collection systems are essential. The introduction of the Second-Generation Transplant Registry Unified Management Program (TRUMP2) is intended to improve data quality and more efficient data management. The TRUMP2 system will also expand possible uses of data, as it is capable of building a more complex relational database. The construction of an accessible data utilization system for adequate data utilization by researchers would promote greater research activity. Study approval and management processes and authorship guidelines also need to be organized within this context. Quality control of processes for data manipulation and analysis will also affect study outcomes. Shared scripts have been introduced to define variables according to standard definitions for quality control and improving efficiency of registry studies using TRUMP data.

  10. Open-Source Programming for Automated Generation of Graphene Raman Spectral Maps

    NASA Astrophysics Data System (ADS)

    Vendola, P.; Blades, M.; Pierre, W.; Jedlicka, S.; Rotkin, S. V.

    Raman microscopy is a useful tool for studying the structural characteristics of graphene deposited onto substrates. However, extracting useful information from the Raman spectra requires data processing and 2D map generation. An existing home-built confocal Raman microscope was optimized for graphene samples and programmed to automatically generate Raman spectral maps across a specified area. In particular, an open source data collection scheme was generated to allow the efficient collection and analysis of the Raman spectral data for future use. NSF ECCS-1509786.

  11. Implementation of a Research Information Management System in a Pediatric Hospital.

    PubMed

    Kissling, Alison D; Ballinger, Kimberly D

    2018-01-01

    Faculty publications have been collected in universities, health, and medical institutions for many years, and Cincinnati Children's is no exception. Since 1949, a yearly list of faculty publications was manually compiled using multiple data sources and disseminated by the Edward L. Pratt Research Library. Products to centralize faculty publication collection and analysis with bibliometric tools are growing in popularity. This article will review the collaborative decision to choose a Research Information Management System and the implementation process including successes, challenges, and future opportunities.

  12. Identification of uncommon objects in containers

    DOEpatents

    Bremer, Peer-Timo; Kim, Hyojin; Thiagarajan, Jayaraman J.

    2017-09-12

    A system for identifying in an image an object that is commonly found in a collection of images and for identifying a portion of an image that represents an object based on a consensus analysis of segmentations of the image. The system collects images of containers that contain objects for generating a collection of common objects within the containers. To process the images, the system generates a segmentation of each image. The image analysis system may also generate multiple segmentations for each image by introducing variations in the selection of voxels to be merged into a segment. The system then generates clusters of the segments based on similarity among the segments. Each cluster represents a common object found in the containers. Once the clustering is complete, the system may be used to identify common objects in images of new containers based on similarity between segments of images and the clusters.

  13. A Synopsis of Technical Issues of Concern for Monitoring Trace Elements in Highway and Urban Runoff

    USGS Publications Warehouse

    Breault, Robert F.; Granato, Gregory E.

    2000-01-01

    Trace elements, which are regulated for aquatic life protection, are a primary concern in highway- and urban-runoff studies because stormwater runoff may transport these constituents from the land surface to receiving waters. Many of these trace elements are essential for biological activity and become detrimental only when geologic or anthropogenic sources exceed concentrations beyond ranges typical of the natural environment. The Federal Highway Administration and State Transportation Agencies are concerned about the potential effects of highway runoff on the watershed scale and for the management and protection of watersheds. Transportation agencies need information that is documented as valid, current, and scientifically defensible to support planning and management decisions. There are many technical issues of concern for monitoring trace elements; therefore, trace-element data commonly are considered suspect, and the responsibility to provide data-quality information to support the validity of reported results rests with the data-collection agency. Paved surfaces are fundamentally different physically, hydraulically, and chemically from the natural surfaces typical of most freshwater systems that have been the focus of many traceelement- monitoring studies. Existing scientific conceptions of the behavior of trace elements in the environment are based largely upon research on natural systems, rather than on systems typical of pavement runoff. Additionally, the logistics of stormwater sampling are difficult because of the great uncertainty in the occurrence and magnitude of storm events. Therefore, trace-element monitoring programs may be enhanced if monitoring and sampling programs are automated. Automation would standardize the process and provide a continuous record of the variations in flow and water-quality characteristics. Great care is required to collect and process samples in a manner that will minimize potential contamination or attenuation of trace elements and other sources of bias and variability in the sampling process. Trace elements have both natural and anthropogenic sources that may affect the sampling process, including the sample-collection and handling materials used in many trace-element monitoring studies. Trace elements also react with these materials within the timescales typical for collection, processing and analysis of runoff samples. To study the characteristics and potential effects of trace elements in highway and urban runoff, investigators typically sample one or more operationally defined matrixes including: whole water, dissolved (filtered water), suspended sediment, bottom sediment, biological tissue, and contaminant sources. The sampling and analysis of each of these sample matrixes can provide specific information about the occurrence and distribution of trace elements in runoff and receiving waters. There are, however, technical concerns specific to each matrix that must be understood and addressed through use of proper collection and processing protocols. Valid protocols are designed to minimize inherent problems and to maximize the accuracy, precision, comparability, and representativeness of data collected. Documentation, including information about monitoring protocols, quality assurance and quality control efforts, and ancillary data also is necessary to establish data quality. This documentation is especially important for evaluation of historical traceelement monitoring data, because trace-element monitoring protocols and analysis methods have been constantly changing over the past 30 years.

  14. A retrospective analysis of the change in anti-malarial treatment policy: Peru.

    PubMed

    Williams, Holly Ann; Vincent-Mark, Arlene; Herrera, Yenni; Chang, O Jaime

    2009-04-28

    National malaria control programmes must deal with the complex process of changing national malaria treatment guidelines, often without guidance on the process of change. Selecting a replacement drug is only one issue in this process. There is a paucity of literature describing successful malaria treatment policy changes to help guide control programs through this process. To understand the wider context in which national malaria treatment guidelines were formulated in a specific country (Peru). Using qualitative methods (individual and focus group interviews, stakeholder analysis and a review of documents), a retrospective analysis of the process of change in Peru's anti-malarial treatment policy from the early 1990's to 2003 was completed. The decision to change Peru's policies resulted from increasing levels of anti-malarial drug resistance, as well as complaints from providers that the drugs were no longer working. The context of the change occurred in a time in which Peru was changing national governments, which created extreme challenges in moving the change process forward. Peru utilized a number of key strategies successfully to ensure that policy change would occur. This included a) having the process directed by a group who shared a common interest in malaria and who had long-established social and professional networks among themselves, b) engaging in collaborative teamwork among nationals and between nationals and international collaborators, c) respect for and inclusion of district-level staff in all phases of the process, d) reliance on high levels of technical and scientific knowledge, e) use of standardized protocols to collect data, and f) transparency. Although not perfectly or fully implemented by 2003, the change in malaria treatment policy in Peru occurred very quickly, as compared to other countries. They identified a problem, collected the data necessary to justify the change, utilized political will to their favor, approved the policy, and moved to improve malaria control in their country. As such, they offer an excellent example for other countries as they contemplate or embark on policy changes.

  15. Hyper-resolution monitoring of urban flooding with social media and crowdsourcing data

    NASA Astrophysics Data System (ADS)

    Wang, Ruo-Qian; Mao, Huina; Wang, Yuan; Rae, Chris; Shaw, Wesley

    2018-02-01

    Hyper-resolution datasets for urban flooding are rare. This problem prevents detailed flooding risk analysis, urban flooding control, and the validation of hyper-resolution numerical models. We employed social media and crowdsourcing data to address this issue. Natural Language Processing and Computer Vision techniques are applied to the data collected from Twitter and MyCoast (a crowdsourcing app). We found these big data based flood monitoring approaches can complement the existing means of flood data collection. The extracted information is validated against precipitation data and road closure reports to examine the data quality. The two data collection approaches are compared and the two data mining methods are discussed. A series of suggestions is given to improve the data collection strategy.

  16. Analysis of a document/reporting system

    NASA Technical Reports Server (NTRS)

    Narrow, B.

    1971-01-01

    An in-depth analysis of the information system within the Data Processing Branch is presented. Quantitative measures are used to evaluate the efficiency and effectiveness of the information system. It is believed that this is the first documented study which utilizes quantitative measures for full scale system analysis. The quantitative measures and techniques for collecting and qualifying the basic data, as described, are applicable to any information system. Therefore this report is considered to be of interest to any persons concerned with the management design, analysis or evaluation of information systems.

  17. Family perspectives on organ and tissue donation for transplantation: a principlist analysis.

    PubMed

    Dos Santos, Marcelo José; Feito, Lydia

    2017-01-01

    The family interview context is permeated by numerous ethical issues which may generate conflicts and impact on organ donation process. This study aims to analyze the family interview process with a focus on principlist bioethics. This exploratory, descriptive study uses a qualitative approach. The speeches were collected using the following prompt: "Talk about the family interview for the donation of organs and tissues for transplantation, from the preparation for the interview to the decision of the family to donate or not." For the treatment of qualitative data, we chose the method of content analysis and categorical thematic analysis. The study involved 18 nurses who worked in three municipal organ procurement organizations in São Paulo, Brazil, and who conducted family interviews for organ donation. Ethical considerations: The data were collected after approval of the study by the Research Ethics Committee of the School of Nursing of the University of São Paulo. The results were classified into four categories and three subcategories. The categories are the principles adopted by principlist bioethics. The principles of autonomy, beneficence, non-maleficence, and justice permeate the family interview and reveal their importance in the organs and tissues donation process for transplantation. The analysis of family interviews for the donation of organs and tissues for transplantation with a focus on principlist bioethics indicates that the process involves many ethical considerations. The elucidation of these aspects contributes to the discussion, training, and improvement of professionals, whether nurses or not, who work in organ procurement organizations and can improve the curriculum of existing training programs for transplant coordinators who pursue ethics in donation and transplantation as their foundation.

  18. Varied Human Tolerance to the Combined Conditions of Low Contrast and Diminished Luminance: A Quasi-Meta Analysis

    DTIC Science & Technology

    2017-08-30

    as being three-fold: 1) a measurement of the integrity of both the central and peripheral visual processing centers; 2) an indicator of detail...visual assessment task 12 integral to the Army’s Class 1 Flight Physical (Ginsburg, 1981 and 1984; Bachman & Behar, 1986). During a Class 1 flight...systems. Meta-analysis has been defined as the statistical analysis of a collection of analytical results for the purpose of integrating the findings

  19. Responding mindfully to distressing psychosis: A grounded theory analysis.

    PubMed

    Abba, Nicola; Chadwick, Paul; Stevenson, Chris

    2008-01-01

    This study investigates the psychological process involved when people with current distressing psychosis learned to respond mindfully to unpleasant psychotic sensations (voices, thoughts, and images). Sixteen participants were interviewed on completion of a mindfulness group program. Grounded theory methodology was used to generate a theory of the core psychological process using a systematically applied set of methods linking analysis with data collection. The theory inducted describes the experience of relating differently to psychosis through a three-stage process: centering in awareness of psychosis; allowing voices, thoughts, and images to come and go without reacting or struggle; and reclaiming power through acceptance of psychosis and the self. The conceptual and clinical applications of the theory and its limits are discussed.

  20. Development of data processing, interpretation and analysis system for the remote sensing of trace atmospheric gas species

    NASA Technical Reports Server (NTRS)

    Casas, Joseph C.; Saylor, Mary S.; Kindle, Earl C.

    1987-01-01

    The major emphasis is on the advancement of remote sensing technology. In particular, the gas filter correlation radiometer (GFCR) technique was applied to the measurement of trace gas species, such as carbon monoxide (CO), from airborne and Earth orbiting platforms. Through a series of low altitude aircraft flights, high altitude aircraft flights, and orbiting space platform flights, data were collected and analyzed, culminating in the first global map of carbon monoxide concentration in the middle troposphere and stratosphere. The four major areas of this remote sensing program, known as the Measurement of Air Pollution from Satellites (MAPS) experiment, are: (1) data acquisition, (2) data processing, analysis, and interpretation algorithms, (3) data display techniques, and (4) information processing.

  1. Ex-vivo imaging of excised tissue using vital dyes and confocal microscopy

    PubMed Central

    Johnson, Simon; Rabinovitch, Peter

    2012-01-01

    Vital dyes routinely used for staining cultured cells can also be used to stain and image live tissue slices ex-vivo. Staining tissue with vital dyes allows researchers to collect structural and functional data simultaneously and can be used for qualitative or quantitative fluorescent image collection. The protocols presented here are useful for structural and functional analysis of viable properties of cells in intact tissue slices, allowing for the collection of data in a structurally relevant environment. With these protocols, vital dyes can be applied as a research tool to disease processes and properties of tissue not amenable to cell culture based studies. PMID:22752953

  2. Automated apparatus for solvent separation of a coal liquefaction product stream

    DOEpatents

    Schweighardt, Frank K.

    1985-01-01

    An automated apparatus for the solvent separation of a coal liquefaction product stream that operates continuously and unattended and eliminates potential errors resulting from subjectivity and the aging of the sample during analysis. In use of the apparatus, metered amounts of one or more solvents are passed sequentially through a filter containing the sample under the direction of a microprocessor control means. The mixture in the filter is agitated by means of ultrasonic cavitation for a timed period and the filtrate is collected. The filtrate of each solvent extraction is collected individually and the residue on the filter element is collected to complete the extraction process.

  3. Study on Stationarity of Random Load Spectrum Based on the Special Road

    NASA Astrophysics Data System (ADS)

    Yan, Huawen; Zhang, Weigong; Wang, Dong

    2017-09-01

    In the special road quality assessment method, there is a method using a wheel force sensor, the essence of this method is collecting the load spectrum of the car to reflect the quality of road. According to the definition of stochastic process, it is easy to find that the load spectrum is a stochastic process. However, the analysis method and application range of different random processes are very different, especially in engineering practice, which will directly affect the design and development of the experiment. Therefore, determining the type of a random process has important practical significance. Based on the analysis of the digital characteristics of road load spectrum, this paper determines that the road load spectrum in this experiment belongs to a stationary stochastic process, paving the way for the follow-up modeling and feature extraction of the special road.

  4. Analysis of the critical thinking process of junior high school students in solving geometric problems by utilizing the v-a-k learning styles model

    NASA Astrophysics Data System (ADS)

    Hananto, R. B.; Kusmayadi, T. A.; Riyadi

    2018-05-01

    The research aims to identify the critical thinking process of students in solving geometry problems. The geometry problem selected in this study was the building of flat side room (cube). The critical thinking process was implemented to visual, auditory and kinesthetic learning styles. This research was a descriptive analysis research using qualitative method. The subjects of this research were 3 students selected by purposive sampling consisting of visual, auditory, and kinesthetic learning styles. Data collection was done through test, interview, and observation. The results showed that the students' critical thinking process in identifying and defining steps for each learning style were similar in solving problems. The critical thinking differences were seen in enumerate, analyze, list, and self-correct steps. It was also found that critical thinking process of students with kinesthetic learning style was better than visual and auditory learning styles.

  5. Knowledge exchange processes in organizations and policy arenas: a narrative systematic review of the literature.

    PubMed

    Contandriopoulos, Damien; Lemire, Marc; Denis, Jean-Louis; Tremblay, Emile

    2010-12-01

    This article presents the main results from a large-scale analytical systematic review on knowledge exchange interventions at the organizational and policymaking levels. The review integrated two broad traditions, one roughly focused on the use of social science research results and the other focused on policymaking and lobbying processes. Data collection was done using systematic snowball sampling. First, we used prospective snowballing to identify all documents citing any of a set of thirty-three seminal papers. This process identified 4,102 documents, 102 of which were retained for in-depth analysis. The bibliographies of these 102 documents were merged and used to identify retrospectively all articles cited five times or more and all books cited seven times or more. All together, 205 documents were analyzed. To develop an integrated model, the data were synthesized using an analytical approach. This article developed integrated conceptualizations of the forms of collective knowledge exchange systems, the nature of the knowledge exchanged, and the definition of collective-level use. This literature synthesis is organized around three dimensions of context: level of polarization (politics), cost-sharing equilibrium (economics), and institutionalized structures of communication (social structuring). The model developed here suggests that research is unlikely to provide context-independent evidence for the intrinsic efficacy of knowledge exchange strategies. To design a knowledge exchange intervention to maximize knowledge use, a detailed analysis of the context could use the kind of framework developed here. © 2010 Milbank Memorial Fund. Published by Wiley Periodicals Inc.

  6. Knowledge Exchange Processes in Organizations and Policy Arenas: A Narrative Systematic Review of the Literature

    PubMed Central

    Contandriopoulos, Damien; Lemire, Marc; Denis, Jean-Louis; Tremblay, Émile

    2010-01-01

    Context: This article presents the main results from a large-scale analytical systematic review on knowledge exchange interventions at the organizational and policymaking levels. The review integrated two broad traditions, one roughly focused on the use of social science research results and the other focused on policymaking and lobbying processes. Methods: Data collection was done using systematic snowball sampling. First, we used prospective snowballing to identify all documents citing any of a set of thirty-three seminal papers. This process identified 4,102 documents, 102 of which were retained for in-depth analysis. The bibliographies of these 102 documents were merged and used to identify retrospectively all articles cited five times or more and all books cited seven times or more. All together, 205 documents were analyzed. To develop an integrated model, the data were synthesized using an analytical approach. Findings: This article developed integrated conceptualizations of the forms of collective knowledge exchange systems, the nature of the knowledge exchanged, and the definition of collective-level use. This literature synthesis is organized around three dimensions of context: level of polarization (politics), cost-sharing equilibrium (economics), and institutionalized structures of communication (social structuring). Conclusions: The model developed here suggests that research is unlikely to provide context-independent evidence for the intrinsic efficacy of knowledge exchange strategies. To design a knowledge exchange intervention to maximize knowledge use, a detailed analysis of the context could use the kind of framework developed here. PMID:21166865

  7. FTOOLS: A FITS Data Processing and Analysis Software Package

    NASA Astrophysics Data System (ADS)

    Blackburn, J. Kent; Greene, Emily A.; Pence, William

    1993-05-01

    FTOOLS, a highly modular collection of utilities for processing and analyzing data in the FITS (Flexible Image Transport System) format, has been developed in support of the HEASARC (High Energy Astrophysics Research Archive Center) at NASA's Goddard Space Flight Center. Each utility performs a single simple task such as presentation of file contents, extraction of specific rows or columns, appending or merging tables, binning values in a column or selecting subsets of rows based on a boolean expression. Individual utilities can easily be chained together in scripts to achieve more complex operations such as the generation and displaying of spectra or light curves. The collection of utilities provides both generic processing and analysis utilities and utilities common to high energy astrophysics data sets. The FTOOLS software package is designed to be both compatible with IRAF and completely stand alone in a UNIX or VMS environment. The user interface is controlled by standard IRAF parameter files. The package is self documenting through the IRAF help facility and a stand alone help task. Software is written in ANSI C and FORTRAN to provide portability across most computer systems. The data format dependencies between hardware platforms are isolated through the FITSIO library package.

  8. Inside the Green House "Black Box": Opportunities for High-Quality Clinical Decision Making.

    PubMed

    Bowers, Barbara; Roberts, Tonya; Nolet, Kimberly; Ryther, Brenda

    2016-02-01

    To develop a conceptual model that explained common and divergent care processes in Green House (GH) nursing homes with high and low hospital transfer rates. Eighty-four face-to-face, semistructured interviews were conducted with direct care, professional, and administrative staff with knowledge of care processes in six GH organizations in six states. The qualitative grounded theory method was used for data collection and analysis. Data were analyzed using open, axial, and selective coding. Data collection and analysis occurred iteratively. Elements of the GH model created significant opportunities to identify, communicate, and respond to early changes in resident condition. Staff in GH homes with lower hospital transfer rates employed care processes that maximized these opportunities. Staff in GH homes with higher transfer rates failed to maximize, or actively undermined, these opportunities. Variations in how the GH model was implemented across GH homes suggest possible explanations for inconsistencies found in past research on the care outcomes, including hospital transfer rates, in culture change models. The findings further suggest that the details of culture change implementation are important considerations in model replication and policies that create incentives for care improvements. © Health Research and Educational Trust.

  9. Workshop on the Analysis of Interplanetary Dust Particles

    NASA Technical Reports Server (NTRS)

    Zolensky, Michael E. (Editor)

    1994-01-01

    Great progress has been made in the analysis of interplanetary dust particles (IDP's) over the past few years. This workshop provided a forum for the discussion of the following topics: observation and modeling of dust in the solar system, mineralogy and petrography of IDP's, processing of IDP's in the solar system and terrestrial atmosphere, comparison of IDP's to meteorites and micrometeorites, composition of IDP's, classification, and collection of IDP's.

  10. FOCUS: A Model of Sensemaking

    DTIC Science & Technology

    2007-05-01

    of the current project was to unpack and develop the concept of sensemaking, principally by developing and testing a cognitive model of the processes...themselves. In Year 2, new Cognitive Task Analysis data collection methods were developed and used to further test the model. Cognitive Task Analysis is a...2004) to examine the phenomenon of "sensemaking," a concept initially formulated by Weick (1995), but not developed from a cognitive perspective

  11. Incorporation of Precipitation Data Into FIA Analyses: A Case Study of Factors Influencing Susceptibility to Oak Decline in Southern Missouri, U.S.A.

    Treesearch

    W. Keith Moser; Greg Liknes; Mark Hansen; Kevin Nimerfro

    2005-01-01

    The Forest Inventory and Analysis program at the North Central Research Station focuses on understanding the forested ecosystems in the North Central and Northern Great Plains States through analyzing the results of annual inventories. The program also researches techniques for data collection and analysis. The FIA process measures the above-ground vegetation and the...

  12. Prevalence, types, and geographical distribution of Listeria monocytogenes from a survey of retail Queso Fresco and associated cheese processing plants and dairy farms in Sonora, Mexico.

    PubMed

    Moreno-Enriquez, R I; Garcia-Galaz, A; Acedo-Felix, E; Gonzalez-Rios, I H; Call, J E; Luchansky, J B; Diaz-Cinco, M E

    2007-11-01

    In the first part of this study, samples were collected from farms, cheese processing plants (CPPs), and retail markets located in various geographical areas of Sonora, Mexico, over a 12-month period during the summer of 2004 and winter of 2005. Four (all Queso Fresco [QF] from retail markets) of 349 total samples tested positive for Listeria monocytogenes (Lm). Of these four positive samples, three were collected in the northern region and one in the southern region of Sonora. Additionally, two were collected during the winter months, and two were collected during the summer months. For the second part of the study, a total of 39 samples from a farm, a CPP, and retail markets were collected and processed according to a combination of the Norma Oficial Mexicana NOM-143-SSA1-1995.10 method (NOM) and the U.S. Food and Drug Administration (FDA) Bacteriological Analytical Manual method, and 27 samples from these same locations were collected and processed according to the U.S. Department of Agriculture Food Safety and Inspection Service method (USDA-FSIS). The NOM-FDA method recovered the pathogen from 6 (15%) of 39 samples (one cheese and five product contact surfaces), while the USDA-FSIS method recovered the pathogen from 5 (18.5%) of 27 samples (all product contact surfaces). In addition, the 40 isolates recovered from the 15 total samples that tested positive for Lm grouped into five distinct pulsotypes that were ca. 60% related, as determined by pulsed-field gel electrophoresis analysis. The results of this study confirmed a 3.4% prevalence of Lm in QF collected from retail markets located in Sonora and no appreciable difference in the effectiveness of either the NOM-FDA or USDA-FSIS method to recover the pathogen from cheese or environmental samples.

  13. Impact of collection container material and holding times on sample integrity for mercury and methylmercury in water

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Riscassi, Ami L; Miller, Carrie L; Brooks, Scott C

    Mercury (Hg) and methylmercury (MeHg) concentrations in streamwater can vary on short timescales (hourly or less) during storm flow and on a diel cycle; the frequency and timing of sampling required to accurately characterize these dynamics may be difficult to accomplish manually. Automated sampling can assist in sample collection; however use has been limited for Hg and MeHg analysis due to stability concerns of trace concentrations during extended storage times. We examined the viability of using automated samplers with disposable low-density polyethylene (LDPE) sample bags to collect industrially contaminated streamwater for unfiltered and filtered Hg and MeHg analysis. Specifically wemore » investigated the effect of holding times ranging from hours to days on streamwater collected during baseflow and storm flow. Unfiltered and filtered Hg and MeHg concentrations decreased with increases in time prior to sample processing; holding times of 24 hours or less resulted in concentration changes (mean 11 7% different) similar to variability in duplicates collected manually during analogous field conditions (mean 7 10% different). Comparisons of samples collected with manual and automated techniques throughout a year for a wide range of stream conditions were also found to be similar to differences observed between duplicate grab samples. These results demonstrate automated sampling into LDPE bags with holding times of 24 hours or less can be effectively used to collect streamwater for Hg and MeHg analysis, and encourage the testing of these materials and methods for implementation in other aqueous systems where high-frequency sampling is warranted.« less

  14. Exhaled Breath Condensate Collection in the Mechanically Ventilated Patient

    PubMed Central

    Carter, Stewart R; Davis, Christopher S; Kovacs, Elizabeth J

    2012-01-01

    Collection of exhaled breath condensate (EBC) is a non-invasive means of sampling the airway-lining fluid of the lungs. EBC contains numerous measurable mediators, whose analysis could change the management of patients with certain pulmonary diseases. While initially popularized in investigations involving spontaneously breathing patients, an increasing number of studies have been performed using EBC in association with mechanical ventilation. Collection of EBC in mechanically ventilated patients follows basic principles of condensation, but is influenced by multiple factors. Effective collection requires selection of a collection device, adequate minute ventilation, low cooling temperatures, and sampling times of greater than ten minutes. Condensate can be contaminated by saliva, which needs to be filtered. Dilution of samples occurs secondary to distilled water in vapors and humidification in the ventilator circuit. Dilution factors may need to be employed when investigating non-volatile biomarkers. Storage and analysis should occur promptly at −70° C to −80° C to prevent rapid degradation of samples. The purpose of this review is to examine and describe methodologies and problems of EBC collection in mechanically ventilated patients. A straightforward and safe framework has been established to investigate disease processes in this population, yet technical aspects of EBC collection still exist that prevent clinical practicality of this technology. These include a lack of standardization of procedure and analysis of biomarkers, and of normal reference ranges for mediators in healthy individuals. Once these procedural aspects have been addressed, EBC could serve as a non-invasive alternative to invasive evaluation of lungs in mechanically ventilated patients. PMID:22398157

  15. Biomolecular signatures of diabetic wound healing by structural mass spectrometry

    PubMed Central

    Hines, Kelly M.; Ashfaq, Samir; Davidson, Jeffrey M.; Opalenik, Susan R.; Wikswo, John P.; McLean, John A.

    2013-01-01

    Wound fluid is a complex biological sample containing byproducts associated with the wound repair process. Contemporary techniques, such as immunoblotting and enzyme immunoassays, require extensive sample manipulation and do not permit the simultaneous analysis of multiple classes of biomolecular species. Structural mass spectrometry, implemented as ion mobility-mass spectrometry (IM-MS), comprises two sequential, gas-phase dispersion techniques well suited for the study of complex biological samples due to its ability to separate and simultaneously analyze multiple classes of biomolecules. As a model of diabetic wound healing, polyvinyl alcohol (PVA) sponges were inserted subcutaneously into non-diabetic (control) and streptozotocin-induced diabetic rats to elicit a granulation tissue response and to collect acute wound fluid. Sponges were harvested at days 2 or 5 to capture different stages of the early wound healing process. Utilizing IM-MS, statistical analysis, and targeted ultra-performance liquid chromatography (UPLC) analysis, biomolecular signatures of diabetic wound healing have been identified. The protein S100-A8 was highly enriched in the wound fluids collected from day 2 diabetic rats. Lysophosphatidylcholine (20:4) and cholic acid also contributed significantly to the differences between diabetic and control groups. This report provides a generalized workflow for wound fluid analysis demonstrated with a diabetic rat model. PMID:23452326

  16. Flexible software platform for fast-scan cyclic voltammetry data acquisition and analysis.

    PubMed

    Bucher, Elizabeth S; Brooks, Kenneth; Verber, Matthew D; Keithley, Richard B; Owesson-White, Catarina; Carroll, Susan; Takmakov, Pavel; McKinney, Collin J; Wightman, R Mark

    2013-11-05

    Over the last several decades, fast-scan cyclic voltammetry (FSCV) has proved to be a valuable analytical tool for the real-time measurement of neurotransmitter dynamics in vitro and in vivo. Indeed, FSCV has found application in a wide variety of disciplines including electrochemistry, neurobiology, and behavioral psychology. The maturation of FSCV as an in vivo technique led users to pose increasingly complex questions that require a more sophisticated experimental design. To accommodate recent and future advances in FSCV application, our lab has developed High Definition Cyclic Voltammetry (HDCV). HDCV is an electrochemical software suite that includes data acquisition and analysis programs. The data collection program delivers greater experimental flexibility and better user feedback through live displays. It supports experiments involving multiple electrodes with customized waveforms. It is compatible with transistor-transistor logic-based systems that are used for monitoring animal behavior, and it enables simultaneous recording of electrochemical and electrophysiological data. HDCV analysis streamlines data processing with superior filtering options, seamlessly manages behavioral events, and integrates chemometric processing. Furthermore, analysis is capable of handling single files collected over extended periods of time, allowing the user to consider biological events on both subsecond and multiminute time scales. Here we describe and demonstrate the utility of HDCV for in vivo experiments.

  17. Analysis of variability of concentrations of valproic acid (VPA) and its selected metabolites in the blood serum of patients treated with VPA and patients hospitalized because of VPA poisoning.

    PubMed

    Wilimowska, J; Kłys, M; Jawień, W

    2014-01-01

    To compare the metabolic profile of valproic acid (VPA) in the studied groups of cases through an analysis of variability of concentrations of VPA with its selected metabolites (2-ene-VPA, 4-ene-VPA, 3-keto-VPA). Blood serum samples collected from 27 patients treated with VPA drugs in the Psychiatry Unit and in the Neurology and Cerebral Strokes Unit at the Ludwik Rydygier Provincial Specialist Hospital in Krakow, and blood serum samples collected from 26 patients hospitalized because of suspected acute VPA poisoning at the Toxicology Department, Chair of Toxicology and Environmental Diseases, Jagiellonian University Medical College in Krakow. The analysis of concentrations of VPA and its selected metabolites has shown that the metabolic profile of VPA determined in cases of acute poisoning is different from cases of VPA therapy. One of VPA's metabolic pathways - the process of desaturation - is unchanged in acute poisoning and prevails over the process of β-oxidation. The ingestion of toxic VPA doses results in an increased formation of 4-ene-VPA, proportional to an increase in VPA concentration. Acute VPA poisoning involves the saturation of VPA's metabolic transformations at the stage of β-oxidation. The process of oxidation of 2-ene-VPA to 3-keto-VPA is slowed down after the ingestion of toxic doses.

  18. Application of Six Sigma methodology to a diagnostic imaging process.

    PubMed

    Taner, Mehmet Tolga; Sezen, Bulent; Atwat, Kamal M

    2012-01-01

    This paper aims to apply the Six Sigma methodology to improve workflow by eliminating the causes of failure in the medical imaging department of a private Turkish hospital. Implementation of the design, measure, analyse, improve and control (DMAIC) improvement cycle, workflow chart, fishbone diagrams and Pareto charts were employed, together with rigorous data collection in the department. The identification of root causes of repeat sessions and delays was followed by failure, mode and effect analysis, hazard analysis and decision tree analysis. The most frequent causes of failure were malfunction of the RIS/PACS system and improper positioning of patients. Subsequent to extensive training of professionals, the sigma level was increased from 3.5 to 4.2. The data were collected over only four months. Six Sigma's data measurement and process improvement methodology is the impetus for health care organisations to rethink their workflow and reduce malpractice. It involves measuring, recording and reporting data on a regular basis. This enables the administration to monitor workflow continuously. The improvements in the workflow under study, made by determining the failures and potential risks associated with radiologic care, will have a positive impact on society in terms of patient safety. Having eliminated repeat examinations, the risk of being exposed to more radiation was also minimised. This paper supports the need to apply Six Sigma and present an evaluation of the process in an imaging department.

  19. Processing Satellite Images on Tertiary Storage: A Study of the Impact of Tile Size on Performance

    NASA Technical Reports Server (NTRS)

    Yu, JieBing; DeWitt, David J.

    1996-01-01

    Before raw data from a satellite can be used by an Earth scientist, it must first undergo a number of processing steps including basic processing, cleansing, and geo-registration. Processing actually expands the volume of data collected by a factor of 2 or 3 and the original data is never deleted. Thus processing and storage requirements can exceed 2 terrabytes/day. Once processed data is ready for analysis, a series of algorithms (typically developed by the Earth scientists) is applied to a large number of images in a data set. The focus of this paper is how best to handle such images stored on tape using the following assumptions: (1) all images of interest to a scientist are stored on a single tape, (2) images are accessed and processed in the order that they are stored on tape, and (3) the analysis requires access to only a portion of each image and not the entire image.

  20. Trade-off analysis of modes of data handling for earth resources (ERS), volume 1

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Data handling requirements are reviewed for earth observation missions along with likely technology advances. Parametric techniques for synthesizing potential systems are developed. Major tasks include: (1) review of the sensors under development and extensions of or improvements in these sensors; (2) development of mission models for missions spanning land, ocean, and atmosphere observations; (3) summary of data handling requirements including the frequency of coverage, timeliness of dissemination, and geographic relationships between points of collection and points of dissemination; (4) review of data routing to establish ways of getting data from the collection point to the user; (5) on-board data processing; (6) communications link; and (7) ground data processing. A detailed synthesis of three specific missions is included.

  1. A sampling procedure to guide the collection of narrow-band, high-resolution spatially and spectrally representative reflectance data. [satellite imagery of earth resources

    NASA Technical Reports Server (NTRS)

    Brand, R. R.; Barker, J. L.

    1983-01-01

    A multistage sampling procedure using image processing, geographical information systems, and analytical photogrammetry is presented which can be used to guide the collection of representative, high-resolution spectra and discrete reflectance targets for future satellite sensors. The procedure is general and can be adapted to characterize areas as small as minor watersheds and as large as multistate regions. Beginning with a user-determined study area, successive reductions in size and spectral variation are performed using image analysis techniques on data from the Multispectral Scanner, orbital and simulated Thematic Mapper, low altitude photography synchronized with the simulator, and associated digital data. An integrated image-based geographical information system supports processing requirements.

  2. Exploring the Opportunities of a Balloon-Satellite in Bangladesh for Weather Data Collection and Vegetative Analysis

    NASA Astrophysics Data System (ADS)

    Shafique, Md. Ishraque Bin; Razzaq Halim, M. A.; Rabbi, Fazle; Khalilur Rhaman, Md.

    2016-07-01

    For a third world country like Bangladesh, satellite and space research is not feasible due to lack of funding. Therefore, in order to imitate the principles of such a satellite Balloon Satellite can easily and inexpensively be setup. Balloon satellites are miniature satellites, which are cheap and easy to construct. This paper discusses a BalloonSat developed using a Raspberry Pi, IMU module, UV sensor, GPS module, Camera and XBee Module. An interactive GUI was designed to display all the data collected after processing. To understand nitrogen concentration of a plant, a leaf color chart is used. This paper attempts to digitalize this process, which is applied on photos taken by the BallonSat.

  3. Application of automation and information systems to forensic genetic specimen processing.

    PubMed

    Leclair, Benoît; Scholl, Tom

    2005-03-01

    During the last 10 years, the introduction of PCR-based DNA typing technologies in forensic applications has been highly successful. This technology has become pervasive throughout forensic laboratories and it continues to grow in prevalence. For many criminal cases, it provides the most probative evidence. Criminal genotype data banking and victim identification initiatives that follow mass-fatality incidents have benefited the most from the introduction of automation for sample processing and data analysis. Attributes of offender specimens including large numbers, high quality and identical collection and processing are ideal for the application of laboratory automation. The magnitude of kinship analysis required by mass-fatality incidents necessitates the application of computing solutions to automate the task. More recently, the development activities of many forensic laboratories are focused on leveraging experience from these two applications to casework sample processing. The trend toward increased prevalence of forensic genetic analysis will continue to drive additional innovations in high-throughput laboratory automation and information systems.

  4. Beer fermentation: monitoring of process parameters by FT-NIR and multivariate data analysis.

    PubMed

    Grassi, Silvia; Amigo, José Manuel; Lyndgaard, Christian Bøge; Foschino, Roberto; Casiraghi, Ernestina

    2014-07-15

    This work investigates the capability of Fourier-Transform near infrared (FT-NIR) spectroscopy to monitor and assess process parameters in beer fermentation at different operative conditions. For this purpose, the fermentation of wort with two different yeast strains and at different temperatures was monitored for nine days by FT-NIR. To correlate the collected spectra with °Brix, pH and biomass, different multivariate data methodologies were applied. Principal component analysis (PCA), partial least squares (PLS) and locally weighted regression (LWR) were used to assess the relationship between FT-NIR spectra and the abovementioned process parameters that define the beer fermentation. The accuracy and robustness of the obtained results clearly show the suitability of FT-NIR spectroscopy, combined with multivariate data analysis, to be used as a quality control tool in the beer fermentation process. FT-NIR spectroscopy, when combined with LWR, demonstrates to be a perfectly suitable quantitative method to be implemented in the production of beer. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Development of a fully automated software system for rapid analysis/processing of the falling weight deflectometer data.

    DOT National Transportation Integrated Search

    2009-02-01

    The Office of Special Investigations at Iowa Department of Transportation (DOT) collects FWD data on regular basis to evaluate pavement structural conditions. The primary objective of this study was to develop a fully-automated software system for ra...

  6. An Over Time Analysis of Relationship Multiplexity and Innovation.

    ERIC Educational Resources Information Center

    Bach, Betsy Wackernagel

    A study investigated the relationship between shared communication links and the process of organizational innovation in a privately owned medical clinic providing comprehensive medical care to children and adults. Participants were the physicians and physician assistants employed at the clinic. Data were collected through questionnaires before…

  7. PTAL Database and Website: Developing a Novel Information System for the Scientific Exploitation of the Planetary Terrestrial Analogues Library

    NASA Astrophysics Data System (ADS)

    Veneranda, M.; Negro, J. I.; Medina, J.; Rull, F.; Lantz, C.; Poulet, F.; Cousin, A.; Dypvik, H.; Hellevang, H.; Werner, S. C.

    2018-04-01

    The PTAL website will store multispectral analysis of samples collected from several terrestrial analogue sites and pretend to become a cornerstone tool for the scientific community interested in deepening the knowledge on Mars geological processes.

  8. Issues in NASA Program and Project Management:: A Collection of Papers on Aerospace Management Issues (Supplement 11)

    NASA Technical Reports Server (NTRS)

    Hoffman, Edward J. (Editor); Lawbaugh, William M. (Editor)

    1996-01-01

    Papers address the following topics: NASA's project management development process; Better decisions through structural analysis; NASA's commercial technology management system; Today's management techniques and tools; Program control in NASA - needs and opportunities; and Resources for NASA managers.

  9. Some Computer-Based Developments in Sociology.

    ERIC Educational Resources Information Center

    Heise, David R.; Simmons, Roberta G.

    1985-01-01

    Discusses several ways in which computers are being used in sociology and how they continue to change this discipline. Areas considered include data collection, data analysis, simulations of social processes based on mathematical models, and problem areas (including standardization concerns, training, and the financing of computing facilities).…

  10. Teachers' Perception of Social Justice in Mathematics Classrooms

    ERIC Educational Resources Information Center

    Panthi, Ram Krishna; Luitel, Bal Chandra; Belbase, Shashidhar

    2017-01-01

    The purpose of this study was to explore mathematics teachers' perception of social justice in mathematics classrooms. We applied interpretive qualitative method for data collection, analysis, and interpretation through iterative process. We administered in-depth semi-structured interviews to capture the perceptions of three mathematics teachers…

  11. Teachers' Perception of Social Justice in Mathematics Classrooms

    ERIC Educational Resources Information Center

    Panthi, Ram Krishna; Luitel, Bal Chandra; Belbase, Shashidhar

    2018-01-01

    The purpose of this study was to explore mathematics teachers' perception of social justice in mathematics classrooms. We applied interpretive qualitative method for data collection, analysis, and interpretation through iterative process. We administered in-depth semi-structured interviews to capture the perceptions of three mathematics teachers…

  12. Nonstationary signal analysis in episodic memory retrieval

    NASA Astrophysics Data System (ADS)

    Ku, Y. G.; Kawasumi, Masashi; Saito, Masao

    2004-04-01

    The problem of blind source separation from a mixture that has nonstationarity can be seen in signal processing, speech processing, spectral analysis and so on. This study analyzed EEG signal during episodic memory retrieval using ICA and TVAR. This paper proposes a method which combines ICA and TVAR. The signal from the brain not only exhibits the nonstationary behavior, but also contain artifacts. EEG data at the frontal lobe (F3) from the scalp is collected during the episodic memory retrieval task. The method is applied to EEG data for analysis. The artifact (eye movement) is removed by ICA, and a single burst (around 6Hz) is obtained by TVAR, suggesting that the single burst is related to the brain activity during the episodic memory retrieval.

  13. Qualitative data analysis: conceptual and practical considerations.

    PubMed

    Liamputtong, Pranee

    2009-08-01

    Qualitative inquiry requires that collected data is organised in a meaningful way, and this is referred to as data analysis. Through analytic processes, researchers turn what can be voluminous data into understandable and insightful analysis. This paper sets out the different approaches that qualitative researchers can use to make sense of their data including thematic analysis, narrative analysis, discourse analysis and semiotic analysis and discusses the ways that qualitative researchers can analyse their data. I first discuss salient issues in performing qualitative data analysis, and then proceed to provide some suggestions on different methods of data analysis in qualitative research. Finally, I provide some discussion on the use of computer-assisted data analysis.

  14. Framework for assessing the capacity of a health ministry to conduct health policy processes--a case study from Tajikistan.

    PubMed

    Mirzoev, Tolib N; Green, Andrew; Van Kalliecharan, Ricky

    2015-01-01

    An adequate capacity of ministries of health (MOH) to develop and implement policies is essential. However, no frameworks were found assessing MOH capacity to conduct health policy processes within developing countries. This paper presents a conceptual framework for assessing MOH capacity to conduct policy processes based on a study from Tajikistan, a former Soviet republic where independence highlighted capacity challenges. The data collection for this qualitative study included in-depth interviews, document reviews and observations of policy events. Framework approach for analysis was used. The conceptual framework was informed by existing literature, guided the data collection and analysis, and was subsequently refined following insights from the study. The Tajik MOH capacity, while gradually improving, remains weak. There is poor recognition of wider contextual influences, ineffective leadership and governance as reflected in centralised decision-making, limited use of evidence, inadequate actors' participation and ineffective use of resources to conduct policy processes. However, the question is whether this is a reflection of lack of MOH ability or evidence of constraining environment or both. The conceptual framework identifies five determinants of robust policy processes, each with specific capacity needs: policy context, MOH leadership and governance, involvement of policy actors, the role of evidence and effective resource use for policy processes. Three underlying considerations are important for applying the capacity to policy processes: the need for clear focus, recognition of capacity levels and elements, and both ability and enabling environment. The proposed framework can be used in assessing and strengthening of the capacity of different policy actors. Copyright © 2013 John Wiley & Sons, Ltd.

  15. Using Earth Observations to Understand and Predict Infectious Diseases

    NASA Technical Reports Server (NTRS)

    Soebiyanto, Radina P.; Kiang, Richard

    2015-01-01

    This presentation discusses the processes from data collection and processing to analysis involved in unraveling patterns between disease outbreaks and the surrounding environment and meteorological conditions. We used these patterns to estimate when and where disease outbreaks will occur. As a case study, we will present our work on assessing the relationship between meteorological conditions and influenza in Central America. Our work represents the discovery, prescriptive and predictive aspects of data analytics.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doug Blankenship

    PDFs of seismic reflection profiles 101,110, 111 local to the West Flank FORGE site. 45 line kilometers of seismic reflection data are processed data collected in 2001 through the use of vibroseis trucks. The initial analysis and interpretation of these data was performed by Unruh et al. (2001). Optim processed these data by inverting the P-wave first arrivals to create a 2-D velocity structure. Kirchhoff images were then created for each line using velocity tomograms (Unruh et al., 2001).

  17. Qualitative Methods in Mental Health Services Research

    PubMed Central

    Palinkas, Lawrence A.

    2014-01-01

    Qualitative and mixed methods play a prominent role in mental health services research. However, the standards for their use are not always evident, especially for those not trained in such methods. This paper reviews the rationale and common approaches to using qualitative and mixed methods in mental health services and implementation research based on a review of the papers included in this special series along with representative examples from the literature. Qualitative methods are used to provide a “thick description” or depth of understanding to complement breadth of understanding afforded by quantitative methods, elicit the perspective of those being studied, explore issues that have not been well studied, develop conceptual theories or test hypotheses, or evaluate the process of a phenomenon or intervention. Qualitative methods adhere to many of the same principles of scientific rigor as quantitative methods, but often differ with respect to study design, data collection and data analysis strategies. For instance, participants for qualitative studies are usually sampled purposefully rather than at random and the design usually reflects an iterative process alternating between data collection and analysis. The most common techniques for data collection are individual semi-structured interviews, focus groups, document reviews, and participant observation. Strategies for analysis are usually inductive, based on principles of grounded theory or phenomenology. Qualitative methods are also used in combination with quantitative methods in mixed method designs for convergence, complementarity, expansion, development, and sampling. Rigorously applied qualitative methods offer great potential in contributing to the scientific foundation of mental health services research. PMID:25350675

  18. Qualitative and mixed methods in mental health services and implementation research.

    PubMed

    Palinkas, Lawrence A

    2014-01-01

    Qualitative and mixed methods play a prominent role in mental health services research. However, the standards for their use are not always evident, especially for those not trained in such methods. This article reviews the rationale and common approaches to using qualitative and mixed methods in mental health services and implementation research based on a review of the articles included in this special series along with representative examples from the literature. Qualitative methods are used to provide a "thick description" or depth of understanding to complement breadth of understanding afforded by quantitative methods, elicit the perspective of those being studied, explore issues that have not been well studied, develop conceptual theories or test hypotheses, or evaluate the process of a phenomenon or intervention. Qualitative methods adhere to many of the same principles of scientific rigor as quantitative methods but often differ with respect to study design, data collection, and data analysis strategies. For instance, participants for qualitative studies are usually sampled purposefully rather than at random and the design usually reflects an iterative process alternating between data collection and analysis. The most common techniques for data collection are individual semistructured interviews, focus groups, document reviews, and participant observation. Strategies for analysis are usually inductive, based on principles of grounded theory or phenomenology. Qualitative methods are also used in combination with quantitative methods in mixed-method designs for convergence, complementarity, expansion, development, and sampling. Rigorously applied qualitative methods offer great potential in contributing to the scientific foundation of mental health services research.

  19. The evolution of process-based hydrologic models: historical challenges and the collective quest for physical realism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clark, Martyn P.; Bierkens, Marc F. P.; Samaniego, Luis

    The diversity in hydrologic models has historically led to great controversy on the correct approach to process-based hydrologic modeling, with debates centered on the adequacy of process parameterizations, data limitations and uncertainty, and computational constraints on model analysis. Here, we revisit key modeling challenges on requirements to (1) define suitable model equations, (2) define adequate model parameters, and (3) cope with limitations in computing power. We outline the historical modeling challenges, provide examples of modeling advances that address these challenges, and define outstanding research needs. We also illustrate how modeling advances have been made by groups using models of different type and complexity,more » and we argue for the need to more effectively use our diversity of modeling approaches in order to advance our collective quest for physically realistic hydrologic models.« less

  20. The evolution of process-based hydrologic models: historical challenges and the collective quest for physical realism

    DOE PAGES

    Clark, Martyn P.; Bierkens, Marc F. P.; Samaniego, Luis; ...

    2017-07-11

    The diversity in hydrologic models has historically led to great controversy on the correct approach to process-based hydrologic modeling, with debates centered on the adequacy of process parameterizations, data limitations and uncertainty, and computational constraints on model analysis. Here, we revisit key modeling challenges on requirements to (1) define suitable model equations, (2) define adequate model parameters, and (3) cope with limitations in computing power. We outline the historical modeling challenges, provide examples of modeling advances that address these challenges, and define outstanding research needs. We also illustrate how modeling advances have been made by groups using models of different type and complexity,more » and we argue for the need to more effectively use our diversity of modeling approaches in order to advance our collective quest for physically realistic hydrologic models.« less

  1. Incremental Transductive Learning Approaches to Schistosomiasis Vector Classification

    NASA Astrophysics Data System (ADS)

    Fusco, Terence; Bi, Yaxin; Wang, Haiying; Browne, Fiona

    2016-08-01

    The key issues pertaining to collection of epidemic disease data for our analysis purposes are that it is a labour intensive, time consuming and expensive process resulting in availability of sparse sample data which we use to develop prediction models. To address this sparse data issue, we present the novel Incremental Transductive methods to circumvent the data collection process by applying previously acquired data to provide consistent, confidence-based labelling alternatives to field survey research. We investigated various reasoning approaches for semi-supervised machine learning including Bayesian models for labelling data. The results show that using the proposed methods, we can label instances of data with a class of vector density at a high level of confidence. By applying the Liberal and Strict Training Approaches, we provide a labelling and classification alternative to standalone algorithms. The methods in this paper are components in the process of reducing the proliferation of the Schistosomiasis disease and its effects.

  2. A conceptual framework and classification of capability areas for business process maturity

    NASA Astrophysics Data System (ADS)

    Van Looy, Amy; De Backer, Manu; Poels, Geert

    2014-03-01

    The article elaborates on business process maturity, which indicates how well an organisation can perform based on its business processes, i.e. on its way of working. This topic is of paramount importance for managers who try to excel in today's competitive world. Hence, business process maturity is an emerging research field. However, no consensus exists on the capability areas (or skills) needed to excel. Moreover, their theoretical foundation and synergies with other fields are frequently neglected. To overcome this gap, our study presents a conceptual framework with six main capability areas and 17 sub areas. It draws on theories regarding the traditional business process lifecycle, which are supplemented by recognised organisation management theories. The comprehensiveness of this framework is validated by mapping 69 business process maturity models (BPMMs) to the identified capability areas, based on content analysis. Nonetheless, as a consensus neither exists among the collected BPMMs, a classification of different maturity types is proposed, based on cluster analysis and discriminant analysis. Consequently, the findings contribute to the grounding of business process literature. Possible future avenues are evaluating existing BPMMs, directing new BPMMs or investigating which combinations of capability areas (i.e. maturity types) contribute more to performance than others.

  3. Combining microwave resonance technology to multivariate data analysis as a novel PAT tool to improve process understanding in fluid bed granulation.

    PubMed

    Lourenço, Vera; Herdling, Thorsten; Reich, Gabriele; Menezes, José C; Lochmann, Dirk

    2011-08-01

    A set of 192 fluid bed granulation batches at industrial scale were in-line monitored using microwave resonance technology (MRT) to determine moisture, temperature and density of the granules. Multivariate data analysis techniques such as multiway partial least squares (PLS), multiway principal component analysis (PCA) and multivariate batch control charts were applied onto collected batch data sets. The combination of all these techniques, along with off-line particle size measurements, led to significantly increased process understanding. A seasonality effect could be put into evidence that impacted further processing through its influence on the final granule size. Moreover, it was demonstrated by means of a PLS that a relation between the particle size and the MRT measurements can be quantitatively defined, highlighting a potential ability of the MRT sensor to predict information about the final granule size. This study has contributed to improve a fluid bed granulation process, and the process knowledge obtained shows that the product quality can be built in process design, following Quality by Design (QbD) and Process Analytical Technology (PAT) principles. Copyright © 2011. Published by Elsevier B.V.

  4. Scripts for TRUMP data analyses. Part II (HLA-related data): statistical analyses specific for hematopoietic stem cell transplantation.

    PubMed

    Kanda, Junya

    2016-01-01

    The Transplant Registry Unified Management Program (TRUMP) made it possible for members of the Japan Society for Hematopoietic Cell Transplantation (JSHCT) to analyze large sets of national registry data on autologous and allogeneic hematopoietic stem cell transplantation. However, as the processes used to collect transplantation information are complex and differed over time, the background of these processes should be understood when using TRUMP data. Previously, information on the HLA locus of patients and donors had been collected using a questionnaire-based free-description method, resulting in some input errors. To correct minor but significant errors and provide accurate HLA matching data, the use of a Stata or EZR/R script offered by the JSHCT is strongly recommended when analyzing HLA data in the TRUMP dataset. The HLA mismatch direction, mismatch counting method, and different impacts of HLA mismatches by stem cell source are other important factors in the analysis of HLA data. Additionally, researchers should understand the statistical analyses specific for hematopoietic stem cell transplantation, such as competing risk, landmark analysis, and time-dependent analysis, to correctly analyze transplant data. The data center of the JSHCT can be contacted if statistical assistance is required.

  5. Analysis of Geometric Thinking Students’ and Process-Guided Inquiry Learning Model

    NASA Astrophysics Data System (ADS)

    Hardianti, D.; Priatna, N.; Priatna, B. A.

    2017-09-01

    This research aims to analysis students’ geometric thinking ability and theoretically examine the process-oriented guided iquiry (POGIL) model. This study uses qualitative approach with descriptive method because this research was done without any treatment on subjects. Data were collected naturally. This study was conducted in one of the State Junior High School in Bandung. The population was second grade students and the sample was 32 students. Data of students’ geometric thinking ability were collected through geometric thinking test. These questions are made based on the characteristics of geometry thinking based on van hiele’s theory. Based on the results of the analysis and discussion, students’ geometric thinking ability is still low so it needs to be improved. Therefore, an effort is needed to overcome the problems related to students’ geometric thinking ability. One of the efforts that can be done by doing the learning that can facilitate the students to construct their own geometry concept, especially quadrilateral’s concepts so that students’ geometric thinking ability can enhance maximally. Based on study of the theory, one of the learning models that can enhance the students’ geometric thinking ability is POGIL model.

  6. Establishing a Secure Data Center with Remote Access: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gonder, J.; Burton, E.; Murakami, E.

    2012-04-01

    Access to existing travel data is critical for many analysis efforts that lack the time or resources to support detailed data collection. High-resolution data sets provide particular value, but also present a challenge for preserving the anonymity of the original survey participants. To address this dilemma of providing data access while preserving privacy, the National Renewable Energy Laboratory and the U.S. Department of Transportation have launched the Transportation Secure Data Center (TSDC). TSDC data sets include those from regional travel surveys and studies that increasingly use global positioning system devices. Data provided by different collecting agencies varies with respect tomore » formatting, elements included and level of processing conducted in support of the original purpose. The TSDC relies on a number of geospatial and other analysis tools to ensure data quality and to generate useful information outputs. TSDC users can access the processed data in two different ways. The first is by downloading summary results and second-by-second vehicle speed profiles (with latitude/longitude information removed) from a publicly-accessible website. The second method involves applying for a remote connection account to a controlled-access environment where spatial analysis can be conducted, but raw data cannot be removed.« less

  7. Collaboration: a SWOT analysis of the process of conducting a review of nursing workforce policies in five European countries.

    PubMed

    Uhrenfeldt, Lisbeth; Lakanmaa, Riitta-Liisa; Flinkman, Mervi; Basto, Marta Lima; Attree, Moira

    2014-05-01

    This paper critically reviews the literature on international collaboration and analyses the collaborative process involved in producing a nursing workforce policy analysis. Collaboration is increasingly promoted as a means of solving shared problems and achieving common goals; however, collaboration creates its own opportunities and challenges. Evidence about the collaboration process, its outcomes and critical success factors is lacking. A literature review and content analysis of data collected from six participants (from five European countries) members of the European Academy of Nursing Science Scholar Collaborative Workforce Workgroup, using a SWOT (Strengths, Weaknesses, Opportunities and Threats) analysis template. Two major factors affecting scholarly collaboration were identified: Facilitators, which incorporated personal attributes and enabling contexts/mechanisms, including individual commitment, responsibility and teamwork, facilitative supportive structures and processes. The second, Barriers, incorporated unmet needs for funding; time; communication and impeding contexts/mechanisms, including workload and insufficient support/mentorship. The literature review identified a low level of evidence on collaboration processes, outcomes, opportunities and challenges. The SWOT analysis identified critical success factors, planning strategies and resources of effective international collaboration. Collaboration is an important concept for management. Evidence-based knowledge of the critical success factors facilitating and impeding collaboration could help managers make collaboration more effective. © 2012 John Wiley & Sons Ltd.

  8. Study protocol of a mixed-methods evaluation of a cluster randomized trial to improve the safety of NSAID and antiplatelet prescribing: data-driven quality improvement in primary care.

    PubMed

    Grant, Aileen; Dreischulte, Tobias; Treweek, Shaun; Guthrie, Bruce

    2012-08-28

    Trials of complex interventions are criticized for being 'black box', so the UK Medical Research Council recommends carrying out a process evaluation to explain the trial findings. We believe it is good practice to pre-specify and publish process evaluation protocols to set standards and minimize bias. Unlike protocols for trials, little guidance or standards exist for the reporting of process evaluations. This paper presents the mixed-method process evaluation protocol of a cluster randomized trial, drawing on a framework designed by the authors. This mixed-method evaluation is based on four research questions and maps data collection to a logic model of how the data-driven quality improvement in primary care (DQIP) intervention is expected to work. Data collection will be predominately by qualitative case studies in eight to ten of the trial practices, focus groups with patients affected by the intervention and quantitative analysis of routine practice data, trial outcome and questionnaire data and data from the DQIP intervention. We believe that pre-specifying the intentions of a process evaluation can help to minimize bias arising from potentially misleading post-hoc analysis. We recognize it is also important to retain flexibility to examine the unexpected and the unintended. From that perspective, a mixed-methods evaluation allows the combination of exploratory and flexible qualitative work, and more pre-specified quantitative analysis, with each method contributing to the design, implementation and interpretation of the other.As well as strengthening the study the authors hope to stimulate discussion among their academic colleagues about publishing protocols for evaluations of randomized trials of complex interventions. DATA-DRIVEN QUALITY IMPROVEMENT IN PRIMARY CARE TRIAL REGISTRATION: ClinicalTrials.gov: NCT01425502.

  9. Biotransformation of aesculin by human gut bacteria and identification of its metabolites in rat urine.

    PubMed

    Ding, Wei-Jun; Deng, Yun; Feng, Hao; Liu, Wei-Wei; Hu, Rong; Li, Xiang; Gu, Zhe-Ming; Dong, Xiao-Ping

    2009-03-28

    To observe the biotransformation process of a Chinese compound, aesculin, by human gut bacteria, and to identify its metabolites in rat urine. Representative human gut bacteria were collected from 20 healthy volunteers, and then utilized in vitro to biotransform aesculin under anaerobic conditions. At 0, 2, 4, 8, 12, 16, 24, 48 and 72 h post-incubation, 10 mL of culture medium was collected. Metabolites of aesculin were extracted 3 x from rat urine with methanol and analyzed by HPLC. For in vivo metabolite analysis, aesculetin (100 mg/kg) was administered to rats via stomach gavage, rat urine was collected from 6 to 48 h post-administration, and metabolite analysis was performed by LC/ESI-MS and MS/MS in the positive and negative modes. Human gut bacteria could completely convert aesculin into aesculetin in vitro. The biotransformation process occurred from 8 to 24 h post-incubation, with its highest activity was seen from 8 to 12 h. The in vitro process was much slower than the in vivo process. In contrast to the in vitro model, six aesculetin metabolites were identified in rat urine, including 6-hydroxy-7-gluco-coumarin (M1), 6-hydroxy-7-sulf-coumarin (M2), 6, 7-di-gluco-coumarin (M3), 6-glc-7-gluco-coumarin (M4), 6-O-methyl-7-gluco-coumarin (M5) and 6-O-methyl-7-sulf-coumarin (M6). Of which, M2 and M6 were novel metabolites. Aesculin can be transferred into aesculetin by human gut bacteria and is further modified by the host in vivo. The diverse metabolites of aesculin may explain its pleiotropic pharmaceutical effects.

  10. Crafting glass vessels: current research on the ancient glass collections in the Freer Gallery of Art, Washington, D.C.

    NASA Astrophysics Data System (ADS)

    Nagel, Alexander; McCarthy, Blythe; Bowe, Stacy

    Our knowledge of glass production in ancient Egypt has been well augmented by the publication of recently excavated materials and glass workshops, but also by more recent materials analysis, and experiments of modern glass-makers attempting to reconstruct the production process of thin-walled coreformed glass vessels. From the mounting of a prefabricated core to the final glass product our understanding of this profession has much improved. The small but well preserved glass collection of the Freer Gallery of Art in Washington, D.C. is a valid tool for examining and studying the technology and production of ancient Egyptian core formed glass vessels. Charles Lang Freer (1854-1919) acquired most of the material from Giovanni Dattari in Cairo in 1909. Previously the glass had received only limited discussion, suggesting that most of these vessels were produced in the 18th Dynasty in the 15th and 14th centuries BCE, while others date from the Hellenistic period and later. In an ongoing project we conducted computed radiography in conjunction with qualitative x-ray fluorescence analysis on a selected group of vessels to understand further aspects of the ancient production process. This paper will provide an overview of our recent research and present our data-gathering process and preliminary results. How can the examinations of core formed glass vessels in the Freer Gallery contribute to our understanding of ancient glass production and technology? By focusing on new ways of looking at old assumptions using the Freer Gallery glass collections, we hope to increase understanding of the challenges of the production process of core-vessel technology as represented by these vessels.

  11. Can the collection of expired long-lasting insecticidal nets reduce their coverage and use? Sociocultural aspects related to LLIN life cycle management and use in four districts in Madagascar.

    PubMed

    Ramanantsoa, Ambinina; Wilson-Barthes, Marta; Rahenintsoa, Rindra; Hoibak, Sarah; Ranaivoharimina, Harilala; Rahelimalala, Martha Delphine; Rakotomanga, Avotiana; Finlay, Alyssa; Muela Ribera, Joan; Peeters Grietens, Koen

    2017-10-10

    There is growing awareness of the likely impact increased numbers of LLINs will have on the environment, if not disposed of or recycled appropriately. As part of a World Health Organization (WHO) and United Nations Environment Programme (UNEP) pilot study to assess environmentally-sound and cost-effective LLIN recycling strategies, the USAID-Deliver Project collected 22,559 used bed nets in Madagascar. A social science study was conducted to provide data on socio-cultural factors related to collection and replacement of LLINs, including impact on primary and other net uses. Ethnographic exploratory research was carried out following the pilot USAID-Deliver net collection and recycling campaign in Betioky, Tsihombe, Fenerive Est and Ambanja districts of Madagascar, triangulating participant observation, interviewing and group discussions. Sampling was theoretical and data analysis was a continuous and iterative process concurrent to data collection. Final analysis was conducted using NVivo10. The following themes emerged as contributing to the success of collecting expired LLINs in the community for recycling purposes: (i) net adequacy and preference: characteristic differences between collected and newly distributed nets lead to communities' reticence to relinquish old nets before confirming new nets were appropriate for intended use. Where newly distributed nets failed to meet local requirements, this was expected to increase alternative uses and decrease household turn over. (ii) Net collection strategies: the net collection campaign brought net use out of the private sphere and into the public arena. Net owners reported feeling ashamed when presenting damaged nets in public for collection, leading to reduced net relinquishment. (iii) Net lifecycle: communities perceived nets as being individually owned and economic value was attributed both to good-condition nets for sleeping and to worn nets for alternative/secondary purposes. Collecting nets at the stage of waste rather than at their prescribed end of life was locally acceptable. The collection of LLINs for recycling/disposal can lead to lower coverage under certain conditions. Collecting used LLINs may be appropriate under the following conditions: (i) nets are collected at the stage of waste; (ii) new nets are in line with community preferences; and (iii) collection strategies have been agreed upon within the community prior to replacement activities. Any collection/recycling of old LLINs should be based on in-depth understanding of the local context and include participatory processes to prevent reduced coverage.

  12. Test Data Monitor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bosas, Joseph

    The National Security Campus (NSC) collects a large amount of test data used The National Security Campus (NSC) collects a large amount of test data used to accept high value and high rigor product. The data has been used historically to support root cause analysis when anomalies are detected in down-stream processes. The opportunity to use the data for predictive failure analysis however, had never been exploited. The primary goal of the Test Data Monitor (TDM) software is to provide automated capabilities to analyze data in near-real-time and report trends that foreshadow actual product failures. To date, the aerospace industrymore » as a whole is challenged at utilizing collected data to the degree that modern technology allows. As a result of the innovation behind TDM, Honeywell is able to monitor millions of data points through a multitude of SPC algorithms continuously and autonomously so that our personnel resources can more efficiently and accurately direct their attention to suspect processes or features. TDM’s capabilities have been recognized by our U.S. Department of Energy National Nuclear Security Administration (NNSA) sponsor for potential use at other sites within the NNSA. This activity supports multiple initiatives including expectations of the NNSA and broader corporate goals that center around data-based quality controls on production.« less

  13. Chemical composition of size-segregated aerosol collected all year-round at Concordia Station (Dome C, Antarctica). Transport processes and climatic implications.

    NASA Astrophysics Data System (ADS)

    Udisti, Roberto; Becagli, Silvia; Frosini, Daniele; Galli, Gaia; Ghedini, Costanza; Rugi, Francesco; Severi, Mirko; Traversi, Rita

    2010-05-01

    Ice-core stratigraphies of chemical components of atmospheric gases and aerosols trapped in the snow layers by scavenging processes are a powerful tool in understanding past climatic and environmental changes. The deep ice core drilled at Dome C in the framework of the EPICA project allowed reconstructing the last 8 glacial-interglacial cycles and highlightened the complex relationships between climatic forcings and environmental feedback processes. In interpreting ice core records as a function of past climatic variations, some difficulties arise from uncertainties in considering selected chemical species as reliable markers of climatic and environmental processes and in attributing the different load and composition of aerosols over Antarctica to changes in source intensity (such as aridity, wind strength, emersion of continental platform by sea-level lowering etc..) and/or to variations in atmospheric processes (such as meridional and zonal atmospheric circulation, polar vortex intensity, scavenging efficiency, transport pathways etc..). Besides, two new aspects are actually under discussions: the possible use of Na as sea-ice cover marker (via frost flower formation on the sea-ice surface during the pack-ice formation) and the identification of continental source areas for mineral dust reaching internal regions of Antarctica during glacial and interglacial periods. In order to better address such controversial issues, since 2005 a continuous, high temporal resolution size-segregated aerosol and surface snow sampling has been performed at Dome C (central East Antarctic Plateau, 75° 06' S, 123° 23' E), in the framework of "Station Concordia" Project (a Italian PNRA- French IPEV joint program). The chemical analysis of size-segregated aerosol and daily superficial snow samples, collected all year-round for more than 4 years, can contribute to clarify some of the above mentioned topics. In particular: the possible seasonal pattern of sea spray aerosol could be related to sea-ice formation timing and/or to changes in zonal wind intensity and atmospheric pathway; the mineralogical analysis of insoluble dust particles can allow the identification of continental sources, by comparison with soils collected in the potential source areas (PSAs); finally, the seasonal pattern of biogenic markers (such as methanesulphonic acid and non-sea-salt sulphate) can be linked to sea surface temperature, sea-ice cover and southern-hemisphere circulation modes (e.g., SOI, AAO or SAM and ACW). As regard as depositional and post-depositional processes, the analysis of chemical markers in aerosol, superficial snow and hoar crystals, sampled contemporaneously, will allow understanding the key factors (e.g., snow acidity, solar irradiation) affecting the preservation of components reversibly fixed in the snow layers (such as, for instance, methanesulphonic acid, nitrate and chloride). A summary of the major results from the chemical analysis of aerosol and snow collected at Dome C is here presented.

  14. ClearedLeavesDB: an online database of cleared plant leaf images

    PubMed Central

    2014-01-01

    Background Leaf vein networks are critical to both the structure and function of leaves. A growing body of recent work has linked leaf vein network structure to the physiology, ecology and evolution of land plants. In the process, multiple institutions and individual researchers have assembled collections of cleared leaf specimens in which vascular bundles (veins) are rendered visible. In an effort to facilitate analysis and digitally preserve these specimens, high-resolution images are usually created, either of entire leaves or of magnified leaf subsections. In a few cases, collections of digital images of cleared leaves are available for use online. However, these collections do not share a common platform nor is there a means to digitally archive cleared leaf images held by individual researchers (in addition to those held by institutions). Hence, there is a growing need for a digital archive that enables online viewing, sharing and disseminating of cleared leaf image collections held by both institutions and individual researchers. Description The Cleared Leaf Image Database (ClearedLeavesDB), is an online web-based resource for a community of researchers to contribute, access and share cleared leaf images. ClearedLeavesDB leverages resources of large-scale, curated collections while enabling the aggregation of small-scale collections within the same online platform. ClearedLeavesDB is built on Drupal, an open source content management platform. It allows plant biologists to store leaf images online with corresponding meta-data, share image collections with a user community and discuss images and collections via a common forum. We provide tools to upload processed images and results to the database via a web services client application that can be downloaded from the database. Conclusions We developed ClearedLeavesDB, a database focusing on cleared leaf images that combines interactions between users and data via an intuitive web interface. The web interface allows storage of large collections and integrates with leaf image analysis applications via an open application programming interface (API). The open API allows uploading of processed images and other trait data to the database, further enabling distribution and documentation of analyzed data within the community. The initial database is seeded with nearly 19,000 cleared leaf images representing over 40 GB of image data. Extensible storage and growth of the database is ensured by using the data storage resources of the iPlant Discovery Environment. ClearedLeavesDB can be accessed at http://clearedleavesdb.org. PMID:24678985

  15. ClearedLeavesDB: an online database of cleared plant leaf images.

    PubMed

    Das, Abhiram; Bucksch, Alexander; Price, Charles A; Weitz, Joshua S

    2014-03-28

    Leaf vein networks are critical to both the structure and function of leaves. A growing body of recent work has linked leaf vein network structure to the physiology, ecology and evolution of land plants. In the process, multiple institutions and individual researchers have assembled collections of cleared leaf specimens in which vascular bundles (veins) are rendered visible. In an effort to facilitate analysis and digitally preserve these specimens, high-resolution images are usually created, either of entire leaves or of magnified leaf subsections. In a few cases, collections of digital images of cleared leaves are available for use online. However, these collections do not share a common platform nor is there a means to digitally archive cleared leaf images held by individual researchers (in addition to those held by institutions). Hence, there is a growing need for a digital archive that enables online viewing, sharing and disseminating of cleared leaf image collections held by both institutions and individual researchers. The Cleared Leaf Image Database (ClearedLeavesDB), is an online web-based resource for a community of researchers to contribute, access and share cleared leaf images. ClearedLeavesDB leverages resources of large-scale, curated collections while enabling the aggregation of small-scale collections within the same online platform. ClearedLeavesDB is built on Drupal, an open source content management platform. It allows plant biologists to store leaf images online with corresponding meta-data, share image collections with a user community and discuss images and collections via a common forum. We provide tools to upload processed images and results to the database via a web services client application that can be downloaded from the database. We developed ClearedLeavesDB, a database focusing on cleared leaf images that combines interactions between users and data via an intuitive web interface. The web interface allows storage of large collections and integrates with leaf image analysis applications via an open application programming interface (API). The open API allows uploading of processed images and other trait data to the database, further enabling distribution and documentation of analyzed data within the community. The initial database is seeded with nearly 19,000 cleared leaf images representing over 40 GB of image data. Extensible storage and growth of the database is ensured by using the data storage resources of the iPlant Discovery Environment. ClearedLeavesDB can be accessed at http://clearedleavesdb.org.

  16. Healthy or Unhealthy Lifestyle: A Thematic Analysis of Iranian Male Adolescents' Perspectives.

    PubMed

    Zareiyan, Armin

    2017-01-01

    Identifying what adolescents perceive as their lifestyle and exploring the factors persuading their decisions to engage in or avoid healthy or unhealthy lifestyle behaviors could improve the ability of healthcare professionals to develop innovative preventive strategies and modify negative health behaviors in adolescents. Hence, the literature on adolescent health-related issues reported by adults showed a rarity of information from adolescents themselves. A qualitative study using the thematic analysis approach was conducted. Data were collected by semi-structured, digitally recorded interviews from 32 male adolescents. Interviews were transcribed verbatim, and after collecting the data, the thematic analysis process was started and conducted in six phases. After data collection, the interview texts were transcribed, and approximately 800 initial codes were extracted. The initial codes were reevaluated to yield 48 main themes. Hence, the final thematic map was created as having 5 overarching themes and 12 subthemes, showing that interviewees emphasized unhealthy lifestyle. The components of unhealthy lifestyle seem important to them because they consider that they could lead a healthy lifestyle through elimination of negative behaviors.

  17. Effects on Physiology and Performance of Wearing the Aviator NBC ensemble While Flying the UH-60 Helicopter Flight Simulator in a Controlled Heat Environment.

    DTIC Science & Technology

    1992-09-01

    and collecting and processing data. They were at the front line in interacting with the subjects and maintaining morale. They did an excellent job. They...second for 16 parameter channels, and the data were processed to produce a single root mean square (RMS) error value for each channel appropriate to...represented in the final analysis. Physiological data The physiological data on the VAX were processed by sampling them at 5-minute intervals throughout the

  18. Mineralogy and Elemental Composition of Wind Drift Soil at Rocknest, Gale Crater

    NASA Technical Reports Server (NTRS)

    Blake, D. F.; Bish, D. L.; Morris, R. V.; Downs, R. T.; Trieman, A. H.; Morrison, S. M.; Chipera, S. J.; Ming, D. W.; Yen, A. S.; Vaniman, D. T.; hide

    2013-01-01

    The Mars Science Laboratory rover Curiosity has been exploring Mars since August 5, 2012, conducting engineering and first-time activities with its mobility system, arm, sample acquisition and processing system (SA/SPaH-CHIMRA) and science instruments. Curiosity spent 54 sols at a location named "Rocknest," collecting and processing five scoops of loose, unconsolidated materials ("soil") acquired from an aeolian bedform (Fig. 1). The Chemistry and Mineralogy (CheMin) instrument analyzed portions of scoops 3, 4, and 5, to obtain the first quantitative mineralogical analysis of Mars soil, and to provide context for Sample Analysis at Mars (SAM) measurements of volatiles, isotopes and possible organic materials.

  19. Analysis of translucent and opaque photocathodes.

    PubMed

    Sizelove, J R; Love Iii, J A

    1966-09-01

    By an analysis of the photodetection process, the response of photodetectors to wide band, noncoherent light and guidelines for its improvement are determined. In this paper, the phenomenon of multiple reflections within the emitter of a reflecting-translucent and a reflecting-opaque photocathode is analyzed. Geometrical and optical configurations and solid state parameters are evaluated in terms of their effect on the photodetection process. The quantum yield, the percent of incident light absorbed, and the collection efficiency are determined as functions of the thickness of the emitting layer. These results are then employed to suggest areas of improvement in the use of state-of-the-art photocathodes.

  20. Quantifying biodiversity using digital cameras and automated image analysis.

    NASA Astrophysics Data System (ADS)

    Roadknight, C. M.; Rose, R. J.; Barber, M. L.; Price, M. C.; Marshall, I. W.

    2009-04-01

    Monitoring the effects on biodiversity of extensive grazing in complex semi-natural habitats is labour intensive. There are also concerns about the standardization of semi-quantitative data collection. We have chosen to focus initially on automating the most time consuming aspect - the image analysis. The advent of cheaper and more sophisticated digital camera technology has lead to a sudden increase in the number of habitat monitoring images and information that is being collected. We report on the use of automated trail cameras (designed for the game hunting market) to continuously capture images of grazer activity in a variety of habitats at Moor House National Nature Reserve, which is situated in the North of England at an average altitude of over 600m. Rainfall is high, and in most areas the soil consists of deep peat (1m to 3m), populated by a mix of heather, mosses and sedges. The cameras have been continuously in operation over a 6 month period, daylight images are in full colour and night images (IR flash) are black and white. We have developed artificial intelligence based methods to assist in the analysis of the large number of images collected, generating alert states for new or unusual image conditions. This paper describes the data collection techniques, outlines the quantitative and qualitative data collected and proposes online and offline systems that can reduce the manpower overheads and increase focus on important subsets in the collected data. By converting digital image data into statistical composite data it can be handled in a similar way to other biodiversity statistics thus improving the scalability of monitoring experiments. Unsupervised feature detection methods and supervised neural methods were tested and offered solutions to simplifying the process. Accurate (85 to 95%) categorization of faunal content can be obtained, requiring human intervention for only those images containing rare animals or unusual (undecidable) conditions, and enabling automatic deletion of images generated by erroneous triggering (e.g. cloud movements). This is the first step to a hierarchical image processing framework, where situation subclasses such as birds or climatic conditions can be fed into more appropriate automated or semi-automated data mining software.

  1. Data processing and analysis with the autoPROC toolbox

    PubMed Central

    Vonrhein, Clemens; Flensburg, Claus; Keller, Peter; Sharff, Andrew; Smart, Oliver; Paciorek, Wlodek; Womack, Thomas; Bricogne, Gérard

    2011-01-01

    A typical diffraction experiment will generate many images and data sets from different crystals in a very short time. This creates a challenge for the high-throughput operation of modern synchrotron beamlines as well as for the subsequent data processing. Novice users in particular may feel overwhelmed by the tables, plots and numbers that the different data-processing programs and software packages present to them. Here, some of the more common problems that a user has to deal with when processing a set of images that will finally make up a processed data set are shown, concentrating on difficulties that may often show up during the first steps along the path of turning the experiment (i.e. data collection) into a model (i.e. interpreted electron density). Difficulties such as unexpected crystal forms, issues in crystal handling and suboptimal choices of data-collection strategies can often be dealt with, or at least diagnosed, by analysing specific data characteristics during processing. In the end, one wants to distinguish problems over which one has no immediate control once the experiment is finished from problems that can be remedied a posteriori. A new software package, autoPROC, is also presented that combines third-party processing programs with new tools and an automated workflow script that is intended to provide users with both guidance and insight into the offline processing of data affected by the difficulties mentioned above, with particular emphasis on the automated treatment of multi-sweep data sets collected on multi-axis goniostats. PMID:21460447

  2. Cost and implementation analysis of a personal digital assistant system for laboratory data collection.

    PubMed

    Blaya, J A; Gomez, W; Rodriguez, P; Fraser, H

    2008-08-01

    One hundred and twenty-six public health centers and laboratories in Lima, Peru, without internet. We have previously shown that a personal digital assistant (PDA) based system reduces data collection delays and errors for tuberculosis (TB) laboratory results when compared to a paper system. To assess the data collection efficiency of each system and the resources required to develop, implement and transfer the PDA-based system to a resource-poor setting. Time-motion study of data collectors using the PDA-based and paper systems. Cost analysis of developing, implementing and transferring the PDA-based system to a local organization and their redeployment of the system. Work hours spent collecting and processing results decreased by 60% (P < 0.001). Users perceived this decrease to be 70% and had no technical problems they failed to fix. The total cost and time to develop and implement the intervention was US$26092 and 22 weeks. The cost to extend the system to cover nine more districts was $1125 and to implement collecting patient weights was $4107. A PDA-based system drastically reduced the effort required to collect TB laboratory results from remote locations. With the framework described, open-source software and local development, organizations in resource-poor settings could reap the benefits of this technology.

  3. Analysis of the mixing processes in the subtropical Advancetown Lake, Australia

    NASA Astrophysics Data System (ADS)

    Bertone, Edoardo; Stewart, Rodney A.; Zhang, Hong; O'Halloran, Kelvin

    2015-03-01

    This paper presents an extensive investigation of the mixing processes occurring in the subtropical monomictic Advancetown Lake, which is the main water body supplying the Gold Coast City in Australia. Meteorological, chemical and physical data were collected from weather stations, laboratory analysis of grab samples and an in-situ Vertical Profiling System (VPS), for the period 2008-2012. This comprehensive, high frequency dataset was utilised to develop a one-dimensional model of the vertical transport and mixing processes occurring along the water column. Multivariate analysis revealed that air temperature and rain forecasts enabled a reliable prediction of the strength of the lake stratification. Vertical diffusion is the main process driving vertical mixing, particularly during winter circulation. However, a high reservoir volume and warm winters can limit the degree of winter mixing, causing only partial circulation to occur, as was the case in 2013. This research study provides a comprehensive approach for understanding and predicting mixing processes for similar lakes, whenever high-frequency data are available from VPS or other autonomous water monitoring systems.

  4. Operation of a sampling train for the analysis of environmental species in coal gasification gas-phase process streams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pochan, M.J.; Massey, M.J.

    1979-02-01

    This report discusses the results of actual raw product gas sampling efforts and includes: Rationale for raw product gas sampling efforts; design and operation of the CMU gas sampling train; development and analysis of a sampling train data base; and conclusions and future application of results. The results of sampling activities at the CO/sub 2/-Acceptor and Hygas pilot plants proved that: The CMU gas sampling train is a valid instrument for characterization of environmental parameters in coal gasification gas-phase process streams; depending on the particular process configuration, the CMU gas sampling train can reduce gasifier effluent characterization activity to amore » single location in the raw product gas line; and in contrast to the slower operation of the EPA SASS Train, CMU's gas sampling train can collect representative effluent data at a rapid rate (approx. 2 points per hour) consistent with the rate of change of process variables, and thus function as a tool for process engineering-oriented analysis of environmental characteristics.« less

  5. Identification of the isomers using principal component analysis (PCA) method

    NASA Astrophysics Data System (ADS)

    Kepceoǧlu, Abdullah; Gündoǧdu, Yasemin; Ledingham, Kenneth William David; Kilic, Hamdi Sukur

    2016-03-01

    In this work, we have carried out a detailed statistical analysis for experimental data of mass spectra from xylene isomers. Principle Component Analysis (PCA) was used to identify the isomers which cannot be distinguished using conventional statistical methods for interpretation of their mass spectra. Experiments have been carried out using a linear TOF-MS coupled to a femtosecond laser system as an energy source for the ionisation processes. We have performed experiments and collected data which has been analysed and interpreted using PCA as a multivariate analysis of these spectra. This demonstrates the strength of the method to get an insight for distinguishing the isomers which cannot be identified using conventional mass analysis obtained through dissociative ionisation processes on these molecules. The PCA results dependending on the laser pulse energy and the background pressure in the spectrometers have been presented in this work.

  6. Assessment of Automatically Exported Clinical Data from a Hospital Information System for Clinical Research in Multiple Myeloma.

    PubMed

    Torres, Viviana; Cerda, Mauricio; Knaup, Petra; Löpprich, Martin

    2016-01-01

    An important part of the electronic information available in Hospital Information System (HIS) has the potential to be automatically exported to Electronic Data Capture (EDC) platforms for improving clinical research. This automation has the advantage of reducing manual data transcription, a time consuming and prone to errors process. However, quantitative evaluations of the process of exporting data from a HIS to an EDC system have not been reported extensively, in particular comparing with manual transcription. In this work an assessment to study the quality of an automatic export process, focused in laboratory data from a HIS is presented. Quality of the laboratory data was assessed in two types of processes: (1) a manual process of data transcription, and (2) an automatic process of data transference. The automatic transference was implemented as an Extract, Transform and Load (ETL) process. Then, a comparison was carried out between manual and automatic data collection methods. The criteria to measure data quality were correctness and completeness. The manual process had a general error rate of 2.6% to 7.1%, obtaining the lowest error rate if data fields with a not clear definition were removed from the analysis (p < 10E-3). In the case of automatic process, the general error rate was 1.9% to 12.1%, where lowest error rate is obtained when excluding information missing in the HIS but transcribed to the EDC from other physical sources. The automatic ETL process can be used to collect laboratory data for clinical research if data in the HIS as well as physical documentation not included in HIS, are identified previously and follows a standardized data collection protocol.

  7. From an Intuitive Practitioner to a Professional Novice Leader

    ERIC Educational Resources Information Center

    Mevorach, Miriam; Miron, Mordechai

    2015-01-01

    This study examined the professional and personal transition undergone by eight experienced early childhood (EC) teachers after completing their graduate studies. The data were collected through interviews and online communication. Three main categories arose in the qualitative content analysis of the text: (1) personal process of change; (2)…

  8. Analysis of Environmental Data and Landscape Characterization on Multiple WetlandTypes Using Water Level Loggers and GIS Techniques in Tampa, FL

    EPA Science Inventory

    To better characterize the relationships between both adjacent hydrology/ precipitation and nutrient processing with groundwater level fluctuations, continuous water level data are being collected across three dominant wetland types, each with varied landscape characteristics. Th...

  9. Psychoactive Substance Use and School Performance among Adolescents in Public Secondary Schools in Uganda

    ERIC Educational Resources Information Center

    Rukundo, Aloysius; Kibanja, Grace; Steffens, Karl

    2014-01-01

    Introduction: Psychoactive substance use among adolescents influences behavioral and cognitive processes and is associated with adolescents' performance in school. We therefore sought to investigate association of PASU with adolescents' school performance. Methods: We employed quantitative methods of data collection and analysis. To test the…

  10. Learning through the Ages: An Epistemological Journey

    ERIC Educational Resources Information Center

    Reid, J. Courtney

    2009-01-01

    This paper explores how three nineteenth-century women writers guided my thinking about education, oppression and spirituality during different decades of my twentieth-century life. In order to re-collect my epistemological journey, a process that requires analysis and reflection, the paper combines the critical lens of feminist theory with the…

  11. The Location of Sources of Human Computer Processed Cerebral Potentials for the Automated Assessment of Visual Field Impairment

    PubMed Central

    Leisman, Gerald; Ashkenazi, Maureen

    1979-01-01

    Objective psychophysical techniques for investigating visual fields are described. The paper concerns methods for the collection and analysis of evoked potentials using a small laboratory computer and provides efficient methods for obtaining information about the conduction pathways of the visual system.

  12. Let's Not Forget: Learning Analytics Are about Learning

    ERIC Educational Resources Information Center

    Gaševic, Dragan; Dawson, Shane; Siemens, George

    2015-01-01

    The analysis of data collected from the interaction of users with educational and information technology has attracted much attention as a promising approach for advancing our understanding of the learning process. This promise motivated the emergence of the new research field, learning analytics, and its closely related discipline, educational…

  13. U.S. Geological Survey quality-assurance plan for surface-water activities in Kansas, 2015

    USGS Publications Warehouse

    Painter, Colin C.; Loving, Brian L.

    2015-01-01

    This Surface Water Quality-Assurance Plan documents the standards, policies, and procedures used by the Kansas Water Science Center (KSWSC) of the U.S. Geological Survey (USGS) for activities related to the collection, processing, storage, analysis, and publication of surface-water data.

  14. List mode multichannel analyzer

    DOEpatents

    Archer, Daniel E [Livermore, CA; Luke, S John [Pleasanton, CA; Mauger, G Joseph [Livermore, CA; Riot, Vincent J [Berkeley, CA; Knapp, David A [Livermore, CA

    2007-08-07

    A digital list mode multichannel analyzer (MCA) built around a programmable FPGA device for onboard data analysis and on-the-fly modification of system detection/operating parameters, and capable of collecting and processing data in very small time bins (<1 millisecond) when used in histogramming mode, or in list mode as a list mode MCA.

  15. The 2010 Broad Prize

    ERIC Educational Resources Information Center

    Education Digest: Essential Readings Condensed for Quick Review, 2011

    2011-01-01

    A new data analysis, based on data collected as part of The Broad Prize process, provides insights into which large urban school districts in the United States are doing the best job of educating traditionally disadvantaged groups: African-American, Hispanics, and low-income students. Since 2002, The Eli and Edythe Broad Foundation has awarded The…

  16. The Relationship between Elementary Principals' Visionary Leadership and Students' Reading Performance

    ERIC Educational Resources Information Center

    Mora-Whitehurst, Rina

    2013-01-01

    This article focuses on elementary principals as instructional leaders, as well as public school initiatives and educational accountability in the United States. It presents the methodology, instrumentation, measures of academic achievement in Florida, data collection, and processing procedures. Finally, it presents data analysis, results of the…

  17. 78 FR 72626 - Notice of Request for Renewal of a Currently Approved Information Collection (Pathogen Reduction...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-03

    ... reduction and Hazard Analysis and Critical Control Point (HACCP) Systems requirements because OMB approval... February 28, 2014. FSIS has established requirements applicable to meat and poultry establishments designed.... coli by slaughter establishments to verify the adequacy of the establishment's process controls for the...

  18. 1992 NACUBO Endowment Study.

    ERIC Educational Resources Information Center

    National Association of College and University Business Officers, Washington, DC.

    This report presents the results of a 1992 study of the performance and management of college and university endowments. Part I offers succinct information on the data collection process, provides definitions, and explains the formula used in the analysis. Part II presents the report's exhibits in two sections. The first section describes: (1)…

  19. Students' Experiences in Interdisciplinary Problembased Learning: A Discourse Analysis of Group Interaction

    ERIC Educational Resources Information Center

    Imafuku, Rintaro; Kataoka, Ryuta; Mayahara, Mitsuori; Suzuki, Hisayoshi; Saiki, Takuya

    2014-01-01

    Interdisciplinary problem-based learning (PBL) aims to provide students with opportunities to develop the necessary skills to work with different health professionals in a collaborative manner. This discourse study examined the processes of collective knowledge construction in Japanese students in the tutorials. Analyses of video-recorded data…

  20. 75 FR 15710 - Submission for OMB Review; Comment Request; Process Evaluation of the NIH's Roadmap...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-30

    ... the data collection plans and instruments, contact: Sue Hamann, PhD, Science Evaluation Officer, Office of Science Policy Officer and Analysis, National Institute of Dental and Craniofacial Research... days of the date of this publication. Dated: March 24, 2010. Sue Hamann, Science Evaluation Officer...

Top