Science.gov

Sample records for activation analysis procedure

  1. Validation of a new image analysis procedure for quantifying filamentous bacteria in activated sludge.

    PubMed

    Liwarska-Bizukojc, Ewa; Bizukojc, Marcin; Andrzejczak, Olga

    2014-01-01

    Quantification of filamentous bacteria in activated sludge systems can be made by manual counting under a microscope or by the application of various automated image analysis procedures. The latter has been significantly developed in the last two decades. In this work a new method based upon automated image analysis techniques was elaborated and presented. It consisted of three stages: (a) Neisser staining, (b) grabbing of microscopic images, and (c) digital image processing and analysis. This automated image analysis procedure possessed the features of novelty. It simultaneously delivered data about aggregates and filaments in an individual calculation routine, which is seldom met in the procedures described in the literature so far. What is more important, the macroprogram performing image processing and calculation of morphological parameters was written in the same software which was used for grabbing of images. Previously published procedures required using two different types of software, one for image grabbing and another one for image processing and analysis. Application of this new procedure for the quantification of filamentous bacteria in the full-scale as well as laboratory activated sludge systems proved that it was simple, fast and delivered reliable results.

  2. NASA trend analysis procedures

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This publication is primarily intended for use by NASA personnel engaged in managing or implementing trend analysis programs. 'Trend analysis' refers to the observation of current activity in the context of the past in order to infer the expected level of future activity. NASA trend analysis was divided into 5 categories: problem, performance, supportability, programmatic, and reliability. Problem trend analysis uncovers multiple occurrences of historical hardware or software problems or failures in order to focus future corrective action. Performance trend analysis observes changing levels of real-time or historical flight vehicle performance parameters such as temperatures, pressures, and flow rates as compared to specification or 'safe' limits. Supportability trend analysis assesses the adequacy of the spaceflight logistics system; example indicators are repair-turn-around time and parts stockage levels. Programmatic trend analysis uses quantitative indicators to evaluate the 'health' of NASA programs of all types. Finally, reliability trend analysis attempts to evaluate the growth of system reliability based on a decreasing rate of occurrence of hardware problems over time. Procedures for conducting all five types of trend analysis are provided in this publication, prepared through the joint efforts of the NASA Trend Analysis Working Group.

  3. A neutron activation analysis procedure for the determination of uranium, thorium and potassium in geologic samples

    USGS Publications Warehouse

    Aruscavage, P. J.; Millard, H.T.

    1972-01-01

    A neutron activation analysis procedure was developed for the determination of uranium, thorium and potassium in basic and ultrabasic rocks. The three elements are determined in the same 0.5-g sample following a 30-min irradiation in a thermal neutron flux of 2??1012 n??cm-2??sec-1. Following radiochemical separation, the nuclides239U (T=23.5 m),233Th (T=22.2 m) and42K (T=12.36 h) are measured by ??-counting. A computer program is used to resolve the decay curves which are complex owing to contamination and the growth of daughter activities. The method was used to determine uranium, throium and potassium in the U. S. Geological Survey standard rocks DTS-1, PCC-1 and BCR-1. For 0.5-g samples the limits of detection for uranium, throium and potassium are 0.7, 1.0 and 10 ppb, respectively. ?? 1972 Akade??miai Kiado??.

  4. Start-up procedures and analysis of heavy metals inhibition on methanogenic activity in EGSB reactor.

    PubMed

    Colussi, I; Cortesi, A; Della Vedova, L; Gallo, V; Robles, F K Cano

    2009-12-01

    The effectiveness of operating an industrial UASB reactor, treating wastewater from the beer industry, with flows containing heavy metals was evaluated. A pilot-scale UASB reactor, already used to simulate the industrial reactor, was unsuccessfully employed. An easy start-up was obtained arranging it as an EGSB reactor. Considerations about this modification are reported. The effects of Cu(II), Ni(II) and Cr(III) ions on the anaerobic activity were analyzed by measurements of methane production rate and COD removal. The employed biomass was the sludge of the industrial UASB reactor, while a solution of ethanol and sodium acetate with COD of 3000 mg/L and a heavy metal concentration of 50 mg/L were continuously fed. Experimental results proved higher biomass sensitivity for copper and much slighter for nickel and chromium. Moreover, copper inhibition has been demonstrated to be less significant if a metal-free feed was provided to the system before copper addition.

  5. Deriving directions through procedural task analysis.

    PubMed

    Yuen, H K; D'Amico, M

    1998-01-01

    Task analysis is one of the essential components of activity analysis. Procedural task analysis involves breaking down an activity into a sequence of steps. Directions are the sequence of steps resulting from the task analysis (i.e., the product of the task analysis). Directions become a guide for caregivers or trainers use in teaching clients a specific skill. However, occupational therapy students often have difficulty in writing directions that are clear enough for caregivers or trainers to carry out. Books on activity analysis only provide examples of directions without giving guidelines on how to perform the writing process. The purposes of this paper are to describe the process of procedural task analysis and to provide a guideline for writing steps of directions.

  6. Application of statistical process control and process capability analysis procedures in orbiter processing activities at the Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Safford, Robert R.; Jackson, Andrew E.; Swart, William W.; Barth, Timothy S.

    1994-01-01

    Successful ground processing at KSC requires that flight hardware and ground support equipment conform to specifications at tens of thousands of checkpoints. Knowledge of conformance is an essential requirement for launch. That knowledge of conformance at every requisite point does not, however, enable identification of past problems with equipment, or potential problem areas. This paper describes how the introduction of Statistical Process Control and Process Capability Analysis identification procedures into existing shuttle processing procedures can enable identification of potential problem areas and candidates for improvements to increase processing performance measures. Results of a case study describing application of the analysis procedures to Thermal Protection System processing are used to illustrate the benefits of the approaches described in the paper.

  7. Using instrumental neutron activation analysis for geochemical analyses of terrestrial impact structures: current analytical procedures at the university of vienna geochemistry activation analysis laboratory.

    PubMed

    Mader, Dieter; Koeberl, Christian

    2009-12-01

    The Instrumental Neutron Activation Analysis Gamma Spectroscopy Laboratory at the Department of Lithospheric Research, University of Vienna, has been upgraded in the year 2006. This paper describes the sample preparation, new instrumentation and data evaluation for hundreds of rock samples of two terrestrial impact structures. The measurement and data evaluation are done by using Genie 2000 and a custom-made batch software for the used analysis sequences.

  8. Student Activity Funds: Procedures & Controls.

    ERIC Educational Resources Information Center

    Cuzzetto, Charles E.

    Student activity funds may create educational opportunities for students, but they frequently create problems for business administrators. The first part of this work reviews the types of organizational issues and transactions an organized student group is likely to encounter, including establishing a constitution, participant roles,…

  9. Student Activity Funds: Procedures and Controls.

    ERIC Educational Resources Information Center

    Cuzzetto, Charles E.

    2000-01-01

    An effective internal-control system can help school business administrators meet the challenges of accounting for student activity funds. Such a system should include appropriate policies and procedures, identification of key control points, self-assessments, audit trails, and internal and external audits. (MLH)

  10. Procedures for numerical analysis of circadian rhythms

    PubMed Central

    REFINETTI, ROBERTO; LISSEN, GERMAINE CORNÉ; HALBERG, FRANZ

    2010-01-01

    This article reviews various procedures used in the analysis of circadian rhythms at the populational, organismal, cellular and molecular levels. The procedures range from visual inspection of time plots and actograms to several mathematical methods of time series analysis. Computational steps are described in some detail, and additional bibliographic resources and computer programs are listed. PMID:23710111

  11. HUMAN RELIABILITY ANALYSIS FOR COMPUTERIZED PROCEDURES

    SciTech Connect

    Ronald L. Boring; David I. Gertman; Katya Le Blanc

    2011-09-01

    This paper provides a characterization of human reliability analysis (HRA) issues for computerized procedures in nuclear power plant control rooms. It is beyond the scope of this paper to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper provides a review of HRA as applied to traditional paper-based procedures, followed by a discussion of what specific factors should additionally be considered in HRAs for computerized procedures. Performance shaping factors and failure modes unique to computerized procedures are highlighted. Since there is no definitive guide to HRA for paper-based procedures, this paper also serves to clarify the existing guidance on paper-based procedures before delving into the unique aspects of computerized procedures.

  12. Ultradian rhythmicity of tyrosine aminotransferase activity in Euglena gracillis: Analysis by cosine and non-sinusoidal fitting procedures

    NASA Astrophysics Data System (ADS)

    Neuhaus-Steinmetz, Ulrich; Balzer, Ivonne; Hardeland, Rüdiger

    1990-03-01

    Although the geophysical periodicity of the earth's rotation corresponds to a biological cyclicity of ca. 24 h, cellular temporal organization comprises a multifrequency time structure, in which ultradian rhythms may be regarded as subelements of the circadian oscillator. In Euglena gracilis kept under conditons in which various cellular functions oscillate with a circadian period, tyrosine aminotransferase activity exhibited predominantly an ultradian cycle, whereas its circadian frequency was only weakly expressed. Ultradian period lengths were in the range of 4 5 h, as demonstrated by least squares fitting of cosines and of a non-sinusoidal regression function.

  13. A procedural analysis of correspondence training techniques

    PubMed Central

    Paniagua, Freddy A.

    1990-01-01

    A variety of names have been given to procedures used in correspondence training, some more descriptive than others. In this article I argue that a terminology more accurately describing actual procedures, rather than the conceptual function that those procedures are assumed to serve, would benefit the area of correspondence training. I identify two documented procedures during the reinforcement of verbalization phase and five procedures during the reinforcement of correspondence phase and suggest that those procedures can be classified, or grouped into nonoverlapping categories, by specifying the critical dimensions of those procedures belonging to a single category. I suggest that the names of such nonoverlapping categories should clearly specify the dimensions on which the classification is based in order to facilitate experimental comparison of procedures, and to be able to recognize when a new procedure (as opposed to a variant of one already in existence) is developed. Future research involving comparative analysis across and within procedures is discussed within the framework of the proposed classification. PMID:22478059

  14. Integrated sampling procedure for metabolome analysis.

    PubMed

    Schaub, Jochen; Schiesling, Carola; Reuss, Matthias; Dauner, Michael

    2006-01-01

    Metabolome analysis, the analysis of large sets of intracellular metabolites, has become an important systems analysis method in biotechnological and pharmaceutical research. In metabolic engineering, the integration of metabolome data with fluxome and proteome data into large-scale mathematical models promises to foster rational strategies for strain and cell line improvement. However, the development of reproducible sampling procedures for quantitative analysis of intracellular metabolite concentrations represents a major challenge, accomplishing (i) fast transfer of sample, (ii) efficient quenching of metabolism, (iii) quantitative metabolite extraction, and (iv) optimum sample conditioning for subsequent quantitative analysis. In addressing these requirements, we propose an integrated sampling procedure. Simultaneous quenching and quantitative extraction of intracellular metabolites were realized by short-time exposure of cells to temperatures < or =95 degrees C, where intracellular metabolites are released quantitatively. Based on these findings, we combined principles of heat transfer with knowledge on physiology, for example, turnover rates of energy metabolites, to develop an optimized sampling procedure based on a coiled single tube heat exchanger. As a result, this sampling procedure enables reliable and reproducible measurements through (i) the integration of three unit operations into a one unit operation, (ii) the avoidance of any alteration of the sample due to chemical reagents in quenching and extraction, and (iii) automation. A sampling frequency of 5 s(-)(1) and an overall individual sample processing time faster than 30 s allow observing responses of intracellular metabolite concentrations to extracellular stimuli on a subsecond time scale. Recovery and reliability of the unit operations were analyzed. Impact of sample conditioning on subsequent IC-MS analysis of metabolites was examined as well. The integrated sampling procedure was validated

  15. An analysis of aircrew procedural compliance

    NASA Technical Reports Server (NTRS)

    Schofield, J. E.; Giffin, W. C.

    1981-01-01

    This research examines the relationships between aircrew compliance with procedures and operator errors. The data for this analysis were generated by reexamination of a 1976 experiment in full mission simulation conducted by Dr. H. P. Ruffell Smith (1979) for the NASA-Ames Research Center. The character of individual operators, the chemistry of crew composition, and complex aspects of the operational environment affected procedural compliance by crew members. Associations between enumerated operator errors and several objective indicators of crew coordination were investigated. The correspondence among high operator error counts and infrequent compliance with specific crew coordination requirements was most notable when copilots were accountable for control of flight parameters.

  16. Confidence Interval Procedures for Reliability Growth Analysis

    DTIC Science & Technology

    1977-06-01

    Plj2s tSAA - TECHNICAL RPORT NO. 197 CONFIDENCE INTERVAL PROCEDURES FOR RELIABILITY, GROWTH ANALYSIS LARRY H. CROW JUNE 1977 APPROVED FOR PUBLIC...dence Intervals for M(T). ¶-. fl [ ] 1 Siion IIS0III0N/AVAI Ale ITY ClOtS Next page is blank. So3 CONFIDENCE INTERVAL PROCIEDURIS• FOR RELTABILITY...and confidence interval procedures for the parameters B and P = X are presented in [l , [2], [4]. In the application of the Weibull process model to

  17. 77 FR 31615 - Improving Mail Management Policies, Procedures, and Activities

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-29

    ... ADMINISTRATION Improving Mail Management Policies, Procedures, and Activities AGENCY: Office of Governmentwide... Services Administration (GSA) has issued Federal Management Regulation (FMR) Bulletin G-03 which provides guidance to Executive Branch agencies for improving mail management policies, procedures, and...

  18. New Procedure for Extension Analysis in Exploratory Factor Analysis.

    ERIC Educational Resources Information Center

    Gorsuch, Richard L.

    1997-01-01

    In exploratory common factor analysis, extension analysis refers to computing the relationship of the common factors to variables that were not included in the factor analysis. A new extension procedure is presented that gives correlations without using estimated factor scores. Advantages of the new method are illustrated. (SLD)

  19. 32 CFR 989.37 - Procedures for analysis abroad.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 6 2011-07-01 2011-07-01 false Procedures for analysis abroad. 989.37 Section... PROTECTION ENVIRONMENTAL IMPACT ANALYSIS PROCESS (EIAP) § 989.37 Procedures for analysis abroad. Procedures for analysis of environmental actions abroad are contained in 32 CFR part 187. That directive...

  20. 32 CFR 989.37 - Procedures for analysis abroad.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... PROTECTION ENVIRONMENTAL IMPACT ANALYSIS PROCESS (EIAP) § 989.37 Procedures for analysis abroad. Procedures for analysis of environmental actions abroad are contained in 32 CFR Part 187. That directive provides... 32 National Defense 6 2010-07-01 2010-07-01 false Procedures for analysis abroad. 989.37...

  1. Operational Control Procedures for the Activated Sludge Process, Part III-A: Calculation Procedures.

    ERIC Educational Resources Information Center

    West, Alfred W.

    This is the second in a series of documents developed by the National Training and Operational Technology Center describing operational control procedures for the activated sludge process used in wastewater treatment. This document deals exclusively with the calculation procedures, including simplified mixing formulas, aeration tank…

  2. Procedural-support music therapy in the healthcare setting: a cost-effectiveness analysis.

    PubMed

    DeLoach Walworth, Darcy

    2005-08-01

    This comparative analysis examined the cost-effectiveness of music therapy as a procedural support in the pediatric healthcare setting. Many healthcare organizations are actively attempting to reduce the amount of sedation for pediatric patients undergoing various procedures. Patients receiving music therapy-assisted computerized tomography scans ( n = 57), echocardiograms ( n = 92), and other procedures ( n = 17) were included in the analysis. Results of music therapy-assisted procedures indicate successful elimination of patient sedation, reduction in procedural times, and decrease in the number of staff members present for procedures. Implications for nurses and music therapists in the healthcare setting are discussed.

  3. 40 CFR 246.202-6 - Recommended procedures: Cost analysis.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 26 2012-07-01 2011-07-01 true Recommended procedures: Cost analysis... § 246.202-6 Recommended procedures: Cost analysis. After potential markets have been identified (but... residual solid waste have been established, an analysis should be conducted which compares the costs of...

  4. 40 CFR 246.202-6 - Recommended procedures: Cost analysis.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 26 2013-07-01 2013-07-01 false Recommended procedures: Cost analysis... § 246.202-6 Recommended procedures: Cost analysis. After potential markets have been identified (but... residual solid waste have been established, an analysis should be conducted which compares the costs of...

  5. 40 CFR 246.200-8 - Recommended procedures: Cost analysis.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 26 2012-07-01 2011-07-01 true Recommended procedures: Cost analysis... § 246.200-8 Recommended procedures: Cost analysis. After potential markets have been located (but prior... paper and residual solid waste have been established, an analysis should be conducted which compares...

  6. 40 CFR 246.201-7 - Recommended procedures: Cost analysis.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 26 2012-07-01 2011-07-01 true Recommended procedures: Cost analysis... § 246.201-7 Recommended procedures: Cost analysis. After potential markets have been located (but prior... residual solid waste have been established, an analysis should be conducted which compares the costs of...

  7. 40 CFR 246.201-7 - Recommended procedures: Cost analysis.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 26 2013-07-01 2013-07-01 false Recommended procedures: Cost analysis... § 246.201-7 Recommended procedures: Cost analysis. After potential markets have been located (but prior... residual solid waste have been established, an analysis should be conducted which compares the costs of...

  8. 40 CFR 246.200-8 - Recommended procedures: Cost analysis.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 25 2011-07-01 2011-07-01 false Recommended procedures: Cost analysis... § 246.200-8 Recommended procedures: Cost analysis. After potential markets have been located (but prior... paper and residual solid waste have been established, an analysis should be conducted which compares...

  9. 40 CFR 246.201-7 - Recommended procedures: Cost analysis.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 25 2011-07-01 2011-07-01 false Recommended procedures: Cost analysis... § 246.201-7 Recommended procedures: Cost analysis. After potential markets have been located (but prior... residual solid waste have been established, an analysis should be conducted which compares the costs of...

  10. 40 CFR 246.200-8 - Recommended procedures: Cost analysis.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 26 2013-07-01 2013-07-01 false Recommended procedures: Cost analysis... § 246.200-8 Recommended procedures: Cost analysis. After potential markets have been located (but prior... paper and residual solid waste have been established, an analysis should be conducted which compares...

  11. 40 CFR 246.202-6 - Recommended procedures: Cost analysis.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 25 2011-07-01 2011-07-01 false Recommended procedures: Cost analysis... § 246.202-6 Recommended procedures: Cost analysis. After potential markets have been identified (but... residual solid waste have been established, an analysis should be conducted which compares the costs of...

  12. Summative Mass Analysis of Algal Biomass - Integration of Analytical Procedures: Laboratory Analytical Procedure (LAP)

    SciTech Connect

    Laurens, L. M. L.

    2013-12-01

    This procedure guides the integration of laboratory analytical procedures to measure algal biomass constituents in an unambiguous manner and ultimately achieve mass balance closure for algal biomass samples. Many of these methods build on years of research in algal biomass analysis.

  13. Keystroke Analysis: Reflections on Procedures and Measures

    ERIC Educational Resources Information Center

    Baaijen, Veerle M.; Galbraith, David; de Glopper, Kees

    2012-01-01

    Although keystroke logging promises to provide a valuable tool for writing research, it can often be difficult to relate logs to underlying processes. This article describes the procedures and measures that the authors developed to analyze a sample of 80 keystroke logs, with a view to achieving a better alignment between keystroke-logging measures…

  14. Differential item functioning analysis by applying multiple comparison procedures.

    PubMed

    Eusebi, Paolo; Kreiner, Svend

    2015-01-01

    Analysis within a Rasch measurement framework aims at development of valid and objective test score. One requirement of both validity and objectivity is that items do not show evidence of differential item functioning (DIF). A number of procedures exist for the assessment of DIF including those based on analysis of contingency tables by Mantel-Haenszel tests and partial gamma coefficients. The aim of this paper is to illustrate Multiple Comparison Procedures (MCP) for analysis of DIF relative to a variable defining a very large number of groups, with an unclear ordering with respect to the DIF effect. We propose a single step procedure controlling the false discovery rate for DIF detection. The procedure applies for both dichotomous and polytomous items. In addition to providing evidence against a hypothesis of no DIF, the procedure also provides information on subset of groups that are homogeneous with respect to the DIF effect. A stepwise MCP procedure for this purpose is also introduced.

  15. Manual of Alternative Procedures: Activities of Daily Living.

    ERIC Educational Resources Information Center

    McCormack, James E.; And Others

    Intended for teachers and others providing services for moderately and severely physically and/or mentally handicapped children and young adults, the manual presents strategies, procedures, and task analyses for training in daily living skills. Section I provides an overview of tactics for teaching activities of daily living (ADL) skills,…

  16. 34 CFR 602.30 - Activities covered by recognition procedures.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 34 Education 3 2010-07-01 2010-07-01 false Activities covered by recognition procedures. 602.30 Section 602.30 Education Regulations of the Offices of the Department of Education (Continued) OFFICE OF POSTSECONDARY EDUCATION, DEPARTMENT OF EDUCATION THE SECRETARY'S RECOGNITION OF ACCREDITING AGENCIES...

  17. An Improved Qualitative Analysis Procedure for Aluminum Subgroup Cations.

    ERIC Educational Resources Information Center

    Kistner, C. R.; Robinson, Patricia J.

    1983-01-01

    Describes a procedure for the qualitative analysis of aluminum subgroup cations designed to avoid failure to obtain lead or barium chromate precipitates or failure to report aluminum hydroxide when present (due to staining). Provides a flow chart and step-by-step explanation for the new procedure, indicating significantly improved student results.…

  18. Analysis of Field Activity Perspectives of Centralized Non-Appropriated Fund Accounting, Banking, and Payroll Procedures within the Department of the Navy.

    DTIC Science & Technology

    1982-06-01

    vice purely historical cost data) conducted in 1976 by an Office of Management and Budget/Department of Defense study group . These absolute dollar...results for the whole class, or target group . [119] 2. Process measures. A process measure relates tt an activity carried on by the organization...OF.MESAEPEPITDO H AO H I a m hito0m PUS 3. NO1 GROUPS OF N’UMBERS ARE tPRE-PRINTED ON THlE CARD. THE EMPLOYEE DATA MUST BE PUNCHED IN THE TOP GROUP OF NUMBERS

  19. Building America Performance Analysis Procedures: Revision 1

    SciTech Connect

    2004-06-01

    To measure progress toward multi-year research goals, cost and performance trade-offs are evaluated through a series of controlled field and laboratory experiments supported by energy analysis techniques using test data to calibrate simulation models.

  20. Computer Based Procedures for Field Workers - FY16 Research Activities

    SciTech Connect

    Oxstrand, Johanna; Bly, Aaron

    2016-09-01

    The Computer-Based Procedure (CBP) research effort is a part of the Light-Water Reactor Sustainability (LWRS) Program, which provides the technical foundations for licensing and managing the long-term, safe, and economical operation of current nuclear power plants. One of the primary missions of the LWRS program is to help the U.S. nuclear industry adopt new technologies and engineering solutions that facilitate the continued safe operation of the plants and extension of the current operating licenses. One area that could yield tremendous savings in increased efficiency and safety is in improving procedure use. A CBP provides the opportunity to incorporate context-driven job aids, such as drawings, photos, and just-in-time training. The presentation of information in CBPs can be much more flexible and tailored to the task, actual plant condition, and operation mode. The dynamic presentation of the procedure will guide the user down the path of relevant steps, thus minimizing time spent by the field worker to evaluate plant conditions and decisions related to the applicability of each step. This dynamic presentation of the procedure also minimizes the risk of conducting steps out of order and/or incorrectly assessed applicability of steps. This report provides a summary of the main research activities conducted in the Computer-Based Procedures for Field Workers effort since 2012. The main focus of the report is on the research activities conducted in fiscal year 2016. The activities discussed are the Nuclear Electronic Work Packages – Enterprise Requirements initiative, the development of a design guidance for CBPs (which compiles all insights gained through the years of CBP research), the facilitation of vendor studies at the Idaho National Laboratory (INL) Advanced Test Reactor (ATR), a pilot study for how to enhance the plant design modification work process, the collection of feedback from a field evaluation study at Plant Vogtle, and path forward to

  1. Procedure for analysis of nickel-cadmium cell materials

    NASA Technical Reports Server (NTRS)

    Halpert, G.; Ogunyankin, O.; Jones, C.

    1973-01-01

    Quality control procedures include analyses on electrolyte, active materials, and separators for nickel cadmium cell materials. Tests range from the visual/mechanical inspection of cells to gas sampling, electrolyte extract, electrochemical tests, and physical measurements.

  2. Terrain-analysis procedures for modeling radar backscatter

    NASA Technical Reports Server (NTRS)

    Schaber, G. G.; Berlin, G. L.; Pike, R. J.

    1980-01-01

    Procedures developed to obtain both raw measured and surface roughness statistics for radar backscatter modeling are described. A comprehensive and highly flexible software package for terrain analysis is introduced.

  3. Umbilical Hernia Repair: Analysis After 934 Procedures.

    PubMed

    Porrero, José L; Cano-Valderrama, Oscar; Marcos, Alberto; Bonachia, Oscar; Ramos, Beatriz; Alcaide, Benito; Villar, Sol; Sánchez-Cabezudo, Carlos; Quirós, Esther; Alonso, María T; Castillo, María J

    2015-09-01

    There is a lack of consensus about the surgical management of umbilical hernias. The aim of this study is to analyze the medium-term results of 934 umbilical hernia repairs. In this study, 934 patients with an umbilical hernia underwent surgery between 2004 and 2010, 599 (64.1%) of which were evaluated at least one year after the surgery. Complications, recurrence, and the reoperation rate were analyzed. Complications were observed in 5.7 per cent of the patients. With a mean follow-up time of 35.5 months, recurrence and reoperation rates were 3.8 per cent and 4.7 per cent, respectively. A higher percentage of female patients (60.9 % vs 29 %, P = 0.001) and a longer follow-up time (47.4 vs 35 months, P = 0.037) were observed in patients who developed a recurrence. No significant differences were observed between complications and the reoperation rate in patients who underwent Ventralex(®) preperitoneal mesh reinforcement and suture repair; however, a trend toward a higher recurrence rate was observed in patients with suture repair (6.5 % vs 3.2 %, P = 0.082). Suture repair had lower recurrence and reoperation rates in patients with umbilical hernias less than 1 cm. Suture repair is an appropriate procedure for small umbilical hernias; however, for larger umbilical hernias, mesh reinforcement should be considered.

  4. Mokken Scale Analysis Using Hierarchical Clustering Procedures

    ERIC Educational Resources Information Center

    van Abswoude, Alexandra A. H.; Vermunt, Jeroen K.; Hemker, Bas T.; van der Ark, L. Andries

    2004-01-01

    Mokken scale analysis (MSA) can be used to assess and build unidimensional scales from an item pool that is sensitive to multiple dimensions. These scales satisfy a set of scaling conditions, one of which follows from the model of monotone homogeneity. An important drawback of the MSA program is that the sequential item selection and scale…

  5. Operating procedures: Fusion Experiments Analysis Facility

    SciTech Connect

    Lerche, R.A.; Carey, R.W.

    1984-03-20

    The Fusion Experiments Analysis Facility (FEAF) is a computer facility based on a DEC VAX 11/780 computer. It became operational in late 1982. At that time two manuals were written to aid users and staff in their interactions with the facility. This manual is designed as a reference to assist the FEAF staff in carrying out their responsibilities. It is meant to supplement equipment and software manuals supplied by the vendors. Also this manual provides the FEAF staff with a set of consistent, written guidelines for the daily operation of the facility.

  6. Effects of Sensory and Procedural Information on Coping with Stressful Medical Procedures and Pain: A Meta-Analysis.

    ERIC Educational Resources Information Center

    Suls, Jerry; Wan, Choi K.

    1989-01-01

    A meta-analysis of studies on preparation for medical procedures and pain examined the relative effects of sensory, procedural, and combined sensory-procedural preoperational information on coping outcomes. Results indicated that, in contrast to sensory information, procedural information provided no significant benefits over control group…

  7. Observation procedures characterizing occupational physical activities: critical review.

    PubMed

    Denis, D; Lortie, M; Rossignol, M

    2000-01-01

    The first objective of this paper is to compare the observation procedures proposed to characterize physical work. The second objective is to examine the following 3 methodological issues: reliability, observer training, and internal validity. Seventy-two papers were reviewed, 38 of which proposed a new or modified observation grid. The observation variables identified were broken down into 7 categories as follows: posture, exertion, load handled, work environment, use of feet, use of hands, and activities or tasks performed. The review revealed the variability of existing procedures. The examination of methodological issues showed that observation data can be reliable and can present an adequate internal validity. However, little information about the conditions necessary to achieve good reliability was available.

  8. Nonparametric inference procedures for multistate life table analysis.

    PubMed

    Dow, M M

    1985-01-01

    Recent generalizations of the classical single state life table procedures to the multistate case provide the means to analyze simultaneously the mobility and mortality experience of 1 or more cohorts. This paper examines fairly general nonparametric combinatorial matrix procedures, known as quadratic assignment, as an analysis technic of various transitional patterns commonly generated by cohorts over the life cycle course. To some degree, the output from a multistate life table analysis suggests inference procedures. In his discussion of multstate life table construction features, the author focuses on the matrix formulation of the problem. He then presents several examples of the proposed nonparametric procedures. Data for the mobility and life expectancies at birth matrices come from the 458 member Cayo Santiago rhesus monkey colony. The author's matrix combinatorial approach to hypotheses testing may prove to be a useful inferential strategy in several multidimensional demographic areas.

  9. A two-step procedure of fractal analysis

    NASA Astrophysics Data System (ADS)

    Dedovich, T. G.; Tokarev, M. V.

    2016-03-01

    A two-step procedure for the analysis of different-type fractals is proposed for the PaC and SePaC methods. An advantage of the two-step procedures of the PaC and SePaC methods over the basic and modified PaC and SePaC methods is shown. Results of comparative analysis of the unified data set using different approaches (the BC method and two-step procedures of the PaC and SePaC methods) are given. It is shown that the two-step procedure of the SePaC method is most efficient in reconstructing the overall data set.

  10. Current Human Reliability Analysis Methods Applied to Computerized Procedures

    SciTech Connect

    Ronald L. Boring

    2012-06-01

    Computerized procedures (CPs) are an emerging technology within nuclear power plant control rooms. While CPs have been implemented internationally in advanced control rooms, to date no US nuclear power plant has implemented CPs in its main control room (Fink et al., 2009). Yet, CPs are a reality of new plant builds and are an area of considerable interest to existing plants, which see advantages in terms of enhanced ease of use and easier records management by omitting the need for updating hardcopy procedures. The overall intent of this paper is to provide a characterization of human reliability analysis (HRA) issues for computerized procedures. It is beyond the scope of this document to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper serves as a review of current HRA as it may be used for the analysis and review of computerized procedures.

  11. Activities identification for activity-based cost/management applications of the diagnostics outpatient procedures.

    PubMed

    Alrashdan, Abdalla; Momani, Amer; Ababneh, Tamador

    2012-01-01

    One of the most challenging problems facing healthcare providers is to determine the actual cost for their procedures, which is important for internal accounting and price justification to insurers. The objective of this paper is to find suitable categories to identify the diagnostic outpatient medical procedures and translate them from functional orientation to process orientation. A hierarchal task tree is developed based on a classification schema of procedural activities. Each procedure is seen as a process consisting of a number of activities. This makes a powerful foundation for activity-based cost/management implementation and provides enough information to discover the value-added and non-value-added activities that assist in process improvement and eventually may lead to cost reduction. Work measurement techniques are used to identify the standard time of each activity at the lowest level of the task tree. A real case study at a private hospital is presented to demonstrate the proposed methodology.

  12. Accident Sequence Evaluation Program: Human reliability analysis procedure

    SciTech Connect

    Swain, A.D.

    1987-02-01

    This document presents a shortened version of the procedure, models, and data for human reliability analysis (HRA) which are presented in the Handbook of Human Reliability Analysis With emphasis on Nuclear Power Plant Applications (NUREG/CR-1278, August 1983). This shortened version was prepared and tried out as part of the Accident Sequence Evaluation Program (ASEP) funded by the US Nuclear Regulatory Commission and managed by Sandia National Laboratories. The intent of this new HRA procedure, called the ''ASEP HRA Procedure,'' is to enable systems analysts, with minimal support from experts in human reliability analysis, to make estimates of human error probabilities and other human performance characteristics which are sufficiently accurate for many probabilistic risk assessments. The ASEP HRA Procedure consists of a Pre-Accident Screening HRA, a Pre-Accident Nominal HRA, a Post-Accident Screening HRA, and a Post-Accident Nominal HRA. The procedure in this document includes changes made after tryout and evaluation of the procedure in four nuclear power plants by four different systems analysts and related personnel, including human reliability specialists. The changes consist of some additional explanatory material (including examples), and more detailed definitions of some of the terms. 42 refs.

  13. A procedure to estimate proximate analysis of mixed organic wastes.

    PubMed

    Zaher, U; Buffiere, P; Steyer, J P; Chen, S

    2009-04-01

    In waste materials, proximate analysis measuring the total concentration of carbohydrate, protein, and lipid contents from solid wastes is challenging, as a result of the heterogeneous and solid nature of wastes. This paper presents a new procedure that was developed to estimate such complex chemical composition of the waste using conventional practical measurements, such as chemical oxygen demand (COD) and total organic carbon. The procedure is based on mass balance of macronutrient elements (carbon, hydrogen, nitrogen, oxygen, and phosphorus [CHNOP]) (i.e., elemental continuity), in addition to the balance of COD and charge intensity that are applied in mathematical modeling of biological processes. Knowing the composition of such a complex substrate is crucial to study solid waste anaerobic degradation. The procedure was formulated to generate the detailed input required for the International Water Association (London, United Kingdom) Anaerobic Digestion Model number 1 (IWA-ADM1). The complex particulate composition estimated by the procedure was validated with several types of food wastes and animal manures. To make proximate analysis feasible for validation, the wastes were classified into 19 types to allow accurate extraction and proximate analysis. The estimated carbohydrates, proteins, lipids, and inerts concentrations were highly correlated to the proximate analysis; correlation coefficients were 0.94, 0.88, 0.99, and 0.96, respectively. For most of the wastes, carbohydrate was the highest fraction and was estimated accurately by the procedure over an extended range with high linearity. For wastes that are rich in protein and fiber, the procedure was even more consistent compared with the proximate analysis. The new procedure can be used for waste characterization in solid waste treatment design and optimization.

  14. Procedural Fidelity: An Analysis of Measurement and Reporting Practices

    ERIC Educational Resources Information Center

    Ledford, Jennifer R.; Wolery, Mark

    2013-01-01

    A systematic analysis was conducted of measurement and reporting practices related to procedural fidelity in single-case research for the past 30 years. Previous reviews of fidelity primarily reported whether fidelity data were collected by authors; these reviews reported that collection was variable, but low across journals and over time. Results…

  15. Research Methods and Data Analysis Procedures Used by Educational Researchers

    ERIC Educational Resources Information Center

    Hsu, Tse-chi

    2005-01-01

    To assess the status and the trends of subject matters investigated and research methods/designs and data analysis procedures employed by educational researchers, this study surveyed articles published by the "American Educational Research Journal (AERJ)," "Journal of Experimental Education (JEE)" and "Journal of Educational Research (JER)" from…

  16. Performance and approval procedures for active personal dosemeters.

    PubMed

    Ginjaume, M

    2011-03-01

    Active personal dosemeters (APDs) are well accepted as useful and reliable instruments for individual dosimetry measurements. The increasing concern about studying the behaviour of APDs in pulsed fields is illustrated through revision of the results of the most representative studies on the performance of APDs in the last 5 y. The deficiencies of APDs in pulsed fields are discussed together with proposals to overcome them. Although there are no legal constraints or technical limitations for recognising APDs for legal dosimetry in facilities with continuous radiation fields, APDs continue to be mainly used as operational dosemeters. The approval procedures applicable to APDs, especially the approach undertaken by Germany, are presented. Finally, some trends in the developments and use of APDs are summarised.

  17. PROC LCA: A SAS Procedure for Latent Class Analysis.

    PubMed

    Lanza, Stephanie T; Collins, Linda M; Lemmon, David R; Schafer, Joseph L

    2007-01-01

    Latent class analysis (LCA) is a statistical method used to identify a set of discrete, mutually exclusive latent classes of individuals based on their responses to a set of observed categorical variables. In multiple-group LCA, both the measurement part and structural part of the model can vary across groups, and measurement invariance across groups can be empirically tested. LCA with covariates extends the model to include predictors of class membership. In this article, we introduce PROC LCA, a new SAS procedure for conducting LCA, multiple-group LCA, and LCA with covariates. The procedure is demonstrated using data on alcohol use behavior in a national sample of high school seniors.

  18. CD-SEM precision: improved procedure and analysis

    NASA Astrophysics Data System (ADS)

    Menaker, Mina

    1999-06-01

    Accurate precision assessment becomes increasingly important as we proceed along the SIA road map, in to more advanced processes and smaller critical dimensions. Accurate precision is necessary in order to determine the P/T ratio which is used to decide whether a specific CD-SEM is valid for controlling a specific process. The customer's needs, as been presented by the SEMATECH Advanced Metrology Advisory Group, are to receive a detailed precision report, in the form of a full repeatability and reproducibility (RR) analysis. The 3 sigma single tool RR, of an in-line SEM, are determined in the same operational modes as used in production, and should include the effects of time and process variants on the SEM performance. We hereby present an RR procedure by a modulate approach which enables the user extending the evaluation according to her/his needs. It includes direct assessment of repeatability, reproducibility and stability analysis. It also allows for a study of wafer non homogeneity, induced process variation and a measured feature type effect on precision. The procedure is based on the standard ISO RR procedure, and includes a modification for a correct compensation for bias, or so called measurement turned. A close examination of the repeatability and reproducibility variations, provides insight to the possible sources of those variations, such as S/N ratio, SEM autofocus mechanism, automation etc. For example, poor wafer alignment might not effect the repeatability, but severally reduce reproducibility. Therefore the analysis is a key to better understanding and improving of CD-SEM performance, on production layers. The procedure is fully implemented on an automated CD-SEM, providing on line precision assessment. RR < 1 nm has been demonstrated on well defined resist and etched structures. Examples of the automatic analysis results, using the new procedure are presented.

  19. Environmental Quality Information Analysis Center (EQIAC) operating procedures handbook

    SciTech Connect

    Walsh, T.E. ); Das, S. )

    1992-08-01

    The Operating Procedures Handbook of the Environmental Quality Information Analysis Center (EQIAC) is intended to be kept current as EQIAC develops and evolves. Its purpose is to provide a comprehensive guide to the mission, infrastructure, functions, and operational procedures of EQIAC. The handbook is a training tool for new personnel and a reference manual for existing personnel. The handbook will be distributed throughout EQIAC and maintained in binders containing current dated editions of the individual sections. The handbook will be revised at least annually to reflect the current structure and operational procedures of EQIAC. The EQIAC provides information on environmental issues such as compliance, restoration, and environmental monitoring do the Air Force and DOD contractors.

  20. The use of analysis of variance procedures in biological studies

    USGS Publications Warehouse

    Williams, B.K.

    1987-01-01

    The analysis of variance (ANOVA) is widely used in biological studies, yet there remains considerable confusion among researchers about the interpretation of hypotheses being tested. Ambiguities arise when statistical designs are unbalanced, and in particular when not all combinations of design factors are represented in the data. This paper clarifies the relationship among hypothesis testing, statistical modelling and computing procedures in ANOVA for unbalanced data. A simple two-factor fixed effects design is used to illustrate three common parametrizations for ANOVA models, and some associations among these parametrizations are developed. Biologically meaningful hypotheses for main effects and interactions are given in terms of each parametrization, and procedures for testing the hypotheses are described. The standard statistical computing procedures in ANOVA are given along with their corresponding hypotheses. Throughout the development unbalanced designs are assumed and attention is given to problems that arise with missing cells.

  1. Jet Engine hot parts IR Analysis Procedure (J-EIRP)

    NASA Technical Reports Server (NTRS)

    Baumeister, Joseph F.

    1993-01-01

    A thermal radiation analysis method called Jet Engine IR Analysis Procedure (J-EIRP) was developed to evaluate jet engine cavity hot parts source radiation. The objectives behind J-EIRP were to achieve the greatest accuracy in model representation and solution, while minimizing computer resources and computational time. The computer programs that comprise J-EIRP were selected on the basis of their performance, accuracy, and flexibility to solve both simple and complex problems. These programs were intended for use on a personal computer, but include the ability to solve large problems on a mainframe or supercomputer. J-EIRP also provides the user with a tool for developing thermal design experience and engineering judgment through analysis experimentation, while using minimal computer resources. A sample jet engine cavity analysis demonstrates the procedure and capabilities within J-EIRP, and is compared to a simplified method for approximating cavity radiation. The goal is to introduce the terminology and solution process used in J-EIRP and to provide insight into the radiation heat transfer principles used in this procedure.

  2. Hair decontamination procedure prior to multi-class pesticide analysis.

    PubMed

    Duca, Radu-Corneliu; Hardy, Emilie; Salquèbre, Guillaume; Appenzeller, Brice M R

    2014-06-01

    Although increasing interest is being observed in hair analysis for the biomonitoring of human exposure to pesticides, some limitations still have to be addressed for optimum use of this matrix in that specific context. One main possible issue concerns the need to differentiate chemicals biologically incorporated into hair from those externally deposited on hair surface from contaminated air or dust. The present study focuses on the development of a washing procedure for the decontamination of hair before analysis of pesticides from different chemical classes. For this purpose, three different procedures of artificial contamination (with silica, cellulose, and aqueous solution) were used to simulate pesticides deposition on hair surface. Several washing solvents (four organic: acetone, dichloromethane, methanol, acetonitrile; and four aqueous: water, phosphate buffer, shampoo, sodium dodecylsulfate) were evaluated for their capacity to remove artificially deposited pesticides from hair surface. The most effective washing solvents were sodium dodecylsulfate and methanol for aqueous and organic solvents, respectively. Moreover, after a first washing with sodium dodecylsulfate or methanol, the majority of externally deposited pesticides was removed and a steady-state was reached since significantly lower amounts were removed by additional second and third washings. Finally, the effectiveness of a decontamination procedure comprising washing with sodium dodecylsulfate and methanol was successively demonstrated. In parallel, it was determined that the final procedure did not affect the chemicals biologically incorporated, as hair strands naturally containing pesticides were used. Such a procedure appears to remove in one-shot the fraction of chemicals located on hair surface and does not require repeated washing steps.

  3. Hyperspectral data analysis procedures with reduced sensitivity to noise

    NASA Technical Reports Server (NTRS)

    Landgrebe, David A.

    1993-01-01

    Multispectral sensor systems have become steadily improved over the years in their ability to deliver increased spectral detail. With the advent of hyperspectral sensors, including imaging spectrometers, this technology is in the process of taking a large leap forward, thus providing the possibility of enabling delivery of much more detailed information. However, this direction of development has drawn even more attention to the matter of noise and other deleterious effects in the data, because reducing the fundamental limitations of spectral detail on information collection raises the limitations presented by noise to even greater importance. Much current effort in remote sensing research is thus being devoted to adjusting the data to mitigate the effects of noise and other deleterious effects. A parallel approach to the problem is to look for analysis approaches and procedures which have reduced sensitivity to such effects. We discuss some of the fundamental principles which define analysis algorithm characteristics providing such reduced sensitivity. One such analysis procedure including an example analysis of a data set is described, illustrating this effect.

  4. Calcium Activities During Different Ion Exchange Separation Procedures

    NASA Astrophysics Data System (ADS)

    Zhang, Z.; Zhu, H.; Liu, Y.; Liu, F.; Zhang, C.; Sun, W.

    2014-12-01

    Calcium is a major element and participates in many geological processes. Investigations on stable calcium isotopic compositions of natural geological samples provide a great powerful tool to understand all kinds of those geological processes from a view of the field of isotope geochemistry. With the development of modern instruments and chemical separation techniques, calcium isotopic compositions could be determined even more precisely if the column chemistry brings no deviation. Usually, Calcium is separated from matrix elements using cation resin columns and the related chemical separation techniques seem to be robust. However, more detailed work still need to be done on matrix effects and calcium isotopic fractionations on column chemistry or during elution processes. If calcium is run on TIMS instruments, the interference effect could be lower and easier controlled, thus, the requirement to the chemistry is relatively not critic, but calcium fractionation on filaments could be much difficult to monitor. If calcium is run on MC-ICP-MS instruments, the interference effect could be huge and is really difficult to be recognized and subtracted, the requirement to the chemistry is much more critical in order to get a real result of the sample, but the instrument fractionation could be easier to monitor. Here we investigate calcium activities on several kinds of cation resins under different column/acid conditions. We seek to find a good balance between recovery and interference effect on column chemistry and are intend to set up a better chemical separation procedure to satisfy the instrument requirements for calcium. In addition, Calcium isotopic fractionation on column will also be discussed further here based on our previous and ongoing results.

  5. Goals Analysis Procedure Guidelines for Applying the Goals Analysis Process

    NASA Technical Reports Server (NTRS)

    Motley, Albert E., III

    2000-01-01

    One of the key elements to successful project management is the establishment of the "right set of requirements", requirements that reflect the true customer needs and are consistent with the strategic goals and objectives of the participating organizations. A viable set of requirements implies that each individual requirement is a necessary element in satisfying the stated goals and that the entire set of requirements, taken as a whole, is sufficient to satisfy the stated goals. Unfortunately, it is the author's experience that during project formulation phases' many of the Systems Engineering customers do not conduct a rigorous analysis of the goals and objectives that drive the system requirements. As a result, the Systems Engineer is often provided with requirements that are vague, incomplete, and internally inconsistent. To complicate matters, most systems development methodologies assume that the customer provides unambiguous, comprehensive and concise requirements. This paper describes the specific steps of a Goals Analysis process applied by Systems Engineers at the NASA Langley Research Center during the formulation of requirements for research projects. The objective of Goals Analysis is to identify and explore all of the influencing factors that ultimately drive the system's requirements.

  6. Development of a simplified procedure for cyclic structural analysis

    NASA Technical Reports Server (NTRS)

    Kaufman, A.

    1984-01-01

    Development was extended of a simplified inelastic analysis computer program (ANSYMP) for predicting the stress-strain history at the critical location of a thermomechanically cycled structure from an elastic solution. The program uses an iterative and incremental procedure to estimate the plastic strains from the material stress-strain properties and a plasticity hardening model. Creep effects can be calculated on the basis of stress relaxation at constant strain, creep at constant stress, or a combination of stress relaxation and creep accumulation. The simplified method was exercised on a number of problems involving uniaxial and multiaxial loading, isothermal and nonisothermal conditions, dwell times at various points in the cycles, different materials, and kinematic hardening. Good agreement was found between these analytical results and nonlinear finite-element solutions for these problems. The simplified analysis program used less than 1 percent of the CPU time required for a nonlinear finite-element analysis.

  7. Optimal thermographic procedures for moisture analysis in building materials

    NASA Astrophysics Data System (ADS)

    Rosina, Elisabetta; Ludwig, Nicola

    1999-09-01

    The presence of moisture in building materials causes damage second only to structural one. NDT are successfully applied to map moisture distribution, to localize the source of water and to determine microclimatic conditions. IR Thermography has the advantage of non-destructive testing while it allows to investigate large surfaces. The measures can be repeated in time to monitor the phenomenon of raising water. Nevertheless the investigation of moisture in walls is one of the less reliable application of Thermography IR applied to cultural heritage preservation. The temperature of the damp areas can be colder than dry ones, because of surface evaporation, or can be warmer, because of the higher thermal inertia of water content versus building materials. The apparent discrepancies between the two results are due to the different microclimatic conditions of the scanning. Aim of the paper is to describe optimal procedures to obtain reliable maps of moisture in building materials, at different environmental and microclimatic conditions. Another goal is the description of the related energetic phenomena, which cause temperature discontinuities, and that are detected by thermography. Active and passive procedures are presented and compared. Case studies show some examples of procedures application.

  8. User's operating procedures. Volume 2: Scout project financial analysis program

    NASA Technical Reports Server (NTRS)

    Harris, C. G.; Haris, D. K.

    1985-01-01

    A review is presented of the user's operating procedures for the Scout Project Automatic Data system, called SPADS. SPADS is the result of the past seven years of software development on a Prime mini-computer located at the Scout Project Office, NASA Langley Research Center, Hampton, Virginia. SPADS was developed as a single entry, multiple cross-reference data management and information retrieval system for the automation of Project office tasks, including engineering, financial, managerial, and clerical support. This volume, two (2) of three (3), provides the instructions to operate the Scout Project Financial Analysis program in data retrieval and file maintenance via the user friendly menu drivers.

  9. Whole-House Energy Analysis Procedures for Existing Homes: Preprint

    SciTech Connect

    Hendron, R.

    2006-08-01

    This paper describes a proposed set of guidelines for analyzing the energy savings achieved by a package of retrofits or an extensive rehabilitation of an existing home. It also describes certain field test and audit methods that can help establish accurate building system performance characteristics that are needed for a meaningful simulation of whole-house energy use. Several sets of default efficiency values have been developed for older appliances that cannot be easily tested and for which published specifications are not readily available. These proposed analysis procedures are documented more comprehensively in NREL Technical Report TP-550-38238.

  10. Reduction procedures for accurate analysis of MSX surveillance experiment data

    NASA Technical Reports Server (NTRS)

    Gaposchkin, E. Mike; Lane, Mark T.; Abbot, Rick I.

    1994-01-01

    Technical challenges of the Midcourse Space Experiment (MSX) science instruments require careful characterization and calibration of these sensors for analysis of surveillance experiment data. Procedures for reduction of Resident Space Object (RSO) detections will be presented which include refinement and calibration of the metric and radiometric (and photometric) data and calculation of a precise MSX ephemeris. Examples will be given which support the reduction, and these are taken from ground-test data similar in characteristics to the MSX sensors and from the IRAS satellite RSO detections. Examples to demonstrate the calculation of a precise ephemeris will be provided from satellites in similar orbits which are equipped with S-band transponders.

  11. Accuracy of remotely sensed data: Sampling and analysis procedures

    NASA Technical Reports Server (NTRS)

    Congalton, R. G.; Oderwald, R. G.; Mead, R. A.

    1982-01-01

    A review and update of the discrete multivariate analysis techniques used for accuracy assessment is given. A listing of the computer program written to implement these techniques is given. New work on evaluating accuracy assessment using Monte Carlo simulation with different sampling schemes is given. The results of matrices from the mapping effort of the San Juan National Forest is given. A method for estimating the sample size requirements for implementing the accuracy assessment procedures is given. A proposed method for determining the reliability of change detection between two maps of the same area produced at different times is given.

  12. Guide to IDAP, Version 2: an interactive decision analysis procedure

    SciTech Connect

    Jusko, M.J.; Whitfield, R.G.

    1980-11-01

    This document is intended to serve as both a programmer's and user's guide to the current version of the IDAP; and to prompt interested individuals into making suggestions for the future development of IDAP. The majority of the sections pertain to the main IDA program rather than to the IDAIN procedure. A brief discussion is presented of the theory of decision analysis. The aspects of decision analysis that are relevant to the IDAP are discussed. A complete list and description of the commands used in the IDAP program is provided and, including three complete examples. This section may be considered a user's guide to the IDAP. The programmer's guide to the IDAP discusses the various technical aspects of the programs, and may be skipped by users not involved with programming the IDAP. A list of the error messages generated by the IDAP is presented. As the program is developed, error handling and messages will improve.

  13. A MEMBRANE FILTER PROCEDURE FOR ASSAYING CYTOTOXIC ACTIVITY IN HETEROTROPHIC BACTERIA ISOLATED FROM DRINKING WATER

    EPA Science Inventory

    Cytotoxic activity assays of Gram-negative, heterotrophic bacteria are often laborious and time consuming. The objective of this study was to develop in situ procedures for testing potential cytotoxic activities of heterotrophic bacteria isolated from drinking water systems. Wate...

  14. Analysis of stereoelectronic properties, mechanism of action and pharmacophore of synthetic indolo[2,1-b]quinazoline-6,12-dione derivatives in relation to antileishmanial activity using quantum chemical, cyclic voltammetry and 3-D-QSAR CATALYST procedures.

    PubMed

    Bhattacharjee, Apurba K; Skanchy, David J; Jennings, Barton; Hudson, Thomas H; Brendle, James J; Werbovetz, Karl A

    2002-06-01

    Several indolo[2,1-b]quinazoline-6,12-dione (tryptanthrin) derivatives exhibited remarkable activity at concentrations below 100 ng/mL when tested against in vitro Leishmania donovani amastigotes. The in vitro toxicity studies indicate that the compounds are fairly well tolerated in both macrophage and neuronal lines. An analysis based on qualitative and quantitative structure-activity relationship studies between in vitro antileishmanial activity and molecular electronic structure of 27 analogues of indolo[2,1-b]quinazoline-6,12-dione is presented here by using a combination of semi-empirical AM1 quantum chemical, cyclic voltammetry and a pharmacophore generation (CATALYST) methods. A modest to good correlation is observed between activity and a few calculated molecular properties such as molecular density, octanol-water partition coefficient, molecular orbital energies, and redox potentials. Electron transfer seems to be a plausible path in the mechanism of action of the compounds. A pharmacophore generated by using the 3-D QSAR of CATALYST produced a fairly accurate predictive model of antileishmanial activity of the tryptanthrins. The validity of the pharmacophore model extends to structurally different class of compounds that could open new frontiers for study. The carbonyl group of the five- and six-membered rings in the indolo[2,1-b]quinazoline-6,12-dione skeleton and the electron transfer ability to the carbonyl atom appear to be crucial for activity.

  15. Synfuel program analysis. Volume 1: Procedures-capabilities

    NASA Astrophysics Data System (ADS)

    Muddiman, J. B.; Whelan, J. W.

    1980-07-01

    The analytic procedures and capabilities developed by Resource Applications (RA) for examining the economic viability, public costs, and national benefits of alternative are described. This volume is intended for Department of Energy (DOE) and Synthetic Fuel Corporation (SFC) program management personnel and includes a general description of the costing, venture, and portfolio models with enough detail for the reader to be able to specify cases and interpret outputs. It contains an explicit description (with examples) of the types of results which can be obtained when applied for the analysis of individual projects; the analysis of input uncertainty, i.e., risk; and the analysis of portfolios of such projects, including varying technology mixes and buildup schedules. The objective is to obtain, on the one hand, comparative measures of private investment requirements and expected returns (under differing public policies) as they affect the private decision to proceed, and, on the other, public costs and national benefits as they affect public decisions to participate (in what form, in what areas, and to what extent).

  16. Neutron activation analysis system

    DOEpatents

    Taylor, M.C.; Rhodes, J.R.

    1973-12-25

    A neutron activation analysis system for monitoring a generally fluid media, such as slurries, solutions, and fluidized powders, including two separate conduit loops for circulating fluid samples within the range of radiation sources and detectors is described. Associated with the first loop is a neutron source that emits s high flux of slow and thermal neutrons. The second loop employs a fast neutron source, the flux from which is substantially free of thermal neutrons. Adjacent to both loops are gamma counters for spectrographic determination of the fluid constituents. Other gsmma sources and detectors are arranged across a portion of each loop for deterMining the fluid density. (Official Gazette)

  17. Bias in Student Survey Findings from Active Parental Consent Procedures

    ERIC Educational Resources Information Center

    Shaw, Thérèse; Cross, Donna; Thomas, Laura T.; Zubrick, Stephen R.

    2015-01-01

    Increasingly, researchers are required to obtain active (explicit) parental consent prior to surveying children and adolescents in schools. This study assessed the potential bias present in a sample of actively consented students, and in the estimates of associations between variables obtained from this sample. Students (n = 3496) from 36…

  18. Procedures manual for the ORNL Radiological Survey Activities (RASA) Program

    SciTech Connect

    Myrick, T.E.; Berven, B.A.; Cottrell, W.D.; Goldsmith, W.A.; Haywood, F.F.

    1987-04-01

    The portion of the radiological survey program performed by ORNL is the subject of this Procedures Manual. The RASA group of the Health and Safety Research Division (HASRD) at ORNL is responsible for the planning, conducting, and reporting of the results of radiological surveys at specified sites and associated vicinity properties. The results of these surveys are used by DOE in determining the need for and extent of remedial actions. Upon completion of the necessary remedial actions, the ORNL-RASA group or other OOS contractor may be called upon to verify the effectiveness of the remedial action. Information from these postremedial action surveys is included as part of the data base used by DOE in certifying a site for unrestricted use.

  19. Patent Network Analysis and Quadratic Assignment Procedures to Identify the Convergence of Robot Technologies.

    PubMed

    Lee, Woo Jin; Lee, Won Kyung; Sohn, So Young

    2016-01-01

    Because of the remarkable developments in robotics in recent years, technological convergence has been active in this area. We focused on finding patterns of convergence within robot technology using network analysis of patents in both the USPTO and KIPO. To identify the variables that affect convergence, we used quadratic assignment procedures (QAP). From our analysis, we observed the patent network ecology related to convergence and found technologies that have great potential to converge with other robotics technologies. The results of our study are expected to contribute to setting up convergence based R&D policies for robotics, which can lead new innovation.

  20. Patent Network Analysis and Quadratic Assignment Procedures to Identify the Convergence of Robot Technologies

    PubMed Central

    Lee, Woo Jin; Lee, Won Kyung

    2016-01-01

    Because of the remarkable developments in robotics in recent years, technological convergence has been active in this area. We focused on finding patterns of convergence within robot technology using network analysis of patents in both the USPTO and KIPO. To identify the variables that affect convergence, we used quadratic assignment procedures (QAP). From our analysis, we observed the patent network ecology related to convergence and found technologies that have great potential to converge with other robotics technologies. The results of our study are expected to contribute to setting up convergence based R&D policies for robotics, which can lead new innovation. PMID:27764196

  1. Computing the surveillance error grid analysis: procedure and examples.

    PubMed

    Kovatchev, Boris P; Wakeman, Christian A; Breton, Marc D; Kost, Gerald J; Louie, Richard F; Tran, Nam K; Klonoff, David C

    2014-07-01

    The surveillance error grid (SEG) analysis is a tool for analysis and visualization of blood glucose monitoring (BGM) errors, based on the opinions of 206 diabetes clinicians who rated 4 distinct treatment scenarios. Resulting from this large-scale inquiry is a matrix of 337 561 risk ratings, 1 for each pair of (reference, BGM) readings ranging from 20 to 580 mg/dl. The computation of the SEG is therefore complex and in need of automation. The SEG software introduced in this article automates the task of assigning a degree of risk to each data point for a set of measured and reference blood glucose values so that the data can be distributed into 8 risk zones. The software's 2 main purposes are to (1) distribute a set of BG Monitor data into 8 risk zones ranging from none to extreme and (2) present the data in a color coded display to promote visualization. Besides aggregating the data into 8 zones corresponding to levels of risk, the SEG computes the number and percentage of data pairs in each zone and the number/percentage of data pairs above/below the diagonal line in each zone, which are associated with BGM errors creating risks for hypo- or hyperglycemia, respectively. To illustrate the action of the SEG software we first present computer-simulated data stratified along error levels defined by ISO 15197:2013. This allows the SEG to be linked to this established standard. Further illustration of the SEG procedure is done with a series of previously published data, which reflect the performance of BGM devices and test strips under various environmental conditions. We conclude that the SEG software is a useful addition to the SEG analysis presented in this journal, developed to assess the magnitude of clinical risk from analytically inaccurate data in a variety of high-impact situations such as intensive care and disaster settings.

  2. Phosphorus Determination by Derivative Activation Analysis: A Multifaceted Radiochemical Application.

    ERIC Educational Resources Information Center

    Kleppinger, E. W.; And Others

    1984-01-01

    Although determination of phosphorus is important in biology, physiology, and environmental science, traditional gravimetric and colorimetric methods are cumbersome and lack the requisite sensitivity. Therefore, a derivative activation analysis method is suggested. Background information, procedures, and results are provided. (JN)

  3. Forensic analysis of Salvia divinorum using multivariate statistical procedures. Part I: discrimination from related Salvia species.

    PubMed

    Willard, Melissa A Bodnar; McGuffin, Victoria L; Smith, Ruth Waddell

    2012-01-01

    Salvia divinorum is a hallucinogenic herb that is internationally regulated. In this study, salvinorin A, the active compound in S. divinorum, was extracted from S. divinorum plant leaves using a 5-min extraction with dichloromethane. Four additional Salvia species (Salvia officinalis, Salvia guaranitica, Salvia splendens, and Salvia nemorosa) were extracted using this procedure, and all extracts were analyzed by gas chromatography-mass spectrometry. Differentiation of S. divinorum from other Salvia species was successful based on visual assessment of the resulting chromatograms. To provide a more objective comparison, the total ion chromatograms (TICs) were subjected to principal components analysis (PCA). Prior to PCA, the TICs were subjected to a series of data pretreatment procedures to minimize non-chemical sources of variance in the data set. Successful discrimination of S. divinorum from the other four Salvia species was possible based on visual assessment of the PCA scores plot. To provide a numerical assessment of the discrimination, a series of statistical procedures such as Euclidean distance measurement, hierarchical cluster analysis, Student's t tests, Wilcoxon rank-sum tests, and Pearson product moment correlation were also applied to the PCA scores. The statistical procedures were then compared to determine the advantages and disadvantages for forensic applications.

  4. Active Reading Procedures for Moderating the Effects of Poor Highlighting

    ERIC Educational Resources Information Center

    Gier, Vicki S.; Herring, Daniel; Hudnell, Jason; Montoya, Jodi; Kreiner, David S.

    2010-01-01

    We investigated two active reading techniques intended to eliminate the negative effect on reading comprehension of preexisting, inappropriate highlighting. College students read passages in three highlighting conditions: no highlighting, appropriate highlighting, and inappropriate highlighting. In Experiment 1, 30 students read the passages while…

  5. Preamplification Procedure for the Analysis of Ancient DNA Samples

    PubMed Central

    Del Gaudio, Stefania; Cirillo, Alessandra; Di Bernardo, Giovanni; Galderisi, Umberto; Thanassoulas, Theodoros; Pitsios, Theodoros; Cipollaro, Marilena

    2013-01-01

    In ancient DNA studies the low amount of endogenous DNA represents a limiting factor that often hampers the result achievement. In this study we extracted the DNA from nine human skeletal remains of different ages found in the Byzantine cemetery of Abdera Halkidiki and in the medieval cemetery of St. Spiridion in Rhodes (Greece). Real-time quantitative polymerase chain reaction (qPCR) was used to detect in the extracts the presence of PCR inhibitors and to estimate the DNA content. As mitochondrial DNA was detected in all samples, amplification of nuclear targets, as amelogenin and the polymorphism M470V of the transmembrane conductance regulator gene, yielded positive results in one case only. In an effort to improve amplification success, we applied, for the first time in ancient DNA, a preamplification strategy based on TaqMan PreAmp Master Mix. A comparison between results obtained from nonpreamplified and preamplified samples is reported. Our data, even if preliminary, show that the TaqMan PreAmp procedure may improve the sensitivity of qPCR analysis. PMID:24187523

  6. Preamplification procedure for the analysis of ancient DNA samples.

    PubMed

    Del Gaudio, Stefania; Cirillo, Alessandra; Di Bernardo, Giovanni; Galderisi, Umberto; Thanassoulas, Theodoros; Pitsios, Theodoros; Cipollaro, Marilena

    2013-01-01

    In ancient DNA studies the low amount of endogenous DNA represents a limiting factor that often hampers the result achievement. In this study we extracted the DNA from nine human skeletal remains of different ages found in the Byzantine cemetery of Abdera Halkidiki and in the medieval cemetery of St. Spiridion in Rhodes (Greece). Real-time quantitative polymerase chain reaction (qPCR) was used to detect in the extracts the presence of PCR inhibitors and to estimate the DNA content. As mitochondrial DNA was detected in all samples, amplification of nuclear targets, as amelogenin and the polymorphism M470V of the transmembrane conductance regulator gene, yielded positive results in one case only. In an effort to improve amplification success, we applied, for the first time in ancient DNA, a preamplification strategy based on TaqMan PreAmp Master Mix. A comparison between results obtained from nonpreamplified and preamplified samples is reported. Our data, even if preliminary, show that the TaqMan PreAmp procedure may improve the sensitivity of qPCR analysis.

  7. Inverse procedure for high-latitude ionospheric electrodynamics: Analysis of satellite-borne magnetometer data

    NASA Astrophysics Data System (ADS)

    Matsuo, Tomoko; Knipp, Delores J.; Richmond, Arthur D.; Kilcommons, Liam; Anderson, Brian J.

    2015-06-01

    This paper presents an analysis of data from the magnetometers on board the Defense Meteorological Satellite Program (DMSP) F-15, F-16, F-17, and F-18 satellites and the Iridium satellite constellation, using an inverse procedure for high-latitude ionospheric electrodynamics, during the period of 29-30 May 2010. The Iridium magnetometer data are made available through the Active Magnetosphere and Planetary Electrodynamics Response Experiment (AMPERE) program. The method presented here is built upon the assimilative mapping of ionospheric electrodynamics procedure but with a more complete treatment of the prior model uncertainty to facilitate an optimal inference of complete polar maps of electrodynamic variables from irregularly distributed observational data. The procedure can provide an objective measure of uncertainty associated with the analysis. The cross-validation analysis, in which the DMSP data are used as independent validation data sets, suggests that the procedure yields the spatial prediction of DMSP perturbation magnetic fields from AMPERE data alone with a median discrepancy of 30-50 nT. Discrepancies larger than 100 nT are seen in about 20% of total samples, whose location and magnitude are generally consistent with the previously identified discrepancy between DMSP and AMPERE data sets. Resulting field-aligned current (FAC) patterns exhibit more distinct spatial patterns without spurious high-frequency oscillatory features in comparison to the FAC products provided by AMPERE. Maps of the toroidal magnetic potential and FAC estimated from both AMPERE and DMSP data under four distinctive interplanetary magnetic field (IMF) conditions during a magnetic cloud event demonstrate the IMF control of high-latitude electrodynamics and the opportunity for future scientific investigation.

  8. An efficient numerical procedure for thermohydrodynamic analysis of cavitating bearings

    NASA Technical Reports Server (NTRS)

    Vijayaraghavan, D.

    1995-01-01

    An efficient and accurate numerical procedure to determine the thermo-hydrodynamic performance of cavitating bearings is described. This procedure is based on the earlier development of Elrod for lubricating films, in which the properties across the film thickness are determined at Lobatto points and their distributions are expressed by collocated polynomials. The cavitated regions and their boundaries are rigorously treated. Thermal boundary conditions at the surfaces, including heat dissipation through the metal to the ambient, are incorporated. Numerical examples are presented comparing the predictions using this procedure with earlier theoretical predictions and experimental data. With a few points across the film thickness and across the journal and the bearing in the radial direction, the temperature profile is very well predicted.

  9. Radiographic Parameter Analysis on Modified Sauvé-Kapandji Procedure

    PubMed Central

    Ota, Norikazu; Nakamura, Toshiyasu; Iwamoto, Takuji; Sato, Kazuki; Toyama, Yoshiaki

    2013-01-01

    Purpose The Sauvé-Kapandji (S-K) procedure is now an established treatment option for symptomatic distal radioulnar joint (DRUJ) dysfunction. However, for patients with poor bone quality (frequently as a result of advanced-stage rheumatoid arthritis [RA]), the conventional S-K procedure is difficult to perform without reducing the radioulnar diameter of the wrist, which may result in a loss of grip strength and pain over the proximal ulnar stump. The purpose of this study was to review the radiographic outcomes of patients who underwent a modified S-K procedure that involves rotating the resected ulnar segment 90 degrees and using it to bridge the gap between the sigmoid notch and the ulnar head. Methods The modified S-K procedure was performed in 29 wrists of 23 patients. Twenty-one patients had severe RA, while two had malunited radius fractures. The mean follow-up period was 43 months (range, 23 to 95). The radiographic evaluation included a measurement of the radioulnar width, the pseudarthrosis gap between the proximal and distal ulnar stump, the radioulnar distance, and the ulnar translation of the carpus. Results The radioulnar width of the wrist, pseudarthrosis gap, and radioulnar distance were well maintained throughout the period. A postoperative loss in the radioulnar width of the wrists appeared to correlate with a postoperative additional ulnar translocation of the carpus. Conclusion Narrowing of the radioulnar width of the wrist is a potential cause of progressive ulnar translocation of the carpus. The modified technique for the S-K procedure maintains the distal ulna in the proper position and provides sufficient ulnar support for the carpus. It is a useful reconstruction procedure in patients with severe RA with poor bone quality. PMID:24436785

  10. Radiographic parameter analysis on modified sauvé-kapandji procedure.

    PubMed

    Ota, Norikazu; Nakamura, Toshiyasu; Iwamoto, Takuji; Sato, Kazuki; Toyama, Yoshiaki

    2013-02-01

    Purpose The Sauvé-Kapandji (S-K) procedure is now an established treatment option for symptomatic distal radioulnar joint (DRUJ) dysfunction. However, for patients with poor bone quality (frequently as a result of advanced-stage rheumatoid arthritis [RA]), the conventional S-K procedure is difficult to perform without reducing the radioulnar diameter of the wrist, which may result in a loss of grip strength and pain over the proximal ulnar stump. The purpose of this study was to review the radiographic outcomes of patients who underwent a modified S-K procedure that involves rotating the resected ulnar segment 90 degrees and using it to bridge the gap between the sigmoid notch and the ulnar head. Methods The modified S-K procedure was performed in 29 wrists of 23 patients. Twenty-one patients had severe RA, while two had malunited radius fractures. The mean follow-up period was 43 months (range, 23 to 95). The radiographic evaluation included a measurement of the radioulnar width, the pseudarthrosis gap between the proximal and distal ulnar stump, the radioulnar distance, and the ulnar translation of the carpus. Results The radioulnar width of the wrist, pseudarthrosis gap, and radioulnar distance were well maintained throughout the period. A postoperative loss in the radioulnar width of the wrists appeared to correlate with a postoperative additional ulnar translocation of the carpus. Conclusion Narrowing of the radioulnar width of the wrist is a potential cause of progressive ulnar translocation of the carpus. The modified technique for the S-K procedure maintains the distal ulna in the proper position and provides sufficient ulnar support for the carpus. It is a useful reconstruction procedure in patients with severe RA with poor bone quality.

  11. An Analysis of Error-Correction Procedures during Discrimination Training.

    ERIC Educational Resources Information Center

    Rodgers, Teresa A.; Iwata, Brian A.

    1991-01-01

    Seven adults with severe to profound mental retardation participated in match-to-sample discrimination training under three conditions. Results indicated that error-correction procedures improve performance through negative reinforcement; that error correction may serve multiple functions; and that, for some subjects, trial repetition enhances…

  12. The Risky Situation: A Procedure for Assessing the Father-Child Activation Relationship

    ERIC Educational Resources Information Center

    Paquette, Daniel; Bigras, Marc

    2010-01-01

    Initial validation data are presented for the Risky Situation (RS), a 20-minute observational procedure designed to assess the father-child activation relationship with children aged 12-18 months. The coding grid, which is simple and easy to use, allows parent-child dyads to be classified into three categories and provides an activation score. By…

  13. Evaluating Active Parental Consent Procedures for School Programming: Addressing the Sensitive Topic of Suicide Prevention

    ERIC Educational Resources Information Center

    Totura, Christine M. Wienke; Kutash, Krista; Labouliere, Christa D.; Karver, Marc S.

    2017-01-01

    Background: Suicide is the second leading cause of death for adolescents. Whereas school-based prevention programs are effective, obtaining active consent for youth participation in public health programming concerning sensitive topics is challenging. We explored several active consent procedures for improving participation rates. Methods: Five…

  14. Acquisition of Procedures: The Effects of Example Elaborations and Active Learning Exercises

    ERIC Educational Resources Information Center

    Catrambone, Richard; Yuasa, Mashiho

    2006-01-01

    This study explored the effects of active learning and types of elaboration on procedure acquisition (writing database queries). Training materials emphasized elaborations of conditions for executing actions versus elaborations of the connection between conditions and actions. In the "active" conditions, participants performed structured exercises…

  15. A Quantitative Review of Functional Analysis Procedures in Public School Settings

    ERIC Educational Resources Information Center

    Solnick, Mark D.; Ardoin, Scott P.

    2010-01-01

    Functional behavioral assessments can consist of indirect, descriptive and experimental procedures, such as a functional analysis. Although the research contains numerous examples demonstrating the effectiveness of functional analysis procedures, experimental conditions are often difficult to implement in classroom settings and analog conditions…

  16. Risk analysis procedure for post-wildfire natural hazards in British Columbia

    NASA Astrophysics Data System (ADS)

    Jordan, Peter

    2010-05-01

    Following a severe wildfire season in 2003, and several subsequent damaging debris flow and flood events, the British Columbia Forest Service developed a procedure for analysing risks to public safety and infrastructure from such events. At the same time, the Forest Service undertook a research program to determine the extent of post-wildfire hazards, and examine the hydrologic and geomorphic processes contributing to the hazards. The risk analysis procedure follows the Canadian Standards Association decision-making framework for risk management (which in turn is based on international standards). This has several steps: identification of risk, risk analysis and estimation, evaluation of risk tolerability, developing control or mitigation strategies, and acting on these strategies. The Forest Service procedure deals only with the first two steps. The results are passed on to authorities such as the Provincial Emergency Program and local government, who are responsible for evaluating risks, warning residents, and applying mitigation strategies if appropriate. The objective of the procedure is to identify and analyse risks to public safety and infrastructure. The procedure is loosely based on the BAER (burned area emergency response) program in the USA, with some important differences. Our procedure focuses on identifying risks and warning affected parties, not on mitigation activities such as broadcast erosion control measures. Partly this is due to limited staff and financial resources. Also, our procedure is not multi-agency, but is limited to wildfires on provincial forest land; in British Columbia about 95% of forest land is in the publicly-owned provincial forest. Each fire season, wildfires are screened by size and proximity to values at risk such as populated areas. For selected fires, when the fire is largely contained, the procedure begins with an aerial reconnaissance of the fire, and photography with a hand-held camera, which can be used to make a

  17. 14 CFR Appendix I to Part 417 - Methodologies for Toxic Release Hazard Analysis and Operational Procedures

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... using an analysis method, such as a failure mode and effects analysis or fault tree analysis. (2... Analysis and Operational Procedures I Appendix I to Part 417 Aeronautics and Space COMMERCIAL SPACE..., App. I Appendix I to Part 417—Methodologies for Toxic Release Hazard Analysis and...

  18. 14 CFR Appendix I to Part 417 - Methodologies for Toxic Release Hazard Analysis and Operational Procedures

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... using an analysis method, such as a failure mode and effects analysis or fault tree analysis. (2... Analysis and Operational Procedures I Appendix I to Part 417 Aeronautics and Space COMMERCIAL SPACE..., App. I Appendix I to Part 417—Methodologies for Toxic Release Hazard Analysis and...

  19. 14 CFR Appendix I to Part 417 - Methodologies for Toxic Release Hazard Analysis and Operational Procedures

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... using an analysis method, such as a failure mode and effects analysis or fault tree analysis. (2... Analysis and Operational Procedures I Appendix I to Part 417 Aeronautics and Space COMMERCIAL SPACE..., App. I Appendix I to Part 417—Methodologies for Toxic Release Hazard Analysis and...

  20. 14 CFR Appendix I to Part 417 - Methodologies for Toxic Release Hazard Analysis and Operational Procedures

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... using an analysis method, such as a failure mode and effects analysis or fault tree analysis. (2... Analysis and Operational Procedures I Appendix I to Part 417 Aeronautics and Space COMMERCIAL SPACE..., App. I Appendix I to Part 417—Methodologies for Toxic Release Hazard Analysis and...

  1. NASTRAN/FLEXSTAB procedure for static aeroelastic analysis

    NASA Technical Reports Server (NTRS)

    Schuster, L. S.

    1984-01-01

    Presented is a procedure for using the FLEXSTAB External Structural Influence Coefficients (ESIC) computer program to produce the structural data necessary for the FLEXSTAB Stability Derivatives and Static Stability (SD&SS) program. The SD&SS program computes trim state, stability derivatives, and pressure and deflection data for a flexible airplane having a plane of symmetry. The procedure used a NASTRAN finite-element structural model as the source of structural data in the form of flexibility matrices. Selection of a set of degrees of freedom, definition of structural nodes and panels, reordering and reformatting of the flexibility matrix, and redistribution of existing point mass data are among the topics discussed. Also discussed are boundary conditions and the NASTRAN substructuring technique.

  2. Computer-based procedure for field activities: Results from three evaluations at nuclear power plants

    SciTech Connect

    Oxstrand, Johanna; bly, Aaron; LeBlanc, Katya

    2014-09-01

    Nearly all activities that involve human interaction with the systems of a nuclear power plant are guided by procedures. The paper-based procedures (PBPs) currently used by industry have a demonstrated history of ensuring safety; however, improving procedure use could yield tremendous savings in increased efficiency and safety. One potential way to improve procedure-based activities is through the use of computer-based procedures (CBPs). Computer-based procedures provide the opportunity to incorporate context driven job aids, such as drawings, photos, just-in-time training, etc into CBP system. One obvious advantage of this capability is reducing the time spent tracking down the applicable documentation. Additionally, human performance tools can be integrated in the CBP system in such way that helps the worker focus on the task rather than the tools. Some tools can be completely incorporated into the CBP system, such as pre-job briefs, placekeeping, correct component verification, and peer checks. Other tools can be partly integrated in a fashion that reduces the time and labor required, such as concurrent and independent verification. Another benefit of CBPs compared to PBPs is dynamic procedure presentation. PBPs are static documents which limits the degree to which the information presented can be tailored to the task and conditions when the procedure is executed. The CBP system could be configured to display only the relevant steps based on operating mode, plant status, and the task at hand. A dynamic presentation of the procedure (also known as context-sensitive procedures) will guide the user down the path of relevant steps based on the current conditions. This feature will reduce the user’s workload and inherently reduce the risk of incorrectly marking a step as not applicable and the risk of incorrectly performing a step that should be marked as not applicable. As part of the Department of Energy’s (DOE) Light Water Reactors Sustainability Program

  3. Procedure for conducting a human-reliability analysis for nuclear power plants. Final report

    SciTech Connect

    Bell, B.J.; Swain, A.D.

    1983-05-01

    This document describes in detail a procedure to be followed in conducting a human reliability analysis as part of a probabilistic risk assessment when such an analysis is performed according to the methods described in NUREG/CR-1278, Handbook for Human Reliability Analysis with Emphasis on Nuclear Power Plant Applications. An overview of the procedure describing the major elements of a human reliability analysis is presented along with a detailed description of each element and an example of an actual analysis. An appendix consists of some sample human reliability analysis problems for further study.

  4. Accuracy of the Parallel Analysis Procedure with Polychoric Correlations

    ERIC Educational Resources Information Center

    Cho, Sun-Joo; Li, Feiming; Bandalos, Deborah

    2009-01-01

    The purpose of this study was to investigate the application of the parallel analysis (PA) method for choosing the number of factors in component analysis for situations in which data are dichotomous or ordinal. Although polychoric correlations are sometimes used as input for component analyses, the random data matrices generated for use in PA…

  5. RECOMMENDED OPERATING PROCEDURE NO. 45: ANALYSIS OF NITROUS OXIDE FROM COMBUSTION SOURCES

    EPA Science Inventory

    The recommended operating procedure (ROP) has been prepared for use in research activities conducted by EPA's Air and Energy Engineering Research Laboratory (AEERL). he procedure applies to the measurement of nitrous oxide (N2O) in dry gas samples extracted from gas streams where...

  6. Pyroshock data analysis-The GCPP validation procedure

    NASA Astrophysics Data System (ADS)

    Piersol, Allan G.

    2002-05-01

    This procedure was developed for validating pyroshock data by Powers and the author, using techniques originated by Gaberson and Chalmers in validating naval shock data at lower frequencies and subsequently modified. It requires that the acceleration time history be single and double integrated to obtain a velocity and a displacement time history, which is to be examined. Valid data look like a low-pass filtered acceleration time history which is not integrated. The maximum resulting displacement should approximate the independently measured displacement (often near zero) or that previously computed analytically for the structure at the measured location. In addition, the positive and negative SRS are computed and compared. Valid data usually show similar spectral content.

  7. Bayesian Procedures for Prediction Analysis of Implication Hypotheses in 2 X 2 Contingency Tables.

    ERIC Educational Resources Information Center

    Lecoutre, Bruno; Charron, Camilo

    2000-01-01

    Illustrates procedures for prediction analysis in 2 X 2 contingency tables through the analyses of solutions of six types of problems associated with the acquisition of fractions. Reviews and extends confidence interval procedures previously proposed for an index of predictive efficiency of implication hypotheses. Compares frequentist coverage…

  8. Analysis of Helicopter Noise Data Using International Helicopter Noise Certification Procedures,

    DTIC Science & Technology

    1986-03-01

    establishes noise levels using the basic testing, reduction and analysis procedures specified by the International Civil Aviation Organization ( ICAO ...for helicopter noise certification supplemented with some procedural refinements contained in ICAO Working Group II recommendations for incorporation...Noise levels are plotted versus the logarithm of maximum gross takeoff weight and are shown relative to the ICAO noise level limits. Data from the ICAO

  9. Responding to Self-Harm: A Documentary Analysis of Agency Policy and Procedure

    ERIC Educational Resources Information Center

    Paul, Sally; Hill, Malcolm

    2013-01-01

    This paper reports on the findings of a documentary analysis of policies and procedures relating to self-harm from a range of organisations working with young people in the UK. It identifies the extent to which policies and/or procedures relating to self-harm are available for service providers and offers a wider understanding of the concepts of…

  10. 75 FR 48553 - Supplement to Commission Procedures During Periods of Emergency Operations Requiring Activation...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-11

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission 18 CFR Part 376 Supplement to Commission Procedures During Periods of Emergency Operations Requiring Activation of Continuity of Operations Plan Issued August 5, 2010....

  11. Object relations theory and activity theory: a proposed link by way of the procedural sequence model.

    PubMed

    Ryle, A

    1991-12-01

    An account of object relations theory (ORT), represented in terms of the procedural sequence model (PSM), is compared to the ideas of Vygotsky and activity theory (AT). The two models are seen to be compatible and complementary and their combination offers a satisfactory account of human psychology, appropriate for the understanding and integration of psychotherapy.

  12. Operational Control Procedures for the Activated Sludge Process, Part I - Observations, Part II - Control Tests.

    ERIC Educational Resources Information Center

    West, Alfred W.

    This is the first in a series of documents developed by the National Training and Operational Technology Center describing operational control procedures for the activated sludge process used in wastewater treatment. Part I of this document deals with physical observations which should be performed during each routine control test. Part II…

  13. CTEPP STANDARD OPERATING PROCEDURE FOR TRANSLATING VIDEOTAPES OF CHILD ACTIVITIES (SOP-4.13)

    EPA Science Inventory

    The EPA will conduct a two-day video translation workshop to demonstrate to coders the procedures for translating the activity patterns of preschool children on videotape. The coders will be required to pass reliability tests to successfully complete the training requirements of ...

  14. 12 CFR 225.27 - Procedures for determining scope of nonbanking activities.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 3 2014-01-01 2014-01-01 false Procedures for determining scope of nonbanking activities. 225.27 Section 225.27 Banks and Banking FEDERAL RESERVE SYSTEM (CONTINUED) BOARD OF GOVERNORS OF THE FEDERAL RESERVE SYSTEM (CONTINUED) BANK HOLDING COMPANIES AND CHANGE IN BANK CONTROL (REGULATION...

  15. 12 CFR 225.27 - Procedures for determining scope of nonbanking activities.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 3 2011-01-01 2011-01-01 false Procedures for determining scope of nonbanking activities. 225.27 Section 225.27 Banks and Banking FEDERAL RESERVE SYSTEM (CONTINUED) BOARD OF GOVERNORS OF THE FEDERAL RESERVE SYSTEM BANK HOLDING COMPANIES AND CHANGE IN BANK CONTROL (REGULATION...

  16. 12 CFR 225.27 - Procedures for determining scope of nonbanking activities.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 3 2010-01-01 2010-01-01 false Procedures for determining scope of nonbanking activities. 225.27 Section 225.27 Banks and Banking FEDERAL RESERVE SYSTEM (CONTINUED) BOARD OF GOVERNORS OF THE FEDERAL RESERVE SYSTEM BANK HOLDING COMPANIES AND CHANGE IN BANK CONTROL (REGULATION...

  17. 12 CFR 225.27 - Procedures for determining scope of nonbanking activities.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 3 2012-01-01 2012-01-01 false Procedures for determining scope of nonbanking activities. 225.27 Section 225.27 Banks and Banking FEDERAL RESERVE SYSTEM (CONTINUED) BOARD OF GOVERNORS OF THE FEDERAL RESERVE SYSTEM BANK HOLDING COMPANIES AND CHANGE IN BANK CONTROL (REGULATION...

  18. 12 CFR 225.27 - Procedures for determining scope of nonbanking activities.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 3 2013-01-01 2013-01-01 false Procedures for determining scope of nonbanking activities. 225.27 Section 225.27 Banks and Banking FEDERAL RESERVE SYSTEM (CONTINUED) BOARD OF GOVERNORS OF THE FEDERAL RESERVE SYSTEM (CONTINUED) BANK HOLDING COMPANIES AND CHANGE IN BANK CONTROL (REGULATION...

  19. Rhythms of Dialogue and Referential Activity: Implicit Process across Procedural and Verbal Realms

    ERIC Educational Resources Information Center

    Ritter, Michael S.

    2009-01-01

    This work examines the relationship between implicit procedural and implicit verbal processes as they occur in natural adult conversation. Theoretical insights and empirical findings are rooted in a move towards integration of Bucci's "Referential Activity" (RA) and "Multiple Code" perspectives and Beebe and Jaffe's…

  20. Small Schools Mathematics Curriculum, 9-12: Scope Objectives, Activities, Resources, Monitoring Procedures.

    ERIC Educational Resources Information Center

    Nelson, JoAnne, Ed.; And Others

    The grade 9-12 mathematics curriculum learning objectives, activities, monitoring procedures and resources for small schools were developed during 1978-79 through the cooperative efforts of 10 Snohomish and Island County school districts, Educational Service District 189 and the Washington State Office of Public Instruction. The objectives were…

  1. 15 CFR 400.37 - Procedure for notification of proposed production activity.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 15 Commerce and Foreign Trade 2 2014-01-01 2014-01-01 false Procedure for notification of proposed production activity. 400.37 Section 400.37 Commerce and Foreign Trade Regulations Relating to Commerce and Foreign Trade (Continued) FOREIGN-TRADE ZONES BOARD, DEPARTMENT OF COMMERCE REGULATIONS OF THE...

  2. Digital image processing and analysis for activated sludge wastewater treatment.

    PubMed

    Khan, Muhammad Burhan; Lee, Xue Yong; Nisar, Humaira; Ng, Choon Aun; Yeap, Kim Ho; Malik, Aamir Saeed

    2015-01-01

    Activated sludge system is generally used in wastewater treatment plants for processing domestic influent. Conventionally the activated sludge wastewater treatment is monitored by measuring physico-chemical parameters like total suspended solids (TSSol), sludge volume index (SVI) and chemical oxygen demand (COD) etc. For the measurement, tests are conducted in the laboratory, which take many hours to give the final measurement. Digital image processing and analysis offers a better alternative not only to monitor and characterize the current state of activated sludge but also to predict the future state. The characterization by image processing and analysis is done by correlating the time evolution of parameters extracted by image analysis of floc and filaments with the physico-chemical parameters. This chapter briefly reviews the activated sludge wastewater treatment; and, procedures of image acquisition, preprocessing, segmentation and analysis in the specific context of activated sludge wastewater treatment. In the latter part additional procedures like z-stacking, image stitching are introduced for wastewater image preprocessing, which are not previously used in the context of activated sludge. Different preprocessing and segmentation techniques are proposed, along with the survey of imaging procedures reported in the literature. Finally the image analysis based morphological parameters and correlation of the parameters with regard to monitoring and prediction of activated sludge are discussed. Hence it is observed that image analysis can play a very useful role in the monitoring of activated sludge wastewater treatment plants.

  3. Formaldehyde monitoring program: development of sampling and analysis procedures

    SciTech Connect

    Matthews, T. G.; Hawthorne, A. R.

    1980-01-01

    This report outlines the scope and goals of the formaldehyde analysis program being carried out in Health and Safety Research Division of the Oak Ridge National Laboratory under contract of the US Consumer Product Safety Commission. The outline of the sampling and analysis techniques under consideration, with reference to a time frame for developmental work and field application, is discussed. The complexity of the different techniques is addressed in instances where technical staff would be requird for accurate operation of the instrumentation.

  4. Terrain-analysis procedures for modeling radar backscatter

    USGS Publications Warehouse

    Schaber, Gerald G.; Pike, Richard J.; Berlin, Graydon Lennis

    1978-01-01

    The collection and analysis of detailed information on the surface of natural terrain are important aspects of radar-backscattering modeling. Radar is especially sensitive to surface-relief changes in the millimeter- to-decimeter scale four conventional K-band (~1-cm wavelength) to L-band (~25-cm wavelength) radar systems. Surface roughness statistics that characterize these changes in detail have been generated by a comprehensive set of seven programmed calculations for radar-backscatter modeling from sets of field measurements. The seven programs are 1) formatting of data in readable form for subsequent topographic analysis program; 2) relief analysis; 3) power spectral analysis; 4) power spectrum plots; 5) slope angle between slope reversals; 6) slope angle against slope interval plots; and 7) base length slope angle and curvature. This complete Fortran IV software package, 'Terrain Analysis', is here presented for the first time. It was originally developed a decade ago for investigations of lunar morphology and surface trafficability for the Apollo Lunar Roving Vehicle.

  5. Effects of certain analysis procedures on solar global velocity signals

    SciTech Connect

    Gilman, P.A.; Glatzmaier, G.A.

    1980-10-15

    We examine the data reduction procedures used by Howard and colleagues to deduce global solar velocities from the orginal Mount Wilson Doppler-magnetograph record. We demonstrate that removing daily rotation ''ears,'' and zero offset signals will greatly attenuate east-west global velocities of longitudinal wavenumber m< or =5. In addition we show that, because global velocity patterns are expected on theoretical grounds to have variable phase speeds in longitude, the construction of synoptic maps can severely attenuate high wavenumbers. The combination of these two effects can easily reduce an original periodic east-west flow velocity of peak amplitude 100 m s/sup -1/ to 10 m s/sup -1/ or less for any wavenumber. We demonstrate further that a velocity spectrum, obtained from a nonlinear spherical convection model for a case in which a differential rotation similar in amplitude and profile to the Sun, is attenuated to rms residual velocities close to or within the upper limits obtained by Howard and LaBonte. However, somewhat more power than they find is retained in variations of the daily rotation rate.

  6. Analysis of generalized Schwarz alternating procedure for domain decomposition

    SciTech Connect

    Engquist, B.; Zhao, Hongkai

    1996-12-31

    The Schwartz alternating method(SAM) is the theoretical basis for domain decomposition which itself is a powerful tool both for parallel computation and for computing in complicated domains. The convergence rate of the classical SAM is very sensitive to the overlapping size between each subdomain, which is not desirable for most applications. We propose a generalized SAM procedure which is an extension of the modified SAM proposed by P.-L. Lions. Instead of using only Dirichlet data at the artificial boundary between subdomains, we take a convex combination of u and {partial_derivative}u/{partial_derivative}n, i.e. {partial_derivative}u/{partial_derivative}n + {Lambda}u, where {Lambda} is some {open_quotes}positive{close_quotes} operator. Convergence of the modified SAM without overlapping in a quite general setting has been proven by P.-L.Lions using delicate energy estimates. The important questions remain for the generalized SAM. (1) What is the most essential mechanism for convergence without overlapping? (2) Given the partial differential equation, what is the best choice for the positive operator {Lambda}? (3) In the overlapping case, is the generalized SAM superior to the classical SAM? (4) What is the convergence rate and what does it depend on? (5) Numerically can we obtain an easy to implement operator {Lambda} such that the convergence is independent of the mesh size. To analyze the convergence of the generalized SAM we focus, for simplicity, on the Poisson equation for two typical geometry in two subdomain case.

  7. Procedures for analysis of debris relative to Space Shuttle systems

    NASA Technical Reports Server (NTRS)

    Kim, Hae Soo; Cummings, Virginia J.

    1993-01-01

    Debris samples collected from various Space Shuttle systems have been submitted to the Microchemical Analysis Branch. This investigation was initiated to develop optimal techniques for the analysis of debris. Optical microscopy provides information about the morphology and size of crystallites, particle sizes, amorphous phases, glass phases, and poorly crystallized materials. Scanning electron microscopy with energy dispersive spectrometry is utilized for information on surface morphology and qualitative elemental content of debris. Analytical electron microscopy with wavelength dispersive spectrometry provides information on the quantitative elemental content of debris.

  8. Procedure for recording the simultaneous activity of single neurons distributed across cortical areas during sensory discrimination

    PubMed Central

    Hernández, Adrián; Nácher, Verónica; Luna, Rogelio; Alvarez, Manuel; Zainos, Antonio; Cordero, Silvia; Camarillo, Liliana; Vázquez, Yuriria; Lemus, Luis; Romo, Ranulfo

    2008-01-01

    We report a procedure for recording the simultaneous activity of single neurons distributed across five cortical areas in behaving monkeys. The procedure consists of a commercially available microdrive adapted to a commercially available neural data collection system. The critical advantage of this procedure is that, in each cortical area, a configuration of seven microelectrodes spaced 250–500 μm can be inserted transdurally and each can be moved independently in the z axis. For each microelectrode, the data collection system can record the activity of up to five neurons together with the local field potential (LFP). With this procedure, we normally monitor the simultaneous activity of 70–100 neurons while trained monkeys discriminate the difference in frequency between two vibrotactile stimuli. Approximately 20–60 of these neurons have response properties previously reported in this task. The neuronal recordings show good signal-to-noise ratio, are remarkably stable along a 1-day session, and allow testing several protocols. Microelectrodes are removed from the brain after a 1-day recording session, but are reinserted again the next day by using the same or different x-y microelectrode array configurations. The fact that microelectrodes can be moved in the z axis during the recording session and that the x-y configuration can be changed from day to day maximizes the probability of studying simultaneous interactions, both local and across distant cortical areas, between neurons associated with the different components of this task. PMID:18946031

  9. Radiological survey activities: uranium mill tailings remedial action project procedures manual

    SciTech Connect

    Little, C.A.; Berven, B.A.; Carter, T.E.; Espegren, M.L.; O'Donnell, F.R.; Ramos, S.J.; Retolaza, C.D.; Rood, A.S.; Santos, F.A.; Witt, D.A.

    1986-07-01

    The US Department of Energy (DOE) was assigned the responsibility for conducting remedial action at 24 sites, which are located in one eastern and nine western states. The DOE's responsibilities are being met through its Uranium Mill Tailings Remedial Action Project Office (UMTRA-PO) in Albuquerque, New Mexico. The purpose of this Procedures Manual is to provide a standardized set of procedures that document in an auditable manner the activities performed by the Radiological Survey Activities (RASA) group in the Dosimetry and Biophysical Transport Section (DABTS) of the Health and Safety Research Division (HASRD) at the Oak Ridge National Laboratory (ORNL), in its role as the Inclusion Survey Contractor (ISC). Members of the RASA group assigned to the UMTRA Project are headquartered in the ORNL/RASA office in Grand Junction, Colorado, and report to the ORNL/RASA Project Manager. The Procedures Manual ensures that the organizational, administrative, and technical activities of the RASA/UMTRA group conform properly to those of the ISC as described in the Vicinity Properties Management and Implementation Manual and the Summary Protocol. This manual also ensures that the techniques and procedures used by the RASA/UMTRA group and contractor personnel meet the requirements of applicable governmental, scientific, and industrial standards.

  10. PROC LCA: A SAS Procedure for Latent Class Analysis

    ERIC Educational Resources Information Center

    Lanza, Stephanie T.; Collins, Linda M.; Lemmon, David R.; Schafer, Joseph L.

    2007-01-01

    Latent class analysis (LCA) is a statistical method used to identify a set of discrete, mutually exclusive latent classes of individuals based on their responses to a set of observed categorical variables. In multiple-group LCA, both the measurement part and structural part of the model can vary across groups, and measurement invariance across…

  11. Conducting On-Farm Animal Research: Procedures & Economic Analysis.

    ERIC Educational Resources Information Center

    Amir, Pervaiz; Knipscheer, Hendrik C.

    This book is intended to give animal scientists elementary tools to perform on-farm livestock analysis and to provide crop-oriented farming systems researchers with methods for conducting animal research. Chapter 1 describes farming systems research as a systems approach to on-farm animal research. Chapter 2 outlines some important…

  12. HYDUR Hydropower Analysis Using Streamflow Duration Procedures. Users Manual

    DTIC Science & Technology

    1982-09-01

    Dependable Capacity and Annual Firm Energy 9 f. Interruptible Capacity and Energy 10 g. Potential Energy Losses 10 h. Calculation of Power Benefits 11 i...Exceeded Curve 7 4 Potential Energy Losses i0 5 FERC Regions for Capacity and Energy Benefits 12 6 Cost Estimate Form 13 EXHIBITS Number 1 Definition of...duration) relationship to determine energy . A wide range of options based on the flow duration concept are available for analysis. The options include

  13. DoD Cost Analysis Guidance and Procedures

    DTIC Science & Technology

    1992-12-01

    Itemis 3-9 ’TABLE’S TAB P!E TIFlF IIAGF 2-1 Cost Analysis -improvemnent Group (CAIG) Tim etab Ic 2-11 De’i’en:sc Acquisition Prolgraw Life’-Cycle Cost...relationship to other systems. 1.1.3 System- Configuration. This section identifies the cquipmenleit (hardwvare and software ) work breakdown structure (W135) for...furnished commercial off-ti,: ~ (COTS) software should be addressed in thle discussion. Where Goverrnlent-fu, .’ ’.cd equipment or inron~ertx’ is

  14. Blind Source Separation in CTBTO Expert Technical Analysis Procedures

    NASA Astrophysics Data System (ADS)

    Rozhkov, M.; Kitov, I.

    2014-12-01

    Blind Source Separation (BSS) is a widely used technique in many branches of data processing, but not in CTBT related applications so far. BSS methods are attractive for they use minimal a priori information about the signals they are dealing with. Homomorphic deconvolution and cepstral smoothing are probably the only methods not widely used in CTBT that can be attributed to given technique. However Expert Technical Analysis (ETA) in CTBTO may face the problems which cannot be resolved with only certified CTBTO applications and may demand specific techniques not presently used in a practice. The case which has to be considered within the ETA framework is the unambiguous separation of signals with close arrival times. There are two scenarios of interest: (1) separation of two almost co-located explosions conducted within fractions of seconds, and (2) extraction of explosion signals merged with wavetrains from strong earthquake. The importance of resolving the problem related to case 1 is obvious since it's connected with the correct explosion yield estimation. Case 2 is a well-known scenario of conducting clandestine nuclear tests. These cases can hardly be resolved with the conventional methods implemented at the International Data Centre, especially if the signals have close slowness and azimuth. The approach we use here is applying one of the blind source separation methods - Independent Component Analysis, implying non-Gaussianity of the signal's mixture underlying processes. We have tested the technique with synthetic data and Monte-Carlo modelling, and with the data from three DPRK tests and mining explosions conducted in Central Russia. The data was recorded by the International Monitoring System of CTBTO and by small-aperture seismic array Mikhnevo (MHVAR) operated by the Institute of Dynamics of Geospheres, Russian Academy of Science. The approach demonstrated good ability of separating sources conducted practically simultaneously and/or having close

  15. Radiological health risks to astronauts from space activities and medical procedures

    NASA Technical Reports Server (NTRS)

    Peterson, Leif E.; Nachtwey, D. Stuart

    1990-01-01

    Radiation protection standards for space activities differ substantially from those applied to terrestrial working situations. The levels of radiation and subsequent hazards to which space workers are exposed are quite unlike anything found on Earth. The new more highly refined system of risk management involves assessing the risk to each space worker from all sources of radiation (occupational and non-occupational) at the organ level. The risk coefficients were applied to previous space and medical exposures (diagnostic x ray and nuclear medicine procedures) in order to estimate the radiation-induced lifetime cancer incidence and mortality risk. At present, the risk from medical procedures when compared to space activities is 14 times higher for cancer incidence and 13 times higher for cancer mortality; however, this will change as the per capita dose during Space Station Freedom and interplanetary missions increases and more is known about the risks from exposure to high-LET radiation.

  16. Radiological health risks to astronauts from space activities and medical procedures

    SciTech Connect

    Paterson, L.E.; Nachtwey, D.S.

    1990-08-01

    Radiation protection standards for space activities differ substantially from those applied to terrestrial working situations. The levels of radiation and subsequent hazards to which space workers are exposed are quite unlike anything found on Earth. The new more highly refined system of risk management involves assessing the risk to each space worker from all sources of radiation (occupational and non-occupational) at the organ level. The risk coefficients were applied to previous space and medical exposures (diagnostic x ray and nuclear medicine procedures) in order to estimate the radiation-induced lifetime cancer incidence and mortality risk. At present, the risk from medical procedures when compared to space activities is 14 times higher for cancer incidence and 13 times higher for cancer mortality; however, this will change as the per capita dose during Space Station Freedom and interplanetary missions increases and more is known about the risks from exposure to high-LET radiation.

  17. Documentation for a Structural Optimization Procedure Developed Using the Engineering Analysis Language (EAL)

    NASA Technical Reports Server (NTRS)

    Martin, Carl J., Jr.

    1996-01-01

    This report describes a structural optimization procedure developed for use with the Engineering Analysis Language (EAL) finite element analysis system. The procedure is written primarily in the EAL command language. Three external processors which are written in FORTRAN generate equivalent stiffnesses and evaluate stress and local buckling constraints for the sections. Several built-up structural sections were coded into the design procedures. These structural sections were selected for use in aircraft design, but are suitable for other applications. Sensitivity calculations use the semi-analytic method, and an extensive effort has been made to increase the execution speed and reduce the storage requirements. There is also an approximate sensitivity update method included which can significantly reduce computational time. The optimization is performed by an implementation of the MINOS V5.4 linear programming routine in a sequential liner programming procedure.

  18. CONSIDERATIONS FOR THE TREATMENT OF COMPUTERIZED PROCEDURES IN HUMAN RELIABILITY ANALYSIS

    SciTech Connect

    Ronald L. Boring; David I. Gertman

    2012-07-01

    Computerized procedures (CPs) are an emerging technology within nuclear power plant control rooms. While CPs have been implemented internationally in advanced control rooms, to date no US nuclear power plant has implemented CPs in its main control room. Yet, CPs are a reality of new plant builds and are an area of considerable interest to existing plants, which see advantages in terms of easier records management by omitting the need for updating hardcopy procedures. The overall intent of this paper is to provide a characterization of human reliability analysis (HRA) issues for computerized procedures. It is beyond the scope of this document to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper serves as a review of current HRA as it may be used for the analysis and review of computerized procedures.

  19. HUMAN RELIABILITY ANALYSIS FOR COMPUTERIZED PROCEDURES, PART TWO: APPLICABILITY OF CURRENT METHODS

    SciTech Connect

    Ronald L. Boring; David I. Gertman

    2012-10-01

    Computerized procedures (CPs) are an emerging technology within nuclear power plant control rooms. While CPs have been implemented internationally in advanced control rooms, to date no U.S. nuclear power plant has implemented CPs in its main control room. Yet, CPs are a reality of new plant builds and are an area of considerable interest to existing plants, which see advantages in terms of easier records management by omitting the need for updating hardcopy procedures. The overall intent of this paper is to provide a characterization of human reliability analysis (HRA) issues for computerized procedures. It is beyond the scope of this document to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper serves as a review of current HRA as it may be used for the analysis and review of computerized procedures.

  20. An analysis of tolerance levels in IMRT quality assurance procedures

    SciTech Connect

    Basran, Parminder S.; Woo, Milton K.

    2008-06-15

    Increased use of intensity modulated radiation therapy (IMRT) has resulted in increased efforts in patient quality assurance (QA). Software and detector systems intended to streamline the IMRT quality assurance process often report metrics, such as percent discrepancies between measured and computed doses, which can be compared to benchmark or threshold values. The purpose of this work is to examine the relationships between two different types of IMRT QA processes in order to define, or refine, appropriate tolerances values. For 115 IMRT plans delivered in a 3 month period, we examine the discrepancies between (a) the treatment planning system (TPS) and results from a commercial independent monitor unit (MU) calculation program; (b) TPS and results from a commercial diode-array measurement system; and (c) the independent MU calculation and the diode-array measurements. Statistical tests were performed to assess significance in the IMRT QA results for different disease site and machine models. There is no evidence that the average total dose discrepancy in the monitor unit calculation depends on the disease site. Second, the discrepancies in the two IMRT QA methods are independent: there is no evidence that a better --or worse--monitor unit validation result is related to a better--or worse--diode-array measurement result. Third, there is marginal benefit in repeating the independent MU calculation with a more suitable dose point, if the initial IMRT QA failed a certain tolerance. Based on these findings, the authors conclude at some acceptable tolerances based on disease site and IMRT QA method. Specifically, monitor unit validations are expected to have a total dose discrepancy of 3% overall, and 5% per beam, independent of disease site. Diode array measurements are expected to have a total absolute dose discrepancy of 3% overall, and 3% per beam, independent of disease site. The percent of pixels exceeding a 3% and 3 mm threshold in a gamma analysis should be

  1. Multiple Group Testing Procedures for Analysis of High-Dimensional Genomic Data

    PubMed Central

    Ko, Hyoseok; Kim, Kipoong

    2016-01-01

    In genetic association studies with high-dimensional genomic data, multiple group testing procedures are often required in order to identify disease/trait-related genes or genetic regions, where multiple genetic sites or variants are located within the same gene or genetic region. However, statistical testing procedures based on an individual test suffer from multiple testing issues such as the control of family-wise error rate and dependent tests. Moreover, detecting only a few of genes associated with a phenotype outcome among tens of thousands of genes is of main interest in genetic association studies. In this reason regularization procedures, where a phenotype outcome regresses on all genomic markers and then regression coefficients are estimated based on a penalized likelihood, have been considered as a good alternative approach to analysis of high-dimensional genomic data. But, selection performance of regularization procedures has been rarely compared with that of statistical group testing procedures. In this article, we performed extensive simulation studies where commonly used group testing procedures such as principal component analysis, Hotelling's T2 test, and permutation test are compared with group lasso (least absolute selection and shrinkage operator) in terms of true positive selection. Also, we applied all methods considered in simulation studies to identify genes associated with ovarian cancer from over 20,000 genetic sites generated from Illumina Infinium HumanMethylation27K Beadchip. We found a big discrepancy of selected genes between multiple group testing procedures and group lasso. PMID:28154510

  2. 14 CFR Appendix I to Part 417 - Methodologies for Toxic Release Hazard Analysis and Operational Procedures

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... public. A launch operator must include the toxic release hazard analysis results in the ground safety... release scenario: (i) Chemical name; (ii) Physical state; (iii) Basis of results (provide model name if... Analysis and Operational Procedures I Appendix I to Part 417 Aeronautics and Space COMMERCIAL...

  3. Substructure procedure for including tile flexibility in stress analysis of shuttle thermal protection system

    NASA Technical Reports Server (NTRS)

    Giles, G. L.

    1980-01-01

    A substructure procedure to include the flexibility of the tile in the stress analysis of the shuttle thermal protection system (TPS) is described. In this procedure, the TPS is divided into substructures of (1) the tile which is modeled by linear finite elements and (2) the SIP which is modeled as a nonlinear continuum. This procedure was applied for loading cases of uniform pressure, uniform moment, and an aerodynamic shock on various tile thicknesses. The ratios of through-the-thickness stresses in the SIP which were calculated using a flexible tile compared to using a rigid tile were found to be less than 1.05 for the cases considered.

  4. Existing Resources, Standards, and Procedures for Precise Monitoring and Analysis of Structural Deformations. Volume 2. Appendices

    DTIC Science & Technology

    1992-09-01

    TEC-0025A AD-A282 964 L,’)r- A [,. * Existing Resources, Eiennr Standards, and Procedures for Precise Monitoring and Analysis of Structural...Technical Re ort May - September 1992 4. TITLE AND SUBTITLE S. FUNDING NUMBERS Existing Resources, Standards, and Procedures for Precise Monitoring DAALO3-91...Frederiction, N.B., E3B 5A3, Canada 9. SPONSORING IMONITORING AGENCY NAME(S) AND ADOD[SSUES) 10. SPONSORING/ MONITORING U.S. Army Topographic

  5. 75 FR 70664 - Guidelines Establishing Test Procedures for the Analysis of Pollutants Under the Clean Water Act...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-18

    ... AGENCY 40 CFR Parts 136, 260, 423, 430, and 435 Guidelines Establishing Test Procedures for the Analysis of Pollutants Under the Clean Water Act; Analysis and Sampling Procedures; Extension of Comment... community and laboratories in their selection of analytical methods (test procedures) for use in Clean...

  6. Activity based costing of diagnostic procedures at a nuclear medicine center of a tertiary care hospital

    PubMed Central

    Hada, Mahesh Singh; Chakravarty, Abhijit; Mukherjee, Partha

    2014-01-01

    Context: Escalating health care expenses pose a new challenge to the health care environment of becoming more cost-effective. There is an urgent need for more accurate data on the costs of health care procedures. Demographic changes, changing morbidity profile, and the rising impact of noncommunicable diseases are emphasizing the role of nuclear medicine (NM) in the future health care environment. However, the impact of emerging disease load and stagnant resource availability needs to be balanced by a strategic drive towards optimal utilization of available healthcare resources. Aim: The aim was to ascertain the cost of diagnostic procedures conducted at the NM Department of a tertiary health care facility by employing activity based costing (ABC) method. Materials and Methods: A descriptive cross-sectional study was carried out over a period of 1 year. ABC methodology was utilized for ascertaining unit cost of different diagnostic procedures and such costs were compared with prevalent market rates for estimating cost effectiveness of the department being studied. Results: The cost per unit procedure for various procedures varied from Rs. 869 (USD 14.48) for a thyroid scan to Rs. 11230 (USD 187.16) for a meta-iodo-benzyl-guanidine (MIBG) scan, the most cost-effective investigations being the stress thallium, technetium-99 m myocardial perfusion imaging (MPI) and MIBG scan. The costs obtained from this study were observed to be competitive when compared to prevalent market rates. Conclusion: ABC methodology provides precise costing inputs and should be used for all future costing studies in NM Departments. PMID:25400363

  7. An analytical derivative procedure for the calculation of vibrational Raman optical activity spectra

    NASA Astrophysics Data System (ADS)

    Liégeois, Vincent; Ruud, Kenneth; Champagne, Benoît

    2007-11-01

    We present an analytical time-dependent Hartree-Fock algorithm for the calculation of the derivatives of the electric dipole-magnetic dipole polarizability with respect to atomic Cartesian coordinates. Combined with analogous procedures to determine the derivatives of the electric dipole-electric dipole and electric dipole-electric quadrupole polarizabilities, it enables a fully analytical evaluation of the three frequency-dependent vibrational Raman optical activity (VROA) invariants within the harmonic approximation. The procedure employs traditional non-London atomic orbitals, and the gauge-origin dependence of the VROA intensities has, therefore, been assessed for the commonly used aug-cc-pVDZ and rDPS:3-21G basis sets.

  8. Roughness Analysis on Composite Materials (Microfilled, Nanofilled and Silorane) After Different Finishing and Polishing Procedures

    PubMed Central

    Pettini, Francesco; Corsalini, Massimo; Savino, Maria Grazia; Stefanachi, Gianluca; Venere, Daniela Di; Pappalettere, Carmine; Monno, Giuseppe; Boccaccio, Antonio

    2015-01-01

    The finishing and polishing of composite materials affect the restoration lifespan. The market shows a variety of finishing and polishing procedures and the choice among them is conditioned by different factors such as the resulting surface roughness. In the present study, 156 samples were realized with three composite materials, -microfilled, nanofilled and silorane-, and treated with different finishing and polishing procedures. Profilometric analyses were carried out on the samples’ surface, the measured roughness values were submitted to statistical analysis. A complete factorial plan was drawn up and two-way analysis of variance (ANOVA) was carried out to investigate whether the following factors affect the values of roughness: (i) material; (ii) polishing/finishing procedure. Tukey post-hoc test was also conducted to evaluate any statistically significant differences between the material/procedure combinations. The results show that the tested materials do not affect the resulting surface quality but roughness values depend on the finishing/polishing procedure adopted. The procedures that involve: (a) the finishing with medium Sof-Lex discs and (b) the finishing with two tungsten carbide multi-blade milling cutters Q series and UF series are those that allow the lowest values of roughness to be obtained. PMID:26734113

  9. Weather analysis and interpretation procedures developed for the US/Canada wheat and barley exploratory experiment

    NASA Technical Reports Server (NTRS)

    Trenchard, M. H. (Principal Investigator)

    1980-01-01

    Procedures and techniques for providing analyses of meteorological conditions at segments during the growing season were developed for the U.S./Canada Wheat and Barley Exploratory Experiment. The main product and analysis tool is the segment-level climagraph which depicts temporally meteorological variables for the current year compared with climatological normals. The variable values for the segment are estimates derived through objective analysis of values obtained at first-order station in the region. The procedures and products documented represent a baseline for future Foreign Commodity Production Forecasting experiments.

  10. Effectiveness of internet-based affect induction procedures: A systematic review and meta-analysis.

    PubMed

    Ferrer, Rebecca A; Grenen, Emily G; Taber, Jennifer M

    2015-12-01

    Procedures used to induce affect in a laboratory are effective and well-validated. Given recent methodological and technological advances in Internet research, it is important to determine whether affect can be effectively induced using Internet methodology. We conducted a meta-analysis and systematic review of prior research that has used Internet-based affect induction procedures, and examined potential moderators of the effectiveness of affect induction procedures. Twenty-six studies were included in final analyses, with 89 independent effect sizes. Affect induction procedures effectively induced general positive affect, general negative affect, fear, disgust, anger, sadness, and guilt, but did not significantly induce happiness. Contamination of other nontarget affect did not appear to be a major concern. Video inductions resulted in greater effect sizes. Overall, results indicate that affect can be effectively induced in Internet studies, suggesting an important venue for the acceleration of affective science. (PsycINFO Database Record

  11. A facile reflux procedure to increase active surface sites form highly active and durable supported palladium@platinum bimetallic nanodendrites

    NASA Astrophysics Data System (ADS)

    Wang, Qin; Li, Yingjun; Liu, Baocang; Xu, Guangran; Zhang, Geng; Zhao, Qi; Zhang, Jun

    2015-11-01

    A series of well-dispersed bimetallic Pd@Pt nanodendrites uniformly supported on XC-72 carbon black are fabricated by using different capping agents. These capping agents are essential for the branched morphology control. However, the surfactant adsorbed on the nanodendrites surface blocks the access of reactant molecules to the active surface sites, and the catalytic activities of these bimetallic nanodendrites are significantly restricted. Herein, a facile reflux procedure to effectively remove the capping agent molecules without significantly affecting their sizes is reported for activating supported nanocatalysts. More significantly, the structure and morphology of the nanodendrites can also be retained, enhancing the numbers of active surface sites, catalytic activity and stability toward methanol and ethanol electro-oxidation reactions. The as-obtained hot water reflux-treated Pd@Pt/C catalyst manifests superior catalytic activity and stability both in terms of surface and mass specific activities, as compared to the untreated catalysts and the commercial Pt/C and Pd/C catalysts. We anticipate that this effective and facile removal method has more general applicability to highly active nanocatalysts prepared with various surfactants, and should lead to improvements in environmental protection and energy production.

  12. Alternative complement pathway activation during invasive coronary procedures in acute myocardial infarction and stable angina pectoris.

    PubMed

    Horváth, Zsófia; Csuka, Dorottya; Vargova, Katarina; Kovács, Andrea; Leé, Sarolta; Varga, Lilian; Préda, István; Tóth Zsámboki, Emese; Prohászka, Zoltán; Kiss, Róbert Gábor

    2016-12-01

    The effect of invasive percutaneous coronary procedures on complement activation has not been elucidated. We enrolled stable angina patients with elective percutaneous coronary intervention (SA-PCI, n=24), diagnostic coronary angiography (CA, n=52) and 23 patients with ST segment elevation myocardial infarction and primary PCI (STEMI-PCI). Complement activation products (C1rC1sC1inh, C3bBbP and SC5b-9) were measured on admission, 6 and 24h after coronary procedures. The alternative pathway product, C3bBbP significantly and reversibly increased 6h after elective PCI (baseline: 7.81AU/ml, 6h: 16.09AU/ml, 24h: 4.27AU/ml, p<0.01, n=23) and diagnostic angiography (baseline: 6.13AU/ml, 6h: 12.08AU/ml, 24h: 5.4AU/ml, p<0.01, n=52). Six hour C3bBbP values correlated with post-procedural CK, creatinine level and the applied contrast material volume (r=0.41, r=0.4, r=0.3, p<0.05, respectively). In STEMI-PCI, baseline C3bBbP level was higher, compared to SA-PCI or CA patients (11.33AU/ml vs. 7.81AU/ml or 6.13AU/ml, p<0.001). Similarly, the terminal complex (SC5b-9) level was already elevated at baseline compared to SA-PCI group (3.49AU/ml vs. 1.87AU/ml, p=0.011). Complement pathway products did not increase further after primary PCI. Elective coronary procedures induced transient alternative complement pathway activation, influenced by the applied contrast volume. In STEMI, the alternative complement pathway is promptly activated during the atherothrombotic event and PCI itself had no further detectable effect.

  13. Procedures for minimizing the effects of high solar activity on satellite tracking and ephemeris generation

    NASA Technical Reports Server (NTRS)

    Bredvik, Gordon D.

    1990-01-01

    We are currently experiencing a period of high solar radiation combined with wide short-term fluctuations in the radiation. The short-term fluctuations, especially when combined with highly energetic solar flares, can adversely affect the mission of U.S. Space Command's Space Surveillance Center (SSC) which catalogs and tracks the satellites in orbit around the Earth. Rapidly increasing levels of solar electromagnetic and/or particle radiation (solar wind) causes atmospheric warming, which, in turn, causes the upper-most portions of the atmosphere to expand outward, into the regime of low altitude satellites. The increased drag on satellites from this expansion can cause large, unmodeled, in-track displacements, thus undermining the SSC's ability to track and predict satellite position. On 13 March 1989, high solar radiation levels, combined with a high-energy solar flare, caused an exceptional amount of short-term atmospheric warming. The SSC temporarily lost track of over 1300 low altitude satellites--nearly half of the low altitude satellite population. Observational data on satellites that became lost during the days following the 13 March 'solar event' was analyzed and compared with the satellites' last element set prior to the event (referred to as a geomagnetic storm because of the large increase in magnetic flux in the upper atmosphere). The analysis led to a set of procedures for reducing the impact of future geomagnetic storms. These procedures adjust selected software limit parameters in the differential correction of element sets and in the observation association process and must be manually initiated at the onset of a geomagnetic storm. Sensor tasking procedures must be adjusted to ensure that a minimum of four observations per day are received for low altitude satellites. These procedures have been implemented and, thus far, appear to be successful in minimizing the effect of subsequent geomagnetic storms on satellite tracking and ephemeris

  14. An Analysis of Public Art on University Campuses: Policies, Procedures, and Best Practices

    ERIC Educational Resources Information Center

    Grenier, Michael Robert

    2009-01-01

    This study investigated the policies, procedures, and practices of public art programs on the campuses of research institutions with very high activity as defined by the Carnegie Classification. From this particular type of institution, 55 of the 96 public art administrators provided their opinions, attitudes, and behaviors as part of the "Public…

  15. A Procedure for the Computerized Analysis of Cleft Palate Speech Transcription

    ERIC Educational Resources Information Center

    Fitzsimons, David A.; Jones, David L.; Barton, Belinda; North, Kathryn N.

    2012-01-01

    The phonetic symbols used by speech-language pathologists to transcribe speech contain underlying hexadecimal values used by computers to correctly display and process transcription data. This study aimed to develop a procedure to utilise these values as the basis for subsequent computerized analysis of cleft palate speech. A computer keyboard…

  16. ISAS: The Instructional Systems Analysis and Selection Procedures. Part I: Models.

    ERIC Educational Resources Information Center

    Epstein, Kenneth I.; Matlick, Richard K.

    Litton Industries has been investigating methods for analyzing training problems and designing appropriate systems of individualized instruction to address those problems. This work can be referred to as the development of the Instructional Systems Analysis and Selection (ISAS) procedures. ISAS is a collection of questions, comparison matrices,…

  17. Analysis of helicopter noise data using international helicopter noise certification procedures

    NASA Astrophysics Data System (ADS)

    Newman, J. S.; Rickley, E. J.; Levanduski, D. A.; Woolridge, S. B.

    1986-03-01

    The results of a Federal Aviation Administration (FAA) noise measurement flight test program involving seven helicopters are documented. Noise levels were established using the basic testing, reduction and analysis techniques specified by the International Civil Aviation Organization (ICAO) for helicopter noise certification, supplemented with some procedural refinements contained in ICAO Working Group II recommendations for incorporation into the standard.

  18. Classical Item Analysis Using Latent Variable Modeling: A Note on a Direct Evaluation Procedure

    ERIC Educational Resources Information Center

    Raykov, Tenko; Marcoulides, George A.

    2011-01-01

    A directly applicable latent variable modeling procedure for classical item analysis is outlined. The method allows one to point and interval estimate item difficulty, item correlations, and item-total correlations for composites consisting of categorical items. The approach is readily employed in empirical research and as a by-product permits…

  19. Alternative Methods for Calculating Intercoder Reliability in Content Analysis: Kappa, Weighted Kappa and Agreement Charts Procedures.

    ERIC Educational Resources Information Center

    Kang, Namjun

    If content analysis is to satisfy the requirement of objectivity, measures and procedures must be reliable. Reliability is usually measured by the proportion of agreement of all categories identically coded by different coders. For such data to be empirically meaningful, a high degree of inter-coder reliability must be demonstrated. Researchers in…

  20. Procedural and Conceptual Difficulties with Slope: An Analysis of Students' Mistakes on Routine Tasks

    ERIC Educational Resources Information Center

    Cho, Peter; Nagle, Courtney

    2017-01-01

    This study extends past research on students' understanding of slope by analyzing college students' mistakes on routine tasks involving slope. We conduct both quantitative and qualitative analysis of students' mistakes on common slope tasks to extract information regarding procedural proficiencies and conceptual underpinnings required in order for…

  1. Thermography used for analysis and comparison of different cataract surgery procedures based on phacoemulsification.

    PubMed

    Corvi, Andrea; Innocenti, Bernardo; Mencucci, Rita

    2006-04-01

    Thermography has been employed to analyze and compare three cataract surgery procedures performed in vivo with phacoemulsification, namely, the Sovereign phacoemulsification system with a traditional technique, the Sovereign WhiteStar phacoemulsification system with a traditional technique and the Sovereign WhiteStar phacoemulsification system with a bimanual technique. During the entire surgical procedure, the temperature of the ocular surface was monitored. The temperature values in the area where the phaco probe was inserted in the eye were measured, and the quantities of heat transmitted to the eye in the different procedures were assessed through suitable indices. In this study the highest temperature measured for each procedure during the surgical operation was 44.9 degrees C for the Sovereign phacoemulsification system with a traditional technique, 41 degrees C for the Sovereign WhiteStar phacoemulsification system with a traditional technique and 39.5 degrees C for the Sovereign WhiteStar phacoemulsification system with a bimanual technique, which is also the surgical procedure having the lowest thermal impact on the eye, i.e., the one in which the temperature peaks are lowest in amplitude and the least amount of heat is transmitted to the eye. Thermography, used in this study as a temperature monitoring instrument, has allowed analysis to be effected through a useful and advantageous methodology, totally non-invasive as regards both surgeon and patient, and has been applied in vivo without requiring any change in the surgical procedure.

  2. The Impact of Active Consent Procedures on Nonresponse and Nonresponse Error in Youth Survey Data: Evidence from a New Experiment

    ERIC Educational Resources Information Center

    Courser, Matthew W.; Shamblen, Stephen R.; Lavrakas, Paul J.; Collins, David; Ditterline, Paul

    2009-01-01

    This article reports results from a student survey fielded using an experimental design with 14 Kentucky school districts. Seven of the 14 districts were randomly assigned to implement the survey with active consent procedures; the other seven districts implemented the survey with passive consent procedures. We used our experimental design to…

  3. A novel procedure of quantitation of virus based on microflow cytometry analysis.

    PubMed

    Vazquez, Diego; López-Vázquez, Carmen; Cutrín, Juan Manuel; Dopazo, Carlos P

    2016-03-01

    The accurate and fast titration of viruses is a critical step in research laboratories and biotechnology industries. Different approaches are commonly applied which either are time consuming (like the plaque and endpoint dilution assays) or do not ensure quantification of only infective particles (like quantitative real-time PCR). In the last decade, a methodology based on the analysis of infected cells by flow cytometry and fluorescence-activated cell sorting (FACS) has been reported as a fast and reliable test for the titration of some viruses. However, this technology needs expensive equipment and expert technicians to operate it. Recently, the "lab on a chip" integrated devices have brought about the miniaturization of this equipment, turning this technology into an affordable and easy-to-use alternative to traditional flow cytometry. In the present study, we have designed a microflow cytometry (μFC) procedure for the quantitation of viruses, using the infectious pancreatic necrosis virus (IPNV) as a model. The optimization of conditions and validation of the method are reported here.

  4. Interdisciplinary analysis procedures in the modeling and control of large space-based structures

    NASA Technical Reports Server (NTRS)

    Cooper, Paul A.; Stockwell, Alan E.; Kim, Zeen C.

    1987-01-01

    The paper describes a computer software system called the Integrated Multidisciplinary Analysis Tool, IMAT, that has been developed at NASA Langley Research Center. IMAT provides researchers and analysts with an efficient capability to analyze satellite control systems influenced by structural dynamics. Using a menu-driven interactive executive program, IMAT links a relational database to commercial structural and controls analysis codes. The paper describes the procedures followed to analyze a complex satellite structure and control system. The codes used to accomplish the analysis are described, and an example is provided of an application of IMAT to the analysis of a reference space station subject to a rectangular pulse loading at its docking port.

  5. Prompt-Gamma Activation Analysis.

    PubMed

    Lindstrom, Richard M

    1993-01-01

    A permanent, full-time instrument for prompt-gamma activation analysis is nearing completion as part of the Cold Neutron Research Facility (CNRF). The design of the analytical system has been optimized for high gamma detection efficiency and low background, particularly for hydrogen. Because of the purity of the neutron beam, shielding requirements are modest and the scatter-capture background is low. As a result of a compact sample-detector geometry, the sensitivity (counting rate per gram of analyte) is a factor of four better than the existing Maryland-NIST thermal-neutron instrument at this reactor. Hydrogen backgrounds of a few micrograms have already been achieved, which promises to be of value in numerous applications where quantitative nondestructive analysis of small quantities of hydrogen in materials is necessary.

  6. The LET Procedure for Prosthetic Myocontrol: Towards Multi-DOF Control Using Single-DOF Activations

    PubMed Central

    Nowak, Markus; Castellini, Claudio

    2016-01-01

    Simultaneous and proportional myocontrol of dexterous hand prostheses is to a large extent still an open problem. With the advent of commercially and clinically available multi-fingered hand prostheses there are now more independent degrees of freedom (DOFs) in prostheses than can be effectively controlled using surface electromyography (sEMG), the current standard human-machine interface for hand amputees. In particular, it is uncertain, whether several DOFs can be controlled simultaneously and proportionally by exclusively calibrating the intended activation of single DOFs. The problem is currently solved by training on all required combinations. However, as the number of available DOFs grows, this approach becomes overly long and poses a high cognitive burden on the subject. In this paper we present a novel approach to overcome this problem. Multi-DOF activations are artificially modelled from single-DOF ones using a simple linear combination of sEMG signals, which are then added to the training set. This procedure, which we named LET (Linearly Enhanced Training), provides an augmented data set to any machine-learning-based intent detection system. In two experiments involving intact subjects, one offline and one online, we trained a standard machine learning approach using the full data set containing single- and multi-DOF activations as well as using the LET-augmented data set in order to evaluate the performance of the LET procedure. The results indicate that the machine trained on the latter data set obtains worse results in the offline experiment compared to the full data set. However, the online implementation enables the user to perform multi-DOF tasks with almost the same precision as single-DOF tasks without the need of explicitly training multi-DOF activations. Moreover, the parameters involved in the system are statistically uniform across subjects. PMID:27606674

  7. The LET Procedure for Prosthetic Myocontrol: Towards Multi-DOF Control Using Single-DOF Activations.

    PubMed

    Nowak, Markus; Castellini, Claudio

    2016-01-01

    Simultaneous and proportional myocontrol of dexterous hand prostheses is to a large extent still an open problem. With the advent of commercially and clinically available multi-fingered hand prostheses there are now more independent degrees of freedom (DOFs) in prostheses than can be effectively controlled using surface electromyography (sEMG), the current standard human-machine interface for hand amputees. In particular, it is uncertain, whether several DOFs can be controlled simultaneously and proportionally by exclusively calibrating the intended activation of single DOFs. The problem is currently solved by training on all required combinations. However, as the number of available DOFs grows, this approach becomes overly long and poses a high cognitive burden on the subject. In this paper we present a novel approach to overcome this problem. Multi-DOF activations are artificially modelled from single-DOF ones using a simple linear combination of sEMG signals, which are then added to the training set. This procedure, which we named LET (Linearly Enhanced Training), provides an augmented data set to any machine-learning-based intent detection system. In two experiments involving intact subjects, one offline and one online, we trained a standard machine learning approach using the full data set containing single- and multi-DOF activations as well as using the LET-augmented data set in order to evaluate the performance of the LET procedure. The results indicate that the machine trained on the latter data set obtains worse results in the offline experiment compared to the full data set. However, the online implementation enables the user to perform multi-DOF tasks with almost the same precision as single-DOF tasks without the need of explicitly training multi-DOF activations. Moreover, the parameters involved in the system are statistically uniform across subjects.

  8. Monte Carlo Analysis of Airport Throughput and Traffic Delays Using Self Separation Procedures

    NASA Technical Reports Server (NTRS)

    Consiglio, Maria C.; Sturdy, James L.

    2006-01-01

    This paper presents the results of three simulation studies of throughput and delay times of arrival and departure operations performed at non-towered, non-radar airports using self-separation procedures. The studies were conducted as part of the validation process of the Small Aircraft Transportation Systems Higher Volume Operations (SATS HVO) concept and include an analysis of the predicted airport capacity using with different traffic conditions and system constraints under increasing levels of demand. Results show that SATS HVO procedures can dramatically increase capacity at non-towered, non-radar airports and that the concept offers the potential for increasing capacity of the overall air transportation system.

  9. Data and analysis procedures for improved aerial applications mission performance. [agricultural aircraft wing geometry

    NASA Technical Reports Server (NTRS)

    Holmes, B. J.; Morris, D. K.; Razak, K.

    1979-01-01

    An analysis procedure is given and cases analyzed for the effects of wing geometry on lateral transport of a variety of agricultural particles released in the wake of an agricultural airplane. The cases analyzed simulate the release of particles from a fuselage centerline-mounted dry material spreader; however, the procedure applies to particles released anywhere along the wing span. Consideration is given to the effects of taper ratio, aspect ratio, wing loading, and deflected flaps. It is noted that significant lateral transport of large particles can be achieved using high-lift devices positioned to create a strong vortex near the location of particle release.

  10. 40 CFR 260.41 - Procedures for case-by-case regulation of hazardous waste recycling activities.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... of hazardous waste recycling activities. 260.41 Section 260.41 Protection of Environment... Rulemaking Petitions § 260.41 Procedures for case-by-case regulation of hazardous waste recycling activities... hazardous waste recycling activities described in § 261.6(a)(2)(iii) under the provisions of § 261.6 (b)...

  11. 40 CFR 260.41 - Procedures for case-by-case regulation of hazardous waste recycling activities.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... of hazardous waste recycling activities. 260.41 Section 260.41 Protection of Environment... Rulemaking Petitions § 260.41 Procedures for case-by-case regulation of hazardous waste recycling activities... hazardous waste recycling activities described in § 261.6(a)(2)(iii) under the provisions of § 261.6 (b)...

  12. 40 CFR 260.41 - Procedures for case-by-case regulation of hazardous waste recycling activities.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... of hazardous waste recycling activities. 260.41 Section 260.41 Protection of Environment... Rulemaking Petitions § 260.41 Procedures for case-by-case regulation of hazardous waste recycling activities... hazardous waste recycling activities described in § 261.6(a)(2)(iii) under the provisions of § 261.6 (b)...

  13. 40 CFR 260.41 - Procedures for case-by-case regulation of hazardous waste recycling activities.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... of hazardous waste recycling activities. 260.41 Section 260.41 Protection of Environment... Rulemaking Petitions § 260.41 Procedures for case-by-case regulation of hazardous waste recycling activities... hazardous waste recycling activities described in § 261.6(a)(2)(iii) under the provisions of § 261.6 (b)...

  14. 40 CFR 260.41 - Procedures for case-by-case regulation of hazardous waste recycling activities.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... of hazardous waste recycling activities. 260.41 Section 260.41 Protection of Environment... Rulemaking Petitions § 260.41 Procedures for case-by-case regulation of hazardous waste recycling activities... hazardous waste recycling activities described in § 261.6(a)(2)(iii) under the provisions of § 261.6 (b)...

  15. Leave-one-out procedure in the validation of elimination rate constant analysis.

    PubMed

    Grabowski, T; Jaroszewski, J J; Sasinowska-Motyl, M

    2012-12-01

    Many registration agencies and other organizations define how to calculate the elimination rate constant (kel) value. No validation procedures have been introduced to verify the correct selection of the concentration-time (C-T) points used for the kel calculation. The purpose of this paper is to discover whether kel analysis can be subjected to the condensed validation procedure and what acceptance criteria should be adopted for such a procedure. For the analysis, data collected during bioequivalence studies of 4 drugs were selected, including 2 highly lipophilic drugs (itraconazole, atorvastatin) and 2 weakly lipophilic drugs (trimetazidine, perindopril). Pharmacokinetic calculations were performed with the use of WinNonlin Professional v 5.3. Internal validation of the kel analysis using leave-one-out cross-validation was performed. The present analysis proves that the C-T selection process for the kel calculations cannot be automated. In each of the analysed data series there were such C-T sequences that did not meet even one of the validation criteria. This paper proposes 3 validation criteria which need to be met in order to confirm the optimal selection of C-T data to calculate kel: Q 2≥0.6, R2≥ 0.85, Q 2-R2<0.3, were Q 2 - squared cross-validated correlation coefficient, R2 - coefficient of determination). Application of the validation procedure for the kel analysis under discussion proves the accuracy of the calculations, even if repeated kel analysis is based on a different sequence of points in the elimination phase.

  16. Perioperative outcomes for pediatric neurosurgical procedures: analysis of the National Surgical Quality Improvement Program-Pediatrics.

    PubMed

    Kuo, Benjamin J; Vissoci, Joao Ricardo N; Egger, Joseph R; Smith, Emily R; Grant, Gerald A; Haglund, Michael M; Rice, Henry E

    2017-03-01

    OBJECTIVE Existing studies have shown a high overall rate of adverse events (AEs) following pediatric neurosurgical procedures. However, little is known regarding the morbidity of specific procedures or the association with risk factors to help guide quality improvement (QI) initiatives. The goal of this study was to describe the 30-day mortality and AE rates for pediatric neurosurgical procedures by using the American College of Surgeons (ACS) National Surgical Quality Improvement Program-Pediatrics (NSQIP-Peds) database platform. METHODS Data on 9996 pediatric neurosurgical patients were acquired from the 2012-2014 NSQIP-Peds participant user file. Neurosurgical cases were analyzed by the NSQIP-Peds targeted procedure categories, including craniotomy/craniectomy, defect repair, laminectomy, shunts, and implants. The primary outcome measure was 30-day mortality, with secondary outcomes including individual AEs, composite morbidity (all AEs excluding mortality and unplanned reoperation), surgical-site infection, and unplanned reoperation. Univariate analysis was performed between individual AEs and patient characteristics using Fischer's exact test. Associations between individual AEs and continuous variables (duration from admission to operation, work relative value unit, and operation time) were examined using the Student t-test. Patient characteristics and continuous variables associated with any AE by univariate analysis were used to develop category-specific multivariable models through backward stepwise logistic regression. RESULTS The authors analyzed 3383 craniotomy/craniectomy, 242 defect repair, 1811 laminectomy, and 4560 shunt and implant cases and found a composite overall morbidity of 30.2%, 38.8%, 10.2%, and 10.7%, respectively. Unplanned reoperation rates were highest for defect repair (29.8%). The mortality rate ranged from 0.1% to 1.2%. Preoperative ventilator dependence was a significant predictor of any AE for all procedure groups, whereas

  17. Test and analysis procedures for updating math models of Space Shuttle payloads

    NASA Technical Reports Server (NTRS)

    Craig, Roy R., Jr.

    1991-01-01

    Over the next decade or more, the Space Shuttle will continue to be the primary transportation system for delivering payloads to Earth orbit. Although a number of payloads have already been successfully carried by the Space Shuttle in the payload bay of the Orbiter vehicle, there continues to be a need for evaluation of the procedures used for verifying and updating the math models of the payloads. The verified payload math models is combined with an Orbiter math model for the coupled-loads analysis, which is required before any payload can fly. Several test procedures were employed for obtaining data for use in verifying payload math models and for carrying out the updating of the payload math models. Research was directed at the evaluation of test/update procedures for use in the verification of Space Shuttle payload math models. The following research tasks are summarized: (1) a study of free-interface test procedures; (2) a literature survey and evaluation of model update procedures; and (3) the design and construction of a laboratory payload simulator.

  18. Forensic analysis of Salvia divinorum using multivariate statistical procedures. Part II: association of adulterated samples to S. divinorum.

    PubMed

    Willard, Melissa A Bodnar; McGuffin, Victoria L; Smith, Ruth Waddell

    2012-01-01

    Salvia divinorum is a plant material that is of forensic interest due to the hallucinogenic nature of the active ingredient, salvinorin A. In this study, S. divinorum was extracted and spiked onto four different plant materials (S. divinorum, Salvia officinalis, Cannabis sativa, and Nicotiana tabacum) to simulate an adulterated sample that might be encountered in a forensic laboratory. The adulterated samples were extracted and analyzed by gas chromatography-mass spectrometry, and the resulting total ion chromatograms were subjected to a series of pretreatment procedures that were used to minimize non-chemical sources of variance in the data set. The data were then analyzed using principal components analysis (PCA) to investigate association of the adulterated extracts to unadulterated S. divinorum. While association was possible based on visual assessment of the PCA scores plot, additional procedures including Euclidean distance measurement, hierarchical cluster analysis, Student's t tests, Wilcoxon rank-sum tests, and Pearson product moment correlation were also applied to the PCA scores to provide a statistical evaluation of the association observed. The advantages and limitations of each statistical procedure in a forensic context were compared and are presented herein.

  19. Surface extra-vehicular activity emergency scenario management: Tools, procedures, and geologically related implications

    NASA Astrophysics Data System (ADS)

    Zea, Luis; Diaz, Alejandro R.; Shepherd, Charles K.; Kumar, Ranganathan

    2010-07-01

    Extra-vehicular activities (EVAs) are an essential part of human space exploration, but involve inherently dangerous procedures which can put crew safety at risk during a space mission. To help mitigate this risk, astronauts' training programs spend substantial attention on preparing for surface EVA emergency scenarios. With the help of two Mars Desert Research Station (MDRS) crews (61 and 65), wearing simulated spacesuits, the most important of these emergency scenarios were examined at three different types of locations that geologically and environmentally resemble lunar and Martian landscapes. These three platforms were analyzed geologically as well as topographically (utilizing a laser range finder with slope estimation capabilities and a slope determination software). Emergency scenarios were separated into four main groups: (1) suit issues, (2) general physiological, (3) attacks and (4) others. Specific tools and procedures were developed to address each scenario. The tools and processes were tested in the field under Mars-analog conditions with the suited subjects for feasibility and speed of execution.

  20. Design and analysis of solder connections using accelerated approximate procedure with disturbed state concept

    NASA Astrophysics Data System (ADS)

    Whitenack, Russell

    The accelerated approximate procedure developed and used herein for analysis, design and parametric optimization in electronic packaging is based on the disturbed state concept (DSC) and the hierarchical single surface (HISS) constitutive models. Over the past many years the benefits of the DSC/HISS model, compared to those of available plasticity models, have been demonstrated and validated for a wide range of materials and solder connections. When the DSC/HISS model is implemented in a two-dimensional finite element code, it is well suited for failure analyses of lead/tin solder connections under cyclic thermal and mechanical loading that are typically occur in electronic packages. Unfortunately, an analysis of a single solder connection, for approximately 4000 or more cycles, can require much effort and computer time, which may be too long to be of practical use. The accelerated approximate procedure significantly reduces the effort and the analysis time to approximately 10 to 15 minutes on a Pentium 4, 3.2 GHz personal computer. The main emphasis of this dissertation is the use of the unified DSC model with the finite element procedure to predict the behavior of chip-substrate solder connections. The DSC code is used to validate the performance of a number of packages (144 BPGA, 313 PBGA) tested in the laboratory under thermomechanical loading. Using the accelerated approximate procedure, the effect of the variable thickness solder connection in a plane stress idealization is compared with that of the constant thickness assumption, and a three-dimensional analysis. It shows that the analysis with variable thickness (in plane stress idealization) yields improved results. The accelerated approximate procedure is then used to perform parametric design analyses of a solder connection by varying a number of important factors such as connection size, shape and misalignment. The effects of varying the DSC/HISS parameters on cycle life are also analyzed. The results of

  1. The effects of post-persulfate-digestion procedures on total phosphorus analysis in water.

    PubMed

    Zhou, Meifang; Struve, David M

    2004-11-01

    There are differences between the EPA Method 365 and the APHA-AWWA-WEF's Standard Method 4500 with respect to the post-digestion treatment procedures of the persulfate-digested water. The effects on total phosphorus analysis of different post-digestion treatment procedures, such as neutralization and reacidification, and shaking/settling, were investigated in this study using the total phosphorus measurements of water samples from the Everglades Round Robin (ERR) study and comparing the results with the ERR study. The effects of the insoluble particles or phosphorus adsorption/precipitation on/with Al and Fe hydroxides in different post-digestion treatment procedures adequately accounted for the differences between the most probable value and the higher or lower total phosphorus measurements reported in the ERR study. Based on the results of this investigation we recommend that a clearly defined set of digestion and post-digestion treatment procedures be adopted as the standard for total phosphorus analysis using the ascorbic acid method.

  2. An improved procedure for the purification of catalytically active alkane hydroxylase from Pseudomonas putida GPo1.

    PubMed

    Xie, Meng; Alonso, Hernan; Roujeinikova, Anna

    2011-10-01

    Bacterial alkane hydroxylases are of high interest for bioremediation applications as they allow some bacteria to grow in oil-contaminated environments. Furthermore, they have tremendous biotechnological potential as they catalyse the stereo- and regio-specific hydroxylation of chemically inert alkanes, which can then be used in the synthesis of pharmaceuticals and other high-cost chemicals. Despite their potential, progress on the detailed characterization of these systems has so far been slow mainly due to the lack of a robust procedure to purify its membrane protein component, monooxygenase AlkB, in a stable and active form. This study reports a new method for isolating milligramme amounts of recombinant Pseudomonas putida GPo1 AlkB in a folded, catalytically active form to purity levels above 90%. AlkB solubilised and purified in the detergent lauryldimethylamine oxide was demonstrated to be active in catalysing the epoxidation reaction of 1-octene with an estimated K (m) value of 0.2 mM.

  3. A comparison of analysis procedures for correlated binary data in dedicated multi-rater imaging trials.

    PubMed

    Kunz, Michael

    2015-01-01

    In this paper, three analysis procedures for repeated correlated binary data with no a priori ordering of the measurements are described and subsequently investigated. Examples for correlated binary data could be the binary assessments of subjects obtained by several raters in the framework of a clinical trial. This topic is especially of relevance when success criteria have to be defined for dedicated imaging trials involving several raters conducted for regulatory purposes. First, an analytical result on the expectation of the 'Majority rater' is presented when only the marginal distributions of the single raters are given. The paper provides a simulation study where all three analysis procedures are compared for a particular setting. It turns out that in many cases, 'Average rater' is associated with a gain in power. Settings were identified where 'Majority significant' has favorable properties. 'Majority rater' is in many cases difficult to interpret.

  4. Detection of cow milk in donkey milk by chemometric procedures on triacylglycerol stereospecific analysis results.

    PubMed

    Cossignani, Lina; Blasi, Francesca; Bosi, Ancilla; D'Arco, Gilda; Maurelli, Silvia; Simonetti, Maria Stella; Damiani, Pietro

    2011-08-01

    Stereospecific analysis is an important tool for the characterization of lipid fraction of food matrices, and also of milk samples. The results of a chemical-enzymatic-chromatographic analytical method were elaborated by chemometric procedures such as linear discriminant analysis (LDA) and artificial neural network (ANN). According to the total composition and intrapositional fatty acid distribution in the triacylglycerol (TAG) backbone, the obtained results were able to characterize pure milk samples and milk mixtures with 1, 3, 5% cow milk added to donkey milk. The resulting score was very satisfactory. Totally correct classified samples were obtained when the TAG stereospecific results of all the considered milk mixtures (donkey-cow) were elaborated by LDA and ANN chemometric procedures.

  5. Supplement to procedures, analysis, and comparison of groundwater velocity measurement methods for unconfined aquifers

    SciTech Connect

    Zinkl, R.J.; Kearl, P.M.

    1988-09-01

    This report is a supplement to Procedures, Analysis, and Comparison of Groundwater Velocity Measurement Methods for Unconfined Aquifers and provides computer program descriptions, type curves, and calculations for the analysis of field data in determining groundwater velocity in unconfined aquifers. The computer programs analyze bail or slug tests, pumping tests, Geoflo Meter data, and borehole dilution data. Appendix A is a description of the code, instructions for using the code, an example data file, and the calculated results to allow checking the code after installation on the user's computer. Calculations, development of formulas, and correction factors for the various programs are presented in Appendices B through F. Appendix G provides a procedure for calculating transmissivity and specific yield for pumping tests performed in unconfined aquifers.

  6. An efficient solution procedure for the thermoelastic analysis of truss space structures

    NASA Technical Reports Server (NTRS)

    Givoli, D.; Rand, O.

    1992-01-01

    A solution procedure is proposed for the thermal and thermoelastic analysis of truss space structures in periodic motion. In this method, the spatial domain is first descretized using a consistent finite element formulation. Then the resulting semi-discrete equations in time are solved analytically by using Fourier decomposition. Geometrical symmetry is taken advantage of completely. An algorithm is presented for the calculation of heat flux distribution. The method is demonstrated via a numerical example of a cylindrically shaped space structure.

  7. The environmental analysis of helicopter operations by Federal agencies: Current procedures and research needs

    NASA Technical Reports Server (NTRS)

    Smith, C. C.; Warner, D. B.; Dajani, J. S.

    1977-01-01

    The technical, economic, and environmental problems restricting commercial helicopter passenger operations are reviewed. The key considerations for effective assessment procedures are outlined and a preliminary model for the environmental analysis of helicopters is developed. It is recommended that this model, or some similar approach, be used as a common base for the development of comprehensive environmental assessment methods for each of the federal agencies concerned with helicopters. A description of the critical environmental research issues applicable to helicopters is also presented.

  8. Neutron Activated Samarium-153 Microparticles for Transarterial Radioembolization of Liver Tumour with Post-Procedure Imaging Capabilities

    PubMed Central

    Hashikin, Nurul Ab. Aziz; Yeong, Chai-Hong; Abdullah, Basri Johan Jeet; Ng, Kwan-Hoong; Chung, Lip-Yong; Dahalan, Rehir; Perkins, Alan Christopher

    2015-01-01

    Introduction Samarium-153 (153Sm) styrene divinylbenzene microparticles were developed as a surrogate for Yttrium-90 (90Y) microspheres in liver radioembolization therapy. Unlike the pure beta emitter 90Y, 153Sm possess both therapeutic beta and diagnostic gamma radiations, making it possible for post-procedure imaging following therapy. Methods The microparticles were prepared using commercially available cation exchange resin, Amberlite IR-120 H+ (620–830 μm), which were reduced to 20–40 μm via ball mill grinding and sieve separation. The microparticles were labelled with 152Sm via ion exchange process with 152SmCl3, prior to neutron activation to produce radioactive 153Sm through 152Sm(n,γ)153Sm reaction. Therapeutic activity of 3 GBq was referred based on the recommended activity used in 90Y-microspheres therapy. The samples were irradiated in 1.494 x 1012 n.cm-2.s-1 neutron flux for 6 h to achieve the nominal activity of 3.1 GBq.g-1. Physicochemical characterisation of the microparticles, gamma spectrometry, and in vitro radiolabelling studies were carried out to study the performance and stability of the microparticles. Results Fourier Transform Infrared (FTIR) spectroscopy of the Amberlite IR-120 resins showed unaffected functional groups, following size reduction of the beads. However, as shown by the electron microscope, the microparticles were irregular in shape. The radioactivity achieved after 6 h neutron activation was 3.104 ± 0.029 GBq. The specific activity per microparticle was 53.855 ± 0.503 Bq. Gamma spectrometry and elemental analysis showed no radioactive impurities in the samples. Radiolabelling efficiencies of 153Sm-Amberlite in distilled water and blood plasma over 48 h were excellent and higher than 95%. Conclusion The laboratory work revealed that the 153Sm-Amberlite microparticles demonstrated superior characteristics for potential use in hepatic radioembolization. PMID:26382059

  9. Consequences of Decontamination Procedures in Forensic Hair Analysis Using Metal-Assisted Secondary Ion Mass Spectrometry Analysis.

    PubMed

    Cuypers, Eva; Flinders, Bryn; Boone, Carolien M; Bosman, Ingrid J; Lusthof, Klaas J; Van Asten, Arian C; Tytgat, Jan; Heeren, Ron M A

    2016-03-15

    Today, hair testing is considered to be the standard method for the detection of chronic drug abuse. Nevertheless, the differentiation between systemic exposure and external contamination remains a major challenge in the forensic interpretation of hair analysis. Nowadays, it is still impossible to directly show the difference between external contamination and use-related incorporation. Although the effects of washing procedures on the distribution of (incorporated) drugs in hair remain unknown, these decontamination procedures prior to hair analysis are considered to be indispensable in order to exclude external contamination. However, insights into the effect of decontamination protocols on levels and distribution of drugs incorporated in hair are essential to draw the correct forensic conclusions from hair analysis; we studied the consequences of these procedures on the spatial distribution of cocaine in hair using imaging mass spectrometry. Additionally, using metal-assisted secondary ion mass spectrometry, we are the first to directly show the difference between cocaine-contaminated and user hair without any prior washing procedure.

  10. Proposal of a procedure for the analysis of atmospheric polycyclic aromatic hydrocarbons in mosses.

    PubMed

    Concha-Graña, Estefanía; Piñeiro-Iglesias, María; Muniategui-Lorenzo, Soledad; López-Mahía, Purificación; Prada-Rodríguez, Darío

    2015-03-01

    A useful analytical procedure for the analysis of 19 polycyclic aromatic hydrocarbons (PAHs) in moss samples using microwave assisted extraction and programmed temperature vaporization-gas chromatography-tandem mass spectrometry (PTV-GC-MS/MS) determination is proposed. The state of art in PAHs analysis in mosses was reviewed. All the steps of the analysis were optimized regarding not only to the analytical parameters, but also the cost, the total time of analysis and the labour. The method was validated for one moss species used as moss monitor in ambient air, obtaining high recoveries (between 83-108%), low quantitation limits (lower than 2 ng g(-1)), good intermediate precision (relative standard deviation lower than 10%), uncertainties lower than 20%. Finally, the method was checked for other species, demonstrating its suitability for the analysis of different moss species. For this reason the proposed method can be helpful in air biomonitoring studies.

  11. Human Factors Process Task Analysis Liquid Oxygen Pump Acceptance Test Procedure for the Advanced Technology Development Center

    NASA Technical Reports Server (NTRS)

    Diorio, Kimberly A.

    2002-01-01

    A process task analysis effort was undertaken by Dynacs Inc. commencing in June 2002 under contract from NASA YA-D6. Funding was provided through NASA's Ames Research Center (ARC), Code M/HQ, and Industrial Engineering and Safety (IES). The John F. Kennedy Space Center (KSC) Engineering Development Contract (EDC) Task Order was 5SMA768. The scope of the effort was to conduct a Human Factors Process Failure Modes and Effects Analysis (HF PFMEA) of a hazardous activity and provide recommendations to eliminate or reduce the effects of errors caused by human factors. The Liquid Oxygen (LOX) Pump Acceptance Test Procedure (ATP) was selected for this analysis. The HF PFMEA table (see appendix A) provides an analysis of six major categories evaluated for this study. These categories include Personnel Certification, Test Procedure Format, Test Procedure Safety Controls, Test Article Data, Instrumentation, and Voice Communication. For each specific requirement listed in appendix A, the following topics were addressed: Requirement, Potential Human Error, Performance-Shaping Factors, Potential Effects of the Error, Barriers and Controls, Risk Priority Numbers, and Recommended Actions. This report summarizes findings and gives recommendations as determined by the data contained in appendix A. It also includes a discussion of technology barriers and challenges to performing task analyses, as well as lessons learned. The HF PFMEA table in appendix A recommends the use of accepted and required safety criteria in order to reduce the risk of human error. The items with the highest risk priority numbers should receive the greatest amount of consideration. Implementation of the recommendations will result in a safer operation for all personnel.

  12. Threshold level of NF-kB activation in small bowel ischemic preconditioning procedure.

    PubMed

    Ferencz, A; Rácz, B; Gasz, B; Kalmár-Nagy, K; Horváth, O P; Röth, E

    2006-01-01

    Ischemic preconditioning (IPC), which is obtained by exposure to brief periods of vascular occlusion, improves organ tolerance to prolonged ischemia. The aim of this study was to evaluate the threshold level of NF-kB activation in small intestine during an IPC procedure. Various intestinal IPC were performed on 20 Wistar rats in seven groups: group I (GI, nonpreconditioned); group II (GII, 1-minute ischemia and 1-minute reperfusion); group III (GIII, two cycles of 1-minute ischemia and 1-minute reperfusion); group IV (GIV, 2-minutes ischemia and 2-minutes reperfusion); group V (GV, two cycles of 2-minute ischemia and 2-minute reperfusion); group VI (GVI, 5-minute ischemia and 10-minute reperfusion); group VII (GVII, two cycles of 5-minute ischemia and 10-minute reperfusion). Bowel biopsies were collected after laparotomy (control) as well as at 30, 60, and 120 minutes following IPC. We determined the cytoplasmic and nuclear NF-kB by a chemiluminescence-based ELISA method. Our results showed low, constant NF-kB levels in GI. In the preconditioned groups (GII-GVII), NF-kB was significantly elevated at 30 minutes following IPC (P < .05 vs control). After 1 hour, NF-kB activity decreased to the control level. However, 2 hours after IPC both forms of NF-kB were elevated significantly again, which was independent of the number of IPC cycles (P < .05 vs control). Our experiments revealed that one cycle of 1-minute ischemia and 1-minute reperfusion is a critical threshold level for NF-kB activation during small bowel IPC. Longer and more IPC cycles did not result in further elevation of NF-kB activation.

  13. Ravitch versus Nuss procedure for pectus excavatum: systematic review and meta-analysis

    PubMed Central

    Kanagaratnam, Aran; Phan, Steven; Tchantchaleishvilli, Vakhtang

    2016-01-01

    Background Pectus excavatum is the most common congenital chest wall deformity. The two most common surgical techniques for its correction are the modified Ravitch technique and the minimally invasive Nuss technique. Despite both procedures being used widely, data comparing them are scarce. Methods We conducted a systematic review and meta-analysis of comparative studies to evaluate these procedures. A systematic search of the literature was performed from six electronic databases. Pooled meta-analysis was conducted using odds ratio (OR) and weighted mean difference (WMD). Results A total of 13 studies comprising 1,432 pediatric (79.3%) and adult (20.7%) patients were identified, including 912 patients undergoing the Nuss procedure compared to 520 patients undergoing the Ravitch procedure. There was no significant difference found between the Nuss group versus Ravitch group in pediatric patients with regard to overall complications (OR =1.16; 95% CI: 0.61–2.19; I2=56%; P=0.65), reoperations (6.1% vs. 6.4%; OR =1.00; 95% CI: 0.40–2.50; I2=0%; P=1.00), wound infections (OR =0.58; 95% CI: 0.23–1.46; I2=0%; P=0.25), hemothorax (1.6% vs. 1.3%; OR =0.74; 95% CI: 0.21–2.65; I2=12%; P=0.64), pneumothorax (3.4% vs. 1.5%; OR =1.11; 95% CI: 0.42–2.93; I2=0%; P=0.83) or pneumonia (OR =0.15; 95% CI: 0.02–1.48; I2=0%; P=0.10). Adult patients undergoing the Nuss procedure had a higher incidence of overall complications (OR =3.26; 95% CI: 1.01–10.46; I2=0%; P=0.05), though there were far fewer studies that reported data. Conclusions These results suggest no difference between the Nuss and Ravitch procedures for pediatric patients, while in adults the Ravitch procedure resulted in fewer complications. PMID:27747174

  14. Random analysis of bearing capacity of square footing using the LAS procedure

    NASA Astrophysics Data System (ADS)

    Kawa, Marek; Puła, Wojciech; Suska, Michał

    2016-09-01

    In the present paper, a three-dimensional problem of bearing capacity of square footing on random soil medium is analyzed. The random fields of strength parameters c and φ are generated using LAS procedure (Local Average Subdivision, Fenton and Vanmarcke 1990). The procedure used is re-implemented by the authors in Mathematica environment in order to combine it with commercial program. Since the procedure is still tested the random filed has been assumed as one-dimensional: the strength properties of soil are random in vertical direction only. Individual realizations of bearing capacity boundary-problem with strength parameters of medium defined the above procedure are solved using FLAC3D Software. The analysis is performed for two qualitatively different cases, namely for the purely cohesive and cohesive-frictional soils. For the latter case the friction angle and cohesion have been assumed as independent random variables. For these two cases the random square footing bearing capacity results have been obtained for the range of fluctuation scales from 0.5 m to 10 m. Each time 1000 Monte Carlo realizations have been performed. The obtained results allow not only the mean and variance but also the probability density function to be estimated. An example of application of this function for reliability calculation has been presented in the final part of the paper.

  15. 34 CFR 79.6 - What procedures apply to the selection of programs and activities under these regulations?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 34 Education 1 2010-07-01 2010-07-01 false What procedures apply to the selection of programs and activities under these regulations? 79.6 Section 79.6 Education Office of the Secretary, Department of Education INTERGOVERNMENTAL REVIEW OF DEPARTMENT OF EDUCATION PROGRAMS AND ACTIVITIES § 79.6 What...

  16. Hydrophobic cluster analysis: procedures to derive structural and functional information from 2-D-representation of protein sequences.

    PubMed

    Lemesle-Varloot, L; Henrissat, B; Gaboriaud, C; Bissery, V; Morgat, A; Mornon, J P

    1990-08-01

    Hydrophobic cluster analysis (HCA) [15] is a very efficient method to analyse and compare protein sequences. Despite its effectiveness, this method is not widely used because it relies in part on the experience and training of the user. In this article, detailed guidelines as to the use of HCA are presented and include discussions on: the definition of the hydrophobic clusters and their relationships with secondary and tertiary structures; the length of the clusters; the amino acid classification used for HCA; the HCA plot programs; and the working strategies. Various procedures for the analysis of a single sequence are presented: structural segmentation, structural domains and secondary structure evaluation. Like most sequence analysis methods, HCA is more efficient when several homologous sequences are compared. Procedures for the detection and alignment of distantly related proteins by HCA are described through several published examples along with 2 previously unreported cases: the beta-glucosidase from Ruminococcus albus is clearly related to the beta-glucosidases from Clostridum thermocellum and Hansenula anomala although they display a reverse organization of their constitutive domains; the alignment of the sequence of human GTPase activating protein with that of the Crk oncogene is presented. Finally, the pertinence of HCA in the identification of important residues for structure/function as well as in the preparation of homology modelling is discussed.

  17. Minimum-mass design of filamentary composite panels under combined loads: Design procedure based on a rigorous buckling analysis

    NASA Technical Reports Server (NTRS)

    Stroud, W. J.; Agranoff, N.; Anderson, M. S.

    1977-01-01

    A procedure is presented for designing uniaxially stiffened panels made of composite material and subjected to combined inplane loads. The procedure uses a rigorous buckling analysis and nonlinear mathematical programing techniques. Design studies carried out with the procedure consider hat-stiffened and corrugated panels made of graphite-epoxy material. Combined longitudinal compression and shear and combined longitudinal and transverse compression are the loadings used in the studies. The capability to tailor the buckling response of a panel is also explored. Finally, the adequacy of another, simpler, analysis-design procedure is examined.

  18. Development of SRC-I product analysis. Volume 3. Documentation of procedures

    SciTech Connect

    Schweighardt, F.K.; Kingsley, I.S.; Cooper, F.E.; Kamzelski, A.Z.; Parees, D.M.

    1983-09-01

    This section documents the BASIC computer program written to simulate Wilsonville's GC-simulated distillation (GCSD) results at APCI-CRSD Trexlertown. The GC conditions used at APCI for the Wilsonville GCSD analysis of coal-derived liquid samples were described in the SRC-I Quarterly Technical Report, April-June 1981. The approach used to simulate the Wilsonville GCSD results is also from an SRC-I Quarterly Technical Report and is reproduced in Appendix VII-A. The BASIC computer program is described in the attached Appendix VII-B. Analysis of gases produced during coal liquefaction generates key information needed to determine product yields for material balance and process control. Gas samples from the coal process development unit (CPDU) and tubing bombs are the primary samples analyzed. A Carle gas chromatographic system was used to analyze coal liquefaction gas samples. A BASIC computer program was written to calculate the gas chromatographic peak area results into mole percent results. ICRC has employed several analytical workup procedures to determine the amount of distillate, oils, asphaltenes, preasphaltenes, and residue in SRC-I process streams. The ASE procedure was developed using Conoco's liquid column fractionation (LC/F) method as a model. In developing the ASE procedure, ICRC was able to eliminate distillation, and therefore quantify the oils fraction in one extraction step. ASE results were shown to be reproducible within +- 2 wt %, and to yield acceptable material balances. Finally, the ASE method proved to be the least affected by sample composition.

  19. Instability analysis procedure for 3-level multi-bearing rotor-foundation systems

    NASA Technical Reports Server (NTRS)

    Zhou, S.; Rieger, N. F.

    1985-01-01

    A procedure for the instability analysis of a three-level multispan rotor systems is described. This procedure is based on a distributed mass elastic representation of the rotor system in several eight-coefficient bearings. Each bearing is supported from an elastic foundation on damped, elastic pedestals. The foundation is represented as a general distributed mass elastic structure on discrete supports, which may have different stiffness and damping properties in the horizontal and vertical directions. This system model is suited to studies of instability threshold conditions for multirotor turbomachines on either massive or flexible foundations. The instability conditions is found by obtaining the eigenvalues of the system determinant, which is obtained by the transfer matrix method from the three-level system model. The stability determinant is solved for the lowest rotational speed at which the system damping becomes zero in the complex eigenvalue, and for the whirl frequency corresponding to the natural frequency of the unstable mode. An efficient algorithm for achieving this is described. Application of this procedure to a rigid rotor in two damped-elastic bearings and flexible supports is described. A second example discusses a flexible rotor with four damped-elastic bearings. The third case compares the stability of a six-bearing 300 Mw turbine generator unit, using two different bearing types. These applications validate the computer program and various aspects of the analysis.

  20. A Procedure for Modeling Structural Component/Attachment Failure Using Transient Finite Element Analysis

    NASA Technical Reports Server (NTRS)

    Lovejoy, Andrew E.; Jegley, Dawn C. (Technical Monitor)

    2007-01-01

    Structures often comprise smaller substructures that are connected to each other or attached to the ground by a set of finite connections. Under static loading one or more of these connections may exceed allowable limits and be deemed to fail. Of particular interest is the structural response when a connection is severed (failed) while the structure is under static load. A transient failure analysis procedure was developed by which it is possible to examine the dynamic effects that result from introducing a discrete failure while a structure is under static load. The failure is introduced by replacing a connection load history by a time-dependent load set that removes the connection load at the time of failure. The subsequent transient response is examined to determine the importance of the dynamic effects by comparing the structural response with the appropriate allowables. Additionally, this procedure utilizes a standard finite element transient analysis that is readily available in most commercial software, permitting the study of dynamic failures without the need to purchase software specifically for this purpose. The procedure is developed and explained, demonstrated on a simple cantilever box example, and finally demonstrated on a real-world example, the American Airlines Flight 587 (AA587) vertical tail plane (VTP).

  1. Analysis of boutique arrays: a universal method for the selection of the optimal data normalization procedure.

    PubMed

    Uszczyńska, Barbara; Zyprych-Walczak, Joanna; Handschuh, Luiza; Szabelska, Alicja; Kaźmierczak, Maciej; Woronowicz, Wiesława; Kozłowski, Piotr; Sikorski, Michał M; Komarnicki, Mieczysław; Siatkowski, Idzi; Figlerowicz, Marek

    2013-09-01

    DNA microarrays, which are among the most popular genomic tools, are widely applied in biology and medicine. Boutique arrays, which are small, spotted, dedicated microarrays, constitute an inexpensive alternative to whole-genome screening methods. The data extracted from each microarray-based experiment must be transformed and processed prior to further analysis to eliminate any technical bias. The normalization of the data is the most crucial step of microarray data pre-processing and this process must be carefully considered as it has a profound effect on the results of the analysis. Several normalization algorithms have been developed and implemented in data analysis software packages. However, most of these methods were designed for whole-genome analysis. In this study, we tested 13 normalization strategies (ten for double-channel data and three for single-channel data) available on R Bioconductor and compared their effectiveness in the normalization of four boutique array datasets. The results revealed that boutique arrays can be successfully normalized using standard methods, but not every method is suitable for each dataset. We also suggest a universal seven-step workflow that can be applied for the selection of the optimal normalization procedure for any boutique array dataset. The described workflow enables the evaluation of the investigated normalization methods based on the bias and variance values for the control probes, a differential expression analysis and a receiver operating characteristic curve analysis. The analysis of each component results in a separate ranking of the normalization methods. A combination of the ranks obtained from all the normalization procedures facilitates the selection of the most appropriate normalization method for the studied dataset and determines which methods can be used interchangeably.

  2. Preliminary design and analysis of procedures for the numerical generation of 3D block-structured grids

    NASA Astrophysics Data System (ADS)

    Boerstoel, J. W.

    1986-08-01

    Aproaches to grid generation are analyzed. A grid-generation procedure for complex aircraft configurations could be based on a combination of three subprocesses: decomposition of the flow domain into 100 hexahedronal blocks; trilinear transfinite interpolation to generate initial grid point distributions; and elliptic mesh-size tuning and smoothing. To get insight into this procedure, mathematical models of the subprocesses were worked out. The results of the analysis are technical concepts required or desirable in the grid-generation procedure.

  3. Differential Scanning Calorimetry (DSC) for the Analysis of Activated Carbon

    DTIC Science & Technology

    1991-10-01

    impregnation procedures . It is believed that Sutcliffe-Speakman is currently using coconut - shell as the carbon precursor (instead of the New Zealand coal...microstructure facilitate the adsorption process whereby all the undesirable materials are retained. For military deployment, the activated carbon is...AD-A245 899 H.P ’ l N dI dUenm / DIFFERENTIAL SCANNING CALORIMETRY (DSC) FOR THE ANALYSIS OF ACTIVATED CARBON (U) by S.H.C. a and L.E. Cameron DTIC x

  4. Processes and Procedures for Application of CFD to Nuclear Reactor Safety Analysis

    SciTech Connect

    Richard W. Johnson; Richard R. Schultz; Patrick J. Roache; Ismail B. Celik; William D. Pointer; Yassin A. Hassan

    2006-09-01

    Traditionally, nuclear reactor safety analysis has been performed using systems analysis codes such as RELAP5, which was developed at the INL. However, goals established by the Generation IV program, especially the desire to increase efficiency, has lead to an increase in operating temperatures for the reactors. This increase pushes reactor materials to operate towards their upper temperature limits relative to structural integrity. Because there will be some finite variation of the power density in the reactor core, there will be a potential for local hot spots to occur in the reactor vessel. Hence, it has become apparent that detailed analysis will be required to ensure that local ‘hot spots’ do not exceed safety limits. It is generally accepted that computational fluid dynamics (CFD) codes are intrinsically capable of simulating fluid dynamics and heat transport locally because they are based on ‘first principles.’ Indeed, CFD analysis has reached a fairly mature level of development, including the commercial level. However, CFD experts are aware that even though commercial codes are capable of simulating local fluid and thermal physics, great care must be taken in their application to avoid errors caused by such things as inappropriate grid meshing, low-order discretization schemes, lack of iterative convergence and inaccurate time-stepping. Just as important is the choice of a turbulence model for turbulent flow simulation. Turbulence models model the effects of turbulent transport of mass, momentum and energy, but are not necessarily applicable for wide ranges of flow types. Therefore, there is a well-recognized need to establish practices and procedures for the proper application of CFD to simulate flow physics accurately and establish the level of uncertainty of such computations. The present document represents contributions of CFD experts on what the basic practices, procedures and guidelines should be to aid CFD analysts to obtain accurate

  5. Critical comparison of extraction procedures for the capillary electrophoretic analysis of opiates in hair.

    PubMed

    de Lima, Elizabete C; da Silva, Clóvis L; Gauchée, Magnólia L N; Tavares, Marina F M

    2003-01-01

    This work presents a comparative evaluation of extraction procedures for the capillary analysis of seven opiates (meperidine, morphine, naloxone, tramadol, fentanyl, sufentanyl, and alfentanyl) in human hair. Pieces of hair (50-150 mg) were subjected to acidic hydrolysis (0.25 mmol L(-1) HCl at 45 degrees C, overnight) followed by pH adjustment and either liquid-liquid extraction (LLE) in hexane, petroleum ether, dichloromethane, and ethyl acetate solvents, or solid-phase extraction (SPE) in octadecyl, cyanopropyl, and aminopropyl bonded silica and cation exchange polymeric phases. Excellent recoveries of approximately 70% (naloxone and fentanyl and its analogues), 88% (meperidine), and ca. 100% (morphine and tramadol) were obtained using SPE in a M-fixed-mode cation exchange reversed-phase cartridge (Oasis MCX LP, Waters Corp., Milford, MA, U.S.A.), making this type of procedure eligible for novel clinical and forensic methodologies for hair analysis. The utility of the proposed extraction technique was demonstrated by the analysis of hair extracts from patients using morphine as part of their pain management protocol.

  6. Computer implementation of analysis and optimization procedures for control-structure interaction problems

    NASA Technical Reports Server (NTRS)

    Belvin, W. Keith; Park, K. C.

    1990-01-01

    Implementation aspects of control-structure interaction analysis and optimization by the staggered use of single-discipline analysis modules are discussed. The single-discipline modules include structural analysis, controller synthesis and optimization. The software modularity is maintained by employing a partitioned control-structure interaction analysis procedure, thus avoiding the need for embedding the single-discipline modules into a monolithic program. A software testbed has been constructed as a stand-alone analysis and optimization program and tested for its versatility and software modularity by applying it to the dynamic analysis and preliminary design of a prototype Earth Pointing Satellite. Experience with the in-core testbed program so far demonstrates that the testbed is efficient, preserves software modularity, and enables the analyst to choose a different set of algorithms, control strategies and design parameters via user software interfaces. Thus, the present software architecture is recommended for adoption by control-structure interaction analysts as a preliminary analysis and design tool.

  7. The interval testing procedure: A general framework for inference in functional data analysis.

    PubMed

    Pini, Alessia; Vantini, Simone

    2016-09-01

    We introduce in this work the Interval Testing Procedure (ITP), a novel inferential technique for functional data. The procedure can be used to test different functional hypotheses, e.g., distributional equality between two or more functional populations, equality of mean function of a functional population to a reference. ITP involves three steps: (i) the representation of data on a (possibly high-dimensional) functional basis; (ii) the test of each possible set of consecutive basis coefficients; (iii) the computation of the adjusted p-values associated to each basis component, by means of a new strategy here proposed. We define a new type of error control, the interval-wise control of the family wise error rate, particularly suited for functional data. We show that ITP is provided with such a control. A simulation study comparing ITP with other testing procedures is reported. ITP is then applied to the analysis of hemodynamical features involved with cerebral aneurysm pathology. ITP is implemented in the fdatest R package.

  8. A Procedure for the supercritical fluid extraction of coal samples, with subsequent analysis of extracted hydrocarbons

    USGS Publications Warehouse

    Kolak, Jonathan J.

    2006-01-01

    Introduction: This report provides a detailed, step-by-step procedure for conducting extractions with supercritical carbon dioxide (CO2) using the ISCO SFX220 supercritical fluid extraction system. Protocols for the subsequent separation and analysis of extracted hydrocarbons are also included in this report. These procedures were developed under the auspices of the project 'Assessment of Geologic Reservoirs for Carbon Dioxide Sequestration' (see http://pubs.usgs.gov/fs/fs026-03/fs026-03.pdf) to investigate possible environmental ramifications associated with CO2 storage (sequestration) in geologic reservoirs, such as deep (~1 km below land surface) coal beds. Supercritical CO2 has been used previously to extract contaminants from geologic matrices. Pressure-temperature conditions within deep coal beds may render CO2 supercritical. In this context, the ability of supercritical CO2 to extract contaminants from geologic materials may serve to mobilize noxious compounds from coal, possibly complicating storage efforts. There currently exists little information on the physicochemical interactions between supercritical CO2 and coal in this setting. The procedures described herein were developed to improve the understanding of these interactions and provide insight into the fate of CO2 and contaminants during simulated CO2 injections.

  9. Comparison of various digestion procedures in chemical analysis of spent hydrodesulfurization catalyst.

    PubMed

    Szymczycha-Madeja, Anna; Mulak, Władysława

    2009-05-30

    Four digestion procedures have been tested to verify their applicability to the determination of major and trace elements (Al, Ba, Cd, Co, Cr, Cu, Fe, Mn, Mo, Ni, Pb, Sr, Ti, V, Zn) in a spent catalyst by inductively coupled plasma optical emission spectrometry (ICP-OES). Two digestion procedures have been carried out in a closed microwave system using: (1) HCl+HNO(3)+H(2)O(2); (2) HNO(3)+HF, whereas the remaining two in an open system using: (1) aqua regia+NH(4)F, HNO(3), H(2)SO(4); (2) HF+HClO(4), H(3)BO(3), HCl. Among these four procedures the microwave digestion system (1) gave the best recovery results. The quality of the analytical results has been evaluated by the analysis of the CTA-FFA-1 Fine Fly Ash Certified Reference Material. A good agreement between the measured and reference values was found for almost all elements. The precision was assessed from the replicate analyses of microwave digestion (1) system and was found to be less than 5% of the relative standard deviation (R.S.D.).

  10. Review of data analysis procedures for the ATS-6 millimeter wave experiment

    NASA Technical Reports Server (NTRS)

    Meneghini, R.

    1975-01-01

    Predictions of satellite downlink attenuation through the use of ground based measurements form a substantial part of the ATS-6 millimeter wave experiment (MWE). At the downlink frequencies (20 and 30 GHz), the major causes of attenuation are the density and the size distribution of rain drops along the propagation path. Ground station data, which include radar and rain gauge records, measure quantities related to the meteorological parameters of interest and thereby provide a prediction of downlink attenuation with which the measured attenuation can be compared. The calibration and data analysis procedures used in the MWE are reviewed with the object of improving the accuracy of such ground based predictions.

  11. Automatic Method of Supernovae Classification by Modeling Human Procedure of Spectrum Analysis

    NASA Astrophysics Data System (ADS)

    Módolo, Marcelo; Rosa, Reinaldo; Guimaraes, Lamartine N. F.

    2016-07-01

    The classification of a recently discovered supernova must be done as quickly as possible in order to define what information will be captured and analyzed in the following days. This classification is not trivial and only a few experts astronomers are able to perform it. This paper proposes an automatic method that models the human procedure of classification. It uses Multilayer Perceptron Neural Networks to analyze the supernovae spectra. Experiments were performed using different pre-processing and multiple neural network configurations to identify the classic types of supernovae. Significant results were obtained indicating the viability of using this method in places that have no specialist or that require an automatic analysis.

  12. SU-E-T-269: Differential Hazard Analysis For Conventional And New Linac Acceptance Testing Procedures

    SciTech Connect

    Harry, T; Yaddanapudi, S; Mutic, S; Cai, B; Goddu, S; Noel, C; Pawlicki, T

    2015-06-15

    Purpose: New techniques and materials have recently been developed to expedite the conventional Linac Acceptance Testing Procedure (ATP). The new ATP method uses the Electronic Portal Imaging Device (EPID) for data collection and is presented separately. This new procedure is meant to be more efficient then conventional methods. While not clinically implemented yet, a prospective risk assessment is warranted for any new techniques. The purpose of this work is to investigate the risks and establish the pros and cons between the conventional approach and the new ATP method. Methods: ATP tests that were modified and performed with the EPID were analyzed. Five domain experts (Medical Physicists) comprised the core analysis team. Ranking scales were adopted from previous publications related to TG 100. The number of failure pathways for each ATP test procedure were compared as well as the number of risk priority numbers (RPN’s) greater than 100 were compared. Results: There were fewer failure pathways with the new ATP compared to the conventional, 262 and 556, respectively. There were fewer RPN’s > 100 in the new ATP compared to the conventional, 41 and 115. Failure pathways and RPN’s > 100 for individual ATP tests on average were 2 and 3.5 times higher in the conventional ATP compared to the new, respectively. The pixel sensitivity map of the EPID was identified as a key hazard to the new ATP procedure with an RPN of 288 for verifying beam parameters. Conclusion: The significant decrease in failure pathways and RPN’s >100 for the new ATP mitigates the possibilities of a catastrophic error occurring. The Pixel Sensitivity Map determining the response and inherent characteristics of the EPID is crucial as all data and hence results are dependent on that process. Grant from Varian Medical Systems Inc.

  13. A limited assessment of the ASEP human reliability analysis procedure using simulator examination results

    SciTech Connect

    Gore, B.R.; Dukelow, J.S. Jr.; Mitts, T.M.; Nicholson, W.L.

    1995-10-01

    This report presents a limited assessment of the conservatism of the Accident Sequence Evaluation Program (ASEP) human reliability analysis (HRA) procedure described in NUREG/CR-4772. In particular, the, ASEP post-accident, post-diagnosis, nominal HRA procedure is assessed within the context of an individual`s performance of critical tasks on the simulator portion of requalification examinations administered to nuclear power plant operators. An assessment of the degree to which operator perforn:Lance during simulator examinations is an accurate reflection of operator performance during actual accident conditions was outside the scope of work for this project; therefore, no direct inference can be made from this report about such performance. The data for this study are derived from simulator examination reports from the NRC requalification examination cycle. A total of 4071 critical tasks were identified, of which 45 had been failed. The ASEP procedure was used to estimate human error probability (HEP) values for critical tasks, and the HEP results were compared with the failure rates observed in the examinations. The ASEP procedure was applied by PNL operator license examiners who supplemented the limited information in the examination reports with expert judgment based upon their extensive simulator examination experience. ASEP analyses were performed for a sample of 162 critical tasks selected randomly from the 4071, and the results were used to characterize the entire population. ASEP analyses were also performed for all of the 45 failed critical tasks. Two tests were performed to assess the bias of the ASEP HEPs compared with the data from the requalification examinations. The first compared the average of the ASEP HEP values with the fraction of the population actually failed and it found a statistically significant factor of two bias on the average.

  14. Procedure for implementation of temperature-dependent mechanical property capability in the Engineering Analysis Language (EAL) system

    NASA Technical Reports Server (NTRS)

    Glass, David E.; Robinson, James C.

    1990-01-01

    A procedure is presented to allow the use of temperature dependent mechanical properties in the Engineering Analysis Language (EAL) System for solid structural elements. This is accomplished by including a modular runstream in the main EAL runstream. The procedure is applicable for models with multiple materials and with anisotropic properties, and can easily be incorporated into an existing EAL runstream. The procedure (which is applicable for EAL elastic solid elements) is described in detail, followed by a description of the validation of the routine. A listing of the EAL runstream used to validate the procedure is included in the Appendix.

  15. 31 CFR 1022.520 - Special information sharing procedures to deter money laundering and terrorist activity for money...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 31 Money and Finance:Treasury 3 2014-07-01 2014-07-01 false Special information sharing procedures to deter money laundering and terrorist activity for money services businesses. 1022.520 Section 1022.520 Money and Finance: Treasury Regulations Relating to Money and Finance (Continued) FINANCIAL...

  16. 31 CFR 1022.520 - Special information sharing procedures to deter money laundering and terrorist activity for money...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 31 Money and Finance:Treasury 3 2011-07-01 2011-07-01 false Special information sharing procedures to deter money laundering and terrorist activity for money services businesses. 1022.520 Section 1022.520 Money and Finance: Treasury Regulations Relating to Money and Finance (Continued) FINANCIAL...

  17. 31 CFR 1022.520 - Special information sharing procedures to deter money laundering and terrorist activity for money...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 31 Money and Finance:Treasury 3 2013-07-01 2013-07-01 false Special information sharing procedures to deter money laundering and terrorist activity for money services businesses. 1022.520 Section 1022.520 Money and Finance: Treasury Regulations Relating to Money and Finance (Continued) FINANCIAL...

  18. 31 CFR 1022.520 - Special information sharing procedures to deter money laundering and terrorist activity for money...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 31 Money and Finance:Treasury 3 2012-07-01 2012-07-01 false Special information sharing procedures to deter money laundering and terrorist activity for money services businesses. 1022.520 Section 1022.520 Money and Finance: Treasury Regulations Relating to Money and Finance (Continued) FINANCIAL...

  19. Small Schools Science Curriculum, K-3: Reading, Language Arts, Mathematics, Science, Social Studies. Scope, Objectives, Activities, Resources, Monitoring Procedures.

    ERIC Educational Resources Information Center

    Hartl, David, Ed.; And Others

    Learning objectives and suggested activities, monitoring procedures and resources for the Washington K-3 Small Schools Science Curriculum are based on the rationale that "young children need the opportunity to observe, classify, predict, test ideas again and again in a variety of contexts, ask questions, explain, discuss ideas, fail, and succeed.…

  20. 31 CFR 1029.520 - Special information sharing procedures to deter money laundering and terrorist activity for loan...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 31 Money and Finance:Treasury 3 2014-07-01 2014-07-01 false Special information sharing procedures to deter money laundering and terrorist activity for loan or finance companies. 1029.520 Section 1029.520 Money and Finance: Treasury Regulations Relating to Money and Finance (Continued) FINANCIAL...

  1. 31 CFR 1025.520 - Special information sharing procedures to deter money laundering and terrorist activity for...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 31 Money and Finance:Treasury 3 2014-07-01 2014-07-01 false Special information sharing procedures to deter money laundering and terrorist activity for insurance companies. 1025.520 Section 1025.520 Money and Finance: Treasury Regulations Relating to Money and Finance (Continued) FINANCIAL...

  2. 31 CFR 1028.520 - Special information sharing procedures to deter money laundering and terrorist activity for...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 31 Money and Finance:Treasury 3 2013-07-01 2013-07-01 false Special information sharing procedures to deter money laundering and terrorist activity for operators of credit card systems. 1028.520 Section 1028.520 Money and Finance: Treasury Regulations Relating to Money and Finance...

  3. 31 CFR 1020.520 - Special information sharing procedures to deter money laundering and terrorist activity for banks.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 31 Money and Finance:Treasury 3 2013-07-01 2013-07-01 false Special information sharing procedures to deter money laundering and terrorist activity for banks. 1020.520 Section 1020.520 Money and Finance: Treasury Regulations Relating to Money and Finance (Continued) FINANCIAL CRIMES...

  4. 31 CFR 1024.520 - Special information sharing procedures to deter money laundering and terrorist activity for...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 31 Money and Finance:Treasury 3 2014-07-01 2014-07-01 false Special information sharing procedures to deter money laundering and terrorist activity for mutual funds. 1024.520 Section 1024.520 Money and Finance: Treasury Regulations Relating to Money and Finance (Continued) FINANCIAL...

  5. 31 CFR 1021.520 - Special information sharing procedures to deter money laundering and terrorist activity for...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 31 Money and Finance:Treasury 3 2013-07-01 2013-07-01 false Special information sharing procedures to deter money laundering and terrorist activity for casinos and card clubs. 1021.520 Section 1021.520 Money and Finance: Treasury Regulations Relating to Money and Finance (Continued) FINANCIAL...

  6. 31 CFR 1027.520 - Special information sharing procedures to deter money laundering and terrorist activity for...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 31 Money and Finance:Treasury 3 2014-07-01 2014-07-01 false Special information sharing procedures to deter money laundering and terrorist activity for dealers in precious metals, precious stones, or jewels. 1027.520 Section 1027.520 Money and Finance: Treasury Regulations Relating to Money and...

  7. 31 CFR 1029.520 - Special information sharing procedures to deter money laundering and terrorist activity for loan...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 31 Money and Finance:Treasury 3 2013-07-01 2013-07-01 false Special information sharing procedures to deter money laundering and terrorist activity for loan or finance companies. 1029.520 Section 1029.520 Money and Finance: Treasury Regulations Relating to Money and Finance (Continued) FINANCIAL...

  8. 31 CFR 1024.520 - Special information sharing procedures to deter money laundering and terrorist activity for...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 31 Money and Finance:Treasury 3 2013-07-01 2013-07-01 false Special information sharing procedures to deter money laundering and terrorist activity for mutual funds. 1024.520 Section 1024.520 Money and Finance: Treasury Regulations Relating to Money and Finance (Continued) FINANCIAL...

  9. 31 CFR 1021.520 - Special information sharing procedures to deter money laundering and terrorist activity for...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 31 Money and Finance:Treasury 3 2014-07-01 2014-07-01 false Special information sharing procedures to deter money laundering and terrorist activity for casinos and card clubs. 1021.520 Section 1021.520 Money and Finance: Treasury Regulations Relating to Money and Finance (Continued) FINANCIAL...

  10. 31 CFR 1020.520 - Special information sharing procedures to deter money laundering and terrorist activity for banks.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 31 Money and Finance:Treasury 3 2014-07-01 2014-07-01 false Special information sharing procedures to deter money laundering and terrorist activity for banks. 1020.520 Section 1020.520 Money and Finance: Treasury Regulations Relating to Money and Finance (Continued) FINANCIAL CRIMES...

  11. 31 CFR 1027.520 - Special information sharing procedures to deter money laundering and terrorist activity for...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 31 Money and Finance:Treasury 3 2013-07-01 2013-07-01 false Special information sharing procedures to deter money laundering and terrorist activity for dealers in precious metals, precious stones, or jewels. 1027.520 Section 1027.520 Money and Finance: Treasury Regulations Relating to Money and...

  12. 31 CFR 1028.520 - Special information sharing procedures to deter money laundering and terrorist activity for...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 31 Money and Finance:Treasury 3 2014-07-01 2014-07-01 false Special information sharing procedures to deter money laundering and terrorist activity for operators of credit card systems. 1028.520 Section 1028.520 Money and Finance: Treasury Regulations Relating to Money and Finance...

  13. 31 CFR 1030.520 - Special information sharing procedures to deter money laundering and terrorist activity for...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 31 Money and Finance:Treasury 3 2014-07-01 2014-07-01 false Special information sharing procedures to deter money laundering and terrorist activity for housing government sponsored enterprises. 1030.520 Section 1030.520 Money and Finance: Treasury Regulations Relating to Money and Finance...

  14. 31 CFR 1026.520 - Special information sharing procedures to deter money laundering and terrorist activity for...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 31 Money and Finance:Treasury 3 2013-07-01 2013-07-01 false Special information sharing procedures to deter money laundering and terrorist activity for futures commission merchants and introducing brokers in commodities. 1026.520 Section 1026.520 Money and Finance: Treasury Regulations Relating...

  15. 31 CFR 1025.520 - Special information sharing procedures to deter money laundering and terrorist activity for...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 31 Money and Finance:Treasury 3 2013-07-01 2013-07-01 false Special information sharing procedures to deter money laundering and terrorist activity for insurance companies. 1025.520 Section 1025.520 Money and Finance: Treasury Regulations Relating to Money and Finance (Continued) FINANCIAL...

  16. 31 CFR 1026.520 - Special information sharing procedures to deter money laundering and terrorist activity for...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 31 Money and Finance:Treasury 3 2014-07-01 2014-07-01 false Special information sharing procedures to deter money laundering and terrorist activity for futures commission merchants and introducing brokers in commodities. 1026.520 Section 1026.520 Money and Finance: Treasury Regulations Relating...

  17. 31 CFR 1023.520 - Special information sharing procedures to deter money laundering and terrorist activity for...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 31 Money and Finance:Treasury 3 2014-07-01 2014-07-01 false Special information sharing procedures to deter money laundering and terrorist activity for brokers or dealers in securities. 1023.520 Section 1023.520 Money and Finance: Treasury Regulations Relating to Money and Finance...

  18. 31 CFR 1023.520 - Special information sharing procedures to deter money laundering and terrorist activity for...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 31 Money and Finance:Treasury 3 2013-07-01 2013-07-01 false Special information sharing procedures to deter money laundering and terrorist activity for brokers or dealers in securities. 1023.520 Section 1023.520 Money and Finance: Treasury Regulations Relating to Money and Finance...

  19. 45 CFR 660.6 - What procedures apply to the selection of programs and activities under these regulations?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 45 Public Welfare 3 2011-10-01 2011-10-01 false What procedures apply to the selection of programs and activities under these regulations? 660.6 Section 660.6 Public Welfare Regulations Relating to Public Welfare (Continued) NATIONAL SCIENCE FOUNDATION INTERGOVERNMENTAL REVIEW OF THE NATIONAL...

  20. Small Schools Music Curriculum, K-3: Scope, Objectives, Activities, Resources, Monitoring Procedures. The Comprehensive Arts in Education Program.

    ERIC Educational Resources Information Center

    Ott, Mary Lou, Comp.

    By following the Washington Small Schools Curriculum format of listing learning objectives with recommended grade placement levels and suggested activities, monitoring procedures, and resources used in teaching, this music curriculum for grades K-3 encourages teacher involvement and decision making. Goals for the program focus on the student,…

  1. Teaching Core Content Embedded in a Functional Activity to Students with Moderate Intellectual Disability Using a Simultaneous Prompting Procedure

    ERIC Educational Resources Information Center

    Karl, Jennifer; Collins, Belva C.; Hager, Karen D.; Ault, Melinda Jones

    2013-01-01

    The purpose of this study was to investigate the effects of a simultaneous prompting procedure in teaching four secondary students with moderate intellectual disability to acquire and generalize core content embedded in a functional activity. Data gathered within the context of a multiple probe design revealed that all participants learned the…

  2. 31 CFR 1028.520 - Special information sharing procedures to deter money laundering and terrorist activity for...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 31 Money and Finance:Treasury 3 2011-07-01 2011-07-01 false Special information sharing procedures to deter money laundering and terrorist activity for operators of credit card systems. 1028.520...) FINANCIAL CRIMES ENFORCEMENT NETWORK, DEPARTMENT OF THE TREASURY RULES FOR OPERATORS OF CREDIT CARD...

  3. 31 CFR 1028.520 - Special information sharing procedures to deter money laundering and terrorist activity for...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 31 Money and Finance:Treasury 3 2012-07-01 2012-07-01 false Special information sharing procedures to deter money laundering and terrorist activity for operators of credit card systems. 1028.520...) FINANCIAL CRIMES ENFORCEMENT NETWORK, DEPARTMENT OF THE TREASURY RULES FOR OPERATORS OF CREDIT CARD...

  4. 40 CFR 29.6 - What procedures apply to the selection of programs and activities under these regulations?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 1 2011-07-01 2011-07-01 false What procedures apply to the selection of programs and activities under these regulations? 29.6 Section 29.6 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY GENERAL INTERGOVERNMENTAL REVIEW OF ENVIRONMENTAL PROTECTION AGENCY PROGRAMS...

  5. 40 CFR 29.6 - What procedures apply to the selection of programs and activities under these regulations?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 1 2013-07-01 2013-07-01 false What procedures apply to the selection of programs and activities under these regulations? 29.6 Section 29.6 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY GENERAL INTERGOVERNMENTAL REVIEW OF ENVIRONMENTAL PROTECTION AGENCY PROGRAMS...

  6. 40 CFR 29.6 - What procedures apply to the selection of programs and activities under these regulations?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 1 2012-07-01 2012-07-01 false What procedures apply to the selection of programs and activities under these regulations? 29.6 Section 29.6 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY GENERAL INTERGOVERNMENTAL REVIEW OF ENVIRONMENTAL PROTECTION AGENCY PROGRAMS...

  7. 40 CFR 29.6 - What procedures apply to the selection of programs and activities under these regulations?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 1 2010-07-01 2010-07-01 false What procedures apply to the selection of programs and activities under these regulations? 29.6 Section 29.6 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY GENERAL INTERGOVERNMENTAL REVIEW OF ENVIRONMENTAL PROTECTION AGENCY PROGRAMS...

  8. 40 CFR 29.6 - What procedures apply to the selection of programs and activities under these regulations?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 1 2014-07-01 2014-07-01 false What procedures apply to the selection of programs and activities under these regulations? 29.6 Section 29.6 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY GENERAL INTERGOVERNMENTAL REVIEW OF ENVIRONMENTAL PROTECTION AGENCY PROGRAMS...

  9. 45 CFR 660.6 - What procedures apply to the selection of programs and activities under these regulations?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 45 Public Welfare 3 2013-10-01 2013-10-01 false What procedures apply to the selection of programs and activities under these regulations? 660.6 Section 660.6 Public Welfare Regulations Relating to Public Welfare (Continued) NATIONAL SCIENCE FOUNDATION INTERGOVERNMENTAL REVIEW OF THE NATIONAL...

  10. 45 CFR 660.6 - What procedures apply to the selection of programs and activities under these regulations?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 45 Public Welfare 3 2014-10-01 2014-10-01 false What procedures apply to the selection of programs and activities under these regulations? 660.6 Section 660.6 Public Welfare Regulations Relating to Public Welfare (Continued) NATIONAL SCIENCE FOUNDATION INTERGOVERNMENTAL REVIEW OF THE NATIONAL...

  11. 45 CFR 660.6 - What procedures apply to the selection of programs and activities under these regulations?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 45 Public Welfare 3 2012-10-01 2012-10-01 false What procedures apply to the selection of programs and activities under these regulations? 660.6 Section 660.6 Public Welfare Regulations Relating to Public Welfare (Continued) NATIONAL SCIENCE FOUNDATION INTERGOVERNMENTAL REVIEW OF THE NATIONAL...

  12. 31 CFR 1029.520 - Special information sharing procedures to deter money laundering and terrorist activity for loan...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 31 Money and Finance:Treasury 3 2012-07-01 2012-07-01 false Special information sharing procedures to deter money laundering and terrorist activity for loan or finance companies. 1029.520 Section 1029.520 Money and Finance: Treasury Regulations Relating to Money and Finance (Continued) FINANCIAL...

  13. Total body nitrogen analysis. [neutron activation analysis

    NASA Technical Reports Server (NTRS)

    Palmer, H. E.

    1975-01-01

    Studies of two potential in vivo neutron activation methods for determining total and partial body nitrogen in animals and humans are described. A method using the CO-11 in the expired air as a measure of nitrogen content was found to be adequate for small animals such as rats, but inadequate for human measurements due to a slow excretion rate. Studies on the method of measuring the induced N-13 in the body show that with further development, this method should be adequate for measuring muscle mass changes occurring in animals or humans during space flight.

  14. 75 FR 18211 - Agency Information Collection Activities; Proposed Collection; Comment Request; Procedures for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-09

    ... Collection; Comment Request; Procedures for the Safe and Sanitary Processing and Importing of Fish and... of FDA's regulations requiring reporting and recordkeeping for processors and importers of fish and..., 2010. ADDRESSES: Submit electronic comments on the collection of information to...

  15. Ultrasonic dispersion of soils for routine particle size analysis: recommended procedures

    SciTech Connect

    Heller, P.R.; Hayden, R.E.; Gee, G.W.

    1984-11-01

    Ultrasonic techniques were found to be more effective than standard mechanical techniques to disperse soils for routine particle-size analysis (i.e., using a dispersing agent and mechanical mixing). Soil samples were tested using an ultrasonic homogenizer at various power outputs. The samples varied widely in texture and mineralogy, and included sands, silts, clays, volcanic soils, and soils high in organic matter. A combination of chemical and ultrasonic dispersion techniques were used in all tests. Hydrometer techniques were used for particle-size analysis. For most materials tested, clay percentage values indicated that ultrasonic dispersion was more complete than mechanical dispersion. Soils high in volcanic ash or iron oxides showed 10 to 20 wt % more clay when using ultrasonic mixing rather than mechanical mixing. The recommended procedure requires ultrasonic dispersion of a 20- to 40-g sample for 15 min at 300 W with a 1.9-cm-diameter ultrasonic homogenizer. 12 references, 5 figures, 1 table.

  16. Skart: A skewness- and autoregression-adjusted batch-means procedure for simulation analysis

    NASA Astrophysics Data System (ADS)

    Tafazzoli Yazdi, Ali

    We discuss Skart, an automated batch-means procedure for constructing a skewness- and autoregression-adjusted confidence interval (CI) for the steady-state mean of a simulation output process in either discrete time (i.e., observation-based statistics) or continuous time (i.e., time-persistent statistics). Skart is a sequential procedure designed to deliver a CI that satisfies user-specified requirements concerning not only the CI's coverage probability but also the absolute or relative precision provided by its half-length. Skart exploits separate adjustments to the half-length of the classical batchmeans CI so as to account for the effects on the distribution of the underlying Student's t-statistic that arise from skewness (nonnormality) and autocorrelation of the batch means. The skewness adjustment is based on a modified Cornish-Fisher expansion for the classical batch-means Student's t -ratio, and the autocorrelation adjustment is based on an autoregressive approximation to the batch-means process for sufficiently large batch sizes. Skart also delivers a point estimator for the steady-state mean that is approximately free of initialization bias. The duration of the associated warm-up period (i.e., the statistics clearing time) is based on iteratively applying von Neumann's randomness test to spaced batch means with progressively increasing batch sizes and interbatch spacer sizes. In an experimental performance evaluation involving a wide range of test processes, Skart compared favorably with other simulation analysis methods---namely, its predecessors ASAP3, WASSP, and SBatch as well as ABATCH, LBATCH, the Heidelberger-Welch procedure, and the Law-Carson procedure. Specifically, Skart exhibited competitive sampling efficiency and substantially closer conformance to the given CI coverage probabilities than the other procedures. Also presented is a nonsequential version of Skart, called N-Skart, in which the user supplies a single simulation-generated series of

  17. Item Unique Identification Capability Expansion: Established Process Analysis, Cost Benefit Analysis, and Optimal Marking Procedures

    DTIC Science & Technology

    2014-12-01

    manufacturer MIL-STD military standard Mm millimeter NCCA Naval Center for Cost Analysis Nd neodymium -doped NPS Naval Postgraduate School NSLC...involves the bonding of a material to the substrate surface using the heat generated by a neodymium -doped (Nd) YAG laser or equivalent. The materials used

  18. An Analysis of Discounting Procedures and Risk Analysis Techniques for Use in the Department of Defense.

    DTIC Science & Technology

    1979-12-01

    risky investments by an amount equal to the cost of risk bearing. Thus the true opportunity cost of the private invesments foregone is the value placed...Discount Rate," Water Resources Research, Vol. 5: 947-957 (October 1969). 19. Hertz, David B. "Risk Analysis in Capital Invesement ," Harvard Business Review

  19. Back Analysis Procedure for Identification of Anisotropic Elastic Parameters of Overcored Rock Specimens

    NASA Astrophysics Data System (ADS)

    Espada, M.; Lamas, L.

    2017-03-01

    This paper presents a back analysis procedure for identification of the elastic parameters of transversely isotropic rock cores, containing an overcoring triaxial strain probe, from the strains measured during a biaxial test. A three-dimensional finite element model was developed to simulate the biaxial test on the overcored rock specimen and to compute the strains at the location of the strain gauges. Different optimisation algorithms were tested and the most suitable one was selected. The back analysis procedure was tested for identification of the five elastic parameters and the two orientation angles that characterise a transversely isotropic rock core. Despite that, with the developed methodology, convergence was reached and all those parameters could be identified, sensitivity analyses demonstrated that the results obtained were not stable, and therefore, they were not reliable. By introducing constrains based on common practice and previous experience, a stable and robust methodology was achieved: the three elastic parameters, E 1, E 2 and ν 2, are reliably identified using the value of G 2 calculated with Saint-Venant expression and a fixed value of ν 1, while the orientation parameters are obtained from observation of overcored rock. Analysis of the results shows that application of this methodology represents an enormous step forward when compared with the traditional use of isotropy. Besides, the methodology is general and can also be used with other types of overcoring equipment. The five elastic parameters and the two orientation angles obtained can then be used, together with the overcoring strains, to compute the complete in situ state of stress.

  20. A comparative analysis of British and Taiwanese students' conceptual and procedural knowledge of fraction addition

    NASA Astrophysics Data System (ADS)

    Li, Hui-Chuan

    2014-10-01

    This study examines students' procedural and conceptual achievement in fraction addition in England and Taiwan. A total of 1209 participants (561 British students and 648 Taiwanese students) at ages 12 and 13 were recruited from England and Taiwan to take part in the study. A quantitative design by means of a self-designed written test is adopted as central to the methodological considerations. The test has two major parts: the concept part and the skill part. The former is concerned with students' conceptual knowledge of fraction addition and the latter is interested in students' procedural competence when adding fractions. There were statistically significant differences both in concept and skill parts between the British and Taiwanese groups with the latter having a higher score. The analysis of the students' responses to the skill section indicates that the superiority of Taiwanese students' procedural achievements over those of their British peers is because most of the former are able to apply algorithms to adding fractions far more successfully than the latter. Earlier, Hart [1] reported that around 30% of the British students in their study used an erroneous strategy (adding tops and bottoms, for example, 2/3 + 1/7 = 3/10) while adding fractions. This study also finds that nearly the same percentage of the British group remained using this erroneous strategy to add fractions as Hart found in 1981. The study also provides evidence to show that students' understanding of fractions is confused and incomplete, even those who are successfully able to perform operations. More research is needed to be done to help students make sense of the operations and eventually attain computational competence with meaningful grounding in the domain of fractions.

  1. Development of a Quality Assurance Procedure for Dose Volume Histogram Analysis

    NASA Astrophysics Data System (ADS)

    Davenport, David A.

    The role of the dose-volume histogram (DVH) is rapidly expanding in radiation oncology treatment planning. DVHs are already relied upon to differentiate between two similar plans and evaluate organ-at-risk dosage. Their role will become even more important as progress continues towards implementing biologically based treatment planning systems. Therefore it is imperative that the accuracy of DVHs is evaluated and reappraised after any major software or hardware upgrades, affecting a treatment planning system (TPS). The purpose of this work is to create and implement a comprehensive quality assurance procedure evaluating dose volume histograms to insure their accuracy while satisfying American College of Radiology guidelines. Virtual phantoms of known volumes were created in Pinnacle TPS and exposed to different beam arrangements. Variables including grid size and slice thickness were varied and their effects were analyzed. The resulting DVHs were evaluated by comparison to the commissioned percent depth dose values using a custom Excel spreadsheet. After determining the uncertainty of the DVH based on these variables, multiple second check calculations were performed using MIM Maestro and Matlab software packages. The uncertainties of the DVHs were shown to be less than +/- 3%. The average uncertainty was shown to be less than +/- 1%. The second check procedures resulted in mean percent differences less than 1% which confirms the accuracy of DVH calculation in Pinnacle and the effectiveness of the quality assurance template. The importance of knowing the limits of accuracy of the DVHs, which are routinely used to assess the quality of clinical treatment plans, cannot be overestimated. The developed comprehensive QA procedure evaluating the accuracy of the DVH statistical analysis will become a part of our clinical arsenal for periodic tests of the treatment planning system. It will also be performed at the time of commissioning and after any major software

  2. Kriging analysis of geochemical data obtained by sequential extraction procedure (BCR)

    NASA Astrophysics Data System (ADS)

    Fajkovic, Hana; Pitarević Svedružić, Lovorka; Prohić, Esad; Rončević, Sanda; Nemet, Ivan

    2015-04-01

    Field examination and laboratory analysis were performed to establish whether nonsanitary landfill Bastijunski brig has a negative influence on Vransko Lake, situated only 1500 m away. Vransko Lake is Croatia's largest natural lake, and it is a part of the Nature Park and ornithological reserve, which indicates its high biodiversity. Therefore it is necessary to understand the environmental processes and complex sediment/water interface. Lake sediments are considered to be a good "sinkhole'and are often the final recipients of anthropogenic and natural pollutants through adsorption onto the organic or clay fraction in sediments. Geochemical investigation were obtained throughout more than 50 lake sediments cores situated in different parts of the lake Speciation of heavy metals by modified BCR sequential extraction procedure with the addition of a first step of sequential extraction procedure by Tessier and analysis of residual by aqua regia were used to determine the amounts of selected elements (Al, Cd, Cr, Co, Cu, Fe, Mn, Ni, Pb, Zn) in different fractions. With such approach it is possible to determine which element will be extracted from sediment/soil in a different environmental conditions and can be valuable tool for interpretation of the mobile fraction of the elements, considered bioavailability, that present threat to biota in a case of a contaminant concentration magnification. All sediment and soil samples were analyzed by inductively coupled plasma atomic emission spectrometry. More accurate interpretation of data is an advantage of BCR sequential extraction procedure while high number of the data together with point data type could be considered as a drawback. Due to high amount of data, graphical presentation is advisable while interpolation tool is a first choice for point type of data, as it makes predictions for defined area based on the measurements. Distribution maps of analysed elements were obtained by kriging as a geostatistical method and

  3. Sensitivity analysis aimed at blood vessels detection using interstitial optical tomography during brain needle biopsy procedures.

    PubMed

    Pichette, Julien; Goyette, Andréanne; Picot, Fabien; Tremblay, Marie-Andrée; Soulez, Gilles; Wilson, Brian C; Leblond, Frédéric

    2015-11-01

    A brain needle biopsy procedure is performed for suspected brain lesions in order to sample tissue that is subsequently analysed using standard histopathology techniques. A common complication resulting from this procedure is brain hemorrhaging from blood vessels clipped off during tissue extraction. Interstitial optical tomography (iOT) has recently been introduced by our group as a mean to assess the presence of blood vessels in the vicinity of the needle. The clinical need to improve safety requires the detection of blood vessels within 2 mm from the outer surface of the needle, since this distance is representative of the volume of tissue that is aspirated durirng tissue extraction. Here, a sensitivity analysis is presented to establish the intrinsic detection limits of iOT based on simulations and experiments using brain tissue phantoms. It is demonstrated that absorbers can be detected with diameters >300 μm located up to >2 mm from the biopsy needle core for bulk optical properties consistent with brain tissue.

  4. Sensitivity analysis aimed at blood vessels detection using interstitial optical tomography during brain needle biopsy procedures

    PubMed Central

    Pichette, Julien; Goyette, Andréanne; Picot, Fabien; Tremblay, Marie-Andrée; Soulez, Gilles; Wilson, Brian C.; Leblond, Frédéric

    2015-01-01

    A brain needle biopsy procedure is performed for suspected brain lesions in order to sample tissue that is subsequently analysed using standard histopathology techniques. A common complication resulting from this procedure is brain hemorrhaging from blood vessels clipped off during tissue extraction. Interstitial optical tomography (iOT) has recently been introduced by our group as a mean to assess the presence of blood vessels in the vicinity of the needle. The clinical need to improve safety requires the detection of blood vessels within 2 mm from the outer surface of the needle, since this distance is representative of the volume of tissue that is aspirated durirng tissue extraction. Here, a sensitivity analysis is presented to establish the intrinsic detection limits of iOT based on simulations and experiments using brain tissue phantoms. It is demonstrated that absorbers can be detected with diameters >300 μm located up to >2 mm from the biopsy needle core for bulk optical properties consistent with brain tissue. PMID:26600990

  5. An open source software project for obstetrical procedure scheduling and occupancy analysis.

    PubMed

    Isken, Mark W; Ward, Timothy J; Littig, Steven J

    2011-03-01

    Increases in the rate of births via cesarean section and induced labor have led to challenging scheduling and capacity planning problems for hospital inpatient obstetrical units. We present occupancy and patient scheduling models to help address these challenges. These patient flow models can be used to explore the relationship between procedure scheduling practices and the resulting occupancy on inpatient obstetrical units such as labor and delivery and postpartum. The models capture numerous important characteristics of inpatient obstetrical patient flow such as time of day and day of week dependent arrivals and length of stay, multiple patient types and clinical interventions, and multiple patient care units with inter-unit patient transfers. We have used these models in several projects at different hospitals involving design of procedure scheduling templates and analysis of inpatient obstetrical capacity. In the development of these models, we made heavy use of open source software tools and have released the entire project as a free and open source model and software toolkit.

  6. Light Water Reactor Sustainability Program: Computer-Based Procedures for Field Activities: Results from Three Evaluations at Nuclear Power Plants

    SciTech Connect

    Oxstrand, Johanna; Le Blanc, Katya; Bly, Aaron

    2014-09-01

    The Computer-Based Procedure (CBP) research effort is a part of the Light-Water Reactor Sustainability (LWRS) Program, which is a research and development (R&D) program sponsored by Department of Energy (DOE) and performed in close collaboration with industry R&D programs that provides the technical foundations for licensing and managing the long-term, safe, and economical operation of current nuclear power plants. One of the primary missions of the LWRS program is to help the U.S. nuclear industry adopt new technologies and engineering solutions that facilitate the continued safe operation of the plants and extension of the current operating licenses. One area that could yield tremendous savings in increased efficiency and safety is in improving procedure use. Nearly all activities in the nuclear power industry are guided by procedures, which today are printed and executed on paper. This paper-based procedure process has proven to ensure safety; however, there are improvements to be gained. Due to its inherent dynamic nature, a CBP provides the opportunity to incorporate context driven job aids, such as drawings, photos, and just-in-time training. Compared to the static state of paper-based procedures (PBPs), the presentation of information in CBPs can be much more flexible and tailored to the task, actual plant condition, and operation mode. The dynamic presentation of the procedure will guide the user down the path of relevant steps, thus minimizing time spent by the field worker to evaluate plant conditions and decisions related to the applicability of each step. This dynamic presentation of the procedure also minimizes the risk of conducting steps out of order and/or incorrectly assessed applicability of steps.

  7. Procedures for experimental measurement and theoretical analysis of large plastic deformations

    NASA Technical Reports Server (NTRS)

    Morris, R. E.

    1974-01-01

    Theoretical equations are derived and analytical procedures are presented for the interpretation of experimental measurements of large plastic strains in the surface of a plate. Orthogonal gage lengths established on the metal surface are measured before and after deformation. The change in orthogonality after deformation is also measured. Equations yield the principal strains, deviatoric stresses in the absence of surface friction forces, true stresses if the stress normal to the surface is known, and the orientation angle between the deformed gage line and the principal stress-strain axes. Errors in the measurement of nominal strains greater than 3 percent are within engineering accuracy. Applications suggested for this strain measurement system include the large-strain-stress analysis of impact test models, burst tests of spherical or cylindrical pressure vessels, and to augment small-strain instrumentation tests where large strains are anticipated.

  8. A Procedure for 3-D Contact Stress Analysis of Spiral Bevel Gears

    NASA Technical Reports Server (NTRS)

    Kumar, A.; Bibel, G.

    1994-01-01

    Contact stress distribution of spiral bevel gears using nonlinear finite element static analysis is presented. Procedures have been developed to solve the nonlinear equations that identify the gear and pinion surface coordinates based on the kinematics of the cutting process and orientate the pinion and the gear in space to mesh with each other. Contact is simulated by connecting GAP elements along the intersection of a line from each pinion point (parallel to the normal at the contact point) with the gear surface. A three dimensional model with four gear teeth and three pinion teeth is used to determine the contact stresses at two different contact positions in a spiral bevel gearset. A summary of the elliptical contact stress distribution is given. This information will be helpful to helicopter and aircraft transmission designers who need to minimize weight of the transmission and maximize reliability.

  9. Experimental and theoretical analysis on the procedure for estimating geo-stresses by the Kaiser effect

    NASA Astrophysics Data System (ADS)

    Li, Yuan-Hui; Yang, Yu-Jiang; Liu, Jian-Po; Zhao, Xing-Dong

    2010-10-01

    Acoustic emission tests of the core specimens retrieved from boreholes at the depth over 1000 m in Hongtoushan Copper Mine were carried out under uniaxial compressive loading, and the numerical test was also done by using the rock failure process analysis (RFPA2D) code, based on the procedure for estimating geo-stresses by the Kaiser effect under uniaxial compression. According to the statistical damage mechanics theory, the Kaiser effect mechanism was analyzed. Based on these analyses, it is indicted that the traditional method of estimating geo-stresses by the Kaiser effect is not appropriate, and the result is usually smaller than the real one. Furthermore, the greater confining compression in the rock mass may result in a larger difference between the Kaiser effect stresses acquired from uniaxial loading in laboratory and the real in-situ stresses.

  10. Monitoring and analysis of gravel-packing procedures to explain well performance

    SciTech Connect

    McLeod, H.O. Jr. ); Minarovic, M.J. )

    1994-10-01

    Gravel-packed gas wells completed in the Gulf of Mexico since 1980 were reviewed to build a selective database for a completion-effectiveness study. Gas wells with clean, uniform sands were selected for analysis. Significant monitoring data identified were injectivity tests at different points during the completion and fluid loss rates (barrels per hour). Injectivity before gravel packing and productivity after gravel packing were classified according to sidewall-core permeabilities. Different gravel-pack preparation and execution techniques were reviewed. Fluid-loss-control pills were identified as the greatest source of damage restricting gravel-packed well productivity. Injectivity tests and sidewall-core permeabilities provide valuable information for monitoring well completion procedures.

  11. 75 FR 43974 - Agency Information Collection Activities; Proposed Collection; Comment Request; Procedures for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-27

    ... Implementing the National Environmental Policy Act and Assessing the Environmental Effects Abroad of EPA...: Procedures for Implementing the National Environmental Policy Act and Assessing the Environmental Effects... certain EPA regulations is consolidated in 40 CFR part 9. Abstract: The National Environmental Policy...

  12. A procedure to find thermodynamic equilibrium constants for CO2 and CH4 adsorption on activated carbon.

    PubMed

    Trinh, T T; van Erp, T S; Bedeaux, D; Kjelstrup, S; Grande, C A

    2015-03-28

    Thermodynamic equilibrium for adsorption means that the chemical potential of gas and adsorbed phase are equal. A precise knowledge of the chemical potential is, however, often lacking, because the activity coefficient of the adsorbate is not known. Adsorption isotherms are therefore commonly fitted to ideal models such as the Langmuir, Sips or Henry models. We propose here a new procedure to find the activity coefficient and the equilibrium constant for adsorption which uses the thermodynamic factor. Instead of fitting the data to a model, we calculate the thermodynamic factor and use this to find first the activity coefficient. We show, using published molecular simulation data, how this procedure gives the thermodynamic equilibrium constant and enthalpies of adsorption for CO2(g) on graphite. We also use published experimental data to find similar thermodynamic properties of CO2(g) and of CH4(g) adsorbed on activated carbon. The procedure gives a higher accuracy in the determination of enthalpies of adsorption than ideal models do.

  13. Biomagnetic activity and non linear analysis in obstetrics and gynecology in a Greek population.

    PubMed

    Anninos, P; Anastasiadis, P; Adamopoulos, A; Kotini, A

    2016-01-01

    This article reports the application of non-linear analysis to biomagnetic signals recorded from fetal growth restriction, fetal brain activity, ovarian lesions, breast lesions, umbilical arteries, uterine myomas, and uterine arteries in a Greek population. The results were correlated with clinical findings. The biomagnetic measurements and the application of non-linear analysis are promising procedures in Obstetrics and Gynecology.

  14. Neutron activation analysis of total diet food composites for iodine

    SciTech Connect

    Allegrini, M.; Boyer, K.W.; Tanner, J.T.

    1981-09-01

    The iodine content of Total Diet food composites was measured using neutron activation analysis. The interfering element chlorine was separated using a modified combustion and gas phase procedure. The average recovery was 94.8% (standard deviation 2.9) for the 10 matrices that were tested. In addition, iodine was measured in National Bureau of Standards Standard Reference Materials, which have no certified values for this element. Preliminary findings of iodine content of adult Total Diet market baskets collected during Fiscal Year 1980 in different regions of the United States ranged from 292 to 901 ..mu..g/day for a 2900 kcal intake.

  15. Maori heads (mokomokai): the usefulness of a complete forensic analysis procedure.

    PubMed

    Charlier, Philippe; Huynh-Charlier, Isabelle; Brun, Luc; Champagnat, Julie; Laquay, Laetitia; Hervé, Christian

    2014-09-01

    Based on an analysis of 19 mummified Maori heads (mokomokai) referred to our forensic laboratory for anthropological analysis prior to their official repatriation from France to New Zealand, and data from the anthropological and medical literature, we propose a complete forensic procedure for the analysis of such pieces. A list of 12 original morphological criteria was developed. Items included the sex, age at death, destruction of the skull base, the presence of argil deposits in the inner part of the skull, nostrils closed with exogenous material, sewing of eyelids and lips, pierced earlobes, ante-mortem and/or post-mortem tattoos, the presence of vegetal fibers within nasal cavities, and other pathological or anthropological anomalies. These criteria were tested for all 19 mokomokai repatriated to New Zealand by the French authorities. Further complementary analyses were limited to fiberscopic examination of the intracranial cavities because of the taboo on any sampling requested by the Maori authorities. In the context of global repatriation of human artifacts to native communities, this type of anthropological expertise is increasingly frequently requested of forensic anthropologists and other practitioners. We discuss the reasons for and against repatriating non-authentic artifacts to such communities and the role played by forensic anthropologists during the authentication process.

  16. Analysis of integrated fuel-efficient, low-noise procedures in terminal-area operations

    SciTech Connect

    McKinley, J.B.

    1981-01-01

    The specific aviation energy conservation issues, terminal area fuel conservation and airport noise level relationships, are investigated. The first objective of the study was to quantify the potential fuel savings and noise level reduction in the Los Angeles International (LAX) terminal area between 1980 and 1990 attributable to compliance with the noise requirements of FAR Part 36. These savings will be due to the retiring, retrofiting, and re-engining of older narrow-body aircraft (DC-8, B707, etc.) and the growth of wide body aircraft operations (DC-10, B747, B767, etc.). The second objective was to determine what current noise abatement procedures could be relaxed without adversely impacting current (1980) noise levels, and at the same time conserving additional fuel. To accomplish these objectives, two FAA computer models were used. The Integrated Noise Model (INM) Version 2.7, was used for noise analysis, and LINKMOD, a preliminary fuel burn model, for the fuel analysis. The results of this detailed analysis revealed that due to the changing aircraft mix at LAX to include more wide body aircraft and fewer narrow body aircraft operations, airport noise level will decrease by 8.5 and 9.2 square miles on the 75 Ldn contour for 1985 and 1990, respectively, from the 1980 baseline.

  17. A single extraction and HPLC procedure for simultaneous analysis of phytosterols, tocopherols and lutein in soybeans.

    PubMed

    Slavin, Margaret; Yu, Liangli Lucy

    2012-12-15

    A saponification/extraction procedure and high performance liquid chromatography (HPLC) analysis method were developed and validated for simultaneous analysis of phytosterols, tocopherols and lutein (a carotenoid) in soybeans. Separation was achieved on a phenyl column with a ternary, isocratic solvent system of acetonitrile, methanol and water (48:22.5:29.5, v/v/v). Evaporative light scattering detection (ELSD) was used to quantify β-sitosterol, stigmasterol, campesterol, and α-, δ- and γ-tocopherols, while lutein was quantified with visible light absorption at 450 nm. Peak identification was verified by retention times and spikes with external standards. Standard curves were constructed (R(2)>0.99) to allow for sample quantification. Recovery of the saponification and extraction was demonstrated via analysis of spiked samples. Also, the accuracy of results of four soybeans using the described saponification and HPLC analytical method was validated against existing methods. This method offers a more efficient alternative to individual methods for quantifying lutein, tocopherols and sterols in soybeans.

  18. An extraction/concentration procedure for analysis of low-level explosives in soils.

    PubMed

    Felt, D R; Larson, S L; Escalon, L

    2008-06-30

    The methods traditionally used for explosives analysis in soil matrices have inherent data quality limitations for low-level samples. The traditional methods employ a soil-dilution extraction of the sample prior to analysis by high performance liquid chromatography with UV absorption detection. Another concern with the traditional analysis is that energetics contamination in environmental samples is often very heterogeneous in nature, usually requiring a large number of samples and multiple testing. The technique presented here addresses these data quality limitations by using a concentrative extraction procedure which produces a small volume of extract from a large soil sample. A concentration factor of 60-fold is achieved in this manner and energetics detection limits for soils are lowered by two orders of magnitude. The larger soil sample size also helps reduce the error associated with sample heterogeneity. The ability to detect explosive-based contaminants at levels of environmental interest enables a more accurate assessment of the transport pathways and treatment options for explosives contamination.

  19. Heavy Metal and Trace Metal Analysis in Soil by Sequential Extraction: A Review of Procedures

    PubMed Central

    Zimmerman, Amanda Jo; Weindorf, David C.

    2010-01-01

    Quantification of heavy and trace metal contamination in soil can be arduous, requiring the use of lengthy and intricate extraction procedures which may or may not give reliable results. Of the many procedures in publication, some are designed to operate within specific parameters while others are designed for more broad application. Most procedures have been modified since their inception which creates ambiguity as to which procedure is most acceptable in a given situation. For this study, the Tessier, Community Bureau of Reference (BCR), Short, Galán, and Geological Society of Canada (GCS) procedures were examined to clarify benefits and limitations of each. Modifications of the Tessier, BCR, and GCS procedures were also examined. The efficacy of these procedures is addressed by looking at the soils used in each procedure, the limitations, applications, and future of sequential extraction. PMID:20414344

  20. Effects of an Activity-Based Anorexia Procedure on Within-Session Changes in Nose-Poke Responding

    ERIC Educational Resources Information Center

    Aoyama, Kenjiro

    2012-01-01

    This study tested the effects of an activity-based anorexia (ABA) procedure on within-session changes in responding. In the ABA group (N = 8), rats were given a 60-min feeding session and allowed to run in a running wheel for the remainder of each day. During the daily 60-min feeding session, each nose-poke response was reinforced by a food…

  1. EEG Σ and slow-wave activity during NREM sleep correlate with overnight declarative and procedural memory consolidation.

    PubMed

    Holz, Johannes; Piosczyk, Hannah; Feige, Bernd; Spiegelhalder, Kai; Baglioni, Chiara; Riemann, Dieter; Nissen, Christoph

    2012-12-01

    Previous studies suggest that sleep-specific brain activity patterns such as sleep spindles and electroencephalographic slow-wave activity contribute to the consolidation of novel memories. The generation of both sleep spindles and slow-wave activity relies on synchronized oscillations in a thalamo-cortical network that might be implicated in synaptic strengthening (spindles) and downscaling (slow-wave activity) during sleep. This study further examined the association between electroencephalographic power during non-rapid eye movement sleep in the spindle (sigma, 12-16 Hz) and slow-wave frequency range (0.1-3.5 Hz) and overnight memory consolidation in 20 healthy subjects (10 men, 27.1 ± 4.6 years). We found that both electroencephalographic sigma power and slow-wave activity were positively correlated with the pre-post-sleep consolidation of declarative (word list) and procedural (mirror-tracing) memories. These results, although only correlative in nature, are consistent with the view that processes of synaptic strengthening (sleep spindles) and synaptic downscaling (slow-wave activity) might act in concert to promote synaptic plasticity and the consolidation of both declarative and procedural memories during sleep.

  2. Neutron Activation Analysis of Water - A Review

    NASA Technical Reports Server (NTRS)

    Buchanan, John D.

    1971-01-01

    Recent developments in this field are emphasized. After a brief review of basic principles, topics discussed include sources of neutrons, pre-irradiation physical and chemical treatment of samples, neutron capture and gamma-ray analysis, and selected applications. Applications of neutron activation analysis of water have increased rapidly within the last few years and may be expected to increase in the future.

  3. Function Allocation in Complex Socio-Technical Systems: Procedure usage in nuclear power and the Context Analysis Method for Identifying Design Solutions (CAMIDS) Model

    NASA Astrophysics Data System (ADS)

    Schmitt, Kara Anne

    This research aims to prove that strict adherence to procedures and rigid compliance to process in the US Nuclear Industry may not prevent incidents or increase safety. According to the Institute of Nuclear Power Operations, the nuclear power industry has seen a recent rise in events, and this research claims that a contributing factor to this rise is organizational, cultural, and based on peoples overreliance on procedures and policy. Understanding the proper balance of function allocation, automation and human decision-making is imperative to creating a nuclear power plant that is safe, efficient, and reliable. This research claims that new generations of operators are less engaged and thinking because they have been instructed to follow procedures to a fault. According to operators, they were once to know the plant and its interrelations, but organizationally more importance is now put on following procedure and policy. Literature reviews were performed, experts were questioned, and a model for context analysis was developed. The Context Analysis Method for Identifying Design Solutions (CAMIDS) Model was created, verified and validated through both peer review and application in real world scenarios in active nuclear power plant simulators. These experiments supported the claim that strict adherence and rigid compliance to procedures may not increase safety by studying the industry's propensity for following incorrect procedures, and when it directly affects the outcome of safety or security of the plant. The findings of this research indicate that the younger generations of operators rely highly on procedures, and the organizational pressures of required compliance to procedures may lead to incidents within the plant because operators feel pressured into following the rules and policy above performing the correct actions in a timely manner. The findings support computer based procedures, efficient alarm systems, and skill of the craft matrices. The solution to

  4. Obsessions and compulsions in the lab: A meta-analysis of procedures to induce symptoms of obsessive-compulsive disorder.

    PubMed

    De Putter, Laura M S; Van Yper, Lotte; Koster, Ernst H W

    2017-03-01

    Efficacious induction procedures of symptoms of obsessive-compulsive disorder (OCD) are necessary in order to test central tenets of theories on OCD. However, the efficacy of the current range of induction procedures remains unclear. Therefore, this meta-analysis set out to examine the efficacy of induction procedures in participants with and without OCD symptoms. Moreover, we explored whether the efficacy varied across different moderators (i.e., induction categories, symptom dimensions of OCD, modalities of presentation, and level of individual tailoring). In total we included 4900 participants across 90 studies. The analyses showed that there was no difference in studies using subclinical and clinical participants, confirming the utility of analogue samples. Induction procedures evoked more symptoms in (sub)clinical OCD than in healthy participants, which was most evident in the contamination symptom dimension of OCD. Analysis within (sub)clinical OCD showed a large effect size of induction procedures, especially for the threat and responsibility category and when stimuli were tailored to individuals. Analysis within healthy participants showed a medium effect size of induction procedures. The magnitude of the effect in healthy individuals was stronger for mental contamination, thought-action fusion and threat inductions.

  5. Supervised pattern recognition procedures for discrimination of whiskeys from gas chromatography/mass spectrometry congener analysis.

    PubMed

    González-Arjona, Domingo; López-Pérez, Germán; González-Gallero, Víctor; González, A Gustavo

    2006-03-22

    The volatile congener analysis of 52 commercialized whiskeys (24 samples of single malt Scotch whiskey, 18 samples of bourbon whiskey, and 10 samples of Irish whiskey) was carried out by gas chromatography/mass spectrometry after liquid-liquid extraction with dichloromethane. Pattern recognition procedures were applied for discrimination of different whiskey categories. Multivariate data analysis includes linear discriminant analysis (LDA), k nearest neighbors (KNN), soft independent modeling of class analogy (SIMCA), procrustes discriminant analysis (PDA), and artificial neural networks techniques involving multilayer perceptrons (MLP) and probabilistic neural networks (PNN). Classification rules were validated by considering the number of false positives (FPs) and false negatives (FNs) of each class associated to the prediction set. Artificial neural networks led to the best results because of their intrinsic nonlinear features. Both techniques, MLP and PNN, gave zero FPs and zero FNs for all of the categories. KNN is a nonparametric method that also provides zero FPs and FNs for every class but only when selecting K = 3 neighbors. PDA produced good results also (zero FPs and FNs always) but only by selecting nine principal components for class modeling. LDA shows a lesser classification performance, because of the building of linear frontiers between classes that does not apply in many real situations. LDA led to one FP for bourbons and one FN for scotches. The worse results were obtained with SIMCA, which gave a higher number of FPs (five for both scotches and bourbons) and FNs (six for scotchs and two for bourbons). The possible cause of these findings is the strong influence of class inhomogeneities on the SIMCA performance. It is remarkable that in any case, all of the methodologies lead to zero FPs and FNs for the Irish whiskeys.

  6. Analysis of the cold-water restraint procedure in gastric ulceration and body temperature.

    PubMed

    Landeira-Fernandez, J

    2004-10-15

    Gastric mucosal injury induced by body restraint can be enhanced when combined with cold-water immersion. Based on this fact, the present study had two main purposes: (i) to examine the contribution of each of these two forms of stress on the development of gastric ulceration and regulation of body temperature and (ii) to investigate the importance of the animal's consciousness on gastric ulceration induced by the cold-water restraint. Independent groups of animals were exposed for 3 h to one of the following stressful treatments: body restraint plus cold-water (20+1 degrees C) immersion, body restraint alone or cold-water immersion alone. Control animals were not exposed to any form of stress. Half of the animals submitted to each of the four treatments were anesthetized with thionembutal (35 mg/kg), whereas the other half was injected with saline. Results indicated that body restraint alone was not sufficient to induce gastric ulceration or changes in body temperature. On the other hand, cold-water exposure, either alone or in conjunction with body restraint, induced the same amount of stomach erosions and hypothermia. Therefore, it appears that body restraint does not play an important role on gastric ulceration induced by the cold-water restraint procedure. Present results also indicated that conscious and anesthetized animals immersed in cold water presented robust gastric ulceration and a marked drop in body temperature. However, conscious animals developed more severe gastric damage in comparison to anesthetized animals although both groups presented the same degree of hypothermia. These findings suggest that hypothermia resulting from cold-water exposure has a deleterious effect on gastric ulceration but the animal's conscious activity during the cold-water immersion increases the severity of gastric mucosal damage. It is concluded that cold-water restraint is a useful procedure for the study of the underlying mechanisms involved in stress

  7. An optimized procedure for exosome isolation and analysis using serum samples: Application to cancer biomarker discovery.

    PubMed

    Li, Mu; Rai, Alex J; DeCastro, G Joel; Zeringer, Emily; Barta, Timothy; Magdaleno, Susan; Setterquist, Robert; Vlassov, Alexander V

    2015-10-01

    Exosomes are RNA and protein-containing nanovesicles secreted by all cell types and found in abundance in body fluids, including blood, urine and cerebrospinal fluid. These vesicles seem to be a perfect source of biomarkers, as their cargo largely reflects the content of parental cells, and exosomes originating from all organs can be obtained from circulation through minimally invasive or non-invasive means. Here we describe an optimized procedure for exosome isolation and analysis using clinical samples, starting from quick and robust extraction of exosomes with Total exosome isolation reagent, then isolation of RNA followed by qRT-PCR. Effectiveness of this workflow is exemplified by analysis of the miRNA content of exosomes derived from serum samples - obtained from the patients with metastatic prostate cancer, treated prostate cancer patients who have undergone prostatectomy, and control patients without prostate cancer. Three promising exosomal microRNA biomarkers were identified, discriminating these groups: hsa-miR375, hsa-miR21, hsa-miR574.

  8. A simplified procedure for semi-targeted lipidomic analysis of oxidized phosphatidylcholines induced by UVA irradiation.

    PubMed

    Gruber, Florian; Bicker, Wolfgang; Oskolkova, Olga V; Tschachler, Erwin; Bochkov, Valery N

    2012-06-01

    Oxidized phospholipids (OxPLs) are increasingly recognized as signaling mediators that are not only markers of oxidative stress but are also "makers" of pathology relevant to disease pathogenesis. Understanding the biological role of individual molecular species of OxPLs requires the knowledge of their concentration kinetics in cells and tissues. In this work, we describe a straightforward "fingerprinting" procedure for analysis of a broad spectrum of molecular species generated by oxidation of the four most abundant species of polyunsaturated phosphatidylcholines (OxPCs). The approach is based on liquid-liquid extraction followed by reversed-phase HPLC coupled to electrospray ionization MS/MS. More than 500 peaks corresponding in retention properties to polar and oxidized PCs were detected within 8 min at 99 m/z precursor values using a single diagnostic product ion in extracts from human dermal fibroblasts. Two hundred seventeen of these peaks were fluence-dependently and statistically significantly increased upon exposure of cells to UVA irradiation, suggesting that these are genuine oxidized or oxidatively fragmented species. This method of semitargeted lipidomic analysis may serve as a simple first step for characterization of specific "signatures" of OxPCs produced by different types of oxidative stress in order to select the most informative peaks for identification of their molecular structure and biological role.

  9. Behavioral economic analysis of drug preference using multiple choice procedure data.

    PubMed

    Greenwald, Mark K

    2008-01-11

    The multiple choice procedure has been used to evaluate preference for psychoactive drugs, relative to money amounts (price), in human subjects. The present re-analysis shows that MCP data are compatible with behavioral economic analysis of drug choices. Demand curves were constructed from studies with intravenous fentanyl, intramuscular hydromorphone and oral methadone in opioid-dependent individuals; oral d-amphetamine, oral MDMA alone and during fluoxetine treatment, and smoked marijuana alone or following naltrexone pretreatment in recreational drug users. For each participant and dose, the MCP crossover point was converted into unit price (UP) by dividing the money value ($) by the drug dose (mg/70kg). At the crossover value, the dose ceases to function as a reinforcer, so "0" was entered for this and higher UPs to reflect lack of drug choice. At lower UPs, the dose functions as a reinforcer and "1" was entered to reflect drug choice. Data for UP vs. average percent choice were plotted in log-log space to generate demand functions. Rank of order of opioid inelasticity (slope of non-linear regression) was: fentanyl>hydromorphone (continuing heroin users)>methadone>hydromorphone (heroin abstainers). Rank order of psychostimulant inelasticity was d-amphetamine>MDMA>MDMA+fluoxetine. Smoked marijuana was more inelastic with high-dose naltrexone. These findings show this method translates individuals' drug preferences into estimates of population demand, which has the potential to yield insights into pharmacotherapy efficacy, abuse liability assessment, and individual differences in susceptibility to drug abuse.

  10. [Ocra Method: development of a new procedure for analysis of multiple tasks subject to infrequent rotation].

    PubMed

    Occhipinti, E; Colombini, Daniela; Occhipinti, M

    2008-01-01

    In the Ocra methods (Ocra index and Ocra Checklist), when computing the final indices (Ocra index or checklist score), in the case of more than one repetitive task a "traditional" procedure was already proposed, the results of which could be defined as "time-weighted average". This approach appears to be appropriate when considering rotations among tasks that are performed very frequently, for instance almost once every hour (or for shorter periods). However, when rotation among repetitive tasks is less frequent (i.e. once every 1 1/2 or more hours), the "time-weighted average" approach could result in an underestimation of the exposure level (as it practically flattens peaks of high exposures). For those scenarios an alternative approach based on the "most stressful task as minimum" might be more realistic. This latter approach has already been included in the NIOSH approach for multiple sequential lifting tasks and, given the recent availability in the Ocra method of more detailed duration multipliers (practically one different Du(M) for each different step of one hour of duration of the repetitive task), it is now possible to define a particular procedure to compute the complex Ocra Multitask Index (cOCRA) and the complex Checklist Score (cCHESCO) for the analysis of two or more repetitive tasks when rotations are infrequent (rotations every 1 1/2 hours or more). The result of this approach will be at least equal to the index of the most stressful task considered for its individual daily duration and at the most equal to the index of the most stressful task when it is (only theoretically) considered as lasting for the overall daily duration of all examined repetitive tasks. The procedure is based on the following formula: Complex Ocra Multitask Index = Ocra(1(Dum1) + (Delta ocra1xK) where 1,2,3,...,N = repetitive tasks ordered by ocra index values (1 = highest; N = lowest) computed considering respective real duration multipliers (Dum(i)). ocra1 = ocra index of

  11. A Meta-Analysis of Bilateral Essure® Procedural Placement Success Rates on First Attempt

    PubMed Central

    Frietze, Gabriel; Rahman, Mahbubur; Rouhani, Mahta; Berenson, Abbey B.

    2015-01-01

    Abstract Background: The Essure® (Bayer HealthCare Pharmaceuticals, Leverkusen, Germany) female sterilization procedure entails using a hysteroscope to guide a microinsert into the Fallopian tube openings. Failed placement can lead to patient dissatisfaction, repeat procedures, unintended or ectopic pregnancy, perforation of internal organs, or need for subsequent medical interventions. Additional interventions increase women's health risks, and costs for patients and the health care industry. Demonstrated successful placement rates are 63%–100%. To date, there have not been any systematic analyses of variables associated with placement rates. Objectives: The aims of this review were: (1) to estimate the average rate of successful bilateral Essure microinsert placement on first attempt; and (2) to identify variables associated with successful placement. Materials and Methods: A meta-analysis was conducted on 64 published studies and 19 variables. Following Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines, all published studies between November 2001 and February 2015 were reviewed. The studies were taken from from PubMed and Google Scholar, and by using the the “snowball” method that reported variables associated with successful bilateral Essure placement rates. Results: The weighted average rate of successful bilateral microinsert placement on first attempt was 92% (0.92 [95% confidence interval: 0.904–0.931]). Variables associated with successful placements were: (1) newer device models; (2) higher body mass index; and (3) a higher percent of patients who received local anesthesia. Conclusions: The data gathered for this review indicate that the highest bilateral success rates may be obtained by utilizing the newest Essure device model with local anesthesia in heavier patients. More standardized data reporting in published Essure studies is recommended. (J GYNECOL SURG 31:308) PMID:26633935

  12. Why and how did Israel adopt activity-based hospital payment? The Procedure-Related Group incremental reform.

    PubMed

    Brammli-Greenberg, Shuli; Waitzberg, Ruth; Perman, Vadim; Gamzu, Ronni

    2016-10-01

    Historically, Israel paid its non-profit hospitals on a perdiem (PD) basis. Recently, like other OECD countries, Israel has moved to activity-based payments. While most countries have adopted a diagnostic related group (DRG) payment system, Israel has chosen a Procedure-Related Group (PRG) system. This differs from the DRG system because it classifies patients by procedure rather than diagnosis. In Israel, the PRG system was found to be more feasible given the lack of data and information needed in the DRG classification system. The Ministry of Health (MoH) chose a payment scheme that depends only on inhouse creation of PRG codes and costing, thus avoiding dependence on hospital data. The PRG tariffs are priced by a joint Health and Finance Ministry commission and updated periodically. Moreover, PRGs are believed to achieve the same main efficiency objectives as DRGs: increasing the volume of activity, shortening unnecessary hospitalization days, and reducing the gaps between the costs and prices of activities. The PRG system is being adopted through an incremental reform that started in 2002 and was accelerated in 2010. The Israeli MoH involved the main players in the hospital market in the consolidation of this potentially controversial reform in order to avoid opposition. The reform was implemented incrementally in order to preserve the balance of resource allocation and overall expenditures of the system, thus becoming budget neutral. Yet, as long as gaps remain between marginal costs and prices of procedures, PRGs will not attain all their objectives. Moreover, it is still crucial to refine PRG rates to reflect the severity of cases, in order to tackle incentives for selection of patients within each procedure.

  13. Role of laser irradiation in direct pulp capping procedures: a systematic review and meta-analysis.

    PubMed

    Javed, Fawad; Kellesarian, Sergio Varela; Abduljabbar, Tariq; Gholamiazizi, Elham; Feng, Changyong; Aldosary, Khaled; Vohra, Fahim; Romanos, Georgios E

    2017-02-01

    A variety of materials are available to treat exposed dental pulp by direct pulp capping. The healing response of the pulp is crucial to form a dentin bridge and seal off the exposed pulp. Studies have used lasers to stimulate the exposed pulp to form tertiary dentin. The aim of the present systematic review and meta-analysis was to evaluate the evidence on the effects of laser irradiation as an adjunctive therapy to stimulate healing after pulp exposure. A systematic literature search was conducted up to April 2016. A structured search using the keywords "Direct pulp capping," "Lasers," "Calcium hydroxide pulp capping," and "Resin pulp capping" was performed. Initially, 34 potentially relevant articles were identified. After removal of duplicates and screening by title, abstract, and full text when necessary, nine studies were included. Studies were assessed for bias and data were synthetized using a random-effects meta-analysis model. Six studies were clinical, and three were preclinical animal trials; the follow-up period ranged from 2 weeks to 54 months. More than two thirds of the included studies showed that laser therapy used as an adjunct for direct pulp capping was more effective in maintaining pulp vitality than conventional therapy alone. Meta-analysis showed that the success rate in the laser treatment group was significantly higher than the control group (log odds ratio = 1.737; 95 % confidence interval, 1.304-2.171). Lasers treatment of exposed pulps can improve the outcome of direct pulp capping procedures; a number of confounding factors may have influenced the outcomes of the included studies.

  14. Immunoassay and GC-MS procedures for the analysis of drugs of abuse in meconium.

    PubMed

    ElSohly, M A; Stanford, D F; Murphy, T P; Lester, B M; Wright, L L; Smeriglio, V L; Verter, J; Bauer, C R; Shankaran, S; Bada, H S; Walls, H C

    1999-10-01

    The analysis of meconium specimens for metabolites of substances of abuse is a relatively accurate method for the detection of fetal exposure to drugs. Most of the methods reported in the literature before the early 1990s relied on radioimmunoassays. The purpose of this study was to develop and validate methods for meconium sample preparation for the screening and gas chromatography-mass spectrometry (GC-MS) confirmation of meconium extracts for cannabinoids, cocaine, opiates, amphetamines, and phencyclidine. EMIT and TDx immunoassays were evaluated as screening methods. The sample preparation method developed for screening included extraction and purification prior to analysis. Cutoff levels were administratively set at 20 ng/g for 11-nor-delta9-THC-9-COOH (THCCOOH) and phencyclidine and at 200 ng/g for benzoylecgonine, morphine, and amphetamines, although lower levels could be detected in meconium using the EMIT-ETS system. Ninety-five meconium specimens were subjected to the screening procedure with GC-MS confirmation of presumptive positives. In addition, 30 (40 for cocaine) meconium specimens were subjected to GC-MS analysis for all analytes regardless of the screening results to determine the false-negative rate, if any, of the immunoassay. Although there were no false negatives detected, the GC-MS confirmation rate for the immunoassay-positive specimens was generally low, ranging from 0% for amphetamines to 75% for opiates. The lowest rate of confirmed positives was found with the cannabinoids, suggesting that tetrahydrocannabinol (THC) metabolites other than free 11-nor-9-carboxy-delta9-THC may be major contributors to the immunoassay response in meconium.

  15. Multifractal detrended fluctuation analysis of optogenetic modulation of neural activity

    NASA Astrophysics Data System (ADS)

    Kumar, S.; Gu, L.; Ghosh, N.; Mohanty, S. K.

    2013-02-01

    Here, we introduce a computational procedure to examine whether optogenetically activated neuronal firing recordings could be characterized as multifractal series. Optogenetics is emerging as a valuable experimental tool and a promising approach for studying a variety of neurological disorders in animal models. The spiking patterns from cortical region of the brain of optogenetically-stimulated transgenic mice were analyzed using a sophisticated fluctuation analysis method known as multifractal detrended fluctuation analysis (MFDFA). We observed that the optogenetically-stimulated neural firings are consistent with a multifractal process. Further, we used MFDFA to monitor the effect of chemically induced pain (formalin injection) and optogenetic treatment used to relieve the pain. In this case, dramatic changes in parameters characterizing a multifractal series were observed. Both the generalized Hurst exponent and width of singularity spectrum effectively differentiates the neural activities during control and pain induction phases. The quantitative nature of the analysis equips us with better measures to quantify pain. Further, it provided a measure for effectiveness of the optogenetic stimulation in inhibiting pain. MFDFA-analysis of spiking data from other deep regions of the brain also turned out to be multifractal in nature, with subtle differences in the parameters during pain-induction by formalin injection and inhibition by optogenetic stimulation. Characterization of neuronal firing patterns using MFDFA will lead to better understanding of neuronal response to optogenetic activation and overall circuitry involved in the process.

  16. Analysis of a guided-response procedure in visual discriminations by rats.

    PubMed Central

    Aronsohn, S; Pinto-Hamuy, T; Toledo, P; Asenjo, P

    1987-01-01

    A guided-response procedure was used to train a visual pattern discrimination by rats in a modified Sutherland box. The method consisted of guiding the animal to the correct choice by means of a retractable bridge that led to reinforcers, followed by gradually removing this prompt. This method was compared to a stimulus-fading procedure, in which the initial differences between discriminative stimuli were gradually faded until they differed only with respect to the critical dimension for discrimination, and to a trial-and-error procedure. Both gradual procedures resulted in fewer errors compared to the trial-and-error procedure. The higher efficiency of the fading procedures was attributed to less aversiveness derived from performance with few errors and to the use of step-by-step requirements relative to the criterion performance. PMID:3612020

  17. Comparison of metabolic and biomechanic responses to active vs. passive warm-up procedures before physical exercise.

    PubMed

    Brunner-Ziegler, Sophie; Strasser, Barbara; Haber, Paul

    2011-04-01

    Active warm-up before physical exercise is a widely accepted practice to enhance physical performance, whereas data on modalities to passively raise tissue temperature are rare. The study compared the effect of active vs. passive warm-up procedures before exercise on energy supply and muscle strength performance. Twenty young, male volunteers performed 3 spiroergometer-test series without prior warm-up and after either an active or passive warm-up procedure. Oxygen uptake (VO2), heart rate (HR), pH value, and lactate were determined at 80% of individual VO2max values and during recovery. Comparing no prior warm-up with passive warm-up, pH values were lower at the fourth test minute (p < 0.004), and lactate values were higher at the sixth and third minutes of recovery (p < 0.01 and p < 0.010, respectively), after no prior warm-up. Comparing active with passive warm-up, HR was lower, and VO2 values were higher at the fourth and sixth test minutes (p < 0.033 and p < 0.011, respectively, and p < 0.015 and p < 0.022, respectively) after active warm-up. Differentiation between active and passive warm-up was more pronounced than between either warm-up or no warm-up. Conditions that may promote improved performance were more present after active vs. passive warm-up. Thus, athletes may reach the metabolic steady state faster after active warm-up.

  18. Determination of Cd and Zn by isotope dilution-thermal ionisation mass spectrometry using a sequential analysis procedure.

    PubMed

    Ayoub, Ahmed S; McGaw, Brian A; Midwood, Andrew J

    2002-05-16

    Isotope dilution-thermal ionisation mass spectrometry (ID-TIMS) was used to examine the certified Cd and Zn content of 4 Certified Reference Materials (CRMs); 2 soils: GBW07401 and GBW07405, 1 plant CRM060 and an animal tissue SRM1566a. The CRMs were chosen to be of contrasting origin and Cd:Zn content. Three digestion procedures were compared: (i) an open tube aqua regia procedure (ii) microwave digestion using Teflon bombs and (iii) hydrofluoric acid (HF) digestion using PTFE bombs. The Cd and Zn levels obtained using ID-TIMS all fell within the published certified range for the CRMs. This was the case regardless of the digestion procedure used, although HF digestion tended to yield marginally higher levels than the other procedures and in one instance, Cd in GBW07401, was significantly different (P<0.05) from the certified range. A filament loading procedure was developed, to allow sequential analysis of Cd and Zn on the same single filament during thermal ionisation mass spectrometry analysis. The sequential analysis technique was evaluated to ensure that Zn did not fractionate during Cd analysis and there was no inter-element interference. No marked difference in the precision and accuracy of the isotope ratio measurements were obtained from sequential element analyses on the same filament when compared to individual element analyses for a range of standard solutions or for sample digests. The most efficient procedure in terms of costs and productivity for future work of this kind would be a combination of microwave digestion and sequential analysis of Cd and Zn on the same filament.

  19. Modular approach to customise sample preparation procedures for viral metagenomics: a reproducible protocol for virome analysis

    PubMed Central

    Conceição-Neto, Nádia; Zeller, Mark; Lefrère, Hanne; De Bruyn, Pieter; Beller, Leen; Deboutte, Ward; Yinda, Claude Kwe; Lavigne, Rob; Maes, Piet; Ranst, Marc Van; Heylen, Elisabeth; Matthijnssens, Jelle

    2015-01-01

    A major limitation for better understanding the role of the human gut virome in health and disease is the lack of validated methods that allow high throughput virome analysis. To overcome this, we evaluated the quantitative effect of homogenisation, centrifugation, filtration, chloroform treatment and random amplification on a mock-virome (containing nine highly diverse viruses) and a bacterial mock-community (containing four faecal bacterial species) using quantitative PCR and next-generation sequencing. This resulted in an optimised protocol that was able to recover all viruses present in the mock-virome and strongly alters the ratio of viral versus bacterial and 16S rRNA genetic material in favour of viruses (from 43.2% to 96.7% viral reads and from 47.6% to 0.19% bacterial reads). Furthermore, our study indicated that most of the currently used virome protocols, using small filter pores and/or stringent centrifugation conditions may have largely overlooked large viruses present in viromes. We propose NetoVIR (Novel enrichment technique of VIRomes), which allows for a fast, reproducible and high throughput sample preparation for viral metagenomics studies, introducing minimal bias. This procedure is optimised mainly for faecal samples, but with appropriate concentration steps can also be used for other sample types with lower initial viral loads. PMID:26559140

  20. A targeted change-detection procedure by combining change vector analysis and post-classification approach

    NASA Astrophysics Data System (ADS)

    Ye, Su; Chen, Dongmei; Yu, Jie

    2016-04-01

    In remote sensing, conventional supervised change-detection methods usually require effective training data for multiple change types. This paper introduces a more flexible and efficient procedure that seeks to identify only the changes that users are interested in, here after referred to as ;targeted change detection;. Based on a one-class classifier ;Support Vector Domain Description (SVDD);, a novel algorithm named ;Three-layer SVDD Fusion (TLSF); is developed specially for targeted change detection. The proposed algorithm combines one-class classification generated from change vector maps, as well as before- and after-change images in order to get a more reliable detecting result. In addition, this paper introduces a detailed workflow for implementing this algorithm. This workflow has been applied to two case studies with different practical monitoring objectives: urban expansion and forest fire assessment. The experiment results of these two case studies show that the overall accuracy of our proposed algorithm is superior (Kappa statistics are 86.3% and 87.8% for Case 1 and 2, respectively), compared to applying SVDD to change vector analysis and post-classification comparison.

  1. Identifying combined design and analysis procedures in two-stage trials with a binary end point.

    PubMed

    Bowden, Jack; Wason, James

    2012-12-20

    Two-stage trial designs provide the flexibility to stop early for efficacy or futility and are popular because they have a smaller sample size on average than a traditional trial has with the same type I and II error rates. This makes them financially attractive but also has the ethical benefit of reducing, in the long run, the number of patients who are given ineffective treatments. Designs that minimise the expected sample size are often referred to as 'optimal'. However, two-stage designs can impart a substantial bias into the parameter estimate at the end of the trial. In this paper, we argue that the expected performance of one's chosen estimation method should also be considered when deciding on a two-stage trial design. We review the properties of standard and bias-adjusted maximum likelihood estimators as well as mean and median unbiased estimators. We then identify optimal two-stage design and analysis procedures that balance projected sample size considerations with those of estimator performance. We make available software to implement this new methodology.

  2. A simplified calculation procedure for mass isotopomer distribution analysis (MIDA) based on multiple linear regression.

    PubMed

    Fernández-Fernández, Mario; Rodríguez-González, Pablo; García Alonso, J Ignacio

    2016-10-01

    We have developed a novel, rapid and easy calculation procedure for Mass Isotopomer Distribution Analysis based on multiple linear regression which allows the simultaneous calculation of the precursor pool enrichment and the fraction of newly synthesized labelled proteins (fractional synthesis) using linear algebra. To test this approach, we used the peptide RGGGLK as a model tryptic peptide containing three subunits of glycine. We selected glycine labelled in two (13) C atoms ((13) C2 -glycine) as labelled amino acid to demonstrate that spectral overlap is not a problem in the proposed methodology. The developed methodology was tested first in vitro by changing the precursor pool enrichment from 10 to 40% of (13) C2 -glycine. Secondly, a simulated in vivo synthesis of proteins was designed by combining the natural abundance RGGGLK peptide and 10 or 20% (13) C2 -glycine at 1 : 1, 1 : 3 and 3 : 1 ratios. Precursor pool enrichments and fractional synthesis values were calculated with satisfactory precision and accuracy using a simple spreadsheet. This novel approach can provide a relatively rapid and easy means to measure protein turnover based on stable isotope tracers. Copyright © 2016 John Wiley & Sons, Ltd.

  3. Calculation procedures for the analysis of integral experiments for fusion-reactor design

    SciTech Connect

    Santoro, R.T.; Barnes, J.M.; Alsmiller, R.G. Jr.; Oblow, E.M.

    1981-07-01

    The calculational models, nuclear data, and radiation transport codes that are used in the analysis of integral measurements of the transport of approx. 14 MeV neutrons through laminated slabs of materials typical of those found in fusion reactor shields are described. The two-dimensional discrete ordinates calculations to optimize the experimental configuration for reducing the neutron and gamma ray background levels and for obtaining an equivalent, reduced geometry of the calculational model to reduce computer core storage and running times are also presented. The equations and data to determine the energy-angle relations to neutrons produced in the reactions of 250 keV deuterons in a titanium-tritide target are given. The procedures used to collapse the 17ln-36..gamma.. VITAMIN C cross section data library to a 53n-21..gamma.. broad group library are described. Finally, a description of the computer code network used to obtain neutron and gamma ray energy spectra for comparison with measured data is included.

  4. Application of a trigonometric finite difference procedure to numerical analysis of compressive and shear buckling of orthotropic panels

    NASA Technical Reports Server (NTRS)

    Stein, M.; Housner, J. D.

    1978-01-01

    A numerical analysis developed for the buckling of rectangular orthotropic layered panels under combined shear and compression is described. This analysis uses a central finite difference procedure based on trigonometric functions instead of using the conventional finite differences which are based on polynomial functions. Inasmuch as the buckle mode shape is usually trigonometric in nature, the analysis using trigonometric finite differences can be made to exhibit a much faster convergence rate than that using conventional differences. Also, the trigonometric finite difference procedure leads to difference equations having the same form as conventional finite differences; thereby allowing available conventional finite difference formulations to be converted readily to trigonometric form. For two-dimensional problems, the procedure introduces two numerical parameters into the analysis. Engineering approaches for the selection of these parameters are presented and the analysis procedure is demonstrated by application to several isotropic and orthotropic panel buckling problems. Among these problems is the shear buckling of stiffened isotropic and filamentary composite panels in which the stiffener is broken. Results indicate that a break may degrade the effect of the stiffener to the extent that the panel will not carry much more load than if the stiffener were absent.

  5. Cost analysis of pulmonary lobectomy procedure: comparison of stapler versus precision dissection and sealant

    PubMed Central

    Droghetti, Andrea; Marulli, Giuseppe; Vannucci, Jacopo; Giovanardi, Michele; Bottoli, Maria Caterina; Ragusa, Mark; Muriana, Giovanni

    2017-01-01

    Objective We aimed to evaluate the direct costs of pulmonary lobectomy hospitalization, comparing surgical techniques for the division of interlobar fissures: stapler (ST) versus electrocautery and hemostatic sealant patch (ES). Methods The cost comparison analysis was based on the clinical pathway and drawn up by collecting the information available from the Thoracic Surgery Division medical team at Mantova Hospital. Direct resource consumption was derived from a previous randomized controlled trial including 40 patients. Use and maintenance of technology, equipment and operating room; administrative plus general costs; and 30-day use of postsurgery hospital resources were considered. The analysis was conducted from the hospital perspective. Results On the average, a patient submitted to pulmonary lobectomy costs €9,744.29. This sum could vary from €9,027 (using ES) to €10,460 (using ST). The overall lower incidence (50% vs 95%, P=0.0001) and duration of air leakage (1.7 days vs 4.5 days, P=0.0001) in the ES group significantly affects the mean time of hospital stay (11.0 days vs 14.3 days) and costs. Cost saving in the ES group was also driven by the lower incidence of complications. The main key cost driver was staff employment (42%), then consumables (34%) and operating room costs (12%). Conclusion There is an overall saving of around €1,432.90 when using ES patch for each pulmonary lobectomy. Among patients undergoing this surgical procedure, ES can significantly reduce air leakage incidence and duration, as well as decrease hospitalization rates. However, further multicenter research should be developed considering different clinical and managerial settings.

  6. Destruction-free procedure for the isolation of bacteria from sputum samples for Raman spectroscopic analysis.

    PubMed

    Kloß, Sandra; Lorenz, Björn; Dees, Stefan; Labugger, Ines; Rösch, Petra; Popp, Jürgen

    2015-11-01

    Lower respiratory tract infections are the fourth leading cause of death worldwide. Here, a timely identification of the causing pathogens is crucial to the success of the treatment. Raman spectroscopy allows for quick identification of bacterial cells without the need for time-consuming cultivation steps, which is the current gold standard to detect pathogens. However, before Raman spectroscopy can be used to identify pathogens, they have to be isolated from the sample matrix, i.e., sputum in case of lower respiratory tract infections. In this study, we report an isolation protocol for single bacterial cells from sputum samples for Raman spectroscopic identification. Prior to the isolation, a liquefaction step using the proteolytic enzyme mixture Pronase E is required in order to deal with the high viscosity of sputum. The extraction of the bacteria was subsequently performed via different filtration and centrifugation steps, whereby isolation ratios between 46 and 57 % were achieved for sputa spiked with 6·10(7) to 6·10(4) CFU/mL of Staphylococcus aureus. The compatibility of such a liquefaction and isolation procedure towards a Raman spectroscopic classification was shown for five different model species, namely S. aureus, Staphylococcus epidermidis, Streptococcus pneumoniae, Klebsiella pneumoniae, and Pseudomonas aeruginosa. A classification of single-cell Raman spectra of these five species with an accuracy of 98.5 % could be achieved on the basis of a principal component analysis (PCA) followed by a linear discriminant analysis (LDA). These classification results could be validated with an independent test dataset, where 97.4 % of all spectra were identified correctly. Graphical Abstract Development of an isolation protocol of bacterial cells out of sputum samples followed by Raman spectroscopic measurement and species identification using chemometrical models.

  7. Design and analysis of thrust active magnetic bearing

    NASA Astrophysics Data System (ADS)

    Jang, Seok-Myeong; Lee, Un-Ho; Choi, Jang-Young; Hong, Jung-Pyo

    2008-04-01

    This paper deals with the design and analysis of thrust active magnetic bearing (AMB). Using the analytical solutions for thrust, resistance, and inductance obtained from equivalent magnetic circuits method, we determine initial design parameters such as the size of magnetic circuit, coil diameter, and the number of turns by investigating the variation of thrust according to design parameters. Then, using nonlinear finite element analysis, a detailed design considering saturation is performed in order to meet required thrust under restricted conditions. Finally, by confirming that the design result is shown in good agreement with experimental results, the validity of design procedures for thrust AMB used in this paper is proved. In particular, the dynamic test results of the thrust AMB are also given to confirm the validity of the design.

  8. Detailed analysis of CAMS procedures for phase 3 using ground truth inventories

    NASA Technical Reports Server (NTRS)

    Carnes, J. G.

    1979-01-01

    The results of a study of Procedure 1 as used during LACIE Phase 3 are presented. The study was performed by comparing the Procedure 1 classification results with digitized ground-truth inventories. The proportion estimation accuracy, dot labeling accuracy, and clustering effectiveness are discussed.

  9. Decreasing Inappropriate Vocalizations Using Classwide Group Contingencies and Color Wheel Procedures: A Component Analysis

    ERIC Educational Resources Information Center

    Kirk, Emily R.; Becker, Jennifer A.; Skinner, Christopher H., Fearrington, Jamie Yarbr; McCane-Bowling, Sara J.; Amburn, Christie; Luna, Elisa; Greear, Corinne

    2010-01-01

    Teacher referrals for consultation resulted in two independent teams collecting evidence that allowed for a treatment component evaluation of color wheel (CW) procedures and/or interdependent group-oriented reward (IGOR) procedures on inappropriate vocalizations in one third- and one first-grade classroom. Both studies involved the application of…

  10. A Component Analysis of Toilet-Training Procedures Recommended for Young Children

    ERIC Educational Resources Information Center

    Greer, Brian D.; Neidert, Pamela L.; Dozier, Claudia L.

    2016-01-01

    We evaluated the combined and sequential effects of 3 toilet-training procedures recommended for use with young children: (a) underwear, (b) a dense sit schedule, and (c) differential reinforcement. A total of 20 children participated. Classroom teachers implemented a toilet-training package consisting of all 3 procedures with 6 children. Of the 6…

  11. A New Variable Weighting and Selection Procedure for K-Means Cluster Analysis

    ERIC Educational Resources Information Center

    Steinley, Douglas; Brusco, Michael J.

    2008-01-01

    A variance-to-range ratio variable weighting procedure is proposed. We show how this weighting method is theoretically grounded in the inherent variability found in data exhibiting cluster structure. In addition, a variable selection procedure is proposed to operate in conjunction with the variable weighting technique. The performances of these…

  12. Influence of the isolation procedure on coriander leaf volatiles with some correlation to the enzymatic activity.

    PubMed

    To Quynh, Cung Thi; Iijima, Yoko; Kubota, Kikue

    2010-01-27

    Coriander leaves (Coriandrum sativum L.) have become popular worldwide because of their pleasant and delicate aroma. By a hot water extraction method, in which coriander leaves were cut before suspending in boiling water for 2 min, the contents of the main volatile compounds such as alkanals and 2-alkenals from C10 to C14 decreased, while the levels of corresponding alcohols increased in comparison to those obtained by solvent extraction. To investigate the reasons for this variation, an enzyme activity was assayed. By using aliphatic aldehyde as a substrate and NADPH as a coenzyme, strong activity of an aliphatic aldehyde reductase was found for the first time in this herb in the relatively wide pH range of 5.0-9.0, with the maximum activity at pH 8.5. Additionally, the aliphatic aldehyde dehydrogenase, responsible for acid formation, was also found to have a relatively weak activity compared to that of reductase.

  13. An integrated multi-scale risk analysis procedure for pluvial flooding

    NASA Astrophysics Data System (ADS)

    Tader, Andreas; Mergili, Martin; Jäger, Stefan; Glade, Thomas; Neuhold, Clemens; Stiefelmeyer, Heinz

    2016-04-01

    Mitigation of or adaptation to the negative impacts of natural processes on society requires a better understanding of the spatio-temporal distribution not only of the processes themselves, but also of the elements at risk. Information on their values, exposures and vulnerabilities towards the expected impact magnitudes/intensities of the relevant processes is needed. GIS-supported methods are particularly useful for integrated spatio-temporal analyses of natural processes and their potential consequences. Hereby, pluvial floods are of particular concern for many parts of Austria. The overall aim of the present study is to calculate the hazards emanating from pluvial floods, to determine the exposure of given elements at risk, to determine their vulnerabilities towards given pluvial flood hazards and to analyze potential consequences in terms of monetary losses. The whole approach builds on data available on a national scale. We introduce an integrated, multi-scale risk analysis procedure with regard to pluvial flooding. Focusing on the risk to buildings, we firstly exemplify this procedure with a well-documented event in the city of Graz (Austria), in order to highlight the associated potentials and limitations. Secondly, we attempt to predict the possible consequences of pluvial flooding triggered by rainfall events with recurrence intervals of 30, 100 and 300 years. (i) We compute spatially distributed inundation depths using the software FloodArea. Infiltration capacity and surface roughness are estimated from the land cover units given by the official cadastre. Various assumptions are tested with regard to the inflow to the urban sewer system. (ii) Based on the inundation depths and the official building register, we employ a set of rules and functions to deduce the exposure, vulnerability and risk for each building. A risk indicator for each building, expressed as the expected damage associated to a given event, is derived by combining the building value and

  14. Comparison of four digestion procedures not requiring perchloric acid for the trace-element analysis of plant material

    SciTech Connect

    Knight, M. J.

    1980-05-01

    Perchloric acid (HClO/sub 4/) is often used to destroy organic material contained in plant tissue during sample preparation for trace-element analysis. However, since perchloric acid is an extremely strong oxidizing agent that can cause fire and explosion when in contact with combustible materials, its use is best avoided when proper safety equipment and training is unavailable. A comparison was made of four digestion procedures that do not require perchloric acid: wet digestion with nitric and sulfuric acids; wet digestion with nitric acid alone; a repeated wet digestion with nitric acid; and direct dry ashing. Each procedure was used to digest National Bureau of Standards orchard leaves (SRM 1571). To investigate the effect of possible filter paper adsorption on the determination of trace elements, digested samples were either filtered or not filtered before analysis. Atomic absorption spectrophotometry was employed to determine concentrations of As, Be, Cd, Cr, Cu, Fe, Mn, Mo, Ni, Pb, Sr, and Zn in each digested sample. Recoveries of each element and the relative error of each determination for each digestion procedure were then calculated. A statistical analysis of these data indicates that the direct dry ashing procedure is best suited for multi-element analysis. Dry ashing is appropriate to recover As, Be, Cr, Cu, Fe, Mn, Mo, Pb, and Zn. The nitric-sulfuric acids, nitric acid, and repeated nitric acid digestion procedures were deemed poor for multi-element analysis; however, each proved useful for the recovery of certain individual elements, including Cd, Pb, and Zn. Sample filtration significantly (p less than or equal to 0.05) lowered the recovery of Cr, Mn, Pb, and Zn from the digested samples. Conversely, the recovery of As, Mo, and Sr was significantly (p less than or equal to 0.05) higher in samples filtered before analysis when compared to the recovery of these elements in unfiltered samples.

  15. The Guided Reading Procedure: An Experimental Analysis of Its Effectiveness as a Technique for Improving Reading Comprehension Skills.

    ERIC Educational Resources Information Center

    Culver, Victor Irwin

    The primary purpose of this study was to experimentally evaluate the Guided Reading Procedure (GRP) as a teaching strategy designed to improve reading comprehension. The chief experimental strategy was compared with a current instructional strategy, the Directed Reading-Thinking Activity (DRTA) described by Stauffer (1969). The effects on reading…

  16. CV 990 interface test and procedure analysis of the monkey restraint, support equipment, and telemetry electronics proposed for Spacelab

    NASA Technical Reports Server (NTRS)

    Newsom, B. D.

    1978-01-01

    A biological system proposed to restrain a monkey in the Spacelab was tested under operational conditions using typical metabolic and telemetered cardiovascular instrumentation. Instrumentation, interfaced with other electronics, and data gathering during a very active operational mission were analyzed for adequacy of procedure and success of data handling by the onboard computer.

  17. RECOMMENDED OPERATING PROCEDURE NO. 2.3: SAMPLING AND ANALYSIS OF TOTAL HYDROCARBONS FROM SOURCES BY CONTINUOUS EMISSION MONITOR

    EPA Science Inventory

    The report is a recommended operating procedure (ROP) prepared for use in research activities conducted by EPA's Air and Energy Engineering Research Laboratory (AEERL). he described method is applicable to the continuous measurement of total hydrocarbons (THCs), also known as tot...

  18. Impairment of endocannabinoids activity in the dorsolateral striatum delays extinction of behavior in a procedural memory task in rats.

    PubMed

    Rueda-Orozco, Pavel E; Montes-Rodriguez, Corinne J; Soria-Gomez, Edgar; Méndez-Díaz, Mónica; Prospéro-García, Oscar

    2008-07-01

    The dorsolateral striatum (DLS) has been implicated in the learning of habits and procedural memories. Extinction of this kind of memories has been poorly studied. The DLS expresses high levels of the cannabinergic receptor one (CB1), and, lately, it has been suggested that the activation of CB1 in this structure is indispensable for long-term depression (LTD) development. We performed experiments in a T-maze and evaluated the effects of intrastriatal and intrahipocampal administration of the CB1 antagonist AM251 on extinction and on c-Fos expression. We also administered anandamide to evaluate if an artificial increase of endocannabinoids facilitates extinction. Our results indicate clearly a dose-response blockade of extinction induced by AM251 injected into the striatum but a facilitation of extinction when administered into the hippocampus. Anandamide did not induce any observable changes. AM251 effects were accompanied by an increase in c-Fos immunoreactivity in the DLS and its decrease in the hippocampal region, suggesting that the activation of CB1 in the striatum is necessary for the extinction of procedural memories. These findings could be important in some neurological conditions, such as obsessive-compulsive disorder in which striatal activity seems to be abnormal.

  19. TiO2/activated carbon fibers photocatalyst: effects of coating procedures on the microstructure, adhesion property, and photocatalytic ability.

    PubMed

    Shi, Jian-Wen; Cui, Hao-Jie; Chen, Jian-Wei; Fu, Ming-Lai; Xu, Bin; Luo, Hong-Yuan; Ye, Zhi-Long

    2012-12-15

    In order to more easily separate TiO(2) photocatalyst from the treated wastewater, TiO(2) film was immobilized on the surface of activated carbon fibers (ACFs) by employing two kinds of coating procedures, dip-coating, and hydrothermal treatment. The effects of coating procedures on microstructure of TiO(2)-coated ACFs (TiO(2)/ACFs), such as morphology, porous property, crystal structure, and light absorption characteristics were investigated in detail. The adhesion property between TiO(2) film and ACFs was evaluated by ultrasonic vibration, and the photocatalytic activity of TiO(2)/ACFs was tested by the photocatalytic decoloration of methylene blue solution. The results show that hydrothermal treatment presented many advantages to obtain high-performance TiO(2)/ACFs photocatalyst in comparison with dip-coating. Hydrothermal treatment could improve the binding property between TiO(2) films and ACFs, which endowed the as-obtained TiO(2)/ACFs photocatalyst with improved reusable performance, and TiO(2)/ACFs synthesized by hydrothermal treatment presented higher photocatalytic activity.

  20. USDA’s National Food and Nutrient Analysis Program: Analytical Quality Control Procedures for Food Composition Research

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Representative food samples collected under the United States Department of Agriculture's (USDA) National Food and Nutrient Analysis Program (NFNAP) are analyzed for composition of nutrients and other bioactive components. Standard procedures have been developed to describe how these primary food s...

  1. 75 FR 58023 - Guidelines Establishing Test Procedures for the Analysis of Pollutants Under the Clean Water Act...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-23

    ... certification pursuant to or permit application pursuant to .'' Section 501(a) of the Act authorizes the... 136 identify test procedures that must be used for the analysis of pollutants in all applications and...) permit application requirements. Although this method is directed toward the coal mining industry...

  2. Exploratory Bifactor Analysis of the WJ-III Cognitive in Adulthood via the Schmid-Leiman Procedure

    ERIC Educational Resources Information Center

    Dombrowski, Stefan C.

    2014-01-01

    The Woodcock-Johnson-III cognitive in the adult time period (age 20 to 90 plus) was analyzed using exploratory bifactor analysis via the Schmid-Leiman orthogonalization procedure. The results of this study suggested possible overfactoring, a different factor structure from that posited in the Technical Manual and a lack of invariance across both…

  3. 77 FR 38523 - Expedited Approval of Alternative Test Procedures for the Analysis of Contaminants Under the Safe...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-28

    ..., ``Results of the Inter-laboratory Method Validation Study using U.S. Environmental Protection Agency Method... From the Federal Register Online via the Government Publishing Office ENVIRONMENTAL PROTECTION... Under the Safe Drinking Water Act; Analysis and Sampling Procedures AGENCY: Environmental...

  4. Investigating the Structure of the WJ-III Cognitive in Early School Age through Two Exploratory Bifactor Analysis Procedures

    ERIC Educational Resources Information Center

    Dombrowski, Stefan C.

    2014-01-01

    Two exploratory bifactor methods (e.g., Schmid-Leiman [SL] and exploratory bifactor analysis [EBFA]) were used to investigate the structure of the Woodcock-Johnson III (WJ-III) Cognitive in early school age (age 6-8). The SL procedure is recognized by factor analysts as a preferred method for EBFA. Jennrich and Bentler recently developed an…

  5. ANALYSIS OF TRACE-LEVEL ORGANIC COMBUSTION PROCESS EMISSIONS USING NOVEL MULTIDIMENSIONAL GAS CHROMATOGRAPHY-MASS SPECTROMETRY PROCEDURES

    EPA Science Inventory

    The paper discusses the analysis of trace-level organic combustion process emissions using novel multidimensional gas chromatography-mass spectrometry (MDGC-MS) procedures. It outlines the application of the technique through the analyses of various incinerator effluent and produ...

  6. Minimally invasive procedure reduces adjacent segment degeneration and disease: New benefit-based global meta-analysis

    PubMed Central

    Li, Xiao-Chuan; Huang, Chun-Ming; Zhong, Cheng-Fan; Liang, Rong-Wei; Luo, Shao-Jian

    2017-01-01

    Objective Adjacent segment pathology (ASP) is a common complication presenting in patients with axial pain and dysfunction, requiring treatment or follow-up surgery. However, whether minimally invasive surgery (MIS), including MIS transforaminal / posterior lumbar interbody fusion (MIS-TLIF/PLIF) decreases the incidence rate of ASP remains unknown. The aim of this meta-analysis was to compare the incidence rate of ASP in patients undergoing MIS versus open procedures. Methods This systematic review was undertaken by following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses Statement. We searched electronic databases, including PubMed, EMBASE, SinoMed, and the Cochrane Library, without language restrictions, to identify clinical trials comparing MIS to open procedures. The results retrieved were last updated on June 15, 2016. Results Overall, 9 trials comprising 770 patients were included in the study; the quality of the studies included 4 moderate and 5 low-quality studies. The pooled data analysis demonstrated low heterogeneity between the trials and a significantly lower ASP incidence rate in patients who underwent MIS procedure, compared with those who underwent open procedure (p = 0.0001). Single-level lumbar interbody fusion was performed in 6 trials of 408 patients and we found a lower ASP incidence rate in MIS group, compared with those who underwent open surgery (p = 0.002). Moreover, the pooled data analysis showed a significant reduction in the incidence rate of adjacent segment disease (ASDis) (p = 0.0003) and adjacent segment degeneration (ASDeg) (p = 0.0002) for both procedures, favoring MIS procedure. Subgroup analyses showed no difference in follow-up durations between the procedures (p = 0.93). Conclusion Therefore, we conclude that MIS-TLIF/PLIF can reduce the incidence rate of ASDis and ASDeg, compared with open surgery. Although the subgroup analysis did not indicate a difference in follow-up duration between the two

  7. From Laser Scanning to Finite Element Analysis of Complex Buildings by Using a Semi-Automatic Procedure

    PubMed Central

    Castellazzi, Giovanni; D’Altri, Antonio Maria; Bitelli, Gabriele; Selvaggi, Ilenia; Lambertini, Alessandro

    2015-01-01

    In this paper, a new semi-automatic procedure to transform three-dimensional point clouds of complex objects to three-dimensional finite element models is presented and validated. The procedure conceives of the point cloud as a stacking of point sections. The complexity of the clouds is arbitrary, since the procedure is designed for terrestrial laser scanner surveys applied to buildings with irregular geometry, such as historical buildings. The procedure aims at solving the problems connected to the generation of finite element models of these complex structures by constructing a fine discretized geometry with a reduced amount of time and ready to be used with structural analysis. If the starting clouds represent the inner and outer surfaces of the structure, the resulting finite element model will accurately capture the whole three-dimensional structure, producing a complex solid made by voxel elements. A comparison analysis with a CAD-based model is carried out on a historical building damaged by a seismic event. The results indicate that the proposed procedure is effective and obtains comparable models in a shorter time, with an increased level of automation. PMID:26225978

  8. From Laser Scanning to Finite Element Analysis of Complex Buildings by Using a Semi-Automatic Procedure.

    PubMed

    Castellazzi, Giovanni; D'Altri, Antonio Maria; Bitelli, Gabriele; Selvaggi, Ilenia; Lambertini, Alessandro

    2015-07-28

    In this paper, a new semi-automatic procedure to transform three-dimensional point clouds of complex objects to three-dimensional finite element models is presented and validated. The procedure conceives of the point cloud as a stacking of point sections. The complexity of the clouds is arbitrary, since the procedure is designed for terrestrial laser scanner surveys applied to buildings with irregular geometry, such as historical buildings. The procedure aims at solving the problems connected to the generation of finite element models of these complex structures by constructing a fine discretized geometry with a reduced amount of time and ready to be used with structural analysis. If the starting clouds represent the inner and outer surfaces of the structure, the resulting finite element model will accurately capture the whole three-dimensional structure, producing a complex solid made by voxel elements. A comparison analysis with a CAD-based model is carried out on a historical building damaged by a seismic event. The results indicate that the proposed procedure is effective and obtains comparable models in a shorter time, with an increased level of automation.

  9. Content Analysis in Systems Engineering Acquisition Activities

    DTIC Science & Technology

    2016-04-30

    shape requirements definitions for system upgrade or modification contracts and new baseline contracts. Finally, content analysis training and skill...back to the system designers, this information can then be used to shape requirements definition for system upgrade or modification contracts and new...Activity System Requirements Definition Ensuring the system requirements adequately reflect the stakeholder requirements Negotiating modifications to

  10. Accession Medical Standards Analysis and Research Activity

    DTIC Science & Technology

    2010-01-01

    Chief, Accession Medical Standards Analysis & Research Activity Li Yuanzhang, PhD Senior Statistician Department of Epidemiology David N...ORGANIZATION NAME(S) AND ADDRESS(ES) AMSARA, Department of Epidemiology , Division of Preventive Medicine Walter Reed Army Institute of Research 503... Epidemiology of Injury form the Assessment of Recruit Strength and Motivation study ARMS) and Program

  11. Physician-patient communication following invasive procedures: an analysis of post-angiogram consultations.

    PubMed

    Gordon, Howard S; Street, Richard L; Kelly, P Adam; Souchek, Julianne; Wray, Nelda P

    2005-09-01

    Although rarely studied, physician-patient interactions immediately following diagnostic tests are significant medical events because during these encounters the physician and patient often make decisions about major and sometimes invasive treatment. This investigation analyzed patterns of physician-patient communication following coronary angiography with particular attention to behaviors important to decision-making: physician information-giving, physician use of partnership-building, and active forms of patient participation (e.g., asking questions, being assertive, expressing concerns). We were particularly interested in effects related to the patient's race in light of documented evidence of racial disparities in cardiac care and outcomes. From audiotape recordings, 93 physician-patient interactions after coronary angiogram in a catheterization laboratory in a large US Veterans Affairs Medical Center were coded to measure the frequency of physicians' information-giving and partnership-building and the frequency of active patient participation. We also stratified these behaviors according to whether the behavior was prompted (e.g., physician information in response to a patient's question; a patient's opinion solicited by the doctor) or self-initiated. Several findings were noteworthy. First, these interactions were very brief and dominated by the physician. Second, although physician information-giving increased with more active patient participation, which in turn was correlated with physicians' use of partnership-building, proportionally little of the physicians' information (8%) and active patient participation (9%) was directly prompted by the other interactant. Finally, there was a tendency for physicians to self-initiate less information giving to black patients and for black patients to self-initiate less active participation than white patients. Although these differences were attenuated when other variables (e.g., the physician's training, disease

  12. Procedure for Tooth Contact Analysis of a Face Gear Meshing With a Spur Gear Using Finite Element Analysis

    NASA Technical Reports Server (NTRS)

    Bibel, George; Lewicki, David G. (Technical Monitor)

    2002-01-01

    A procedure was developed to perform tooth contact analysis between a face gear meshing with a spur pinion using finite element analysis. The face gear surface points from a previous analysis were used to create a connected tooth solid model without gaps or overlaps. The face gear surface points were used to create a five tooth face gear Patran model (with rim) using Patran PCL commands. These commands were saved in a series of session files suitable for Patran input. A four tooth spur gear that meshes with the face gear was designed and constructed with Patran PCL commands. These commands were also saved in a session files suitable for Patran input. The orientation of the spur gear required for meshing with the face gear was determined. The required rotations and translations are described and built into the session file for the spur gear. The Abaqus commands for three-dimensional meshing were determined and verified for a simplified model containing one spur tooth and one face gear tooth. The boundary conditions, loads, and weak spring constraints were determined to make the simplified model work. The load steps and load increments to establish contact and obtain a realistic load was determined for the simplified two tooth model. Contact patterns give some insight into required mesh density. Building the two gears in two different local coordinate systems and rotating the local coordinate systems was verified as an easy way to roll the gearset through mesh. Due to limitation of swap space, disk space and time constraints of the summer period, the larger model was not completed.

  13. Safety and Performance Analysis of the Non-Radar Oceanic/Remote Airspace In-Trail Procedure

    NASA Technical Reports Server (NTRS)

    Carreno, Victor A.; Munoz, Cesar A.

    2007-01-01

    This document presents a safety and performance analysis of the nominal case for the In-Trail Procedure (ITP) in a non-radar oceanic/remote airspace. The analysis estimates the risk of collision between the aircraft performing the ITP and a reference aircraft. The risk of collision is only estimated for the ITP maneuver and it is based on nominal operating conditions. The analysis does not consider human error, communication error conditions, or the normal risk of flight present in current operations. The hazards associated with human error and communication errors are evaluated in an Operational Hazards Analysis presented elsewhere.

  14. Cross-linked enzyme aggregates (CLEAs) of selected lipases: a procedure for the proper calculation of their recovered activity

    PubMed Central

    2013-01-01

    In the last few years, synthesis of carrier-free immobilized biocatalysts by cross-linking of enzyme aggregates has appeared as a promising technique. Cross-linked enzyme aggregates (CLEAs) present several interesting advantages over carrier-bound immobilized enzymes, such as highly concentrated enzymatic activity, high stability of the produced superstructure, important production costs savings by the absence of a support, and the fact that no previous purification of the enzyme is needed. However, the published literature evidences that a) much specific non-systematic exploratory work is being done and, b) recovered activity calculations in CLEAs still need to be optimized. In this context, this contribution presents results of an optimized procedure for the calculation of the activity retained by CLEAs, based on the comparison of their specific activity relative to their free enzyme counterparts. The protocol implies determination of precipitable protein content in commercial enzyme preparations through precipitation with ammonium sulphate and a protein co-feeder. The identification of linear ranges of activity versus concentration/amount of protein in the test reaction is also required for proper specific activity determinations. By use of mass balances that involve the protein initially added to the synthesis medium, and the protein remaining in the supernatant and washing solutions (these last derived from activity measurements), the precipitable protein present in CLEAs is obtained, and their specific activity can be calculated. In the current contribution the described protocol was applied to CLEAs of Thermomyces lanuginosa lipase, which showed a recovered specific activity of 11.1% relative to native lipase. The approach described is simple and can easily be extended to other CLEAs and also to carrier-bound immobilized enzymes for accurate determination of their retained activity. PMID:23663379

  15. An exploration of diabetic foot screening procedures data by a multiple correspondence analysis

    PubMed Central

    Rovan, Jože

    2017-01-01

    Abstract Aims Gangrene and amputation are among most feared complications of diabetes mellitus. Early detection of patients at high risk for foot ulceration can prevent foot complications. Regular foot screening (medical history, foot examination and classification into risk groups) was introduced at the out-patient diabetes clinic in Ljubljana in November 1996. We aimed to explore the relationships between the observed variables, check the appropriateness of the risk status classification and of the post-screening decisions. Methods The data of 11.594 patients, obtained in 18 years, were analysed by multiple correspondence analysis (MCA). Most of the observed variables were categorical. Results The majority of the screened population was free of foot complications. We demonstrated an increasing frequency and severity of foot problems with an increasing age, as well as the association between the loss of protective sensation and the history of foot ulceration, foot deformity and callus formation, the history of foot ulcer or amputation and acute foot ulceration. A new finding was that the location of foot deformity points was closer to female than male gender, indicating the possible role of fashionable high-heel footwear. The appropriateness of therapeutic decisions was confirmed: the points representing absent foot pulses and referral to vascular specialist were close together, as well as points representing foot deformity and special footwear prescription or callus formation and referral to pedicurist. Conclusions MCA was applied to the data on foot pathology in the population attending the out-patient diabetes clinic. The method proved to be a useful statistical tool for analysing the data of screening procedures. PMID:28289465

  16. Utilisation of Blood Components in Cardiac Surgery: A Single-Centre Retrospective Analysis with Regard to Diagnosis-Related Procedures

    PubMed Central

    Geissler, Raoul Georg; Rotering, Heinrich; Buddendick, Hubert; Franz, Dominik; Bunzemeier, Holger; Roeder, Norbert; Kwiecien, Robert; Sibrowski, Walter; Scheld, Hans H.; Martens, Sven; Schlenke, Peter

    2015-01-01

    Background More blood components are required in cardiac surgery than in most other medical disciplines. The overall blood demand may increase as a function of the total number of cardiothoracic and vascular surgical interventions and their level of complexity, and also when considering the demographic ageing. Awareness has grown with respect to adverse events, such as transfusion-related immunomodulation by allogeneic blood supply, which can contribute to morbidity and mortality. Therefore, programmes of patient blood management (PBM) have been implemented to avoid unnecessary blood transfusions and to standardise the indication of blood transfusions more strictly with aim to improve patients' overall outcomes. Methods A comprehensive retrospective analysis of the utilisation of blood components in the Department of Cardiac Surgery at the University Hospital of Münster (UKM) was performed over a 4-year period. Based on a medical reporting system of all medical disciplines, which was established as part of a PBM initiative, all transfused patients in cardiac surgery and their blood components were identified in a diagnosis- and medical procedure-related system, which allows the precise allocation of blood consumption to interventional procedures in cardiac surgery, such as coronary or valve surgery. Results This retrospective single centre study included all in-patients in cardiac surgery at the UKM from 2009 to 2012, corresponding to a total of 1,405-1,644 cases per year. A blood supply was provided for 55.6-61.9% of the cardiac surgery patients, whereas approximately 9% of all in-patients at the UKM required blood transfusions. Most of the blood units were applied during cardiac valve surgery and during coronary surgery. Further surgical activities with considerable use of blood components included thoracic surgery, aortic surgery, heart transplantations and the use of artificial hearts. Under the measures of PBM in 2012 a noticeable decrease in the number of

  17. Spline-based procedures for dose-finding studies with active control

    PubMed Central

    Helms, Hans-Joachim; Benda, Norbert; Zinserling, Jörg; Kneib, Thomas; Friede, Tim

    2015-01-01

    In a dose-finding study with an active control, several doses of a new drug are compared with an established drug (the so-called active control). One goal of such studies is to characterize the dose–response relationship and to find the smallest target dose concentration d*, which leads to the same efficacy as the active control. For this purpose, the intersection point of the mean dose–response function with the expected efficacy of the active control has to be estimated. The focus of this paper is a cubic spline-based method for deriving an estimator of the target dose without assuming a specific dose–response function. Furthermore, the construction of a spline-based bootstrap CI is described. Estimator and CI are compared with other flexible and parametric methods such as linear spline interpolation as well as maximum likelihood regression in simulation studies motivated by a real clinical trial. Also, design considerations for the cubic spline approach with focus on bias minimization are presented. Although the spline-based point estimator can be biased, designs can be chosen to minimize and reasonably limit the maximum absolute bias. Furthermore, the coverage probability of the cubic spline approach is satisfactory, especially for bias minimal designs. © 2014 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. PMID:25319931

  18. 75 FR 25236 - Agency Information Collection Activities; Proposed Collection; Comment Request; Procedures for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-07

    ... estimate that you provide. 5. Offer alternative ways to improve the collection activity. 6. Make sure to submit your comments by the deadline identified under DATES. 7. To ensure proper receipt by EPA, be sure... Wastewater Treatment Construction Grants Program facilities, STAG actions subject to NEPA and new...

  19. Processes, Procedures, and Methods to Control Pollution Resulting from Silvicultural Activities.

    ERIC Educational Resources Information Center

    Environmental Protection Agency, Washington, DC. Office of Water Programs.

    This report presents brief documentation of silvicultural practices, both those now in use and those in stages of research and development. A majority of the text is concerned with the specific aspects of silvicultural activities which relate to nonpoint source pollution control methods. Analyzed are existing and near future pollution control…

  20. Improving Homework Compliance in Career Counseling with a Behavioral Activation Functional Assessment Procedure: A Pilot Study

    ERIC Educational Resources Information Center

    Baruch, David E.; Kanter, Jonathan W.; Bowe, William M.; Pfennig, Sherri L.

    2011-01-01

    Behavioral activation has emerged as a widely used treatment for depression in a number of health care settings due to its concrete, straightforward emphasis on out-of-session client homework, but it lacks explicit guidelines for identifying and overcoming barriers that interfere with homework completion. The purpose of this pilot study was to…

  1. Spline-based procedures for dose-finding studies with active control.

    PubMed

    Helms, Hans-Joachim; Benda, Norbert; Zinserling, Jörg; Kneib, Thomas; Friede, Tim

    2015-01-30

    In a dose-finding study with an active control, several doses of a new drug are compared with an established drug (the so-called active control). One goal of such studies is to characterize the dose-response relationship and to find the smallest target dose concentration d(*), which leads to the same efficacy as the active control. For this purpose, the intersection point of the mean dose-response function with the expected efficacy of the active control has to be estimated. The focus of this paper is a cubic spline-based method for deriving an estimator of the target dose without assuming a specific dose-response function. Furthermore, the construction of a spline-based bootstrap CI is described. Estimator and CI are compared with other flexible and parametric methods such as linear spline interpolation as well as maximum likelihood regression in simulation studies motivated by a real clinical trial. Also, design considerations for the cubic spline approach with focus on bias minimization are presented. Although the spline-based point estimator can be biased, designs can be chosen to minimize and reasonably limit the maximum absolute bias. Furthermore, the coverage probability of the cubic spline approach is satisfactory, especially for bias minimal designs.

  2. Easy Green: A Handbook of Earth-Smart Activities and Operating Procedures for Youth Programs.

    ERIC Educational Resources Information Center

    Westerman, Marty

    This book aims to help camp directors and programmers evaluate the environmental impact of camp practices, make informed environmental choices, and make environmental awareness a habit in all operations and activities. Section 1 discusses developing a personal environmental philosophy, and considering possibilities for camp environmental action in…

  3. CTEPP STANDARD OPERATING PROCEDURE FOR VIDEOTAPING CHILD ACTIVITIES (SOP-2.23)

    EPA Science Inventory

    This SOP describes the method for videotaping a preschool child at a home. The CTEPP main study will collect multimedia samples and questionnaire data at the homes of participants (adults and children) during 48-hr sampling periods. Videotaping the activities of 10% of these chi...

  4. Effect of different detoxification procedures on the residual pertussis toxin activities in vaccines.

    PubMed

    Yuen, Chun-Ting; Asokanathan, Catpagavalli; Cook, Sarah; Lin, Naomi; Xing, Dorothy

    2016-04-19

    Pertussis toxin (PTx) is a major virulence factor produced by Bordetella pertussis and its detoxified form is one of the major protective antigens in vaccines against whooping cough. Ideally, PTx in the vaccine should be completely detoxified while still preserving immunogenicity. However, this may not always be the case. Due to multilevel reaction mechanisms of chemical detoxification that act on different molecular sites and with different production processes, it is difficult to define a molecular characteristic of a pertussis toxoid. PTx has two functional distinctive domains: the ADP-ribosyltransferase enzymatic subunit S1 (A-protomer) and the host cell binding carbohydrate-binding subunits S2-5 (B-oligomer); and in this study, we investigated the effect of different detoxification processes on these two functional activities of the residual PTx in toxoids and vaccines currently marketed worldwide using a recently developed in vitro biochemical assay system. The patho-physiological activities in these samples were also estimated using the in vivo official histamine sensitisation tests. Different types of vaccines, detoxified by formaldehyde, glutaraldehyde or by both, have different residual functional and individual baseline activities. Of the vaccines tested, PT toxoid detoxified by formaldehyde had the lowest residual PTx ADP-ribosyltransferase activity. The carbohydrate binding results detected by anti-PTx polyclonal (pAb) and anti-PTx subunits monoclonal antibodies (mAb) showed specific binding profiles for toxoids and vaccines produced from different detoxification methods. In addition, we also demonstrated that using pAb or mAb S2/3 as detection antibodies would give a better differential difference between these vaccine lots than using mAbs S1 or S4. In summary, we showed for the first time that by measuring the activities of the two functional domains of PTx, we could characterise pertussis toxoids prepared from different chemical detoxification

  5. Fast neutron activation analysis by means of low voltage neutron generator

    NASA Astrophysics Data System (ADS)

    Medhat, M. E.

    A description of D-T neutron generator (NG) is presented. This machine can be used for fast neutron activation analysis applied to determine some selected elements, especially light elements, in different materials. Procedure of neutron flux determination and efficiency calculation is described. Examples of testing some Egyptian natural cosmetics are given.

  6. New electrochemical procedure for obtaining surface enhanced Raman scattering active polythiophene films on platinum

    NASA Astrophysics Data System (ADS)

    Bazzaoui, E. A.; Aeiyach, S.; Aubard, J.; Felidj, N.; Lévi, G.; Sakmeche, N.; Lacaze, P. C.

    1998-06-01

    A new electrochemical procedure for obtaining Surface Enhanced Raman Scattering (SERS) spectra of silver islands polybithiophene composite films is described. During the electropolymerization process which consists to use silver dodecylsulfate micellar aqueous solution mixed with bithiophene and LiClO4, silver cations are reduced, thus giving metallic silver particles embedded within the polybithiophene (PbT) film. Both doped and undoped PbT species display SERS spectra with exaltation factors varying between 40 and 200 with respect to the film prepared in sodium dodecylsulfate. Vibrational characterization of both doped and undoped species show that the amount of the polymer structural defects are more important in the oxidized species than in the reduced ones. This general method allows to synthesize various polymeric films displaying SERS effect and appears very promising for the structural study of these materials. Nous décrivons un procédé original pour synthétiser par voie électrochimique des films formés d'un composite de polybithiophène et d'îlots d'argent qui présentent des Spectres de Diffusion Raman Exaltée de Surface (DRES). Au cours de l'électropolymérisation d'une solution aqueuse micellaire de bithiophène en présence de dodécylsulfate d'argent (AgDS) et de LiClO4, les ions argent présents dans la solution se complexent avec le soufre du bithiophène et pénètrent dans le film polymère où ils sont réduits sous forme d'argent métallique. Les spectres Raman des deux formes réduite et oxydée du film ainsi obtenu présentent un effet DRES important avec un facteur d'exaltation variant entre 40 et 200 par rapport au même film électrosynthétisé en présence de dodécylsulfate de sodium (SDS). L'analyse vibrationnelle des deux formes redox montre que le taux de défauts est plus important dans la forme oxydée que dans la forme réduite. Cette méthode de polymérisation très générale, qui permet d'obtenir des polymères

  7. Activity Analysis and Cost Analysis in Medical Schools.

    ERIC Educational Resources Information Center

    Koehler, John E.; Slighton, Robert L.

    There is no unique answer to the question of what an ongoing program costs in medical schools. The estimates of program costs generated by classical methods of cost accounting are unsatisfactory because such accounting cannot deal with the joint production or joint cost problem. Activity analysis models aim at calculating the impact of alternative…

  8. Surgical treatment of infective endocarditis in active intravenous drug users: a justified procedure?

    PubMed Central

    2014-01-01

    Background Infective endocarditis is a life threatening complication of intravenous drug abuse, which continues to be a major burden with inadequately characterised long-term outcomes. We reviewed our institutional experience of surgical treatment of infective endocarditis in active intravenous drug abusers with the aim of identifying the determinants long-term outcome of this distinct subgroup of infective endocarditis patients. Methods A total of 451 patients underwent surgery for infective endocarditis between January 1993 and July 2013 at the University Hospital of Heidelberg. Of these patients, 20 (7 female, mean age 35 ± 7.7 years) underwent surgery for infective endocarditis with a history of active intravenous drug abuse. Mean follow-up was 2504 ± 1842 days. Results Staphylococcus aureus was the most common pathogen detected in preoperative blood cultures. Two patients (10%) died before postoperative day 30. Survival at 1, 5 and 10 years was 90%, 85% and 85%, respectively. Freedom from reoperation was 100%. Higher NYHA functional class, higher EuroSCORE II, HIV infection, longer operating time, postoperative fever and higher requirement for red blood cell transfusion were associated with 90-day mortality. Conclusions In active intravenous drug abusers, surgical treatment for infective endocarditis should be performed as extensively as possible and be followed by an aggressive postoperative antibiotic therapy to avoid high mortality. Early surgical intervention is advisable in patients with precipitous cardiac deterioration and under conditions of staphylococcal endocarditis. However, larger studies are necessary to confirm our preliminary results. PMID:24661344

  9. Technical note: A procedure to estimate glucose requirements of an activated immune system in steers.

    PubMed

    Kvidera, S K; Horst, E A; Abuajamieh, M; Mayorga, E J; Sanz Fernandez, M V; Baumgard, L H

    2016-11-01

    Infection and inflammation impede efficient animal productivity. The activated immune system ostensibly requires large amounts of energy and nutrients otherwise destined for synthesis of agriculturally relevant products. Accurately determining the immune system's in vivo energy needs is difficult, but a better understanding may facilitate developing nutritional strategies to maximize productivity. The study objective was to estimate immune system glucose requirements following an i.v. lipopolysaccharide (LPS) challenge. Holstein steers (148 ± 9 kg; = 15) were jugular catheterized bilaterally and assigned to 1 of 3 i.v.

  10. A modified release analysis procedure using advanced froth flotation mechanisms: Technical report, March 1, 1996-May 31, 1996

    SciTech Connect

    Honaker, R.Q., Mohanty, M.K.

    1997-04-01

    Recent studies indicate that the optimum separation performances achieved by multiple stage cleaning using various column flotation technologies and single stage cleaning using a Packed-Flotation Column are superior to the performance achieved by the traditional release procedure, especially in terms of pyritic sulfur rejection. This superior performance is believed to be the result of the advanced flotation mechanisms provided by column flotation technologies. Thus, the objective of this study is to develop a suitable process utilizing the advanced froth flotation mechanisms to characterize the true flotation response of a coal sample. Work in this reporting period concentrated on developing a modified coal flotation characterization procedure, termed as Advanced Flotation Washability (AFW) technique. The new apparatus used for this procedure is essentially a batch operated packed-column device equipped with a controlled wash water system. Several experiments were conducted using the AFW technique on a relatively high sulfur, -100 mesh Illinois No. 5 run-of-mine coal sample collected from a local coal preparation plant. Similar coal characterization experiments were also conducted using the traditional release and tree analysis procedures. The best performance curve generated using the AFW technique was found to be superior to the optimum curve produced by the traditional procedures. For example, at a combustible recovery of 80%, a 19% improvement in the reduction of the pyritic sulfur content was achieved by the AFW method while the ash reduction was also enhanced by 4%. Several tests are on-going to solidify the AFW procedure and verify the above finding by conducting Anova analyses to evaluate the repeatability of the AFW method and the statistical significance of the difference in the performance achieved from the traditional and modified coal characterization procedures.

  11. Calculation procedures and HPLC method for analysis of the lipophilicity of acyclovir esters.

    PubMed

    Lesniewska, Monika A; Gdaniec, Zofia; Muszalska, Izabela

    2015-04-01

    Acyclovir (ACV) belongs to a class of drugs with low bioavailability. Selected ACV esters including acetyl (Ac-), isobutyryl (iBut-), pivaloyl (Piv-), ethoxycarbonyl (Etc-) and nicotinoyl (Nic-) were synthesized, and their lipophilicity was determined by the high-performance liquid chromatography (HPLC) RP method. Statistical analyses of the comparative values of log P and clog P were carried out using computational methods. It was proved that the AC log P algorithm can be useful for the analysis of these compounds and has a statistically justified application in the assessment of the quantitative structure-activity relationship. Moreover, the lipophilicity determined by the HPLC method appears as follows: ACV < Ac- < Nic- < Etc- < iBut- < Piv-.

  12. Mobile Energy Laboratory Procedures

    SciTech Connect

    Armstrong, P.R.; Batishko, C.R.; Dittmer, A.L.; Hadley, D.L.; Stoops, J.L.

    1993-09-01

    Pacific Northwest Laboratory (PNL) has been tasked to plan and implement a framework for measuring and analyzing the efficiency of on-site energy conversion, distribution, and end-use application on federal facilities as part of its overall technical support to the US Department of Energy (DOE) Federal Energy Management Program (FEMP). The Mobile Energy Laboratory (MEL) Procedures establish guidelines for specific activities performed by PNL staff. PNL provided sophisticated energy monitoring, auditing, and analysis equipment for on-site evaluation of energy use efficiency. Specially trained engineers and technicians were provided to conduct tests in a safe and efficient manner with the assistance of host facility staff and contractors. Reports were produced to describe test procedures, results, and suggested courses of action. These reports may be used to justify changes in operating procedures, maintenance efforts, system designs, or energy-using equipment. The MEL capabilities can subsequently be used to assess the results of energy conservation projects. These procedures recognize the need for centralized NM administration, test procedure development, operator training, and technical oversight. This need is evidenced by increasing requests fbr MEL use and the economies available by having trained, full-time MEL operators and near continuous MEL operation. DOE will assign new equipment and upgrade existing equipment as new capabilities are developed. The equipment and trained technicians will be made available to federal agencies that provide funding for the direct costs associated with MEL use.

  13. Genomic features of uncultured methylotrophs in activated-sludge microbiomes grown under different enrichment procedures

    PubMed Central

    Fujinawa, Kazuki; Asai, Yusuke; Miyahara, Morio; Kouzuma, Atsushi; Abe, Takashi; Watanabe, Kazuya

    2016-01-01

    Methylotrophs are organisms that are able to grow on C1 compounds as carbon and energy sources. They play important roles in the global carbon cycle and contribute largely to industrial wastewater treatment. To identify and characterize methylotrophs that are involved in methanol degradation in wastewater-treatment plants, methanol-fed activated-sludge (MAS) microbiomes were subjected to phylogenetic and metagenomic analyses, and genomic features of dominant methylotrophs in MAS were compared with those preferentially grown in laboratory enrichment cultures (LECs). These analyses consistently indicate that Hyphomicrobium plays important roles in MAS, while Methylophilus occurred predominantly in LECs. Comparative analyses of bin genomes reconstructed for the Hyphomicrobium and Methylophilus methylotrophs suggest that they have different C1-assimilation pathways. In addition, function-module analyses suggest that their cell-surface structures are different. Comparison of the MAS bin genome with genomes of closely related Hyphomicrobium isolates suggests that genes unnecessary in MAS (for instance, genes for anaerobic respiration) have been lost from the genome of the dominant methylotroph. We suggest that genomic features and coded functions in the MAS bin genome provide us with insights into how this methylotroph adapts to activated-sludge ecosystems. PMID:27221669

  14. Using CV-GLUE procedure in analysis of wetland model predictive uncertainty.

    PubMed

    Huang, Chun-Wei; Lin, Yu-Pin; Chiang, Li-Chi; Wang, Yung-Chieh

    2014-07-01

    This study develops a procedure that is related to Generalized Likelihood Uncertainty Estimation (GLUE), called the CV-GLUE procedure, for assessing the predictive uncertainty that is associated with different model structures with varying degrees of complexity. The proposed procedure comprises model calibration, validation, and predictive uncertainty estimation in terms of a characteristic coefficient of variation (characteristic CV). The procedure first performed two-stage Monte-Carlo simulations to ensure predictive accuracy by obtaining behavior parameter sets, and then the estimation of CV-values of the model outcomes, which represent the predictive uncertainties for a model structure of interest with its associated behavior parameter sets. Three commonly used wetland models (the first-order K-C model, the plug flow with dispersion model, and the Wetland Water Quality Model; WWQM) were compared based on data that were collected from a free water surface constructed wetland with paddy cultivation in Taipei, Taiwan. The results show that the first-order K-C model, which is simpler than the other two models, has greater predictive uncertainty. This finding shows that predictive uncertainty does not necessarily increase with the complexity of the model structure because in this case, the more simplistic representation (first-order K-C model) of reality results in a higher uncertainty in the prediction made by the model. The CV-GLUE procedure is suggested to be a useful tool not only for designing constructed wetlands but also for other aspects of environmental management.

  15. A Benchtop Fractionation Procedure for Subcellular Analysis of the Plant Metabolome

    PubMed Central

    Fürtauer, Lisa; Weckwerth, Wolfram; Nägele, Thomas

    2016-01-01

    Although compartmentation is a key feature of eukaryotic cells, biological research is frequently limited by methods allowing for the comprehensive subcellular resolution of the metabolome. It has been widely accepted that such a resolution would be necessary in order to approximate cellular biochemistry and metabolic regulation, yet technical challenges still limit both the reproducible subcellular fractionation and the sample throughput being necessary for a statistically robust analysis. Here, we present a method and a detailed protocol which is based on the non-aqueous fractionation technique enabling the assignment of metabolites to their subcellular localization. The presented benchtop method aims at unraveling subcellular metabolome dynamics in a precise and statistically robust manner using a relatively small amount of tissue material. The method is based on the separation of cellular fractions via density gradients consisting of organic, non-aqueous solvents. By determining the relative distribution of compartment-specific marker enzymes together with metabolite profiles over the density gradient it is possible to estimate compartment-specific metabolite concentrations by correlation. To support this correlation analysis, a spreadsheet is provided executing a calculation algorithm to determine the distribution of metabolites over subcellular compartments. The calculation algorithm performs correlation of marker enzyme activity and metabolite abundance accounting for technical errors, reproducibility and the resulting error propagation. The method was developed, tested and validated in three natural accessions of Arabidopsis thaliana showing different ability to acclimate to low temperature. Particularly, amino acids were strongly shuffled between subcellular compartments in a cold-sensitive accession while a cold-tolerant accession was characterized by a stable subcellular metabolic homeostasis. Finally, we conclude that subcellular metabolome analysis is

  16. Development and validation of a novel data analysis procedure for spherical nanoindentation

    NASA Astrophysics Data System (ADS)

    Pathak, Siddhartha

    This dissertation presents a novel approach for converting the raw load-displacement data measured in spherical nanoindentation into much more meaningful indentation stress-strain curves. This new method entails a novel definition of the indentation strain and a new procedure for establishing the effective zero point in the raw dataset---both with and without the use of the continuous stiffness measurement (CSM) data. The concepts presented here have been validated by simulations and experiments on isotropic metallic samples of aluminum and tungsten. It is demonstrated that these new indentation stress-strain curves accurately capture the loading and unloading elastic moduli, the indentation yield points, as well as the post-yield characteristics in the tested samples. Subsequently this approach has been applied on a wide range of material systems including metals, carbon nanotubes (CNTs), ceramics and bone. In metals, these data analysis techniques have been highly successful in explaining several of the surface preparation artifacts typically encountered during nanoindentation measurements. This approach has also been extended to anisotropic polycrystalline samples, where a judicious combination of Orientation Imaging Microscopy (OIM) and nanoindentation were used to estimate, for the first time, the changes in slip resistance in deformed grains of Fe-3%Si steel. Similar studies on dense CNT brushes, with ˜10 times higher density than CNT brushes produced by other methods, demonstrate the higher modulus (˜17-20 GPa) and orders of magnitude higher resistance to buckling in these brushes than vapor phase deposited CNT brushes or carbon walls, showing their promise for energy-absorbing coatings. Even for a complex hierarchical material system like bone, these techniques have elucidated trends in the elastic and yield behavior at the lamellar level in the femora (thigh bone) of different inbred mouse strains. Thus bone with a higher mineral-to-matrix ratio

  17. Somatic cell cloning in Buffalo (Bubalus bubalis): effects of interspecies cytoplasmic recipients and activation procedures.

    PubMed

    Kitiyanant, Y; Saikhun, J; Chaisalee, B; White, K L; Pavasuthipaisit, K

    2001-01-01

    Successful nuclear transfer (NT) of somatic cell nuclei from various mammalian species to enucleated bovine oocytes provides a universal cytoplast for NT in endangered or extinct species. Buffalo fetal fibroblasts were isolated from a day 40 fetus and were synchronized in presumptive G(0) by serum deprivation. Buffalo and bovine oocytes from abattoir ovaries were matured in vitro and enucleated at 22 h. In the first experiment, we compared the ability of buffalo and bovine oocyte cytoplasm to support in vitro development of NT embryos produced by buffalo fetal fibroblasts as donor nuclei. There were no significant differences (p > 0.05) between the NT embryos derived from buffalo and bovine oocytes, in fusion (74% versus 71%) and cleavage (77% versus 75%) rates, respectively. No significant differences were also observed in blastocyst development (39% versus 33%) and the mean cell numbers of day 7 cloned blastocysts (88.5 +/- 25.7 versus 51.7 +/- 5.4). In the second experiment, we evaluated the effects of activation with calcium ionophore A23187 on development of NT embryos after electrical fusion. A significantly higher (p < 0.05) percentage of blastocyst development was observed in the NT embryos activated by calcium ionophore and 6-DMAP when compared with 6-DMAP alone (33% versus 17%). The results indicate that the somatic nuclei from buffalo can be reprogrammed after transfer to enucleated bovine oocytes, resulting in the production of cloned buffalo blastocysts similar to those transferred into buffalo oocytes. Calcium ionophore used in conjunction with 6-DMAP effectively induces NT embryo development.

  18. Background Interference Procedure and discriminant function analysis in predicting clinically determined categories of learning disability.

    PubMed

    Mallinger, B I

    1977-06-01

    There is a need to determine the extent to which the "Background Interference Procedure" as an adjunct to the Bender-Gestalt can account for criterion variance beyond that level predicted by an optimal battery. Discriminant functions empirically classified subjects into clinical categories of learning disability. A reduced battery of intellective and visual-motor predictors generated two significant functions, accounting for 91% of the variance. The first dimension reflected over-all intellectual functioning, the second, psychomotor skills. Empirical classification accurately categorized 71% of all subjects across five criterion groups. The functions efficiently separated the criteria, but the six Background Interference Procedure predictor variables did not improve prediction. Implications include using the Background Interference Procedure for early screening of learning disabilities and employing discriminant functions for data reduction and construct validation of teachers' and judges' ratings.

  19. An analysis of the Home Office Statistics of Scientific Procedures on Living Animals, Great Britain 2004.

    PubMed

    Hudson, Michelle; Bhogal, Nirmala

    2006-02-01

    The 2004 Statistics of Scientific Procedures on Living Animals were released by the Home Office in December 2005. They indicate that, for the third year running, there has been a significant increase in the number of laboratory animal procedures undertaken in Great Britain, and that increasing numbers of animals are involved. The overall trends in the use of toxicological and non-toxicological procedures involving animals are described. Particular emphasis is placed on the production and use of genetically modified animals, the production of biological materials, and acute toxicity testing. The use of non-human primates and dogs is also discussed. The implications of these latest statistics are considered with reference to the implementation of the Three Rs and their consequences for animal welfare.

  20. Transport of Space Environment Electrons: A Simplified Rapid-Analysis Computational Procedure

    NASA Technical Reports Server (NTRS)

    Nealy, John E.; Anderson, Brooke M.; Cucinotta, Francis A.; Wilson, John W.; Katz, Robert; Chang, C. K.

    2002-01-01

    A computational procedure for describing transport of electrons in condensed media has been formulated for application to effects and exposures from spectral distributions typical of electrons trapped in planetary magnetic fields. The procedure is based on earlier parameterizations established from numerous electron beam experiments. New parameterizations have been derived that logically extend the domain of application to low molecular weight (high hydrogen content) materials and higher energies (approximately 50 MeV). The production and transport of high energy photons (bremsstrahlung) generated in the electron transport processes have also been modeled using tabulated values of photon production cross sections. A primary purpose for developing the procedure has been to provide a means for rapidly performing numerous repetitive calculations essential for electron radiation exposure assessments for complex space structures. Several favorable comparisons have been made with previous calculations for typical space environment spectra, which have indicated that accuracy has not been substantially compromised at the expense of computational speed.

  1. Evaluation of shoulder function in clavicular fracture patients after six surgical procedures based on a network meta-analysis.

    PubMed

    Huang, Shou-Guo; Chen, Bo; Lv, Dong; Zhang, Yong; Nie, Feng-Feng; Li, Wei; Lv, Yao; Zhao, Huan-Li; Liu, Hong-Mei

    2017-01-01

    Purpose Using a network meta-analysis approach, our study aims to develop a ranking of the six surgical procedures, that is, Plate, titanium elastic nail (TEN), tension band wire (TBW), hook plate (HP), reconstruction plate (RP) and Knowles pin, by comparing the post-surgery constant shoulder scores in patients with clavicular fracture (CF). Methods A comprehensive search of electronic scientific literature databases was performed to retrieve publications investigating surgical procedures in CF, with the stringent eligible criteria, and clinical experimental studies of high quality and relevance to our area of interest were selected for network meta-analysis. Statistical analyses were conducted using Stata 12.0. Results A total of 19 studies met our inclusion criteria were eventually enrolled into our network meta-analysis, representing 1164 patients who had undergone surgical procedures for CF (TEN group = 240; Plate group = 164; TBW group  =  180; RP group  =  168; HP group  =  245; Knowles pin group  =  167). The network meta-analysis results revealed that RP significantly improved constant shoulder score in patients with CF when compared with TEN, and the post-operative constant shoulder scores in patients with CF after Plate, TBW, HP, Knowles pin and TEN were similar with no statistically significant differences. The treatment relative ranking of predictive probabilities of constant shoulder scores in patients with CF after surgery revealed the surface under the cumulative ranking curves (SUCRA) value is the highest in RP. Conclusion The current network meta-analysis suggests that RP may be the optimum surgical treatment among six inventions for patients with CF, and it can improve the shoulder score of patients with CF. Implications for Rehabilitation RP improves shoulder joint function after surgical procedure. RP achieves stability with minimal complications after surgery. RP may be the optimum surgical treatment for

  2. Neutron activation analysis; A sensitive test for trace elements

    SciTech Connect

    Hossain, T.Z. . Ward Lab.)

    1992-01-01

    This paper discusses neutron activation analysis (NAA), an extremely sensitive technique for determining the elemental constituents of an unknown specimen. Currently, there are some twenty-five moderate-power TRIGA reactors scattered across the United States (fourteen of them at universities), and one of their principal uses is for NAA. NAA is procedurally simple. A small amount of the material to be tested (typically between one and one hundred milligrams) is irradiated for a period that varies from a few minutes to several hours in a neutron flux of around 10{sup 12} neutrons per square centimeter per second. A tiny fraction of the nuclei present (about 10{sup {minus}8}) is transmuted by nuclear reactions into radioactive forms. Subsequently, the nuclei decay, and the energy and intensity of the gamma rays that they emit can be measured in a gamma-ray spectrometer.

  3. Analysis of regression methods for solar activity forecasting

    NASA Technical Reports Server (NTRS)

    Lundquist, C. A.; Vaughan, W. W.

    1979-01-01

    The paper deals with the potential use of the most recent solar data to project trends in the next few years. Assuming that a mode of solar influence on weather can be identified, advantageous use of that knowledge presumably depends on estimating future solar activity. A frequently used technique for solar cycle predictions is a linear regression procedure along the lines formulated by McNish and Lincoln (1949). The paper presents a sensitivity analysis of the behavior of such regression methods relative to the following aspects: cycle minimum, time into cycle, composition of historical data base, and unnormalized vs. normalized solar cycle data. Comparative solar cycle forecasts for several past cycles are presented as to these aspects of the input data. Implications for the current cycle, No. 21, are also given.

  4. Neutron activation analysis of major, minor, and trace elements in marine sediments

    SciTech Connect

    Stone, S.F.; Zeisler, R.; Koster, B.J.

    1988-01-01

    Neutron activation analysis (NAA) techniques are well established in the multielement assay of geological materials. Similarly, applications of NAA to the analysis of marine sediments have been described. The different emphasis on elemental composition in studying and monitoring the health of the environment, however, presents a new challenge to the analyst. To investigate as many elements as possible, previous multielement procedures need to be reevaluated and modified. In this work, the authors have utilized the NAA steps of a recently developed sequential analysis procedure that obtained concentrations for 45 biological and pollutant elements in marine bivalves. This procedure, with modification, was applied to samples of marine sediments collected for the National Oceanic and Atmospheric Administration (NOAA) National Status and Trends (NS T) specimen banking program.

  5. Investigation of Deterioration Behavior of Hysteretic Loops in Nonlinear Static Procedure Analysis of Concrete Structures with Shear Walls

    SciTech Connect

    Ghodrati Amiri, G.; Amidi, S.; Khorasani, M.

    2008-07-08

    In the recent years, scientists developed the seismic rehabilitation of structures and their view points were changed from sufficient strength to the performance of structures (Performance Base Design) to prepare a safe design. Nonlinear Static Procedure analysis (NSP) or pushover analysis is a new method that is chosen for its speed and simplicity in calculations. 'Seismic Rehabilitation Code for Existing Buildings' and FEMA 356 considered this method. Result of this analysis is a target displacement that is the base of the performance and rehabilitation procedure of the structures. Exact recognition of that displacement could develop the workability of pushover analysis. In these days, Nonlinear Dynamic Analysis (NDP) is only method can exactly apply the seismic ground motions. In this case because it consumes time, costs very high and is more difficult than other methods, is not applicable as much as NSP. A coefficient used in NSP for determining the target displacement is C2 (Stiffness and Strength Degradations Coefficient) and is applicable for correcting the errors due to eliminating the stiffness and strength degradations in hysteretic loops. In this study it has been tried to analysis three concrete frames with shear walls by several accelerations that scaled according to FEMA 273 and FEMA 356. These structures were designed with Iranian 2800 standard (vers.3). Finally after the analyzing by pushover method and comparison results with dynamic analysis, calculated C2 was comprised with values in rehabilitation codes.

  6. Simple procedure for the synthesis of high specific activity tritiated (6S)-5-formyltetrahydrofolate

    SciTech Connect

    Moran, R.G.; Colman, P.D.

    1982-05-01

    The 5-position of tetrahydrofolate was found to be unusually reactive with low concentrations of formic acid in the presence of a water-soluble carbodiimide. The product of this reaction has neutral and acid ultraviolet spectra and chromatographic behavior consistent with its identity as 5-formyltetrahydrofolate (leucovoriun). When enzymatically synthesized (6S)-tetrahydrofolate was used as starting material, the product supported the growth of folate-depleted L1210 cells at one-half the concentration required for authentic (6R,S)-leucovorin. This reaction has been used to produce high specific activity (44 Ci/mmol) (/sup 3/H)(6S)-5-formyltetrahydrofolate in high yield. Experiments with (/sup 14/C)formic acid indicate that 1 mol of formate reacted per mol of tetrahydrofolate but that no reaction occurred with a variety of other folate compounds. (6S)-5-Formyltetrahydrofolate, labeled in the formyl group with /sup 14/C, has also been synthesized using this reaction. These easily produced, labeled folates should allow close examination of the transport and utilization of leucovorin and of the mechanism of reversal of methotrexate toxicity by reduced folate cofactors.

  7. To what extent are surgery and invasive procedures effective beyond a placebo response? A systematic review with meta-analysis of randomised, sham controlled trials

    PubMed Central

    Jonas, Wayne B; Crawford, Cindy; Colloca, Luana; Kaptchuk, Ted J; Moseley, Bruce; Miller, Franklin G; Kriston, Levente; Linde, Klaus; Meissner, Karin

    2015-01-01

    Objectives To assess the quantity and quality of randomised, sham-controlled studies of surgery and invasive procedures and estimate the treatment-specific and non-specific effects of those procedures. Design Systematic review and meta-analysis. Data sources We searched PubMed, EMBASE, CINAHL, CENTRAL (Cochrane Library), PILOTS, PsycInfo, DoD Biomedical Research, clinicaltrials.gov, NLM catalog and NIH Grantee Publications Database from their inception through January 2015. Study selection We included randomised controlled trials of surgery and invasive procedures that penetrated the skin or an orifice and had a parallel sham procedure for comparison. Data extraction and analysis Three authors independently extracted data and assessed risk of bias. Studies reporting continuous outcomes were pooled and the standardised mean difference (SMD) with 95% CIs was calculated using a random effects model for difference between true and sham groups. Results 55 studies (3574 patients) were identified meeting inclusion criteria; 39 provided sufficient data for inclusion in the main analysis (2902 patients). The overall SMD of the continuous primary outcome between treatment/sham-control groups was 0.34 (95% CI 0.20 to 0.49; p<0.00001; I2=67%). The SMD for surgery versus sham surgery was non-significant for pain-related conditions (n=15, SMD=0.13, p=0.08), marginally significant for studies on weight loss (n=10, SMD=0.52, p=0.05) and significant for gastroesophageal reflux disorder (GERD) studies (n=5, SMD=0.65, p<0.001) and for other conditions (n=8, SMD=0.44, p=0.004). Mean improvement in sham groups relative to active treatment was larger in pain-related conditions (78%) and obesity (71%) than in GERD (57%) and other conditions (57%), and was smaller in classical-surgery trials (21%) than in endoscopic trials (73%) and those using percutaneous procedures (64%). Conclusions The non-specific effects of surgery and other invasive procedures are generally large. Particularly in

  8. Evaluation of solution procedures for material and/or geometrically nonlinear structural analysis by the direct stiffness method.

    NASA Technical Reports Server (NTRS)

    Stricklin, J. A.; Haisler, W. E.; Von Riesemann, W. A.

    1972-01-01

    This paper presents an assessment of the solution procedures available for the analysis of inelastic and/or large deflection structural behavior. A literature survey is given which summarized the contribution of other researchers in the analysis of structural problems exhibiting material nonlinearities and combined geometric-material nonlinearities. Attention is focused at evaluating the available computation and solution techniques. Each of the solution techniques is developed from a common equation of equilibrium in terms of pseudo forces. The solution procedures are applied to circular plates and shells of revolution in an attempt to compare and evaluate each with respect to computational accuracy, economy, and efficiency. Based on the numerical studies, observations and comments are made with regard to the accuracy and economy of each solution technique.

  9. Input strategy analysis for an air quality data modelling procedure at a local scale based on neural network.

    PubMed

    Ragosta, M; D'Emilio, M; Giorgio, G A

    2015-05-01

    In recent years, a significant part of the studies on air pollutants has been devoted to improve statistical techniques for forecasting the values of their concentrations in the atmosphere. Reliable predictions of pollutant trends are essential not only for setting up preventive measures able to avoid risks for human health but also for helping stakeholders to take decision about traffic limitations. In this paper, we present an operating procedure, including both pollutant concentration measurements (CO, SO₂, NO₂, O₃, PM10) and meteorological parameters (hourly data of atmospheric pressure, relative humidity, wind speed), which improves the simple use of neural network for the prediction of pollutant concentration trends by means of the integration of multivariate statistical analysis. In particular, we used principal component analysis in order to define an unconstrained mix of variables able to improve the performance of the model. The developed procedure is particularly suitable for characterizing the investigated phenomena at a local scale.

  10. 76 FR 78015 - Revised Analysis and Mapping Procedures for Non-Accredited Levees

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-15

    ... terms of its feasibility, flexibility, and collaborative nature. DATES: Comments must be received by... revise the mapping procedures for non-accredited levees. This approach works within the confines of existing federal regulations, yet is more flexible, collaborative and feasible. FEMA is replacing...

  11. A Comparative Analysis of British and Taiwanese Students' Conceptual and Procedural Knowledge of Fraction Addition

    ERIC Educational Resources Information Center

    Li, Hui-Chuan

    2014-01-01

    This study examines students' procedural and conceptual achievement in fraction addition in England and Taiwan. A total of 1209 participants (561 British students and 648 Taiwanese students) at ages 12 and 13 were recruited from England and Taiwan to take part in the study. A quantitative design by means of a self-designed written test is adopted…

  12. Recession Vs Myotomy–Comparative Analysis of Two Surgical Procedures of Weakening Inferior Oblique Muscle Overaction

    PubMed Central

    Alajbegovic-Halimic, Jasmina; Zvizdic, Denisa; Sahbegovic-Holcner, Amra; Kulanic-Kuduzovic, Amira

    2015-01-01

    Introduction: Inferior oblique overaction (IOOA) can be primary or secondary, isolated or combined to other types of horizontal deviation, mostly with esotropias. Surgical weakening of IOOA means several techniques like; recession, myotomy, myectomy, anteroposition etc. Goals: we analyzed the effect of inferior oblique muscle surgical weakening comparing two groups of patients with primary hypertropia. Material and methods: In 5-years retrospective study, we observed 33 patients on which we did the surgical procedure of weakening inferior muscle overaction by two methods; recession and myotomy. Results: In total number of 33 patients, there were 57,6% male and 42,4% female patients with average age of 10,6±7,5 (in range of 4–36). There was 33,3% of isolated primary hypertropias, and 66,7% combined with esotropias. At 23 (69,9%) patients the recession surgical procedure was done, and with 10 (30,1%) myotomy. Better effect and binocularity was in 65,2% of patients in recession group which was statistically significant with significance level of p<0,0, χ2=5,705; p=0,021. Conclusion: Comparing of two surgical procedures of weakening inferior oblique muscles overaction, recession is better procedure than myotomy. PMID:26261384

  13. Diagnosing Behavior Disorders: An Analysis of State Definitions, Eligibility Criteria and Recommended Procedures.

    ERIC Educational Resources Information Center

    Swartz, Stanley L.; And Others

    Using information collected in a survey of all 50 states and the District of Columbia, the study analyzed state definitions of the "behavior disordered/emotionally disturbed" (BD/ED) category of handicapped children, program entrance and exit criteria, and procedures for referral, evaluation, and program placement. A general lack of…

  14. A Qualitative Analysis of the Determinants in the Choice of a French Journal Reviewing Procedures

    ERIC Educational Resources Information Center

    Morge, Ludovic

    2015-01-01

    Between 1993 and 2010, two French journals (Aster and Didaskalia) coming from different backgrounds but belonging to the same institution used to publish papers on research in science and technology education. The merging of these journals made it necessary for them to compare the different reviewing procedures used by each. This merging occurred…

  15. Finite element procedures for coupled linear analysis of heat transfer, fluid and solid mechanics

    NASA Technical Reports Server (NTRS)

    Sutjahjo, Edhi; Chamis, Christos C.

    1993-01-01

    Coupled finite element formulations for fluid mechanics, heat transfer, and solid mechanics are derived from the conservation laws for energy, mass, and momentum. To model the physics of interactions among the participating disciplines, the linearized equations are coupled by combining domain and boundary coupling procedures. Iterative numerical solution strategy is presented to solve the equations, with the partitioning of temporal discretization implemented.

  16. 7 CFR 201.51a - Special procedures for purity analysis.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... kinds. After completing the blowing procedure, remove all weed and other crop seeds from the light portion and add these to the weed or other crop separation, as appropriate. The remainder of the light portion shall be considered inert matter. Remove all weed and other crop seeds and other inert...

  17. Correcting for Indirect Range Restriction in Meta-Analysis: Testing a New Meta-Analytic Procedure

    ERIC Educational Resources Information Center

    Le, Huy; Schmidt, Frank L.

    2006-01-01

    Using computer simulation, the authors assessed the accuracy of J. E. Hunter, F. L. Schmidt, and H. Le's (2006) procedure for correcting for indirect range restriction, the most common type of range restriction, in comparison with the conventional practice of applying the Thorndike Case II correction for direct range restriction. Hunter et…

  18. A qualitative analysis of the determinants in the choice of a French journal reviewing procedures

    NASA Astrophysics Data System (ADS)

    Morge, Ludovic

    2015-12-01

    Between 1993 and 2010, two French journals (Aster and Didaskalia) coming from different backgrounds but belonging to the same institution used to publish papers on research in science and technology education. The merging of these journals made it necessary for them to compare the different reviewing procedures used by each. This merging occurred at a time when research is becoming increasingly international which partly determines some of the reviewing procedure choices. In order for a francophone international journal to survive, it needs to take this internationalization into account in a reasoned manner. The author of this article, as a chief editor of RDST (Recherches en Didactique des Sciences et des Technologies)—the journal resulting from the merging- taking part in this merger, analyses the social, cultural and pragmatic determinants which impacted the choices made in reviewing procedures. This paper describes how these diversity of factors leads us to drop the idea of a standard reviewing procedure which would be valid for all journals.

  19. The Generalized Johnson-Neyman Procedures: An Approach to Covariate Adjustment and Interaction Analysis.

    ERIC Educational Resources Information Center

    Forster, Fred

    Statistical methods are described for diagnosing and treating three important problems in covariate tests of significance: curvilinearity, covariable effectiveness, and treatment-covariable interaction. Six major assumptions, prerequisites for covariate procedure, are discussed in detail: (1) normal distribution, (2) homogeneity of variances, (3)…

  20. Response-restriction analysis: II. Alteration of activity preferences.

    PubMed Central

    Hanley, Gregory P; Iwata, Brian A; Roscoe, Eileen M; Thompson, Rachel H; Lindberg, Jana S

    2003-01-01

    We used response-restriction (RR) assessments to identify the preferences of 7 individuals with mental retardation for a variety of vocational and leisure activities. We subsequently increased their engagement in nonpreferred activities using several procedures: response restriction per se versus a Premack-type contingency (Study 1), supplemental reinforcement for engagement in target activities (Study 2), and noncontingent pairing of reinforcers with nonpreferred activities (Study 3). Results indicated that preferences are not immutable and can be altered through a variety of relatively benign interventions and that the results of RR assessments may be helpful in determining which types of procedures may be most effective on an individual basis. PMID:12723867

  1. 40 CFR 93.123 - Procedures for determining localized CO, PM10, and PM2.5 concentrations (hot-spot analysis).

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 20 2014-07-01 2013-07-01 true Procedures for determining localized CO... Transit Laws § 93.123 Procedures for determining localized CO, PM10, and PM2.5 concentrations (hot-spot analysis). (a) CO hot-spot analysis. (1) The demonstrations required by § 93.116 (“Localized CO, PM10,...

  2. A Refinement of Risk Analysis Procedures for Trichloroethylene Through the Use of Monte Carlo Method in Conjunction with Physiologically Based Pharmacokinetic Modeling

    DTIC Science & Technology

    1993-09-01

    This study refines risk analysis procedures for trichloroethylene (TCE) using a physiologically based pharmacokinetic (PBPK) model in conjunction...promulgate, and better present, more realistic standards.... Risk analysis , Physiologically based pharmacokinetics, Pbpk, Trichloroethylene, Monte carlo method.

  3. Application of response surface methodology to optimize solid-phase microextraction procedure for chromatographic determination of aroma-active monoterpenes in berries.

    PubMed

    Chmiel, Tomasz; Kupska, Magdalena; Wardencki, Waldemar; Namieśnik, Jacek

    2017-04-15

    Most of scientific papers concern the qualitative or semi-quantitative analysis of aroma-active terpenes in liquid food matrices. Therefore, the procedure based on solid-phase microextraction and two-dimensional gas chromatography-time-of-flight mass spectrometry for determination of monoterpenes in fresh berries was developed. The optimal extraction conditions using divinylbenzene-carboxen-polydimethylsiloxane fiber were: temperature of 50°C, extraction time of 26min, equilibrium time of 29min. The developed procedure provides a high recovery (70.8-99.2%), good repeatability (CV<10.4%), high linearity (r>0.9915) and offers practical advantages over currently used methods: reliability of compounds identification, simplicity of extraction and at least one order of magnitude lower detection limits (0.10-0.011μg/L). The method was successfully applied to determine monoterpenes in 27 berry samples of different varieties and 4 berry products. Tukey's test revealed that monoterpenes content is a reliable indicator of fruit maturity and origin. It suggests that the method may be of interest to researchers and food industry.

  4. Cloning of the bronze locus in maize by a simple and generalizable procedure using the transposable controlling element Activator (Ac)

    PubMed Central

    Fedoroff, Nina V.; Furtek, Douglas B.; Nelson, Oliver E.

    1984-01-01

    The bronze (bz) locus of maize has been cloned by an indirect procedure utilizing the cloned transposable controlling element Activator (Ac). Restriction endonuclease fragments of maize DNA were cloned in bacteriophage λ and recombinant phage with homology to the center of the Ac element were isolated. The cloned fragments were analyzed to determine which contained sequences that were structurally identical to a previously isolated Ac element. Two such fragments were identified. Sequences flanking the Ac element were subcloned and used to probe genomic DNA from plants with well-defined mutations at the bz locus. By this means, it was established that one of the genomic clones contained a bz locus sequence. The subcloned probe fragment was then used to clone a nonmutant Bz allele of the locus. The method described here should prove useful in cloning other loci with Ac insertion mutations. Images PMID:16593478

  5. Analysis of plant gums and saccharide materials in paint samples: comparison of GC-MS analytical procedures and databases

    PubMed Central

    2012-01-01

    Background Saccharide materials have been used for centuries as binding media, to paint, write and illuminate manuscripts and to apply metallic leaf decorations. Although the technical literature often reports on the use of plant gums as binders, actually several other saccharide materials can be encountered in paint samples, not only as major binders, but also as additives. In the literature, there are a variety of analytical procedures that utilize GC-MS to characterize saccharide materials in paint samples, however the chromatographic profiles are often extremely different and it is impossible to compare them and reliably identify the paint binder. Results This paper presents a comparison between two different analytical procedures based on GC-MS for the analysis of saccharide materials in works-of-art. The research presented here evaluates the influence of the analytical procedure used, and how it impacts the sugar profiles obtained from the analysis of paint samples that contain saccharide materials. The procedures have been developed, optimised and systematically used to characterise plant gums at the Getty Conservation Institute in Los Angeles, USA (GCI) and the Department of Chemistry and Industrial Chemistry of the University of Pisa, Italy (DCCI). The main steps of the analytical procedures and their optimisation are discussed. Conclusions The results presented highlight that the two methods give comparable sugar profiles, whether the samples analysed are simple raw materials, pigmented and unpigmented paint replicas, or paint samples collected from hundreds of centuries old polychrome art objects. A common database of sugar profiles of reference materials commonly found in paint samples was thus compiled. The database presents data also from those materials that only contain a minor saccharide fraction. This database highlights how many sources of saccharides can be found in a paint sample, representing an important step forward in the problem of

  6. Transcranial electrical stimulation during sleep enhances declarative (but not procedural) memory consolidation: Evidence from a meta-analysis.

    PubMed

    Barham, Michael P; Enticott, Peter G; Conduit, Russell; Lum, Jarrad A G

    2016-04-01

    This meta-analysis summarizes research examining whether transcranial electrical stimulation (transcranial direct current stimulation with oscillating and constant currents; transcranial alternating current stimulation), administered during sleep, can modulate declarative and procedural memory consolidation. Included in the meta-analysis were 13 experiments that represented data from 179 participants. Study findings were summarized using standardized mean difference (SMD) which is an effect size that summarizes differences in standard deviation units. Results showed electrical stimulation during sleep could enhance (SMD=0.447; p=.003) or disrupt (SMD=-0.476, p=.030) declarative memory consolidation. However, transcranial electric stimulation does not appear to be able to enhance (SMD=0.154, p=.279) or disrupt (SMD=0.076, p=.675) procedural memory consolidation. This meta-analysis provides strong evidence that TES is able to modulate some consolidation processes. Additional research is required to determine the mechanisms by which transcranial electrical stimulation is able to influence declarative memory consolidation. Finally, it is yet to be determined whether transcranial electrical stimulation can modulate procedural memory consolidation.

  7. Recent Data Analysis of Carbon ACtivation

    NASA Astrophysics Data System (ADS)

    Jiang, Hui Ming; Smith, Elizabeth; Padalino, Stephen; Baumgart, Leigh; Suny Geneseooltz, Katie; Colburn, Robyn; Fuschino, Julia

    2002-10-01

    A method for measuring tertiary neutrons produced in Inertial Confinement Fusion reactions has been developed using carbon activation. Ultra pure samples of carbon, free from positron-emitting contaminants must be used in the detection. Our primary goal has been to reduce the contamination level by refining purification and packaging procedures. This process involves baking the disks in a vacuum oven to 1000¢XC @ 200 microns for a prescribed bake time without exposing the disks to nitrogen in the air which is a major contaminant. Recent experiments were conducted to determine the optimal bake time for purification. Disks were baked for varying times, from one hour to five hours, and then exposed to high-neutron-yield ( 5 x 1013) shots on OMEGA. Data collected was normalized to the same time interval and the same primary neutron yield, and no significant difference in the number of background counts was seen. Experimental results also indicated that disks that were exposed to air for short time intervals showed a significant increase in the number of contamination counts. This further supports our findings that the gaseous diffusion through graphite disks is very high. Experimental results of these findings will be presented. Research funded in part by the United States Department of Energy.

  8. Comparative Sensitivity Analysis of Muscle Activation Dynamics

    PubMed Central

    Rockenfeller, Robert; Günther, Michael; Schmitt, Syn; Götz, Thomas

    2015-01-01

    We mathematically compared two models of mammalian striated muscle activation dynamics proposed by Hatze and Zajac. Both models are representative for a broad variety of biomechanical models formulated as ordinary differential equations (ODEs). These models incorporate parameters that directly represent known physiological properties. Other parameters have been introduced to reproduce empirical observations. We used sensitivity analysis to investigate the influence of model parameters on the ODE solutions. In addition, we expanded an existing approach to treating initial conditions as parameters and to calculating second-order sensitivities. Furthermore, we used a global sensitivity analysis approach to include finite ranges of parameter values. Hence, a theoretician striving for model reduction could use the method for identifying particularly low sensitivities to detect superfluous parameters. An experimenter could use it for identifying particularly high sensitivities to improve parameter estimation. Hatze's nonlinear model incorporates some parameters to which activation dynamics is clearly more sensitive than to any parameter in Zajac's linear model. Other than Zajac's model, Hatze's model can, however, reproduce measured shifts in optimal muscle length with varied muscle activity. Accordingly we extracted a specific parameter set for Hatze's model that combines best with a particular muscle force-length relation. PMID:26417379

  9. Neutron activation analysis in archaeological chemistry

    SciTech Connect

    Harbottle, G.

    1987-01-01

    Neutron activation analysis has proven to be a convenient way of performing the chemical analysis of archaeologically-excavated artifacts and materials. It is fast and does not require tedious laboratory operations. It is multielement, sensitive, and can be made nondestructive. Neutron activation analysis in its instrumental form, i.e., involving no chemical separation, is ideally suited to automation and conveniently takes the first step in data flow patterns that are appropriate for many taxonomic and statistical operations. The future will doubtless see improvements in the practice of NAA in general, but in connection with archaeological science the greatest change will be the filling, interchange and widespread use of data banks based on compilations of analytical data. Since provenience-oriented data banks deal with materials (obsidian, ceramics, metals, semiprecious stones, building materials and sculptural media) that participated in trade networks, the analytical data is certain to be of interest to a rather broad group of archaeologists. It is to meet the needs of the whole archaeological community that archaeological chemistry must now turn.

  10. Analysis of low levels of rare earths by radiochemical neutron activation analysis

    USGS Publications Warehouse

    Wandless, G.A.; Morgan, J.W.

    1985-01-01

    A procedure for the radiochemical neutron-activation analysis for the rare earth elements (REE) involves the separation of the REE as a group by rapid ion-exchange methods and determination of yields by reactivation or by energy dispersive X-ray fluorescence (EDXRF) spectrometry. The U. S. Geological Survey (USGS) standard rocks, BCR-1 and AGV-1, were analyzed to determine the precision and accuracy of the method. We found that the precision was ??5-10% on the basis of replicate analysis and that, in general the accuracy was within ??5% of accepted values for most REE. Data for USGS standard rocks BIR-1 (Icelandic basalt) and DNC-1 (North Carolina diabase) are also presented. ?? 1985 Akade??miai Kiado??.

  11. A MATERIAL COST-MINIMIZATION ANALYSIS FOR HERNIA REPAIRS AND MINOR PROCEDURES DURING A SURGICAL MISSION IN THE DOMINICAN REPUBLIC

    PubMed Central

    Cavallo, Jaime A.; Ousley, Jenny; Barrett, Christopher D.; Baalman, Sara; Ward, Kyle; Borchardt, Malgorzata; Thomas, J. Ross; Perotti, Gary; Frisella, Margaret M.; Matthews, Brent D.

    2013-01-01

    INTRODUCTION Expenditures on material supplies and medications constitute the greatest per capita costs for surgical missions. We hypothesized that supply acquisition at nonprofit organization (NPO) costs would lead to significant cost-savings compared to supply acquisition at US academic institution costs from the provider perspective for hernia repairs and minor procedures during a surgical mission in the Dominican Republic (DR). METHODS Items acquired for a surgical mission were uniquely QR-coded for accurate consumption accounting. Both NPO and US academic institution unit costs were associated with each item in an electronic inventory system. Medication doses were recorded and QR-codes for consumed items were scanned into a record for each sampled procedure. Mean material costs and cost savings ± SDs were calculated in US dollars for each procedure type. Cost-minimization analyses between the NPO and the US academic institution platforms for each procedure type ensued using a two-tailed Wilcoxon matched-pairs test with α=0.05. Item utilization analyses generated lists of most frequently used materials by procedure type. RESULTS The mean cost savings of supply acquisition at NPO costs for each procedure type were as follows: $482.86 ± $683.79 for unilateral inguinal hernia repair (IHR, n=13); $332.46 ± $184.09 for bilateral inguinal hernia repair (BIHR, n=3); $127.26 ± $13.18 for hydrocelectomy (HC, n=9); $232.92 ± $56.49 for femoral hernia repair (FHR, n=3); $120.90 ± $30.51 for umbilical hernia repair (UHR, n=8); $36.59 ± $17.76 for minor procedures (MP, n=26); and $120.66 ± $14.61 for pediatric inguinal hernia repair (PIHR, n=7). CONCLUSION Supply acquisition at NPO costs leads to significant cost-savings compared to supply acquisition at US academic institution costs from the provider perspective for IHR, HC, UHR, MP, and PIHR during a surgical mission to DR. Item utilization analysis can generate minimum-necessary material lists for each procedure

  12. An integrated portfolio optimisation procedure based on data envelopment analysis, artificial bee colony algorithm and genetic programming

    NASA Astrophysics Data System (ADS)

    Hsu, Chih-Ming

    2014-12-01

    Portfolio optimisation is an important issue in the field of investment/financial decision-making and has received considerable attention from both researchers and practitioners. However, besides portfolio optimisation, a complete investment procedure should also include the selection of profitable investment targets and determine the optimal timing for buying/selling the investment targets. In this study, an integrated procedure using data envelopment analysis (DEA), artificial bee colony (ABC) and genetic programming (GP) is proposed to resolve a portfolio optimisation problem. The proposed procedure is evaluated through a case study on investing in stocks in the semiconductor sub-section of the Taiwan stock market for 4 years. The potential average 6-month return on investment of 9.31% from 1 November 2007 to 31 October 2011 indicates that the proposed procedure can be considered a feasible and effective tool for making outstanding investment plans, and thus making profits in the Taiwan stock market. Moreover, it is a strategy that can help investors to make profits even when the overall stock market suffers a loss.

  13. Analysis of KMnO/sub 4//NaOH battery header cleaning procedure

    SciTech Connect

    Douglas, S.C.; Bunker, B.C.; Proctor-Puissant, P.M.; Hallett, S.G.; Yelton, W.G.

    1988-12-01

    KMnO/sub 4//NaOH solutions are used to remove the oxides that form on the stainless steel conductor of battery headers during high-temperature glass sealing operations. The cleaning procedure has been evaluated to determine if these corrosive solutions damage TA-23 glass insulators to a degree that would make the headers unacceptable. Battery headers and solid pieces of TA-23 glass were tested, and a colorimetric method was developed to analyze reaction products in the solutions. The analyses showed that the solutions contained 1--10 ..mu..g SiO/sub 2/per mL, indicating minimal deterioration of the glass. The procedure is an acceptable method for cleaning battery headers containing TA-23 glass insulators. 11 refs., 4 figs., 2 tabs.

  14. A component analysis of toilet-training procedures recommended for young children.

    PubMed

    Greer, Brian D; Neidert, Pamela L; Dozier, Claudia L

    2016-03-01

    We evaluated the combined and sequential effects of 3 toilet-training procedures recommended for use with young children: (a) underwear, (b) a dense sit schedule, and (c) differential reinforcement. A total of 20 children participated. Classroom teachers implemented a toilet-training package consisting of all 3 procedures with 6 children. Of the 6 children, 2 showed clear and immediate improvements in toileting performance, and 3 showed delayed improvements. Teachers implemented components of the training package sequentially with 12 children. At least 2 of the 4 children who experienced the underwear component after baseline improved. Toileting performance did not improve for any of the 8 children who were initially exposed to either the dense sit schedule or differential reinforcement. When initial training components were ineffective, teachers implemented additional components sequentially until toileting performance improved or all components were implemented. Toileting performance often improved when underwear or differential reinforcement was later added.

  15. Robustness of two-step acid hydrolysis procedure for composition analysis of poplar.

    PubMed

    Bhagia, Samarthya; Nunez, Angelica; Wyman, Charles E; Kumar, Rajeev

    2016-09-01

    The NREL standard procedure for lignocellulosic biomass composition has two steps: primary hydrolysis in 72% wt sulfuric acid at 30°C for 1h followed by secondary hydrolysis of the slurry in 4wt% acid at 121°C for 1h. Although pointed out in the NREL procedure, the impact of particle size on composition has never been shown. In addition, the effects of primary hydrolysis time and separation of solids prior to secondary hydrolysis on composition have never been shown. Using poplar, it was found that particle sizes less than 0.250mm significantly lowered the glucan content and increased the Klason lignin but did not affect xylan, acetate, or acid soluble lignin contents. Composition was unaffected for primary hydrolysis time between 30 and 90min. Moreover, separating solids prior to secondary hydrolysis had negligible effect on composition suggesting that lignin and polysaccharides are completely separated in the primary hydrolysis stage.

  16. A modified release analysis procedure using advanced froth flotation mechanisms. Technical report, September 1--November 30, 1995

    SciTech Connect

    Honaker, R.Q.; Mohanty, M.K.

    1995-12-31

    The objective of this study is to reinvestigate the release analysis procedure, which is traditionally conducted using a laboratory Denver cell, and to develop a modified process that can be used for all froth flotation technologies. Recent studies have found that the separation performance achieved by multiple stage cleaning and, in some cases, single stage cleaning using column flotation is superior to the performance achieved by the traditional release procedure. These findings are a result of the advanced flotation mechanisms provided by column flotation, which will be incorporated into a modified release analysis procedure developed in this study. A fundamental model of an open column has been developed which incorporates the effects of system hydrodynamics, froth drop-back, selective and non-selective detachment, operating parameters, feed solids content, and feed component flotation kinetics. Simulation results obtained during this reporting period indicate that the ultimate separation that can be achieved by a column flotation process can only be obtained in a single cleaning stage if the detachment mechanism in the froth phase is highly selective, which does not appear to occur in practice based on experimental results. Two to three cleaning stages were found to be required to obtain the ultimate performance if non-selective detachment or kinetic limiting conditions are assumed. this simulated finding agrees well with the experimental results obtained from the multiple stage cleaning of an Illinois No. 5 seam coal using the Packed-Column. Simulated results also indicate that the separation performance achieved by column flotation improves with increasing feed solids content after carrying-capacity limiting conditions are realized. These findings will be utilized in the next reporting period to modify the traditional release analysis procedure.

  17. Laboratory guidelines and procedures for coal analysis: Volume 1, Assessing the cleanability of fine coal

    SciTech Connect

    Bosold, R.C.; Glessner, D.M.

    1988-05-01

    The conventional laboratory static bath float/sink method of measuring the theoretical limits of coal cleaning is unreliable for ultra-fine (minus 100M topsize) coal particles because of their long and erratic settling rates. Developing a reliable method to assess the theoretical cleanability of ultra-fine coal has been given impetus by the increased emphasis on reducing sulfur dioxide emissions from power plants, greater quantities of fines created by mechanized mining methods, and the development of advanced physical coal cleaning processes that grind coal to ultra-fine sizes in an effort to achieve high coal impurities liberation. EPRI, therefore, commissioned researchers at the Homer City Coal Laboratory in western Pennsylvania to develop and demonstrate a float/sink procedure for ultra-fine sizes. Based on test work performed on two ultra-fine size fractions (100M x 200M and 200M x 0), a detailed laboratory procedure using a centrifugal device was established. Results obtained using the guideline presented in this report are as accurate as those obtained using the static bath float/sink method, and for 200M x 0 material, more accurate. In addition, the centrifugal procedure is faster and less costly than the conventional static bath float/sink method. 12 refs., 32 figs., 1 tab.

  18. Analysis of DOE international environmental management activities

    SciTech Connect

    Ragaini, R.C.

    1995-09-01

    The Department of Energy`s (DOE) Strategic Plan (April 1994) states that DOE`s long-term vision includes world leadership in environmental restoration and waste management activities. The activities of the DOE Office of Environmental Management (EM) can play a key role in DOE`s goals of maintaining U.S. global competitiveness and ensuring the continuation of a world class science and technology community. DOE`s interest in attaining these goals stems partly from its participation in organizations like the Trade Policy Coordinating Committee (TPCC), with its National Environmental Export Promotion Strategy, which seeks to strengthen U.S. competitiveness and the building of public-private partnerships as part of U.S. industrial policy. The International Interactions Field Office task will build a communication network which will facilitate the efficient and effective communication between DOE Headquarters, Field Offices, and contractors. Under this network, Headquarters will provide the Field Offices with information on the Administration`s policies and activities (such as the DOE Strategic Plan), interagency activities, as well as relevant information from other field offices. Lawrence Livermore National Laboratory (LLNL) will, in turn, provide Headquarters with information on various international activities which, when appropriate, will be included in reports to groups like the TPCC and the EM Focus Areas. This task provides for the collection, review, and analysis of information on the more significant international environmental restoration and waste management initiatives and activities which have been used or are being considered at LLNL. Information gathering will focus on efforts and accomplishments in meeting the challenges of providing timely and cost effective cleanup of its environmentally damaged sites and facilities, especially through international technical exchanges and/or the implementation of foreign-development technologies.

  19. Persistent neuropathic pain after inguinal herniorrhaphy depending on the procedure (open mesh v. laparoscopy): a propensity-matched analysis

    PubMed Central

    Niccolaï, Patrick; Ouchchane, Lemlih; Libier, Maurice; Beouche, Fayçale; Belon, Monique; Vedrinne, Jean-Marc; El Drayi, Bilal; Vallet, Laurent; Ruiz, Franck; Biermann, Céline; Duchêne, Pascal; Chirat, Claudine; Soule-Sonneville, Sylvie; Dualé, Christian; Dubray, Claude; Schoeffler, Pierre

    2015-01-01

    Background A greater incidence of persistent pain after inguinal herniorrhaphy is suspected with the open mesh procedure than with laparoscopy (transabdominal preperitoneal), but the involvement of neuropathy needs to be clarified. Methods We examined the cumulative incidence of neuropathic persistent pain, defined as self-report of pain at the surgical site with neuropathic aspects, within 6 months after surgery in 2 prospective subcohorts of a multicentre study. We compared open mesh with laparoscopy using different analysis, including a propensity-matched analysis with the propensity score built from a multivariable analysis using a generalized linear model. Results Considering the full patient sample (242 open mesh v. 126 laparoscopy), the raw odds ratio for neuropathic persistent pain after inguinal herniorrhaphy was 4.3. It reached 6.8 with the propensity-matched analysis conducted on pooled subgroups of 194 patients undergoing open mesh and 125 undergoing laparoscopy (95% confidence interval 1.5–30.4, p = 0.012). A risk factor analysis of these pooled subgroups revealed that history of peripheral neuropathy was an independent risk factor for persistent neuropathic pain, while older age was protective. Conclusion We found a greater risk of persistent pain with open mesh than with laparoscopy that may be explained by direct or indirect lesion of nerve terminations. Strategies to identify and preserve nerve terminations with the open mesh procedure are needed. PMID:25799247

  20. Care staff training in residential homes for managing behavioural and psychological symptoms of dementia based on differential reinforcement procedures of applied behaviour analysis: a process research.

    PubMed

    Noguchi, Dai; Kawano, Yoshiyuki; Yamanaka, Katsuo

    2013-06-01

    Previous studies of care staff training programmes for managing behavioural and psychological symptoms of dementia (BPSD) based on the antecedent-behaviour-consequence analysis of applied behaviour analysis have not included definite intervention strategies. This case study examined the effects of such a programme when combined with differential reinforcement procedures. We examined two female care home residents with dementia of Alzheimer's type. One resident (C) exhibited difficulty in sitting in her seat and made frequent visits to the restroom. The other resident (D) avoided contact with others and insisted on staying in her room. These residents were cared for by 10 care staff trainees. Using an original workbook, we trained the staff regarding the antecedent-behaviour-consequence analysis with differential reinforcement procedures. On the basis of their training, the staff implemented individual care plans for these residents. This study comprised a baseline phase and an intervention phase (IN) to assess the effectiveness of this approach as a process research. One month after IN ended, data for the follow-up phase were collected. In both residents, the overall frequency of the target behaviour of BPSD decreased, whereas the overall rate of engaging in leisure activities as an alternative behaviour increased more during IN than during the baseline phase. In addition, the overall rate of staff actions to support residents' activities increased more during IN than during the baseline phase. However, the frequency of the target behaviour of BPSD gradually increased during IN and the follow-up phase in both residents. Simultaneously, the rate of engaging in leisure activities and the staff's treatment integrity gradually decreased for C. The training programme was effective in decreasing BPSD and increasing prosocial behaviours in these two cases. However, continuous support for the staff is essential for maintaining effects.

  1. Type I error and statistical power of the Mantel-Haenszel procedure for detecting DIF: a meta-analysis.

    PubMed

    Guilera, Georgina; Gómez-Benito, Juana; Hidalgo, Maria Dolores; Sánchez-Meca, Julio

    2013-12-01

    This article presents a meta-analysis of studies investigating the effectiveness of the Mantel-Haenszel (MH) procedure when used to detect differential item functioning (DIF). Studies were located electronically in the main databases, representing the codification of 3,774 different simulation conditions, 1,865 related to Type I error and 1,909 to statistical power. The homogeneity of effect-size distributions was assessed by the Q statistic. The extremely high heterogeneity in both error rates (I² = 94.70) and power (I² = 99.29), due to the fact that numerous studies test the procedure in extreme conditions, means that the main interest of the results lies in explaining the variability in detection rates. One-way analysis of variance was used to determine the effects of each variable on detection rates, showing that the MH test was more effective when purification procedures were used, when the data fitted the Rasch model, when test contamination was below 20%, and with sample sizes above 500. The results imply a series of recommendations for practitioners who wish to study DIF with the MH test. A limitation, one inherent to all meta-analyses, is that not all the possible moderator variables, or the levels of variables, have been explored. This serves to remind us of certain gaps in the scientific literature (i.e., regarding the direction of DIF or variances in ability distribution) and is an aspect that methodologists should consider in future simulation studies.

  2. Economics definitions, methods, models, and analysis procedures for Homeland Security applications.

    SciTech Connect

    Ehlen, Mark Andrew; Loose, Verne William; Vargas, Vanessa N.; Smith, Braeton J.; Warren, Drake E.; Downes, Paula Sue; Eidson, Eric D.; Mackey, Greg Edward

    2010-01-01

    This report gives an overview of the types of economic methodologies and models used by Sandia economists in their consequence analysis work for the National Infrastructure Simulation & Analysis Center and other DHS programs. It describes the three primary resolutions at which analysis is conducted (microeconomic, mesoeconomic, and macroeconomic), the tools used at these three levels (from data analysis to internally developed and publicly available tools), and how they are used individually and in concert with each other and other infrastructure tools.

  3. The procedures used to review safety analysis reports for packagings submitted to the US Department of Energy for certification

    SciTech Connect

    Popper, G.F.; Raske, D.T.; Turula, P.

    1988-01-01

    This paper presents an overview of the procedures used at the Argonne National Laboratory (ANL) to review Safety Analysis Reports for Packagings (SARPs) submitted to the US Department of Energy (DOE) for issuance of a Certificate of Compliance. Prior to certification and shipment of a packaging for the transport of radioactive materials, a SARP must be prepared describing the design, contents, analyses, testing, and safety features of the packaging. The SARP must be reviewed to ensure that the specific packaging meets all DOE orders and federal regulations for safe transport. The ANL SARP review group provides an independent review and evaluation function for the DOE to ensure that the packaging meets all the prescribed requirements. This review involves many disciplines and includes evaluating the general information, drawings, construction details, operating procedures, maintenance and test programs, and the quality assurance plan for compliance with requirements. 14 refs., 6 figs.

  4. The zodiacal light as observed with the Clementine startracker cameras - Calibration and image analysis procedures

    NASA Astrophysics Data System (ADS)

    Zook, H. A.; Cooper, B. L.; Potter, A. E.

    1997-03-01

    The zodiacal light, due to sunlight reflected off interplanetary dust grains, is a measure of their spatial density as a function of heliocentric distance and latitude. We have developed procedures to photometrically correct the brightness variations in the Clementine Corona and Zodiacal Light (CZL) photographs to true image brightnesses. The processing steps follow generally along the path described by Berry (1994); however, there are additional processing steps involved for the Clementine images. Careful corrections are necessary because of the dimness of the features that we wish to observe.

  5. A semi-automated image analysis procedure for in situ plankton imaging systems.

    PubMed

    Bi, Hongsheng; Guo, Zhenhua; Benfield, Mark C; Fan, Chunlei; Ford, Michael; Shahrestani, Suzan; Sieracki, Jeffery M

    2015-01-01

    Plankton imaging systems are capable of providing fine-scale observations that enhance our understanding of key physical and biological processes. However, processing the large volumes of data collected by imaging systems remains a major obstacle for their employment, and existing approaches are designed either for images acquired under laboratory controlled conditions or within clear waters. In the present study, we developed a semi-automated approach to analyze plankton taxa from images acquired by the ZOOplankton VISualization (ZOOVIS) system within turbid estuarine waters, in Chesapeake Bay. When compared to images under laboratory controlled conditions or clear waters, images from highly turbid waters are often of relatively low quality and more variable, due to the large amount of objects and nonlinear illumination within each image. We first customized a segmentation procedure to locate objects within each image and extracted them for classification. A maximally stable extremal regions algorithm was applied to segment large gelatinous zooplankton and an adaptive threshold approach was developed to segment small organisms, such as copepods. Unlike the existing approaches for images acquired from laboratory, controlled conditions or clear waters, the target objects are often the majority class, and the classification can be treated as a multi-class classification problem. We customized a two-level hierarchical classification procedure using support vector machines to classify the target objects (< 5%), and remove the non-target objects (> 95%). First, histograms of oriented gradients feature descriptors were constructed for the segmented objects. In the first step all non-target and target objects were classified into different groups: arrow-like, copepod-like, and gelatinous zooplankton. Each object was passed to a group-specific classifier to remove most non-target objects. After the object was classified, an expert or non-expert then manually removed the

  6. A Semi-Automated Image Analysis Procedure for In Situ Plankton Imaging Systems

    PubMed Central

    Bi, Hongsheng; Guo, Zhenhua; Benfield, Mark C.; Fan, Chunlei; Ford, Michael; Shahrestani, Suzan; Sieracki, Jeffery M.

    2015-01-01

    Plankton imaging systems are capable of providing fine-scale observations that enhance our understanding of key physical and biological processes. However, processing the large volumes of data collected by imaging systems remains a major obstacle for their employment, and existing approaches are designed either for images acquired under laboratory controlled conditions or within clear waters. In the present study, we developed a semi-automated approach to analyze plankton taxa from images acquired by the ZOOplankton VISualization (ZOOVIS) system within turbid estuarine waters, in Chesapeake Bay. When compared to images under laboratory controlled conditions or clear waters, images from highly turbid waters are often of relatively low quality and more variable, due to the large amount of objects and nonlinear illumination within each image. We first customized a segmentation procedure to locate objects within each image and extracted them for classification. A maximally stable extremal regions algorithm was applied to segment large gelatinous zooplankton and an adaptive threshold approach was developed to segment small organisms, such as copepods. Unlike the existing approaches for images acquired from laboratory, controlled conditions or clear waters, the target objects are often the majority class, and the classification can be treated as a multi-class classification problem. We customized a two-level hierarchical classification procedure using support vector machines to classify the target objects (< 5%), and remove the non-target objects (> 95%). First, histograms of oriented gradients feature descriptors were constructed for the segmented objects. In the first step all non-target and target objects were classified into different groups: arrow-like, copepod-like, and gelatinous zooplankton. Each object was passed to a group-specific classifier to remove most non-target objects. After the object was classified, an expert or non-expert then manually removed the

  7. Determination of gross alpha, 224Ra, 226Ra, and 228Ra activities in drinking water using a single sample preparation procedure.

    PubMed

    Parsa, Bahman; Obed, Reynaldo N; Nemeth, William K; Suozzo, Gail P

    2005-12-01

    The current federal and New Jersey State regulations have greatly increased the number of gross alpha and radium tests for public and private drinking water supplies. The determination of radium isotopes in water generally involves lengthy and complicated processes. In this study, a new approach is presented for the determination of gross alpha, 224Ra, 226Ra, and 228Ra activities in water samples. The method includes a single sample preparation procedure followed by alpha counting and gamma-ray spectroscopy. The sample preparation technique incorporates an EPA-approved co-precipitation methodology for gross alpha determination with a few alterations and improvements. Using 3-L aliquots of sample, spiked with 133Ba tracer, the alpha-emitting radionuclides are isolated by a BaSO4 and Fe(OH)3 co-precipitation scheme. First the gross alpha-particle activity of the sample is measured with a low-background gas-flow proportional counter, followed by radium isotopes assay by gamma-ray spectroscopy, using the same prepared sample. Gamma-ray determination of 133Ba tracer is used to assess the radium chemical recovery. The 224Ra, 226Ra, and 228Ra activities in the sample are measured through their gamma-ray-emitting decay products, 212Pb, 214Pb/214Bi, and 228Ac, respectively. In cases where 224Ra determination is required, the gamma-ray counting should be performed within 2-4 d from sample collection. To measure 226Ra activity in the sample, the gamma-ray spectroscopy can be repeated 21 d after sample preparation to ensure that 226Ra and its progeny have reached the equilibrium state. At this point, the 228Ac equilibration with parent 228Ra is already established. Analysis of aliquots of de-ionized water spiked with NIST-traceable 230Th, 224Ra, 226Ra, and 228Ra standards demonstrated the accuracy and precision of this method. Various performance evaluation samples were also assayed for gross alpha as well as radium isotope activity determination using this procedure and the

  8. Finite element analysis of donning procedure of a prosthetic transfemoral socket.

    PubMed

    Lacroix, Damien; Patiño, Juan Fernando Ramírez

    2011-12-01

    Lower limb amputation is a severe psychological and physical event in a patient. A prosthetic solution can be provided but should respond to a patient-specific need to accommodate for the geometrical and biomechanical specificities. A new approach to calculate the stress-strain state at the interaction between the socket and the stump of five transfemoral amputees is presented. In this study the socket donning procedure is modeled using an explicit finite element method based on the patient-specific geometry obtained from CT and laser scan data. Over stumps the mean maximum pressure is 4 kPa (SD 1.7) and the mean maximum shear stresses are 1.4 kPa (SD 0.6) and 0.6 kPa (SD 0.3) in longitudinal and circumferential directions, respectively. Locations of the maximum values are according to pressure zones at the sockets. The stress-strain states obtained in this study can be considered more reliable than others, since there are normal and tangential stresses associated to the socket donning procedure.

  9. A sensitive flow-based procedure for spectrophotometric speciation analysis of inorganic bromine in waters.

    PubMed

    Rocha, Diogo L; Machado, Marcos C; Melchert, Wanessa R

    2014-11-01

    A flow-based system with solenoid micro-pumps and long path-length spectrophotometry for bromate and bromide determination in drinking water is proposed. The method is based on the formation of an unstable dye from the reaction between bromate, 2-(5-dibromo-2-pyridylazo)-5-(diethylamino)phenol (5-Br-PADAP) and thiocyanate ions. A multivariate optimization was carried out. A linear response was observed between 5.0 and 100 µg L(-1) BrO3(-) and the detection limit was estimated as 2.0 µg L(-1) (99.7% confidence level). The coefficient of variation (n=20) and sampling rate were estimated as 1.0% and 40 determinations per hour, respectively. Reagent consumption was estimated as 0.17 µg of 5-Br-PADAP and 230 μg of NaSCN per measurement, generating 6.0 mL of waste. Bromide determination was carried out after UV-assisted conversion with K2S2O8 using 300 µL of sample within the range 20-400 µg L(-1) Br(-). The generated bromate was then determined by the proposed flow system. The results for tap and commercial mineral water samples agreed with those obtained with the reference procedure at the 95% confidence level. The proposed procedure is therefore a sensitive, environmentally friendly and reliable alternative for inorganic bromine speciation.

  10. Physics faculty beliefs and values about the teaching and learning of problem solving. II. Procedures for measurement and analysis

    NASA Astrophysics Data System (ADS)

    Henderson, Charles; Yerushalmi, Edit; Kuo, Vince H.; Heller, Kenneth; Heller, Patricia

    2007-12-01

    To identify and describe the basis upon which instructors make curricular and pedagogical decisions, we have developed an artifact-based interview and an analysis technique based on multilayered concept maps. The policy capturing technique used in the interview asks instructors to make judgments about concrete instructional artifacts similar to those they likely encounter in their teaching environment. The analysis procedure alternatively employs both an a priori systems view analysis and an emergent categorization to construct a multilayered concept map, which is a hierarchically arranged set of concept maps where child maps include more details than parent maps. Although our goal was to develop a model of physics faculty beliefs about the teaching and learning of problem solving in the context of an introductory calculus-based physics course, the techniques described here are applicable to a variety of situations in which instructors make decisions that influence teaching and learning.

  11. Mercury mass measurement in fluorescent lamps via neutron activation analysis

    NASA Astrophysics Data System (ADS)

    Viererbl, L.; Vinš, M.; Lahodová, Z.; Fuksa, A.; Kučera, J.; Koleška, M.; Voljanskij, A.

    2015-11-01

    Mercury is an essential component of fluorescent lamps. Not all fluorescent lamps are recycled, resulting in contamination of the environment with toxic mercury, making measurement of the mercury mass used in fluorescent lamps important. Mercury mass measurement of lamps via instrumental neutron activation analysis (NAA) was tested under various conditions in the LVR-15 research reactor. Fluorescent lamps were irradiated in different positions in vertical irradiation channels and a horizontal channel in neutron fields with total fluence rates from 3×108 cm-2 s-1 to 1014 cm-2 s-1. The 202Hg(n,γ)203Hg nuclear reaction was used for mercury mass evaluation. Activities of 203Hg and others induced radionuclides were measured via gamma spectrometry with an HPGe detector at various times after irradiation. Standards containing an Hg2Cl2 compound were used to determine mercury mass. Problems arise from the presence of elements with a large effective cross section in luminescent material (europium, antimony and gadolinium) and glass (boron). The paper describes optimization of the NAA procedure in the LVR-15 research reactor with particular attention to influence of neutron self-absorption in fluorescent lamps.

  12. Using Functional Analysis Procedures To Monitor Medication Effects in an Outpatient and School Setting.

    ERIC Educational Resources Information Center

    Anderson, Mark T.; Vu, Chau; Derby, K. Mark; Goris, Mary; McLaughlin, T. F.

    2002-01-01

    Functional analysis methods were used to monitor medication used to reduce vocal and physical tics of a child with Tourettes Syndrome. Post-medication results demonstrated a reduced level of tics by the participant. Although preliminary, the findings suggest that functional analysis methods can be used to monitor the effects of medication in…

  13. Development and Analysis of Psychomotor Skills Metrics for Procedural Skills Decay.

    PubMed

    Parthiban, Chembian; Ray, Rebecca; Rutherford, Drew; Zinn, Mike; Pugh, Carla

    2016-01-01

    In this paper we develop and analyze the metrics associated with a force production task involving a stationary target with the help of advanced VR and Force Dimension Omega 6 haptic device. We study the effects of force magnitude and direction on the various metrics namely path length, movement smoothness, velocity and acceleration patterns, reaction time and overall error in achieving the target. Data was collected from 47 participants who were residents. Results show a positive correlation between the maximum force applied and the deflection error, velocity while reducing the path length and increasing smoothness with a force of higher magnitude showing the stabilizing characteristics of higher magnitude forces. This approach paves a way to assess and model procedural skills decay.

  14. Integrated Data Collection Analysis (IDCA) Program - Mixing Procedures and Materials Compatibility

    SciTech Connect

    Olinger, Becky D.; Sandstrom, Mary M.; Warner, Kirstin F.; Sorensen, Daniel N.; Remmers, Daniel L.; Moran, Jesse S.; Shelley, Timothy J.; Whinnery, LeRoy L.; Hsu, Peter C.; Whipple, Richard E.; Kashgarian, Michaele; Reynolds, John G.

    2011-01-14

    Three mixing procedures have been standardized for the IDCA proficiency test—solid-solid, solid-liquid, and liquid-liquid. Due to the variety of precursors used in formulating the materials for the test, these three mixing methods have been designed to address all combinations of materials. Hand mixing is recommended for quantities less than 10 grams and Jar Mill mixing is recommended for quantities over 10 grams. Consideration must also be given to the type of container used for the mixing due to the wide range of chemical reactivity of the precursors and mixtures. Eight web site sources from container and chemical manufacturers have been consulted. Compatible materials have been compiled as a resource for selecting containers made of materials stable to the mixtures. In addition, container materials used in practice by the participating laboratories are discussed. Consulting chemical compatibility tables is highly recommended for each operation by each individual engaged in testing the materials in this proficiency test.

  15. A Highly Sensitive Multicommuted Flow Analysis Procedure for Photometric Determination of Molybdenum in Plant Materials without a Solvent Extraction Step

    PubMed Central

    Santos, Felisberto G.

    2017-01-01

    A highly sensitive analytical procedure for photometric determination of molybdenum in plant materials was developed and validated. This procedure is based on the reaction of Mo(V) with thiocyanate ions (SCN−) in acidic medium to form a compound that can be monitored at 474 nm and was implemented employing a multicommuted flow analysis setup. Photometric detection was performed using an LED-based photometer coupled to a flow cell with a long optical path length (200 mm) to achieve high sensitivity, allowing Mo(V) determination at a level of μg L−1 without the use of an organic solvent extraction step. After optimization of operational conditions, samples of digested plant materials were analyzed employing the proposed procedure. The accuracy was assessed by comparing the obtained results with those of a reference method, with an agreement observed at 95% confidence level. In addition, a detection limit of 9.1 μg L−1, a linear response (r = 0.9969) over the concentration range of 50–500 μg L−1, generation of only 3.75 mL of waste per determination, and a sampling rate of 51 determinations per hour were achieved. PMID:28357152

  16. Beef, chicken and lamb fatty acid analysis--a simplified direct bimethylation procedure using freeze-dried material.

    PubMed

    Lee, M R F; Tweed, J K S; Kim, E J; Scollan, N D

    2012-12-01

    When fractionation of meat lipids is not required, procedures such as saponification can be used to extract total fatty acids, reducing reliance on toxic organic compounds. However, saponification of muscle fatty acids is laborious, and requires extended heating times, and a second methylation step to convert the extracted fatty acids to fatty acid methyl esters prior to gas chromatography. Therefore the development of a more rapid direct methylation procedure would be of merit. The use of freeze-dried material for analysis is common and allows for greater homogenisation of the sample. The present study investigated the potential of using freeze-dried muscle samples and a direct bimethylation to analyse total fatty acids of meat (beef, chicken and lamb) in comparison with a saponification procedure followed by bimethylation. Both methods compared favourably for all major fatty acids measured. There was a minor difference in relation to the C18:1 trans 10 isomer with a greater (P<0.05) recovery with saponification. However, numerically the difference was small and likely as a result of approaching the limits of isomer identification by single column gas chromatography. Differences (P<0.001) between species were found for all fatty acids measured with no interaction effects. The described technique offers a simplified, quick and reliable alternative to saponification to analyse total fatty acids from muscle samples.

  17. A Highly Sensitive Multicommuted Flow Analysis Procedure for Photometric Determination of Molybdenum in Plant Materials without a Solvent Extraction Step.

    PubMed

    Santos, Felisberto G; Reis, Boaventura F

    2017-01-01

    A highly sensitive analytical procedure for photometric determination of molybdenum in plant materials was developed and validated. This procedure is based on the reaction of Mo(V) with thiocyanate ions (SCN(-)) in acidic medium to form a compound that can be monitored at 474 nm and was implemented employing a multicommuted flow analysis setup. Photometric detection was performed using an LED-based photometer coupled to a flow cell with a long optical path length (200 mm) to achieve high sensitivity, allowing Mo(V) determination at a level of μg L(-1) without the use of an organic solvent extraction step. After optimization of operational conditions, samples of digested plant materials were analyzed employing the proposed procedure. The accuracy was assessed by comparing the obtained results with those of a reference method, with an agreement observed at 95% confidence level. In addition, a detection limit of 9.1 μg L(-1), a linear response (r = 0.9969) over the concentration range of 50-500 μg L(-1), generation of only 3.75 mL of waste per determination, and a sampling rate of 51 determinations per hour were achieved.

  18. An experimental analysis of some procedures to teach priming and reinforcement skills to preschool teachers.

    PubMed

    Thomson, C L; Holmberg, M C; Baer, D M

    1978-01-01

    This Monograph reports the results of teaching preschool teachers to be successful at increasing desired behaviors of their children, thus becoming successful teachers. Five teacher-training techniques were examined experimentally under single-subject designs: written assignments, feedback from viewing graphs, on-the-spot feedback from a wireless radio (Bug-in-the-Ear), feedback from an observer, and self-counting. Those teaching procedures that included prompt and frequent information to the teacher about the behavior under study were the most effective techniques. Self-counting, in which the teacher tallied the number of times she emitted the behavior of either priming or reinforcing social or verbal behavior of a child (or children), and observer feedback, in which the observer reported to the teacher periodically during the hour the frequency of her behavior, were the most reliable teaching techniques. The other procedures, while less reliable than self-counting and observer feedback, were effective with some teachers. Maintenance of teacher behavior across settings was examined with a group of Head Start teachers, and maintenance of teacher behaviors across different child behaviors and different children was examined with three student teachers. The results indicated that teaching was more likely to maintain if it occurred in the teacher's home setting rather than at another site. In all cases, when generalization occurred across settings, time, or children, the frequency of the teacher's behavior was not as high as when the relevant behavior had been trained directly. Results supported the proposal that it is possible to define effective teacher behavior, not just characterize it, as it occurs in the classroom, and that effectiveness can be measured by defining and observing the child behaviors to which teacher behaviors are directed.

  19. Analysis of excimer laser radiant exposure effect toward corneal ablation volume at LASIK procedure

    NASA Astrophysics Data System (ADS)

    Adiati, Rima Fitria; Rini Rizki, Artha Bona; Kusumawardhani, Apriani; Setijono, Heru; Rahmadiansah, Andi

    2016-11-01

    LASIK (Laser Asissted In Situ Interlamelar Keratomilieusis) is a technique for correcting refractive disorders of the eye such as myopia and astigmatism using an excimer laser. This procedure use photoablation technique to decompose corneal tissues. Although preferred due to its efficiency, permanency, and accuracy, the inappropriate amount radiant exposure often cause side effects like under-over correction, irregular astigmatism and problems on surrounding tissues. In this study, the radiant exposure effect toward corneal ablation volume has been modelled through several processes. Data collecting results is laser data specifications with 193 nm wavelength, beam diameter of 0.065 - 0.65 cm, and fluence of 160 mJ/cm2. For the medical data, the myopia-astigmatism value, cornea size, corneal ablation thickness, and flap data are taken. The first modelling step is determining the laser diameter between 0.065 - 0.65 cm with 0.45 cm increment. The energy, power, and intensity of laser determined from laser beam area. Number of pulse and total energy is calculated before the radiant exposure of laser is obtained. Next is to determine the parameters influence the ablation volume. Regression method used to create the equation, and then the spot size is substituted to the model. The validation used is statistic correlation method to both experimental data and theory. By the model created, it is expected that any potential complications can be prevented during LASIK procedures. The recommendations can give the users clearer picture to determine the appropriate amount of radiant exposure with the corneal ablation volume necessary.

  20. Tests of an alternate mobile transporter and extravehicular activity assembly procedure for the Space Station Freedom truss

    NASA Technical Reports Server (NTRS)

    Heard, Walter L., Jr.; Watson, Judith J.; Lake, Mark S.; Bush, Harold G.; Jensen, J. Kermit; Wallsom, Richard E.; Phelps, James E.

    1992-01-01

    Results are presented from a ground test program of an alternate mobile transporter (MT) concept and extravehicular activity (EVA) assembly procedure for the Space Station Freedom (SSF) truss keel. A three-bay orthogonal tetrahedral truss beam consisting of 44 2-in-diameter struts and 16 nodes was assembled repeatedly in neutral buoyancy by pairs of pressure-suited test subjects working from astronaut positioning devices (APD's) on the MT. The truss bays were cubic with edges 15 ft long. All the truss joint hardware was found to be EVA compatible. The average unit assembly time for a single pair of experienced test subjects was 27.6 sec/strut, which is about half the time derived from other SSF truss assembly tests. A concept for integration of utility trays during truss assembly is introduced and demonstrated in the assembly tests. The concept, which requires minimal EVA handling of the trays, is shown to have little impact on overall assembly time. The results of these tests indicate that by using an MT equipped with APD's, rapid EVA assembly of a space station-size truss structure can be expected.

  1. Incidence of adverse events in paediatric procedural sedation in the emergency department: a systematic review and meta-analysis

    PubMed Central

    Bellolio, M Fernanda; Puls, Henrique A; Anderson, Jana L; Gilani, Waqas I; Murad, M Hassan; Barrionuevo, Patricia; Erwin, Patricia J; Wang, Zhen; Hess, Erik P

    2016-01-01

    Objective and design We conducted a systematic review and meta-analysis to evaluate the incidence of adverse events in the emergency department (ED) during procedural sedation in the paediatric population. Randomised controlled trials and observational studies from the past 10 years were included. We adhere to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) statement. Setting ED. Participants Children. Interventions Procedural sedation. Outcomes Adverse events like vomiting, agitation, hypoxia and apnoea. Meta-analysis was performed with random-effects model and reported as incidence rates with 95% CIs. Results A total of 1177 studies were retrieved for screening and 258 were selected for full-text review. 41 studies reporting on 13 883 procedural sedations in 13 876 children (≤18 years) were included. The most common adverse events (all reported per 1000 sedations) were: vomiting 55.5 (CI 45.2 to 65.8), agitation 17.9 (CI 12.2 to 23.7), hypoxia 14.8 (CI 10.2 to 19.3) and apnoea 7.1 (CI 3.2 to 11.0). The need to intervene with either bag valve mask, oral airway or positive pressure ventilation occurred in 5.0 per 1000 sedations (CI 2.3 to 7.6). The incidences of severe respiratory events were: 34 cases of laryngospasm among 8687 sedations (2.9 per 1000 sedations, CI 1.1 to 4.7; absolute rate 3.9 per 1000 sedations), 4 intubations among 9136 sedations and 0 cases of aspiration among 3326 sedations. 33 of the 34 cases of laryngospasm occurred in patients who received ketamine. Conclusions Serious adverse respiratory events are very rare in paediatric procedural sedation in the ED. Emesis and agitation are the most frequent adverse events. Hypoxia, a late indicator of respiratory depression, occurs in 1.5% of sedations. Laryngospasm, though rare, happens most frequently with ketamine. The results of this study provide quantitative risk estimates to facilitate shared decision-making, risk communication, informed consent and

  2. Development of mixed time partition procedures for thermal analysis of structures

    NASA Technical Reports Server (NTRS)

    Liu, W. K.

    1982-01-01

    The computational methods used to predict and optimize the thermal-structural behavior of aerospace vehicle structures are reviewed. In general, two classes of algorithms, implicit and explicit, are used in transient thermal analysis of structures. Each of these two methods has its own merits. Due to the different time scales of the mechanical and thermal responses, the selection of a time integration method can be a difficult yet critical factor in the efficient solution of such problems. Therefore mixed time integration methods for transient thermal analysis of structures are being developed. This proposed methodology would be readily adaptable to existing computer programs for structural thermal analysis.

  3. A multistate analysis of active life expectancy.

    PubMed

    Rogers, A; Rogers, R G; Branch, L G

    1989-01-01

    With today's lower mortality rates, longer expectations of life, and new medical technologies, the nation's health policy focus has shifted from emphasis on individual survival to emphasis on personal health and independent living. Using longitudinal data sets and new methodological techniques, researchers have begun to assess active life expectancies, estimating not only how long a subpopulation can expect to live beyond each age, but what fractions of the expected remaining lifetime will be lived as independent, dependent, or institutionalized. New ideas are addressed, applying recently developed multistate life table methods to Waves One and Two of the Massachusetts Health Care Panel Study. Expectations of active life are presented for those 65 and older who initially are in one of two functional states of well-being. Included are expectations of life, for those, for example, who were independent and remained so, or those who were dependent and became independent. Although public health officials are concerned about the number of elderly who cease being independent, preliminary analysis shows that a significant number of the dependent elderly regain their independence, a situation which needs to be addressed in health care planning.

  4. Neutron activation analysis of some building materials

    NASA Astrophysics Data System (ADS)

    Salagean, M. N.; Pantelica, A. I.; Georgescu, I. I.; Muntean, M. I.

    1999-01-01

    Concentrations of As, Au, Ba, Br, Ca, Ce, Co, Cr, Cs, Eu, Fe, Hf, K, La, Lu, Mo, Na, Nd, Rb, Sb, Sc, Sr, Ta, Tb, Th, U. Yb, W and Zn in seven Romanian building materials were determined by the Instrumental Neutron Activation Analysis (INAA) method using the VVR-S Reactor of NIPNE- Bucharest. Raw matarials used in cement obtaining ≈ 75% of limestone and ≈ 25% of clay, cement samples from three different factories, furnace slag, phosphogypsum, and a type of brick have been analyzed. The brick was compacted from furnace slay, fly coal ash, phosphogypsum, lime and cement. The U, Th and K concentrations determined in the brick are in agreement with the natural radioactivity measurements of226Ra,232Th and40K. These specific activities were found about twice and 1.5 higher than the accepted levels in the case of226Ra and232Th, as well as40K, respectively. By consequence, the investigated brick is considered a radioactive waste. The rather high content of Co, Cr, K, Th, and Zh in the brick is especially due to the slag and fly ash, the main componets. The presence of U, Th and K in slag is mainly correlated with the limestone and dolomite as fluxes in matallurgy.

  5. Active polarimeter optical system laser hazard analysis.

    SciTech Connect

    Augustoni, Arnold L.

    2005-07-01

    A laser hazard analysis was performed for the SNL Active Polarimeter Optical System based on the ANSI Standard Z136.1-2000, American National Standard for Safe Use of Lasers and the ANSI Standard Z136.6-2000, American National Standard for Safe Use of Lasers Outdoors. The Active Polarimeter Optical System (APOS) uses a pulsed, near-infrared, chromium doped lithium strontium aluminum fluoride (Cr:LiSAF) crystal laser in conjunction with a holographic diffuser and lens to illuminate a scene of interest. The APOS is intended for outdoor operations. The system is mounted on a height adjustable platform (6 feet to 40 feet) and sits atop a tripod that points the beam downward. The beam can be pointed from nadir to as much as 60 degrees off of nadir producing an illuminating spot geometry that can vary from circular (at nadir) to elliptical in shape (off of nadir). The JP Innovations crystal Cr:LiSAF laser parameters are presented in section II. The illuminating laser spot size is variable and can be adjusted by adjusting the separation distance between the lens and the holographic diffuser. The system is adjusted while platform is at the lowest level. The laser spot is adjusted for a particular spot size at a particular distance (elevation) from the laser by adjusting the separation distance (d{sub diffuser}) to predetermined values. The downward pointing angle is also adjusted before the platform is raised to the selected operation elevation.

  6. [Analysis of clinical Risk and adoption of shared procedures: experience of nephrology and dialysis unit of ASL BA].

    PubMed

    Mancini, Andrea; Angelini, Pernina; Bozzi, Michele; Cuzzola, Cristoforo; Giancaspro, Vincenzo; Laraia, Elvira; Nisi, Maria Teresa; Proscia, Anna Rita; Tarantino, Giuseppe; Vitale, Ottavia; Petrarulo, Francesco

    2015-01-01

    Currently, English scientific literature is lacking in studies showing that medical assistance may be delivered without errors. Since two years ago, the department of nephrology and urology of ASL BA has been establishing a process of clinical risk management.Starting with the reporting of a single error, a related database was subsequently developed, in order to validate technical and organizational procedures that would be of common use in the daily clinical practice.With regard to error reporting, the system of incident reporting was adopted: that is a structured collection of significant events for the safety of patients with a specific form for reporting to be filled out by health professionals. Reports have been collected, coded and analysed. Finally measures were adopted to reduce the recurrence of the error.This first phase consisted on writing the procedures in order to create structured diagnostic-therapeutic protocols. In 18 months of observation adopting the incident reporting form, 48 errors have been reported: 52% due to adverse events; 12.5% to adverse reactions; 31.2% near misses and 2% to sentinel events. In 35.4 % of cases the error occurred in the administration or prescription of drug therapies, in 18.7% of cases it occurred in the organizational stage, in 12.5% it was a surgical error, in 18.7% of cases the error was due to incorrect asepsis, in 8.3% of cases it occurred during the medical examination and finally in 8.3% during dialysis. An analysis of the error database resulted in the choice of more urgent procedures. It is our view that only the observation of procedures can ensure the achievement of a high quality with improved clinical outcomes, reduction of complications, elimination of inappropriate interventions and increased patient satisfaction.

  7. Nickel-catalyzed proton-deuterium exchange (HDX) procedures for glycosidic linkage analysis of complex carbohydrates

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The structural analysis of complex carbohydrates typically requires the assignment of three parameters: monosaccharide composition, the position of glycosidic linkages between monosaccharides, and the position and nature of non-carbohydrate substituents. The glycosidic linkage positions are often de...

  8. Complications of percutaneous vertebroplasty: An analysis of 1100 procedures performed in 616 patients.

    PubMed

    Saracen, Agnieszka; Kotwica, Zbigniew

    2016-06-01

    Percutaneous vertebroplasty (PVP) is a minimally invasive procedure widely used for the treatment of pain due to vertebral fractures of different origins-osteoporotic, traumatic, or neoplastic. PVP is minimally invasive, but the complications are not rare; however, they are in most cases not significant clinically. The most frequent is cement leakage, which can occur onto veins, paravertebral soft tissue, into the intervertebral disk, or to the spinal canal, affecting foraminal area or epidural space. We analyzed results of treatment and complications of vertebroplasty performed with the use of polimethylomethylacrylate cement (PMMA) on 1100 vertebrae, with a special regard to the severity of complication and eventual clinical manifestation. One thousand one hundred PVP were analyzed, performed in 616 patients. There were 468 (76%) women and 148 men (24%), 24 to 94-year old, mean age 68 years. From 1100 procedures, 794 treated osteporotic and 137 fractures due to malignant disease, 69 PVP were made in traumatic fractures. One hundred patients had painful vertebral hemangiomas. Seven hundred twenty-six (66%) lesions were in thoracic, and 374 (34%) in lumbar area. Results of treatment were assessed using 10 cm Visual Analogue Scale (VAS) 12 hours after surgery, 7 days, 30 days, and then each 6 months, up to 3 years. Before surgery all patients had significant pain 7 to 10 in VAS scale, mean 8.9 cm. Twelve  hours after surgery 602 (97.7%) reported significant relief of pain, with mean VAS of 2,3 cm. Local complications occurred in 50% of osteoporotic, 34% of neoplastic, 16% of traumatic fractures, and 2% of vertebral hemangiomas. The most common was PMMA leakage into surrounding tissues-20%; paravertebral vein embolism-13%; intradiscal leakage-8%; and PMMA leakage into the spinal canal-0.8%. Results of treatment did not differ between patients with and without any complications. From 104 patients who had chest X-ray or CT study performed after surgery

  9. Why do the numbers of laboratory animal procedures conducted continue to rise? An analysis of the Home Office Statistics of Scientific Procedures on Living Animals: Great Britain 2005.

    PubMed

    Hudson, Michelle

    2007-03-01

    The publication of the Statistics of Scientific Procedures on Living Animals: Great Britain 2005 once again provides evidence that the levels of animal experimentation in Great Britain are rising, the underlying reason for this being the continued and increasing reliance on genetically modified animals as model systems. There has been a gradual increase in fundamental research, as applied toxicological studies have declined. Of particular concern is the impact that the forthcoming REACH legislation will have and the apparent lack of urgency in facing up to this challenge. The major issues arising from the Statistics are discussed, including the increases in rabbit and primate procedures. The potential of newly validated and emerging techniques to counteract these worrying trends are also considered.

  10. Methods of analysis by the U.S. Geological Survey National Water Quality Laboratory; comparison of a nitric acid in-bottle digestion procedure to other whole-water digestion procedures

    USGS Publications Warehouse

    Garbarino, John R.; Hoffman, Gerald L.

    1999-01-01

    A hydrochloric acid in-bottle digestion procedure is used to partially digest wholewater samples prior to determining recoverable elements by various analytical methods. The use of hydrochloric acid is problematic for some methods of analysis because of spectral interference. The inbottle digestion procedure has been modified to eliminate such interference by using nitric acid instead of hydrochloric acid in the digestion. Implications of this modification are evaluated by comparing results for a series of synthetic whole-water samples. Results are also compared with those obtained by using U.S. Environmental Protection Agency (1994) (USEPA) Method 200.2 total-recoverable digestion procedure. Percentage yields that use the nitric acid inbottle digestion procedure are within 10 percent of the hydrochloric acid in-bottle yields for 25 of the 26 elements determined in two of the three synthetic whole-water samples tested. Differences in percentage yields for the third synthetic whole-water sample were greater than 10 percent for 16 of the 26 elements determined. The USEPA method was the most rigorous for solubilizing elements from particulate matter in all three synthetic whole-water samples. Nevertheless, the variability in the percentage yield by using the USEPA digestion procedure was generally greater than the in-bottle digestion procedure, presumably because of the difficulty in controlling the digestion conditions accurately.

  11. Determination of Np-237 by radiochemical neutron activation analysis combined with extraction chromatography.

    PubMed

    Kalmykov, St N; Aliev, R A; Sapozhnikov, D Yu; Sapozhnikov, Yu A; Afinogenov, A M

    2004-01-01

    A procedure for determination of 237Np, 238Pu, 239,240Pu and 241Pu in environmental samples is described. Neptunium-237 is determined using radiochemical neutron activation analysis with pre- and post-irradiation chemistry based on solvent extraction and extraction chromatography. 238Pu, 239,240Pu is determined using alpha spectrometry and 241Pu by liquid scintillation spectrometry. The vertical profiles of 237Np, 238Pu, 239,240Pu in bottom sediments from the Black Sea are presented.

  12. Inverse scattering transform analysis of rogue waves using local periodization procedure

    PubMed Central

    Randoux, Stéphane; Suret, Pierre; El, Gennady

    2016-01-01

    The nonlinear Schrödinger equation (NLSE) stands out as the dispersive nonlinear partial differential equation that plays a prominent role in the modeling and understanding of the wave phenomena relevant to many fields of nonlinear physics. The question of random input problems in the one-dimensional and integrable NLSE enters within the framework of integrable turbulence, and the specific question of the formation of rogue waves (RWs) has been recently extensively studied in this context. The determination of exact analytic solutions of the focusing 1D-NLSE prototyping RW events of statistical relevance is now considered as the problem of central importance. Here we address this question from the perspective of the inverse scattering transform (IST) method that relies on the integrable nature of the wave equation. We develop a conceptually new approach to the RW classification in which appropriate, locally coherent structures are specifically isolated from a globally incoherent wave train to be subsequently analyzed by implementing a numerical IST procedure relying on a spatial periodization of the object under consideration. Using this approach we extend the existing classifications of the prototypes of RWs from standard breathers and their collisions to more general nonlinear modes characterized by their nonlinear spectra. PMID:27385164

  13. Inverse scattering transform analysis of rogue waves using local periodization procedure.

    PubMed

    Randoux, Stéphane; Suret, Pierre; El, Gennady

    2016-07-07

    The nonlinear Schrödinger equation (NLSE) stands out as the dispersive nonlinear partial differential equation that plays a prominent role in the modeling and understanding of the wave phenomena relevant to many fields of nonlinear physics. The question of random input problems in the one-dimensional and integrable NLSE enters within the framework of integrable turbulence, and the specific question of the formation of rogue waves (RWs) has been recently extensively studied in this context. The determination of exact analytic solutions of the focusing 1D-NLSE prototyping RW events of statistical relevance is now considered as the problem of central importance. Here we address this question from the perspective of the inverse scattering transform (IST) method that relies on the integrable nature of the wave equation. We develop a conceptually new approach to the RW classification in which appropriate, locally coherent structures are specifically isolated from a globally incoherent wave train to be subsequently analyzed by implementing a numerical IST procedure relying on a spatial periodization of the object under consideration. Using this approach we extend the existing classifications of the prototypes of RWs from standard breathers and their collisions to more general nonlinear modes characterized by their nonlinear spectra.

  14. Acylation of Chiral Alcohols: A Simple Procedure for Chiral GC Analysis

    PubMed Central

    Oromí-Farrús, Mireia; Torres, Mercè; Canela, Ramon

    2012-01-01

    The use of iodine as a catalyst and either acetic or trifluoroacetic acid as a derivatizing reagent for determining the enantiomeric composition of acyclic and cyclic aliphatic chiral alcohols was investigated. Optimal conditions were selected according to the molar ratio of alcohol to acid, the reaction time, and the reaction temperature. Afterwards, chiral stability of chiral carbons was studied. Although no isomerization was observed when acetic acid was used, partial isomerization was detected with the trifluoroacetic acid. A series of chiral alcohols of a widely varying structural type were then derivatized with acetic acid using the optimal conditions. The resolution of the enantiomeric esters and the free chiral alcohols was measured using a capillary gas chromatograph equipped with a CP Chirasil-DEX CB column. The best resolutions were obtained with 2-pentyl acetates (α = 3.00) and 2-hexyl acetates (α = 1.95). This method provides a very simple and efficient experimental workup procedure for analyzing chiral alcohols by chiral-phase GC. PMID:22649749

  15. Inverse scattering transform analysis of rogue waves using local periodization procedure

    NASA Astrophysics Data System (ADS)

    Randoux, Stéphane; Suret, Pierre; El, Gennady

    2016-07-01

    The nonlinear Schrödinger equation (NLSE) stands out as the dispersive nonlinear partial differential equation that plays a prominent role in the modeling and understanding of the wave phenomena relevant to many fields of nonlinear physics. The question of random input problems in the one-dimensional and integrable NLSE enters within the framework of integrable turbulence, and the specific question of the formation of rogue waves (RWs) has been recently extensively studied in this context. The determination of exact analytic solutions of the focusing 1D-NLSE prototyping RW events of statistical relevance is now considered as the problem of central importance. Here we address this question from the perspective of the inverse scattering transform (IST) method that relies on the integrable nature of the wave equation. We develop a conceptually new approach to the RW classification in which appropriate, locally coherent structures are specifically isolated from a globally incoherent wave train to be subsequently analyzed by implementing a numerical IST procedure relying on a spatial periodization of the object under consideration. Using this approach we extend the existing classifications of the prototypes of RWs from standard breathers and their collisions to more general nonlinear modes characterized by their nonlinear spectra.

  16. Comparative evaluation of rRNA depletion procedures for the improved analysis of bacterial biofilm and mixed pathogen culture transcriptomes.

    PubMed

    Petrova, Olga E; Garcia-Alcalde, Fernando; Zampaloni, Claudia; Sauer, Karin

    2017-01-24

    Global transcriptomic analysis via RNA-seq is often hampered by the high abundance of ribosomal (r)RNA in bacterial cells. To remove rRNA and enrich coding sequences, subtractive hybridization procedures have become the approach of choice prior to RNA-seq, with their efficiency varying in a manner dependent on sample type and composition. Yet, despite an increasing number of RNA-seq studies, comparative evaluation of bacterial rRNA depletion methods has remained limited. Moreover, no such study has utilized RNA derived from bacterial biofilms, which have potentially higher rRNA:mRNA ratios and higher rRNA carryover during RNA-seq analysis. Presently, we evaluated the efficiency of three subtractive hybridization-based kits in depleting rRNA from samples derived from biofilm, as well as planktonic cells of the opportunistic human pathogen Pseudomonas aeruginosa. Our results indicated different rRNA removal efficiency for the three procedures, with the Ribo-Zero kit yielding the highest degree of rRNA depletion, which translated into enhanced enrichment of non-rRNA transcripts and increased depth of RNA-seq coverage. The results indicated that, in addition to improving RNA-seq sensitivity, efficient rRNA removal enhanced detection of low abundance transcripts via qPCR. Finally, we demonstrate that the Ribo-Zero kit also exhibited the highest efficiency when P. aeruginosa/Staphylococcus aureus co-culture RNA samples were tested.

  17. Comparative evaluation of rRNA depletion procedures for the improved analysis of bacterial biofilm and mixed pathogen culture transcriptomes

    PubMed Central

    Petrova, Olga E.; Garcia-Alcalde, Fernando; Zampaloni, Claudia; Sauer, Karin

    2017-01-01

    Global transcriptomic analysis via RNA-seq is often hampered by the high abundance of ribosomal (r)RNA in bacterial cells. To remove rRNA and enrich coding sequences, subtractive hybridization procedures have become the approach of choice prior to RNA-seq, with their efficiency varying in a manner dependent on sample type and composition. Yet, despite an increasing number of RNA-seq studies, comparative evaluation of bacterial rRNA depletion methods has remained limited. Moreover, no such study has utilized RNA derived from bacterial biofilms, which have potentially higher rRNA:mRNA ratios and higher rRNA carryover during RNA-seq analysis. Presently, we evaluated the efficiency of three subtractive hybridization-based kits in depleting rRNA from samples derived from biofilm, as well as planktonic cells of the opportunistic human pathogen Pseudomonas aeruginosa. Our results indicated different rRNA removal efficiency for the three procedures, with the Ribo-Zero kit yielding the highest degree of rRNA depletion, which translated into enhanced enrichment of non-rRNA transcripts and increased depth of RNA-seq coverage. The results indicated that, in addition to improving RNA-seq sensitivity, efficient rRNA removal enhanced detection of low abundance transcripts via qPCR. Finally, we demonstrate that the Ribo-Zero kit also exhibited the highest efficiency when P. aeruginosa/Staphylococcus aureus co-culture RNA samples were tested. PMID:28117413

  18. A Bayesian network meta-analysis of three different surgical procedures for the treatment of humeral shaft fractures

    PubMed Central

    Qiu, Hao; Wei, Zhihui; Liu, Yuting; Dong, Jing; Zhou, Xin; Yin, Liangjun; Zhang, Minhua; Lu, Minpeng

    2016-01-01

    Abstract Background: The optimal surgical procedure for humeral shaft fractures remains a matter of debate. We aimed to establish the optimum procedure by performing a Bayesian network meta-analysis. Methods: PubMed, EMBASE, the Cochrane Library, and Medline were searched for both randomized controlled trials and prospective studies of surgical treatment for humeral shaft fractures. The quality of the included studies was assessed according to the Cochrane Collaboration's “Risk of bias”. Results: Seventeen RCTs or prospective studies were included in the meta-analysis. The pooled results showed that the occurrence rate of radial nerve injury was lowest for minimally invasive plate osteosynthesis (MIPO; SUCRA probability, 95.1%), followed by open reduction and plate osteosynthesis (ORPO; SUCRA probability, 29.5%), and was highest for intramedullary nailing (IMN; SUCRA probability, 25.4%). The aggregated results of pairwise meta-analysis showed no significant difference in radial nerve injury rate when comparing ORPO versus IMN (OR, 1.92; 95% CI, 0.96 to 3.86), ORPO versus MIPO (OR, 3.38; 95% CI, 0.80 to 14.31), or IMN versus MIPO (OR, 3.19; 95% CI, 0.48 to 21.28). Regarding the nonunion, SUCRA probabilities were 90.5%, 40.2%, and 19.3% for MIPO, ORPO, and IMN, respectively. The aggregated results of a pairwise meta-analysis also showed no significant difference for ORPO versus IMN (OR, 0.83; 95% CI, 0.41 to 1.69), ORPO versus MIPO (OR, 2.42; 95% CI, 0.45 to 12.95), or IMN versus MIPO (OR, 2.49; 95% CI, 0.35 to 17.64). Conclusion: The current evidence indicates that MIPO is the optimum choice in the treatment of humeral shaft fractures and that ORPO is superior to IMN. PMID:28002327

  19. Development of sample handling procedures for foods under USDA's National Food and Nutrient Analysis Program

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The National Food and Nutrient Analysis Program (NFNAP) was implemented in 1997 to update and improve the quality of food composition data maintained in the United States Department of Agriculture (USDA) National Nutrient Database for Standard Reference. NFNAP was designed to sample and analyze fre...

  20. Acoustic emission analysis as a non-destructive test procedure for fiber compound structures

    NASA Technical Reports Server (NTRS)

    Block, J.

    1983-01-01

    The concept of acoustic emission analysis is explained in scientific terms. The detection of acoustic events, their localization, damage discrimination, and event summation curves are discussed. A block diagram of the concept of damage-free testing of fiber-reinforced synthetic materials is depicted. Prospects for application of the concept are assessed.

  1. A Simple Gauss-Newton Procedure for Covariance Structure Analysis with High-Level Computer Languages.

    ERIC Educational Resources Information Center

    Cudeck, Robert; And Others

    1993-01-01

    An implementation of the Gauss-Newton algorithm for the analysis of covariance structure that is specifically adapted for high-level computer languages is reviewed. This simple method for estimating structural equation models is useful for a variety of standard models, as is illustrated. (SLD)

  2. Advanced techniques and painless procedures for nonlinear contact analysis and forming simulation via implicit FEM

    NASA Astrophysics Data System (ADS)

    Zhuang, Shoubing

    2013-05-01

    Nonlinear contact analysis including forming simulation via finite element methods has a crucial and practical application in many engineering fields. However, because of high nonlinearity, nonlinear contact analysis still remains as an extremely challenging obstacle for many industrial applications. The implicit finite element scheme is generally more accurate than the explicit finite element scheme, but it has a known challenge of convergence because of complex geometries, large relative motion and rapid contact state change. It might be thought as a very painful process to diagnose the convergence issue of nonlinear contact. Most complicated contact models have a great many contact surfaces, and it is hard work to well define the contact pairs using the common contact definition methods, which either result in hundreds of contact pairs or are time-consuming. This paper presents the advanced techniques of nonlinear contact analysis and forming simulation via the implicit finite element scheme and the penalty method. The calculation of the default automatic contact stiffness is addressed. Furthermore, this paper presents the idea of selection groups to help easily and efficiently define contact pairs for complicated contact analysis, and the corresponding implementation and usage are discussed. Lastly, typical nonlinear contact models and forming models with nonlinear material models are shown in the paper to demonstrate the key presented method and technologies.

  3. A Comparison of Missing-Data Procedures for Arima Time-Series Analysis

    ERIC Educational Resources Information Center

    Velicer, Wayne F.; Colby, Suzanne M.

    2005-01-01

    Missing data are a common practical problem for longitudinal designs. Time-series analysis is a longitudinal method that involves a large number of observations on a single unit. Four different missing-data methods (deletion, mean substitution, mean of adjacent observations, and maximum likelihood estimation) were evaluated. Computer-generated…

  4. Assessing Reading Skill with a Think-Aloud Procedure and Latent Semantic Analysis.

    ERIC Educational Resources Information Center

    Magliano, Joseph P.; Millis, Keith K.

    2003-01-01

    Two studies examined the viability of assessing reading strategies using a think-aloud protocol combined with latent semantic analysis (LSA). Findings demonstrated that the responses of less-skilled readers semantically overlapped more with focal sentences than with causal antecedent sentences, whereas skilled readers' responses overlapped with…

  5. Pairwise Comparison Procedures for One-Way Analysis of Variance Designs. Research Report.

    ERIC Educational Resources Information Center

    Zwick, Rebecca

    Research in the behavioral and health sciences frequently involves the application of one-factor analysis of variance models. The goal may be to compare several independent groups of subjects on a quantitative dependent variable or to compare measurements made on a single group of subjects on different occasions or under different conditions. In…

  6. 40 CFR 93.123 - Procedures for determining localized CO, PM10, and PM2.5 concentrations (hot-spot analysis).

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... CO, PM10, and PM2.5 concentrations (hot-spot analysis). 93.123 Section 93.123 Protection of.... or the Federal Transit Laws § 93.123 Procedures for determining localized CO, PM10, and PM2.5 concentrations (hot-spot analysis). (a) CO hot-spot analysis. (1) The demonstrations required by §...

  7. 40 CFR 93.123 - Procedures for determining localized CO, PM10, and PM2.5 concentrations (hot-spot analysis).

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... CO, PM10, and PM2.5 concentrations (hot-spot analysis). 93.123 Section 93.123 Protection of.... or the Federal Transit Laws § 93.123 Procedures for determining localized CO, PM10, and PM2.5 concentrations (hot-spot analysis). (a) CO hot-spot analysis. (1) The demonstrations required by §...

  8. 40 CFR 93.123 - Procedures for determining localized CO, PM10, and PM2.5 concentrations (hot-spot analysis).

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... CO, PM10, and PM2.5 concentrations (hot-spot analysis). 93.123 Section 93.123 Protection of.... or the Federal Transit Laws § 93.123 Procedures for determining localized CO, PM10, and PM2.5 concentrations (hot-spot analysis). (a) CO hot-spot analysis. (1) The demonstrations required by §...

  9. 40 CFR 93.123 - Procedures for determining localized CO, PM10, and PM2.5 concentrations (hot-spot analysis).

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... CO, PM10, and PM2.5 concentrations (hot-spot analysis). 93.123 Section 93.123 Protection of.... or the Federal Transit Laws § 93.123 Procedures for determining localized CO, PM10, and PM2.5 concentrations (hot-spot analysis). (a) CO hot-spot analysis. (1) The demonstrations required by §...

  10. Intraoperative blood flow analysis of direct revascularization procedures in patients with moyamoya disease

    PubMed Central

    Lee, Marco; Guzman, Raphael; Bell-Stephens, Teresa; Steinberg, Gary K

    2011-01-01

    Moyamoya disease is characterized by the progressive stenosis and often occlusion of the terminal internal carotid arteries, which leads to ischemic and hemorrhagic injuries. The etiology is unknown and surgical revascularization remains the mainstay treatment. We analyzed various hemodynamic factors in 292 patients with moyamoya disease, representing 496 revascularization procedures, including vessel dimension and intraoperative blood flow, using a perivascular ultrasonic flowprobe. Mean middle cerebral artery (MCA) flow rate was 4.4±0.26 mL/min. After superficial temporal artery (STA)–MCA bypass surgery, flows at the microanastomosis were increased fivefold to a mean of 22.2±0.8 mL/min. The MCA flows were significantly lower in the pediatric (16.2±1.3 mL/min) compared with the adult (23.9±1.0 mL/min; P<0.0001) population. Increased local flow rates were associated with clinical improvement. Permanent postoperative complications were low (<5%), but very high postanastomosis MCA flow was associated with postoperative stroke (31.2±6.8 mL/min; P=0.045), hemorrhage (32.1±10.2 mL/min; P=0.045), and transient neurologic deficits (28.6±5.6 mL/min; P=0.047) compared with controls. Other flow and vessel dimension data are presented to elucidate the hemodynamic changes related to the vasculopathy and subsequent to surgical intervention. PMID:20588321

  11. Activity.

    ERIC Educational Resources Information Center

    Clearing: Nature and Learning in the Pacific Northwest, 1984

    1984-01-01

    Presents three activities: (1) investigating succession in a schoolground; (2) investigating oak galls; and (3) making sun prints (photographs made without camera or darkroom). Each activity includes a list of materials needed and procedures used. (JN)

  12. Development of a sequential injection-liquid microextraction procedure with GC-FID for analysis of short-chain fatty acids in palm oil mill effluent.

    PubMed

    Pruksatrakul, Thapanee; Phoopraintra, Pattamaporn; Wilairat, Prapin; Chaiyen, Pimchai; Chantiwas, Rattikan

    2017-04-01

    Short-chain fatty acids, such as acetic, propionic, butyric, iso-valeric and valeric acids, play an important role in methanogenesis activity for biogas production processes. Thus, simple and rapid procedures for monitoring the levels of short-chain fatty acids are requisite for sustaining biogas production. This work presents the development of a sequential injection-liquid microextraction (SI-LME) procedure with GC-FID analysis for determination of short-chain fatty acids. GC-FID was employed for detection of the short-chain fatty acids. Calibration curves were linear with good coefficients of determination (r(2)>0.999), using methacrylic acid as the internal standard. Limits of quantification (LOQ) were in the range of 0.03-0.19mM. The SI-LME procedure employed tert-butyl methyl ether (TBME) as the extracting solvent. Various SI-LME conditions were investigated and optimized to obtain the highest recovery of extraction. With these optimized conditions, an extraction recovery of the five key short-chain fatty acids of 67-90% was obtained, with less than 2% RSD (n=3). The final SI-LME procedure employed two fluidic zones of TBME with a single aqueous fluidic zone of sample sandwiched between the TBME zones, with 5 cycles of flow reversal at a flow rate of 5µL/s for the extraction process. Intra- and inter-day precision values were 0.5-4.0% RSD and 3.3-4.8% RSD, respectively. Accuracy based on percentage of sample recovery were in the range of 69-96, 102-107, and 82-101% (n=4) for acetic, propionic and butyric acids, respectively. The proposed method was applied for the measurement of short-chain fatty acids in palm oil mill effluents used in biogas production in a factory performing palm oil extraction process. The SI-LME method provides improved extraction performance with high precision, and is both simple and rapid with its economical extraction technique. The SI-LME procedure with GC-FID has strong potential for use as a quality control process for monitoring

  13. A Guide for Developing Standard Operating Job Procedures for the Activated Sludge - Aeration & Sedimentation Process Wastewater Treatment Facility. SOJP No. 5.

    ERIC Educational Resources Information Center

    Mason, George J.

    This guide for developing standard operating job procedures for wastewater treatment facilities is devoted to the activated sludge aeration and sedimentation process. This process is for conversion of nonsettleable and nonfloatable materials in wastewater to settleable, floculated biological groups and separation of the settleable solids from the…

  14. The Effectiveness of Embedded Teaching through the Most-to-Least Prompting Procedure in Concept Teaching to Children with Autism within Orff-Based Music Activities

    ERIC Educational Resources Information Center

    Eren, Bilgehan; Deniz, Jale; Duzkantar, Ayten

    2013-01-01

    The purpose of this study was to demonstrate the effectiveness of embedded teaching through the most-to-least prompting procedure in concept teaching to children with autism in Orff-based music activities. In this research, being one of the single subject research designs, multiple probe design was used. The generalization effect of the research…

  15. Development of a comprehensive analysis for rotorcraft. II - Aircraft model, solution procedure and applications

    NASA Technical Reports Server (NTRS)

    Johnson, W.

    1981-01-01

    The development of a comprehensive analytical model of rotorcraft aerodynamics and dynamics is described. Particular emphasis is given to describing the reasons behind the choices and decisions involved in constructing the model. The analysis is designed to calculate rotor performance, loads and noise; helicopter vibration and gust response; flight dynamics and handling qualities; and system aeroelastic stability. It is intended for use in the design, testing and evaluation of a wide class of rotors and rotorcraft and to be the basis for further development of rotary wing theories. The general characteristics of the geometric, structural, inertial and aerodynamic models used for the rotorcraft components are described, including the assumptions introduced by the chosen models and the resulting capabilities and limitations. Finally, some examples from recent applications of the analysis are given.

  16. Procedures for analysis of spatial relationships among ship survey data and sea surface temperature

    NASA Technical Reports Server (NTRS)

    Berry, J. K.; Sailor, J. K.

    1981-01-01

    The establishment of a limited spatial data base for the U.S. eastern seaboard vicinity is discussed along with the demonstration of computer assisted analysis techniques for investigating spatial patterns and relationships among ship survey data and remotely sensed sea surface temperature. Ship survey variables included concentrations of two zooplankton, two icthyoplankton, and two fish species, in addition to physical data of depth to bottom and surface and bottom water temperatures. Continuous spatial distributions of these data were created by both weighted nearest neighbor and iterative smoothing interpolation techniques. Maps of surface water temperature were created by digitizing GOES satellite images. All mapped data were spatially registered by conversion of latitude and longitude coordinates to rotated Lambert conic conformal rectilinear coordinates and stored in grid format of approximately one hundred square kilometers per cell. The analysis of these data include the generation of statistical summaries and maps describing the joint occurrence among variables.

  17. Computer-Aided Diagnosis of Solid Breast Lesions Using an Ultrasonic Multi-Feature Analysis Procedure

    DTIC Science & Technology

    2011-01-01

    areas. We quantified morphometric features by geometric and fractal analysis of traced lesion boundaries. Although no single parameter can reliably...These include acoustic descriptors (“echogenicity,” “heterogeneity,” “shadowing”) and morphometric descriptors (“area,” “aspect ratio,” “border...quantitative descriptors; some morphometric features (such as border irregularity) also were particularly effective in lesion classification. Our

  18. Improved analysis of electron penetration and numerical procedures for space radiation shielding

    NASA Technical Reports Server (NTRS)

    Wilson, J. W.; Denn, F. M.

    1977-01-01

    Electron penetration calculational techniques are reviewed with regard to their suitability for shield analysis in future space operations. Methods based on the transmission factors of Mar are discussed and a correction term for low-energy electrons, which results in slightly conservative shield estimates, is derived. This modified Mar's method provides estimates of the dose for electrons that penetrate through shields of arbitrary elemental material with an atomic number greater than four. A complete computer algorithm is included.

  19. Clarifying an ambiguous functional analysis with matched and mismatched extinction procedures.

    PubMed

    Kuhn, D E; DeLeon, I G; Fisher, W W; Wilke, A E

    1999-01-01

    Results of functional analysis were ambiguous in suggesting that self-injurious behavior (SIB) was maintained by escape, sensory reinforcement, or both. To help clarify these results, we compared escape extinction, sensory extinction, and the combined treatments. Sensory extinction proved to be a necessary and sufficient treatment, whereas escape extinction failed to decrease SIB. These analyses helped to clarify the function of SIB and to identify an effective and efficient treatment.

  20. Clarifying an ambiguous functional analysis with matched and mismatched extinction procedures.

    PubMed Central

    Kuhn, D E; DeLeon, I G; Fisher, W W; Wilke, A E

    1999-01-01

    Results of functional analysis were ambiguous in suggesting that self-injurious behavior (SIB) was maintained by escape, sensory reinforcement, or both. To help clarify these results, we compared escape extinction, sensory extinction, and the combined treatments. Sensory extinction proved to be a necessary and sufficient treatment, whereas escape extinction failed to decrease SIB. These analyses helped to clarify the function of SIB and to identify an effective and efficient treatment. PMID:10201106

  1. Development of calibration training and procedures using job-task analysis

    SciTech Connect

    Smith, R.A.

    1993-12-01

    Efforts to handle an increased workload with dwindling manpower in the Physical and Electrical Standards Laboratory (Standards Lab) at the Oak Ridge Y-12 Plant are described. Empowerment of workers via Total Quality Management (TQM) is the basis for their efforts. A survey and follow-up team work was the course of action. The job-task analysis received honors by their peers at the Y-12 Plant.

  2. Eulogy for a neutron activation analysis facility

    SciTech Connect

    Lepel, E.A.

    2000-07-01

    A relatively inexpensive facility for neutron activation analysis (NAA) was developed in the early 1970s at Pacific Northwest National Laboratory (PNNL). With the availability of large {sup 252}Cf sources, a subcritical facility was designed that could contain up to 100 mg of {sup 252}Cf (T{sub 1/2} = 2.645 yr and a spontaneous fission yield of 2.34 x 10{sup 9} n/s{center_dot}mg{sup {minus}1}). The {sup 252}Cf source was surrounded by a hexagonal array of {sup 235}U enriched fuel rods, which provided a 10- to 20-fold multiplication of the neutrons emitted from the {sup 252}Cf source. This assembly was located near the bottom of a 1.52-m-diam x 6.10-m-deep water-filled pool. The Neutron Multiplier Facility (NMF) was operational from November 1977 to April 1998--a period of 20.4 yr. The NMF began operation with {approximately}100 mg of {sup 252}Cf, and because of decay of the {sup 252}Cf, it had decreased to 0.34 mg at the time of shutdown. Decommissioning of the NMF began April 1998 and was completed in October 1999.

  3. Improving financial performance by modeling and analysis of radiology procedure scheduling at a large community hospital.

    PubMed

    Lu, Lingbo; Li, Jingshan; Gisler, Paula

    2011-06-01

    Radiology tests, such as MRI, CT-scan, X-ray and ultrasound, are cost intensive and insurance pre-approvals are necessary to get reimbursement. In some cases, tests may be denied for payments by insurance companies due to lack of pre-approvals, inaccurate or missing necessary information. This can lead to substantial revenue losses for the hospital. In this paper, we present a simulation study of a centralized scheduling process for outpatient radiology tests at a large community hospital (Central Baptist Hospital in Lexington, Kentucky). Based on analysis of the central scheduling process, a simulation model of information flow in the process has been developed. Using such a model, the root causes of financial losses associated with errors and omissions in this process were identified and analyzed, and their impacts were quantified. In addition, "what-if" analysis was conducted to identify potential process improvement strategies in the form of recommendations to the hospital leadership. Such a model provides a quantitative tool for continuous improvement and process control in radiology outpatient test scheduling process to reduce financial losses associated with process error. This method of analysis is also applicable to other departments in the hospital.

  4. Hair analysis in order to evaluate drug abuse in driver's license regranting procedures.

    PubMed

    Tassoni, G; Mirtella, D; Zampi, M; Ferrante, L; Cippitelli, M; Cognigni, E; Froldi, R; Cingolani, M

    2014-11-01

    In Italy, driving under the influence of drugs determines the suspension of the offender's driver's license. To regain the license the person must be drug free during an observation period. People whose license has been revoked or suspended can obtain, or re-obtain their driver's license subject to the judgment of a medical commission. The exclusion of illicit drug use is determined by means of toxicological analysis, mainly on urine or hair matrices. We reported the results of several years of experience of the forensic toxicology laboratory of the University of Macerata in the use of hair analysis for the assessment of past exposure to drugs in people suspected of driving under the influence of drugs. From 2004 to 2013, 8612 hair samples, were analyzed for opiates, cocaine and delta-9-tetrahydrocannabinol (Δ(9)-THC) using gas chromatography/mass spectrometry (GC/MS) method. We used a cutoff (SoHT or national guidelines) to determine the positive data, regardless of the hair sample concentrations. 1213 samples resulted positive, 71.7% were positive for cocaine and metabolites, 19.8% for morphine and metabolites, 8.5% for Δ(9)-THC. We also studied the timeframe of the abuse, as well as gender and age distribution of positive subjects. Moreover, we analyzed the possible deterrent effect of the hair analysis on driving under the influence of psychoactive substances.

  5. Rapid and Efficient Filtration-Based Procedure for Separation and Safe Analysis of CBRN Mixed Samples

    PubMed Central

    Bentahir, Mostafa; Laduron, Frederic; Irenge, Leonid; Ambroise, Jérôme; Gala, Jean-Luc

    2014-01-01

    Separating CBRN mixed samples that contain both chemical and biological warfare agents (CB mixed sample) in liquid and solid matrices remains a very challenging issue. Parameters were set up to assess the performance of a simple filtration-based method first optimized on separate C- and B-agents, and then assessed on a model of CB mixed sample. In this model, MS2 bacteriophage, Autographa californica nuclear polyhedrosis baculovirus (AcNPV), Bacillus atrophaeus and Bacillus subtilis spores were used as biological agent simulants whereas ethyl methylphosphonic acid (EMPA) and pinacolyl methylphophonic acid (PMPA) were used as VX and soman (GD) nerve agent surrogates, respectively. Nanoseparation centrifugal devices with various pore size cut-off (30 kD up to 0.45 µm) and three RNA extraction methods (Invisorb, EZ1 and Nuclisens) were compared. RNA (MS2) and DNA (AcNPV) quantification was carried out by means of specific and sensitive quantitative real-time PCRs (qPCR). Liquid chromatography coupled to time-of-flight mass spectrometry (LC/TOFMS) methods was used for quantifying EMPA and PMPA. Culture methods and qPCR demonstrated that membranes with a 30 kD cut-off retain more than 99.99% of biological agents (MS2, AcNPV, Bacillus Atrophaeus and Bacillus subtilis spores) tested separately. A rapid and reliable separation of CB mixed sample models (MS2/PEG-400 and MS2/EMPA/PMPA) contained in simple liquid or complex matrices such as sand and soil was also successfully achieved on a 30 kD filter with more than 99.99% retention of MS2 on the filter membrane, and up to 99% of PEG-400, EMPA and PMPA recovery in the filtrate. The whole separation process turnaround-time (TAT) was less than 10 minutes. The filtration method appears to be rapid, versatile and extremely efficient. The separation method developed in this work constitutes therefore a useful model for further evaluating and comparing additional separation alternative procedures for a safe handling and

  6. Rapid and efficient filtration-based procedure for separation and safe analysis of CBRN mixed samples.

    PubMed

    Bentahir, Mostafa; Laduron, Frederic; Irenge, Leonid; Ambroise, Jérôme; Gala, Jean-Luc

    2014-01-01

    Separating CBRN mixed samples that contain both chemical and biological warfare agents (CB mixed sample) in liquid and solid matrices remains a very challenging issue. Parameters were set up to assess the performance of a simple filtration-based method first optimized on separate C- and B-agents, and then assessed on a model of CB mixed sample. In this model, MS2 bacteriophage, Autographa californica nuclear polyhedrosis baculovirus (AcNPV), Bacillus atrophaeus and Bacillus subtilis spores were used as biological agent simulants whereas ethyl methylphosphonic acid (EMPA) and pinacolyl methylphophonic acid (PMPA) were used as VX and soman (GD) nerve agent surrogates, respectively. Nanoseparation centrifugal devices with various pore size cut-off (30 kD up to 0.45 µm) and three RNA extraction methods (Invisorb, EZ1 and Nuclisens) were compared. RNA (MS2) and DNA (AcNPV) quantification was carried out by means of specific and sensitive quantitative real-time PCRs (qPCR). Liquid chromatography coupled to time-of-flight mass spectrometry (LC/TOFMS) methods was used for quantifying EMPA and PMPA. Culture methods and qPCR demonstrated that membranes with a 30 kD cut-off retain more than 99.99% of biological agents (MS2, AcNPV, Bacillus Atrophaeus and Bacillus subtilis spores) tested separately. A rapid and reliable separation of CB mixed sample models (MS2/PEG-400 and MS2/EMPA/PMPA) contained in simple liquid or complex matrices such as sand and soil was also successfully achieved on a 30 kD filter with more than 99.99% retention of MS2 on the filter membrane, and up to 99% of PEG-400, EMPA and PMPA recovery in the filtrate. The whole separation process turnaround-time (TAT) was less than 10 minutes. The filtration method appears to be rapid, versatile and extremely efficient. The separation method developed in this work constitutes therefore a useful model for further evaluating and comparing additional separation alternative procedures for a safe handling and

  7. Epidural tramadol via intraoperatively placed catheter as a standalone analgesic after spinal fusion procedure: An analysis of efficacy and cost

    PubMed Central

    Ilangovan, Vijaysundar; Vivakaran, Thanga Tirupathi Rajan; Gunasekaran, D.; Devikala, D.

    2017-01-01

    Objective: This was a prospective analysis of epidural tramadol as a single analgesic agent delivered through intraoperatively placed epidural catheter for postoperative pain relief after spinal fusion procedures in terms of efficacy and cost. Materials and Methods: Twenty patients who underwent spinal fusion procedures were included in the study. After completion of the procedure, an epidural catheter was placed at the highest level of exposed dura and brought out through a separate tract. Postoperatively, tramadol was infused into the epidural space via the catheter at a dose of 1 mg/kg diluted in 10 ml of saline. The dosage frequency was arbitrarily fixed at every 6 h during the first 2 days and thereafter reduced to every 8 h after the first 2 days till day 5. Conventional intravenous analgesics were used only if additional analgesia was required as assessed by visual analog scale (VAS). Results: Patients’ VAS score was assessed every 4 h from the day of surgery. Patients with a VAS score of 6 or more were given additional analgesia in the form of intravenous paracetamol. Of the twenty patients, eight patients needed additional analgesia during the first 24 h and none required additional analgesia after the first 24 h. The median VAS score was 7 within the first 24 h and progressively declined thereafter. Epidural tramadol was noted to be many times cheaper than conventional parenteral analgesics. Conclusion: Epidural tramadol infusion is safe and effective as a standalone analgesic after open spinal fusion surgery, especially after the 1st postoperative day. Intraoperative placement of the epidural catheter is a simple way of delivering tramadol to the epidural space. The cost of analgesia after spinal fusion surgery can be reduced significantly using epidural tramadol alone. PMID:28149082

  8. Evaluation of Procedures for the Collection, Processing, and Analysis of Biomolecules from Low-Biomass Surfaces▿†

    PubMed Central

    Kwan, K.; Cooper, M.; La Duc, M. T.; Vaishampayan, P.; Stam, C.; Benardini, J. N.; Scalzi, G.; Moissl-Eichinger, C.; Venkateswaran, K.

    2011-01-01

    To comprehensively assess microbial diversity and abundance via molecular-analysis-based methods, procedures for sample collection, processing, and analysis were evaluated in depth. A model microbial community (MMC) of known composition, representative of a typical low-biomass surface sample, was used to examine the effects of variables in sampling matrices, target cell density/molecule concentration, and cryogenic storage on the overall efficacy of the sampling regimen. The MMC used in this study comprised 11 distinct species of bacterial, archaeal, and fungal lineages associated with either spacecraft or clean-room surfaces. A known cellular density of MMC was deposited onto stainless steel coupons, and after drying, a variety of sampling devices were used to recover cells and biomolecules. The biomolecules and cells/spores recovered from each collection device were assessed by cultivable and microscopic enumeration, and quantitative and species-specific PCR assays. rRNA gene-based quantitative PCR analysis showed that cotton swabs were superior to nylon-flocked swabs for sampling of small surface areas, and for larger surfaces, biological sampling kits significantly outperformed polyester wipes. Species-specific PCR revealed differential recovery of certain species dependent upon the sampling device employed. The results of this study empower current and future molecular-analysis-based microbial sampling and processing methodologies. PMID:21398492

  9. Collected radiochemical and geochemical procedures

    SciTech Connect

    Kleinberg, J

    1990-05-01

    This revision of LA-1721, 4th Ed., Collected Radiochemical Procedures, reflects the activities of two groups in the Isotope and Nuclear Chemistry Division of the Los Alamos National Laboratory: INC-11, Nuclear and radiochemistry; and INC-7, Isotope Geochemistry. The procedures fall into five categories: I. Separation of Radionuclides from Uranium, Fission-Product Solutions, and Nuclear Debris; II. Separation of Products from Irradiated Targets; III. Preparation of Samples for Mass Spectrometric Analysis; IV. Dissolution Procedures; and V. Geochemical Procedures. With one exception, the first category of procedures is ordered by the positions of the elements in the Periodic Table, with separate parts on the Representative Elements (the A groups); the d-Transition Elements (the B groups and the Transition Triads); and the Lanthanides (Rare Earths) and Actinides (the 4f- and 5f-Transition Elements). The members of Group IIIB-- scandium, yttrium, and lanthanum--are included with the lanthanides, elements they resemble closely in chemistry and with which they occur in nature. The procedures dealing with the isolation of products from irradiated targets are arranged by target element.

  10. Meta-analysis of organ damage after conversion from off-pump coronary artery bypass procedures.

    PubMed

    Mukherjee, Dayal; Rao, Christopher; Ibrahim, Michael; Ahmed, Kamran; Ashrafian, Hutan; Protopapas, Aristotle; Darzi, Ara; Athanasiou, Thanos

    2011-08-01

    The relative efficacy of off-pump and on-pump coronary revascularization is uncertain. A complication of off-pump surgery which is rarely considered is intraoperative conversion to cardiopulmonary bypass. Consequently, meta-analysis was performed of studies comparing morbidity after converted and nonconverted off-pump coronary revascularization. There were significant increases in the likelihood of stroke, myocardial injury, bleeding, renal failure, wound infection, intraaortic balloon pump requirement, transfusion, and respiratory and gastrointestinal complications after conversion. The underlying mechanisms need to be urgently elucidated. Prevention and treatment protocols for conversion warrant serious consideration and the risk of conversion may need to be discussed when obtaining informed patient consent.

  11. Analysis, Verification, and Application of Equations and Procedures for Design of Exhaust-pipe Shrouds

    NASA Technical Reports Server (NTRS)

    Ellerbrock, Herman H.; Wcislo, Chester R.; Dexter, Howard E.

    1947-01-01

    Investigations were made to develop a simplified method for designing exhaust-pipe shrouds to provide desired or maximum cooling of exhaust installations. Analysis of heat exchange and pressure drop of an adequate exhaust-pipe shroud system requires equations for predicting design temperatures and pressure drop on cooling air side of system. Present experiments derive such equations for usual straight annular exhaust-pipe shroud systems for both parallel flow and counter flow. Equations and methods presented are believed to be applicable under certain conditions to the design of shrouds for tail pipes of jet engines.

  12. Evaluation of a procedure for the analysis of nonstationary vibroacoustic data

    NASA Technical Reports Server (NTRS)

    Himelblau, Harry; Piersol, Allan G.

    1989-01-01

    Numerical techniques for the spectral analysis of vibration data from space-vehicle launches are described and demonstrated. A nonstationary product model described by Bendat and Piersol (1986) and its locally stationary version (Silverman, 1957) are applied to Space Shuttle flight data, and the results are presented in extensive graphs. It is shown that the nonstationary model can analyze data from longer sampling periods and thus significantly reduce random error; this in turn leads to vibration spectra lower than those obtained with short-duration models.

  13. A meta-analysis assessing the benefits of concomitant pleural tent procedure after upper lobectomy.

    PubMed

    Uzzaman, Mohammed M; Daniel Robb, J; Mhandu, Peter C E; Khan, Habib; Baig, Kamran; Chaubey, Sanjay; Whitaker, Donald C

    2014-01-01

    A meta-analysis comparing outcomes of upper lobectomies with or without pleural tenting was performed. Five trials comprising 396 patients were selected. There was significantly reduced duration of hospital stay, chest drain use, and air leak in the pleural tenting group compared with the group without the pleural tent. There was also a significant reduction in number of patients with prolonged air leak more than 7 days in pleural tenting group. No other difference was noted in other outcomes such as total drainage, operative time, or hospital costs. In patients at high-risk of air leak, we advocate concomitant use of the pleural tent after upper lobectomies.

  14. 22 CFR 217.61 - Procedures.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... ACTIVITIES RECEIVING FEDERAL FINANCIAL ASSISTANCE Procedures § 217.61 Procedures. The procedural provisons applicable to title VI of the Civil Rights Act of 1964 apply to this part. These procedures are found...

  15. 45 CFR 84.61 - Procedures.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... PROGRAMS OR ACTIVITIES RECEIVING FEDERAL FINANCIAL ASSISTANCE Procedures § 84.61 Procedures. The procedural provisions applicable to title VI of the Civil Rights Act of 1964 apply to this part. These procedures...

  16. An XML Representation for Crew Procedures

    NASA Technical Reports Server (NTRS)

    Simpson, Richard C.

    2005-01-01

    NASA ensures safe operation of complex systems through the use of formally-documented procedures, which encode the operational knowledge of the system as derived from system experts. Crew members use procedure documentation on the ground for training purposes and on-board space shuttle and space station to guide their activities. Investigators at JSC are developing a new representation for procedures that is content-based (as opposed to display-based). Instead of specifying how a procedure should look on the printed page, the content-based representation will identify the components of a procedure and (more importantly) how the components are related (e.g., how the activities within a procedure are sequenced; what resources need to be available for each activity). This approach will allow different sets of rules to be created for displaying procedures on a computer screen, on a hand-held personal digital assistant (PDA), verbally, or on a printed page, and will also allow intelligent reasoning processes to automatically interpret and use procedure definitions. During his NASA fellowship, Dr. Simpson examined how various industries represent procedures (also called business processes or workflows), in areas such as manufacturing, accounting, shipping, or customer service. A useful method for designing and evaluating workflow representation languages is by determining their ability to encode various workflow patterns, which depict abstract relationships between the components of a procedure removed from the context of a specific procedure or industry. Investigators have used this type of analysis to evaluate how well-suited existing workflow representation languages are for various industries based on the workflow patterns that commonly arise across industry-specific procedures. Based on this type of analysis, it is already clear that existing workflow representations capture discrete flow of control (i.e., when one activity should start and stop based on when other

  17. Influence of the preparation procedure on the catalytic activity of gold supported on diamond nanoparticles for phenol peroxidation.

    PubMed

    Martin, Roberto; Navalon, Sergio; Delgado, Juan Jose; Calvino, Jose J; Alvaro, Mercedes; Garcia, Hermenegildo

    2011-08-16

    The catalytic activity of diamond-supported gold nanoparticle (Au/D) samples prepared by the deposition/precipitation method have been correlated as a function of the pH and the reduction treatment. It was found that the most active material is the one prepared at pH 5 followed by subsequent thermal treatment at 300 °C under hydrogen. TEM images show that Au/D prepared under optimal conditions contain very small gold nanoparticles with sizes below 2 nm that are proposed to be responsible for the catalytic activity. Tests of productivity using large phenol (50 g L(-1)) and H(2)O(2) excesses (100 g L(-1)) and reuse gives a minimum TON of 458,759 moles of phenol degraded per gold atom. Analysis of the organic compounds extracted from the deactivated solid catalyst indicates that the poisons are mostly hydroxylated dicarboxylic acids arising from the degradative oxidation of the phenyl ring. By determining the efficiency for phenol degradation and the amount of O(2) evolved two different reactions of H(2)O(2) decomposition (the Fenton reaction at acidic pH values and spurious O(2) evolution at basic pH values) are proposed for Au/D catalysis. The activation energy of the two processes is very similar (ranging between 30 and 35 kJ mol(-1)). By using dimethylsulfoxide as a radical scavenger and N-tert-butyl-α-phenylnitrone as a spin trap under aerated conditions, the EPR spectrum of the expected PBN-OCH(3) adduct was detected, supporting the generation of HO(.), characteristic of Fenton chemistry in the process. Phenol degradation, on the other hand, exhibits the same activation energy as H(2)O(2) decomposition at pH 4 (due to the barrierless attack of HO(.) to phenol), but increases the activation energy gradually up to about 90 kJ mol(-1) at pH 7 and then undergoes a subsequent reduction as the pH increases reaching another minimum at pH 8.5 (49 kJ mol(-1)).

  18. Charge-coupled device imaging spectroscopy of Mars. I - Instrumentation and data reduction/analysis procedures

    NASA Technical Reports Server (NTRS)

    Bell, James F., III; Lucey, Paul G.; Mccord, Thomas B.

    1992-01-01

    This paper describes the collection, reduction, and analysis of 0.4-1.0-micron Mars imaging spectroscopy data obtained during the 1988 and 1990 oppositions from Mauna Kea Observatory and provides a general outline for the acquisition and analysis of similar imaging spectroscopy data sets. The U.H. 2.24-m Wide Field Grism CCD Spectrograph was used to collect 13 3D image cubes covering 90 percent of the planet south of 50 deg N in the 0.4-0.8 micron region and covering 55 percent of the planet south of 50 deg N in the 0.5-1.0 micron region. Spectra extracted from these image cubes reveal the detailed character of the Martian near-UV to visible spectrum. Images at red wavelengths reveal the 'classical' albedo markings at 100-500 km spatial resolution while images at blue wavelengths show little surface feature contrast and are dominated by condensate clouds/hazes and polar ice.

  19. Evaluation of extraction procedures for 2-DE analysis of aphid proteins.

    PubMed

    Yiou, Pan; Shaoli, An; Kebin, Li; Tao, Wang; Kui, Fang; Hua, Zhang; Yu, Sun; Xun, Yang; Jinghui, Xi

    2013-02-01

    Protein sample preparation is a crucial step in a 2-DE proteomics approach. In order to establish a routine protocol for the application of proteomics analysis to aphids, this study focuses on the specific protein extraction problems in insect tissues and evaluates four methods to bypass them. The approaches of phenol extraction methanol/ammonium acetate precipitation (PA), TCA/acetone precipitation, PEG precipitation, and no precipitation were evaluated for proteins isolation and purification from apterous adult aphids, Sitobion avenae. For 2-DE, the PA protocol was optimal, resulting in good IEF and clear spots. PA method yielded the greatest amount of protein and displayed most protein spots in 2-DE gels, as compared with the TCA/acetone precipitation, PEG precipitation and no precipitation protocols. Analysis of protein yield, image quality and spot numbers demonstrate that the TCA/acetone precipitation protocol is a reproducible and reliable method for extracting proteins from aphids. The PEG precipitation approach is a newly developed protein extraction protocol for aphids, from which more unique protein spots can be detected, especially for detection of acid proteins. These protocols are expected to be applicable to other insects or could be of interest to laboratories involved in insect proteomics, despite the amounts and types of interfering compounds vary considerably in different insects.

  20. An Environmental Friendly Procedure for Photometric Determination of Hypochlorite in Tap Water Employing a Miniaturized Multicommuted Flow Analysis Setup

    PubMed Central

    Borges, Sivanildo S.; Reis, Boaventura F.

    2011-01-01

    A photometric procedure for the determination of ClO− in tap water employing a miniaturized multicommuted flow analysis setup and an LED-based photometer is described. The analytical procedure was implemented using leucocrystal violet (LCV; 4,4′,4′′-methylidynetris (N,N-dimethylaniline), C25H31N3) as a chromogenic reagent. Solenoid micropumps employed for solutions propelling were assembled together with the photometer in order to compose a compact unit of small dimensions. After control variables optimization, the system was applied for the determination of ClO− in samples of tap water, and aiming accuracy assessment samples were also analyzed using an independent method. Applying the paired t-test between results obtained using both methods, no significant difference at the 95% confidence level was observed. Other useful features include low reagent consumption, 2.4 μg of LCV per determination, a linear response ranging from 0.02 up to 2.0 mg L−1  ClO−, a relative standard deviation of 1.0% (n = 11) for samples containing 0.2 mg L−1  ClO−, a detection limit of 6.0 μg L−1  ClO−, a sampling throughput of 84 determinations per hour, and a waste generation of 432 μL per determination. PMID:21747732

  1. Man-Machine Interaction Design and Analysis System (MIDAS): Memory Representation and Procedural Implications for Airborne Communication Modalities

    NASA Technical Reports Server (NTRS)

    Corker, Kevin M.; Pisanich, Gregory M.; Lebacqz, Victor (Technical Monitor)

    1996-01-01

    The Man-Machine Interaction Design and Analysis System (MIDAS) has been under development for the past ten years through a joint US Army and NASA cooperative agreement. MIDAS represents multiple human operators and selected perceptual, cognitive, and physical functions of those operators as they interact with simulated systems. MIDAS has been used as an integrated predictive framework for the investigation of human/machine systems, particularly in situations with high demands on the operators. Specific examples include: nuclear power plant crew simulation, military helicopter flight crew response, and police force emergency dispatch. In recent applications to airborne systems development, MIDAS has demonstrated an ability to predict flight crew decision-making and procedural behavior when interacting with automated flight management systems and Air Traffic Control. In this paper we describe two enhancements to MIDAS. The first involves the addition of working memory in the form of an articulatory buffer for verbal communication protocols and a visuo-spatial buffer for communications via digital datalink. The second enhancement is a representation of multiple operators working as a team. This enhanced model was used to predict the performance of human flight crews and their level of compliance with commercial aviation communication procedures. We show how the data produced by MIDAS compares with flight crew performance data from full mission simulations. Finally, we discuss the use of these features to study communications issues connected with aircraft-based separation assurance.

  2. A Standard Operating Procedure (SOP) for the preparation of intra- and extracellular proteins of Clostridium acetobutylicum for proteome analysis.

    PubMed

    Schwarz, Katrin; Fiedler, Tomas; Fischer, Ralf-Jörg; Bahl, Hubert

    2007-02-01

    We report on the development of a Standard Operating Procedure (SOP) for extraction and handling of intra- and extracellular protein fractions of Clostridium acetobutylicum ATCC 824 for reproducible high quality two-dimensional gel electrophoresis (2-DE) analyses. Standardized cells from a phosphate-limited chemostat were used to evaluate different protein preparation methods. For the preparation of the secretome, a dialysis/ultrafiltration procedure resulted in higher protein yields and proved to be more reliable compared to different precipitation methods using TCA, DOC-TCA, acetone, and PEG 6000. Sonication was found to be the most efficient method among different tested techniques of cell disruption for the analysis of the intracellular proteome. Furthermore, the effect of protease inhibitors and sample storage conditions were tested for both intra- and extracellular protein samples. Significant changes in the protein pattern were observed depending on the addition of protease inhibitors. 2-DE gels with a pH gradient from 4 to 7 prepared according to the developed SOP contained at least 736 intracellular and 324 extracellular protein spots.

  3. Nonlinear Pressurization and Modal Analysis Procedure for Dynamic Modeling of Inflatable Structures

    NASA Technical Reports Server (NTRS)

    Smalley, Kurt B.; Tinker, Michael L.; Saxon, Jeff (Technical Monitor)

    2002-01-01

    An introduction and set of guidelines for finite element dynamic modeling of nonrigidized inflatable structures is provided. A two-step approach is presented, involving 1) nonlinear static pressurization of the structure and updating of the stiffness matrix and 2) hear normal modes analysis using the updated stiffness. Advantages of this approach are that it provides physical realism in modeling of pressure stiffening, and it maintains the analytical convenience of a standard bear eigensolution once the stiffness has been modified. Demonstration of the approach is accomplished through the creation and test verification of an inflated cylinder model using a large commercial finite element code. Good frequency and mode shape comparisons are obtained with test data and previous modeling efforts, verifying the accuracy of the technique. Problems encountered in the application of the approach, as well as their solutions, are discussed in detail.

  4. Exploring extraction/dissolution procedures for analysis of starch chain-length distributions.

    PubMed

    Wu, Alex Chi; Li, EnPeng; Gilbert, Robert G

    2014-12-19

    The analysis of starch chain-length distributions (CLDs) is important for understanding starch biosythesis-structure-property relations. It is obtained by analyzing the number distribution of the linear glucan chains released by enzymatic debranching of starch α-(1→6) glycosidic bonds for subsequent characterization by techniques such as fluorophore-assisted carbohydrate electrophoresis (FACE) or size-exclusion chromatography (SEC). Current literature pretreatments for debranching prior to CLD determination involve varying protocols, which might yield artifactual results. This paper examines the two widely used starch dissolution treatments with dimethyl sulfoxide (DMSO) containing 0.5% (w/w) lithium bromide (DMSO-LiBr) at 80°C and with aqueous alkaline (i.e. NaOH) solvents at 100 ˚C. Analyses by FACE with a very high range of degree of polymerization, and by SEC, of the CLD of barley starches with different structures show the following. (1) The NaOH treatment, even at a dilute concentration, causes significant degradation at higher degrees of polymerization, leading to quantitatively incorrect CLD results in longer amylopectin and in amylose chains. (2) Certain features in both amylopectin and amylose fractions of the CLD reduced to bumps or are missing with NaOH treatment. (3) Overestimation of amylose chains in starch CLD due to incomplete amylopectin dissolution with dilute NaOH concentration. These results indicate starch dissolution with DMSO-LiBr is the method of choice for minimizing artifacts. An improved pretreatment protocol is presented for starch CLD analysis by FACE and SEC.

  5. Target preparation procedure and PIXE analysis of selenium in different foods consumed in the region of Algiers

    NASA Astrophysics Data System (ADS)

    Amokrane, A.; Benamar, M. E. A.

    2002-03-01

    A procedure has been elaborated to measure the trace element selenium (Se) in onions and potatoes consumed in the Algiers region. The important chemical pre-concentration step includes wet sample oxidation and the reduction of Se to the element using co-precipitation with tellurium as carrier element and radioactive 75Se to control the chemical process yield. Finally, the filtered probe material is dissolved, labeled with yttrium as an internal PIXE standard and deposited on a thin foil of polycarbonate by solvent drip evaporation. The resulting targets are investigated by the PIXE analysis with 3-MeV proton beams. The low concentration of Se in the samples is determined by measuring the characteristic X-rays.

  6. Stability analysis of amplitude death in delay-coupled high-dimensional map networks and their design procedure

    NASA Astrophysics Data System (ADS)

    Watanabe, Tomohiko; Sugitani, Yoshiki; Konishi, Keiji; Hara, Naoyuki

    2017-01-01

    The present paper studies amplitude death in high-dimensional maps coupled by time-delay connections. A linear stability analysis provides several sufficient conditions for an amplitude death state to be unstable, i.e., an odd number property and its extended properties. Furthermore, necessary conditions for stability are provided. These conditions, which reduce trial-and-error tasks for design, and the convex direction, which is a popular concept in the field of robust control, allow us to propose a design procedure for system parameters, such as coupling strength, connection delay, and input-output matrices, for a given network topology. These analytical results are confirmed numerically using delayed logistic maps, generalized Henon maps, and piecewise linear maps.

  7. A Discussion of Procedures and Equipment for the Comprehensive Test Ban Treaty On-Site Inspection Environmental Sampling and Analysis

    SciTech Connect

    Wogman, Ned A.; Milbrath, Brian D.; Payne, Rosara F.; Seifert, Carolyn E.; Friese, Judah I.; Miley, Harry S.; Bowyer, Ted W.; Hanlen, Richard C.; Onishi, Yasuo; Hayes, James C.; Wigmosta, Mark S.

    2011-02-01

    This paper is intended to serve as a scientific basis to start discussions of the available environmental sampling techniques and equipment that have been used in the past that could be considered for use within the context of the Comprehensive Nuclear-Test-Ban Treaty (CTBT) on-site inspections (OSI). This work contains information on the techniques, equipment, costs, and some operational procedures associated with environmental sampling that have actually been used in the past by the United States for the detection of nuclear explosions. This paper also includes a discussion of issues, recommendations, and questions needing further study within the context of the sampling and analysis of aquatic materials, atmospheric gases, atmospheric particulates, vegetation, sediments and soils, fauna, and drill-back materials.

  8. Input Files and Procedures for Analysis of SMA Hybrid Composite Beams in MSC.Nastran and ABAQUS

    NASA Technical Reports Server (NTRS)

    Turner, Travis L.; Patel, Hemant D.

    2005-01-01

    A thermoelastic constitutive model for shape memory alloys (SMAs) and SMA hybrid composites (SMAHCs) was recently implemented in the commercial codes MSC.Nastran and ABAQUS. The model is implemented and supported within the core of the commercial codes, so no user subroutines or external calculations are necessary. The model and resulting structural analysis has been previously demonstrated and experimentally verified for thermoelastic, vibration and acoustic, and structural shape control applications. The commercial implementations are described in related documents cited in the references, where various results are also shown that validate the commercial implementations relative to a research code. This paper is a companion to those documents in that it provides additional detail on the actual input files and solution procedures and serves as a repository for ASCII text versions of the input files necessary for duplication of the available results.

  9. Model free isoconversional procedure for evaluating the effective activation energy values of thermally stimulated processes in dinitroimidazoles

    NASA Astrophysics Data System (ADS)

    Ravi, P.

    2014-05-01

    The decomposition kinetics of 1,4-dinitroimidazole, 2,4-dinitroimidazole, and N-methyl-2,4-dinitroimidazole have been investigated using thermogravimetry-differential thermal analysis technique under N2 atmosphere at the flow rate 100 cm3/min. The Flynn-Wall-Ozawa method and the Friedman method were used for the estimation of the effective activation energy values. These model free isoconversional kinetic methods showed variation in the calculated values due to the approximation of temperature integral used in the derivations of the kinetic equations. The model compounds were decomposed by multi-step kinetics evident from the nonlinear relationship of the effective activation energy values with the conversion rate. Three different reaction pathways namely NO2 elimination, NO elimination, and HONO elimination are expected to play crucial role in the decomposition of nitroimidazoles. The model dinitroimidazoles represent different decomposition kinetics, and the reaction pathways the NO2 elimination, and NO elimination compete with each other for the decomposition mechanism. The present study is certainly helpful in understanding the decomposition kinetics, and dynamics of substituted nitroimidazoles to be used for fuel, and explosive applications.

  10. Mammography screening: an incremental cost effectiveness analysis of two view versus one view procedures in London.

    PubMed Central

    Bryan, S; Brown, J; Warren, R

    1995-01-01

    STUDY OBJECTIVE--To compare the costs and effects of routine mammography screening by a single mediolateral-oblique view and two views (mediolateral-oblique plus craniocaudal) of each breast. DESIGN--A cost effectiveness analysis of a prospective non-randomised trial comparing one and two view mammography screening was carried out at St Margaret's Hospital, Epping. All women in the study had two view mammography. The mediolateral-oblique view was always the first image read by the radiologist. After reading the films for a clinic session, the same radiologist then went back and read both the mediolateral-oblique and craniocaudal views together. Each set of films was read by two radiologists. The main outcome measures were recall rates, number of cancers detected, screening and assessment costs, and cost effectiveness ratios. SUBJECTS--A total of 26,430 women who attended for breast screening using both one and two view mammography participated. A sample of 132 women attending for assessment provided data on the private costs incurred in attending for assessment. RESULTS--There was a reduction in the recall rate from 9.1% (2404 of 26,430) after one view screening to 6.7% (1760 of 26,430) after two view screening. The results also suggest that for every 10,000 women screened an additional five cancers would be detected earlier with two view screening. The additional health service screening cost associated with two view screening was estimated to be 3.63 pounds: the costs associated with one and two view screening policies were estimated to be 41.49 pounds and 32.99 pounds respectively. Private costs incurred were estimated to be 0.35 pounds per woman screened and 32.75 pounds per woman assessed. Two cost effectiveness ratios were calculated: an incremental health service cost per additional cancer detected of 4129 pounds and an incremental health service plus private cost per additional cancer detected of 2742 pounds. The sensitivity analysis suggested that the

  11. Radiation densitometry in tree-ring analysis: a review and procedure manual

    SciTech Connect

    Parker, M.L.; Taylor, F.G.; Doyle, T.W.; Foster, B.E.; Cooper, C.; West, D.C.

    1985-01-01

    An x-ray densitometry of wood facility is being established by the Environmental Sciences Division, Oak Ridge Natioanl Laboratory (ORNL). The objective is to apply tree-ring data to determine whether or not there is a fertilizer effect on tree growth from increased atmospheric carbon dioxide since the beginning of the industrial era. Intra-ring width and density data, including ring-mass will be detemined from tree-ring samples collected from sites located throughout the United States and Canada. This report is designed as a guide to assist ORNL scientists in building the x-ray densitometry system. The history and development of x-ray densitometry in tree-ring research is examined and x-ray densitometry is compared with other techniques. Relative wood and tree characteristics are described as are environmental and genetic factors affecting tree growth responses. Methods in x-ray densitometry are examined in detail and the techniques used at four operating laboratories are described. Some ways that dendrochronology has been applied in dating, in wood quality, and environmental studies are presented, and a number of tree-ring studies in Canada are described. An annotated bibliography of radiation densitometry in tree-ring analysis and related subjects is included.

  12. Radiometer Calibrations: Saving Time by Automating the Gathering and Analysis Procedures

    NASA Technical Reports Server (NTRS)

    Sadino, Jeffrey L.

    2005-01-01

    Mr. Abtahi custom-designs radiometers for Mr. Hook's research group. Inherently, when the radiometers report the temperature of arbitrary surfaces, the results are affected by errors in accuracy. This problem can be reduced if the errors can be accounted for in a polynomial. This is achieved by pointing the radiometer at a constant-temperature surface. We have been using a Hartford Scientific WaterBath. The measurements from the radiometer are collected at many different temperatures and compared to the measurements made by a Hartford Chubb thermometer with a four-decimal point resolution. The data is analyzed and fit to a fifth-order polynomial. This formula is then uploaded into the radiometer software, enabling accurate data gathering. Traditionally, Mr. Abtahi has done this by hand, spending several hours of his time setting the temperature, waiting for stabilization, taking measurements, and then repeating for other temperatures. My program, written in the Python language, has enabled the data gathering and analysis process to be handed off to a less-senior member of the team. Simply by entering several initial settings, the program will simultaneously control all three instruments and organize the data suitable for computer analyses, thus giving the desired fifth-order polynomial. This will save time, allow for a more complete calibration data set, and allow for base calibrations to be developed. The program is expandable to simultaneously take any type of measurement from up to nine distinct instruments.

  13. Environmental-sanitary risk analysis procedure applied to artificial turf sports fields.

    PubMed

    Ruffino, Barbara; Fiore, Silvia; Zanetti, Maria Chiara

    2013-07-01

    Owing to the extensive use of artificial turfs worldwide, over the past 10 years there has been much discussion about the possible health and environmental problems originating from styrene-butadiene recycled rubber. In this paper, the authors performed a Tier 2 environmental-sanitary risk analysis on five artificial turf sports fields located in the city of Turin (Italy) with the aid of RISC4 software. Two receptors (adult player and child player) and three routes of exposure (direct contact with crumb rubber, contact with rainwater soaking the rubber mat, inhalation of dusts and gases from the artificial turf fields) were considered in the conceptual model. For all the fields and for all the routes, the cumulative carcinogenic risk proved to be lower than 10(-6) and the cumulative non-carcinogenic risk lower than 1. The outdoor inhalation of dusts and gases was the main route of exposure for both carcinogenic and non-carcinogenic substances. The results given by the inhalation pathway were compared with those of a risk assessment carried out on citizens breathing gases and dusts from traffic emissions every day in Turin. For both classes of substances and for both receptors, the inhalation of atmospheric dusts and gases from vehicular traffic gave risk values of one order of magnitude higher than those due to playing soccer on an artificial field.

  14. An Analysis and Procedure for Determining Space Environmental Sink Temperatures With Selected Computational Results

    NASA Technical Reports Server (NTRS)

    Juhasz, Albert J.

    2001-01-01

    The purpose of this report was to analyze the heat-transfer problem posed by the determination of spacecraft temperatures and to incorporate the theoretically derived relationships in the computational code TSCALC. The basis for the code was a theoretical analysis of the thermal radiative equilibrium in space, particularly in the Solar System. Beginning with the solar luminosity, the code takes into account these key variables: (1) the spacecraft-to-Sun distance expressed in astronomical units (AU), where 1 AU represents the average Sun-to-Earth distance of 149.6 million km; (2) the angle (arc degrees) at which solar radiation is incident upon a spacecraft surface (ILUMANG); (3) the spacecraft surface temperature (a radiator or photovoltaic array) in kelvin, the surface absorptivity-to-emissivity ratio alpha/epsilon with respect to the solar radiation and (alpha/epsilon)(sub 2) with respect to planetary radiation; and (4) the surface view factor to space F. Outputs from the code have been used to determine environmental temperatures in various Earth orbits. The code was also utilized as a subprogram in the design of power system radiators for deep-space probes.

  15. A new bioseed for determination of wastewater biodegradability: analysis of the experimental procedure.

    PubMed

    Ballesteros Martín, M M; Esteban García, B; Ortega-Gómez, E; Sánchez Pérez, J A

    2014-01-01

    A new bioassay proposed in the patent P201300029 was applied to a pre-treated wastewater containing a mixture of commercial pesticides to simulate a recalcitrant industrial wastewater in order to determine its biodegradability. The test uses a mixture of standardized inoculum of the lyophilized bacteria Pseudomonas putida with the proper proportion of salts and minerals. The results highlight that biodegradation efficiency can be calculated using a gross parameter (chemical oxygen demand (COD)) which facilitates the biodegradability determination for routine water biodegradability analysis. The same trend was observed throughout the assay with the dehydrated and fresh inoculums, and only a difference of 5% in biodegradation efficiency (E f) was observed. The obtained results showed that the P. putida biodegradability assay can be used as a commercial test with a lyophilized inoculum in order to monitor the ready biodegradability of an organic pollutant or a WWTP influent. Moreover, a combination of the BOD5/COD ratio and the P. putida biodegradability test is an attractive alternative in order to evaluate the biodegradability enhancement in water pre-treated with advanced oxidation processes (AOPs).

  16. Major morbidity or mortality from office anesthetic procedures: a closed-claim analysis of 13 cases.

    PubMed

    Jastak, J T; Peskin, R M

    1991-01-01

    A closed-claim analysis of anesthetic-related deaths and permanent injuries in the dental office setting was conducted in cooperation with a leading insurer of oral and maxillofacial surgeons and dental anesthesiologists. A total of 13 cases occurring between 1974 and 1989 was included. In each case, all available records, reports, depositions, and proceedings were reviewed. The following were determined for each case: preoperative physical status of the patient, anesthetic technique used (classified as either general anesthesia or conscious sedation), probable cause of the morbid event, avoidability of the occurrence, and contributing factors important to the outcome. The majority of patients were classified as American Society of Anesthesiologists (ASA) status II or III. Most patients had preexisting conditions, such as gross obesity, cardiac disease, epilepsy, and chronic obstructive pulmonary disease, that can significantly affect anesthesia care. Hypoxia arising from airway obstruction and/or respiratory depression was the most common cause of untoward events, and most of the adverse events were determined to be avoidable. The disproportionate number of patients in this sample who were at the extremes of age and with ASA classifications below I suggests that anesthesia risk may be significantly increased in patients who fall outside the healthy, young adult category typically treated in the oral surgical/dental outpatient setting.

  17. Representativeness of laboratory sampling procedures for the analysis of trace metals in soil.

    PubMed

    Dubé, Jean-Sébastien; Boudreault, Jean-Philippe; Bost, Régis; Sona, Mirela; Duhaime, François; Éthier, Yannic

    2015-08-01

    This study was conducted to assess the representativeness of laboratory sampling protocols for purposes of trace metal analysis in soil. Five laboratory protocols were compared, including conventional grab sampling, to assess the influence of sectorial splitting, sieving, and grinding on measured trace metal concentrations and their variability. It was concluded that grinding was the most important factor in controlling the variability of trace metal concentrations. Grinding increased the reproducibility of sample mass reduction by rotary sectorial splitting by up to two orders of magnitude. Combined with rotary sectorial splitting, grinding increased the reproducibility of trace metal concentrations by almost three orders of magnitude compared to grab sampling. Moreover, results showed that if grinding is used as part of a mass reduction protocol by sectorial splitting, the effect of sieving on reproducibility became insignificant. Gy's sampling theory and practice was also used to analyze the aforementioned sampling protocols. While the theoretical relative variances calculated for each sampling protocol qualitatively agreed with the experimental variances, their quantitative agreement was very poor. It was assumed that the parameters used in the calculation of theoretical sampling variances may not correctly estimate the constitutional heterogeneity of soils or soil-like materials. Finally, the results have highlighted the pitfalls of grab sampling, namely, the fact that it does not exert control over incorrect sampling errors and that it is strongly affected by distribution heterogeneity.

  18. Atmospheric pressure microwave sample preparation procedure for the combined analysis of total phosphorus and kjeldahl nitrogen.

    PubMed

    Collins, L W; Chalk, S J; Kingston, H M

    1996-08-01

    An atmospheric pressure microwave digestion method has been developed for the combined analysis of total phosphorus and Kjeldahl nitrogen in complex matrices. In comparison to the digestion steps in EPA Methods 365.4 (total phosphorus) and 351.x (Kjeldahl nitrogen), this method requires less time, eliminates the need for a catalyst, and reduces the toxicity of the waste significantly. It employs a microwave-assisted digestion step, using refluxing borosilicate glass vessels at atmospheric pressure. Traditionally, this method has a time-consuming sample preparation step and generates toxic waste through the use of heavy metal catalysts. These advantages are gained by the combination of a high boiling point acid (sulfuric acid) and the application of focused microwave irradiation, which enhances the digestion process by direct energy coupling. NIST standard reference materials 1572 (citrus leaves), 1577a (bovine liver), and 1566 (oyster tissue) and tryptophan were analyzed to validate the method. Phosphorus concentrations were determined by the colorimetric ascorbic acid method outlined in EPA Method 365.3. Kjeldahl nitrogen concentrations were determined using EPA Method 351.1. The results of the analyses showed good precision and are in excellent agreement with the NIST published values for both elements.

  19. What makes extinction work: an analysis of procedural form and function.

    PubMed

    Iwata, B A; Pace, G M; Cowdery, G E; Miltenberger, R G

    1994-01-01

    We examined methods for determining how extinction should be applied to different functions of self-injurious behavior (SIB). Assessment data indicated that the head banging of 3 children with developmental disabilities was maintained by different reinforcement contingencies: One subject's SIB was positively reinforced by attention from adults, the 2nd subjects SIB was negatively reinforced by escape from educational tasks, and the 3rd subject's SIB appeared to be automatically reinforced or "self-stimulatory" in nature. Three functional variations of extinction--EXT (attention), EXT (escape), and EXT (sensory)--were evaluated, and each subject was exposed to at least two of these variations in reversal or multiple baseline designs. Reductions in SIB were observed only when implementation of "extinction" involved the discontinuation of reinforcement previously shown to be responsible for maintaining the behavior. These results highlight important differences among treatment techniques based on the same behavioral principle (extinction) when applied to topographically similar but functionally dissimilar responses, and further illustrate the practical implications of a functional analysis of behavior disorders for designing, selecting, and classifying therapeutic interventions.

  20. An alternative procedure for uranium analysis in drinking water using AQUALIX columns: application to varied French bottled waters.

    PubMed

    Bouvier-Capely, C; Bonthonneau, J P; Dadache, E; Rebière, F

    2014-01-01

    The general population is chronically exposed to uranium ((234)U, (235)U, and (238)U) and polonium ((210)Po) mainly through day-to-day food and beverage intake. The measurement of these naturally-occurring radionuclides in drinking water is important to assess their health impact. In this work the applicability of calix[6]arene-derivatives columns for uranium analysis in drinking water was investigated. A simple and effective method was proposed on a specific column called AQUALIX, for the separation and preconcentration of U from drinking water. This procedure is suitable for routine analysis and the analysis time is considerably shortened (around 4h) by combining the separation on AQUALIX with fast ICP-MS measurement. This new method was tested on different French bottled waters (still mineral water, sparkling mineral water, and spring water). Then, the case of simultaneous presence of uranium and polonium in water was considered due to interferences in alpha spectrometry measurement. A protocol was proposed using a first usual step of spontaneous deposition of polonium on silver disc in order to separate Po, followed by the uranium extraction on AQUALIX column before alpha spectrometry counting.