Sample records for performance standard steps

  1. Standardized Six-Step Approach to the Performance of the Focused Basic Obstetric Ultrasound Examination.

    PubMed

    Abuhamad, Alfred; Zhao, Yili; Abuhamad, Sharon; Sinkovskaya, Elena; Rao, Rashmi; Kanaan, Camille; Platt, Lawrence

    2016-01-01

    This study aims to validate the feasibility and accuracy of a new standardized six-step approach to the performance of the focused basic obstetric ultrasound examination, and compare the new approach to the regular approach performed in the scheduled obstetric ultrasound examination. A new standardized six-step approach to the performance of the focused basic obstetric ultrasound examination, to evaluate fetal presentation, fetal cardiac activity, presence of multiple pregnancy, placental localization, amniotic fluid volume evaluation, and biometric measurements, was prospectively performed on 100 pregnant women between 18(+0) and 27(+6) weeks of gestation and another 100 pregnant women between 28(+0) and 36(+6) weeks of gestation. The agreement of findings for each of the six steps of the standardized six-step approach was evaluated against the regular approach. In all ultrasound examinations performed, substantial to perfect agreement (Kappa value between 0.64 and 1.00) was observed between the new standardized six-step approach and the regular approach. The new standardized six-step approach to the focused basic obstetric ultrasound examination can be performed successfully and accurately between 18(+0) and 36(+6) weeks of gestation. This standardized approach can be of significant benefit to limited resource settings and in point of care obstetric ultrasound applications. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  2. Standardizing bimanual vaginal examination using cognitive task analysis.

    PubMed

    Plumptre, Isabella; Mulki, Omar; Granados, Alejandro; Gayle, Claudine; Ahmed, Shahla; Low-Beer, Naomi; Higham, Jenny; Bello, Fernando

    2017-10-01

    To create a standardized universal list of procedural steps for bimanual vaginal examination (BVE) for teaching, assessment, and simulator development. This observational study, conducted from June-July 2012 and July-December 2014, collected video data of 10 expert clinicians performing BVE in a nonclinical environment. Video data were analyzed to produce a cognitive task analysis (CTA) of the examination steps performed. The CTA was further refined through structured interviews to make it suitable for teaching or assessment. It was validated through its use as a procedural examination checklist to rate expert clinician performance. BVE was deconstructed into 88 detailed steps outlining the complete examination process. These initial 88 steps were reduced to 35 by focusing on the unseen internal examination, then further refined through interviews with five experts into 30 essential procedural steps, five of which are additional steps if pathology is suspected. Using the CTA as a procedural checklist, the mean number of steps performed and/or verbalized was 21.6 ± 3.12 (72% ± 10.4%; range, 15.9-27.9, 53%-93%). This approach identified 30 essential steps for performing BVE, producing a new technique and standardized tool for teaching, assessment, and simulator development. © 2017 International Federation of Gynecology and Obstetrics.

  3. Measurement of intrahepatic pressure during radiofrequency ablation in porcine liver.

    PubMed

    Kawamoto, Chiaki; Yamauchi, Atsushi; Baba, Yoko; Kaneko, Keiko; Yakabi, Koji

    2010-04-01

    To identify the most effective procedures to avoid increased intrahepatic pressure during radiofrequency ablation, we evaluated different ablation methods. Laparotomy was performed in 19 pigs. Intrahepatic pressure was monitored using an invasive blood pressure monitor. Radiofrequency ablation was performed as follows: single-step standard ablation; single-step at 30 W; single-step at 70 W; 4-step at 30 W; 8-step at 30 W; 8-step at 70 W; and cooled-tip. The array was fully deployed in single-step methods. In the multi-step methods, the array was gradually deployed in four or eight steps. With the cooled-tip, ablation was performed by increasing output by 10 W/min, starting at 40 W. Intrahepatic pressure was as follows: single-step standard ablation, 154.5 +/- 30.9 mmHg; single-step at 30 W, 34.2 +/- 20.0 mmHg; single-step at 70 W, 46.7 +/- 24.3 mmHg; 4-step at 30 W, 42.3 +/- 17.9 mmHg; 8-step at 30 W, 24.1 +/- 18.2 mmHg; 8-step at 70 W, 47.5 +/- 31.5 mmHg; and cooled-tip, 114.5 +/- 16.6 mmHg. The radiofrequency ablation-induced area was spherical with single-step standard ablation, 4-step at 30 W, and 8-step at 30 W. Conversely, the ablated area was irregular with single-step at 30 W, single-step at 70 W, and 8-step at 70 W. The ablation time was significantly shorter for the multi-step method than for the single-step method. Increased intrahepatic pressure could be controlled using multi-step methods. From the shapes of the ablation area, 30-W 8-step expansions appear to be most suitable for radiofrequency ablation.

  4. Iranian Special Library Standards.

    ERIC Educational Resources Information Center

    Harvey, John F.

    The purposes of these standards are: (1) to suggest the steps necessary for the establishment of special libraries; (2) for those recently established, to suggest the steps necessary to achieve satisfactory performance levels in all areas, and finally, (3) for well established libraries, to suggest the steps necessary to achieve excellence. These…

  5. Do medical student stress, health, or quality of life foretell step 1 scores? A comparison of students in traditional and revised preclinical curricula.

    PubMed

    Tucker, Phebe; Jeon-Slaughter, Haekyung; Sener, Ugur; Arvidson, Megan; Khalafian, Andrey

    2015-01-01

    We explored the theory that measures of medical students' well-being and stress from different types of preclinical curricula are linked with performance on standardized assessment. Self-reported stress and quality of life among sophomore medical students having different types of preclinical curricula will vary in their relationships to USMLE Step 1 scores. Voluntary surveys in 2010 and 2011 compared self-reported stress, physical and mental health, and quality of life with Step 1 scores for beginning sophomore students in the final year of a traditional, discipline-based curriculum and the 1st year of a revised, systems-based curriculum with changed grading system. Wilcoxon rank sum tests and Spearman rank correlations were used to analyze data, significant at p <.05. New curriculum students reported worse physical health, subjective feelings, leisure activities, social relationships and morale, and more depressive symptoms and life stress than traditional curriculum students. However, among curriculum-related stressors, few differences emerged; revised curriculum sophomores reported less stress working with real and standardized patients than traditional students. There were no class differences in respondents' Step 1 scores. Among emotional and physical health measures, only feelings of morale correlated negatively with Step 1 performance. Revised curriculum students' Step 1 scores correlated negatively with stress from difficulty of coursework. Although revised curriculum students reported worse quality of life, general stress, and health and less stress from patient interactions than traditional students, few measures were associated with performance differences on Step 1. Moreover, curriculum type did not appear to either hinder or help students' Step 1 performance. To identify and help students at risk for academic problems, future assessments of correlates of Step 1 performance should be repeated after the new curriculum is well established, relating them also to performance on other standardized assessments of communication skills, professionalism, and later clinical evaluations in clerkships or internships.

  6. Improving particle filters in rainfall-runoff models: application of the resample-move step and development of the ensemble Gaussian particle filter

    NASA Astrophysics Data System (ADS)

    Plaza Guingla, D. A.; Pauwels, V. R.; De Lannoy, G. J.; Matgen, P.; Giustarini, L.; De Keyser, R.

    2012-12-01

    The objective of this work is to analyze the improvement in the performance of the particle filter by including a resample-move step or by using a modified Gaussian particle filter. Specifically, the standard particle filter structure is altered by the inclusion of the Markov chain Monte Carlo move step. The second choice adopted in this study uses the moments of an ensemble Kalman filter analysis to define the importance density function within the Gaussian particle filter structure. Both variants of the standard particle filter are used in the assimilation of densely sampled discharge records into a conceptual rainfall-runoff model. In order to quantify the obtained improvement, discharge root mean square errors are compared for different particle filters, as well as for the ensemble Kalman filter. First, a synthetic experiment is carried out. The results indicate that the performance of the standard particle filter can be improved by the inclusion of the resample-move step, but its effectiveness is limited to situations with limited particle impoverishment. The results also show that the modified Gaussian particle filter outperforms the rest of the filters. Second, a real experiment is carried out in order to validate the findings from the synthetic experiment. The addition of the resample-move step does not show a considerable improvement due to performance limitations in the standard particle filter with real data. On the other hand, when an optimal importance density function is used in the Gaussian particle filter, the results show a considerably improved performance of the particle filter.

  7. A Seven-Step Process To Align Curriculum with Oregon State Content Standards.

    ERIC Educational Resources Information Center

    Golden, Nancy; Lane, Marilyn

    1998-01-01

    The University of Oregon (UO) and Captain Robert Gray Elementary School formed a partnership where UO students used the elementary school as a case study for curriculum research. This document gives an overview of the 7-step process the students used to align the school's curriculum with Oregon's content and performance standards. The text opens…

  8. New FASB standard addresses revenue recognition considerations.

    PubMed

    McKee, Thomas E

    2015-12-01

    Healthcare organizations are expected to apply the following steps in revenue recognition under the new standard issued in May 2014 by the Financial Accounting Standards Board: Identify the customer contract. Identify the performance obligations in the contract. Determine the transaction price. Allocate the transaction price to the performance obligations in the contract. Recognize revenue when--or in some circumstances, as--the entity satisfies the performance obligation.

  9. A Delphi Consensus of the Crucial Steps in Gastric Bypass and Sleeve Gastrectomy Procedures in the Netherlands.

    PubMed

    Kaijser, Mirjam A; van Ramshorst, Gabrielle H; Emous, Marloes; Veeger, Nic J G M; van Wagensveld, Bart A; Pierie, Jean-Pierre E N

    2018-04-09

    Bariatric procedures are technically complex and skill demanding. In order to standardize the procedures for research and training, a Delphi analysis was performed to reach consensus on the practice of the laparoscopic gastric bypass and sleeve gastrectomy in the Netherlands. After a pre-round identifying all possible steps from literature and expert opinion within our study group, questionnaires were send to 68 registered Dutch bariatric surgeons, with 73 steps for bypass surgery and 51 steps for sleeve gastrectomy. Statistical analysis was performed to identify steps with and without consensus. This process was repeated to reach consensus of all necessary steps. Thirty-eight participants (56%) responded in the first round and 32 participants (47%) in the second round. After the first Delphi round, 19 steps for gastric bypass (26%) and 14 for sleeve gastrectomy (27%) gained full consensus. After the second round, an additional amount of 10 and 12 sub-steps was confirmed as key steps, respectively. Thirteen steps in the gastric bypass and seven in the gastric sleeve were deemed advisable. Our expert panel showed a high level of consensus expressed in a Cronbach's alpha of 0.82 for the gastric bypass and 0.87 for the sleeve gastrectomy. The Delphi consensus defined 29 steps for gastric bypass and 26 for sleeve gastrectomy as being crucial for correct performance of these procedures to the standards of our expert panel. These results offer a clear framework for the technical execution of these procedures.

  10. A mixed-methods research approach to the review of competency standards for orthotist/prosthetists in Australia.

    PubMed

    Ash, Susan; O'Connor, Jackie; Anderson, Sarah; Ridgewell, Emily; Clarke, Leigh

    2015-06-01

    The requirement for an allied health workforce is expanding as the global burden of disease increases internationally. To safely meet the demand for an expanded workforce of orthotist/prosthetists in Australia, competency based standards, which are up-to-date and evidence-based, are required. The aims of this study were to determine the minimum level for entry into the orthotic/prosthetic profession; to develop entry level competency standards for the profession; and to validate the developed entry-level competency standards within the profession nationally, using an evidence-based approach. A mixed-methods research design was applied, using a three-step sequential exploratory design, where step 1 involved collecting and analyzing qualitative data from two focus groups; step 2 involved exploratory instrument development and testing, developing the draft competency standards; and step 3 involved quantitative data collection and analysis - a Delphi survey. In stage 1 (steps 1 and 2), the two focus groups - an expert and a recent graduate group of Australian orthotist/prosthetists - were led by an experienced facilitator, to identify gaps in the current competency standards and then to outline a key purpose, and work roles and tasks for the profession. The resulting domains and activities of the first draft of the competency standards were synthesized using thematic analysis. In stage 2 (step 3), the draft-competency standards were circulated to a purposive sample of the membership of the Australian Orthotic Prosthetic Association, using three rounds of Delphi survey. A project reference group of orthotist/prosthetists reviewed the results of both stages. In stage 1, the expert (n = 10) and the new graduate (n = 8) groups separately identified work roles and tasks, which formed the initial draft of the competency standards. Further drafts were refined and performance criteria added by the project reference group, resulting in the final draft-competency standards. In stage 2, the final draft-competency standards were circulated to 56 members (n = 44 final round) of the Association, who agreed on the key purpose, 6 domains, 18 activities, and 68 performance criteria of the final competency standards. This study outlines a rigorous and evidence-based mixed-methods approach for developing and endorsing professional competency standards, which is representative of the views of the profession of orthotist/prosthetists.

  11. Development, validation and operating room-transfer of a six-step laparoscopic training program for the vesicourethral anastomosis.

    PubMed

    Klein, Jan; Teber, Dogu; Frede, Tom; Stock, Christian; Hruza, Marcel; Gözen, Ali; Seemann, Othmar; Schulze, Michael; Rassweiler, Jens

    2013-03-01

    Development and full validation of a laparoscopic training program for stepwise learning of a reproducible application of a standardized laparoscopic anastomosis technique and integration into the clinical course. The training of vesicourethral anastomosis (VUA) was divided into six simple standardized steps. To fix the objective criteria, four experienced surgeons performed the stepwise training protocol. Thirty-eight participants with no previous laparoscopic experience were investigated in their training performance. The times needed to manage each training step and the total training time were recorded. The integration into the clinical course was investigated. The training results and the corresponding steps during laparoscopic radical prostatectomy (LRP) were analyzed. Data analysis of corresponding operating room (OR) sections of 793 LRP was performed. Based on the validity, criteria were determined. In the laboratory section, a significant reduction of OR time for every step was seen in all participants. Coordination: 62%; longitudinal incision: 52%; inverted U-shape incision: 43%; plexus: 47%. Anastomosis catheter model: 38%. VUA: 38%. The laboratory section required a total time of 29 hours (minimum: 16 hours; maximum: 42 hours). All participants had shorter execution times in the laboratory than under real conditions. The best match was found within the VUA model. To perform an anastomosis under real conditions, 25% more time was needed. By using the training protocol, the performance of the VUA is comparable to that of an surgeon with experience of about 50 laparoscopic VUA. Data analysis proved content, construct, and prognostic validity. The use of stepwise training approaches enables a surgeon to learn and reproduce complex reconstructive surgical tasks: eg, the VUA in a safe environment. The validity of the designed system is given at all levels and should be used as a standard in the clinical surgical training in laparoscopic reconstructive urology.

  12. 21 CFR 680.1 - Allergenic Products.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ...) Mold manufacturers shall maintain written standard operating procedures, developed by a qualified... representative species of mold subject to the standard operating procedures. The tests shall be performed at each manufacturing step during and subsequent to harvest, as specified in the standard operating procedures. Before...

  13. 21 CFR 680.1 - Allergenic Products.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ...) Mold manufacturers shall maintain written standard operating procedures, developed by a qualified... representative species of mold subject to the standard operating procedures. The tests shall be performed at each manufacturing step during and subsequent to harvest, as specified in the standard operating procedures. Before...

  14. 21 CFR 680.1 - Allergenic Products.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ...) Mold manufacturers shall maintain written standard operating procedures, developed by a qualified... representative species of mold subject to the standard operating procedures. The tests shall be performed at each manufacturing step during and subsequent to harvest, as specified in the standard operating procedures. Before...

  15. 21 CFR 680.1 - Allergenic Products.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ...) Mold manufacturers shall maintain written standard operating procedures, developed by a qualified... representative species of mold subject to the standard operating procedures. The tests shall be performed at each manufacturing step during and subsequent to harvest, as specified in the standard operating procedures. Before...

  16. A Step Towards Electric Propulsion Testing Standards: Pressure Measurements and Effective Pumping Speeds

    NASA Technical Reports Server (NTRS)

    Dankanich, John W.; Swiatek, Michael W.; Yim, John T.

    2012-01-01

    The electric propulsion community has been implored to establish and implement a set of universally applicable test standards during the research, development, and qualification of electric propulsion systems. Existing practices are fallible and result in testing variations which leads to suspicious results, large margins in application, or aversion to mission infusion. Performance measurements and life testing under appropriate conditions can be costly and lengthy. Measurement practices must be consistent, accurate, and repeatable. Additionally, the measurements must be universally transportable across facilities throughout the development, qualification, spacecraft integration and on-orbit performance. A preliminary step to progress towards universally applicable testing standards is outlined for facility pressure measurements and effective pumping speed calculations. The standard has been applied to multiple facilities at the NASA Glenn Research Center. Test results and analyses of universality of measurements are presented herein.

  17. Acoustic and Perceptual Measures of SATB Choir Performances on Two Types of Portable Choral Riser Units in Three Singer-Spacing Conditions

    ERIC Educational Resources Information Center

    Daugherty, James F.; Manternach, Jeremy N.; Brunkan, Melissa C.

    2013-01-01

    Under controlled conditions, we assessed acoustically (long-term average spectra) and perceptually (singer survey, listener survey) six performances of an soprano, alto, tenor, and bass (SATB) choir ("N" = 27) as it sang the same musical excerpt on two portable riser units (standard riser step height, taller riser step height) with…

  18. Six steps to an effective denials management program.

    PubMed

    Robertson, Brian; Doré, Alexander

    2005-09-01

    The following six steps can help you manage denials management issues in your organization: Create standard definitions of denial types. Establish a denial hierarchy. Establish a centralized denial database. Develop key performance indicators. Build responsibility matrices. Measure, monitor, and take action.

  19. Interferometric step gauge for CMM verification

    NASA Astrophysics Data System (ADS)

    Hemming, B.; Esala, V.-P.; Laukkanen, P.; Rantanen, A.; Viitala, R.; Widmaier, T.; Kuosmanen, P.; Lassila, A.

    2018-07-01

    The verification of the measurement capability of coordinate measuring machines (CMM) is usually performed using gauge blocks or step gauges as reference standards. Gauge blocks and step gauges are robust and easy to use, but have some limitations such as finite lengths and uncertainty of thermal expansion. This paper describes the development, testing and uncertainty evaluation of an interferometric step gauge (ISG) for CMM verification. The idea of the ISG is to move a carriage bearing a gauge block along a rail and to measure the position with an interferometer. For a displacement of 1 m the standard uncertainty of the position of the gauge block is 0.2 µm. A short range periodic error of CMM can also be detected.

  20. Tests of stepping as indicators of mobility, balance, and fall risk in balance-impaired older adults.

    PubMed

    Cho, Be-long; Scarpace, Diane; Alexander, Neil B

    2004-07-01

    To determine the relationships between two tests of stepping ability (the maximal step length (MSL) and rapid step test (RST)) and standard tests of standing balance, gait, mobility, and functional impairment in a group of at-risk older adults. Cross-sectional study. University-based laboratory. One hundred sixty-seven mildly balance-impaired older adults recruited for a balance-training and fall-reduction program (mean age 78, range 65-90). Measures of stepping maximally (MSL, the ability to maximally step out and return to the initial position) and rapidly (RST, the time taken to step out and return in multiple directions as fast as possible); standard measures of balance, gait, and mobility including timed tandem stance (TS), tandem walk (TW, both timing and errors), timed unipedal stance (US), timed up and go (TUG), performance oriented mobility assessment (POMA), and 6-minute walk (SMW); measures of leg strength (peak knee and ankle torque and power at slow and fast speeds); self-report measures of frequent falls (>2 per 12 months), disability (Established Population for Epidemiologic Studies of the Elderly (EPESE) physical function), and confidence to avoid falls (Activity-specific Balance Confidence (ABC) Scale). Spearman and Pearson correlation, intraclass correlation coefficient, logistic regression, and linear regression were used for data analysis. MSL consistently predicted a number of self-report and performance measures at least as well as other standard balance measures. MSL correlations with EPESE physical function, ABC, TUG, and POMA scores; SMW; and peak maximum knee and ankle torque and power were at least as high as those correlations seen with TS, TW, or US. MSL score was associated with the risk of being a frequent faller. In addition, the six MSL directions were highly correlated (up to 0.96), and any one of the leg directions yielded similar relationships with functional measures and a history of falls. Relationships between RST and these measures were relatively modest. MSL is as good a predictor of mobility performance, frequent falls, self-reported function, and balance confidence as standard stance tests such as US. MSL simplified to one direction may be a useful clinical indicator of mobility, balance, and fall risk in older adults.

  1. Does the NBME Surgery Shelf exam constitute a "double jeopardy" of USMLE Step 1 performance?

    PubMed

    Ryan, Michael S; Colbert-Getz, Jorie M; Glenn, Salem N; Browning, Joel D; Anand, Rahul J

    2017-02-01

    Scores from the NBME Subject Examination in Surgery (Surgery Shelf) positively correlate with United States Medical Licensing Examination Step 1 (Step 1). Based on this relationship, the authors evaluated the predictive value of Step 1 on the Surgery Shelf. Surgery Shelf standard scores were substituted for Step 1 standard scores for 395 students in 2012-2014 at one medical school. Linear regression was used to determine how well Step 1 scores predicted Surgery Shelf scores. Percent match between original (with Shelf) and modified (with Step 1) clerkship grades were computed. Step 1 scores significantly predicted Surgery Shelf scores, R 2  = 0.42, P < 0.001. For every point increase in Step 1, a Surgery Shelf score increased by 0.30 points. Seventy-seven percent of original grades matched the modified grades. Replacing Surgery Shelf scores with Step 1 scores did not have an effect on the majority of final clerkship grades. This observation raises concern over use of Surgery Shelf scores as a measure of knowledge obtained during the Surgery clerkship. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. ASRM standard embryo transfer protocol template: a committee opinion.

    PubMed

    Penzias, Alan; Bendikson, Kristin; Butts, Samantha; Coutifaris, Christos; Falcone, Tommaso; Fossum, Gregory; Gitlin, Susan; Gracia, Clarisa; Hansen, Karl; Mersereau, Jennifer; Odem, Randall; Rebar, Robert; Reindollar, Richard; Rosen, Mitchell; Sandlow, Jay; Vernon, Michael

    2017-04-01

    Standardization improves performance and safety. A template for standardizing the embryo transfer procedure is presented here with 12 basic steps supported by published scientific literature and a survey of common practice of SART programs; it can be used by ART practices to model their own standard protocol. Copyright © 2017 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  3. Performance in physical examination on the USMLE Step 2 Clinical Skills examination.

    PubMed

    Peitzman, Steven J; Cuddy, Monica M

    2015-02-01

    To provide descriptive information about history-taking (HX) and physical examination (PE) performance for U.S. medical students as documented by standardized patients (SPs) during the Step 2 Clinical Skills (CS) component of the United States Medical Licensing Examination. The authors examined two hypotheses: (1) Students perform worse in PE compared with HX, and (2) for PE, students perform worse in the musculoskeletal system and neurology compared with other clinical domains. The sample included 121,767 student-SP encounters based on 29,442 examinees from U.S. medical schools who took Step 2 CS for the first time in 2011. The encounters comprised 107 clinical presentations, each categorized into one of five clinical domains: cardiovascular, gastrointestinal, musculoskeletal, neurological, and respiratory. The authors compared mean percent-correct scores for HX and PE via a one-tailed paired-samples t test and examined mean score differences by clinical domain using analysis of variance techniques. Average PE scores (59.6%) were significantly lower than average HX scores (78.1%). The range of scores for PE (51.4%-72.7%) was larger than for HX (74.4%-81.0%), and the standard deviation for PE scores (28.3) was twice as large as the HX standard deviation (14.7). PE performance was significantly weaker for musculoskeletal and neurological encounters compared with other encounters. U.S. medical students perform worse on PE than HX; PE performance was weakest in musculoskeletal and neurology clinical domains. Findings may reflect imbalances in U.S. medical education, but more research is needed to fully understand the relationships among PE instruction, assessment, and proficiency.

  4. Results of a prospective clinical study on the diagnostic performance of standard magnetic resonance imaging in comparison to a combination of 3T MRI and additional CT imaging in Kienböck's disease.

    PubMed

    Stahl, Stephane; Hentschel, Pascal; Ketelsen, Dominik; Grosse, Ulrich; Held, Manuel; Wahler, Theodora; Syha, Roland; Schaller, Hans-Eberhard; Nikolaou, Konstantin; Grözinger, Gerd

    2017-05-01

    This prospective clinical study examined standard wrist magnetic resonance imaging (MRI) examinations and the incremental value of computed tomography (CT) in the diagnosis of Kienböck's disease (KD) with regard to reliability and precision in the different diagnostic steps during diagnostic work-up. Sixty-four consecutive patients referred between January 2009 and January 2014 with positive initial suspicion of KD according to external standard wrist MRI were prospectively included (step one). Institutional review board approval was obtained. Clinical examination by two handsurgeons were followed by wrist radiographs (step two), ultrathin-section CT, and 3T contrast-enhanced MRI (step three). Final diagnosis was established in a consensus conference involving all examiners and all examinations results available from step three. In 12/64 patients, initial suspicion was discarded at step two and in 34/64 patients, the initial suspicion of KD was finally discarded at step three. The final external MRI positive predictive value was 47%. The most common differential diagnoses at step three were intraosseous cysts (n=15), lunate pseudarthrosis (n=13), and ulnar impaction syndrome (n=5). A correlation between radiograph-based diagnoses (step two) with final diagnosis (step three) showed that initial suspicion of stage I KD had the lowest sensitivity for correct diagnosis (2/11). Technical factors associated with a false positive external MRI KD diagnosis were not found. Standard wrist MRI should be complemented with thin-section CT, and interdisciplinary interpretation of images and clinical data, to increase diagnostic accuracy in patients with suspected KD. Copyright © 2017. Published by Elsevier B.V.

  5. Step-by-Step Heating of Dye Solution for Efficient Solar Energy Harvesting in Dye-Sensitized Solar Cells

    NASA Astrophysics Data System (ADS)

    Shah, Syed Afaq Ali; Sayyad, Muhammad Hassan; Abdulkarim, Salem; Qiao, Qiquan

    2018-05-01

    A step-by-step heat treatment was applied to ruthenium-based N719 dye solution for its potential application in dye-sensitized solar cells (DSSCs). The effects were analyzed and compared with standard untreated devices. A significant increase in short circuit current density was observed by employing a step-by-step heating method for dye solution in DSSCs. This increase of J sc is attributed to the enhancement in dye adsorption by the surface of the semiconductor and the higher number of charge carriers generated. DSSCs fabricated by a heated dye solution have achieved an overall power conversion efficiency of 8.41% which is significantly higher than the efficiency of 7.31% achieved with DSSCs fabricated without heated dye. Electrochemical impedance spectroscopy and capacitance voltage studies were performed to understand the better performance of the device fabricated with heated dye. Furthermore, transient photocurrent and transient photovoltage measurements were also performed to gain an insight into interfacial charge carrier recombinations.

  6. Evaluation of a newly developed media-supported 4-step approach for basic life support training

    PubMed Central

    2012-01-01

    Objective The quality of external chest compressions (ECC) is of primary importance within basic life support (BLS). Recent guidelines delineate the so-called 4“-step approach” for teaching practical skills within resuscitation training guided by a certified instructor. The objective of this study was to evaluate whether a “media-supported 4-step approach” for BLS training leads to equal practical performance compared to the standard 4-step approach. Materials and methods After baseline testing, 220 laypersons were either trained using the widely accepted method for resuscitation training (4-step approach) or using a newly created “media-supported 4-step approach”, both of equal duration. In this approach, steps 1 and 2 were ensured via a standardised self-produced podcast, which included all of the information regarding the BLS algorithm and resuscitation skills. Participants were tested on manikins in the same mock cardiac arrest single-rescuer scenario prior to intervention, after one week and after six months with respect to ECC-performance, and participants were surveyed about the approach. Results Participants (age 23 ± 11, 69% female) reached comparable practical ECC performances in both groups, with no statistical difference. Even after six months, there was no difference detected in the quality of the initial assessment algorithm or delay concerning initiation of CPR. Overall, at least 99% of the intervention group (n = 99; mean 1.5 ± 0.8; 6-point Likert scale: 1 = completely agree, 6 = completely disagree) agreed that the video provided an adequate introduction to BLS skills. Conclusions The “media-supported 4-step approach” leads to comparable practical ECC-performance compared to standard teaching, even with respect to retention of skills. Therefore, this approach could be useful in special educational settings where, for example, instructors’ resources are sparse or large-group sessions have to be prepared. PMID:22647148

  7. Community-Wide Zero Energy Ready Home Standard

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herk, A.; Beggs, T.

    This report outlines the steps a developer can use when looking to create and implement higher performance standards such as the U.S. Department of Energy (DOE) Zero Energy Ready Home (ZERH) standards in a community. The report also describes the specific examples of how this process was followed by a developer, Forest City, in the Stapleton community in Denver, Colorado. IBACOS described the steps used to begin to bring the DOE ZERH standard to the Forest City Stapleton community based on 15 years of community-scale development work done by IBACOS. As a result of this prior IBACOS work, the teammore » gained an understanding of the various components that a master developer needs to consider and created strategies for incorporating those components in the initial phases of development to achieve higher performance buildings in the community. An automated scoring system can be used to perform an internal audit that provides a detailed and consistent evaluation of how several homes under construction or builders' floor plans compare with the requirements of the DOE Zero Energy Ready Home program. This audit can be performed multiple times at specific milestones during construction to allow the builder to make changes as needed throughout construction for the project to meet Zero Energy Ready Home standards. This scoring system also can be used to analyze a builder's current construction practices and design.« less

  8. Gait parameter and event estimation using smartphones.

    PubMed

    Pepa, Lucia; Verdini, Federica; Spalazzi, Luca

    2017-09-01

    The use of smartphones can greatly help for gait parameters estimation during daily living, but its accuracy needs a deeper evaluation against a gold standard. The objective of the paper is a step-by-step assessment of smartphone performance in heel strike, step count, step period, and step length estimation. The influence of smartphone placement and orientation on estimation performance is evaluated as well. This work relies on a smartphone app developed to acquire, process, and store inertial sensor data and rotation matrices about device position. Smartphone alignment was evaluated by expressing the acceleration vector in three reference frames. Two smartphone placements were tested. Three methods for heel strike detection were considered. On the basis of estimated heel strikes, step count is performed, step period is obtained, and the inverted pendulum model is applied for step length estimation. Pearson correlation coefficient, absolute and relative errors, ANOVA, and Bland-Altman limits of agreement were used to compare smartphone estimation with stereophotogrammetry on eleven healthy subjects. High correlations were found between smartphone and stereophotogrammetric measures: up to 0.93 for step count, to 0.99 for heel strike, 0.96 for step period, and 0.92 for step length. Error ranges are comparable to those in the literature. Smartphone placement did not affect the performance. The major influence of acceleration reference frames and heel strike detection method was found in step count. This study provides detailed information about expected accuracy when smartphone is used as a gait monitoring tool. The obtained results encourage real life applications. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Hypertension and orthostatic hypotension in applicants for spaceflight training and spacecrews: a review of medical standards.

    PubMed

    Fuchs, H S

    1983-01-01

    The inauguration of NASA of the position of Payload Specialists for SHUTTLE-SPACELAB flights has broken the tradition of restrictive medical physical standards in several ways: by reducing physical requirements and extensive training; by permitting the selection of older individuals and women; by selecting individuals who may fly only one or several missions and do not spend an entire career in space activities. Experience with Payload Specialists to be gained during the forthcoming SPACELAB missions, observing man in spaceflight step by step on an incremental basis, will provide valuable data for modifying the medical standards for Payload Specialists, Space Station Technicians, and Space Support Personnel who perform routine work rather than peculiar tasks. Such revisions necessarily include a modification of traditional blood pressure standards. In this paper I review the history and evolution of these standards in aeronautics and astronautics.

  10. Hypertension and orthostatic hypotension in applicants for spaceflight training and spacecrews: A review of medical standards

    NASA Astrophysics Data System (ADS)

    Fuchs, Heinz S.

    The inauguration of NASA of the position of Payload Specialists for SHUTTLE-SPACELAB flights has broken the tradition of restrictive medical physical standards in several ways: by reducing physical requirements and extensive training; by permitting the selection of older individuals and women; by selecting individuals who may fly only one or several missions and do not spend an entire career in space activities. Experience with Payload Specialists to be gained during the forthcoming SPACELAB missions, observing man in spaceflight step by step on an incremental basis, will provide valuable data for modifying the medical standards for Payload Specialists, Space Station Technicians, and Space Support Personnel who perform routine work rather than peculiar tasks. Such revisions necessarily include a modification of traditional blood pressure standards. In this paper I review the history and evolution of these standards in aeronautics and astronautics.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herk, A.; Beggs, T.

    This report outlines the steps a developer can take when creating and implementing high performance standards such as the U.S. Department of Energy’s (DOE’s) Zero Energy Ready Home (ZERH) standards on a community-wide scale. The report also describes the specific examples of how this process is underway in the Stapleton community in Denver, Colorado, by the developer Forest City.

  12. Step-height standards based on the rapid formation of monolayer steps on the surface of layered crystals

    NASA Astrophysics Data System (ADS)

    Komonov, A. I.; Prinz, V. Ya.; Seleznev, V. A.; Kokh, K. A.; Shlegel, V. N.

    2017-07-01

    Metrology is essential for nanotechnology, especially for structures and devices with feature sizes going down to nm. Scanning probe microscopes (SPMs) permits measurement of nanometer- and subnanometer-scale objects. Accuracy of size measurements performed using SPMs is largely defined by the accuracy of used calibration measures. In the present publication, we demonstrate that height standards of monolayer step (∼1 and ∼0.6 nm) can be easily prepared by cleaving Bi2Se3 and ZnWO4 layered single crystals. It was shown that the conducting surface of Bi2Se3 crystals offers height standard appropriate for calibrating STMs and for testing conductive SPM probes. Our AFM study of the morphology of freshly cleaved (0001) Bi2Se3 surfaces proved that such surfaces remained atomically smooth during a period of at least half a year. The (010) surfaces of ZnWO4 crystals remained atomically smooth during one day, but already two days later an additional nanorelief of amplitude ∼0.3 nm appeared on those surfaces. This relief, however, did not further grow in height, and it did not hamper the calibration. Simplicity and the possibility of rapid fabrication of the step-height standards, as well as their high stability, make these standards available for a great, permanently growing number of users involved in 3D printing activities.

  13. Interdisciplinary cognitive task analysis: a strategy to develop a comprehensive endoscopic retrograde cholangiopancreatography protocol for use in fellowship training.

    PubMed

    Canopy, Erin; Evans, Matt; Boehler, Margaret; Roberts, Nicole; Sanfey, Hilary; Mellinger, John

    2015-10-01

    Endoscopic retrograde cholangiopancreatography is a challenging procedure performed by surgeons and gastroenterologists. We employed cognitive task analysis to identify steps and decision points for this procedure. Standardized interviews were conducted with expert gastroenterologists (7) and surgeons (4) from 4 institutions. A procedural step and cognitive decision point protocol was created from audio-taped transcriptions and was refined by 5 additional surgeons. Conceptual elements, sequential actions, and decision points were iterated for 5 tasks: patient preparation, duodenal intubation, selective cannulation, imaging interpretation with related therapeutic intervention, and complication management. A total of 180 steps were identified. Gastroenterologists identified 34 steps not identified by surgeons, and surgeons identified 20 steps not identified by gastroenterologists. The findings suggest that for complex procedures performed by diverse practitioners, more experts may help delineate distinctive emphases differentiated by training background and type of practice. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Counting Steps in Activities of Daily Living in People With a Chronic Disease Using Nine Commercially Available Fitness Trackers: Cross-Sectional Validity Study

    PubMed Central

    Beekman, Emmylou; Theunissen, Kyra; Braun, Susy; Beurskens, Anna J

    2018-01-01

    Background Measuring physical activity with commercially available activity trackers is gaining popularity. People with a chronic disease can especially benefit from knowledge about their physical activity pattern in everyday life since sufficient physical activity can contribute to wellbeing and quality of life. However, no validity data are available for this population during activities of daily living. Objective The aim of this study was to investigate the validity of 9 commercially available activity trackers for measuring step count during activities of daily living in people with a chronic disease receiving physiotherapy. Methods The selected activity trackers were Accupedo (Corusen LLC), Activ8 (Remedy Distribution Ltd), Digi-Walker CW-700 (Yamax), Fitbit Flex (Fitbit inc), Lumoback (Lumo Bodytech), Moves (ProtoGeo Oy), Fitbit One (Fitbit inc), UP24 (Jawbone), and Walking Style X (Omron Healthcare Europe BV). In total, 130 persons with chronic diseases performed standardized activity protocols based on activities of daily living that were recorded on video camera and analyzed for step count (gold standard). The validity of the trackers’ step count was assessed by correlation coefficients, t tests, scatterplots, and Bland-Altman plots. Results The correlations between the number of steps counted by the activity trackers and the gold standard were low (range: –.02 to .33). For all activity trackers except for Fitbit One, a significant systematic difference with the gold standard was found for step count. Plots showed a wide range in scores for all activity trackers; Activ8 showed an average overestimation and the other 8 trackers showed underestimations. Conclusions This study showed that the validity of 9 commercially available activity trackers is low measuring steps while individuals with chronic diseases receiving physiotherapy engage in activities of daily living. PMID:29610110

  15. Newmark local time stepping on high-performance computing architectures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rietmann, Max, E-mail: max.rietmann@erdw.ethz.ch; Institute of Geophysics, ETH Zurich; Grote, Marcus, E-mail: marcus.grote@unibas.ch

    In multi-scale complex media, finite element meshes often require areas of local refinement, creating small elements that can dramatically reduce the global time-step for wave-propagation problems due to the CFL condition. Local time stepping (LTS) algorithms allow an explicit time-stepping scheme to adapt the time-step to the element size, allowing near-optimal time-steps everywhere in the mesh. We develop an efficient multilevel LTS-Newmark scheme and implement it in a widely used continuous finite element seismic wave-propagation package. In particular, we extend the standard LTS formulation with adaptations to continuous finite element methods that can be implemented very efficiently with very strongmore » element-size contrasts (more than 100x). Capable of running on large CPU and GPU clusters, we present both synthetic validation examples and large scale, realistic application examples to demonstrate the performance and applicability of the method and implementation on thousands of CPU cores and hundreds of GPUs.« less

  16. Use of Low-Fidelity Simulation Laboratory Training for Teaching Radiology Residents CT-Guided Procedures.

    PubMed

    Picard, Melissa; Nelson, Rachel; Roebel, John; Collins, Heather; Anderson, M Bret

    2016-11-01

    To determine the benefit of the addition of low-fidelity simulation-based training to the standard didactic-based training in teaching radiology residents common CT-guided procedures. This was a prospective study involving 24 radiology residents across all years in a university program. All residents underwent standard didactic lecture followed by low-fidelity simulation-based training on three common CT-guided procedures: random liver biopsy, lung nodule biopsy, and drain placement. Baseline knowledge, confidence, and performance assessments were obtained after the didactic session and before the simulation training session. Approximately 2 months later, all residents participated in a simulation-based training session covering all three of these procedures. Knowledge, confidence, and performance data were obtained afterward. These assessments covered topics related to preprocedure workup, intraprocedure steps, and postprocedure management. Knowledge data were collected based on a 15-question assessment. Confidence data were obtained based on a 5-point Likert-like scale. Performance data were obtained based on successful completion of predefined critical steps. There was significant improvement in knowledge (P = .005), confidence (P < .008), and tested performance (P < .043) after the addition of simulation-based training to the standard didactic curriculum for all procedures. This study suggests that the addition of low-fidelity simulation-based training to a standard didactic-based curriculum is beneficial in improving resident knowledge, confidence, and tested performance of common CT-guided procedures. Copyright © 2016 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  17. SU-E-T-135: Investigation of Commercial-Grade Flatbed Scanners and a Medical- Grade Scanner for Radiochromic EBT Film Dosimetry.

    PubMed

    Syh, J; Patel, B; Syh, J; Wu, H; Rosen, L; Durci, M; Katz, S; Sibata, C

    2012-06-01

    To evaluate the characteristics of commercial-grade flatbed scanners and medical-grade scanners for radiochromic EBT film dosimetry. Performance aspects of a Vidar Dosimetry Pro Advantage (Red), Epson 750 Pro, Microtek ArtixScan 1800f, and Microtek ScanMaker 8700 scanner for EBT2 Gafchromic film were evaluated in the categories of repeatability, maximum distinguishable optical density (OD) differentiation, OD variance, and dose curve characteristics. OD step film by Stouffer Industries containing 31 steps ranging from 0.05 to 3.62 OD was used. EBT films were irradiated with dose ranging from 20 to 600 cGy in 6×6 cm 2 field sizes and analyzed 24 hours later using RIT113 and Tomotherapy Film Analyzer software. Scans were performed in transmissive mode, landscape orientation, 16-bit image. The mean and standard deviation Analog to Digital (A/D) scanner value was measured by selecting a 3×3 mm 2 uniform area in the central region of each OD step from a total of 20 scans performed over several weeks. Repeatability was determined from the variance of OD step 0.38. Maximum distinguishable OD was defined as the last OD step whose range of A/D values does not overlap with its neighboring step. Repeatability uncertainty ranged from 0.1% for Vidar to 4% for Epson. Average standard deviation of OD steps ranged from 0.21% for Vidar to 6.4% for ArtixScan 1800f. Maximum distinguishable optical density ranged from 3.38 for Vidar to 1.32 for ScanMaker 8700. A/D range of each OD step corresponds to a dose range. Dose ranges of OD steps varied from 1% for Vidar to 20% for ScanMaker 8700. The Vidar exhibited a dose curve that utilized a broader range of OD values than the other scanners. Vidar exhibited higher maximum distinguishable OD, smaller variance in repeatability, smaller A/D value deviation per OD step, and a shallower dose curve with respect to OD. © 2012 American Association of Physicists in Medicine.

  18. Brain-computer interfacing under distraction: an evaluation study

    NASA Astrophysics Data System (ADS)

    Brandl, Stephanie; Frølich, Laura; Höhne, Johannes; Müller, Klaus-Robert; Samek, Wojciech

    2016-10-01

    Objective. While motor-imagery based brain-computer interfaces (BCIs) have been studied over many years by now, most of these studies have taken place in controlled lab settings. Bringing BCI technology into everyday life is still one of the main challenges in this field of research. Approach. This paper systematically investigates BCI performance under 6 types of distractions that mimic out-of-lab environments. Main results. We report results of 16 participants and show that the performance of the standard common spatial patterns (CSP) + regularized linear discriminant analysis classification pipeline drops significantly in this ‘simulated’ out-of-lab setting. We then investigate three methods for improving the performance: (1) artifact removal, (2) ensemble classification, and (3) a 2-step classification approach. While artifact removal does not enhance the BCI performance significantly, both ensemble classification and the 2-step classification combined with CSP significantly improve the performance compared to the standard procedure. Significance. Systematically analyzing out-of-lab scenarios is crucial when bringing BCI into everyday life. Algorithms must be adapted to overcome nonstationary environments in order to tackle real-world challenges.

  19. Electronic Nose Testing Procedure for the Definition of Minimum Performance Requirements for Environmental Odor Monitoring

    PubMed Central

    Eusebio, Lidia; Capelli, Laura; Sironi, Selena

    2016-01-01

    Despite initial enthusiasm towards electronic noses and their possible application in different fields, and quite a lot of promising results, several criticalities emerge from most published research studies, and, as a matter of fact, the diffusion of electronic noses in real-life applications is still very limited. In general, a first step towards large-scale-diffusion of an analysis method, is standardization. The aim of this paper is describing the experimental procedure adopted in order to evaluate electronic nose performances, with the final purpose of establishing minimum performance requirements, which is considered to be a first crucial step towards standardization of the specific case of electronic nose application for environmental odor monitoring at receptors. Based on the experimental results of the performance testing of a commercialized electronic nose type with respect to three criteria (i.e., response invariability to variable atmospheric conditions, instrumental detection limit, and odor classification accuracy), it was possible to hypothesize a logic that could be adopted for the definition of minimum performance requirements, according to the idea that these are technologically achievable. PMID:27657086

  20. Electronic Nose Testing Procedure for the Definition of Minimum Performance Requirements for Environmental Odor Monitoring.

    PubMed

    Eusebio, Lidia; Capelli, Laura; Sironi, Selena

    2016-09-21

    Despite initial enthusiasm towards electronic noses and their possible application in different fields, and quite a lot of promising results, several criticalities emerge from most published research studies, and, as a matter of fact, the diffusion of electronic noses in real-life applications is still very limited. In general, a first step towards large-scale-diffusion of an analysis method, is standardization. The aim of this paper is describing the experimental procedure adopted in order to evaluate electronic nose performances, with the final purpose of establishing minimum performance requirements, which is considered to be a first crucial step towards standardization of the specific case of electronic nose application for environmental odor monitoring at receptors. Based on the experimental results of the performance testing of a commercialized electronic nose type with respect to three criteria (i.e., response invariability to variable atmospheric conditions, instrumental detection limit, and odor classification accuracy), it was possible to hypothesize a logic that could be adopted for the definition of minimum performance requirements, according to the idea that these are technologically achievable.

  1. Are Scores From NBME Subject Examinations Valid Measures of Knowledge Acquired During Clinical Clerkships?

    PubMed

    Ryan, Michael S; Bishop, Steven; Browning, Joel; Anand, Rahul J; Waterhouse, Elizabeth; Rigby, Fidelma; Al-Mateen, Cheryl S; Lee, Clifton; Bradner, Melissa; Colbert-Getz, Jorie M

    2017-06-01

    The National Board of Medical Examiners' Clinical Science Subject Examinations are a component used by most U.S. medical schools to determine clerkship grades. The purpose of this study was to examine the validity of this practice. This was a retrospective cohort study of medical students at the Virginia Commonwealth University School of Medicine who completed clerkships in 2012 through 2014. Linear regression was used to determine how well United States Medical Licensing Examination Step 1 scores predicted Subject Examination scores in seven clerkships. The authors then substituted each student's Subject Examination standard scores with his or her Step 1 standard score. Clerkship grades based on the Step 1 substitution were compared with actual grades with the Wilcoxon rank test. A total of 2,777 Subject Examination scores from 432 students were included in the analysis. Step 1 scores significantly predicted between 23% and 44% of the variance in Subject Examination scores, P < .001 for all clerkship regression equations. Mean differences between expected and actual Subject Examination scores were small (≤ 0.2 points). There was a match between 73% of Step 1 substituted final clerkship grades and actual final clerkship grades. The results of this study suggest that performance on Step 1 can be used to identify and counsel students at risk for poor performance on the Subject Examinations. In addition, these findings call into the question the validity of using scores from Subject Examinations as a high-stakes assessment of learning in individual clerkships.

  2. Task Analysis for Health Occupations. Cluster: Nursing. Occupation: Geriatric Aide. Education for Employment Task Lists.

    ERIC Educational Resources Information Center

    Lake County Area Vocational Center, Grayslake, IL.

    This task analysis for nursing education provides performance standards, steps to be followed, knowledge required, attitudes to be developed, safety procedures, and equipment and supplies needed for 13 tasks performed by geriatric aides in the duty area of performing diagnostic measures and for 30 tasks in the duty area of providing therapeutic…

  3. Managing for Results--Linking Performance Measures and Budgets.

    ERIC Educational Resources Information Center

    McGee, William L.; Fountain, James R., Jr.

    1995-01-01

    The Government Accounting Standards Board notion of service efforts and accomplishments reporting is one step in a process of managing for results that includes strategic planning, development and use of performance measures of managing ongoing programs, and outputs to budgetary appropriation. Reports a trial application to one school district.…

  4. [Determination of serum or plasma alpha-tocopherol by high performance liquid chromatography: optimization of operative models].

    PubMed

    Jezequel-Cuer, M; Le Moël, G; Mounié, J; Peynet, J; Le Bizec, C; Vernet, M H; Artur, Y; Laschi-Loquerie, A; Troupel, S

    1995-01-01

    A previous multicentric study set up by the Société française de biologie clinique has emphasized the usefulness of a standardized procedure for the determination by high performance liquid chromatography of alpha-tocopherol in serum or plasma. In our study, we have tested every step of the different published procedures: internal standard adduct, lipoprotein denaturation and vitamin extraction. Reproducibility of results was improved by the use of tocol as an internal standard when compared to retinol or alpha-tocopherol acetates. Lipoprotein denaturation was more efficient with ethanol addition than with methanol and when the ethanol/water ratio was > or = 0.7. Use of n-hexane or n-heptane gave the same recovery of alpha-tocopherol. When organic solvent/water ratio was > or = 1, n-hexane enabled to efficiently extract, in a one-step procedure, the alpha-tocopherol from both normo and hyperlipidemic sera. Performances of the selected procedure were: detection limit: 0.5 microM--linear range: 750 microM--within run coefficient of variation: 2.03%--day to day: 4.76%. Finally, this pluricentric study allows us to propose an optimised procedure for the determination of alpha-tocopherol in serum or plasma.

  5. Some Assembly Required: Building a Better Accountability System for California. Education Sector Reports

    ERIC Educational Resources Information Center

    Carey, Kevin

    2012-01-01

    For years California has been a leader in public education. In 1999, the state implemented some of the strongest content standards in the country through its main accountability metric, the Academic Performance Index (API). The state has also signed onto the Common Core State Standards, taking important steps to ensure its students are college-…

  6. Steps to standardization and validation of hippocampal volumetry as a biomarker in clinical trials and diagnostic criteria for Alzheimer’s disease

    PubMed Central

    Jack, Clifford R; Barkhof, Frederik; Bernstein, Matt A; Cantillon, Marc; Cole, Patricia E; DeCarli, Charles; Dubois, Bruno; Duchesne, Simon; Fox, Nick C; Frisoni, Giovanni B; Hampel, Harald; Hill, Derek LG; Johnson, Keith; Mangin, Jean-François; Scheltens, Philip; Schwarz, Adam J; Sperling, Reisa; Suhy, Joyce; Thompson, Paul M; Weiner, Michael; Foster, Norman L

    2012-01-01

    Background The promise of Alzheimer’s disease (AD) biomarkers has led to their incorporation in new diagnostic criteria and in therapeutic trials; however, significant barriers exist to widespread use. Chief among these is the lack of internationally accepted standards for quantitative metrics. Hippocampal volumetry is the most widely studied quantitative magnetic resonance imaging (MRI) measure in AD and thus represents the most rational target for an initial effort at standardization. Methods and Results The authors of this position paper propose a path toward this goal. The steps include: 1) Establish and empower an oversight board to manage and assess the effort, 2) Adopt the standardized definition of anatomic hippocampal boundaries on MRI arising from the EADC-ADNI hippocampal harmonization effort as a Reference Standard, 3) Establish a scientifically appropriate, publicly available Reference Standard Dataset based on manual delineation of the hippocampus in an appropriate sample of subjects (ADNI), and 4) Define minimum technical and prognostic performance metrics for validation of new measurement techniques using the Reference Standard Dataset as a benchmark. Conclusions Although manual delineation of the hippocampus is the best available reference standard, practical application of hippocampal volumetry will require automated methods. Our intent is to establish a mechanism for credentialing automated software applications to achieve internationally recognized accuracy and prognostic performance standards that lead to the systematic evaluation and then widespread acceptance and use of hippocampal volumetry. The standardization and assay validation process outlined for hippocampal volumetry is envisioned as a template that could be applied to other imaging biomarkers. PMID:21784356

  7. The Relations between Teasing and Bullying and Middle School Standardized Exam Performance

    ERIC Educational Resources Information Center

    Lacey, Anna; Cornell, Dewey; Konold, Timothy

    2017-01-01

    This study examined the relations between the schoolwide prevalence of teasing and bullying (PTB) and schoolwide academic performance in a sample of 271 Virginia middle schools. In addition, the study examined the mediating effects of student engagement. A three-step sequence of path models investigated associations between schoolwide PTB and…

  8. From the lab to commercial reality with biobased adhesives for wood

    Treesearch

    Charles R. Frihart; Michael J. Birkeland

    2016-01-01

    Many technologies can be demonstrated in the laboratory to give products that meet performance standards, but there are many hurdles to overcome before these products are commercially viable. Demonstrating performance under simulated commercial processes conditions is the first key step to be accomplished through using appropriate adhesive application, furnish...

  9. Building America Case Study: Zero Energy Ready Home and the Challenge of Hot Water on Demand, Denver, Colorado

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    "This report outlines the steps a developer can use when looking to create and implement higher performance standards such as the U.S. Department of Energy (DOE) Zero Energy Ready Home (ZERH) standards in a community. The report also describes the specific examples of how this process was followed by a developer, Forest City, in the Stapleton community in Denver, Colorado. IBACOS described the steps used to begin to bring the DOE ZERH standard to the Forest City Stapleton community based on 15 years of community-scale development work done by IBACOS. As a result of this prior IBACOS work, the teammore » gained an understanding of the various components that a master developer needs to consider and created strategies for incorporating those components in the initial phases of development to achieve higher performance buildings in the community. An automated scoring system can be used to perform an internal audit that provides a detailed and consistent evaluation of how several homes under construction or builders' floor plans compare with the requirements of the DOE Zero Energy Ready Home program. This audit can be performed multiple times at specific milestones during construction to allow the builder to make changes as needed throughout construction for the project to meet Zero Energy Ready Home standards. This scoring system also can be used to analyze a builder's current construction practices and design.« less

  10. A Two-Step Absorber Deposition Approach To Overcome Shunt Losses in Thin-Film Solar Cells: Using Tin Sulfide as a Proof-of-Concept Material System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steinmann, Vera; Chakraborty, Rupak; Rekemeyer, Paul H.

    2016-08-31

    As novel absorber materials are developed and screened for their photovoltaic (PV) properties, the challenge remains to reproducibly test promising candidates for high-performing PV devices. Many early-stage devices are prone to device shunting due to pinholes in the absorber layer, producing 'false-negative' results. Here, we demonstrate a device engineering solution toward a robust device architecture, using a two-step absorber deposition approach. We use tin sulfide (SnS) as a test absorber material. The SnS bulk is processed at high temperature (400 degrees C) to stimulate grain growth, followed by a much thinner, low-temperature (200 degrees C) absorber deposition. At a lowermore » process temperature, the thin absorber overlayer contains significantly smaller, densely packed grains, which are likely to provide a continuous coating and fill pinholes in the underlying absorber bulk. We compare this two-step approach to the more standard approach of using a semi-insulating buffer layer directly on top of the annealed absorber bulk, and we demonstrate a more than 3.5x superior shunt resistance Rsh with smaller standard error ..sigma..Rsh. Electron-beam-induced current (EBIC) measurements indicate a lower density of pinholes in the SnS absorber bulk when using the two-step absorber deposition approach. We correlate those findings to improvements in the device performance and device performance reproducibility.« less

  11. Developing a holistic accreditation system for medical universities of the Islamic Republic of Iran.

    PubMed

    Yousefy, A; Changiz, T; Yamani, N; Zahrai, R H; Ehsanpour, S

    2009-01-01

    This report describes the steps in the development of an accreditation system for medical universities in the Islamic Republic of Iran. The national accreditation project, supported by the government, was performed from 2001 to 2005. The project was carried out in 3 main phases, each phase including a number of tasks. After a review of the international literature on accreditation and through national consensus, a set of national institutional accreditation standards was developed, including 95 standards and 504 indicators in 10 areas. By complying with accepted national standards, Iranian medical universities will play an important role in promoting health system performance.

  12. A programmable positioning stepper-motor controller with a multibus/IEEE 796 compatible interface.

    PubMed

    Papoff, P; Ricci, D

    1984-02-01

    A programmable positioning stepper-motor controller, based on the Multibus/IEEE 796 standard interface, has been assembled by use of some intelligent and programmable integrated circuits. This controller, organized as a bus-slave unit, has been planned for local management of up to four stepper motors working simultaneously. The number of steps, the direction of rotation and the step-rate for the positioning of each motor are issued by the bus master microcomputer to the controller which handles all the required operations. Once each positioning has been performed, the controller informs the master by generating a proper bus-vectored interrupt. Displacements in up to 64,000 steps may be programmed with step-rates ranging from 0.1 to 6550 steps/sec. This device, for which only low-cost, high-performance components are required, can be successfully used in a wide range of applications and can be easily extended to control more than four stepper motors.

  13. Clinical Assessment of Risk Management: an INtegrated Approach (CARMINA).

    PubMed

    Tricarico, Pierfrancesco; Tardivo, Stefano; Sotgiu, Giovanni; Moretti, Francesca; Poletti, Piera; Fiore, Alberto; Monturano, Massimo; Mura, Ida; Privitera, Gaetano; Brusaferro, Silvio

    2016-08-08

    Purpose - The European Union recommendations for patient safety calls for shared clinical risk management (CRM) safety standards able to guide organizations in CRM implementation. The purpose of this paper is to develop a self-evaluation tool to measure healthcare organization performance on CRM and guide improvements over time. Design/methodology/approach - A multi-step approach was implemented including: a systematic literature review; consensus meetings with an expert panel from eight Italian leader organizations to get to an agreement on the first version; field testing to test instrument feasibility and flexibility; Delphi strategy with a second expert panel for content validation and balanced scoring system development. Findings - The self-assessment tool - Clinical Assessment of Risk Management: an INtegrated Approach includes seven areas (governance, communication, knowledge and skills, safe environment, care processes, adverse event management, learning from experience) and 52 standards. Each standard is evaluated according to four performance levels: minimum; monitoring; outcomes; and improvement actions, which resulted in a feasible, flexible and valid instrument to be used throughout different organizations. Practical implications - This tool allows practitioners to assess their CRM activities compared to minimum levels, monitor performance, benchmarking with other institutions and spreading results to different stakeholders. Originality/value - The multi-step approach allowed us to identify core minimum CRM levels in a field where no consensus has been reached. Most standards may be easily adopted in other countries.

  14. Improving HIV outcomes in resource-limited countries: the importance of quality indicators.

    PubMed

    Ahonkhai, Aima A; Bassett, Ingrid V; Ferris, Timothy G; Freedberg, Kenneth A

    2012-11-24

    Resource-limited countries increasingly depend on quality indicators to improve outcomes within HIV treatment programs, but indicators of program performance suitable for use at the local program level remain underdeveloped. Using the existing literature as a guide, we applied standard quality improvement (QI) concepts to the continuum of HIV care from HIV diagnosis, to enrollment and retention in care, and highlighted critical service delivery process steps to identify opportunities for performance indicator development. We then identified existing indicators to measure program performance, citing examples used by pivotal donor agencies, and assessed their feasibility for use in surveying local program performance. Clinical delivery steps without existing performance measures were identified as opportunities for measure development. Using National Quality Forum (NQF) criteria as a guide, we developed measurement concepts suitable for use at the local program level that address existing gaps in program performance assessment. This analysis of the HIV continuum of care identified seven critical process steps providing numerous opportunities for performance measurement. Analysis of care delivery process steps and the application of NQF criteria identified 24 new measure concepts that are potentially useful for improving operational performance in HIV care at the local level. An evidence-based set of program-level quality indicators is critical for the improvement of HIV care in resource-limited settings. These performance indicators should be utilized as treatment programs continue to grow.

  15. Study Behaviors and USMLE Step 1 Performance: Implications of a Student Self-Directed Parallel Curriculum.

    PubMed

    Burk-Rafel, Jesse; Santen, Sally A; Purkiss, Joel

    2017-11-01

    To determine medical students' study behaviors when preparing for the United States Medical Licensing Examination (USMLE) Step 1, and how these behaviors are associated with Step 1 scores when controlling for likely covariates. The authors distributed a study-behaviors survey in 2014 and 2015 at their institution to two cohorts of medical students who had recently taken Step 1. Demographic and academic data were linked to responses. Descriptive statistics, bivariate correlations, and multiple linear regression analyses were performed. Of 332 medical students, 274 (82.5%) participated. Most students (n = 211; 77.0%) began studying for Step 1 during their preclinical curriculum, increasing their intensity during a protected study period during which they averaged 11.0 hours studying per day (standard deviation [SD] 2.1) over a period of 35.3 days (SD 6.2). Students used numerous third-party resources, including reading an exam-specific 700-page review book on average 2.1 times (SD 0.8) and completing an average of 3,597 practice multiple-choice questions (SD 1,611). Initiating study prior to the designated study period, increased review book usage, and attempting more practice questions were all associated with higher Step 1 scores, even when controlling for Medical College Admission Test scores, preclinical exam performance, and self-identified score goal (adjusted R = 0.56, P < .001). Medical students at one public institution engaged in a self-directed, "parallel" Step 1 curriculum using third-party study resources. Several study behaviors were associated with improved USMLE Step 1 performance, informing both institutional- and student-directed preparation for this high-stakes exam.

  16. Problem-based learning outcomes: the glass half-full.

    PubMed

    Distlehorst, Linda H; Dawson, Elizabeth; Robbs, Randall S; Barrows, Howard S

    2005-03-01

    To compare the characteristics and outcome data of students from a single institution with a two-track, problem based learning (PBL) and standard (STND) curriculum. PBL and STND students from nine graduating classes at Southern Illinois University School of Medicine were compared using common medical school performance outcomes (USMLE Step 1, USMLE Step 2, clerkship mean ratings, number of clerkship honors and remediation designations, and the senior clinical competency exam), as well as common admission and demographic variables. PBL students were older, and the cohort had a higher proportion of women. The two tracks had similar USMLE Step 1 and 2 mean scores and pass rates. Performance differences were significant for PBL students in two clerkships as well as in the clerkship subcategories of clinical performance, knowledge and clinical reasoning, and noncognitive behaviors. In addition, the proportion of PBL students earning honors was greater. The traditional undergraduate educational outcomes for the PBL and STND students are very positive. In several of the clerkship performance measures, the PBL students performed significantly better, and in no circumstance did they perform worse than the STND students.

  17. Counting Steps in Activities of Daily Living in People With a Chronic Disease Using Nine Commercially Available Fitness Trackers: Cross-Sectional Validity Study.

    PubMed

    Ummels, Darcy; Beekman, Emmylou; Theunissen, Kyra; Braun, Susy; Beurskens, Anna J

    2018-04-02

    Measuring physical activity with commercially available activity trackers is gaining popularity. People with a chronic disease can especially benefit from knowledge about their physical activity pattern in everyday life since sufficient physical activity can contribute to wellbeing and quality of life. However, no validity data are available for this population during activities of daily living. The aim of this study was to investigate the validity of 9 commercially available activity trackers for measuring step count during activities of daily living in people with a chronic disease receiving physiotherapy. The selected activity trackers were Accupedo (Corusen LLC), Activ8 (Remedy Distribution Ltd), Digi-Walker CW-700 (Yamax), Fitbit Flex (Fitbit inc), Lumoback (Lumo Bodytech), Moves (ProtoGeo Oy), Fitbit One (Fitbit inc), UP24 (Jawbone), and Walking Style X (Omron Healthcare Europe BV). In total, 130 persons with chronic diseases performed standardized activity protocols based on activities of daily living that were recorded on video camera and analyzed for step count (gold standard). The validity of the trackers' step count was assessed by correlation coefficients, t tests, scatterplots, and Bland-Altman plots. The correlations between the number of steps counted by the activity trackers and the gold standard were low (range: -.02 to .33). For all activity trackers except for Fitbit One, a significant systematic difference with the gold standard was found for step count. Plots showed a wide range in scores for all activity trackers; Activ8 showed an average overestimation and the other 8 trackers showed underestimations. This study showed that the validity of 9 commercially available activity trackers is low measuring steps while individuals with chronic diseases receiving physiotherapy engage in activities of daily living. ©Darcy Ummels, Emmylou Beekman, Kyra Theunissen, Susy Braun, Anna J Beurskens. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 02.04.2018.

  18. Preparation of high purity plutonium oxide for radiochemistry instrument calibration standards and working standards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wong, A.S.; Stalnaker, N.D.

    1997-04-01

    Due to the lack of suitable high level National Institute of Standards and Technology (NIST) traceable plutonium solution standards from the NIST or commercial vendors, the CST-8 Radiochemistry team at Los Alamos National Laboratory (LANL) has prepared instrument calibration standards and working standards from a well-characterized plutonium oxide. All the aliquoting steps were performed gravimetrically. When a {sup 241}Am standardized solution obtained from a commercial vendor was compared to these calibration solutions, the results agreed to within 0.04% for the total alpha activity. The aliquots of the plutonium standard solutions and dilutions were sealed in glass ampules for long termmore » storage.« less

  19. Advanced real-time multi-display educational system (ARMES): An innovative real-time audiovisual mentoring tool for complex robotic surgery.

    PubMed

    Lee, Joong Ho; Tanaka, Eiji; Woo, Yanghee; Ali, Güner; Son, Taeil; Kim, Hyoung-Il; Hyung, Woo Jin

    2017-12-01

    The recent scientific and technologic advances have profoundly affected the training of surgeons worldwide. We describe a novel intraoperative real-time training module, the Advanced Robotic Multi-display Educational System (ARMES). We created a real-time training module, which can provide a standardized step by step guidance to robotic distal subtotal gastrectomy with D2 lymphadenectomy procedures, ARMES. The short video clips of 20 key steps in the standardized procedure for robotic gastrectomy were created and integrated with TilePro™ software to delivery on da Vinci Surgical Systems (Intuitive Surgical, Sunnyvale, CA). We successfully performed the robotic distal subtotal gastrectomy with D2 lymphadenectomy for patient with gastric cancer employing this new teaching method without any transfer errors or system failures. Using this technique, the total operative time was 197 min and blood loss was 50 mL and there were no intra- or post-operative complications. Our innovative real-time mentoring module, ARMES, enables standardized, systematic guidance during surgical procedures. © 2017 Wiley Periodicals, Inc.

  20. Prevalence in running events and running performance of endurance runners following a vegetarian or vegan diet compared to non-vegetarian endurance runners: the NURMI Study.

    PubMed

    Wirnitzer, Katharina; Seyfart, Tom; Leitzmann, Claus; Keller, Markus; Wirnitzer, Gerold; Lechleitner, Christoph; Rüst, Christoph Alexander; Rosemann, Thomas; Knechtle, Beat

    2016-01-01

    Beneficial and detrimental effects of various vegetarian and vegan diets on the health status are well known. Considering the growing background numbers of vegetarians and vegans, the number of vegetarian and vegan runners is likely to rise, too. Therefore, the Nutrition and Running High Mileage (NURMI) Study was designed as a comparative study to investigate the prevalence of omnivores, vegetarians, and vegans in running events and to detect potential differences in running performance comparing these three subgroups. The NURMI Study will be conducted in three steps following a cross-sectional design. Step 1 will determine epidemiological aspects of endurance runners (any distance) using a short standardized questionnaire. Step 2 will investigate dietary habits and running history from eligible participants (capable of running a half-marathon at least) using an extended standardized questionnaire. Step 3 will collect data after a running event on finishing time and final ranking as well as a post-race rating of perceived exertion, mood status, nutrient and fluid intake during the race. Our study will provide a major contribution to overcome the lack of data on the prevalence and running performance of vegetarian and vegan runners in endurance running events. We estimate the prevalence of vegetarians and vegans participating in a running event to be less compared to the respective proportion of vegetarians and vegans to the general population. Furthermore we will validate the subject's self-assessment of their respective diet. This comparative study may identify possible effects of dietary behavior on running performance und may detect possible differences between the respective subgroups: omnivorous, vegetarian and vegan runners. Trial registration Current controlled trials, ISRCTN73074080.

  1. Laboratory scale Clean-In-Place (CIP) studies on the effectiveness of different caustic and acid wash steps on the removal of dairy biofilms.

    PubMed

    Bremer, Philip J; Fillery, Suzanne; McQuillan, A James

    2006-02-15

    A laboratory scale, bench top flow system was used to partially reproduce dairy plant conditions under which biofilms form and to quantify the effectiveness of caustic and acid wash steps in reducing the number of viable bacteria attached to stainless steel (SS) surfaces. Once bacteria attached to surfaces, a standard clean-in-place (CIP) regime (water rinse, 1% sodium hydroxide at 65 degrees C for 10 min, water rinse, 1.0% nitric acid at 65 degrees C for 10 min, water rinse) did not reproducibly ensure their removal. Standard CIP effectiveness was compared to alternative cleaning chemicals such as: caustic blends (Alkazolv 48, Ultrazolv 700, Concept C20, and Reflex B165); a caustic additive (Eliminator); acid blends (Nitroplus and Nitrobrite); and sanitizer (Perform). The addition of a caustic additive, Eliminator, enhanced biofilm removal compared to the standard CIP regime and further increases in cleaning efficiency occurred when nitric acid was substituted with Nitroplus. The combination of NaOH plus Eliminator and Nitroplus achieved a 3.8 log reduction in the number of cells recovered from the stainless steel surface. The incorporation of a sanitizer step into the CIP did not appear to enhance biofilm removal. This study has shown that the effectiveness of a "standard" CIP can possibly be enhanced through the testing and use of caustic and acid blends. There are many implications of these findings, including: the development of improved cleaning regimes and improved product quality, plant performance, and economic returns.

  2. Influence of learning styles on the practical performance after the four-step basic life support training approach – An observational cohort study

    PubMed Central

    Henke, Alexandra; Stieger, Lina; Beckers, Stefan; Biermann, Henning; Rossaint, Rolf; Sopka, Saša

    2017-01-01

    Background Learning and training basic life support (BLS)—especially external chest compressions (ECC) within the BLS-algorithm—are essential resuscitation training for laypersons as well as for health care professionals. The objective of this study was to evaluate the influence of learning styles on the performance of BLS and to identify whether all types of learners are sufficiently addressed by Peyton’s four-step approach for BLS training. Methods A study group of first-year medical students (n = 334) without previous medical knowledge was categorized according to learning styles using the German Lernstilinventar questionnaire based on Kolb’s Learning Styles Inventory. Students’ BLS performances were assessed before and after a four-step BLS training approach lasting 4 hours. Standardized BLS training was provided by an educational staff consisting of European Resuscitation Council-certified advanced life support providers and instructors. Pre- and post-intervention BLS performance was evaluated using a single-rescuer-scenario and standardized questionnaires (6-point-Likert-scales: 1 = completely agree, 6 = completely disagree). The recorded points of measurement were the time to start, depth, and frequency of ECC. Results The study population was categorized according to learning styles: diverging (5%, n = 16), assimilating (36%, n = 121), converging (41%, n = 138), and accommodating (18%, n = 59). Independent of learning styles, both male and female participants showed significant improvement in cardiopulmonary resuscitation (CPR) performance. Based on the Kolb learning styles, no significant differences between the four groups were observed in compression depth, frequency, time to start CPR, or the checklist-based assessment within the baseline assessment. A significant sex effect on the difference between pre- and post-interventional assessment points was observed for mean compression depth and mean compression frequency. Conclusions The findings of this work show that the four-step-approach for BLS training addresses all types of learners independent of their learning styles and does not lead to significant differences in the performance of CPR. PMID:28542636

  3. Influence of learning styles on the practical performance after the four-step basic life support training approach - An observational cohort study.

    PubMed

    Schröder, Hanna; Henke, Alexandra; Stieger, Lina; Beckers, Stefan; Biermann, Henning; Rossaint, Rolf; Sopka, Saša

    2017-01-01

    Learning and training basic life support (BLS)-especially external chest compressions (ECC) within the BLS-algorithm-are essential resuscitation training for laypersons as well as for health care professionals. The objective of this study was to evaluate the influence of learning styles on the performance of BLS and to identify whether all types of learners are sufficiently addressed by Peyton's four-step approach for BLS training. A study group of first-year medical students (n = 334) without previous medical knowledge was categorized according to learning styles using the German Lernstilinventar questionnaire based on Kolb's Learning Styles Inventory. Students' BLS performances were assessed before and after a four-step BLS training approach lasting 4 hours. Standardized BLS training was provided by an educational staff consisting of European Resuscitation Council-certified advanced life support providers and instructors. Pre- and post-intervention BLS performance was evaluated using a single-rescuer-scenario and standardized questionnaires (6-point-Likert-scales: 1 = completely agree, 6 = completely disagree). The recorded points of measurement were the time to start, depth, and frequency of ECC. The study population was categorized according to learning styles: diverging (5%, n = 16), assimilating (36%, n = 121), converging (41%, n = 138), and accommodating (18%, n = 59). Independent of learning styles, both male and female participants showed significant improvement in cardiopulmonary resuscitation (CPR) performance. Based on the Kolb learning styles, no significant differences between the four groups were observed in compression depth, frequency, time to start CPR, or the checklist-based assessment within the baseline assessment. A significant sex effect on the difference between pre- and post-interventional assessment points was observed for mean compression depth and mean compression frequency. The findings of this work show that the four-step-approach for BLS training addresses all types of learners independent of their learning styles and does not lead to significant differences in the performance of CPR.

  4. Absolute Frequency Measurements with a Set of Transportable HE-NE/CH4 Optical Frequency Standards and Prospects for Future Design and Applications

    NASA Astrophysics Data System (ADS)

    Gubin, M.; Kovalchuk, E.; Petrukhin, E.; Shelkovnikov, A.; Tyurikov, D.; Gamidov, R.; Erdogan, C.; Sahin, E.; Felder, R.; Gill, P.; Lea, S. N.; Kramer, G.; Lipphardt, B.

    2002-04-01

    The accumulated results of absolute frequency measurements (AFM) carried out in 1997-2000 with transportable double-mode He-Ne/CH4 optical frequency standards (λ = 3 .39μm) in a collaboration of several laboratories are presented. The performance of this secondary optical frequency standard is estimated on the level of 10-13 (in repeatability), and 1 × 10-14/s (in stability). The next steps towards He-Ne/CH4 standards with one order of magnitude better performance, including devices based on monolithic zerodur resonators, are discussed. Important applications of transportable He-Ne/CH4 optical frequency standards have appeared now due to dramatic progress in the field of optical frequency measurements. Used to stabilize the repetition rate of a Ti:Sa fs laser, these compact secondary standards can transfer their performance into the whole optical range covered by a fs comb. Thus they can play the role of a narrow spectrum interrogative oscillator for super-accurate optical or microwave frequency standards substituting in some tasks a H-maser or oscillators based on cryogenic sapphire resonators.

  5. Tendency for interlaboratory precision in the GMO analysis method based on real-time PCR.

    PubMed

    Kodama, Takashi; Kurosawa, Yasunori; Kitta, Kazumi; Naito, Shigehiro

    2010-01-01

    The Horwitz curve estimates interlaboratory precision as a function only of concentration, and is frequently used as a method performance criterion in food analysis with chemical methods. The quantitative biochemical methods based on real-time PCR require an analogous criterion to progressively promote method validation. We analyzed the tendency of precision using a simplex real-time PCR technique in 53 collaborative studies of seven genetically modified (GM) crops. Reproducibility standard deviation (SR) and repeatability standard deviation (Sr) of the genetically modified organism (GMO) amount (%) was more or less independent of GM crops (i.e., maize, soybean, cotton, oilseed rape, potato, sugar beet, and rice) and evaluation procedure steps. Some studies evaluated whole steps consisting of DNA extraction and PCR quantitation, whereas others focused only on the PCR quantitation step by using DNA extraction solutions. Therefore, SR and Sr for GMO amount (%) are functions only of concentration similar to the Horwitz curve. We proposed S(R) = 0.1971C 0.8685 and S(r) = 0.1478C 0.8424, where C is the GMO amount (%). We also proposed a method performance index in GMO quantitative methods that is analogous to the Horwitz Ratio.

  6. Recovery of forward stepping in spinal cord injured patients does not transfer to untrained backward stepping.

    PubMed

    Grasso, Renato; Ivanenko, Yuri P; Zago, Myrka; Molinari, Marco; Scivoletto, Giorgio; Lacquaniti, Francesco

    2004-08-01

    Six spinal cord injured (SCI) patients were trained to step on a treadmill with body-weight support for 1.5-3 months. At the end of training, foot motion recovered the shape and the step-by-step reproducibility that characterize normal gait. They were then asked to step backward on the treadmill belt that moved in the opposite direction relative to standard forward training. In contrast to healthy subjects, who can immediately reverse the direction of walking by time-reversing the kinematic waveforms, patients were unable to step backward. Similarly patients were unable to perform another untrained locomotor task, namely stepping in place on the idle treadmill. Two patients who were trained to step backward for 2-3 weeks were able to develop control of foot motion appropriate for this task. The results show that locomotor improvement does not transfer to untrained tasks, thus supporting the idea of task-dependent plasticity in human locomotor networks.

  7. Cause or effect? The relationship between student perception of the medical school learning environment and academic performance on USMLE Step 1.

    PubMed

    Wayne, Sharon J; Fortner, Sally A; Kitzes, Judith A; Timm, Craig; Kalishman, Summers

    2013-05-01

    A school's learning environment is believed to influence academic performance yet few studies have evaluated this association controlling for prior academic ability, an important factor since students who do well in school tend to rate their school's environment more highly than students who are less academically strong. To evaluate the effect of student perception of the learning environment on their performance on a standardized licensing test while controlling for prior academic ability. We measured perception of the learning environment after the first year of medical school in 267 students from five consecutive classes and related that measure to performance on United States Medical Licensing Examination (USMLE) Step 1, taken approximately six months later. We controlled for prior academic performance by including Medical College Admission Test score and undergraduate grade point average in linear regression models. Three of the five learning environment subscales were statistically associated with Step 1 performance (p < 0.05): meaningful learning environment, emotional climate, and student-student interaction. A one-point increase in the rating of the subscales (scale of 1-4) was associated with increases of 6.8, 6.6, and 4.8 points on the Step 1 exam. Our findings provide some evidence for the widely held assumption that a positively perceived learning environment contributes to better academic performance.

  8. Comparison of two filtration-elution procedures to improve the standard methods ISO 10705-1 & 2 for bacteriophage detection in groundwater, surface water and finished water samples.

    PubMed

    Helmi, K; Jacob, P; Charni-Ben-Tabassi, N; Delabre, K; Arnal, C

    2011-09-01

    To select a reliable method for bacteriophage concentration prior detection by culture from surface water, groundwater and drinking water to enhance the sensitivity of the standard methods ISO 10705-1 & 2. Artificially contaminated (groundwater and drinking water) and naturally contaminated (surface water) 1-litre samples were processed for bacteriophages detection. The spiked samples were inoculated with about 150 PFU of F-specific RNA bacteriophages and somatic coliphages using wastewater. Bacteriophage detection in the water samples was achieved using the standard method without and with a concentration step (electropositive Anodisc membrane or a pretreated electronegative Micro Filtration membrane, MF). For artificially contaminated matrices (drinking and ground waters), recovery rates using the concentration step were superior to 70% whilst analyses without concentration step mainly led to false negative results. Besides, the MF membrane presented higher performances compared with the Anodisc membrane. The concentration of a large volume of water (up to one litre) on a filter membrane avoids false negative results obtained by direct analysis as it allows detecting low number of bacteriophages in water samples. The addition of concentration step before applying the standard method could be useful to enhance the reliability of bacteriophages monitoring in water samples as bio-indicators to highlight faecal pollution. © No claim to French Government works. Letters in Applied Microbiology © 2011 The Society for Applied Microbiology.

  9. Academy of nutrition and dietetics: revised 2014 standards of practice and standards of professional performance for registered dietitian nutritionists (competent, proficient, and expert) in sports nutrition and dietetics.

    PubMed

    Steinmuller, Patricia L; Kruskall, Laura J; Karpinski, Christine A; Manore, Melinda M; Macedonio, Michele A; Meyer, Nanna L

    2014-04-01

    Sports nutrition and dietetics addresses relationships of nutrition with physical activity, including weight management, exercise, and physical performance. Nutrition plays a key role in the prevention and treatment of obesity and chronic disease and for maintenance of health, and the ability to engage in physical activity, sports, and other aspects of physical performance. Thus, the Sports, Cardiovascular, and Wellness Nutrition Dietetic Practice Group, with guidance from the Academy of Nutrition and Dietetics Quality Management Committee, has developed the Revised 2014 Standards of Practice and Standards of Professional Performance as a resource for Registered Dietitian Nutritionists working in sports nutrition and dietetics to assess their current skill levels and to identify areas for further professional development in this emerging practice area. The revised document reflects advances in sports nutrition and dietetics practice since the original standards were published in 2009 and replaces those standards. The Standards of Practice represents the four steps in the Nutrition Care Process as applied to the care of patients/clients. The Standards of Professional Performance covers six standards of professional performance: quality in practice, competence and accountability, provision of services, application of research, communication and application of knowledge, and utilization and management of resources. Within each standard, specific indicators provide measurable action statements that illustrate how the standards can be applied to practice. The indicators describe three skill levels (competent, proficient, and expert) for Registered Dietitian Nutritionists working in sports nutrition and dietetics. The Standards of Practice and Standards of Professional Performance are complementary resources for Registered Dietitian Nutritionists in sports nutrition and dietetics practice. Copyright © 2014 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.

  10. Stepping Stones To Using "Caring for Our Children": National Health and Safety Performance Standards for Out-of-Home Child Care Programs. Protecting Children from Harm.

    ERIC Educational Resources Information Center

    Colorado Univ. Health Sciences Center, Denver.

    Developed in support of state licensing and regulatory agencies as well as state child care, health, and resource and referral agencies, and a variety of other public and private organizations, parents, and advocacy groups, this guide identifies those standards most needed for the prevention of injury, morbidity, and mortality in child care…

  11. Creating Royal Australian Navy Standard Operating Procedures using Flow Diagrams

    DTIC Science & Technology

    2015-08-01

    DST-Group-TR-3137 UNCLASSIFIED Acronyms 4TQ 4TQ Toolkit ABR Australian Book of Reference ADF Australian Defence Force BPMN Business...steps to perform the activity. Object Management Group’s (OMG) Business Process Model and Notation ( BPMN ) [10] is becoming the standard to use when...Department of Defence 10. Object Management Group, Business Process Model and Notation ( BPMN ), version 2.0. 2011, Object Management Group: http

  12. An Interlaboratory Evaluation of Drift Tube Ion Mobility–Mass Spectrometry Collision Cross Section Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stow, Sarah M.; Causon, Tim J.; Zheng, Xueyun

    Collision cross section (CCS) measurements resulting from ion mobility-mass spectrometry (IM-MS) experiments provide a promising orthogonal dimension of structural information in MS-based analytical separations. As with any molecular identifier, interlaboratory standardization must precede broad range integration into analytical workflows. In this study, we present a reference drift tube ion mobility mass spectrometer (DTIM-MS) where improvements on the measurement accuracy of experimental parameters influencing IM separations provide standardized drift tube, nitrogen CCS values (DTCCSN2) for over 120 unique ion species with the lowest measurement uncertainty to date. The reproducibility of these DTCCSN2 values are evaluated across three additional laboratories on amore » commercially available DTIM-MS instrument. The traditional stepped field CCS method performs with a relative standard deviation (RSD) of 0.29% for all ion species across the three additional laboratories. The calibrated single field CCS method, which is compatible with a wide range of chromatographic inlet systems, performs with an average, absolute bias of 0.54% to the standardized stepped field DTCCSN2 values on the reference system. The low RSD and biases observed in this interlaboratory study illustrate the potential of DTIM-MS for providing a molecular identifier for a broad range of discovery based analyses.« less

  13. An adaptive software defined radio design based on a standard space telecommunication radio system API

    NASA Astrophysics Data System (ADS)

    Xiong, Wenhao; Tian, Xin; Chen, Genshe; Pham, Khanh; Blasch, Erik

    2017-05-01

    Software defined radio (SDR) has become a popular tool for the implementation and testing for communications performance. The advantage of the SDR approach includes: a re-configurable design, adaptive response to changing conditions, efficient development, and highly versatile implementation. In order to understand the benefits of SDR, the space telecommunication radio system (STRS) was proposed by NASA Glenn research center (GRC) along with the standard application program interface (API) structure. Each component of the system uses a well-defined API to communicate with other components. The benefit of standard API is to relax the platform limitation of each component for addition options. For example, the waveform generating process can support a field programmable gate array (FPGA), personal computer (PC), or an embedded system. As long as the API defines the requirements, the generated waveform selection will work with the complete system. In this paper, we demonstrate the design and development of adaptive SDR following the STRS and standard API protocol. We introduce step by step the SDR testbed system including the controlling graphic user interface (GUI), database, GNU radio hardware control, and universal software radio peripheral (USRP) tranceiving front end. In addition, a performance evaluation in shown on the effectiveness of the SDR approach for space telecommunication.

  14. Association of MCAT scores obtained with standard vs extra administration time with medical school admission, medical student performance, and time to graduation.

    PubMed

    Searcy, Cynthia A; Dowd, Keith W; Hughes, Michael G; Baldwin, Sean; Pigg, Trey

    2015-06-09

    Individuals with documented disabilities may receive accommodations on the Medical College Admission Test (MCAT). Whether such accommodations are associated with MCAT scores, medical school admission, and medical school performance is unclear. To determine the comparability of MCAT scores obtained with standard vs extra administration time with respect to likelihood of acceptance to medical school and future medical student performance. Retrospective cohort study of applicants to US medical schools for the 2011-2013 entering classes who reported MCAT scores obtained with standard time (n = 133,962) vs extra time (n = 435), and of students who matriculated in US medical schools from 2000-2004 who reported MCAT scores obtained with standard time (n = 76,262) vs extra time (n = 449). Standard or extra administration time during MCAT. Primary outcome measures were acceptance rates at US medical schools and graduation rates within 4 or 5 years after matriculation. Secondary outcome measures were pass rates on the United States Medical Licensing Examination (USMLE) Step examinations and graduation rates within 6 to 8 years after matriculation. Acceptance rates were not significantly different for applicants who had MCAT scores obtained with standard vs extra time (44.5% [59,585/133,962] vs 43.9% [191/435]; difference, 0.6% [95% CI, -4.1 to 5.3]). Students who tested with extra time passed the Step examinations on first attempt at significantly lower rates (Step 1, 82.1% [344/419] vs 94.0% [70,188/74,668]; difference, 11.9% [95% CI, 9.6% to 14.2%]; Step 2 CK, 85.5% [349/408] vs 95.4% [70,476/73,866]; difference, 9.9% [95% CI, 7.8% to 11.9%]; Step 2 CS, 92.0% [288/313] vs 97.0% [60,039/61,882]; difference, 5.0% [95% CI, 3.1% to 6.9%]). They also graduated from medical school at significantly lower rates at different times (4 years, 67.2% [285/424] vs 86.1% [60,547/70,305]; difference, 18.9% [95% CI, 15.6% to 22.2%]; 5 years, 81.6% [346/424] vs 94.4% [66,369/70,305]; difference, 12.8% [95% CI, 10.6% to 15.0%]; 6 years, 85.4% [362/424] vs 95.8% [67,351/70,305]; difference, 10.4% [95% CI, 8.5% to 12.4%]; 7 years, 88.0% [373/424] vs 96.2% [67,639/70,305]; difference, 8.2% [95% CI, 6.4% to 10.1%]; 8 years, 88.4% [375/424] vs 96.5% [67,847/70,305]; difference, 8.1% [95% CI, 6.3% to 9.8%]). These differences remained after controlling for MCAT scores and undergraduate grade point averages. Among applicants to US medical schools, those with MCAT scores obtained with extra test administration time, compared with standard administration time, had no significant difference in rate of medical school admission but had lower rates of passing the USMLE Step examinations and of medical school graduation within 4 to 8 years after matriculation. These findings raise questions about the types of learning environments and support systems needed by students who test with extra time on the MCAT to enable them to succeed in medical school.

  15. Standardized Methods to Generate Mock (Spiked) Clinical Specimens by Spiking Blood or Plasma with Cultured Pathogens

    PubMed Central

    Dong, Ming; Fisher, Carolyn; Añez, Germán; Rios, Maria; Nakhasi, Hira L.; Hobson, J. Peyton; Beanan, Maureen; Hockman, Donna; Grigorenko, Elena; Duncan, Robert

    2016-01-01

    Aims To demonstrate standardized methods for spiking pathogens into human matrices for evaluation and comparison among diagnostic platforms. Methods and Results This study presents detailed methods for spiking bacteria or protozoan parasites into whole blood and virus into plasma. Proper methods must start with a documented, reproducible pathogen source followed by steps that include standardized culture, preparation of cryopreserved aliquots, quantification of the aliquots by molecular methods, production of sufficient numbers of individual specimens and testing of the platform with multiple mock specimens. Results are presented following the described procedures that showed acceptable reproducibility comparing in-house real-time PCR assays to a commercially available multiplex molecular assay. Conclusions A step by step procedure has been described that can be followed by assay developers who are targeting low prevalence pathogens. Significance and Impact of Study The development of diagnostic platforms for detection of low prevalence pathogens such as biothreat or emerging agents is challenged by the lack of clinical specimens for performance evaluation. This deficit can be overcome using mock clinical specimens made by spiking cultured pathogens into human matrices. To facilitate evaluation and comparison among platforms, standardized methods must be followed in the preparation and application of spiked specimens. PMID:26835651

  16. Spatiotemporal and Kinematic Parameters Relating to Oriented Gait and Turn Performance in Patients with Chronic Stroke

    PubMed Central

    Bonnyaud, Céline; Pradon, Didier; Vuillerme, Nicolas; Bensmail, Djamel; Roche, Nicolas

    2015-01-01

    Background The timed up and go test (TUG) is a functional test which is increasingly used to evaluate patients with stroke. The outcome measured is usually global TUG performance-time. Assessment of spatiotemporal and kinematic parameters during the Oriented gait and Turn sub-tasks of the TUG would provide a better understanding of the mechanisms underlying patients’ performance and therefore may help to guide rehabilitation. The aim of this study was thus to determine the spatiotemporal and kinematic parameters which were most related to the walking and turning sub-tasks of TUG performance in stroke patients. Methods 29 stroke patients carried out the TUG test which was recorded using an optoelectronic system in two conditions: spontaneous and standardized condition (standardized foot position and instructed to turn towards the paretic side). They also underwent a clinical assessment. Stepwise regression was used to determine the parameters most related to Oriented gait and Turn sub-tasks. Relationships between explanatory parameters of Oriented gait and Turn performance and clinical scales were evaluated using Spearman correlations. Results Step length and cadence explained 82% to 95% of the variance for the walking sub-tasks in both conditions. Percentage single support phase and contralateral swing phase (depending on the condition) respectively explained 27% and 56% of the variance during the turning sub-task in the spontaneous and standardized conditions. Discussion and Conclusion Step length, cadence, percentage of paretic single support phase and non-paretic swing phase, as well as dynamic stability were the main parameters related to TUG performance and they should be targeted in rehabilitation. PMID:26091555

  17. Performance management of multiple access communication networks

    NASA Astrophysics Data System (ADS)

    Lee, Suk; Ray, Asok

    1993-12-01

    This paper focuses on conceptual design, development, and implementation of a performance management tool for computer communication networks to serve large-scale integrated systems. The objective is to improve the network performance in handling various types of messages by on-line adjustment of protocol parameters. The techniques of perturbation analysis of Discrete Event Dynamic Systems (DEDS), stochastic approximation (SA), and learning automata have been used in formulating the algorithm of performance management. The efficacy of the performance management tool has been demonstrated on a network testbed. The conceptual design presented in this paper offers a step forward to bridging the gap between management standards and users' demands for efficient network operations since most standards such as ISO (International Standards Organization) and IEEE address only the architecture, services, and interfaces for network management. The proposed concept of performance management can also be used as a general framework to assist design, operation, and management of various DEDS such as computer integrated manufacturing and battlefield C(sup 3) (Command, Control, and Communications).

  18. Evaluation of a continuous-rotation, high-speed scanning protocol for micro-computed tomography.

    PubMed

    Kerl, Hans Ulrich; Isaza, Cristina T; Boll, Hanne; Schambach, Sebastian J; Nolte, Ingo S; Groden, Christoph; Brockmann, Marc A

    2011-01-01

    Micro-computed tomography is used frequently in preclinical in vivo research. Limiting factors are radiation dose and long scan times. The purpose of the study was to compare a standard step-and-shoot to a continuous-rotation, high-speed scanning protocol. Micro-computed tomography of a lead grid phantom and a rat femur was performed using a step-and-shoot and a continuous-rotation protocol. Detail discriminability and image quality were assessed by 3 radiologists. The signal-to-noise ratio and the modulation transfer function were calculated, and volumetric analyses of the femur were performed. The radiation dose of the scan protocols was measured using thermoluminescence dosimeters. The 40-second continuous-rotation protocol allowed a detail discriminability comparable to the step-and-shoot protocol at significantly lower radiation doses. No marked differences in volumetric or qualitative analyses were observed. Continuous-rotation micro-computed tomography significantly reduces scanning time and radiation dose without relevantly reducing image quality compared with a normal step-and-shoot protocol.

  19. How Accurate Is Your Activity Tracker? A Comparative Study of Step Counts in Low-Intensity Physical Activities

    PubMed Central

    2017-01-01

    Background As commercially available activity trackers are being utilized in clinical trials, the research community remains uncertain about reliability of the trackers, particularly in studies that involve walking aids and low-intensity activities. While these trackers have been tested for reliability during walking and running activities, there has been limited research on validating them during low-intensity activities and walking with assistive tools. Objective The aim of this study was to (1) determine the accuracy of 3 Fitbit devices (ie, Zip, One, and Flex) at different wearing positions (ie, pants pocket, chest, and wrist) during walking at 3 different speeds, 2.5, 5, and 8 km/h, performed by healthy adults on a treadmill; (2) determine the accuracy of the mentioned trackers worn at different sites during activities of daily living; and (3) examine whether intensity of physical activity (PA) impacts the choice of optimal wearing site of the tracker. Methods We recruited 15 healthy young adults to perform 6 PAs while wearing 3 Fitbit devices (ie, Zip, One, and Flex) on their chest, pants pocket, and wrist. The activities include walking at 2.5, 5, and 8 km/h, pushing a shopping cart, walking with aid of a walker, and eating while sitting. We compared the number of steps counted by each tracker with gold standard numbers. We performed multiple statistical analyses to compute descriptive statistics (ie, ANOVA test), intraclass correlation coefficient (ICC), mean absolute error rate, and correlation by comparing the tracker-recorded data with that of the gold standard. Results All the 3 trackers demonstrated good-to-excellent (ICC>0.75) correlation with the gold standard step counts during treadmill experiments. The correlation was poor (ICC<0.60), and the error rate was significantly higher in walker experiment compared to other activities. There was no significant difference between the trackers and the gold standard in the shopping cart experiment. The wrist worn tracker, Flex, counted several steps when eating (P<.01). The chest tracker was identified as the most promising site to capture steps in more intense activities, while the wrist was the optimal wearing site in less intense activities. Conclusions This feasibility study focused on 6 PAs and demonstrated that Fitbit trackers were most accurate when walking on a treadmill and least accurate during walking with a walking aid and for low-intensity activities. This may suggest excluding participants with assistive devices from studies that focus on PA interventions using commercially available trackers. This study also indicates that the wearing site of the tracker is an important factor impacting the accuracy performance. A larger scale study with a more diverse population, various activity tracker vendors, and a larger activity set are warranted to generalize our results. PMID:28801304

  20. Algorithm-enabled partial-angular-scan configurations for dual-energy CT.

    PubMed

    Chen, Buxin; Zhang, Zheng; Xia, Dan; Sidky, Emil Y; Pan, Xiaochuan

    2018-05-01

    We seek to investigate an optimization-based one-step method for image reconstruction that explicitly compensates for nonlinear spectral response (i.e., the beam-hardening effect) in dual-energy CT, to investigate the feasibility of the one-step method for enabling two dual-energy partial-angular-scan configurations, referred to as the short- and half-scan configurations, on standard CT scanners without involving additional hardware, and to investigate the potential of the short- and half-scan configurations in reducing imaging dose and scan time in a single-kVp-switch full-scan configuration in which two full rotations are made for collection of dual-energy data. We use the one-step method to reconstruct images directly from dual-energy data through solving a nonconvex optimization program that specifies the images to be reconstructed in dual-energy CT. Dual-energy full-scan data are generated from numerical phantoms and collected from physical phantoms with the standard single-kVp-switch full-scan configuration, whereas dual-energy short- and half-scan data are extracted from the corresponding full-scan data. Besides visual inspection and profile-plot comparison, the reconstructed images are analyzed also in quantitative studies based upon tasks of linear-attenuation-coefficient and material-concentration estimation and of material differentiation. Following the performance of a computer-simulation study to verify that the one-step method can reconstruct numerically accurately basis and monochromatic images of numerical phantoms, we reconstruct basis and monochromatic images by using the one-step method from real data of physical phantoms collected with the full-, short-, and half-scan configurations. Subjective inspection based upon visualization and profile-plot comparison reveals that monochromatic images, which are used often in practical applications, reconstructed from the full-, short-, and half-scan data are largely visually comparable except for some differences in texture details. Moreover, quantitative studies based upon tasks of linear-attenuation-coefficient and material-concentration estimation and of material differentiation indicate that the short- and half-scan configurations yield results in close agreement with the ground-truth information and that of the full-scan configuration. The one-step method considered can compensate effectively for the nonlinear spectral response in full- and partial-angular-scan dual-energy CT. It can be exploited for enabling partial-angular-scan configurations on standard CT scanner without involving additional hardware. Visual inspection and quantitative studies reveal that, with the one-step method, partial-angular-scan configurations considered can perform at a level comparable to that of the full-scan configuration, thus suggesting the potential of the two partial-angular-scan configurations in reducing imaging dose and scan time in the standard single-kVp-switch full-scan CT in which two full rotations are performed. The work also yields insights into the investigation and design of other nonstandard scan configurations of potential practical significance in dual-energy CT. © 2018 American Association of Physicists in Medicine.

  1. Cross-platform evaluation of commercial real-time SYBR green RT-PCR kits for sensitive and rapid detection of European bat Lyssavirus type 1.

    PubMed

    Picard-Meyer, Evelyne; Peytavin de Garam, Carine; Schereffer, Jean Luc; Marchal, Clotilde; Robardet, Emmanuelle; Cliquet, Florence

    2015-01-01

    This study evaluates the performance of five two-step SYBR Green RT-qPCR kits and five one-step SYBR Green qRT-PCR kits using real-time PCR assays. Two real-time thermocyclers showing different throughput capacities were used. The analysed performance evaluation criteria included the generation of standard curve, reaction efficiency, analytical sensitivity, intra- and interassay repeatability as well as the costs and the practicability of kits, and thermocycling times. We found that the optimised one-step PCR assays had a higher detection sensitivity than the optimised two-step assays regardless of the machine used, while no difference was detected in reaction efficiency, R (2) values, and intra- and interreproducibility between the two methods. The limit of detection at the 95% confidence level varied between 15 to 981 copies/µL and 41 to 171 for one-step kits and two-step kits, respectively. Of the ten kits tested, the most efficient kit was the Quantitect SYBR Green qRT-PCR with a limit of detection at 95% of confidence of 20 and 22 copies/µL on the thermocyclers Rotor gene Q MDx and MX3005P, respectively. The study demonstrated the pivotal influence of the thermocycler on PCR performance for the detection of rabies RNA, as well as that of the master mixes.

  2. Cross-Platform Evaluation of Commercial Real-Time SYBR Green RT-PCR Kits for Sensitive and Rapid Detection of European Bat Lyssavirus Type 1

    PubMed Central

    Picard-Meyer, Evelyne; Peytavin de Garam, Carine; Schereffer, Jean Luc; Marchal, Clotilde; Robardet, Emmanuelle; Cliquet, Florence

    2015-01-01

    This study evaluates the performance of five two-step SYBR Green RT-qPCR kits and five one-step SYBR Green qRT-PCR kits using real-time PCR assays. Two real-time thermocyclers showing different throughput capacities were used. The analysed performance evaluation criteria included the generation of standard curve, reaction efficiency, analytical sensitivity, intra- and interassay repeatability as well as the costs and the practicability of kits, and thermocycling times. We found that the optimised one-step PCR assays had a higher detection sensitivity than the optimised two-step assays regardless of the machine used, while no difference was detected in reaction efficiency, R 2 values, and intra- and interreproducibility between the two methods. The limit of detection at the 95% confidence level varied between 15 to 981 copies/µL and 41 to 171 for one-step kits and two-step kits, respectively. Of the ten kits tested, the most efficient kit was the Quantitect SYBR Green qRT-PCR with a limit of detection at 95% of confidence of 20 and 22 copies/µL on the thermocyclers Rotor gene Q MDx and MX3005P, respectively. The study demonstrated the pivotal influence of the thermocycler on PCR performance for the detection of rabies RNA, as well as that of the master mixes. PMID:25785274

  3. Observation of Stronger-than-Binary Correlations with Entangled Photonic Qutrits

    NASA Astrophysics Data System (ADS)

    Hu, Xiao-Min; Liu, Bi-Heng; Guo, Yu; Xiang, Guo-Yong; Huang, Yun-Feng; Li, Chuan-Feng; Guo, Guang-Can; Kleinmann, Matthias; Vértesi, Tamás; Cabello, Adán

    2018-05-01

    We present the first experimental confirmation of the quantum-mechanical prediction of stronger-than-binary correlations. These are correlations that cannot be explained under the assumption that the occurrence of a particular outcome of an n ≥3 -outcome measurement is due to a two-step process in which, in the first step, some classical mechanism precludes n -2 of the outcomes and, in the second step, a binary measurement generates the outcome. Our experiment uses pairs of photonic qutrits distributed between two laboratories, where randomly chosen three-outcome measurements are performed. We report a violation by 9.3 standard deviations of the optimal inequality for nonsignaling binary correlations.

  4. On the Performance Evaluation of 3D Reconstruction Techniques from a Sequence of Images

    NASA Astrophysics Data System (ADS)

    Eid, Ahmed; Farag, Aly

    2005-12-01

    The performance evaluation of 3D reconstruction techniques is not a simple problem to solve. This is not only due to the increased dimensionality of the problem but also due to the lack of standardized and widely accepted testing methodologies. This paper presents a unified framework for the performance evaluation of different 3D reconstruction techniques. This framework includes a general problem formalization, different measuring criteria, and a classification method as a first step in standardizing the evaluation process. Performance characterization of two standard 3D reconstruction techniques, stereo and space carving, is also presented. The evaluation is performed on the same data set using an image reprojection testing methodology to reduce the dimensionality of the evaluation domain. Also, different measuring strategies are presented and applied to the stereo and space carving techniques. These measuring strategies have shown consistent results in quantifying the performance of these techniques. Additional experiments are performed on the space carving technique to study the effect of the number of input images and the camera pose on its performance.

  5. Sensitivity assessment of sea lice to chemotherapeutants: Current bioassays and best practices.

    PubMed

    Marín, S L; Mancilla, J; Hausdorf, M A; Bouchard, D; Tudor, M S; Kane, F

    2017-12-18

    Traditional bioassays are still necessary to test sensitivity of sea lice species to chemotherapeutants, but the methodology applied by the different scientists has varied over time in respect to that proposed in "Sea lice resistance to chemotherapeutants: A handbook in resistance management" (2006). These divergences motivated the organization of a workshop during the Sea Lice 2016 conference "Standardization of traditional bioassay process by sharing best practices." There was an agreement by the attendants to update the handbook. The objective of this article is to provide a baseline analysis of the methodology for traditional bioassays and to identify procedures that need to be addressed to standardize the protocol. The methodology was divided into the following steps: bioassay design; material and equipment; sea lice collection, transportation and laboratory reception; preparation of dilution; parasite exposure; response evaluation; data analysis; and reporting. Information from the presentations of the workshop, and also from other studies, allowed for the identification of procedures inside a given step that need to be standardized as they were reported to be performed differently by the different working groups. Bioassay design and response evaluation were the targeted steps where more procedures need to be analysed and agreed upon. © 2017 John Wiley & Sons Ltd.

  6. Recursive regularization step for high-order lattice Boltzmann methods

    NASA Astrophysics Data System (ADS)

    Coreixas, Christophe; Wissocq, Gauthier; Puigt, Guillaume; Boussuge, Jean-François; Sagaut, Pierre

    2017-09-01

    A lattice Boltzmann method (LBM) with enhanced stability and accuracy is presented for various Hermite tensor-based lattice structures. The collision operator relies on a regularization step, which is here improved through a recursive computation of nonequilibrium Hermite polynomial coefficients. In addition to the reduced computational cost of this procedure with respect to the standard one, the recursive step allows to considerably enhance the stability and accuracy of the numerical scheme by properly filtering out second- (and higher-) order nonhydrodynamic contributions in under-resolved conditions. This is first shown in the isothermal case where the simulation of the doubly periodic shear layer is performed with a Reynolds number ranging from 104 to 106, and where a thorough analysis of the case at Re=3 ×104 is conducted. In the latter, results obtained using both regularization steps are compared against the Bhatnagar-Gross-Krook LBM for standard (D2Q9) and high-order (D2V17 and D2V37) lattice structures, confirming the tremendous increase of stability range of the proposed approach. Further comparisons on thermal and fully compressible flows, using the general extension of this procedure, are then conducted through the numerical simulation of Sod shock tubes with the D2V37 lattice. They confirm the stability increase induced by the recursive approach as compared with the standard one.

  7. Contribution of diagnostic tests for the etiological assessment of uveitis, data from the ULISSE study (Uveitis: Clinical and medicoeconomic evaluation of a standardized strategy of the etiological diagnosis).

    PubMed

    Grumet, Pierre; Kodjikian, Laurent; de Parisot, Audrey; Errera, Marie-Hélène; Sedira, Neila; Heron, Emmanuel; Pérard, Laurent; Cornut, Pierre-Loïc; Schneider, Christelle; Rivière, Sophie; Ollé, Priscille; Pugnet, Grégory; Cathébras, Pascal; Manoli, Pierre; Bodaghi, Bahram; Saadoun, David; Baillif, Stéphanie; Tieulie, Nathalie; Andre, Marc; Chiambaretta, Frédéric; Bonin, Nicolas; Bielefeld, Philip; Bron, Alain; Mouriaux, Frédéric; Bienvenu, Boris; Vicente, Stéphanie; Bin, Sylvie; Labetoulle, Marc; Broussolle, Christiane; Jamilloux, Yvan; Decullier, Evelyne; Sève, Pascal

    2018-04-01

    ULISSE is the only study that prospectively assessed the efficiency of a standardized strategy, compared to an open strategy for the etiologic diagnosis of uveitis. Our aim was to evaluate the diagnostic yield of the tests prescribed in the ULISSE study to clarify their relevance. ULISSE is a non-inferiority, prospective, multicenter and cluster randomized study. The standardized strategy is a two-steps strategy: in the first step, common standard tests were performed, and in the second step, tests were guided by the clinical and anatomic type of uveitis. We reported the relevance of the diagnostic tests used in the standardized strategy, as well as the profitability of the tests that were prescribed to more than twenty patients in each group. Based on diagnostic criteria, either an ophthalmologist, or an internist, established the profitability of a test by considering whether the test lead to a diagnosis or not. Among the 676 patients included (standardized 303; open 373), a diagnosis was made for 152 (50.4%) in the standardized group and 203 (54.4%) in the open group. The most common entities were HLA-B27 associated uveitis (22%), spondyloarthritis (11%), sarcoidosis (18%), tuberculosis (10.7%) and herpes virus infections (8.5%). Among the first step's systematic tests, tuberculin skin test was the most contributive investigation (17.1%), followed by chest X-ray (8.4%), C reactive protein and ESR (6.6% and 5.1%), complete blood count (2.2%) and VDRL (2.0%). The second step's most often contributive tests were: HLA B27 (56.3%), chest-CT (30.3%) and angiotensin converting enzyme (ACE) (16.5%). HLA B27 and ACE were significantly more contributive in the standardized group than in the open group. Immunological tests were never contributive. Among the free investigations, or among the investigations guided by clinical or paraclinical findings, the most often contributive tests were: Quantiferon® (24%), electrophoresis of serum protein (7.8%) and sacroiliac imagery (46.4%). Intracellular serologies (1.7%), serum calcium (2.1%) and hepatic tests (3.3%) were exceptionally contributive. Among the third intention tests, labial salivary gland biopsies were contributive in 17.9% of cases, but the profitability of other invasive investigations (anterior chamber tap, vitrectomy, bronchoscopy and lumbar puncture) or specialized imagery (18F-FDG PET, Brain MRI) could not be determined since these test were rarely performed. Only a few diagnostic tests are useful for the etiological assessment of uveitis. They are often cheap, simple, more often guided by the clinical findings, and lead to an etiological diagnosis in most patients. On the other hand, some tests are never or exceptionally contributive, such as immunological tests or intracellular serologies. Further studies are required to evaluate the profitability of third intention imagery and invasive investigations. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. Step-Climbing Power Wheelchairs: A Literature Review

    PubMed Central

    Sundaram, S. Andrea; Wang, Hongwu; Ding, Dan

    2017-01-01

    Background: Power wheelchairs capable of overcoming environmental barriers, such as uneven terrain, curbs, or stairs, have been under development for more than a decade. Method: We conducted a systematic review of the scientific and engineering literature to identify these devices, and we provide brief descriptions of the mechanism and method of operation for each. We also present data comparing their capabilities in terms of step climbing and standard wheelchair functions. Results: We found that all the devices presented allow for traversal of obstacles that cannot be accomplished with traditional power wheelchairs, but the slow speeds and small wheel diameters of some designs make them only moderately effective in the basic area of efficient transport over level ground and the size and configuration of some others limit maneuverability in tight spaces. Conclusion: We propose that safety and performance test methods more comprehensive than the International Organization for Standards (ISO) testing protocols be developed for measuring the capabilities of advanced wheelchairs with step-climbing and other environment-negotiating features to allow comparison of their clinical effectiveness. PMID:29339886

  9. Standardization of vascular assessment of erectile dysfunction: standard operating procedures for duplex ultrasound.

    PubMed

    Sikka, Suresh C; Hellstrom, Wayne J G; Brock, Gerald; Morales, Antonio Martin

    2013-01-01

    In-office evaluation of erectile dysfunction by color duplex Doppler ultrasound (CDDU) may benefit the decision-making process in regard to choosing the most appropriate therapy. Unfortunately, there is no uniform standardization in performing CDDU resulting in high variability in data expression and interpretation when comparing results among various centers, especially when conducting multicenter trials. Establishing standard operating procedures (SOPs) is a major step that will help minimize such variability. This SOP describes CDDU procedure with focus on establishing uniformity and normative parameters. Measure intra-arterial diameter, peak systolic velocity, end-diastolic velocity, and resistive index for each cavernosal artery. After initial discussion with the patient about his history and International Index of Erectile Function evaluation describe procedural steps to the patient. Perform the CDDU in a relaxed state, scanning the entire penis (in B-mode image) using a 7.5- to 12-MHz linear array ultrasound probe. An intracorporal injection of a single or combination of vasoactive agents (e.g., prostaglandin E1, phentolamine, and papaverine) is then administered and CDDU performed at various time points, preferably with audiovisual sexual stimulation (AVSS). Monitor penile erection response (tumescence and rigidity) near peak blood flow. Self-stimulation or AVSS leaving the patient alone in room or redosing may be considered to decrease any anxiety and help achieve a maximum rigid erection. Considering the complexity and heterogeneity of CDDU evaluation, this communication will help in standardization and establish uniformity in such data interpretation. When indicated, invasive diagnostic testing involving (i) penile angiography and (ii) cavernosography/cavernosometry to establish veno-occlusive dysfunction may be recommended to facilitate further treatment options. © 2012 International Society for Sexual Medicine.

  10. Pretraining and posttraining assessment of residents' performance in the fourth accreditation council for graduate medical education competency: patient communication skills.

    PubMed

    Chandawarkar, Rajiv Y; Ruscher, Kimberly A; Krajewski, Aleksandra; Garg, Manish; Pfeiffer, Carol; Singh, Rekha; Longo, Walter E; Kozol, Robert A; Lesnikoski, Beth; Nadkarni, Prakash

    2011-08-01

    Structured communication curricula will improve surgical residents' ability to communicate effectively with patients. A prospective study approved by the institutional review board involved 44 University of Connecticut general surgery residents. Residents initially completed a written baseline survey to assess general communication skills awareness. In step 1 of the study, residents were randomized to 1 of 2 simulations using standardized patient instructors to mimic patients receiving a diagnosis of either breast or rectal cancer. The standardized patient instructors scored residents' communication skills using a case-specific content checklist and Master Interview Rating Scale. In step 2 of the study, residents attended a 3-part interactive program that comprised (1) principles of patient communication; (2) experiences of a surgeon (role as physician, patient, and patient's spouse); and (3) role-playing (3-resident groups played patient, physician, and observer roles and rated their own performance). In step 3, residents were retested as in step 1, using a crossover case design. Scores were analyzed using Wilcoxon signed rank test with a Bonferroni correction. Case-specific performance improved significantly, from a pretest content checklist median score of 8.5 (65%) to a posttest median of 11.0 (84%) (P = .005 by Wilcoxon signed rank test for paired ordinal data)(n = 44). Median Master Interview Rating Scale scores changed from 58.0 before testing (P = .10) to 61.5 after testing (P = .94). Difference between overall rectal cancer scores and breast cancer scores also were not significant. Patient communication skills need to be taught as part of residency training. With limited training, case-specific skills (herein, involving patients with cancer) are likely to improve more than general communication skills.

  11. Symplectic molecular dynamics simulations on specially designed parallel computers.

    PubMed

    Borstnik, Urban; Janezic, Dusanka

    2005-01-01

    We have developed a computer program for molecular dynamics (MD) simulation that implements the Split Integration Symplectic Method (SISM) and is designed to run on specialized parallel computers. The MD integration is performed by the SISM, which analytically treats high-frequency vibrational motion and thus enables the use of longer simulation time steps. The low-frequency motion is treated numerically on specially designed parallel computers, which decreases the computational time of each simulation time step. The combination of these approaches means that less time is required and fewer steps are needed and so enables fast MD simulations. We study the computational performance of MD simulation of molecular systems on specialized computers and provide a comparison to standard personal computers. The combination of the SISM with two specialized parallel computers is an effective way to increase the speed of MD simulations up to 16-fold over a single PC processor.

  12. Transient effects of harsh luminous conditions on the visual performance of aviators in a civil aircraft cockpit.

    PubMed

    Yang, Biao; Lin, Yandan; Sun, Yaojie

    2013-03-01

    The aim of this work was to examine how harsh luminous conditions in a cockpit, such as lightning in a thunderstorm or direct sunlight immediately after an aircraft passes through clouds, may affect the visual performance of pilots, and how to improve it. Such lighting conditions can result in the temporary visual impairment of aviators, which may greatly increase the risk of accidents. Tests were carried out in a full-scale simulator cockpit in which two kinds of dynamic lighting scenes, namely pulse changed and step changed lighting, were used to represent harsh luminous conditions. Visual acuity (VA), reaction time (RT) and identification accuracy (IA) were recorded as dependent variables. Data analysis results indicate that standardized VA values decreased significantly in both pulsing and step conditions in comparison with the dark condition. Standardized RT values increased significantly in the step condition; on the contrary, less reaction time was observed in the pulsing condition. Such effects could be reduced by an ambient illumination provided by a fluorescent lamp in both conditions. The results are to be used as a principle for optimizing lighting design with a thunderstorm light. Copyright © 2012 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  13. Capturing the imagination of nurse executives in tracking the quality of nursing care.

    PubMed

    Kurtzman, Ellen T; Jennings, Bonnie M

    2008-01-01

    Nurses represent the single largest healthcare profession in the United States. A growing evidence base demonstrates nursing's direct influence on inpatient safety and healthcare outcomes. Support for nursing's essential role in quality and patient safety and mounting interest in publicly reporting performance results have led to efforts to standardized nursing-sensitive performance measures. To this end, in 2004, the National Quality Forum endorsed a set of 15 nursing-sensitive consensus standards intended for use by the public in assessing inpatient nursing care. However, until recently, only anecdotal knowledge existed regarding the implementation of these consensus standards. As a step toward better understanding the interest in and adoption of nursing performance measures, The Robert Wood Johnson Foundation funded a study that concluded in March 2007. In this article, findings from the study are summarized as they apply to nursing leadership and implications for the future role of the nurse executive.

  14. Validation of Helicopter Gear Condition Indicators Using Seeded Fault Tests

    NASA Technical Reports Server (NTRS)

    Dempsey, Paula; Brandon, E. Bruce

    2013-01-01

    A "seeded fault test" in support of a rotorcraft condition based maintenance program (CBM), is an experiment in which a component is tested with a known fault while health monitoring data is collected. These tests are performed at operating conditions comparable to operating conditions the component would be exposed to while installed on the aircraft. Performance of seeded fault tests is one method used to provide evidence that a Health Usage Monitoring System (HUMS) can replace current maintenance practices required for aircraft airworthiness. Actual in-service experience of the HUMS detecting a component fault is another validation method. This paper will discuss a hybrid validation approach that combines in service-data with seeded fault tests. For this approach, existing in-service HUMS flight data from a naturally occurring component fault will be used to define a component seeded fault test. An example, using spiral bevel gears as the targeted component, will be presented. Since the U.S. Army has begun to develop standards for using seeded fault tests for HUMS validation, the hybrid approach will be mapped to the steps defined within their Aeronautical Design Standard Handbook for CBM. This paper will step through their defined processes, and identify additional steps that may be required when using component test rig fault tests to demonstrate helicopter CI performance. The discussion within this paper will provide the reader with a better appreciation for the challenges faced when defining a seeded fault test for HUMS validation.

  15. Using cognitive task analysis to create a teaching protocol for bovine dystocia.

    PubMed

    Read, Emma K; Baillie, Sarah

    2013-01-01

    When learning skilled techniques and procedures, students face many challenges. Learning is easier when detailed instructions are available, but experts often find it difficult to articulate all of the steps involved in a task or relate to the learner as a novice. This problem is further compounded when the technique is internal and unsighted (e.g., obstetrical procedures). Using expert bovine practitioners and a life-size model cow and calf, the steps and decision making involved in performing correction of two different dystocia presentations (anterior leg back and breech) were deconstructed using cognitive task analysis (CTA). Video cameras were positioned to capture movement inside and outside the cow model while the experts were asked to first perform the technique as they would in a real situation and then perform the procedure again as if articulating the steps to a novice learner. The audio segments were transcribed and, together with the video components, analyzed to create a list of steps for each expert. Consensus was achieved between experts during individual interviews followed by a group discussion. A "gold standard" list or teaching protocol was created for each malpresentation. CTA was useful in defining the technical and cognitive steps required to both perform and teach the tasks effectively. Differences between experts highlight the need for consensus before teaching the skill. In addition, the study identified several different, yet effective, techniques and provided information that could allow experts to consider other approaches they might use when their own technique fails.

  16. 49 CFR 242.119 - Training.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...) necessary for learning transfer; and (iii) A statement of the standards by which proficiency is measured... and related steps the employee learning the job shall be able to perform; (ii) A statement of the... part. (i) If ownership of a railroad is being transferred from one company to another, the conductor(s...

  17. 49 CFR 242.119 - Training.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...) necessary for learning transfer; and (iii) A statement of the standards by which proficiency is measured... and related steps the employee learning the job shall be able to perform; (ii) A statement of the... part. (i) If ownership of a railroad is being transferred from one company to another, the conductor(s...

  18. 49 CFR 242.119 - Training.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...) necessary for learning transfer; and (iii) A statement of the standards by which proficiency is measured... and related steps the employee learning the job shall be able to perform; (ii) A statement of the... part. (i) If ownership of a railroad is being transferred from one company to another, the conductor(s...

  19. Basic Emergency Medical Technician Skills Manual.

    ERIC Educational Resources Information Center

    Oklahoma State Dept. of Vocational and Technical Education, Stillwater. Curriculum and Instructional Materials Center.

    This manual was developed to help students preparing to become emergency medical technicians (EMTs) learn standardized basic skills in the field. The manual itemizes the steps and performance criteria of each required skill and uses an accompanying videotape series (not included) to enhance the educational experience. The five units of the manual,…

  20. Development of a Content-Valid Standardized Orthopedic Assessment Tool (SOAT)

    ERIC Educational Resources Information Center

    Lafave, Mark; Katz, Larry; Butterwick, Dale

    2008-01-01

    Content validation of an instrument that measures student performance in OSCE-type practical examinations is a critical step in a tool's overall validity and reliability [Hopkins (1998), "Educational and Psychological Measurement and Evaluation" (8th ed.). Toronto: Allyn & Bacon]. The purpose of the paper is to outline the process…

  1. Principal Evaluation: Standards, Rubrics, and Tools for Effective Performance

    ERIC Educational Resources Information Center

    Stronge, James H.; Xu, Xianxuan; Leeper, Lauri M.; Tonneson, Virginia C.

    2013-01-01

    Effective principals run effective schools--this much we know. Accurately measuring principal effectiveness, however, has long been an elusive goal for school administrators. In this indispensable book, author James H. Stronge details the steps and resources necessary for designing a comprehensive principal evaluation system that is based on sound…

  2. Place-Based Investigations and Authentic Inquiry

    ERIC Educational Resources Information Center

    Sarkar, Somnath; Frazier, Richard

    2008-01-01

    Although many science students perform hands-on activities as inquiry exercises, such activities sometimes remain disconnected in the student's mind and fail to nurture a deeper understanding of methods of science and the role these methods play in scientific inquiry. Students may be able to reiterate the steps of the standard "scientific…

  3. A new one-step procedure for pulmonary valve implantation of the melody valve: Simultaneous prestenting and valve implantation.

    PubMed

    Boudjemline, Younes

    2018-01-01

    To describe a new modification, the one-step procedure, that allows interventionists to pre-stent and implant a Melody valve simultaneously. Percutaneous pulmonary valve implantation (PPVI) is the standard of care for managing patients with dysfunctional right ventricular outflow tract, and the approach is standardized. Patients undergoing PPVI using the one-step procedure were identified in our database. Procedural data and radiation exposure were compared to those in a matched group of patients who underwent PPVI using the conventional two-step procedure. Between January 2016 and January 2017, PPVI was performed in 27 patients (median age/range, 19.1/10-55 years) using the one-step procedure involving manual crimping of one to three bare metal stents over the Melody valve. The stent and Melody valve were delivered successfully using the Ensemble delivery system. No complications occurred. All patients had excellent hemodynamic results (median/range post-PPVI right ventricular to pulmonary artery gradient, 9/0-20 mmHg). Valve function was excellent. Median procedural and fluoroscopic times were 56 and 10.2 min, respectively, which significantly differed from those of the two-step procedure group. Similarly, the dose area product (DAP), and radiation time were statistically lower in the one-step group than in the two-step group (P < 0.001 for all variables). After a median follow-up of 8 months (range, 3-14.7), no patient underwent reintervention, and no device dysfunction was observed. The one-step procedure is a safe modification that allows interventionists to prestent and implants the Melody valve simultaneously. It significantly reduces procedural and fluoroscopic times, and radiation exposure. © 2017 Wiley Periodicals, Inc.

  4. [Endoscopic Approach to the Quadrilateral Plate (EAQUAL): a New Endoscopic Approach for Plate Osteosynthesis of the Pelvic Ring and Acetabulum - a Cadaver Study].

    PubMed

    Trulson, Alexander; Küper, Markus Alexander; Trulson, Inga Maria; Minarski, Christian; Grünwald, Leonard; Hirt, Bernhard; Stöckle, Ulrich; Stuby, Fabian

    2018-06-14

    Dislocated pelvic fractures which require surgical repair are usually operated on via open surgery. Approach-related morbidity is reported with a frequency of up to 30%. The aim of this anatomical study was to prove the feasibility of endoscopic visualisation of the relevant anatomical structures in pelvic surgery and to perform completely endoscopic plate osteosynthesis of the acetabulum with available standard laparoscopic instruments. In four human cadavers, we established an endoscopic preparation of the complete pelvic ring, from the symphysis to the iliosacral joint, including the quadrilateral plate and the sciatic nerve, and performed endoscopic plate osteosynthesis along the iliopectineal line. The endoscopic preparation of the complete pelvic ring and the quadrilateral plate was demonstrated step-by-step, followed by completely endoscopic plate osteosynthesis along the pelvic brim. Endoscopic, radiographic, and schematic pictures are used to illustrate the technique. The completely endoscopic preparation of the pelvic brim and the quadrilateral plate is feasible with available standard laparoscopic instruments. Moreover, plate osteosynthesis could be performed endoscopically. Further research on reduction techniques is necessary when planning to implement this technique into a clinical scenario. Georg Thieme Verlag KG Stuttgart · New York.

  5. Relative dosimetrical verification in high dose rate brachytherapy using two-dimensional detector array IMatriXX

    PubMed Central

    Manikandan, A.; Biplab, Sarkar; David, Perianayagam A.; Holla, R.; Vivek, T. R.; Sujatha, N.

    2011-01-01

    For high dose rate (HDR) brachytherapy, independent treatment verification is needed to ensure that the treatment is performed as per prescription. This study demonstrates dosimetric quality assurance of the HDR brachytherapy using a commercially available two-dimensional ion chamber array called IMatriXX, which has a detector separation of 0.7619 cm. The reference isodose length, step size, and source dwell positional accuracy were verified. A total of 24 dwell positions, which were verified for positional accuracy gave a total error (systematic and random) of –0.45 mm, with a standard deviation of 1.01 mm and maximum error of 1.8 mm. Using a step size of 5 mm, reference isodose length (the length of 100% isodose line) was verified for single and multiple catheters of same and different source loadings. An error ≤1 mm was measured in 57% of tests analyzed. Step size verification for 2, 3, 4, and 5 cm was performed and 70% of the step size errors were below 1 mm, with maximum of 1.2 mm. The step size ≤1 cm could not be verified by the IMatriXX as it could not resolve the peaks in dose profile. PMID:21897562

  6. Methodology for the preliminary design of high performance schools in hot and humid climates

    NASA Astrophysics Data System (ADS)

    Im, Piljae

    A methodology to develop an easy-to-use toolkit for the preliminary design of high performance schools in hot and humid climates was presented. The toolkit proposed in this research will allow decision makers without simulation knowledge easily to evaluate accurately energy efficient measures for K-5 schools, which would contribute to the accelerated dissemination of energy efficient design. For the development of the toolkit, first, a survey was performed to identify high performance measures available today being implemented in new K-5 school buildings. Then an existing case-study school building in a hot and humid climate was selected and analyzed to understand the energy use pattern in a school building and to be used in developing a calibrated simulation. Based on the information from the previous step, an as-built and calibrated simulation was then developed. To accomplish this, five calibration steps were performed to match the simulation results with the measured energy use. The five steps include: (1) Using an actual 2006 weather file with measured solar radiation, (2) Modifying lighting & equipment schedule using ASHRAE's RP-1093 methods, (3) Using actual equipment performance curves (i.e., scroll chiller), (4) Using the Winkelmann's method for the underground floor heat transfer, and (5) Modifying the HVAC and room setpoint temperature based on the measured field data. Next, the calibrated simulation of the case-study K-5 school was compared to an ASHRAE Standard 90.1-1999 code-compliant school. In the next step, the energy savings potentials from the application of several high performance measures to an equivalent ASHRAE Standard 90.1-1999 code-compliant school. The high performance measures applied included the recommendations from the ASHRAE Advanced Energy Design Guides (AEDG) for K-12 and other high performance measures from the literature review as well as a daylighting strategy and solar PV and thermal systems. The results show that the net energy consumption of the final high performance school with the solar thermal and a solar PV system would be 1,162.1 MMBtu, which corresponds to the 14.9 kBtu/sqft-yr of EUI. The calculated final energy and cost savings over the code compliant school are 68.2% and 69.9%, respectively. As a final step of the research, specifications for a simplified easy-to-use toolkit were then developed, and a prototype screenshot of the toolkit was developed. The toolkit is expected to be used by non-technical decision-maker to select and evaluate high performance measures for a new school building in terms of energy and cost savings in a quick and easy way.

  7. Improved Reproducibility for Perovskite Solar Cells with 1 cm2 Active Area by a Modified Two-Step Process.

    PubMed

    Shen, Heping; Wu, Yiliang; Peng, Jun; Duong, The; Fu, Xiao; Barugkin, Chog; White, Thomas P; Weber, Klaus; Catchpole, Kylie R

    2017-02-22

    With rapid progress in recent years, organohalide perovskite solar cells (PSC) are promising candidates for a new generation of highly efficient thin-film photovoltaic technologies, for which up-scaling is an essential step toward commercialization. In this work, we propose a modified two-step method to deposit the CH 3 NH 3 PbI 3 (MAPbI 3 ) perovskite film that improves the uniformity, photovoltaic performance, and repeatability of large-area perovskite solar cells. This method is based on the commonly used two-step method, with one additional process involving treating the perovskite film with concentrated methylammonium iodide (MAI) solution. This additional treatment is proved to be helpful for tailoring the residual PbI 2 level to an optimal range that is favorable for both optical absorption and inhibition of recombination. Scanning electron microscopy and photoluminescence image analysis further reveal that, compared to the standard two-step and one-step methods, this method is very robust for achieving uniform and pinhole-free large-area films. This is validated by the photovoltaic performance of the prototype devices with an active area of 1 cm 2 , where we achieved the champion efficiency of ∼14.5% and an average efficiency of ∼13.5%, with excellent reproducibility.

  8. How Accurate Is Your Activity Tracker? A Comparative Study of Step Counts in Low-Intensity Physical Activities.

    PubMed

    Alinia, Parastoo; Cain, Chris; Fallahzadeh, Ramin; Shahrokni, Armin; Cook, Diane; Ghasemzadeh, Hassan

    2017-08-11

    As commercially available activity trackers are being utilized in clinical trials, the research community remains uncertain about reliability of the trackers, particularly in studies that involve walking aids and low-intensity activities. While these trackers have been tested for reliability during walking and running activities, there has been limited research on validating them during low-intensity activities and walking with assistive tools. The aim of this study was to (1) determine the accuracy of 3 Fitbit devices (ie, Zip, One, and Flex) at different wearing positions (ie, pants pocket, chest, and wrist) during walking at 3 different speeds, 2.5, 5, and 8 km/h, performed by healthy adults on a treadmill; (2) determine the accuracy of the mentioned trackers worn at different sites during activities of daily living; and (3) examine whether intensity of physical activity (PA) impacts the choice of optimal wearing site of the tracker. We recruited 15 healthy young adults to perform 6 PAs while wearing 3 Fitbit devices (ie, Zip, One, and Flex) on their chest, pants pocket, and wrist. The activities include walking at 2.5, 5, and 8 km/h, pushing a shopping cart, walking with aid of a walker, and eating while sitting. We compared the number of steps counted by each tracker with gold standard numbers. We performed multiple statistical analyses to compute descriptive statistics (ie, ANOVA test), intraclass correlation coefficient (ICC), mean absolute error rate, and correlation by comparing the tracker-recorded data with that of the gold standard. All the 3 trackers demonstrated good-to-excellent (ICC>0.75) correlation with the gold standard step counts during treadmill experiments. The correlation was poor (ICC<0.60), and the error rate was significantly higher in walker experiment compared to other activities. There was no significant difference between the trackers and the gold standard in the shopping cart experiment. The wrist worn tracker, Flex, counted several steps when eating (P<.01). The chest tracker was identified as the most promising site to capture steps in more intense activities, while the wrist was the optimal wearing site in less intense activities. This feasibility study focused on 6 PAs and demonstrated that Fitbit trackers were most accurate when walking on a treadmill and least accurate during walking with a walking aid and for low-intensity activities. This may suggest excluding participants with assistive devices from studies that focus on PA interventions using commercially available trackers. This study also indicates that the wearing site of the tracker is an important factor impacting the accuracy performance. A larger scale study with a more diverse population, various activity tracker vendors, and a larger activity set are warranted to generalize our results. ©Parastoo Alinia, Chris Cain, Ramin Fallahzadeh, Armin Shahrokni, Diane Cook, Hassan Ghasemzadeh. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 11.08.2017.

  9. Standardized Procedure Content And Data Structure Based On Human Factors Requirements For Computer-Based Procedures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bly, Aaron; Oxstrand, Johanna; Le Blanc, Katya L

    2015-02-01

    Most activities that involve human interaction with systems in a nuclear power plant are guided by procedures. Traditionally, the use of procedures has been a paper-based process that supports safe operation of the nuclear power industry. However, the nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. Advances in digital technology make computer-based procedures (CBPs) a valid option that provides further enhancement of safety by improving human performance related to procedure use. The transition from paper-based procedures (PBPs) to CBPs creates a need for a computer-based proceduremore » system (CBPS). A CBPS needs to have the ability to perform logical operations in order to adjust to the inputs received from either users or real time data from plant status databases. Without the ability for logical operations the procedure is just an electronic copy of the paper-based procedure. In order to provide the CBPS with the information it needs to display the procedure steps to the user, special care is needed in the format used to deliver all data and instructions to create the steps. The procedure should be broken down into basic elements and formatted in a standard method for the CBPS. One way to build the underlying data architecture is to use an Extensible Markup Language (XML) schema, which utilizes basic elements to build each step in the smart procedure. The attributes of each step will determine the type of functionality that the system will generate for that step. The CBPS will provide the context for the step to deliver referential information, request a decision, or accept input from the user. The XML schema needs to provide all data necessary for the system to accurately perform each step without the need for the procedure writer to reprogram the CBPS. The research team at the Idaho National Laboratory has developed a prototype CBPS for field workers as well as the underlying data structure for such CBPS. The objective of the research effort is to develop guidance on how to design both the user interface and the underlying schema. This paper will describe the result and insights gained from the research activities conducted to date.« less

  10. Simulations of laser undulators

    NASA Astrophysics Data System (ADS)

    Milton, S. V.; Biedron, S. B.; Einstein, J. E.

    2016-09-01

    We perform a series of single-pass, one-D free-electron laser simulations based on an electron beam from a standard linear accelerator coupled with a so-called laser undulator, a specialized device that is more compact than a standard undulator based on magnetic materials. The longitudinal field profiles of such lasers undulators are intriguing as one must and can tailor the profile for the needs of creating the virtual undulator. We present and discuss several results of recent simulations and our future steps.

  11. Application of single-step genomic evaluation for crossbred performance in pig.

    PubMed

    Xiang, T; Nielsen, B; Su, G; Legarra, A; Christensen, O F

    2016-03-01

    Crossbreding is predominant and intensively used in commercial meat production systems, especially in poultry and swine. Genomic evaluation has been successfully applied for breeding within purebreds but also offers opportunities of selecting purebreds for crossbred performance by combining information from purebreds with information from crossbreds. However, it generally requires that all relevant animals are genotyped, which is costly and presently does not seem to be feasible in practice. Recently, a novel single-step BLUP method for genomic evaluation of both purebred and crossbred performance has been developed that can incorporate marker genotypes into a traditional animal model. This new method has not been validated in real data sets. In this study, we applied this single-step method to analyze data for the maternal trait of total number of piglets born in Danish Landrace, Yorkshire, and two-way crossbred pigs in different scenarios. The genetic correlation between purebred and crossbred performances was investigated first, and then the impact of (crossbred) genomic information on prediction reliability for crossbred performance was explored. The results confirm the existence of a moderate genetic correlation, and it was seen that the standard errors on the estimates were reduced when including genomic information. Models with marker information, especially crossbred genomic information, improved model-based reliabilities for crossbred performance of purebred boars and also improved the predictive ability for crossbred animals and, to some extent, reduced the bias of prediction. We conclude that the new single-step BLUP method is a good tool in the genetic evaluation for crossbred performance in purebred animals.

  12. Standardization, evaluation and early-phase method validation of an analytical scheme for batch-consistency N-glycosylation analysis of recombinant produced glycoproteins.

    PubMed

    Zietze, Stefan; Müller, Rainer H; Brecht, René

    2008-03-01

    In order to set up a batch-to-batch-consistency analytical scheme for N-glycosylation analysis, several sample preparation steps including enzyme digestions and fluorophore labelling and two HPLC-methods were established. The whole method scheme was standardized, evaluated and validated according to the requirements on analytical testing in early clinical drug development by usage of a recombinant produced reference glycoprotein (RGP). The standardization of the methods was performed by clearly defined standard operation procedures. During evaluation of the methods, the major interest was in the loss determination of oligosaccharides within the analytical scheme. Validation of the methods was performed with respect to specificity, linearity, repeatability, LOD and LOQ. Due to the fact that reference N-glycan standards were not available, a statistical approach was chosen to derive accuracy from the linearity data. After finishing the validation procedure, defined limits for method variability could be calculated and differences observed in consistency analysis could be separated into significant and incidental ones.

  13. Perceived difficulty of various steps of manual small incision cataract surgery among trainees in rural China.

    PubMed

    Huang, Wenyong; Ye, Ronghua; Huang, Shengsong; Wang, Decai; Wang, Lanhua; Liu, Bin; Friedman, David S; He, Mingguang; Liu, Yizhi; Congdon, Nathan G

    2013-01-01

    The perceived difficulty of steps of manual small incision cataract surgery among trainees in rural China was assessed. Cohort study. Fifty-two trainees at the end of a manual small incision cataract surgery training programme. Participants rated the difficulty of 14 surgical steps using a 5-point scale, 1 (very easy) to 5 (very difficult). Demographic and professional information was recorded for trainees. Mean ratings for surgical steps. Questionnaires were completed by 49 trainees (94.2%, median age 38 years, 8 [16.3%] women). Twenty six (53.1%) had performed ≤50 independent cataract surgeries prior to training. Trainees rated cortical aspiration (mean score ± standard deviation = 3.10 ± 1.14) the most difficult step, followed by wound construction (2.76 ± 1.08), nuclear prolapse into the anterior chamber (2.74 ± 1.23) and lens delivery (2.51 ± 1.08). Draping the surgical field (1.06 ± 0.242), anaesthetic block administration (1.14 ± 0.354) and thermal coagulation (1.18 ± 0.441) were rated easiest. In regression models, the score for cortical aspiration was significantly inversely associated with performing >50 independent manual small incision cataract surgery surgeries during training (P = 0.01), but not with age, gender, years of experience in an eye department or total number of cataract surgeries performed prior to training. Cortical aspiration, wound construction and nuclear prolapse pose the greatest challenge for trainees learning manual small incision cataract surgery, and should receive emphasis during training. Number of cases performed is the strongest predictor of perceived difficulty of key steps. © 2013 The Authors. Clinical and Experimental Ophthalmology © 2013 Royal Australian and New Zealand College of Ophthalmologists.

  14. Economics of recombinant antibody production processes at various scales: Industry-standard compared to continuous precipitation.

    PubMed

    Hammerschmidt, Nikolaus; Tscheliessnig, Anne; Sommer, Ralf; Helk, Bernhard; Jungbauer, Alois

    2014-06-01

    Standard industry processes for recombinant antibody production employ protein A affinity chromatography in combination with other chromatography steps and ultra-/diafiltration. This study compares a generic antibody production process with a recently developed purification process based on a series of selective precipitation steps. The new process makes two of the usual three chromatographic steps obsolete and can be performed in a continuous fashion. Cost of Goods (CoGs) analyses were done for: (i) a generic chromatography-based antibody standard purification; (ii) the continuous precipitation-based purification process coupled to a continuous perfusion production system; and (iii) a hybrid process, coupling the continuous purification process to an upstream batch process. The results of this economic analysis show that the precipitation-based process offers cost reductions at all stages of the life cycle of a therapeutic antibody, (i.e. clinical phase I, II and III, as well as full commercial production). The savings in clinical phase production are largely attributed to the fact that expensive chromatographic resins are omitted. These economic analyses will help to determine the strategies that are best suited for small-scale production in parallel fashion, which is of importance for antibody production in non-privileged countries and for personalized medicine. Copyright © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. The thermodynamic parameters of the step dissociation of L-phenylalanyl in aqueous solution

    NASA Astrophysics Data System (ADS)

    Kochergina, L. A.; Emel'Yanov, A. V.; Krutova, O. N.; Gorboletova, G. G.

    2007-10-01

    The heats of interaction of L-phenylalanine with solutions of nitric acid and potassium and lithium hydroxides were determined calorimetrically at 288.15, 298.15, and 308.15 K and solution ionic strengths of 0.5, 0.75, and 1.0 in the presence of LiNO3 and KNO3. The standard thermodynamic characteristics (Δr H°, Δr G°, Δr S°, and Δ C {/p °} of acid-base interactions in aqueous solutions of L-phenylalanine were calculated. The influence of the concentration of background electrolytes and temperature on the heats of dissociation of L-phenylalanine was considered. A comparative analysis of the standard thermodynamic characteristics of step dissociation of L-phenylalanine and alanine was performed in terms of the modern concepts of the structure and physicochemical properties of these compounds and their solutions.

  16. Agribusiness Management. Competency Based Education Curriculum. Teacher's Guide.

    ERIC Educational Resources Information Center

    Long, Diana

    This publication is the Teacher's Guide for the competency based curriculum in agribusiness management for high school students in West Virginia. The purpose of the competency based education curriculum is to provide a set of West Virginia-validated agribusiness tasks, along with the steps needed to perform each task, the evaluation standards, and…

  17. Comprehensive School Reform with a Focus on Literacy

    ERIC Educational Resources Information Center

    Zyburt, Gina M.

    2010-01-01

    Within the past years of Comprehensive School Reform (CSR), educators have begun to be innovative and employ strategies to support teaching and learning by incorporating high standards and inspiring high performance. Unfortunately, student achievement is not increasing and the achievement gap is continuing to widen. The next step for schools is to…

  18. Value Added: Do New Teacher Evaluation Methods Make the Grade?

    ERIC Educational Resources Information Center

    Garrett, Kristi

    2011-01-01

    Measuring a teacher's effectiveness in quantifiable ways is a logical step in a society driven by the SMART goals (specific, measurable, attainable, relevant, and timely objectives) that pervade modern management. The idea of using student performance on standardized tests to judge a teacher's effectiveness picked up steam after the Obama…

  19. Analysis and design of a standardized control module for switching regulators

    NASA Astrophysics Data System (ADS)

    Lee, F. C.; Mahmoud, M. F.; Yu, Y.; Kolecki, J. C.

    1982-07-01

    Three basic switching regulators: buck, boost, and buck/boost, employing a multiloop standardized control module (SCM) were characterized by a common small signal block diagram. Employing the unified model, regulator performances such as stability, audiosusceptibility, output impedance, and step load transient are analyzed and key performance indexes are expressed in simple analytical forms. More importantly, the performance characteristics of all three regulators are shown to enjoy common properties due to the unique SCM control scheme which nullifies the positive zero and provides adaptive compensation to the moving poles of the boost and buck/boost converters. This allows a simple unified design procedure to be devised for selecting the key SCM control parameters for an arbitrarily given power stage configuration and parameter values, such that all regulator performance specifications can be met and optimized concurrently in a single design attempt.

  20. Implicit integration methods for dislocation dynamics

    DOE PAGES

    Gardner, D. J.; Woodward, C. S.; Reynolds, D. R.; ...

    2015-01-20

    In dislocation dynamics simulations, strain hardening simulations require integrating stiff systems of ordinary differential equations in time with expensive force calculations, discontinuous topological events, and rapidly changing problem size. Current solvers in use often result in small time steps and long simulation times. Faster solvers may help dislocation dynamics simulations accumulate plastic strains at strain rates comparable to experimental observations. Here, this paper investigates the viability of high order implicit time integrators and robust nonlinear solvers to reduce simulation run times while maintaining the accuracy of the computed solution. In particular, implicit Runge-Kutta time integrators are explored as a waymore » of providing greater accuracy over a larger time step than is typically done with the standard second-order trapezoidal method. In addition, both accelerated fixed point and Newton's method are investigated to provide fast and effective solves for the nonlinear systems that must be resolved within each time step. Results show that integrators of third order are the most effective, while accelerated fixed point and Newton's method both improve solver performance over the standard fixed point method used for the solution of the nonlinear systems.« less

  1. Data encryption standard ASIC design and development report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, Perry J.; Pierson, Lyndon George; Witzke, Edward L.

    2003-10-01

    This document describes the design, fabrication, and testing of the SNL Data Encryption Standard (DES) ASIC. This device was fabricated in Sandia's Microelectronics Development Laboratory using 0.6 {micro}m CMOS technology. The SNL DES ASIC was modeled using VHDL, then simulated, and synthesized using Synopsys, Inc. software and finally IC layout was performed using Compass Design Automation's CAE tools. IC testing was performed by Sandia's Microelectronic Validation Department using a HP 82000 computer aided test system. The device is a single integrated circuit, pipelined realization of DES encryption and decryption capable of throughputs greater than 6.5 Gb/s. Several enhancements accommodate ATMmore » or IP network operation and performance scaling. This design is the latest step in the evolution of DES modules.« less

  2. Self-regulated learning and achievement by middle-school children.

    PubMed

    Sink, C A; Barnett, J E; Hixon, J E

    1991-12-01

    The relationship of self-regulated learning to the achievement test scores of 62 Grade 6 students was studied. Generally, the metacognitive and affective variables correlated significantly with teachers' grades and standardized test scores in mathematics, reading, and science. Planning and self-assessment significantly predicted the six measures of achievement. Step-wise multiple regression analyses using the metacognitive and affective variables largely indicate that students' and teachers' perceptions of scholastic ability and planning appear to be the most salient factors in predicting academic performance. The locus of control dimension had no utility in predicting classroom grades and performance on standardized measures of achievement. The implications of the findings for teaching and learning are discussed.

  3. [Lung function tests: the pneumologist and ambulatory care].

    PubMed

    Reis Ferreira, J M

    2004-01-01

    Lung function testing (LFT) has been standardized and greatly improved in the last three decades, but its relative complexity has driven to recent sistematization and standardization of its applicability in the office and in primary care. In memorian of Prof António Couto, and of his outstanding role in the promotion of LFT in Potyugal, this conference deals with the definition of office spirometry, its application range, and the essential steps for the performance, in acceptable quality and reproducibility conditioms. The role of the specialist in promoting this method, and his support to possible spirometry performers, is aimed as an important request in the success of the practical and used technique in family practice and primary health care.

  4. Quality measurement and benchmarking of HPV vaccination services: a new approach.

    PubMed

    Maurici, Massimo; Paulon, Luca; Campolongo, Alessandra; Meleleo, Cristina; Carlino, Cristiana; Giordani, Alessandro; Perrelli, Fabrizio; Sgricia, Stefano; Ferrante, Maurizio; Franco, Elisabetta

    2014-01-01

    A new measurement process based upon a well-defined mathematical model was applied to evaluate the quality of human papillomavirus (HPV) vaccination centers in 3 of 12 Local Health Units (ASLs) within the Lazio Region of Italy. The quality aspects considered for evaluation were communicational efficiency, organizational efficiency and comfort. The overall maximum achievable value was 86.10%, while the HPV vaccination quality scores for ASL1, ASL2 and ASL3 were 73.07%, 71.08%, and 67.21%, respectively. With this new approach it is possible to represent the probabilistic reasoning of a stakeholder who evaluates the quality of a healthcare provider. All ASLs had margins for improvements and optimal quality results can be assessed in terms of better performance conditions, confirming the relationship between the resulting quality scores and HPV vaccination coverage. The measurement process was structured into three steps and involved four stakeholder categories: doctors, nurses, parents and vaccinated women. In Step 1, questionnaires were administered to collect different stakeholders' points of view (i.e., subjective data) that were elaborated to obtain the best and worst performance conditions when delivering a healthcare service. Step 2 of the process involved the gathering of performance data during the service delivery (i.e., objective data collection). Step 3 of the process involved the elaboration of all data: subjective data from step 1 are used to define a "standard" to test objective data from step 2. This entire process led to the creation of a set of scorecards. Benchmarking is presented as a result of the probabilistic meaning of the evaluated scores.

  5. Novel system for distant assessment of cataract surgical quality in rural China.

    PubMed

    Wang, Lanhua; Xu, Danping; Liu, Bin; Jin, Ling; Wang, Decai; He, Mingguang; Congdon, Nathan G; Huang, Wenyong

    2015-01-01

    This study aims to assess the quality of various steps of manual small incision cataract surgery and predictors of quality, using video recordings. This paper applies a retrospective study. Fifty-two trainees participated in a hands-on small incision cataract surgery training programme at rural Chinese hospitals. Trainees provided one video each recorded by a tripod-mounted digital recorder after completing a one-week theoretical course and hands-on training monitored by expert trainers. Videos were graded by two different experts, using a 4-point scale developed by the International Council of Ophthalmology for each of 12 surgical steps and six global factors. Grades ranged from 2 (worst) to 5 (best), with a score of 0 if the step was performed by trainers. Mean score for the performance of each cataract surgical step rated by trainers. Videos and data were available for 49/52 trainees (94.2%, median age 38 years, 16.3% women and 77.5% completing > 50 training cases). The majority (53.1%, 26/49) had performed ≤ 50 cataract surgeries prior to training. Kappa was 0.57∼0.98 for the steps (mean 0.85). Poorest-rated steps were draping the surgical field (mean ± standard deviation = 3.27 ± 0.78), hydro-dissection (3.88 ± 1.22) and wound closure (3.92 ± 1.03), and top-rated steps were insertion of viscoelastic (4.96 ± 0.20) and anterior chamber entry (4.69 ± 0.74). In linear regression models, higher total score was associated with younger age (P = 0.015) and having performed >50 independent manual small incision cases (P = 0.039). More training should be given to preoperative draping, which is poorly performed and crucial in preventing infection. Surgical experience improves ratings. © 2015 Royal Australian and New Zealand College of Ophthalmologists.

  6. Promoting ADL independence in vulnerable, community-dwelling older adults: a pilot RCT comparing 3-Step Workout for Life versus resistance exercise

    PubMed Central

    Liu, Chiung-ju; Xu, Huiping; Keith, NiCole R; Clark, Daniel O

    2017-01-01

    Background Resistance exercise is effective to increase muscle strength for older adults; however, its effect on the outcome of activities of daily living is often limited. The purpose of this study was to examine whether 3-Step Workout for Life (which combines resistance exercise, functional exercise, and activities of daily living exercise) would be more beneficial than resistance exercise alone. Methods A single-blind randomized controlled trial was conducted. Fifty-two inactive, community-dwelling older adults (mean age =73 years) with muscle weakness and difficulty in activities of daily living were randomized to receive 3-Step Workout for Life or resistance exercise only. Participants in the 3-Step Workout for Life Group performed functional movements and selected activities of daily living at home in addition to resistance exercise. Participants in the Resistance Exercise Only Group performed resistance exercise only. Both groups were comparable in exercise intensity (moderate), duration (50–60 minutes each time for 10 weeks), and frequency (three times a week). Assessment of Motor and Process Skills, a standard performance test on activities of daily living, was administered at baseline, postintervention, and 6 months after intervention completion. Results At postintervention, the 3-Step Workout for Life Group showed improvement on the outcome measure (mean change from baseline =0.29, P=0.02), but the improvement was not greater than the Resistance Exercise Only Group (group mean difference =0.24, P=0.13). However, the Resistance Exercise Only Group showed a significant decline (mean change from baseline =−0.25, P=0.01) 6 months after the intervention completion. Meanwhile, the superior effect of 3-Step Workout for Life was observed (group mean difference =0.37, P<0.01). Conclusion Compared to resistance exercise alone, 3-Step Workout for Life improves the performance of activities of daily living and attenuates the disablement process in older adults. PMID:28769559

  7. STEP and STEPSPL: Computer programs for aerodynamic model structure determination and parameter estimation

    NASA Technical Reports Server (NTRS)

    Batterson, J. G.

    1986-01-01

    The successful parametric modeling of the aerodynamics for an airplane operating at high angles of attack or sideslip is performed in two phases. First the aerodynamic model structure must be determined and second the associated aerodynamic parameters (stability and control derivatives) must be estimated for that model. The purpose of this paper is to document two versions of a stepwise regression computer program which were developed for the determination of airplane aerodynamic model structure and to provide two examples of their use on computer generated data. References are provided for the application of the programs to real flight data. The two computer programs that are the subject of this report, STEP and STEPSPL, are written in FORTRAN IV (ANSI l966) compatible with a CDC FTN4 compiler. Both programs are adaptations of a standard forward stepwise regression algorithm. The purpose of the adaptation is to facilitate the selection of a adequate mathematical model of the aerodynamic force and moment coefficients of an airplane from flight test data. The major difference between STEP and STEPSPL is in the basis for the model. The basis for the model in STEP is the standard polynomial Taylor's series expansion of the aerodynamic function about some steady-state trim condition. Program STEPSPL utilizes a set of spline basis functions.

  8. Water cycle algorithm: A detailed standard code

    NASA Astrophysics Data System (ADS)

    Sadollah, Ali; Eskandar, Hadi; Lee, Ho Min; Yoo, Do Guen; Kim, Joong Hoon

    Inspired by the observation of the water cycle process and movements of rivers and streams toward the sea, a population-based metaheuristic algorithm, the water cycle algorithm (WCA) has recently been proposed. Lately, an increasing number of WCA applications have appeared and the WCA has been utilized in different optimization fields. This paper provides detailed open source code for the WCA, of which the performance and efficiency has been demonstrated for solving optimization problems. The WCA has an interesting and simple concept and this paper aims to use its source code to provide a step-by-step explanation of the process it follows.

  9. Minimum Information about T Regulatory Cells: A Step toward Reproducibility and Standardization.

    PubMed

    Fuchs, Anke; Gliwiński, Mateusz; Grageda, Nathali; Spiering, Rachel; Abbas, Abul K; Appel, Silke; Bacchetta, Rosa; Battaglia, Manuela; Berglund, David; Blazar, Bruce; Bluestone, Jeffrey A; Bornhäuser, Martin; Ten Brinke, Anja; Brusko, Todd M; Cools, Nathalie; Cuturi, Maria Cristina; Geissler, Edward; Giannoukakis, Nick; Gołab, Karolina; Hafler, David A; van Ham, S Marieke; Hester, Joanna; Hippen, Keli; Di Ianni, Mauro; Ilic, Natasa; Isaacs, John; Issa, Fadi; Iwaszkiewicz-Grześ, Dorota; Jaeckel, Elmar; Joosten, Irma; Klatzmann, David; Koenen, Hans; van Kooten, Cees; Korsgren, Olle; Kretschmer, Karsten; Levings, Megan; Marek-Trzonkowska, Natalia Maria; Martinez-Llordella, Marc; Miljkovic, Djordje; Mills, Kingston H G; Miranda, Joana P; Piccirillo, Ciriaco A; Putnam, Amy L; Ritter, Thomas; Roncarolo, Maria Grazia; Sakaguchi, Shimon; Sánchez-Ramón, Silvia; Sawitzki, Birgit; Sofronic-Milosavljevic, Ljiljana; Sykes, Megan; Tang, Qizhi; Vives-Pi, Marta; Waldmann, Herman; Witkowski, Piotr; Wood, Kathryn J; Gregori, Silvia; Hilkens, Catharien M U; Lombardi, Giovanna; Lord, Phillip; Martinez-Caceres, Eva M; Trzonkowski, Piotr

    2017-01-01

    Cellular therapies with CD4+ T regulatory cells (Tregs) hold promise of efficacious treatment for the variety of autoimmune and allergic diseases as well as posttransplant complications. Nevertheless, current manufacturing of Tregs as a cellular medicinal product varies between different laboratories, which in turn hampers precise comparisons of the results between the studies performed. While the number of clinical trials testing Tregs is already substantial, it seems to be crucial to provide some standardized characteristics of Treg products in order to minimize the problem. We have previously developed reporting guidelines called minimum information about tolerogenic antigen-presenting cells, which allows the comparison between different preparations of tolerance-inducing antigen-presenting cells. Having this experience, here we describe another minimum information about Tregs (MITREG). It is important to note that MITREG does not dictate how investigators should generate or characterize Tregs, but it does require investigators to report their Treg data in a consistent and transparent manner. We hope this will, therefore, be a useful tool facilitating standardized reporting on the manufacturing of Tregs, either for research purposes or for clinical application. This way MITREG might also be an important step toward more standardized and reproducible testing of the Tregs preparations in clinical applications.

  10. Optical splitter design for telecommunication access networks with triple-play services

    NASA Astrophysics Data System (ADS)

    Agalliu, Rajdi; Burtscher, Catalina; Lucki, Michal; Seyringer, Dana

    2018-01-01

    In this paper, we present various designs of optical splitters for access networks, such as GPON and XG-PON by ITU-T with triple-play services (ie data, voice and video). The presented designs exhibit a step forward, compared to the solutions recommended by the ITU, in terms of performance in transmission systems using WDM. The quality of performance is represented by the bit error rate and the Q-factor. Besides the standard splitter design, we propose a new length-optimized splitter design with a smaller waveguide core, providing some reduction of non-uniformity of the power split between the output waveguides. The achieved splitting parameters are incorporated in the simulations of passive optical networks. For this purpose, the OptSim tool employing Time Domain Split Step method was used.

  11. Evaluation of microtensile bond strength of self-etching adhesives on normal and caries-affected dentin.

    PubMed

    Shibata, Shizuma; Vieira, Luiz Clovis Cardoso; Baratieri, Luiz Narciso; Fu, Jiale; Hoshika, Shuhei; Matsuda, Yasuhiro; Sano, Hidehiko

    2016-01-01

    The purpose of this study was to evaluate the µTBS (microtensile bond strength) of currently available self-etching adhesives with an experimental self-etch adhesive in normal and caries-affected dentin, using a portable hardness measuring device, in order to standardize dentin Knoop hardness. Normal (ND) and caries-affected dentin (CAD) were obtained from twenty human molars with class II natural caries. The following adhesive systems were tested: Mega Bond (MB), a 2-step self-etching adhesive; MTB-200 (MTB), an experimental 1-step self-etching adhesive (1-SEA), and two commercially available one-step self-etching systems, G-Bond Plus (GB) and Adper Easy Bond (EB). MB-ND achieved the highest µTBS (p<0.05). The mean µTBS was statistically lower in CAD than in ND for all adhesives tested (p<0.05), and the 2-step self-etch adhesive achieved better overall performance than the 1-step self-etch adhesives.

  12. Task Analysis for Health Occupations. Cluster: Nursing. Occupation: Professional Nurse (Associate Degree). Education for Employment Task Lists.

    ERIC Educational Resources Information Center

    Lake County Area Vocational Center, Grayslake, IL.

    This document contains a task analysis for health occupations (professional nurse) in the nursing cluster. For each task listed, occupation, duty area, performance standard, steps, knowledge, attitudes, safety, equipment/supplies, source of analysis, and Illinois state goals for learning are listed. For the duty area of "providing therapeutic…

  13. Task Analysis for Health Occupations. Cluster: Nursing. Occupation: Home Health Aide. Education for Employment Task Lists.

    ERIC Educational Resources Information Center

    Lake County Area Vocational Center, Grayslake, IL.

    This document contains a task analysis for health occupations (home health aid) in the nursing cluster. For each task listed, occupation, duty area, performance standard, steps, knowledge, attitudes, safety, equipment/supplies, source of analysis, and Illinois state goals for learning are listed. For the duty area of "providing therapeutic…

  14. To amend the executive compensation provisions of the Emergency Economic Stabilization Act of 2008 to prohibit unreasonable and excessive compensation and compensation not based on performance standards.

    THOMAS, 111th Congress

    Rep. Grayson, Alan [D-FL-8

    2009-03-23

    Senate - 04/23/2009 Read the second time. Placed on Senate Legislative Calendar under General Orders. Calendar No. 50. (All Actions) Tracker: This bill has the status Passed HouseHere are the steps for Status of Legislation:

  15. Stepping Stones: Five Ways to Increase Craftsmanship in the Art Room

    ERIC Educational Resources Information Center

    Balsley, Jessica

    2012-01-01

    Art educators consistently strive to coach and model good craftsmanship to their students. Sure, teachers can check to ensure students are understanding the art concepts, test them on the vocabulary or even assess students on their color mixing strategies. If these art standards are performed in a sloppy manner (i.e.: lacking craftsmanship),…

  16. The Role of Technology in Advancing Performance Standards in Science and Mathematics Learning.

    ERIC Educational Resources Information Center

    Quellmalz, Edys

    Technology permeates the lives of most Americans: voice mail, personal computers, and the ever-blinking VCR clock have become commonplace. In schools, it is creating educational opportunities at a dizzying pace and, within and beyond the classroom, it is providing unprecedented access to a universe of ideas and resources. As a next step, the…

  17. Exploring the Relationships Between USMLE Performance and Disciplinary Action in Practice: A Validity Study of Score Inferences From a Licensure Examination.

    PubMed

    Cuddy, Monica M; Young, Aaron; Gelman, Andrew; Swanson, David B; Johnson, David A; Dillon, Gerard F; Clauser, Brian E

    2017-12-01

    Physicians must pass the United States Medical Licensing Examination (USMLE) to obtain an unrestricted license to practice allopathic medicine in the United States. Little is known, however, about how well USMLE performance relates to physician behavior in practice, particularly conduct inconsistent with safe, effective patient care. The authors examined the extent to which USMLE scores relate to the odds of receiving a disciplinary action from a U.S. state medical board. Controlling for multiple factors, the authors used non-nested multilevel logistic regression analyses to estimate the relationships between scores and receiving an action. The sample included 164,725 physicians who graduated from U.S. MD-granting medical schools between 1994 and 2006. Physicians had a mean Step 1 score of 214 (standard deviation [SD] = 21) and a mean Step 2 Clinical Knowledge (CK) score of 213 (SD = 23). Of the physicians, 2,205 (1.3%) received at least one action. Physicians with higher Step 2 CK scores had lower odds of receiving an action. A 1-SD increase in Step 2 CK scores corresponded to a decrease in the chance of disciplinary action by roughly 25% (odds ratio = 0.75; 95% CI = 0.70-0.80). After accounting for Step 2 CK scores, Step 1 scores were unrelated to the odds of receiving an action. USMLE Step 2 CK scores provide useful information about the odds a physician will receive an official sanction for problematic practice behavior. These results provide validity evidence supporting current interpretation and use of Step 2 CK scores.

  18. ECIRS (Endoscopic Combined Intrarenal Surgery) in the Galdakao-modified supine Valdivia position: a new life for percutaneous surgery?

    PubMed

    Cracco, Cecilia Maria; Scoffone, Cesare Marco

    2011-12-01

    Percutaneous nephrolithotomy (PNL) is still the gold-standard treatment for large and/or complex renal stones. Evolution in the endoscopic instrumentation and innovation in the surgical skills improved its success rate and reduced perioperative morbidity. ECIRS (Endoscopic Combined IntraRenal Surgery) is a new way of affording PNL in a modified supine position, approaching antero-retrogradely to the renal cavities, and exploiting the full array of endourologic equipment. ECIRS summarizes the main issues recently debated about PNL. The recent literature regarding supine PNL and ECIRS has been reviewed, namely about patient positioning, synergy between operators, procedures, instrumentation, accessories and diagnostic tools, step-by-step standardization along with versatility of the surgical sequence, minimization of radiation exposure, broadening to particular and/or complex patients, limitation of post-operative renal damage. Supine PNL and ECIRS are not superior to prone PNL in terms of urological results, but guarantee undeniable anesthesiological and management advantages for both patient and operators. In particular, ECIRS requires from the surgeon a permanent mental attitude to synergy, standardized surgical steps, versatility and adherence to the ongoing clinical requirements. ECIRS can be performed also in particular cases, irrespective to age or body habitus. The use of flexible endoscopes during ECIRS contributes to minimizing radiation exposure, hemorrhagic risk and post-PNL renal damage. ECIRS may be considered an evolution of the PNL procedure. Its proposal has the merit of having triggered the critical analysis of the various PNL steps and of patient positioning, and of having transformed the old static PNL into an updated approach.

  19. Solar cell and module performance assessment based on indoor calibration methods

    NASA Astrophysics Data System (ADS)

    Bogus, K.

    A combined space/terrestrial solar cell test calibration method that requires five steps and can be performed indoors is described. The test conditions are designed to qualify the cell or module output data in standard illumination and temperature conditions. Measurements are made of the short-circuit current, the open circuit voltage, the maximum power, the efficiency, and the spectral response. Standard sunlight must be replicated both in earth surface and AM0 conditions; Xe lamps are normally used for the light source, with spectral measurements taken of the light. Cell and module spectral response are assayed by using monochromators and narrow band pass monochromatic filters. Attention is required to define the performance characteristics of modules under partial shadowing. Error sources that may effect the measurements are discussed, as are previous cell performance testing and calibration methods and their effectiveness in comparison with the behaviors of satellite solar power panels.

  20. Predictors for Long-Term Hip Survivorship Following Acetabular Fracture Surgery: Importance of Gap Compared with Step Displacement.

    PubMed

    Verbeek, Diederik O; van der List, Jelle P; Tissue, Camden M; Helfet, David L

    2018-06-06

    Historically, the greatest residual (gap or step) displacement is used to predict clinical outcome following acetabular fracture surgery. Gap and step displacement may, however, impact the outcome to different degrees. We assessed the individual relationship between gap or step displacement and hip survivorship and determined their independent association with conversion to total hip arthroplasty. Patients who had acetabular fracture fixation (from 1992 through 2014), follow-up of ≥2 years (or early conversion to total hip arthroplasty), and postoperative computed tomography (CT) scans were included. Of 227 patients, 55 (24.2%) had conversion to total hip arthroplasty at a mean follow-up (and standard deviation) of 8.7 ± 5.6 years. Residual gap and step displacement were measured using a standardized CT-based method, and assessors were blinded to the outcome. Kaplan-Meier survivorship curves for the hips were plotted and compared (log-rank test) using critical cutoff values for gap and step displacement. These values were identified using receiver operating characteristic curves. Multivariate analysis was performed to identify independent variables associated with conversion to total hip arthroplasty. Subgroup analysis was performed in younger patients (<50 years old). The critical CT cutoff value for total hip arthroplasty conversion was 5 mm for gap and 1 mm for step displacement. Hip survivorship at 10 years was 82.0% for patients with a gap of <5 mm compared with 56.5% for a gap of ≥5 mm (p < 0.001) and 80.0% for a step of <1.0 mm versus 65.5% for a step of ≥1.0 mm (p = 0.012). A gap of ≥5 mm (hazard ratio [HR], 2.3; p = 0.012) and an age of ≥50 years (HR, 4.2; p < 0.001) were independently associated with conversion to total hip arthroplasty in all patients. In the subgroup of younger patients, only a step of ≥1 mm (HR, 6.4; p = 0.017) was an independent factor for conversion to total hip arthroplasty. Residual gap and step displacement as measured on CT scans are both related to long-term hip survivorship, but step displacement (1 mm) is tolerated less than gap displacement (5 mm). Of the 2 types of displacement, only a large gap displacement (≥5 mm) was independently associated with conversion to total hip arthroplasty. In younger patients who had less articular impaction with smaller residual gaps, only step displacement (≥1 mm) appeared to be associated with this outcome. Prognostic Level III. See Instructions for Authors for a complete description of levels of evidence.

  1. Fast and scalable purification of a therapeutic full-length antibody based on process crystallization.

    PubMed

    Smejkal, Benjamin; Agrawal, Neeraj J; Helk, Bernhard; Schulz, Henk; Giffard, Marion; Mechelke, Matthias; Ortner, Franziska; Heckmeier, Philipp; Trout, Bernhardt L; Hekmat, Dariusch

    2013-09-01

    The potential of process crystallization for purification of a therapeutic monoclonal IgG1 antibody was studied. The purified antibody was crystallized in non-agitated micro-batch experiments for the first time. A direct crystallization from clarified CHO cell culture harvest was inhibited by high salt concentrations. The salt concentration of the harvest was reduced by a simple pretreatment step. The crystallization process from pretreated harvest was successfully transferred to stirred tanks and scaled-up from the mL-scale to the 1 L-scale for the first time. The crystallization yield after 24 h was 88-90%. A high purity of 98.5% was reached after a single recrystallization step. A 17-fold host cell protein reduction was achieved and DNA content was reduced below the detection limit. High biological activity of the therapeutic antibody was maintained during the crystallization, dissolving, and recrystallization steps. Crystallization was also performed with impure solutions from intermediate steps of a standard monoclonal antibody purification process. It was shown that process crystallization has a strong potential to replace Protein A chromatography. Fast dissolution of the crystals was possible. Furthermore, it was shown that crystallization can be used as a concentrating step and can replace several ultra-/diafiltration steps. Molecular modeling suggested that a negative electrostatic region with interspersed exposed hydrophobic residues on the Fv domain of this antibody is responsible for the high crystallization propensity. As a result, process crystallization, following the identification of highly crystallizable antibodies using molecular modeling tools, can be recognized as an efficient, scalable, fast, and inexpensive alternative to key steps of a standard purification process for therapeutic antibodies. Copyright © 2013 Wiley Periodicals, Inc.

  2. Boosting drug named entity recognition using an aggregate classifier.

    PubMed

    Korkontzelos, Ioannis; Piliouras, Dimitrios; Dowsey, Andrew W; Ananiadou, Sophia

    2015-10-01

    Drug named entity recognition (NER) is a critical step for complex biomedical NLP tasks such as the extraction of pharmacogenomic, pharmacodynamic and pharmacokinetic parameters. Large quantities of high quality training data are almost always a prerequisite for employing supervised machine-learning techniques to achieve high classification performance. However, the human labour needed to produce and maintain such resources is a significant limitation. In this study, we improve the performance of drug NER without relying exclusively on manual annotations. We perform drug NER using either a small gold-standard corpus (120 abstracts) or no corpus at all. In our approach, we develop a voting system to combine a number of heterogeneous models, based on dictionary knowledge, gold-standard corpora and silver annotations, to enhance performance. To improve recall, we employed genetic programming to evolve 11 regular-expression patterns that capture common drug suffixes and used them as an extra means for recognition. Our approach uses a dictionary of drug names, i.e. DrugBank, a small manually annotated corpus, i.e. the pharmacokinetic corpus, and a part of the UKPMC database, as raw biomedical text. Gold-standard and silver annotated data are used to train maximum entropy and multinomial logistic regression classifiers. Aggregating drug NER methods, based on gold-standard annotations, dictionary knowledge and patterns, improved the performance on models trained on gold-standard annotations, only, achieving a maximum F-score of 95%. In addition, combining models trained on silver annotations, dictionary knowledge and patterns are shown to achieve comparable performance to models trained exclusively on gold-standard data. The main reason appears to be the morphological similarities shared among drug names. We conclude that gold-standard data are not a hard requirement for drug NER. Combining heterogeneous models build on dictionary knowledge can achieve similar or comparable classification performance with that of the best performing model trained on gold-standard annotations. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.

  3. NASA and the challenge of ISDN: The role of satellites in an ISDN world

    NASA Technical Reports Server (NTRS)

    Byerly, Radford; Barnes, Frank; Codding, George; Hofgard, Jefferson

    1988-01-01

    To understand what role satellites may play in Integrated Services Digital Network (ISDN), it is necessary to understand the concept of ISDN, including key organizations involved, the current status of key standards recommendations, and domestic and international progress implementation of ISDN. Each of these areas are explained. A summary of the technical performance criteria for ISDN, current standards for satellites in ISDN, key players in the ISDN environment, and what steps can be taken to encourage application of satellites in ISDN are also covered.

  4. Polishing performance of multiple-use silicone rubber-based polishing instruments with and without disinfection/sterilization.

    PubMed

    Heintze, Siegward Dietmar; Forjanic, Monika

    2008-10-01

    To evaluate the effect of the multiple-use of a three-step rubber-based polishing system on the polishing performance with and without a disinfection/sterilization protocol with prolonged disinfection (overnight). The three-step polishing system Astropol was applied under standardized contact pressure of 2 N on 320 grit pre-roughened flat composite specimens of Tetric EvoCeram for 10 seconds (F and P disc) and 30 seconds (HP disc) respectively. After each polishing step, the surface gloss and roughness were measured with a glossmeter and an optical sensor (FRT MicroProf), respectively. Material loss of the composite specimens and polishing instruments were measured after each step with a high precision digital scale. For all four variables (surface gloss, surface roughness, composite loss, loss of rubber material) the mean percentage of change compared to the reference was calculated. Already after the first use, the instruments which were used without disinfection or sterilization demonstrated a statistically significantly reduced polishing performance in all polishing steps compared to the reference (new polishing system) (t-test, P < 0.05). In addition, this loss in performance further increased with the second and third re-use. Especially the third component (Astropol HP) was affected by performance loss. By contrast, the multiple-use of the instruments which were subjected to prolonged disinfection did not result in a reduced polishing performance. For the P disc, a statistically significant improvement of the polishing performance could be observed throughout almost all multiple-use sessions (ANOVA, P < 0.05). The improved polishing performance was, however, accompanied by an increased loss of the silicone rubber material of the P and F polishing discs; the HP discs were not affected by this loss. Furthermore, particles of the rubber material also adhered to the composite. The polishing performance of the discs which were only subjected to the sterilization process was not statistically significantly different to the polishing performance of the control group in terms of surface roughness; but the surface gloss was worse than that of the control group. No loss of rubber material or adherence to the composite was observed in this group.

  5. Science and the rules governing anti-doping violations.

    PubMed

    Bowers, Larry D

    2010-01-01

    The fight against the use of performance-enhancing drugs in sports has been in effect for nearly 90 years. The formation of the World Anti-Doping Agency in 1999 was a major event because an independent agency was entrusted with harmonization of the antidoping program. In addition to sports governing bodies, governments have endorsed WADA and its programs by signing a United Nations Education, Science, and Cultural Organization Convention on Doping. The first step in the harmonization process was the development of the World Anti-Doping Program. This program consisted of five documents - the Code, the International Standard for Testing, the International Standard for Laboratories, the Prohibited List, and the International Standard for Therapeutic Use Exemptions - which unified the approach of the international federations and national antidoping agencies in applying antidoping rules. For laboratory testing, the International Standard for Laboratories establishes the performance expectations for and competence of laboratories recognized by WADA, including accreditation under ISO/IEC 17025. The antidoping rules are adjudicated by arbitration using the internationally recognized Court of Arbitration for Sport.

  6. A real-time inverse quantised transform for multi-standard with dynamic resolution support

    NASA Astrophysics Data System (ADS)

    Sun, Chi-Chia; Lin, Chun-Ying; Zhang, Ce

    2016-06-01

    In this paper, a real-time configurable intelligent property (IP) core is presented for image/video decoding process in compatibility with the standard MPEG-4 Visual and the standard H.264/AVC. The inverse quantised discrete cosine and integer transform can be used to perform inverse quantised discrete cosine transform and inverse quantised inverse integer transforms which only required shift and add operations. Meanwhile, COordinate Rotation DIgital Computer iterations and compensation steps are adjustable in order to compensate for the video compression quality regarding various data throughput. The implementations are embedded in publicly available software XVID Codes 1.2.2 for the standard MPEG-4 Visual and the H.264/AVC reference software JM 16.1, where the experimental results show that the balance between the computational complexity and video compression quality is retained. At the end, FPGA synthesised results show that the proposed IP core can bring advantages to low hardware costs and also provide real-time performance for Full HD and 4K-2K video decoding.

  7. Electronic Procedures for Medical Operations

    NASA Technical Reports Server (NTRS)

    2015-01-01

    Electronic procedures are replacing text-based documents for recording the steps in performing medical operations aboard the International Space Station. S&K Aerospace, LLC, has developed a content-based electronic system-based on the Extensible Markup Language (XML) standard-that separates text from formatting standards and tags items contained in procedures so they can be recognized by other electronic systems. For example, to change a standard format, electronic procedures are changed in a single batch process, and the entire body of procedures will have the new format. Procedures can be quickly searched to determine which are affected by software and hardware changes. Similarly, procedures are easily shared with other electronic systems. The system also enables real-time data capture and automatic bookmarking of current procedure steps. In Phase II of the project, S&K Aerospace developed a Procedure Representation Language (PRL) and tools to support the creation and maintenance of electronic procedures for medical operations. The goal is to develop these tools in such a way that new advances can be inserted easily, leading to an eventual medical decision support system.

  8. Reducing Time and Increasing Sensitivity in Sample Preparation for Adherent Mammalian Cell Metabolomics

    PubMed Central

    Lorenz, Matthew A.; Burant, Charles F.; Kennedy, Robert T.

    2011-01-01

    A simple, fast, and reproducible sample preparation procedure was developed for relative quantification of metabolites in adherent mammalian cells using the clonal β-cell line INS-1 as a model sample. The method was developed by evaluating the effect of different sample preparation procedures on high performance liquid chromatography- mass spectrometry quantification of 27 metabolites involved in glycolysis and the tricarboxylic acid cycle on a directed basis as well as for all detectable chromatographic features on an undirected basis. We demonstrate that a rapid water rinse step prior to quenching of metabolism reduces components that suppress electrospray ionization thereby increasing signal for 26 of 27 targeted metabolites and increasing total number of detected features from 237 to 452 with no detectable change of metabolite content. A novel quenching technique is employed which involves addition of liquid nitrogen directly to the culture dish and allows for samples to be stored at −80 °C for at least 7 d before extraction. Separation of quenching and extraction steps provides the benefit of increased experimental convenience and sample stability while maintaining metabolite content similar to techniques that employ simultaneous quenching and extraction with cold organic solvent. The extraction solvent 9:1 methanol: chloroform was found to provide superior performance over acetonitrile, ethanol, and methanol with respect to metabolite recovery and extract stability. Maximal recovery was achieved using a single rapid (~1 min) extraction step. The utility of this rapid preparation method (~5 min) was demonstrated through precise metabolite measurements (11% average relative standard deviation without internal standards) associated with step changes in glucose concentration that evoke insulin secretion in the clonal β-cell line INS-1. PMID:21456517

  9. Evaluation of sequential extraction procedures for soluble and insoluble hexavalent chromium compounds in workplace air samples.

    PubMed

    Ashley, Kevin; Applegate, Gregory T; Marcy, A Dale; Drake, Pamela L; Pierce, Paul A; Carabin, Nathalie; Demange, Martine

    2009-02-01

    Because toxicities may differ for Cr(VI) compounds of varying solubility, some countries and organizations have promulgated different occupational exposure limits (OELs) for soluble and insoluble hexavalent chromium (Cr(VI)) compounds, and analytical methods are needed to determine these species in workplace air samples. To address this need, international standard methods ASTM D6832 and ISO 16740 have been published that describe sequential extraction techniques for soluble and insoluble Cr(VI) in samples collected from occupational settings. However, no published performance data were previously available for these Cr(VI) sequential extraction procedures. In this work, the sequential extraction methods outlined in the relevant international standards were investigated. The procedures tested involved the use of either deionized water or an ammonium sulfate/ammonium hydroxide buffer solution to target soluble Cr(VI) species. This was followed by extraction in a sodium carbonate/sodium hydroxide buffer solution to dissolve insoluble Cr(VI) compounds. Three-step sequential extraction with (1) water, (2) sulfate buffer and (3) carbonate buffer was also investigated. Sequential extractions were carried out on spiked samples of soluble, sparingly soluble and insoluble Cr(VI) compounds, and analyses were then generally carried out by using the diphenylcarbazide method. Similar experiments were performed on paint pigment samples and on airborne particulate filter samples collected from stainless steel welding. Potential interferences from soluble and insoluble Cr(III) compounds, as well as from Fe(II), were investigated. Interferences from Cr(III) species were generally absent, while the presence of Fe(II) resulted in low Cr(VI) recoveries. Two-step sequential extraction of spiked samples with (first) either water or sulfate buffer, and then carbonate buffer, yielded quantitative recoveries of soluble Cr(VI) and insoluble Cr(VI), respectively. Three-step sequential extraction gave excessively high recoveries of soluble Cr(VI), low recoveries of sparingly soluble Cr(VI), and quantitative recoveries of insoluble Cr(VI). Experiments on paint pigment samples using two-step extraction with water and carbonate buffer yielded varying percentages of relative fractions of soluble and insoluble Cr(VI). Sequential extractions of stainless steel welding fume air filter samples demonstrated the predominance of soluble Cr(VI) compounds in such samples. The performance data obtained in this work support the Cr(VI) sequential extraction procedures described in the international standards.

  10. Energetic cost of standard activities in Gurkha and British soldiers.

    PubMed

    Strickland, S S; Ulijaszek, S J

    1990-01-01

    Measurements of basal metabolic rate and energy expenditure at lying, sitting, standing, and performing a step test at four levels of exercise, were made on Gurkha soldiers stationed in Britain and on British controls matched by body weight and occupational background. There was no significant difference in basal metabolic rate (BMR), nor in the energy cost of lying, sitting and standing between the two groups. Gurhas showed significantly lower gross and net energy expenditure, and so significantly greater net mechanical efficiency, at the lower levels of step exercise. The ratio of gross energy expenditure to BMR was lower in Gurkhas at the lowest rates of stepping compared with the British controls. These results suggest that the energy cost of some physical activities expressed as multiples of BMR may not be constant across populations.

  11. A two-step electrodialysis method for DNA purification from polluted metallic environmental samples.

    PubMed

    Rodríguez-Mejía, José Luis; Martínez-Anaya, Claudia; Folch-Mallol, Jorge Luis; Dantán-González, Edgar

    2008-08-01

    Extracting DNA from samples of polluted environments using standard methods often results in low yields of poor-quality material unsuited to subsequent manipulation and analysis by molecular biological techniques. Here, we report a novel two-step electrodialysis-based method for the extraction of DNA from environmental samples. This technique permits the rapid and efficient isolation of high-quality DNA based on its acidic nature, and without the requirement for phenol-chloroform-isoamyl alcohol cleanup and ethanol precipitation steps. Subsequent PCR, endonuclease restriction, and cloning reactions were successfully performed utilizing DNA obtained by electrodialysis, whereas some or all of these techniques failed using DNA extracted with two alternative methods. We also show that his technique is applicable to purify DNA from a range of polluted and nonpolluted samples.

  12. Efficient Ensemble State-Parameters Estimation Techniques in Ocean Ecosystem Models: Application to the North Atlantic

    NASA Astrophysics Data System (ADS)

    El Gharamti, M.; Bethke, I.; Tjiputra, J.; Bertino, L.

    2016-02-01

    Given the recent strong international focus on developing new data assimilation systems for biological models, we present in this comparative study the application of newly developed state-parameters estimation tools to an ocean ecosystem model. It is quite known that the available physical models are still too simple compared to the complexity of the ocean biology. Furthermore, various biological parameters remain poorly unknown and hence wrong specifications of such parameters can lead to large model errors. Standard joint state-parameters augmentation technique using the ensemble Kalman filter (Stochastic EnKF) has been extensively tested in many geophysical applications. Some of these assimilation studies reported that jointly updating the state and the parameters might introduce significant inconsistency especially for strongly nonlinear models. This is usually the case for ecosystem models particularly during the period of the spring bloom. A better handling of the estimation problem is often carried out by separating the update of the state and the parameters using the so-called Dual EnKF. The dual filter is computationally more expensive than the Joint EnKF but is expected to perform more accurately. Using a similar separation strategy, we propose a new EnKF estimation algorithm in which we apply a one-step-ahead smoothing to the state. The new state-parameters estimation scheme is derived in a consistent Bayesian filtering framework and results in separate update steps for the state and the parameters. Unlike the classical filtering path, the new scheme starts with an update step and later a model propagation step is performed. We test the performance of the new smoothing-based schemes against the standard EnKF in a one-dimensional configuration of the Norwegian Earth System Model (NorESM) in the North Atlantic. We use nutrients profile (up to 2000 m deep) data and surface partial CO2 measurements from Mike weather station (66o N, 2o E) to estimate different biological parameters of phytoplanktons and zooplanktons. We analyze the performance of the filters in terms of complexity and accuracy of the state and parameters estimates.

  13. Accuracy in planar cutting of bones: an ISO-based evaluation.

    PubMed

    Cartiaux, Olivier; Paul, Laurent; Docquier, Pierre-Louis; Francq, Bernard G; Raucent, Benoît; Dombre, Etienne; Banse, Xavier

    2009-03-01

    Computer- and robot-assisted technologies are capable of improving the accuracy of planar cutting in orthopaedic surgery. This study is a first step toward formulating and validating a new evaluation methodology for planar bone cutting, based on the standards from the International Organization for Standardization. Our experimental test bed consisted of a purely geometrical model of the cutting process around a simulated bone. Cuts were performed at three levels of surgical assistance: unassisted, computer-assisted and robot-assisted. We measured three parameters of the standard ISO1101:2004: flatness, parallelism and location of the cut plane. The location was the most relevant parameter for assessing cutting errors. The three levels of assistance were easily distinguished using the location parameter. Our ISO methodology employs the location to obtain all information about translational and rotational cutting errors. Location may be used on any osseous structure to compare the performance of existing assistance technologies.

  14. Photomask applications of traceable atomic force microscope dimensional metrology at NIST

    NASA Astrophysics Data System (ADS)

    Dixson, Ronald; Orji, Ndubuisi G.; Potzick, James; Fu, Joseph; Allen, Richard A.; Cresswell, Michael; Smith, Stewart; Walton, Anthony J.; Tsiamis, Andreas

    2007-10-01

    The National Institute of Standards and Technology (NIST) has a multifaceted program in atomic force microscope (AFM) dimensional metrology. Three major instruments are being used for traceable measurements. The first is a custom in-house metrology AFM, called the calibrated AFM (C-AFM), the second is the first generation of commercially available critical dimension AFM (CD-AFM), and the third is a current generation CD-AFM at SEMATECH - for which NIST has established the calibration and uncertainties. All of these instruments have useful applications in photomask metrology. Linewidth reference metrology is an important application of CD-AFM. We have performed a preliminary comparison of linewidths measured by CD-AFM and by electrical resistance metrology on a binary mask. For the ten selected test structures with on-mask linewidths between 350 nm and 600 nm, most of the observed differences were less than 5 nm, and all of them were less than 10 nm. The offsets were often within the estimated uncertainties of the AFM measurements, without accounting for the effect of linewidth roughness or the uncertainties of electrical measurements. The most recent release of the NIST photomask standard - which is Standard Reference Material (SRM) 2059 - was also supported by CD-AFM reference measurements. We review the recent advances in AFM linewidth metrology that will reduce the uncertainty of AFM measurements on this and future generations of the NIST photomask standard. The NIST C-AFM has displacement metrology for all three axes traceable to the 633 nm wavelength of the iodine-stabilized He-Ne laser. One of the important applications of the C-AFM is step height metrology, which has some relevance to phase shift calibration. In the current generation of the system, the approximate level of relative standard uncertainty for step height measurements at the 100 nm scale is 0.1 %. We discuss the monitor history of a 290 nm step height, originally measured on the C-AFM with a 1.9 nm (k = 2) expanded uncertainty, and describe advances that bring the step height uncertainty of recent measurements to an estimated 0.6 nm (k = 2). Based on this work, we expect to be able to reduce the topographic component of phase uncertainty in alternating aperture phase shift masks (AAPSM) by a factor of three compared to current calibrations based on earlier generation step height references.

  15. Determination of dissolved bromate in drinking water by ion chromatography and post column reaction: interlaboratory study.

    PubMed

    Cordeiro, Fernando; Robouch, Piotr; de la Calle, Maria Beatriz; Emteborg, Håkan; Charoud-Got, Jean; Schmitz, Franz

    2011-01-01

    A collaborative study, International Evaluation Measurement Programme-25a, was conducted in accordance with international protocols to determine the performance characteristics of an analytical method for the determination of dissolved bromate in drinking water. The method should fulfill the analytical requirements of Council Directive 98/83/EC (referred to in this work as the Drinking Water Directive; DWD). The new draft standard method under investigation is based on ion chromatography followed by post-column reaction and UV detection. The collaborating laboratories used the Draft International Organization for Standardization (ISO)/Draft International Standard (DIS) 11206 document. The existing standard method (ISO 15061:2001) is based on ion chromatography using suppressed conductivity detection, in which a preconcentration step may be required for the determination of bromate concentrations as low as 3 to 5 microg/L. The new method includes a dilution step that reduces the matrix effects, thus allowing the determination of bromate concentrations down to 0.5 microg/L. Furthermore, the method aims to minimize any potential interference of chlorite ions. The collaborative study investigated different types of drinking water, such as soft, hard, and mineral water. Other types of water, such as raw water (untreated), swimming pool water, a blank (named river water), and a bromate standard solution, were included as test samples. All test matrixes except the swimming pool water were spiked with high-purity potassium bromate to obtain bromate concentrations ranging from 1.67 to 10.0 microg/L. Swimming pool water was not spiked, as this water was incurred with bromate. Test samples were dispatched to 17 laboratories from nine different countries. Sixteen participants reported results. The repeatability RSD (RSD(r)) ranged from 1.2 to 4.1%, while the reproducibility RSD (RSDR) ranged from 2.3 to 5.9%. These precision characteristics compare favorably with those of ISO 15601. A thorough comparison of the performance characteristics is presented in this report. All method performance characteristics obtained in the frame of this collaborative study indicate that the draft ISO/DIS 11206 standard method meets the requirements set down by the DWD. It can, therefore, be considered to fit its intended analytical purpose.

  16. [Application of robustness test for assessment of the measurement uncertainty at the end of development phase of a chromatographic method for quantification of water-soluble vitamins].

    PubMed

    Ihssane, B; Bouchafra, H; El Karbane, M; Azougagh, M; Saffaj, T

    2016-05-01

    We propose in this work an efficient way to evaluate the measurement of uncertainty at the end of the development step of an analytical method, since this assessment provides an indication of the performance of the optimization process. The estimation of the uncertainty is done through a robustness test by applying a Placquett-Burman design, investigating six parameters influencing the simultaneous chromatographic assay of five water-soluble vitamins. The estimated effects of the variation of each parameter are translated into standard uncertainty value at each concentration level. The values obtained of the relative uncertainty do not exceed the acceptance limit of 5%, showing that the procedure development was well done. In addition, a statistical comparison conducted to compare standard uncertainty after the development stage and those of the validation step indicates that the estimated uncertainty are equivalent. The results obtained show clearly the performance and capacity of the chromatographic method to simultaneously assay the five vitamins and suitability for use in routine application. Copyright © 2015 Académie Nationale de Pharmacie. Published by Elsevier Masson SAS. All rights reserved.

  17. The California Clinical Data Project: a case study in the adoption of clinical data standards for quality improvement.

    PubMed

    Sujansky, Walter; Chang, Sophia

    2006-01-01

    The California Clinical Data Project is a statewide initiative to remove barriers to the widespread and effective use of information technology to improve chronic disease care. The project is a case study in the development and widespread adoption of clinical data standards by varied and often competing stakeholders. As an initial step, the project defined precise data standards for the batch reporting of pharmacy claims data and laboratory results data. These uniform standards facilitate the flow of existing electronic clinical information into disease registries and electronic health record systems. Pharmacy and lab results data now are being exchanged electronically with this standard among the largest health plans, medical groups, and clinical laboratories participating in California's pay-for-performance programs. Lessons from this project may apply to the development and adoption of data standards for other states and locales and for the emerging national health information infrastructure.

  18. Design of a Sub-Picosecond Jitter with Adjustable-Range CMOS Delay-Locked Loop for High-Speed and Low-Power Applications

    PubMed Central

    Abdulrazzaq, Bilal I.; Ibrahim, Omar J.; Kawahito, Shoji; Sidek, Roslina M.; Shafie, Suhaidi; Yunus, Nurul Amziah Md.; Lee, Lini; Halin, Izhal Abdul

    2016-01-01

    A Delay-Locked Loop (DLL) with a modified charge pump circuit is proposed for generating high-resolution linear delay steps with sub-picosecond jitter performance and adjustable delay range. The small-signal model of the modified charge pump circuit is analyzed to bring forth the relationship between the DLL’s internal control voltage and output time delay. Circuit post-layout simulation shows that a 0.97 ps delay step within a 69 ps delay range with 0.26 ps Root-Mean Square (RMS) jitter performance is achievable using a standard 0.13 µm Complementary Metal-Oxide Semiconductor (CMOS) process. The post-layout simulation results show that the power consumption of the proposed DLL architecture’s circuit is 0.1 mW when the DLL is operated at 2 GHz. PMID:27690040

  19. User Instructions for the Policy Analysis Modeling System (PAMS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McNeil, Michael A.; Letschert, Virginie E.; Van Buskirk, Robert D.

    PAMS uses country-specific and product-specific data to calculate estimates of impacts of a Minimum Efficiency Performance Standard (MEPS) program. The analysis tool is self-contained in a Microsoft Excel spreadsheet, and requires no links to external data, or special code additions to run. The analysis can be customized to a particular program without additional user input, through the use of the pull-down menus located on the Summary page. In addition, the spreadsheet contains many areas into which user-generated input data can be entered for increased accuracy of projection. The following is a step-by-step guide for using and customizing the tool.

  20. Effect of uphill and downhill walking on walking performance in geriatric patients using a wheeled walker.

    PubMed

    Lindemann, Ulrich; Schwenk, Michael; Schmitt, Syn; Weyrich, Michael; Schlicht, Wolfgang; Becker, Clemens

    2017-08-01

    Wheeled walkers are recommended to improve walking performance in older persons and to encourage and assist participation in daily life. Nevertheless, using a wheeled walker can cause serious problems in the natural environment. This study aimed to compare uphill and downhill walking with walking level in geriatric patients using a wheeled walker. Furthermore, we investigated the effect of using a wheeled walker with respect to dual tasking when walking level. A total of 20 geriatric patients (median age 84.5 years) walked 10 m at their habitual pace along a level surface, uphill and downhill, with and without a standard wheeled walker. Gait speed, stride length and cadence were assessed by wearable sensors and the walk ratio was calculated. When using a wheeled walker while walking level the walk ratio improved (0.58 m/[steps/min] versus 0.57 m/[steps/min], p = 0.023) but gait speed decreased (1.07 m/s versus 1.12 m/s, p = 0.020) when compared to not using a wheeled walker. With respect to the walk ratio, uphill and downhill walking with a wheeled walker decreased walking performance when compared to level walking (0.54 m/[steps/min] versus 0.58 m/[steps/min], p = 0.023 and 0.55 m/[steps/min] versus 0.58 m/[steps/min], p = 0.001, respectively). At the same time, gait speed decreased (0.079 m/s versus 1.07 m/s, p < 0.0001) or was unaffected. The use of a wheeled walker improved the quality of level walking but the performance of uphill and downhill walking was worse compared to walking level when using a wheeled walker.

  1. Simultaneous Spectrophotometric Determination of Rifampicin, Isoniazid and Pyrazinamide in a Single Step

    PubMed Central

    Asadpour-Zeynali, Karim; Saeb, Elhameh

    2016-01-01

    Three antituberculosis medications are investigated in this work consist of rifampicin, isoniazid and pyrazinamide. The ultra violet (UV) spectra of these compounds are overlapped, thus use of suitable chemometric methods are helpful for simultaneous spectrophotometric determination of them. A generalized version of net analyte signal standard addition method (GNASSAM) was used for determination of three antituberculosis medications as a model system. In generalized net analyte signal standard addition method only one standard solution was prepared for all analytes. This standard solution contains a mixture of all analytes of interest, and the addition of such solution to sample, causes increases in net analyte signal of each analyte which are proportional to the concentrations of analytes in added standards solution. For determination of concentration of each analyte in some synthetic mixtures, the UV spectra of pure analytes and each sample were recorded in the range of 210 nm-550 nm. The standard addition procedure was performed for each sample and the UV spectrum was recorded after each addition and finally the results were analyzed by net analyte signal method. Obtained concentrations show acceptable performance of GNASSAM in these cases. PMID:28243267

  2. Computerization of the standard corsi block-tapping task affects its underlying cognitive concepts: a pilot study.

    PubMed

    Claessen, Michiel H G; van der Ham, Ineke J M; van Zandvoort, Martine J E

    2015-01-01

    The tablet computer initiates an important step toward computerized administration of neuropsychological tests. Because of its lack of standardization, the Corsi Block-Tapping Task could benefit from advantages inherent to computerization. This task, which requires reproduction of a sequence of movements by tapping blocks as demonstrated by an examiner, is widely used as a representative of visuospatial attention and working memory. The aim was to validate a computerized version of the Corsi Task (e-Corsi) by comparing recall accuracy to that on the standard task. Forty university students (Mage = 22.9 years, SD = 2.7 years; 20 female) performed the standard Corsi Task and the e-Corsi on an iPad 3. Results showed higher accuracy in forward reproduction on the standard Corsi compared with the e-Corsi, whereas backward performance was comparable. These divergent performance patterns on the 2 versions (small-to-medium effect sizes) are explained as a result of motor priming and interference effects. This finding implies that computerization has serious consequences for the cognitive concepts that the Corsi Task is assumed to assess. Hence, whereas the e-Corsi was shown to be useful with respect to administration and registration, these findings also stress the need for reconsideration of the underlying theoretical concepts of this task.

  3. Lower-extremity biomechanics during forward and lateral stepping activities in older adults

    PubMed Central

    Wang, Man-Ying; Flanagan, Sean; Song, Joo-Eun; Greendale, Gail A.; Salem, George J.

    2012-01-01

    Objective To characterize the lower-extremity biomechanics associated with stepping activities in older adults. Design Repeated-measures comparison of kinematics and kinetics associated with forward step-up and lateral step-up activities. Background Biomechanical analysis may be used to assess the effectiveness of various ‘in-home activities’ in targeting appropriate muscle groups and preserving functional strength and power in elders. Methods Data were analyzed from 21 participants (mean 74.7 yr (standard deviation, 4.4 yr)) who performed the forward and lateral step-up activities while instrumented for biomechanical analysis. Motion analysis equipment, inverse dynamics equations, and repeated measures anovas were used to contrast the maximum joint angles, peak net joint moments, angular impulse, work, and power associated with the activities. Results The lateral step-up resulted in greater maximum knee flexion (P < 0.001) and ankle dorsiflexion angles (P < 0.01). Peak joint moments were similar between exercises. The forward step-up generated greater peak hip power (P < 0.05) and total work (P < 0.001); whereas, the lateral step-up generated greater impulse (P < 0.05), work (P < 0.01), and power (P < 0.05) at the knee and ankle. Conclusions In older adults, the forward step-up places greater demand on the hip extensors, while lateral step-up places greater demand on the knee extensors and ankle plantar flexors. PMID:12620784

  4. A New Type of Motor: Pneumatic Step Motor

    PubMed Central

    Stoianovici, Dan; Patriciu, Alexandru; Petrisor, Doru; Mazilu, Dumitru; Kavoussi, Louis

    2011-01-01

    This paper presents a new type of pneumatic motor, a pneumatic step motor (PneuStep). Directional rotary motion of discrete displacement is achieved by sequentially pressurizing the three ports of the motor. Pulsed pressure waves are generated by a remote pneumatic distributor. The motor assembly includes a motor, gearhead, and incremental position encoder in a compact, central bore construction. A special electronic driver is used to control the new motor with electric stepper indexers and standard motion control cards. The motor accepts open-loop step operation as well as closed-loop control with position feedback from the enclosed sensor. A special control feature is implemented to adapt classic control algorithms to the new motor, and is experimentally validated. The speed performance of the motor degrades with the length of the pneumatic hoses between the distributor and motor. Experimental results are presented to reveal this behavior and set the expectation level. Nevertheless, the stepper achieves easily controllable precise motion unlike other pneumatic motors. The motor was designed to be compatible with magnetic resonance medical imaging equipment, for actuating an image-guided intervention robot, for medical applications. For this reason, the motors were entirely made of nonmagnetic and dielectric materials such as plastics, ceramics, and rubbers. Encoding was performed with fiber optics, so that the motors are electricity free, exclusively using pressure and light. PneuStep is readily applicable to other pneumatic or hydraulic precision-motion applications. PMID:21528106

  5. Study of the SCC Behavior of 7075 Aluminum Alloy After One-Step Aging at 163 °C

    NASA Astrophysics Data System (ADS)

    Silva, G.; Rivolta, B.; Gerosa, R.; Derudi, U.

    2013-01-01

    For the past many years, 7075 aluminum alloys have been widely used especially in those applications for which high mechanical performances are required. It is well known that the alloy in the T6 condition is characterized by the highest ultimate and yield strengths, but, at the same time, by poor stress corrosion cracking (SCC) resistance. For this reason, in the aeronautic applications, new heat treatments have been introduced to produce T7X conditions, which are characterized by lower mechanical strength, but very good SCC behavior, when compared with the T6 condition. The aim of this study is to study the tensile properties and the SCC behavior of 7075 thick plates when submitted to a single-step aging by varying the aging times. The tests were carried out according to the standards and the data obtained from the SCC tests were analyzed quantitatively using an image analysis software. The results show that, when compared with the T7X conditions, the single-step aging performed in the laboratory can produce acceptable tensile and SCC properties.

  6. Step-feed biofiltration: a low cost alternative configuration for off-gas treatment.

    PubMed

    Estrada, José M; Quijano, Guillermo; Lebrero, Raquel; Muñoz, Raúl

    2013-09-01

    Clogging due to biomass accumulation and the loss of structural stability of the packing media are common operational drawbacks of standard gas biofiltration inherent to the traditional biofilter design, which result in prohibitive pressure drop buildups and media channeling. In this work, an innovative step-feed biofilter configuration, with the air emission supplied in either two or three locations along the biofilter height, was tested and compared with a standard biofilter using toluene as a model pollutant and two packing materials: compost and perlite. When using compost, the step-feed biofilter supported similar elimination capacities (EC ≈ 80 g m(-3) h(-1)) and CO2 production rates (200 g m(-3) h(-1)) to those achieved in the standard biofilter. However, while the pressure drop in the step-feed system remained below 300 Pa m bed(-1) for 61 days, the standard biofilter reached this value in only 14 days and 4000 Pa m bed(-1) by day 30, consuming 75% more compression energy throughout the entire operational period. Operation with perlite supported lower ECs compared to compost in both the step-feed and standard biofilters (≈ 30 g m(-3) h(-1)), probably due to the high indigenous microbial diversity present in this organic packing material. The step-feed biofilter exhibited 65% lower compression energy requirements than the standard biofilter during operation with perlite, while supporting similar ECs. In brief, step-feed biofiltration constitutes a promising operational strategy capable of drastically reducing the operating costs of biofiltration due to a reduced energy consumption and an increased packing material lifespan. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. Isolation of oxidative degradation products of atorvastatin with supercritical fluid chromatography.

    PubMed

    Klobčar, Slavko; Prosen, Helena

    2015-12-01

    The isolation of four oxidative degradation products of atorvastatin using preparative high-performance liquid chromatography applying at least two chromatographic steps is known from the literature. In this paper it is shown that the same four impurities could be isolated from similarly prepared mixtures in only one step using supercritical fluid chromatography. The methods for separation were developed and optimized. The preparation of the mixtures was altered in such a way as to enhance the concentration of desired impurities. Appropriate solvents were applied for collection of separated impurities in order to prevent degradation. The structures of the isolated impurities were confirmed and their purity determined. The preparative supercritical fluid chromatography has proven to be superior to preparative HPLC regarding achieved purity of standards applying fewer chromatographic as well as isolation steps. Copyright © 2015 John Wiley & Sons, Ltd.

  8. From Sequences to Insights in Microbial Ecology

    PubMed Central

    Knight, R.

    2010-01-01

    s4-3 Rapid declines in the cost of sequencing have made large volumes of DNA sequence data available to individual investigators. Now, data analysis is the rate-limiting step: providing a user with sequences alone typically leads to bewilderment, frustration, and skepticism about the technology. In this talk, I focus on how to extract insights from 16S rRNA data, including key lab steps (barcoding and normalization) and on which tools are available to perform routine but essential processing steps such as denoising, chimera detection, taxonomy assignment, and diversity analyses (including detection of biological clusters and gradients in the samples). Providing users with advice on these points and with a standard pipeline they can exploit (but modify if circumstances require) can greatly accelerate the rate of understanding, publication, and acquisition of funding for further studies.

  9. Significant improvement of the quality of bystander first aid using an expert system with a mobile multimedia device.

    PubMed

    Ertl, Lorenz; Christ, Frank

    2007-08-01

    Better quality bystander first-aid could improve outcome rates for emergency victims significantly. In this case-control study, we hypothesised that expert knowledge presented step-by-step to untrained helpers using a personal digital assistant (PDA), would improve the quality of bystanders basic life support. We confronted 101 lay-helpers with two standard emergency situations. (1) An unconscious trauma victim with severe bleeding. (2) Cardiopulmonary resuscitation (CPR). Performance was assessed using an Objective Structured Clinical Examination (OSCE). One group was supported by a PDA providing visual and audio instructions, whereas the control group acted only with their current knowledge. The expert system was programmed in HTML-code and displayed on the PDA's Internet browser. The maximum score obtainable was 24 points corresponding to optimal treatment. The control group without the PDA reached 14.8+/-3.5 (mean value+/-standard deviation), whereas the PDA supported group scored significantly higher (21.9+/-2.7, p<0.01). The difference in performance was measurable in all criteria tested and particularly notable in the items: placing in recovery position, airway management and quality of CPR. The PDA based expert system increased the performance of untrained helpers supplying emergency care significantly. Since Internet compatible mobile devices have become widely available, a significant quality improvement in bystander first-aid seems possible.

  10. Minimally invasive mitral valve surgery through right mini-thoracotomy: recommendations for good exposure, stable cardiopulmonary bypass, and secure myocardial protection.

    PubMed

    Ito, Toshiaki

    2015-07-01

    An apparent advantage of minimally invasive mitral surgery through right mini-thoracotomy is cosmetic appearance. Possible advantages of this procedure are a shorter ventilation time, shorter hospital stay, and less blood transfusion. With regard to hard endpoints, such as operative mortality, freedom from reoperation, or cardiac death, this method is reportedly equivalent, but not superior, to the standard median sternotomy technique. However, perfusion-related complications (e.g., stroke, vascular damage, and limb ischemia) tend to occur more frequently in minimally invasive technique than with the standard technique. In addition, valve repair through a small thoracotomy is technically demanding. Therefore, screening out patients who are not appropriate for performing minimally invasive surgery is the first step. Vascular disease and inadequate anatomy can be evaluated with contrast-enhanced computed tomography. Peripheral cannulation should be carefully performed, using transesophageal echocardiography guidance. Preoperative detailed planning of the valve repair process is desirable because every step is time-consuming in minimally invasive surgery. Three-dimensional echocardiography is a powerful tool for this purpose. For satisfactory exposure and detailed observation of the valve, a special left atrial retractor and high-definition endoscope are useful. Valve repair can be performed in minimally invasive surgery as long as cardiopulmonary bypass is stable and bloodless exposure of the valve is obtained.

  11. Assessing performance of flaw characterization methods through uncertainty propagation

    NASA Astrophysics Data System (ADS)

    Miorelli, R.; Le Bourdais, F.; Artusi, X.

    2018-04-01

    In this work, we assess the inversion performance in terms of crack characterization and localization based on synthetic signals associated to ultrasonic and eddy current physics. More precisely, two different standard iterative inversion algorithms are used to minimize the discrepancy between measurements (i.e., the tested data) and simulations. Furthermore, in order to speed up the computational time and get rid of the computational burden often associated to iterative inversion algorithms, we replace the standard forward solver by a suitable metamodel fit on a database built offline. In a second step, we assess the inversion performance by adding uncertainties on a subset of the database parameters and then, through the metamodel, we propagate these uncertainties within the inversion procedure. The fast propagation of uncertainties enables efficiently evaluating the impact due to the lack of knowledge on some parameters employed to describe the inspection scenarios, which is a situation commonly encountered in the industrial NDE context.

  12. Assessing performance of an Electronic Health Record (EHR) using Cognitive Task Analysis.

    PubMed

    Saitwal, Himali; Feng, Xuan; Walji, Muhammad; Patel, Vimla; Zhang, Jiajie

    2010-07-01

    Many Electronic Health Record (EHR) systems fail to provide user-friendly interfaces due to the lack of systematic consideration of human-centered computing issues. Such interfaces can be improved to provide easy to use, easy to learn, and error-resistant EHR systems to the users. To evaluate the usability of an EHR system and suggest areas of improvement in the user interface. The user interface of the AHLTA (Armed Forces Health Longitudinal Technology Application) was analyzed using the Cognitive Task Analysis (CTA) method called GOMS (Goals, Operators, Methods, and Selection rules) and an associated technique called KLM (Keystroke Level Model). The GOMS method was used to evaluate the AHLTA user interface by classifying each step of a given task into Mental (Internal) or Physical (External) operators. This analysis was performed by two analysts independently and the inter-rater reliability was computed to verify the reliability of the GOMS method. Further evaluation was performed using KLM to estimate the execution time required to perform the given task through application of its standard set of operators. The results are based on the analysis of 14 prototypical tasks performed by AHLTA users. The results show that on average a user needs to go through 106 steps to complete a task. To perform all 14 tasks, they would spend about 22 min (independent of system response time) for data entry, of which 11 min are spent on more effortful mental operators. The inter-rater reliability analysis performed for all 14 tasks was 0.8 (kappa), indicating good reliability of the method. This paper empirically reveals and identifies the following finding related to the performance of AHLTA: (1) large number of average total steps to complete common tasks, (2) high average execution time and (3) large percentage of mental operators. The user interface can be improved by reducing (a) the total number of steps and (b) the percentage of mental effort, required for the tasks. 2010 Elsevier Ireland Ltd. All rights reserved.

  13. Modification of Peyton's four-step approach for small group teaching - a descriptive study.

    PubMed

    Nikendei, Christoph; Huber, Julia; Stiepak, Jan; Huhn, Daniel; Lauter, Jan; Herzog, Wolfgang; Jünger, Jana; Krautter, Markus

    2014-04-02

    Skills-lab training as a methodological teaching approach is nowadays part of the training programs of almost all medical faculties. Specific ingredients have been shown to contribute to a successful learning experience in skills-labs. Although it is undoubted that the instructional approach used to introduce novel clinical technical skills to learners has a decisive impact on subsequent skills performance, as yet, little is known about differential effects of varying instructional methods. An instructional approach that is becoming increasingly prevalent in medical education is "Peyton's Four-Step Approach". As Peyton's Four Step Approach was designed for a 1:1 teacher : student ratio, the aim of the present study was to develop and evaluate a modified Peyton's Approach for small group teaching. The modified Peyton's Approach was applied in three skills-lab training sessions on IV catheter insertion, each with three first- or second year medical students (n = 9), delivered by three different skills-lab teachers. The presented descriptive study investigated the practicability and subjective impressions of skills-lab trainees and tutors. Skills-lab sessions were evaluated by trainees' self-assessment, expert ratings, and qualitative analysis of semi-standardized interviews conducted with trainees and tutors. The model was well accepted by trainees, and was rated as easy to realize, resulting in a good flow of teaching and success in attracting trainee's attention when observed by expert raters. Qualitative semi-standardized interviews performed with all of the trainees and tutors revealed that trainees valued repeated observation, instruction of trainees and the opportunity for independent performance, while tutors stressed that trainees were highly concentrated throughout the training and that they perceived repeated observation to be a valuable preparation for their own performance. The modified Peyton's Approach to instruct small groups of students in skills-lab training sessions has revealed to be practicable, well accepted by trainees, and easy for tutors to realize. Further research should address the realization of the model in larger skills-lab training groups.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mendell, Mark J.; Fisk, William J.

    Background - The goal of this project, with a focus on commercial buildings in California, was to develop a new framework for evidence-based minimum ventilation rate (MVR) standards that protect occupants in buildings while also considering energy use and cost. This was motivated by research findings suggesting that current prescriptive MVRs in commercial buildings do not provide occupants with fully safe and satisfactory indoor environments. Methods - The project began with a broad review in several areas ? the diverse strategies now used for standards or guidelines for MVRs or for environmental contaminant exposures, current knowledge about adverse human effectsmore » associated with VRs, and current knowledge about contaminants in commercial buildings, including their their presence, their adverse human effects, and their relationships with VRs. Based on a synthesis of the reviewed information, new principles and approaches are proposed for setting evidence-based VRs standards for commercial buildings, considering a range of human effects including health, performance, and acceptability of air. Results ? A review and evaluation is first presented of current approaches to setting prescriptive building ventilation standards and setting acceptable limits for human contaminant exposures in outdoor air and occupational settings. Recent research on approaches to setting acceptable levels of environmental exposures in evidence-based MVR standards is also described. From a synthesis and critique of these materials, a set of principles for setting MVRs is presented, along with an example approach based on these principles. The approach combines two sequential strategies. In a first step, an acceptable threshold is set for each adverse outcome that has a demonstrated relationship to VRs, as an increase from a (low) outcome level at a high reference ventilation rate (RVR, the VR needed to attain the best achievable levels of the adverse outcome); MVRs required to meet each specific outcome threshold are estimated; and the highest of these MVRs, which would then meet all outcome thresholds, is selected as the target MVR. In a second step, implemented only if the target MVR from step 1 is judged impractically high, costs and benefits are estimated and this information is used in a risk management process. Four human outcomes with substantial quantitative evidence of relationships to VRs are identified for initial consideration in setting MVR standards. These are: building-related symptoms (sometimes called sick building syndrome symptoms), poor perceived indoor air quality, and diminished work performance, all with data relating them directly to VRs; and cancer and non-cancer chronic outcomes, related indirectly to VRs through specific VR-influenced indoor contaminants. In an application of step 1 for offices using a set of example outcome thresholds, a target MVR of 9 L/s (19 cfm) per person was needed. Because this target MVR was close to MVRs in current standards, use of a cost/benefit process seemed unnecessary. Selection of more stringent thresholds for one or more human outcomes, however, could raise the target MVR to 14 L/s (30 cfm) per person or higher, triggering the step 2 risk management process. Consideration of outdoor air pollutant effects would add further complexity to the framework. For balancing the objective and subjective factors involved in setting MVRs in a cost-benefit process, it is suggested that a diverse group of stakeholders make the determination after assembling as much quantitative data as possible.« less

  15. Robotic-assisted laparoscopic radical nephrectomy using the Da Vinci Si system: how to improve surgeon autonomy. Our step-by-step technique.

    PubMed

    Davila, Hugo H; Storey, Raul E; Rose, Marc C

    2016-09-01

    Herein, we describe several steps to improve surgeon autonomy during a Left Robotic-Assisted Laparoscopic Radical Nephrectomy (RALRN), using the Da Vinci Si system. Our kidney cancer program is based on 2 community hospitals. We use the Da Vinci Si system. Access is obtained with the following trocars: Two 8 mm robotic, one 8 mm robotic, bariatric length (arm 3), 15 mm for the assistant and 12 mm for the camera. We use curved monopolar scissors in robotic arm 1, Bipolar Maryland in arm 2, Prograsp Forceps in arm 3, and we alternate throughout the surgery with EndoWrist clip appliers and the vessel sealer. Here, we described three steps and the use of 3 robotic instruments to improve surgeon autonomy. Step 1: the lower pole of the kidney was dissected and this was retracted upwards and laterally. This maneuver was performed using the 3rd robotic arm with the Prograsp Forceps. Step 2: the monopolar scissors was replaced (robotic arm 1) with the robotic EndoWrist clip applier, 10 mm Hem-o-Lok. The renal artery and vein were controlled and transected by the main surgeon. Step 3: the superior, posterolateral dissection and all bleeders were carefully coagulated by the surgeon with the EndoWrist one vessel sealer. We have now performed 15 RALRN following these steps. Our results were: blood loss 300 cc, console time 140 min, operating room time 200 min, anesthesia time 180 min, hospital stay 2.5 days, 1 incisional hernia, pathology: (13) RCC clear cell, (1) chromophobe and (1) papillary type 1. Tumor Stage: (5) T1b, (8) T2a, (2) T2b. We provide a concise, step-by-step technique for radical nephrectomy (RN) using the Da Vinci Si robotic system that may provide more autonomy to the surgeon, while maintaining surgical outcome equivalent to standard laparoscopic RN.

  16. Optimal Signal Processing of Frequency-Stepped CW Radar Data

    NASA Technical Reports Server (NTRS)

    Ybarra, Gary A.; Wu, Shawkang M.; Bilbro, Griff L.; Ardalan, Sasan H.; Hearn, Chase P.; Neece, Robert T.

    1995-01-01

    An optimal signal processing algorithm is derived for estimating the time delay and amplitude of each scatterer reflection using a frequency-stepped CW system. The channel is assumed to be composed of abrupt changes in the reflection coefficient profile. The optimization technique is intended to maximize the target range resolution achievable from any set of frequency-stepped CW radar measurements made in such an environment. The algorithm is composed of an iterative two-step procedure. First, the amplitudes of the echoes are optimized by solving an overdetermined least squares set of equations. Then, a nonlinear objective function is scanned in an organized fashion to find its global minimum. The result is a set of echo strengths and time delay estimates. Although this paper addresses the specific problem of resolving the time delay between the first two echoes, the derivation is general in the number of echoes. Performance of the optimization approach is illustrated using measured data obtained from an HP-X510 network analyzer. It is demonstrated that the optimization approach offers a significant resolution enhancement over the standard processing approach that employs an IFFT. Degradation in the performance of the algorithm due to suboptimal model order selection and the effects of additive white Gaussion noise are addressed.

  17. Optimal Signal Processing of Frequency-Stepped CW Radar Data

    NASA Technical Reports Server (NTRS)

    Ybarra, Gary A.; Wu, Shawkang M.; Bilbro, Griff L.; Ardalan, Sasan H.; Hearn, Chase P.; Neece, Robert T.

    1995-01-01

    An optimal signal processing algorithm is derived for estimating the time delay and amplitude of each scatterer reflection using a frequency-stepped CW system. The channel is assumed to be composed of abrupt changes in the reflection coefficient profile. The optimization technique is intended to maximize the target range resolution achievable from any set of frequency-stepped CW radar measurements made in such an environment. The algorithm is composed of an iterative two-step procedure. First, the amplitudes of the echoes are optimized by solving an overdetermined least squares set of equations. Then, a nonlinear objective function is scanned in an organized fashion to find its global minimum. The result is a set of echo strengths and time delay estimates. Although this paper addresses the specific problem of resolving the time delay between the two echoes, the derivation is general in the number of echoes. Performance of the optimization approach is illustrated using measured data obtained from an HP-851O network analyzer. It is demonstrated that the optimization approach offers a significant resolution enhancement over the standard processing approach that employs an IFFT. Degradation in the performance of the algorithm due to suboptimal model order selection and the effects of additive white Gaussion noise are addressed.

  18. Report: EPA Data Standards Plan Completed But Additional Steps Are Needed

    EPA Pesticide Factsheets

    Report #12-P-0519, June 5, 2012. The actions taken by EPA were either incomplete or lacked steps to help management determine the overall effectiveness of the Agency’s implementation of data standards.

  19. De-biasing the dynamic mode decomposition for applied Koopman spectral analysis of noisy datasets

    NASA Astrophysics Data System (ADS)

    Hemati, Maziar S.; Rowley, Clarence W.; Deem, Eric A.; Cattafesta, Louis N.

    2017-08-01

    The dynamic mode decomposition (DMD)—a popular method for performing data-driven Koopman spectral analysis—has gained increased popularity for extracting dynamically meaningful spatiotemporal descriptions of fluid flows from snapshot measurements. Often times, DMD descriptions can be used for predictive purposes as well, which enables informed decision-making based on DMD model forecasts. Despite its widespread use and utility, DMD can fail to yield accurate dynamical descriptions when the measured snapshot data are imprecise due to, e.g., sensor noise. Here, we express DMD as a two-stage algorithm in order to isolate a source of systematic error. We show that DMD's first stage, a subspace projection step, systematically introduces bias errors by processing snapshots asymmetrically. To remove this systematic error, we propose utilizing an augmented snapshot matrix in a subspace projection step, as in problems of total least-squares, in order to account for the error present in all snapshots. The resulting unbiased and noise-aware total DMD (TDMD) formulation reduces to standard DMD in the absence of snapshot errors, while the two-stage perspective generalizes the de-biasing framework to other related methods as well. TDMD's performance is demonstrated in numerical and experimental fluids examples. In particular, in the analysis of time-resolved particle image velocimetry data for a separated flow, TDMD outperforms standard DMD by providing dynamical interpretations that are consistent with alternative analysis techniques. Further, TDMD extracts modes that reveal detailed spatial structures missed by standard DMD.

  20. Proficiency Standards and Cut-Scores for Language Proficiency Tests.

    ERIC Educational Resources Information Center

    Moy, Raymond H.

    1984-01-01

    Discusses the problems associated with "grading on a curve," the approach often used for standard setting on language proficiency tests. Proposes four main steps presented in the setting of a non-arbitrary cut-score. These steps not only establish a proficiency standard checked by external criteria, but also check to see that the test covers the…

  1. Developing Knowledgeable Teachers: A Framework for Standards-Based Teacher Education Supported by Institutional Collaboration. The STEP Reports.

    ERIC Educational Resources Information Center

    Garvin, Patty, Ed.

    This collection of papers describes the process of creating a standards-based teacher education program through strong collaboration among arts and science, education, and P-12 faculty members and administrators. The Standards-based Teacher Education Project (STEP) was designed to help teacher education programs ensure that their graduates know…

  2. [Study on control and management for industrial volatile organic compounds (VOCs) in China].

    PubMed

    Wang, Hai-Lin; Zhang, Guo-Ning; Nei, Lei; Wang, Yu-Fei; Hao, Zheng-Ping

    2011-12-01

    Volatile organic compounds (VOCs) emitted from industrial sources account for a large percent of total anthropogenic VOCs. In this paper, VOCs emission characterization, control technologies and management were discussed. VOCs from industrial emissions were characterized by high intensity, wide range and uneven distribution, which focused on Bejing-Tianjin Joint Belt, Shangdong Peninsula, Yangtze River Delta and the Pearl River Delta. The current technologies for VOCs treatment include adsorption, catalytic combustion, bio-degradation and others, which were applied in petrochemical, oil vapor recovery, shipbuilding, printing, pharmaceutical, feather manufacturing and so on. The scarcity of related regulations/standards plus ineffective supervision make the VOCs management difficult. Therefore, it is suggested that VOCs treatment be firstly performed from key areas and industries, and then carried out step by step. By establishing of actual reducing amount control system and more detailed VOCs emission standards and regulations, applying practical technologies together with demonstration projects, and setting up VOCs emission registration and classification-related-charge system, VOCs could be reduced effectively.

  3. Current surgical management of mitral regurgitation.

    PubMed

    Calvinho, Paulo; Antunes, Manuel

    2008-04-01

    From Walton Lillehei, who performed the first successful open mitral valve surgery in 1956, until the advent of robotic surgery in the 21st Century, only 50 years have passed. The introduction of the first heart valve prosthesis, in 1960, was the next major step forward. However, correction of mitral disease by valvuloplasty results in better survival and ventricular performance than mitral valve replacement. However, the European Heart Survey demonstrated that only 40% of the valves are repaired. The standard procedures (Carpentier's techniques and Alfieri's edge-to-edge suture) are the surgical basis for the new technical approaches. Minimally invasive surgery led to the development of video-assisted and robotic surgery and interventional cardiology is already making the first steps on endovascular procedures, using the classical concepts in highly differentiated approaches. Correction of mitral regurgitation is a complex field that is still growing, whereas classic surgery is still under debate as the new era arises.

  4. Cross-linked polyvinyl alcohol films as alkaline battery separators

    NASA Technical Reports Server (NTRS)

    Sheibley, D. W.; Manzo, M. A.; Gonzalez-Sanabria, O. D.

    1983-01-01

    Cross-linking methods have been investigated to determine their effect on the performance of polyvinyl alcohol (PVA) films as alkaline battery separators. The following types of cross-linked PVA films are discussed: (1) PVA-dialdehyde blends post-treated with an acid or acid periodate solution (two-step method) and (2) PVA-dialdehyde blends cross-linked during film formation (drying) by using a reagent with both aldehyde and acid functionality (one-step method). Laboratory samples of each cross-linked type of film were prepared and evaluated in standard separator screening tests. Then pilot-plant batches of films were prepared and compared to measure differences due to the cross-linking method. The pilot-plant materials were then tested in nickel oxide-zinc cells to compare the two methods with respect to performance characteristics and cycle life. Cell test results are compared with those from tests with Celgard.

  5. Cross-linked polyvinyl alcohol films as alkaline battery separators

    NASA Technical Reports Server (NTRS)

    Sheibley, D. W.; Manzo, M. A.; Gonzalez-Sanabria, O. D.

    1982-01-01

    Cross-linking methods were investigated to determine their effect on the performance of polyvinyl alcohol (PVA) films as alkaline battery separators. The following types of cross-linked PVA films are discussed: (1) PVA-dialdehyde blends post-treated with an acid or acid periodate solution (two-step method) and (2) PVA-dialdehyde blends cross-linked during film formation (drying) by using a reagent with both aldehyde and acid functionality (one-step method). Laboratory samples of each cross-linked type of film were prepared and evaluated in standard separator screening tests. The pilot-plant batches of films were prepared and compared to measure differences due to the cross-linking method. The pilot-plant materials were then tested in nickel oxide - zinc cells to compare the two methods with respect to performance characteristics and cycle life. Cell test results are compared with those from tests with Celgard.

  6. Segmentation and determination of joint space width in foot radiographs

    NASA Astrophysics Data System (ADS)

    Schenk, O.; de Muinck Keizer, D. M.; Bernelot Moens, H. J.; Slump, C. H.

    2016-03-01

    Joint damage in rheumatoid arthritis is frequently assessed using radiographs of hands and feet. Evaluation includes measurements of the joint space width (JSW) and detection of erosions. Current visual scoring methods are timeconsuming and subject to inter- and intra-observer variability. Automated measurement methods avoid these limitations and have been fairly successful in hand radiographs. This contribution aims at foot radiographs. Starting from an earlier proposed automated segmentation method we have developed a novel model based image analysis algorithm for JSW measurements. This method uses active appearance and active shape models to identify individual bones. The model compiles ten submodels, each representing a specific bone of the foot (metatarsals 1-5, proximal phalanges 1-5). We have performed segmentation experiments using 24 foot radiographs, randomly selected from a large database from the rheumatology department of a local hospital: 10 for training and 14 for testing. Segmentation was considered successful if the joint locations are correctly determined. Segmentation was successful in only 14%. To improve results a step-by-step analysis will be performed. We performed JSW measurements on 14 randomly selected radiographs. JSW was successfully measured in 75%, mean and standard deviation are 2.30+/-0.36mm. This is a first step towards automated determination of progression of RA and therapy response in feet using radiographs.

  7. Metabolomics evaluation of early-storage red blood cell rejuvenation at 4°C and 37°C.

    PubMed

    Gehrke, Sarah; Srinivasan, Amudan J; Culp-Hill, Rachel; Reisz, Julie A; Ansari, Andrea; Gray, Alan; Landrigan, Matthew; Welsby, Ian; D'Alessandro, Angelo

    2018-04-24

    Refrigerated red blood cell (RBC) storage results in the progressive accumulation of biochemical and morphological alterations collectively referred to as the storage lesion. Storage-induced metabolic alterations can be in part reversed by rejuvenation practices. However, rejuvenation requires an incubation step of RBCs for 1 hour at 37°C, limiting the practicality of providing "on-demand," rejuvenated RBCs. We tested the hypothesis that the addition of rejuvenation solution early in storage as an adjunct additive solution would prevent-in a time window consistent with the average age of units transfused to sickle cell recipients at Duke (15 days)-many of the adverse biochemical changes that can be reversed via standard rejuvenation, while obviating the incubation step. Metabolomics analyses were performed on cells and supernatants from AS-1 RBC units (n = 4), stored for 15 days. Units were split into pediatric bag aliquots and stored at 4°C. These were untreated controls, washed with or without rejuvenation, performed under either standard (37°C) or cold (4°C) conditions. All three treatments removed most metabolic storage by-products from RBC supernatants. However, only standard and cold rejuvenation provided significant metabolic benefits as judged by the reactivation of glycolysis and regeneration of adenosine triphosphate and 2,3-diphosphoglycerate. Improvements in energy metabolism also translated into increased capacity to restore the total glutathione pool and regenerate oxidized vitamin C in its reduced (ascorbate) form. Cold and standard rejuvenation of 15-day-old RBCs primes energy and redox metabolism of stored RBCs, while providing a logistic advantage for routine blood bank processing workflows. © 2018 AABB.

  8. The effects of acute experimental hip muscle pain on dynamic single-limb balance performance in healthy middle-aged adults.

    PubMed

    Hatton, Anna L; Hug, François; Chen, Sarah H; Reid, Christine; Sorensen, Nicole A; Tucker, Kylie

    2016-10-01

    Middle-aged adults with painful hip conditions show balance impairments that are consistent with an increased risk of falls. Pathological changes at the hip, accompanied by pain, may accelerate pre-existing age-related balance deficits present in midlife. To consider the influence of pain alone, we investigated the effects of acute experimental hip muscle pain on dynamic single-limb balance in middle-aged adults. Thirty-four healthy adults aged 40-60 years formed two groups (Group-1: n=16; Group-2: n=18). Participants performed four tasks: Reactive Sideways Stepping (ReactSide); Star Excursion Balance Test (SEBT); Step Test; Single-Limb Squat; before and after an injection of hypertonic saline into the right gluteus medius muscle (Group-1) or ∼5min rest (Group-2). Balance measures included the range and standard deviation of centre of pressure (CoP) movement in mediolateral and anterior-posterior directions, and CoP total path velocity (ReactSide, Squat); reach distance (SEBT); and number of completed steps (Step Test). Data were assessed using three-way analysis of variance. Motor outcomes were altered during the second repetition of tasks irrespective of exposure to experimental hip muscle pain or rest, with reduced SEBT anterior reach (-1.2±4.1cm, P=0.027); greater step number during Step Test (1.5±1.7 steps, P<0.001); and slower CoP velocity during Single-Limb Squat (-4.9±9.4mms -1 , P=0.024). Factors other than the presence of pain may play a greater role in balance impairments in middle-aged adults with hip pathologies. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Easy-to-learn cardiopulmonary resuscitation training programme: a randomised controlled trial on laypeople’s resuscitation performance

    PubMed Central

    Ko, Rachel Jia Min; Lim, Swee Han; Wu, Vivien Xi; Leong, Tak Yam; Liaw, Sok Ying

    2018-01-01

    INTRODUCTION Simplifying the learning of cardiopulmonary resuscitation (CPR) is advocated to improve skill acquisition and retention. A simplified CPR training programme focusing on continuous chest compression, with a simple landmark tracing technique, was introduced to laypeople. The study aimed to examine the effectiveness of the simplified CPR training in improving lay rescuers’ CPR performance as compared to standard CPR. METHODS A total of 85 laypeople (aged 21–60 years) were recruited and randomly assigned to undertake either a two-hour simplified or standard CPR training session. They were tested two months after the training on a simulated cardiac arrest scenario. Participants’ performance on the sequence of CPR steps was observed and evaluated using a validated CPR algorithm checklist. The quality of chest compression and ventilation was assessed from the recording manikins. RESULTS The simplified CPR group performed significantly better on the CPR algorithm when compared to the standard CPR group (p < 0.01). No significant difference was found between the groups in time taken to initiate CPR. However, a significantly higher number of compressions and proportion of adequate compressions was demonstrated by the simplified group than the standard group (p < 0.01). Hands-off time was significantly shorter in the simplified CPR group than in the standard CPR group (p < 0.001). CONCLUSION Simplifying the learning of CPR by focusing on continuous chest compressions, with simple hand placement for chest compression, could lead to better acquisition and retention of CPR algorithms, and better quality of chest compressions than standard CPR. PMID:29167910

  10. Easy-to-learn cardiopulmonary resuscitation training programme: a randomised controlled trial on laypeople's resuscitation performance.

    PubMed

    Ko, Rachel Jia Min; Lim, Swee Han; Wu, Vivien Xi; Leong, Tak Yam; Liaw, Sok Ying

    2018-04-01

    Simplifying the learning of cardiopulmonary resuscitation (CPR) is advocated to improve skill acquisition and retention. A simplified CPR training programme focusing on continuous chest compression, with a simple landmark tracing technique, was introduced to laypeople. The study aimed to examine the effectiveness of the simplified CPR training in improving lay rescuers' CPR performance as compared to standard CPR. A total of 85 laypeople (aged 21-60 years) were recruited and randomly assigned to undertake either a two-hour simplified or standard CPR training session. They were tested two months after the training on a simulated cardiac arrest scenario. Participants' performance on the sequence of CPR steps was observed and evaluated using a validated CPR algorithm checklist. The quality of chest compression and ventilation was assessed from the recording manikins. The simplified CPR group performed significantly better on the CPR algorithm when compared to the standard CPR group (p < 0.01). No significant difference was found between the groups in time taken to initiate CPR. However, a significantly higher number of compressions and proportion of adequate compressions was demonstrated by the simplified group than the standard group (p < 0.01). Hands-off time was significantly shorter in the simplified CPR group than in the standard CPR group (p < 0.001). Simplifying the learning of CPR by focusing on continuous chest compressions, with simple hand placement for chest compression, could lead to better acquisition and retention of CPR algorithms, and better quality of chest compressions than standard CPR. Copyright: © Singapore Medical Association.

  11. A novel partial volume effects correction technique integrating deconvolution associated with denoising within an iterative PET image reconstruction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Merlin, Thibaut, E-mail: thibaut.merlin@telecom-bretagne.eu; Visvikis, Dimitris; Fernandez, Philippe

    2015-02-15

    Purpose: Partial volume effect (PVE) plays an important role in both qualitative and quantitative PET image accuracy, especially for small structures. A previously proposed voxelwise PVE correction method applied on PET reconstructed images involves the use of Lucy–Richardson deconvolution incorporating wavelet-based denoising to limit the associated propagation of noise. The aim of this study is to incorporate the deconvolution, coupled with the denoising step, directly inside the iterative reconstruction process to further improve PVE correction. Methods: The list-mode ordered subset expectation maximization (OSEM) algorithm has been modified accordingly with the application of the Lucy–Richardson deconvolution algorithm to the current estimationmore » of the image, at each reconstruction iteration. Acquisitions of the NEMA NU2-2001 IQ phantom were performed on a GE DRX PET/CT system to study the impact of incorporating the deconvolution inside the reconstruction [with and without the point spread function (PSF) model] in comparison to its application postreconstruction and to standard iterative reconstruction incorporating the PSF model. The impact of the denoising step was also evaluated. Images were semiquantitatively assessed by studying the trade-off between the intensity recovery and the noise level in the background estimated as relative standard deviation. Qualitative assessments of the developed methods were additionally performed on clinical cases. Results: Incorporating the deconvolution without denoising within the reconstruction achieved superior intensity recovery in comparison to both standard OSEM reconstruction integrating a PSF model and application of the deconvolution algorithm in a postreconstruction process. The addition of the denoising step permitted to limit the SNR degradation while preserving the intensity recovery. Conclusions: This study demonstrates the feasibility of incorporating the Lucy–Richardson deconvolution associated with a wavelet-based denoising in the reconstruction process to better correct for PVE. Future work includes further evaluations of the proposed method on clinical datasets and the use of improved PSF models.« less

  12. Diagnostic Laparoscopy for Trauma: How Not to Miss Injuries.

    PubMed

    Koto, Modise Z; Matsevych, Oleh Y; Aldous, Colleen

    2018-05-01

    Diagnostic laparoscopy (DL) is a well-accepted approach for penetrating abdominal trauma (PAT). However, the steps of procedure and the systematic laparoscopic examination are not clearly defined in the literature. The aim of this study was to clarify the definition of DL in trauma surgery by auditing DL performed for PAT at our institution, and to describe the strategies on how to avoid missed injuries. The data of patients managed with laparoscopy for PAT from January 2012 to December 2015 were retrospectively analyzed. The details of operative technique and strategies on how to avoid missed injuries were discussed. Out of 250 patients managed with laparoscopy for PAT, 113 (45%) patients underwent DL. Stab wounds sustained 94 (83%) patients. The penetration of the peritoneal cavity or retroperitoneum was documented in 67 (59%) of patients. Organ evisceration was present in 21 (19%) patients. Multiple injuries were present in 22% of cases. The chest was the most common associated injury. Two (1.8%) iatrogenic injuries were recorded. The conversion rate was 1.7% (2/115). The mean length of hospital stay was 4 days. There were no missed injuries. In the therapeutic laparoscopy (TL) group, DL was performed as the initial part and identified all injuries. There were no missed injuries in the TL group. The predetermined sequential steps of DL and the standard systematic examination of intraabdominal organs were described. DL is a feasible and safe procedure. It accurately identifies intraabdominal injuries. The selected use of preoperative imaging, adherence to the predetermined steps of procedure and the standard systematic laparoscopic examination will minimize the rate of missed injuries.

  13. Application of UV-Vis spectrophotometric process for the assessment of indoloacridines as free radical scavenger.

    PubMed

    Sridharan, Makuteswaran; Prasad, K J Rajendra; Madhumitha, G; Al-Dhabi, Naif Abdullah; Arasu, Mariadhas Valan

    2016-09-01

    A conventional approach has been used to synthesis Indole fused acridine, 4a-e. In this paper to achieve the target molecule, 4 the reaction was performed via two steps. In step 1, there was a reaction between Carbazolone, 1 and benzophenone, 2 to get dihydroindoloacridine, 3. In step 2, compound, 3 was treated with 5% Palladium/Carbon in the presence of diphenyl ether for 5h to give a dark brown product, 4. The column chromatography was used to purify final product, 4. All the synthesized compounds such as 3 and 4 were characterized by melting point, FTIR, (1)H NMR, and Mass spectra. Further to check the purity of the compounds it was subjected to CHN analyzer. The target molecules such as 3 and 4 were screened for antimicrobial studies against bacteria such as Bacillus subtilis (B. subtilis), Staphylococcus aureus (S. aureus), Klebsiella pneumonia (K. pneumonia), Salmonella typhi (S. typhi); and fungi like Aspergillus niger (A. niger), Aspergillus fumigatus (A. fumigatus). The obtained results clearly proves that the target molecules shown reasonable activity against K. pneumonia and A. niger. Further the compounds were screened for free radical scavenging activity using 2,2-diphenyl-1-picrylhydrazyl (DPPH). The free radical scavenging property was performed using UV-Visible spectroscopy. The results were compared with the standard BHT (Butylated Hydroxy Toluene). Compounds, 4a and 4e were shown higher percentage of inhibition when compare to the standard. The result confirms that further research on indoloacridine will leads effective drug to the market. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Utilization of International Association of Diabetes and Pregnancy Study Groups criteria vs. a two-step approach to screening for gestational diabetes mellitus in Chinese women with twin pregnancies.

    PubMed

    Liu, X; Chen, Y; Zhou, Q; Shi, H; Cheng, W W

    2015-03-01

    To evaluate prevalence and pregnancy outcomes using the International Association of Diabetes and Pregnancy Study Groups (IADPSG) criteria and screening protocol vs. a standard two-step screening approach for gestational diabetes mellitus in Chinese twin pregnancies. A retrospective cohort study for pregnancies during 2007-2013 was performed in a tertiary hospital in Shanghai, China. Data were abstracted from the medical records of twin pregnancies delivered at the hospital. During the period 2007-2011, this hospital used a two-step approach with a 50 g screening with a cut-off value of ≥ 7.8 mmol/l followed by a 100 g diagnostic oral glucose tolerance test (OGTT) utilizing Carpenter-Coustan criteria. In 2012-2013, the hospital switched to the IADPSG protocol of universal 75 g OGTT. Among 1461 twin pregnancies, 643 were screened utilizing IADPSG criteria and 818 using the two-step protocol. Gestational diabetes mellitus was diagnosed more frequently in the IADPSG group than in the two-step group [20.4% and 7.0%, respectively; adjusted odds ratio (aOR) = 3.22; 95% confidence interval (CI) = 2.30-4.52]. During the IADPSG period, the incidence of pre-eclampsia was 38% lower in non-gestational diabetes mellitus affected pregnancies compared with the two-step period (aOR = 0.62; 95% CI = 0.44-0.87). We observed no significant differences in most perinatal outcomes between the two groups. Compared with a standard two-step approach to screening and diagnosis, the IADPSG screening method resulted in a three-fold increase in the incidence of gestational diabetes mellitus in twin pregnancies, with a 38% lower risk of pre-eclampsia but no significant difference in most perinatal outcomes in non-gestational diabetes mellitus affected pregnancies. © 2014 The Authors. Diabetic Medicine © 2014 Diabetes UK.

  15. Two-step glutamate dehydrogenase antigen real-time polymerase chain reaction assay for detection of toxigenic Clostridium difficile.

    PubMed

    Goldenberg, S D; Cliff, P R; Smith, S; Milner, M; French, G L

    2010-01-01

    Current diagnosis of Clostridium difficile infection (CDI) relies upon detection of toxins A/B in stool by enzyme immunoassay [EIA(A/B)]. This strategy is unsatisfactory because it has a low sensitivity resulting in significant false negatives. We investigated the performance of a two-step algorithm for diagnosis of CDI using detection of glutamate dehydrogenase (GDH). GDH-positive samples were tested for C. difficile toxin B gene (tcdB) by polymerase chain reaction (PCR). The performance of the two-step protocol was compared with toxin detection by the Meridian Premier EIA kit in 500 consecutive stool samples from patients with suspected CDI. The reference standard among samples that were positive by either EIA(A/B) or GDH testing was culture cytotoxin neutralisation (culture/CTN). Thirty-six (7%) of 500 samples were identified as true positives by culture/CTN. EIA(A/B) identified 14 of the positive specimens with 22 false negatives and two false positives. The two-step protocol identified 34 of the positive samples with two false positives and two false negatives. EIA(A/B) had a sensitivity of 39%, specificity of 99%, positive predictive value of 88% and negative predictive value of 95%. The two-step algorithm performed better, with corresponding values of 94%, 99%, 94% and 99% respectively. Screening for GDH before confirmation of positives by PCR is cheaper than screening all specimens by PCR and is an effective method for routine use. Current EIA(A/B) tests for CDI are of inadequate sensitivity and should be replaced; however, this may result in apparent changes in CDI rates that would need to be explained in national surveillance statistics. Copyright 2009 The Hospital Infection Society. Published by Elsevier Ltd. All rights reserved.

  16. Quality of haemophilia care in The Netherlands: new standards for optimal care.

    PubMed

    Leebeek, Frank W G; Fischer, Kathelijn

    2014-04-01

    In the Netherlands, the first formal haemophilia comprehensive care centre was established in 1964, and Dutch haemophilia doctors have been organised since 1972. Although several steps were taken to centralise haemophilia care and maintain quality of care, treatment was still delivered in many hospitals, and formal criteria for haemophilia treatment centres as well as a national haemophilia registry were lacking. In collaboration with patients and other stakeholders, Dutch haemophilia doctors have undertaken a formal process to draft new quality standards for the haemophilia treatment centres. First a project group including doctors, nurses, patients and the institute for harmonisation of quality standards undertook a literature study on quality standards and performed explorative visits to several haemophilia treatment centres in the Netherlands. Afterwards concept standards were defined and validated in two treatment centres. Next, the concept standards were evaluated by haemophilia doctors, patients, health insurance representatives and regulators. Finally, the final version of the standards of care was approved by Central body of Experts on quality standards in clinical care and the Dutch Ministry of Health. A team of expert auditors have been trained and, together with an independent auditor, will perform audits in haemophilia centres applying for formal certification. Concomitantly, a national registry for haemophilia and allied disorders is being set up. It is expected that these processes will lead to further concentration and improved quality of haemophilia care in the Netherlands.

  17. Scanning tunneling microscope study of GaAs(001) surfaces grown by migration enhanced epitaxy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, J.; Gallagher, M.C.; Willis, R.F.

    We report an investigation of the morphology of p-type GaAs(001) surfaces using scanning tunneling microscopy (STM). The substrates were prepared using two methods: migration enhanced epitaxy (MEE) and standard molecular-beam epitaxy (MBE). The STM measurements were performed ex situ using As decapping. Analysis indicates that the overall step density of the MEE samples decreases as the growth temperature is increased. Nominally flat samples grown at 300{degrees}C exhibited step densities of 10.5 steps/1000 {Angstrom} along [ 110] dropping to 2.5 steps at 580{degrees}C. MEE samples exhibited a lower step density than MBE samples. However as-grown surfaces exhibited a larger distribution ofmore » step heights. Annealing the samples reduced the step height distribution exposing fewer atomic layers. Samples grown by MEE at 580{degrees}C and annealed for 2 min displayed the lowest step density and the narrowest step height distribution. All samples displayed an anisotropic step density. We found a ratio of A-type to B-type steps of between 2 and 3 which directly reflects the difference in the incorporation energy at steps. The aspect ratio increased slightly with growth temperature. We found a similar aspect ratio on samples grown by MBE. This indicates that anisotropic growth during MEE, like MBE, is dominated by incorporation kinetics. MEE samples grown at 580{degrees}C and capped immediately following growth exhibited a number of {open_quotes}holes{close_quotes} in the surface. The holes could be eliminated by annealing the surface prior to quenching. 20 refs., 3 figs., 1 tab.« less

  18. Implementation of the AMEDD (Army Medical Department) Standards of Nursing Practice: An Evaluation.

    DTIC Science & Technology

    1987-01-29

    Self -Care Deficit (Specify level: Feeding, Bathing/ Hygiene, Dressing/grooming, Toileting K Self -Concept, Alteration In: Body Image, Self - Esteem , Role...Performance, Personal Identity K Self -Concept, Disturbance in G Self -Dressing-Grooming Deficit (Specify Level) G Self - Esteem Disturbance G Self -Feeding...were conceptualized as working documents providing the foundation for the profession’s self -monitoring (M. Phaneuf, M. Wandelt, 1974). The next step

  19. Standardization of Performance Tests: A Proposal for Further Steps.

    DTIC Science & Technology

    1986-07-01

    obviously demand substantial attention can sometimes be time shared perfectly. Wickens describes cases in which skilled pianists can time share sight-reading...effects of divided attention on information processing in tracking. Journal of Experimental Psychology, 1, 1-13. Wickens, C.D. (1984). Processing resources... attention he regards focused- divided attention tasks (e.g. dichotic listening, dual task situations) as theoretically useful. From his point of view good

  20. 10 CFR Appendix A to Subpart U of... - Sampling Plan for Enforcement Testing of Electric Motors

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... based on a 20 percent tolerance in the total power loss at full-load and fixed output power. Given the... performance of the n1 units in the first sample as follows: ER83AD04.005 where Xi is the measured full-load efficiency of unit i. Step 3. Compute the sample standard deviation (S1) of the measured full-load efficiency...

  1. 10 CFR Appendix A to Subpart U of... - Sampling Plan for Enforcement Testing of Electric Motors

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... based on a 20 percent tolerance in the total power loss at full-load and fixed output power. Given the... performance of the n1 units in the first sample as follows: ER83AD04.005 where Xi is the measured full-load efficiency of unit i. Step 3. Compute the sample standard deviation (S1) of the measured full-load efficiency...

  2. Distinct motor impairments of dopamine D1 and D2 receptor knockout mice revealed by three types of motor behavior

    PubMed Central

    Nakamura, Toru; Sato, Asako; Kitsukawa, Takashi; Momiyama, Toshihiko; Yamamori, Tetsuo; Sasaoka, Toshikuni

    2014-01-01

    Both D1R and D2R knock out (KO) mice of the major dopamine receptors show significant motor impairments. However, there are some discrepant reports, which may be due to the differences in genetic background and experimental procedures. In addition, only few studies directly compared the motor performance of D1R and D2R KO mice. In this paper, we examined the behavioral difference among N10 congenic D1R and D2R KO, and wild type (WT) mice. First, we examined spontaneous motor activity in the home cage environment for consecutive 5 days. Second, we examined motor performance using the rota-rod task, a standard motor task in rodents. Third, we examined motor ability with the Step-Wheel task in which mice were trained to run in a motor-driven turning wheel adjusting their steps on foothold pegs to drink water. The results showed clear differences among the mice of three genotypes in three different types of behavior. In monitoring spontaneous motor activities, D1R and D2R KO mice showed higher and lower 24 h activities, respectively, than WT mice. In the rota-rod tasks, at a low speed, D1R KO mice showed poor performance but later improved, whereas D2R KO mice showed a good performance at early days without further improvement. When first subjected to a high speed task, the D2R KO mice showed poorer rota-rod performance at a low speed than the D1R KO mice. In the Step-Wheel task, across daily sessions, D2R KO mice increased the duration that mice run sufficiently close to the spout to drink water, and decreased time to touch the floor due to missing the peg steps and number of times the wheel was stopped, which performance was much better than that of D1R KO mice. These incongruent results between the two tasks for D1R and D2R KO mice may be due to the differences in the motivation for the rota-rod and Step-Wheel tasks, aversion- and reward-driven, respectively. The Step-Wheel system may become a useful tool for assessing the motor ability of WT and mutant mice. PMID:25076876

  3. Distinct motor impairments of dopamine D1 and D2 receptor knockout mice revealed by three types of motor behavior.

    PubMed

    Nakamura, Toru; Sato, Asako; Kitsukawa, Takashi; Momiyama, Toshihiko; Yamamori, Tetsuo; Sasaoka, Toshikuni

    2014-01-01

    Both D1R and D2R knock out (KO) mice of the major dopamine receptors show significant motor impairments. However, there are some discrepant reports, which may be due to the differences in genetic background and experimental procedures. In addition, only few studies directly compared the motor performance of D1R and D2R KO mice. In this paper, we examined the behavioral difference among N10 congenic D1R and D2R KO, and wild type (WT) mice. First, we examined spontaneous motor activity in the home cage environment for consecutive 5 days. Second, we examined motor performance using the rota-rod task, a standard motor task in rodents. Third, we examined motor ability with the Step-Wheel task in which mice were trained to run in a motor-driven turning wheel adjusting their steps on foothold pegs to drink water. The results showed clear differences among the mice of three genotypes in three different types of behavior. In monitoring spontaneous motor activities, D1R and D2R KO mice showed higher and lower 24 h activities, respectively, than WT mice. In the rota-rod tasks, at a low speed, D1R KO mice showed poor performance but later improved, whereas D2R KO mice showed a good performance at early days without further improvement. When first subjected to a high speed task, the D2R KO mice showed poorer rota-rod performance at a low speed than the D1R KO mice. In the Step-Wheel task, across daily sessions, D2R KO mice increased the duration that mice run sufficiently close to the spout to drink water, and decreased time to touch the floor due to missing the peg steps and number of times the wheel was stopped, which performance was much better than that of D1R KO mice. These incongruent results between the two tasks for D1R and D2R KO mice may be due to the differences in the motivation for the rota-rod and Step-Wheel tasks, aversion- and reward-driven, respectively. The Step-Wheel system may become a useful tool for assessing the motor ability of WT and mutant mice.

  4. Obtaining accreditation by the pharmacy compounding accreditation board, part 2: developing essential standard operating procedures.

    PubMed

    Cabaleiro, Joe

    2007-01-01

    A key component of qualifying for accreditation with the Pharmacy Compounding Accreditation Board is having a set of comprehensive standard operating procedures that are being used by the pharmacy staff. The three criteria in standard operating procedures for which the Pharmacy Compounding Accreditation Board looks are: (1)written standard operating procedures; (2)standard operating procedures that reflect what the organization actualy does; and (3) whether the written standard operating procedures are implemented. Following specified steps in the preparation of standard operating procedures will result in procedures that meet Pharmacy Compounding Accreditation Board Requirements, thereby placing pharmacies one step closer to qualifying for accreditation.

  5. Similarities and differences among half-marathon runners according to their performance level

    PubMed Central

    Morante, Juan Carlos; Gómez-Molina, Josué; García-López, Juan

    2018-01-01

    This study aimed to identify the similarities and differences among half-marathon runners in relation to their performance level. Forty-eight male runners were classified into 4 groups according to their performance level in a half-marathon (min): Group 1 (n = 11, < 70 min), Group 2 (n = 13, < 80 min), Group 3 (n = 13, < 90 min), Group 4 (n = 11, < 105 min). In two separate sessions, training-related, anthropometric, physiological, foot strike pattern and spatio-temporal variables were recorded. Significant differences (p<0.05) between groups (ES = 0.55–3.16) and correlations with performance were obtained (r = 0.34–0.92) in training-related (experience and running distance per week), anthropometric (mass, body mass index and sum of 6 skinfolds), physiological (VO2max, RCT and running economy), foot strike pattern and spatio-temporal variables (contact time, step rate and length). At standardized submaximal speeds (11, 13 and 15 km·h-1), no significant differences between groups were observed in step rate and length, neither in contact time when foot strike pattern was taken into account. In conclusion, apart from training-related, anthropometric and physiological variables, foot strike pattern and step length were the only biomechanical variables sensitive to half-marathon performance, which are essential to achieve high running speeds. However, when foot strike pattern and running speeds were controlled (submaximal test), the spatio-temporal variables were similar. This indicates that foot strike pattern and running speed are responsible for spatio-temporal differences among runners of different performance level. PMID:29364940

  6. Anti-Propionibacterium acnes assay-guided purification of brazilin and preparation of brazilin rich extract from Caesalpinia sappan heartwood.

    PubMed

    Nirmal, Nilesh Prakash; Panichayupakaranant, Pharkphoom

    2014-09-01

    Caesalpinia sappan L. (Leguminosae or Fabaceae) heartwood has been used as a coloring agent, with antibacterial activity in food, beverages, cosmetics, and garments. To purify brazilin from C. sappan heartwood and use it as a standard marker for the preparation and standardization of an active constituent-rich extract. Crude ethanol extracts of C. sappan heartwood (CSE) were fractionated to isolate brazilin by an anti-P. acnes assay-guided isolation. Quantitative analysis was performed by HPLC. Minimum inhibitory concentrations (MICs) and minimum bactericidal concentrations (MBCs) were determined by the broth microdilution method. Brazilin isolated from CSE possessed antibacterial activity against P. acnes with MIC and MBC values of 15.6 and 31.2 µg/mL, respectively. Brazilin was, therefore, used as a standard marker for standardization and preparation of a brazilin rich extract (BRE). BRE was prepared from CSE using a simple one-step purification using a macroporous resin column eluted with 35% v/v ethanol. This method increased the brazilin content in the BRE up to 39.9% w/w. The antibacterial activity of the standardized BRE against acne involved bacteria was higher than for the CSE but lower than brazilin. However, for industrial applications, a large-scale one-step preparation of BRE has more advantages than the use of pure brazilin in terms of convenience and a low-cost production process. Therefore, BRE is considered as a potential coloring agent with antibacterial activity which is used for pharmaceutical, cosmetic, and nutraceutical applications.

  7. Dynamic pressure sensitivity determination with Mach number method

    NASA Astrophysics Data System (ADS)

    Sarraf, Christophe; Damion, Jean-Pierre

    2018-05-01

    Measurements of pressure in fast transient conditions are often performed even if the dynamic characteristic of the transducer are not traceable to international standards. Moreover, the question of a primary standard in dynamic pressure is still open, especially for gaseous applications. The question is to improve dynamic standards in order to respond to expressed industrial needs. In this paper, the method proposed in the EMRP IND09 ‘Dynamic’ project, which can be called the ‘ideal shock tube method’, is compared with the ‘collective standard method’ currently used in the Laboratoire de Métrologie Dynamique (LNE/ENSAM). The input is a step of pressure generated by a shock tube. The transducer is a piezoelectric pressure sensor. With the ‘ideal shock tube method’ the sensitivity of a pressure sensor is first determined dynamically. This method requires a shock tube implemented with piezoelectric shock wave detectors. The measurement of the Mach number in the tube allows an evaluation of the incident pressure amplitude of a step using a theoretical 1D model of the shock tube. Heat transfer, other actual effects and effects of the shock tube imperfections are not taken into account. The amplitude of the pressure step is then used to determine the sensitivity in dynamic conditions. The second method uses a frequency bandwidth comparison to determine pressure at frequencies from quasi-static conditions, traceable to static pressure standards, to higher frequencies (up to 10 kHz). The measurand is also a step of pressure generated by a supposed ideal shock tube or a fast-opening device. The results are provided as a transfer function with an uncertainty budget assigned to a frequency range, also deliverable frequency by frequency. The largest uncertainty in the bandwidth of comparison is used to trace the final pressure step level measured in dynamic conditions, owing that this pressure is not measurable in a steady state on a shock tube. A reference sensor thereby calibrated can be used in a comparison measurement process. At high frequencies the most important component of the uncertainty in this method is due to actual shock tube complex effects not already functionalized nowadays or thought not to be functionalized in this kind of direct method. After a brief review of both methods and a brief review of the determination of the transfer function of pressure transducers, and the budget of associated uncertainty for the dynamic calibration of a pressure transducer in gas, this paper presents a comparison of the results obtained with the ‘ideal shock tube’ and the ‘collective standard’ methods.

  8. Building an open-source robotic stereotaxic instrument.

    PubMed

    Coffey, Kevin R; Barker, David J; Ma, Sisi; West, Mark O

    2013-10-29

    This protocol includes the designs and software necessary to upgrade an existing stereotaxic instrument to a robotic (CNC) stereotaxic instrument for around $1,000 (excluding a drill), using industry standard stepper motors and CNC controlling software. Each axis has variable speed control and may be operated simultaneously or independently. The robot's flexibility and open coding system (g-code) make it capable of performing custom tasks that are not supported by commercial systems. Its applications include, but are not limited to, drilling holes, sharp edge craniotomies, skull thinning, and lowering electrodes or cannula. In order to expedite the writing of g-coding for simple surgeries, we have developed custom scripts that allow individuals to design a surgery with no knowledge of programming. However, for users to get the most out of the motorized stereotax, it would be beneficial to be knowledgeable in mathematical programming and G-Coding (simple programming for CNC machining). The recommended drill speed is greater than 40,000 rpm. The stepper motor resolution is 1.8°/Step, geared to 0.346°/Step. A standard stereotax has a resolution of 2.88 μm/step. The maximum recommended cutting speed is 500 μm/sec. The maximum recommended jogging speed is 3,500 μm/sec. The maximum recommended drill bit size is HP 2.

  9. Standard work for room entry: Linking lean, hand hygiene, and patient-centeredness.

    PubMed

    O'Reilly, Kristin; Ruokis, Samantha; Russell, Kristin; Teves, Tim; DiLibero, Justin; Yassa, David; Berry, Hannah; Howell, Michael D

    2016-03-01

    Healthcare-associated infections are costly and fatal. Substantial front-line, administrative, regulatory, and research efforts have focused on improving hand hygiene. While broad agreement exists that hand hygiene is the most important single approach to infection prevention, compliance with hand hygiene is typically only about 40%(1). Our aim was to develop a standard process for room entry in the intensive care unit that improved compliance with hand hygiene and allowed for maximum efficiency. We recognized that hand hygiene is a single step in a substantially more complicated process of room entry. We applied Lean engineering techniques to develop a standard process that included both physical steps and also standard communication elements from provider to patients and families and created a physical environment to support this. We observed meaningful improvement in the performance of the new standard as well as time savings for clinical providers with each room entry. We also observed an increase in room entries that included verbal communication and an explanation of what the clinician was entering the room to do. The design and implementation of a standardized room entry process and the creation of an environment that supports that new process has resulted in measurable positive outcomes on the medical intensive care unit, including quality, patient experience, efficiency, and staff satisfaction. Designing a process, rather than viewing tasks that need to happen in close proximity in time (either serially or in parallel) as unrelated, simplifies work for staff and results in higher compliance to individual tasks. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. CT and MR Protocol Standardization Across a Large Health System: Providing a Consistent Radiologist, Patient, and Referring Provider Experience.

    PubMed

    Sachs, Peter B; Hunt, Kelly; Mansoubi, Fabien; Borgstede, James

    2017-02-01

    Building and maintaining a comprehensive yet simple set of standardized protocols for a cross-sectional image can be a daunting task. A single department may have difficulty preventing "protocol creep," which almost inevitably occurs when an organized "playbook" of protocols does not exist and individual radiologists and technologists alter protocols at will and on a case-by-case basis. When multiple departments or groups function in a large health system, the lack of uniformity of protocols can increase exponentially. In 2012, the University of Colorado Hospital formed a large health system (UCHealth) and became a 5-hospital provider network. CT and MR imaging studies are conducted at multiple locations by different radiology groups. To facilitate consistency in ordering, acquisition, and appearance of a given study, regardless of location, we minimized the number of protocols across all scanners and sites of practice with a clinical indication-driven protocol selection and standardization process. Here we review the steps utilized to perform this process improvement task and insure its stability over time. Actions included creation of a standardized protocol template, which allowed for changes in electronic storage and management of protocols, designing a change request form, and formation of a governance structure. We utilized rapid improvement events (1 day for CT, 2 days for MR) and reduced 248 CT protocols into 97 standardized protocols and 168 MR protocols to 66. Additional steps are underway to further standardize output and reporting of imaging interpretation. This will result in an improved, consistent radiologist, patient, and provider experience across the system.

  11. The fourth radiation transfer model intercomparison (RAMI-IV): Proficiency testing of canopy reflectance models with ISO-13528

    NASA Astrophysics Data System (ADS)

    Widlowski, J.-L.; Pinty, B.; Lopatka, M.; Atzberger, C.; Buzica, D.; Chelle, M.; Disney, M.; Gastellu-Etchegorry, J.-P.; Gerboles, M.; Gobron, N.; Grau, E.; Huang, H.; Kallel, A.; Kobayashi, H.; Lewis, P. E.; Qin, W.; Schlerf, M.; Stuckens, J.; Xie, D.

    2013-07-01

    The radiation transfer model intercomparison (RAMI) activity aims at assessing the reliability of physics-based radiative transfer (RT) models under controlled experimental conditions. RAMI focuses on computer simulation models that mimic the interactions of radiation with plant canopies. These models are increasingly used in the development of satellite retrieval algorithms for terrestrial essential climate variables (ECVs). Rather than applying ad hoc performance metrics, RAMI-IV makes use of existing ISO standards to enhance the rigor of its protocols evaluating the quality of RT models. ISO-13528 was developed "to determine the performance of individual laboratories for specific tests or measurements." More specifically, it aims to guarantee that measurement results fall within specified tolerance criteria from a known reference. Of particular interest to RAMI is that ISO-13528 provides guidelines for comparisons where the true value of the target quantity is unknown. In those cases, "truth" must be replaced by a reliable "conventional reference value" to enable absolute performance tests. This contribution will show, for the first time, how the ISO-13528 standard developed by the chemical and physical measurement communities can be applied to proficiency testing of computer simulation models. Step by step, the pre-screening of data, the identification of reference solutions, and the choice of proficiency statistics will be discussed and illustrated with simulation results from the RAMI-IV "abstract canopy" scenarios. Detailed performance statistics of the participating RT models will be provided and the role of the accuracy of the reference solutions as well as the choice of the tolerance criteria will be highlighted.

  12. Performance testing and module monitoring at the EC Necessary steps to develop cost-effective PV modules

    NASA Astrophysics Data System (ADS)

    Krebs, K.

    Testing programs carried out by the European Communities to establish testing techniques and standards for verifying the reliability and integrity of solar cells intended for the marketplace are described. The efforts are being expended to assure quality control and certification for photovoltaic (PV) products manufactured in any of the member nations. The failure rate for PV modules was lowered to 0.5 pct/year by 1981, and single cell failures are projected to be lowered to 0.00001/yr, connectors to 0.001/yr, and batteries to 0.01/yr. Day/night thermal cycling causes the most dominant type of failures, i.e., cracked cells and interconnect defects. Tests have been standardized for inspection, verification, performance, mechanical loading, hail impact, damp heat, high temperature long exposure, hot-spot heating, thermal cycling, and humidity-freezing tolerance.

  13. Multi-profile Bayesian alignment model for LC-MS data analysis with integration of internal standards

    PubMed Central

    Tsai, Tsung-Heng; Tadesse, Mahlet G.; Di Poto, Cristina; Pannell, Lewis K.; Mechref, Yehia; Wang, Yue; Ressom, Habtom W.

    2013-01-01

    Motivation: Liquid chromatography-mass spectrometry (LC-MS) has been widely used for profiling expression levels of biomolecules in various ‘-omic’ studies including proteomics, metabolomics and glycomics. Appropriate LC-MS data preprocessing steps are needed to detect true differences between biological groups. Retention time (RT) alignment, which is required to ensure that ion intensity measurements among multiple LC-MS runs are comparable, is one of the most important yet challenging preprocessing steps. Current alignment approaches estimate RT variability using either single chromatograms or detected peaks, but do not simultaneously take into account the complementary information embedded in the entire LC-MS data. Results: We propose a Bayesian alignment model for LC-MS data analysis. The alignment model provides estimates of the RT variability along with uncertainty measures. The model enables integration of multiple sources of information including internal standards and clustered chromatograms in a mathematically rigorous framework. We apply the model to LC-MS metabolomic, proteomic and glycomic data. The performance of the model is evaluated based on ground-truth data, by measuring correlation of variation, RT difference across runs and peak-matching performance. We demonstrate that Bayesian alignment model improves significantly the RT alignment performance through appropriate integration of relevant information. Availability and implementation: MATLAB code, raw and preprocessed LC-MS data are available at http://omics.georgetown.edu/alignLCMS.html Contact: hwr@georgetown.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24013927

  14. An intraorganizational model for developing and spreading quality improvement innovations.

    PubMed

    Kellogg, Katherine C; Gainer, Lindsay A; Allen, Adrienne S; OʼSullivan, Tatum; Singer, Sara J

    Recent policy reforms encourage quality improvement (QI) innovations in primary care, but practitioners lack clear guidance regarding spread inside organizations. We designed this study to identify how large organizations can facilitate intraorganizational spread of QI innovations. We conducted ethnographic observation and interviews in a large, multispecialty, community-based medical group that implemented three QI innovations across 10 primary care sites using a new method for intraorganizational process development and spread. We compared quantitative outcomes achieved through the group's traditional versus new method, created a process model describing the steps in the new method, and identified barriers and facilitators at each step. The medical group achieved substantial improvement using its new method of intraorganizational process development and spread of QI innovations: standard work for rooming and depression screening, vaccine error rates and order compliance, and Pap smear error rates. Our model details nine critical steps for successful intraorganizational process development (set priorities, assess the current state, develop the new process, and measure and refine) and spread (develop support, disseminate information, facilitate peer-to-peer training, reinforce, and learn and adapt). Our results highlight the importance of utilizing preexisting organizational structures such as established communication channels, standardized roles, common workflows, formal authority, and performance measurement and feedback systems when developing and spreading QI processes inside an organization. In particular, we detail how formal process advocate positions in each site for each role can facilitate the spread of new processes. Successful intraorganizational spread is possible and sustainable. Developing and spreading new QI processes across sites inside an organization requires creating a shared understanding of the necessary process steps, considering the barriers that may arise at each step, and leveraging preexisting organizational structures to facilitate intraorganizational process development and spread.

  15. An intraorganizational model for developing and spreading quality improvement innovations

    PubMed Central

    Kellogg, Katherine C.; Gainer, Lindsay A.; Allen, Adrienne S.; O'Sullivan, Tatum; Singer, Sara J.

    2017-01-01

    Background: Recent policy reforms encourage quality improvement (QI) innovations in primary care, but practitioners lack clear guidance regarding spread inside organizations. Purpose: We designed this study to identify how large organizations can facilitate intraorganizational spread of QI innovations. Methodology/Approach: We conducted ethnographic observation and interviews in a large, multispecialty, community-based medical group that implemented three QI innovations across 10 primary care sites using a new method for intraorganizational process development and spread. We compared quantitative outcomes achieved through the group’s traditional versus new method, created a process model describing the steps in the new method, and identified barriers and facilitators at each step. Findings: The medical group achieved substantial improvement using its new method of intraorganizational process development and spread of QI innovations: standard work for rooming and depression screening, vaccine error rates and order compliance, and Pap smear error rates. Our model details nine critical steps for successful intraorganizational process development (set priorities, assess the current state, develop the new process, and measure and refine) and spread (develop support, disseminate information, facilitate peer-to-peer training, reinforce, and learn and adapt). Our results highlight the importance of utilizing preexisting organizational structures such as established communication channels, standardized roles, common workflows, formal authority, and performance measurement and feedback systems when developing and spreading QI processes inside an organization. In particular, we detail how formal process advocate positions in each site for each role can facilitate the spread of new processes. Practice Implications: Successful intraorganizational spread is possible and sustainable. Developing and spreading new QI processes across sites inside an organization requires creating a shared understanding of the necessary process steps, considering the barriers that may arise at each step, and leveraging preexisting organizational structures to facilitate intraorganizational process development and spread. PMID:27428788

  16. Assistive devices alter gait patterns in Parkinson disease: advantages of the four-wheeled walker.

    PubMed

    Kegelmeyer, Deb A; Parthasarathy, Sowmya; Kostyk, Sandra K; White, Susan E; Kloos, Anne D

    2013-05-01

    Gait abnormalities are a hallmark of Parkinson's disease (PD) and contribute to fall risk. Therapy and exercise are often encouraged to increase mobility and decrease falls. As disease symptoms progress, assistive devices are often prescribed. There are no guidelines for choosing appropriate ambulatory devices. This unique study systematically examined the impact of a broad range of assistive devices on gait measures during walking in both a straight path and around obstacles in individuals with PD. Quantitative gait measures, including velocity, stride length, percent swing and double support time, and coefficients of variation were assessed in 27 individuals with PD with or without one of six different devices including canes, standard and wheeled walkers (two, four or U-Step). Data were collected using the GAITRite and on a figure-of-eight course. All devices, with the exception of four-wheeled and U-Step walkers significantly decreased gait velocity. The four-wheeled walker resulted in less variability in gait measures and had less impact on spontaneous unassisted gait patterns. The U-Step walker exhibited the highest variability across all parameters followed by the two-wheeled and standard walkers. Higher variability has been correlated with increased falls. Though subjects performed better on a figure-of-eight course using either the four-wheeled or the U-Step walker, the four-wheeled walker resulted in the most consistent improvement in overall gait variables. Laser light use on a U-Step walker did not improve gait measures or safety in figure-of-eight compared to other devices. Of the devices tested, the four-wheeled-walker offered the most consistent advantages for improving mobility and safety. Copyright © 2012 Elsevier B.V. All rights reserved.

  17. Removal of toxic metal ions from landfill leachate by complementary sorption and transport across polymer inclusion membranes.

    PubMed

    Zawierucha, Iwona; Kozlowski, Cezary; Malina, Grzegorz

    2013-10-01

    In this study, performance of a lab-scale two-step treatment system was evaluated for removal of toxic metal ions from landfill leachate. The technology of polymer inclusion membranes (PIMs) was the first step, while the second step of the treatment system was based on sorption on impregnated resin. The PIMs were synthesized from cellulose triacetate as a support, macrocyclic compound i.e. alkyl derivative of resorcinarene as a ionic carrier and o-nitrophenyl pentyl ether as a plasticizer. The transport experiments through PIM were carried out in a permeation cell, in which the membrane film was tightly clamped between two cell compartments. The sorption tests were carried out using a column filled with a resin impregnated with resorcinarene derivative. The obtained results show the good performance with respect to the removal of heavy metals from landfill leachate with the overall removal efficiency of 99%, 88% and 55% for Pb(II), Cd(II) and Zn(II) ions, respectively. Moreover the contents of metal ions in the leachate sample after treatment system were below permissible limit for wastewater according to the Polish Standards. Copyright © 2012 Elsevier Ltd. All rights reserved.

  18. Patient-reported outcome measures versus inertial performance-based outcome measures: A prospective study in patients undergoing primary total knee arthroplasty.

    PubMed

    Bolink, S A A N; Grimm, B; Heyligers, I C

    2015-12-01

    Outcome assessment of total knee arthroplasty (TKA) by subjective patient reported outcome measures (PROMs) may not fully capture the functional (dis-)abilities of relevance. Objective performance-based outcome measures could provide distinct information. An ambulant inertial measurement unit (IMU) allows kinematic assessment of physical performance and could potentially be used for routine follow-up. To investigate the responsiveness of IMU measures in patients following TKA and compare outcomes with conventional PROMs. Patients with end stage knee OA (n=20, m/f=7/13; age=67.4 standard deviation 7.7 years) were measured preoperatively and one year postoperatively. IMU measures were derived during gait, sit-stand transfers and block step-up transfers. PROMs were assessed by using the Western Ontario and McMaster Universities Osteoarthritis Index (WOMAC) and Knee Society Score (KSS). Responsiveness was calculated by the effect size, correlations were calculated with Spearman's rho correlation coefficient. One year after TKA, patients performed significantly better at gait, sit-to-stand transfers and block step-up transfers. Measures of time and kinematic IMU measures demonstrated significant improvements postoperatively for each performance-based test. The largest improvement was found in block step-up transfers (effect size=0.56-1.20). WOMAC function score and KSS function score demonstrated moderate correlations (Spearman's rho=0.45-0.74) with some of the physical performance-based measures pre- and postoperatively. To characterize the changes in physical function after TKA, PROMs could be supplemented by performance-based measures, assessing function during different activities and allowing kinematic characterization with an ambulant IMU. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. One Small Step for a Man: Estimation of Gender, Age and Height from Recordings of One Step by a Single Inertial Sensor

    PubMed Central

    Riaz, Qaiser; Vögele, Anna; Krüger, Björn; Weber, Andreas

    2015-01-01

    A number of previous works have shown that information about a subject is encoded in sparse kinematic information, such as the one revealed by so-called point light walkers. With the work at hand, we extend these results to classifications of soft biometrics from inertial sensor recordings at a single body location from a single step. We recorded accelerations and angular velocities of 26 subjects using integrated measurement units (IMUs) attached at four locations (chest, lower back, right wrist and left ankle) when performing standardized gait tasks. The collected data were segmented into individual walking steps. We trained random forest classifiers in order to estimate soft biometrics (gender, age and height). We applied two different validation methods to the process, 10-fold cross-validation and subject-wise cross-validation. For all three classification tasks, we achieve high accuracy values for all four sensor locations. From these results, we can conclude that the data of a single walking step (6D: accelerations and angular velocities) allow for a robust estimation of the gender, height and age of a person. PMID:26703601

  20. Adaptive macro finite elements for the numerical solution of monodomain equations in cardiac electrophysiology.

    PubMed

    Heidenreich, Elvio A; Ferrero, José M; Doblaré, Manuel; Rodríguez, José F

    2010-07-01

    Many problems in biology and engineering are governed by anisotropic reaction-diffusion equations with a very rapidly varying reaction term. This usually implies the use of very fine meshes and small time steps in order to accurately capture the propagating wave while avoiding the appearance of spurious oscillations in the wave front. This work develops a family of macro finite elements amenable for solving anisotropic reaction-diffusion equations with stiff reactive terms. The developed elements are incorporated on a semi-implicit algorithm based on operator splitting that includes adaptive time stepping for handling the stiff reactive term. A linear system is solved on each time step to update the transmembrane potential, whereas the remaining ordinary differential equations are solved uncoupled. The method allows solving the linear system on a coarser mesh thanks to the static condensation of the internal degrees of freedom (DOF) of the macroelements while maintaining the accuracy of the finer mesh. The method and algorithm have been implemented in parallel. The accuracy of the method has been tested on two- and three-dimensional examples demonstrating excellent behavior when compared to standard linear elements. The better performance and scalability of different macro finite elements against standard finite elements have been demonstrated in the simulation of a human heart and a heterogeneous two-dimensional problem with reentrant activity. Results have shown a reduction of up to four times in computational cost for the macro finite elements with respect to equivalent (same number of DOF) standard linear finite elements as well as good scalability properties.

  1. Stairs and Doors Recognition as Natural Landmarks Based on Clouds of 3D Edge-Points from RGB-D Sensors for Mobile Robot Localization.

    PubMed

    Souto, Leonardo A V; Castro, André; Gonçalves, Luiz Marcos Garcia; Nascimento, Tiago P

    2017-08-08

    Natural landmarks are the main features in the next step of the research in localization of mobile robot platforms. The identification and recognition of these landmarks are crucial to better localize a robot. To help solving this problem, this work proposes an approach for the identification and recognition of natural marks included in the environment using images from RGB-D (Red, Green, Blue, Depth) sensors. In the identification step, a structural analysis of the natural landmarks that are present in the environment is performed. The extraction of edge points of these landmarks is done using the 3D point cloud obtained from the RGB-D sensor. These edge points are smoothed through the S l 0 algorithm, which minimizes the standard deviation of the normals at each point. Then, the second step of the proposed algorithm begins, which is the proper recognition of the natural landmarks. This recognition step is done as a real-time algorithm that extracts the points referring to the filtered edges and determines to which structure they belong to in the current scenario: stairs or doors. Finally, the geometrical characteristics that are intrinsic to the doors and stairs are identified. The approach proposed here has been validated with real robot experiments. The performed tests verify the efficacy of our proposed approach.

  2. Stairs and Doors Recognition as Natural Landmarks Based on Clouds of 3D Edge-Points from RGB-D Sensors for Mobile Robot Localization†

    PubMed Central

    Castro, André; Nascimento, Tiago P.

    2017-01-01

    Natural landmarks are the main features in the next step of the research in localization of mobile robot platforms. The identification and recognition of these landmarks are crucial to better localize a robot. To help solving this problem, this work proposes an approach for the identification and recognition of natural marks included in the environment using images from RGB-D (Red, Green, Blue, Depth) sensors. In the identification step, a structural analysis of the natural landmarks that are present in the environment is performed. The extraction of edge points of these landmarks is done using the 3D point cloud obtained from the RGB-D sensor. These edge points are smoothed through the Sl0 algorithm, which minimizes the standard deviation of the normals at each point. Then, the second step of the proposed algorithm begins, which is the proper recognition of the natural landmarks. This recognition step is done as a real-time algorithm that extracts the points referring to the filtered edges and determines to which structure they belong to in the current scenario: stairs or doors. Finally, the geometrical characteristics that are intrinsic to the doors and stairs are identified. The approach proposed here has been validated with real robot experiments. The performed tests verify the efficacy of our proposed approach. PMID:28786925

  3. Spatiotemporal Parameters are not Substantially Influenced by Load Carriage or Inclination During Treadmill and Overground Walking

    PubMed Central

    Seay, Joseph F.; Gregorczyk, Karen N.; Hasselquist, Leif

    2016-01-01

    Abstract Influences of load carriage and inclination on spatiotemporal parameters were examined during treadmill and overground walking. Ten soldiers walked on a treadmill and overground with three load conditions (00 kg, 20 kg, 40 kg) during level, uphill (6% grade) and downhill (-6% grade) inclinations at self-selected speed, which was constant across conditions. Mean values and standard deviations for double support percentage, stride length and a step rate were compared across conditions. Double support percentage increased with load and inclination change from uphill to level walking, with a 0.4% stance greater increase at the 20 kg condition compared to 00 kg. As inclination changed from uphill to downhill, the step rate increased more overground (4.3 ± 3.5 steps/min) than during treadmill walking (1.7 ± 2.3 steps/min). For the 40 kg condition, the standard deviations were larger than the 00 kg condition for both the step rate and double support percentage. There was no change between modes for step rate standard deviation. For overground compared to treadmill walking, standard deviation for stride length and double support percentage increased and decreased, respectively. Changes in the load of up to 40 kg, inclination of 6% grade away from the level (i.e., uphill or downhill) and mode (treadmill and overground) produced small, yet statistically significant changes in spatiotemporal parameters. Variability, as assessed by standard deviation, was not systematically lower during treadmill walking compared to overground walking. Due to the small magnitude of changes, treadmill walking appears to replicate the spatiotemporal parameters of overground walking. PMID:28149338

  4. Embryo transfer techniques: an American Society for Reproductive Medicine survey of current Society for Assisted Reproductive Technology practices.

    PubMed

    Toth, Thomas L; Lee, Malinda S; Bendikson, Kristin A; Reindollar, Richard H

    2017-04-01

    To better understand practice patterns and opportunities for standardization of ET. Cross-sectional survey. Not applicable. Not applicable. An anonymous 82-question survey was emailed to the medical directors of 286 Society for Assisted Reproductive Technology member IVF practices. A follow-up survey composed of three questions specific to ET technique was emailed to the same medical directors. Descriptive statistics of the results were compiled. The survey assessed policies, protocols, restrictions, and specifics pertinent to the technique of ET. There were 117 (41%) responses; 32% practice in academic settings and 68% in private practice. Responders were experienced clinicians, half of whom had performed <10 procedures during training. Ninety-eight percent of practices allowed all practitioners to perform ET; half did not follow a standardized ET technique. Multiple steps in the ET process were identified as "highly conserved;" others demonstrated discordance. ET technique is divided among [1] trial transfer followed immediately with ET (40%); [2] afterload transfer (30%); and [3] direct transfer without prior trial or afterload (27%). Embryos are discharged in the upper (66%) and middle thirds (29%) of the endometrial cavity and not closer than 1-1.5 cm from fundus (87%). Details of each step were reported and allowed the development of a "common" practice ET procedure. ET training and practices vary widely. Improved training and standardization based on outcomes data and best practices are warranted. A common practice procedure is suggested for validation by a systematic literature review. Copyright © 2017 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  5. Collision Avoidance Functional Requirements for Step 1. Revision 6

    NASA Technical Reports Server (NTRS)

    2006-01-01

    This Functional Requirements Document (FRD) describes the flow of requirements from the high level operational objectives down to the functional requirements specific to cooperative collision avoidance for high altitude, long endurance unmanned aircraft systems. These are further decomposed into performance and safety guidelines that are backed up by analysis or references to various documents or research findings. The FRD should be considered when establishing future policies, procedures, and standards pertaining to cooperative collision avoidance.

  6. The AFLOW Standard for High-throughput Materials Science Calculations

    DTIC Science & Technology

    2015-01-01

    84602, USA fDepartment of Physics and Department of Chemistry, University of North Texas, Denton, TX 76203, USA gMaterials Science, Electrical ...inversion in the iterative subspace (RMM– DIIS ) [10]. Of the two, DBS is known to be the slower and more stable option. Additionally, the subspace...RMM– DIIS steps as needed to fulfill the dEelec condition. Later determinations of system forces are performed by a similar sequence, but only a single

  7. Comparison of methods for measurement of organic compounds at ultra-trace level: analytical criteria and application to analysis of amino acids in extraterrestrial samples.

    PubMed

    Vandenabeele-Trambouze, O; Claeys-Bruno, M; Dobrijevic, M; Rodier, C; Borruat, G; Commeyras, A; Garrelly, L

    2005-02-01

    The need for criteria to compare different analytical methods for measuring extraterrestrial organic matter at ultra-trace levels in relatively small and unique samples (e.g., fragments of meteorites, micrometeorites, planetary samples) is discussed. We emphasize the need to standardize the description of future analyses, and take the first step toward a proposed international laboratory network for performance testing.

  8. Texture-based segmentation of temperate-zone woodland in panchromatic IKONOS imagery

    NASA Astrophysics Data System (ADS)

    Gagnon, Langis; Bugnet, Pierre; Cavayas, Francois

    2003-08-01

    We have performed a study to identify optimal texture parameters for woodland segmentation in a highly non-homogeneous urban area from a temperate-zone panchromatic IKONOS image. Texture images are produced with the sum- and difference-histograms depend on two parameters: window size f and displacement step p. The four texture features yielding the best discrimination between classes are the mean, contrast, correlation and standard deviation. The f-p combinations 17-1, 17-2, 35-1 and 35-2 are those which give the best performance, with an average classification rate of 90%.

  9. 7 CFR 65.230 - Production step.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Production step. 65.230 Section 65.230 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Standards..., PEANUTS, AND GINSENG General Provisions Definitions § 65.230 Production step. Production step means, in...

  10. Effectiveness of en masse versus two-step retraction: a systematic review and meta-analysis.

    PubMed

    Rizk, Mumen Z; Mohammed, Hisham; Ismael, Omar; Bearn, David R

    2018-01-05

    This review aims to compare the effectiveness of en masse and two-step retraction methods during orthodontic space closure regarding anchorage preservation and anterior segment retraction and to assess their effect on the duration of treatment and root resorption. An electronic search for potentially eligible randomized controlled trials and prospective controlled trials was performed in five electronic databases up to July 2017. The process of study selection, data extraction, and quality assessment was performed by two reviewers independently. A narrative review is presented in addition to a quantitative synthesis of the pooled results where possible. The Cochrane risk of bias tool and the Newcastle-Ottawa Scale were used for the methodological quality assessment of the included studies. Eight studies were included in the qualitative synthesis in this review. Four studies were included in the quantitative synthesis. En masse/miniscrew combination showed a statistically significant standard mean difference regarding anchorage preservation - 2.55 mm (95% CI - 2.99 to - 2.11) and the amount of upper incisor retraction - 0.38 mm (95% CI - 0.70 to - 0.06) when compared to a two-step/conventional anchorage combination. Qualitative synthesis suggested that en masse retraction requires less time than two-step retraction with no difference in the amount of root resorption. Both en masse and two-step retraction methods are effective during the space closure phase. The en masse/miniscrew combination is superior to the two-step/conventional anchorage combination with regard to anchorage preservation and amount of retraction. Limited evidence suggests that anchorage reinforcement with a headgear produces similar results with both retraction methods. Limited evidence also suggests that en masse retraction may require less time and that no significant differences exist in the amount of root resorption between the two methods.

  11. 75 FR 16913 - Transmission Relay Loadability Reliability Standard

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-02

    ... B. Approval of PRC-023-1 13 C. Applicability 20 D. Generator Step-Up and Auxiliary Transformers......... 98 1. Omission From the Reliability Standard 98 2. Generator Step-Up Transformer Relays as Back-up... series elements such as transformers) that emanates from the remote buses. All transmission owners shall...

  12. The road to JCAHO disease-specific care certification: a step-by-step process log.

    PubMed

    Morrison, Kathy

    2005-01-01

    In 2002, the Joint Commission on Accreditation of Healthcare Organizations (JCAHO) implemented Disease-Specific Care (DSC) certification. This is a voluntary program in which organizations have their disease management program evaluated by this regulatory agency. Some of the DSC categories are stroke, heart failure, acute MI, diabetes, and pneumonia. The criteria for any disease management program certification are: compliance with consensus-based national standards, effective use of established clinical practice guidelines to manage and optimize care, and an organized approach to performance measurement and improvement activities. Successful accomplishment of DSC certification defines organizations as Centers of Excellence in management of that particular disease. This article will review general guidelines for DSC certification with an emphasis on Primary Stroke Center certification.

  13. Software verification plan for GCS. [guidance and control software

    NASA Technical Reports Server (NTRS)

    Dent, Leslie A.; Shagnea, Anita M.; Hayhurst, Kelly J.

    1990-01-01

    This verification plan is written as part of an experiment designed to study the fundamental characteristics of the software failure process. The experiment will be conducted using several implementations of software that were produced according to industry-standard guidelines, namely the Radio Technical Commission for Aeronautics RTCA/DO-178A guidelines, Software Consideration in Airborne Systems and Equipment Certification, for the development of flight software. This plan fulfills the DO-178A requirements for providing instructions on the testing of each implementation of software. The plan details the verification activities to be performed at each phase in the development process, contains a step by step description of the testing procedures, and discusses all of the tools used throughout the verification process.

  14. One-Step Immunochromatography Assay Kit for Detecting Antibodies to Canine Parvovirus

    PubMed Central

    Oh, Jin-Sik; Ha, Gun-Woo; Cho, Young-Shik; Kim, Min-Jae; An, Dong-Jun; Hwang, Kyu-Kye; Lim, Yoon-Kyu; Park, Bong-Kyun; Kang, BoKyu; Song, Dae-Sub

    2006-01-01

    This study was performed to determine the feasibility of using whole serum to detect antibodies to canine parvovirus (CPV) under nonlaboratory conditions and to evaluate the performance characteristics of an immunochromatography assay kit. Precise detection of levels of antibody against CPV in puppies can be used to determine a vaccination schedule, because maternal antibodies frequently result in the failure of protective vaccination, and can also be used to determine the antibody levels of infected puppies. Several methods for the titration of CPV antibodies have been reported, including the hemagglutination inhibition (HI) assay, which is considered the “gold standard.” These methods, however, require intricate and time-consuming procedures. In this study, a total of 386 serum specimens were tested. Compared to the HI assay, the rapid assay had a 97.1% sensitivity and a 76.6% specificity (with a cutoff HI titer of 1:80). This single-step assay could be performed rapidly and easily without special equipment. The kit provides a reliable method for detection of anti-CPV antibody where laboratory support and personnel are limited. PMID:16603622

  15. One-step immunochromatography assay kit for detecting antibodies to canine parvovirus.

    PubMed

    Oh, Jin-Sik; Ha, Gun-Woo; Cho, Young-Shik; Kim, Min-Jae; An, Dong-Jun; Hwang, Kyu-Kye; Lim, Yoon-Kyu; Park, Bong-Kyun; Kang, BoKyu; Song, Dae-Sub

    2006-04-01

    This study was performed to determine the feasibility of using whole serum to detect antibodies to canine parvovirus (CPV) under nonlaboratory conditions and to evaluate the performance characteristics of an immunochromatography assay kit. Precise detection of levels of antibody against CPV in puppies can be used to determine a vaccination schedule, because maternal antibodies frequently result in the failure of protective vaccination, and can also be used to determine the antibody levels of infected puppies. Several methods for the titration of CPV antibodies have been reported, including the hemagglutination inhibition (HI) assay, which is considered the "gold standard." These methods, however, require intricate and time-consuming procedures. In this study, a total of 386 serum specimens were tested. Compared to the HI assay, the rapid assay had a 97.1% sensitivity and a 76.6% specificity (with a cutoff HI titer of 1:80). This single-step assay could be performed rapidly and easily without special equipment. The kit provides a reliable method for detection of anti-CPV antibody where laboratory support and personnel are limited.

  16. A method for tailoring the information content of a software process model

    NASA Technical Reports Server (NTRS)

    Perkins, Sharon; Arend, Mark B.

    1990-01-01

    The framework is defined for a general method for selecting a necessary and sufficient subset of a general software life cycle's information products, to support new software development process. Procedures for characterizing problem domains in general and mapping to a tailored set of life cycle processes and products is presented. An overview of the method is shown using the following steps: (1) During the problem concept definition phase, perform standardized interviews and dialogs between developer and user, and between user and customer; (2) Generate a quality needs profile of the software to be developed, based on information gathered in step 1; (3) Translate the quality needs profile into a profile of quality criteria that must be met by the software to satisfy the quality needs; (4) Map the quality criteria to set of accepted processes and products for achieving each criterion; (5) Select the information products which match or support the accepted processes and product of step 4; and (6) Select the design methodology which produces the information products selected in step 5.

  17. A method for tailoring the information content of a software process model

    NASA Technical Reports Server (NTRS)

    Perkins, Sharon; Arend, Mark B.

    1990-01-01

    The framework is defined for a general method for selecting a necessary and sufficient subset of a general software life cycle's information products, to support new software development process. Procedures for characterizing problem domains in general and mapping to a tailored set of life cycle processes and products is presented. An overview of the method is shown using the following steps: (1) During the problem concept definition phase, perform standardized interviews and dialogs between developer and user, and between user and customer; (2) Generate a quality needs profile of the software to be developed, based on information gathered in step 1; (3) Translate the quality needs profile into a profile of quality criteria that must be met by the software to satisfy the quality needs; (4) Map the quality criteria to a set of accepted processes and products for achieving each criterion; (5) select the information products which match or support the accepted processes and product of step 4; and (6) Select the design methodology which produces the information products selected in step 5.

  18. Checkout and Standard Use Procedures for the Mark III Space Suit Assembly

    NASA Technical Reports Server (NTRS)

    Valish, Dana J.

    2012-01-01

    The operational pressure range is the range to which the suit can be nominally operated for manned testing. The top end of the nominal operational pressure range is equivalent to 1/2 the proof pressure. Structural pressure is 1.5 times the specified test pressure for any given test. Proof pressure is the maximum unmanned pressure to which the suit was tested by the vendor prior to delivery. The maximum allowable working pressure (MAWP) is 90% of the proof pressure. The pressure systems RVs are set to keep components below their MAWPs. If the suit is pressurized over its MAWP, the suit will be taken out of service and an in-depth inspection/review of the suit will be performed before the suit is put back in service. The procedures outlined in this document should be followed as written. However, the suit test engineer (STE) may make redline changes real-time, provided those changes are recorded in the anomaly section of the test data sheet. If technicians supporting suit build-up, check-out, and/or test execution believe that a procedure can be improved, they should notify their lead. If procedures are incorrect to the point of potentially causing hardware damage or affecting safety, bring the problem to the technician lead and/or STE s attention and stop work until a solution (temporary or permanent) is authorized. Certain steps in the procedure are marked with a DV , for Designated Verifier. The Designated Verifier for this procedure is an Advanced Space Suit Technology Development Laboratory technician, not directly involved in performing the procedural steps, who will verify that the step was performed as stated. The steps to be verified by the DV were selected based on one or more of the following criteria: the step was deemed significant in ensuring the safe performance of the test, the data recorded in the step is of specific interest in monitoring the suit system operation, or the step has a strong influence on the successful completion of test objectives. Prior to all manned test activities, Advanced Suit Test Data Sheet (TDS) Parts A-E shall be completed to verify system and team are ready for test. Advanced Suit TDS Parts F-G shall be completed at the end of the suited activity. Appendix B identifies tha appropriate Mark III suit emergency event procedures.

  19. Compliance With WHO/UNICEF BFHI Standards in Croatia After Implementation of the BFHI.

    PubMed

    Zakarija-Grković, Irena; Boban, Marija; Janković, Sunčana; Ćuže, Anamarija; Burmaz, Tea

    2018-02-01

    The primary goal of the Baby-Friendly Hospital Initiative (BFHI) is to create conditions in maternity facilities that enable women to initiate and sustain the practice of breastfeeding exclusively. Research aim: This study aimed to determine hospital practices and breastfeeding rates before and after BFHI implementation and assess compliance with UNICEF/World Health Organization (WHO) standards for seven of the BFHI's Ten Steps to Successful Breastfeeding ( Ten Steps). Mothers of healthy, term infants ( N = 1,115) were recruited from the postnatal ward of the University Hospital of Split, Croatia, between February 2008 and July 2011 and followed for 12 months in a repeated-measures, prospective, longitudinal, three-group, nonequivalent, cohort study. Breastfeeding rates, hospital practices-including seven of the Ten Steps-and maternal sociodemographic data were collected. Parts of all seven Ten Steps that were assessed improved significantly post-BFHI. Step 3 ("antenatal education") showed the least improvement, whereas Step 7 ("rooming-in"; 2.6% pre-BFHI vs. 98.5% post-BFHI) and Step 9 ("no pacifiers/teats"; 21.8% pre-BFHI vs. 99.4% post-BFHI) showed the greatest improvement. Six months after Baby-Friendly designation, only Steps 7 and 9 were in full compliance with UNICEF/WHO standards. In-hospital, exclusive-breastfeeding rates rose markedly ( p < .001), but no change occurred in breastfeeding rates at 3, 6, or 12 months. Full implementation of the BFHI was associated with significant improvement in hospital practices and in-hospital, exclusive-breastfeeding rates, but it did not affect breastfeeding rates postdischarge, emphasizing the vital role of community support. Baby-Friendly Hospital Initiative standards declined rapidly post-hospital designation, indicating the need for regular monitoring and reassessment as well as ongoing, effective training for hospital staff.

  20. Development of one-step hollow fiber supported liquid phase sampling technique for occupational workplace air analysis using high performance liquid chromatography with ultra-violet detector.

    PubMed

    Yan, Cheing-Tong; Chien, Hai-Ying

    2012-07-13

    In this study, a simple and novel one-step hollow-fiber supported liquid-phase sampling (HF-LPS) technique was developed for enriched sampling of gaseous toxic species prior to chemical analysis for workplace air monitoring. A lab-made apparatus designed with a gaseous sample generator and a microdialysis sampling cavity (for HF-LPS) was utilized and evaluated to simulate gaseous contaminant air for occupational workplace analysis. Gaseous phenol was selected as the model toxic species. A polyethersulfone hollow fiber dialysis module filled with ethylene glycol in the shell-side was applied as the absorption solvent to collect phenol from a gas flow through the tube-side, based on the concentration distribution of phenol between the absorption solvent and the gas flow. After sampling, 20 μL of the extractant was analyzed by high performance liquid chromatography with ultraviolet detection (HPLC-UV). Factors that influence the generation of gaseous standards and the HF-LPS were studied thoroughly. Results indicated that at 25 °C the phenol (2000 μg/mL) standard solution injected at 15-μL/min can be vaporized into sampling cavity under nitrogen flow at 780 mL/min, to generate gaseous phenol with concentration approximate to twice the permissible exposure limit. Sampling at 37.3 mL/min for 30 min can meet the requirement of the workplace air monitoring. The phenol in air ranged between 0.7 and 10 cm³/m³ (shows excellent linearity) with recovery between 98.1 and 104.1%. The proposed method was identified as a one-step sampling for workplace monitoring with advantages of convenience, rapidity, sensitivity, and usage of less-toxic solvent. Copyright © 2012 Elsevier B.V. All rights reserved.

  1. Dynamic lighting system for the learning environment: performance of elementary students.

    PubMed

    Choi, Kyungah; Suk, Hyeon-Jeong

    2016-05-16

    This study aims to investigate the effects of lighting color temperatures on elementary students' performance, and thereby propose a dynamic lighting system for a smart learning environment. Three empirical studies were conducted: First, physiological responses were measured as a potential mediator of performance. Second, cognitive and behavioral responses were observed during academic and recess activities. Lastly, the experiment was carried out in a real-life setting with prolonged exposure. With a comprehensive analysis of the three studies, three lighting presets-3500 K, 5000 K, and 6500 K-are suggested for easy, standard, and intensive activity, respectively. The study is expected to act as a good stepping stone for developing dynamic lighting systems to support students' performance in learning environments.

  2. A two-step ultra-high-performance liquid chromatography-quadrupole/time of flight mass spectrometry with mass defect filtering method for rapid identification of analogues from known components of different chemical structure types in Fructus Gardeniae-Fructus Forsythiae herb pair extract and in rat's blood.

    PubMed

    Zhou, Wei; Shan, Jinjun; Meng, Minxin

    2018-08-17

    Fructus Gardeniae-Fructus Forsythiae herb pair is an herbal formula used extensively to treat inflammation and fever, but few systematic identification studies of the bioactive components have been reported. Herein, the unknown analogues in the first-step screening were rapidly identified from representative compounds in different structure types (geniposide as iridoid type, crocetin as crocetin type, jasminoside B as monocyclic monoterpene type, oleanolic acid as saponin type, 3-caffeoylquinic acid as organic acid type, forsythoside A as phenylethanoid type, phillyrin as lignan type and quercetin 3-rutinoside as flavonoid type) by UPLC-Q-Tof/MS combined with mass defect filtering (MDF), and further confirmed with reference standards and published literatures. Similarly, in the second step, other unknown components were rapidly discovered from the compounds identified in the first step by MDF. Using the two-step screening method, a total of 58 components were characterized in Fructus Gardeniae-Fructus Forsythiae (FG-FF) decoction. In rat's blood, 36 compounds in extract and 16 metabolites were unambiguously or tentatively identified. Besides, we found the principal metabolites were glucuronide conjugates, with the glucuronide conjugates of caffeic acid, quercetin and kaempferol confirmed as caffeic acid 3-glucuronide, quercetin 3-glucuronide and kaempferol 3-glucuronide by reference standards, respectively. Additionally, most of them bound more strongly to human serum albumin than their respective prototypes, predicted by Molecular Docking and Simulation, indicating that they had lower blood clearance in vivo and possibly more contribution to pharmacological effects. This study developed a novel two-step screening method in addressing how to comprehensively screen components in herbal medicine by UPLC-Q-Tof/MS with MDF. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. A fractional-N frequency divider for multi-standard wireless transceiver fabricated in 0.18 μm CMOS process

    NASA Astrophysics Data System (ADS)

    Wang, Jiafeng; Fan, Xiangning; Shi, Xiaoyang; Wang, Zhigong

    2017-12-01

    With the rapid evolution of wireless communication technology, integrating various communication modes in a mobile terminal has become the popular trend. Because of this, multi-standard wireless technology is one of the hot spots in current research. This paper presents a wideband fractional-N frequency divider of the multi-standard wireless transceiver for many applications. High-speed divider-by-2 with traditional source-coupled-logic is designed for very wide band usage. Phase switching technique and a chain of divider-by-2/3 are applied to the programmable frequency divider with 0.5 step. The phase noise of the whole frequency synthesizer will be decreased by the narrower step of programmable frequency divider. Δ-Σ modulator is achieved by an improved MASH 1-1-1 structure. This structure has excellent performance in many ways, such as noise, spur and input dynamic range. Fabricated in TSMC 0.18μm CMOS process, the fractional-N frequency divider occupies a chip area of 1130 × 510 μm2 and it can correctly divide within the frequency range of 0.8-9 GHz. With 1.8 V supply voltage, its division ratio ranges from 62.5 to 254 and the total current consumption is 29 mA.

  4. Hospital cost accounting: implementing the system successfully.

    PubMed

    Burik, D; Duvall, T J

    1985-05-01

    To successfully implement a cost accounting system, certain key steps should be undertaken. These steps include developing and installing software; developing cost center budgets and inter-cost center allocations; developing service item standard costs; generating cost center level and patient level standard cost reports and reconciling these costs to actual costs; generating product line profitability reports and reconciling these reports to the financial statements; and providing ad hoc reporting capabilities. By following these steps, potential problems in the implementation process can be anticipated and avoided.

  5. Aerodynamic performance of conventional and advanced design labyrinth seals with solid-smooth abradable, and honeycomb lands. [gas turbine engines

    NASA Technical Reports Server (NTRS)

    Stocker, H. L.; Cox, D. M.; Holle, G. F.

    1977-01-01

    Labyrinth air seal static and dynamic performance was evaluated using solid, abradable, and honeycomb lands with standard and advanced seal designs. The effects on leakage of land surface roughness, abradable land porosity, rub grooves in abradable lands, and honeycomb land cell size and depth were studied using a standard labyrinth seal. The effects of rotation on the optimum seal knife pitch were also investigated. Selected geometric and aerodynamic parameters for an advanced seal design were evaluated to derive an optimized performance configuration. The rotational energy requirements were also measured to determine the inherent friction and pumping energy absorbed by the various seal knife and land configurations tested in order to properly assess the net seal system performance level. Results indicate that: (1) seal leakage can be significantly affected with honeycomb or abradable lands; (2) rotational energy absorption does not vary significantly with the use of a solid-smooth, an abradable, or a honeycomb land; and (3) optimization of an advanced lab seal design produced a configuration that had leakage 25% below a conventional stepped seal.

  6. Cholera Rapid Test with Enrichment Step Has Diagnostic Performance Equivalent to Culture

    PubMed Central

    Ontweka, Lameck N.; Deng, Lul O.; Rauzier, Jean; Debes, Amanda K.; Tadesse, Fisseha; Parker, Lucy A.; Wamala, Joseph F.; Bior, Bior K.; Lasuba, Michael; But, Abiem Bona; Grandesso, Francesco; Jamet, Christine; Cohuet, Sandra; Ciglenecki, Iza; Serafini, Micaela; Sack, David A.; Quilici, Marie-Laure; Azman, Andrew S.; Luquero, Francisco J.

    2016-01-01

    Cholera rapid diagnostic tests (RDT) could play a central role in outbreak detection and surveillance in low-resource settings, but their modest performance has hindered their broad adoption. The addition of an enrichment step may improve test specificity. We describe the results of a prospective diagnostic evaluation of the Crystal VC RDT (Span Diagnostics, India) with enrichment step and of culture, each compared to polymerase chain reaction (PCR), during a cholera outbreak in South Sudan. RDTs were performed on alkaline peptone water inoculated with stool and incubated for 4–6 hours at ambient temperature. Cholera culture was performed from wet filter paper inoculated with stool. Molecular detection of Vibrio cholerae O1 by PCR was done from dry Whatman 903 filter papers inoculated with stool, and from wet filter paper supernatant. In August and September 2015, 101 consecutive suspected cholera cases were enrolled, of which 36 were confirmed by PCR. The enriched RDT had 86.1% (95% CI: 70.5–95.3) sensitivity and 100% (95% CI: 94.4–100) specificity compared to PCR as the reference standard. The sensitivity of culture versus PCR was 83.3% (95% CI: 67.2–93.6) for culture performed on site and 72.2% (95% CI: 54.8–85.8) at the international reference laboratory, where samples were tested after an average delay of two months after sample collection, and specificity was 98.5% (95% CI: 91.7–100) and 100% (95% CI: 94.5–100), respectively. The RDT with enrichment showed performance comparable to that of culture and could be a sustainable alternative to culture confirmation where laboratory capacity is limited. PMID:27992488

  7. Specifying and Pilot Testing Quality Measures for the American Society of Addiction Medicine's Standards of Care.

    PubMed

    Harris, Alex H S; Weisner, Constance M; Chalk, Mady; Capoccia, Victor; Chen, Cheng; Thomas, Cindy Parks

    2016-01-01

    In 2013, the American Society of Addiction Medicine (ASAM) approved its Standards of Care for the Addiction Specialist Physician. Subsequently, an ASAM Performance Measures Panel identified and prioritized the standards to be operationalized into performance measures. The goal of this study is to describe the process of operationalizing 3 of these standards into quality measures, and to present the initial measure specifications and results of pilot testing these measures in a large health care system. By presenting the process rather than just the end results, we hope to shed light on the measure development process to educate, and also to stimulate debate about the decisions that were made. Each measure was decomposed into major concepts. Then each concept was operationalized using commonly available administrative data sources. Alternative specifications examined and sensitivity analyses were conducted to inform decisions that balanced accuracy, clinical nuance, and simplicity. Using data from the US Veterans Health Administration (VHA), overall performance and variation in performance across 119 VHA facilities were calculated. Three measures were operationalized and pilot tested: pharmacotherapy for alcohol use disorder, pharmacotherapy for opioid use disorder, and timely follow-up after medically managed withdrawal (aka detoxification). Each measure was calculable with available data, and showed ample room for improvement (no ceiling effects) and wide facility-level variability. Next steps include conducting feasibility and pilot testing in other health care systems and other contexts such as standalone addiction treatment programs, and also to study the specification and predictive validity of these measures.

  8. Brief International Cognitive Assessment for MS (BICAMS): international standards for validation.

    PubMed

    Benedict, Ralph H B; Amato, Maria Pia; Boringa, Jan; Brochet, Bruno; Foley, Fred; Fredrikson, Stan; Hamalainen, Paivi; Hartung, Hans; Krupp, Lauren; Penner, Iris; Reder, Anthony T; Langdon, Dawn

    2012-07-16

    An international expert consensus committee recently recommended a brief battery of tests for cognitive evaluation in multiple sclerosis. The Brief International Cognitive Assessment for MS (BICAMS) battery includes tests of mental processing speed and memory. Recognizing that resources for validation will vary internationally, the committee identified validation priorities, to facilitate international acceptance of BICAMS. Practical matters pertaining to implementation across different languages and countries were discussed. Five steps to achieve optimal psychometric validation were proposed. In Step 1, test stimuli should be standardized for the target culture or language under consideration. In Step 2, examiner instructions must be standardized and translated, including all information from manuals necessary for administration and interpretation. In Step 3, samples of at least 65 healthy persons should be studied for normalization, matched to patients on demographics such as age, gender and education. The objective of Step 4 is test-retest reliability, which can be investigated in a small sample of MS and/or healthy volunteers over 1-3 weeks. Finally, in Step 5, criterion validity should be established by comparing MS and healthy controls. At this time, preliminary studies are underway in a number of countries as we move forward with this international assessment tool for cognition in MS.

  9. An overview on STEP-NC compliant controller development

    NASA Astrophysics Data System (ADS)

    Othman, M. A.; Minhat, M.; Jamaludin, Z.

    2017-10-01

    The capabilities of conventional Computer Numerical Control (CNC) machine tools as termination organiser to fabricate high-quality parts promptly, economically and precisely are undeniable. To date, most CNCs follow the programming standard of ISO 6983, also called G & M code. However, in fluctuating shop floor environment, flexibility and interoperability of current CNC system to react dynamically and adaptively are believed still limited. This outdated programming language does not explicitly relate to each other to have control of arbitrary locations other than the motion of the block-by-block. To address this limitation, new standard known as STEP-NC was developed in late 1990s and is formalized as an ISO 14649. It adds intelligence to the CNC in term of interoperability, flexibility, adaptability and openness. This paper presents an overview of the research work that have been done in developing a STEP-NC controller standard and the capabilities of STEP-NC to overcome modern manufacturing demands. Reviews stated that most existing STEP-NC controller prototypes are based on type 1 and type 2 implementation levels. There are still lack of effort being done to develop type 3 and type 4 STEP-NC compliant controller.

  10. Quantitation of sugar content in pyrolysis liquids after acid hydrolysis using high-performance liquid chromatography without neutralization.

    PubMed

    Johnston, Patrick A; Brown, Robert C

    2014-08-13

    A rapid method for the quantitation of total sugars in pyrolysis liquids using high-performance liquid chromatography (HPLC) was developed. The method avoids the tedious and time-consuming sample preparation required by current analytical methods. It is possible to directly analyze hydrolyzed pyrolysis liquids, bypassing the neutralization step usually required in determination of total sugars. A comparison with traditional methods was used to determine the validity of the results. The calibration curve coefficient of determination on all standard compounds was >0.999 using a refractive index detector. The relative standard deviation for the new method was 1.13%. The spiked sugar recoveries on the pyrolysis liquid samples were between 104 and 105%. The research demonstrates that it is possible to obtain excellent accuracy and efficiency using HPLC to quantitate glucose after acid hydrolysis of polymeric and oligomeric sugars found in fast pyrolysis bio-oils without neutralization.

  11. Bootstrap Signal-to-Noise Confidence Intervals: An Objective Method for Subject Exclusion and Quality Control in ERP Studies

    PubMed Central

    Parks, Nathan A.; Gannon, Matthew A.; Long, Stephanie M.; Young, Madeleine E.

    2016-01-01

    Analysis of event-related potential (ERP) data includes several steps to ensure that ERPs meet an appropriate level of signal quality. One such step, subject exclusion, rejects subject data if ERP waveforms fail to meet an appropriate level of signal quality. Subject exclusion is an important quality control step in the ERP analysis pipeline as it ensures that statistical inference is based only upon those subjects exhibiting clear evoked brain responses. This critical quality control step is most often performed simply through visual inspection of subject-level ERPs by investigators. Such an approach is qualitative, subjective, and susceptible to investigator bias, as there are no standards as to what constitutes an ERP of sufficient signal quality. Here, we describe a standardized and objective method for quantifying waveform quality in individual subjects and establishing criteria for subject exclusion. The approach uses bootstrap resampling of ERP waveforms (from a pool of all available trials) to compute a signal-to-noise ratio confidence interval (SNR-CI) for individual subject waveforms. The lower bound of this SNR-CI (SNRLB) yields an effective and objective measure of signal quality as it ensures that ERP waveforms statistically exceed a desired signal-to-noise criterion. SNRLB provides a quantifiable metric of individual subject ERP quality and eliminates the need for subjective evaluation of waveform quality by the investigator. We detail the SNR-CI methodology, establish the efficacy of employing this approach with Monte Carlo simulations, and demonstrate its utility in practice when applied to ERP datasets. PMID:26903849

  12. Do placebo based validation standards mimic real batch products behaviour? Case studies.

    PubMed

    Bouabidi, A; Talbi, M; Bouklouze, A; El Karbane, M; Bourichi, H; El Guezzar, M; Ziemons, E; Hubert, Ph; Rozet, E

    2011-06-01

    Analytical methods validation is a mandatory step to evaluate the ability of developed methods to provide accurate results for their routine application. Validation usually involves validation standards or quality control samples that are prepared in placebo or reconstituted matrix made of a mixture of all the ingredients composing the drug product except the active substance or the analyte under investigation. However, one of the main concerns that can be made with this approach is that it may lack an important source of variability that come from the manufacturing process. The question that remains at the end of the validation step is about the transferability of the quantitative performance from validation standards to real authentic drug product samples. In this work, this topic is investigated through three case studies. Three analytical methods were validated using the commonly spiked placebo validation standards at several concentration levels as well as using samples coming from authentic batch samples (tablets and syrups). The results showed that, depending on the type of response function used as calibration curve, there were various degrees of differences in the results accuracy obtained with the two types of samples. Nonetheless the use of spiked placebo validation standards was showed to mimic relatively well the quantitative behaviour of the analytical methods with authentic batch samples. Adding these authentic batch samples into the validation design may help the analyst to select and confirm the most fit for purpose calibration curve and thus increase the accuracy and reliability of the results generated by the method in routine application. Copyright © 2011 Elsevier B.V. All rights reserved.

  13. Interviewing Neuroscientists for an Undergraduate Honors Project

    PubMed Central

    Montiel, Catalina; Meitzen, John

    2017-01-01

    Honors projects that supplement standard coursework are a widely used practice in undergraduate curricula. These projects can take many forms, ranging from laboratory research projects to performing service learning to literature analyses. Here we discuss an honors project focused on interviewing neuroscientists to learn about individual scientific practice and career paths, and synthesizing the resulting information into a personal reflection essay. We detail step-by-step instructions for performing this type of project, including how to develop interview questions, a sample project timeline, deliverables, learning objectives and outcomes, and address potential pitfalls. We provide sample interview questions, an interview solicitation email, and in the supplemental materials an example student reflection essay, assessment rubrics, and the transcription of a student-conducted interview of Drs. John Godwin and Santosh Mishra of North Carolina State University. This type of project is a promising method to enable student-researcher communication, and potentially useful to a broad spectrum of both honors and non-honors neuroscience coursework. PMID:29371847

  14. Interviewing Neuroscientists for an Undergraduate Honors Project.

    PubMed

    Montiel, Catalina; Meitzen, John

    2017-01-01

    Honors projects that supplement standard coursework are a widely used practice in undergraduate curricula. These projects can take many forms, ranging from laboratory research projects to performing service learning to literature analyses. Here we discuss an honors project focused on interviewing neuroscientists to learn about individual scientific practice and career paths, and synthesizing the resulting information into a personal reflection essay. We detail step-by-step instructions for performing this type of project, including how to develop interview questions, a sample project timeline, deliverables, learning objectives and outcomes, and address potential pitfalls. We provide sample interview questions, an interview solicitation email, and in the supplemental materials an example student reflection essay, assessment rubrics, and the transcription of a student-conducted interview of Drs. John Godwin and Santosh Mishra of North Carolina State University. This type of project is a promising method to enable student-researcher communication, and potentially useful to a broad spectrum of both honors and non-honors neuroscience coursework.

  15. DOD Civilian Personnel. Intelligence Personnel System Incorporates Safeguards, but Opportunities Exist for Improvement

    DTIC Science & Technology

    2009-12-01

    high standards of integrity, conduct, and concern for the public interest. 5. The Federal work force should be used efficiently and effectively . 6...NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) U.S. Government Accountability Office,441 G Street...in its implementation of DCIPS, DOD has taken some positive steps to incorporate 10 internal safeguards to help ensure the fair, effective , and

  16. Past, Present, and Future of Minimally Invasive Abdominal Surgery

    PubMed Central

    Antoniou, George A.; Antoniou, Athanasios I.; Granderath, Frank-Alexander

    2015-01-01

    Laparoscopic surgery has generated a revolution in operative medicine during the past few decades. Although strongly criticized during its early years, minimization of surgical trauma and the benefits of minimization to the patient have been brought to our attention through the efforts and vision of a few pioneers in the recent history of medicine. The German gynecologist Kurt Semm (1927–2003) transformed the use of laparoscopy for diagnostic purposes into a modern therapeutic surgical concept, having performed the first laparoscopic appendectomy, inspiring Erich Mühe and many other surgeons around the world to perform a wide spectrum of procedures by minimally invasive means. Laparoscopic cholecystectomy soon became the gold standard, and various laparoscopic procedures are now preferred over open approaches, in the light of emerging evidence that demonstrates less operative stress, reduced pain, and shorter convalescence. Natural orifice transluminal endoscopic surgery (NOTES) and single-incision laparoscopic surgery (SILS) may be considered further steps toward minimization of surgical trauma, although these methods have not yet been standardized. Laparoscopic surgery with the use of a robotic platform constitutes a promising field of investigation. New technologies are to be considered under the prism of the history of surgery; they seem to be a step toward further minimization of surgical trauma, but not definite therapeutic modalities. Patient safety and medical ethics must be the cornerstone of future investigation and implementation of new techniques. PMID:26508823

  17. Steps towards the international regulatory acceptance of non-animal methodology in safety assessment.

    PubMed

    Sewell, Fiona; Doe, John; Gellatly, Nichola; Ragan, Ian; Burden, Natalie

    2017-10-01

    The current animal-based paradigm for safety assessment must change. In September 2016, the UK National Centre for Replacement, Refinement and Reduction of Animals in Research (NC3Rs) brought together scientists from regulatory authorities, academia and industry to review progress in bringing new methodology into regulatory use, and to identify ways to expedite progress. Progress has been slow. Science is advancing to make this possible but changes are necessary. The new paradigm should allow new methodology to be adopted once it is developed rather than being based on a fixed set of studies. Regulatory authorities can help by developing Performance-Based Standards. The most pressing need is in repeat dose toxicology, although setting standards will be more complex than in areas such as sensitization. Performance standards should be aimed directly at human safety, not at reproducing the results of animal studies. Regulatory authorities can also aid progress towards the acceptance of non-animal based methodology by promoting "safe-haven" trials where traditional and new methodology data can be submitted in parallel to build up experience in the new methods. Industry can play its part in the acceptance of new methodology, by contributing to the setting of performance standards and by actively contributing to "safe-haven" trials. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  18. Reliability and Concurrent Validity of the Narrow Path Walking Test in Persons With Multiple Sclerosis.

    PubMed

    Rosenblum, Uri; Melzer, Itshak

    2017-01-01

    About 90% of people with multiple sclerosis (PwMS) have gait instability and 50% fall. Reliable and clinically feasible methods of gait instability assessment are needed. The study investigated the reliability and validity of the Narrow Path Walking Test (NPWT) under single-task (ST) and dual-task (DT) conditions for PwMS. Thirty PwMS performed the NPWT on 2 different occasions, a week apart. Number of Steps, Trial Time, Trial Velocity, Step Length, Number of Step Errors, Number of Cognitive Task Errors, and Number of Balance Losses were measured. Intraclass correlation coefficients (ICC2,1) were calculated from the average values of NPWT parameters. Absolute reliability was quantified from standard error of measurement (SEM) and smallest real difference (SRD). Concurrent validity of NPWT with Functional Reach Test, Four Square Step Test (FSST), 12-item Multiple Sclerosis Walking Scale (MSWS-12), and 2 Minute Walking Test (2MWT) was determined using partial correlations. Intraclass correlation coefficients (ICCs) for most NPWT parameters during ST and DT ranged from 0.46-0.94 and 0.55-0.95, respectively. The highest relative reliability was found for Number of Step Errors (ICC = 0.94 and 0.93, for ST and DT, respectively) and Trial Velocity (ICC = 0.83 and 0.86, for ST and DT, respectively). Absolute reliability was high for Number of Step Errors in ST (SEM % = 19.53%) and DT (SEM % = 18.14%) and low for Trial Velocity in ST (SEM % = 6.88%) and DT (SEM % = 7.29%). Significant correlations for Number of Step Errors and Trial Velocity were found with FSST, MSWS-12, and 2MWT. In persons with PwMS performing the NPWT, Number of Step Errors and Trial Velocity were highly reliable parameters. Based on correlations with other measures of gait instability, Number of Step Errors was the most valid parameter of dynamic balance under the conditions of our test.Video Abstract available for more insights from the authors (see Supplemental Digital Content 1, available at: http://links.lww.com/JNPT/A159).

  19. Lessons Learned for Collaborative Clinical Content Development

    PubMed Central

    Collins, S.A.; Bavuso, K.; Zuccotti, G.; Rocha, R.A.

    2013-01-01

    Background Site-specific content configuration of vendor-based Electronic Health Records (EHRs) is a vital step in the development of standardized and interoperable content that can be used for clinical decision-support, reporting, care coordination, and information exchange. The multi-site, multi-stakeholder Acute Care Documentation (ACD) project at Partners Healthcare Systems (PHS) aimed to develop highly structured clinical content with adequate breadth and depth to meet the needs of all types of acute care clinicians at two academic medical centers. The Knowledge Management (KM) team at PHS led the informatics and knowledge management effort for the project. Objectives We aimed to evaluate the role, governance, and project management processes and resources for the KM team’s effort as part of the standardized clinical content creation. Methods We employed the Center for Disease Control’s six step Program Evaluation Framework to guide our evaluation steps. We administered a forty-four question, open-ended, semi-structured voluntary survey to gather focused, credible evidence from members of the KM team. Qualitative open-coding was performed to identify themes for lessons learned and concluding recommendations. Results Six surveys were completed. Qualitative data analysis informed five lessons learned and thirty specific recommendations associated with the lessons learned. The five lessons learned are: 1) Assess and meet knowledge needs and set expectations at the start of the project; 2) Define an accountable decision-making process; 3) Increase team meeting moderation skills; 4) Ensure adequate resources and competency training with online asynchronous collaboration tools; 5) Develop focused, goal-oriented teams and supportive, consultative service based teams. Conclusions Knowledge management requirements for the development of standardized clinical content within a vendor-based EHR among multi-stakeholder teams and sites include: 1) assessing and meeting informatics knowledge needs, 2) setting expectations and standardizing the process for decision-making, and 3) ensuring the availability of adequate resources and competency training. PMID:23874366

  20. Documentation of Gender Identity in an Adolescent and Young Adult Clinic.

    PubMed

    Vance, Stanley R; Mesheriakova, Veronika V

    2017-03-01

    To determine if changing electronic health record (EHR) note templates can increase documentation of gender identity in an adolescent and young adult clinic. A two-step gender question was added to EHR note templates for physicals in February 2016. A retrospective chart review was performed 3 months before and after this addition. The primary measure was whether answers to the two-step question were documented. Gender identity/birth-assigned sex discordance, age, and use of the appropriate note template post-template change were also measured. One hundred twenty-five pretemplate change and 106 post-template change physicals were reviewed with an inter-rater reliability of 97%. Documentation of answers to the two-step gender identity question increased from 11% to 84% (p < .001). This study suggests that incorporating a standardized question into EHR note templates is effective at improving the documentation of gender identity in youth presenting for annual physicals. Copyright © 2016 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.

  1. Thermal neutron calibration channel at LNMRI/IRD.

    PubMed

    Astuto, A; Salgado, A P; Leite, S P; Patrão, K C S; Fonseca, E S; Pereira, W W; Lopes, R T

    2014-10-01

    The Brazilian Metrology Laboratory of Ionizing Radiations (LNMRI) standard thermal neutron flux facility was designed to provide uniform neutron fluence for calibration of small neutron detectors and individual dosemeters. This fluence is obtained by neutron moderation from four (241)Am-Be sources, each with 596 GBq, in a facility built with blocks of graphite/paraffin compound and high-purity carbon graphite. This study was carried out in two steps. In the first step, simulations using the MCNPX code on different geometric arrangements of moderator materials and neutron sources were performed. The quality of the resulting neutron fluence in terms of spectrum, cadmium ratio and gamma-neutron ratio was evaluated. In the second step, the system was assembled based on the results obtained on the simulations, and new measurements are being made. These measurements will validate the system, and other intercomparisons will ensure traceability to the International System of Units. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  2. Detection and Characterization of Viral Species/Subspecies Using Isothermal Recombinase Polymerase Amplification (RPA) Assays.

    PubMed

    Glais, Laurent; Jacquot, Emmanuel

    2015-01-01

    Numerous molecular-based detection protocols include an amplification step of the targeted nucleic acids. This step is important to reach the expected sensitive detection of pathogens in diagnostic procedures. Amplifications of nucleic acid sequences are generally performed, in the presence of appropriate primers, using thermocyclers. However, the time requested to amplify molecular targets and the cost of the thermocycler machines could impair the use of these methods in routine diagnostics. Recombinase polymerase amplification (RPA) technique allows rapid (short-term incubation of sample and primers in an enzymatic mixture) and simple (isothermal) amplification of molecular targets. RPA protocol requires only basic molecular steps such as extraction procedures and agarose gel electrophoresis. Thus, RPA can be considered as an interesting alternative to standard molecular-based diagnostic tools. In this paper, the complete procedures to set up an RPA assay, applied to detection of RNA (Potato virus Y, Potyvirus) and DNA (Wheat dwarf virus, Mastrevirus) viruses, are described. The proposed procedure allows developing species- or subspecies-specific detection assay.

  3. Quality Measures in Pre-Analytical Phase of Tissue Processing: Understanding Its Value in Histopathology.

    PubMed

    Rao, Shalinee; Masilamani, Suresh; Sundaram, Sandhya; Duvuru, Prathiba; Swaminathan, Rajendiran

    2016-01-01

    Quality monitoring in histopathology unit is categorized into three phases, pre-analytical, analytical and post-analytical, to cover various steps in the entire test cycle. Review of literature on quality evaluation studies pertaining to histopathology revealed that earlier reports were mainly focused on analytical aspects with limited studies on assessment of pre-analytical phase. Pre-analytical phase encompasses several processing steps and handling of specimen/sample by multiple individuals, thus allowing enough scope for errors. Due to its critical nature and limited studies in the past to assess quality in pre-analytical phase, it deserves more attention. This study was undertaken to analyse and assess the quality parameters in pre-analytical phase in a histopathology laboratory. This was a retrospective study done on pre-analytical parameters in histopathology laboratory of a tertiary care centre on 18,626 tissue specimens received in 34 months. Registers and records were checked for efficiency and errors for pre-analytical quality variables: specimen identification, specimen in appropriate fixatives, lost specimens, daily internal quality control performance on staining, performance in inter-laboratory quality assessment program {External quality assurance program (EQAS)} and evaluation of internal non-conformities (NC) for other errors. The study revealed incorrect specimen labelling in 0.04%, 0.01% and 0.01% in 2007, 2008 and 2009 respectively. About 0.04%, 0.07% and 0.18% specimens were not sent in fixatives in 2007, 2008 and 2009 respectively. There was no incidence of specimen lost. A total of 113 non-conformities were identified out of which 92.9% belonged to the pre-analytical phase. The predominant NC (any deviation from normal standard which may generate an error and result in compromising with quality standards) identified was wrong labelling of slides. Performance in EQAS for pre-analytical phase was satisfactory in 6 of 9 cycles. A low incidence of errors in pre-analytical phase implies that a satisfactory level of quality standards was being practiced with still scope for improvement.

  4. Preparation of a novel sorptive stir bar based on vinylpyrrolidone-ethylene glycol dimethacrylate monolithic polymer for the simultaneous extraction of diazepam and nordazepam from human plasma.

    PubMed

    Torabizadeh, Mahsa; Talebpour, Zahra; Adib, Nuoshin; Aboul-Enein, Hassan Y

    2016-04-01

    A new monolithic coating based on vinylpyrrolidone-ethylene glycol dimethacrylate polymer was introduced for stir bar sorptive extraction. The polymerization step was performed using different contents of monomer, cross-linker and porogenic solvent, and the best formulation was selected. The quality of the prepared vinylpyrrolidone-ethylene glycol dimethacrylate stir bars was satisfactory, demonstrating good repeatability within batch (relative standard deviation < 3.5%) and acceptable reproducibility between batches (relative standard deviation < 6.0%). The prepared stir bar was utilized in combination with ultrasound-assisted liquid desorption, followed by high-performance liquid chromatography with ultraviolet detection for the simultaneous determination of diazepam and nordazepam in human plasma samples. To optimize the extraction step, a three-level, four-factor, three-block Box-Behnken design was applied. Under the optimum conditions, the analytical performance of the proposed method displayed excellent linear dynamic ranges for diazepam (36-1200 ng/mL) and nordazepam (25-1200 ng/mL), with correlation coefficients of 0.9986 and 0.9968 and detection limits of 12 and 10 ng/mL, respectively. The intra- and interday recovery ranged from 93 to 106%, and the relative standard deviations were less than 6%. Finally, the proposed method was successfully applied to the analysis of diazepam and nordazepam at their therapeutic levels in human plasma. The novelty of this study is the improved polarity of the stir bar coating and its application for the simultaneous extraction of diazepam and its active metabolite, nordazepam in human plasma sample. The method was more rapid than previously reported stir bar sorptive extraction techniques based on monolithic coatings, and exhibited lower detection limits in comparison with similar methods for the determination of diazepam and nordazepam in biological fluids. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Standard Operating Procedure for the Grinding and Extraction of Lead in Paint using Nitric Acid and a Rotor/Stator System Powered by a High Speed Motor

    EPA Science Inventory

    This Standard Operating Procedure (SOP) describes a new, rapid, and relatively inexpensive one step procedure which grinds the paint samples removed from the substrate and simultaneously quantitatively extracts the Pb from the paint in only one step in preparation for quantitativ...

  6. Leading the Common Core State Standards: From Common Sense to Common Practice

    ERIC Educational Resources Information Center

    Dunkle, Cheryl A.

    2012-01-01

    Many educators agree that we already know how to foster student success, so what is keeping common sense from becoming common practice? The author provides step-by-step guidance for overcoming the barriers to adopting the Common Core State Standards (CCSS) and achieving equity and excellence for all students. As an experienced teacher and…

  7. Living Traditions--A Teacher's Guide: Teaching Local History Using State and National Learning Standards.

    ERIC Educational Resources Information Center

    Skelding, Mark; Kemple, Martin; Kiefer, Joseph

    This guide is designed to take teachers through a step-by-step process for developing an integrated, standards-based curriculum that focuses on the stories, history, folkways, and agrarian traditions of the local community. Such a place-based curriculum helps students to become culturally literate, makes learning relevant and engaging, draws on…

  8. Two-step estimation in ratio-of-mediator-probability weighted causal mediation analysis.

    PubMed

    Bein, Edward; Deutsch, Jonah; Hong, Guanglei; Porter, Kristin E; Qin, Xu; Yang, Cheng

    2018-04-15

    This study investigates appropriate estimation of estimator variability in the context of causal mediation analysis that employs propensity score-based weighting. Such an analysis decomposes the total effect of a treatment on the outcome into an indirect effect transmitted through a focal mediator and a direct effect bypassing the mediator. Ratio-of-mediator-probability weighting estimates these causal effects by adjusting for the confounding impact of a large number of pretreatment covariates through propensity score-based weighting. In step 1, a propensity score model is estimated. In step 2, the causal effects of interest are estimated using weights derived from the prior step's regression coefficient estimates. Statistical inferences obtained from this 2-step estimation procedure are potentially problematic if the estimated standard errors of the causal effect estimates do not reflect the sampling uncertainty in the estimation of the weights. This study extends to ratio-of-mediator-probability weighting analysis a solution to the 2-step estimation problem by stacking the score functions from both steps. We derive the asymptotic variance-covariance matrix for the indirect effect and direct effect 2-step estimators, provide simulation results, and illustrate with an application study. Our simulation results indicate that the sampling uncertainty in the estimated weights should not be ignored. The standard error estimation using the stacking procedure offers a viable alternative to bootstrap standard error estimation. We discuss broad implications of this approach for causal analysis involving propensity score-based weighting. Copyright © 2018 John Wiley & Sons, Ltd.

  9. Single-step reinitialization and extending algorithms for level-set based multi-phase flow simulations

    NASA Astrophysics Data System (ADS)

    Fu, Lin; Hu, Xiangyu Y.; Adams, Nikolaus A.

    2017-12-01

    We propose efficient single-step formulations for reinitialization and extending algorithms, which are critical components of level-set based interface-tracking methods. The level-set field is reinitialized with a single-step (non iterative) "forward tracing" algorithm. A minimum set of cells is defined that describes the interface, and reinitialization employs only data from these cells. Fluid states are extrapolated or extended across the interface by a single-step "backward tracing" algorithm. Both algorithms, which are motivated by analogy to ray-tracing, avoid multiple block-boundary data exchanges that are inevitable for iterative reinitialization and extending approaches within a parallel-computing environment. The single-step algorithms are combined with a multi-resolution conservative sharp-interface method and validated by a wide range of benchmark test cases. We demonstrate that the proposed reinitialization method achieves second-order accuracy in conserving the volume of each phase. The interface location is invariant to reapplication of the single-step reinitialization. Generally, we observe smaller absolute errors than for standard iterative reinitialization on the same grid. The computational efficiency is higher than for the standard and typical high-order iterative reinitialization methods. We observe a 2- to 6-times efficiency improvement over the standard method for serial execution. The proposed single-step extending algorithm, which is commonly employed for assigning data to ghost cells with ghost-fluid or conservative interface interaction methods, shows about 10-times efficiency improvement over the standard method while maintaining same accuracy. Despite their simplicity, the proposed algorithms offer an efficient and robust alternative to iterative reinitialization and extending methods for level-set based multi-phase simulations.

  10. STEP: What Is It and Should It Be Used for KSC's ISE/CEE Project in the Near Future?

    NASA Technical Reports Server (NTRS)

    Bareiss, Catherine C.

    2000-01-01

    The ability to exchange information between different engineering software (i.e, CAD, CAE, CAM) is necessary to aid in collaborative engineering. There are a number of different ways to accomplish this goal. One popular method is to transfer data via different file formats. However this method can lose data and becomes complex as more file formats are added. Another method is to use a standard protocol. STEP is one such standard. This paper gives an overview of STEP, provides a list of where to access more information, and develops guidelines to aid the reader in deciding if STEP is appropriate for his/her use.

  11. Separation of Be and Al for AMS using single-step column chromatography

    NASA Astrophysics Data System (ADS)

    Binnie, Steven A.; Dunai, Tibor J.; Voronina, Elena; Goral, Tomasz; Heinze, Stefan; Dewald, Alfred

    2015-10-01

    With the aim of simplifying AMS target preparation procedures for TCN measurements we tested a new extraction chromatography approach which couples an anion exchange resin (WBEC) to a chelating resin (Beryllium resin) to separate Be and Al from dissolved quartz samples. Results show that WBEC-Beryllium resin stacks can be used to provide high purity Be and Al separations using a combination of hydrochloric/oxalic and nitric acid elutions. 10Be and 26Al concentrations from quartz samples prepared using more standard procedures are compared with results from replicate samples prepared using the coupled WBEC-Beryllium resin approach and show good agreement. The new column procedure is performed in a single step, reducing sample preparation times relative to more traditional methods of TCN target production.

  12. Kruskal-Wallis-based computationally efficient feature selection for face recognition.

    PubMed

    Ali Khan, Sajid; Hussain, Ayyaz; Basit, Abdul; Akram, Sheeraz

    2014-01-01

    Face recognition in today's technological world, and face recognition applications attain much more importance. Most of the existing work used frontal face images to classify face image. However these techniques fail when applied on real world face images. The proposed technique effectively extracts the prominent facial features. Most of the features are redundant and do not contribute to representing face. In order to eliminate those redundant features, computationally efficient algorithm is used to select the more discriminative face features. Extracted features are then passed to classification step. In the classification step, different classifiers are ensemble to enhance the recognition accuracy rate as single classifier is unable to achieve the high accuracy. Experiments are performed on standard face database images and results are compared with existing techniques.

  13. The Harvard Automated Processing Pipeline for Electroencephalography (HAPPE): Standardized Processing Software for Developmental and High-Artifact Data.

    PubMed

    Gabard-Durnam, Laurel J; Mendez Leal, Adriana S; Wilkinson, Carol L; Levin, April R

    2018-01-01

    Electroenchephalography (EEG) recordings collected with developmental populations present particular challenges from a data processing perspective. These EEGs have a high degree of artifact contamination and often short recording lengths. As both sample sizes and EEG channel densities increase, traditional processing approaches like manual data rejection are becoming unsustainable. Moreover, such subjective approaches preclude standardized metrics of data quality, despite the heightened importance of such measures for EEGs with high rates of initial artifact contamination. There is presently a paucity of automated resources for processing these EEG data and no consistent reporting of data quality measures. To address these challenges, we propose the Harvard Automated Processing Pipeline for EEG (HAPPE) as a standardized, automated pipeline compatible with EEG recordings of variable lengths and artifact contamination levels, including high-artifact and short EEG recordings from young children or those with neurodevelopmental disorders. HAPPE processes event-related and resting-state EEG data from raw files through a series of filtering, artifact rejection, and re-referencing steps to processed EEG suitable for time-frequency-domain analyses. HAPPE also includes a post-processing report of data quality metrics to facilitate the evaluation and reporting of data quality in a standardized manner. Here, we describe each processing step in HAPPE, perform an example analysis with EEG files we have made freely available, and show that HAPPE outperforms seven alternative, widely-used processing approaches. HAPPE removes more artifact than all alternative approaches while simultaneously preserving greater or equivalent amounts of EEG signal in almost all instances. We also provide distributions of HAPPE's data quality metrics in an 867 file dataset as a reference distribution and in support of HAPPE's performance across EEG data with variable artifact contamination and recording lengths. HAPPE software is freely available under the terms of the GNU General Public License at https://github.com/lcnhappe/happe.

  14. The Harvard Automated Processing Pipeline for Electroencephalography (HAPPE): Standardized Processing Software for Developmental and High-Artifact Data

    PubMed Central

    Gabard-Durnam, Laurel J.; Mendez Leal, Adriana S.; Wilkinson, Carol L.; Levin, April R.

    2018-01-01

    Electroenchephalography (EEG) recordings collected with developmental populations present particular challenges from a data processing perspective. These EEGs have a high degree of artifact contamination and often short recording lengths. As both sample sizes and EEG channel densities increase, traditional processing approaches like manual data rejection are becoming unsustainable. Moreover, such subjective approaches preclude standardized metrics of data quality, despite the heightened importance of such measures for EEGs with high rates of initial artifact contamination. There is presently a paucity of automated resources for processing these EEG data and no consistent reporting of data quality measures. To address these challenges, we propose the Harvard Automated Processing Pipeline for EEG (HAPPE) as a standardized, automated pipeline compatible with EEG recordings of variable lengths and artifact contamination levels, including high-artifact and short EEG recordings from young children or those with neurodevelopmental disorders. HAPPE processes event-related and resting-state EEG data from raw files through a series of filtering, artifact rejection, and re-referencing steps to processed EEG suitable for time-frequency-domain analyses. HAPPE also includes a post-processing report of data quality metrics to facilitate the evaluation and reporting of data quality in a standardized manner. Here, we describe each processing step in HAPPE, perform an example analysis with EEG files we have made freely available, and show that HAPPE outperforms seven alternative, widely-used processing approaches. HAPPE removes more artifact than all alternative approaches while simultaneously preserving greater or equivalent amounts of EEG signal in almost all instances. We also provide distributions of HAPPE's data quality metrics in an 867 file dataset as a reference distribution and in support of HAPPE's performance across EEG data with variable artifact contamination and recording lengths. HAPPE software is freely available under the terms of the GNU General Public License at https://github.com/lcnhappe/happe. PMID:29535597

  15. Fully Disposable Manufacturing Concepts for Clinical and Commercial Manufacturing and Ballroom Concepts.

    PubMed

    Boedeker, Berthold; Goldstein, Adam; Mahajan, Ekta

    2017-11-04

    The availability and use of pre-sterilized disposables has greatly changed the methods used in biopharmaceuticals development and production, particularly from mammalian cell culture. Nowadays, almost all process steps from cell expansion, fermentation, cell removal, and purification to formulation and storage of drug substances can be carried out in disposables, although there are still limitations with single-use technologies, particularly in the areas of pretesting and quality control of disposables, bag and connections standardization and qualification, extractables and leachables (E/L) validation, and dependency on individual vendors. The current status of single-use technologies is summarized for all process unit operations using a standard mAb process as an example. In addition, current pros and cons of using disposables are addressed in a comparative way, including quality control and E/L validation.The continuing progress in developing single-use technologies has an important impact on manufacturing facilities, resulting in much faster, less expensive and simpler plant design, start-up, and operation, because cell culture process steps are no longer performed in hard-piped unit operations. This leads to simpler operations in a lab-like environment. Overall it enriches the current landscape of available facilities from standard hard-piped to hard-piped/disposables hybrid to completely single-use-based production plants using the current segregation and containment concept. At the top, disposables in combination with completely and functionally closed systems facilitate a new, revolutionary design of ballroom facilities without or with much less segregation, which enables us to perform good manufacturing practice manufacturing of different products simultaneously in unclassified but controlled areas.Finally, single-use processing in lab-like shell facilities is a big enabler of transferring and establishing production in emergent countries, and this is described in more detail in 7. Graphical Abstract.

  16. Twenty-Four-Hour Blood Pressure Monitoring to Predict and Assess Impact of Renal Denervation: The DENERHTN Study (Renal Denervation for Hypertension).

    PubMed

    Gosse, Philippe; Cremer, Antoine; Pereira, Helena; Bobrie, Guillaume; Chatellier, Gilles; Chamontin, Bernard; Courand, Pierre-Yves; Delsart, Pascal; Denolle, Thierry; Dourmap, Caroline; Ferrari, Emile; Girerd, Xavier; Michel Halimi, Jean; Herpin, Daniel; Lantelme, Pierre; Monge, Matthieu; Mounier-Vehier, Claire; Mourad, Jean-Jacques; Ormezzano, Olivier; Ribstein, Jean; Rossignol, Patrick; Sapoval, Marc; Vaïsse, Bernard; Zannad, Faiez; Azizi, Michel

    2017-03-01

    The DENERHTN trial (Renal Denervation for Hypertension) confirmed the blood pressure (BP) lowering efficacy of renal denervation added to a standardized stepped-care antihypertensive treatment for resistant hypertension at 6 months. We report here the effect of denervation on 24-hour BP and its variability and look for parameters that predicted the BP response. Patients with resistant hypertension were randomly assigned to denervation plus stepped-care treatment or treatment alone (control). Average and standard deviation of 24-hour, daytime, and nighttime BP and the smoothness index were calculated on recordings performed at randomization and 6 months. Responders were defined as a 6-month 24-hour systolic BP reduction ≥20 mm Hg. Analyses were performed on the per-protocol population. The significantly greater BP reduction in the denervation group was associated with a higher smoothness index ( P =0.02). Variability of 24-hour, daytime, and nighttime BP did not change significantly from baseline to 6 months in both groups. The number of responders was greater in the denervation (20/44, 44.5%) than in the control group (11/53, 20.8%; P =0.01). In the discriminant analysis, baseline average nighttime systolic BP and standard deviation were significant predictors of the systolic BP response in the denervation group only, allowing adequate responder classification of 70% of the patients. Our results show that denervation lowers ambulatory BP homogeneously over 24 hours in patients with resistant hypertension and suggest that nighttime systolic BP and variability are predictors of the BP response to denervation. URL: https://www.clinicaltrials.gov. Unique identifier: NCT01570777. © 2017 American Heart Association, Inc.

  17. Whole-body Magnetic Resonance Imaging in Inflammatory Arthritis: Systematic Literature Review and First Steps Toward Standardization and an OMERACT Scoring System.

    PubMed

    Østergaard, Mikkel; Eshed, Iris; Althoff, Christian E; Poggenborg, Rene P; Diekhoff, Torsten; Krabbe, Simon; Weckbach, Sabine; Lambert, Robert G W; Pedersen, Susanne J; Maksymowych, Walter P; Peterfy, Charles G; Freeston, Jane; Bird, Paul; Conaghan, Philip G; Hermann, Kay-Geert A

    2017-11-01

    Whole-body magnetic resonance imaging (WB-MRI) is a relatively new technique that can enable assessment of the overall inflammatory status of people with arthritis, but standards for image acquisition, definitions of key pathologies, and a quantification system are required. Our aim was to perform a systematic literature review (SLR) and to develop consensus definitions of key pathologies, anatomical locations for assessment, a set of MRI sequences and imaging planes for the different body regions, and a preliminary scoring system for WB-MRI in inflammatory arthritis. An SLR was initially performed, searching for WB-MRI studies in arthritis, osteoarthritis, spondyloarthritis, or enthesitis. These results were presented to a meeting of the MRI in Arthritis Working Group together with an MR image review. Following this, preliminary standards for WB-MRI in inflammatory arthritides were developed with further iteration at the Working Group meetings at the Outcome Measures in Rheumatology (OMERACT) 2016. The SLR identified 10 relevant original articles (7 cross-sectional and 3 longitudinal, mostly focusing on synovitis and/or enthesitis in spondyloarthritis, 4 with reproducibility data). The Working Group decided on inflammation in peripheral joints and entheses as primary focus areas, and then developed consensus MRI definitions for these pathologies, selected anatomical locations for assessment, agreed on a core set of MRI sequences and imaging planes for the different regions, and proposed a preliminary scoring system. It was decided to test and further develop the system by iterative multireader exercises. These first steps in developing an OMERACT WB-MRI scoring system for use in inflammatory arthritides offer a framework for further testing and refinement.

  18. Subdural Fluid Collection and Hydrocephalus After Foramen Magnum Decompression for Chiari Malformation Type I: Management Algorithm of a Rare Complication.

    PubMed

    Rossini, Zefferino; Milani, Davide; Costa, Francesco; Castellani, Carlotta; Lasio, Giovanni; Fornari, Maurizio

    2017-10-01

    Chiari malformation type I is a hindbrain abnormality characterized by descent of the cerebellar tonsils beneath the foramen magnum, frequently associated with symptoms or brainstem compression, impaired cerebrospinal fluid circulation, and syringomyelia. Foramen magnum decompression represents the most common way of treatment. Rarely, subdural fluid collection and hydrocephalus represent postoperative adverse events. The treatment of this complication is still debated, and physicians are sometimes uncertain when to perform diversion surgery and when to perform more conservative management. We report an unusual occurrence of subdural fluid collection and hydrocephalus that developed in a 23-year-old patient after foramen magnum decompression for Chiari malformation type I. Following a management protocol, based on a step-by-step approach, from conservative therapy to diversion surgery, the patient was managed with urgent external ventricular drainage, and then with conservative management and wound revision. Because of the rarity of this adverse event, previous case reports differ about the form of treatment. In future cases, finding clinical and radiologic features to identify risk factors that are useful in predicting if the patient will benefit from conservative management or will need to undergo diversion surgery is only possible if a uniform form of treatment is used. Therefore, we believe that a management algorithm based on a step-by-step approach will reduce the use of invasive therapies and help to create a standard of care. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Resuscitation training in small-group setting – gender matters

    PubMed Central

    2013-01-01

    Background Within cardiopulmonary resuscitation external chest compressions (ECC) are of outstanding importance. Frequent training in Basic Life Support (BLS) may improve the performance, but the perfect method or environment is still a matter of research. The objective of this study was to evaluate whether practical performance and retention of skills in resuscitation training may be influenced by the gender composition in learning groups. Methods Participants were allocated to three groups for standardized BLS-training: Female group (F): only female participants; Male group (M): only male participants; Standard group (S): male and female participants. All groups were trained with the standardized 4-step-approach method. Assessment of participants’ performance was done before training (t1), after one week (t2) and eight months later (t3) on a manikin in the same cardiac arrest single-rescuer-scenario. Participants were 251 Laypersons (mean age 21; SD 4; range 18–42 years; females 63%) without previous medical knowledge. Endpoints: compression rate 90-110/min; mean compression depth 38–51 mm. Standardized questionnaires were used for the evaluation of attitude and learning environment. Results After one week group F performed significantly better with respect to the achievement of the correct mean compression depth (F: 63% vs. S: 43%; p = 0.02). Moreover, groups F and S were the only groups which were able to improve their performance concerning the mean compression rate (t1: 35%; t3: 52%; p = 0.04). Female participants felt more comfortable in the female–only environment. Conclusions Resuscitation training in gender-segregated groups has an effect on individual performance with superior ECC skills in the female-only learning groups. Female participants could improve their skills by a more suitable learning environment, while male participants in the standard group felt less distracted by their peers than male participants in the male-only group. PMID:23590998

  20. Interactive data based on Apriori - AHP - C4.5 results assessment method

    NASA Astrophysics Data System (ADS)

    Zhao, Quan; Zhang, Li

    2017-05-01

    AHP method for weight calculation method, will introduce the subjective concept of "experts, supposed steps", for the objective result has certain uncertainty, causes the classroom interaction data attribute weights proportion difference is not big, the whole class achievement trend of convergence, introduce the concept of Apriori-AHP. C4.5 is used to calculate the weight of attribute column, and then using the Apriori-AHP algorithm calculate attribute weights, attribute importance weights on judgment performance indicators table overall consideration, with the weight of index table of gifted student achievement, make the class performance trends to fluctuate, have tended to be "standard" real results for teacher reference.

  1. Fractional Step Like Schemes for Free Surface Problems with Thermal Coupling Using the Lagrangian PFEM

    NASA Astrophysics Data System (ADS)

    Aubry, R.; Oñate, E.; Idelsohn, S. R.

    2006-09-01

    The method presented in Aubry et al. (Comput Struc 83:1459-1475, 2005) for the solution of an incompressible viscous fluid flow with heat transfer using a fully Lagrangian description of motion is extended to three dimensions (3D) with particular emphasis on mass conservation. A modified fractional step (FS) based on the pressure Schur complement (Turek 1999), and related to the class of algebraic splittings Quarteroni et al. (Comput Methods Appl Mech Eng 188:505-526, 2000), is used and a new advantage of the splittings of the equations compared with the classical FS is highlighted for free surface problems. The temperature is semi-coupled with the displacement, which is the main variable in a Lagrangian description. Comparisons for various mesh Reynolds numbers are performed with the classical FS, an algebraic splitting and a monolithic solution, in order to illustrate the behaviour of the Uzawa operator and the mass conservation. As the classical fractional step is equivalent to one iteration of the Uzawa algorithm performed with a standard Laplacian as a preconditioner, it will behave well only in a Reynold mesh number domain where the preconditioner is efficient. Numerical results are provided to assess the superiority of the modified algebraic splitting to the classical FS.

  2. Performance and thermal behavior of wood plastic composite produced by nonmetals of pulverized waste printed circuit boards.

    PubMed

    Guo, Jie; Tang, Yinen; Xu, Zhenming

    2010-07-15

    A new kind of wood plastic composite (WPC) was produced by compounding nonmetals from waste printed circuit boards (PCBs), recycled high-density polyethylene (HDPE), wood flour and other additives. The blended granules were then extruded to profile WPC products by a conical counter-rotating twin-screw extruder. The results showed that the addition of nonmetals in WPC improved the flexural strength and tensile strength and reduced screw withdrawal strength. When the added content of nonmetals was 40%, the flexural strength of WPC was 23.4 MPa, tensile strength was 9.6 MPa, impact strength was 3.03 J/m(2) and screw withdrawal strength was 1755 N. Dimensional stability and fourier transform infrared spectroscopy (FTIR) of WPC panels were also investigated. Furthermore, thermogravimetric analysis showed that thermal degradation of WPC mainly included two steps. The first step was the decomposition of wood flour and nonmetals from 260 to 380 degrees C, and the second step was the decomposition of HDPE from 440 to 500 degrees C. The performance and thermal behavior of WPC produced by nonmetals from PCBs achieves the standard of WPC. It offers a novel method to treat nonmetals from PCBs. 2010 Elsevier B.V. All rights reserved.

  3. Standardisation of costs: the Dutch Manual for Costing in economic evaluations.

    PubMed

    Oostenbrink, Jan B; Koopmanschap, Marc A; Rutten, Frans F H

    2002-01-01

    The lack of a uniform costing methodology is often considered a weakness of economic evaluations that hinders the interpretation and comparison of studies. Standardisation is therefore an important topic within the methodology of economic evaluations and in national guidelines that formulate the formal requirements for studies to be considered when deciding on the reimbursement of new medical therapies. Recently, the Dutch Manual for Costing: Methods and Standard Costs for Economic Evaluations in Health Care (further referred to as "the manual") has been published, in addition to the Dutch guidelines for pharmacoeconomic research. The objectives of this article are to describe the main content of the manual and to discuss some key issues of the manual in relation to the standardisation of costs. The manual introduces a six-step procedure for costing. These steps concern: the scope of the study;the choice of cost categories;the identification of units;the measurement of resource use;the monetary valuation of units; andthe calculation of unit costs. Each step consists of a number of choices and these together define the approach taken. In addition to a description of the costing process, five key issues regarding the standardisation of costs are distinguished. These are the use of basic principles, methods for measurement and valuation, standard costs (average prices of healthcare services), standard values (values that can be used within unit cost calculations), and the reporting of outcomes. The use of the basic principles, standard values and minimal requirements for reporting outcomes, as defined in the manual, are obligatory in studies that support submissions to acquire reimbursement for new pharmaceuticals. Whether to use standard costs, and the choice of a particular method to measure or value costs, is left mainly to the investigator, depending on the specific study setting. In conclusion, several instruments are available to increase standardisation in costing methodology among studies. These instruments have to be used in such a way that a balance is found between standardisation and the specific setting in which a study is performed. The way in which the Dutch manual tries to reach this balance can serve as an illustration for other countries.

  4. Comparison of Stepped Care Delivery Against a Single, Empirically Validated Cognitive-Behavioral Therapy Program for Youth With Anxiety: A Randomized Clinical Trial.

    PubMed

    Rapee, Ronald M; Lyneham, Heidi J; Wuthrich, Viviana; Chatterton, Mary Lou; Hudson, Jennifer L; Kangas, Maria; Mihalopoulos, Cathrine

    2017-10-01

    Stepped care is embraced as an ideal model of service delivery but is minimally evaluated. The aim of this study was to evaluate the efficacy of cognitive-behavioral therapy (CBT) for child anxiety delivered via a stepped-care framework compared against a single, empirically validated program. A total of 281 youth with anxiety disorders (6-17 years of age) were randomly allocated to receive either empirically validated treatment or stepped care involving the following: (1) low intensity; (2) standard CBT; and (3) individually tailored treatment. Therapist qualifications increased at each step. Interventions did not differ significantly on any outcome measures. Total therapist time per child was significantly shorter to deliver stepped care (774 minutes) compared with best practice (897 minutes). Within stepped care, the first 2 steps returned the strongest treatment gains. Stepped care and a single empirically validated program for youth with anxiety produced similar efficacy, but stepped care required slightly less therapist time. Restricting stepped care to only steps 1 and 2 would have led to considerable time saving with modest loss in efficacy. Clinical trial registration information-A Randomised Controlled Trial of Standard Care Versus Stepped Care for Children and Adolescents With Anxiety Disorders; http://anzctr.org.au/; ACTRN12612000351819. Copyright © 2017 American Academy of Child and Adolescent Psychiatry. Published by Elsevier Inc. All rights reserved.

  5. Improving the performance of extreme learning machine for hyperspectral image classification

    NASA Astrophysics Data System (ADS)

    Li, Jiaojiao; Du, Qian; Li, Wei; Li, Yunsong

    2015-05-01

    Extreme learning machine (ELM) and kernel ELM (KELM) can offer comparable performance as the standard powerful classifier―support vector machine (SVM), but with much lower computational cost due to extremely simple training step. However, their performance may be sensitive to several parameters, such as the number of hidden neurons. An empirical linear relationship between the number of training samples and the number of hidden neurons is proposed. Such a relationship can be easily estimated with two small training sets and extended to large training sets so as to greatly reduce computational cost. Other parameters, such as the steepness parameter in the sigmodal activation function and regularization parameter in the KELM, are also investigated. The experimental results show that classification performance is sensitive to these parameters; fortunately, simple selections will result in suboptimal performance.

  6. Burnout is Associated With Emotional Intelligence but not Traditional Job Performance Measurements in Surgical Residents.

    PubMed

    Cofer, Kevin D; Hollis, Robert H; Goss, Lauren; Morris, Melanie S; Porterfield, John R; Chu, Daniel I

    2018-02-23

    To evaluate whether burnout was associated with emotional intelligence and job performance in surgical residents. General surgery residents at a single institution were surveyed using the Maslach Burnout Inventory (MBI) and trait EI questionnaire (TEIQ-SF). Burnout was defined as scoring in 2 of the 3 following domains; Emotional Exhaustion (high), Depersonalization (high), and Personal Accomplishment (low). Job performance was evaluated using faculty evaluations of clinical competency-based surgical milestones and standardized test scores including the American Board of Surgery In-Training Exam (ABSITE) and the United States Medical Licensing Examination (USMLE) Step 3. USMLE Step 1 and USMLE Step 2, which were taken prior to residency training, were included to examine possible associations of burnout with USMLE examinations. Statistical comparison was made using Pearson correlation and simple linear regression adjusting for PGY level. This study was conducted at the University of Alabama at Birmingham (UAB) general surgery residency program. All current and incoming general surgery residents at UAB were invited to participate in this study. Forty residents participated in the survey (response rate 77%). Ten residents, evenly distributed from incoming residents to PGY-4, had burnout (25%). Mean global EI was lower in residents with burnout versus those without burnout (3.71 vs 3.9, p = 0.02). Of the 4 facets of EI, mean self-control values were lower in residents with burnout versus those without burnout (3.3 vs 4.06, p < 0.01). Each component of burnout was associated with global EI, with the strongest correlation being with personal accomplishment (r = 0.64; p < 0.01). Residents with burnout did not have significantly different mean scores for USMLE Step 1 (229 vs 237, p = 0.12), Step 2 (248 vs 251, p = 0.56), Step 3 (223 vs 222, p = 0.97), or ABSITE percentile (44.6 vs 58, p = 0.33) compared to residents without burnout. Personal accomplishment was associated with ABSITE percentile scores (r = 0.35; p = 0.049). None of the 16 surgical milestone scores were significantly associated with burnout. Burnout is present in surgery residents and associated with emotional intelligence. There was no association of burnout with USMLE scores, ABSITE percentile, or surgical milestones. Traditional methods of assessing resident performance may not be capturing burnout and strategies to reduce burnout should consider targeting emotional intelligence. Published by Elsevier Inc.

  7. A Guide for Developing Standard Operating Job Procedures for the Screening & Grinding Process Wastewater Treatment Facility. SOJP No. 1.

    ERIC Educational Resources Information Center

    Deal, Gerald A.; Montgomery, James A.

    This guide describes standard operating job procedures for the screening and grinding process of wastewater treatment facilities. The objective of this process is the removal of coarse materials from the raw waste stream for the protection of subsequent equipment and processes. The guide gives step-by-step instructions for safety inspection,…

  8. A Guide for Developing Standard Operating Job Procedures for the Sludge Thickening Process Wastewater Treatment Facility. SOJP No. 9.

    ERIC Educational Resources Information Center

    Schwing, Carl M.

    This guide describes standard operating job procedures for the screening and grinding process of wastewater treatment facilities. The objective of this process is the removal of coarse materials from the raw waste stream for the protection of subsequent equipment and processes. The guide gives step-by-step instructions for safety inspection,…

  9. A Guide for Developing Standard Operating Job Procedures for the Digestion Process Wastewater Treatment Facility. SOJP No. 10.

    ERIC Educational Resources Information Center

    Schwing, Carl M.

    This guide describes standard operating job procedures for the digestion process of wastewater treatment facilities. This process is for reducing the volume of sludge to be treated in subsequent units and to reduce the volatile content of sludge. The guide gives step-by-step instructions for pre-startup, startup, continuous operating, shutdown,…

  10. A Guide for Developing Standard Operating Job Procedures for the Tertiary Chemical Treatment - Lime Precipitation Process Wastewater Treatment Facility. SOJP No. 6.

    ERIC Educational Resources Information Center

    Petrasek, Al, Jr.

    This guide describes the standard operating job procedures for the tertiary chemical treatment - lime precipitation process of wastewater treatment plants. Step-by-step instructions are given for pre-start up, start-up, continuous operation, and shut-down procedures. In addition, some theoretical material is presented along with some relevant…

  11. A Guide for Developing Standard Operating Job Procedures for the Grit Removal Process Wastewater Treatment Facility. SOJP No. 2.

    ERIC Educational Resources Information Center

    Deal, Gerald A.; Montgomery, James A.

    This guide describes standard operating job procedures for the grit removal process of wastewater treatment plants. Step-by-step instructions are given for pre-start up inspection, start-up, continuous operation, and shut-down procedures. A description of the equipment used in the process is given. Some theoretical material is presented. (BB)

  12. A Guide for Developing Standard Operating Job Procedures for the Tertiary Multimedia Filtration Process Wastewater Treatment Facility. SOJP No. 7.

    ERIC Educational Resources Information Center

    Petrasek, Al, Jr.

    This guide describes the standard operating job procedures for the tertiary multimedia filtration process of wastewater treatment plants. The major objective of the filtration process is the removal of suspended solids from the reclaimed wastewater. The guide gives step-by-step instructions for pre-start up, start-up, continuous operation, and…

  13. Implementing standardized performance indicators to improve hypertension control at both the population and healthcare organization levels

    PubMed Central

    Campbell, Norm; Ordunez, Pedro; Jafe, Marc G.; Orias, Marcelo; DiPete, Donald J.; Patel, Pragna; Khan, Nadia; Onuma, Oyere; Lackland, Daniel T.

    2017-01-01

    The ability to reliably evaluate the impact of interventions and changes in hypertension prevalence and control is critical if the burden of hypertension-related disease is to be reduced. Previously, a World Hypertension League Expert Committee made recommendations to standardize the reporting of population blood pressure surveys. We have added to those recommendations and also provide modified recommendations from a Pan American Health Organization expert meeting for “performance indicators” to be used to evaluate clinical practices. Core indicators for population surveys are recommended to include: (1) mean systolic blood pressure and (2) mean diastolic blood pressure, and the prevalences of: (3) hypertension, (4) awareness of hypertension, (5) drug-treated hypertension, and (6) drug-treated and controlled hypertension. Core indicators for clinical registries are recommended to include: (1) the prevalence of diagnosed hypertension and (2) the ratio of diagnosed hypertension to that expected by population surveys, and the prevalences of: (3) controlled hypertension, (4) lack of blood pressure measurement within a year in people diagnosed with hypertension, and (5) missed visits by people with hypertension. Definitions and additional indicators are provided. Widespread adoption of standardized population and clinical hypertension performance indicators could represent a major step forward in the effort to control hypertension. PMID:28191704

  14. Three-step approach for prediction of limit cycle pressure oscillations in combustion chambers of gas turbines

    NASA Astrophysics Data System (ADS)

    Iurashev, Dmytro; Campa, Giovanni; Anisimov, Vyacheslav V.; Cosatto, Ezio

    2017-11-01

    Currently, gas turbine manufacturers frequently face the problem of strong acoustic combustion driven oscillations inside combustion chambers. These combustion instabilities can cause extensive wear and sometimes even catastrophic damages to combustion hardware. This requires prevention of combustion instabilities, which, in turn, requires reliable and fast predictive tools. This work presents a three-step method to find stability margins within which gas turbines can be operated without going into self-excited pressure oscillations. As a first step, a set of unsteady Reynolds-averaged Navier-Stokes simulations with the Flame Speed Closure (FSC) model implemented in the OpenFOAM® environment are performed to obtain the flame describing function of the combustor set-up. The standard FSC model is extended in this work to take into account the combined effect of strain and heat losses on the flame. As a second step, a linear three-time-lag-distributed model for a perfectly premixed swirl-stabilized flame is extended to the nonlinear regime. The factors causing changes in the model parameters when applying high-amplitude velocity perturbations are analysed. As a third step, time-domain simulations employing a low-order network model implemented in Simulink® are performed. In this work, the proposed method is applied to a laboratory test rig. The proposed method permits not only the unsteady frequencies of acoustic oscillations to be computed, but the amplitudes of such oscillations as well. Knowing the amplitudes of unstable pressure oscillations, it is possible to determine how these oscillations are harmful to the combustor equipment. The proposed method has a low cost because it does not require any license for computational fluid dynamics software.

  15. Evaluating maturation and genetic modification of human dendritic cells in a new polyolefin cell culture bag system.

    PubMed

    Macke, Lars; Garritsen, Henk S P; Meyring, Wilhelm; Hannig, Horst; Pägelow, Ute; Wörmann, Bernhard; Piechaczek, Christoph; Geffers, Robert; Rohde, Manfred; Lindenmaier, Werner; Dittmar, Kurt E J

    2010-04-01

    Dendritic cells (DCs) are applied worldwide in several clinical studies of immune therapy of malignancies, autoimmune diseases, and transplantations. Most legislative bodies are demanding high standards for cultivation and transduction of cells. Closed-cell cultivating systems like cell culture bags would simplify and greatly improve the ability to reach these cultivation standards. We investigated if a new polyolefin cell culture bag enables maturation and adenoviral modification of human DCs in a closed system and compare the results with standard polystyrene flasks. Mononuclear cells were isolated from HLA-A*0201-positive blood donors by leukapheresis. A commercially available separation system (CliniMACS, Miltenyi Biotec) was used to isolate monocytes by positive selection using CD14-specific immunomagnetic beads. The essentially homogenous starting cell population was cultivated in the presence of granulocyte-macrophage-colony-stimulating factor and interleukin-4 in a closed-bag system in parallel to the standard flask cultivation system. Genetic modification was performed on Day 4. After induction of maturation on Day 5, mature DCs could be harvested and cryopreserved on Day 7. During the cultivation period comparative quality control was performed using flow cytometry, gene expression profiling, and functional assays. Both flasks and bags generated mature genetically modified DCs in similar yields. Surface membrane markers, expression profiles, and functional testing results were comparable. The use of a closed-bag system facilitated clinical applicability of genetically modified DCs. The polyolefin bag-based culture system yields DCs qualitatively and quantitatively comparable to the standard flask preparation. All steps including cryopreservation can be performed in a closed system facilitating standardized, safe, and reproducible preparation of therapeutic cells.

  16. Developing hospital accreditation standards in Uganda.

    PubMed

    Galukande, Moses; Katamba, Achilles; Nakasujja, Noeline; Baingana, Rhona; Bateganya, Moses; Hagopian, Amy; Tavrow, Paula; Barnhart, Scott; Luboga, Sam

    2016-07-01

    Whereas accreditation is widely used as a tool to improve quality of healthcare in the developed world, it is a concept not well adapted in most developing countries for a host of reasons, including insufficient incentives, insufficient training and a shortage of human and material resources. The purpose of this paper is to describe refining use and outcomes of a self-assessment hospital accreditation tool developed for a resource-limited context. We invited 60 stakeholders to review a set of standards (from which a self-assessment tool was developed), and subsequently refined them to include 485 standards in 7 domains. We then invited 60 hospitals to test them. A study team traveled to each of the 40 hospitals that agreed to participate providing training and debrief the self-assessment. The study was completed in 8 weeks. Hospital self-assessments revealed hospitals were remarkably open to frank rating of their performance and willing to rank all 485 measures. Good performance was measured in outreach programs, availability of some types of equipment and running water, 24-h staff calls systems, clinical guidelines and waste segregation. Poor performance was measured in care for the vulnerable, staff living quarters, physician performance reviews, patient satisfaction surveys and sterilizing equipment. We have demonstrated the feasibility of a self-assessment approach to hospital standards in low-income country setting. This low-cost approach may be used as a good precursor to establishing a national accreditation body, as indicated by the Ministry's efforts to take the next steps. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  17. Enriching step-based product information models to support product life-cycle activities

    NASA Astrophysics Data System (ADS)

    Sarigecili, Mehmet Ilteris

    The representation and management of product information in its life-cycle requires standardized data exchange protocols. Standard for Exchange of Product Model Data (STEP) is such a standard that has been used widely by the industries. Even though STEP-based product models are well defined and syntactically correct, populating product data according to these models is not easy because they are too big and disorganized. Data exchange specifications (DEXs) and templates provide re-organized information models required in data exchange of specific activities for various businesses. DEXs show us it would be possible to organize STEP-based product models in order to support different engineering activities at various stages of product life-cycle. In this study, STEP-based models are enriched and organized to support two engineering activities: materials information declaration and tolerance analysis. Due to new environmental regulations, the substance and materials information in products have to be screened closely by manufacturing industries. This requires a fast, unambiguous and complete product information exchange between the members of a supply chain. Tolerance analysis activity, on the other hand, is used to verify the functional requirements of an assembly considering the worst case (i.e., maximum and minimum) conditions for the part/assembly dimensions. Another issue with STEP-based product models is that the semantics of product data are represented implicitly. Hence, it is difficult to interpret the semantics of data for different product life-cycle phases for various application domains. OntoSTEP, developed at NIST, provides semantically enriched product models in OWL. In this thesis, we would like to present how to interpret the GD & T specifications in STEP for tolerance analysis by utilizing OntoSTEP.

  18. Global linear-irreversible principle for optimization in finite-time thermodynamics

    NASA Astrophysics Data System (ADS)

    Johal, Ramandeep S.

    2018-03-01

    There is intense effort into understanding the universal properties of finite-time models of thermal machines —at optimal performance— such as efficiency at maximum power, coefficient of performance at maximum cooling power, and other such criteria. In this letter, a global principle consistent with linear irreversible thermodynamics is proposed for the whole cycle —without considering details of irreversibilities in the individual steps of the cycle. This helps to express the total duration of the cycle as τ \\propto {\\bar{Q}^2}/{Δ_\\text{tot}S} , where \\bar{Q} models the effective heat transferred through the machine during the cycle, and Δ_ \\text{tot} S is the total entropy generated. By taking \\bar{Q} in the form of simple algebraic means (such as arithmetic and geometric means) over the heats exchanged by the reservoirs, the present approach is able to predict various standard expressions for figures of merit at optimal performance, as well as the bounds respected by them. It simplifies the optimization procedure to a one-parameter optimization, and provides a fresh perspective on the issue of universality at optimal performance, for small difference in reservoir temperatures. As an illustration, we compare the performance of a partially optimized four-step endoreversible cycle with the present approach.

  19. The IEEE GRSS Standardized Remote Sensing Data Website: A Step Towards "Science 2.0" in Remote Sensing

    NASA Astrophysics Data System (ADS)

    Dell'Acqua, Fabio; Iannelli, Gianni Cristian; Kerekes, John; Lisini, Gianni; Moser, Gabriele; Ricardi, Niccolo; Pierce, Leland

    2016-08-01

    The issue of homogeneity in performance assessment of proposed algorithms for information extraction is generally perceived also in the Earth Observation (EO) domain. Different authors propose different datasets to test their developed algorithms and to the reader it is frequently difficult to assess which is better for his/her specific application, given the wide variability in test sets that makes pure comparison of e.g. accuracy values less meaningful than one would desire. With our work, we gave a modest contribution to ease the problem by making it possible to automatically distribute a limited set of possible "standard" open datasets, together with some ground truth info, and automatically assess processing results provided by the users.

  20. Simulant Basis for the Standard High Solids Vessel Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peterson, Reid A.; Fiskum, Sandra K.; Suffield, Sarah R.

    The Waste Treatment and Immobilization Plant (WTP) is working to develop a Standard High Solids Vessel Design (SHSVD) process vessel. To support testing of this new design, WTP engineering staff requested that a Newtonian simulant and a non-Newtonian simulant be developed that would represent the Most Adverse Design Conditions (in development) with respect to mixing performance as specified by WTP. The majority of the simulant requirements are specified in 24590-PTF-RPT-PE-16-001, Rev. 0. The first step in this process is to develop the basis for these simulants. This document describes the basis for the properties of these two simulant types. Themore » simulant recipes that meet this basis will be provided in a subsequent document.« less

  1. Stencils and problem partitionings: Their influence on the performance of multiple processor systems

    NASA Technical Reports Server (NTRS)

    Reed, D. A.; Adams, L. M.; Patrick, M. L.

    1986-01-01

    Given a discretization stencil, partitioning the problem domain is an important first step for the efficient solution of partial differential equations on multiple processor systems. Partitions are derived that minimize interprocessor communication when the number of processors is known a priori and each domain partition is assigned to a different processor. This partitioning technique uses the stencil structure to select appropriate partition shapes. For square problem domains, it is shown that non-standard partitions (e.g., hexagons) are frequently preferable to the standard square partitions for a variety of commonly used stencils. This investigation is concluded with a formalization of the relationship between partition shape, stencil structure, and architecture, allowing selection of optimal partitions for a variety of parallel systems.

  2. Automated cellular sample preparation using a Centrifuge-on-a-Chip.

    PubMed

    Mach, Albert J; Kim, Jae Hyun; Arshi, Armin; Hur, Soojung Claire; Di Carlo, Dino

    2011-09-07

    The standard centrifuge is a laboratory instrument widely used by biologists and medical technicians for preparing cell samples. Efforts to automate the operations of concentration, cell separation, and solution exchange that a centrifuge performs in a simpler and smaller platform have had limited success. Here, we present a microfluidic chip that replicates the functions of a centrifuge without moving parts or external forces. The device operates using a purely fluid dynamic phenomenon in which cells selectively enter and are maintained in microscale vortices. Continuous and sequential operation allows enrichment of cancer cells from spiked blood samples at the mL min(-1) scale, followed by fluorescent labeling of intra- and extra-cellular antigens on the cells without the need for manual pipetting and washing steps. A versatile centrifuge-analogue may open opportunities in automated, low-cost and high-throughput sample preparation as an alternative to the standard benchtop centrifuge in standardized clinical diagnostics or resource poor settings.

  3. Nurse practitioners: leadership behaviors and organizational climate.

    PubMed

    Jones, L C; Guberski, T D; Soeken, K L

    1990-01-01

    The purpose of this article is to examine the relationships of individual nurse practitioners' perceptions of the leadership climate in their organizations and self-reported formal and informal leadership behaviors. The nine climate dimensions (Structure, Responsibility, Reward, Perceived Support of Risk Taking, Warmth, Support, Standard Setting, Conflict, and Identity) identified by Litwin and Stringer in 1968 were used to predict five leadership dimensions (Meeting Organizational Needs, Managing Resources, Leadership Competence, Task Accomplishment, and Communications). Demographic variables of age, educational level, and percent of time spent performing administrative functions were forced as a first step in each multiple regression analysis and used to explain a significant amount of variance in all but one analysis. All leadership dimensions were predicted by at least one organizational climate dimension: (1) Meeting Organizational Needs by Risk and Reward; (2) Managing Resources by Risk and Structure; (3) Leadership Competence by Risk and Standards; (4) Task Accomplishment by Structure, Risk, and Standards; and (5) Communication by Rewards.

  4. Simultaneous multielement atomic absorption spectrometry with graphite furnace atomization

    NASA Astrophysics Data System (ADS)

    Harnly, James M.; Miller-Ihli, Nancy J.; O'Haver, Thomas C.

    The extended analytical range capability of a simultaneous multielement atomic absorption continuum source spectrometer (SIMAAC) was tested for furnace atomization with respect to the signal measurement mode (peak height and area), the atomization mode (from the wall or from a platform), and the temperature program mode (stepped or ramped atomization). These parameters were evaluated with respect to the shapes of the analytical curves, the detection limits, carry-over contamination and accuracy. Peak area measurements gave more linear calibration curves. Methods for slowing the atomization step heating rate, the use of a ramped temperature program or a platform, produced similar calibration curves and longer linear ranges than atomization with a stepped temperature program. Peak height detection limits were best using stepped atomization from the wall. Peak area detection limits for all atomization modes were similar. Carry-over contamination was worse for peak area than peak height, worse for ramped atomization than stepped atomization, and worse for atomization from a platform than from the wall. Accurate determinations (100 ± 12% for Ca, Cu, Fe, Mn, and Zn in National Bureau of Standards' Standard Reference Materials Bovine Liver 1577 and Rice Flour 1568 were obtained using peak area measurements with ramped atomization from the wall and stepped atomization from a platform. Only stepped atomization from a platform gave accurate recoveries for K. Accurate recoveries, 100 ± 10%, with precisions ranging from 1 to 36 % (standard deviation), were obtained for the determination of Al, Co, Cr, Fe, Mn, Mo, Ni. Pb, V and Zn in Acidified Waters (NBS SRM 1643 and 1643a) using stepped atomization from a platform.

  5. The effects of multi-disciplinary psycho-social care on socio-economic problems in cancer patients: a cluster-randomized trial.

    PubMed

    Singer, Susanne; Roick, Julia; Meixensberger, Jürgen; Schiefke, Franziska; Briest, Susanne; Dietz, Andreas; Papsdorf, Kirsten; Mössner, Joachim; Berg, Thomas; Stolzenburg, Jens-Uwe; Niederwieser, Dietger; Keller, Annette; Kersting, Anette; Danker, Helge

    2018-06-01

    We examined whether multi-disciplinary stepped psycho-social care decreases financial problems and improves return-to-work in cancer patients. In a university hospital, wards were randomly allocated to either stepped or standard care. Stepped care comprised screening for financial problems, consultation between doctor and patient, and the provision of social service. Outcomes were financial problems at the time of discharge and return-to-work in patients < 65 years old half a year after baseline. The analysis employed mixed-effect multivariate regression modeling. Thirteen wards were randomized and 1012 patients participated (n = 570 in stepped care and n = 442 in standard care). Those who reported financial problems at baseline were less likely to have financial problems at discharge when they had received stepped care (odds ratio (OR) 0.2, 95% confidence interval (CI) 0.1, 0.7; p = 0.01). There was no evidence for an effect of stepped care on financial problems in patients without such problems at baseline (OR 1.1, CI 0.5, 2.6; p = 0.82). There were 399 patients < 65 years old who were not retired at baseline. In this group, there was no evidence for an effect of stepped care on being employed half a year after baseline (OR 0.7, CI 0.3, 2.0; p = 0.52). NCT01859429 CONCLUSIONS: Financial problems can be avoided more effectively with multi-disciplinary stepped psycho-social care than with standard care in patients who have such problems.

  6. Single-Run Single-Mask Inductively-Coupled-Plasma Reactive-Ion-Etching Process for Fabricating Suspended High-Aspect-Ratio Microstructures

    NASA Astrophysics Data System (ADS)

    Yang, Yao-Joe; Kuo, Wen-Cheng; Fan, Kuang-Chao

    2006-01-01

    In this work, we present a single-run single-mask (SRM) process for fabricating suspended high-aspect-ratio structures on standard silicon wafers using an inductively coupled plasma-reactive ion etching (ICP-RIE) etcher. This process eliminates extra fabrication steps which are required for structure release after trench etching. Released microstructures with 120 μm thickness are obtained by this process. The corresponding maximum aspect ratio of the trench is 28. The SRM process is an extended version of the standard process proposed by BOSCH GmbH (BOSCH process). The first step of the SRM process is a standard BOSCH process for trench etching, then a polymer layer is deposited on trench sidewalls as a protective layer for the subsequent structure-releasing step. The structure is released by dry isotropic etching after the polymer layer on the trench floor is removed. All the steps can be integrated into a single-run ICP process. Also, only one mask is required. Therefore, the process complexity and fabrication cost can be effectively reduced. Discussions on each SRM step and considerations for avoiding undesired etching of the silicon structures during the release process are also presented.

  7. Intra-operative multi-site stimulation: Expanding methodology for cortical brain mapping of language functions

    PubMed Central

    Korn, Akiva; Kirschner, Adi; Perry, Daniella; Hendler, Talma; Ram, Zvi

    2017-01-01

    Direct cortical stimulation (DCS) is considered the gold-standard for functional cortical mapping during awake surgery for brain tumor resection. DCS is performed by stimulating one local cortical area at a time. We present a feasibility study using an intra-operative technique aimed at improving our ability to map brain functions which rely on activity in distributed cortical regions. Following standard DCS, Multi-Site Stimulation (MSS) was performed in 15 patients by applying simultaneous cortical stimulations at multiple locations. Language functioning was chosen as a case-cognitive domain due to its relatively well-known cortical organization. MSS, performed at sites that did not produce disruption when applied in a single stimulation point, revealed additional language dysfunction in 73% of the patients. Functional regions identified by this technique were presumed to be significant to language circuitry and were spared during surgery. No new neurological deficits were observed in any of the patients following surgery. Though the neuro-electrical effects of MSS need further investigation, this feasibility study may provide a first step towards sophistication of intra-operative cortical mapping. PMID:28700619

  8. A selective and sensitive method for quantitation of lysergic acid diethylamide (LSD) in whole blood by gas chromatography-ion trap tandem mass spectrometry.

    PubMed

    Libong, Danielle; Bouchonnet, Stéphane; Ricordel, Ivan

    2003-01-01

    A gas chromatography-ion trap tandem mass spectrometry (GC-ion trap MS-MS) method for detection and quantitation of LSD in whole blood is presented. The sample preparation process, including a solid-phase extraction step with Bond Elut cartridges, was performed with 2 mL of whole blood. Eight microliters of the purified extract was injected with a cold on-column injection method. Positive chemical ionization was performed using acetonitrile as reagent gas; LSD was detected in the MS-MS mode. The chromatograms obtained from blood extracts showed the great selectivity of the method. GC-MS quantitation was performed using lysergic acid methylpropylamide as the internal standard. The response of the MS was linear for concentrations ranging from 0.02 ng/mL (detection threshold) to 10.0 ng/mL. Several parameters such as the choice of the capillary column, the choice of the internal standard and that of the ionization mode (positive CI vs. EI) were rationalized. Decomposition pathways under both ionization modes were studied. Within-day and between-day stability were evaluated.

  9. Analysis of Workplace Health Education Performed by Occupational Health Managers in Korea.

    PubMed

    Kim, Yeon-Ha; Jung, Moon-Hee

    2016-09-01

    To evaluate workplace health education as practiced by occupational health managers based on standardized job tasks and suggest priority tasks and areas to be trained. The study was conducted between November 10, 2013 and April 30, 2014. The tool used in this study was standardized job tasks of workplace health education for occupational health managers which was developed through methodological steps. It was evaluated by 233 worksite occupational health managers. Data were analyzed using SPSS 21.0. Predicting variables of workplace health education performance were the "analysis and planning" factor, type of enterprise, and form of management. Healthcare professionals and occupational health managers who managed the nonmanufacturing industry showed high importance and low performance level in "analysis and planning" factor. "Analysis and planning" skill is priority training area for healthcare professionals and occupational health managers who managed nonmanufacturing industry. It is necessary to develop a training curriculum for occupational health managers that include improving analysis of worksites and plans for a health education program. Copyright © 2016. Published by Elsevier B.V.

  10. Intra-operative multi-site stimulation: Expanding methodology for cortical brain mapping of language functions.

    PubMed

    Gonen, Tal; Gazit, Tomer; Korn, Akiva; Kirschner, Adi; Perry, Daniella; Hendler, Talma; Ram, Zvi

    2017-01-01

    Direct cortical stimulation (DCS) is considered the gold-standard for functional cortical mapping during awake surgery for brain tumor resection. DCS is performed by stimulating one local cortical area at a time. We present a feasibility study using an intra-operative technique aimed at improving our ability to map brain functions which rely on activity in distributed cortical regions. Following standard DCS, Multi-Site Stimulation (MSS) was performed in 15 patients by applying simultaneous cortical stimulations at multiple locations. Language functioning was chosen as a case-cognitive domain due to its relatively well-known cortical organization. MSS, performed at sites that did not produce disruption when applied in a single stimulation point, revealed additional language dysfunction in 73% of the patients. Functional regions identified by this technique were presumed to be significant to language circuitry and were spared during surgery. No new neurological deficits were observed in any of the patients following surgery. Though the neuro-electrical effects of MSS need further investigation, this feasibility study may provide a first step towards sophistication of intra-operative cortical mapping.

  11. Empirical evaluation of humpback whale telomere length estimates; quality control and factors causing variability in the singleplex and multiplex qPCR methods.

    PubMed

    Olsen, Morten Tange; Bérubé, Martine; Robbins, Jooke; Palsbøll, Per J

    2012-09-06

    Telomeres, the protective cap of chromosomes, have emerged as powerful markers of biological age and life history in model and non-model species. The qPCR method for telomere length estimation is one of the most common methods for telomere length estimation, but has received recent critique for being too error-prone and yielding unreliable results. This critique coincides with an increasing awareness of the potentials and limitations of the qPCR technique in general and the proposal of a general set of guidelines (MIQE) for standardization of experimental, analytical, and reporting steps of qPCR. In order to evaluate the utility of the qPCR method for telomere length estimation in non-model species, we carried out four different qPCR assays directed at humpback whale telomeres, and subsequently performed a rigorous quality control to evaluate the performance of each assay. Performance differed substantially among assays and only one assay was found useful for telomere length estimation in humpback whales. The most notable factors causing these inter-assay differences were primer design and choice of using singleplex or multiplex assays. Inferred amplification efficiencies differed by up to 40% depending on assay and quantification method, however this variation only affected telomere length estimates in the worst performing assays. Our results suggest that seemingly well performing qPCR assays may contain biases that will only be detected by extensive quality control. Moreover, we show that the qPCR method for telomere length estimation can be highly precise and accurate, and thus suitable for telomere measurement in non-model species, if effort is devoted to optimization at all experimental and analytical steps. We conclude by highlighting a set of quality controls which may serve for further standardization of the qPCR method for telomere length estimation, and discuss some of the factors that may cause variation in qPCR experiments.

  12. Empirical evaluation of humpback whale telomere length estimates; quality control and factors causing variability in the singleplex and multiplex qPCR methods

    PubMed Central

    2012-01-01

    Background Telomeres, the protective cap of chromosomes, have emerged as powerful markers of biological age and life history in model and non-model species. The qPCR method for telomere length estimation is one of the most common methods for telomere length estimation, but has received recent critique for being too error-prone and yielding unreliable results. This critique coincides with an increasing awareness of the potentials and limitations of the qPCR technique in general and the proposal of a general set of guidelines (MIQE) for standardization of experimental, analytical, and reporting steps of qPCR. In order to evaluate the utility of the qPCR method for telomere length estimation in non-model species, we carried out four different qPCR assays directed at humpback whale telomeres, and subsequently performed a rigorous quality control to evaluate the performance of each assay. Results Performance differed substantially among assays and only one assay was found useful for telomere length estimation in humpback whales. The most notable factors causing these inter-assay differences were primer design and choice of using singleplex or multiplex assays. Inferred amplification efficiencies differed by up to 40% depending on assay and quantification method, however this variation only affected telomere length estimates in the worst performing assays. Conclusion Our results suggest that seemingly well performing qPCR assays may contain biases that will only be detected by extensive quality control. Moreover, we show that the qPCR method for telomere length estimation can be highly precise and accurate, and thus suitable for telomere measurement in non-model species, if effort is devoted to optimization at all experimental and analytical steps. We conclude by highlighting a set of quality controls which may serve for further standardization of the qPCR method for telomere length estimation, and discuss some of the factors that may cause variation in qPCR experiments. PMID:22954451

  13. Standardized versus Individualized Acupuncture for Chronic Low Back Pain: A Randomized Controlled Trial

    PubMed Central

    Pach, Daniel; Yang-Strobel, Xiaoli; Lüdtke, Rainer; Icke, Katja; Brinkhaus, Benno; Witt, Claudia M.

    2013-01-01

    We aimed to compare the effectiveness of standardized and individualized acupuncture treatment in patients with chronic low back pain. A single-center randomized controlled single-blind trial was performed in a general medical practice in Germany run by a Chinese-born medical doctor trained in western and Chinese medicine. One hundred and fifty outpatients with chronic low back pain were randomly allocated to two groups (78 standardized and 72 individualized acupuncture). Patients received either standardized acupuncture or individualized acupuncture. Treatment encompassed between 10 and 15 treatments based on individual symptoms with two treatments per week. The main outcome measure was the area under the curve (AUC) summarizing eight weeks of daily rated pain severity measured with a visual analogue scale (0 mm = no pain, 100 mm = worst imaginable pain). No significant differences between groups were observed for the AUC (individualized acupuncture mean: 1768.7 (95% CI, 1460.4; 2077.1); standardized acupuncture 1482.9 (1177.2; 1788.7); group difference, 285.8 (−33.9; 605.5) P = 0.080). In this single-center trial, individualized acupuncture was not superior to standardized acupuncture for patients suffering from chronic pain. As a next step, a multicenter noninferiority study should be performed to investigate whether standardised acupuncture treatment for chronic low back pain might be applicable in a broader usual care setting. This trial is registered with ClinicalTrials.gov NCT00758017. PMID:24288556

  14. Postural adjustment errors during lateral step initiation in older and younger adults

    PubMed Central

    Sparto, Patrick J.; Fuhrman, Susan I.; Redfern, Mark S.; Perera, Subashan; Jennings, J. Richard; Furman, Joseph M.

    2016-01-01

    The purpose was to examine age differences and varying levels of step response inhibition on the performance of a voluntary lateral step initiation task. Seventy older adults (70 – 94 y) and twenty younger adults (21 – 58 y) performed visually-cued step initiation conditions based on direction and spatial location of arrows, ranging from a simple choice reaction time task to a perceptual inhibition task that included incongruous cues about which direction to step (e.g. a left pointing arrow appearing on the right side of a monitor). Evidence of postural adjustment errors and step latencies were recorded from vertical ground reaction forces exerted by the stepping leg. Compared with younger adults, older adults demonstrated greater variability in step behavior, generated more postural adjustment errors during conditions requiring inhibition, and had greater step initiation latencies that increased more than younger adults as the inhibition requirements of the condition became greater. Step task performance was related to clinical balance test performance more than executive function task performance. PMID:25595953

  15. Postural adjustment errors during lateral step initiation in older and younger adults

    PubMed Central

    Sparto, Patrick J.; Fuhrman, Susan I.; Redfern, Mark S.; Perera, Subashan; Jennings, J. Richard; Furman, Joseph M.

    2014-01-01

    The purpose was to examine age differences and varying levels of step response inhibition on the performance of a voluntary lateral step initiation task. Seventy older adults (70 – 94 y) and twenty younger adults (21 – 58 y) performed visually-cued step initiation conditions based on direction and spatial location of arrows, ranging from a simple choice reaction time task to a perceptual inhibition task that included incongruous cues about which direction to step (e.g. a left pointing arrow appearing on the right side of a monitor). Evidence of postural adjustment errors and step latencies were recorded from vertical ground reaction forces exerted by the stepping leg. Compared with younger adults, older adults demonstrated greater variability in step behavior, generated more postural adjustment errors during conditions requiring inhibition, and had greater step initiation latencies that increased more than younger adults as the inhibition requirements of the condition became greater. Step task performance was related to clinical balance test performance more than executive function task performance. PMID:25183162

  16. A Guide for Developing Standard Operating Job Procedures for the Pump Station Process Wastewater Treatment Facility. SOJP No. 3.

    ERIC Educational Resources Information Center

    Perley, Gordon F.

    This is a guide for standard operating job procedures for the pump station process of wastewater treatment plants. Step-by-step instructions are given for pre-start up inspection, start-up procedures, continuous routine operation procedures, and shut-down procedures. A general description of the equipment used in the process is given. Two…

  17. Optimizing ROOT’s Performance Using C++ Modules

    NASA Astrophysics Data System (ADS)

    Vassilev, Vassil

    2017-10-01

    ROOT comes with a C++ compliant interpreter cling. Cling needs to understand the content of the libraries in order to interact with them. Exposing the full shared library descriptors to the interpreter at runtime translates into increased memory footprint. ROOT’s exploratory programming concepts allow implicit and explicit runtime shared library loading. It requires the interpreter to load the library descriptor. Re-parsing of descriptors’ content has a noticeable effect on the runtime performance. Present state-of-art lazy parsing technique brings the runtime performance to reasonable levels but proves to be fragile and can introduce correctness issues. An elegant solution is to load information from the descriptor lazily and in a non-recursive way. The LLVM community advances its C++ Modules technology providing an io-efficient, on-disk representation capable to reduce build times and peak memory usage. The feature is standardized as a C++ technical specification. C++ Modules are a flexible concept, which can be employed to match CMS and other experiments’ requirement for ROOT: to optimize both runtime memory usage and performance. Cling technically “inherits” the feature, however tweaking it to ROOT scale and beyond is a complex endeavor. The paper discusses the status of the C++ Modules in the context of ROOT, supported by few preliminary performance results. It shows a step-by-step migration plan and describes potential challenges which could appear.

  18. Robust automated mass spectra interpretation and chemical formula calculation using mixed integer linear programming.

    PubMed

    Baran, Richard; Northen, Trent R

    2013-10-15

    Untargeted metabolite profiling using liquid chromatography and mass spectrometry coupled via electrospray ionization is a powerful tool for the discovery of novel natural products, metabolic capabilities, and biomarkers. However, the elucidation of the identities of uncharacterized metabolites from spectral features remains challenging. A critical step in the metabolite identification workflow is the assignment of redundant spectral features (adducts, fragments, multimers) and calculation of the underlying chemical formula. Inspection of the data by experts using computational tools solving partial problems (e.g., chemical formula calculation for individual ions) can be performed to disambiguate alternative solutions and provide reliable results. However, manual curation is tedious and not readily scalable or standardized. Here we describe an automated procedure for the robust automated mass spectra interpretation and chemical formula calculation using mixed integer linear programming optimization (RAMSI). Chemical rules among related ions are expressed as linear constraints and both the spectra interpretation and chemical formula calculation are performed in a single optimization step. This approach is unbiased in that it does not require predefined sets of neutral losses and positive and negative polarity spectra can be combined in a single optimization. The procedure was evaluated with 30 experimental mass spectra and was found to effectively identify the protonated or deprotonated molecule ([M + H](+) or [M - H](-)) while being robust to the presence of background ions. RAMSI provides a much-needed standardized tool for interpreting ions for subsequent identification in untargeted metabolomics workflows.

  19. Assessing the effectiveness of combining evaluation methods for the early identification of students with inadequate knowledge during a clerkship.

    PubMed

    Hemmer, Paul A.; Grau, Thomas; Pangaro, Louis N.

    2001-10-01

    This study examined the predictive validity of in-clerkship evaluation methods to identify medical students who have insufficient knowledge. Study subjects were 124 third-year medical students at the Uniformed Services University. Insufficient knowledge was defined by: (1) a clerkship 'pre-test' score one standard deviation below the mean or lower; or (2) any teacher verbally rating a student's general knowledge as 'marginal' or less; or (3) a student did not pass Step One of the United States Medical Licensing Examination (USMLE). We determined sensitivity and specificity using a standard score of 90%. Using USMLE Step One pass-fail performance did not improve sensitivity. Combining a 'pre-test' and instructors' formal evaluation session comments improves the early identification of students with insufficient knowledge, allowing for formative feedback and remediation during the clerkship.

  20. [Internal audit in medical laboratory: what means of control for an effective audit process?].

    PubMed

    Garcia-Hejl, Carine; Chianéa, Denis; Dedome, Emmanuel; Sanmartin, Nancy; Bugier, Sarah; Linard, Cyril; Foissaud, Vincent; Vest, Philippe

    2013-01-01

    To prepare the French Accreditation Committee (COFRAC) visit for initial certification of our medical laboratory, our direction evaluated its quality management system (QMS) and all its technical activities. This evaluation was performed owing an internal audit. This audit was outsourced. Auditors had an expertise in audit, a whole knowledge of biological standards and were independent. Several nonconformities were identified at that time, including a lack of control of several steps of the internal audit process. Hence, necessary corrective actions were taken in order to meet the requirements of standards, in particular, the formalization of all stages, from the audit program, to the implementation, review and follow-up of the corrective actions taken, and also the implementation of the resources needed to carry out audits in a pre-established timing. To ensure an optimum control of each step, the main concepts of risk management were applied: process approach, root cause analysis, effects and criticality analysis (FMECA). After a critical analysis of our practices, this methodology allowed us to define our "internal audit" process, then to formalize it and to follow it up, with a whole documentary system.

  1. Fabrication of Long Period Gratings by Periodically Removing the Coating of Cladding-Etched Single Mode Optical Fiber Towards Optical Fiber Sensor Development.

    PubMed

    Ascorbe, Joaquin; Corres, Jesus M; Del Villar, Ignacio; Matias, Ignacio R

    2018-06-07

    Here, we present a novel method to fabricate long period gratings using standard single mode optical fibers (SMF). These optical devices were fabricated in a three-step process, which consisted of etching the SMF, then coating it with a thin-film and, the final step, which involved removing sections of the coating periodically by laser ablation. Tin dioxide was chosen as the material for this study and it was sputtered using a pulsed DC sputtering system. Theoretical simulations were performed in order to select the appropriate parameters for the experiments. The responses of two different devices to different external refractive indices was studied, and the maximum sensitivity obtained was 6430 nm/RIU for external refractive indices ranging from 1.37 to 1.39.

  2. The future is in the numbers: the power of predictive analysis in the biomedical educational environment.

    PubMed

    Gullo, Charles A

    2016-01-01

    Biomedical programs have a potential treasure trove of data they can mine to assist admissions committees in identification of students who are likely to do well and help educational committees in the identification of students who are likely to do poorly on standardized national exams and who may need remediation. In this article, we provide a step-by-step approach that schools can utilize to generate data that are useful when predicting the future performance of current students in any given program. We discuss the use of linear regression analysis as the means of generating that data and highlight some of the limitations. Finally, we lament on how the combination of these institution-specific data sets are not being fully utilized at the national level where these data could greatly assist programs at large.

  3. Thermal neutral format based on the step technology

    NASA Technical Reports Server (NTRS)

    Almazan, P. Planas; Legal, J. L.

    1995-01-01

    The exchange of models is one of the most serious problems currently encountered in the practice of spacecraft thermal analysis. Essentially, the problem originates in the diversity of computing environments that are used across different sites, and the consequent proliferation of native tool formats. Furthermore, increasing pressure to reduce the development's life cycle time has originated a growing interest in the so-called spacecraft concurrent engineering. In this context, the realization of the interdependencies between different disciplines and the proper communication between them become critical issues. The use of a neutral format represents a step forward in addressing these problems. Such a means of communication is adopted by consensus. A neutral format is not directly tied to any specific tool and it is kept under stringent change control. Currently, most of the groups promoting exchange formats are contributing with their experience to STEP, the Standard for Exchange of Product Model Data, which is being developed under the auspices of the International Standards Organization (ISO 10303). This paper presents the different efforts made in Europe to provide the spacecraft thermal analysis community with a Thermal Neutral Format (TNF) based on STEP. Following an introduction with some background information, the paper presents the characteristics of the STEP standard. Later, the first efforts to produce a STEP Spacecraft Thermal Application Protocol are described. Finally, the paper presents the currently harmonized European activities that follow up and extend earlier work on the area.

  4. Comparison of 10 single and stepped methods to identify frail older persons in primary care: diagnostic and prognostic accuracy.

    PubMed

    Sutorius, Fleur L; Hoogendijk, Emiel O; Prins, Bernard A H; van Hout, Hein P J

    2016-08-03

    Many instruments have been developed to identify frail older adults in primary care. A direct comparison of the accuracy and prevalence of identification methods is rare and most studies ignore the stepped selection typically employed in routine care practice. Also it is unclear whether the various methods select persons with different characteristics. We aimed to estimate the accuracy of 10 single and stepped methods to identify frailty in older adults and to predict adverse health outcomes. In addition, the methods were compared on their prevalence of the identified frail persons and on the characteristics of persons identified. The Groningen Frailty Indicator (GFI), the PRISMA-7, polypharmacy, the clinical judgment of the general practitioner (GP), the self-rated health of the older adult, the Edmonton Frail Scale (EFS), the Identification Seniors At Risk Primary Care (ISAR PC), the Frailty Index (FI), the InterRAI screener and gait speed were compared to three measures: two reference standards (the clinical judgment of a multidisciplinary expert panel and Fried's frailty criteria) and 6-years mortality or long term care admission. Data were used from the Dutch Identification of Frail Elderly Study, consisting of 102 people aged 65 and over from a primary care practice in Amsterdam. Frail older adults were oversampled. The accuracy of each instrument and several stepped strategies was estimated by calculating the area under the ROC-curve. Prevalence rates of frailty ranged from 14.8 to 52.9 %. The accuracy for recommended cut off values ranged from poor (AUC = 0.556 ISAR-PC) to good (AUC = 0.865 gait speed). PRISMA-7 performed best over two reference standards, GP predicted adversities best. Stepped strategies resulted in lower prevalence rates and accuracy. Persons selected by the different instruments varied greatly in age, IADL dependency, receiving homecare and mood. We found huge differences between methods to identify frail persons in prevalence, accuracy and in characteristics of persons they select. A necessary next step is to find out which frail persons can benefit from intervention before case finding programs are implemented. Further evidence is needed to guide this emerging clinical field.

  5. Peak Detection Method Evaluation for Ion Mobility Spectrometry by Using Machine Learning Approaches

    PubMed Central

    Hauschild, Anne-Christin; Kopczynski, Dominik; D’Addario, Marianna; Baumbach, Jörg Ingo; Rahmann, Sven; Baumbach, Jan

    2013-01-01

    Ion mobility spectrometry with pre-separation by multi-capillary columns (MCC/IMS) has become an established inexpensive, non-invasive bioanalytics technology for detecting volatile organic compounds (VOCs) with various metabolomics applications in medical research. To pave the way for this technology towards daily usage in medical practice, different steps still have to be taken. With respect to modern biomarker research, one of the most important tasks is the automatic classification of patient-specific data sets into different groups, healthy or not, for instance. Although sophisticated machine learning methods exist, an inevitable preprocessing step is reliable and robust peak detection without manual intervention. In this work we evaluate four state-of-the-art approaches for automated IMS-based peak detection: local maxima search, watershed transformation with IPHEx, region-merging with VisualNow, and peak model estimation (PME). We manually generated a gold standard with the aid of a domain expert (manual) and compare the performance of the four peak calling methods with respect to two distinct criteria. We first utilize established machine learning methods and systematically study their classification performance based on the four peak detectors’ results. Second, we investigate the classification variance and robustness regarding perturbation and overfitting. Our main finding is that the power of the classification accuracy is almost equally good for all methods, the manually created gold standard as well as the four automatic peak finding methods. In addition, we note that all tools, manual and automatic, are similarly robust against perturbations. However, the classification performance is more robust against overfitting when using the PME as peak calling preprocessor. In summary, we conclude that all methods, though small differences exist, are largely reliable and enable a wide spectrum of real-world biomedical applications. PMID:24957992

  6. Peak detection method evaluation for ion mobility spectrometry by using machine learning approaches.

    PubMed

    Hauschild, Anne-Christin; Kopczynski, Dominik; D'Addario, Marianna; Baumbach, Jörg Ingo; Rahmann, Sven; Baumbach, Jan

    2013-04-16

    Ion mobility spectrometry with pre-separation by multi-capillary columns (MCC/IMS) has become an established inexpensive, non-invasive bioanalytics technology for detecting volatile organic compounds (VOCs) with various metabolomics applications in medical research. To pave the way for this technology towards daily usage in medical practice, different steps still have to be taken. With respect to modern biomarker research, one of the most important tasks is the automatic classification of patient-specific data sets into different groups, healthy or not, for instance. Although sophisticated machine learning methods exist, an inevitable preprocessing step is reliable and robust peak detection without manual intervention. In this work we evaluate four state-of-the-art approaches for automated IMS-based peak detection: local maxima search, watershed transformation with IPHEx, region-merging with VisualNow, and peak model estimation (PME).We manually generated Metabolites 2013, 3 278 a gold standard with the aid of a domain expert (manual) and compare the performance of the four peak calling methods with respect to two distinct criteria. We first utilize established machine learning methods and systematically study their classification performance based on the four peak detectors' results. Second, we investigate the classification variance and robustness regarding perturbation and overfitting. Our main finding is that the power of the classification accuracy is almost equally good for all methods, the manually created gold standard as well as the four automatic peak finding methods. In addition, we note that all tools, manual and automatic, are similarly robust against perturbations. However, the classification performance is more robust against overfitting when using the PME as peak calling preprocessor. In summary, we conclude that all methods, though small differences exist, are largely reliable and enable a wide spectrum of real-world biomedical applications.

  7. On the Preparation and Testing of Fuel Cell Catalysts Using the Thin Film Rotating Disk Electrode Method.

    PubMed

    Inaba, Masanori; Quinson, Jonathan; Bucher, Jan Rudolf; Arenz, Matthias

    2018-03-16

    We present a step-by-step tutorial to prepare proton exchange membrane fuel cell (PEMFC) catalysts, consisting of Pt nanoparticles (NPs) supported on a high surface area carbon, and to test their performance in thin film rotating disk electrode (TF-RDE) measurements. The TF-RDE methodology is widely used for catalyst screening; nevertheless, the measured performance sometimes considerably differs among research groups. These uncertainties impede the advancement of new catalyst materials and, consequently, several authors discussed possible best practice methods and the importance of benchmarking. The visual tutorial highlights possible pitfalls in the TF-RDE testing of Pt/C catalysts. A synthesis and testing protocol to assess standard Pt/C catalysts is introduced that can be used together with polycrystalline Pt disks as benchmark catalysts. In particular, this study highlights how the properties of the catalyst film on the glassy carbon (GC) electrode influence the measured performance in TF-RDE testing. To obtain thin, homogeneous catalyst films, not only the catalyst preparation, but also the ink deposition and drying procedures are essential. It is demonstrated that an adjustment of the ink's pH might be necessary, and how simple control measurements can be used to check film quality. Once reproducible TF-RDE measurements are obtained, determining the Pt loading on the catalyst support (expressed as Pt wt%) and the electrochemical surface area is necessary to normalize the determined reaction rates to either surface area or Pt mass. For the surface area determination, so-called CO stripping, or the determination of the hydrogen underpotential deposition (Hupd) charge, are standard. For the determination of the Pt loading, a straightforward and cheap procedure using digestion in aqua regia with subsequent conversion of Pt(IV) to Pt(II) and UV-vis measurements is introduced.

  8. On the Preparation and Testing of Fuel Cell Catalysts Using the Thin Film Rotating Disk Electrode Method

    PubMed Central

    Inaba, Masanori; Quinson, Jonathan; Bucher, Jan Rudolf; Arenz, Matthias

    2018-01-01

    We present a step-by-step tutorial to prepare proton exchange membrane fuel cell (PEMFC) catalysts, consisting of Pt nanoparticles (NPs) supported on a high surface area carbon, and to test their performance in thin film rotating disk electrode (TF-RDE) measurements. The TF-RDE methodology is widely used for catalyst screening; nevertheless, the measured performance sometimes considerably differs among research groups. These uncertainties impede the advancement of new catalyst materials and, consequently, several authors discussed possible best practice methods and the importance of benchmarking. The visual tutorial highlights possible pitfalls in the TF-RDE testing of Pt/C catalysts. A synthesis and testing protocol to assess standard Pt/C catalysts is introduced that can be used together with polycrystalline Pt disks as benchmark catalysts. In particular, this study highlights how the properties of the catalyst film on the glassy carbon (GC) electrode influence the measured performance in TF-RDE testing. To obtain thin, homogeneous catalyst films, not only the catalyst preparation, but also the ink deposition and drying procedures are essential. It is demonstrated that an adjustment of the ink's pH might be necessary, and how simple control measurements can be used to check film quality. Once reproducible TF-RDE measurements are obtained, determining the Pt loading on the catalyst support (expressed as Pt wt%) and the electrochemical surface area is necessary to normalize the determined reaction rates to either surface area or Pt mass. For the surface area determination, so-called CO stripping, or the determination of the hydrogen underpotential deposition (Hupd) charge, are standard. For the determination of the Pt loading, a straightforward and cheap procedure using digestion in aqua regia with subsequent conversion of Pt(IV) to Pt(II) and UV-vis measurements is introduced. PMID:29608166

  9. Protocol for Uniformly Measuring and Expressing the Performance of Energy Storage Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Conover, David R.; Crawford, Aladsair J.; Fuller, Jason C.

    The Protocol for Uniformly Measuring and Expressing the Performance of Energy Storage Systems (PNNL-22010) was first issued in November 2012 as a first step toward providing a foundational basis for developing an initial standard for the uniform measurement and expression of energy storage system (ESS) performance. Based on experiences with the application and use of that document, and to include additional ESS applications and associated duty cycles, test procedures and performance metrics, a first revision of the November 2012 Protocol was issued in June 2014 (PNNL 22010 Rev. 1). As an update of the 2014 revision 1 to the Protocol,more » this document (the March 2016 revision 2 to the Protocol) is intended to supersede the June 2014 revision 1 to the Protocol and provide a more user-friendly yet more robust and comprehensive basis for measuring and expressing ESS performance.« less

  10. Predictors of a Top Performer During Emergency Medicine Residency.

    PubMed

    Bhat, Rahul; Takenaka, Katrin; Levine, Brian; Goyal, Nikhil; Garg, Manish; Visconti, Annette; Oyama, Leslie; Castillo, Edward; Broder, Joshua; Omron, Rodney; Hayden, Stephen

    2015-10-01

    Emergency Medicine (EM) residency program directors and faculty spend significant time and effort creating a residency rank list. To date, however, there have been few studies to assist program directors in determining which pre-residency variables best predict performance during EM residency. To evaluate which pre-residency variables best correlated with an applicant's performance during residency. This was a retrospective multicenter sample of all residents in the three most recent graduating classes from nine participating EM residency programs. The outcome measure of top residency performance was defined as placement in the top third of a resident's graduating class based on performance on the final semi-annual evaluation. A total of 277 residents from nine institutions were evaluated. Eight of the predictors analyzed had a significant correlation with the outcome of resident performance. Applicants' grade during home and away EM rotations, designation as Alpha Omega Alpha (AOA), U.S. Medical Licensing Examination (USMLE) Step 1 score, interview scores, "global rating" and "competitiveness" on nonprogram leadership standardized letter of recommendation (SLOR), and having five or more publications or presentations showed a significant association with residency performance. We identified several predictors of top performers in EM residency: an honors grade for an EM rotation, USMLE Step 1 score, AOA designation, interview score, high SLOR rankings from nonprogram leadership, and completion of five or more presentations and publications. EM program directors may consider utilizing these variables during the match process to choose applicants who have the highest chance of top performance during residency. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Traceable calibration and demonstration of a portable dynamic force transfer standard

    NASA Astrophysics Data System (ADS)

    Vlajic, Nicholas; Chijioke, Ako

    2017-08-01

    In general, the dynamic sensitivity of a force transducer depends upon the mechanical system in which it is used. This dependence serves as motivation to develop a dynamic force transfer standard, which can be used to calibrate an application transducer in situ. In this work, we SI-traceably calibrate a hand-held force transducer, namely an impact hammer, by using a mass suspended from a thin line which is cut to produce a known dynamic force in the form of a step function. We show that this instrument is a promising candidate as a transfer standard, since its dynamic response has small variance between different users. This calibrated transfer standard is then used to calibrate a secondary force transducer in an example application setting. The combined standard uncertainty (k  =  2) in the calibration of the transfer standard was determined to be 2.1% or less, up to a bandwidth of 5 kHz. The combined standard uncertainty (k  =  2) in the performed transfer calibration was less than 4%, up to 3 kHz. An advantage of the transfer calibration framework presented here, is that the transfer standard can be used to transfer SI-traceable calibrations without the use of any SI-traceable voltage metrology instrumentation.

  12. Global phenomena from local rules: Peer-to-peer networks and crystal steps

    NASA Astrophysics Data System (ADS)

    Finkbiner, Amy

    Even simple, deterministic rules can generate interesting behavior in dynamical systems. This dissertation examines some real world systems for which fairly simple, locally defined rules yield useful or interesting properties in the system as a whole. In particular, we study routing in peer-to-peer networks and the motion of crystal steps. Peers can vary by three orders of magnitude in their capacities to process network traffic. This heterogeneity inspires our use of "proportionate load balancing," where each peer provides resources in proportion to its individual capacity. We provide an implementation that employs small, local adjustments to bring the entire network into a global balance. Analytically and through simulations, we demonstrate the effectiveness of proportionate load balancing on two routing methods for de Bruijn graphs, introducing a new "reversed" routing method which performs better than standard forward routing in some cases. The prevalence of peer-to-peer applications prompts companies to locate the hosts participating in these networks. We explore the use of supervised machine learning to identify peer-to-peer hosts, without using application-specific information. We introduce a model for "triples," which exploits information about nearly contemporaneous flows to give a statistical picture of a host's activities. We find that triples, together with measurements of inbound vs. outbound traffic, can capture most of the behavior of peer-to-peer hosts. An understanding of crystal surface evolution is important for the development of modern nanoscale electronic devices. The most commonly studied surface features are steps, which form at low temperatures when the crystal is cut close to a plane of symmetry. Step bunching, when steps arrange into widely separated clusters of tightly packed steps, is one important step phenomenon. We analyze a discrete model for crystal steps, in which the motion of each step depends on the two steps on either side of it. We find an time-dependence term for the motion that does not appear in continuum models, and we determine an explicit dependence on step number.

  13. Streamlining and Standardizing Due Diligence to Ensure Quality of PV Power Plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurtz, Sarah

    Those investing in PV power plants would like to have confidence that the plants will provide the anticipated return on investment. While due diligence is capably performed by independent engineers today, as PV systems mature, there will be benefit in standardization and streamlining of this process. The IECRE has defined technical information that is needed as a basis for each transaction step such as approving a design to begin construction, documenting readiness to operate, quantifying performance after a year of operation, and assessing the health of the plant in preparation for sale of the plant. The technical requirements have beenmore » defined by IEC Technical Committee 82 and have been designed to be both effective and efficient in completing the assessments. This workshop will describe these new tools that are now available to the community and will include a panel/audience discussion about how and when they can be most effectively used.« less

  14. WAIS-III index score profiles in the Canadian standardization sample.

    PubMed

    Lange, Rael T

    2007-01-01

    Representative index score profiles were examined in the Canadian standardization sample of the Wechsler Adult Intelligence Scale-Third Edition (WAIS-III). The identification of profile patterns was based on the methodology proposed by Lange, Iverson, Senior, and Chelune (2002) that aims to maximize the influence of profile shape and minimize the influence of profile magnitude on the cluster solution. A two-step cluster analysis procedure was used (i.e., hierarchical and k-means analyses). Cluster analysis of the four index scores (i.e., Verbal Comprehension [VCI], Perceptual Organization [POI], Working Memory [WMI], Processing Speed [PSI]) identified six profiles in this sample. Profiles were differentiated by pattern of performance and were primarily characterized as (a) high VCI/POI, low WMI/PSI, (b) low VCI/POI, high WMI/PSI, (c) high PSI, (d) low PSI, (e) high VCI/WMI, low POI/PSI, and (f) low VCI, high POI. These profiles are potentially useful for determining whether a patient's WAIS-III performance is unusual in a normal population.

  15. [Technical points of laparoscopic splenic hilar lymph node dissection--The original intention of CLASS-04 research design].

    PubMed

    Huang, Changming; Lin, Mi

    2018-02-25

    According to Japanese gastric cancer treatment guidelines, the standard operation for locally advanced upper third gastric cancer is the total gastrectomy with D2 lymphadenectomy, which includes the dissection of the splenic hilar lymph nodes. With the development of minimally invasive ideas and surgical techniques, laparoscopic spleen-preserving splenic hilar lymph node dissection is gradually accepted. It needs high technical requirements and should be carried out by surgeons with rich experience of open operation and skilled laparoscopic techniques. Based on being familiar with the anatomy of splenic hilum, we should choose a reasonable surgical approach and standardized operating procedure. A favorable left-sided approach is used to perform the laparoscopic spleen-preserving splenic hilar lymph node dissection in Department of Gastric Surgery, Fujian Medical University Union Hospital. This means that the membrane of the pancreas is separated at the superior border of the pancreatic tail in order to reach the posterior pancreatic space, revealing the end of the splenic vessels' trunk. The short gastric vessels are severed at their roots. This enables complete removal of the splenic hilar lymph nodes and stomach. At the same time, based on the rich clinical practice of laparoscopic gastric cancer surgery, we have summarized an effective operating procedure called Huang's three-step maneuver. The first step is the dissection of the lymph nodes in the inferior pole region of the spleen. The second step is the dissection of the lymph nodes in the trunk of splenic artery region. The third step is the dissection of the lymph nodes in the superior pole region of the spleen. It simplifies the procedure, reduces the difficulty of the operation, improves the efficiency of the operation, and ensures the safety of the operation. To further explore the safety of laparoscopic spleen-preserving splenic hilar lymph node dissection for locally advanced upper third gastric cancer, in 2016, we launched a multicenter phase II( trial of safety and feasibility of laparoscopic spleen-preserving No.10 lymph node dissection for locally advanced upper third gastric cancer (CLASS-04). Through the multicenter prospective study, we try to provide scientific theoretical basis and clinical experience for the promotion and application of the operation, and also to standardize and popularize the laparoscopic spleen-preserving splenic hilar lymph node dissection to promote its development. At present, the enrollment of the study has been completed, and the preliminary results also suggested that laparoscopic spleen-preserving No.10 lymph node dissection for locally advanced upper third gastric cancer was safe and feasible. We believe that with the improvement of standardized operation training system, the progress of laparoscopic technology and the promotion of Huang's three-step maneuver, laparoscopic spleen-preserving splenic hilar lymph node dissection will also become one of the standard treatments for locally advanced upper third gastric cancer.

  16. Impact of HIPAA’s Minimum Necessary Standard on Genomic Data Sharing

    PubMed Central

    Evans, Barbara J.; Jarvik, Gail P.

    2017-01-01

    Purpose This article provides a brief introduction to the HIPAA Privacy Rule’s minimum necessary standard, which applies to sharing of genomic data, particularly clinical data, following 2013 Privacy Rule revisions. Methods This research used the Thomson Reuters Westlaw™ database and law library resources in its legal analysis of the HIPAA privacy tiers and the impact of the minimum necessary standard on genomic data-sharing. We considered relevant example cases of genomic data-sharing needs. Results In a climate of stepped-up HIPAA enforcement, this standard is of concern to laboratories that generate, use, and share genomic information. How data-sharing activities are characterized—whether for research, public health, or clinical interpretation and medical practice support—affects how the minimum necessary standard applies and its overall impact on data access and use. Conclusion There is no clear regulatory guidance on how to apply HIPAA’s minimum necessary standard when considering the sharing of information in the data-rich environment of genomic testing. Laboratories that perform genomic testing should engage with policy-makers to foster sound, well-informed policies and appropriate characterization of data-sharing activities to minimize adverse impacts on day-to-day workflows. PMID:28914268

  17. Impact of HIPAA's minimum necessary standard on genomic data sharing.

    PubMed

    Evans, Barbara J; Jarvik, Gail P

    2018-04-01

    This article provides a brief introduction to the Health Insurance Portability and Accountability Act of 1996 (HIPAA) Privacy Rule's minimum necessary standard, which applies to sharing of genomic data, particularly clinical data, following 2013 Privacy Rule revisions. This research used the Thomson Reuters Westlaw database and law library resources in its legal analysis of the HIPAA privacy tiers and the impact of the minimum necessary standard on genomic data sharing. We considered relevant example cases of genomic data-sharing needs. In a climate of stepped-up HIPAA enforcement, this standard is of concern to laboratories that generate, use, and share genomic information. How data-sharing activities are characterized-whether for research, public health, or clinical interpretation and medical practice support-affects how the minimum necessary standard applies and its overall impact on data access and use. There is no clear regulatory guidance on how to apply HIPAA's minimum necessary standard when considering the sharing of information in the data-rich environment of genomic testing. Laboratories that perform genomic testing should engage with policy makers to foster sound, well-informed policies and appropriate characterization of data-sharing activities to minimize adverse impacts on day-to-day workflows.

  18. Does the use of automated fetal biometry improve clinical work flow efficiency?

    PubMed

    Espinoza, Jimmy; Good, Sara; Russell, Evie; Lee, Wesley

    2013-05-01

    This study was designed to compare the work flow efficiency of manual measurements of 5 fetal parameters with a novel technique that automatically measures these parameters from 2-dimensional sonograms. This prospective study included 200 singleton pregnancies between 15 and 40 weeks' gestation. Patients were randomly allocated to either manual (n = 100) or automatic (n = 100) fetal biometry. The automatic measurement was performed using a commercially available software application. A digital video recorder captured all on-screen activity associated with the sonographic examination. The examination time and number of steps required to obtain fetal measurements were compared between manual and automatic methods. The mean time required to obtain the biometric measurements was significantly shorter using the automated technique than the manual approach (P < .001 for all comparisons). Similarly, the mean number of steps required to perform these measurements was significantly fewer with automatic measurements compared to the manual technique (P < .001). In summary, automated biometry reduced the examination time required for standard fetal measurements. This approach may improve work flow efficiency in busy obstetric sonography practices.

  19. A practical laboratory study simulating the percutaneous lumbar transforaminal epidural injection: training model in fresh cadaveric sheep spine.

    PubMed

    Suslu, Husnu

    2012-01-01

    Laboratory training models are essential for developing and refining treatment skills before the clinical application of surgical and invasive procedures. A simple simulation model is needed for young trainees to learn how to handle instruments, and to perform safe lumbar transforaminal epidural injections. Our aim is to present a model of a fresh cadaveric sheep lumbar spine that simulates the lumbar transforaminal epidural injection. The material consists of a 2-year-old fresh cadaveric sheep spine. A 4-step approach was designed for lumbar transforaminal epidural injection under C-arm scopy. For the lumbar transforaminal epidural injection, the fluoroscope was adjusted to get a proper oblique view while the material was stabilized in a prone position. The procedure then begin, using the C-arm guidance scopy. The model simulates well the steps of standard lumbar transforaminal epidural injections in the human spine. The cadaveric sheep spine represents a good method for training and it simulates fluoroscopic lumbar transforaminal epidural steroid injection procedures performed in the human spine.

  20. Simultaneous determination of azathioprine and 6-mercaptopurine by high-performance liquid chromatography.

    PubMed

    Van Os, E C; McKinney, J A; Zins, B J; Mays, D C; Schriver, Z H; Sandborn, W J; Lipsky, J J

    1996-04-26

    A specific, sensitive, single-step solid-phase extraction and reversed-phase high-performance liquid chromatographic method for the simultaneous determination of plasma 6-mercaptopurine and azathioprine concentrations is reported. Following solid-phase extraction, analytes are separated on a C18 column with mobile phase consisting of 0.8% acetonitrile in 1 mM triethylamine, pH 3.2, run on a gradient system. Quantitation limits were 5 ng/ml and 2 ng/ml for azathioprine and 6-mercaptopurine, respectively. Peak heights correlated linearly to known extracted standards for 6-mercaptopurine and azathioprine (r = 0.999) over a range of 2-200 ng/ml. No chromatographic interferences were detected.

  1. High-performance liquid chromatography purification of homogenous-length RNA produced by trans cleavage with a hammerhead ribozyme.

    PubMed Central

    Shields, T P; Mollova, E; Ste Marie, L; Hansen, M R; Pardi, A

    1999-01-01

    An improved method is presented for the preparation of milligram quantities of homogenous-length RNAs suitable for nuclear magnetic resonance or X-ray crystallographic structural studies. Heterogeneous-length RNA transcripts are processed with a hammerhead ribozyme to yield homogenous-length products that are then readily purified by anion exchange high-performance liquid chromatography. This procedure eliminates the need for denaturing polyacrylamide gel electrophoresis, which is the most laborious step in the standard procedure for large-scale production of RNA by in vitro transcription. The hammerhead processing of the heterogeneous-length RNA transcripts also substantially improves the overall yield and purity of the desired RNA product. PMID:10496226

  2. Simple and sensitive method for quantification of fluorescent enzymatic mature and senescent crosslinks of collagen in bone hydrolysate using single-column high performance liquid chromatography.

    PubMed

    Viguet-Carrin, S; Gineyts, E; Bertholon, C; Delmas, P D

    2009-01-01

    A rapid high performance liquid chromatographic method was developed including an internal standard for the measurement of mature and senescent crosslinks concentration in non-demineralized bone hydrolysates. To avoid the demineralization which is a tedious step, we developed a method based on the use of a solid-phase extraction procedure to clean-up the samples. It resulted in sensitive and accurate measurements: the detection limits as low as 0.2 pmol for the pyridimium crosslinks and 0.02 pmol for the pentosidine. The inter- and intra-assay coefficients of variation were as low as 5% and 2%, respectively, for all crosslinks.

  3. One Small Step for the Gram Stain, One Giant Leap for Clinical Microbiology

    PubMed Central

    2016-01-01

    The Gram stain is one of the most commonly performed tests in the clinical microbiology laboratory, yet it is poorly controlled and lacks standardization. It was once the best rapid test in microbiology, but it is no longer trusted by many clinicians. The publication by Samuel et al. (J. Clin. Microbiol. 54:1442–1447, 2016, http://dx.doi.org/10.1128/JCM.03066-15) is a start for those who want to evaluate and improve Gram stain performance. In an age of emerging rapid molecular results, is the Gram stain still relevant? How should clinical microbiologists respond to the call to reduce Gram stain error rates? PMID:27008876

  4. Old and New Techniques as a Safe Hybrid Approach for Carotid Tandem Lesions.

    PubMed

    Barillà, David; Massara, Mafalda; Alberti, Antonino; Volpe, Alberto; Cutrupi, Andrea; Versace, Paolo; Volpe, Pietro

    2016-04-01

    Carotid revascularization is performed to prevent stroke. Carotid tandem lesions represent a challenge for treatment, and a hybrid approach may result effective. A high-risk 65-year-old woman presented with a "tandem lesion" of left common and internal carotid artery. She was deemed unfit for "simple" standard carotid endarterectomy (CEA). A "single-step" safe hybrid procedure was scheduled for the patient. A "Cormier" carotid vein graft bypass with a retrograde stenting was performed under local anesthesia. The "safe hybrid procedure" for tandem lesions of the common and internal carotid artery is effective and suitable in high-risk patients in a high-volume centers. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. The fast multipole method and point dipole moment polarizable force fields.

    PubMed

    Coles, Jonathan P; Masella, Michel

    2015-01-14

    We present an implementation of the fast multipole method for computing Coulombic electrostatic and polarization forces from polarizable force-fields based on induced point dipole moments. We demonstrate the expected O(N) scaling of that approach by performing single energy point calculations on hexamer protein subunits of the mature HIV-1 capsid. We also show the long time energy conservation in molecular dynamics at the nanosecond scale by performing simulations of a protein complex embedded in a coarse-grained solvent using a standard integrator and a multiple time step integrator. Our tests show the applicability of fast multipole method combined with state-of-the-art chemical models in molecular dynamical systems.

  6. Association of physical performance measures with bone mineral density in postmenopausal women.

    PubMed

    Lindsey, Carleen; Brownbill, Rhonda A; Bohannon, Richard A; Ilich, Jasminka Z

    2005-06-01

    To investigate the association between physical performance measures and bone mineral density (BMD) in older women. Cross-sectional analysis. University research laboratory. Healthy postmenopausal women (N=116; mean age +/- standard deviation, 68.3+/-6.8y) in self-reported good health who were not taking medications known to affect bone, including hormone replacement therapy. Not applicable. Anthropometrics and BMD of the hip, spine, whole body, and forearm. Physical performance measures included normal and brisk 8-m gait speed, normal step length (NSL), brisk step length (BSL), timed 1-leg stance (OLS), timed sit-to-stand (STS), and grip strength. NSL, BSL, normal gait speed, brisk gait speed, OLS, and grip strength correlated significantly with several skeletal sites ( r range, .19-.38; P <.05). In multiple regression models containing body mass index, hours of total activity, total calcium intake, and age of menarche, NSL, BSL, normal and brisk gait speeds, OLS, and grip strength were all significantly associated with BMD of various skeletal sites (adjusted R 2 range, .11-.24; P <.05). Analysis of covariance showed that subjects with longer step lengths and faster normal and brisk gait speeds had higher BMD at the whole body, hip, and spine (brisk speed only). Those with a longer OLS had greater femoral neck BMD, and those with a stronger grip strength had greater BMD in the whole body and forearm ( P <.05). STS was not related to any skeletal site. Normal and brisk gait speed, NSL, BSL, OLS, and grip strength are all associated with BMD at the whole body, hip, spine, and forearm. Physical performance evaluation may help with osteoporosis prevention and treatment programs for postmenopausal women when bone density scores have not been obtained or are unavailable.

  7. Magnetic ionic liquids as non-conventional extraction solvents for the determination of polycyclic aromatic hydrocarbons.

    PubMed

    Trujillo-Rodríguez, María J; Nacham, Omprakash; Clark, Kevin D; Pino, Verónica; Anderson, Jared L; Ayala, Juan H; Afonso, Ana M

    2016-08-31

    This work describes the applicability of magnetic ionic liquids (MILs) in the analytical determination of a group of heavy polycyclic aromatic hydrocarbons. Three different MILs, namely, benzyltrioctylammonium bromotrichloroferrate (III) (MIL A), methoxybenzyltrioctylammonium bromotrichloroferrate (III) (MIL B), and 1,12-di(3-benzylbenzimidazolium) dodecane bis[(trifluoromethyl)sulfonyl)]imide bromotrichloroferrate (III) (MIL C), were designed to exhibit hydrophobic properties, and their performance examined in a microextraction method for hydrophobic analytes. The magnet-assisted approach with these MILs was performed in combination with high performance liquid chromatography and fluorescence detection. The study of the extraction performance showed that MIL A was the most suitable solvent for the extraction of polycyclic aromatic hydrocarbons and under optimum conditions the fast extraction step required ∼20 μL of MIL A for 10 mL of aqueous sample, 24 mmol L(-1) NaOH, high ionic strength content of NaCl (25% (w/v)), 500 μL of acetone as dispersive solvent, and 5 min of vortex. The desorption step required the aid of an external magnetic field with a strong NdFeB magnet (the separation requires few seconds), two back-extraction steps for polycyclic aromatic hydrocarbons retained in the MIL droplet with n-hexane, evaporation and reconstitution with acetonitrile. The overall method presented limits of detection down to 5 ng L(-1), relative recoveries ranging from 91.5 to 119%, and inter-day reproducibility values (expressed as relative standard derivation) lower than 16.4% for a spiked level of 0.4 μg L(-1) (n = 9). The method was also applied for the analysis of real samples, including tap water, wastewater, and tea infusion. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Evaluation of cleaning and disinfection performance of automatic washer disinfectors machines in programs presenting different cycle times and temperatures.

    PubMed

    Bergo, Maria do Carmo Noronha Cominato

    2006-01-01

    Thermal washer-disinfectors represent a technology that brought about great advantages such as, establishment of protocols, standard operating procedures, reduction in occupational risk of a biological and environmental nature. The efficacy of the cleaning and disinfection obtained by automatic washer disinfectors machines in running programs with different times and temperatures determined by the different official agencies was validated according to recommendations from ISO Standards 15883-1/1999 and HTM2030 (NHS Estates, 1997) for the determining of the Minimum Lethality and DAL both theoretically and through the use with thermocouples. In order to determine the cleaning efficacy, the Soil Test, Biotrace Pro-tect and the Protein Test Kit were used. The procedure to verify the CFU count of viable microorganisms was performed before and after the thermal disinfection. This article shows that the results are in compliance with the ISO and HTM Standards. The validation steps confirmed the high efficacy level of the Medical Washer-Disinfectors. This protocol enabled the evaluation of the procedure based on evidence supported by scientific research, aiming at the support of the Supply Center multi-professional personnel with information and the possibility of developing further research.

  9. Step-by-step strategy in the management of residual hepatolithiasis using post-operative cholangioscopy

    PubMed Central

    Wen, Xu-dong; Wang, Tao; Huang, Zhu; Zhang, Hong-jian; Zhang, Bing-yin; Tang, Li-jun; Liu, Wei-hui

    2017-01-01

    Hepatolithiasis is the presence of calculi within the intrahepatic bile duct specifically located proximal to the confluence of the left and right hepatic ducts. The ultimate goal of hepatolithiasis treatment is the complete removal of the stone, the correction of the associated strictures and the prevention of recurrent cholangitis. Although hepatectomy could effectively achieve the above goals, it can be restricted by the risk of insufficient residual liver volume, and has a 15.6% rate of residual hepatolithiasis. With improvements in minimally invasive surgery, post-operative cholangioscopy (POC), provides an additional option for hepatolithiasis treatment with higher clearance rate and fewer severe complications. POC is very safe, and can be performed repeatedly until full patient benefit is achieved. During POC three main steps are accomplished: first, the analysis of the residual hepatolithiasis distribution indirectly by imaging methods or directly endoscopic observation; second, the establishment of the surgical pathway to relieve the strictures; and third, the removal of the stone by a combination of different techniques such as simple basket extraction, mechanical fragmentation, electrohydraulic lithotripsy or laser lithotripsy, among others. In summary, a step-by-step strategy of POC should be put forward to standardize the procedures, especially when dealing with complicated residual hepatolithiasis. This review briefly summarizes the classification, management and complications of hepatolithiasis during the POC process. PMID:29147136

  10. Plantarflexion moment is a contributor to step length after-effect following walking on a split-belt treadmill in individuals with stroke and healthy individuals.

    PubMed

    Lauzière, Séléna; Miéville, Carole; Betschart, Martina; Duclos, Cyril; Aissaoui, Rachid; Nadeau, Sylvie

    2014-10-01

    To assess plantarflexion moment and hip joint moment after-effects following walking on a split-belt treadmill in healthy individuals and individuals post-stroke. Cross-sectional study. Ten healthy individuals (mean age 57.6 years (standard deviation; SD 17.2)) and twenty individuals post-stroke (mean age 49.3 years (SD 13.2)). Participants walked on an instrumented split-belt treadmill under 3 gait periods: i) baseline (tied-belt); ii) adaptation (split-belt); and iii) post-adaptation (tied-belt). Participants post-stroke performed the protocol with the paretic and nonparetic leg on the faster belt when belts were split. Kinematic data were recorded with the Optotrak system and ground reaction forces were collected via the instrumented split-belt treadmill. In both groups, the fast plantarflexion moment was reduced and the slow plantarflexion moment was increased from mid-stance to toe-off in the post-adaptation period. Significant relationships were found between the plantarflexion moment and contralateral step length. Split-belt treadmills could be useful for restoring step length symmetry in individuals post-stroke who present with a longer paretic step length because the use of this type of intervention increases paretic plantarflexion moments. This intervention might be less recommended for individuals post-stroke with a shorter paretic step length because it reduces the paretic plantarflexion moment.

  11. Centrifugal step emulsification applied for absolute quantification of nucleic acids by digital droplet RPA.

    PubMed

    Schuler, Friedrich; Schwemmer, Frank; Trotter, Martin; Wadle, Simon; Zengerle, Roland; von Stetten, Felix; Paust, Nils

    2015-07-07

    Aqueous microdroplets provide miniaturized reaction compartments for numerous chemical, biochemical or pharmaceutical applications. We introduce centrifugal step emulsification for the fast and easy production of monodisperse droplets. Homogenous droplets with pre-selectable diameters in a range from 120 μm to 170 μm were generated with coefficients of variation of 2-4% and zero run-in time or dead volume. The droplet diameter depends on the nozzle geometry (depth, width, and step size) and interfacial tensions only. Droplet size is demonstrated to be independent of the dispersed phase flow rate between 0.01 and 1 μl s(-1), proving the robustness of the centrifugal approach. Centrifugal step emulsification can easily be combined with existing centrifugal microfluidic unit operations, is compatible to scalable manufacturing technologies such as thermoforming or injection moulding and enables fast emulsification (>500 droplets per second and nozzle) with minimal handling effort (2-3 pipetting steps). The centrifugal microfluidic droplet generation was used to perform the first digital droplet recombinase polymerase amplification (ddRPA). It was used for absolute quantification of Listeria monocytogenes DNA concentration standards with a total analysis time below 30 min. Compared to digital droplet polymerase chain reaction (ddPCR), with processing times of about 2 hours, the overall processing time of digital analysis was reduced by more than a factor of 4.

  12. Brain computer interfaces, a review.

    PubMed

    Nicolas-Alonso, Luis Fernando; Gomez-Gil, Jaime

    2012-01-01

    A brain-computer interface (BCI) is a hardware and software communications system that permits cerebral activity alone to control computers or external devices. The immediate goal of BCI research is to provide communications capabilities to severely disabled people who are totally paralyzed or 'locked in' by neurological neuromuscular disorders, such as amyotrophic lateral sclerosis, brain stem stroke, or spinal cord injury. Here, we review the state-of-the-art of BCIs, looking at the different steps that form a standard BCI: signal acquisition, preprocessing or signal enhancement, feature extraction, classification and the control interface. We discuss their advantages, drawbacks, and latest advances, and we survey the numerous technologies reported in the scientific literature to design each step of a BCI. First, the review examines the neuroimaging modalities used in the signal acquisition step, each of which monitors a different functional brain activity such as electrical, magnetic or metabolic activity. Second, the review discusses different electrophysiological control signals that determine user intentions, which can be detected in brain activity. Third, the review includes some techniques used in the signal enhancement step to deal with the artifacts in the control signals and improve the performance. Fourth, the review studies some mathematic algorithms used in the feature extraction and classification steps which translate the information in the control signals into commands that operate a computer or other device. Finally, the review provides an overview of various BCI applications that control a range of devices.

  13. Anosognosia, neglect, extinction and lesion site predict impairment of daily living after right-hemispheric stroke.

    PubMed

    Vossel, Simone; Weiss, Peter H; Eschenbeck, Philipp; Fink, Gereon R

    2013-01-01

    Right-hemispheric stroke can give rise to manifold neuropsychological deficits, in particular, impairments of spatial perception which are often accompanied by reduced self-awareness of these deficits (anosognosia). To date, the specific contribution of these deficits to a patient's difficulties in daily life activities remains to be elucidated. In 55 patients with right-hemispheric stroke we investigated the predictive value of different neglect-related symptoms, visual extinction and anosognosia for the performance of standardized activities of daily living (ADL). The additional impact of lesion location was examined using voxel-based lesion-symptom mapping. Step-wise linear regression revealed that anosognosia for visuospatial deficits was the most important predictor for performance in standardized ADL. In addition, motor-intentional and perceptual-attentional neglect, extinction and cancellation task performance significantly predicted ADL performance. Lesions comprising the right frontal and cingulate cortex and adjacent white matter explained additional variance in the performance of standardized ADL, in that damage to these areas was related to lower performance than predicted by the regression model only. Our data show a decisive role of anosognosia for visuospatial deficits for impaired ADL and therefore outcome/disability after stroke. The findings further demonstrate that the severity of neglect and extinction also predicts ADL performance. Our results thus strongly suggest that right-hemispheric stroke patients should not only be routinely assessed for neglect and extinction but also for anosognosia to initiate appropriate rehabilitative treatment. The observation that right frontal lesions explain additional variance in ADL most likely reflects that dysfunction of the supervisory system also significantly impacts upon rehabilitation. Copyright © 2012 Elsevier Ltd. All rights reserved.

  14. Surgical anatomy of the supracarinal esophagus based on a minimally invasive approach: vascular and nervous anatomy and technical steps to resection and lymphadenectomy.

    PubMed

    Cuesta, Miguel A; van der Wielen, Nicole; Weijs, Teus J; Bleys, Ronald L A W; Gisbertz, Suzanne S; van Duijvendijk, Peter; van Hillegersberg, Richard; Ruurda, Jelle P; van Berge Henegouwen, Mark I; Straatman, Jennifer; Osugi, Harushi; van der Peet, Donald L

    2017-04-01

    During esophageal dissection and lymphadenectomy of the upper mediastinum by thoracoscopy in prone position, we observed a complex anatomy in which we had to resect the esophagus, dissect vessels and nerves, and take down some of these in order to perform a complete lymphadenectomy. In order to improve the quality of the dissection and standardization of the procedure, we describe the surgical anatomy and steps involved in this procedure. We retrospectively evaluated twenty consecutive and unedited videos of thoracoscopic esophageal resections. We recorded the vascular anatomy of the supracarinal esophagus, lymph node stations and the steps taken in this procedure. The resulting concept was validated in a prospective study including five patients. Seventy percent of patients in the retrospective study had one right bronchial artery (RBA) and two left bronchial arteries (LBA). The RBA was divided at both sides of the esophagus in 18 patients, with preservation of one LBA or at least one esophageal branch in all cases. Both recurrent laryngeal nerves were identified in 18 patients. All patients in the prospective study had one RBA and two LBA, and in four patients the RBA was divided at both sides of the esophagus and preserved one of the LBA. Lymphadenectomy was performed of stations 4R, 4L, 2R and 2L, with a median of 11 resected lymph nodes. Both recurrent laryngeal nerves were identified in four patients. In three patients, only the left recurrent nerve could be identified. Two patients showed palsy of the left recurrent laryngeal nerve, and one showed neuropraxia of the left vocal cord. Knowledge of the surgical anatomy of the upper mediastinum and its anatomical variations is important for standardization of an adequate esophageal resection and paratracheal lymphadenectomy with preservation of any vascularization of the trachea, bronchi and the recurrent laryngeal nerves.

  15. The A3 Problem Solving Report: A 10-Step Scientific Method to Execute Performance Improvements in an Academic Research Vivarium

    PubMed Central

    Bassuk, James A.; Washington, Ida M.

    2013-01-01

    The purpose of this study was to illustrate the application of A3 Problem Solving Reports of the Toyota Production System to our research vivarium through the methodology of Continuous Performance Improvement, a lean approach to healthcare management at Seattle Children's (Hospital, Research Institute, Foundation). The Report format is described within the perspective of a 10-step scientific method designed to realize measurable improvements of Issues identified by the Report's Author, Sponsor and Coach. The 10-step method (Issue, Background, Current Condition, Goal, Root Cause, Target Condition, Countermeasures, Implementation Plan, Test, and Follow-up) was shown to align with Shewhart's Plan-Do-Check-Act process improvement cycle in a manner that allowed for quantitative analysis of the Countermeasure's outcomes and of Testing results. During fiscal year 2012, 9 A3 Problem Solving Reports were completed in the vivarium under the teaching and coaching system implemented by the Research Institute. Two of the 9 reports are described herein. Report #1 addressed the issue of the vivarium's veterinarian not being able to provide input into sick animal cases during the work day, while report #7 tackled the lack of a standard in keeping track of weekend/holiday animal health inspections. In each Report, a measurable Goal that established the basis for improvement recognition was present. A Five Whys analysis identified the Root Cause for Report #1 as historical work patterns that existed before the veterinarian was hired on and that modern electronic communication tools had not been implemented. The same analysis identified the Root Cause for Report #7 as the vivarium had never standardized the process for weekend/holiday checks. Successful outcomes for both Reports were obtained and validated by robust audit plans. The collective data indicate that vivarium staff acquired a disciplined way of reporting on, as well as solving, problems in a manner consistent with high level A3 Thinking. PMID:24204681

  16. The a3 problem solving report: a 10-step scientific method to execute performance improvements in an academic research vivarium.

    PubMed

    Bassuk, James A; Washington, Ida M

    2013-01-01

    The purpose of this study was to illustrate the application of A3 Problem Solving Reports of the Toyota Production System to our research vivarium through the methodology of Continuous Performance Improvement, a lean approach to healthcare management at Seattle Children's (Hospital, Research Institute, Foundation). The Report format is described within the perspective of a 10-step scientific method designed to realize measurable improvements of Issues identified by the Report's Author, Sponsor and Coach. The 10-step method (Issue, Background, Current Condition, Goal, Root Cause, Target Condition, Countermeasures, Implementation Plan, Test, and Follow-up) was shown to align with Shewhart's Plan-Do-Check-Act process improvement cycle in a manner that allowed for quantitative analysis of the Countermeasure's outcomes and of Testing results. During fiscal year 2012, 9 A3 Problem Solving Reports were completed in the vivarium under the teaching and coaching system implemented by the Research Institute. Two of the 9 reports are described herein. Report #1 addressed the issue of the vivarium's veterinarian not being able to provide input into sick animal cases during the work day, while report #7 tackled the lack of a standard in keeping track of weekend/holiday animal health inspections. In each Report, a measurable Goal that established the basis for improvement recognition was present. A Five Whys analysis identified the Root Cause for Report #1 as historical work patterns that existed before the veterinarian was hired on and that modern electronic communication tools had not been implemented. The same analysis identified the Root Cause for Report #7 as the vivarium had never standardized the process for weekend/holiday checks. Successful outcomes for both Reports were obtained and validated by robust audit plans. The collective data indicate that vivarium staff acquired a disciplined way of reporting on, as well as solving, problems in a manner consistent with high level A3 Thinking.

  17. Short communication: measures of weight distribution and frequency of steps as indicators of restless behavior.

    PubMed

    Chapinal, N; de Passillé, A M; Rushen, J; Tucker, C B

    2011-02-01

    Restless behavior, as measured by the steps taken or weight shifting between legs, may be a useful tool to assess the comfort of dairy cattle. These behaviors increase when cows stand on uncomfortable surfaces or are lame. The objective of this study was to compare 2 measures of restless behavior, stepping behavior and changes in weight distribution, on 2 standing surfaces: concrete and rubber. Twelve cows stood on a weighing platform with 1 scale/hoof for 1h. The platform was covered with either concrete or rubber, presented in a crossover design. Restlessness, as measured by both the frequency of steps and weight shifting (measured as the standard deviation of weight applied over time to the legs), increased over 1h of forced standing on either concrete or rubber. A positive relationship was found between the frequency of steps and the standard deviation of weight over 1h for both treatments and pairs of legs (r ≥ 0.66). No differences existed in the standard deviation of weight applied to the front (27.6 ± 1.6 kg) or rear legs (33.5 ± 1.4 kg) or the frequency of steps (10.2 ± 1.6 and 20.8 ± 3.2 steps/10 min for the front and rear pair, respectively) between rubber and concrete. Measures of restlessness are promising tools for assessing specific types of discomfort, such as those associated with lameness, but additional tools are needed to assess comfort of non-concrete standing surfaces. Copyright © 2011 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  18. A New Quaternion-Based Kalman Filter for Real-Time Attitude Estimation Using the Two-Step Geometrically-Intuitive Correction Algorithm.

    PubMed

    Feng, Kaiqiang; Li, Jie; Zhang, Xiaoming; Shen, Chong; Bi, Yu; Zheng, Tao; Liu, Jun

    2017-09-19

    In order to reduce the computational complexity, and improve the pitch/roll estimation accuracy of the low-cost attitude heading reference system (AHRS) under conditions of magnetic-distortion, a novel linear Kalman filter, suitable for nonlinear attitude estimation, is proposed in this paper. The new algorithm is the combination of two-step geometrically-intuitive correction (TGIC) and the Kalman filter. In the proposed algorithm, the sequential two-step geometrically-intuitive correction scheme is used to make the current estimation of pitch/roll immune to magnetic distortion. Meanwhile, the TGIC produces a computed quaternion input for the Kalman filter, which avoids the linearization error of measurement equations and reduces the computational complexity. Several experiments have been carried out to validate the performance of the filter design. The results demonstrate that the mean time consumption and the root mean square error (RMSE) of pitch/roll estimation under magnetic disturbances are reduced by 45.9% and 33.8%, respectively, when compared with a standard filter. In addition, the proposed filter is applicable for attitude estimation under various dynamic conditions.

  19. A New Quaternion-Based Kalman Filter for Real-Time Attitude Estimation Using the Two-Step Geometrically-Intuitive Correction Algorithm

    PubMed Central

    Feng, Kaiqiang; Li, Jie; Zhang, Xiaoming; Shen, Chong; Bi, Yu; Zheng, Tao; Liu, Jun

    2017-01-01

    In order to reduce the computational complexity, and improve the pitch/roll estimation accuracy of the low-cost attitude heading reference system (AHRS) under conditions of magnetic-distortion, a novel linear Kalman filter, suitable for nonlinear attitude estimation, is proposed in this paper. The new algorithm is the combination of two-step geometrically-intuitive correction (TGIC) and the Kalman filter. In the proposed algorithm, the sequential two-step geometrically-intuitive correction scheme is used to make the current estimation of pitch/roll immune to magnetic distortion. Meanwhile, the TGIC produces a computed quaternion input for the Kalman filter, which avoids the linearization error of measurement equations and reduces the computational complexity. Several experiments have been carried out to validate the performance of the filter design. The results demonstrate that the mean time consumption and the root mean square error (RMSE) of pitch/roll estimation under magnetic disturbances are reduced by 45.9% and 33.8%, respectively, when compared with a standard filter. In addition, the proposed filter is applicable for attitude estimation under various dynamic conditions. PMID:28925979

  20. Media Fill Test for validation of autologous leukocytes separation and labelling by (99m)Tc-HmPAO.

    PubMed

    Urbano, Nicoletta; Modoni, Sergio; Schillaci, Orazio

    2013-01-01

    Manufacturing of sterile products must be carried out in order to minimize risks of microbiological contamination. White blood cells (WBC) labelled with (99m)Tc-exametazime ((99m)Tc-hexamethylpropyleneamine oxime; (99m)Tc-HMPAO) are being successfully applied in the field of infection/inflammation scintigraphy for many years. In our radiopharmacy lab, separation and labelling of autologous leukocytes with (99m)Tc-HMPAO were performed in a laminar flow cabinet not classified and placed in a controlled area, whereas (99m)Tc-HMPAO radiolabelling procedure was carried out in a hot cell with manipulator gloves. This study was conducted to validate this process using a Media Fill simulation test. The study was performed using sterile Tryptic Soy Broth (TSB) in place of active product, reproducing as closely as possible the routine aseptic production process with all the critical steps, as described in the our internal standard operative procedures (SOP). The final vials containing the media of each processed step were then incubated for 14 days and examined for the evidence of microbial growth. No evidence of turbidity was observed in all the steps assayed by the Media Fill. In the separation and labelling of autologous leukocytes with (99m)Tc-HmPAO, Media-Fill test represents a reliable tool to validate the aseptic process. Copyright © 2013 Elsevier Inc. All rights reserved.

  1. In vitro and in vivo testing of a novel recessed-step catheter for reflux-free convection-enhanced drug delivery to the brain.

    PubMed

    Gill, T; Barua, N U; Woolley, M; Bienemann, A S; Johnson, D E; S O'Sullivan; Murray, G; Fennelly, C; Lewis, O; Irving, C; Wyatt, M J; Moore, P; Gill, S S

    2013-09-30

    The optimisation of convection-enhanced drug delivery (CED) to the brain is fundamentally reliant on minimising drug reflux. The aim of this study was to evaluate the performance of a novel reflux-resistant CED catheter incorporating a recessed-step and to compare its performance to previously described stepped catheters. The in vitro performance of the recessed-step catheter was compared to a conventional "one-step" catheter with a single transition in outer diameter (OD) at the catheter tip, and a "two-step" design comprising two distal transitions in OD. The volumes of distribution and reflux were compared by performing infusions of Trypan blue into agarose gels. The in vivo performance of the recessed-step catheter was then analysed in a large animal model by performing infusions of 0.2% Gadolinium-DTPA in Large White/Landrace pigs. The recessed-step catheter demonstrated significantly higher volumes of distribution than the one-step and two-step catheters (p=0.0001, one-way ANOVA). No reflux was detected until more than 100 ul had been delivered via the recessed-step catheter, whilst reflux was detected after infusion of only 25 ul via the 2 non-recessed catheters. The recessed-step design also showed superior reflux resistance to a conventational one-step catheter in vivo. Reflux-free infusions were achieved in the thalamus, putamen and white matter at a maximum infusion rate of 5 ul/min using the recessed-step design. The novel recessed-step catheter described in this study shows significant potential for the achievement of predictable high volume, high flow rate infusions whilst minimising the risk of reflux. Copyright © 2013 Elsevier B.V. All rights reserved.

  2. Sinusoidal Siemens star spatial frequency response measurement errors due to misidentified target centers

    DOE PAGES

    Birch, Gabriel Carisle; Griffin, John Clark

    2015-07-23

    Numerous methods are available to measure the spatial frequency response (SFR) of an optical system. A recent change to the ISO 12233 photography resolution standard includes a sinusoidal Siemens star test target. We take the sinusoidal Siemens star proposed by the ISO 12233 standard, measure system SFR, and perform an analysis of errors induced by incorrectly identifying the center of a test target. We show a closed-form solution for the radial profile intensity measurement given an incorrectly determined center and describe how this error reduces the measured SFR of the system. As a result, using the closed-form solution, we proposemore » a two-step process by which test target centers are corrected and the measured SFR is restored to the nominal, correctly centered values.« less

  3. The First Israeli Hydro-Electric Pumped Storage Power Plant Gilboa PSPP

    NASA Astrophysics Data System (ADS)

    Maruzewski, P., Dr.; Sautereau, T.; Sapir, Y.; Barak, H.; Hénard, F.; Blaix, J.-C.

    2016-11-01

    The Israeli Public Utilities Authority, PUA, decided to increase the instantaneous power available on the grid by adding Pumped Storage Power Plants, PSPP, to the existing generation capacity. PSP Investments Ltd. is a private investor that decided to develop the Gilboa PSPP. Its capacity is 300MWe. The project performance has to comply with PUA regulation for PSPP, and with all relevant Israeli laws and IECo standards. This paper itemizes an overview of the Gilboa PSPP through short summaries of units’ components from design step to manufacturing processes.

  4. Standardized Testing Capabilities at U.S. Navy Transducer Repair Facilities

    DTIC Science & Technology

    1972-08-01

    The following acoustical and electrical tests can be performed at the TRFs by the use of an AN/FQM-lO(V) Sonar Test Set, a. Low level impedance...It fits into the sub- system as shown in Fig 2-2. A switch located at the preamplifier is used to select either high or low power operation. The...may also be selected. 2.6.2 Signal Amplification. A low noise differential premnplifier with selectable gains from -20 dB to +60 dB in 10 dB steps is

  5. The Crank Nicolson Time Integrator for EMPHASIS.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McGregor, Duncan Alisdair Odum; Love, Edward; Kramer, Richard Michael Jack

    2018-03-01

    We investigate the use of implicit time integrators for finite element time domain approxi- mations of Maxwell's equations in vacuum. We discretize Maxwell's equations in time using Crank-Nicolson and in 3D space using compatible finite elements. We solve the system by taking a single step of Newton's method and inverting the Eddy-Current Schur complement allowing for the use of standard preconditioning techniques. This approach also generalizes to more complex material models that can include the Unsplit PML. We present verification results and demonstrate performance at CFL numbers up to 1000.

  6. Alternating Direction Implicit (ADI) schemes for a PDE-based image osmosis model

    NASA Astrophysics Data System (ADS)

    Calatroni, L.; Estatico, C.; Garibaldi, N.; Parisotto, S.

    2017-10-01

    We consider Alternating Direction Implicit (ADI) splitting schemes to compute efficiently the numerical solution of the PDE osmosis model considered by Weickert et al. in [10] for several imaging applications. The discretised scheme is shown to preserve analogous properties to the continuous model. The dimensional splitting strategy traduces numerically into the solution of simple tridiagonal systems for which standard matrix factorisation techniques can be used to improve upon the performance of classical implicit methods, even for large time steps. Applications to the shadow removal problem are presented.

  7. An Operator-Integration-Factor Splitting (OIFS) method for Incompressible Flows in Moving Domains

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patel, Saumil S.; Fischer, Paul F.; Min, Misun

    In this paper, we present a characteristic-based numerical procedure for simulating incompressible flows in domains with moving boundaries. Our approach utilizes an operator-integration-factor splitting technique to help produce an effcient and stable numerical scheme. Using the spectral element method and an arbitrary Lagrangian-Eulerian formulation, we investigate flows where the convective acceleration effects are non-negligible. Several examples, ranging from laminar to turbulent flows, are considered. Comparisons with a standard, semi-implicit time-stepping procedure illustrate the improved performance of the scheme.

  8. The development of an imaging informatics-based multi-institutional platform to support sports performance and injury prevention in track and field

    NASA Astrophysics Data System (ADS)

    Liu, Joseph; Wang, Ximing; Verma, Sneha; McNitt-Gray, Jill; Liu, Brent

    2018-03-01

    The main goal of sports science and performance enhancement is to collect video and image data, process them, and quantify the results, giving insight to help athletes improve technique. For long jump in track and field, the processed output of video with force vector overlays and force calculations allow coaches to view specific stages of the hop, step, and jump, and identify how each stage can be improved to increase jump distance. Outputs also provide insight into how athletes can better maneuver to prevent injury. Currently, each data collection site collects and stores data with their own methods. There is no standard for data collection, formats, or storage. Video files and quantified results are stored in different formats, structures, and locations such as Dropbox and hard drives. Using imaging informatics-based principles we can develop a platform for multiple institutions that promotes the standardization of sports performance data. In addition, the system will provide user authentication and privacy as in clinical trials, with specific user access rights. Long jump data collected from different field sites will be standardized into specified formats before database storage. Quantified results from image-processing algorithms are stored similar to CAD algorithm results. The system will streamline the current sports performance data workflow and provide a user interface for athletes and coaches to view results of individual collections and also longitudinally across different collections. This streamlined platform and interface is a tool for coaches and athletes to easily access and review data to improve sports performance and prevent injury.

  9. Optimization of automated large-scale production of [(18)F]fluoroethylcholine for PET prostate cancer imaging.

    PubMed

    Pascali, Giancarlo; D'Antonio, Luca; Bovone, Paola; Gerundini, Paolo; August, Thorsten

    2009-07-01

    PET tumor imaging is gaining importance in current clinical practice. FDG-PET is the most utilized approach but suffers from inflammation influences and is not utilizable in prostate cancer detection. Recently, (11)C-choline analogues have been employed successfully in this field of imaging, leading to a growing interest in the utilization of (18)F-labeled analogues: [(18)F]fluoroethylcholine (FEC) has been demonstrated to be promising, especially in prostate cancer imaging. In this work we report an automatic radiosynthesis of this tracer with high yields, short synthesis time and ease of performance, potentially utilizable in routine production sites. We used a Modular Lab system to automatically perform the two-step/one-pot synthesis. In the first step, we labeled ethyleneglycolditosylate obtaining [(18)F]fluoroethyltosylate; in the second step, we performed the coupling of the latter intermediate with neat dimethylethanolamine. The final mixture was purified by means of solid phase extraction; in particular, the product was trapped into a cation-exchange resin and eluted with isotonic saline. The optimized procedure resulted in a non decay corrected yield of 36% and produced a range of 30-45 GBq of product already in injectable form. The product was analyzed for quality control and resulted as pure and sterile; in addition, residual solvents were under the required threshold. In this work, we present an automatic FEC radiosynthesis that has been optimized for routine production. This findings should foster the interest for a wider utilization of this radiomolecule for imaging of prostate cancer with PET, a field for which no gold-standard tracer has yet been validated.

  10. Simultaneous determination of PPCPs, EDCs, and artificial sweeteners in environmental water samples using a single-step SPE coupled with HPLC-MS/MS and isotope dilution.

    PubMed

    Tran, Ngoc Han; Hu, Jiangyong; Ong, Say Leong

    2013-09-15

    A high-throughput method for the simultaneous determination of 24 pharmaceuticals and personal care products (PPCPs), endocrine disrupting chemicals (EDCs) and artificial sweeteners (ASs) was developed. The method was based on a single-step solid phase extraction (SPE) coupled with high performance liquid chromatography-tandem mass spectrometry (HPLC-MS/MS) and isotope dilution. In this study, a single-step SPE procedure was optimized for simultaneous extraction of all target analytes. Good recoveries (≥ 70%) were observed for all target analytes when extraction was performed using Chromabond(®) HR-X (500 mg, 6 mL) cartridges under acidic condition (pH 2). HPLC-MS/MS parameters were optimized for the simultaneous analysis of 24 PPCPs, EDCs and ASs in a single injection. Quantification was performed by using 13 isotopically labeled internal standards (ILIS), which allows correcting efficiently the loss of the analytes during SPE procedure, matrix effects during HPLC-MS/MS and fluctuation in MS/MS signal intensity due to instrument. Method quantification limit (MQL) for most of the target analytes was below 10 ng/L in all water samples. The method was successfully applied for the simultaneous determination of PPCPs, EDCs and ASs in raw wastewater, surface water and groundwater samples collected in a local catchment area in Singapore. In conclusion, the developed method provided a valuable tool for investigating the occurrence, behavior, transport, and the fate of PPCPs, EDCs and ASs in the aquatic environment. Copyright © 2013 Elsevier B.V. All rights reserved.

  11. Improving stability of prediction models based on correlated omics data by using network approaches.

    PubMed

    Tissier, Renaud; Houwing-Duistermaat, Jeanine; Rodríguez-Girondo, Mar

    2018-01-01

    Building prediction models based on complex omics datasets such as transcriptomics, proteomics, metabolomics remains a challenge in bioinformatics and biostatistics. Regularized regression techniques are typically used to deal with the high dimensionality of these datasets. However, due to the presence of correlation in the datasets, it is difficult to select the best model and application of these methods yields unstable results. We propose a novel strategy for model selection where the obtained models also perform well in terms of overall predictability. Several three step approaches are considered, where the steps are 1) network construction, 2) clustering to empirically derive modules or pathways, and 3) building a prediction model incorporating the information on the modules. For the first step, we use weighted correlation networks and Gaussian graphical modelling. Identification of groups of features is performed by hierarchical clustering. The grouping information is included in the prediction model by using group-based variable selection or group-specific penalization. We compare the performance of our new approaches with standard regularized regression via simulations. Based on these results we provide recommendations for selecting a strategy for building a prediction model given the specific goal of the analysis and the sizes of the datasets. Finally we illustrate the advantages of our approach by application of the methodology to two problems, namely prediction of body mass index in the DIetary, Lifestyle, and Genetic determinants of Obesity and Metabolic syndrome study (DILGOM) and prediction of response of each breast cancer cell line to treatment with specific drugs using a breast cancer cell lines pharmacogenomics dataset.

  12. When procedures discourage insight: epistemological consequences of prompting novice physics students to construct force diagrams

    NASA Astrophysics Data System (ADS)

    Kuo, Eric; Hallinen, Nicole R.; Conlin, Luke D.

    2017-05-01

    One aim of school science instruction is to help students become adaptive problem solvers. Though successful at structuring novice problem solving, step-by-step problem-solving frameworks may also constrain students' thinking. This study utilises a paradigm established by Heckler [(2010). Some consequences of prompting novice physics students to construct force diagrams. International Journal of Science Education, 32(14), 1829-1851] to test how cuing the first step in a standard framework affects undergraduate students' approaches and evaluation of solutions in physics problem solving. Specifically, prompting the construction of a standard diagram before problem solving increases the use of standard procedures, decreasing the use of a conceptual shortcut. Providing a diagram prompt also lowers students' ratings of informal approaches to similar problems. These results suggest that reminding students to follow typical problem-solving frameworks limits their views of what counts as good problem solving.

  13. Variables influencing wearable sensor outcome estimates in individuals with stroke and incomplete spinal cord injury: a pilot investigation validating two research grade sensors.

    PubMed

    Jayaraman, Chandrasekaran; Mummidisetty, Chaithanya Krishna; Mannix-Slobig, Alannah; McGee Koch, Lori; Jayaraman, Arun

    2018-03-13

    Monitoring physical activity and leveraging wearable sensor technologies to facilitate active living in individuals with neurological impairment has been shown to yield benefits in terms of health and quality of living. In this context, accurate measurement of physical activity estimates from these sensors are vital. However, wearable sensor manufacturers generally only provide standard proprietary algorithms based off of healthy individuals to estimate physical activity metrics which may lead to inaccurate estimates in population with neurological impairment like stroke and incomplete spinal cord injury (iSCI). The main objective of this cross-sectional investigation was to evaluate the validity of physical activity estimates provided by standard proprietary algorithms for individuals with stroke and iSCI. Two research grade wearable sensors used in clinical settings were chosen and the outcome metrics estimated using standard proprietary algorithms were validated against designated golden standard measures (Cosmed K4B2 for energy expenditure and metabolic equivalent and manual tallying for step counts). The influence of sensor location, sensor type and activity characteristics were also studied. 28 participants (Healthy (n = 10); incomplete SCI (n = 8); stroke (n = 10)) performed a spectrum of activities in a laboratory setting using two wearable sensors (ActiGraph and Metria-IH1) at different body locations. Manufacturer provided standard proprietary algorithms estimated the step count, energy expenditure (EE) and metabolic equivalent (MET). These estimates were compared with the estimates from gold standard measures. For verifying validity, a series of Kruskal Wallis ANOVA tests (Games-Howell multiple comparison for post-hoc analyses) were conducted to compare the mean rank and absolute agreement of outcome metrics estimated by each of the devices in comparison with the designated gold standard measurements. The sensor type, sensor location, activity characteristics and the population specific condition influences the validity of estimation of physical activity metrics using standard proprietary algorithms. Implementing population specific customized algorithms accounting for the influences of sensor location, type and activity characteristics for estimating physical activity metrics in individuals with stroke and iSCI could be beneficial.

  14. 77 FR 67340 - National Fire Codes: Request for Comments on NFPA's Codes and Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-09

    ... the process. The Code Revision Process contains four basic steps that are followed for developing new documents as well as revising existing documents. Step 1: Public Input Stage, which results in the First Draft Report (formerly ROP); Step 2: Comment Stage, which results in the Second Draft Report (formerly...

  15. User error with Diskus and Turbuhaler by asthma patients and pharmacists in Jordan and Australia.

    PubMed

    Basheti, Iman A; Qunaibi, Eyad; Bosnic-Anticevich, Sinthia Z; Armour, Carol L; Khater, Samar; Omar, Muthana; Reddel, Helen K

    2011-12-01

    Use of inhalers requires accurate completion of multiple steps to ensure effective medication delivery. To evaluate the most problematic steps in the use of Diskus and Turbuhaler for pharmacists and patients in Jordon and Australia. With standardized inhaler-technique checklists, we asked community pharmacists to demonstrate the use of Diskus and Turbuhaler. We asked patients with asthma to demonstrate the inhaler (Diskus or Turbuhaler) they were currently using. Forty-two community pharmacists in Jordan, and 31 in Australia, participated. In Jordan, 51 asthma patients demonstrated use of Diskus, and 40 demonstrated use of Turbuhaler. In Australia, 53 asthma patients demonstrated use of Diskus, and 42 demonstrated use of Turbuhaler. The pharmacists in Australia had received inhaler-technique education more recently than those in Jordan (P = .03). With Diskus, few pharmacists in either country demonstrated correct technique for step 3 (exhale to residual volume) or step 4 (exhale away from the device), although there were somewhat fewer errors in Australia than Jordan (16% vs 0% in step 3, P = .007, and 20% vs 0% in step 4, P = .003 via chi-square test). With Turbuhaler there were significant differences between the pharmacists from Australia and Jordan, mainly in step 2 (hold the device upright while loading, 45% vs 2% correct, P < .001). Few of the patients had received inhaler-technique education in the previous year. The patients made errors similar to those of the pharmacists in individual steps with Diskus and Turbuhaler. The essential steps with Diskus were performed correctly more often by the Jordanian patients, and with Turbuhaler by the Australian patients. Despite differences in Jordan's and Australia's health systems, pharmacists from both Australia and Jordan had difficulty with the same Diskus and Turbuhaler steps. In both countries, the errors made by the asthma patients were similar to those made by the pharmacists.

  16. Validity and reliability of the activPAL3 for measuring posture and stepping in adults and young people.

    PubMed

    Sellers, Ceri; Dall, Philippa; Grant, Margaret; Stansfield, Ben

    2016-01-01

    Characterisation of free-living physical activity requires the use of validated and reliable monitors. This study reports an evaluation of the validity and reliability of the activPAL3 monitor for the detection of posture and stepping in both adults and young people. Twenty adults (median 27.6y; IQR22.6y) and 8 young people (12.0y; IQR4.1y) performed standardised activities and activities of daily living (ADL) incorporating sedentary, upright and stepping activity. Agreement, specificity and positive predictive value were calculated between activPAL3 outcomes and the gold-standard of video observation. Inter-device reliability was calculated between 4 monitors. Sedentary and upright times for standardised activities were within ±5% of video observation as was step count (excluding jogging) for both adults and young people. Jogging step detection accuracy reduced with increasing cadence >150stepsmin(-1). For ADLs, sensitivity to stepping was very low for adults (40.4%) but higher for young people (76.1%). Inter-device reliability was either good (ICC(1,1)>0.75) or excellent (ICC(1,1)>0.90) for all outcomes. An excellent level of detection of standardised postures was demonstrated by the activPAL3. Postures such as seat-perching, kneeling and crouching were misclassified when compared to video observation. The activPAL3 appeared to accurately detect 'purposeful' stepping during ADL, but detection of smaller stepping movements was poor. Small variations in outcomes between monitors indicated that differences in monitor placement or hardware may affect outcomes. In general, the detection of posture and purposeful stepping with the activPAL3 was excellent indicating that it is a suitable monitor for characterising free-living posture and purposeful stepping activity in healthy adults and young people. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Standardization of a two-step real-time polymerase chain reaction based method for species-specific detection of medically important Aspergillus species.

    PubMed

    Das, P; Pandey, P; Harishankar, A; Chandy, M; Bhattacharya, S; Chakrabarti, A

    2017-01-01

    Standardization of Aspergillus polymerase chain reaction (PCR) poses two technical challenges (a) standardization of DNA extraction, (b) optimization of PCR against various medically important Aspergillus species. Many cases of aspergillosis go undiagnosed because of relative insensitivity of conventional diagnostic methods such as microscopy, culture or antigen detection. The present study is an attempt to standardize real-time PCR assay for rapid sensitive and specific detection of Aspergillus DNA in EDTA whole blood. Three nucleic acid extraction protocols were compared and a two-step real-time PCR assay was developed and validated following the recommendations of the European Aspergillus PCR Initiative in our setup. In the first PCR step (pan-Aspergillus PCR), the target was 28S rDNA gene, whereas in the second step, species specific PCR the targets were beta-tubulin (for Aspergillus fumigatus, Aspergillus flavus, Aspergillus terreus), gene and calmodulin gene (for Aspergillus niger). Species specific identification of four medically important Aspergillus species, namely, A. fumigatus, A. flavus, A. niger and A. terreus were achieved by this PCR. Specificity of the PCR was tested against 34 different DNA source including bacteria, virus, yeast, other Aspergillus sp., other fungal species and for human DNA and had no false-positive reactions. The analytical sensitivity of the PCR was found to be 102 CFU/ml. The present protocol of two-step real-time PCR assays for genus- and species-specific identification for commonly isolated species in whole blood for diagnosis of invasive Aspergillus infections offers a rapid, sensitive and specific assay option and requires clinical validation at multiple centers.

  18. An Automatic and Robust Algorithm of Reestablishment of Digital Dental Occlusion

    PubMed Central

    Chang, Yu-Bing; Xia, James J.; Gateno, Jaime; Xiong, Zixiang; Zhou, Xiaobo; Wong, Stephen T. C.

    2017-01-01

    In the field of craniomaxillofacial (CMF) surgery, surgical planning can be performed on composite 3-D models that are generated by merging a computerized tomography scan with digital dental models. Digital dental models can be generated by scanning the surfaces of plaster dental models or dental impressions with a high-resolution laser scanner. During the planning process, one of the essential steps is to reestablish the dental occlusion. Unfortunately, this task is time-consuming and often inaccurate. This paper presents a new approach to automatically and efficiently reestablish dental occlusion. It includes two steps. The first step is to initially position the models based on dental curves and a point matching technique. The second step is to reposition the models to the final desired occlusion based on iterative surface-based minimum distance mapping with collision constraints. With linearization of rotation matrix, the alignment is modeled by solving quadratic programming. The simulation was completed on 12 sets of digital dental models. Two sets of dental models were partially edentulous, and another two sets have first premolar extractions for orthodontic treatment. Two validation methods were applied to the articulated models. The results show that using our method, the dental models can be successfully articulated with a small degree of deviations from the occlusion achieved with the gold-standard method. PMID:20529735

  19. Step Up-Not On-The Step 2 Clinical Skills Exam: Directors of Clinical Skills Courses (DOCS) Oppose Ending Step 2 CS.

    PubMed

    Ecker, David J; Milan, Felise B; Cassese, Todd; Farnan, Jeanne M; Madigosky, Wendy S; Massie, F Stanford; Mendez, Paul; Obadia, Sharon; Ovitsh, Robin K; Silvestri, Ronald; Uchida, Toshiko; Daniel, Michelle

    2018-05-01

    Recently, a student-initiated movement to end the United States Medical Licensing Examination Step 2 Clinical Skills and the Comprehensive Osteopathic Medical Licensing Examination Level 2-Performance Evaluation has gained momentum. These are the only national licensing examinations designed to assess clinical skills competence in the stepwise process through which physicians gain licensure and certification. Therefore, the movement to end these examinations and the ensuing debate merit careful consideration. The authors, elected representatives of the Directors of Clinical Skills Courses, an organization comprising clinical skills educators in the United States and beyond, believe abolishing the national clinical skills examinations would have a major negative impact on the clinical skills training of medical students, and that forfeiting a national clinical skills competency standard has the potential to diminish the quality of care provided to patients. In this Perspective, the authors offer important additional background information, outline key concerns regarding the consequences of ending these national clinical skills examinations, and provide recommendations for moving forward: reducing the costs for students, exploring alternatives, increasing the value and transparency of the current examinations, recognizing and enhancing the strengths of the current examinations, and engaging in a national dialogue about the issue.

  20. Immersed boundary-simplified lattice Boltzmann method for incompressible viscous flows

    NASA Astrophysics Data System (ADS)

    Chen, Z.; Shu, C.; Tan, D.

    2018-05-01

    An immersed boundary-simplified lattice Boltzmann method is developed in this paper for simulations of two-dimensional incompressible viscous flows with immersed objects. Assisted by the fractional step technique, the problem is resolved in a predictor-corrector scheme. The predictor step solves the flow field without considering immersed objects, and the corrector step imposes the effect of immersed boundaries on the velocity field. Different from the previous immersed boundary-lattice Boltzmann method which adopts the standard lattice Boltzmann method (LBM) as the flow solver in the predictor step, a recently developed simplified lattice Boltzmann method (SLBM) is applied in the present method to evaluate intermediate flow variables. Compared to the standard LBM, SLBM requires lower virtual memories, facilitates the implementation of physical boundary conditions, and shows better numerical stability. The boundary condition-enforced immersed boundary method, which accurately ensures no-slip boundary conditions, is implemented as the boundary solver in the corrector step. Four typical numerical examples are presented to demonstrate the stability, the flexibility, and the accuracy of the present method.

  1. [Company Wide Quality Control (total quality): methodological principles and intervention techniques for step-by-step improvement].

    PubMed

    Corbara, F; Di Cristofaro, E

    1996-01-01

    The concept of Quality is particularly up to date and not a new one for the Journal. The need for better Quality is a must also in Medical care. Quality doesn't mean additional costs and excessive burden for the co-workers. On the contrary, initial costs can be compensated for through a more rational utilisation of the resources. The consequent better service for the patient results in an ameliorated working environment, with high profits. Fundamental requirements for reaching concrete results are: 1) the convinced involvement in the idea of all levels (division, service, laboratory) in order to have the different groups act in synergism towards common goals; 2) the knowledge of appropriate methods. The Authors examine this last point with a deep analysis of the techniques involved in Company Wide Quality Control (C.W.Q.C.) or Total Quality. The improving process has to the continuous and proceed in small steps, each time being constituted by 4 different phases, represented by the PDCA cycle, or Demining wheel, where: P = PLAN, which means plan before acting; D = DO, perform what has been planned; C = CHECK, verify the results; A = ACT, standardize if the results are positive, repeat the process if negative. Each process of improvement implies a prior precise definition of a project, i.e. a problem whose solution has been planned. The project must always presume: a specific subject--a goal--one or more people to reach it--a limited time to work it out. The most effective way to ameliorate Quality is performing projects. Step by Step amelioration is synonymous of performance of many projects. A brilliant way to produce many projects remains their "industrialization", which can be reached by means of 6 basic criteria: 1) full involvement of the Direction; 2) potential co-working in the projects of all employees; 3) employment of simple instruments; 4) respect of a few procedural formalities; 5) rewarding of personnel; 6) continuous promotion of the concepts of quality and ongoing improvement. The Authors describe for each of the previous criteria approaching methods and best operative techniques, according C.W.Q.C.

  2. ISLLC/ELCC Standards Implementation: Do Educational Administration Faculty Practice What They Preach?

    ERIC Educational Resources Information Center

    Machado, Crystal

    2012-01-01

    Both the 1996 Interstate School Leadership Licensure Consortium (ISLLC) standards and the 2002 Educational Leadership Constituent Council (ELCC) standards, adopted by preparation programs nationwide have a strong emphasis on democratic ideals. By aligning their programs with these standards education administration faculty have taken a step in the…

  3. 40 CFR 1039.107 - What evaporative emission standards and requirements apply?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 1048 that apply to spark-ignition engines, as follows: (a) Follow the steps in 40 CFR 1048.245 to show...-IGNITION ENGINES Emission Standards and Related Requirements § 1039.107 What evaporative emission standards and requirements apply? There are no evaporative emission standards for diesel-fueled engines, or...

  4. Reliability of a standardized test in Swedish for evaluation of reading performance in healthy eyes. Interchart and test-retest analyses.

    PubMed

    Thaung, Jörgen; Olseke, Kjell; Ahl, Johan; Sjöstrand, Johan

    2014-09-01

    The purpose of our study was to establish a practical and quick test for assessing reading performance and to statistically analyse interchart and test-retest reliability of a new standardized Swedish reading chart system consisting of three charts constructed according to the principles available in the literature. Twenty-four subjects with healthy eyes, mean age 65 ± 10 years, were tested binocularly and the reading performance evaluated as reading acuity, critical print size and maximum reading speed. The test charts all consist of 12 short text sentences with a print size ranging from 0.9 to -0.2 logMAR in approximate steps of 0.1 logMAR. Two testing sessions, in two different groups (C1 and C2), were under strict control of luminance and lighting environment. Reading performance tests with chart T1, T2 and T3 were used for evaluation of interchart reliability and test data from a second session 1 month or more apart for the test-retest analysis. The testing of reading performance in adult observers with short sentences of continuous text was quick and practical. The agreement between the tests obtained with the three different test charts was high both within the same test session and at retest. This new Swedish variant of a standardized reading system based on short sentences and logarithmic progression of print size provides reliable measurements of reading performance and preliminary norms in an age group around 65 years. The reading test with three independent reading charts can be useful for clinical studies of reading ability before and after treatment. © 2013 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  5. Semi-automated Anatomical Labeling and Inter-subject Warping of High-Density Intracranial Recording Electrodes in Electrocorticography.

    PubMed

    Hamilton, Liberty S; Chang, David L; Lee, Morgan B; Chang, Edward F

    2017-01-01

    In this article, we introduce img_pipe, our open source python package for preprocessing of imaging data for use in intracranial electrocorticography (ECoG) and intracranial stereo-EEG analyses. The process of electrode localization, labeling, and warping for use in ECoG currently varies widely across laboratories, and it is usually performed with custom, lab-specific code. This python package aims to provide a standardized interface for these procedures, as well as code to plot and display results on 3D cortical surface meshes. It gives the user an easy interface to create anatomically labeled electrodes that can also be warped to an atlas brain, starting with only a preoperative T1 MRI scan and a postoperative CT scan. We describe the full capabilities of our imaging pipeline and present a step-by-step protocol for users.

  6. Multispot single-molecule FRET: High-throughput analysis of freely diffusing molecules

    PubMed Central

    Panzeri, Francesco

    2017-01-01

    We describe an 8-spot confocal setup for high-throughput smFRET assays and illustrate its performance with two characteristic experiments. First, measurements on a series of freely diffusing doubly-labeled dsDNA samples allow us to demonstrate that data acquired in multiple spots in parallel can be properly corrected and result in measured sample characteristics consistent with those obtained with a standard single-spot setup. We then take advantage of the higher throughput provided by parallel acquisition to address an outstanding question about the kinetics of the initial steps of bacterial RNA transcription. Our real-time kinetic analysis of promoter escape by bacterial RNA polymerase confirms results obtained by a more indirect route, shedding additional light on the initial steps of transcription. Finally, we discuss the advantages of our multispot setup, while pointing potential limitations of the current single laser excitation design, as well as analysis challenges and their solutions. PMID:28419142

  7. Optimization of Advanced ACTPol Transition Edge Sensor Bolometer Operation Using R(T,I) Transition Measurements

    NASA Astrophysics Data System (ADS)

    Salatino, Maria

    2017-06-01

    In the current submm and mm cosmology experiments the focal planes are populated by kilopixel transition edge sensors (TESes). Varying incoming power load requires frequent rebiasing of the TESes through standard current-voltage (IV) acquisition. The time required to perform IVs on such large arrays and the resulting transient heating of the bath reduces the sky observation time. We explore a bias step method that significantly reduces the time required for the rebiasing process. This exploits the detectors' responses to the injection of a small square wave signal on top of the dc bias current and knowledge of the shape of the detector transition R(T,I). This method has been tested on two detector arrays of the Atacama Cosmology Telescope (ACT). In this paper, we focus on the first step of the method, the estimate of the TES %Rn.

  8. Strategies for the one-step immobilization-purification of enzymes as industrial biocatalysts.

    PubMed

    Barbosa, Oveimar; Ortiz, Claudia; Berenguer-Murcia, Ángel; Torres, Rodrigo; Rodrigues, Rafael C; Fernandez-Lafuente, Roberto

    2015-01-01

    In this review, we detail the efforts performed to couple the purification and the immobilization of industrial enzymes in a single step. The use of antibodies, the development of specific domains with affinity for some specific supports will be revised. Moreover, we will discuss the use of domains that increase the affinity for standard matrices (ionic exchangers, silicates). We will show how the control of the immobilization conditions may convert some unspecific supports in largely specific ones. The development of tailor-made heterofunctional supports as a tool to immobilize-stabilize-purify some proteins will be discussed in deep, using low concentration of adsorbent groups and a dense layer of groups able to give an intense multipoint covalent attachment. The final coupling of mutagenesis and tailor made supports will be the last part of the review. Copyright © 2015 Elsevier Inc. All rights reserved.

  9. CIDOC-CRM extensions for conservation processes: A methodological approach

    NASA Astrophysics Data System (ADS)

    Vassilakaki, Evgenia; Zervos, Spiros; Giannakopoulos, Georgios

    2015-02-01

    This paper aims to report the steps taken to create the CIDOC Conceptual Reference Model (CIDOC-CRM) extensions and the relationships established to accommodate the depiction of conservation processes. In particular, the specific steps undertaken for developing and applying the CIDOC-CRM extensions for defining the conservation interventions performed on the cultural artifacts of the National Archaeological Museum of Athens, Greece are presented in detail. A report on the preliminary design of the DOC-CULTURE project (Development of an integrated information environment for assessment and documentation of conservation interventions to cultural works/objects with nondestructive testing techniques [NDTs], www.ndt-lab.gr/docculture), co-financed by the European Union NSRF THALES program, can be found in Kyriaki-Manessi, Zervos & Giannakopoulos (1) whereas the NDT&E methods and their output data through CIDOC-CRM extension of the DOC-CULTURE project approach to standardize the documentation of the conservation were further reported in Kouis et al. (2).

  10. Frequency optimization in the eddy current test for high purity niobium

    NASA Astrophysics Data System (ADS)

    Joung, Mijoung; Jung, Yoochul; Kim, Hyungjin

    2017-01-01

    The eddy current test (ECT) is frequently used as a non-destructive method to check for the defects of high purity niobium (RRR300, Residual Resistivity Ratio) in a superconducting radio frequency (SRF) cavity. Determining an optimal frequency corresponding to specific material properties and probe specification is a very important step. The ECT experiments for high purity Nb were performed to determine the optimal frequency using the standard sample of high purity Nb having artificial defects. The target depth was considered with the treatment step that the niobium receives as the SRF cavity material. The results were analysed via the selectivity that led to a specific result, depending on the size of the defects. According to the results, the optimal frequency was determined to be 200 kHz, and a few features of the ECT for the high purity Nb were observed.

  11. Semi-automated Anatomical Labeling and Inter-subject Warping of High-Density Intracranial Recording Electrodes in Electrocorticography

    PubMed Central

    Hamilton, Liberty S.; Chang, David L.; Lee, Morgan B.; Chang, Edward F.

    2017-01-01

    In this article, we introduce img_pipe, our open source python package for preprocessing of imaging data for use in intracranial electrocorticography (ECoG) and intracranial stereo-EEG analyses. The process of electrode localization, labeling, and warping for use in ECoG currently varies widely across laboratories, and it is usually performed with custom, lab-specific code. This python package aims to provide a standardized interface for these procedures, as well as code to plot and display results on 3D cortical surface meshes. It gives the user an easy interface to create anatomically labeled electrodes that can also be warped to an atlas brain, starting with only a preoperative T1 MRI scan and a postoperative CT scan. We describe the full capabilities of our imaging pipeline and present a step-by-step protocol for users. PMID:29163118

  12. The future is in the numbers: the power of predictive analysis in the biomedical educational environment

    PubMed Central

    Gullo, Charles A.

    2016-01-01

    Biomedical programs have a potential treasure trove of data they can mine to assist admissions committees in identification of students who are likely to do well and help educational committees in the identification of students who are likely to do poorly on standardized national exams and who may need remediation. In this article, we provide a step-by-step approach that schools can utilize to generate data that are useful when predicting the future performance of current students in any given program. We discuss the use of linear regression analysis as the means of generating that data and highlight some of the limitations. Finally, we lament on how the combination of these institution-specific data sets are not being fully utilized at the national level where these data could greatly assist programs at large. PMID:27374246

  13. Re-evaluation of an Optimized Second Order Backward Difference (BDF2OPT) Scheme for Unsteady Flow Applications

    NASA Technical Reports Server (NTRS)

    Vatsa, Veer N.; Carpenter, Mark H.; Lockard, David P.

    2009-01-01

    Recent experience in the application of an optimized, second-order, backward-difference (BDF2OPT) temporal scheme is reported. The primary focus of the work is on obtaining accurate solutions of the unsteady Reynolds-averaged Navier-Stokes equations over long periods of time for aerodynamic problems of interest. The baseline flow solver under consideration uses a particular BDF2OPT temporal scheme with a dual-time-stepping algorithm for advancing the flow solutions in time. Numerical difficulties are encountered with this scheme when the flow code is run for a large number of time steps, a behavior not seen with the standard second-order, backward-difference, temporal scheme. Based on a stability analysis, slight modifications to the BDF2OPT scheme are suggested. The performance and accuracy of this modified scheme is assessed by comparing the computational results with other numerical schemes and experimental data.

  14. Numerical solution of turbulent flow past a backward facing step using a nonlinear K-epsilon model

    NASA Technical Reports Server (NTRS)

    Speziale, C. G.; Ngo, Tuan

    1987-01-01

    The problem of turbulent flow past a backward facing step is important in many technological applications and has been used as a standard test case to evaluate the performance of turbulence models in the prediction of separated flows. It is well known that the commonly used kappa-epsilon (and K-l) models of turbulence yield inaccurate predictions for the reattachment points in this problem. By an analysis of the mean vorticity transport equation, it will be argued that the intrinsically inaccurate prediction of normal Reynolds stress differences by the Kappa-epsilon and K-l models is a major contributor to this problem. Computations using a new nonlinear kappa-epsilon model (which alleviates this deficiency) are made with the TEACH program. Comparisons are made between the improved results predicted by this nonlinear kappa-epsilon model and those obtained from the linear kappa-epsilon model as well as from second-order closure models.

  15. A toxin-free enzyme-linked immunosorbent assay for the analysis of aflatoxins based on a VHH surrogate standard.

    PubMed

    Wang, Yanru; Li, Peiwu; Zhang, Qi; Hu, Xiaofeng; Zhang, Wen

    2016-09-01

    A toxin-free enzyme-linked immunosorbent assay (ELISA) for aflatoxins was developed using an anti-idiotype nanobody VHH 2-5 as surrogate standard. Anti-idiotype nanobody VHH 2-5 was generated by immunizing an alpaca with anti-aflatoxin monoclonal antibody 1C11. This assay was used to detect aflatoxins in agro-products after a simple extraction with 75 % methanol/H2O. Aflatoxin concentration was calculated by a two-step approach: the concentration of VHH 2-5 was first obtained by a four-parameter logistic regression from the detected absorbance value at 450 nm, and then converted to aflatoxin concentration by a linear equation. The assay exhibits a limit of detection (LOD) of 0.015 ng mL(-1), which is better than or comparable with conventional immunoassays. The performance of our VHH surrogate-based ELISA was further validated with a high-performance liquid chromatography (HPLC) method for total aflatoxins determination in 20 naturally contaminated peanut samples, displaying a good correlation (R (2) = 0.988). In conclusion, the proposed assay represents a first example applying an anti-idiotype VHH antibody as a standard surrogate in ELISA. With the advantages of high stability and ease of production, the VHH antibody-based standard surrogate can be extended in the future to immunoassays for other highly toxic compounds. Graphical Abstract ᅟ.

  16. Protocol for Uniformly Measuring and Expressing the Performance of Energy Storage Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Conover, David R.; Crawford, Aladsair J.; Viswanathan, Vilayanur V.

    2014-06-01

    The Protocol for Uniformly Measuring and Expressing the Performance of Energy Storage Systems (PNNL-22010) was first issued in November 2012 as a first step toward providing a foundational basis for developing an initial standard for the uniform measurement and expression of energy storage system (ESS) performance. Its subsequent use in the field and review by the protocol working group and most importantly the users’ subgroup and the thermal subgroup has led to the fundamental modifications reflected in this update of the 2012 Protocol. As an update of the 2012 Protocol, this document (the June 2014 Protocol) is intended to supersedemore » its predecessor and be used as the basis for measuring and expressing ESS performance. The foreword provides general and specific details about what additions, revisions, and enhancements have been made to the 2012 Protocol and the rationale for them in arriving at the June 2014 Protocol.« less

  17. The predictive validity of the MCAT for medical school performance and medical board licensing examinations: a meta-analysis of the published research.

    PubMed

    Donnon, Tyrone; Paolucci, Elizabeth Oddone; Violato, Claudio

    2007-01-01

    To conduct a meta-analysis of published studies to determine the predictive validity of the MCAT on medical school performance and medical board licensing examinations. The authors included all peer-reviewed published studies reporting empirical data on the relationship between MCAT scores and medical school performance or medical board licensing exam measures. Moderator variables, participant characteristics, and medical school performance/medical board licensing exam measures were extracted and reviewed separately by three reviewers using a standardized protocol. Medical school performance measures from 11 studies and medical board licensing examinations from 18 studies, for a total of 23 studies, were selected. A random-effects model meta-analysis of weighted effects sizes (r) resulted in (1) a predictive validity coefficient for the MCAT in the preclinical years of r = 0.39 (95% confidence interval [CI], 0.21-0.54) and on the USMLE Step 1 of r = 0.60 (95% CI, 0.50-0.67); and (2) the biological sciences subtest as the best predictor of medical school performance in the preclinical years (r = 0.32 95% CI, 0.21-0.42) and on the USMLE Step 1 (r = 0.48 95% CI, 0.41-0.54). The predictive validity of the MCAT ranges from small to medium for both medical school performance and medical board licensing exam measures. The medical profession is challenged to develop screening and selection criteria with improved validity that can supplement the MCAT as an important criterion for admission to medical schools.

  18. CR-Calculus and adaptive array theory applied to MIMO random vibration control tests

    NASA Astrophysics Data System (ADS)

    Musella, U.; Manzato, S.; Peeters, B.; Guillaume, P.

    2016-09-01

    Performing Multiple-Input Multiple-Output (MIMO) tests to reproduce the vibration environment in a user-defined number of control points of a unit under test is necessary in applications where a realistic environment replication has to be achieved. MIMO tests require vibration control strategies to calculate the required drive signal vector that gives an acceptable replication of the target. This target is a (complex) vector with magnitude and phase information at the control points for MIMO Sine Control tests while in MIMO Random Control tests, in the most general case, the target is a complete spectral density matrix. The idea behind this work is to tailor a MIMO random vibration control approach that can be generalized to other MIMO tests, e.g. MIMO Sine and MIMO Time Waveform Replication. In this work the approach is to use gradient-based procedures over the complex space, applying the so called CR-Calculus and the adaptive array theory. With this approach it is possible to better control the process performances allowing the step-by-step Jacobian Matrix update. The theoretical bases behind the work are followed by an application of the developed method to a two-exciter two-axis system and by performance comparisons with standard methods.

  19. Anti-impulse-noise Edge Detection via Anisotropic Morphological Directional Derivatives.

    PubMed

    Shui, Peng-Lang; Wang, Fu-Ping

    2017-07-13

    Traditional differential-based edge detection suffers from abrupt degradation in performance when images are corrupted by impulse noises. The morphological operators such as the median filters and weighted median filters possess the intrinsic ability to counteract impulse noise. In this paper, by combining the biwindow configuration with weighted median filters, anisotropic morphological directional derivatives (AMDD) robust to impulse noise are proposed to measure the local grayscale variation around a pixel. For ideal step edges, the AMDD spatial response and directional representation are derived. The characteristics and edge resolution of two kinds of typical biwindows are analyzed thoroughly. In terms of the AMDD spatial response and directional representation of ideal step edges, the spatial matched filter is used to extract the edge strength map (ESM) from the AMDDs of an image. The spatial and directional matched filters are used to extract the edge direction map (EDM). Embedding the extracted ESM and EDM into the standard route of the differential-based edge detection, an anti-impulse-noise AMDD-based edge detector is constructed. It is compared with the existing state-of-the-art detectors on a recognized image dataset for edge detection evaluation. The results show that it attains competitive performance in noise-free and Gaussian noise cases and the best performance in impulse noise cases.

  20. Students with Disabilities & Educational Standards: Recommendations for Policy & Practice.

    ERIC Educational Resources Information Center

    Ysseldyke, James E.; And Others

    1994-01-01

    By setting academic standards, America takes its first critical step toward providing a plan that will create an excellent educational system for the 21st century. It is important for those working on standards and those educating students with disabilities to work together as standards are being developed. Four kinds of standards need to be…

  1. State Standard-Setting Processes in Brief. State Academic Standards: Standard-Setting Processes

    ERIC Educational Resources Information Center

    Thomsen, Jennifer

    2014-01-01

    Concerns about academic standards, whether created by states from scratch or adopted by states under the Common Core State Standards (CCSS) banner, have drawn widespread media attention and are at the top of many state policymakers' priority lists. Recently, a number of legislatures have required additional steps, such as waiting periods for…

  2. Accuracy of a smartphone to test laryngoscope's light and an audit to our laryngoscopes using an ISO standard.

    PubMed

    Machado, Diogo Alcino de Abreu Ribeiro Carvalho; Esteves, Dina da Assunção Azevedo; Branca, Pedro Manuel Araújo de Sousa

    Laryngoscope is a key tool in anesthetic practice. Direct laryngoscopy is a crucial moment and inadequate laryngoscope's light can lead to catastrophic consequences. From our experience laryngoscope's light is assessed in a subjective manner and we believe a more precise evaluation should be used. Our objective is to compare the accuracy of a smartphone compared to a lux meter. Secondly we audited our Operating Room laryngoscopes. We designed a pragmatic study, using as primary outcome the accuracy of a smartphone compared to the lux meter. Further we audited with both the lux meter and the smartphone all laryngoscopes and blades ready to use in our Operating Rooms, using the International Standard form the International Organization for Standardization. For primary outcome we found no significant difference between devices. Our audit showed that only 2 in 48 laryngoscopes complied with the ISO norm. When comparing the measurements between the lux meter and the smartphone we found no significant difference. Ideally every laryngoscope should perform as required. We believe all laryngoscopes should have a practical but reliable and objective test prior to its utilization. Our results suggest the smartphone was accurate enough to be used as a lux meter to test laryngoscope's light. Audit results showing only 4% comply with the ISO standard are consistent with other studies. The tested smartphone has enough accuracy to perform light measurement in laryngoscopes. We believe this is a step further to perform an objective routine check to laryngoscope's light. Copyright © 2016. Published by Elsevier Editora Ltda.

  3. Robotic Single-Site Sacrocolpopexy Using Barbed Suture Anchoring and Peritoneal Tunneling Technique: Tips and Tricks.

    PubMed

    Guan, Xiaoming; Ma, Yingchun; Gisseman, Jordan; Kleithermes, Christopher; Liu, Juan

    2017-01-01

    To demonstrate the tips and tricks of a simpler technique for single-site sacrocolpopexy using barbed suture anchoring and retroperitoneal tunneling to make the procedure more efficient and reproducible. Step-by-step description of surgical tutorial using a narrated video (Canadian Task Force classification III). Academic tertiary care hospital. Patient with Stage III uterine prolapse. Sacrocolpopexy is increasing utilized since the FDA warning about complications of vaginal mesh surgery. It is the gold standard for repair of apical prolapse. However, there is great variation in the sacrocolpopexy procedure techniques and they have not been standardized. Traditional single-site laparoscopic sacrocolpopexy is very challenging as the procedure time is long and suturing is difficult. The advantages of suturing with wristed needle drivers in robotic single-site surgery simplify this complex procedure. Furthermore, using barbed suture anchoring and peritoneal tunneling technique potentially decreases the surgeon's learning curve and makes the procedure reproducible. In this video, we demonstrate a supracervial hysterectomy with a stepwise explanation of the correct technique for performing a robotic single incision sacrocolpopexy. Sacrocolpopexy is increasing used since the US Food and Drug Administration warning about complications of vaginal mesh surgery. It is the gold standard for repair of apical prolapse. However, a great variation exists in the sacrocolpopexy procedure techniques that need to be standardized. Traditional single-site laparoscopic sacrocolpopexy is very challenging because the procedure time is long and suturing is difficult. The advantages of suturing with wristed needle drivers in robotic single-site surgery simplify this complex procedure. Furthermore, using the barbed suture anchoring and peritoneal tunneling technique potentially decreases the surgeon's learning curve and makes the procedure reproducible. In this video, we demonstrate a supracervical hysterectomy with a stepwise explaation of the correct technique for performing a robotic single-incision sacrocolpopexy. The possibility of using the barbed suture and peritoneal tunneling technique with wristed needle drivers in robotic single-site sacrocolpopexy offers the possibility of an effective, safe, reproducible, and cosmetic surgical option. Copyright © 2016 AAGL. Published by Elsevier Inc. All rights reserved.

  4. Novel hybrid linear stochastic with non-linear extreme learning machine methods for forecasting monthly rainfall a tropical climate.

    PubMed

    Zeynoddin, Mohammad; Bonakdari, Hossein; Azari, Arash; Ebtehaj, Isa; Gharabaghi, Bahram; Riahi Madavar, Hossein

    2018-09-15

    A novel hybrid approach is presented that can more accurately predict monthly rainfall in a tropical climate by integrating a linear stochastic model with a powerful non-linear extreme learning machine method. This new hybrid method was then evaluated by considering four general scenarios. In the first scenario, the modeling process is initiated without preprocessing input data as a base case. While in other three scenarios, the one-step and two-step procedures are utilized to make the model predictions more precise. The mentioned scenarios are based on a combination of stationarization techniques (i.e., differencing, seasonal and non-seasonal standardization and spectral analysis), and normality transforms (i.e., Box-Cox, John and Draper, Yeo and Johnson, Johnson, Box-Cox-Mod, log, log standard, and Manly). In scenario 2, which is a one-step scenario, the stationarization methods are employed as preprocessing approaches. In scenario 3 and 4, different combinations of normality transform, and stationarization methods are considered as preprocessing techniques. In total, 61 sub-scenarios are evaluated resulting 11013 models (10785 linear methods, 4 nonlinear models, and 224 hybrid models are evaluated). The uncertainty of the linear, nonlinear and hybrid models are examined by Monte Carlo technique. The best preprocessing technique is the utilization of Johnson normality transform and seasonal standardization (respectively) (R 2  = 0.99; RMSE = 0.6; MAE = 0.38; RMSRE = 0.1, MARE = 0.06, UI = 0.03 &UII = 0.05). The results of uncertainty analysis indicated the good performance of proposed technique (d-factor = 0.27; 95PPU = 83.57). Moreover, the results of the proposed methodology in this study were compared with an evolutionary hybrid of adaptive neuro fuzzy inference system (ANFIS) with firefly algorithm (ANFIS-FFA) demonstrating that the new hybrid methods outperformed ANFIS-FFA method. Copyright © 2018 Elsevier Ltd. All rights reserved.

  5. PETOOL: MATLAB-based one-way and two-way split-step parabolic equation tool for radiowave propagation over variable terrain

    NASA Astrophysics Data System (ADS)

    Ozgun, Ozlem; Apaydin, Gökhan; Kuzuoglu, Mustafa; Sevgi, Levent

    2011-12-01

    A MATLAB-based one-way and two-way split-step parabolic equation software tool (PETOOL) has been developed with a user-friendly graphical user interface (GUI) for the analysis and visualization of radio-wave propagation over variable terrain and through homogeneous and inhomogeneous atmosphere. The tool has a unique feature over existing one-way parabolic equation (PE)-based codes, because it utilizes the two-way split-step parabolic equation (SSPE) approach with wide-angle propagator, which is a recursive forward-backward algorithm to incorporate both forward and backward waves into the solution in the presence of variable terrain. First, the formulation of the classical one-way SSPE and the relatively-novel two-way SSPE is presented, with particular emphasis on their capabilities and the limitations. Next, the structure and the GUI capabilities of the PETOOL software tool are discussed in detail. The calibration of PETOOL is performed and demonstrated via analytical comparisons and/or representative canonical tests performed against the Geometric Optic (GO) + Uniform Theory of Diffraction (UTD). The tool can be used for research and/or educational purposes to investigate the effects of a variety of user-defined terrain and range-dependent refractivity profiles in electromagnetic wave propagation. Program summaryProgram title: PETOOL (Parabolic Equation Toolbox) Catalogue identifier: AEJS_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEJS_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 143 349 No. of bytes in distributed program, including test data, etc.: 23 280 251 Distribution format: tar.gz Programming language: MATLAB (MathWorks Inc.) 2010a. Partial Differential Toolbox and Curve Fitting Toolbox required Computer: PC Operating system: Windows XP and Vista Classification: 10 Nature of problem: Simulation of radio-wave propagation over variable terrain on the Earth's surface, and through homogeneous and inhomogeneous atmosphere. Solution method: The program implements one-way and two-way Split-Step Parabolic Equation (SSPE) algorithm, with wide-angle propagator. The SSPE is, in general, an initial-value problem starting from a reference range (typically from an antenna), and marching out in range by obtaining the field along the vertical direction at each range step, through the use of step-by-step Fourier transformations. The two-way algorithm incorporates the backward-propagating waves into the standard one-way SSPE by utilizing an iterative forward-backward scheme for modeling multipath effects over a staircase-approximated terrain. Unusual features: This is the first software package implementing a recursive forward-backward SSPE algorithm to account for the multipath effects during radio-wave propagation, and enabling the user to easily analyze and visualize the results of the two-way propagation with GUI capabilities. Running time: Problem dependent. Typically, it is about 1.5 ms (for conducting ground) and 4 ms (for lossy ground) per range step for a vertical field profile of vector length 1500, on Intel Core 2 Duo 1.6 GHz with 2 GB RAM under Windows Vista.

  6. Physical Activity Is Positively Associated with Episodic Memory in Aging.

    PubMed

    Hayes, Scott M; Alosco, Michael L; Hayes, Jasmeet P; Cadden, Margaret; Peterson, Kristina M; Allsup, Kelly; Forman, Daniel E; Sperling, Reisa A; Verfaellie, Mieke

    2015-11-01

    Aging is associated with performance reductions in executive function and episodic memory, although there is substantial individual variability in cognition among older adults. One factor that may be positively associated with cognition in aging is physical activity. To date, few studies have objectively assessed physical activity in young and older adults, and examined whether physical activity is differentially associated with cognition in aging. Young (n=29, age 18-31 years) and older adults (n=31, ages 55-82 years) completed standardized neuropsychological testing to assess executive function and episodic memory capacities. An experimental face-name relational memory task was administered to augment assessment of episodic memory. Physical activity (total step count and step rate) was objectively assessed using an accelerometer, and hierarchical regressions were used to evaluate relationships between cognition and physical activity. Older adults performed more poorly on tasks of executive function and episodic memory. Physical activity was positively associated with a composite measure of visual episodic memory and face-name memory accuracy in older adults. Physical activity associations with cognition were independent of sedentary behavior, which was negatively correlated with memory performance. Physical activity was not associated with cognitive performance in younger adults. Physical activity is positively associated with episodic memory performance in aging. The relationship appears to be strongest for face-name relational memory and visual episodic memory, likely attributable to the fact that these tasks make strong demands on the hippocampus. The results suggest that physical activity relates to cognition in older, but not younger adults.

  7. 5 CFR 531.504 - Level of performance required for quality step increase.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... step increase. 531.504 Section 531.504 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS PAY UNDER THE GENERAL SCHEDULE Quality Step Increases § 531.504 Level of performance required for quality step increase. A quality step increase shall not be required but may be granted only...

  8. Developing Statistical Models to Assess Transplant Outcomes Using National Registries: The Process in the United States.

    PubMed

    Snyder, Jon J; Salkowski, Nicholas; Kim, S Joseph; Zaun, David; Xiong, Hui; Israni, Ajay K; Kasiske, Bertram L

    2016-02-01

    Created by the US National Organ Transplant Act in 1984, the Scientific Registry of Transplant Recipients (SRTR) is obligated to publicly report data on transplant program and organ procurement organization performance in the United States. These reports include risk-adjusted assessments of graft and patient survival, and programs performing worse or better than expected are identified. The SRTR currently maintains 43 risk adjustment models for assessing posttransplant patient and graft survival and, in collaboration with the SRTR Technical Advisory Committee, has developed and implemented a new systematic process for model evaluation and revision. Patient cohorts for the risk adjustment models are identified, and single-organ and multiorgan transplants are defined, then each risk adjustment model is developed following a prespecified set of steps. Model performance is assessed, the model is refit to a more recent cohort before each evaluation cycle, and then it is applied to the evaluation cohort. The field of solid organ transplantation is unique in the breadth of the standardized data that are collected. These data allow for quality assessment across all transplant providers in the United States. A standardized process of risk model development using data from national registries may enhance the field.

  9. Lanthanide-IMAC enrichment of carbohydrates and polyols.

    PubMed

    Schemeth, Dieter; Rainer, Matthias; Messner, Christoph B; Rode, Bernd M; Bonn, Günther K

    2014-03-01

    In this study a new type of immobilized metal ion affinity chromatography resin for the enrichment of carbohydrates and polyols was synthesized by radical polymerization reaction of vinyl phosphonic acid and 1,4-butandiole dimethacrylate using azo-bis-isobutyronitrile as radical initiator. Interaction between the chelated trivalent lanthanide ions and negatively charged hydroxyl groups of carbohydrates and polyols was observed by applying high pH values. The new method was evaluated by single standard solutions, mixtures of standards, honey and a more complex extract of Cynara scolymus. The washing step was accomplished by acetonitrile in excess volumes. Elution of enriched carbohydrates was successfully performed with deionized water. The subsequent analysis was carried out with matrix-free laser desorption/ionization-time of flight mass spectrometry involving a TiO2 -coated steel target, especially suitable for the measurement of low-molecular-weight substances. Quantitative analysis of the sugar alcohol xylitol as well as the determination of the maximal loading capacity was performed by gas chromatography in conjunction with mass spectrometric detection after chemical derivatization. In a parallel approach quantum mechanical geometry optimizations were performed in order to compare the coordination behavior of various trivalent lanthanide ions. Copyright © 2013 John Wiley & Sons, Ltd.

  10. The use of cognitive task analysis to reveal the instructional limitations of experts in the teaching of procedural skills.

    PubMed

    Sullivan, Maura E; Yates, Kenneth A; Inaba, Kenji; Lam, Lydia; Clark, Richard E

    2014-05-01

    Because of the automated nature of knowledge, experts tend to omit information when describing a task. A potential solution is cognitive task analysis (CTA). The authors investigated the percentage of knowledge experts omitted when teaching a cricothyrotomy to determine the percentage of additional knowledge gained during a CTA interview. Three experts were videotaped teaching a cricothyrotomy in 2010 at the University of Southern California. After transcription, they participated in CTA interviews for the same procedure. Three additional surgeons were recruited to perform a CTA for the procedure, and a "gold standard" task list was created. Transcriptions from the teaching sessions were compared with the task list to identify omitted steps (both "what" and "how" to do). Transcripts from the CTA interviews were compared against the task list to determine the percentage of knowledge articulated by each expert during the initial "free recall" (unprompted) phase of the CTA interview versus the amount of knowledge gained by using CTA elicitation techniques (prompted). Experts omitted an average of 71% (10/14) of clinical knowledge steps, 51% (14/27) of action steps, and 73% (3.6/5) of decision steps. For action steps, experts described "how to do it" only 13% (3.6/27) of the time. The average number of steps that were described increased from 44% (20/46) when unprompted to 66% (31/46) when prompted. This study supports previous research that experts unintentionally omit knowledge when describing a procedure. CTA is a useful method to extract automated knowledge and augment expert knowledge recall during teaching.

  11. Finite-sample corrected generalized estimating equation of population average treatment effects in stepped wedge cluster randomized trials.

    PubMed

    Scott, JoAnna M; deCamp, Allan; Juraska, Michal; Fay, Michael P; Gilbert, Peter B

    2017-04-01

    Stepped wedge designs are increasingly commonplace and advantageous for cluster randomized trials when it is both unethical to assign placebo, and it is logistically difficult to allocate an intervention simultaneously to many clusters. We study marginal mean models fit with generalized estimating equations for assessing treatment effectiveness in stepped wedge cluster randomized trials. This approach has advantages over the more commonly used mixed models that (1) the population-average parameters have an important interpretation for public health applications and (2) they avoid untestable assumptions on latent variable distributions and avoid parametric assumptions about error distributions, therefore, providing more robust evidence on treatment effects. However, cluster randomized trials typically have a small number of clusters, rendering the standard generalized estimating equation sandwich variance estimator biased and highly variable and hence yielding incorrect inferences. We study the usual asymptotic generalized estimating equation inferences (i.e., using sandwich variance estimators and asymptotic normality) and four small-sample corrections to generalized estimating equation for stepped wedge cluster randomized trials and for parallel cluster randomized trials as a comparison. We show by simulation that the small-sample corrections provide improvement, with one correction appearing to provide at least nominal coverage even with only 10 clusters per group. These results demonstrate the viability of the marginal mean approach for both stepped wedge and parallel cluster randomized trials. We also study the comparative performance of the corrected methods for stepped wedge and parallel designs, and describe how the methods can accommodate interval censoring of individual failure times and incorporate semiparametric efficient estimators.

  12. The synchronisation of lower limb responses with a variable metronome: the effect of biomechanical constraints on timing.

    PubMed

    Chen, Hui-Ya; Wing, Alan M; Pratt, David

    2006-04-01

    Stepping in time with a metronome has been reported to improve pathological gait. Although there have been many studies of finger tapping synchronisation tasks with a metronome, the specific details of the influences of metronome timing on walking remain unknown. As a preliminary to studying pathological control of gait timing, we designed an experiment with four synchronisation tasks, unilateral heel tapping in sitting, bilateral heel tapping in sitting, bilateral heel tapping in standing, and stepping on the spot, in order to examine the influence of biomechanical constraints on metronome timing. These four conditions allow study of the effects of bilateral co-ordination and maintenance of balance on timing. Eight neurologically normal participants made heel tapping and stepping responses in synchrony with a metronome producing 500 ms interpulse intervals. In each trial comprising 40 intervals, one interval, selected at random between intervals 15 and 30, was lengthened or shortened, which resulted in a shift in phase of all subsequent metronome pulses. Performance measures were the speed of compensation for the phase shift, in terms of the temporal difference between the response and the metronome pulse, i.e. asynchrony, and the standard deviation of the asynchronies and interresponse intervals of steady state synchronisation. The speed of compensation decreased with increase in the demands of maintaining balance. The standard deviation varied across conditions but was not related to the compensation speed. The implications of these findings for metronome assisted gait are discussed in terms of a first-order linear correction account of synchronisation.

  13. Numerical study of the direct pressure effect of acoustic waves in planar premixed flames

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schmidt, H.; Jimenez, C.

    Recently the unsteady response of 1-D premixed flames to acoustic pressure waves for the range of frequencies below and above the inverse of the flame transit time was investigated experimentally using OH chemiluminescence Wangher (2008). They compared the frequency dependence of the measured response to the prediction of an analytical model proposed by Clavin et al. (1990), derived from the standard flame model (one-step Arrhenius kinetics) and to a similar model proposed by McIntosh (1991). Discrepancies between the experimental results and the model led to the conclusion that the standard model does not provide an adequate description of the unsteadymore » response of real flames and that it is necessary to investigate more realistic chemical models. Here we follow exactly this suggestion and perform numerical studies of the response of lean methane flames using different reaction mechanisms. We find that the global flame response obtained with both detailed chemistry (GRI3.0) and a reduced multi-step model by Peters (1996) lies slightly above the predictions of the analytical model, but is close to experimental results. We additionally used an irreversible one-step Arrhenius reaction model and show the effect of the pressure dependence of the global reaction rate in the flame response. Our results suggest first that the current models have to be extended to capture the amplitude and phase results of the detailed mechanisms, and second that the correlation between the heat release and the measured OH* chemiluminescence should be studied deeper. (author)« less

  14. SOI-silicon as structural layer for NEMS applications

    NASA Astrophysics Data System (ADS)

    Villarroya, Maria; Figueras, Eduard; Perez-Murano, Francesc; Campabadal, Francesca; Esteve, Jaume; Barniol, Nuria

    2003-04-01

    The objective of this paper is to present the compatibilization between a standard CMOS on bulk silicon process and the fabrication of nanoelectromechanical systems using Silicon On Insulator (SOI) wafers as substrate. This compatibilization is required as first step to fabricate a very high sensitive mass sensor based on a resonant cantilever with nanometer dimensions using the crystal silicon COI layer as the structural layer. The cantilever is driven electrostatically to its resonance frequency by an electrode placed parallel to the cantilever. A capacitive readout is performed. To achieve very high resolution, very small dimensions of the cantilever (nanometer range) are needed. For this reason, the control and excitation circuitry has to be integrated on the same substrate than the cantilever. Prior to the development of this sensor, it is necessary to develop a substrate able to be used first to integrate a standard CMOS circuit and afterwards to fabricate the nano-resonator. Starting from a SOI wafer and using very simple processes, the SOI silicon layer is removed, except from the areas in which nano-structures will be fabricated; obtaining a silicon substrate with islands with a SOI structure. The CMOS circuitry will be integrated on the bulk silicon region, while the remainder SOI region will be used for the nanoresonator. The silicon oxide of this SOI region is used as insulator; and as sacrificial layer, etched to release the cantilever from the substrate. To assure the cover of the different CMOS layers over the step of the islands, it is essential to avoid very sharp steps.

  15. Laboratory Performance Evaluation Report of SEL 421 Phasor Measurement Unit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Zhenyu; faris, Anthony J.; Martin, Kenneth E.

    2007-12-01

    PNNL and BPA have been in close collaboration on laboratory performance evaluation of phasor measurement units for over ten years. A series of evaluation tests are designed to confirm accuracy and determine measurement performance under a variety of conditions that may be encountered in actual use. Ultimately the testing conducted should provide parameters that can be used to adjust all measurements to a standardized basis. These tests are performed with a standard relay test set using recorded files of precisely generated test signals. The test set provides test signals at a level and in a format suitable for input tomore » a PMU that accurately reproduces the signals in both signal amplitude and timing. Test set outputs are checked to confirm the accuracy of the output signal. The recorded signals include both current and voltage waveforms and a digital timing track used to relate the PMU measured value with the test signal. Test signals include steady-state waveforms to test amplitude, phase, and frequency accuracy, modulated signals to determine measurement and rejection bands, and step tests to determine timing and response accuracy. Additional tests are included as necessary to fully describe the PMU operation. Testing is done with a BPA phasor data concentrator (PDC) which provides communication support and monitors data input for dropouts and data errors.« less

  16. Performance evaluation of digital phase-locked loops for advanced deep space transponders

    NASA Technical Reports Server (NTRS)

    Nguyen, T. M.; Hinedi, S. M.; Yeh, H.-G.; Kyriacou, C.

    1994-01-01

    The performances of the digital phase-locked loops (DPLL's) for the advanced deep-space transponders (ADT's) are investigated. DPLL's considered in this article are derived from the analog phase-locked loop, which is currently employed by the NASA standard deep space transponder, using S-domain to Z-domain mapping techniques. Three mappings are used to develop digital approximations of the standard deep space analog phase-locked loop, namely the bilinear transformation (BT), impulse invariant transformation (IIT), and step invariant transformation (SIT) techniques. The performance in terms of the closed loop phase and magnitude responses, carrier tracking jitter, and response of the loop to the phase offset (the difference between in incoming phase and reference phase) is evaluated for each digital approximation. Theoretical results of the carrier tracking jitter for command-on and command-off cases are then validated by computer simulation. Both theoretical and computer simulation results show that at high sampling frequency, the DPLL's approximated by all three transformations have the same tracking jitter. However, at low sampling frequency, the digital approximation using BT outperforms the others. The minimum sampling frequency for adequate tracking performance is determined for each digital approximation of the analog loop. In addition, computer simulation shows that the DPLL developed by BT provides faster response to the phase offset than IIT and SIT.

  17. Setting Foundations for Developing Disaster Response Metrics.

    PubMed

    Abir, Mahshid; Bell, Sue Anne; Puppala, Neha; Awad, Osama; Moore, Melinda

    2017-08-01

    There are few reported efforts to define universal disaster response performance measures. Careful examination of responses to past disasters can inform the development of such measures. As a first step toward this goal, we conducted a literature review to identify key factors in responses to 3 recent events with significant loss of human life and economic impact: the 2003 Bam, Iran, earthquake; the 2004 Indian Ocean tsunami; and the 2010 Haiti earthquake. Using the PubMed (National Library of Medicine, Bethesda, MD) database, we identified 710 articles and retained 124 after applying inclusion and exclusion criteria. Seventy-two articles pertained to the Haiti earthquake, 38 to the Indian Ocean tsunami, and 14 to the Bam earthquake. On the basis of this review, we developed an organizational framework for disaster response performance measurement with 5 key disaster response categories: (1) personnel, (2) supplies and equipment, (3) transportation, (4) timeliness and efficiency, and (5) interagency cooperation. Under each of these, and again informed by the literature, we identified subcategories and specific items that could be developed into standardized performance measures. The validity and comprehensiveness of these measures can be tested by applying them to other recent and future disaster responses, after which standardized performance measures can be developed through a consensus process. (Disaster Med Public Health Preparedness. 2017;11:505-509).

  18. Correcting for the influence of sampling conditions on biomarkers of exposure to phenols and phthalates: a 2-step standardization method based on regression residuals.

    PubMed

    Mortamais, Marion; Chevrier, Cécile; Philippat, Claire; Petit, Claire; Calafat, Antonia M; Ye, Xiaoyun; Silva, Manori J; Brambilla, Christian; Eijkemans, Marinus J C; Charles, Marie-Aline; Cordier, Sylvaine; Slama, Rémy

    2012-04-26

    Environmental epidemiology and biomonitoring studies typically rely on biological samples to assay the concentration of non-persistent exposure biomarkers. Between-participant variations in sampling conditions of these biological samples constitute a potential source of exposure misclassification. Few studies attempted to correct biomarker levels for this error. We aimed to assess the influence of sampling conditions on concentrations of urinary biomarkers of select phenols and phthalates, two widely-produced families of chemicals, and to standardize biomarker concentrations on sampling conditions. Urine samples were collected between 2002 and 2006 among 287 pregnant women from Eden and Pélagie cohorts, from which phthalates and phenols metabolites levels were assayed. We applied a 2-step standardization method based on regression residuals. First, the influence of sampling conditions (including sampling hour, duration of storage before freezing) and of creatinine levels on biomarker concentrations were characterized using adjusted linear regression models. In the second step, the model estimates were used to remove the variability in biomarker concentrations due to sampling conditions and to standardize concentrations as if all samples had been collected under the same conditions (e.g., same hour of urine collection). Sampling hour was associated with concentrations of several exposure biomarkers. After standardization for sampling conditions, median concentrations differed by--38% for 2,5-dichlorophenol to +80 % for a metabolite of diisodecyl phthalate. However, at the individual level, standardized biomarker levels were strongly correlated (correlation coefficients above 0.80) with unstandardized measures. Sampling conditions, such as sampling hour, should be systematically collected in biomarker-based studies, in particular when the biomarker half-life is short. The 2-step standardization method based on regression residuals that we proposed in order to limit the impact of heterogeneity in sampling conditions could be further tested in studies describing levels of biomarkers or their influence on health.

  19. Independent influence of gait speed and step length on stability and fall risk.

    PubMed

    Espy, D D; Yang, F; Bhatt, T; Pai, Y-C

    2010-07-01

    With aging, individuals' gaits become slower and their steps shorter; both are thought to improve stability against balance threats. Recent studies have shown that shorter step lengths, which bring the center of mass (COM) closer to the leading foot, improve stability against slip-related falls. However, a slower gait, hence lower COM velocity, does the opposite. Due to the inherent coupling of step length and speed in spontaneous gait, the extent to which the benefit of shorter steps can offset the slower speed is unknown. The purpose of this study was to investigate, through decoupling, the independent effects of gait speed and step length on gait stability and the likelihood of slip-induced falls. Fifty-seven young adults walked at one of three target gait patterns, two of equal speed and two of equal step length; at a later trial, they encountered an unannounced slip. The results supported our hypotheses that faster gait as well as shorter steps each ameliorates fall risk when a slip is encountered. This appeared to be attributable to the maintenance of stability from slip initiation to liftoff of the recovery foot during the slip. Successful decoupling of gait speed from step length reveals for the first time that, although slow gait in itself leads to instability and falls (a one-standard-deviation decrease in gait speed increases the odds of fall by 4-fold), this effect is offset by the related decrease in step length (the same one-standard-deviation decrease in step length lowers fall risk by 6 times). Copyright © 2010 Elsevier B.V. All rights reserved.

  20. Verification and large scale clinical evaluation of a national standard protocol for Salmonella spp./Shigella spp. screening using real-time PCR combined with guided culture.

    PubMed

    Tang, Xi-Jun; Yang, Ze; Chen, Xin-Bin; Tian, Wen-Fang; Tu, Cheng-Ning; Wang, Hai-Bo

    2018-02-01

    Salmonella spp./Shigella spp. are often associated with food poisoning and fecal-oral transmission of acute gastroenteritis that requires strict monitoring, especially among people who would handle food and water. In 2014, the National Health and Family Planning Commission of the P. R. China issued a national standard protocol (recommendatory) for the screening of Salmonella spp./Shigella spp.. However, its performance has not been fully studied. Whether it was suitable for use in our laboratory was still unknown. In the current study, the new protocol was first verified by various experiments and then its clinical performance was evaluated in about 20,000 stool samples over a three-year period. Verification results showed that the new protocol was highly specific and reproducible. Sensitivity (as defined as the lower limit of detection) of the new protocol at the PCR step was 10 3 CFU/mL and 10 1 CFU/mL for Salmonella spp. and Shigella spp., while that at the guided culture step was 10 4 CFU/mL and 10 3 CFU/mL, respectively. The large scale clinical evaluation indicated that the new protocol could increase the positivity rate by two fold and decrease the workload/median turnaround time significantly. In conclusion, the protocol was verified and evaluated and was proven to be a valuable platform for the rapid, specific, sensitive and high-throughput screening of Salmonella spp./Shigella spp. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Evaluation of prototype air/fluid separator for Space Station Freedom Health Maintenance Facility

    NASA Technical Reports Server (NTRS)

    Billica, Roger; Smith, Maureen; Murphy, Linda; Kizzee, Victor D.

    1991-01-01

    A prototype air/fluid separator suction apparatus proposed as a possible design for use with the Health Maintenance Facility aboard Space Station Freedom (SSF) was evaluated. A KC-135 parabolic flight test was performed for this purpose. The flights followed the standard 40 parabola profile with 20 to 25 seconds of near-zero gravity in each parabola. A protocol was prepared to evaluate the prototype device in several regulator modes (or suction force), using three fluids of varying viscosity, and using either continuous or intermittent suction. It was felt that a matrixed approach would best approximate the range of utilization anticipated for medical suction on SSF. The protocols were performed in one-gravity in a lab setting to familiarize the team with procedures and techniques. Identical steps were performed aboard the KC-135 during parabolic flight.

  2. One Small Step for the Gram Stain, One Giant Leap for Clinical Microbiology.

    PubMed

    Thomson, Richard B

    2016-06-01

    The Gram stain is one of the most commonly performed tests in the clinical microbiology laboratory, yet it is poorly controlled and lacks standardization. It was once the best rapid test in microbiology, but it is no longer trusted by many clinicians. The publication by Samuel et al. (J. Clin. Microbiol. 54:1442-1447, 2016, http://dx.doi.org/10.1128/JCM.03066-15) is a start for those who want to evaluate and improve Gram stain performance. In an age of emerging rapid molecular results, is the Gram stain still relevant? How should clinical microbiologists respond to the call to reduce Gram stain error rates? Copyright © 2016, American Society for Microbiology. All Rights Reserved.

  3. [Health protection for rural workers: the need to standardize techniques for quantifying dermal exposure to pesticides].

    PubMed

    Selmi, Giuliana da Fontoura Rodrigues; Trapé, Angelo Zanaga

    2014-05-01

    Quantification of dermal exposure to pesticides in rural workers, used in risk assessment, can be performed with different techniques such as patches or whole body evaluation. However, the wide variety of methods can jeopardize the process by producing disparate results, depending on the principles in sample collection. A critical review was thus performed on the main techniques for quantifying dermal exposure, calling attention to this issue and the need to establish a single methodology for quantification of dermal exposure in rural workers. Such harmonization of different techniques should help achieve safer and healthier working conditions. Techniques that can provide reliable exposure data are an essential first step towards avoiding harm to workers' health.

  4. Quantitative Imaging Biomarkers: A Review of Statistical Methods for Computer Algorithm Comparisons

    PubMed Central

    2014-01-01

    Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research. PMID:24919829

  5. Evidence-Based Consensus Recommendations for Colposcopy Practice for Cervical Cancer Prevention in the United States.

    PubMed

    Wentzensen, Nicolas; Massad, L Stewart; Mayeaux, Edward J; Khan, Michelle J; Waxman, Alan G; Einstein, Mark H; Conageski, Christine; Schiffman, Mark H; Gold, Michael A; Apgar, Barbara S; Chelmow, David; Choma, Kim K; Darragh, Teresa M; Gage, Julia C; Garcia, Francisco A R; Guido, Richard S; Jeronimo, Jose A; Liu, Angela; Mathews, Cara A; Mitchell, Martha M; Moscicki, Anna-Barbara; Novetsky, Akiva P; Papasozomenos, Theognosia; Perkins, Rebecca B; Silver, Michelle I; Smith, Katie M; Stier, Elizabeth A; Tedeschi, Candice A; Werner, Claudia L; Huh, Warner K

    2017-10-01

    The American Society for Colposcopy and Cervical Pathology (ASCCP) Colposcopy Standards recommendations address the role of colposcopy and directed biopsy for cervical cancer prevention in the United States (US). The recommendations were developed by an expert working group appointed by ASCCP's Board of Directors. An extensive literature review was conducted and supplemented by a systematic review and meta-analysis of unpublished data. In addition, a survey of practicing colposcopists was conducted to assess current colposcopy practice in the US. Recommendations were approved by the working group members, and the final revisions were made based on comments received from the public. The recommendations cover terminology, risk-based colposcopy, colposcopy procedures, and colposcopy adjuncts. The ASCCP Colposcopy Standards recommendations are an important step toward raising the standard of colposcopy services delivered to women in the US. Because cervical cancer screening programs are currently undergoing important changes that may affect colposcopy performance, updates to some of the current recommendations may be necessary in the future.

  6. Health level 7 development framework for medication administration.

    PubMed

    Kim, Hwa Sun; Cho, Hune

    2009-01-01

    We propose the creation of a standard data model for medication administration activities through the development of a clinical document architecture using the Health Level 7 Development Framework process based on an object-oriented analysis and the development method of Health Level 7 Version 3. Medication administration is the most common activity performed by clinical professionals in healthcare settings. A standardized information model and structured hospital information system are necessary to achieve evidence-based clinical activities. A virtual scenario is used to demonstrate the proposed method of administering medication. We used the Health Level 7 Development Framework and other tools to create the clinical document architecture, which allowed us to illustrate each step of the Health Level 7 Development Framework in the administration of medication. We generated an information model of the medication administration process as one clinical activity. It should become a fundamental conceptual model for understanding international-standard methodology by healthcare professionals and nursing practitioners with the objective of modeling healthcare information systems.

  7. Whole body MRI, including diffusion-weighted imaging in follow-up of patients with testicular cancer.

    PubMed

    Mosavi, Firas; Laurell, Anna; Ahlström, Håkan

    2015-11-01

    Whole body (WB) magnetic resonance imaging (MRI), including diffusion-weighted imaging (DWI) has become increasingly utilized in cancer imaging, yet the clinical utility of these techniques in follow-up of testicular cancer patients has not been evaluated. The purpose of this study was to evaluate the feasibility of WB MRI with continuous table movement (CTM) technique, including multistep DWI in follow-up of patients with testicular cancer. WB MRI including DWI was performed in follow-up of 71 consecutive patients (median age, 37 years; range 19-84) with histologically confirmed testicular cancer. WB MRI protocol included axial T1-Dixon and T2-BLADE sequences using CTM technique. Furthermore, multi-step DWI was performed using b-value 50 and 1000 s/mm(2). One criterion for feasibility was patient tolerance and satisfactory image quality. Another criterion was the accuracy in detection of any pathological mass, compared to standard of reference. Signal intensity in DWI was used for evaluation of residual mass activity. Clinical, laboratory and imaging follow-up were applied as standard of reference for the evaluation of WB MRI. WB MRI was tolerated in nearly all patients (69/71 patients, 97%) and the image quality was satisfactory. Metal artifacts deteriorated the image quality in six patients, but it did not influence the overall results. No case of clinical relapse was observed during the follow-up time. There was a good agreement between conventional WB MRI and standard of reference in all patients. Three patients showed residual masses and DWI signal was not restricted in these patients. Furthermore, DWI showed abnormally high signal intensity in a normal-sized retroperitoneal lymph node indicating metastasis. The subsequent (18)F-FDG PET/CT could verify the finding. WB MRI with CTM technique including multi-step DWI is feasible in follow-up of patients with testicular cancer. DWI may contribute to important added-value data to conventional MRI sequences regarding the activity of residual masses.

  8. Impact of temporal resolution of inputs on hydrological model performance: An analysis based on 2400 flood events

    NASA Astrophysics Data System (ADS)

    Ficchì, Andrea; Perrin, Charles; Andréassian, Vazken

    2016-07-01

    Hydro-climatic data at short time steps are considered essential to model the rainfall-runoff relationship, especially for short-duration hydrological events, typically flash floods. Also, using fine time step information may be beneficial when using or analysing model outputs at larger aggregated time scales. However, the actual gain in prediction efficiency using short time-step data is not well understood or quantified. In this paper, we investigate the extent to which the performance of hydrological modelling is improved by short time-step data, using a large set of 240 French catchments, for which 2400 flood events were selected. Six-minute rain gauge data were available and the GR4 rainfall-runoff model was run with precipitation inputs at eight different time steps ranging from 6 min to 1 day. Then model outputs were aggregated at seven different reference time scales ranging from sub-hourly to daily for a comparative evaluation of simulations at different target time steps. Three classes of model performance behaviour were found for the 240 test catchments: (i) significant improvement of performance with shorter time steps; (ii) performance insensitivity to the modelling time step; (iii) performance degradation as the time step becomes shorter. The differences between these groups were analysed based on a number of catchment and event characteristics. A statistical test highlighted the most influential explanatory variables for model performance evolution at different time steps, including flow auto-correlation, flood and storm duration, flood hydrograph peakedness, rainfall-runoff lag time and precipitation temporal variability.

  9. 40 CFR 63.11502 - What definitions apply to this subpart?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... begins with the initiation of steps as described in a written standard operating procedures (SOP) or... PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Chemical Manufacturing Area Sources...

  10. The Next Generation Science Standards: The Features and Challenges

    ERIC Educational Resources Information Center

    Pruitt, Stephen L.

    2014-01-01

    Beginning in January of 2010, the Carnegie Corporation of New York funded a two-step process to develop a new set of state developed science standards intended to prepare students for college and career readiness in science. These new internationally benchmarked science standards, the Next Generation Science Standards (NGSS) were completed in…

  11. Laser capture microdissection of embryonic cells and preparation of RNA for microarray assays.

    PubMed

    Redmond, Latasha C; Pang, Christopher J; Dumur, Catherine; Haar, Jack L; Lloyd, Joyce A

    2014-01-01

    In order to compare the global gene expression profiles of different embryonic cell types, it is first necessary to isolate the specific cells of interest. The purpose of this chapter is to provide a step-by-step protocol to perform laser capture microdissection (LCM) on embryo samples and obtain sufficient amounts of high-quality RNA for microarray hybridizations. Using the LCM/microarray strategy on mouse embryo samples has some challenges, because the cells of interest are available in limited quantities. The first step in the protocol is to obtain embryonic tissue, and immediately cryoprotect and freeze it in a cryomold containing Optimal Cutting Temperature freezing media (Sakura Finetek), using a dry ice-isopentane bath. The tissue is then cryosectioned, and the microscope slides are processed to fix, stain, and dehydrate the cells. LCM is employed to isolate specific cell types from the slides, identified under the microscope by virtue of their morphology. Detailed protocols are provided for using the currently available ArcturusXT LCM instrument and CapSure(®) LCM Caps, to which the selected cells adhere upon laser capture. To maintain RNA integrity, upon removing a slide from the final processing step, or attaching the first cells on the LCM cap, LCM is completed within 20 min. The cells are then immediately recovered from the LCM cap using a denaturing solution that stabilizes RNA integrity. RNA is prepared using standard methods, modified for working with small samples. To ensure the validity of the microarray data, the quality of the RNA is assessed using the Agilent bioanalyzer. Only RNA that is of sufficient integrity and quantity is used to perform microarray assays. This chapter provides guidance regarding troubleshooting and optimization to obtain high-quality RNA from cells of limited availability, obtained from embryo samples by LCM.

  12. Balance and postural skills in normal-weight and overweight prepubertal boys.

    PubMed

    Deforche, Benedicte I; Hills, Andrew P; Worringham, Charles J; Davies, Peter S W; Murphy, Alexia J; Bouckaert, Jacques J; De Bourdeaudhuij, Ilse M

    2009-01-01

    This study investigated differences in balance and postural skills in normal-weight versus overweight prepubertal boys. Fifty-seven 8-10-year-old boys were categorized overweight (N = 25) or normal-weight (N = 32) according to the International Obesity Task Force cut-off points for overweight in children. The Balance Master, a computerized pressure plate system, was used to objectively measure six balance skills: sit-to-stand, walk, step up/over, tandem walk (walking on a line), unilateral stance and limits of stability. In addition, three standardized field tests were employed: standing on one leg on a balance beam, walking heel-to-toe along the beam and the multiple sit-to-stand test. Overweight boys showed poorer performances on several items assessed on the Balance Master. Overweight boys had slower weight transfer (p < 0.05), lower rising index (p < 0.05) and greater sway velocity (p < 0.001) in the sit-to-stand test, greater step width while walking (p < 0.05) and lower speed when walking on a line (p < 0.01) compared with normal-weight counterparts. Performance on the step up/over test, the unilateral stance and the limits of stability were comparable between both groups. On the balance beam, overweight boys could not hold their balance on one leg as long (p < 0.001) and had fewer correct steps in the heel-to-toe test (p < 0.001) than normal-weight boys. Finally, overweight boys were slower in standing up and sitting down five times in the multiple sit-to-stand task (p < 0.01). This study demonstrates that when categorised by body mass index (BMI) level, overweight prepubertal boys displayed lower capacity on several static and dynamic balance and postural skills.

  13. Acute experimental hip muscle pain alters single-leg squat balance in healthy young adults.

    PubMed

    Hatton, Anna L; Crossley, Kay M; Hug, François; Bouma, James; Ha, Bonnie; Spaulding, Kara L; Tucker, Kylie

    2015-05-01

    Clinical musculoskeletal pain commonly accompanies hip pathology and can impact balance performance. Due to the cross-sectional designs of previous studies, and the multifactorial nature of musculoskeletal pain conditions, it is difficult to determine whether pain is a driver of balance impairments in this population. This study explored the effects of experimentally induced hip muscle pain on static and dynamic balance. Twelve healthy adults (4 women, mean[SD]: 27.1[3] years) performed three balance tasks on each leg, separately: single-leg standing (eyes closed), single-leg squat (eyes open), forward step (eyes open); before and after hypertonic saline injection (1ml, 5% NaCl) into the right gluteus medius. Range, standard deviation (SD), and velocity of the centre of pressure (CoP) in medio-lateral (ML) and anterior-posterior (AP) directions were considered. During the single-leg squat task, experimental hip pain was associated with significantly reduced ML range (-4[13]%, P=0.028), AP range (-14[21]%, P=0.005), APSD (-15[28]%, P=0.009), and AP velocity (-6[13]%, P=0.032), relative to the control condition, in both legs. No effect of pain was observed during single-leg standing and forward stepping. Significant between-leg differences in ML velocity were observed during the forward stepping task (P=0.034). Pain is a potentially modifiable patient-reported outcome in individuals with hip problems. This study demonstrates that acute hip muscle pain alone, without interference of musculoskeletal pathology, does not lead to the same impairments in balance as exhibited in clinical populations with hip pathologies. This is the first step in understanding how and why balance is altered in painful hip pathologies. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Laser Capture Microdissection of Embryonic Cells and Preparation of RNA for Microarray Assays

    PubMed Central

    Redmond, Latasha C.; Pang, Christopher J.; Dumur, Catherine; Haar, Jack L.; Lloyd, Joyce A.

    2014-01-01

    In order to compare the global gene expression profiles of different embryonic cell types, it is first necessary to isolate the specific cells of interest. The purpose of this chapter is to provide a step-by-step protocol to perform laser capture microdissection (LCM) on embryo samples and obtain sufficient amounts of high-quality RNA for microarray hybridizations. Using the LCM/microarray strategy on mouse embryo samples has some challenges, because the cells of interest are available in limited quantities. The first step in the protocol is to obtain embryonic tissue, and immediately cryoprotect and freeze it in a cryomold containing Optimal Cutting Temperature freezing media (Sakura Finetek), using a dry ice–isopentane bath. The tissue is then cryosectioned, and the microscope slides are processed to fix, stain, and dehydrate the cells. LCM is employed to isolate specific cell types from the slides, identified under the microscope by virtue of their morphology. Detailed protocols are provided for using the currently available ArcturusXT LCM instrument and CapSure® LCM Caps, to which the selected cells adhere upon laser capture. To maintain RNA integrity, upon removing a slide from the final processing step, or attaching the first cells on the LCM cap, LCM is completed within 20 min. The cells are then immediately recovered from the LCM cap using a denaturing solution that stabilizes RNA integrity. RNA is prepared using standard methods, modified for working with small samples. To ensure the validity of the microarray data, the quality of the RNA is assessed using the Agilent bioanalyzer. Only RNA that is of sufficient integrity and quantity is used to perform microarray assays. This chapter provides guidance regarding troubleshooting and optimization to obtain high-quality RNA from cells of limited availability, obtained from embryo samples by LCM. PMID:24318813

  15. Deliberate Practice as a Theoretical Framework for Interprofessional Experiential Education.

    PubMed

    Wang, Joyce M; Zorek, Joseph A

    2016-01-01

    The theory of deliberate practice has been applied to many skill-based performance activities. The primary aim of this project was to integrate synergistic principles from deliberate practice and consensus-derived competencies for interprofessional education into a framework upon which educational models to advance interprofessional experiential education (IEE) might be built. CINAHL, ERIC, and MEDLINE databases were searched using the keywords "deliberate practice" and "interprofessional education," both individually and in combination. Relevant articles were selected from the catalog based on support for the premise of the project. Defining characteristics of deliberate practice were distilled with particular emphasis on their application to the Interprofessional Education Collaborative's (IPEC) core competencies. Recommendations for IEE development were identified through the synthesis of deliberate practice principles and IPEC competencies. There is a high degree of synergy between deliberate practice principles and IPEC competencies. Our synthesis of the literature yielded a cyclical four-step process to advance IEE: (1) implement an IEE plan guided by the student's strengths/weaknesses and in consideration of the collaborative practice skills they wish to develop, (2) engage in IPE experiences that will challenge targeted skills according to the IEE plan, (3) embed frequent opportunities for student reflection and preceptor/team feedback within IEE plan, and (4) revise the IEE plan and the IPE experience based on insights gained during step 3. The cyclical four-step process synthesized through this literature review may be used to guide the development of new IEE models. The purposeful development of IEE models grounded in a theory that has already been operationalized in other skill-based performance areas is an important step to address expanding accreditation standards throughout the health professions mandating interprofessional education for pre-licensure health professional students.

  16. GWAS with longitudinal phenotypes: performance of approximate procedures

    PubMed Central

    Sikorska, Karolina; Montazeri, Nahid Mostafavi; Uitterlinden, André; Rivadeneira, Fernando; Eilers, Paul HC; Lesaffre, Emmanuel

    2015-01-01

    Analysis of genome-wide association studies with longitudinal data using standard procedures, such as linear mixed model (LMM) fitting, leads to discouragingly long computation times. There is a need to speed up the computations significantly. In our previous work (Sikorska et al: Fast linear mixed model computations for genome-wide association studies with longitudinal data. Stat Med 2012; 32.1: 165–180), we proposed the conditional two-step (CTS) approach as a fast method providing an approximation to the P-value for the longitudinal single-nucleotide polymorphism (SNP) effect. In the first step a reduced conditional LMM is fit, omitting all the SNP terms. In the second step, the estimated random slopes are regressed on SNPs. The CTS has been applied to the bone mineral density data from the Rotterdam Study and proved to work very well even in unbalanced situations. In another article (Sikorska et al: GWAS on your notebook: fast semi-parallel linear and logistic regression for genome-wide association studies. BMC Bioinformatics 2013; 14: 166), we suggested semi-parallel computations, greatly speeding up fitting many linear regressions. Combining CTS with fast linear regression reduces the computation time from several weeks to a few minutes on a single computer. Here, we explore further the properties of the CTS both analytically and by simulations. We investigate the performance of our proposal in comparison with a related but different approach, the two-step procedure. It is analytically shown that for the balanced case, under mild assumptions, the P-value provided by the CTS is the same as from the LMM. For unbalanced data and in realistic situations, simulations show that the CTS method does not inflate the type I error rate and implies only a minimal loss of power. PMID:25712081

  17. Time series analysis as input for clinical predictive modeling: Modeling cardiac arrest in a pediatric ICU

    PubMed Central

    2011-01-01

    Background Thousands of children experience cardiac arrest events every year in pediatric intensive care units. Most of these children die. Cardiac arrest prediction tools are used as part of medical emergency team evaluations to identify patients in standard hospital beds that are at high risk for cardiac arrest. There are no models to predict cardiac arrest in pediatric intensive care units though, where the risk of an arrest is 10 times higher than for standard hospital beds. Current tools are based on a multivariable approach that does not characterize deterioration, which often precedes cardiac arrests. Characterizing deterioration requires a time series approach. The purpose of this study is to propose a method that will allow for time series data to be used in clinical prediction models. Successful implementation of these methods has the potential to bring arrest prediction to the pediatric intensive care environment, possibly allowing for interventions that can save lives and prevent disabilities. Methods We reviewed prediction models from nonclinical domains that employ time series data, and identified the steps that are necessary for building predictive models using time series clinical data. We illustrate the method by applying it to the specific case of building a predictive model for cardiac arrest in a pediatric intensive care unit. Results Time course analysis studies from genomic analysis provided a modeling template that was compatible with the steps required to develop a model from clinical time series data. The steps include: 1) selecting candidate variables; 2) specifying measurement parameters; 3) defining data format; 4) defining time window duration and resolution; 5) calculating latent variables for candidate variables not directly measured; 6) calculating time series features as latent variables; 7) creating data subsets to measure model performance effects attributable to various classes of candidate variables; 8) reducing the number of candidate features; 9) training models for various data subsets; and 10) measuring model performance characteristics in unseen data to estimate their external validity. Conclusions We have proposed a ten step process that results in data sets that contain time series features and are suitable for predictive modeling by a number of methods. We illustrated the process through an example of cardiac arrest prediction in a pediatric intensive care setting. PMID:22023778

  18. Time series analysis as input for clinical predictive modeling: modeling cardiac arrest in a pediatric ICU.

    PubMed

    Kennedy, Curtis E; Turley, James P

    2011-10-24

    Thousands of children experience cardiac arrest events every year in pediatric intensive care units. Most of these children die. Cardiac arrest prediction tools are used as part of medical emergency team evaluations to identify patients in standard hospital beds that are at high risk for cardiac arrest. There are no models to predict cardiac arrest in pediatric intensive care units though, where the risk of an arrest is 10 times higher than for standard hospital beds. Current tools are based on a multivariable approach that does not characterize deterioration, which often precedes cardiac arrests. Characterizing deterioration requires a time series approach. The purpose of this study is to propose a method that will allow for time series data to be used in clinical prediction models. Successful implementation of these methods has the potential to bring arrest prediction to the pediatric intensive care environment, possibly allowing for interventions that can save lives and prevent disabilities. We reviewed prediction models from nonclinical domains that employ time series data, and identified the steps that are necessary for building predictive models using time series clinical data. We illustrate the method by applying it to the specific case of building a predictive model for cardiac arrest in a pediatric intensive care unit. Time course analysis studies from genomic analysis provided a modeling template that was compatible with the steps required to develop a model from clinical time series data. The steps include: 1) selecting candidate variables; 2) specifying measurement parameters; 3) defining data format; 4) defining time window duration and resolution; 5) calculating latent variables for candidate variables not directly measured; 6) calculating time series features as latent variables; 7) creating data subsets to measure model performance effects attributable to various classes of candidate variables; 8) reducing the number of candidate features; 9) training models for various data subsets; and 10) measuring model performance characteristics in unseen data to estimate their external validity. We have proposed a ten step process that results in data sets that contain time series features and are suitable for predictive modeling by a number of methods. We illustrated the process through an example of cardiac arrest prediction in a pediatric intensive care setting.

  19. Defining the cognitive enhancing properties of video games: Steps Towards Standardization and Translation.

    PubMed

    Goodwin, Shikha Jain; Dziobek, Derek

    2016-09-01

    Ever since video games were available to the general public, they have intrigued brain researchers for many reasons. There is an enormous amount of diversity in the video game research, ranging from types of video games used, the amount of time spent playing video games, the definition of video gamer versus non-gamer to the results obtained after playing video games. In this paper, our goal is to provide a critical discussion of these issues, along with some steps towards generalization using the discussion of an article published by Clemenson and Stark (2005) as the starting point. The authors used a distinction between 2D versus 3D video games to compare their effects on the learning and memory in humans. The primary hypothesis of the authors is that the exploration of virtual environments while playing video games is a human correlate of environment enrichment. Authors found that video gamers performed better than the non-video gamers, and if non-gamers are trained on playing video gamers, 3D games provide better environment enrichment compared to 2D video games, as indicated by better memory scores. The end goal of standardization in video games is to be able to translate the field so that the results can be used for greater good.

  20. Practical Application of Linear Growth Measurements in Clinical Research in Low- and Middle-Income Countries

    PubMed Central

    Wit, Jan M.; Himes, John H.; van Buuren, Stef; Denno, Donna M.; Suchdev, Parminder S.

    2017-01-01

    Background/Aims Childhood stunting is a prevalent problem in low- and middle-income countries and is associated with long-term adverse neurodevelopment and health outcomes. In this review, we define indicators of growth, discuss key challenges in their analysis and application, and offer suggestions for indicator selection in clinical research contexts. Methods Critical review of the literature. Results Linear growth is commonly expressed as length-for-age or height-for-age z-score (HAZ) in comparison to normative growth standards. Conditional HAZ corrects for regression to the mean where growth changes relate to previous status. In longitudinal studies, growth can be expressed as ΔHAZ at 2 time points. Multilevel modeling is preferable when more measurements per individual child are available over time. Height velocity z-score reference standards are available for children under the age of 2 years. Adjusting for covariates or confounders (e.g., birth weight, gestational age, sex, parental height, maternal education, socioeconomic status) is recommended in growth analyses. Conclusion The most suitable indicator(s) for linear growth can be selected based on the number of available measurements per child and the child's age. By following a step-by-step algorithm, growth analyses can be precisely and accurately performed to allow for improved comparability within and between studies. PMID:28196362

  1. Defining the cognitive enhancing properties of video games: Steps Towards Standardization and Translation

    PubMed Central

    Goodwin, Shikha Jain; Dziobek, Derek

    2016-01-01

    Ever since video games were available to the general public, they have intrigued brain researchers for many reasons. There is an enormous amount of diversity in the video game research, ranging from types of video games used, the amount of time spent playing video games, the definition of video gamer versus non-gamer to the results obtained after playing video games. In this paper, our goal is to provide a critical discussion of these issues, along with some steps towards generalization using the discussion of an article published by Clemenson and Stark (2005) as the starting point. The authors used a distinction between 2D versus 3D video games to compare their effects on the learning and memory in humans. The primary hypothesis of the authors is that the exploration of virtual environments while playing video games is a human correlate of environment enrichment. Authors found that video gamers performed better than the non-video gamers, and if non-gamers are trained on playing video gamers, 3D games provide better environment enrichment compared to 2D video games, as indicated by better memory scores. The end goal of standardization in video games is to be able to translate the field so that the results can be used for greater good. PMID:27747256

  2. Modelling and performance of Nb SIS mixers in the 1.3 mm and 0.8 mm bands

    NASA Technical Reports Server (NTRS)

    Karpov, A.; Carter, M.; Lazareff, B.; Billon-Pierron, D.; Gundlach, K. H.

    1992-01-01

    We describe the modeling and subsequent improvements of SIS waveguide mixers for the 200-270 and 330-370 GHz bands (Blundell, Carter, and Gundlach 1988, Carter et al 1991). These mixers are constructed for use in receivers on IRAM radiotelescopes on Pico Veleta (Spain, Sierra Nevada) and Plateau de Bure (French Alps), and must meet specific requirements. The standard reduced height waveguide structure with suspended stripline is first analyzed and a model is validated through comparison with scale model and working scale measurements. In the first step, the intrinsic limitations of the standard mixer structure are identified, and the parameters are optimized bearing in mind the radioastronomical applications. In the second step, inductive tuning of the junctions is introduced and optimized for minimum noise and maximum bandwidth. In the 1.3 mm band, a DSB receiver temperature of less than 110 K (minimum 80 K) is measured from 180 through 260 GHz. In the 0.8 mm band, a DSB receiver temperature of less than 250 K (minimum 175 K) is obtained between 325 and 355 GHz. All these results are obtained with room-temperature optics and a 4 GHz IF chain having a 500 MHz bandwidth and a noise temperature of 14 K.

  3. Brain Computer Interfaces, a Review

    PubMed Central

    Nicolas-Alonso, Luis Fernando; Gomez-Gil, Jaime

    2012-01-01

    A brain-computer interface (BCI) is a hardware and software communications system that permits cerebral activity alone to control computers or external devices. The immediate goal of BCI research is to provide communications capabilities to severely disabled people who are totally paralyzed or ‘locked in’ by neurological neuromuscular disorders, such as amyotrophic lateral sclerosis, brain stem stroke, or spinal cord injury. Here, we review the state-of-the-art of BCIs, looking at the different steps that form a standard BCI: signal acquisition, preprocessing or signal enhancement, feature extraction, classification and the control interface. We discuss their advantages, drawbacks, and latest advances, and we survey the numerous technologies reported in the scientific literature to design each step of a BCI. First, the review examines the neuroimaging modalities used in the signal acquisition step, each of which monitors a different functional brain activity such as electrical, magnetic or metabolic activity. Second, the review discusses different electrophysiological control signals that determine user intentions, which can be detected in brain activity. Third, the review includes some techniques used in the signal enhancement step to deal with the artifacts in the control signals and improve the performance. Fourth, the review studies some mathematic algorithms used in the feature extraction and classification steps which translate the information in the control signals into commands that operate a computer or other device. Finally, the review provides an overview of various BCI applications that control a range of devices. PMID:22438708

  4. Science Teacher Leaders: Exploring Practices and Potential

    NASA Astrophysics Data System (ADS)

    Stinson, John Kevin

    It has become standard practice for teachers to step into the role of "teacher leaders" and perform a variety of curriculum, instruction and assessment tasks for schools and school districts. The literature regarding these Ohio K-12 teacher leaders, who may perform these tasks in addition to or in lieu of regular teaching assignments, rarely includes a disciplinary focus. In this exploratory, descriptive study the results of a web-based survey containing both closed and open-ended items were used in an inquiry into teacher leaders working with the discipline of science. Data from Ohio teachers responding to the survey were used first to create a standard profile for science teacher leaders. Descriptive statistics and correlations were then performed on quantitative survey data to explore science teacher leader tasks and factors that influence task performance. Analysis of data included descriptions of sense of purpose for their role held by these science teacher leaders. Results indicate that science teacher leaders appear to embrace their role as advocates for science and have great potential for implementing science education reform as well as other science-related school initiatives. Aligning performance, administrative oversight, impact on student achievement and teacher training concerning tasks science teacher leaders are expected to perform would enhance this potential. However, science teacher leaders face challenges to realizing that potential due to ambiguity of their leadership role, the breadth of tasks they tend to perform and lack of alignment between task and outcomes.

  5. Redo Laparoscopic Gastric Bypass: One-Step or Two-Step Procedure?

    PubMed

    Theunissen, Caroline M J; Guelinckx, Nele; Maring, John K; Langenhoff, Barbara S

    2016-11-01

    The adjustable gastric band (AGB) is a bariatric procedure that used to be widely performed. However, AGB failure-signifying band-related complications or unsatisfactory weight loss, resulting in revision surgery (redo operations)-frequently occurs. Often this entails a conversion to a laparoscopic Roux-en-Y gastric bypass (LRYGB). This can be performed as a one-step or two-step (separate band removal) procedure. Data were collected from patients operated from 2012 to 2014 in a single bariatric centre. We compared 107 redo LRYGB after AGB failure with 1020 primary LRYGB. An analysis was performed of the one-step vs. two-step redo procedures. All redo procedures were performed by experienced bariatric surgeons. No difference in major complication rate was seen (2.8 vs. 2.3 %, p = 0.73) between redo and primary LRYGB, and overall complication severity for redos was low (mainly Clavien-Dindo 1 or 2). Weight loss results were comparable for primary and redo procedures. The one-step and two-step redos were comparable regarding complication rates and readmissions. The operating time for the one-step redo LRYGB was 136 vs. 107.5 min for the two-step (median, p < 0.001), excluding the operating time of separate AGB removal (mean 61 min, range 36-110). Removal of a failed AGB and LRYGB in a one-step procedure is safe when performed by experienced bariatric surgeons. However, when erosion or perforation of the AGB occurs, we advise caution and would perform the redo LRYGB as a two-step procedure. Equal weights can be achieved at 1 year post redo LRYGB as after primary LRYGB procedures.

  6. Validation of a One-Step Method for Extracting Fatty Acids from Salmon, Chicken and Beef Samples.

    PubMed

    Zhang, Zhichao; Richardson, Christine E; Hennebelle, Marie; Taha, Ameer Y

    2017-10-01

    Fatty acid extraction methods are time-consuming and expensive because they involve multiple steps and copious amounts of extraction solvents. In an effort to streamline the fatty acid extraction process, this study compared the standard Folch lipid extraction method to a one-step method involving a column that selectively elutes the lipid phase. The methods were tested on raw beef, salmon, and chicken. Compared to the standard Folch method, the one-step extraction process generally yielded statistically insignificant differences in chicken and salmon fatty acid concentrations, percent composition and weight percent. Initial testing showed that beef stearic, oleic and total fatty acid concentrations were significantly lower by 9-11% with the one-step method as compared to the Folch method, but retesting on a different batch of samples showed a significant 4-8% increase in several omega-3 and omega-6 fatty acid concentrations with the one-step method relative to the Folch. Overall, the findings reflect the utility of a one-step extraction method for routine and rapid monitoring of fatty acids in chicken and salmon. Inconsistencies in beef concentrations, although minor (within 11%), may be due to matrix effects. A one-step fatty acid extraction method has broad applications for rapidly and routinely monitoring fatty acids in the food supply and formulating controlled dietary interventions. © 2017 Institute of Food Technologists®.

  7. Procedures and Standards for Residential Ventilation System Commissioning: An Annotated Bibliography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stratton, J. Chris; Wray, Craig P.

    2013-04-01

    Beginning with the 2008 version of Title 24, new homes in California must comply with ANSI/ASHRAE Standard 62.2-2007 requirements for residential ventilation. Where installed, the limited data available indicate that mechanical ventilation systems do not always perform optimally or even as many codes and forecasts predict. Commissioning such systems when they are installed or during subsequent building retrofits is a step towards eliminating deficiencies and optimizing the tradeoff between energy use and acceptable IAQ. Work funded by the California Energy Commission about a decade ago at Berkeley Lab documented procedures for residential commissioning, but did not focus on ventilation systems.more » Since then, standards and approaches for commissioning ventilation systems have been an active area of work in Europe. This report describes our efforts to collect new literature on commissioning procedures and to identify information that can be used to support the future development of residential-ventilation-specific procedures and standards. We recommend that a standardized commissioning process and a commissioning guide for practitioners be developed, along with a combined energy and IAQ benefit assessment standard and tool, and a diagnostic guide for estimating continuous pollutant emission rates of concern in residences (including a database that lists emission test data for commercially-available labeled products).« less

  8. Diversity of assessing circulating tumor cells (CTCs) emphasizes need for standardization: a CTC Guide to design and report trials.

    PubMed

    Bünger, S; Zimmermann, M; Habermann, J K

    2015-09-01

    Hematogenous spreading of tumor cells from primary tumors is a crucial step in the cascade to metastasis, the latter being the most limiting factor for patients' survival prognosis. Therefore, circulating tumor cells (CTCs) have become a field of intensive research. However, the process of isolation and identification of CTCs lacks standardization. This article presents an overview of 71 CTC studies reported in PUBMED since 2000 and focusing on colorectal cancer. These studies are evaluated regarding standardization of CTC isolation and identification, marker proteins used, study population and blood sample quality management, clinical performance, and quality measures. Overall, standardization of CTC assessment seems insufficient. Thus, comparability of CTC studies is hampered and results should be interpreted carefully. We here propose a standardized CTC guideline (CTC Guide) to prospectively design and report studies/trials in a harmonized form. Despite the current interstudy heterogeneity, the data indicate that CTC detection is of clinical relevance and CTCs should be considered as a surrogate prognostic marker. Many studies indicate the high potential for CTCs as prognostic markers, e.g., in colorectal cancer treatment. However, standardized, large-scale multicenter validation studies are still needed to pave the way for clinical implementation of CTC detection that could ameliorate individualized medicine regimes.

  9. Evaluation of advanced laparoscopic skills tasks for validity evidence.

    PubMed

    Nepomnayshy, Dmitry; Whitledge, James; Birkett, Richard; Delmonico, Theodore; Ruthazer, Robin; Sillin, Lelan; Seymour, Neal E

    2015-02-01

    Since fundamentals of laparoscopic surgery (FLS) represents a minimum proficiency standard for laparoscopic surgery, more advanced proficiency standards are required to address the needs of current surgical training. We wanted to evaluate the acceptance and discriminative ability of a novel set of skills building on the FLS model that could represent a more advanced proficiency standard-advanced laparoscopic surgery (ALS). Qualitative and quantitative analyses were employed. Quantitative analysis involved comparison of expert (PGY 5+), intermediate (PGY 3-4) and novice (PGY 1-2) surgeons on FLS and ALS tasks. Composite scores included time and errors. Standard FLS errors were added to task time to create the composite score. Qualitative analysis involved thematic review of open-ended questions provided to experts participating in the study. Out of 48 participants, there were 15 (31 %) attendings, 3 (6 %) fellows and 30 (63 %) residents. By specialty, 54 % were general/MIS/bariatric/colorectal (GMBC) and 46 % were other (urology and gynecology). There was no difference between experience level and performance on FLS and ALS tasks for the entire cohort. However, looking at the GMBC subgroup, experts performed better than novices (p = 0.012) and intermediates performed better than novices (p = 0.057) on ALS tasks. There was no difference for the same group in FLS performance. Also, GMBC subgroup performed significantly better on FLS (p = 0.0035) and ALS (p = 0.0027) than the other subgroup. Thematic analysis revealed that the majority of experts felt that ALS was more realistic, challenging and clinically relevant for specific situations compared to FLS. For GMBC surgeons, we were able to show evidence of validity for a series of advanced laparoscopic tasks and their relationship to surgeon skill level. This study may represent the first step in the development of an advanced laparoscopic skills curriculum. Given the high degree of specialization in surgery, different advanced skills curricula will need to be developed for each specialty.

  10. The economic case for digital interventions for eating disorders among United States college students.

    PubMed

    Kass, Andrea E; Balantekin, Katherine N; Fitzsimmons-Craft, Ellen E; Jacobi, Corinna; Wilfley, Denise E; Taylor, C Barr

    2017-03-01

    Eating disorders (EDs) are serious health problems affecting college students. This article aimed to estimate the costs, in United States (US) dollars, of a stepped care model for online prevention and treatment among US college students to inform meaningful decisions regarding resource allocation and adoption of efficient care delivery models for EDs on college campuses. Using a payer perspective, we estimated the costs of (1) delivering an online guided self-help (GSH) intervention to individuals with EDs, including the costs of "stepping up" the proportion expected to "fail"; (2) delivering an online preventive intervention compared to a "wait and treat" approach to individuals at ED risk; and (3) applying the stepped care model across a population of 1,000 students, compared to standard care. Combining results for online GSH and preventive interventions, we estimated a stepped care model would cost less and result in fewer individuals needing in-person psychotherapy (after receiving less-intensive intervention) compared to standard care, assuming everyone in need received intervention. A stepped care model was estimated to achieve modest cost savings compared to standard care, but these estimates need to be tested with sensitivity analyses. Model assumptions highlight the complexities of cost calculations to inform resource allocation, and considerations for a disseminable delivery model are presented. Efforts are needed to systematically measure the costs and benefits of a stepped care model for EDs on college campuses, improve the precision and efficacy of ED interventions, and apply these calculations to non-US care systems with different cost structures. © 2017 Wiley Periodicals, Inc.

  11. Robot-Applied Resistance Augments the Effects of Body Weight-Supported Treadmill Training on Stepping and Synaptic Plasticity in a Rodent Model of Spinal Cord Injury.

    PubMed

    Hinahon, Erika; Estrada, Christina; Tong, Lin; Won, Deborah S; de Leon, Ray D

    2017-08-01

    The application of resistive forces has been used during body weight-supported treadmill training (BWSTT) to improve walking function after spinal cord injury (SCI). Whether this form of training actually augments the effects of BWSTT is not yet known. To determine if robotic-applied resistance augments the effects of BWSTT using a controlled experimental design in a rodent model of SCI. Spinally contused rats were treadmill trained using robotic resistance against horizontal (n = 9) or vertical (n = 8) hind limb movements. Hind limb stepping was tested before and after 6 weeks of training. Two control groups, one receiving standard training (ie, without resistance; n = 9) and one untrained (n = 8), were also tested. At the terminal experiment, the spinal cords were prepared for immunohistochemical analysis of synaptophysin. Six weeks of training with horizontal resistance increased step length, whereas training with vertical resistance enhanced step height and movement velocity. None of these changes occurred in the group that received standard (ie, no resistance) training or in the untrained group. Only standard training increased the number of step cycles and shortened cycle period toward normal values. Synaptophysin expression in the ventral horn was highest in rats trained with horizontal resistance and in untrained rats and was positively correlated with step length. Adding robotic-applied resistance to BWSTT produced gains in locomotor function over BWSTT alone. The impact of resistive forces on spinal connections may depend on the nature of the resistive forces and the synaptic milieu that is present after SCI.

  12. Physical and cognitive task analysis in interventional radiology.

    PubMed

    Johnson, S; Healey, A; Evans, J; Murphy, M; Crawshaw, M; Gould, D

    2006-01-01

    To identify, describe and detail the cognitive thought processes, decision-making, and physical actions involved in the preparation and successful performance of core interventional radiology procedures. Five commonly performed core interventional radiology procedures were selected for cognitive task analysis. Several examples of each procedure being performed by consultant interventional radiologists were videoed. The videos of those procedures, and the steps required for successful outcome, were analysed by a psychologist and an interventional radiologist. Once a skeleton algorithm of the procedures was defined, further refinement was achieved using individual interview techniques with consultant interventional radiologists. Additionally a critique of each iteration of the established algorithm was sought from non-participating independent consultant interventional radiologists. Detailed task descriptions and decision protocols were developed for five interventional radiology procedures (arterial puncture, nephrostomy, venous access, biopsy-using both ultrasound and computed tomography, and percutaneous transhepatic cholangiogram). Identical tasks performed within these procedures were identified and standardized within the protocols. Complex procedures were broken down and their constituent processes identified. This might be suitable for use as a training protocol to provide a universally acceptable safe practice at the most fundamental level. It is envisaged that data collected in this way can be used as an educational resource for trainees and could provide the basis for a training curriculum in interventional radiology. It will direct trainees towards safe practice of the highest standard. It will also provide performance objectives of a simulator model.

  13. Gyroscope and Micromirror Design Using Vertical-Axis CMOS-MEMS Actuation and Sensing

    DTIC Science & Technology

    2002-01-01

    Interference pattern around the upper anchor (each fringe occurs at 310 nm vertical displacement...described above require extra lithography step(s) other than standard CMOS lithography steps and/or deposition of structural and sacrificial materials...Instruments’ dig- ital mirror device ( DMD ) [43]. The aluminum thin-film technology with vertical parallel- plate actuation has difficulty in achieving

  14. Stepped and Standard Care for Childhood Trauma: A Pilot Randomized Clinical Trial

    ERIC Educational Resources Information Center

    Salloum, Alison; Small, Brent J.; Robst, John; Scheeringa, Michael S.; Cohen, Judith A.; Storch, Eric A.

    2017-01-01

    Objective: This study explored the feasibility of stepped care trauma-focused cognitive behavioral therapy (SC-TF-CBT) relative to TF-CBT with children (aged 8--12). Method: Children (N = 33) with post-traumatic stress symptoms (PTSS) were randomly assigned (2:1) to SC-TF-CBT or TF-CBT. SC-TF-CBT consisted of Step 1, parent-led therapist-assisted…

  15. Concurrent Engineering through Product Data Standards

    DTIC Science & Technology

    1991-05-01

    standards, represents the power of a new industrial revolution . The role of the NIST National PDES testbed, technical leadership and a testing-based foundation for the development of STEP, is described.

  16. The ELPA library: scalable parallel eigenvalue solutions for electronic structure theory and computational science.

    PubMed

    Marek, A; Blum, V; Johanni, R; Havu, V; Lang, B; Auckenthaler, T; Heinecke, A; Bungartz, H-J; Lederer, H

    2014-05-28

    Obtaining the eigenvalues and eigenvectors of large matrices is a key problem in electronic structure theory and many other areas of computational science. The computational effort formally scales as O(N(3)) with the size of the investigated problem, N (e.g. the electron count in electronic structure theory), and thus often defines the system size limit that practical calculations cannot overcome. In many cases, more than just a small fraction of the possible eigenvalue/eigenvector pairs is needed, so that iterative solution strategies that focus only on a few eigenvalues become ineffective. Likewise, it is not always desirable or practical to circumvent the eigenvalue solution entirely. We here review some current developments regarding dense eigenvalue solvers and then focus on the Eigenvalue soLvers for Petascale Applications (ELPA) library, which facilitates the efficient algebraic solution of symmetric and Hermitian eigenvalue problems for dense matrices that have real-valued and complex-valued matrix entries, respectively, on parallel computer platforms. ELPA addresses standard as well as generalized eigenvalue problems, relying on the well documented matrix layout of the Scalable Linear Algebra PACKage (ScaLAPACK) library but replacing all actual parallel solution steps with subroutines of its own. For these steps, ELPA significantly outperforms the corresponding ScaLAPACK routines and proprietary libraries that implement the ScaLAPACK interface (e.g. Intel's MKL). The most time-critical step is the reduction of the matrix to tridiagonal form and the corresponding backtransformation of the eigenvectors. ELPA offers both a one-step tridiagonalization (successive Householder transformations) and a two-step transformation that is more efficient especially towards larger matrices and larger numbers of CPU cores. ELPA is based on the MPI standard, with an early hybrid MPI-OpenMPI implementation available as well. Scalability beyond 10,000 CPU cores for problem sizes arising in the field of electronic structure theory is demonstrated for current high-performance computer architectures such as Cray or Intel/Infiniband. For a matrix of dimension 260,000, scalability up to 295,000 CPU cores has been shown on BlueGene/P.

  17. Influence of External Beam Radiotherapy on the Properties of Polymethyl Methacrylate-Versus Silicone-Induced Membranes in a Bilateral Segmental Bone Defect in Rats.

    PubMed

    Sagardoy, Thomas; Ehret, Camille; Bareille, Reine; Benoit, Jérôme; Amedee, Joëlle; De Mones, Erwan

    2018-05-01

    Standard care for malignant tumors arising next to a bone structure is surgical removal with safety margins, followed by external beam radiotherapy (EBRT). Complete tumor removal can result in large bone defects. A two-step bone reconstruction technique using the induced membrane (IM) technique has proven its efficacy to bridge gap nonunion. During the first step, a spacer is placed in the bone gap. The spacer then is removed and the IM around it is filled with autologous cancellous bone graft. However, the feasibility of this technique with the addition of adjuvant EBRT between the two reconstruction steps has not yet been studied. Polymethyl methacrylate (PMMA) used to be the standard spacer material for the first step. Silicone spacers could replace them owing to their good behavior when submitted to EBRT and their easier removal from the surgical site during the second step. The aim of this study was to evaluate the influence of EBRT on the histological and biochemical properties of IM induced using PMMA or silicone as spacer. The analyses were performed on PMMA- or silicone-IM with and without EBRT in a 6-mm bilateral femoral defect in 32 rats. Thickness and vessel content were measured in both groups. Bone morphogenetic protein 2 (BMP2) and vascular endothelial growth factor (VEGF) content in lysates of the crushed membranes were measured by enzyme immunoassay. Finally, alkaline phosphatase activity was analyzed in human bone marrow stromal cell cultures in contact with the same lysates. EBRT did not change the histological structure of the cellular internal layer or the fibrous outer layer. The nature of the spacer only influenced IM thickness, PMMA-IM with external radiotherapy being significantly thicker. EBRT decreased the vascular density of IM but was less effective on VEGF/BMP2 production. In vitro, IM could have an osteoinductive potential on human bone marrow stem cells. EBRT did not modify the histological properties of IMs but decreased their vascular density. VEGF and BMP2 production within IMs was not affected by EBRT. Silicone spacers are able to induce membranes with similar histological characteristics to PMMA-IM.

  18. A numerical projection technique for large-scale eigenvalue problems

    NASA Astrophysics Data System (ADS)

    Gamillscheg, Ralf; Haase, Gundolf; von der Linden, Wolfgang

    2011-10-01

    We present a new numerical technique to solve large-scale eigenvalue problems. It is based on the projection technique, used in strongly correlated quantum many-body systems, where first an effective approximate model of smaller complexity is constructed by projecting out high energy degrees of freedom and in turn solving the resulting model by some standard eigenvalue solver. Here we introduce a generalization of this idea, where both steps are performed numerically and which in contrast to the standard projection technique converges in principle to the exact eigenvalues. This approach is not just applicable to eigenvalue problems encountered in many-body systems but also in other areas of research that result in large-scale eigenvalue problems for matrices which have, roughly speaking, mostly a pronounced dominant diagonal part. We will present detailed studies of the approach guided by two many-body models.

  19. Mariner Mars 1971 project. Volume 3: Mission operations system implementation and standard mission flight operations

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The Mariner Mars 1971 mission which was another step in the continuing program of planetary exploration in search of evidence of exobiological activity, information on the origin and evolution of the solar system, and basic science data related to the study of planetary physics, geology, planetology, and cosmology is reported. The mission plan was designed for two spacecraft, each performing a separate but complementary mission. However, a single mission plan was actually used for Mariner 9 because of failure of the launch vehicle for the first spacecraft. The implementation is described, of the Mission Operations System, including organization, training, and data processing development and operations, and Mariner 9 spacecraft cruise and orbital operations through completion of the standard mission from launch to solar occultation in April 1972 are discussed.

  20. Detailed Surface Analysis Of Incremental Centrifugal Barrel Polishing (CBP) Of Single-Crystal Niobium Samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palczewski, Ari D.; Tian, Hui; Trofimova, Olga

    2011-07-01

    We performed Centrifugal Barrel Polishing (CBP) on single crystal niobium samples/coupons housed in a stainless steel sample holder following the polishing recipe developed at Fermi Lab (FNAL) in 2011 \\cite{C. A. Cooper 2011}. Post CBP, the sample coupons were analyzed for surface roughness, crystal composition and structure, and particle contamination. Following the initial analysis each coupon was high pressure rinsed (HRP) and analyzed for the effectiveness of contamination removal. We were able to obtain the mirror like surface finish after the final stage of tumbling, although some defects and embedded particles remained. In addition, standard HPR appears to have littlemore » effect on removing embedded particles which remain after each tumbling step, although final polishing media removal was partially affected by standard/extended HPR.« less

  1. Comparative Effectiveness of Two Walking Interventions on Participation, Step Counts, and Health.

    PubMed

    Smith-McLallen, Aaron; Heller, Debbie; Vernisi, Kristin; Gulick, Diana; Cruz, Samantha; Snyder, Richard L

    2017-03-01

    To (1) compare the effects of two worksite-based walking interventions on employee participation rates; (2) compare average daily step counts between conditions, and; (3) examine the effects of increases in average daily step counts on biometric and psychologic outcomes. We conducted a cluster-randomized trial in which six employer groups were randomly selected and randomly assigned to condition. Four manufacturing worksites and two office-based worksite served as the setting. A total of 474 employees from six employer groups were included. A standard walking program was compared to an enhanced program that included incentives, feedback, competitive challenges, and monthly wellness workshops. Walking was measured by self-reported daily step counts. Survey measures and biometric screenings were administered at baseline and 3, 6, and 9 months after baseline. Analysis used linear mixed models with repeated measures. During 9 months, participants in the enhanced condition averaged 726 more steps per day compared with those in the standard condition (p < .001). A 1000-step increase in average daily steps was associated with significant weight loss for both men (-3.8 lbs.) and women (-2.1 lbs.), and reductions in body mass index (-0.41 men, -0.31 women). Higher step counts were also associated with improvements in mood, having more energy, and higher ratings of overall health. An enhanced walking program significantly increases participation rates and daily step counts, which were associated with weight loss and reductions in body mass index.

  2. 29 CFR 1952.371 - Developmental schedule.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... (CONTINUED) APPROVED STATE PLANS FOR ENFORCEMENT OF STATE STANDARDS Virginia § 1952.371 Developmental schedule. The Virginia plan is developmental. Following is a schedule of major developmental steps: (a) Standards identical to the Federal standards will be completely adopted by January 1, 1978. (b) A plan for...

  3. 29 CFR 1952.371 - Developmental schedule.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... (CONTINUED) APPROVED STATE PLANS FOR ENFORCEMENT OF STATE STANDARDS Virginia § 1952.371 Developmental schedule. The Virginia plan is developmental. Following is a schedule of major developmental steps: (a) Standards identical to the Federal standards will be completely adopted by January 1, 1978. (b) A plan for...

  4. 29 CFR 1952.371 - Developmental schedule.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... (CONTINUED) APPROVED STATE PLANS FOR ENFORCEMENT OF STATE STANDARDS Virginia § 1952.371 Developmental schedule. The Virginia plan is developmental. Following is a schedule of major developmental steps: (a) Standards identical to the Federal standards will be completely adopted by January 1, 1978. (b) A plan for...

  5. 29 CFR 1952.371 - Developmental schedule.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... (CONTINUED) APPROVED STATE PLANS FOR ENFORCEMENT OF STATE STANDARDS Virginia § 1952.371 Developmental schedule. The Virginia plan is developmental. Following is a schedule of major developmental steps: (a) Standards identical to the Federal standards will be completely adopted by January 1, 1978. (b) A plan for...

  6. The role of peer-assisted learning in building evaluative judgement: opportunities in clinical medical education.

    PubMed

    Tai, Joanna Hong-Meng; Canny, Benedict J; Haines, Terry P; Molloy, Elizabeth K

    2016-08-01

    This study explored the contribution of peer-assisted learning (PAL) in the development of evaluative judgement capacity; the ability to understand work quality and apply those standards to appraising performance. The study employed a mixed methods approach, collecting self-reported survey data, observations of, and reflective interviews with, the medical students observed. Participants were in their first year of clinical placements. Data were thematically analysed. Students indicated that PAL contributed to both the comprehension of notions of quality, and the practice of making comparisons between a given performance and the standards. Emergent themes included peer story-telling, direct observation of performance, and peer-based feedback, all of which helped students to define 'work quality'. By participating in PAL, students were required to make comparisons, therefore using the standards of practice and gaining a deeper understanding of them. The data revealed tensions in that peers were seen as less threatening than supervisors with the advantage of increasing learners' appetites for thoughtful 'intellectual risk taking'. Despite this reported advantage of peer engagement, learners still expressed a preference for feedback from senior teachers as more trusted sources of clinical knowledge. While this study suggests that PAL already contributes to the development of evaluative judgement, further steps could be taken to formalise PAL in clinical placements to improve learners' capacity to make accurate judgements on the performance of self and others. Further experimental studies are necessary to confirm the best methods of using PAL to develop evaluative judgement. This may include both students and educators as instigators of PAL in the workplace.

  7. Protein Quantification by Derivatization-Free High-Performance Liquid Chromatography of Aromatic Amino Acids

    PubMed Central

    Hesse, Almut

    2016-01-01

    Amino acid analysis is considered to be the gold standard for quantitative peptide and protein analysis. Here, we would like to propose a simple HPLC/UV method based on a reversed-phase separation of the aromatic amino acids tyrosine (Tyr), phenylalanine (Phe), and optionally tryptophan (Trp) without any derivatization. The hydrolysis of the proteins and peptides was performed by an accelerated microwave technique, which needs only 30 minutes. Two internal standard compounds, homotyrosine (HTyr) and 4-fluorophenylalanine (FPhe) were used for calibration. The limit of detection (LOD) was estimated to be 0.05 µM (~10 µg/L) for tyrosine and phenylalanine at 215 nm. The LOD for a protein determination was calculated to be below 16 mg/L (~300 ng BSA absolute). Aromatic amino acid analysis (AAAA) offers excellent accuracy and a precision of about 5% relative standard deviation, including the hydrolysis step. The method was validated with certified reference materials (CRM) of amino acids and of a pure protein (bovine serum albumin, BSA). AAAA can be used for the quantification of aromatic amino acids, isolated peptides or proteins, complex peptide or protein samples, such as serum or milk powder, and peptides or proteins immobilized on solid supports. PMID:27559481

  8. Immediate Effects of Clock-Turn Strategy on the Pattern and Performance of Narrow Turning in Persons With Parkinson Disease.

    PubMed

    Yang, Wen-Chieh; Hsu, Wei-Li; Wu, Ruey-Meei; Lin, Kwan-Hwa

    2016-10-01

    Turning difficulty is common in people with Parkinson disease (PD). The clock-turn strategy is a cognitive movement strategy to improve turning performance in people with PD despite its effects are unverified. Therefore, this study aimed to investigate the effects of the clock-turn strategy on the pattern of turning steps, turning performance, and freezing of gait during a narrow turning, and how these effects were influenced by concurrent performance of a cognitive task (dual task). Twenty-five people with PD were randomly assigned to the clock-turn or usual-turn group. Participants performed the Timed Up and Go test with and without concurrent cognitive task during the medication OFF period. The clock-turn group performed the Timed Up and Go test using the clock-turn strategy, whereas participants in the usual-turn group performed in their usual manner. Measurements were taken during the 180° turn of the Timed Up and Go test. The pattern of turning steps was evaluated by step time variability and step time asymmetry. Turning performance was evaluated by turning time and number of turning steps. The number and duration of freezing of gait were calculated by video review. The clock-turn group had lower step time variability and step time asymmetry than the usual-turn group. Furthermore, the clock-turn group turned faster with fewer freezing of gait episodes than the usual-turn group. Dual task increased the step time variability and step time asymmetry in both groups but did not affect turning performance and freezing severity. The clock-turn strategy reduces turning time and freezing of gait during turning, probably by lowering step time variability and asymmetry. Dual task compromises the effects of the clock-turn strategy, suggesting a competition for attentional resources.Video Abstract available for more insights from the authors (see Supplemental Digital Content 1, http://links.lww.com/JNPT/A141).

  9. [Pre-analytical quality in fluid samples cytopathology: Results of a survey from the French Society of Clinical Cytology].

    PubMed

    Courtade-Saïdi, Monique; Fleury Feith, Jocelyne

    2015-10-01

    The pre-analytical step includes sample collection, preparation, transportation and storage in the pathology unit where the diagnosis is performed. The pathologist ensures that pre-analytical conditions are in line with expectations. The lack of standardization for handling cytological samples makes this pre-analytical step difficult to harmonize. Moreover, this step depends on the nature of the sample: fresh liquid or fixed material, air-dried smears, liquid-based cytology. The aim of the study was to review the different practices in French structures of pathology on the pre-analytical phase concerning cytological fluids such as broncho-alveolar lavage (BALF), serous fluids and urine. A survey was conducted on the basis of the pre-analytical chapter of the ISO 15189 and sent to 191 French pathological structures (105 public and 86 private). Fifty-six laboratories replied to the survey. Ninety-five per cent have a computerized management system and 70% a manual on sample handling. The general instructions requested for the patients and sample identification were highly correctly filled with a short time routing and additional tests prescription. By contrast, information are variable concerning the clinical information requested and the type of tubes for collecting fluids and the volumes required as well as the actions taken in case of non-conformity. For the specific items concerning BALF, serous fluids and urine, this survey has shown a great heterogeneity according to sample collection, fixation and of clinical information. This survey demonstrates that the pre-analytical quality for BALF, serous fluids and urine is not optimal and that some corrections of the practices are recommended with a standardization of numerous steps in order to increase the reproducibility of additional tests such as immunocytochemistry, cytogenetic and molecular biology. Some recommendations have been written. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  10. Retinopathy online challenge: automatic detection of microaneurysms in digital color fundus photographs.

    PubMed

    Niemeijer, Meindert; van Ginneken, Bram; Cree, Michael J; Mizutani, Atsushi; Quellec, Gwénolé; Sanchez, Clara I; Zhang, Bob; Hornero, Roberto; Lamard, Mathieu; Muramatsu, Chisako; Wu, Xiangqian; Cazuguel, Guy; You, Jane; Mayo, Agustín; Li, Qin; Hatanaka, Yuji; Cochener, Béatrice; Roux, Christian; Karray, Fakhri; Garcia, María; Fujita, Hiroshi; Abramoff, Michael D

    2010-01-01

    The detection of microaneurysms in digital color fundus photographs is a critical first step in automated screening for diabetic retinopathy (DR), a common complication of diabetes. To accomplish this detection numerous methods have been published in the past but none of these was compared with each other on the same data. In this work we present the results of the first international microaneurysm detection competition, organized in the context of the Retinopathy Online Challenge (ROC), a multiyear online competition for various aspects of DR detection. For this competition, we compare the results of five different methods, produced by five different teams of researchers on the same set of data. The evaluation was performed in a uniform manner using an algorithm presented in this work. The set of data used for the competition consisted of 50 training images with available reference standard and 50 test images where the reference standard was withheld by the organizers (M. Niemeijer, B. van Ginneken, and M. D. Abràmoff). The results obtained on the test data was submitted through a website after which standardized evaluation software was used to determine the performance of each of the methods. A human expert detected microaneurysms in the test set to allow comparison with the performance of the automatic methods. The overall results show that microaneurysm detection is a challenging task for both the automatic methods as well as the human expert. There is room for improvement as the best performing system does not reach the performance of the human expert. The data associated with the ROC microaneurysm detection competition will remain publicly available and the website will continue accepting submissions.

  11. An implementation of the look-ahead Lanczos algorithm for non-Hermitian matrices, part 1

    NASA Technical Reports Server (NTRS)

    Freund, Roland W.; Gutknecht, Martin H.; Nachtigal, Noel M.

    1990-01-01

    The nonsymmetric Lanczos method can be used to compute eigenvalues of large sparse non-Hermitian matrices or to solve large sparse non-Hermitian linear systems. However, the original Lanczos algorithm is susceptible to possible breakdowns and potential instabilities. We present an implementation of a look-ahead version of the Lanczos algorithm which overcomes these problems by skipping over those steps in which a breakdown or near-breakdown would occur in the standard process. The proposed algorithm can handle look-ahead steps of any length and is not restricted to steps of length 2, as earlier implementations are. Also, our implementation has the feature that it requires roughly the same number of inner products as the standard Lanczos process without look-ahead.

  12. Physical examination skills training: Faculty staff vs. patient instructor feedback—A controlled trial

    PubMed Central

    Diefenbacher, Katja; Schultz, Jobst-Hendrik; Maatouk, Imad; Herrmann-Werner, Anne; Koehl-Hackert, Nadja; Herzog, Wolfgang; Nikendei, Christoph

    2017-01-01

    Background Standardized patients are widely used in training of medical students, both in teaching and assessment. They also frequently lead complete training sessions delivering physical examination skills without the aid of faculty teaching staff–acting as “patient instructors” (PIs). An important part of this training is their ability to provide detailed structured feedback to students which has a strong impact on their learning success. Yet, to date no study has assessed the quality of physical examination related feedback by PIs. Therefore, we conducted a randomized controlled study comparing feedback of PIs and faculty staff following a physical examination assessed by students and video assessors. Methods 14 PIs and 14 different faculty staff physicians both delivered feedback to 40 medical students that had performed a physical examination on the respective PI while the physicians observed the performance. The physical examination was rated by two independent video assessors to provide an objective performance standard (gold standard). Feedback of PI and physicians was content analyzed by two different independent video assessors based on a provided checklist and compared to the performance standard. Feedback of PIs and physicians was also rated by medical students and video assessors using a questionnaire consisting of 12 items. Results There was no statistical significant difference concerning overall matching of physician or PI feedback with gold standard ratings by video assessment (p = .219). There was also no statistical difference when focusing only on items that were classified as major key steps (p = .802), mistakes or parts that were left out during physical examination (p = .219) or mistakes in communication items (p = .517). The feedback of physicians was significantly better rated than PI feedback both by students (p = .043) as well as by video assessors (p = .034). Conclusions In summary, our study demonstrates that trained PIs are able to provide feedback of equal quantitative value to that of faculty staff physicians with regard to a physical examination performed on them. However, both the students and the video raters judged the quality of the feedback given by the physicians to be significantly better than that of the PIs. PMID:28692703

  13. Physical examination skills training: Faculty staff vs. patient instructor feedback-A controlled trial.

    PubMed

    Krautter, Markus; Diefenbacher, Katja; Schultz, Jobst-Hendrik; Maatouk, Imad; Herrmann-Werner, Anne; Koehl-Hackert, Nadja; Herzog, Wolfgang; Nikendei, Christoph

    2017-01-01

    Standardized patients are widely used in training of medical students, both in teaching and assessment. They also frequently lead complete training sessions delivering physical examination skills without the aid of faculty teaching staff-acting as "patient instructors" (PIs). An important part of this training is their ability to provide detailed structured feedback to students which has a strong impact on their learning success. Yet, to date no study has assessed the quality of physical examination related feedback by PIs. Therefore, we conducted a randomized controlled study comparing feedback of PIs and faculty staff following a physical examination assessed by students and video assessors. 14 PIs and 14 different faculty staff physicians both delivered feedback to 40 medical students that had performed a physical examination on the respective PI while the physicians observed the performance. The physical examination was rated by two independent video assessors to provide an objective performance standard (gold standard). Feedback of PI and physicians was content analyzed by two different independent video assessors based on a provided checklist and compared to the performance standard. Feedback of PIs and physicians was also rated by medical students and video assessors using a questionnaire consisting of 12 items. There was no statistical significant difference concerning overall matching of physician or PI feedback with gold standard ratings by video assessment (p = .219). There was also no statistical difference when focusing only on items that were classified as major key steps (p = .802), mistakes or parts that were left out during physical examination (p = .219) or mistakes in communication items (p = .517). The feedback of physicians was significantly better rated than PI feedback both by students (p = .043) as well as by video assessors (p = .034). In summary, our study demonstrates that trained PIs are able to provide feedback of equal quantitative value to that of faculty staff physicians with regard to a physical examination performed on them. However, both the students and the video raters judged the quality of the feedback given by the physicians to be significantly better than that of the PIs.

  14. 10 Steps to Building an Architecture for Space Surveillance Projects

    NASA Astrophysics Data System (ADS)

    Gyorko, E.; Barnhart, E.; Gans, H.

    Space surveillance is an increasingly complex task, requiring the coordination of a multitude of organizations and systems, while dealing with competing capabilities, proprietary processes, differing standards, and compliance issues. In order to fully understand space surveillance operations, analysts and engineers need to analyze and break down their operations and systems using what are essentially enterprise architecture processes and techniques. These techniques can be daunting to the first- time architect. This paper provides a summary of simplified steps to analyze a space surveillance system at the enterprise level in order to determine capabilities, services, and systems. These steps form the core of an initial Model-Based Architecting process. For new systems, a well defined, or well architected, space surveillance enterprise leads to an easier transition from model-based architecture to model-based design and provides a greater likelihood that requirements are fulfilled the first time. Both new and existing systems benefit from being easier to manage, and can be sustained more easily using portfolio management techniques, based around capabilities documented in the model repository. The resulting enterprise model helps an architect avoid 1) costly, faulty portfolio decisions; 2) wasteful technology refresh efforts; 3) upgrade and transition nightmares; and 4) non-compliance with DoDAF directives. The Model-Based Architecting steps are based on a process that Harris Corporation has developed from practical experience architecting space surveillance systems and ground systems. Examples are drawn from current work on documenting space situational awareness enterprises. The process is centered on DoDAF 2 and its corresponding meta-model so that terminology is standardized and communicable across any disciplines that know DoDAF architecting, including acquisition, engineering and sustainment disciplines. Each step provides a guideline for the type of data to collect, and also the appropriate views to generate. The steps include 1) determining the context of the enterprise, including active elements and high level capabilities or goals; 2) determining the desired effects of the capabilities and mapping capabilities against the project plan; 3) determining operational performers and their inter-relationships; 4) building information and data dictionaries; 5) defining resources associated with capabilities; 6) determining the operational behavior necessary to achieve each capability; 7) analyzing existing or planned implementations to determine systems, services and software; 8) cross-referencing system behavior to operational behavioral; 9) documenting system threads and functional implementations; and 10) creating any required textual documentation from the model.

  15. Molecular dynamics based enhanced sampling of collective variables with very large time steps.

    PubMed

    Chen, Pei-Yang; Tuckerman, Mark E

    2018-01-14

    Enhanced sampling techniques that target a set of collective variables and that use molecular dynamics as the driving engine have seen widespread application in the computational molecular sciences as a means to explore the free-energy landscapes of complex systems. The use of molecular dynamics as the fundamental driver of the sampling requires the introduction of a time step whose magnitude is limited by the fastest motions in a system. While standard multiple time-stepping methods allow larger time steps to be employed for the slower and computationally more expensive forces, the maximum achievable increase in time step is limited by resonance phenomena, which inextricably couple fast and slow motions. Recently, we introduced deterministic and stochastic resonance-free multiple time step algorithms for molecular dynamics that solve this resonance problem and allow ten- to twenty-fold gains in the large time step compared to standard multiple time step algorithms [P. Minary et al., Phys. Rev. Lett. 93, 150201 (2004); B. Leimkuhler et al., Mol. Phys. 111, 3579-3594 (2013)]. These methods are based on the imposition of isokinetic constraints that couple the physical system to Nosé-Hoover chains or Nosé-Hoover Langevin schemes. In this paper, we show how to adapt these methods for collective variable-based enhanced sampling techniques, specifically adiabatic free-energy dynamics/temperature-accelerated molecular dynamics, unified free-energy dynamics, and by extension, metadynamics, thus allowing simulations employing these methods to employ similarly very large time steps. The combination of resonance-free multiple time step integrators with free-energy-based enhanced sampling significantly improves the efficiency of conformational exploration.

  16. Molecular dynamics based enhanced sampling of collective variables with very large time steps

    NASA Astrophysics Data System (ADS)

    Chen, Pei-Yang; Tuckerman, Mark E.

    2018-01-01

    Enhanced sampling techniques that target a set of collective variables and that use molecular dynamics as the driving engine have seen widespread application in the computational molecular sciences as a means to explore the free-energy landscapes of complex systems. The use of molecular dynamics as the fundamental driver of the sampling requires the introduction of a time step whose magnitude is limited by the fastest motions in a system. While standard multiple time-stepping methods allow larger time steps to be employed for the slower and computationally more expensive forces, the maximum achievable increase in time step is limited by resonance phenomena, which inextricably couple fast and slow motions. Recently, we introduced deterministic and stochastic resonance-free multiple time step algorithms for molecular dynamics that solve this resonance problem and allow ten- to twenty-fold gains in the large time step compared to standard multiple time step algorithms [P. Minary et al., Phys. Rev. Lett. 93, 150201 (2004); B. Leimkuhler et al., Mol. Phys. 111, 3579-3594 (2013)]. These methods are based on the imposition of isokinetic constraints that couple the physical system to Nosé-Hoover chains or Nosé-Hoover Langevin schemes. In this paper, we show how to adapt these methods for collective variable-based enhanced sampling techniques, specifically adiabatic free-energy dynamics/temperature-accelerated molecular dynamics, unified free-energy dynamics, and by extension, metadynamics, thus allowing simulations employing these methods to employ similarly very large time steps. The combination of resonance-free multiple time step integrators with free-energy-based enhanced sampling significantly improves the efficiency of conformational exploration.

  17. Quantitative gel electrophoresis: new records in precision by elaborated staining and detection protocols.

    PubMed

    Deng, Xi; Schröder, Simone; Redweik, Sabine; Wätzig, Hermann

    2011-06-01

    Gel electrophoresis (GE) is a very common analytical technique for proteome research and protein analysis. Despite being developed decades ago, there is still a considerable need to improve its precision. Using the fluorescence of Colloidal Coomassie Blue -stained proteins in near-infrared (NIR), the major error source caused by the unpredictable background staining is strongly reduced. This result was generalized for various types of detectors. Since GE is a multi-step procedure, standardization of every single step is required. After detailed analysis of all steps, the staining and destaining were identified as the major source of the remaining variation. By employing standardized protocols, pooled percent relative standard deviations of 1.2-3.1% for band intensities were achieved for one-dimensional separations in repetitive experiments. The analysis of variance suggests that the same batch of staining solution should be used for gels of one experimental series to minimize day-to-day variation and to obtain high precision. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Combination of magnetic dispersive micro solid-phase extraction and supramolecular solvent-based microextraction followed by high-performance liquid chromatography for determination of trace amounts of cholesterol-lowering drugs in complicated matrices.

    PubMed

    Arghavani-Beydokhti, Somayeh; Rajabi, Maryam; Asghari, Alireza

    2017-07-01

    A novel, efficient, rapid, simple, sensitive, selective, and environmentally friendly method termed magnetic dispersive micro solid-phase extraction combined with supramolecular solvent-based microextraction (Mdμ-SPE-SSME) followed by high-performance liquid chromatography (HPLC) with UV detection is introduced for the simultaneous microextraction of cholesterol-lowering drugs in complicated matrices. In the first microextraction procedure, using layered double hydroxide (LDH)-coated Fe 3 O 4 magnetic nanoparticles, an efficient sample cleanup is simply and rapidly provided without the need for time-consuming centrifugation and elution steps. In the first step, desorption of the target analytes is easily performed through dissolution of the LDH-coated magnetic nanoparticles containing the target analytes in an acidic solution. In the next step, an emulsification microextraction method based on a supramolecular solvent is used for excellent preconcentration, ultimately resulting in an appropriate determination of the target analytes in real samples. Under the optimal experimental conditions, the Mdμ-SPE-SSME-HPLC-UV detection procedure provides good linearity in the ranges of 1.0-1500 ng mL -1 , 1.5-2000 ng mL -1 , and 2.0-2000 ng mL -1 with coefficients of determination of 0.995 or less, low limits of detection (0.3, 0.5, and 0.5 ng mL -1 ), and good extraction repeatabilities (relative standard deviations below 7.8%, n = 5) in deionized water for rosuvastatin, atorvastatin, and gemfibrozil, respectively. Finally, the proposed method is successfully applied for the determination of the target analytes in complicated matrices. Graphical Abstract Mdμ-SPE-SSME procedure.

  19. Anterior clinoidectomy using an extradural and intradural 2-step hybrid technique.

    PubMed

    Tayebi Meybodi, Ali; Lawton, Michael T; Yousef, Sonia; Guo, Xiaoming; González Sánchez, Jose Juan; Tabani, Halima; García, Sergio; Burkhardt, Jan-Karl; Benet, Arnau

    2018-02-23

    Anterior clinoidectomy is a difficult yet essential technique in skull base surgery. Two main techniques (extradural and intradural) with multiple modifications have been proposed to increase efficiency and avoid complications. In this study, the authors sought to develop a hybrid technique based on localization of the optic strut (OS) to combine the advantages and avoid the disadvantages of both techniques. Ten cadaveric specimens were prepared for surgical simulation. After a standard pterional craniotomy, the anterior clinoid process (ACP) was resected in 2 steps. The segment anterior to the OS was resected extradurally, while the segment posterior to the OS was resected intradurally. The proposed technique was performed in 6 clinical cases to evaluate its safety and efficiency. Anterior clinoidectomy was successfully performed in all cadaveric specimens and all 6 patients by using the proposed technique. The extradural phase enabled early decompression of the optic nerve while avoiding the adjacent internal carotid artery. The OS was drilled intradurally under direct visualization of the adjacent neurovascular structures. The described landmarks were easily identifiable and applicable in the surgically treated patients. No operative complication was encountered. A proposed 2-step hybrid technique combines the advantages of the extradural and intradural techniques while avoiding their disadvantages. This technique allows reduced intradural drilling and subarachnoid bone dust deposition. Moreover, the most critical part of the clinoidectomy-that is, drilling of the OS and removal of the body of the ACP-is left for the intradural phase, when critical neurovascular structures can be directly viewed.

  20. How do physicians become medical experts? A test of three competing theories: distinct domains, independent influence and encapsulation models.

    PubMed

    Violato, Claudio; Gao, Hong; O'Brien, Mary Claire; Grier, David; Shen, E

    2018-05-01

    The distinction between basic sciences and clinical knowledge which has led to a theoretical debate on how medical expertise is developed has implications for medical school and lifelong medical education. This longitudinal, population based observational study was conducted to test the fit of three theories-knowledge encapsulation, independent influence, distinct domains-of the development of medical expertise employing structural equation modelling. Data were collected from 548 physicians (292 men-53.3%; 256 women-46.7%; mean age = 24.2 years on admission) who had graduated from medical school 2009-2014. They included (1) Admissions data of undergraduate grade point average and Medical College Admission Test sub-test scores, (2) Course performance data from years 1, 2, and 3 of medical school, and (3) Performance on the NBME exams (i.e., Step 1, Step 2 CK, and Step 3). Statistical fit indices (Goodness of Fit Index-GFI; standardized root mean squared residual-SRMR; root mean squared error of approximation-RSMEA) and comparative fit [Formula: see text] of three theories of cognitive development of medical expertise were used to assess model fit. There is support for the knowledge encapsulation three factor model of clinical competency (GFI = 0.973, SRMR = 0.043, RSMEA = 0.063) which had superior fit indices to both the independent influence and distinct domains theories ([Formula: see text] vs [Formula: see text] [[Formula: see text

  1. Blue emitting undecaplatinum clusters

    NASA Astrophysics Data System (ADS)

    Chakraborty, Indranath; Bhuin, Radha Gobinda; Bhat, Shridevi; Pradeep, T.

    2014-07-01

    A blue luminescent 11-atom platinum cluster showing step-like optical features and the absence of plasmon absorption was synthesized. The cluster was purified using high performance liquid chromatography (HPLC). Electrospray ionization (ESI) and matrix assisted laser desorption ionization (MALDI) mass spectrometry (MS) suggest a composition, Pt11(BBS)8, which was confirmed by a range of other experimental tools. The cluster is highly stable and compatible with many organic solvents.A blue luminescent 11-atom platinum cluster showing step-like optical features and the absence of plasmon absorption was synthesized. The cluster was purified using high performance liquid chromatography (HPLC). Electrospray ionization (ESI) and matrix assisted laser desorption ionization (MALDI) mass spectrometry (MS) suggest a composition, Pt11(BBS)8, which was confirmed by a range of other experimental tools. The cluster is highly stable and compatible with many organic solvents. Electronic supplementary information (ESI) available: Details of experimental procedures, instrumentation, chromatogram of the crude cluster; SEM/EDAX, DLS, PXRD, TEM, FT-IR, and XPS of the isolated Pt11 cluster; UV/Vis, MALDI MS and SEM/EDAX of isolated 2 and 3; and 195Pt NMR of the K2PtCl6 standard. See DOI: 10.1039/c4nr02778g

  2. Irradiation campaign in the EOLE critical facility of fiber optic Bragg gratings dedicated to the online temperature measurement in zero power research reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mellier, Frederic; Cheymol, Guy; Destouches, Christophe

    2015-07-01

    The control of temperature during operation of zero power research reactors participates to the overall control of experimentation conditions and reveals itself of a major importance more especially when measuring small multiplication factor variations. Within the framework of the refurbishment of the MASURCA facility, the development of a new temperature measurement system based on the optical fiber Bragg grating (FBG) technology is under consideration. In a first step, a series of FBGs is irradiated in the EOLE critical facility with the aim to select the most appropriate. Online temperature measurements are performed during a set of irradiations that should allowmore » reaching a fast neutron fluence of some 10{sup 14} n.cm{sup -2}. The results obtained, more especially the Bragg wavelength shifts during the irradiation campaign, are discussed in this paper and compared to data from standard PT100 temperature sensors to highlight possible radiation effects on sensor performances. Work to be conducted during the second step of the project, aiming to a feasibility demonstration using a MASURCA assembly, is also presented. (authors)« less

  3. The Finite-Surface Method for incompressible flow: a step beyond staggered grid

    NASA Astrophysics Data System (ADS)

    Hokpunna, Arpiruk; Misaka, Takashi; Obayashi, Shigeru

    2017-11-01

    We present a newly developed higher-order finite surface method for the incompressible Navier-Stokes equations (NSE). This method defines the velocities as a surface-averaged value on the surfaces of the pressure cells. Consequently, the mass conservation on the pressure cells becomes an exact equation. The only things left to approximate is the momentum equation and the pressure at the new time step. At certain conditions, the exact mass conservation enables the explicit n-th order accurate NSE solver to be used with the pressure treatment that is two or four order less accurate without loosing the apparent convergence rate. This feature was not possible with finite volume of finite difference methods. We use Fourier analysis with a model spectrum to determine the condition and found that the range covers standard boundary layer flows. The formal convergence and the performance of the proposed scheme is compared with a sixth-order finite volume method. Finally, the accuracy and performance of the method is evaluated in turbulent channel flows. This work is partially funded by a research colloaboration from IFS, Tohoku university and ASEAN+3 funding scheme from CMUIC, Chiang Mai University.

  4. Use of nitrogen to remove solvent from through oven transfer adsorption desorption interface during analysis of polycyclic aromatic hydrocarbons by large volume injection in gas chromatography.

    PubMed

    Áragón, Alvaro; Toledano, Rosa M; Cortés, José M; Vázquez, Ana M; Villén, Jesús

    2014-04-25

    The through oven transfer adsorption desorption (TOTAD) interface allows large volume injection (LVI) in gas chromatography and the on-line coupling of liquid chromatography and gas chromatography (LC-GC), enabling the LC step to be carried out in normal as well as in reversed phase. However, large amounts of helium, which is both expensive and scarce, are necessary for solvent elimination. We describe how slight modification of the interface and the operating mode allows nitrogen to be used during the solvent elimination steps. In order to evaluate the performance of the new system, volumes ranging from 20 to 100μL of methanolic solutions of four polycyclic aromatic hydrocarbons (PAHs) were sampled. No significant differences were found in the repeatability and sensitivity of the analyses of standard PAH solutions when using nitrogen or helium. The performance using the proposed modification was similar and equally satisfactory when using nitrogen or helium for solvent elimination in the TOTAD interface. In conclusion, the use of nitrogen will make analyses less expensive. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Optimal Filter Estimation for Lucas-Kanade Optical Flow

    PubMed Central

    Sharmin, Nusrat; Brad, Remus

    2012-01-01

    Optical flow algorithms offer a way to estimate motion from a sequence of images. The computation of optical flow plays a key-role in several computer vision applications, including motion detection and segmentation, frame interpolation, three-dimensional scene reconstruction, robot navigation and video compression. In the case of gradient based optical flow implementation, the pre-filtering step plays a vital role, not only for accurate computation of optical flow, but also for the improvement of performance. Generally, in optical flow computation, filtering is used at the initial level on original input images and afterwards, the images are resized. In this paper, we propose an image filtering approach as a pre-processing step for the Lucas-Kanade pyramidal optical flow algorithm. Based on a study of different types of filtering methods and applied on the Iterative Refined Lucas-Kanade, we have concluded on the best filtering practice. As the Gaussian smoothing filter was selected, an empirical approach for the Gaussian variance estimation was introduced. Tested on the Middlebury image sequences, a correlation between the image intensity value and the standard deviation value of the Gaussian function was established. Finally, we have found that our selection method offers a better performance for the Lucas-Kanade optical flow algorithm.

  6. World-Class Ambitions, Weak Standards: An Excerpt from "The State of State Science Standards 2012"

    ERIC Educational Resources Information Center

    American Educator, 2012

    2012-01-01

    A solid science education program begins by clearly establishing what well-educated youngsters need to learn about this multifaceted domain of human knowledge. The first crucial step is setting clear academic standards for the schools--standards that not only articulate the critical science content students need to learn, but that also properly…

  7. Toward practical 3D radiography of pipeline girth welds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wassink, Casper, E-mail: casper.wassink@applusrtd.com; Hol, Martijn, E-mail: martijn.hol@applusrtd.com; Flikweert, Arjan, E-mail: martijn.hol@applusrtd.com

    2015-03-31

    Digital radiography has made its way into in-the-field girth weld testing. With recent generations of detectors and x-ray tubes it is possible to reach the image quality desired in standards as well as the speed of inspection desired to be competitive with film radiography and automated ultrasonic testing. This paper will show the application of these technologies in the RTD Rayscan system. The method for achieving an image quality that complies with or even exceeds prevailing industrial standards will be presented, as well as the application on pipeline girth welds with CRA layers. A next step in development will bemore » to also achieve a measurement of weld flaw height to allow for performing an Engineering Critical Assessment on the weld. This will allow for similar acceptance limits as currently used with Automated Ultrasonic Testing of pipeline girth welds. Although a sufficient sizing accuracy was already demonstrated and qualified in the TomoCAR system, testing in some applications is restricted to time limits. The paper will present some experiments that were performed to achieve flaw height approximation within these time limits.« less

  8. A Framework for Simulation of Aircraft Flyover Noise Through a Non-Standard Atmosphere

    NASA Technical Reports Server (NTRS)

    Arntzen, Michael; Rizzi, Stephen A.; Visser, Hendrikus G.; Simons, Dick G.

    2012-01-01

    This paper describes a new framework for the simulation of aircraft flyover noise through a non-standard atmosphere. Central to the framework is a ray-tracing algorithm which defines multiple curved propagation paths, if the atmosphere allows, between the moving source and listener. Because each path has a different emission angle, synthesis of the sound at the source must be performed independently for each path. The time delay, spreading loss and absorption (ground and atmosphere) are integrated along each path, and applied to each synthesized aircraft noise source to simulate a flyover. A final step assigns each resulting signal to its corresponding receiver angle for the simulation of a flyover in a virtual reality environment. Spectrograms of the results from a straight path and a curved path modeling assumption are shown. When the aircraft is at close range, the straight path results are valid. Differences appear especially when the source is relatively far away at shallow elevation angles. These differences, however, are not significant in common sound metrics. While the framework used in this work performs off-line processing, it is conducive to real-time implementation.

  9. Protecting your eyes in the laser operating room.

    PubMed

    Sallavanti, R A

    1995-01-01

    1. Laser protective eyewear is nearly as important to the OR nurse as the surgical mask in an operating room where laser surgery is performed. 2. Most hospitals require OR personnel to wear protective eyewear during laser procedures in voluntary compliance with American National Standards Institute (ANSI) Z136.3 for the safe use of lasers in health care facilities. 3. The basic steps to protecting your eyes are as follows: Select the appropriate eyewear (plastic or glass); make sure the eyewear fits properly; wear the protective lenses during laser testing and operation; and heed your laser safety officer.

  10. Development and Evaluation of a Training Program for Organ Procurement Coordinators Using Standardized Patient Methodology.

    PubMed

    Odabasi, Orhan; Elcin, Melih; Uzun Basusta, Bilge; Gulkaya Anik, Esin; Aki, Tuncay F; Bozoklar, Ata

    2015-12-01

    The low rate of consent by next of kin of donor-eligible patients is a major limiting factor in organ transplant. Educating health care professionals about their role may lead to measurable improvements in the process. Our aim was to describe the developmental steps of a communication skills training program for health care professionals using standardized patients and to evaluate the results. We developed a rubric and 5 cases for standardized family interviews. The 20 participants interviewed standardized families at the beginning and at the end of the training course, with interviews followed by debriefing sessions. Participants also provided feedback before and after the course. The performance of each participant was assessed by his or her peers using the rubric. We calculated the generalizability coefficient to measure the reliability of the rubric and used the Wilcoxon signed rank test to compare achievement among participants. Statistical analyses were performed with SPSS software (SPSS: An IBM Company, version 17.0, IBM Corporation, Armonk, NY, USA). All participants received higher scores in their second interview, including novice participants who expressed great discomfort during their first interview. The participants rated the scenarios and the standardized patients as very representative of real-life situations, with feedback forms showing that the interviews, the video recording sessions, and the debriefing sessions contributed to their learning. Our program was designed to meet the current expectations and implications in the field of donor consent from next of kin. Results showed that our training program developed using standardized patient methodology was effective in obtaining the communication skills needed for family interviews during the consent process. The rubric developed during the study was a valid and reliable assessment tool that could be used in further educational activities. The participants showed significant improvements in communication skills.

  11. Measuring individual work performance: identifying and selecting indicators.

    PubMed

    Koopmans, Linda; Bernaards, Claire M; Hildebrandt, Vincent H; de Vet, Henrica C W; van der Beek, Allard J

    2014-01-01

    Theoretically, individual work performance (IWP) can be divided into four dimensions: task performance, contextual performance, adaptive performance, and counterproductive work behavior. However, there is no consensus on the indicators used to measure these dimensions. This study was designed to (1) identify indicators for each dimension, (2) select the most relevant indicators, and (3) determine the relative weight of each dimension in ratings of work performance. IWP indicators were identified from multiple research disciplines, via literature, existing questionnaires, and expert interviews. Subsequently, experts selected the most relevant indicators per dimension and scored the relative weight of each dimension in ratings of IWP. In total, 128 unique indicators were identified. Twenty-three of these indicators were selected by experts as most relevant for measuring IWP. Task performance determined 36% of the work performance rating, while the other three dimensions respectively determined 22%, 20% and 21% of the rating. Notable consensus was found on relevant indicators of IWP, reducing the number from 128 to 23 relevant indicators. This provides an important step towards the development of a standardized, generic and short measurement instrument for assessing IWP.

  12. 40 CFR 420.126 - Pretreatment standards for new sources (PSNS).

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) EFFLUENT GUIDELINES AND STANDARDS IRON AND STEEL MANUFACTURING POINT SOURCE CATEGORY Hot Coating... rinse step. (2) [Reserved] (b) Galvanizing and other coatings—(1) Wire products and fasteners. Subpart L...

  13. Associations Between United States Medical Licensing Examination (USMLE) and Internal Medicine In-Training Examination (IM-ITE) Scores

    PubMed Central

    Zeger, Scott L.; Kolars, Joseph C.

    2008-01-01

    Background Little is known about the associations of previous standardized examination scores with scores on subsequent standardized examinations used to assess medical knowledge in internal medicine residencies. Objective To examine associations of previous standardized test scores on subsequent standardized test scores. Design Retrospective cohort study. Participants One hundred ninety-five internal medicine residents. Methods Bivariate associations of United States Medical Licensing Examination (USMLE) Steps and Internal Medicine In-Training Examination (IM-ITE) scores were determined. Random effects analysis adjusting for repeated administrations of the IM-ITE and other variables known or hypothesized to affect IM-ITE score allowed for discrimination of associations of individual USMLE Step scores on IM-ITE scores. Results In bivariate associations, USMLE scores explained 17% to 27% of the variance in IME-ITE scores, and previous IM-ITE scores explained 66% of the variance in subsequent IM-ITE scores. Regression coefficients (95% CI) for adjusted associations of each USMLE Step with IM-ITE scores were USMLE-1 0.19 (0.12, 0.27), USMLE-2 0.23 (0.17, 0.30), and USMLE-3 0.19 (0.09, 0.29). Conclusions No single USMLE Step is more strongly associated with IM-ITE scores than the others. Because previous IM-ITE scores are strongly associated with subsequent IM-ITE scores, appropriate modeling, such as random effects methods, should be used to account for previous IM-ITE administrations in studies for which IM-ITE score is an outcome. PMID:18612735

  14. Associations between United States Medical Licensing Examination (USMLE) and Internal Medicine In-Training Examination (IM-ITE) scores.

    PubMed

    McDonald, Furman S; Zeger, Scott L; Kolars, Joseph C

    2008-07-01

    Little is known about the associations of previous standardized examination scores with scores on subsequent standardized examinations used to assess medical knowledge in internal medicine residencies. To examine associations of previous standardized test scores on subsequent standardized test scores. Retrospective cohort study. One hundred ninety-five internal medicine residents. Bivariate associations of United States Medical Licensing Examination (USMLE) Steps and Internal Medicine In-Training Examination (IM-ITE) scores were determined. Random effects analysis adjusting for repeated administrations of the IM-ITE and other variables known or hypothesized to affect IM-ITE score allowed for discrimination of associations of individual USMLE Step scores on IM-ITE scores. In bivariate associations, USMLE scores explained 17% to 27% of the variance in IME-ITE scores, and previous IM-ITE scores explained 66% of the variance in subsequent IM-ITE scores. Regression coefficients (95% CI) for adjusted associations of each USMLE Step with IM-ITE scores were USMLE-1 0.19 (0.12, 0.27), USMLE-2 0.23 (0.17, 0.30), and USMLE-3 0.19 (0.09, 0.29). No single USMLE Step is more strongly associated with IM-ITE scores than the others. Because previous IM-ITE scores are strongly associated with subsequent IM-ITE scores, appropriate modeling, such as random effects methods, should be used to account for previous IM-ITE administrations in studies for which IM-ITE score is an outcome.

  15. Colorimetric carbon dioxide detector to determine accidental tracheal feeding tube placement.

    PubMed

    Howes, Daniel W; Shelley, Eric S; Pickett, William

    2005-04-01

    To determine the accuracy of colorimetric CO2 detection compared to the reference standard two-step radiological confirmation of feeding tube position. A prospective study was conducted with patients presenting to a 21-bed medical-surgical intensive care unit. An adapter was developed using an endotracheal tube adapter to connect a colorimetric CO2 detector to a feeding tube in an airtight manner. In part I of the study a feeding tube connected to the colorimeter was inserted into the endotracheal tubes of ten ventilated patients to test the device's ability to detect tracheal placement. In part II patients undergoing feeding tube insertion had tube position confirmed with the colorimeter as well as the reference standard two-step x-ray. In phase I the colorimeter correctly identified tracheal placement in all ten patients. In phase II 93/100 procedures ultimately were eligible; the colorimeter had a sensitivity of 0.88 (95% confidence interval: 0.65-1.00) and specificity of 0.99 (0.97-1.00). The device missed one of the eight tracheal placements. Agreement between the colorimeter and two-step x-ray interpretations was excellent (Kappa 0.86; standard error 0.10). We describe a novel, convenient method to confirm esophageal feeding tube placement. The device is easily assembled and inexpensive, but should not be reused. Colorimetric determination of tracheal feeding tube placement with this device has excellent agreement with the reference standard two-step radiological technique.

  16. Alaska Mathematics Standards

    ERIC Educational Resources Information Center

    Alaska Department of Education & Early Development, 2012

    2012-01-01

    High academic standards are an important first step in ensuring that all Alaska's students have the tools they need for success. These standards reflect the collaborative work of Alaskan educators and national experts from the nonprofit National Center for the Improvement of Educational Assessment. Further, they are informed by public comments.…

  17. Performance of a visuomotor walking task in an augmented reality training setting.

    PubMed

    Haarman, Juliet A M; Choi, Julia T; Buurke, Jaap H; Rietman, Johan S; Reenalda, Jasper

    2017-12-01

    Visual cues can be used to train walking patterns. Here, we studied the performance and learning capacities of healthy subjects executing a high-precision visuomotor walking task, in an augmented reality training set-up. A beamer was used to project visual stepping targets on the walking surface of an instrumented treadmill. Two speeds were used to manipulate task difficulty. All participants (n = 20) had to change their step length to hit visual stepping targets with a specific part of their foot, while walking on a treadmill over seven consecutive training blocks, each block composed of 100 stepping targets. Distance between stepping targets was varied between short, medium and long steps. Training blocks could either be composed of random stepping targets (no fixed sequence was present in the distance between the stepping targets) or sequenced stepping targets (repeating fixed sequence was present). Random training blocks were used to measure non-specific learning and sequenced training blocks were used to measure sequence-specific learning. Primary outcome measures were performance (% of correct hits), and learning effects (increase in performance over the training blocks: both sequence-specific and non-specific). Secondary outcome measures were the performance and stepping-error in relation to the step length (distance between stepping target). Subjects were able to score 76% and 54% at first try for lower speed (2.3 km/h) and higher speed (3.3 km/h) trials, respectively. Performance scores did not increase over the course of the trials, nor did the subjects show the ability to learn a sequenced walking task. Subjects were better able to hit targets while increasing their step length, compared to shortening it. In conclusion, augmented reality training by use of the current set-up was intuitive for the user. Suboptimal feedback presentation might have limited the learning effects of the subjects. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. United States Medical Licensing Examination and American Board of Pediatrics Certification Examination Results: Does the Residency Program Contribute to Trainee Achievement.

    PubMed

    Welch, Thomas R; Olson, Brad G; Nelsen, Elizabeth; Beck Dallaghan, Gary L; Kennedy, Gloria A; Botash, Ann

    2017-09-01

    To determine whether training site or prior examinee performance on the US Medical Licensing Examination (USMLE) step 1 and step 2 might predict pass rates on the American Board of Pediatrics (ABP) certifying examination. Data from graduates of pediatric residency programs completing the ABP certifying examination between 2009 and 2013 were obtained. For each, results of the initial ABP certifying examination were obtained, as well as results on National Board of Medical Examiners (NBME) step 1 and step 2 examinations. Hierarchical linear modeling was used to nest first-time ABP results within training programs to isolate program contribution to ABP results while controlling for USMLE step 1 and step 2 scores. Stepwise linear regression was then used to determine which of these examinations was a better predictor of ABP results. A total of 1110 graduates of 15 programs had complete testing results and were subject to analysis. Mean ABP scores for these programs ranged from 186.13 to 214.32. The hierarchical linear model suggested that the interaction of step 1 and 2 scores predicted ABP performance (F[1,1007.70] = 6.44, P = .011). By conducting a multilevel model by training program, both USMLE step examinations predicted first-time ABP results (b = .002, t = 2.54, P = .011). Linear regression analyses indicated that step 2 results were a better predictor of ABP performance than step 1 or a combination of the two USMLE scores. Performance on the USMLE examinations, especially step 2, predicts performance on the ABP certifying examination. The contribution of training site to ABP performance was statistically significant, though contributed modestly to the effect compared with prior USMLE scores. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Effect of step width manipulation on tibial stress during running.

    PubMed

    Meardon, Stacey A; Derrick, Timothy R

    2014-08-22

    Narrow step width has been linked to variables associated with tibial stress fracture. The purpose of this study was to evaluate the effect of step width on bone stresses using a standardized model of the tibia. 15 runners ran at their preferred 5k running velocity in three running conditions, preferred step width (PSW) and PSW±5% of leg length. 10 successful trials of force and 3-D motion data were collected. A combination of inverse dynamics, musculoskeletal modeling and beam theory was used to estimate stresses applied to the tibia using subject-specific anthropometrics and motion data. The tibia was modeled as a hollow ellipse. Multivariate analysis revealed that tibial stresses at the distal 1/3 of the tibia differed with step width manipulation (p=0.002). Compression on the posterior and medial aspect of the tibia was inversely related to step width such that as step width increased, compression on the surface of tibia decreased (linear trend p=0.036 and 0.003). Similarly, tension on the anterior surface of the tibia decreased as step width increased (linear trend p=0.029). Widening step width linearly reduced shear stress at all 4 sites (p<0.001 for all). The data from this study suggests that stresses experienced by the tibia during running were influenced by step width when using a standardized model of the tibia. Wider step widths were generally associated with reduced loading of the tibia and may benefit runners at risk of or experiencing stress injury at the tibia, especially if they present with a crossover running style. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Transfer effects of step training on stepping performance in untrained directions in older adults: A randomized controlled trial.

    PubMed

    Okubo, Yoshiro; Menant, Jasmine; Udyavar, Manasa; Brodie, Matthew A; Barry, Benjamin K; Lord, Stephen R; L Sturnieks, Daina

    2017-05-01

    Although step training improves the ability of quick stepping, some home-based step training systems train limited stepping directions and may cause harm by reducing stepping performance in untrained directions. This study examines the possible transfer effects of step training on stepping performance in untrained directions in older people. Fifty four older adults were randomized into: forward step training (FT); lateral plus forward step training (FLT); or no training (NT) groups. FT and FLT participants undertook a 15-min training session involving 200 step repetitions. Prior to and post training, choice stepping reaction time and stepping kinematics in untrained, diagonal and lateral directions were assessed. Significant interactions of group and time (pre/post-assessment) were evident for the first step after training indicating negative (delayed response time) and positive (faster peak stepping speed) transfer effects in the diagonal direction in the FT group. However, when the second to the fifth steps after training were included in the analysis, there were no significant interactions of group and time for measures in the diagonal stepping direction. Step training only in the forward direction improved stepping speed but may acutely slow response times in the untrained diagonal direction. However, this acute effect appears to dissipate after a few repeated step trials. Step training in both forward and lateral directions appears to induce no negative transfer effects in diagonal stepping. These findings suggest home-based step training systems present low risk of harm through negative transfer effects in untrained stepping directions. ANZCTR 369066. Copyright © 2017 Elsevier B.V. All rights reserved.

Top