Sample records for hitachi computers

  1. Automation photometer of Hitachi U–2000 spectrophotometer with RS–232C–based computer

    PubMed Central

    Kumar, K. Senthil; Lakshmi, B. S.; Pennathur, Gautam

    1998-01-01

    The interfacing of a commonly used spectrophotometer, the Hitachi U2000, through its RS–232C port to a IBM compatible computer is described. The hardware for data acquisation was designed by suitably modifying readily available materials, and the software was written using the C programming language. The various steps involved in these procedures are elucidated in detail. The efficacy of the procedure was tested experimentally by running the visible spectrum of a cyanine dye. The spectrum was plotted through a printer hooked to the computer. The spectrum was also plotted by transforming the abscissa to the wavenumber scale. This was carried out by using another module written in C. The efficiency of the whole set-up has been calculated using standard procedures. PMID:18924834

  2. Immediate behavioural responses to earthquakes in Christchurch, New Zealand, and Hitachi, Japan.

    PubMed

    Lindell, Michael K; Prater, Carla S; Wu, Hao Che; Huang, Shih-Kai; Johnston, David M; Becker, Julia S; Shiroshita, Hideyuki

    2016-01-01

    This study examines people's immediate responses to earthquakes in Christchurch, New Zealand, and Hitachi, Japan. Data collected from 257 respondents in Christchurch and 332 respondents in Hitachi revealed notable similarities between the two cities in people's emotional reactions, risk perceptions, and immediate protective actions during the events. Respondents' physical, household, and social contexts were quite similar, but Hitachi residents reported somewhat higher levels of emotional reaction and risk perception than did Christchurch residents. Contrary to the recommendations of emergency officials, the most frequent response of residents in both cities was to freeze. Christchurch residents were more likely than Hitachi residents to drop to the ground and take cover, whereas Hitachi residents were more likely than Christchurch residents to evacuate immediately the building in which they were situated. There were relatively small correlations between immediate behavioural responses and demographic characteristics, earthquake experience, and physical, social, or household context. © 2016 The Author(s). Disasters © Overseas Development Institute, 2016.

  3. 76 FR 72674 - Approval for Expansion of Manufacturing Authority, Foreign-Trade Subzone 29F, Hitachi Automotive...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-25

    ... DEPARTMENT OF COMMERCE Foreign-Trade Zones Board [Order No. 1798] Approval for Expansion of Manufacturing Authority, Foreign-Trade Subzone 29F, Hitachi Automotive Systems Americas, Inc., (Automotive... requested an expansion of the scope of manufacturing authority on behalf of Hitachi Automotive Systems...

  4. 76 FR 19746 - Approval for Subzone Expansion and Expansion of Manufacturing Authority; Foreign-Trade Subzone...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-08

    ... and Expansion of Manufacturing Authority; Foreign-Trade Subzone 29F; Hitachi Automotive Systems Americas, Inc. (Automotive Components); Harrodsburg, KY Pursuant to its authority under the Foreign-Trade... on behalf of Hitachi Automotive Systems Americas, Inc. (Hitachi), operator of Subzone 29F at the...

  5. 77 FR 13367 - General Electric-Hitachi Global Laser Enrichment, LLC, Proposed Laser-Based Uranium Enrichment...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-06

    ... NUCLEAR REGULATORY COMMISSION [NRC-2009-0157] General Electric-Hitachi Global Laser Enrichment... Impact Statement (EIS) for the proposed General Electric- Hitachi Global Laser Enrichment, LLC (GLE... issue a license to GLE, pursuant to Title 10 of the Code of Federal Regulations (10 CFR) parts 30, 40...

  6. 76 FR 14437 - Economic Simplified Boiling Water Reactor Standard Design: GE Hitachi Nuclear Energy; Issuance of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-16

    ... NUCLEAR REGULATORY COMMISSION [NRC-2011-0055] Economic Simplified Boiling Water Reactor Standard Design: GE Hitachi Nuclear Energy; Issuance of Final Design Approval The U.S. Nuclear Regulatory Commission has issued a final design approval (FDA) to GE Hitachi Nuclear Energy (GEH) for the economic...

  7. 76 FR 4948 - GE Hitachi Nuclear Energy; Notice of Receipt and Availability of an Application for Renewal of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-27

    ... NUCLEAR REGULATORY COMMISSION [NRC-2011-0020] GE Hitachi Nuclear Energy; Notice of Receipt and... December 7, 2010, GE Hitachi Nuclear Energy (GEH) filed with the U.S. Nuclear Regulatory Commission (NRC..., Certifications, and Approvals for Nuclear Power Plants,'' an application for a design certification (DC) renewal...

  8. 76 FR 9612 - GE Hitachi Nuclear Energy; Acceptance for Docketing of an Application for Renewal of the U.S...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-18

    ... NUCLEAR REGULATORY COMMISSION [Docket No. 52-045; NRC-2011-0020] GE Hitachi Nuclear Energy... Certification On December 7, 2010, GE Hitachi Nuclear Energy (GEH) submitted an application to the U.S. Nuclear... and Approvals for Nuclear Power Plants.'' A notice of receipt and availability of this application was...

  9. Influence of photoisomers in bilirubin determinations on Kodak Ektachem and Hitachi analysers in neonatal specimens study of the contribution of structural and configurational isomers.

    PubMed

    Gulian, J M; Dalmasso, C; Millet, V; Unal, D; Charrel, M

    1995-08-01

    We compared data obtained with the Kodak Ektachem and Hitachi 717 Analysers and HPLC from 83 neonates under phototherapy. Total bilirubin values determined with the Kodak and Hitachi are in good agreement, but we observed a large discrepancy in the results for conjugated (Kodak) and direct (Hitachi) bilirubin. HPLC revealed that all the samples contained configurational isomers, while only 7.7% and 30.8% contained conjugated bilirubin and structural isomers, respectively. We developed a device for the specific and quantitative production of configurational or structural isomers, by irradiation with blue or green light. In vitro, total bilirubin values are coherent for the routine analysers in the presence of configurational or structural isomers. With configurational isomers, unconjugated bilirubin (Kodak) is lower than total bilirubin (Kodak), and conjugated bilirubin (Kodak) is always equal to zero, so the apparatus gives a false positive response for delta bilirubin. In contrast, the direct bilirubin (Hitachi) is constant. Furthermore, in the presence of structural isomers, unconjugated bilirubin (Kodak) is unexpectedly higher than total bilirubin (Kodak), conjugated bilirubin (Kodak) is proportional to the quantity of these isomers, and direct bilirubin (Hitachi) is constant. The contribution of photoisomers in bilirubin measurements is discussed.

  10. CORBASec Used to Secure Distributed Aerospace Propulsion Simulations

    NASA Technical Reports Server (NTRS)

    Blaser, Tammy M.

    2003-01-01

    The NASA Glenn Research Center and its industry partners are developing a Common Object Request Broker (CORBA) Security (CORBASec) test bed to secure their distributed aerospace propulsion simulations. Glenn has been working with its aerospace propulsion industry partners to deploy the Numerical Propulsion System Simulation (NPSS) object-based technology. NPSS is a program focused on reducing the cost and time in developing aerospace propulsion engines. It was developed by Glenn and is being managed by the NASA Ames Research Center as the lead center reporting directly to NASA Headquarters' Aerospace Technology Enterprise. Glenn is an active domain member of the Object Management Group: an open membership, not-for-profit consortium that produces and manages computer industry specifications (i.e., CORBA) for interoperable enterprise applications. When NPSS is deployed, it will assemble a distributed aerospace propulsion simulation scenario from proprietary analytical CORBA servers and execute them with security afforded by the CORBASec implementation. The NPSS CORBASec test bed was initially developed with the TPBroker Security Service product (Hitachi Computer Products (America), Inc., Waltham, MA) using the Object Request Broker (ORB), which is based on the TPBroker Basic Object Adaptor, and using NPSS software across different firewall products. The test bed has been migrated to the Portable Object Adaptor architecture using the Hitachi Security Service product based on the VisiBroker 4.x ORB (Borland, Scotts Valley, CA) and on the Orbix 2000 ORB (Dublin, Ireland, with U.S. headquarters in Waltham, MA). Glenn, GE Aircraft Engines, and Pratt & Whitney Aircraft are the initial industry partners contributing to the NPSS CORBASec test bed. The test bed uses Security SecurID (RSA Security Inc., Bedford, MA) two-factor token-based authentication together with Hitachi Security Service digital-certificate-based authentication to validate the various NPSS users. The test bed is expected to demonstrate NPSS CORBASec-specific policy functionality, confirm adequate performance, and validate the required Internet configuration in a distributed collaborative aerospace propulsion environment.

  11. Evaluation of Shipbuilding CAD/CAM/CIM Systems - Phase II (Requirements for Future Systems)

    DTIC Science & Technology

    1997-02-01

    INNOVATION MARINE INDUSTRY STANDARDS WELDING INDUSTRIAL ENGINEERING EDUCATION AND TRAINING THE NATIONAL SHIPBUILDING RESEARCH PROGRAM February 1997 NSRP 0479...an analysis of CAD/CAM/CIM in shipyards, ship-design software firms, and alIied industries in Europe, Japan and the U.S. The purpose of the analysis...possible: Black and Veatch Hitachi Ariake Works Industrial Technology Institute Intergraph Corporation Kockums Computer Systems Mitsubishi Heavy Industries

  12. Massively parallel quantum computer simulator

    NASA Astrophysics Data System (ADS)

    De Raedt, K.; Michielsen, K.; De Raedt, H.; Trieu, B.; Arnold, G.; Richter, M.; Lippert, Th.; Watanabe, H.; Ito, N.

    2007-01-01

    We describe portable software to simulate universal quantum computers on massive parallel computers. We illustrate the use of the simulation software by running various quantum algorithms on different computer architectures, such as a IBM BlueGene/L, a IBM Regatta p690+, a Hitachi SR11000/J1, a Cray X1E, a SGI Altix 3700 and clusters of PCs running Windows XP. We study the performance of the software by simulating quantum computers containing up to 36 qubits, using up to 4096 processors and up to 1 TB of memory. Our results demonstrate that the simulator exhibits nearly ideal scaling as a function of the number of processors and suggest that the simulation software described in this paper may also serve as benchmark for testing high-end parallel computers.

  13. ONRASIA Scientific Information Bulletin, Volume 16, Number 1

    DTIC Science & Technology

    1991-03-01

    be expressed naturally in an and hence the programs produced by pline. They range from computing the algebraic language such as Fortran, these efforts...years devel- gram an iterative scheme to solve the function satisfies oping vectorizing compilers for Hitachi. problem. This is quite natural to do in...for it ential equations to be expressed in a on the plate, with 0,=1 at the outside to compile into efficient vectorizable natural mathematical syntax

  14. 75 FR 36447 - Notice of Availability of Draft Environmental Impact Statement and Public Meetings for the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-25

    ... Statement and Public Meetings for the General Electric-Hitachi Global Laser Enrichment, LLC Proposed Laser... the proposed General Electric-Hitachi (GEH) Global Laser Enrichment (GLE) Uranium Enrichment Facility... to locate the facility on the existing General Electric Company (GE) site near Wilmington, North...

  15. 75 FR 9451 - Notice of Receipt and Availability of Environmental Report Supplement 2 for the Proposed GE...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-02

    ... Availability of Environmental Report Supplement 2 for the Proposed GE-Hitachi Global Laser Enrichment Laser- Based Uranium Enrichment Facility On January 13, 2009, GE-Hitachi Global Laser Enrichment, LLC (GLE) was..., operation, and decommissioning of a laser-based uranium enrichment facility. The proposed facility would be...

  16. 77 FR 14523 - Western Digital Corporation; Analysis of Agreement Containing Consent Order to Aid Public Comment

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-12

    ... of Viviti Technologies Ltd., formerly known as Hitachi Global Storage Technologies Ltd. (``HGST''), a... negotiate the purchase price of desktop HDDs at a global level. The desktop HDD market is highly... Digital'') proposed acquisition of Viviti Technologies Ltd., formerly known as Hitachi Global Storage...

  17. 75 FR 1819 - GE-Hitachi Global Laser Enrichment LLC; (GLE Commercial Facility); Notice of Receipt of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-13

    ... Energy Americas LLC (GEH), which is a wholly owned subsidiary of GE-Hitachi Nuclear Energy Holdings LLC..., 2009, the NRC published notice of its intent to prepare an Environmental Impact Statement (EIS) on the... on the issuance of a license is completed. See Notice of Intent to Prepare an Environmental Impact...

  18. High-speed prediction of crystal structures for organic molecules

    NASA Astrophysics Data System (ADS)

    Obata, Shigeaki; Goto, Hitoshi

    2015-02-01

    We developed a master-worker type parallel algorithm for allocating tasks of crystal structure optimizations to distributed compute nodes, in order to improve a performance of simulations for crystal structure predictions. The performance experiments were demonstrated on TUT-ADSIM supercomputer system (HITACHI HA8000-tc/HT210). The experimental results show that our parallel algorithm could achieve speed-ups of 214 and 179 times using 256 processor cores on crystal structure optimizations in predictions of crystal structures for 3-aza-bicyclo(3.3.1)nonane-2,4-dione and 2-diazo-3,5-cyclohexadiene-1-one, respectively. We expect that this parallel algorithm is always possible to reduce computational costs of any crystal structure predictions.

  19. Bioenergetic Approaches and Inflammation of MPTP Toxicity

    DTIC Science & Technology

    2006-09-01

    with 8 strokes of pestle A followed by 8 strokes of pestle B. The brain homogenate was centrifuged at 1250 g for 3 min; pellet was discarded and...legends to figures. Fluorescence of safranin O was measured with an F4500 fluorimeter (Hitachi, Japan ) equipped with a magnetic stirring assembly and...and 531 nm emission wavelengths with F4500 fluorimeter (Hitachi, Japan ) equipped with a magnetic stirring assembly and a thermostated cuvette holder

  20. Behavioral Response in the Immediate Aftermath of Shaking: Earthquakes in Christchurch and Wellington, New Zealand, and Hitachi, Japan

    PubMed Central

    Jon, Ihnji; Lindell, Michael K.; Prater, Carla S.; Huang, Shih-Kai; Wu, Hao-Che; Johnston, David M.; Becker, Julia S.; Shiroshita, Hideyuki; Doyle, Emma E.H.; Potter, Sally H.; McClure, John; Lambie, Emily

    2016-01-01

    This study examines people’s response actions in the first 30 min after shaking stopped following earthquakes in Christchurch and Wellington, New Zealand, and Hitachi, Japan. Data collected from 257 respondents in Christchurch, 332 respondents in Hitachi, and 204 respondents in Wellington revealed notable similarities in some response actions immediately after the shaking stopped. In all four events, people were most likely to contact family members and seek additional information about the situation. However, there were notable differences among events in the frequency of resuming previous activities. Actions taken in the first 30 min were weakly related to: demographic variables, earthquake experience, contextual variables, and actions taken during the shaking, but were significantly related to perceived shaking intensity, risk perception and affective responses to the shaking, and damage/infrastructure disruption. These results have important implications for future research and practice because they identify promising avenues for emergency managers to communicate seismic risks and appropriate responses to risk area populations. PMID:27854306

  1. Behavioral Response in the Immediate Aftermath of Shaking: Earthquakes in Christchurch and Wellington, New Zealand, and Hitachi, Japan.

    PubMed

    Jon, Ihnji; Lindell, Michael K; Prater, Carla S; Huang, Shih-Kai; Wu, Hao-Che; Johnston, David M; Becker, Julia S; Shiroshita, Hideyuki; Doyle, Emma E H; Potter, Sally H; McClure, John; Lambie, Emily

    2016-11-15

    This study examines people's response actions in the first 30 min after shaking stopped following earthquakes in Christchurch and Wellington, New Zealand, and Hitachi, Japan. Data collected from 257 respondents in Christchurch, 332 respondents in Hitachi, and 204 respondents in Wellington revealed notable similarities in some response actions immediately after the shaking stopped. In all four events, people were most likely to contact family members and seek additional information about the situation. However, there were notable differences among events in the frequency of resuming previous activities. Actions taken in the first 30 min were weakly related to: demographic variables, earthquake experience, contextual variables, and actions taken during the shaking, but were significantly related to perceived shaking intensity, risk perception and affective responses to the shaking, and damage/infrastructure disruption. These results have important implications for future research and practice because they identify promising avenues for emergency managers to communicate seismic risks and appropriate responses to risk area populations.

  2. Comparison of avian biochemical test results with Abaxis VetScan and Hitachi 911 analyzers.

    PubMed

    Greenacre, Cheryl B; Flatland, Bente; Souza, Marcy J; Fry, Michael M

    2008-12-01

    To compare results of clinical biochemical analysis using an Abaxis VetScan bench-top analyzer with reagents specifically marketed for avian use and a Hitachi 911 analyzer, plasma (both methods) and whole blood (VetScan method) samples from 20 clinically healthy Hispaniolan Amazon parrots (Amazona ventralis) were analyzed. Correlation between methods was very high (r = 0.9-1.0) for aspartate aminotransferase (AST), calcium, glucose, and uric acid; high (r = 0.7-0.89) for creatine kinase (CK), phosphorus, potassium, and total protein; moderate (r = 0.5-0.69) for globulin; and low (r = 0.3-0.49) for albumin and sodium. VetScan analyzer results for globulin, sodium, and uric acid had a constant negative bias (values below those from the Hitachi method). Based on difference plot analysis, results for AST, calcium, CK, and glucose are comparable. Because 16 of 20 values fell below the lower detection limit of the VetScan analyzer, bile acid data were excluded from analysis. By using a relatively small sample size (0.1 ml whole blood or plasma), the VetScan analyzer offers rapid in-house results, compact size, and ease of operation. For 4 of the most clinically relevant biochemical analytes used in avian medicine (AST, calcium, CK, glucose), it offers reliable values. For an additional 4 analytes (phosphorous, potassium, total protein, uric acid), establishing analyzer-specific reference intervals is recommended. Neither the VetScan nor the Hitachi method is recommended to assess albumin and globulin concentrations.

  3. Comparison of chemistry analytes between 2 portable, commercially available analyzers and a conventional laboratory analyzer in reptiles.

    PubMed

    McCain, Stephanie L; Flatland, Bente; Schumacher, Juergen P; Clarke Iii, Elsburgh O; Fry, Michael M

    2010-12-01

    Advantages of handheld and small bench-top biochemical analyzers include requirements for smaller sample volume and practicality for use in the field or in practices, but little has been published on the performance of these instruments compared with standard reference methods in analysis of reptilian blood. The aim of this study was to compare reptilian blood biochemical values obtained using the Abaxis VetScan Classic bench-top analyzer and a Heska i-STAT handheld analyzer with values obtained using a Roche Hitachi 911 chemical analyzer. Reptiles, including 14 bearded dragons (Pogona vitticeps), 4 blue-tongued skinks (Tiliqua gigas), 8 Burmese star tortoises (Geochelone platynota), 10 Indian star tortoises (Geochelone elegans), 5 red-tailed boas (Boa constrictor), and 5 Northern pine snakes (Pituophis melanoleucus melanoleucus), were manually restrained, and a single blood sample was obtained and divided for analysis. Results for concentrations of albumin, bile acids, calcium, glucose, phosphates, potassium, sodium, total protein, and uric acid and activities of aspartate aminotransferase and creatine kinase obtained from the VetScan Classic and Hitachi 911 were compared. Results for concentrations of chloride, glucose, potassium, and sodium obtained from the i-STAT and Hitachi 911 were compared. Compared with results from the Hitachi 911, those from the VetScan Classic and i-STAT had variable correlations, and constant or proportional bias was found for many analytes. Bile acid data could not be evaluated because results for 44 of 45 samples fell below the lower linearity limit of the VetScan Classic. Although the 2 portable instruments might provide measurements with clinical utility, there were significant differences compared with the reference analyzer, and development of analyzer-specific reference intervals is recommended. ©2010 American Society for Veterinary Clinical Pathology.

  4. Transition of the Course Programs in the 40 Years History of Hitachi Institute of Technology

    NASA Astrophysics Data System (ADS)

    Miura, Osamu; Katsura, Koyo; Takahashi, Masahiko

    In 2010, the Hitachi Institute of Technology reached the 40th anniversary. In the beginning, the institute stood at the product-out-oriented view point and carried out extensive technical education from basis to advanced technology. After the 1990s, transition of the business environment with the globalization caused that the needs of the engineer education required by the business sections have been transformed. As the result, the changes of needs have been reflected for course program of the institute. Nowadays, in addition to the conventional course programs, the engineer education programs for the business competency and human skill have also been focused.

  5. Comparison of Two Different Ultrasound Devices Using Strain Elastography Technology in the Diagnosis of Breast Lesions Related to the Histologic Results.

    PubMed

    Farrokh, André; Schaefer, Fritz; Degenhardt, Friedrich; Maass, Nicolai

    2018-05-01

    This study was conducted to provide evidence that elastograms of two different devices and different manufacturers using the same technical approach provide the same diagnoses. A total of 110 breast lesions were prospectively analysed by two experts in ultrasound, using the strain elastography function from two different manufacturers (Hitachi HI-RTE, Hitachi Medical Systems, Wiesbaden, Germany; and Siemens eSie Touch, Siemens Medical Systems, Erlangen, Germany). Results were compared with the histopathologic results. Applying the Bowker test of symmetry, no statistically significant difference between the two elastography functions of these two devices was found (p = 0.120). The Cohen's kappa of k = 0.591 showed moderate strength of agreement between the two elastograms. The two examiners yielded moderate strength of agreement analysing the elastograms (Hitachi HI-RTE, k = 0.478; Siemens eSie Touch, k = 0.441). In conclusion, evidence is provided that elastograms of the same lesion generated by two different ultrasound devices equipped with a strain elastography function do not significantly differ. Copyright © 2018 World Federation for Ultrasound in Medicine and Biology. Published by Elsevier Inc. All rights reserved.

  6. Self-Sustaining Thorium Boiling Water Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greenspan, Ehud; Gorman, Phillip M.; Bogetic, Sandra

    The primary objectives of this project are to: Perform a pre-conceptual design of a core for an alternative to the Hitachi proposed fuel-self- sustaining RBWR-AC, to be referred to as a RBWR-Th. The use of thorium fuel is expected to assure negative void coefficient of reactivity (versus positive of the RBWR-AC) and improve reactor safety; Perform a pre-conceptual design of an alternative core to the Hitachi proposed LWR TRU transmuting RBWR-TB2, to be referred to as the RBWR-TR. In addition to improved safety, use of thorium for the fertile fuel is expected to improve the TRU transmutation effectiveness; Compare themore » RBWR-Th and RBWR-TR performance against that of the Hitachi RBWR core designs and sodium cooled fast reactor counterparts - the ARR and ABR; and, Perform a viability assessment of the thorium-based RBWR design concepts to be identified along with their associated fuel cycle, a technology gap analysis, and a technology development roadmap. A description of the work performed and of the results obtained is provided in this Overview Report and, in more detail, in the Attachments. The major findings of the study are summarized.« less

  7. Using the Hitachi SEM to engage learners and promote next generation science standards inquiry

    NASA Astrophysics Data System (ADS)

    Menshew, D. E.

    2014-09-01

    In this study participants will learn how the Hitachi TM3000 scanning electron microscope (SEM) played a central role in one school's movement towards Next Generation Science Standards (NGSS) and promoted exceptional student engagement. The device was used to create high quality images that were used by students in a variety of lab activities including a simulated crime scene investigation focusing on developing evidence based arguments as well as a real world conservation biology study. It provided opportunities for small group and independent investigations in support of NGSS, and peer-peer mentoring. Furthermore, use of the device was documented and were included to enhance secondary students' college and scholarship applications, all of which were successful.

  8. Hot Chips and Hot Interconnects for High End Computing Systems

    NASA Technical Reports Server (NTRS)

    Saini, Subhash

    2005-01-01

    I will discuss several processors: 1. The Cray proprietary processor used in the Cray X1; 2. The IBM Power 3 and Power 4 used in an IBM SP 3 and IBM SP 4 systems; 3. The Intel Itanium and Xeon, used in the SGI Altix systems and clusters respectively; 4. IBM System-on-a-Chip used in IBM BlueGene/L; 5. HP Alpha EV68 processor used in DOE ASCI Q cluster; 6. SPARC64 V processor, which is used in the Fujitsu PRIMEPOWER HPC2500; 7. An NEC proprietary processor, which is used in NEC SX-6/7; 8. Power 4+ processor, which is used in Hitachi SR11000; 9. NEC proprietary processor, which is used in Earth Simulator. The IBM POWER5 and Red Storm Computing Systems will also be discussed. The architectures of these processors will first be presented, followed by interconnection networks and a description of high-end computer systems based on these processors and networks. The performance of various hardware/programming model combinations will then be compared, based on latest NAS Parallel Benchmark results (MPI, OpenMP/HPF and hybrid (MPI + OpenMP). The tutorial will conclude with a discussion of general trends in the field of high performance computing, (quantum computing, DNA computing, cellular engineering, and neural networks).

  9. Automated technical validation--a real time expert system for decision support.

    PubMed

    de Graeve, J S; Cambus, J P; Gruson, A; Valdiguié, P M

    1996-04-15

    Dealing daily with various machines and various control specimens provides a lot of data that cannot be processed manually. In order to help decision-making we wrote specific software coping with the traditional QC, with patient data (mean of normals, delta check) and with criteria related to the analytical equipment (flags and alarms). Four machines (3 Ektachem 700 and 1 Hitachi 911) analysing 25 common chemical tests are controlled. Every day, three different control specimens and one more once a week (regional survey) are run on the various pieces of equipment. The data are collected on a 486 microcomputer connected to the central computer. For every parameter the standard deviation is compared with the published acceptable limits and the Westgard's rules are computed. The mean of normals is continuously monitored. The final decision induces either an alarm sound and the print-out of the cause of rejection or, if no alarms happen, the daily print-out of recorded data, with or without the Levey Jennings graphs.

  10. 77 FR 14838 - General Electric-Hitachi Global Laser Enrichment LLC, Commercial Laser-Based Uranium Enrichment...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-13

    ... safety, chemical process safety, fire safety, emergency management, environmental protection... the transportation of SNM of low strategic significance, human factors engineering, and electrical...

  11. Electrochemical direct immobilization of DNA sequences for label-free herpes virus detection

    NASA Astrophysics Data System (ADS)

    Tam, Phuong Dinh; Trung, Tran; Tuan, Mai Anh; Chien, Nguyen Duc

    2009-09-01

    DNA sequences/bio-macromolecules of herpes virus (5'-AT CAC CGA CCC GGA GAG GGA C-3') were directly immobilized into polypyrrole matrix by using the cyclic voltammetry method, and grafted onto arrays of interdigitated platinum microelectrodes. The morphology surface of the obtained PPy/DNA of herpes virus composite films was investigated by a FESEM Hitachi-S 4800. Fourier transform infrared spectroscopy (FTIR) was used to characterize the PPy/DNA film and to study the specific interactions that may exist between DNA biomacromolecules and PPy chains. Attempts are made to use these PPy/DNA composite films for label-free herpes virus detection revealed a response time of 60 s in solutions containing as low as 2 nM DNA concentration, and self life of six months when immerged in double distilled water and kept refrigerated.

  12. Advanced Reactor PSA Methodologies for System Reliability Analysis and Source Term Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, D.; Brunett, A.; Passerini, S.

    Beginning in 2015, a project was initiated to update and modernize the probabilistic safety assessment (PSA) of the GE-Hitachi PRISM sodium fast reactor. This project is a collaboration between GE-Hitachi and Argonne National Laboratory (Argonne), and funded in part by the U.S. Department of Energy. Specifically, the role of Argonne is to assess the reliability of passive safety systems, complete a mechanistic source term calculation, and provide component reliability estimates. The assessment of passive system reliability focused on the performance of the Reactor Vessel Auxiliary Cooling System (RVACS) and the inherent reactivity feedback mechanisms of the metal fuel core. Themore » mechanistic source term assessment attempted to provide a sequence specific source term evaluation to quantify offsite consequences. Lastly, the reliability assessment focused on components specific to the sodium fast reactor, including electromagnetic pumps, intermediate heat exchangers, the steam generator, and sodium valves and piping.« less

  13. Software tools for developing an acoustics multimedia CD-ROM

    NASA Astrophysics Data System (ADS)

    Bigelow, Todd W.; Wheeler, Paul A.

    2003-10-01

    A multimedia CD-ROM was developed to accompany the textbook, Science of Sound, by Tom Rossing. This paper discusses the multimedia elements included in the CD-ROM and the various software packages used to create them. PowerPoint presentations with an audio-track background were converted to web pages using Impatica. Animations of acoustic examples and quizzes were developed using Flash by Macromedia. Vegas Video and Sound Forge by Sonic Foundry were used for editing video and audio clips while Cleaner by Discreet was used to compress the clips for use over the internet. Math tutorials were presented as whiteboard presentations using Hitachis Starboard to create the graphics and TechSmiths Camtasia Studio to record the presentations. The CD-ROM is in a web-page format created with Macromedias Dreamweaver. All of these elements are integrated into a single course supplement that can be viewed by any computer with a web browser.

  14. Effect of hemoglobin- and Perflubron-based oxygen carriers on common clinical laboratory tests.

    PubMed

    Ma, Z; Monk, T G; Goodnough, L T; McClellan, A; Gawryl, M; Clark, T; Moreira, P; Keipert, P E; Scott, M G

    1997-09-01

    Polymerized hemoglobin solutions (Hb-based oxygen carriers; HBOCs) and a second-generation perfluorocarbon (PFC) emulsion (Perflubron) are in clinical trials as temporary oxygen carriers ("blood substitutes"). Plasma and serum samples from patients receiving HBOCs look markedly red, whereas those from patients receiving PFC appear to be lipemic. Because hemolysis and lipemia are well-known interferents in many assays, we examined the effects of these substances on clinical chemistry, immunoassay, therapeutic drug, and coagulation tests. HBOC concentrations up to 50 g/L caused essentially no interference for Na, K, Cl, urea, total CO2, P, uric acid, Mg, creatinine, and glucose values determined by the Hitachi 747 or Vitros 750 analyzers (or both) or for immunoassays of lidocaine, N-acetylprocainamide, procainamide, digoxin, phenytoin, quinidine, or theophylline performed on the Abbott AxSym or TDx. Gentamycin and vancomycin assays on the AxSym exhibited a significant positive and negative interference, respectively. Immunoassays for TSH on the Abbott IMx and for troponin I on the Dade Stratus were unaffected by HBOC at this concentration. Tests for total protein, albumin, LDH, AST, ALT, GGT, amylase, lipase, and cholesterol were significantly affected to various extents at different HBOC concentrations on the Hitachi 747 and Vitros 750. The CK-MB assay on the Stratus exhibited a negative interference at 5 g/L HBOC. HBOC interference in coagulation tests was method-dependent-fibrometer-based methods on the BBL Fibro System were free from interference, but optical-based methods on the MLA 1000C exhibited interferences at 20 g/L HBOC. A 1:20 dilution of the PFC-based oxygen carrier (600 g/L) caused no interference on any of these chemistry or immunoassay tests except for amylase and ammonia on the Vitros 750 and plasma iron on the Hitachi 747.

  15. Evaluation of NGAL TestTM on Cobas 6000.

    PubMed

    Hansen, Young B L; Damgaard, Anette; Poulsen, Jørgen H

    2014-01-01

    Neutrophil Gelatinase-Associated Lipocalin (NGAL) is a promising biomarker for acute kidney injury (AKI). Our objectives were to evaluate the NGAL Test(TM) from Bioporto for both urine NGAL and plasma NGAL on the Cobas 6000 c501 (Roche Diagnostics, Rotkreuz, Switzerland) with matched measurements run on Hitachi 917, the method's linearity on the Cobas 6000 in urine, EDTA and Lithium-Heparin (Li-Hep), the influence of using EDTA or Li-Hep tubes and, finally, the impact of freezing and thawing on the sample. Forty matched samples of Li-Hep and EDTA plasma and 40 urine samples were analyzed for method, anticoagulant, and freeze-thaw comparisons. Linearity was assessed using high NGAL samples diluted in urine, EDTA, and Li-Hep plasma. Commercial internal controls were used for the imprecision study. The Cobas 6000 measured identically with the Hitachi 917, however, not in EDTA plasma (Median Difference = 17.50 μg/L, p < 0.0001). Freeze-thaw process reduced NGAL ((EDTA: Mean Difference = = 15.13 μg/L, p = 0.0014)(Li-Hep: Median Difference = = 6.5 μg/L, p = 0.0129)). NGAL results were higher in Li-Hep plasma than in EDTA plasma ((Non-thawed: Median Difference = = 14.5 μg/L, p < 0.0001), (Thawed: Median Difference = = 21.5 μg/L, p = 0.0003)). Linearity agreements were observed in all three specimens. Imprecision (CV%) was below 3%. The NGAL Test(TM) can be applied on the Cobas 6000 with acceptable performance, although the Cobas 6000 measured higher than the Hitachi 917 in EDTA plasma. Though clinically insignificant, we found that the freeze-thaw process had a reduced effect. NGAL results were higher in Li-Hep tubes than in EDTA tubes. Thus, for blood samples we recommend use of EDTA tubes for NGAL measurements.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kenji Akagi; Masayuki Ishiwata; Kenji Araki

    In nuclear power plant construction, countless variety of parts, products, and jigs more than one million are treated under construction. Furthermore, strict traceability to the history of material, manufacturing, and installation is required for all products from the start to finish of the construction, which enforce much workforce and many costs at every project. In an addition, the operational efficiency improvement is absolutely essential for the effective construction to reduce the initial investment for construction. As one solution, RFID (Radio Frequent Identification) application technology, one of the fundamental technologies to realize a ubiquitous society, currently expands its functionality and generalmore » versatility at an accelerating pace in mass-production industry. Hitachi believes RFID technology can be useful of one of the key solutions for the issues in non-mass production industry as well. Under this situation, Hitachi initiated the development of next generation plant concept (ubiquitous plant construction technology) which utilizes information and RFID technologies. In this paper, our application plans of RFID technology to nuclear power is described. (authors)« less

  17. 75 FR 21680 - GE-Hitachi Global Laser Enrichment LLC;

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-26

    ... Global Laser Enrichment LLC; Establishment of Atomic Safety and Licensing Board Pursuant to delegation by... is hereby given that an Atomic Safety and Licensing Board (Board) is being established to preside... comprised of the following administrative judges: Paul S. Ryerson, Chair, Atomic Safety and Licensing Board...

  18. 77 FR 20009 - Howard Hughes Medical Institute, et al.; Notice of Consolidated Decision on Applications for Duty...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-03

    ... decision consolidated pursuant to Section 6(c) of the Educational, Scientific, and Cultural Materials... 07470. Instrument: Electron Microscope. Manufacturer: Hitachi High Technologies America, Inc., Japan... educational uses requiring an electron microscope. We know of no electron microscope, or any other instrument...

  19. FBIS report. Science and technology: Japan, November 6, 1995

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1995-11-06

    Some articles are: R&D on Microfactory Technologies; MHI Develops Low Cost, Low Noise Mid-size Helicopters; Kumamoto University to Apply for Approval to Conduct Clinical Experiment for Gene Therapy; MITI To Support Private Sector to Develop Cipher Technology; and Hitachi Electronics Develops Digital Broadcasting Camera System.

  20. CD-ROM and Libraries.

    ERIC Educational Resources Information Center

    Murphy, Brower

    1985-01-01

    The Compact Disc-Read Only Memory (CD-ROM) data format is explained and illustrated, noting current and potential applications. The "5-inch" compact laserdisc is described and photographs of an IBM PC/Hitachi CD-ROM system adopted by Library Corporation to support its MARC database--BiblioFile--are presented. Screen displays for…

  1. HTA educational outreach program and change the equation participation

    NASA Astrophysics Data System (ADS)

    Gordon, Robert

    2013-05-01

    In this presentation, Hitachi High Technologies America (HTA) introduces its Educational Outreach Program and explains it's involvement with Change The Equation (CTEq), a nonprofit, nonpartisan, CEO-led initiative that is mobilizing the business community to improve the quality of science, technology, engineering and mathematics (STEM) learning in the United States.

  2. OEM unveil new ideas for shovels and excavators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fiscor, S.

    2006-08-15

    From upgrades to new loading arrangements, vendors are looking at new ways to optimize the production process. The paper describes P & M equipment's new C series electric shovels equipped with the centurion system, Hitachi's super-sized excavator to Canadian oil sands, and Bucyrus and Siemens' engineer shovels. 3 figs., 1 photo.

  3. 75 FR 61227 - Advisory Committee on Reactor Safeguards Meeting of the ACRS Subcommittee on Future Plant Designs...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-04

    ... Westinghouse Electric Company, General Electric--Hitachi Nuclear Energy (GEH), and their contractors, pursuant... NUCLEAR REGULATORY COMMISSION Advisory Committee on Reactor Safeguards Meeting of the ACRS Subcommittee on Future Plant Designs; Revision to the September 24, 2010, ACRS Meeting Federal Register Notice...

  4. 78 FR 70532 - Notification of Proposed Production Activity, Hitachi Automotive Systems Americas, Inc., Subzone...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-26

    ... products (lithium-ion hybrid battery pack assemblies, electrical power steering modules, and electronic...-HK would be able to choose the duty rates during customs entry procedures that apply to lithium-ion..., alternators, distributors, other static converters, inverter modules, rotors/stators, batteries, ignition...

  5. A remote monitoring system for patients with implantable ventricular assist devices with a personal handy phone system.

    PubMed

    Okamoto, E; Shimanaka, M; Suzuki, S; Baba, K; Mitamura, Y

    1999-01-01

    The usefulness of a remote monitoring system that uses a personal handy phone for artificial heart implanted patients was investigated. The type of handy phone used in this study was a personal handy phone system (PHS), which is a system developed in Japan that uses the NTT (Nippon Telephone and Telegraph, Inc.) telephone network service. The PHS has several advantages: high-speed data transmission, low power output, little electromagnetic interference with medical devices, and easy locating of patients. In our system, patients have a mobile computer (Toshiba, Libretto 50, Kawasaki, Japan) for data transmission control between an implanted controller and a host computer (NEC, PC-9821V16) in the hospital. Information on the motor rotational angle (8 bits) and motor current (8 bits) of the implanted motor driven heart is fed into the mobile computer from the implanted controller (Hitachi, H8/532, Yokohama, Japan) according to 32-bit command codes from the host computer. Motor current and motor rotational angle data from inside the body are framed together by a control code (frame number and parity) for data error checking and correcting at the receiving site, and the data are sent through the PHS connection to the mobile computer. The host computer calculates pump outflow and arterial pressure from the motor rotational angle and motor current values and displays the data in real-time waveforms. The results of this study showed that accurate data on motor rotational angle and current could be transmitted from the subjects while they were walking or driving a car to the host computer at a data transmission rate of 9600 bps. This system is useful for remote monitoring of patients with an implanted artificial heart.

  6. 78 FR 52504 - Oregon Health and Science University, et al.; Notice of Consolidated Decision on Applications for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-23

    ... is a decision consolidated pursuant to Section 6(c) of the Educational, Scientific, and Cultural...: 13-004. Applicant: Georgia Institute of Technology, Atlanta, GA 30332. Instrument: Electron Microscope. Manufacturer: Hitachi High-Technologies Corp., Japan. Intended Use: See notice at 78 FR 13860-61...

  7. 76 FR 28214 - University of Wyoming, et al.; Notice of Consolidated Decision on Applications for Duty-Free...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-16

    ... consolidated pursuant to Section 6(c) of the Educational, Scientific, and Cultural Materials Importation Act of... 82072. Instrument: Electron Microscope. Manufacturer: Hitachi High- Technologies Corporation, Japan...-Technologies Corporation, Japan. Intended Use: See notice at 76 FR 20952, April 14, 2011. Docket Number: 11-024...

  8. 76 FR 37852 - Advisory Committee on Reactor Safeguards; Notice of Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-28

    ... Thursday, October 21, 2010 (74 FR 65038-65039). Wednesday, July 13, 2011, Conference Room T2-B1, 11545....: Safety Evaluation Report Associated with NEDC-33173, Supplement 2, Parts 1, 2, and 3, ``Analysis of Gamma... Electric Hitachi (GEH) regarding the safety evaluation report associated with NEDC-33173, Supplement 2...

  9. 76 FR 53980 - Request for a License To Import Radioactive Waste

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-30

    ... NUCLEAR REGULATORY COMMISSION Request for a License To Import Radioactive Waste Pursuant to 10 CFR... Hitachi Nuclear Energy, LLC. Radioactive waste Up to 210 Cobalt- Recycling, China August 1, 2011, August 5, consisting of 60 sealed forensic testing 2011, IW030. used Cobalt-60 sources. or storage and radioactive...

  10. 76 FR 48884 - Notice Pursuant to the National Cooperative Research and Production Act of 1993-ODVA, Inc.

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-09

    ... Motion, Inc., Boulder, CO; Hitachi Cable Manchester, Inc., Manchester, NH; and Global Engineering..., Applied Robotics, Inc., Glenville, NY; WIT, St.-Laurent-Du- Var, FRANCE; Caron Engineering, Inc., Wells... Act on May 2, 2011 (76 FR 24523). Patricia A. Brink, Director of Civil Enforcement, Antitrust Division...

  11. Oxidation of ZrB2 and ZrB2-SiC Ceramics With Tungsten Additions (Preprint)

    DTIC Science & Technology

    2009-02-01

    evaporate. The dried powder cake was ground using an alumina mortar and pestle to form granules that were passed through a 50 mesh sieve. Cylindrical...SEM; S-570, Hitachi, Japan ) equipped with energy dispersive spectroscopy (EDS; E2V, Scientific Instruments, UK) for simultaneous chemical analysis

  12. FIELD EVALUATION OF LOW-EMISSION COAL BURNER TECHNOLOGY ON UTILITY BOILERS VOLUME II. SECOND GENERATION LOW-NOX BURNERS

    EPA Science Inventory

    The report describes tests to evaluate the performance characteristics of three Second Generation Low-NOx burner designs: the Dual Register burner (DRB), the Babcock-Hitachi NOx Reducing (HNR) burner, and the XCL burner. The three represent a progression in development based on t...

  13. 76 FR 3540 - U.S. Advanced Boiling Water Reactor Aircraft Impact Design Certification Amendment

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-20

    .... Plain Language IX. Voluntary Consensus Standards X. Finding of No Significant Environmental Impact... information (or its equivalent) which was originally developed by GE Nuclear Energy (GE) and approved by the... from GE Hitachi Nuclear Energy (GEH) to use the GE- developed U.S. ABWR proprietary information for...

  14. 76 FR 78096 - U.S. Advanced Boiling Water Reactor Aircraft Impact Design Certification Amendment

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-16

    ... Environmental Impact: Availability IX. Paperwork Reduction Act Statement X. Regulatory Analysis XI. Regulatory.... ABWR; one commenter, GE Hitachi Nuclear Energy (GEH), was against the proposed amendment to the U.S... information from both the DCD developed by GE Nuclear Energy (GE) and the DCD developed by the STPNOC. The...

  15. Electron Microscopist | Center for Cancer Research

    Cancer.gov

    PROGRAM DESCRIPTION The Cancer Research Technology Program (CRTP) develops and implements emerging technology, cancer biology expertise and research capabilities to accomplish NCI research objectives. The CRTP is an outward-facing, multi-disciplinary hub purposed to enable the external cancer research community and provides dedicated support to NCI’s intramural Center for Cancer Research (CCR). The dedicated units provide electron microscopy, protein characterization, protein expression, optical microscopy and genetics. These research efforts are an integral part of CCR at the Frederick National Laboratory for Cancer Research (FNLCR). CRTP scientists also work collaboratively with intramural NCI investigators to provide research technologies and expertise. KEY ROLES/RESPONSIBILITIES - THIS POSITION IS CONTINGENT UPON FUNDING APPROVAL The Electron Microscopist will: Operate ultramicrotomes (Leica) and other instrumentation related to the preparation of embedded samples for EM (TEM and SEM) Operate TEM microscopes, (specifically Hitachi, FEI T20 and FEI T12) as well as SEM microscopes (Hitachi); task will include loading samples, screening, and performing data collection for a variety of samples: from cells to proteins Manage maintenance for the TEM and SEM microscopes Provide technical advice to investigators on sample preparation and data collection

  16. 75 FR 7040 - Investigations Regarding Certifications of Eligibility To Apply for Worker Adjustment Assistance

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-16

    ... Hitachi Automotive Products Harrodsburg, KY....... 01/04/10 12/31/09 (USA) (State). 73204 The Tie King...). 73207 O'neal Steel (State)....... Greensboro, NC........ 01/05/10 12/31/09 73208 Nomura Asset Management New York, NY 01/05/10 12/12/09 USA, Inc. (Wkrs). 73209 CL Automotive LLC (Wkrs)... Highland Park, MI...

  17. Symposium N: Materials and Devices for Thermal-to-Electric Energy Conversion

    DTIC Science & Technology

    2010-08-24

    X - ray diffraction, transmission electron microscopy, scanning electron microscopy, and dynamic light scattering. Thermal conductivity measurements...SEM), X - ray diffraction (XRD) measurements as well as Raman spectroscopy. The results from these techniques indicate a clear modification...was examined by using scanning electron microscope (SEM; HITACHI S-4500 model) attached with an energy dispersive x - ray spectroscopy. The electrical

  18. JPRS Report, Science & Technology, Europe, Economic Competitiveness

    DTIC Science & Technology

    1991-04-25

    business), and entertainment electronics com- panies such as Sony , Pioneer, JVC, Hitachi and Mat- sushita ("National Panasonic") in particular are...of outsiders from Southeast Asia, who are making life miser- able for the already too numerous big producers, such as BASF, Sony , TDK, Maxell, FDM...Motor, Mitsubishi Motor, and Yamaha Motor (motorcycles, alongside its parent com- pany Yamaha Corporation, which is also in the musical instrument

  19. The Shock and Vibration Digest. Volume 12, Number 4.

    DTIC Science & Technology

    1980-04-01

    self -excited oscillations. a great deal of experience has been gained in applying these techniques to practical situations. This Con- INVITED...Outlet Flow Field of Axial Flow Fans Key Words: Pumps, Compressors, Self -excited vibrations, Surges H. Fujita Mechanical Engrg. Res. Lab., Hitachi, Ltd...Tsu- Investigations concerned with the stability of stationary chiura, 300 Japan, NOISE-CON 79, Machinery Noise states and the possibility of self

  20. Microwave Fiber-Optics Delay Line.

    DTIC Science & Technology

    1980-01-01

    frequency response. We observed the anticipated modulation resonance and its dependence on the dc bias level. However, we did not have a scanning Fabry - Perot ...could not be determined accurately because a scanning Fabry - Perot was not available. However, from the various experimental observations and the rise time...Hitachi HLP-2400U BH laser and a Rockwell heterojunction photodiode o. ......... 26 11 Demodulated rf power versus detector dc photocurrent

  1. Nanosizing a Metal-Organic Framework Enzyme Carrier for Accelerating Nerve Agent Hydrolysis

    DTIC Science & Technology

    2016-10-05

    Previously, biodegradable liposome nano- carriers have been shown to be effective at providing functionally significant amounts of highly purified enzymes in...AlexaFluor-647 dye was purchased from Life Technologies (Thermo Fisher Scientific). Methyl 6-(pinacolboryl)-2-naphthoate was synthesized using a published...Hitachi) and PXRD (Smartlab, Rigaku). Labeling OPAA with Fluorescent Dye . AlexaFluor-647-labeled OPAA (OPAA647) was prepared by reacting OPAA (0.5

  2. Using a university characterization facility to educate the public about microscopes: light microscopes to SEM

    NASA Astrophysics Data System (ADS)

    Healy, Nancy; Henderson, Walter

    2015-10-01

    The National Nanotechnology Infrastructure Network (NNIN)1is an integrated partnership of 14 universities across the US funded by NSF to support nanoscale researchers. The NNIN education office is located at the Institute of Electronics and Nanotechnology at the Georgia Institute of Technology. At Georgia Tech we offer programs that integrate the facility and its resources to educate the public about nanotechnology. One event that has proved highly successful involves using microscopes in our characterization suite to educate a diverse audience about a variety of imaging instruments. As part of the annual Atlanta Science Festival (ATLSF)2 we provided an event entitled: "What's all the Buzz about Nanotechnology?" which was open to the public and advertised through a variety of methods by the ATLSF. During the event, we provided hands-on demos, cleanroom tours, and activities with three of our microscopes in our recently opened Imaging and Characterization Facility: 1. Keyence VHX-600 Digital Microscope; 2. Hitachi SU823 FE-SEM; and 3. Hitachi TM 3000. During the two hour event we had approximately 150 visitors including many families with school-aged children. Visitors were invited to bring a sample for scanning with the TM-3000. This paper will discuss how to do such an event, lessons learned, and visitor survey results.

  3. Evaluation of the Hitachi 717 analyser.

    PubMed

    Biosca, C; Antoja, F; Sierra, C; Douezi, H; Macià, M; Alsina, M J; Galimany, R

    1989-01-01

    The selective multitest Boehringer Mannheim Hitachi 717 analyser was evaluated according to the guidelines of the Comisión de Instrumentación de la Sociedad Española de Química Clinica and the European Committee for Clinical Laboratory Standards. The evaluation was performed in two steps: examination of the analytical units and evaluation in routine operation.THE EVALUATION OF THE ANALYTICAL UNITS INCLUDED A PHOTOMETRIC STUDY: the inaccuracy is acceptable for 340 and 405 nm; the imprecision ranges from 0.12 to 0.95% at 340 nm and from 0.30 to 0.73 at 405 nm, the linearity shows some dispersion at low absorbance for NADH at 340 nm, the drift is negligible, the imprecision of the pipette delivery system increases when the sample pipette operates with 3 mul, the reagent pipette imprecision is acceptable and the temperature control system is good.UNDER ROUTINE WORKING CONDITIONS, SEVEN DETERMINATIONS WERE STUDIED: glucose, creatinine, iron, total protein, AST, ALP and calcium. The within-run imprecision (CV) ranged from 0.6% for total protein and AST to 6.9% for iron. The between run imprecision ranged from 2.4% for glucose to 9.7% for iron. Some contamination was found in the carry-over study. The relative inaccuracy is good for all the constituents assayed.

  4. Development of the Phase-up Technology of the Radio Telescopes: 6.7 GHz Methanol Maser Observations with Phased Hitachi 32 m and Takahagi 32 m Radio Telescopes

    NASA Astrophysics Data System (ADS)

    Takefuji, K.; Sugiyama, K.; Yonekura, Y.; Saito, T.; Fujisawa, K.; Kondo, T.

    2017-11-01

    For the sake of high-sensitivity 6.7 GHz methanol maser observations, we developed a new technology for coherently combining the two signals from the Hitachi 32 m radio telescope and the Takahagi 32 m radio telescope of the Japanese Very long baseline interferometer Network (JVN), where the two telescopes were separated by about 260 m. After the two telescopes were phased as a twofold larger single telescope, the mean signal-to-noise ratio (S/N) of the 6.7 GHz methanol masers observed by the phased telescopes was improved to 1.254-fold higher than that of the single dish, through a very long baseline interferometry (VLBI) experiment on the 50 km baseline of the Kashima 34 m telescope and the 1000 km baseline of the Yamaguchi 32 m telescope. Furthermore, we compared the S/Ns of the 6.7 GHz maser spectra for two methods. One is a VLBI method and the other is the newly developed digital position switching that is a similar technology to that used in noise-canceling headphones. Finally, we confirmed that the mean S/N of method of the digital position switching (ON-OFF) was 1.597-fold higher than that of the VLBI method.

  5. JPRS Report, Near East & South Asia, Israel

    DTIC Science & Technology

    1991-09-11

    practice, however, a but I never heard him; I have enough music here at the good number of ceremonies are held at the Western Western Wall." Wall...Israel on the volume Company, Daihatsu, Suzuki, Sony , Hitachi, and others. of their purchases from Israeli industry. Golan: "We are certain that, under...Kibbutz that arrived in the corresponding period last year. How- Yad Mordekhai, an entertainment center on the Dead ever, in recent months, the branch

  6. Quantum Effect Physics, Electronics and Applications: Proceedings of the International Workshop Held in Luxor, Egypt on January 6-10, 1992

    DTIC Science & Technology

    1992-12-15

    Giza Engineering Systems, Fujitsu, Hitachi, Matsushita, Mitsubishi, NEC, BTT, Sanyo, Sony. and Toshiba. K lsmail T Ikoma H I Smith Organizing and...etch and the i"• development of low etch rate surfaces were used for the fabrication of pyramid - shaped ridges with the QWs forming buried layers...inside the pyramids . "a/s Depending on the etch-depth, the wire /\

  7. Dynamic Testing of Signal Transduction Deregulation During Breast Cancer Initiation

    DTIC Science & Technology

    2011-07-01

    1 at a chamber pressure of ~3 × 10-6 Torr for the electron beam evaporated films. A Hitachi FB2100 Focused Ion Beam milling machine with a gallium ...immobilization. These include physical absorption, layer-by-layer (LBL) assembly, and covalent attachment, and eventually chose the covalent attachment...testing real-time signaling in live breast cancer cells, it is important to evaluate the nanosensors to monitor fluorescent compounds in single

  8. Electro-Optics Millimeter/Microwave Technology in Japan. Report of DoD Technology Team.

    DTIC Science & Technology

    1985-05-01

    Fiber Technology Hitachi is developing Ge-Se chalcogenide glass infrared optical fibers. Mate- rial development and evaluation has been carried out...chalcogenide glass fibers. The analysis indi- cates that the addition of Sb to Ge-Se chalcogenide glass should yield fibers with a very small absorption...representative of other commercial cables. Fiber is drawn using Vapor Axial Deposition (VAD) with pre-form glass ingots. Multiple fibers are combined

  9. Performance evaluation of new automated hepatitis B viral markers in the clinical laboratory: two quantitative hepatitis B surface antigen assays and an HBV core-related antigen assay.

    PubMed

    Park, Yongjung; Hong, Duck Jin; Shin, Saeam; Cho, Yonggeun; Kim, Hyon-Suk

    2012-05-01

    We evaluated quantitative hepatitis B surface antigen (qHBsAg) assays and a hepatitis B virus (HBV) core-related antigen (HBcrAg) assay. A total of 529 serum samples from patients with hepatitis B were tested. HBsAg levels were determined by using the Elecsys (Roche Diagnostics, Indianapolis, IN) and Architect (Abbott Laboratories, Abbott Park, IL) qHBsAg assays. HBcrAg was measured by using Lumipulse HBcrAg assay (Fujirebio, Tokyo, Japan). Serum aminotransferases and HBV DNA were respectively quantified by using the Hitachi 7600 analyzer (Hitachi High-Technologies, Tokyo, Japan) and the Cobas AmpliPrep/Cobas TaqMan test (Roche). Precision of the qHBsAg and HBcrAg assays was assessed, and linearity of the qHBsAg assays was verified. All assays showed good precision performance with coefficients of variation between 4.5% and 5.3% except for some levels. Both qHBsAg assays showed linearity from 0.1 to 12,000.0 IU/mL and correlated well (r = 0.9934). HBsAg levels correlated with HBV DNA (r = 0.3373) and with HBcrAg (r = 0.5164), and HBcrAg also correlated with HBV DNA (r = 0.5198; P < .0001). This observation could provide impetus for further research to elucidate the clinical usefulness of the qHBsAg and HBcrAg assays.

  10. [MRI/TRUS fusion-guided prostate biopsy : Value in the context of focal therapy].

    PubMed

    Franz, T; von Hardenberg, J; Blana, A; Cash, H; Baumunk, D; Salomon, G; Hadaschik, B; Henkel, T; Herrmann, J; Kahmann, F; Köhrmann, K-U; Köllermann, J; Kruck, S; Liehr, U-B; Machtens, S; Peters, I; Radtke, J P; Roosen, A; Schlemmer, H-P; Sentker, L; Wendler, J J; Witzsch, U; Stolzenburg, J-U; Schostak, M; Ganzer, R

    2017-02-01

    Several systems for MRI/TRUS fusion-guided biopsy of the prostate are commercially available. Many studies have shown superiority of fusion systems for tumor detection and diagnostic quality compared to random biopsy. The benefit of fusion systems in focal therapy of prostate cancer (PC) is less clear. Critical considerations of fusion systems for planning and monitoring of focal therapy of PC were investigated. A systematic literature review of available fusion systems for the period 2013-5/2016 was performed. A checklist of technical details, suitability for special anatomic situations and suitability for focal therapy was established by the German working group for focal therapy (Arbeitskreis fokale und Mikrotherapie). Eight fusion systems were considered (Artemis™, BioJet, BiopSee®, iSR´obot™ Mona Lisa, Hitachi HI-RVS, UroNav and Urostation®). Differences were found for biopsy mode (transrectal, perineal, both), fusion mode (elastic or rigid), navigation (image-based, electromagnetic sensor-based or mechanical sensor-based) and space requirements. Several consensus groups recommend fusion systems for focal therapy. Useful features are "needle tracking" and compatibility between fusion system and treatment device (available for Artemis™, BiopSee® and Urostation® with Focal One®; BiopSee®, Hitachi HI-RVS with NanoKnife®; BioJet, BiopSee® with cryoablation, brachytherapy). There are a few studies for treatment planning. However, studies on treatment monitoring after focal therapy are missing.

  11. Evaluation of the Radiometer whole blood glucose measuring system, EML 105.

    PubMed

    Harff, G A; Janssen, W C; Rooijakkers, M L

    1997-03-01

    The performance of a new glucose electrode system from Radiometer was tested using two EML 105 analyzers (Radiometer Medical A/S, Copenhagen, Denmark). Results were very precise (both analyzers reported CV = 1.0% at a glucose concentration of 13.4 mmol/l). Comparison of methods was performed according to the NCCLS EP9-T guideline. Patients glucose results from both analyzers were lower compared with the results obtained with a Hitachi 911 (Boehringer Mannheim, Mannheim, Germany). There was no haematocrit dependency of relevance.

  12. Study on excitation and fluorescence spectrums of Japanese citruses to construct machine vision systems for acquiring fluorescent images

    NASA Astrophysics Data System (ADS)

    Momin, Md. Abdul; Kondo, Naoshi; Kuramoto, Makoto; Ogawa, Yuichi; Shigi, Tomoo

    2011-06-01

    Research was conducted to acquire knowledge of the ultraviolet and visible spectrums from 300 -800 nm of some common varieties of Japanese citrus, to investigate the best wave-lengths for fluorescence excitation and the resulting fluorescence wave-lengths and to provide a scientific background for the best quality fluorescent imaging technique for detecting surface defects of citrus. A Hitachi U-4000 PC-based microprocessor controlled spectrophotometer was used to measure the absorption spectrum and a Hitachi F-4500 spectrophotometer was used for the fluorescence and excitation spectrums. We analyzed the spectrums and the selected varieties of citrus were categorized into four groups of known fluorescence level, namely strong, medium, weak and no fluorescence.The level of fluorescence of each variety was also examined by using machine vision system. We found that around 340-380 nm LEDs or UV lamps are appropriate as lighting devices for acquiring the best quality fluorescent image of the citrus varieties to examine their fluorescence intensity. Therefore an image acquisition device was constructed with three different lighting panels with UV LED at peak 365 nm, Blacklight blue lamps (BLB) peak at 350 nm and UV-B lamps at peak 306 nm. The results from fluorescent images also revealed that the findings of the measured spectrums worked properly and can be used for practical applications such as for detecting rotten, injured or damaged parts of a wide variety of citrus.

  13. Application of small-diameter FBG sensors for detection of damages in composites

    NASA Astrophysics Data System (ADS)

    Okabe, Yoji; Mizutani, Tadahito; Yashiro, Shigeki; Takeda, Nobuo

    2001-08-01

    Small-diameter fiber Bragg grating (FBG) sensors have been developed by Hitachi Cable Ltd. and the authors. Since the outside diameter of polyimide coating is 52 micrometers , embedding of the sensors into carbon fiber reinforced plastic (CFRP) composites prepregs of 125 micrometers in thickness does not deteriorate the mechanical properties of the composite laminates. In this research, the small-diameter FBG sensor was applied for the detection of transverse cracks in CFRP composites. The FBG sensor was embedded in 0 degree(s) ply of a CFRP cross-ply laminate.

  14. Creation of three-dimensional craniofacial standards from CBCT images

    NASA Astrophysics Data System (ADS)

    Subramanyan, Krishna; Palomo, Martin; Hans, Mark

    2006-03-01

    Low-dose three-dimensional Cone Beam Computed Tomography (CBCT) is becoming increasingly popular in the clinical practice of dental medicine. Two-dimensional Bolton Standards of dentofacial development are routinely used to identify deviations from normal craniofacial anatomy. With the advent of CBCT three dimensional imaging, we propose a set of methods to extend these 2D Bolton Standards to anatomically correct surface based 3D standards to allow analysis of morphometric changes seen in craniofacial complex. To create 3D surface standards, we have implemented series of steps. 1) Converting bi-plane 2D tracings into set of splines 2) Converting the 2D splines curves from bi-plane projection into 3D space curves 3) Creating labeled template of facial and skeletal shapes and 4) Creating 3D average surface Bolton standards. We have used datasets from patients scanned with Hitachi MercuRay CBCT scanner providing high resolution and isotropic CT volume images, digitized Bolton Standards from age 3 to 18 years of lateral and frontal male, female and average tracings and converted them into facial and skeletal 3D space curves. This new 3D standard will help in assessing shape variations due to aging in young population and provide reference to correct facial anomalies in dental medicine.

  15. Design and Analysis of Thorium-fueled Reduced Moderation Boiling Water Reactors

    NASA Astrophysics Data System (ADS)

    Gorman, Phillip Michael

    The Resource-renewable Boiling Water Reactors (RBWRs) are a set of light water reactors (LWRs) proposed by Hitachi which use a triangular lattice and high void fraction to incinerate fuel with an epithermal spectrum, which is highly atypical of LWRs. The RBWRs operate on a closed fuel cycle, which is impossible with a typical thermal spectrum reactor, in order to accomplish missions normally reserved for sodium fast reactors (SFRs)--either fuel self-sufficiency or waste incineration. The RBWRs also axially segregate the fuel into alternating fissile "seed" regions and fertile "blanket" regions in order to enhance breeding and leakage probability upon coolant voiding. This dissertation focuses on thorium design variants of the RBWR: the self-sufficient RBWR-SS and the RBWR-TR, which consumes reprocessed transuranic (TRU) waste from PWR used nuclear fuel. These designs were based off of the Hitachi-designed RBWR-AC and the RBWR-TB2, respectively, which use depleted uranium (DU) as the primary fertile fuel. The DU-fueled RBWRs use a pair of axially segregated seed sections in order to achieve a negative void coefficient; however, several concerns were raised with this multi-seed approach, including difficulty with controlling the reactor and unacceptably high axial power peaking. Since thorium-uranium fuel tends to have much more negative void feedback than uranium-plutonium fuels, the thorium RBWRs were designed to use a single elongated seed to avoid these issues. A series of parametric studies were performed in order to find the design space for the thorium RBWRs, and optimize the designs while meeting the required safety constraints. The RBWR-SS was optimized to maximize the discharge burnup, while the RBWR-TR was optimized to maximize the TRU transmutation rate. These parametric studies were performed on an assembly level model using the MocDown simulator, which calculates an equilibrium fuel composition with a specified reprocessing scheme. A full core model was then created for each design, using the Serpent/PARCS 3-D core simulator, and the full core performance was assessed. The RBWR-SS benefited from a harder spectrum than the RBWR-TR; a hard spectrum promotes breeding and increases the discharge burnup, but reduces the TRU transmutation rate. This led the RBWR-SS to have a very tight lattice, which has a lot of experimental uncertainty in the thermal hydraulic correlations. Two different RBWR-SS designs were created assuming different thermal hydraulic assumptions: the RBWR-SSH used the same assumptions as Hitachi used for the RBWR-AC, while the RBWR-SSM used more conservative correlations recommended by collaborators at MIT. However, the void feedback of the pure Th-fed system was too strongly negative, even with a single elongated seed. Therefore, instead of using just thorium, the self-sustaining designs were fed with a mix of between 30% and 50% DU and the rest thorium in order to keep the void feedback as close to zero as possible. This was not necessary for the RBWR-TR, as the external TRU feed fulfilled a similar role. Unfortunately, it was found that the RBWR-SSM could not sustain a critical cycle without either significantly downgrading the power or supplying an external feed of fissile material. While the RBWR-SSH and the RBWR-TR could reach similar burnups and transmutation rates to their DU-fueled counterparts as designed by Hitachi, the thorium designs were unable to simultaneously have negative void feedback and sufficient shutdown margin to shut down the core. The multi-seed approach of the Hitachi designs allowed their reactors to have much lower magnitudes of Doppler feedback than the single-seed designs, which helps them to have sufficient shutdown margin. It is expected that thorium-fueled RBWRs designed to have multiple seeds would permit adequate shutdown margin, although care would need to be taken in order to avoid running into the same issues as the DU fueled RBWRs. Alternatively, it may be possible to increase the amount of boron in the control blades by changing the assembly and core design. Nonetheless, the uncertainties in the multiplication factor due to nuclear data and void fraction uncertainty were assessed for the RBWR-SSH and the RBWR-TR, as well as for the RBWR-TB2. In addition, the uncertainty associated with the change in reactor states (such as the reactivity insertion in flooding the core) due to nuclear data uncertainties was quantified. The thorium RBWRs have much larger uncertainty of their DU-fueled counterparts as designed by Hitachi, as the fission cross section of 233U has very large uncertainty in the epithermal energy range. The uncertainty in the multiplication factor at reference conditions was about 1350 pcm for the RBWR-SSH, while it was about 900 pcm for the RBWR-TR. The uncertainty in the void coefficient of reactivity for both reactors is between 8 and 10 pcm/% void, which is on the same order of magnitude as the full core value. Finally, since sharp linear heat rate spikes were observed in the RBWR-TB2 simulation, the RBWR-TB2 unit cell was simulated using a much finer mesh than is possible using deterministic codes. It was found that the thermal neutrons reflecting back from the reflectors and the blankets were causing extreme spikes in the power density near the axial boundaries of the seeds, which were artificially smoothed out when using coarser meshes. It is anticipated that these spikes will cause melting in both seeds in the RBWR-TB2, unless design changes--such as reducing the enrichment level near the axial boundaries of the seeds--are made.

  16. Reference System of DNA and Protein Sequences on CD-ROM

    NASA Astrophysics Data System (ADS)

    Nasu, Hisanori; Ito, Toshiaki

    DNASIS-DBREF31 is a database for DNA and Protein sequences in the form of optical Compact Disk (CD) ROM, developed and commercialized by Hitachi Software Engineering Co., Ltd. Both nucleic acid base sequences and protein amino acid sequences can be retrieved from a single CD-ROM. Existing database is offered in the form of on-line service, floppy disks, or magnetic tape, all of which have some problems or other, such as usability or storage capacity. DNASIS-DBREF31 newly adopt a CD-ROM as a database device to realize a mass storage and personal use of the database.

  17. Energy Conservation Potential of Surface Modification Technologies

    DTIC Science & Technology

    1985-09-01

    vigorously pursued by industry. In effect, two companies, Energy Conversion Devices, Inc. and Chronar, both amor- phous photovoltaic cell producers...MOOM«* -KIO «t t» IO CM 00 00 VO Ov vo oo CM vo r- o Tj- o oo CM IO vO© t IO t ON to ovvotor- inov ^io CM CM — — CM — vo CM CM in — tn...Maklno Mach. Tool Co. Model MC 40 (Y-15 3/4") Kearney & Trecker M1lwaukee-Mat1c 180 (Y«20") Hitachi Selkl U.S.A. Inc. HA-400 SEIKIMATIC <Y» 20") Ex- Cell

  18. Lotus LADM Based Self-Decontaminating Surfaces

    DTIC Science & Technology

    2007-05-01

    nylon composite was examined with a scanning electron microscope (SEM), Hitachi S -3200N, operated at 10kV and magnifications from 50x to 2000x. For...cos1( HL H S p L p S d L d SeL γγγγγγθγ ⋅+⋅+⋅=+ (1) To make the nylon film surface hydrophobic, we need to attach a low surface tension material to...views of roughness pattern. For this rough surface, r and ФS are defined as: ( ) 1 2 2 2 ++ = dR Rhr π (4) ( )2 2 2 dR R S + =Φ π (5

  19. Establishing pediatric reference intervals for 13 biochemical analytes derived from normal subjects in a pediatric endocrinology clinic in Korea.

    PubMed

    Cho, Sun-Mi; Lee, Sang-Guk; Kim, Ho Seong; Kim, Jeong-Ho

    2014-12-01

    Defining pediatric reference intervals is one of the most difficult tasks for laboratory physicians. The continuously changing physiology of growing children makes their laboratory values moving targets. In addition, ethnic and behavioral differences might also cause variations. The aim of this study was to establish age- and sex-specific partitioned reference intervals for 13 serum biochemical analytes in Korean children. A total of 2474 patients, girls aged 2-14 years and boys aged 2-16 years, who underwent a short stature workup but were diagnosed as normal at the Pediatric Endocrinology Clinic of Severance Hospital (Seoul, Korea) between September 2010 and June 2012 were included in this study. The levels of serum calcium, inorganic phosphorus, blood urea nitrogen, creatinine, uric acid, glucose, total cholesterol, total protein, albumin, alkaline phosphatase, aspartic aminotransferase, alanine aminotransferase, and total bilirubin were measured using a Hitachi 7600 analyzer (Hitachi High-Technologies Corporation, Tokyo, Japan). Reference intervals were partitioned according to sex or age subgroups using the Harris and Boyd method. Most analytes except calcium and albumin required partitioning either by sex or age. Age-specific partitioned reference intervals for alkaline phosphatase, creatinine, and total bilirubin were established for both males and females after being partitioned by sex. Additional age-specific partitioning of aspartic aminotransferase in females and total protein and uric acid in males was also required. Inorganic phosphorus, total cholesterol, alanine aminotransferase, blood urea nitrogen, and glucose were partitioned only by sex. This study provided updated age- and sex-specific pediatric reference intervals for 13 basic serum chemistry analytes from a sufficient number of healthy children by using a modern analytical chemistry platform. Copyright © 2014 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  20. Isolation and identification of female DNA on postcoital penile swabs.

    PubMed

    Cina, S J; Collins, K A; Pettenati, M J; Fitts, M

    2000-06-01

    After sexual assault, cells originating from the assailant may be recovered from the victim. Through polymerase chain reaction (PCR)-based technology, positive scientific identification of the assailant may be made from these cells. Described is a prospective study describing a method for positively identifying cells from a female sex partner obtained from postcoital swabs of the penis of the male sex partner. Swabs were taken from the penis of a man at 1- to 24-hour intervals after coitus. DNA was isolated from each swab through standard organic extraction methods. The presence of female DNA was detected using the gender-specific amelogenin marker. Extracted DNA was amplified for eight different genetic loci using the Promega PowerPlex kit (Promega) and Amplitaq Gold (Perkin Elmer). Amplified samples were electrophoresed on precast sequencing gels (Hitachi) and were analyzed fluorescently using Hitachi's FMBIO 2 fluorescent scanner and software. Each sample obtained from a penile swab or condom was compared to male and female buccal controls. Female DNA was isolated from all postcoital penile swabs as determined by exclusive amplification of the X-chromosome specific 212 base pair amelogenin marker. In all cases, scientific identification of the female DNA from the swabs was determined by coamplification of eight STR loci (PowerPlex) and was compared to female and male control profiles. Cells shed from a female victim during sexual intercourse can be retrieved from the penis of a male offender after sexual intercourse during a 1- to 24-hour postcoital interval. DNA can be extracted from these cells and can be used to scientifically identify the female sexual participant through PCR-based technology. It is suggested that penile swabs be taken from alleged perpetrators of sexual assaults to associate them with a female victim.

  1. High-resolution ultrasonography in assessing temporomandibular joint disc position.

    PubMed

    Talmaceanu, Daniel; Lenghel, Lavinia Manuela; Bolog, Nicolae; Popa Stanila, Roxana; Buduru, Smaranda; Leucuta, Daniel Corneliu; Rotar, Horatiu; Baciut, Mihaela; Baciut, Grigore

    2018-02-04

    The purpose of this study was to determine the diagnostic value of high-resolution ultrasonography (US) in temporomandibular joint (TMJ) disc displacements. A number of 74 patients (148 TMJs) with signs and symptoms of TMJ disorders, according to the Research Diagnostic Criteria for Temporomandibular Disorders, were included in this study. All patients received US and magnetic resonance imaging (MRI) of both TMJs 1 to 5 days after the clinical examination. MRI examinations were performed using 1.5 T MRI equipment (Siemens Avanto, Siemens, Erlangen). Ultrasonographic examination was performed on a Hitachi EUB 8500 (Hitachi Medical Corp., Tokyo, Japan) scanner with L 54 M6.5-13 MHz linear transducer. MRI depicted 68 (45.95%) normal joints, 47 (31.76%) with disc displacement with reduction, 33 (22.3%) with disc displacement without reduction and 34 (22.97%) with degenerative changes. US detected 78 (52.7%) normal joints, 37 (25%) with disc displacement with reduction, 33 (22.3%) with disc displacement without reduction and 21 (14.19%) with degenerative changes. Compared to MRI, US showed a sensitivity of 93.1%, specificity of 87.88%, accuracy of 90.32%, a positive predictive value of 87.1% and a negative predictive value of 93.55% for overall diagnosis of disc displacement. The Youden index was 0.81. Based on our results, high-resolution ultrasonography showed high sensitivity, specificity and accuracy in the diagnosis of TMJ disc displacement. It could be a valuable imaging technique in assessing TMJ disc position. The diagnostic value of high-resolution ultrasonography depends strictly on the examiner's skills and on the equipment used.

  2. Mediaprocessors in medical imaging for high performance and flexibility

    NASA Astrophysics Data System (ADS)

    Managuli, Ravi; Kim, Yongmin

    2002-05-01

    New high performance programmable processors, called mediaprocessors, have been emerging since the early 1990s for various digital media applications, such as digital TV, set-top boxes, desktop video conferencing, and digital camcorders. Modern mediaprocessors, e.g., TI's TMS320C64x and Hitachi/Equator Technologies MAP-CA, can offer high performance utilizing both instruction-level and data-level parallelism. During this decade, with continued performance improvement and cost reduction, we believe that the mediaprocessors will become a preferred choice in designing imaging and video systems due to their flexibility in incorporating new algorithms and applications via programming and faster-time-to-market. In this paper, we will evaluate the suitability of these mediaprocessors in medical imaging. We will review the core routines of several medical imaging modalities, such as ultrasound and DR, and present how these routines can be mapped to mediaprocessors and their resultant performance. We will analyze the architecture of several leading mediaprocessors. By carefully mapping key imaging routines, such as 2D convolution, unsharp masking, and 2D FFT, to the mediaprocessor, we have been able to achieve comparable (if not better) performance to that of traditional hardwired approaches. Thus, we believe that future medical imaging systems will benefit greatly from these advanced mediaprocessors, offering significantly increased flexibility and adaptability, reducing the time-to-market, and improving the cost/performance ratio compared to the existing systems while meeting the high computing requirements.

  3. Single exposure EUV patterning of BEOL metal layers on the IMEC iN7 platform

    NASA Astrophysics Data System (ADS)

    Blanco Carballo, V. M.; Bekaert, J.; Mao, M.; Kutrzeba Kotowska, B.; Larivière, S.; Ciofi, I.; Baert, R.; Kim, R. H.; Gallagher, E.; Hendrickx, E.; Tan, L. E.; Gillijns, W.; Trivkovic, D.; Leray, P.; Halder, S.; Gallagher, M.; Lazzarino, F.; Paolillo, S.; Wan, D.; Mallik, A.; Sherazi, Y.; McIntyre, G.; Dusa, M.; Rusu, P.; Hollink, T.; Fliervoet, T.; Wittebrood, F.

    2017-03-01

    This paper summarizes findings on the iN7 platform (foundry N5 equivalent) for single exposure EUV (SE EUV) of M1 and M2 BEOL layers. Logic structures within these layers have been measured after litho and after etch, and variability was characterized both with conventional CD-SEM measurements as well as Hitachi contouring method. After analyzing the patterning of these layers, the impact of variability on potential interconnect reliability was studied by using MonteCarlo and process emulation simulations to determine if current litho/etch performance would meet success criteria for the given platform design rules.

  4. Sequence search on a supercomputer.

    PubMed

    Gotoh, O; Tagashira, Y

    1986-01-10

    A set of programs was developed for searching nucleic acid and protein sequence data bases for sequences similar to a given sequence. The programs, written in FORTRAN 77, were optimized for vector processing on a Hitachi S810-20 supercomputer. A search of a 500-residue protein sequence against the entire PIR data base Ver. 1.0 (1) (0.5 M residues) is carried out in a CPU time of 45 sec. About 4 min is required for an exhaustive search of a 1500-base nucleotide sequence against all mammalian sequences (1.2M bases) in Genbank Ver. 29.0. The CPU time is reduced to about a quarter with a faster version.

  5. Collaborative Manufacturing Management in Networked Supply Chains

    NASA Astrophysics Data System (ADS)

    Pouly, Michel; Naciri, Souleiman; Berthold, Sébastien

    ERP systems provide information management and analysis to industrial companies and support their planning activities. They are currently mostly based on theoretical values (averages) of parameters and not on the actual, real shop floor data, leading to disturbance of the planning algorithms. On the other hand, sharing data between manufacturers, suppliers and customers becomes very important to ensure reactivity towards markets variability. This paper proposes software solutions to address these requirements and methods to automatically capture the necessary corresponding shop floor information. In order to share data produced by different legacy systems along the collaborative networked supply chain, we propose to use the Generic Product Model developed by Hitachi to extract, translate and store heterogeneous ERP data.

  6. An Educational Program of Engineering Ethics and Its Dissemination Activity

    NASA Astrophysics Data System (ADS)

    Muramatsu, Ryujiro; Nagashima, Shigeo

    Education on ethics for corporate employees, especially for engineers, seems to become increasingly important for most of companies in Japan, because some affairs or scandals caused by ethical problem in many companies were likely to subject them to operational disadvantages. Even in Hitachi, Ltd., we have worked on education of engineering ethics for two years. In this paper, we describe some activities of committees on engineering ethics, an e-learning training course which is usable on our intranet e-learning system, and a short-term in-house training course operated regularly in our training institute. And we also refer to its dissemination activities to employees in each division and some subsidiaries.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ozeki, H.; Isono, T.; Uno, Y.

    JAEA successfully completed the manufacture of the toroidal field (TF) insert coil (TFIC) for a performance test of the ITER TF conductor in the final design in cooperation with Hitachi, Ltd. The TFIC is a single-layer 8.875-turn solenoid coil with 1.44-m diameter. This will be tested for 68-kA current application in a 13-T external magnetic field. TFIC was manufactured in the following order: winding of the TF conductor, lead bending, fabrication of the electrical termination, heat treatment, turn insulation, installation of the coil into the support mandrel structure, vacuum pressure impregnation (VPI), structure assembly, and instrumentation. Here in this presentation,more » manufacture process and quality control status for the TFIC manufacturing are reported.« less

  8. Decontamination, decommissioning, and vendor advertorial issue, 2008

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agnihotri, Newal

    The focus of the July-August issue is on Decontamination, decommissioning, and vendor advertorials. Articles and reports in this issue include: D and D technical paper summaries; The role of nuclear power in turbulent times, by Tom Chrisopher, AREVA, NP, Inc.; Enthusiastic about new technologies, by Jack Fuller, GE Hitachi Nuclear Energy; It's important to be good citizens, by Steve Rus, Black and Veatch Corporation; Creating Jobs in the U.S., by Guy E. Chardon, ALSTOM Power; and, and, An enviroment and a community champion, by Tyler Lamberts, Entergy Nuclear Operations, Inc. The Industry Innovations article is titled Best of the bestmore » TIP achievement 2008, by Edward Conaway, STP Nuclear Operating Company.« less

  9. Age-related plasma chemistry findings in the buff-crested bustard (Eupodotis ruficrista gindiana).

    PubMed

    Bailey, T A; Wernery, U; Howlett, J; Naldo, J; Samour, J H

    1998-12-01

    Blood samples were obtained from adult (> 1.5 years) and juvenile (2-8 weeks, 9-16 weeks and 17-24 weeks) captive buff-crested bustards (Eupodotis ruficrista gindiana) to study age-related changes. A total of twelve different tests were conducted using a Hitachi 90011 wet chemistry analyzer. A comparison of the values obtained was made between adult and juvenile buff-crested bustards and from the literature with other bustard species. Significant differences between adult and juvenile buff-crested bustards were found for glucose, uric acid, total protein, alkaline phosphatase, asparatate amino transferase and calcium. The results obtained from this study provide blood chemistry values for this species and demonstrate age-related differences between adult and juvenile birds.

  10. International trade and waste and fuel managment issue, 2008

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agnihotri, Newal

    The focus of the January-February issue is on international trade and waste and fuel managment. Major articles/reports in this issue include: A global solution for clients, by Yves Linz, AREVA NP; A safer, secure and economical plant, by Andy White, GE Hitachi Nuclear; Robust global prospects, by Ken Petrunik, Atomic Energy of Canada Limited; Development of NPPs in China, by Chen Changbing and Li Huiqiang, Huazhong University of Science and Technology; Yucca Mountain update; and, A class of its own, by Tyler Lamberts, Entergy Nuclear. The Industry Innovation articles in this issue are: Fuel assembly inspection program, by Jim Lemons,more » Tennessee Valley Authority; and, Improved in-core fuel shuffle for reduced refueling duration, by James Tusar, Exelon Nuclear.« less

  11. Prime Contract Awards Alphabetically by Contractor, by State or Country, and Place, FY 88. Part 9. (Giusti & Renshaw Construction-Hitachi America, Ltd.)

    DTIC Science & Technology

    1988-01-01

    0000000000000-4-4 00 C-0 C- 0 0 Mol040 9L0 I c0N IW00000000110N00O X"- wo Ci 0 izo-4 ( 0KI o WI(0ON 0 0 C4 C-C-C-C-C-C-C-L9CC-0 C-I.- OC- 0 C-LI 0001...I r- r- rl r- r- o to V -* q1t v -4 -9t -4 Ln Ln Ln Ln 0 1 wotn< I N N C,4 N -4 Ln Ln -4 V 14 -4 -4 -4 00 Lf) U) Ul) Ln I CO a c) 1 -4 04 0) -e 0) tD

  12. A Mobile Nanoscience and Electron Microscopy Outreach Program

    NASA Astrophysics Data System (ADS)

    Coffey, Tonya; Kelley, Kyle

    2013-03-01

    We have established a mobile nanoscience laboratory outreach program in Western NC that puts scanning electron microscopy (SEM) directly in the hands of K-12 students and the general public. There has been a recent push to develop new active learning materials to educate students at all levels about nanoscience and nanotechnology. Previous projects, such as Bugscope, nanoManipulator, or SPM Live! allowed remote access to advanced microscopies. However, placing SEM directly in schools has not often been possible because the cost and steep learning curve of these technologies were prohibitive, making this project quite novel. We have developed new learning modules for a microscopy outreach experience with a tabletop SEM (Hitachi TM3000). We present here an overview of our outreach and results of the assessment of our program to date.

  13. Manufacture and Quality Control of Insert Coil with Real ITER TF Conductor

    DOE PAGES

    Ozeki, H.; Isono, T.; Uno, Y.; ...

    2016-03-02

    JAEA successfully completed the manufacture of the toroidal field (TF) insert coil (TFIC) for a performance test of the ITER TF conductor in the final design in cooperation with Hitachi, Ltd. The TFIC is a single-layer 8.875-turn solenoid coil with 1.44-m diameter. This will be tested for 68-kA current application in a 13-T external magnetic field. TFIC was manufactured in the following order: winding of the TF conductor, lead bending, fabrication of the electrical termination, heat treatment, turn insulation, installation of the coil into the support mandrel structure, vacuum pressure impregnation (VPI), structure assembly, and instrumentation. Here in this presentation,more » manufacture process and quality control status for the TFIC manufacturing are reported.« less

  14. Heat insulating device for low temperature liquefied gas storage tank

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Okamoto, T.; Nishimoto, T.; Sawada, K.

    1978-05-02

    Hitachi Shipbuilding and Engineering Co., Ltd.'s insulation method for spherical LNG containers solves various problems associated with insulating a sphere's three-dimensional curved surface; equalizing the thickness of the insulation, insulating the junctions between insulation blocks, and preventing seawater or LNG from penetrating the insulation barrier in the event of a rupture in the tank and ship's hull. The design incorporates a number of blocks or plates of rigid foam-insulating material bonded to the outer wall; seats for receiving pressing jigs for the bonding operation are secured to the outer wall in the joints between the insulating blocks. The joints aremore » filled with soft synthetic foam (embedding the seats), a moistureproof layer covers the insulating blocks and joints, and a waterproof material covers the moistureproof layer.« less

  15. Secondary barrier construction for low temperature liquefied gas storage tank carrying vessels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Okamoto, T.; Nishimoto, T.; Sawada, K.

    1978-12-05

    A new LNG-cargo-tank secondary barrier developed by Japan's Hitachi Shipbuilding and Engineering Co., Ltd., offers ease of fabrication, simple construction, improved efficiency of installation, and protection against seawater ingress as well as LNG leakage. The secondary barrier, intended for use below spherical LNG tanks, consists of unit heat-insulating block plates adhesively secured to the bottom plate of the ship's hold, heat-insulating filling members stuffed into the joints between the block plates, and a protective layer formed on the entire surface of the block plates and the filling members. These unit block plates are in the form of heat-insulating members ofmore » the required thickness, preformed into a square or trapezoidal shape, particularly in the form of rigid-foam synthetic-resin plates.« less

  16. FBIS report. Science and technology: Japan, December 10, 1996

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1996-12-10

    Contents (partial): Japan: Fabrication of Diamond Single Crystal Thin Film by Ion Beam Deposition; Japan: Hitachi Metal Develops New Semi Solid Metal Processing Technology; Japan: NTT Develops Fuel Cell System That Uses Both City Gas, LPG; Japan: Daihatsu Motor Completes Prototype EV; Japan: NIRIM Announces Success With Synthetic Bone Development; Japan: Sandoz Pharmaceuticals Plans Clinical Trials of Gene Therapy to Cerebral Tumor in Japan; Japan: MITI To Provide Aid for Residential Solar Power Generation Systems; Japan: MELCO To Provide Satellite Solar Cell Panel for SSL, USA; Japan: Japan Atomic Energy Research Institute Leads Nuclear Research; Japan: Kobe Steel`s Superconducting Magnetmore » Ready to Go Fast; Japan: MPT To Begin Validation Test for Electric Money Implementation; and Japan: Defense Agency to Send ASDF`s Pilots to Russia for Training.« less

  17. Methodological specifics of the study of micro HPP based on internal combustion engines with air cooling and cogeneration

    NASA Astrophysics Data System (ADS)

    Shchinnikov, P. A.; Tomilov, V. G.; Sinelnikov, D. S.

    2017-01-01

    The article considers some aspects of the research methodology of micro heat power plants based on internal combustion engines with air cooling and cogeneration based on energy balance equations and the laws of heat transfer. The research is conducted for such a setup based on the Hitachi internal combustion engine with 2.4 kW capacity. It has shown the efficiency of cogeneration use in the form of useful heat flow from air, cooling the cylinder head, with its further heating by utilizing the heat of flue gases in an additional plate heat exchanger. It has been shown that the cogeneration can save fuel costs 3-10 times compared with heat guns, depending on the duration of the setup use.

  18. PREFACE: 7th International Conference on Applications of Physics in Financial Analysis

    NASA Astrophysics Data System (ADS)

    Takayasu, M.; Watanabe, T.; Ikeda, Y.; Takayasu, H.

    2010-04-01

    This volume contains contributed papers from the 7th international conference on 'Applications of Physics in Financial Analysis (APFA)' held at Tokyo on 1-5 March 2009. The conference was organized jointly by Tokyo Institute of Technology and Hitotsubashi University with support from the Research Institute of Economy, Trade, and Industry (RIETI), Physical Society of Japan, Japanese Economic Association, Information Processing Society of Japan, Japanese Society for Artificial Intelligence, and Japan Association for Evolutionary Economics. The first APFA conference (APFA1) was held in 1999 at Dublin, followed by APFA2 at Liege in 2000, APFA3 at London in 2001, APFA4 at Warsaw in 2003, APFA5 at Torino in 2006, and APFA6 at Lisbon in 2007. The 7th APFA conference, which is the first meeting held outside Europe, was attended by 223 researchers in physics and economics from 23 countries world-wide. In keeping with past APFA conferences, we paid special attention to issues in financial markets, which turned out to be very timely. The conference was held in March 2009, in the middle of the global financial crisis that originally started in the US and spread quickly to every corner of the world. The topic of the conference is 'New Approaches to the Analysis of Large Scale Business and Economic data'. The rapid development of information and communication technology has enabled financial/non-financial firms to keep detailed records of their business activities in the form of, for example, tick-by-tick data in financial markets, point-of-sale (POS) data on individual household's purchasing activity, and interfirm network data describing relationships among firms in terms of suppliers/customers transactions and ownerships. This growth in the scope and amount of business data available to researchers has led to a far-reaching expansion in research possibilities. Researchers not only in social sciences but also in physics, mathematics, and information sciences have recently become interested in such datasets, conducting empirical investigations about various aspects of economic activities. Specifically, they have searched for regularities and 'laws' akin to the ones in natural science, successfully producing fascinating results, as shown in the papers contained in this volume. Each paper submitted for publication in this volume has gone through the refereeing process, and has been revised on the basis of comments and discussion at the conference as well as comments from the anonymous referees. Finally, 19 papers were accepted for publication. The editors are very grateful to the colleagues involved in the refereeing process for their rapid and careful reviewing of the papers. We thank Takayuki Mizuno, Koji Sakai, Hiwon Yoon and Hiroki Matsui for their support for the conference. We appreciate the administrative assistance provided by Yayoi Hatano of Hitotsubashi University, and Masahiko Ozaki, Masato Yamada and Tomoko Kase of RIETI. We are most grateful to the authors for their contributions, as well as to the participants, all of whom made this conference stimulating and enjoyable. Misako Takayasu Tokyo Institute of Technology, Japan Tsutomu Watanabe Hitotsubashi University, Japan RIETI, Japan Yuichi Ikeda Hitachi Research Laboratory, Hitachi Ltd, Japan Hideki Takayasu Sony Computer Science Laboratories, Inc, Japan

  19. OSCAR API for Real-Time Low-Power Multicores and Its Performance on Multicores and SMP Servers

    NASA Astrophysics Data System (ADS)

    Kimura, Keiji; Mase, Masayoshi; Mikami, Hiroki; Miyamoto, Takamichi; Shirako, Jun; Kasahara, Hironori

    OSCAR (Optimally Scheduled Advanced Multiprocessor) API has been designed for real-time embedded low-power multicores to generate parallel programs for various multicores from different vendors by using the OSCAR parallelizing compiler. The OSCAR API has been developed by Waseda University in collaboration with Fujitsu Laboratory, Hitachi, NEC, Panasonic, Renesas Technology, and Toshiba in an METI/NEDO project entitled "Multicore Technology for Realtime Consumer Electronics." By using the OSCAR API as an interface between the OSCAR compiler and backend compilers, the OSCAR compiler enables hierarchical multigrain parallel processing with memory optimization under capacity restriction for cache memory, local memory, distributed shared memory, and on-chip/off-chip shared memory; data transfer using a DMA controller; and power reduction control using DVFS (Dynamic Voltage and Frequency Scaling), clock gating, and power gating for various embedded multicores. In addition, a parallelized program automatically generated by the OSCAR compiler with OSCAR API can be compiled by the ordinary OpenMP compilers since the OSCAR API is designed on a subset of the OpenMP. This paper describes the OSCAR API and its compatibility with the OSCAR compiler by showing code examples. Performance evaluations of the OSCAR compiler and the OSCAR API are carried out using an IBM Power5+ workstation, an IBM Power6 high-end SMP server, and a newly developed consumer electronics multicore chip RP2 by Renesas, Hitachi and Waseda. From the results of scalability evaluation, it is found that on an average, the OSCAR compiler with the OSCAR API can exploit 5.8 times speedup over the sequential execution on the Power5+ workstation with eight cores and 2.9 times speedup on RP2 with four cores, respectively. In addition, the OSCAR compiler can accelerate an IBM XL Fortran compiler up to 3.3 times on the Power6 SMP server. Due to low-power optimization on RP2, the OSCAR compiler with the OSCAR API achieves a maximum power reduction of 84% in the real-time execution mode.

  20. Optimization and validation of CEDIA drugs of abuse immunoassay tests in serum on Hitachi 912.

    PubMed

    Kirschbaum, Katrin M; Musshoff, Frank; Schmithausen, Ricarda; Stockhausen, Sarah; Madea, Burkhard

    2011-10-10

    Due to sensitive limits of detection of chromatographic methods and low limit values regarding the screening of drugs under the terms of impairment in safe driving (§ 24a StVG, Street Traffic Law in Germany), preliminary immunoassay (IA) tests should be able to detect also low concentrations of legal and illegal drugs in serum in forensic cases. False-negatives should be avoided, the rate of false-positive samples should be low due to cost and time. An optimization of IA cutoff values and a validation of the assay is required for each laboratory. In a retrospective study results for serum samples containing amphetamine, methylenedioxy derivatives, cannabinoids, benzodiazepines, cocaine (metabolites), methadone and opiates obtained with CEDIA drugs of abuse reagents on a Hitachi 912 autoanalyzer were compared with quantitative results of chromatographic methods (gas or liquid chromatography coupled with mass spectrometry (GC/MS or LC/MS)). Firstly sensitivity, specificity, positive and negative predictive values and overall misclassification rates were evaluated by contingency tables and compared to ROC-analyses and Youden-Indices. Secondly ideal cutoffs were statistically calculated on the basis of sensitivity and specificity as decisive statistical criteria with focus on a high sensitivity (low rates of false-negatives), i.e. using the Youden-Index. Immunoassay (IA) and confirmatory results were available for 3014 blood samples. Sensitivity was 90% or more for nearly all analytes: amphetamines (IA cutoff 9.5 ng/ml), methylenedioxy derivatives (IA cutoff 5.5 ng/ml), cannabinoids (IA cutoff 14.5 ng/ml), benzodiazepines (IA cutoff >0 ng/ml). Test of opiates showed a sensitivity of 86% for a IA cutoff value of >0 ng/ml. Values for specificity ranged between 33% (methadone, IA cutoff 10 ng/ml) and 90% (cocaine, IA cutoff 20 ng/ml). Lower cutoff values as recommended by ROC analyses were chosen for most tests to decrease the rate of false-negatives. Analyses enabled the definition of cutoff values with good values for sensitivity. Small rates of false-positives can be accepted in forensic cases. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  1. Development of critical dimension measurement scanning electron microscope for ULSI (S-8000 series)

    NASA Astrophysics Data System (ADS)

    Ezumi, Makoto; Otaka, Tadashi; Mori, Hiroyoshi; Todokoro, Hideo; Ose, Yoichi

    1996-05-01

    The semiconductor industry is moving from half-micron to quarter-micron design rules. To support this evolution, Hitachi has developed a new critical dimension measurement scanning electron microscope (CD-SEM), the model S-8800 series, for quality control of quarter- micron process lines. The new CD-SEM provides detailed examination of process conditions with 5 nm resolution and 5 nm repeatability (3 sigma) at accelerating voltage 800 V using secondary electron imaging. In addition, a newly developed load-lock system has a capability of achieving a high sample throughput of 20 wafers/hour (5 point measurements per wafer) under continuous operation. To support user friendliness, the system incorporates a graphical user interface (GUI), an automated pattern recognition system which helps locating measurement points, both manual and semi-automated operation, and user-programmable operating parameters.

  2. Fabrication and evaluation of 100 Ah cylindrical lithium ion battery for electric vehicle applications

    NASA Astrophysics Data System (ADS)

    Hyung, Yoo-Eup; Moon, Seong-In; Yum, Duk-Hyeng; Yun, Seong-Kyu

    A total of 100 Ah class lithium ion cells with C/LiCoO 2 cell system for electric vehicles (EVs) was developed. EV-size lithium ion battery was developed by Sony, KERI/STC, SAFT, VARTA, Sanyo and Matsushita. GS battery and Hitachi have developed also stationary type large scale (70-80 Ah) lithium ion batteries. Lithium ion battery module for EVs was demonstrated by Sony/Nissan and KERI/STC in 1996. At present, the performance of developed EV-cells was up to 115 Wh/kg and 286 W/kg of specific power at 80% DOD. We assume our EV cells to have 248 and 242 km driving distance per one charge with DST-120 mode and ECE-15 mode, respectively. Finally, we performed safety/abuse tests of developed lithium ion cell.

  3. [Comparison of image distortion between three magnetic resonance imaging systems of different magnetic field strengths for use in stereotactic irradiation of brain].

    PubMed

    Takemura, Akihiro; Sasamoto, Kouhei; Nakamura, Kaori; Kuroda, Tatsunori; Shoji, Saori; Matsuura, Yukihiro; Matsushita, Tatsuhiko

    2013-06-01

    In this study, we evaluated the image distortion of three magnetic resonance imaging (MRI) systems with magnetic field strengths of 0.4 T, 1.5 T and 3 T, during stereotactic irradiation of the brain. A quality assurance phantom for MRI image distortion in radiosurgery was used for these measurements of image distortion. Images were obtained from a 0.4-T MRI (APERTO Eterna, HITACHI), a 1.5-T MRI (Signa HDxt, GE Healthcare) and a 3-T MRI (Signa HDx 3.0 T, GE Healthcare) system. Imaging sequences for the 0.4-T and 3-T MRI were based on the 1.5-T MRI sequence used for stereotactic irradiation in the clinical setting. The same phantom was scanned using a computed tomography (CT) system (Aquilion L/B, Toshiba) as the standard. The results showed mean errors in the Z direction to be the least satisfactory of all the directions in all results. The mean error in the Z direction for 1.5-T MRI at -110 mm in the axial plane showed the largest error of 4.0 mm. The maximum errors for the 0.4-T and 3-T MRI were 1.7 mm and 2.8 mm, respectively. The errors in the plane were not uniform and did not show linearity, suggesting that simple distortion correction using outside markers is unlikely to be effective. The 0.4-T MRI showed the lowest image distortion of the three. However, other items, such as image noise, contrast and study duration need to be evaluated in MRI systems when applying frameless stereotactic irradiation.

  4. [Metallurgic drugs in ancient Japan].

    PubMed

    Sugiyama, S

    2001-01-01

    Advancements in metallurgic and pharmaceutical chemistry in ancient Japan were made by people like Mangan-Shonin, who combined elements from Shinto, Buddhism, and Taoism to take advantage of technologies brought by Chinese and Korean immigrants. The Shonin himself, though it may be considered a wild speculation, could well be such an immigrant. Along with the immigrants, the Shonin established government-subsidized temples (Jingu-ji, Jogaku-ji) throughout the country under sponsorship by the Imperial Court for the purpose of raising funds through private donations. Research and educational activities conducted in these temples ultimately resulted in a well-established body of chemical engineers who could excavate chemical substances as well as alter their natures. According to a list of regional products (Sasaki,19) 1972) up to the 14th century, these chemical substances and their derivative products included iron from the Hitachi region, cast metal from Shimotsuke, swords from Sagami, face powder (lead carbonate) from Ise, mercury, and gold.

  5. Study of photoluminescence properties of CaAl{sub 2}O{sub 4}: Eu{sup 2+} prepared by combustion synthesis method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hingwe, V. S., E-mail: vishwas.hingwe@yahoo.in; Omanwar, S. K.; Bajaj, N. S.

    2016-05-06

    Eu{sup 2+} doped alkaline earth metals such as strontium aluminate, calcium aluminate and barium aluminate prepared by using modified combustion synthesis method at 600°C with Urea as fuel. Crystal structure is determined by using XRD and the sample confirmation by using the FTIR. The effect of the host material on the photoluminescence (PL) and phosphorescence properties were studied by using the Hitachi F-7000 spectrofluorimeter equipped with a 450W Xenon lamp, in the range 200-650 nm. The emission spectra of Eu{sup 2+} range from 450 to 500 nm in the Blue to aqua region and the transition 4f{sup 7}-4f{sup 6} 5d{sup 1}.more » The observed emission in CaAl{sub 2}O{sub 4} is 440 nm.« less

  6. Outage management and health physics issue, 2009

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agnihotri, Newal

    2009-05-15

    The focus of the May-June issue is on outage management and health physics. Major articles include the following: Planning and scheduling to minimize refueling outage, by Pat McKenna, AmerenUE; Prioritizing safety, quality and schedule, by Tom Sharkey, Dominion; Benchmarking to high standards, by Margie Jepson, Energy Nuclear; Benchmarking against U.S. standards, by Magnox North, United Kingdom; Enabling suppliers for new build activity, by Marcus Harrington, GE Hitachi Nuclear Energy; Identifying, cultivating and qualifying suppliers, by Thomas E. Silva, AREVA NP; Creating new U.S. jobs, by Francois Martineau, Areva NP. Industry innovation articles include: MSL Acoustic source load reduction, by Amirmore » Shahkarami, Exelon Nuclear; Dual Methodology NDE of CRDM nozzles, by Michael Stark, Dominion Nuclear; and Electronic circuit board testing, by James Amundsen, FirstEnergy Nuclear Operating Company. The plant profile article is titled The future is now, by Julia Milstead, Progress Energy Service Company, LLC.« less

  7. Imaging and elemental mapping of biological specimens with a dual-EDS dedicated scanning transmission electron microscope

    PubMed Central

    Wu, J.S.; Kim, A. M.; Bleher, R.; Myers, B.D.; Marvin, R. G.; Inada, H.; Nakamura, K.; Zhang, X.F.; Roth, E.; Li, S.Y.; Woodruff, T. K.; O'Halloran, T. V.; Dravid, Vinayak P.

    2013-01-01

    A dedicated analytical scanning transmission electron microscope (STEM) with dual energy dispersive spectroscopy (EDS) detectors has been designed for complementary high performance imaging as well as high sensitivity elemental analysis and mapping of biological structures. The performance of this new design, based on a Hitachi HD-2300A model, was evaluated using a variety of biological specimens. With three imaging detectors, both the surface and internal structure of cells can be examined simultaneously. The whole-cell elemental mapping, especially of heavier metal species that have low cross-section for electron energy loss spectroscopy (EELS), can be faithfully obtained. Optimization of STEM imaging conditions is applied to thick sections as well as thin sections of biological cells under low-dose conditions at room- and cryogenic temperatures. Such multimodal capabilities applied to soft/biological structures usher a new era for analytical studies in biological systems. PMID:23500508

  8. Plant maintenance and plant life extension issue, 2008

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agnihotri, Newal

    The focus of the March-April issue is on plant maintenance and plant life extension. Major articles include the following: Exciting time to be at the U.S. NRC, by Dale Klein, Nuclear Regulatory Commission; Extraordinary steps to ensure a minimal environmental impact, by George Vanderheyden, UniStar Nuclear Energy, LLC.; Focused on consistent reduction of outages, by Kevin Walsh, GE Hitachi Nuclear Energy; On the path towards operational excellence, by Ricardo Perez, Westinghouse Electric Company; Ability to be refuelled on-line, by Ian Trotman, CANDU Services, Atomic Energy of Canada, Ltd.; ASCA Application for maintenance of SG secondary side, by Patrick Wagner, Wolfmore » Creek Nuclear Operating Corporation, Phillip Battaglia and David Selfridge, Westinghouse Electric Company; and, An integral part of the landscape and lives, by Tyler Lamberts, Entergy Nuclear Operations, Inc. The Industry Innovation article is titled Steam generator bowl drain repairs, by John Makar and Richard Gimple, Wolf Creek Nuclear Operating Corporation.« less

  9. Plant maintenance and advanced reactors, 2005

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agnihotri, Newal

    2005-09-15

    The focus of the September-October issue is on plant maintenance and advanced reactors. Major articles/reports in this issue include: First U.S. EPRs in 2015, by Ray Ganthner, Framatome ANP; Pursuing several opportunities, by William E. (Ed) Cummins, Westinghouse Electric Company; Vigorous plans to develop advanced reactors, by Yuliang Sun, Tsinghua University, China; Multiple designs, small and large, by Kumiaki Moriya, Hitachi Ltd., Japan; Sealed and embedded for safety and security, by Handa Norihiko, Toshiba Corporation, Japan; Scheduled online in 2010, by Johan Slabber, PMBR (Pty) Ltd., South Africa; Multi-application reactors, by Nikolay G. Kodochigov, OKBM, Russia; Six projects under budgetmore » and on schedule, by David F. Togerson, AECL, Canada; Creating a positive image, by Scott Peterson, Nuclear Energy Institute (NEI); Advanced plans for nuclear power's renaissance, by John Cleveland, International Atomic Energy Agency, Austria; and, Plant profile: last five outages in less than 20 days, by Beth Rapczynski, Exelon Nuclear.« less

  10. High Serum Phospholipid Dihomo-γ-Linoleic Acid Concentration and Low Δ5-Desaturase Activity Are Associated with Increased Risk of Type 2 Diabetes among Japanese Adults in the Hitachi Health Study.

    PubMed

    Akter, Shamima; Kurotani, Kayo; Sato, Masao; Hayashi, Takuya; Kuwahara, Keisuke; Matsushita, Yumi; Nakagawa, Tohru; Konishi, Maki; Honda, Toru; Yamamoto, Shuichiro; Hayashi, Takeshi; Noda, Mitsuhiko; Mizoue, Tetsuya

    2017-08-01

    Background: The association between the circulating fatty acid (FA) composition and type 2 diabetes (T2D) has been reported in Western populations, but evidence is scarce among Asian populations, including Japanese, who consume large amounts of fish. Objective: The objective of the present study was to prospectively examine the association between circulating concentrations of individual FAs and T2D incidence among Japanese adults. Methods: We conducted a nested case-control study in a cohort of 4754 employees, aged 34-69 y, who attended a comprehensive health checkup in 2008-2009 and donated blood samples for the Hitachi Health Study. During 5 y of follow-up, diabetes was identified on the basis of plasma glucose, glycated hemoglobin, and self-report. Two controls matched to each case by sex, age, and date of checkup were randomly chosen by using density sampling, resulting in 336 cases and 678 controls with FA measurements. GC was used to measure the FA composition in serum phospholipids. Cox proportional hazards regression was used to estimate the HRs and 95% CIs after adjusting for potential confounders. We examined the association of T2D risk with 25 different individual and combinations of FAs. Results: T2D risk was positively associated with serum dihomo-γ-linoleic acid concentration (highest compared with the lowest quartile-HR: 1.49; 95% CI: 1.04, 2.11; P- trend = 0.02) and inversely associated with Δ5-desaturase activity (highest compared with the lowest quartile-HR: 0.72; 95% CI: 0.52, 0.99; P- trend = 0.02), independent of body mass index (BMI). There were also inverse associations between T2D risk with serum total n-6 (ω-6) polyunsaturated fatty acids (PUFAs), linoleic acid, and cis -vaccenic acid, but these were attenuated and became nonsignificant after adjustment for BMI. Serum n-3 (ω-3) PUFAs and saturated fatty acids (SFAs) were not associated with T2D risk. Conclusions: T2D risk was associated with circulating concentrations of the n-6 PUFA dihomo-γ-linoleic acid and Δ5-desaturase activity but not with n-3 PUFA or SFA concentrations in Japanese adults. © 2017 American Society for Nutrition.

  11. Quantification of dsDNA using the Hitachi F-7000 Fluorescence Spectrophotometer and PicoGreen dye.

    PubMed

    Moreno, Luis A; Cox, Kendra L

    2010-11-05

    Quantification of DNA, especially in small concentrations, is an important task with a wide range of biological applications including standard molecular biology assays such as synthesis and purification of DNA, diagnostic applications such as quantification of DNA amplification products, and detection of DNA molecules in drug preparations. During this video we will demonstrate the capability of the Hitachi F-7000 Fluorescence Spectrophotometer equipped with a Micro Plate Reader accessory to perform dsDNA quantification using Molecular Probes Quant-it PicoGreen dye reagent kit. The F-7000 Fluorescence Spectrophotometer offers high sensitivity and high speed measurements. It is a highly flexible system capable of measuring fluorescence, luminescence, and phosphorescence. Several measuring modes are available, including wavelength scan, time scan, photometry and 3-D scan measurement. The spectrophotometer has sensitivity in the range of 50 picomoles of fluorescein when using a 300 μL sample volume in the microplate, and is capable of measuring scan speeds of 60,000 nm/minute. It also has a wide dynamic range of up to 5 orders of magnitude which allows for the use of calibration curves over a wide range of concentrations. The optical system uses all reflective optics for maximum energy and sensitivity. The standard wavelength range is 200 to 750 nm, and can be extended to 900 nm when using one of the optional near infrared photomultipliers. The system allows optional temperature control for the plate reader from 5 to 60 degrees Celsius using an optional external temperature controlled liquid circulator. The microplate reader allows for the use of 96 well microplates, and the measuring speed for 96 wells is less than 60 seconds when using the kinetics mode. Software controls for the F-7000 and Microplate Reader are also highly flexible. Samples may be set in either column or row formats, and any combination of wells may be chosen for sample measurements. This allows for optimal utilization of the microplate. Additionally, the software allows importing micro plate sample configurations created in Excel and saved in comma separated values, or "csv" format. Microplate measuring configurations can be saved and recalled by the software for convenience and increased productivity. Data results can be output to a standard report, to Excel, or to an optional Report Generator Program.

  12. Quantification of dsDNA using the Hitachi F-7000 Fluorescence Spectrophotometer and PicoGreen Dye

    PubMed Central

    Moreno, Luis A.; Cox, Kendra L.

    2010-01-01

    Quantification of DNA, especially in small concentrations, is an important task with a wide range of biological applications including standard molecular biology assays such as synthesis and purification of DNA, diagnostic applications such as quantification of DNA amplification products, and detection of DNA molecules in drug preparations. During this video we will demonstrate the capability of the Hitachi F-7000 Fluorescence Spectrophotometer equipped with a Micro Plate Reader accessory to perform dsDNA quantification using Molecular Probes Quant-it PicoGreen dye reagent kit. The F-7000 Fluorescence Spectrophotometer offers high sensitivity and high speed measurements. It is a highly flexible system capable of measuring fluorescence, luminescence, and phosphorescence. Several measuring modes are available, including wavelength scan, time scan, photometry and 3-D scan measurement. The spectrophotometer has sensitivity in the range of 50 picomoles of fluorescein when using a 300 μL sample volume in the microplate, and is capable of measuring scan speeds of 60,000 nm/minute. It also has a wide dynamic range of up to 5 orders of magnitude which allows for the use of calibration curves over a wide range of concentrations. The optical system uses all reflective optics for maximum energy and sensitivity. The standard wavelength range is 200 to 750 nm, and can be extended to 900 nm when using one of the optional near infrared photomultipliers. The system allows optional temperature control for the plate reader from 5 to 60 degrees Celsius using an optional external temperature controlled liquid circulator. The microplate reader allows for the use of 96 well microplates, and the measuring speed for 96 wells is less than 60 seconds when using the kinetics mode. Software controls for the F-7000 and Microplate Reader are also highly flexible. Samples may be set in either column or row formats, and any combination of wells may be chosen for sample measurements. This allows for optimal utilization of the microplate. Additionally, the software allows importing micro plate sample configurations created in Excel and saved in comma separated values, or "csv" format. Microplate measuring configurations can be saved and recalled by the software for convenience and increased productivity. Data results can be output to a standard report, to Excel, or to an optional Report Generator Program. PMID:21189464

  13. Electrochemical evaluation of the corrosion resistance of cup-yoke-type dental magnetic attachments.

    PubMed

    Takada, Yukyo; Takahashi, Masatoshi; Kikuchi, Akira; Tenkumo, Taichi

    2014-01-01

    The corrosion resistance of different magnetic assemblies—Magfit DX800 (Aichi Steel), Gigauss D800 (GC), Hyper Slim 4013, and Hicorex Slim 4013 (Hitachi Metals)—were electrochemically evaluated using anodic polarization curves obtained in 0.9% NaCl solution at 37°C. Stainless steels (444, XM27, 447J1, and 316L) composing the magnetic assemblies were also examined as controls. This revealed that all of the magnetic assemblies break down at 0.6-1.1 V; however, their breakdown potentials were all still significantly higher (p<0.05) than that of 316L. The distribution of elements in the laser welding zone between the yoke and shield ring was analyzed using EPMA; except with Magfit DX800, where the Cr content of the shield ring weld was greater than that of 316L. These magnetic assemblies are expected to have good corrosion resistance in the oral cavity, as their breakdown potentials are sufficiently higher than the 316L commonly used as a surgical implant material.

  14. Analytical interference of drugs in clinical chemistry: I--Study of twenty drugs on seven different instruments.

    PubMed

    Letellier, G; Desjarlais, F

    1985-12-01

    We have investigated the effect of 20 drugs on the accuracy of results obtained from seven instruments now widely used in clinical biochemistry laboratories: Abbott VP, aca II, Cobas Bio, Ektachem 400, Hitachi 705, KDA and SMAC. Eleven to 18 constituents were analysed on each instrument. Our results lead us to the following conclusions: (1) only rarely does drug interference with a method lead to a clinically significant change in a measured value; (2) the magnitude of the change may relate linearly or non-linearly to the drug concentration but is usually independent of the target analyte concentration; (3) interference with a chemical reaction on one instrument does not always mean that the same reaction will be altered in the same way on other instruments; (4) no interferences were found for drugs with therapeutic levels in the low micro-molar range; (5) in most cases the interference could not be predicted from the chemical nature of drug.

  15. Urine phenobarbital drug screening: potential use for compliance assessment in neonates.

    PubMed

    Guillet, Ronnie; Kwon, Jennifer M; Chen, Sixaio; McDermott, Michael P

    2012-02-01

    This study was done to determine if urine phenobarbital measurements provide a reliable indicator of presence of the drug in neonates. Urine was collected from neonates treated with phenobarbital for clinical indications within 4 to 6 hours of clinically indicated collection of serum phenobarbital levels. Urine samples were also collected from control neonates not treated with phenobarbital. One aliquot was assayed fresh, another frozen at -30°C and assayed 1 to 3 months later. Phenobarbital was assayed using the ONLINE TDM Roche/Hitachi automated clinical chemistry analyzer. Serum and urine concentrations were compared as were fresh and frozen urine measurements. Serum phenobarbital ranged from 5.6 to 52.7 μg/mL. Matched urine samples were 56.6 ± 12.5% of the serum level. Frozen samples were 98.3 ± 8.0% of the fresh samples. Urine phenobarbital concentrations, either fresh or frozen, can be used in neonates as a noninvasive estimate of drug levels.

  16. Evaluation of the Technicon Axon analyser.

    PubMed

    Martínez, C; Márquez, M; Cortés, M; Mercé, J; Rodriguez, J; González, F

    1990-01-01

    An evaluation of the Technicon Axon analyser was carried out following the guidelines of the 'Sociedad Española de Química Clínica' and the European Committee for Clinical Laboratory Standards.A photometric study revealed acceptable results at both 340 nm and 404 nm. Inaccuracy and imprecision were lower at 404 nm than at 340 nm, although poor dispersion was found at both wavelengths, even at low absorbances. Drift was negligible, the imprecision of the sample pipette delivery system was greater for small sample volumes, the reagent pipette delivery system imprecision was acceptable and the sample diluting system study showed good precision and accuracy.Twelve analytes were studied for evaluation of the analyser under routine working conditions. Satisfactory results were obtained for within-run imprecision, while coefficients of variation for betweenrun imprecision were much greater than expected. Neither specimenrelated nor specimen-independent contamination was found in the carry-over study. For all analytes assayed, when comparing patient sample results with those obtained in a Hitachi 737 analyser, acceptable relative inaccuracy was observed.

  17. The variation of the strength of neck extensor muscles and semispinalis capitis muscle size with head and neck position.

    PubMed

    Rezasoltani, A; Nasiri, R; Faizei, A M; Zaafari, G; Mirshahvelayati, A S; Bakhshidarabad, L

    2013-04-01

    Semispinalis capitis muscle (SECM) is a massive and long cervico-thoracic muscle which functions as a main head and neck extensor muscle. The aim of this study was to detect the effect of head and neck positions on the strength of neck extensor muscles and size of SECM in healthy subjects. Thirty healthy women students voluntarily participated in this study. An ultrasonography apparatus (Hitachi EUB 525) and a system of tension-meter were used to scan the right SECM at the level of third cervical spine and to measure the strength of neck extensor muscles at three head and neck positions. Neck extensor muscles were stronger in neutral than flexion or than extension positions while the size of SECM was larger in extension than neutral or than flexion position. The force generation capacity of the main neck extensor muscle was lower at two head and neck flexion and extension positions than neutral position. Copyright © 2012 Elsevier Ltd. All rights reserved.

  18. Imaging and elemental mapping of biological specimens with a dual-EDS dedicated scanning transmission electron microscope.

    PubMed

    Wu, J S; Kim, A M; Bleher, R; Myers, B D; Marvin, R G; Inada, H; Nakamura, K; Zhang, X F; Roth, E; Li, S Y; Woodruff, T K; O'Halloran, T V; Dravid, Vinayak P

    2013-05-01

    A dedicated analytical scanning transmission electron microscope (STEM) with dual energy dispersive spectroscopy (EDS) detectors has been designed for complementary high performance imaging as well as high sensitivity elemental analysis and mapping of biological structures. The performance of this new design, based on a Hitachi HD-2300A model, was evaluated using a variety of biological specimens. With three imaging detectors, both the surface and internal structure of cells can be examined simultaneously. The whole-cell elemental mapping, especially of heavier metal species that have low cross-section for electron energy loss spectroscopy (EELS), can be faithfully obtained. Optimization of STEM imaging conditions is applied to thick sections as well as thin sections of biological cells under low-dose conditions at room and cryogenic temperatures. Such multimodal capabilities applied to soft/biological structures usher a new era for analytical studies in biological systems. Copyright © 2013 Elsevier B.V. All rights reserved.

  19. Necroptosis may be a novel mechanism for cardiomyocyte death in acute myocarditis.

    PubMed

    Zhou, Fei; Jiang, Xuejun; Teng, Lin; Yang, Jun; Ding, Jiawang; He, Chao

    2018-05-01

    In this study, we investigated the roles of RIP1/RIP3 mediated cardiomyocyte necroptosis in CVB3-induced acute myocarditis. Serum concentrations of creatinine kinase (CK), CK-MB, and cardiac troponin I were detected using a Hitachi Automatic Biochemical Analyzer in a mouse model of acute VMC. Histological changes in cardiac tissue were observed by light microscope and expression levels of RIP1/RIP3 in the cardiac tissue were detected via Western blot and immunohistochemistry. The data showed that RIP1/RIP3 was highly expressed in cardiomyocytes in the acute VMC mouse model and that the necroptosis pathway specific blocker, Nec-1, dramatically reduced the myocardial damage by downregulating the expression of RIP1/RIP3. These findings provide evidence that necroptosis plays a significant role in cardiomyocyte death and it is a major pathway for cell death in acute VMC. Blocking the necroptosis pathway may serve as a new therapeutic option for the treatment of acute viral myocarditis.

  20. Development of the water-lubricated thrust bearing of the hydraulic turbine generator

    NASA Astrophysics Data System (ADS)

    Inoue, K.; Deguchi, K.; Okude, K.; Fujimoto, R.

    2012-11-01

    In hydropower plant, a large quantities of turbine oil is used as machine control pressure oil and lubricating oil. If the oil leak out from hydropower plant, it flows into a river. And such oil spill has an adverse effect on natural environment because the oil does not degrade easily. Therefore the KANSAI and Hitachi Mitsubishi Hydro developed the water-lubricated thrust bearing for vertical type hydraulic turbine generator. The water-lubricated bearing has advantages in risk avoidance of river pollution because it does not need oil. For proceeding the development of the water-lubricated thrust bearing, we studied following items. The first is the examination of the trial products of water lubricating liquid. The second is the study of bearing structure which can satisfy bearing performance such as temperature characteristic and so on. The third is the mock-up testing for actual application in the future. As a result, it was found that the water-lubricated thrust bearing was technically applicable to actual equipments.

  1. Plant maintenance and plant life extension issue, 2009

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agnihotri, Newal

    The focus of the March-April issue is on plant maintenance and plant life extension. Major articles include the following: Application of modeling and simulation to nuclear power plants, by Berry Gibson, IBM, and Rolf Gibbels, Dassault Systems; Steam generators with tight manufacturing procedures, by Ei Kadokami, Mitsubishi Heavy Industries; SG design based on operational experience and R and D, by Jun Tang, Babcock and Wilcox Canada; Confident to deliver reliable performance, by Bruce Bevilacqua, Westinghouse Nuclear; An evolutionary plant design, by Martin Parece, AREVA NP, Inc.; and, Designed for optimum production, by Danny Roderick, GE Hitachi Nuclear Energy. Industry Innovationmore » articles include: Controlling alloy 600 degradation, by John Wilson, Exelon Nuclear Corporation; Condensate polishing innovation, by Lewis Crone, Dominion Millstone Power Station; Reducing deposits in steam generators, by the Electric Power Research Institute; and, Minimizing Radiological effluent releases, by the Electric Power Research Institute. The plant profile article is titled 2008 - a year of 'firsts' for AmerenUE's Callaway plant, by Rick Eastman, AmerenUE.« less

  2. Trace Assessment for BWR ATWS Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cheng, L.Y.; Diamond, D.; Arantxa Cuadra, Gilad Raitses, Arnold Aronson

    2010-04-22

    A TRACE/PARCS input model has been developed in order to be able to analyze anticipated transients without scram (ATWS) in a boiling water reactor. The model is based on one developed previously for the Browns Ferry reactor for doing loss-of-coolant accident analysis. This model was updated by adding the control systems needed for ATWS and a core model using PARCS. The control systems were based on models previously developed for the TRAC-B code. The PARCS model is based on information (e.g., exposure and moderator density (void) history distributions) obtained from General Electric Hitachi and cross sections for GE14 fuel obtainedmore » from an independent source. The model is able to calculate an ATWS, initiated by the closure of main steam isolation valves, with recirculation pump trip, water level control, injection of borated water from the standby liquid control system and actuation of the automatic depres-surization system. The model is not considered complete and recommendations are made on how it should be improved.« less

  3. Measures of Urinary Protein and Albumin in the Prediction of Progression of IgA Nephropathy

    PubMed Central

    Zhao, Yan-feng; Liu, Li-jun; Shi, Su-fang; Lv, Ji-cheng; Zhang, Hong

    2016-01-01

    Background and objectives Proteinuria is an independent predictor for IgA nephropathy (IgAN) progression. Urine albumin-to-creatinine ratio (ACR), protein-to-creatinine ratio, and 24-hour urine protein excretion (UPE) are widely used for proteinuria evaluation in clinical practice. Here, we evaluated the association of these measurements with clinical and histologic findings of IgAN and explored which was the best predictor of IgAN prognosis. Design, setting, participants, & measurements Patients with IgAN were followed up for ≥12 months, were diagnosed between 2003 and 2012, and had urine samples available (438 patients). Spot urine ACR, protein-to-creatinine ratio, and 24-hour UPE at the time of renal biopsy were measured on a Hitachi Automatic Biochemical Analyzer 7180 (Hitachi, Yokohama, Japan). Results In our patients, ACR, protein-to-creatinine ratio, and 24-hour UPE were highly correlated (correlation coefficients: 0.71–0.87). They showed good relationships with acknowledged markers reflecting IgAN severity, including eGFR, hypertension, and the biopsy parameter (Oxford severity of tubular atrophy/interstitial fibrosis parameter). However, only ACR presented with positive association with the Oxford segmental glomerulosclerosis/adhesion parameter and extracapillary proliferation lesions. The follow-up time was 37.0 (22.0–58.0) months, with the last follow-up on April 18, 2014. In total, 124 patients reached the composite end point (30% eGFR decline, ESRD, or death). In univariate survival analysis, ACR consistently had better performance than protein-to-creatinine ratio and 24-hour UPE as represented by higher area under the curve using time–dependent survival analysis. When adjusted for well known risk factors for IgAN progression, ACR was most significantly associated with the composite end point (hazard ratio, 1.56 per 1-SD change of standard normalized square root–transformed ACR; 95% confidence interval, 1.29 to 1.89; P<0.001). Compared with protein-to-creatinine ratio and 24-hour UPE, addition of ACR to traditional risk factors resulted in more improvement in the predictive ability of IgAN progression (c statistic: ACR=0.70; protein-to-creatinine ratio =0.68; 24-hour UPE =0.69; Akaike information criterion: ACR=1217.85; protein-to-creatinine ratio =1229.28; 24-hour UPE =1234.96; P<0.001). Conclusions In IgAN, ACR, protein-to-creatinine ratio, and 24-hour UPE had comparable association with severe clinical and histologic findings. Compared with protein-to-creatinine ratio and 24-hour UPE, ACR showed slightly better performance in predicting IgAN progression. PMID:27026518

  4. TU-FG-BRB-08: Challenges, Limitations and Future Outlook Towards Clinical Translation of Proton Acoustic Range Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yousefi, S; Ahmad, M; Xiang, L

    Purpose: To report our investigations of proton acoustic imaging, including computer simulations and preliminary experimental studies at clinical facilities. The ultimate achievable accuracy, sensitivity and clinical translation challenges are discussed. Methods: The acoustic pulse due to pressure rise was estimated using finite element model. Since the ionoacoustic pulse is highly dependent on the proton pulse width and energy, multiple pulse widths were studied. Based on the received signal spectrum at piezoelectric ultrasound transducer with consideration of random thermal noise, maximum spatial resolution of the proton-acoustic imaging modality was calculated. The simulation studies defined the design specifications of the system tomore » detect proton acoustic signal from Hitachi and Mevion clinical machines. A 500 KHz hydrophone with 100 dB amplification was set up in a water tank placed in front of the proton nozzle A 40 MHz data acquisition was synchronized by a trigger signal provided by the machine. Results: Given 30–800 mGy dose per pulse at the Bragg peak, the minimum number of protons detectable by the proton acoustic technique was on the order of 10×10^6 per pulse. The broader pulse widths produce signal with lower acoustic frequencies, with 10 µs pulses producing signals with frequency less than 100 kHz. As the proton beam pulse width increases, a higher dose rate is required to measure the acoustic signal. Conclusion: We have established the minimal detection limit for protonacoustic range validation for a variety of pulse parameters. Our study indicated practical proton-acoustic range verification can be feasible with a pulse shorter than 10 µs, 5×10^6 protons/pulse, 50 nA beam current and a highly sensitive ultrasonic transducer. The translational challenges into current clinical machines include proper magnetic shielding of the measurement equipment, providing a clean trigger signal from the proton machine, providing a shorter proton beam pulse and higher dose per pulse.« less

  5. PREFACE Preface

    NASA Astrophysics Data System (ADS)

    Takahashi, Migaku; Saito, Hitoshi; Yoshimura, Satoru; Takanashi, Koki; Sahashi, Masashi; Tsunoda, Masakiyo

    2011-01-01

    The 2nd International Symposium on Advanced Magnetic Materials and Applications 2010 (ISAMMA 2010) was held in Sendai, Japan, from 12-16, July 2010. ISAMMA is the first consolidated symposium of three independent symposia held in the Asian region: ISPMM (International Symposium on Physics of Magnetic Materials) of Japan which was first held in 1987 in Sendai, and was subsequently held five times, Beijing (1992), Seoul (1995), Sendai (1998), Taipei (2001), and Singapore (2005); ISAMT (International Symposium of Advanced Magnetic Technology) of Taiwan, and SOMMA (International Symposium on Magnetic Materials and Applications) of Korea, both of which were started in 1999, and were held five times up to 2005. ISAMMA was established as a new international symposium which will be held every 3 years in Asia. The concept of this unified international symposium was mainly developed by Prof. M. Takahashi, Conference Chair of this conference, ISAMMA 2010. The first memorial symposium, ISAMMA 2007, was held on Jeju Island, Korea, from 28 May to 1 June 2007. The main purpose and scope of the ISAMMA conferences are to provide an opportunity for scientists and engineers from all over the world to meet in Asia to discuss recent advances in the study of magnetic materials and their physics, and spin related phenomena and materials. Conference photograph The categories of ISAMMA 2010 were: Fundamental Properties of Magnetic Materials; Hard/Soft Magnetic Materials and Applications; Spintronics Materials and Devices; Structured Materials; Multi Functional Magnetic Materials; Spin Dynamics and Micromagnetics; Magnetic Storage; Materials for Applications (Sensors, High Frequency, Power, and Bio/Medical devices); Magnetic Imaging and Characterization. The scientific program commenced on Tuesday 13 July 2010 with opening remarks by the Symposium Chairman and the plenary talks were presented by T Rasing, P Fischer, H Yoda and S Sugimoto. The conference was attended by 511 participants from 23 countries, with about 40 percent of participants attending from overseas (see figure). The program involved 4 plenary talks (45 minutes each), 37 invited talks (30 minutes), 85 contributed talks (15 minutes), and 352 posters. Pie chart Organizing Committee of ISAMMA 2010 M TakahashiTohoku Univ., Japan, Chairman K TakanashiTohoku Univ., Japan, Chair of the Program Committee H SaitoAkita Univ., Japan, Chair of the Publication Committee M SahashiTohoku Univ., Japan, Chair of the Treasury Committee M TsunodaTohoku Univ., Japan, General Secretary H AkinagaAIST, Japan H FukunagaNagasaki Univ., Japan K HonoNIMS, Japan S IshioAkita Univ., Japan S IwataNagoya Univ., Japan K NakagawaNihon Univ., Japan S NakagawaTokyo Inst. of Tech., Japan T OnoKyoto Univ., Japan Y SuzukiOsaka Univ., Japan M TanakaEhime Univ., Japan T Tanaka Univ. of Tokyo, Japan Program Committee of ISAMMA 2010 K TakanashiTohoku Univ., Japan, ChairS MizukamiTohoku Univ., Japan M MizuguchiTohoku Univ., Japan, Vice-chairH NaganumaTohoku Univ., Japan M DoiTohoku Univ., JapanS NakagawaTokyo Inst. of Tech., Japan A FujitaTohoku Univ., JapanK NakamuraTohoku Univ., Japan K IshiyamaTohoku Univ., JapanK OnoKEK, Japan T KatoNagoya Univ., JapanT OnoKyoto Univ., Japan T KawagoeOsaka Pref. Univ.of Edu., JapanF SatoTohoku Univ., Japan O KitakamiTohoku Univ., JapanM ShiraiTohoku Univ., Japan Y KitamotoTokyo Inst. of Tech., JapanS SugimotoTohoku Univ., Japan F MatsukuraTohoku Univ., JapanM YamaguchiTohoku Univ., Japan C MitsumataHitachi Metals, Japan Publication Committee of ISAMMA 2010 H SaitoAkita Univ., Japan, ChairS MitaniNIMS, Japan S YoshimuraAkita Univ., Japan, Vice-chairH MuraokaTohoku, Japan Y AndoTohoku Univ., JapanM NakanoNagasaki Univ., Japan J AriakeAIT, JapanR NakataniOsaka Univ., Japan H AsanoNagoya Univ., JapanK O'GradyUniv. of York, UK M FutamotoChuo Univ., JapanA SakumaTohoku Univ., Japan J HayakawaHitachi, ARL, JapanT SatoKeio Univ., Japan T HondaKyushu Inst. of Tech., JapanT SatoShinshu Univ., Japan M IgarashiHitachi, CRL, JapanK TajimaAkita Univ., Japan H ItoKansai Univ., JapanM TakedaJAEA, Japan H IwasakiToshiba, JapanY TakemuraYokohama Nat'l Univ., Japan H KatoYamagata Univ., JapanM TanakaUniv. of Tokyo, Japan M KonotoAIST, JapanA TsukamotoNihon Univ., Japan H KubotaAIST, JapanS YabukamiTohoku Gakuin Univ., Japan Treasury Committee of ISAMMA 2010 M SahashiTohoku Univ., Japan, ChairS SaitoTohoku Univ., Japan K IshiyamaTohoku Univ., JapanT TanakaEhime Univ., Japan K NakagawaNihon Univ., JapanN TezukaTohoku Univ., Japan T OgawaTohoku Univ., Japan Executive Committee of ISAMMA 2010 M TakahashiTohoku Univ., Japan, ChairS SaitoTohoku Univ., Japan K TakanashiTohoku Univ., Japan, Vice-chairY SakurabaTohoku Univ., Japan K MiyakeTohoku Univ., JapanT ShimaTohoku Gakuin Univ., Japan T OgawaTohoku Univ., JapanN TezukaTohoku Univ., Japan S OkamotoTohoku Univ., JapanM TsunodaTohoku Univ., Japan M OoganeTohoku Univ., Japan We are grateful to all the participants for their valuable contributions and active discussions. We gratefully acknowledge the financial support from 17 Japanese companies (ASAKA RIKEN CO., LTD, Fujikin Incorporated, Furukawa Electric Co., Ltd, Hitachi Metals, Ltd, IZUMI-TEC CO., LTD, Miwa Electric Industrial CO., LTD, MIWA MFG CO., LTD, NEOARK Corporation, Optima Corporation, PRESTO CO., LTD, SHOWA DENKO K.K., TAIYO YUDEN CO., LTD, TDK Corporation, TEIJIN LIMITED, Ube Material Industries, Ltd, ULVAC, Inc, and V TEX Corporation) and 7 foundations (SENDAI TOURISM & CONVENTION BUREAU, The Iwatani Naoji Foundation, Tohoku University Electro-Related Departments Global COE Program 'Center of Education and Research for Information Electronics Systems', The Murata Science Foundation, Research Foundation for Materials Science, Nippon Sheet Glass Foundation for Materials Science and Engineering, and Aoba Foundation for The Promotion of Engineering).

  6. The role of total laboratory automation in a consolidated laboratory network.

    PubMed

    Seaberg, R S; Stallone, R O; Statland, B E

    2000-05-01

    In an effort to reduce overall laboratory costs and improve overall laboratory efficiencies at all of its network hospitals, the North Shore-Long Island Health System recently established a Consolidated Laboratory Network with a Core Laboratory at its center. We established and implemented a centralized Core Laboratory designed around the Roche/Hitachi CLAS Total Laboratory Automation system to perform the general and esoteric laboratory testing throughout the system in a timely and cost-effective fashion. All remaining STAT testing will be performed within the Rapid Response Laboratories (RRLs) at each of the system's hospitals. Results for this laboratory consolidation and implementation effort demonstrated a decrease in labor costs and improved turnaround time (TAT) at the core laboratory. Anticipated system savings are approximately $2.7 million. TATs averaged 1.3 h within the Core Laboratory and less than 30 min in the RRLs. When properly implemented, automation systems can reduce overall laboratory expenses, enhance patient services, and address the overall concerns facing the laboratory today: job satisfaction, decreased length of stay, and safety. The financial savings realized are primarily a result of labor reductions.

  7. Atomic imaging using secondary electrons in a scanning transmission electron microscope: experimental observations and possible mechanisms.

    PubMed

    Inada, H; Su, D; Egerton, R F; Konno, M; Wu, L; Ciston, J; Wall, J; Zhu, Y

    2011-06-01

    We report detailed investigation of high-resolution imaging using secondary electrons (SE) with a sub-nanometer probe in an aberration-corrected transmission electron microscope, Hitachi HD2700C. This instrument also allows us to acquire the corresponding annular dark-field (ADF) images both simultaneously and separately. We demonstrate that atomic SE imaging is achievable for a wide range of elements, from uranium to carbon. Using the ADF images as a reference, we studied the SE image intensity and contrast as functions of applied bias, atomic number, crystal tilt, and thickness to shed light on the origin of the unexpected ultrahigh resolution in SE imaging. We have also demonstrated that the SE signal is sensitive to the terminating species at a crystal surface. A possible mechanism for atomic-scale SE imaging is proposed. The ability to image both the surface and bulk of a sample at atomic-scale is unprecedented, and can have important applications in the field of electron microscopy and materials characterization. Copyright © 2010 Elsevier B.V. All rights reserved.

  8. Diagnostic performance of the EMIT-tox benzodiazepine immunoassay, FPIA serum benzodiazepine immunoassay, and radioreceptor assay in suspected acute poisoning.

    PubMed

    Verstraete, A G; Belpaire, F M; Leroux-Roels, G G

    1998-01-01

    We evaluated the diagnostic performance of the EMIT-tox serum benzodiazepine assay adapted to a Hitachi 717 analyzer (EMIT), the Abbott ADx serum benzodiazepine fluorescence polarization immunoassay (FPIA), and a radioreceptor assay (RRA) in 113 patients with suspected acute poisoning. The reference method was high-performance liquid chromatography with ultraviolet detection after solid-phase extraction. For the discrimination between negative and positive samples, the areas under the receiver-operating characteristic (ROC) curves were 0.976, 0.991, and 0.991 for EMIT (cutoff, 50-ng/mL diazepam), FPIA (cutoff, 12-ng/mL nordiazepam), and RRA (cutoff, 50-ng/mL diazepam), respectively. For the discrimination between non-toxic and toxic concentrations, the areas under the ROC curves were 0.896, 0.893, and 0.933, respectively. EMIT (with the cutoff lowered to 50 ng/mL), FPIA, and RRA can be reliably used to screen for the presence of benzodiazepines in serum, but in many cases they cannot discriminate between toxic and nontoxic concentrations.

  9. Secondary barrier construction for vessels carrying spherical low temperature liquefied gas storage tanks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Okamoto, T.; Nishimoto, T.; Sawada, K.

    1978-05-16

    To simplify and thus reduce the cost of the secondary barrier for spherical LNG storage tanks onboard ocean-transport vessels, Japan's Hitachi Shipbuilding and Engineering Co., Ltd., has developed a new secondary-containment system that allows easy installation directly on the cargo hold's bottom plate beneath the spherical tank. The new system comprises at least two layers of rigid-foam synthetic resin sprayed on the hold plates and covered by a layer of glass mesh and adhesive. Alternatively, the layers of synthetic resin, glass mesh, and adhesive are applied to plywood attached to the hold plates by joists, thus forming an air spacemore » between the secondary barrier and the hold plates. Where the hold plates have a multisurface construction, (1) laminated rigid urethane foam blocks are butted end-to-end and are bonded to each other and to the plywood sheets at the corners between adjacent hold plates, (2) the spray-formed layers are applied between the blocks, and (3) the entire assembly is covered by a protective layer of glass mesh and adhesive.« less

  10. Prime Contract Awards Alphabetically by Contractor, by State or Country, and Place, FY 88. Part 10. (Hitachi Instruments, Inc.-International Cordage, Inc.)

    DTIC Science & Technology

    1988-01-01

    4 4 4- . 44" loo 1 n 1NN Nl Nl " 4 0 r- r- r-r- r 1% - r.r- lr-..-r Pr l ’)0014 I 0 󈧄V 1 0 󈧄V 100 (10 I " jci "ci C,1 l 1 1 74cI 0 N NNN N 0 N 0e...0N N* 4C N SN C4 C1 SN 1000 1I-C4N N N NNNN C4c jci ,qr N NN ON 0.N(4N U) N UCN ZN I-4N C ZN 1’o00000𔃾) 000000000-)00000000 0 030 0 Q 0 0 0 100 ...1n0.4 - l00om 4 4- 44 4- 4- -4 4 -4 - -4 -4 .- 4-4-1 .-- 4 4 -4 ) -4 lOONi nI I nI I nI i nI I f u,v)lUI u 11) -4 100 -4 m (1 c’) mm ) mMIM ) mM) M c

  11. Development and Validation of a Kit to Measure Drink Antioxidant Capacity Using a Novel Colorimeter.

    PubMed

    Priftis, Alexandros; Stagos, Dimitrios; Tzioumakis, Nikolaos; Konstantinopoulos, Konstantinos; Patouna, Anastasia; Papadopoulos, Georgios E; Tsatsakis, Aristides; Kouretas, Dimitrios

    2016-08-30

    Measuring the antioxidant capacity of foods is essential, as a means of quality control to ensure that the final product reaching the consumer will be of high standards. Despite the already existing assays with which the antioxidant activity is estimated, new, faster and low cost methods are always sought. Therefore, we have developed a novel colorimeter and combined it with a slightly modified DPPH assay, thus creating a kit that can assess the antioxidant capacity of liquids (e.g., different types of coffee, beer, wine, juices) in a quite fast and low cost manner. The accuracy of the colorimeter was ensured by comparing it to a fully validated Hitachi U-1900 spectrophotometer, and a coefficient was calculated to eliminate the observed differences. In addition, a new, user friendly software was developed, in order to render the procedure as easy as possible, while allowing a central monitoring of the obtained results. Overall, a novel kit was developed, with which the antioxidant activity of liquids can be measured, firstly to ensure their quality and secondly to assess the amount of antioxidants consumed with the respective food.

  12. Cardiac-Specific Conversion Factors to Estimate Radiation Effective Dose From Dose-Length Product in Computed Tomography.

    PubMed

    Trattner, Sigal; Halliburton, Sandra; Thompson, Carla M; Xu, Yanping; Chelliah, Anjali; Jambawalikar, Sachin R; Peng, Boyu; Peters, M Robert; Jacobs, Jill E; Ghesani, Munir; Jang, James J; Al-Khalidi, Hussein; Einstein, Andrew J

    2018-01-01

    This study sought to determine updated conversion factors (k-factors) that would enable accurate estimation of radiation effective dose (ED) for coronary computed tomography angiography (CTA) and calcium scoring performed on 12 contemporary scanner models and current clinical cardiac protocols and to compare these methods to the standard chest k-factor of 0.014 mSv·mGy -1 cm -1 . Accurate estimation of ED from cardiac CT scans is essential to meaningfully compare the benefits and risks of different cardiac imaging strategies and optimize test and protocol selection. Presently, ED from cardiac CT is generally estimated by multiplying a scanner-reported parameter, the dose-length product, by a k-factor which was determined for noncardiac chest CT, using single-slice scanners and a superseded definition of ED. Metal-oxide-semiconductor field-effect transistor radiation detectors were positioned in organs of anthropomorphic phantoms, which were scanned using all cardiac protocols, 120 clinical protocols in total, on 12 CT scanners representing the spectrum of scanners from 5 manufacturers (GE, Hitachi, Philips, Siemens, Toshiba). Organ doses were determined for each protocol, and ED was calculated as defined in International Commission on Radiological Protection Publication 103. Effective doses and scanner-reported dose-length products were used to determine k-factors for each scanner model and protocol. k-Factors averaged 0.026 mSv·mGy -1 cm -1 (95% confidence interval: 0.0258 to 0.0266) and ranged between 0.020 and 0.035 mSv·mGy -1 cm -1 . The standard chest k-factor underestimates ED by an average of 46%, ranging from 30% to 60%, depending on scanner, mode, and tube potential. Factors were higher for prospective axial versus retrospective helical scan modes, calcium scoring versus coronary CTA, and higher (100 to 120 kV) versus lower (80 kV) tube potential and varied among scanner models (range of average k-factors: 0.0229 to 0.0277 mSv·mGy -1 cm -1 ). Cardiac k-factors for all scanners and protocols are considerably higher than the k-factor currently used to estimate ED of cardiac CT studies, suggesting that radiation doses from cardiac CT have been significantly and systematically underestimated. Using cardiac-specific factors can more accurately inform the benefit-risk calculus of cardiac-imaging strategies. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  13. Effects of annealing on the structure and magnetic properties of Fe80B20 magnetostrictive fibers.

    PubMed

    Zhu, Qianke; Zhang, Shuling; Geng, Guihong; Li, Qiushu; Zhang, Kewei; Zhang, Lin

    2016-07-04

    Fe80B20 amorphous alloys exhibit excellent soft magnetic properties, high abrasive resistance and outstanding corrosion resistance. In this work, Fe80B20 amorphous micro-fibers with HC of 3.33 Oe were firstly fabricated and the effects of annealing temperature on the structure and magnetic properties of the fibers were investigated. In this study, Fe80B20 amorphous fibers were prepared by the single roller melt-spinning method. The structures of as-spun and annealed fibers were investigated by X-ray diffractometer (XRD) (PANalytical X,Pert Power) using Cu Kα radiation. The morphology of the fibers was observed by scanning electron microscopy (SEM) (HITACHI-S4800). Differential scanning calorimetry (DSC) measurements of the fibers were performed on Mettler Toledo TGA/DSC1 device under N2 protection. Vibrating sample magnetometer (VSM, Versalab) was used to examine the magnetic properties of the fibers. The resonance behavior of the fibers was characterized by an impedance analyzer (Agilent 4294A) with a home-made copper coil. The X-ray diffusion (XRD) patterns show that the fibers remain amorphous structure until the annealing temperature reaches 500°C. The differential scanning calorimetry (DSC) results show that the crystallization temperature of the fibers is 449°C. The crystallization activation energy is calculated to be 221 kJ/mol using Kissinger formula. The scanning electron microscopy (SEM) images show that a few dendrites appear at the fiber surface after annealing. The result indicates that the coercivity HC (//) and HC (⊥) slightly increases with increasing annealing temperature until 400°C, and then dramatically increases with further increasing annealing temperature which is due to significant increase in magneto-crystalline anisotropy and magneto-elastic anisotropy. The Q value firstly increases slightly when the annealing temperature rises from room temperature (RT) to 300°C, then decreases until 400°C. Eventually, the value of Q increases to ~2000 at annealing temperature of 500°C. In this study, Fe80B20 amorphous fibers with the diameter of 60 μm were prepared by the single roller melt-spinning method and annealed at 200°C, 300°C, 400°C, and 500°C, respectively. XRD results indicate that the fiber structure remains amorphous when the annealing temperature is below 400°C. α-Fe phase and Fe3B phase appear when the annealing temperature rises to 500°C, which is above the crystallization temperature of 449°C. The recrystallization activation energy is calculated to be 221 kJ/mol. The coercivity increases with increasing annealing temperature, which attributes to the increase of total anisotropy. All the as-spun and annealed fibers exhibit good resonance behavior for magnetostrictive sensors.

  14. Simultaneous effect of crystal lattice and non magnetic substitution on magnetic properties of barium hexaferrite

    NASA Astrophysics Data System (ADS)

    Kumar, Sunil; Supriya, Sweety; Pradhan, Lagen Kumar; Pandey, Rabichandra; Kar, Manoranjan

    2018-05-01

    The aluminium doped barium hexaferrite BaFe12-xAlxO19 with x =0.0, 1.0, 2.0, 4.0 and 6.0 have been synthesized by the sol-gel method to modify the magnetic properties for technological applications. The crystal structure and phase purity of all the samples have been explored by employing the X-ray diffraction (XRD) technique. It confirms that the sample is nanocrystalline, hexagonal symmetry and all the intense peaks could be indexed to the P63/mmc space group. The obtained lattice parameters from the XRD analysis decrease with the increase in Al3+ content in the samples. The microstructural morphology and particle sizes of all samples were studied by using the Field Emission Scanning Electron Microscopy (FESEM-Hitachi-S4800) technique. The magnetic hysteresis (M-H) loops measurement has been carried out at room temperature by employing the vibrating sample magnetometer (VSM) over a field range of +20 kOe to -20 kOe. The magnetic hysteresis (M-H) loops revealed the ferromagnetic (hard magnetic materials) nature of the samples and, analyzed by using the Law of Approach to Saturation.

  15. [Frequency distribution of dibucaine numbers in 24,830 patients].

    PubMed

    Pestel, G; Sprenger, H; Rothhammer, A

    2003-06-01

    Atypical cholinesterase prolongs the duration of neuromuscular blocking drugs such as succinylcholine and mivacurium. Measuring the dibucaine number identifies patients who are at risk. This study shows the frequency distribution of dibucaine numbers routinely measured and discusses avoidable clinical problems and economic implications. Dibucaine numbers were measured on a Hitachi 917-analyzer and all dibucaine numbers recorded over a period of 4 years were taken into consideration. Repeat observations were excluded. A total of 24,830 dibucaine numbers were analysed and numbers below 30 were found in 0.07% ( n=18) giving an incidence of 1:1,400. Dibucaine numbers from 30 to 70 were found in 1.23% ( n=306). On the basis of identification of the Dibucaine numbers we could avoid the administration of succinylcholine or mivacurium resulting in a cost reduction of 12,280 Euro offset against the total laboratory costs amounting to 10,470 Euro. An incidence of 1:1,400 of dibucaine numbers below 30 is higher than documented in the literature. Therefore, routine measurement of dibucaine number is a cost-effective method of identifying patients at increased risk of prolonged neuromuscular blockade due to atypical cholinesterase.

  16. Clinical determination of glucose in human serum by a tomato skin biosensor.

    PubMed

    Han, Hui; Li, Yi; Yue, Huan; Zhou, Zaide; Xiao, Dan; Choi, Martin M F

    2008-09-01

    Glucose biosensors based on enzyme reaction of glucose oxidase were studied because the symptomatic therapy of diabetes mellitus requires reliable assessment of blood glucose level at frequent intervals. Tomato skin membranes have been successfully employed to entrap glucose oxidase for fabrication of glucose biosensor. Glucose oxidase was immobilized onto the tomato skin and the enzyme membrane was then positioned on the surface of an oxygen electrode. The glucose concentration was quantified by the change of dissolved oxygen. All the serum samples were also simultaneously determined by a Hitachi 7060 chemistry analyzer. The response of the biosensor showed a linear relationship with a concentration range of 1.0-30.0 mmol/l glucose. The limit of detection was 0.20 mmol/l. Error Grid analysis demonstrated that 100% of the results fell within clinically acceptable zones A and B. The F- and t-tests showed no significant differences between the 2 methods. The recovery was 95.0-110.0% for 30 serum samples analysis. The tomato skin biosensor possesses the advantages of simple fabrication, fast response time, low cost and high sensitivity. The results of our method are more accurate than and match well with the current clinical instrument method.

  17. SEM Imaging and Chemical Analysis of Aerosol Particles from Surface and Hi-altitudes in New Jersey.

    NASA Astrophysics Data System (ADS)

    Bandamede, M.; Boaggio, K.; Bancroft, L.; Hurler, K.; Magee, N. B.

    2016-12-01

    We report on Scanning Electron Microscopy analysis of aerosol particle morphology and chemistry. The work includes the first comparative SEM analysis of aerosol particles captured by balloon at high altitude. The particles were acquired in an urban/suburban environment in central New-Jersey. Particles were sampled from near the surface using ambient air filtration and at high-altitudes using a novel balloon-borne instrument (ICE-Ball, see abstract by K. Boaggio). Particle images and 3D geometry are acquired by a Hitachi SU-5000 SEM, with resolution to approximately 3 nm. Elemental analysis on particles is provided by Energy Dispersive X-Ray Spectroscopy (EDS, EDAX, Inc.). Uncoated imaging is conducted in low vacuum within the variable-pressure SEM, which provides improved detection and analysis of light-element compositions including Carbon. Preliminary results suggest that some similar particle types and chemical species are sampled at both surface and high-altitude. However, as expected, particle morphologies, concentrations, chemistry, and apparent origin vary significantly at different altitudes and under different atmospheric flow regimes. Improved characterization of high-altitude aerosol particles, and differences from surface particulate composition, may advance inputs for atmospheric cloud and radiation models.

  18. Influence of Grand Multiparity on the Levels of Insulin, Glucose and HOMA-IR in Comparison with Nulliparity and Primiparity.

    PubMed

    Eldin Ahmed Abdelsalam, Kamal; Alobeid M Elamin, Abdelsamee

    2017-01-01

    It is to compare the levels of fasting glucose and insulin as well as insulin resistance in grand multiparas with primiparity and nulliparity. Fasting blood samples were collected from 100 non-pregnant ladies as control group, 100 primiparity pregnant women and 100 grand multiparity pregnant women. Glucose (FBS) and insulin (FSI) concentrations were measured by Hitachi 912 full automated Chemistry Analyzer (Roche Diagnostics, Germany) as manufacturer procedure. Insulin resistance was calculated following the formula: FBG (mg dL-1)×FSI (μU mL-1)/405. This study found a significant reduction in glucose level in primiparity when compared to control group but it was increased significantly in multiparity comparing to primiparity and control. Insulin level showed significant high concentrations in pregnant women and increased significantly in grand multiparas comparing to primiparas and controls. As a result of that, HOMA-IR was increased significantly by increasing of parity. Also, there was a significant increase in fasting insulin and a decrease in insulin sensitivity with parity with association to age and obesity. Grand multiparity is associated with an increased risk of subsequent clinical insulin resistance (HOMA-IR).

  19. Low-pressure clathrate-hydrate formation in amorphous astrophysical ice analogs

    NASA Technical Reports Server (NTRS)

    Blake, D. F.; Allamandola, L. J.; Sandford, S.; Hudgins, D.; Freund, F.

    1991-01-01

    In modeling cometary ice, the properties of clathrate hydrates were used to explain anomalous gas release at large radial distances from the Sun, and the retention of particular gas inventories at elevated temperatures. Clathrates may also have been important early in solar system history. However, there has never been a reasonable mechanism proposed for clathrate formation under the low pressures typical of these environments. For the first time, it was shown that clathrate hydrates can be formed by warming and annealing amorphous mixed molecular ices at low pressures. The complex microstructures which occur as a result of clathrate formation from the solid state may provide an explanation for a variety of unexplained phenomena. The vacuum and imaging systems of an Hitachi H-500H Analytical Electron Microscope was modified to study mixed molecular ices at temperatures between 12 and 373 K. The resulting ices are characterized by low-electron dose Transmission Electron Microscopy (TEM) and Selected Area Electron Diffraction (SAED). The implications of these results for the mechanical and gas release properties of comets are discussed. Laboratory IR data from similar ices are presented which suggest the possibility of remotely observing and identifying clathrates in astrophysical objects.

  20. The Effects of Exercise Therapy on CVD Risk Factors in Women

    PubMed Central

    Hur, Sun; Kim, Seon-Rye

    2014-01-01

    [Purpose] The purpose of this study was to search for the association of Type D personality and CVD risk factors through comparison of the association of exercise participation with CVD risk factors in women. [Subjects] The research subjects were randomly assigned to four groups: Type D+Exercise (n=12), Type D+non-exercise (n=12), non-Type D+Exercise (n=12), and non-Type D+non-exercise (n=10). The study consisted of 46 participants. [Methods] An aerobic exercise program and meditation were conducted in parallel for 10 months. Stretching was performed for 10 min as a warm-up, and then walking and running on a treadmill at 60 to 70% of HRmax were performed for 40 min three times a week. Blood samples were processed according to standard laboratory procedures. The concentrations of TG and HDL cholesterol were determined enzymatically using a clinical chemistry analyzer (Hitachi High-Technologies Corporation, Tokyo, Japan). [Results] The weight, percentage of body fat, waist circumference, triglyceride concentration, HDL cholesterol concentration, systolic blood pressure, and diastolic blood pressure showed a significant difference between measurement times in the exercise groups. [Conclusion] In conclusion, there were significant differences between groups in terms of cardiovascular disease risk factors. PMID:25276017

  1. Plant maintenance and advanced reactors issue, 2008

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agnihotri, Newal

    The focus of the September-October issue is on plant maintenance and advanced reactors. Major articles/reports in this issue include: Technologies of national importance, by Tsutomu Ohkubo, Japan Atomic Energy Agency, Japan; Modeling and simulation advances brighten future nuclear power, by Hussein Khalil, Argonne National Laboratory, Energy and desalination projects, by Ratan Kumar Sinha, Bhabha Atomic Research Centre, India; A plant with simplified design, by John Higgins, GE Hitachi Nuclear Energy; A forward thinking design, by Ray Ganthner, AREVA; A passively safe design, by Ed Cummins, Westinghouse Electric Company; A market-ready design, by Ken Petrunik, Atomic Energy of Canada Limited, Canada;more » Generation IV Advanced Nuclear Energy Systems, by Jacques Bouchard, French Commissariat a l'Energie Atomique, France, and Ralph Bennett, Idaho National Laboratory; Innovative reactor designs, a report by IAEA, Vienna, Austria; Guidance for new vendors, by John Nakoski, U.S. Nuclear Regulatory Commission; Road map for future energy, by John Cleveland, International Atomic Energy Agency, Vienna, Austria; and, Vermont's largest source of electricity, by Tyler Lamberts, Entergy Nuclear Operations, Inc. The Industry Innovation article is titled Intelligent monitoring technology, by Chris Demars, Exelon Nuclear.« less

  2. Hydroxyapatite Coating on TiO₂ Nanotube by Sol-Gel Method for Implant Applications.

    PubMed

    Lim, Hyun-Pil; Park, Sang-Won; Yun, Kwi-Dug; Park, Chan; Ji, Min-Kyung; Oh, Gye-Jeong; Lee, Jong-Tak; Lee, Kwangmin

    2018-02-01

    The aim of this study was to determine the effect of hydroxyapatite (HA) coating on titanium dioxide (TiO2) nanotube by sol-gel process on viability of osteoblast like cell (MC3T3-E1) and bone formation in rat tibia. Specimens were divided into three groups including commercially pure titanium (control group), TiO2 nanotubes (group N), and HA coated TiO2 nanotubes (group HN). Surface characteristics were determined using field emission scanning electron microscope (FE-SEM; S-4700, Hitachi, Japan) and contact angles were measured. Cell viability was investigated in vitro after 1 day, 3 days, and 7 days of incubation. Implants (2.0 mm in diameter and 5.0 mm in length) were inserted into the tibia of rats. After 4 weeks, histomorphometric analysis was performed. Both N and HN groups showed enhanced hydrophilicity compared to control group. After 7 days of implantation, group HN showed higher cell viability with marginal significance (0.05 < P < 0.1). Bone to implant contact (BIC) ratio in the control group, group N, and group HN were 32.5%, 33.1%, and 43.8%, respectively. Results of this study showed that HA coated TiO2 nanotube using sol-gel process could be used to enhance hydrophilicity and improve osseointegration of dental implant surface.

  3. Parallelizing Compiler Framework and API for Power Reduction and Software Productivity of Real-Time Heterogeneous Multicores

    NASA Astrophysics Data System (ADS)

    Hayashi, Akihiro; Wada, Yasutaka; Watanabe, Takeshi; Sekiguchi, Takeshi; Mase, Masayoshi; Shirako, Jun; Kimura, Keiji; Kasahara, Hironori

    Heterogeneous multicores have been attracting much attention to attain high performance keeping power consumption low in wide spread of areas. However, heterogeneous multicores force programmers very difficult programming. The long application program development period lowers product competitiveness. In order to overcome such a situation, this paper proposes a compilation framework which bridges a gap between programmers and heterogeneous multicores. In particular, this paper describes the compilation framework based on OSCAR compiler. It realizes coarse grain task parallel processing, data transfer using a DMA controller, power reduction control from user programs with DVFS and clock gating on various heterogeneous multicores from different vendors. This paper also evaluates processing performance and the power reduction by the proposed framework on a newly developed 15 core heterogeneous multicore chip named RP-X integrating 8 general purpose processor cores and 3 types of accelerator cores which was developed by Renesas Electronics, Hitachi, Tokyo Institute of Technology and Waseda University. The framework attains speedups up to 32x for an optical flow program with eight general purpose processor cores and four DRP(Dynamically Reconfigurable Processor) accelerator cores against sequential execution by a single processor core and 80% of power reduction for the real-time AAC encoding.

  4. Characterization of a novel antibacterial glycopeptide produced by Penicillium sp. M03.

    PubMed

    Yang, W H; Zhang, W C; Lu, X M; Jiang, G S; Gao, P J

    2009-04-01

    To isolate a novel antibiotic termed AF from fermentation broth of Penicillium sp. M03 and to examine its antimicrobial activity, biological properties and structure characteristics. Sephadex LH-20 and HPLC were used to purify AF from fermentation broth of Penicillium sp. M03. The antimicrobial activity of AF was evaluated with the agar diffusion test. Amino acid and monosaccharide composition of AF was analysed by a HITACHI 835 detector and HPLC assay, respectively. Matrix-assisted laser desorption time of flight mass spectrometry, FT-IR and (1)H nuclear magnetic resonance spectra analyses were performed to examine the initial structure of AF. Eighty milligrams of AF was isolated as white powder from 1-l Penicillium sp. M03 fermentation broth. It consists of five amino acid and two monosaccharide residues and the molecular weight of it was 1017, and it was stable to beta-lactamase, heat, acid and alkali. AF showed inhibitory activity to a wide range of bacteria, particularly to multidrug-resistant Staphylococcus aureus. AF was a novel antibacterial glycopeptide with a broad inhibitory spectrum to pathogenic bacteria including multidrug-resistant agents. Furthermore, it is difficult to generate bacteria resistant to AF. Characterization of AF made it a potential antibiotic to fight against antibiotic-resistant bacterial pathogens.

  5. Nonalcoholic fatty liver in patients with Laron syndrome and GH gene deletion - preliminary report.

    PubMed

    Laron, Zvi; Ginsberg, Shira; Webb, Muriel

    2008-10-01

    There is little information on the relationship between growth hormone/insulin-like growth factor-I (GH/IGF-I) deficiency or IGF-I treatment on nonalcoholic fatty liver disease (NAFLD) a disorder linked to obesity and insulin resistance. To find out whether the markedly obese patients with Laron syndrome (LS) and GH gene deletion have fatty livers. We studied 11 untreated adult patients with LS (5M, 6F), five girls with LS treated by IGF-I and five adult patients with GH gene deletion (3M, 3F), four previously treated by hGH in childhood. Fatty liver was quantitatively evaluated by ultrasonography using a phase array US system (HITACHI 6500, Japan). Body adiposity was determined by DEXA, and insulin resistance was estimated by HOMA-IR using the fasting serum glucose and insulin values. Six out of 11 adult patients with LS, two out of the five IGF-I treated girls with LS and three out of five adult hGH gene deletion patients were found to have NAFLD (nonalcoholic fatty liver disease). NAFLD is a frequent complication in untreated and treated congenital IGF-I deficiency. No correlation between NAFLD and age, sex, degree of obesity, blood lipids, or degree of insulin resistance was observed.

  6. Voices from the front lines. Four leaders on the cross-border challeng they've faced.

    PubMed

    Minguet, Luc; Caride, Eduardo; Yamaguchi, Takeo; Tedjarati, Shane

    2014-09-01

    Executives on the front lines of managing across borders share their insights: Luc Minguet, of France's Michelin, talks about the importance of cultural training not just for managers taking on assignments abroad but also for local employees who work with colleagues from around the world. He describes how his own experience learning to communicate across cultures reflects the tire-maker's broader practices. Eduardo Caride, of Madrid-based Telefónica, explains how the relatively young multinational is investing in a diverse talent mix as it strives to become a truly global company. Whereas early on, leaders relied on exporting Spanish managers abroad, he notes, the street now runs both ways. Takeo Yamaguchi, of Japan's Hitachi, details his efforts to create standardized global HR systems and processes across the conglomerate's 948 separate companies. "Three years ago, we had no systematic way of tracking employees, evaluating performance, or identifying future leaders," Yamaguchi says. "Today we do." And Shane Tedjarati, from the United States' Honeywell, talks about how the industrial powerhouse is shifting its strategy toward new regions, such as China, India, vietnam, and Indonesia. "We call these markets 'high-growth regions' instead of emerging markets," says Tedjarati, "because they now account for more than half of Honeywell's total growth."

  7. EELS Valence Mapping in Electron Beam Sensitive FeFx/C Nanocomposites

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cosandey, F.; Al-Sharab, J.F.; Amatucci, Glenn G.

    A new type of positive electrodes for Li-Ion batteries has been synthesized based on FeF{sub 2}/C and FeF3/C nanocomposites with particle size in the 8-12 nm range [1]. The measured high capacities rely on a complete reduction of Fe to its metallic state according to the following reaction: xLi{sup +}+xe{sup -} +Fe{sup x+}Fx = xLiF + Fe{sup 0}, where x=3 and x=2 for FeF3/C and FeF2/C respectively. This electrochemical reaction involves a change in valence state of Fe from 3+ or 2+ to 0 that can be determined uniquely by EELS from the peak energy of the L{sub 3} linemore » and from the L{sub 3}/L{sub 2} line intensity ratio. In this paper, we report EELS mapping results on the electrochemical conversion processes and in particular the mapping of the Fe valence state before and after discharge. This work was performed with a Hitachi HF2000 equipped with a Gatan PEELS and with a FEI CM200 FEG TEM equipped with a Gatan GIF. Both instruments were operated in STEM mode at 200kV with an EELS collection half angle of {beta}=5 mrad and spectrum imaging software.« less

  8. Baseline Fracture Toughness and CGR testing of alloys X-750 and XM-19 (EPRI Phase I)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J. H. Jackson; S. P. Teysseyre

    2012-10-01

    The Advanced Test Reactor National Scientific User Facility (ATR NSUF) and Electric Power Research Institute (EPRI) formed an agreement to test representative alloys used as reactor structural materials as a pilot program toward establishing guidelines for future ATR NSUF research programs. This report contains results from the portion of this program established as Phase I (of three phases) that entails baseline fracture toughness, stress corrosion cracking (SCC), and tensile testing of selected materials for comparison to similar tests conducted at GE Global Research. The intent of this Phase I research program is to determine baseline properties for the materials ofmore » interest prior to irradiation, and to ensure comparability between laboratories using similar testing techniques, prior to applying these techniques to the same materials after having been irradiated at the Advanced Test Reactor (ATR). The materials chosen for this research are the nickel based super alloy X-750, and nitrogen strengthened austenitic stainless steel XM-19. A spare core shroud upper support bracket of alloy X-750 was purchased by EPRI from Southern Co. and a section of XM-19 plate was purchased by EPRI from GE-Hitachi. These materials were sectioned at GE Global Research and provided to INL.« less

  9. Baseline Fracture Toughness and CGR testing of alloys X-750 and XM-19 (EPRI Phase I)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J. H. Jackson; S. P. Teysseyre

    2012-02-01

    The Advanced Test Reactor National Scientific User Facility (ATR NSUF) and Electric Power Research Institute (EPRI) formed an agreement to test representative alloys used as reactor structural materials as a pilot program toward establishing guidelines for future ATR NSUF research programs. This report contains results from the portion of this program established as Phase I (of three phases) that entails baseline fracture toughness, stress corrosion cracking (SCC), and tensile testing of selected materials for comparison to similar tests conducted at GE Global Research. The intent of this Phase I research program is to determine baseline properties for the materials ofmore » interest prior to irradiation, and to ensure comparability between laboratories using similar testing techniques, prior to applying these techniques to the same materials after having been irradiated at the Advanced Test Reactor (ATR). The materials chosen for this research are the nickel based super alloy X-750, and nitrogen strengthened austenitic stainless steel XM-19. A spare core shroud upper support bracket of alloy X-750 was purchased by EPRI from Southern Co. and a section of XM-19 plate was purchased by EPRI from GE-Hitachi. These materials were sectioned at GE Global Research and provided to INL.« less

  10. Application of the DG-1199 methodology to the ESBWR and ABWR.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kalinich, Donald A.; Gauntt, Randall O.; Walton, Fotini

    2010-09-01

    Appendix A-5 of Draft Regulatory Guide DG-1199 'Alternative Radiological Source Term for Evaluating Design Basis Accidents at Nuclear Power Reactors' provides guidance - applicable to RADTRAD MSIV leakage models - for scaling containment aerosol concentration to the expected steam dome concentration in order to preserve the simplified use of the Accident Source Term (AST) in assessing containment performance under assumed design basis accident (DBA) conditions. In this study Economic and Safe Boiling Water Reactor (ESBWR) and Advanced Boiling Water Reactor (ABWR) RADTRAD models are developed using the DG-1199, Appendix A-5 guidance. The models were run using RADTRAD v3.03. Low Populationmore » Zone (LPZ), control room (CR), and worst-case 2-hr Exclusion Area Boundary (EAB) doses were calculated and compared to the relevant accident dose criteria in 10 CFR 50.67. For the ESBWR, the dose results were all lower than the MSIV leakage doses calculated by General Electric/Hitachi (GEH) in their licensing technical report. There are no comparable ABWR MSIV leakage doses, however, it should be noted that the ABWR doses are lower than the ESBWR doses. In addition, sensitivity cases were evaluated to ascertain the influence/importance of key input parameters/features of the models.« less

  11. Comparison of Endoanal Ultrasound with Clinical Diagnosis in Anal Fistula Assessment.

    PubMed

    Sirikurnpiboon, Siripong; Phadhana-anake, Oradee; Awapittaya, Burin

    2016-02-01

    Anal fistula anatomy and its relationship with anal sphincters are important factors influencing the results of surgical management. Pre-operative definitions of fistulous track(s) and the internal opening play a primary role in minimizing damage to the sphincters and recurrence of the fistula. To evaluate the relative accuracy of digital examination and endoanal ultrasound for pre-operative assessment of anal fistula by comparing operative findings. A retrospective review was conducted of all patients with anal fistula admitted to the surgical unit between May 2008 and May 2012. Physical examination and hydrogen peroxide-enhanced endoanal ultrasound (utilising a 10 MHz endoprobe, HITACHI: EUB-7500), were performed in 142 consecutive patients. Results were matched with surgical features to establish their accuracy in preoperative anal fistula assessment. A total of 142 patients (107 men, 35 women), 28 of whom had had previous surgery, were included in the study. Their mean age was 40 (range 18-71) years and their mean BMI was 26.37 (range 17.30-36.11) kg/m². The majority of the fistulas were transphincteric (90.4%) and the rest were intersphincteric (9.6%). The accuracy rates of clinical examination and endoanal ultrasound were 55.63 and 95.07 percent (p < 0.01), respectively. Endoanal ultrasound is superior to digital examination for pre-operative classification of anal fistula

  12. AVC/H.264 patent portfolio license

    NASA Astrophysics Data System (ADS)

    Skandalis, Dean A.

    2006-08-01

    MPEG LA, LLC offers a joint patent license for the AVC (a/k/a H.264) Standard (ISO/IEC IS 14496-10:2004). Like MPEG LA's other licenses, the AVC Patent Portfolio License is offered for the convenience of the marketplace as an alternative enabling users to access essential intellectual property owned by many patent holders under a single license rather than negotiating licenses with each of them individually. The AVC Patent Portfolio License includes essential patents owned by DAEWOO Electronics Corporation; Electronics and Telecommunications Research Institute (ETRI); France Telecom, societe anonyme; Fujitsu Limited; Hitachi, Ltd.; Koninklijke Philips Electronics N.V.; LG Electronics Inc.; Matsushita Electric Industrial Co., Ltd.; Microsoft Corporation; Mitsubishi Electric Corporation; Robert Bosch GmbH; Samsung Electronics Co., Ltd.; Sedna Patent Services, LLC; Sharp Kabushiki Kaisha; Siemens AG; Sony Corporation; The Trustees of Columbia University in the City of New York; Toshiba Corporation; UB Video Inc.; and Victor Company of Japan, Limited. Another is expected also to join as of August 1, 2006. MPEG LA's objective is to provide worldwide access to as much AVC essential intellectual property as possible for the benefit of AVC users. Therefore, any party that believes it has essential patents is welcome to submit them for evaluation of their essentiality and inclusion in the License if found essential.

  13. Shear wave elastography with a new reliability indicator.

    PubMed

    Dietrich, Christoph F; Dong, Yi

    2016-09-01

    Non-invasive methods for liver stiffness assessment have been introduced over recent years. Of these, two main methods for estimating liver fibrosis using ultrasound elastography have become established in clinical practice: shear wave elastography and quasi-static or strain elastography. Shear waves are waves with a motion perpendicular (lateral) to the direction of the generating force. Shear waves travel relatively slowly (between 1 and 10 m/s). The stiffness of the liver tissue can be assessed based on shear wave velocity (the stiffness increases with the speed). The European Federation of Societies for Ultrasound in Medicine and Biology has published Guidelines and Recommendations that describe these technologies and provide recommendations for their clinical use. Most of the data available to date has been published using the Fibroscan (Echosens, France), point shear wave speed measurement using an acoustic radiation force impulse (Siemens, Germany) and 2D shear wave elastography using the Aixplorer (SuperSonic Imagine, France). More recently, also other manufacturers have introduced shear wave elastography technology into the market. A comparison of data obtained using different techniques for shear wave propagation and velocity measurement is of key interest for future studies, recommendations and guidelines. Here, we present a recently introduced shear wave elastography technology from Hitachi and discuss its reproducibility and comparability to the already established technologies.

  14. Shear wave elastography with a new reliability indicator

    PubMed Central

    Dong, Yi

    2016-01-01

    Non-invasive methods for liver stiffness assessment have been introduced over recent years. Of these, two main methods for estimating liver fibrosis using ultrasound elastography have become established in clinical practice: shear wave elastography and quasi-static or strain elastography. Shear waves are waves with a motion perpendicular (lateral) to the direction of the generating force. Shear waves travel relatively slowly (between 1 and 10 m/s). The stiffness of the liver tissue can be assessed based on shear wave velocity (the stiffness increases with the speed). The European Federation of Societies for Ultrasound in Medicine and Biology has published Guidelines and Recommendations that describe these technologies and provide recommendations for their clinical use. Most of the data available to date has been published using the Fibroscan (Echosens, France), point shear wave speed measurement using an acoustic radiation force impulse (Siemens, Germany) and 2D shear wave elastography using the Aixplorer (SuperSonic Imagine, France). More recently, also other manufacturers have introduced shear wave elastography technology into the market. A comparison of data obtained using different techniques for shear wave propagation and velocity measurement is of key interest for future studies, recommendations and guidelines. Here, we present a recently introduced shear wave elastography technology from Hitachi and discuss its reproducibility and comparability to the already established technologies. PMID:27679731

  15. Fine structural features of nanoscale zero-valent iron characterized by spherical aberration corrected scanning transmission electron microscopy (Cs-STEM).

    PubMed

    Liu, Airong; Zhang, Wei-xian

    2014-09-21

    An angstrom-resolution physical model of nanoscale zero-valent iron (nZVI) is generated with a combination of spherical aberration corrected scanning transmission electron microscopy (Cs-STEM), selected area electron diffraction (SAED), energy-dispersive X-ray spectroscopy (EDS) and electron energy-loss spectroscopy (EELS) on the Fe L-edge. Bright-field (BF), high-angle annular dark-field (HAADF) and secondary electron (SE) imaging of nZVI acquired by a Hitachi HD-2700 STEM show near atomic resolution images and detailed morphological and structural information of nZVI. The STEM-EDS technique confirms that the fresh nZVI comprises of a metallic iron core encapsulated with a thin layer of iron oxides or oxyhydroxides. SAED patterns of the Fe core suggest the polycrystalline structure in the metallic core and amorphous nature of the oxide layer. Furthermore, Fe L-edge of EELS shows varied structural features from the innermost Fe core to the outer oxide shell. A qualitative analysis of the Fe L(2,3) edge fine structures reveals that the shell of nZVI consists of a mixed Fe(II)/Fe(III) phase close to the Fe (0) interface and a predominantly Fe(III) at the outer surface of nZVI.

  16. SEM-based overlay measurement between via patterns and buried M1 patterns using high-voltage SEM

    NASA Astrophysics Data System (ADS)

    Hasumi, Kazuhisa; Inoue, Osamu; Okagawa, Yutaka; Shao, Chuanyu; Leray, Philippe; Halder, Sandip; Lorusso, Gian; Jehoul, Christiane

    2017-03-01

    The miniaturization of semiconductors continues, importance of overlay measurement is increasing. We measured overlay with analysis SEM called Miracle Eye which can output ultrahigh acceleration voltage in 1998. Meanwhile, since 2006, we have been working on SEM based overlay measurement and developed overlay measurement function of the same layer using CD-SEM. Then, we evaluated overlay of the same layer pattern after etching. This time, in order to measure overlay after lithography, we evaluated the see-through overlay using high voltage SEM CV5000 released in October 2016. In collaboration between imec and Hitachi High-Technologies, we evaluated repeatability, TIS of SEM-OVL as well as correlation between SEM-OVL and Opt-OVL in the M1@ADI and V0@ADI process. Repeatability and TIS results are reasonable and SEM-OVL has good correlation with Opt-OVL. By overlay measurement using CV 5000, we got the following conclusions. (1)SEM_OVL results of both M1 and V0 at ADI show good correlation to OPT_OVL. (2)High voltage SEM can prove the measurement capability of a small pattern(Less than 1 2um) like device that can be placed in-die area. (3)"In-die SEM based overlay" shows possibility for high order control of scanner

  17. Transmission electron microscopy of polyhydroxybutyrate-co-valerate (PHBV)/nanocrystalline cellulose (NCC) bio-nanocomposite prepared using cryo-ultramicrotomy

    NASA Astrophysics Data System (ADS)

    Ismarul, N. I.; Engku, A. H. E. U.; Siti, N. K.; Tay, K. Y.

    2017-12-01

    Environmental issues on disposal and end-of-life for product made from synthetic petroleum-derived polymers have gained increasing attention from materials scientist to search for new materials with similar physical and mechanical properties but environmental friendly in a way that they are renewable and biodegradable as well. This work is to study the effect of nanocrystalline cellulose in improving the thermal stability of polyhydroxybutyrate-co-valerate biopolymer for high temperature processing of packaging material. 10 % w/w PHBV-NCC bio-nanocomposite feedstock pellet prepared using RONDOL minilab compounder was used as the sample for the preparation of Transmission Electron Microscopy (TEM) sample. RMC Cryo-Ultramicrotomy equipment was used to prepare the ultra-thin slice of the bio-nanocomposite pellet under liquid nitrogen at - 60 °C. Diamond knife was used to slice off about 80-100 nm ultra-thin bio-nanocomposite films and was transferred into the lacey carbon film coated grid using cooled sugar solution. A few drops of phosphotungstic acid was used as negative stain to improve the contrast during the TEM analysis. HITACHI TEM systems was used to obtain the TEM micrograph of PHBV-NCC bio-nanocomposite using 80kV accelerating voltage. A well dispersed NCC in PHBV matrix, ranging from 5 to 25 nm in width was observed.

  18. Inkjet-Printed Nanocavities on a Photonic Crystal Template.

    PubMed

    Brossard, Frederic S F; Pecunia, Vincenzo; Ramsay, Andrew J; Griffiths, Jonathan P; Hugues, Maxime; Sirringhaus, Henning

    2017-12-01

    The last decade has witnessed the rapid development of inkjet printing as an attractive bottom-up microfabrication technology due to its simplicity and potentially low cost. The wealth of printable materials has been key to its widespread adoption in organic optoelectronics and biotechnology. However, its implementation in nanophotonics has so far been limited by the coarse resolution of conventional inkjet-printing methods. In addition, the low refractive index of organic materials prevents the use of "soft-photonics" in applications where strong light confinement is required. This study introduces a hybrid approach for creating and fine tuning high-Q nanocavities, involving the local deposition of an organic ink on the surface of an inorganic 2D photonic crystal template using a commercially available high-resolution inkjet printer. The controllability of this approach is demonstrated by tuning the resonance of the printed nanocavities by the number of printer passes and by the fabrication of photonic crystal molecules with controllable splitting. The versatility of this method is evidenced by the realization of nanocavities obtained by surface deposition on a blank photonic crystal. A new method for a free-form, high-density, material-independent, and high-throughput fabrication technique is thus established with a manifold of opportunities in photonic applications. © 2017 Hitachi Cambridge Laboratory. Published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. A new automated multiple allergen simultaneous test-chemiluminescent assay (MAST-CLA) using an AP720S analyzer.

    PubMed

    Lee, Sungsil; Lim, Hwan Sub; Park, Jungyong; Kim, Hyon Suk

    2009-04-01

    In the diagnosis of atopic diseases, allergen detection is a crucial step. Multiple allergen simultaneous test-chemiluminescent assay (MAST-CLA) is a simple and noninvasive method for in vitro screening of allergen-specific IgE antibodies. The Korean Inhalant Panel test on 20 patients and Food Panel test on 19 patients were performed using the conventional manual MAST-CLA kit and the new automated MAST-CLA method (automated AP720S system for the Optigen Assay; Hitachi Chemical Diagnostics, Inc., USA) simultaneously. The results were evaluated for positive reactivity and concordance. The results of inhalant panel gave a relatively higher class level result than the food panel. The 8 patients out of 20 (40%) of the inhalation panel, and 9 patients out of 18 (47.4%) of the food panel showed 100% concordance between the 2 systems. Eighteen patients (90%) of the Inhalation Panel and sixteen patients (84.2%) of the Food Panel showed more than 91% concordance. These results suggest that the MAST-CLA assay using the new, automated AP720S analyzer performs well, showing a high concordance rate with conventional MAST-CLA. Compared to manual MAST-CLA, the automated AP720S system has a shorter assay time and uses a smaller serum volume (500 microl) along with other conveniences.

  20. Technical note: Evaluation of the diagnostic accuracy of 2 point-of-care β-hydroxybutyrate devices in stored bovine plasma at room temperature and at 37°C.

    PubMed

    Leal Yepes, F A; Nydam, D V; Heuwieser, W; Mann, S

    2018-04-25

    The use of point-of-care (POC) devices to measure blood metabolites, such as β-hydroxybutyrate (BHB), on farm have become an important diagnostic and screening tool in the modern dairy industry. The POC devices allow for immediate decision making and are often more economical than the use of laboratory-based methods; however, precision and accuracy may be lower when measurements are performed in an uncontrolled environment. Ideally, the advantages of the POC devices and the standardized laboratory environment could be combined when measuring samples that do not require an immediate result-for example, in research applications or when immediate intervention is not the goal. The objective of this study was to compare the capability of 2 POC devices (TaiDoc, Pharmadoc, Lübeck, Germany; Precision Xtra, Abbott Diabetes Care, Abingdon, UK) to measure BHB concentrations either at room temperature (RT; 20-22°C) or at 37°C compared with the gold standard test in stored plasma samples. Whole blood from multiparous Holstein dairy cows (n = 113) was sampled from the coccygeal vessels between 28 d before expected calving and 42 DIM. Whole-blood BHB concentrations were determined cow-side using the TaiDoc POC device. Plasma was separated within 1 h of collection and stored until analysis. A subset of stored plasma samples (n = 100) consisting of 1 sample per animal was chosen retrospectively based on the BHB concentrations in whole blood within the range of 0.2 to 4.0 mmol/L. The samples were analyzed for BHB plasma concentration using an automated chemistry analyzer (Hitachi 917, Hitachi, Tokyo, Japan), which was considered the gold standard. On the same day, the samples were also measured with the 2 POC devices, with samples either at RT or heated up to 37°C. Our study showed high Spearman correlation coefficients (>0.99) using either device and with samples at both temperatures compared with the gold standard. Passing-Bablok regression revealed a very strong correlation (>0.99), indicating good agreement between both POC devices and the gold standard method. For hyperketonemia detection, defined as BHB concentration ≥1.2 mmol/L, the sensitivity for both POC devices at RT and 37°C was equally high at 100%. Specificity was lowest (67.4%) for the TaiDoc used with plasma at RT and was highest (86.5%) when plasma was measured at 37°C with the Precision Xtra meter. Bland-Altman plots revealed a mean bias of 0.25 and 0.4 mmol/L for the Precision Xtra meter and TaiDoc, respectively, when tested on plasma at 37°C. Our data showed that both POC devices are suitable for measuring BHB concentration in stored bovine plasma, and accuracy was highest when samples were heated to 37°C compared with RT. Copyright © 2018 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  1. Surface scanning inspection system particle detection dependence on aluminum film morphology

    NASA Astrophysics Data System (ADS)

    Prater, Walter; Tran, Natalie; McGarvey, Steve

    2012-03-01

    Physical vapor deposition (PVD) aluminum films present unique challenges when detecting particulate defects with a Surface Scanning Inspection System (SSIS). Aluminum (Al) films 4500Å thick were deposited on 300mm particle grade bare Si wafers at two temperatures using a Novellus Systems INOVA® NExT,.. Film surface roughness and morphology measurements were performed using a Veeco Vx310® atomic force microscope (AFM). AFM characterization found the high deposition temperature (TD) Al roughness (Root Mean Square 16.5 nm) to be five-times rougher than the low-TD Al roughness (rms 3.7 nm). High-TD Al had grooves at the grain boundaries that were measured to be 20 to 80 nm deep. Scanning electron microscopy (SEM) examination, with a Hitachi RS6000 defect review SEM, confirmed the presence of pronounced grain grooves. SEM images established that the low-TD filmed wafers have fine grains (0.1 to 0.3 um diameter) and the high-TD film wafers have fifty-times larger equiaxed plateletshape grains (5 to 15 um diameter). Calibrated Poly-Styrene Latex (PSL) spheres ranging in size from 90 nm to 1 μm were deposited in circular patterns on the wafers using an aerosol deposition chamber. PSL sphere depositions at each spot were controlled to yield 2000 to 5000 counts. A Hitachi LS9100® dark field full wafer SSIS was used to experimentally determine the relationship of the PSL sphere scattered light intensity with S-polarized light, a measure of scattering cross-section, with respect to the calibrated PSL sphere diameter. Comparison of the SSIS scattered light versus PSL spot size calibration curves shows two distinct differences. Scattering cross-section (intensity) of the PSL spheres increased on the low-TD Al film with smooth surface roughness and the low-TD Al film defect detection sensitivity was 126 nm compared to 200 nm for the rougher high- TD Al film. This can be explained by the higher signal to noise attributed to the smooth low-TD Al. Dark field defect detection on surface scanning inspection systems is used to rapidly measure defectivity data. The user generates a calibration curve on the SSIS to plot the intensity of the light scattering derived at each National Institute of Standards and Technology (NIST) certified PSL deposition spot that was deposited. It is not uncommon for the end user to embark upon the time consuming process of attempting to "push" the maximal SSIS film specific sensitivity curve beyond the optical performance capability of the SSIS. Bidirectional reflectance distribution function (BRDF) light scattering modeling was utilized as a means of determining the most appropriate polarity prior to the SSIS recipe creation process. The modeling utilized the Al refractive index (n) and extinction coefficient (k) and the SSIS detector angles and laser wavelength. The modeling results allowed predetermination of the maximal sensitivity for each different Al thickness and eliminate unnecessary recipe modification trial-and-error in search of the SSIS maximal sensitivity. The modeling accurately forecasted the optimal polarization and maximal sensitivity of the SSIS recipe, which, by avoiding a trial and error approach, can result in a substantial savings in time and resources.

  2. [Comparison of two methods for rapid determination of C-reactive protein with the Tina-quant].

    PubMed

    Oremek, G M; Luksaite, R; Bretschneider, I

    2008-03-01

    C-reactive protein (CRP) as an acute phase protein is an important diagnostic marker for the presence and course of human processes. Out of the acute phase proteins it is one of those the concentrations increase most rapidly with its sensitivity being superior to other markers of inflammation, such as leukocytosis, erythrocytic sedimentation rate, and fever. This study compared two-point-of-care assays with the standard laboratory method Tina-quant CRP processed on a Hitachi 917: the immunofiltration assay NycoCard CRP Whole Blood and the turbidimetric immunoassay Micros CRP. Both methods are carried in the presence of a patient, by using capillary or venous blood. Seventy-eight blood samples were analyzed first in the standard laboratory routine and then by both rapid test assays. The precision of both assays was determined from the confidence interval. The results were statistically analyzed by arithmetic standard deviation mean method, variation coefficient, Spearman correlation index, Wilcoxon and Bland-Altman tests, and Passing-Bablock regression. NycoCard CRP Whole Blood showed a correlation coefficient of R = 0.9838; the precision had a coefficient of variation of CV = 1.8759% while As compared with Tina-quant CRP had R = 0.9934 and CV = 0.9160%. Both assays indicated the same results as Tina-quant CRP. Both Tina-quant CRP and NycoCard CRP Whole Blood give the best fit for the rapid determination of CRP.

  3. Prefrontal cerebral blood volume patterns while playing video games--a near-infrared spectroscopy study.

    PubMed

    Nagamitsu, Shinichiro; Nagano, Miki; Yamashita, Yushiro; Takashima, Sachio; Matsuishi, Toyojiro

    2006-06-01

    Video game playing is an attractive form of entertainment among school-age children. Although this activity reportedly has many adverse effects on child development, these effects remain controversial. To investigate the effect of video game playing on regional cerebral blood volume, we measured cerebral hemoglobin concentrations using near-infrared spectroscopy in 12 normal volunteers consisting of six children and six adults. A Hitachi Optical Topography system was used to measure hemoglobin changes. For all subjects, the video game Donkey Kong was played on a Game Boy device. After spectroscopic probes were positioned on the scalp near the target brain regions, the participants were asked to play the game for nine periods of 15s each, with 15-s rest intervals between these task periods. Significant increases in bilateral prefrontal total-hemoglobin concentrations were observed in four of the adults during video game playing. On the other hand, significant decreases in bilateral prefrontal total-hemoglobin concentrations were seen in two of the children. A significant positive correlation between mean oxy-hemoglobin changes in the prefrontal region and those in the bilateral motor cortex area was seen in adults. Playing video games gave rise to dynamic changes in cerebral blood volume in both age groups, while the difference in the prefrontal oxygenation patterns suggested an age-dependent utilization of different neural circuits during video game tasks.

  4. Evaluation of the Microsemi CRP, an automated hematology analyzer for rapid 3-part WBC differential and CRP using whole blood.

    PubMed

    Nomura, N; Saito, K; Ikeda, M; Yuasa, S; Pastore, M; Chabert, C; Kono, E; Sakai, A; Tanaka, H; Ikemoto, T; Takubo, T

    2015-08-01

    We evaluated the basic performance of Microsemi CRP, an unique automated hematology analyzer which can simultaneously measure CBC including 3-part WBC differential (3-Diff) and CRP using whole blood treated with EDTA-2K anticoagulant. We found that it produced generally the acceptable results for all parameters performed (repeatability, reproducibility, linearity, interference effect, carry over, and correlation) using control materials, fresh human whole bloods, and serum samples. CBC data examined using Microsemi CRP showed the good correlation with the previous model, Micros CRP200 (r ≧ 0.9), and also those obtained using the routine analyzer, ADVIA 2120i (r ≧ 0.989). Concerning the 3-Diff, both GRA (%) and LYM (%) showed the excellent correlation coefficient between Microsemi CRP and Micros CRP200 (r ≧ 0.992) as well as ADVIA 2120i (r ≧ 0.957). MON (%) showed good correlation between Microsemi CRP and Micros CRP200 (r = 0.959), but lower correlation between Microsemi CRP and ADVIA 2120 i (r = 0.471). CRP data showed the good correlation with HITACHI7600 (r ≧ 0.997) and Micros CRP200 (r ≧ 0.997). From these findings, we concluded that Microsemi CRP seemed the convenient laboratory analyzer in the setting of point of care testing (POCT) especially at NICU or primary care unit. © 2014 John Wiley & Sons Ltd.

  5. [Evaluation of Optium Xceed (Abbott) and One Touch Ultra (Lifescan) glucose meters].

    PubMed

    Coyne, S; Lacour, B; Hennequin-Le Meur, C

    2008-01-01

    In order to build a continuous quality improvement approach for control of glucose meters in clinical divisions at Necker-Enfants Malades hospital, the analytical performances (precision and accuracy) of 2 glucose meters have been evaluated in our laboratory according to SFBC recommendations. Fifty-six heparinized whole blood specimens from patients and thirty-nine from healthy volunteers were analyzed on each of the two meters and compared to plasma glucose measurement on the Roche Hitachi 917 system. The correlation coefficient was 0.938 for Optium Xceed and 0.911 for One Touch Ultra. However, 14.7% and 18.9% of the results (n = 95) for respectively Optium Xceed and One Touch Ultra were discordant, i.e. higher than a 20% difference compared to reference blood glucose concentrations. Inaccuracy was more important for low glucose concentrations (< 5 mmol/L; 12/14 discrepant samples for Optium Xceed and 16/19 for One Touch Ultra). This data suggests a lack of accuracy, particularly for low glucose concentrations. Capillary blood glucose concentrations must therefore be interpreted with caution concerning the diagnosis of hypoglycemia and treatment of unstable patients. Moreover, quality control of glucose meters (blood glucose determinations concurrently at bedside and in the laboratory) is difficult to perform. It also raises questions about the responsibility of "point-of-care testing", an area still subject to discussion.

  6. The CloudBoard Research Platform: an interactive whiteboard for corporate users

    NASA Astrophysics Data System (ADS)

    Barrus, John; Schwartz, Edward L.

    2013-03-01

    Over one million interactive whiteboards (IWBs) are sold annually worldwide, predominantly for classroom use with few sales for corporate use. Unmet needs for IWB corporate use were investigated and the CloudBoard Research Platform (CBRP) was developed to investigate and test technology for meeting these needs. The CBRP supports audio conferencing with shared remote drawing activity, casual capture of whiteboard activity for long-term storage and retrieval, use of standard formats such as PDF for easy import of documents via the web and email and easy export of documents. Company RFID badges and key fobs provide secure access to documents at the board and automatic logout occurs after a period of inactivity. Users manage their documents with a web browser. Analytics and remote device management is provided for administrators. The IWB hardware consists of off-the-shelf components (a Hitachi UST Projector, SMART Technologies, Inc. IWB hardware, Mac Mini, Polycom speakerphone, etc.) and a custom occupancy sensor. The three back-end servers provide the web interface, document storage, stroke and audio streaming. Ease of use, security, and robustness sufficient for internal adoption was achieved. Five of the 10 boards installed at various Ricoh sites have been in daily or weekly use for the past year and total system downtime was less than an hour in 2012. Since CBRP was installed, 65 registered users, 9 of whom use the system regularly, have created over 2600 documents.

  7. Correlation of salivary glucose level with blood glucose level in diabetes mellitus.

    PubMed

    Gupta, Shreya; Nayak, Meghanand T; Sunitha, J D; Dawar, Geetanshu; Sinha, Nidhi; Rallan, Neelakshi Singh

    2017-01-01

    Saliva is a unique fluid, which is important for normal functioning of the oral cavity. Diabetes mellitus (DM) is a disease of absolute or relative insulin deficiency characterized by insufficient secretion of insulin by pancreatic beta-cells. The diagnosis of diabetes through blood is difficult in children, older adults, debilitated and chronically ill patients, so diagnosis by analysis of saliva can be potentially valuable as collection of saliva is noninvasive, easier and technically insensitive, unlike blood. The aim of the study was to correlate blood glucose level (BGL) and salivary glucose level (SGL) in DM patients. A cross-sectional study was conducted in 120 patients, who were categorized as 40 controlled diabetics, 40 uncontrolled diabetics and 40 healthy, age- and sex-matched individuals constituted the controls. The blood and unstimulated saliva samples were collected from the patients at the different intervals for fasting, random and postprandial levels. These samples were then subjected for analysis of glucose in blood and saliva using glucose oxidase/peroxidase reagent in HITACHI 902 (R) Automatic analyzer, and the results were recorded. The mean SGLs were higher in uncontrolled and controlled diabetic groups than in nondiabetic group. A highly statistically significant correlation was found between fasting saliva glucose and fasting blood glucose in all the groups. With increase in BGL, increase in SGL was observed in patients with diabetes suggesting that SGL can be used for monitoring glycemic level in DM.

  8. Development of the compact proton beam therapy system dedicated to spot scanning with real-time tumor-tracking technology

    NASA Astrophysics Data System (ADS)

    Umezawa, Masumi; Fujimoto, Rintaro; Umekawa, Tooru; Fujii, Yuusuke; Takayanagi, Taisuke; Ebina, Futaro; Aoki, Takamichi; Nagamine, Yoshihiko; Matsuda, Koji; Hiramoto, Kazuo; Matsuura, Taeko; Miyamoto, Naoki; Nihongi, Hideaki; Umegaki, Kikuo; Shirato, Hiroki

    2013-04-01

    Hokkaido University and Hitachi Ltd. have started joint development of the Gated Spot Scanning Proton Therapy with Real-Time Tumor-Tracking System by integrating real-time tumor tracking technology (RTRT) and the proton therapy system dedicated to discrete spot scanning techniques under the "Funding Program for World-Leading Innovative R&D on Science and Technology (FIRST Program)". In this development, we have designed the synchrotron-based accelerator system by using the advantages of the spot scanning technique in order to realize a more compact and lower cost proton therapy system than the conventional system. In the gated irradiation, we have focused on the issues to maximize irradiation efficiency and minimize the dose errors caused by organ motion. In order to understand the interplay effect between scanning beam delivery and target motion, we conducted a simulation study. The newly designed system consists of the synchrotron, beam transport system, one compact rotating gantry treatment room with robotic couch, and one experimental room for future research. To improve the irradiation efficiency, the new control function which enables multiple gated irradiations per synchrotron cycle has been applied and its efficacy was confirmed by the irradiation time estimation. As for the interplay effect, we confirmed that the selection of a strict gating width and scan direction enables formation of the uniform dose distribution.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, David; Brunett, Acacia J.; Passerini, Stefano

    GE Hitachi Nuclear Energy (GEH) and Argonne National Laboratory (Argonne) participated in a two year collaboration to modernize and update the probabilistic risk assessment (PRA) for the PRISM sodium fast reactor. At a high level, the primary outcome of the project was the development of a next-generation PRA that is intended to enable risk-informed prioritization of safety- and reliability-focused research and development. A central Argonne task during this project was a reliability assessment of passive safety systems, which included the Reactor Vessel Auxiliary Cooling System (RVACS) and the inherent reactivity feedbacks of the metal fuel core. Both systems were examinedmore » utilizing a methodology derived from the Reliability Method for Passive Safety Functions (RMPS), with an emphasis on developing success criteria based on mechanistic system modeling while also maintaining consistency with the Fuel Damage Categories (FDCs) of the mechanistic source term assessment. This paper provides an overview of the reliability analyses of both systems, including highlights of the FMEAs, the construction of best-estimate models, uncertain parameter screening and propagation, and the quantification of system failure probability. In particular, special focus is given to the methodologies to perform the analysis of uncertainty propagation and the determination of the likelihood of violating FDC limits. Additionally, important lessons learned are also reviewed, such as optimal sampling methodologies for the discovery of low likelihood failure events and strategies for the combined treatment of aleatory and epistemic uncertainties.« less

  10. Repeatability of shear wave elastography in liver fibrosis phantoms—Evaluation of five different systems

    PubMed Central

    2018-01-01

    This study aimed to assess and validate the repeatability and agreement of quantitative elastography of novel shear wave methods on four individual tissue-mimicking liver fibrosis phantoms with different known Young’s modulus. We used GE Logiq E9 2D-SWE, Philips iU22 ARFI (pSWE), Samsung TS80A SWE (pSWE), Hitachi Ascendus (SWM) and Transient Elastography (TE). Two individual investigators performed all measurements non-continued and in parallel. The methods were evaluated for inter- and intraobserver variability by intraclass correlation, coefficient of variation and limits of agreement using the median elastography value. All systems used in this study provided high repeatability in quantitative measurements in a liver fibrosis phantom and excellent inter- and intraclass correlations. All four elastography platforms showed excellent intra-and interobserver agreement (interclass correlation 0.981–1.000 and intraclass correlation 0.987–1.000) and no significant difference in mean elasticity measurements for all systems, except for TE on phantom 4. All four liver fibrosis phantoms could be differentiated by quantitative elastography, by all platforms (p<0.001). In the Bland-Altman analysis the differences in measurements were larger for the phantoms with higher Young’s modulus. All platforms had a coefficient of variation in the range 0.00–0.21 for all four phantoms, equivalent to low variance and high repeatability. PMID:29293527

  11. Improvement of conventional transcutaneous bilirubinometry results in term newborn infants.

    PubMed

    Felc, Zlata

    2005-05-01

    This prospective study was performed to determine a way to improve conventional transcutaneous bilirubinometry results in healthy term newborn infants. In 118 infants during phototherapy (group A), and in 118 infants without phototherapy (group B), bilirubin determinations were done in duplicate using the Minolta AirShields Jaundice Meter type 101 (transcutaneous bilirubin index [TcB]), and the diazometric method on the Hitachi 717 Automated Analyzer (total reacting serum bilirubin [SeB]). In 112 infants (group C), bilirubin determinations were done in triplicate, using simple direct-reading photometry on the Moltronic Bilirubinometer (direct serum bilirubin [BiB]). A close correlation between TcB and SeB values was observed in group A ( r = 0.69; p < 0.001) and in group B ( r = 0.59; p < 0.001). The 95% confidence intervals of TcB readings corresponding to SeB were +/- 80.7 micromol/L in group A and +/- 76.9 micromol/L in group B, respectively. In group C, using a correctly calibrated BiB with adult sera containing bilirubin concentration in the range 271 to 344 micromol/L, the 95% confidence intervals of parallel TcB and BiB readings corresponding to SeB were +/- 28 micromol/L. Parallel determinations of the TcB and the BiB in healthy term newborn infants give results almost identical to those of bilirubin determination by the laboratory method.

  12. Comparison of low and high frequency transducers in the detection of liver metastases.

    PubMed

    Schacherer, D; Wrede, C; Obermeier, F; Schölmerich, J; Schlottmann, K; Klebl, F

    2006-09-01

    To evaluate the benefit of the additional use of a high frequency ultrasound probe (7.5 MHz) in finding suspicious liver lesions compared to the examination using a 3.5-MHz transducer only. One hundred and fifty-seven patients with underlying malignant disease were examined with both transducers using one of three ultrasound machines (Siemens Sonoline Elegra, GE Healthcare Logic 9, or Hitachi EUB-8500). Findings on hepatic lesions were collected on a standardised documentation sheet and evaluated by descriptive statistics. Ninety-three patients (59.2% of all patients) showed no evident liver lesion on conventional ultrasound with the 3.5 MHz probe. In 29 patients (18.5%) new suspicious liver lesions were found by using the high frequency transducer. Thirteen of these 29 patients (44.8%) were suspected to suffer from diffuse infiltration of the liver with malignant lesions or at least 10 additional visible lesions. In 14 patients, no liver lesion had been known before high frequency ultrasound examination. The size of newly described liver lesions ranged from 2 mm to 1.5 cm. Time needed for the additional examination with the high frequency transducer ranged between 1 and 15 min with an average of 4.0 min. The additional use of a high frequency transducer in patients with underlying malignant disease slightly extends the examination time, but reveals new, potentially malignant hepatic lesions in almost every fifth patient.

  13. Absolute quantum yield measurement of powder samples.

    PubMed

    Moreno, Luis A

    2012-05-12

    Measurement of fluorescence quantum yield has become an important tool in the search for new solutions in the development, evaluation, quality control and research of illumination, AV equipment, organic EL material, films, filters and fluorescent probes for bio-industry. Quantum yield is calculated as the ratio of the number of photons absorbed, to the number of photons emitted by a material. The higher the quantum yield, the better the efficiency of the fluorescent material. For the measurements featured in this video, we will use the Hitachi F-7000 fluorescence spectrophotometer equipped with the Quantum Yield measuring accessory and Report Generator program. All the information provided applies to this system. Measurement of quantum yield in powder samples is performed following these steps: 1. Generation of instrument correction factors for the excitation and emission monochromators. This is an important requirement for the correct measurement of quantum yield. It has been performed in advance for the full measurement range of the instrument and will not be shown in this video due to time limitations. 2. Measurement of integrating sphere correction factors. The purpose of this step is to take into consideration reflectivity characteristics of the integrating sphere used for the measurements. 3. Reference and Sample measurement using direct excitation and indirect excitation. 4. Quantum Yield calculation using Direct and Indirect excitation. Direct excitation is when the sample is facing directly the excitation beam, which would be the normal measurement setup. However, because we use an integrating sphere, a portion of the emitted photons resulting from the sample fluorescence are reflected by the integrating sphere and will re-excite the sample, so we need to take into consideration indirect excitation. This is accomplished by measuring the sample placed in the port facing the emission monochromator, calculating indirect quantum yield and correcting the direct quantum yield calculation. 5. Corrected quantum yield calculation. 6. Chromaticity coordinates calculation using Report Generator program. The Hitachi F-7000 Quantum Yield Measurement System offer advantages for this application, as follows: High sensitivity (S/N ratio 800 or better RMS). Signal is the Raman band of water measured under the following conditions: Ex wavelength 350 nm, band pass Ex and Em 5 nm, response 2 sec), noise is measured at the maximum of the Raman peak. High sensitivity allows measurement of samples even with low quantum yield. Using this system we have measured quantum yields as low as 0.1 for a sample of salicylic acid and as high as 0.8 for a sample of magnesium tungstate. Highly accurate measurement with a dynamic range of 6 orders of magnitude allows for measurements of both sharp scattering peaks with high intensity, as well as broad fluorescence peaks of low intensity under the same conditions. High measuring throughput and reduced light exposure to the sample, due to a high scanning speed of up to 60,000 nm/minute and automatic shutter function. Measurement of quantum yield over a wide wavelength range from 240 to 800 nm. Accurate quantum yield measurements are the result of collecting instrument spectral response and integrating sphere correction factors before measuring the sample. Large selection of calculated parameters provided by dedicated and easy to use software. During this video we will measure sodium salicylate in powder form which is known to have a quantum yield value of 0.4 to 0.5.

  14. Elastography in the differential diagnosis of thyroid nodules in Hashimoto thyroiditis.

    PubMed

    Şahin, Mustafa; Çakal, Erman; Özbek, Mustafa; Güngünes, Aşkin; Arslan, Müyesser Sayki; Akkaymak, Esra Tutal; Uçan, Bekir; Ünsal, Ilknur Öztürk; Bozkurt, Nujen Çolak; Delibaşi, Tuncay

    2014-08-01

    Elastography is a method which assesses the risk of the malignancy and provides information about the degree of hardness in tissue. Hashimoto's thyroiditis, autoimmune lymphocytic infiltration and fibrosis, is considered to be a very common disease that is able to change the hardness of the tissue. The diagnostic value of elastography of this group of patients has not previously been reported. In our study, we aimed to determine the diagnostic value of elastography in 283 patients (255 female, 28 male) with Hashimoto's thyroiditis. Elastography score and index were measured with real-time ultrasound elastography (Hitachi(®) EUB 7000 HV machine with using 13 MHz linear transducer). The outcome of this measure shows that malignant nodules were with higher elastography scores (ES) and strain indexes (SI) values. ES ≥3 were observed in 16/20 malignant and 130/263 benign nodules, respectively. The area under the curve (AUC) for the elasto score (AUC) was 0.72 (p = 0.001), and AUC for the strain index was 0.77 (p < 0.0001). Accordingly, our study suggests that strain index reflects malignancy better than the elasto score. We conclude that elastography score is ≥3 providing 80 % sensitivity and 50 %, six specificity for diagnosing malignancy. For strain index, we found that 2.45 (72.2 % sensitivity and 70 % specificity) is a cut-off point. We have detected a lower cut-off point for SI in Hashimoto patients although sensitivity and specificity decreases in Hashimoto in this population.

  15. New paleoradiological investigations of ancient human remains from North West Lombardy archaeological excavations.

    PubMed

    Licata, Marta; Borgo, Melania; Armocida, Giuseppe; Nicosia, Luca; Ferioli, Elena

    2016-03-01

    Since its birth in 1895, radiology has been used to study ancient mummies. The purpose of this article is to present paleoradiological investigations conducted on several medieval human remains in Varese province. Anthropological (generic identification) and paleopathological analyses were carried out with the support of diagnostic imaging (X-ray and CT scans). Human remains were discovered during excavations of medieval archaeological sites in northwest Lombardy. Classical physical anthropological methods were used for the macroscopic identification of the human remains. X-ray and CT scans were performed on the same scanner (16-layer Hitachi Eclos 16 X-ray equipment). Results Radiological analysis permitted investigating (1) the sex, (2) age of death, (3) type of trauma, (4) therapeutic interventions and (5) osteomas in ancient human remains. In particular, X-ray and CT examinations showed dimorphic facial traits on the mummified skull, and the same radiological approaches allowed determining the age at death from a mummified lower limb. CT analyses allow investigating different types of traumatic lesions in skulls and postcranial skeleton portions and reconstructing the gait and functional outcomes of a fractured femur. Moreover, one case of possible Gardner’s syndrome (GS) was postulated from observing multiple osteomas in an ancient skull. Among the medical tests available to the clinician, radiology is the most appropriate first-line procedure for a diagnostic approach to ancient human remains because it can be performed without causing any significant damage to the specimen.

  16. Effect of electric potential and current on mandibular linear measurements in cone beam CT.

    PubMed

    Panmekiate, S; Apinhasmit, W; Petersson, A

    2012-10-01

    The purpose of this study was to compare mandibular linear distances measured from cone beam CT (CBCT) images produced by different radiographic parameter settings (peak kilovoltage and milliampere value). 20 cadaver hemimandibles with edentulous ridges posterior to the mental foramen were embedded in clear resin blocks and scanned by a CBCT machine (CB MercuRay(TM); Hitachi Medico Technology Corp., Chiba-ken, Japan). The radiographic parameters comprised four peak kilovoltage settings (60 kVp, 80 kVp, 100 kVp and 120 kVp) and two milliampere settings (10 mA and 15 mA). A 102.4 mm field of view was chosen. Each hemimandible was scanned 8 times with 8 different parameter combinations resulting in 160 CBCT data sets. On the cross-sectional images, six linear distances were measured. To assess the intraobserver variation, the 160 data sets were remeasured after 2 weeks. The measurement precision was calculated using Dahlberg's formula. With the same peak kilovoltage, the measurements yielded by different milliampere values were compared using the paired t-test. With the same milliampere value, the measurements yielded by different peak kilovoltage were compared using analysis of variance. A significant difference was considered when p < 0.05. Measurement precision varied from 0.03 mm to 0.28 mm. No significant differences in the distances were found among the different radiographic parameter combinations. Based upon the specific machine in the present study, low peak kilovoltage and milliampere value might be used for linear measurements in the posterior mandible.

  17. Nano-enabled tribological thin film coatings: global patent scenario.

    PubMed

    Sivudu, Kurva S; Mahajan, Yashwant R; Joshi, Shrikant V

    2014-01-01

    The aim of this paper is to present current status and future prospects of nano-enabled tribological thin film coatings based on worldwide patent landscape analysis. The study also presents an overview of technological trends by carrying out state-of-the-art literature analysis, including survey of corporate websites. Nanostructured tribological coatings encompass a wide spectrum of nanoscale microstructures, including nanocrystalline, nanolayered, nano-multilayered, nanocomposite, nanogradient structures or their unique combinations, which are composed of single or multi-component phases. The distinct microstructural features of the coatings impart outstanding tribological properties combined with multifunctional attributes to the coated components. Their unique combination of remarkable properties make them ideal candidates for a wide range of applications in diverse fields such as cutting and metalworking tools, biomedical devices, automotive engine components, wear parts, hard disc drives etc. The patent landscape analysis has revealed that nano-enabled tribological thin film coatings have significant potential for commercial applications in view of the lion's share of corporate industry in patenting activity. The largest patent portfolio is held by Japan followed by USA, Germany, Sweden and China. The prominent players involved in this field are Mitsubishi Materials Corp., Sandvik Aktiebolag, Hitachi Ltd., Sumitomo Electric Industries Ltd., OC Oerlikon Corp., and so on. The outstanding potential of nanostructured thin film tribological coatings is yet to be fully unravelled and, therefore, immense opportunities are available in future for microstructurally engineered novel coatings to enhance their performance and functionality by many folds.

  18. Defect window analysis by using SEM-contour based shape quantifying method for sub-20nm node production

    NASA Astrophysics Data System (ADS)

    Hibino, Daisuke; Hsu, Mingyi; Shindo, Hiroyuki; Izawa, Masayuki; Enomoto, Yuji; Lin, J. F.; Hu, J. R.

    2013-04-01

    The impact on yield loss due to systematic defect which remains after Optical Proximity Correction (OPC) modeling has increased, and achieving an acceptable yield has become more difficult in the leading technology beyond 20 nm node production. Furthermore Process-Window has become narrow because of the complexity of IC design and less process margin. In the past, the systematic defects have been inspected by human-eyes. However the judgment by human-eyes is sometime unstable and not accurate. Moreover an enormous amount of time and labor will have to be expended on the one-by-one judgment for several thousands of hot-spot defects. In order to overcome these difficulties and improve the yield and manufacturability, the automated system, which can quantify the shape difference with high accuracy and speed, is needed. Inspection points could be increased for getting higher yield, if the automated system achieves our goal. Defect Window Analysis (DWA) system by using high-precision-contour extraction from SEM image on real silicon and quantifying method which can calculate the difference between defect pattern and non-defect pattern automatically, which was developed by Hitachi High-Technologies, has been applied to the defect judgment instead of the judgment by human-eyes. The DWA result which describes process behavior might be feedback to design or OPC or mask. This new methodology and evaluation results will be presented in detail in this paper.

  19. Correlation of salivary glucose level with blood glucose level in diabetes mellitus

    PubMed Central

    Gupta, Shreya; Nayak, Meghanand T; Sunitha, JD; Dawar, Geetanshu; Sinha, Nidhi; Rallan, Neelakshi Singh

    2017-01-01

    Background: Saliva is a unique fluid, which is important for normal functioning of the oral cavity. Diabetes mellitus (DM) is a disease of absolute or relative insulin deficiency characterized by insufficient secretion of insulin by pancreatic beta-cells. The diagnosis of diabetes through blood is difficult in children, older adults, debilitated and chronically ill patients, so diagnosis by analysis of saliva can be potentially valuable as collection of saliva is noninvasive, easier and technically insensitive, unlike blood. The aim of the study was to correlate blood glucose level (BGL) and salivary glucose level (SGL) in DM patients. Methodology: A cross-sectional study was conducted in 120 patients, who were categorized as 40 controlled diabetics, 40 uncontrolled diabetics and 40 healthy, age- and sex-matched individuals constituted the controls. The blood and unstimulated saliva samples were collected from the patients at the different intervals for fasting, random and postprandial levels. These samples were then subjected for analysis of glucose in blood and saliva using glucose oxidase/peroxidase reagent in HITACHI 902(R) Automatic analyzer, and the results were recorded. Results: The mean SGLs were higher in uncontrolled and controlled diabetic groups than in nondiabetic group. A highly statistically significant correlation was found between fasting saliva glucose and fasting blood glucose in all the groups. Conclusion: With increase in BGL, increase in SGL was observed in patients with diabetes suggesting that SGL can be used for monitoring glycemic level in DM. PMID:29391704

  20. Real-Time Elastography Visualization and Histopathological Characterization of Rabbit Atherosclerotic Carotid Arteries.

    PubMed

    Wang, ZhenZhen; Liu, NaNa; Zhang, LiFeng; Li, XiaoYing; Han, XueSong; Peng, YanQing; Dang, MeiZheng; Sun, LiTao; Tian, JiaWei

    2016-01-01

    To evaluate the feasibility of non-invasive vascular real-time elastography imaging (RTE) in visualizing the composition of rabbit carotid atherosclerotic plaque as determined by histopathology, a rabbit model of accelerated carotid atherosclerosis was used. Thirty rabbits were randomly divided into two groups of 15 rabbits each. The first group was fed a cholesterol-rich diet and received balloon-induced injury the left common carotid artery endothelium, whereas the second group only received a cholesterol-rich diet. The rabbits were all examined in vivo with HITACHI non-invasive vascular real-time elastography (Hi-RTE) at baseline and 12 wk, and results from the elastography were compared with American Heart Association histologic classifications. Hi-RTE and the American Heart Association histologic classifications had good agreement, with weighted Cohen's kappa (95% confidence internal) of 0.785 (0.649-0.920). Strains of segmented plaques that were stained in different colors were statistically different (p < 0.0001). The sensitivity and specificity of elastograms for detecting a lipid core were 95.5% and 61.5%, respectively, and the area under the receiver operating characteristic curve was 0.789, with a 95% confidence interval of 0.679 to 0.876. This study is the first to indicate the feasibility of utilizing Hi-RTE in visualizing normal and atherosclerotic rabbit carotid arteries non-invasively. This affordable and reliable method can be widely applied in research of both animal and human peripheral artery atherosclerosis. Copyright © 2016 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  1. Multicentre physiological reference intervals for serum concentrations of immunoglobulins A, G and M, complement C3c and C4 measured with Tina-Quant reagents systems.

    PubMed

    Fuentes-Arderiu, Xavier; Alonso-Gregorio, Eduardo; Alvarez-Funes, Virtudes; Ambrós-Marigómez, Carmen; Coca-Fábregas, Lluís; Cruz-Placer, Marta; Díaz-Fernández, Julián; Pinel-Julián, María Pilar; Gutiérrez-Cecchini, Beatriz; Herrero-Bernal, Pilar; Sempere-Alcocer, Marcos; García-Caballero, Francisca; Del Mar Larrea-Ortiz-Quintana, María; La-Torre-Marcellán, Pedro; Del Señor López-Vélez, María; Mar-Medina, Carmen; Martín-Oncina, Javier; Rodríguez-Hernández, María Victoria; Romero-Sotomayor, María Victoria; Serrano-López, Cándido; Sicilia-Enríquez-de-Salamanca, Adolfo; Velasco-Romero, Ana María; Juvé-Cuxart, Santiago

    2007-01-01

    Clinical laboratories seeking accreditation for compliance with ISO 15189:2003 need to demonstrate that the physiological reference intervals communicated to all users of the laboratory service are appropriate for the patient population served and for the measurement systems used. In the case of immunological quantities, few articles have been published in peer-reviewed journals. A total of 21 clinical laboratories in different regions of Spain collaborated in identifying reference individuals and determining adult reference intervals for some immunological quantities measured using RD/Hitachi Modular Analytics analysers and Tina-Quant reagent systems. These immunological quantities are the mass concentrations of immunoglobulin A, immunoglobulin G, immunoglobulin M, complement C3c and complement C4 in serum. All the logistic work was carried out in co-operation with the supplier of the reagents and analysers (Roche Diagnostics España, S.L., Sant Cugat del Vallès, Catalonia, Spain). From the set of reference values obtained by each laboratory, multicentre reference limits were estimated non-parametrically. The reference intervals estimated in this study for concentrations of serum components under consideration are: complement C3c, 0.62-1.64 g/L for women and men; complement C4, 0.14-0.72 g/L for women and men; immunoglobulin A, 0.89-4.80 g/L for women and men; immunoglobulin G, 6.5-14.3 g/L for women and men; and immunoglobulin M, 0.48-3.38 g/L for women and 0.41-2.46 g/L for men.

  2. Development and Evaluation of a New Creatine Kinase MB Mass Determination Assay Using a Latex Agglutination Turbidimetric Immunoassay with an Automated Analyzer.

    PubMed

    Hoshino, Tadashi; Hanai, Kazuma; Tanimoto, Kazuhito; Nakayama, Tomohiro

    2016-01-01

    The diagnosis of myocardial infraction (MI) in patients presenting to the emergency department represents a clinical challenge. It is known that creatine kinase-MB isoenzyme (CK-MB) is present in soluble cell fractions of cardiac muscle, and injury to those cells results in an increase of CK-MB in the blood. Therefore, CK-MB is a suitable clinical biomarker of myocardial infraction. To measure CK-MB mass rapidly and easily, we developed the new reagent 'L-type Wako CK-MB mass' (L-CK-MB mass) for the latex agglutination turbidimetric immunoassay method. Using a Hitachi LABOSPECT 008, we evaluated the performance of this assay as a method for quantifying CK-MB mass, and we compared the measurement of the serum CK-MB mass concentration with this assay to that obtained using an electrochemiluminescence immunoassay (ECLIA). A dilution test showed linearity from 5 μg/L to 190 μg/L, and the limit of quantification of the L-CK-MB mass assay was 3.0 μg/L. The within-run CV and between-day CV were 1.0 - 4.5% and 1.8 - 4.4%, respectively. Serum CK-MB mass concentration determined using the L-CK-MB mass assay was reliably and strongly correlated with that determined using ECLIA (n = 163, r = 0.999, y = 0.977x + 0.307). The L-CK-MB mass assay is able to specifically determine CK-MB mass and is a very useful method for the accurate measurement of CK-MB mass for routine clinical analyses.

  3. Traceability Assessment and Performance Evaluation of Results for Measurement of Abbott Clinical Chemistry Assays on 4 Chemistry Analyzers.

    PubMed

    Lim, Jinsook; Song, Kyung Eun; Song, Sang Hoon; Choi, Hyun-Jung; Koo, Sun Hoe; Kwon, Gye Choel

    2016-05-01

    -The traceability of clinical results to internationally recognized and accepted reference materials and reference measurement procedures has become increasingly important. Therefore, the establishment of traceability has become a mandatory requirement for all in vitro diagnostics devices. -To evaluate the traceability of the Abbott Architect c8000 system (Abbott Laboratories, Abbott Park, Illinois), consisting of calibrators and reagents, across 4 different chemistry analyzers, and to evaluate its general performance on the Toshiba 2000FR NEO (Toshiba Medical Systems Corporation, Otawara-shi, Tochigi-ken, Japan). -For assessment of traceability, secondary reference materials were evaluated 5 times, and then bias was calculated. Precision, linearity, and carryover were determined according to the guidelines of the Clinical and Laboratory Standards Institute (Wayne, Pennsylvania). -The biases from 4 different analyzers ranged from -2.33% to 2.70% on the Toshiba 2000FR NEO, -2.33% to 5.12% on the Roche Hitachi 7600 (Roche Diagnostics International, Basel, Switzerland), -0.93% to 2.87% on the Roche Modular, and -2.16% to 2.86% on the Abbott Architect c16000. The total coefficients of variance of all analytes were less than 5%. The coefficients of determination (R(2)) were more than 0.9900. The carryover rate ranged from -0.54% to 0.17%. -Abbott clinical chemistry assays met the performance criteria based on desirable biological variation for precision, bias, and total error. They also showed excellent linearity and carryover. Therefore, these clinical chemistry assays were found to be accurate and reliable and are readily applicable on the various platforms used in this study.

  4. Adenosine Deaminase activity and HLA-DRB as diagnostic markers for Rheumatoid Arthritis.

    PubMed

    Valadbeigi, Shirin; Ebrahimi-Rad, Mina; Khatami, Shohreh; Akhbari, Hadi; Saghiri, Reza

    2018-04-05

    Rheumatoid Arthritis (RA) is a chronic multi systemic disorder with the unclarified ethiopathology. Although several markers have been presented for recognition of RA, but none of them has been specific. New markers such as HLA typing and activity of Adenosine deaminase (ADA) isoenzymes could be useful and specific. The aim of this study is to evaluate the pattern of ADA isoenzymes activity and HLA typing in both RA patients and healthy cases. Blood samples were collected from 55 RA patients and 60 healthy subjects, over a period of 6 months. Levels of C-reactive protein (CRP), rheumatoid factor (RF) and ADA (ADA1, ADA2, total ADA) were measured using AVITEX kit and HITACHI Auto Analyzer. In addition, HLA-DRB1*1,*04 and *010 was detected using PCR-SSP. ADA activity, particularly ADA2 level, was significantly higher among RA group (P<0.05). The concentrations of tADA in patients with RF and CRP positive were significantly higher (P <0.05). The allele prevalence of DRB1*10 and *01 was significantly higher in RA patients (8.3% and 13.1%, respectively) compared with control group (2.51% and 5.5%, respectively) (P <0.05). Calculated sensitivity and specificity for diagnostic tests in this study are listed as: CRP (75%), RF (80%), ADA (84%) and RF (90%), ADA (83%), CRP (72%), respectively. Increase tADA level and the frequency of DRB1*010 and *01 caused to susceptibility to RA. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  5. New drilling rigs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tubb, M.

    1981-02-01

    Friede and Goldman Ltd. of New Orleans, Louisiana has a successful drilling rig, the L-780 jack-up series. The triangular-shaped drilling vessel measures 180 x 176 ft. and is equipped with three 352 ft legs including spud cans. It is designed to work in up to 250 ft waters and drill to 20,000 ft depths. The unit is scheduled to begin initial drilling operations in the Gulf of Mexico for Arco. Design features are included for the unit. Davie Shipbuilding Ltd. has entered the Mexican offshore market with the signing of a $40,000,000 Canadian contract for a jack-up to work inmore » 300 ft water depths. Baker Marine Corporation has contracted with the People's Republic of China for construction of two self-elevating jack-ups. The units will be built for Magnum Marine, headquartered in Houston. Details for the two rigs are given. Santa Fe International Corporation has ordered a new jack-up rig to work initially in the Gulf of Suez. The newly ordered unit, Rig 136, will be the company's fourth offshore drilling rig now being built in the Far East. Temple Drilling Company has signed a construction contract with Bethlehem Steel for a jack-up to work in 200 ft water depths. Penrod Drilling Company has ordered two additional cantilever type jack-ups for Hitachi Shipbuilding and Engineering Co. Ltd. of Japan. Two semi-submersibles, capable of working in up to 2000 ft water depths, have been ordered by two Liberian companies. Details for these rigs are included. (DP)« less

  6. SU-F-T-156: Monte Carlo Simulation Using TOPAS for Synchrotron Based Proton Discrete Spot Scanning System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moskvin, V; Pirlepesov, F; Tsiamas, P

    Purpose: This study provides an overview of the design and commissioning of the Monte Carlo (MC) model of the spot-scanning proton therapy nozzle and its implementation for the patient plan simulation. Methods: The Hitachi PROBEAT V scanning nozzle was simulated based on vendor specifications using the TOPAS extension of Geant4 code. FLUKA MC simulation was also utilized to provide supporting data for the main simulation. Validation of the MC model was performed using vendor provided data and measurements collected during acceptance/commissioning of the proton therapy machine. Actual patient plans using CT based treatment geometry were simulated and compared to themore » dose distributions produced by the treatment planning system (Varian Eclipse 13.6), and patient quality assurance measurements. In-house MATLAB scripts are used for converting DICOM data into TOPAS input files. Results: Comparison analysis of integrated depth doses (IDDs), therapeutic ranges (R90), and spot shape/sizes at different distances from the isocenter, indicate good agreement between MC and measurements. R90 agreement is within 0.15 mm across all energy tunes. IDDs and spot shapes/sizes differences are within statistical error of simulation (less than 1.5%). The MC simulated data, validated with physical measurements, were used for the commissioning of the treatment planning system. Patient geometry simulations were conducted based on the Eclipse produced DICOM plans. Conclusion: The treatment nozzle and standard option beam model were implemented in the TOPAS framework to simulate a highly conformal discrete spot-scanning proton beam system.« less

  7. Missed detection of significant positive and negative shifts in gentamicin assay: implications for routine laboratory quality practices.

    PubMed

    Koerbin, Gus; Liu, Jiakai; Eigenstetter, Alex; Tan, Chin Hon; Badrick, Tony; Loh, Tze Ping

    2018-02-15

    A product recall was issued for the Roche/Hitachi Cobas Gentamicin II assays on 25 th May 2016 in Australia, after a 15 - 20% positive analytical shift was discovered. Laboratories were advised to employ the Thermo Fisher Gentamicin assay as an alternative. Following the reintroduction of the revised assay on 12 th September 2016, a second reagent recall was made on 20 th March 2017 after the discovery of a 20% negative analytical shift due to erroneous instrument adjustment factor. The practices of an index laboratory were examined to determine how the analytical shifts evaded detection by routine internal quality control (IQC) and external quality assurance (EQA) systems. The ability of the patient result-based approaches, including moving average (MovAvg) and moving sum of outliers (MovSO) approaches in detecting these shifts were examined. Internal quality control data of the index laboratory were acceptable prior to the product recall. The practice of adjusting IQC target following a change in assay method resulted in the missed negative shift when the revised Roche assay was reintroduced. While the EQA data of the Roche subgroup showed clear negative bias relative to other laboratory methods, the results were considered as possible 'matrix effect'. The MovAvg method detected the positive shift before the product recall. The MovSO did not detect the negative shift in the index laboratory but did so in another laboratory 5 days before the second product recall. There are gaps in current laboratory quality practices that leave room for analytical errors to evade detection.

  8. High throughput web inspection system using time-stretch real-time imaging

    NASA Astrophysics Data System (ADS)

    Kim, Chanju

    Photonic time-stretch is a novel technology that enables capturing of fast, rare and non-repetitive events. Therefore, it operates in real-time with ability to record over long period of time while having fine temporal resolution. The powerful property of photonic time-stretch has already been employed in various fields of application such as analog-to-digital conversion, spectroscopy, laser scanner and microscopy. Further expanding the scope, we fully exploit the time-stretch technology to demonstrate a high throughput web inspection system. Web inspection, namely surface inspection is a nondestructive evaluation method which is crucial for semiconductor wafer and thin film production. We successfully report a dark-field web inspection system with line scan speed of 90.9 MHz which is up to 1000 times faster than conventional inspection instruments. The manufacturing of high quality semiconductor wafer and thin film may directly benefit from this technology as it can easily locate defects with area of less than 10 microm x 10 microm where it allows maximum web flow speed of 1.8 km/s. The thesis provides an overview of our web inspection technique, followed by description of the photonic time-stretch technique which is the keystone in our system. A detailed explanation of each component is covered to provide quantitative understanding of the system. Finally, imaging results from a hard-disk sample and flexible films are presented along with performance analysis of the system. This project was the first application of time-stretch to industrial inspection, and was conducted under financial support and with close involvement by Hitachi, Ltd.

  9. Evaluation of Biomass and Coal Briquettes for a Spreader Stoker Boiler Using an Experimental Furnace --- Modeling and Test

    NASA Astrophysics Data System (ADS)

    Wiggins, Gavin Memminger

    The compliance of coal-fired boilers with emissions regulations is a concern for many facilities. The introduction of biomass briquettes in industrial boilers can help to reduce greenhouse gas emissions and coal usage. In this research project, a thermodynamic chemical equilibrium model was derived and analytical simulations performed for a coal boiler system for several types of biomass fuels such as beech, hickory, maple, poplar, white oak, willow, sawdust, torrefied willow, and switchgrass. The biomass emissions were compared to coal and charcoal emissions. The chemical equilibrium analysis numerically estimated the emissions of CO, CO2, NO, NO2, N 2O, SO2, and SO3. When examining the computer results, coal and charcoal emitted the highest CO, CO2, and SO x levels while the lowest (especially for SOx) were reached by the biomass fuels. Similarly, NOx levels were highest for the biomass and lowest for coal and charcoal. To validate these analytical results, a custom traveling grate furnace was designed and fabricated to evaluate different types of biofuels in the laboratory for operation temperatures and emissions. The furnace fuels tested included coal, charcoal, torrefied wood chips, and wood briquettes. As expected, the coal reached the highest temperature while the torrefied wood chips offered the lowest temperature. For CO and NO x emissions, the charcoal emitted the highest levels while the wood briquettes emitted the lowest levels. The highest SO2 emissions were reached by the coal while the lowest were emitted by the wood briquettes. When compared to the coal fuel, charcoal emissions for CO increased by 103%, NO and NOx decreased by 21% and 20% respectively, and SO2 levels decreased by 92%. For torrefied wood, emissions for CO increased by 17%, NO and NOx decreased by 58% and 57% respectively, and SO 2 decreased by 90%. For wood briquettes, emissions for CO decreased by 27%, NO and NOx decreased by 66%, and SO2 levels decreased by 97%. General trends in emissions levels for CO, CO2, SO2, and SO3 among the various fuels were the same for the two methods. From the modeling and experimental results, it is clear that the opportunity exists to reduce boiler emissions using biomass materials. In computer controlled systems, electric motor and connector arcing can cause operational difficulties such as reduced motor life, connector/cable failure, and VFD tripping. To better understand the behavior of electric motors in diverse environments, experimental testing has been conducted on two different 230/460 V 3-phase AC brushless motors at unloaded and loaded conditions. The motors were driven with a 200 VAC or 400 VAC class Hitachi variable-frequency drive (VFD) and operated in air, argon, and helium environments for a duration of eight hours. Voltage transients and temperatures were monitored for these tests. The largest recorded voltage spike of 1,852 V occurred during 480 VAC start/stop tests. In addition, two different cable lengths between the VFD and motor terminals were tested. The experimental results demonstrated that the shorter cable produced smaller voltage spikes when compared to the longer electrical cable. For all tests, both motors operated coolest in the helium environment and warmest in the argon environment.

  10. Studies on Anthropogenic Impact on Water Quality in Hilo (Hawaii) Bay and Mapping the Study Stations Using Geospatial Technologies

    NASA Astrophysics Data System (ADS)

    Cartier, A. J.; Williams, M. S.; Adolf, J.; Sriharan, S.

    2015-12-01

    Hilo Bay has uncharacteristically brown waters compared to other waters found in Hawai'i. The majority of the freshwater entering Hilo Bay is from storm and surface water runoff. The anthropogenic impact on water quality at Hilo Bay is due to sediment entrance, cesspools (Bacteria), and invasive species (Albizia). This poster presentation will focus on the water quality and phytoplankton collected on a weekly basis at a buoy positioned one meter from the shore of Hilo Bay, preserving the phytoplankton intact, concentrating and dehydrating the sample with ethanol, and viewing the phytoplankton with a scanning electron microscope (Hitachi S-3400NII). The GPS (Global Positioning System) points were collected at the sampling stations. Three transects on three separate dates were performed in Hilo Bay with salinity, percent dissolved oxygen, turbidity, secchi depth, temperature, and chlorophyll fluorescence data collected at each sampling station. A consistent trend observed in all transects was as distance from the river increased turbidity decreased and salinity increased. The GPS data on June 30, 2015 showed a major correlation between stations and their distance from shore. There is a decrease in the turbidity but not the temperature for these stations. The GPS points collected on July 7, 2015 at thirteen stations starting with station one being at the shore to the water, showed that the salinity concentration fluctuate noticeably at the first 6 stations. As we proceed further away from the shore, the salinity concentration increases from stations seven through thirteen. The water temperature shows little variation throughout the thirteen stations. The turbidity level was high at the shore and shows a noticeable drop at station thirteen.

  11. An evaluation of the DRI-ETG EIA method for the determination of ethyl glucuronide concentrations in clinical and post-mortem urine.

    PubMed

    Turfus, Sophie C; Vo, Tu; Niehaus, Nadia; Gerostamoulos, Dimitri; Beyer, Jochen

    2013-06-01

    A commercial enzyme immunoassay for the qualitative and semi-quantitative measurement of ethyl glucuronide (EtG) in urine was evaluated. Post-mortem (n=800), and clinical urine (n=200) samples were assayed using a Hitachi 902 analyzer. The determined concentrations were compared with those obtained using a previously published liquid chromatography-tandem mass spectrometry (LC-MS/MS) method for the quantification of EtG and ethyl sulfate. Using a cut-off of 0.5 µg/ml and LC-MS/MS limit of reporting of 0.1 µg/ml, there was a sensitivity of 60.8% and a specificity of 100% for clinical samples. For post-mortem samples, sensitivity and specificity were 82.4% and 97.1%, respectively. When reducing the cut-off to 0.1 µg/ml, the sensitivity and specificity were 83.3% and 100% for clinical samples whereas for post-mortem samples the sensitivity and specificity were 90.3 % and 88.3 %, respectively. The best trade-offs between sensitivity and specificity for LC-MS/MS limits of reporting of 0.5 and 0.1 µg/ml were achieved when using immunoassay cut-offs of 0.3 and 0.092 µg/ml, respectively. There was good correlation between quantitative results obtained by both methods but analysis of samples by LC-MS/MS gave higher concentrations than by enzyme immunoassay (EIA), with a statistically significant proportional bias (P<0.0001, Deming regression) for both sample types. The immunoassay is reliable for the qualitative and semi-quantitative presumptive detection of ethyl glucuronide in urine. Copyright © 2012 John Wiley & Sons, Ltd.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vinke, R; Peng, H; Xing, L

    Purpose: In searching for a robust, efficient and cost-effective dual energy cone beam CT (DECBCT) solution for various radiation oncology applications, in particularly for improved proton dose planning/replanning accuracy and DE-CBCT guided radiation therapy, we investigate a novel energy modulation scheme using a beam modifier placed between the source and patient and optimize its geometric configuration for routine clinical use. Methods: The study was performed using a Hitachi CBCT scanner and the tube voltage was set at 125 kVp. The higher energy beam was obtained by filtering the incident utilizing a beam modulation layer (material: copper, thickness: 1.8 mm). Tomore » avoid the need for double scans (one with and one without the energy modulator), the modulation layer was configured to cover only the half of the X-ray beam so that two sets of sinograms corresponding low and high energies were collected after a single gantry rotation of 360 deg. The average high energy and low energy HU numbers (HUhigh and HUlow) were derived for pixels in a defined region-of-interest, respectively. Results: The beam modifier increased the threshold of the energy spectrum from ∼20 keV up to ∼50 keV. Two complete sets of images were obtained with good alignment between the high energy and low-energy cases without any artifact observed (Fig. 2). The HUlow/HUhigh is ∼0/0 (water), ∼394/238 (brain), ∼1283/1085 (cortical bone) and ∼3000/1800 (titanium). Conclusion: The feasibility of the proposed DECT implementation using a beam modifier has been demonstrated. Compared to the existing DECT solutions, the proposed scheme is much more cost-effective and requires minimum hardware modification. The work lays foundation for us to study the quantification of HU values to derive material density images and atomic number (and electron density) of substances.« less

  13. A Methodology for the Integration of a Mechanistic Source Term Analysis in a Probabilistic Framework for Advanced Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, Dave; Brunett, Acacia J.; Bucknor, Matthew

    GE Hitachi Nuclear Energy (GEH) and Argonne National Laboratory are currently engaged in a joint effort to modernize and develop probabilistic risk assessment (PRA) techniques for advanced non-light water reactors. At a high level, the primary outcome of this project will be the development of next-generation PRA methodologies that will enable risk-informed prioritization of safety- and reliability-focused research and development, while also identifying gaps that may be resolved through additional research. A subset of this effort is the development of PRA methodologies to conduct a mechanistic source term (MST) analysis for event sequences that could result in the release ofmore » radionuclides. The MST analysis seeks to realistically model and assess the transport, retention, and release of radionuclides from the reactor to the environment. The MST methods developed during this project seek to satisfy the requirements of the Mechanistic Source Term element of the ASME/ANS Non-LWR PRA standard. The MST methodology consists of separate analysis approaches for risk-significant and non-risk significant event sequences that may result in the release of radionuclides from the reactor. For risk-significant event sequences, the methodology focuses on a detailed assessment, using mechanistic models, of radionuclide release from the fuel, transport through and release from the primary system, transport in the containment, and finally release to the environment. The analysis approach for non-risk significant event sequences examines the possibility of large radionuclide releases due to events such as re-criticality or the complete loss of radionuclide barriers. This paper provides details on the MST methodology, including the interface between the MST analysis and other elements of the PRA, and provides a simplified example MST calculation for a sodium fast reactor.« less

  14. Comparison of removal torques between laser-treated and SLA-treated implant surfaces in rabbit tibiae

    PubMed Central

    Kang, Nam-Seok; Li, Lin-Jie

    2014-01-01

    PURPOSE The purpose of this study was to compare removal torques and surface topography between laser treated and sandblasted, large-grit, acid-etched (SLA) treated implants. MATERIALS AND METHODS Laser-treated implants (experimental group) and SLA-treated implants (control group) 8 mm in length and 3.4 mm in diameter were inserted into both sides of the tibiae of 12 rabbits. Surface analysis was accomplished using a field emission scanning electron microscope (FE-SEM; Hitachi S-4800; Japan) under ×25, ×150 and ×1,000 magnification. Surface components were analyzed using energy dispersive spectroscopy (EDS). Rabbits were sacrificed after a 6-week healing period. The removal torque was measured using the MGT-12 digital torque meter (Mark-10 Co., Copiague, NY, USA). RESULTS In the experimental group, the surface analysis showed uniform porous structures under ×25, ×150 and ×1,000 magnification. Pore sizes in the experimental group were 20-40 mm and consisted of numerous small pores, whereas pore sizes in the control group were 0.5-2.0 mm. EDS analysis showed no significant difference between the two groups. The mean removal torque in the laser-treated and the SLA-treated implant groups were 79.4 Ncm (SD = 20.4; range 34.6-104.3 Ncm) and 52.7 Ncm (SD = 17.2; range 18.7-73.8 Ncm), respectively. The removal torque in the laser-treated surface implant group was significantly higher than that in the control group (P=.004). CONCLUSION In this study, removal torque values were significantly higher for laser-treated surface implants than for SLA-treated surface implants. PMID:25177474

  15. [Association between urinary polycyclic aromatic hydrocarbon metabolites and elevated serum uric acid levels in coke oven workers].

    PubMed

    Deng, Siyun; Deng, Qifei; Hu, Die; Li, Jun; Zhu, Xiaoyan; Guo, Huan; Wu, Tangchun

    2014-06-01

    To analyze the relationship between metabolites of polycyclic aromatic hydrocarbons (PAHs) and serum uric acid levels in coke oven workers and to provide new clues to the pathogenic mechanism of PAHs. A total of 1302 coke oven workers were divided into four groups, namely control group and low-, intermediate-, and high-dose exposure groups. The concentrations of ambient PAHs at each workplace were determined by high-performance liquid chromatography. The detailed information on the occupational history and health of workers was collected by questionnaire survey and physical examination, and so were their blood and urine samples. Serum uric acid and creatinine levels were measured using a Hitachi 7020 automatic biochemical analyzer. Ten urinary PAH metabolites were detected by gas chromatography-mass spectrometry. Serum uric acid levels were the highest in the high-dose exposure group, followed by the intermediate- and low-dose exposure groups, and were the lowest in the control group. There were significant correlations between serum uric acid levels and the quartiles of 1-hydroxynaphthalene and 1-hydroxyphenanthrene (P < 0.05). After adjustment for PAH metabolite-related relationship, only urinary 1-hydroxyphenanthrene was significantly correlated with serum uric acid levels (P = 0.001). After adjustment for confounding factors and using the 1st quartile of 1-hydroxyphenanthrene as a reference, the odds ratio for hyperuricemia in subjects with the 2nd, 3rd, and 4th quartiles of 1-hydroxyphenanthrene were 1.55, 1.57, and 2.35, respectively. Urinary 1-hydroxyphenanthrene is associated with a dose-response increase in serum uric acid levels in coke oven workers, and exposure to phenanthrene in PAHs may be a risk factor for hyperuricemia.

  16. Cell sheets image validation of phase-diversity homodyne OCT and effect of the light irradiation on cells

    NASA Astrophysics Data System (ADS)

    Senda, Naoko; Osawa, Kentaro

    2016-04-01

    Optical coherence tomography (OCT) is one of powerful 3D tissue imaging tools with no fluorescence staining. We have reported that Phase-Diversity Homodyne OCT developed in Hitachi could be useful for non-invasive regeneration tissue evaluation test. The OCT enables cell imaging because of high resolution (axial resolution; ~2.6 μm, lateral resolution; ~1 μm, in the air), whereas conventional OCT was not used for cell imaging because of low resolution (10~20 μm). Furthermore, the OCT has advantage over other 3D imaging devices in cost because the light source and the objective were originally used as an optical pickup of compact disc. In this report, we aimed to assess effectiveness and safety of Phase-Diversity Homodyne OCT cell imaging. Effectiveness of OCT was evaluated by imaging a living cell sheet of human oral mucosal epithelial cells. OCT images were compared with reflection confocal microscopy (RCM) images, because confocal optical system is the highest resolution (<1 μm) 3D in vivo imaging technique. Similar nuclei images were confirmed with OCT and RCM, which suggested the OCT has enough resolution to image nuclei inside a cell sheet. Degree of differentiation could be estimated using OCT images, which becomes possible because the size of cells depends on distribution of differentiation. Effect of the OCT light irradiation on cells was studied using NIH/3T3 cells. Light irradiation, the exposure amount of which is equivalent to OCT, had no impact on cell shape, cell viability, and proliferation rate. It suggested that the light irradiation has no cell damage under the condition.

  17. Efficacy of levetiracetam versus fosphenytoin for the recurrence of seizures after status epilepticus.

    PubMed

    Nakamura, Kensuke; Inokuchi, Ryota; Daidoji, Hiroaki; Naraba, Hiromu; Sonoo, Tomohiro; Hashimoto, Hideki; Tokunaga, Kurato; Hiruma, Takahiro; Doi, Kent; Morimura, Naoto

    2017-06-01

    Benzodiazepines are used as first-line treatments for status epilepticus. Fosphenytoin (FPHT) is recommended for second-line therapy; however, intravenous injection of levetiracetam (LEV) may also be effective against status epilepticus. Herein, we compared the efficacy and safety of LEV as a second-line treatment for status epilepticus with FPHT in Japanese patients.Patients with status epilepticus were selected from the database of the Emergency and Critical Care Center of Hitachi General Hospital. The subjects were patients whose status epilepticus was successfully stopped by diazepam, and in whom FPHT or LEV was administered after diazepam. As LEV injections recently became clinically available in Japan, the choice of drug was determined by the treatment period. Thus, 21 patients who were intravenously injected with LEV as a second-line therapy and 42 matched patients (historical controls) who were treated with FPHT (1:2) were selected.The subjects had a mean age of 64.0 ± 2.2 years, and included 48 males and 15 females. The status epilepticus control rates of the FPHT and LEV groups did not differ significantly (81.0% [34/42] vs 85.1% [18/21], respectively; P  =  .69). As for serious adverse events, a reduction in blood pressure was observed in the FPHT group, but not in the LEV group. The oral anticonvulsant switching rates of the 2 groups were similar, but the same-drug switching rates of the FPHT and LEV groups were 8.1% and 77.8%, respectively.The efficacy of intravenous LEV injections after status epilepticus was equivalent to that of FPHT, and the incidence of adverse events was lower in the LEV group. LEV is effective and safe at preventing recurrent seizures after status epilepticus following benzodiazepine treatment.

  18. Morphology and Elemental Composition of Recent and Fossil Cyanobacteria

    NASA Technical Reports Server (NTRS)

    SaintAmand, Ann; Hoover, Richard B.; Jerman, Gregory; Rozanov, Alexei Yu.

    2005-01-01

    Cyanobacteria (cyanophyta, cyanoprokaryota, and blue-green algae) are an ancient, diverse and abundant group of photosynthetic oxygenic microorganisms. Together with other bacteria and archaea, the cyanobacteria have been the dominant life forms on Earth for over 3.5 billion years. Cyanobacteria occur in some of our planets most extreme environments - hot springs and geysers, hypersaline and alkaline lakes, hot and cold deserts, and the polar ice caps. They occur in a wide variety of morphologies. Unlike archaea and other bacteria, which are typically classified in pure culture by their physiological, biochemical and phylogenetic properties, the cyanobacteria have historically been classified based upon their size and morphological characteristics. These include the presence or absence of heterocysts, sheath, uniseriate or multiseriate trichomes, true or false branching, arrangement of thylakoids, reproduction by akinetes, binary fission, hormogonia, fragmentation, presence/absence of motility etc. Their antiquity, distribution, structural and chemical differentiation, diversity, morphological complexity and large size compared to most other bacteria, makes the cyanobacteria ideal candidates for morphological biomarkers in returned Astromaterials. We have obtained optical (nomarski and phase contrast)/fluorescent (blue and green excitation) microscopy images using an Olympus BX60 compound microscope and Field Emission Scanning Electron Microscopy images and EDAX elemental compositions of living and fossil cyanobacteria. The S-4000 Hitachi Field Emission Scanning Electron Microscope (FESEM) has been used to investigate microfossils in freshly fractured interior surfaces of terrestrial rocks and the cells, hormogonia, sheaths and trichomes of recent filamentous cyanobacteria. We present Fluorescent and FESEM Secondary and Backscattered Electron images and associated EDAX elemental analyses of recent and fossil cyanobacteria, concentrating on representatives of the genera Calothnx, Leptolyngbya, Lyngbya, Planktolyngbya and Oscillatoria.

  19. Morphology and elemental composition of recent and fossil cyanobacteria

    NASA Astrophysics Data System (ADS)

    St. Amand, Ann; Hoover, Richard B.; Jerman, Gregory A.; Coston, James; Rozanov, Alexei Y.

    2005-09-01

    Cyanobacteria (cyanophyta, cyanoprokaryota, and blue-green algae) are an ancient, diverse and abundant group of photosynthetic oxygenic microorganisms. Together with other bacteria and archaea, the cyanobacteria have been the dominant life forms on Earth for over 3.5 billion years. Cyanobacteria occur in some of our planets most extreme environments - hot springs and geysers, hypersaline and alkaline lakes, hot and cold deserts, and the polar ice caps. They occur in a wide variety of morphologies. Unlike archaea and other bacteria, which are typically classified in pure culture by their physiological, biochemical and phylogenetic properties, the cyanobacteria have historically been classified based upon their size and morphological characteristics. These include the presence or absence of heterocysts, sheath, uniseriate or multiseriate trichomes, true or false branching, arrangement of thylakoids, reproduction by akinetes, binary fission, hormogonia, fragmentation, presence/absence of motility etc. Their antiquity, distribution, structural and chemical differentiation, diversity, morphological complexity and large size compared to most other bacteria, makes the cyanobacteria ideal candidates for morphological biomarkers in returned Astromaterials. We have obtained optical (nomarski and phase contrast)/fluorescent (blue and green excitation) microscopy images using an Olympus BX60 compound microscope and Field Emission Scanning Electron Microscopy images and EDAX elemental compositions of living and fossil cyanobacteria. The S-4000 Hitachi Field Emission Scanning Electron Microscope (FESEM) has been used to investigate microfossils in freshly fractured interior surfaces of terrestrial rocks and the cells, hormogonia, sheaths and trichomes of recent filamentous cyanobacteria. We present Fluorescent and FESEM Secondary and Backscattered Electron images and associated EDAX elemental analyses of recent and fossil cyanobacteria, concentrating on representatives of the genera Calothrix, Leptolyngbya, Lyngbya, Planktolyngbya and Oscillatoria.

  20. The Hollow Spheres of the Orgueil Meteorite: A Re-Examination

    NASA Technical Reports Server (NTRS)

    Hoover, Richard B.; Jerman, Gregory; Rossignold-Strick, Maritine

    2005-01-01

    In 1971, Rossignol-Strick and Barghoorn provided images and a description of a number of spherical hollow microstructures showing well-defined walls in acid macerated extract of the Orgueil CI carbonaceous meteorite. Other forms such as membranes and spiral shaped structures were also reported. The carbon-rich (kerogen) hollow spheres were found to be in a narrowly constrained distribution of sizes (mainly 7 to 10 microns in diameter). Electron microprobe analysis revealed that these spheres contained Carbon, possibly P, N, and K. It was established that these forms could not be attributed to pollen or other recent terrestrial contaminants. It was concluded that they most probably represented organic coatings on globules of glass, olivine or magnetite in the meteorite. However, recent studies of the Orgueil meteorite have been carried out at the NASA/Marshall Space Flight Center with the S-4000 Hitachi Field Emission Scanning Electron Microscope (FESEM). These investigations have revealed the presence of numerous carbon encrusted spherical magnetite platelets and spherical and ovoidal bodies of elemental iron in-situ in freshly fractured interior surfaces of the meteorite. Their size range is also very narrowly constrained (typically approximately 6 to 12 microns) in diameter. High resolution images reveal that these bodies are also encrusted with a thin carbonaceous sheath and are surrounded by short nanofibrils that are shown to be composed of high purity iron by EDAX elemental analysis. We present Secondary and Backscatter Electron FESEM images and associated EDAX elemental analyses and 2D X-ray maps of these forms as we re-examine the hollow spheres of Orgueil and attempt to determine if they are representatives of the same population of indigenous microstructures.

  1. A new method to evaluate image quality of CBCT images quantitatively without observers

    PubMed Central

    Shimizu, Mayumi; Okamura, Kazutoshi; Yoshida, Shoko; Weerawanich, Warangkana; Tokumori, Kenji; Jasa, Gainer R; Yoshiura, Kazunori

    2017-01-01

    Objectives: To develop an observer-free method for quantitatively evaluating the image quality of CBCT images by applying just-noticeable difference (JND). Methods: We used two test objects: (1) a Teflon (polytetrafluoroethylene) plate phantom attached to a dry human mandible; and (2) a block phantom consisting of a Teflon step phantom and an aluminium step phantom. These phantoms had holes with different depths. They were immersed in water and scanned with a CB MercuRay (Hitachi Medical Corporation, Tokyo, Japan) at tube voltages of 120 kV, 100 kV, 80 kV and 60 kV. Superimposed images of the phantoms with holes were used for evaluation. The number of detectable holes was used as an index of image quality. In detecting holes quantitatively, the threshold grey value (ΔG), which differentiated holes from the background, was calculated using a specific threshold (the JND), and we extracted the holes with grey values above ΔG. The indices obtained by this quantitative method (the extracted hole values) were compared with the observer evaluations (the observed hole values). In addition, the contrast-to-noise ratio (CNR) of the shallowest detectable holes and the deepest undetectable holes were measured to evaluate the contribution of CNR to detectability. Results: The results of this evaluation method corresponded almost exactly with the evaluations made by observers. The extracted hole values reflected the influence of different tube voltages. All extracted holes had an area with a CNR of ≥1.5. Conclusions: This quantitative method of evaluating CBCT image quality may be more useful and less time-consuming than evaluation by observation. PMID:28045343

  2. Power Doppler signal calibration between ultrasound machines by use of a capillary-flow phantom for pannus vascularity in rheumatoid finger joints: a basic study.

    PubMed

    Sakano, Ryosuke; Kamishima, Tamotsu; Nishida, Mutsumi; Horie, Tatsunori

    2015-01-01

    Ultrasound allows the detection and grading of inflammation in rheumatology. Despite these advantages of ultrasound in the management of rheumatoid patients, it is well known that there are significant machine-to-machine disagreements regarding signal quantification. In this study, we tried to calibrate the power Doppler (PD) signal of two models of ultrasound machines by using a capillary-flow phantom. After flow velocity analysis in the perfusion cartridge at various injection rates (0.1-0.5 ml/s), we measured the signal count in the perfusion cartridge at various injection rates and pulse repetition frequencies (PRFs) by using PD, perfusing an ultrasound micro-bubble contrast agent diluted with normal saline simulating human blood. By use of the data from two models of ultrasound machines, Aplio 500 (Toshiba) and Avius (Hitachi Aloka), the quantitative PD (QPD) index [the summation of the colored pixels in a 1 cm × 1 cm rectangular region of interest (ROI)] was calculated via Image J (internet free software). We found a positive correlation between the injection rate and the flow velocity. In Aplio 500 and Avius, we found negative correlations between the PRF and the QPD index when the flow velocity was constant, and a positive correlation between flow velocity and the QPD index at constant PRF. The equation for the relationship of the PRF between Aplio 500 and Avius was: y = 0.023x + 0.36 [y = PRF of Avius (kHz), x = PRF of Aplio 500 (kHz)]. Our results suggested that the signal calibration of various models of ultrasound machines is possible by adjustment of the PRF setting.

  3. Investigation of hot cracking resistance of 2205 duplex steel

    NASA Astrophysics Data System (ADS)

    Adamiec, J.; Ścibisz, B.

    2010-02-01

    Austenitic duplex steel of the brand 2205 according to Avesta Sheffield is used for welded constructions (pipelines, tanks) in the petrol industry, chemical industry and food industry. It is important to know the range of high-temperature brittleness in designing welding technology for constructions made of this steel type. There is no data in literature concerning this issue. High-temperature brittleness tests using the simulator of heat flow device Gleeble 3800 were performed. The tests results allowed the evaluation of the characteristic temperatures in the brittleness temperature range during the joining of duplex steels, specifically the nil-strength temperature (NST) and nil-ductility temperatures (NDT) during heating, the strength and ductility recovery temperatures (DRT) during cooling, the Rfparameter (Rf = (Tliquidus - NDT)/NDT) describing the duplex steel inclination for hot cracking, and the brittleness temperature range (BTR). It has been stated that, for the examined steel, this range is wide and amounts to ca. 90 °C. The joining of duplex steels with the help of welding techniques creates a significant risk of hot cracks. After analysis of the DTA curves a liquidus temperature of TL = 1465 °C and a solidus temperature of TS = 1454 °C were observed. For NST a mean value was assumed, in which the cracks appeared for six samples; the temperature was 1381 °C. As the value of the NDT temperature 1367 °C was applied while for DRT the assumed temperature was 1375 °C. The microstructure of the fractures was observed using a Hitachi S-3400N scanning electron microscope (SEM). The analyses of the chemical composition were performed using an energy-dispersive X-ray spectrometer (EDS), Noran System Six of Thermo Fisher Scientific. Essential differences of fracture morphology type over the brittle temperature range were observed and described.

  4. Development of Effective Connectivity during Own- and Other-Race Face Processing: A Granger Causality Analysis

    PubMed Central

    Zhou, Guifei; Liu, Jiangang; Ding, Xiao Pan; Fu, Genyue; Lee, Kang

    2016-01-01

    Numerous developmental studies have suggested that other-race effect (ORE) in face recognition emerges as early as in infancy and develops steadily throughout childhood. However, there is very limited research on the neural mechanisms underlying this developmental ORE. The present study used Granger causality analysis (GCA) to examine the development of children's cortical networks in processing own- and other-race faces. Children were between 3 and 13 years. An old-new paradigm was used to assess their own- and other-race face recognition with ETG-4000 (Hitachi Medical Co., Japan) acquiring functional near infrared spectroscopy (fNIRS) data. After preprocessing, for each participant and under each face condition, we obtained the causal map by calculating the weights of causal relations between the time courses of [oxy-Hb] of each pair of channels using GCA. To investigate further the differential causal connectivity for own-race faces and other-race faces at the group level, a repeated measure analysis of variance (ANOVA) was performed on the GCA weights for each pair of channels with the face race task (own-race face vs. other-race face) as the within-subject variable and the age as a between-subject factor (continuous variable). We found an age-related increase in functional connectivity, paralleling a similar age-related improvement in behavioral face processing ability. More importantly, we found that the significant differences in neural functional connectivity between the recognition of own-race faces and that of other-race faces were modulated by age. Thus, like the behavioral ORE, the neural ORE emerges early and undergoes a protracted developmental course. PMID:27713696

  5. Laboratory Simulation of Electrical Discharge in Surface Lunar Regolith

    NASA Astrophysics Data System (ADS)

    Shusterman, M.; Izenberg, N.; Wing, B. R.; Liang, S.

    2016-12-01

    Physical, chemical, and optical characteristics of space-weathered surface materials on airless bodies are produced primarily from bombardment by solar energetic particles and micrometeoroid impacts. On bodies such as the Moon and Mercury, soils in permanently shadowed regions (PSRs) are very cold, have low electrical conductivities, and are subjected to a high flux of incoming energetic particles accelerated by solar events. Theoretical models predict that up to 25% of gardened soils in the lunar polar regions are altered by dielectric breakdown; a potentially significant weathering process that is currently unconfirmed. Although electrical properties of lunar soils have been studied in relation to flight electronics and spacecraft safety, no studies have characterized potential alterations to soils resulting from electrical discharge. To replicate the surface charge field in PSRs, lunar regolith simulant JSC-1A was placed between two parallel plane electrodes under both low and high vacuum environments, 10e-3 torr and 2.5e-6 torr, respectively. Voltage was increased until discharge occurred within the sample. Grains were analyzed using an SVC fiber-fed point spectrometer, Olympus BX51 upright metallurgical microscope, and a Hitachi TM3000 scanning electron microscope with Bruker Quantax-70 X-ray spectrometer. Discharges occurring in samples under low vacuum resulted in surficial melting, silicate vapor deposition, coalescence of metallic iron, and micro-scale changes to surface topography. Samples treated under a high vacuum environment showed similar types of effects, but fewer in number compared to low vacuum samples. The variation in alteration abundances between the two environments implies that discharges may be occurring across surface contaminants, even at high vacuum conditions, inhibiting dielectric breakdown in our laboratory simulations.

  6. Development/Modernization of an Advanced Non-Light Water Reactor Probabilistic Risk Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henneke, Dennis W.; Robinson, James

    In 2015, GE Hitachi Nuclear Energy (GEH) teamed with Argonne National Laboratory (Argonne) to perform Research and Development (R&D) of next-generation Probabilistic Risk Assessment (PRA) methodologies for the modernization of an advanced non-Light Water Reactor (non-LWR) PRA. This effort built upon a PRA developed in the early 1990s for GEH’s Power Reactor Inherently Safe Module (PRISM) Sodium Fast Reactor (SFR). The work had four main tasks: internal events development modeling the risk from the reactor for hazards occurring at-power internal to the plant; an all hazards scoping review to analyze the risk at a high level from external hazards suchmore » as earthquakes and high winds; an all modes scoping review to understand the risk at a high level from operating modes other than at-power; and risk insights to integrate the results from each of the three phases above. To achieve these objectives, GEH and Argonne used and adapted proven PRA methodologies and techniques to build a modern non-LWR all hazards/all modes PRA. The teams also advanced non-LWR PRA methodologies, which is an important outcome from this work. This report summarizes the project outcomes in two major phases. The first phase presents the methodologies developed for non-LWR PRAs. The methodologies are grouped by scope, from Internal Events At-Power (IEAP) to hazards analysis to modes analysis. The second phase presents details of the PRISM PRA model which was developed as a validation of the non-LWR methodologies. The PRISM PRA was performed in detail for IEAP, and at a broader level for hazards and modes. In addition to contributing methodologies, this project developed risk insights applicable to non-LWR PRA, including focus-areas for future R&D, and conclusions about the PRISM design.« less

  7. SU-E-T-439: Fundamental Verification of Respiratory-Gated Spot Scanning Proton Beam Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hamano, H; Yamakawa, T; Hayashi, N

    Purpose: The spot-scanning proton beam irradiation with respiratory gating technique provides quite well dose distribution and requires both dosimetric and geometric verification prior to clinical implementation. The purpose of this study is to evaluate the impact of gating irradiation as a fundamental verification. Methods: We evaluated field width, flatness, symmetry, and penumbra in the gated and non-gated proton beams. The respiration motion was distinguished into 3 patterns: 10, 20, and 30 mm. We compared these contents between the gated and non-gated beams. A 200 MeV proton beam from PROBEAT-III unit (Hitachi Co.Ltd) was used in this study. Respiratory gating irradiationmore » was performed by Quasar phantom (MODUS medical devices) with a combination of dedicated respiratory gating system (ANZAI Medical Corporation). For radiochromic film dosimetry, the calibration curve was created with Gafchromic EBT3 film (Ashland) on FilmQA Pro 2014 (Ashland) as film analysis software. Results: The film was calibrated at the middle of spread out Bragg peak in passive proton beam. The field width, flatness and penumbra in non-gated proton irradiation with respiratory motion were larger than those of reference beam without respiratory motion: the maximum errors of the field width, flatness and penumbra in respiratory motion of 30 mm were 1.75% and 40.3% and 39.7%, respectively. The errors of flatness and penumbra in gating beam (motion: 30 mm, gating rate: 25%) were 0.0% and 2.91%, respectively. The results of symmetry in all proton beams with gating technique were within 0.6%. Conclusion: The field width, flatness, symmetry and penumbra were improved with the gating technique in proton beam. The spot scanning proton beam with gating technique is feasible for the motioned target.« less

  8. Real time elastography - a non-invasive diagnostic method of small hepatocellular carcinoma in cirrhosis.

    PubMed

    Gheorghe, Liana; Iacob, Speranta; Iacob, Razvan; Dumbrava, Mona; Becheanu, Gabriel; Herlea, Vlad; Gheorghe, Cristian; Lupescu, Ioana; Popescu, Irinel

    2009-12-01

    Small nodules (under 3 cm) detected on ultrasound (US) in cirrhotics represent the most challenging category for noninvasive diagnosis of hepatocellular carcinoma (HCC). To evaluate real-time sonoelastography as a noninvasive tool for the diagnosis of small HCC nodules in cirrhotic patients. 42 cirrhotic patients with 58 nodules (1-3 cm) were evaluated with real-time elastography (Hitachi EUB-6500); the mean intensity of colors red, blue, green were measured using a semi-quantitative method. Analysis of histograms for each color of the sonoelastography images was performed for quantifying the elasticity of nodule tissue in comparison with the cirrhotic liver tissue. AUROC curves were constructed to define the best cut-off points to distinguish malignant features of the nodules. Univariate and multivariate logistic regression analysis was performed. 595 sonoelastography images from 42 patients (25 men; 17 women) were analyzed. The mean age was 56.4 +/- 0.7 years and 69% patients were in Child-Pugh class A, 19% class B, 11% class C. For the mean intensity of green color AUROC=0.81, a cut-off value under 108.7 being diagnostic for HCC with a Sp=91.1%, Se=50%, PPV=92.1%, NPV=47.1%. Mean intensity of blue color proved to be an excellent diagnostic tool for HCC (AUROC=0.94); for a cut-off value greater than 128.9, Sp=92.2%, Se=78.9%, PPV=95.4%, NPV=68%. Independent predictive factors of HCC for a small nodule in cirrhotic patients were: blue color over 128.9 at sonoelastography and hypervascular appearance at Doppler US. US elastography is a promising method for the non-invasive diagnosis of early HCC. Blue color at elastography and hypervascular aspects are independent predictors of HCC.

  9. Unbiased roughness measurements: the key to better etch performance

    NASA Astrophysics Data System (ADS)

    Liang, Andrew; Mack, Chris; Sirard, Stephen; Liang, Chen-wei; Yang, Liu; Jiang, Justin; Shamma, Nader; Wise, Rich; Yu, Jengyi; Hymes, Diane

    2018-03-01

    Edge placement error (EPE) has become an increasingly critical metric to enable Moore's Law scaling. Stochastic variations, as characterized for lines by line width roughness (LWR) and line edge roughness (LER), are dominant factors in EPE and known to increase with the introduction of EUV lithography. However, despite recommendations from ITRS, NIST, and SEMI standards, the industry has not agreed upon a methodology to quantify these properties. Thus, differing methodologies applied to the same image often result in different roughness measurements and conclusions. To standardize LWR and LER measurements, Fractilia has developed an unbiased measurement that uses a raw unfiltered line scan to subtract out image noise and distortions. By using Fractilia's inverse linescan model (FILM) to guide development, we will highlight the key influences of roughness metrology on plasma-based resist smoothing processes. Test wafers were deposited to represent a 5 nm node EUV logic stack. The patterning stack consists of a core Si target layer with spin-on carbon (SOC) as the hardmask and spin-on glass (SOG) as the cap. Next, these wafers were exposed through an ASML NXE 3350B EUV scanner with an advanced chemically amplified resist (CAR). Afterwards, these wafers were etched through a variety of plasma-based resist smoothing techniques using a Lam Kiyo conductor etch system. Dense line and space patterns on the etched samples were imaged through advanced Hitachi CDSEMs and the LER and LWR were measured through both Fractilia and an industry standard roughness measurement software. By employing Fractilia to guide plasma-based etch development, we demonstrate that Fractilia produces accurate roughness measurements on resist in contrast to an industry standard measurement software. These results highlight the importance of subtracting out SEM image noise to obtain quicker developmental cycle times and lower target layer roughness.

  10. 64nm pitch metal1 double patterning metrology: CD and OVL control by SEMCD, image based overlay and diffraction based overlay

    NASA Astrophysics Data System (ADS)

    Ducoté, Julien; Dettoni, Florent; Bouyssou, Régis; Le-Gratiet, Bertrand; Carau, Damien; Dezauzier, Christophe

    2015-03-01

    Patterning process control of advanced nodes has required major changes over the last few years. Process control needs of critical patterning levels since 28nm technology node is extremely aggressive showing that metrology accuracy/sensitivity must be finely tuned. The introduction of pitch splitting (Litho-Etch-Litho-Etch) at 14FDSOInm node requires the development of specific metrologies to adopt advanced process control (for CD, overlay and focus corrections). The pitch splitting process leads to final line CD uniformities that are a combination of the CD uniformities of the two exposures, while the space CD uniformities are depending on both CD and OVL variability. In this paper, investigations of CD and OVL process control of 64nm minimum pitch at Metal1 level of 14FDSOI technology, within the double patterning process flow (Litho, hard mask etch, line etch) are presented. Various measurements with SEMCD tools (Hitachi), and overlay tools (KT for Image Based Overlay - IBO, and ASML for Diffraction Based Overlay - DBO) are compared. Metrology targets are embedded within a block instanced several times within the field to perform intra-field process variations characterizations. Specific SEMCD targets were designed for independent measurement of both line CD (A and B) and space CD (A to B and B to A) for each exposure within a single measurement during the DP flow. Based on those measurements correlation between overlay determined with SEMCD and with standard overlay tools can be evaluated. Such correlation at different steps through the DP flow is investigated regarding the metrology type. Process correction models are evaluated with respect to the measurement type and the intra-field sampling.

  11. [Multisite validation of CDT measurement by the %CDT TIA and the Tina Quant %CDT kits].

    PubMed

    Boehrer, J L; Cano, Y; Capolaghi, B; Desch, G; Dosbaa, I; Estepa, L; Hennache, B; Schellenberg, F

    2007-01-01

    The measurement of CDT (Carbohydrate Deficient Transferrin) is an essential biological tool in the diagnosis and follow-up of alcohol abuse. It is also employed as a marker of abstinence for the restitution of driving licences. However, the precision of measurement, and the between laboratory homogeneity of the results are still discussed. The ion exchange followed by immunodetermination of CDT is available in two products, the Tina Quant %CDT (Roche, Mannheim, Germany) and the %CDT TIA (Bio-Rad, Hercules, United States). This multicentre study was undertaken: 1) to evaluate the analytical characteristics of these kits and the homogeneity of the results from one laboratory to another, independently of the method used, 2) to validate the differences between the proposed normal values of both kits, 3) to study the possibility of using commercial control sera as external quality control. Four analytical systems were included in the study (Roche Modular/Hitachi 717, Beckman Coulter Immage and LX20, Dade Behring BNII). Determinations were carried out on pools of sera, commercial control sera, kit controls, and 30 serums of patients. These latter were also analyzed in capillary electrophoresis in order to establish correlations between the techniques. The calibrations were stable over one 2 weeks period. The repeatability of measurements spread out from 3,1% to 24,7%, for a mean value lower than 10%. The commercial control sera provided reliable results, with values adapted to a routine quality control use. The results of the Bio-Rad applications were lower by approximately 20% than those of the Roche application, which justifies the difference of the normal values (2,6% versus 3%), and an identical classification of the patients in at least 27 of the 30 samples. We conclude that the analytical quality of the compared techniques, even if it could be improved, is sufficient to guarantee a good reliability of the results. An external quality control could be proposed by using the control sera that we tested.

  12. A Methodology for the Development of a Reliability Database for an Advanced Reactor Probabilistic Risk Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, Dave; Brunett, Acacia J.; Bucknor, Matthew

    GE Hitachi Nuclear Energy (GEH) and Argonne National Laboratory are currently engaged in a joint effort to modernize and develop probabilistic risk assessment (PRA) techniques for advanced non-light water reactors. At a high level the primary outcome of this project will be the development of next-generation PRA methodologies that will enable risk-informed prioritization of safety- and reliability-focused research and development, while also identifying gaps that may be resolved through additional research. A subset of this effort is the development of a reliability database (RDB) methodology to determine applicable reliability data for inclusion in the quantification of the PRA. The RDBmore » method developed during this project seeks to satisfy the requirements of the Data Analysis element of the ASME/ANS Non-LWR PRA standard. The RDB methodology utilizes a relevancy test to examine reliability data and determine whether it is appropriate to include as part of the reliability database for the PRA. The relevancy test compares three component properties to establish the level of similarity to components examined as part of the PRA. These properties include the component function, the component failure modes, and the environment/boundary conditions of the component. The relevancy test is used to gauge the quality of data found in a variety of sources, such as advanced reactor-specific databases, non-advanced reactor nuclear databases, and non-nuclear databases. The RDB also establishes the integration of expert judgment or separate reliability analysis with past reliability data. This paper provides details on the RDB methodology, and includes an example application of the RDB methodology for determining the reliability of the intermediate heat exchanger of a sodium fast reactor. The example explores a variety of reliability data sources, and assesses their applicability for the PRA of interest through the use of the relevancy test.« less

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kanehira, T; Sutherland, K; Matsuura, T

    Purpose: To evaluate density inhomogeneities which can effect dose distributions for real-time image gated spot-scanning proton therapy (RGPT), a dose calculation system, using treatment planning system VQA (Hitachi Ltd., Tokyo) spot position data, was developed based on Geant4. Methods: A Geant4 application was developed to simulate spot-scanned proton beams at Hokkaido University Hospital. A CT scan (0.98 × 0.98 × 1.25 mm) was performed for prostate cancer treatment with three or four inserted gold markers (diameter 1.5 mm, volume 1.77 mm3) in or near the target tumor. The CT data was read into VQA. A spot scanning plan was generatedmore » and exported to text files, specifying the beam energy and position of each spot. The text files were converted and read into our Geant4-based software. The spot position was converted into steering magnet field strength (in Tesla) for our beam nozzle. Individual protons were tracked from the vacuum chamber, through the helium chamber, steering magnets, dose monitors, etc., in a straight, horizontal line. The patient CT data was converted into materials with variable density and placed in a parametrized volume at the isocenter. Gold fiducial markers were represented in the CT data by two adjacent voxels (volume 2.38 mm3). 600,000 proton histories were tracked for each target spot. As one beam contained about 1,000 spots, approximately 600 million histories were recorded for each beam on a blade server. Two plans were considered: two beam horizontal opposed (90 and 270 degree) and three beam (0, 90 and 270 degree). Results: We are able to convert spot scanning plans from VQA and simulate them with our Geant4-based code. Our system can be used to evaluate the effect of dose reduction caused by gold markers used for RGPT. Conclusion: Our Geant4 application is able to calculate dose distributions for spot scanned proton therapy.« less

  14. Interference studies with two hospital-grade and two home-grade glucose meters.

    PubMed

    Lyon, Martha E; Baskin, Leland B; Braakman, Sandy; Presti, Steven; Dubois, Jeffrey; Shirey, Terry

    2009-10-01

    Interference studies of four glucose meters (Nova Biomedical [Waltham, MA] StatStrip [hospital grade], Roche Diagnostics [Indianapolis, IN] Accu-Chek Aviva [home grade], Abbott Diabetes Care [Alameda, CA] Precision FreeStyle Freedom [home grade], and LifeScan [Milpitas, CA] SureStep Flexx [hospital grade]) were evaluated and compared to the clinical laboratory plasma hexokinase reference method (Roche Hitachi 912 chemistry analyzer). These meters were chosen to reflect the continuum of care from hospital to home grade meters commonly seen in North America. Within-run precision was determined using a freshly prepared whole blood sample spiked with concentrated glucose to give three glucose concentrations. Day-to-day precision was evaluated using aqueous control materials supplied by each vendor. Common interferences, including hematocrit, maltose, and ascorbate, were tested alone and in combination with one another on each of the four glucose testing devices at three blood glucose concentrations. Within-run precision for all glucose meters was <5% except for the FreeStyle (up to 7.6%). Between-day precision was <6% for all glucose meters. Ascorbate caused differences (percentage change from a sample without added interfering substances) of >5% with pyrroloquinolinequinone (PQQ)-glucose dehydrogenase-based technologies (Aviva and Freestyle) and the glucose oxidase-based Flexx meter. Maltose strongly affected the PQQ-glucose dehydrogenase-based meter systems. When combinations of interferences (ascorbate, maltose, and hematocrit mixtures) were tested, the extent of the interference was up to 193% (Aviva), 179% (FreeStyle), 25.1% (Flexx), and 5.9% (StatStrip). The interference was most pronounced at low glucose (3.9-4.4 mmol/L). All evaluated glucose meter systems demonstrated varying degrees of interference by hematocrit, ascorbate, and maltose mixtures. PQQ-glucose dehydrogenase-based technologies showed greater susceptibility than glucose oxidase-based systems. However, the modified glucose oxidase-based amperometric method (Nova StatStrip) was less affected in comparison with the glucose oxidase-based photometric method (LifeScan SureStep Flexx).

  15. LiF TLD-100 as a Dosimeter in High Energy Proton Beam Therapy-Can It Yield Accurate Results?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zullo, John R.; Kudchadker, Rajat J.; Zhu, X. Ronald

    In the region of high-dose gradients at the end of the proton range, the stopping power ratio of the protons undergoes significant changes, allowing for a broad spectrum of proton energies to be deposited within a relatively small volume. Because of the potential linear energy transfer dependence of LiF TLD-100 (thermolumescent dosimeter), dose measurements made in the distal fall-off region of a proton beam may be less accurate than those made in regions of low-dose gradients. The purpose of this study is to determine the accuracy and precision of dose measured using TLD-100 for a pristine Bragg peak, particularly inmore » the distal fall-off region. All measurements were made along the central axis of an unmodulated 200-MeV proton beam from a Probeat passive beam-scattering proton accelerator (Hitachi, Ltd., Tokyo, Japan) at varying depths along the Bragg peak. Measurements were made using TLD-100 powder flat packs, placed in a virtual water slab phantom. The measurements were repeated using a parallel plate ionization chamber. The dose measurements using TLD-100 in a proton beam were accurate to within {+-}5.0% of the expected dose, previously seen in our past photon and electron measurements. The ionization chamber and the TLD relative dose measurements agreed well with each other. Absolute dose measurements using TLD agreed with ionization chamber measurements to within {+-} 3.0 cGy, for an exposure of 100 cGy. In our study, the differences in the dose measured by the ionization chamber and those measured by TLD-100 were minimal, indicating that the accuracy and precision of measurements made in the distal fall-off region of a pristine Bragg peak is within the expected range. Thus, the rapid change in stopping power ratios at the end of the range should not affect such measurements, and TLD-100 may be used with confidence as an in vivo dosimeter for proton beam therapy.« less

  16. Copper Decoration of Carbon Nanotubes and High Resolution Electron Microscopy

    NASA Astrophysics Data System (ADS)

    Probst, Camille

    A new process of decorating carbon nanotubes with copper was developed for the fabrication of nanocomposite aluminum-nanotubes. The process consists of three stages: oxidation, activation and electroless copper plating on the nanotubes. The oxidation step was required to create chemical function on the nanotubes, essential for the activation step. Then, catalytic nanoparticles of tin-palladium were deposited on the tubes. Finally, during the electroless copper plating, copper particles with a size between 20 and 60 nm were uniformly deposited on the nanotubes surface. The reproducibility of the process was shown by using another type of carbon nanotube. The fabrication of nanocomposites aluminum-nanotubes was tested by aluminum vacuum infiltration. Although the infiltration of carbon nanotubes did not produce the expected results, an interesting electron microscopy sample was discovered during the process development: the activated carbon nanotubes. Secondly, scanning transmitted electron microscopy (STEM) imaging in SEM was analysed. The images were obtained with a new detector on the field emission scanning electron microscope (Hitachi S-4700). Various parameters were analysed with the use of two different samples: the activated carbon nanotubes (previously obtained) and gold-palladium nanodeposits. Influences of working distance, accelerating voltage or sample used on the spatial resolution of images obtained with SMART (Scanning Microscope Assessment and Resolution Testing) were analysed. An optimum working distance for the best spatial resolution related to the sample analysed was found for the imaging in STEM mode. Finally, relation between probe size and spatial resolution of backscattered electrons (BSE) images was studied. An image synthesis method was developed to generate the BSE images from backscattered electrons coefficients obtained with CASINO software. Spatial resolution of images was determined using SMART. The analysis shown that using a probe size smaller than the size of the observed object (sample features) does not improve the spatial resolution. In addition, the effects of the accelerating voltage, the current intensity and the sample geometry and composition were analysed.

  17. Spin-Valve and Spin-Tunneling Devices: Read Heads, MRAMs, Field Sensors

    NASA Astrophysics Data System (ADS)

    Freitas, P. P.

    Hard disk magnetic data storage is increasing at a steady state in terms of units sold, with 144 million drives sold in 1998 (107 million for desktops, 18 million for portables, and 19 million for enterprise drives), corresponding to a total business of 34 billion US [1]. The growing need for storage coming from new PC operating systems, INTERNET applications, and a foreseen explosion of applications connected to consumer electronics (digital TV, video, digital cameras, GPS systems, etc.), keep the magnetics community actively looking for new solutions, concerning media, heads, tribology, and system electronics. Current state of the art disk drives (January 2000), using dual inductive-write, magnetoresistive-read (MR) integrated heads reach areal densities of 15 to 23 bit/μm2, capable of putting a full 20 GB in one platter (a 2 hour film occupies 10 GB). Densities beyond 80 bit/μm2 have already been demonstrated in the laboratory (Fujitsu 87 bit/μm2-Intermag 2000, Hitachi 81 bit/μm2, Read-Rite 78 bit/μ m2, Seagate 70 bit/μ m2 - all the last three demos done in the first 6 months of 2000, with IBM having demonstrated 56 bit/μ m2 already at the end of 1999). At densities near 60 bit/μm2, the linear bit size is sim 43 nm, and the width of the written tracks is sim 0.23 μm. Areal density in commercial drives is increasing steadily at a rate of nearly 100% per year [1], and consumer products above 60 bit/μm2 are expected by 2002. These remarkable achievements are only possible by a stream of technological innovations, in media [2], write heads [3], read heads [4], and system electronics [5]. In this chapter, recent advances on spin valve materials and spin valve sensor architectures, low resistance tunnel junctions and tunnel junction head architectures will be addressed.

  18. Dysfunction in gastric myoelectric and motor activity in Helicobacter pylori positive gastritis patients with non-ulcer dyspesia.

    PubMed

    Thor, P; Lorens, K; Tabor, S; Herman, R; Konturek, J W; Konturek, S J

    1996-09-01

    Helicobacter pylori (Hp) infection has been shown to affect gastric acid secretion and the somatostatin-gastrin ratio but its effects on gastric motility have not been evaluated. This study was carried out in 12 patients (10 males and 2 females, mean age 33 +/- 6 yrs) who underwent endoscopy and Campylobacter-like Organism (CLO)-test. All patients were found initially to be Hp positive according to CLO-test. Gastric emptying was evaluated by measuring antral diameter with ultrasonography (Hitachi EUB 240) in fasted and fed patients. Electrogastrography (EGG) with antral manometry were done 5 h before and 4 h after a meal before the therapy and one month after the eradication with triple therapy (lanzoprazole 30 mg daily- 2 x 250 mg clarithromycin 500 mg t.i.d.-3 x 500 mg and metronidazole 500 mg b.i.d.-2 x 500 mg). In Hp positive patients before the triple therapy the mean fasted antral diameter was 4.3 cm2, initial EGG showed significant dysrhythmia of electrical control activity (ECA) with tachygastria up to 25% of recording time in 9 of 12 Hp positive patients without normal increase of the power of signal in any of tested subjects. In 7 Hp positive fasted antral manometry failed to exhibit gastric phases III of the migrating motor complex (MMC). Hp eradication was accomplished in 10 of 12 examined patients and this was followed by a decrease in tachygastria to 3 cpm rhythm with an increase of the ECA power after meal. Phase III of MMC was observed again in 7 Hp negative patients with a decrease of fasted antral diameter (p < 0.05). Fasted and fed antral motility pattern increased after eradication. Two patients remained Hp positive after standard therapy. We conclude that most symptomatic non ulcer dyspeptic Hp positive patients show changes in ECA and antral hypomotility that are associated with Hp infections.

  19. PREFACE: Synthesis and integration of nanowires

    NASA Astrophysics Data System (ADS)

    Samuelson, L.

    2006-06-01

    The field of semiconductor nanowires has attracted much attention in recent years, from the areas of basic materials science, advanced characterization and technology, as well as from the perspective of the applications of nanowires. Research on large-sized whiskers and wires had already begun in the 1960s with the pioneering work of Wagner, as well as by other researchers. It was, however, in the early 1990s that Kenji Hiruma at Hitachi Central Research Laboratories in Japan first succeeded in developing methods for the growth of nanowires with dimensions on the scale of 10-100 nm, thereby initiating the field of growth and applications of nanowires, with a strong emphasis on epitaxial nucleation of nanowires on a single-crystalline substrate. Starting from the mid-1990s, the field developed very rapidly with the number of papers on the subject growing from ten per year to several thousand papers on the subject published annually today, although with a rather generous definition of the concept of nanowires. With this rapid development we have seen many new and different approaches to the growth of nanowires, technological advances leading to a more well-controlled formation of nanowires, new innovative methods for the characterization of structures, as well as a wealth of approaches towards the use of nanowires in electronics, photonics and sensor applications. This issue contains contributions from many different laboratories, each adding significant detail to the development of the field of research. The contributions cover issues such as basic growth, advanced characterization and technology, and application of nanowires. I would like to acknowledge the shared responsibilities for this special issue of Nanotechnology on the synthesis and integration of nanowires with my co-Editors, S Tong Lee and M Sunkara, as well as the highly professional support from Dr Nina Couzin, Dr Ian Forbes and the Nanotechnology team from the Institute of Physics Publishing.

  20. Influence of particle size on Cutting Forces and Surface Roughness in Machining of B4Cp - 6061 Aluminium Matrix Composites

    NASA Astrophysics Data System (ADS)

    Hiremath, Vijaykumar; Badiger, Pradeep; Auradi, V.; Dundur, S. T.; Kori, S. A.

    2016-02-01

    Amongst advanced materials, metal matrix composites (MMC) are gaining importance as materials for structural applications in particular, particulate reinforced aluminium MMCs have received considerable attention due to their superior properties such as high strength to weight ratio, excellent low-temperature performance, high wear resistance, high thermal conductivity. The present study aims at studying and comparing the machinability aspects of B4Cp reinforced 6061Al alloy metal matrix composites reinforced with 37μm and 88μm particulates produced by stir casting method. The micro structural characterization of the prepared composites is done using Scanning Electron Microscopy equipped with EDX analysis (Hitachi Su-1500 model) to identify morphology and distribution of B4C particles in the 6061Al matrix. The specimens are turned on a conventional lathe machine using a Polly crystalline Diamond (PCD) tool to study the effect of particle size on the cutting forces and the surface roughness under varying machinability parameters viz., Cutting speed (29-45 m/min.), Feed rate (0.11-0.33 mm/rev.) and depth of cut (0.5-1mm). Results of micro structural characterization revealed fairly uniform distribution of B4C particles (in both cases i.e., 37μm and 88μm) in 6061Al matrix. The surface roughness of the composite is influenced by cutting speed. The feed rate and depth of cut have a negative influence on surface roughness. The cutting forces decreased with increase in cutting speed whereas cutting forces increased with increase in feed and depth of cut. Higher cutting forces are noticed while machining Al6061 base alloy compared to reinforced composites. Surface finish is high during turning of the 6061Al base alloy and surface roughness is high with 88μm size particle reinforced composites. As the particle size increases Surface roughness also increases.

  1. Effects of two types of medical contrast media on routine chemistry results by three automated chemistry analyzers.

    PubMed

    Park, Yu Jin; Rim, John Hoon; Yim, Jisook; Lee, Sang-Guk; Kim, Jeong-Ho

    2017-08-01

    The use of iodinated contrast media has grown in popularity in the past two decades, but relatively little attention has been paid to the possible interferential effects of contrast media on laboratory test results. Herein, we investigate medical contrast media interference with routine chemistry results obtained by three automated chemistry analyzers. Ten levels of pooled serum were used in the study. Two types of medical contrast media [Iopamiro (iopamidol) and Omnipaque (iohexol)] were evaluated. To evaluate the dose-dependent effects of the contrast media, iopamidol and iohexol were spiked separately into aliquots of serum for final concentrations of 1.8%, 3.6%, 5.5%, 7.3%, and 9.1%. The 28 analytes included in the routine chemistry panel were measured by using Hitachi 7600, AU5800, and Cobas c702 analyzers. We calculated the delta percentage difference (DPD) between the samples and the control, and examined dose-dependent trends. When the mean DPD values were compared with the reference cut-off criteria, the only uniformly interferential effect observed for all analyzers was in total protein with iopamidol. Two additional analytes that showed trends toward interferential effects only in few analyzers and exceeded the limits of the allowable error were the serum iron and the total CO 2 . The other combinations of analyzer and contrast showed no consistent dose-dependent propensity for change in any analyte level. Our study suggests that many of the analytes included in routine chemistry results, except total protein and serum iron, are not significantly affected by iopamidol and iohexol. These results suggest that it would be beneficial to apply a flexible medical evaluation process for patients requiring both laboratory tests and imaging studies, minimizing the need for strict regulations for sequential tests. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  2. SU-F-T-189: Dosimetric Comparison of Spot-Scanning Proton Therapy Techniques for Liver Tumors Close to the Skin Surface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Takao, S; Matsuzaki, Y; Matsuura, T

    Purpose: Spot-scanning technique has been utilized to achieve conformal dose distribution to large and complicated tumors. This technique generally does not require patient-specific devices such as aperture and compensator. The commercially available spot-scanning proton therapy (SSPT) systems, however, cannot deliver proton beams to the region shallower than 4 g/cm2. Therefore some range compensation device is required to treat superficial tumors with SSPT. This study shows dosimetric comparison of the following treatment techniques: (i) with a tabletop bolus, (ii) with a nozzle-mounted applicator, and (iii) without any devices and using intensity-modulated proton therapy (IMPT) technique. Methods: The applicator composed of amore » combination of a mini-ridge filter and a range shifter has been manufactured by Hitachi, Ltd., and the tabletop bolus was made by .decimal, Inc. Both devices have been clinically implemented in our facility. Three patients with liver tumors close to the skin surface were examined in this study. Each treatment plan was optimized so that the prescription dose of 76 Gy(RBE) or 66 Gy(RBE) would be delivered to 99% of the clinical target volume in 20 fractions. Three beams were used for tabletop bolus plan and IMPT plan, whereas two beams were used in the applicator plan because the gantry angle available was limited due to potential collision to patient and couch. The normal liver, colon, and skin were considered as organs at risk (OARs). Results: The target heterogeneity index (HI = D{sub 5}/D{sub 95}) was 1.03 on average in each planning technique. The mean dose to the normal liver was considerably less than 20 Gy(RBE) in all cases. The dose to the skin could be reduced by 20 Gy(RBE) on average in the IMPT plan compared to the applicator plan. Conclusion: It has been confirmed that all treatment techniques met the dosimetric criteria for the OARs and could be implemented clinically.« less

  3. Interface bonding of NiCrAlY coating on laser modified H13 tool steel surface

    NASA Astrophysics Data System (ADS)

    Reza, M. S.; Aqida, S. N.; Ismail, I.

    2016-06-01

    Bonding strength of thermal spray coatings depends on the interfacial adhesion between bond coat and substrate material. In this paper, NiCrAlY (Ni-164/211 Ni22 %Cr10 %Al1.0 %Y) coatings were developed on laser modified H13 tool steel surface using atmospheric plasma spray (APS). Different laser peak power, P p, and duty cycle, DC, were investigated in order to improve the mechanical properties of H13 tool steel surface. The APS spraying parameters setting for coatings were set constant. The coating microstructure near the interface was analyzed using IM7000 inverted optical microscope. Interface bonding of NiCrAlY was investigated by interfacial indentation test (IIT) method using MMT-X7 Matsuzawa Hardness Tester Machine with Vickers indenter. Diffusion of atoms along NiCrAlY coating, laser modified and substrate layers was investigated by energy-dispersive X-ray spectroscopy (EDXS) using Hitachi Tabletop Microscope TM3030 Plus. Based on IIT method results, average interfacial toughness, K avg, for reference sample was 2.15 MPa m1/2 compared to sample L1 range of K avg from 6.02 to 6.96 MPa m1/2 and sample L2 range of K avg from 2.47 to 3.46 MPa m1/2. Hence, according to K avg, sample L1 has the highest interface bonding and is being laser modified at lower laser peak power, P p, and higher duty cycle, DC, prior to coating. The EDXS analysis indicated the presence of Fe in the NiCrAlY coating layer and increased Ni and Cr composition in the laser modified layer. Atomic diffusion occurred in both coating and laser modified layers involved in Fe, Ni and Cr elements. These findings introduce enhancement of coating system by substrate surface modification to allow atomic diffusion.

  4. Attempt to assess the infiltration of enamel made with experimental preparation using a scanning electron microscope.

    PubMed

    Skucha-Nowak, Małgorzata

    2015-01-01

    The resin infiltration technique, a minimally invasive method, involves the saturation, strengthening, and stabilization of demineralized enamel by a mixture of polymer resins without the need to use rotary tools or the risk of losing healthy tooth structures. To design and synthesize an experimental infiltrant with potential bacteriostatic properties.To compare the depth of infiltration of the designed experimental preparation with the infiltrant available in the market using a scanning electron microscope. Composition of the experimental infiltrant was established after analysis of 1H NMR spectra of the commercially available compounds that can penetrate pores of demineralized enamel. As the infiltrant should have bacteriostatic features by definition, an addition of 1% of monomer containing metronidazole was made. Thirty extracted human teeth were soaked in an acidic solution, which was to provide appropriate conditions for demineralization of enamel. Afterward, each tooth was divided along the coronal-root axis into two zones. One zone had experimental preparation applied to it (the test group), while the other had commercially available Icon (the control group). The teeth were dissected along the long axis and described above underwent initial observation with use of a Hitachi S-4200 scanning electron microscope. It was found that all samples contained only oxygen and carbon, regardless of the concentration of additions introduced into them. The occurrence of carbon is partially because it is a component of the preparation in question and partially because of sputtering of the sample with it. Hydrogen is also a component of the preparation, as a result of its phase composition; however, it cannot be detected by the EDS method. SEM, in combination with X-ray microanalysis, does not allow one to explicitly assess the depth of penetration of infiltration preparations into enamel.In order to assess the depth of penetration of infiltration preparations with use of X-ray microanalysis, it is recommended to introduce a contrast agent that is approved for use in dental materials, such as ytterbium III fluoride.

  5. Multi-scale 3D characterization of long period stacking ordered structure in Mg-Zn-Gd cast alloys.

    PubMed

    Ishida, Masahiro; Yoshioka, Satoru; Yamamoto, Tomokazu; Yasuda, Kazuhiro; Matsumura, Syo

    2014-11-01

    Magnesium alloys containing rare earth elements are attractive as lightweight structural materials due to their low density, high-specific strength and recycling efficiency. Mg-Zn-Gd system is one of promising systems because of their high creep-resistant property[1]. It is reported that the coherent precipitation formation of the 14H long period stacking ordered structure (LPSO) in Mg-Zn-Gd system at temperatures higher than 623 K[2,3]. In this study, the 14H LPSO phase formed in Mg-Zn-Gd alloys were investigated by multi-scale characterization with X-ray computer tomography (X-CT), focused ion beam (FIB) tomography and aberration-corrected STEM observation for further understanding of the LPSO formation mechanism.The Mg89.5 Zn4.5 Gd6 alloy ingots were cast using high-frequency induction heating in argon atmosphere. The specimens were aged at 753 K for 24 h in air. The aged specimen were cut and polished mechanically for microstructural analysis. The micrometer resolution X-CT observation was performed by conventional scaner (Bruker SKY- SCAN1172) at 80 kV. The FIB tomography and energy dispersive x-ray spectroscopy (EDS) were carried out by a dual beam FIB-SEM system (Hitachi MI-4000L) with silicon drift detector (SDD) (Oxford X-Max(N)). The electron acceleration voltages were used with 3 kV for SEM observation and 10 kV for EDX spectroscopy. The 3D reconstruction from image series was performed by Avizo Fire 8.0 software (FEI). TEM/STEM observations were also performed by transmission electron microscopes (JEOL JEM 2100, JEM-ARM 200F) at the acceleration voltage of 200 keV.The LPSO phase was observed clearly in SEM image of the Mg89.5Zn4.5Gd6 alloy at 753 K for 2h (Fig.1 (a)). The atomic structure of LPSO phase observed as white gray region of SEM image was also confirmed as 14H LPSO structure by using selected electron diffraction patterns and high-resolution STEM observations. The elemental composition of LPSO phase was determined as Mg97Zn1Gd2 by EDS analyses. The 3D representation of the LPSO phase shown in Fig.1 (b) reveals that the shape of LPSO phase was disk-like. The calculated volume fraction of LPSO was about 20%, which is consistent with estimated value from initial composition. The stacked LPSO disks were distributed along 3D network. It is suggested that this 3D structure is concerned with the distribution of Mg3Gd compounds observed in as-cast specimens.jmicro;63/suppl_1/i25-a/DFU068F1F1DFU068F1Fig. 1.(a) SEM image of the Mg89.5Zn4.5Gd6 alloy aged at 753 K for 2h. (b) 3D representation of the tomographic reconstruction from SEM images. The soiled parts of the 3D volume are 14 H LPSO phase. © The Author 2014. Published by Oxford University Press on behalf of The Japanese Society of Microscopy. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  6. SU-D-210-07: The Dependence On Acoustic Velocity of Medium On the Needle Template and Electronic Grid Alignment in Ultrasound QA for Prostate Brachytherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kapoor, P; Kapoor, R; Curran, B

    Purpose: To analyze the impact on acoustic velocity (AV) of two different media (water and milk) using the needle template/electronic grid alignment test. Water, easily available, makes a good material to test the alignment of the template and grid although water’s AV (1498 m/s at 25°C) is significantly different from tissue (1540 m/s). Milk, with an AV much closer (1548 m/s) to prostate tissue, may be a good substitute for water in ultrasound quality assurance testing. Methods: Tests were performed using a Hitachi ultrasound unit with a mechanical arrangement designed to position needles parallel to the transducer. In this work,more » two materials – distilled water and homogenized whole milk (AVs of 1498 and 1548 m/s at 25°C) were used in a phantom to test ultrasound needle/grid alignment. The images were obtained with both materials and analyzed for their placement accuracy. Results: The needle template/electronic grid alignment tests showed displacement errors between measured and calculated values. The measurements showed displacements of 2.3mm (water) and 0.4mm (milk), and 1.6mm (water) and 0.3mm (milk) at depths of 7cm and 5cm respectively from true needle positions. The calculated results showed a displacement of 2.36 mm (water); 0.435mm (milk), and 1.66mm (water) and 0.31mm (milk) at a depth of 7cm and 5cm respectively. The displacements in the X and Y directions were also calculated. At depths of 7cm and 5cm, the (ΔX,ΔY) displacements in water were (0.829mm, 2.21mm) and (0.273mm, 1.634mm) and for milk were (0.15mm, 0.44mm) and (0.05mm, 0.302mm) respectively. Conclusion: The measured and calculated values were in good agreement for all tests. They show that milk provides superior results when performing needle template and electronic grid alignment tests for ultrasound units used in prostate brachytherapy.« less

  7. Atomic force microscopy and scanning electron microscopy analysis of daily disposable limbal ring contact lenses.

    PubMed

    Lorenz, Kathrine Osborn; Kakkassery, Joseph; Boree, Danielle; Pinto, David

    2014-09-01

    Limbal ring (also known as 'circle') contact lenses are becoming increasingly popular, especially in Asian markets because of their eye-enhancing effects. The pigment particles that give the eye-enhancing effects of these lenses can be found on the front or back surface of the contact lens or 'enclosed' within the lens matrix. The purpose of this research was to evaluate the pigment location and surface roughness of seven types of 'circle' contact lenses. Scanning electron microscopic (SEM) analysis was performed using a variable pressure Hitachi S3400N instrument to discern the placement of lens pigments. Atomic force microscopy (Dimension Icon AFM from Bruker Nano) was used to determine the surface roughness of the pigmented regions of the contact lenses. Atomic force microscopic analysis was performed in fluid phase under contact mode using a Sharp Nitride Lever probe (SNL-10) with a spring constant of 0.06 N/m. Root mean square (RMS) roughness values were analysed using a generalised linear mixed model with a log-normal distribution. Least square means and their corresponding 95% confidence intervals were estimated for each brand, location and pigment combination. SEM cross-sectional images at 500× and 2,000× magnification showed pigment on the surface of six of the seven lens types tested. The mean depth of pigment for 1-DAY ACUVUE DEFINE (1DAD) lenses was 8.1 μm below the surface of the lens, while the remaining lens types tested had pigment particles on the front or back surface. Results of the atomic force microscopic analysis indicated that 1DAD lenses had significantly lower root mean square roughness values in the pigmented area of the lens than the other lens types tested. SEM and AFM analysis revealed pigment on the surface of the lens for all types tested with the exception of 1DAD. Further research is required to determine if the difference in pigment location influences on-eye performance. © 2014 The Authors. Clinical and Experimental Optometry © 2014 Optometrists Association Australia.

  8. Release of cobalt and chromium ions into the serum following implantation of the metal-on-metal Maverick-type artificial lumbar disc (Medtronic Sofamor Danek).

    PubMed

    Zeh, Alexander; Planert, Michael; Siegert, Gabriele; Lattke, Peter; Held, Andreas; Hein, Werner

    2007-02-01

    Cross-sectional study of 10 patients to measure the serum levels of cobalt and chromium after TDA. To investigate the release of cobalt and chromium ions into the serum following implantation of the metal-on-metal Maverick-type artificial lumbar disc. In total hip endoprosthetics and consequently for TDA (total disc arthroplasty), metal-on-metal combinations are used with the aim of reducing wear debris. In metal-on-metal TDA the release of metal ions has until now been secondary to the main discussion. We investigated the serum cobalt and chromium concentration following implantation of 15 Maverick TDAs (monosegmental L5-S1, n = 5; bisegmental L4-L5 and L5-S1, n = 5; average age, 36.5 years). Five healthy subjects (no metal implants) acted as a control group. The measurements of the metals were carried out using the HITACHI Z-8200 AAS polarized Zeeman atomic absorption spectrometer after an average of 14.8 months. The concentrations of cobalt and chromium ions in the serum amounted on average to 4.75 microg/L (SD, 2.71) for cobalt and 1.10 microg/L (SD, 1.24) for chromium. Compared with control group, both the chromium and cobalt levels in the serum showed significant increases (Mann-Whitney U test, P = 0.0120). At follow-up,the Oswestry Disability Score was on average significantly decreased by 24.4 points (L5-S1) (t test, P < 0.05) and by 26.8 points (L4-S1) (t test, P < 0.05). The improved clinical situation is also represented by a significant decrease of the Visual Analog Pain Scale of 42.2 points after the follow-up (t test, P < 0.05). Significant systemic release of Cr/Co was proven in the serum compared with the control group. The concentrations of Cr/Co measured in the serum are similar in terms of their level to the values measured in THA metal-on-metal combinations or exceed these values given in the literature. Long-term implication of this metal exposure is unknown and should be studied further.

  9. Atomic force microscopy and scanning electron microscopy analysis of daily disposable limbal ring contact lenses

    PubMed Central

    Lorenz, Kathrine Osborn; Kakkassery, Joseph; Boree, Danielle; Pinto, David

    2014-01-01

    Background Limbal ring (also known as ‘circle’) contact lenses are becoming increasingly popular, especially in Asian markets because of their eye-enhancing effects. The pigment particles that give the eye-enhancing effects of these lenses can be found on the front or back surface of the contact lens or ‘enclosed’ within the lens matrix. The purpose of this research was to evaluate the pigment location and surface roughness of seven types of ‘circle’ contact lenses. Methods Scanning electron microscopic (SEM) analysis was performed using a variable pressure Hitachi S3400N instrument to discern the placement of lens pigments. Atomic force microscopy (Dimension Icon AFM from Bruker Nano) was used to determine the surface roughness of the pigmented regions of the contact lenses. Atomic force microscopic analysis was performed in fluid phase under contact mode using a Sharp Nitride Lever probe (SNL-10) with a spring constant of 0.06 N/m. Root mean square (RMS) roughness values were analysed using a generalised linear mixed model with a log-normal distribution. Least square means and their corresponding 95% confidence intervals were estimated for each brand, location and pigment combination. Results SEM cross-sectional images at 500× and 2,000× magnification showed pigment on the surface of six of the seven lens types tested. The mean depth of pigment for 1-DAY ACUVUE DEFINE (1DAD) lenses was 8.1 μm below the surface of the lens, while the remaining lens types tested had pigment particles on the front or back surface. Results of the atomic force microscopic analysis indicated that 1DAD lenses had significantly lower root mean square roughness values in the pigmented area of the lens than the other lens types tested. Conclusions SEM and AFM analysis revealed pigment on the surface of the lens for all types tested with the exception of 1DAD. Further research is required to determine if the difference in pigment location influences on-eye performance. PMID:24689948

  10. LiF TLD-100 as a dosimeter in high energy proton beam therapy--can it yield accurate results?

    PubMed

    Zullo, John R; Kudchadker, Rajat J; Zhu, X Ronald; Sahoo, Narayan; Gillin, Michael T

    2010-01-01

    In the region of high-dose gradients at the end of the proton range, the stopping power ratio of the protons undergoes significant changes, allowing for a broad spectrum of proton energies to be deposited within a relatively small volume. Because of the potential linear energy transfer dependence of LiF TLD-100 (thermolumescent dosimeter), dose measurements made in the distal fall-off region of a proton beam may be less accurate than those made in regions of low-dose gradients. The purpose of this study is to determine the accuracy and precision of dose measured using TLD-100 for a pristine Bragg peak, particularly in the distal fall-off region. All measurements were made along the central axis of an unmodulated 200-MeV proton beam from a Probeat passive beam-scattering proton accelerator (Hitachi, Ltd., Tokyo, Japan) at varying depths along the Bragg peak. Measurements were made using TLD-100 powder flat packs, placed in a virtual water slab phantom. The measurements were repeated using a parallel plate ionization chamber. The dose measurements using TLD-100 in a proton beam were accurate to within +/-5.0% of the expected dose, previously seen in our past photon and electron measurements. The ionization chamber and the TLD relative dose measurements agreed well with each other. Absolute dose measurements using TLD agreed with ionization chamber measurements to within +/- 3.0 cGy, for an exposure of 100 cGy. In our study, the differences in the dose measured by the ionization chamber and those measured by TLD-100 were minimal, indicating that the accuracy and precision of measurements made in the distal fall-off region of a pristine Bragg peak is within the expected range. Thus, the rapid change in stopping power ratios at the end of the range should not affect such measurements, and TLD-100 may be used with confidence as an in vivo dosimeter for proton beam therapy. Copyright 2010 American Association of Medical Dosimetrists. Published by Elsevier Inc. All rights reserved.

  11. WE-EF-303-05: Development and Commissioning of Real-Time Imaging Function for Respiratory-Gated Spot-Scanning Proton Beam Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miyamoto, N; Takao, S; Matsuura, T

    2015-06-15

    Purpose: To realize real-time-image gated proton beam therapy (RGPT) for treating mobile tumors. Methods: The rotating gantry of spot scanning proton beam therapy has been designed to equip two x-ray fluoroscopy devices that enable real-time imaging of the internal fiducial markers during respiration. Three-dimensional position of the fiducial marker located near the tumor can be calculated from the fluoroscopic images obtained from orthogonal directions and therapeutic beam is gated only when the fiducial marker is within the predefined gating window. Image acquisition rate can be selected from discrete value ranging from 0.1 Hz to 30 Hz. In order to confirmmore » the effectiveness of RGPT and apply it clinically, clinical commissioning was conducted. Commissioning tests were categorized to main three parts including geometric accuracy, temporal accuracy and dosimetric evaluation. Results: Developed real-time imaging function has been installed and its basic performances have been confirmed. In the evaluation of geometric accuracy, coincidence of three-dimensional treatment room coordinate system and imaging coordinate system was confirmed to be less than 1 mm. Fiducial markers (gold sphere and coil) were able to be tracked in simulated clinical condition using an anthropomorphic chest phantom. In the evaluation of temporal accuracy, latency from image acquisition to gate on/off signal was about 60 msec in typical case. In dosimetric evaluation, treatment beam characteristics including beam irradiation position and dose output were stable in gated irradiation. Homogeneity indices to the mobile target were 0.99 (static), 0.89 (w/o gating, motion is parallel to direction of scan), 0.75 (w/o gating, perpendicular), 0.98 (w/ gating, parallel) and 0.93 (w/ gating, perpendicular). Dose homogeneity to the mobile target can be maintained in RGPT. Conclusion: Real-time imaging function utilizing x-ray fluoroscopy has been developed and commissioned successfully in order to realize RGPT. Funding Support: This research was partially supported by Japan Society for the Promotion of Science (JSPS) through the FIRST Program. Conflict of Interest: Prof. Shirato has research fund from Hitachi Ltd, Mitsubishi Heavy Industries Ltd and Shimadzu Corporation.« less

  12. Research interface on a programmable ultrasound scanner.

    PubMed

    Shamdasani, Vijay; Bae, Unmin; Sikdar, Siddhartha; Yoo, Yang Mo; Karadayi, Kerem; Managuli, Ravi; Kim, Yongmin

    2008-07-01

    Commercial ultrasound machines in the past did not provide the ultrasound researchers access to raw ultrasound data. Lack of this ability has impeded evaluation and clinical testing of novel ultrasound algorithms and applications. Recently, we developed a flexible ultrasound back-end where all the processing for the conventional ultrasound modes, such as B, M, color flow and spectral Doppler, was performed in software. The back-end has been incorporated into a commercial ultrasound machine, the Hitachi HiVision 5500. The goal of this work is to develop an ultrasound research interface on the back-end for acquiring raw ultrasound data from the machine. The research interface has been designed as a software module on the ultrasound back-end. To increase the amount of raw ultrasound data that can be spooled in the limited memory available on the back-end, we have developed a method that can losslessly compress the ultrasound data in real time. The raw ultrasound data could be obtained in any conventional ultrasound mode, including duplex and triplex modes. Furthermore, use of the research interface does not decrease the frame rate or otherwise affect the clinical usability of the machine. The lossless compression of the ultrasound data in real time can increase the amount of data spooled by approximately 2.3 times, thus allowing more than 6s of raw ultrasound data to be acquired in all the modes. The interface has been used not only for early testing of new ideas with in vitro data from phantoms, but also for acquiring in vivo data for fine-tuning ultrasound applications and conducting clinical studies. We present several examples of how newer ultrasound applications, such as elastography, vibration imaging and 3D imaging, have benefited from this research interface. Since the research interface is entirely implemented in software, it can be deployed on existing HiVision 5500 ultrasound machines and may be easily upgraded in the future. The developed research interface can aid researchers in the rapid testing and clinical evaluation of new ultrasound algorithms and applications. Additionally, we believe that our approach would be applicable to designing research interfaces on other ultrasound machines.

  13. Third-generation blood pumps with mechanical noncontact magnetic bearings.

    PubMed

    Hoshi, Hideo; Shinshi, Tadahiko; Takatani, Setsuo

    2006-05-01

    This article reviews third-generation blood pumps, focusing on the magnetic-levitation (maglev) system. The maglev system can be categorized into three types: (i) external motor-driven system, (ii) direct-drive motor-driven system, and (iii) self-bearing or bearingless motor system. In the external motor-driven system, Terumo (Ann Arbor, MI, U.S.A.) DuraHeart is an example where the impeller is levitated in the axial or z-direction. The disadvantage of this system is the mechanical wear in the mechanical bearings of the external motor. In the second system, the impeller is made into the rotor of the motor, and the magnetic flux, through the external stator, rotates the impeller, while the impeller levitation is maintained through another electromagnetic system. The Berlin Heart (Berlin, Germany) INCOR is the best example of this principle where one-axis control combination with hydrodynamic force achieves high performance. In the third system, the stator core is shared by the levitation and drive coil to make it as if the bearing does not exist. Levitronix CentriMag (Zürich, Switzerland), which appeared recently, employs this concept to achieve stable and safe operation of the extracorporeal system that can last for a duration of 14 days. Experimental systems including HeartMate III (Thoratec, Woburn, MA, U.S.A.), HeartQuest (WorldHeart, Ottawa, ON, Canada), MagneVAD (Gold Medical Technologies, Valhalla, NY, U.S.A.), MiTiHeart (MiTi Heart, Albany, NY, U.S.A.), Ibaraki University's Heart (Hitachi, Japan) and Tokyo Medical and Dental University/Tokyo Institute of Technology's disposable and implantable maglev blood pumps are also reviewed. In reference to second-generation blood pumps, such as the Jarvik 2000 (Jarvik Heart, New York, NY, U.S.A.), which is showing remarkable achievement, a question is raised whether a complicated system such as the maglev system is really needed. We should pay careful attention to future clinical outcomes of the ongoing clinical trials of the second-generation devices before making any further remarks. What is best for patients is the best for everyone. We should not waste any efforts unless they are actually needed to improve the quality of life of heart-failure patients.

  14. A Proton Beam Therapy System Dedicated to Spot-Scanning Increases Accuracy with Moving Tumors by Real-Time Imaging and Gating and Reduces Equipment Size

    PubMed Central

    Shimizu, Shinichi; Miyamoto, Naoki; Matsuura, Taeko; Fujii, Yusuke; Umezawa, Masumi; Umegaki, Kikuo; Hiramoto, Kazuo; Shirato, Hiroki

    2014-01-01

    Purpose A proton beam therapy (PBT) system has been designed which dedicates to spot-scanning and has a gating function employing the fluoroscopy-based real-time-imaging of internal fiducial markers near tumors. The dose distribution and treatment time of the newly designed real-time-image gated, spot-scanning proton beam therapy (RGPT) were compared with free-breathing spot-scanning proton beam therapy (FBPT) in a simulation. Materials and Methods In-house simulation tools and treatment planning system VQA (Hitachi, Ltd., Japan) were used for estimating the dose distribution and treatment time. Simulations were performed for 48 motion parameters (including 8 respiratory patterns and 6 initial breathing timings) on CT data from two patients, A and B, with hepatocellular carcinoma and with clinical target volumes 14.6 cc and 63.1 cc. The respiratory patterns were derived from the actual trajectory of internal fiducial markers taken in X-ray real-time tumor-tracking radiotherapy (RTRT). Results With FBPT, 9/48 motion parameters achieved the criteria of successful delivery for patient A and 0/48 for B. With RGPT 48/48 and 42/48 achieved the criteria. Compared with FBPT, the mean liver dose was smaller with RGPT with statistical significance (p<0.001); it decreased from 27% to 13% and 28% to 23% of the prescribed doses for patients A and B, respectively. The relative lengthening of treatment time to administer 3 Gy (RBE) was estimated to be 1.22 (RGPT/FBPT: 138 s/113 s) and 1.72 (207 s/120 s) for patients A and B, respectively. Conclusions This simulation study demonstrated that the RGPT was able to improve the dose distribution markedly for moving tumors without very large treatment time extension. The proton beam therapy system dedicated to spot-scanning with a gating function for real-time imaging increases accuracy with moving tumors and reduces the physical size, and subsequently the cost of the equipment as well as of the building housing the equipment. PMID:24747601

  15. Compositional Characteristics of Dissolved Organic Matter released from the sediment of Han river in Korea.

    NASA Astrophysics Data System (ADS)

    Oh, H.; Choi, J. H.

    2017-12-01

    The dissolved organic matter (DOM) has variable characteristics depending on the sources. The DOM of a river is affected by rain water, windborne material, surface and groundwater flow, and sediments. In particular, sediments are sources and sinks of nutrients and pollutants in aquatic ecosystems by supplying large amounts of organic matter. The DOM which absorbs ultraviolet and visible light is called colored dissolved organic matter (CDOM). CDOM is responsible for the optical properties of natural waters in several biogeochemical and photochemical processes and absorbs UV-A (315-400 nm) and UV-B (280-315), which are harmful to aquatic ecosystems (Helms et al., 2008). In this study, we investigated the quantity and quality of DOM and CDOM released from the sediments of Han river which was impacted by anthropogenic activities and hydrologic alternation of 4 Major River Restoration Project. The target area of this study is Gangchenbo (GC), Yeojubo (YJ), and Ipobo(IP) of the Han River, Korea. Sediments and water samples were taken on July and August of 2016 and were incubated at 20° up to 7 days. Absorbance was measured with UV-visible spectrophotometer (Libra S32 PC, Biochrom). Fluorescence intensity determined with Fluorescence EEMs (F-7000, Hitachi). Absorbance and fluorescence intensity were used to calculate Specific Ultraviolet Absorbance (SUVA254), Humification index (HIX), Biological index (BIX), Spectral slope (SR) and component analysis. The DOC concentration increased after 3 days of incubation. According to the SUVA254 analysis, the microbial activity is highest in the initial overlying water of IP. HIX have range of 1.35-4.08, and decrease poly aromatic structures of organic matter during incubation. From the results of the BIX, autochthonous organic matter was released from the sediments. In all sites, Humic-like DOM, Microbial humic-like DOM and Protein-like DOM increased significantly between Day 0 and 3(except Humic-like, Microbial humic-like DOM in IP). Spectral slope ratio of all sites increased according to incubation, which means that the amount of CDOM increased from the sediment to overlying water.

  16. Elevation of small, dense low density lipoprotein cholesterol-a possible antecedent of atherogenic lipoprotein phenotype in type 2 diabetes patients in Jos, North-Central Nigeria.

    PubMed

    Inaku, Kenneth O; Ogunkeye, Obasola O; Abbiyesuku, Fayeofori M; Chuhwak, Evelyn K; Isichei, Christian O; Imoh, Lucius C; Amadu, Noel O; Abu, Alexander O

    2017-01-01

    The global prevalence of type 2 diabetes is increasing. Dyslipidaemia is a known complication of diabetes mellitus manifesting frequently as cardiovascular diseases and stoke. Elevation of small, dense low density lipoprotein has been recognised as a component of the atherogenic lipoprotein phenotype associated with cardiovascular complications. We speculate that the elevation of this lipoprotein particle may be the antecedent of the atherogenic lipoprotein phenotype. This study therefore aims to determine the pattern of dyslipidaemia among diabetes mellitus patients in Jos, North-Central Nigeria. One hundred and seventy-six patients with type 2 diabetes and 154 age-matched controls were studied. The patients with diabetes were regular clinic attenders and had stable glycaemic control. None were on lipid-lowering therapy. Anthropometric indices, blood pressure, and lipids (including total cholesterol, high density lipoprotein cholesterol, and triglyceride) were measured by chemical methods using the Hitachi 902 analyzer. Low density lipoprotein cholesterol was calculated using the Friedewald's equation. Small, dense low density lipoprotein cholesterol, -sdLDL-C was measured using the precipitation method by Hirano et al. Means of the different groups were compared using EPI Info and a P -value of <0.05 was accepted as significant difference. Total cholesterol, low density lipoprotein cholesterol, triglyceride and small, dense lipoprotein cholesterol were all significantly higher in diabetes patients than controls except high density lipoprotein cholesterol. The percentage of LDL-C as sdLDL-C among the diabetes versus control group was 45% ± 17.79 v 32.0% ± 15.93. Serum sdLDL-C concentration was determined to be 1.45 ± 0.64 among diabetes patients and 0.8 ± 0.54 among control subjects. 75% of diabetes patients had hypertension and were taking blood pressure lowering medications. The classical atherogenic lipoprotein phenotype was not demonstrated among subjects with type 2 diabetes mellitus in this study, but the elevation of serum small dense low density lipoprotein cholesterol in patients with sustained hypertension suggests the establishment of atherogenic complications among our diabetes patients.

  17. Variability and Reproducibility of Segmental Longitudinal Strain Measurement: A Report From the EACVI-ASE Strain Standardization Task Force.

    PubMed

    Mirea, Oana; Pagourelias, Efstathios D; Duchenne, Jurgen; Bogaert, Jan; Thomas, James D; Badano, Luigi P; Voigt, Jens-Uwe

    2018-01-01

    In this study, we compared left ventricular (LV) segmental strain measurements obtained with different ultrasound machines and post-processing software packages. Global longitudinal strain (GLS) has proven to be a reproducible and valuable tool in clinical practice. Data about the reproducibility and intervendor differences of segmental strain measurements, however, are missing. We included 63 volunteers with cardiac magnetic resonance-proven infarct scar with segmental LV function ranging from normal to severely impaired. Each subject was examined within 2 h by a single expert sonographer with machines from multiple vendors. All 3 apical views were acquired twice to determine the test-retest and the intervendor variability. Segmental longitudinal peak systolic, end-systolic, and post-systolic strain were measured using 7 vendor-specific systems (Hitachi, Tokyo, Japan; Esaote, Florence, Italy; GE Vingmed Ultrasound, Horten, Norway; Philips, Andover, Massachusetts; Samsung, Seoul, South Korea; Siemens, Mountain View, California; and Toshiba, Otawara, Japan) and 2 independent software packages (Epsilon, Ann Arbor, Michigan; and TOMTEC, Unterschleissheim, Germany) and compared among vendors. Image quality and tracking feasibility differed among vendors (analysis of variance, p < 0.05). The absolute test-retest difference ranged from 2.5% to 4.9% for peak systolic, 2.6% to 5.0% for end-systolic, and 2.5% to 5.0% for post-systolic strain. The average segmental strain values varied significantly between vendors (up to 4.5%). Segmental strain parameters from each vendor correlated well with the mean of all vendors (r 2 range 0.58 to 0.81) but showed very different ranges of values. Bias and limits of agreement were up to -4.6 ± 7.5%. In contrast to GLS, LV segmental longitudinal strain measurements have a higher variability on top of the known intervendor bias. The fidelity of different software to follow segmental function varies considerably. We conclude that single segmental strain values should be used with caution in the clinic. Segmental strain pattern analysis might be a more robust alternative. Copyright © 2018 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  18. SU-E-T-146: Effects of Uncertainties of Radiation Sensitivity of Biological Modelling for Treatment Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oita, M; Department of Life System, Institute of Technology and Science, Graduate School, The Tokushima University; Uto, Y

    Purpose: The aim of this study was to evaluate the distribution of uncertainty of cell survival by radiation, and assesses the usefulness of stochastic biological model applying for gaussian distribution. Methods: For single cell experiments, exponentially growing cells were harvested from the standard cell culture dishes by trypsinization, and suspended in test tubes containing 1 ml of MEM(2x10{sup 6} cells/ml). The hypoxic cultures were treated with 95% N{sub 2}−5% CO{sub 2} gas for 30 minutes. In vitro radiosensitization was also measured in EMT6/KU single cells to add radiosensitizer under hypoxic conditions. X-ray irradiation was carried out by using an Xraymore » unit (Hitachi X-ray unit, model MBR-1505R3) with 0.5 mm Al/1.0 mm Cu filter, 150 kV, 4 Gy/min). In vitro assay, cells on the dish were irradiated with 1 Gy to 24 Gy, respectively. After irradiation, colony formation assays were performed. Variations of biological parameters were investigated at standard cell culture(n=16), hypoxic cell culture(n=45) and hypoxic cell culture(n=21) with radiosensitizers, respectively. The data were obtained by separate schedule to take account for the variation of radiation sensitivity of cell cycle. Results: At standard cell culture, hypoxic cell culture and hypoxic cell culture with radiosensitizers, median and standard deviation of alpha/beta ratio were 37.1±73.4 Gy, 9.8±23.7 Gy, 20.7±21.9 Gy, respectively. Average and standard deviation of D{sub 50} were 2.5±2.5 Gy, 6.1±2.2 Gy, 3.6±1.3 Gy, respectively. Conclusion: In this study, we have challenged to apply these uncertainties of parameters for the biological model. The variation of alpha values, beta values, D{sub 50} as well as cell culture might have highly affected by probability of cell death. Further research is in progress for precise prediction of the cell death as well as tumor control probability for treatment planning.« less

  19. Techno-Economic Analysis of a Secondary Air Stripper Process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heberle, J.R.; Nikolic, Heather; Thompson, Jesse

    We present results of an initial techno-economic assessment on a post-combustion CO2 capture process developed by the Center for Applied Energy Research (CAER) at the University of Kentucky using Mitsubishi Hitachi Power Systems’ H3-1 aqueous amine solvent. The analysis is based on data collected at a 0.7 MWe pilot unit combined with laboratory data and process simulations. The process adds a secondary air stripper to a conventional solvent process, which increases the cyclic loading of the solvent in two ways. First, air strips additional CO2 from the solvent downstream of the conventional steam-heated thermal stripper. This extra stripping of CO2more » reduces the lean loading entering the absorber. Second, the CO2-enriched air is then sent to the boiler for use as secondary air. This recycling of CO2 results in a higher concentration of CO2 in the flue gas sent to the absorber, and hence a higher rich loading of the solvent exiting the absorber. A process model was incorporated into a full-scale supercritical pulverized coal power plant model to determine the plant performance and heat and mass balances. The performance and heat and mass balance data were used to size equipment and develop cost estimates for capital and operating costs. Lifecycle costs were considered through a levelized cost of electricity (LCOE) assessment based on the capital cost estimate and modeled performance. The results of the simulations show that the CAER process yields a regeneration energy of 3.12 GJ/t CO2, a $53.05/t CO2 capture cost, and LCOE of $174.59/MWh. This compares to the U.S. Department of Energy’s projected costs (Case 10) of regeneration energy of 3.58 GJ/t CO2 , a $61.31/t CO2 capture cost, and LCOE of $189.59/MWh. For H3-1, the CAER process results in a regeneration energy of 2.62 GJ/tCO2 with a stripper pressure of 5.2 bar, a capture cost of $46.93/t CO2, and an LCOE of $164.33/MWh.« less

  20. WE-D-17A-03: Improvement of Accuracy of Spot-Scanning Proton Beam Delivery for Liver Tumor by Real-Time Tumor-Monitoring and Gating System: A Simulation Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matsuura, T; Shimizu, S; Miyamoto, N

    2014-06-15

    Purpose: To improve the accuracy of spot-scanning proton beam delivery for target in motion, a real-time tumor-monitoring and gating system using fluoroscopy images was developed. This study investigates the efficacy of this method for treatment of liver tumors using simulation. Methods: Three-dimensional position of a fiducial marker inserted close to the tumor is calculated in real time and proton beam is gated according to the marker's distance from the planned position (Shirato, 2012). The efficient beam delivery is realized even for the irregular and sporadic motion signals, by employing the multiple-gated irradiations per operation cycle (Umezawa, 2012). For each ofmore » two breath-hold CTs (CTV=14.6cc, 63.1cc), dose distributions were calculated with internal margins corresponding to freebreathing (FB) and real-time gating (RG) with a 2-mm gating window. We applied 8 trajectories of liver tumor recorded during the treatment of RTRT in X-ray therapy and 6 initial timings. Dmax/Dmin in CTV, mean liver dose (MLD), and irradiation time to administer 3 Gy (RBE) dose were estimated assuming rigid motion of targets by using in-house simulation tools and VQA treatment planning system (Hitachi, Ltd., Tokyo). Results: Dmax/Dmin was degraded by less than 5% compared to the prescribed dose with all motion parameters for smaller CTV and less than 7% for larger CTV with one exception. Irradiation time showed only a modest increase if RG was used instead of FB; the average value over motion parameters was 113 (FB) and 138 s (RG) for smaller CTV and 120 (FB) and 207 s (RG) for larger CTV. In RG, it was within 5 min for all but one trajectory. MLD was markedly decreased by 14% and 5–6% for smaller and larger CTVs respectively, if RG was applied. Conclusions: Spot-scanning proton beam was shown to be delivered successfully to liver tumor without much lengthening of treatment time. This research was supported by the Cabinet Office, Government of Japan and the Japan Society for the Promotion of Science (JSPS) through the Funding Program for World-Leading Innovative R and D on Science and Technology (FIRST Program), initiated by the Council for Science and Technology Policy (CSTP)« less

  1. Willingness to Pay for Elderly Telecare Service Using the Internet and Digital Terrestrial Broadcasting

    PubMed Central

    Kaga, Satoshi; Suzuki, Teppei

    2017-01-01

    Background In Japan over the past few years, more attention has been focused on unnoticed solitary death in the context of an aging society and the trend toward nuclear family. A number of institutions and companies have implemented a prevention measure with digital terrestrial broadcasting telecare services for the elderly: Hokkaido University; TV-Asahi Corporation; Hitachi, Ltd; Iwamizawa City; Hokkaido Television Broadcasting Co, Ltd; and Hamanasu Information Co, Ltd. Although this system is provided free of charge as a demonstration test, determining the appropriate price for the service is required for its sustainable operation. Objective The aim of this study was to quantify individual willingness to pay (WTP) so as to test the tenability of digital terrestrial broadcasting service for elderly telecare. Methods We used the contingent valuation method (CVM) to estimate the WTP for this service among 305 citizens (valid response rate 76.0%) living in Japan. A questionnaire survey was conducted for people aged 18 to 100 years according to Japanese age distribution from September 2016. To elicit WTP, we adopted a double-bound dichotomous choice method to ask the respondents whether they agree or disagree with the price we offered. Results The median WTP for this service’s monthly fee is estimated to be 431 JPY (approximately US $3.7). The finding suggests that gender (0.66, P=.01), health consciousness (1.08, P=.01), willingness to use (2.38, P<.001), and seeing others less than once a week (1.00, P=.06) made a positive effect on WTP. Conclusions We conclude that reliable WTP was elicited by CVM based on an Internet survey. Calculated median WTP for digital terrestrial broadcasting service for elderly telecare was 431 JPY (approximately US $3.7). In the analysis of factors that affect WTP, constant factors, log-bid, health consciousness, gender, see others less than one time for week, and willingness to use made positive effect to probability of acceptance. In comparison of WTP in different groups, age groups showed that WTP of the elderly group was higher than WTP of the middle age group and younger age group. However, WTP surveys need to be carefully conducted to minimize the sampling bias and allocate accurate structure of gender distribution. PMID:29066428

  2. Willingness to Pay for Elderly Telecare Service Using the Internet and Digital Terrestrial Broadcasting.

    PubMed

    Kaga, Satoshi; Suzuki, Teppei; Ogasawara, Katsuhiko

    2017-10-24

    In Japan over the past few years, more attention has been focused on unnoticed solitary death in the context of an aging society and the trend toward nuclear family. A number of institutions and companies have implemented a prevention measure with digital terrestrial broadcasting telecare services for the elderly: Hokkaido University; TV-Asahi Corporation; Hitachi, Ltd; Iwamizawa City; Hokkaido Television Broadcasting Co, Ltd; and Hamanasu Information Co, Ltd. Although this system is provided free of charge as a demonstration test, determining the appropriate price for the service is required for its sustainable operation. The aim of this study was to quantify individual willingness to pay (WTP) so as to test the tenability of digital terrestrial broadcasting service for elderly telecare. We used the contingent valuation method (CVM) to estimate the WTP for this service among 305 citizens (valid response rate 76.0%) living in Japan. A questionnaire survey was conducted for people aged 18 to 100 years according to Japanese age distribution from September 2016. To elicit WTP, we adopted a double-bound dichotomous choice method to ask the respondents whether they agree or disagree with the price we offered. The median WTP for this service's monthly fee is estimated to be 431 JPY (approximately US $3.7). The finding suggests that gender (0.66, P=.01), health consciousness (1.08, P=.01), willingness to use (2.38, P<.001), and seeing others less than once a week (1.00, P=.06) made a positive effect on WTP. We conclude that reliable WTP was elicited by CVM based on an Internet survey. Calculated median WTP for digital terrestrial broadcasting service for elderly telecare was 431 JPY (approximately US $3.7). In the analysis of factors that affect WTP, constant factors, log-bid, health consciousness, gender, see others less than one time for week, and willingness to use made positive effect to probability of acceptance. In comparison of WTP in different groups, age groups showed that WTP of the elderly group was higher than WTP of the middle age group and younger age group. However, WTP surveys need to be carefully conducted to minimize the sampling bias and allocate accurate structure of gender distribution. ©Satoshi Kaga, Teppei Suzuki, Katsuhiko Ogasawara. Originally published in the Interactive Journal of Medical Research (http://www.i-jmr.org/), 24.10.2017.

  3. Scanning electron microscope automatic defect classification of process induced defects

    NASA Astrophysics Data System (ADS)

    Wolfe, Scott; McGarvey, Steve

    2017-03-01

    With the integration of high speed Scanning Electron Microscope (SEM) based Automated Defect Redetection (ADR) in both high volume semiconductor manufacturing and Research and Development (R and D), the need for reliable SEM Automated Defect Classification (ADC) has grown tremendously in the past few years. In many high volume manufacturing facilities and R and D operations, defect inspection is performed on EBeam (EB), Bright Field (BF) or Dark Field (DF) defect inspection equipment. A comma separated value (CSV) file is created by both the patterned and non-patterned defect inspection tools. The defect inspection result file contains a list of the inspection anomalies detected during the inspection tools' examination of each structure, or the examination of an entire wafers surface for non-patterned applications. This file is imported into the Defect Review Scanning Electron Microscope (DRSEM). Following the defect inspection result file import, the DRSEM automatically moves the wafer to each defect coordinate and performs ADR. During ADR the DRSEM operates in a reference mode, capturing a SEM image at the exact position of the anomalies coordinates and capturing a SEM image of a reference location in the center of the wafer. A Defect reference image is created based on the Reference image minus the Defect image. The exact coordinates of the defect is calculated based on the calculated defect position and the anomalies stage coordinate calculated when the high magnification SEM defect image is captured. The captured SEM image is processed through either DRSEM ADC binning, exporting to a Yield Analysis System (YAS), or a combination of both. Process Engineers, Yield Analysis Engineers or Failure Analysis Engineers will manually review the captured images to insure that either the YAS defect binning is accurately classifying the defects or that the DRSEM defect binning is accurately classifying the defects. This paper is an exploration of the feasibility of the utilization of a Hitachi RS4000 Defect Review SEM to perform Automatic Defect Classification with the objective of the total automated classification accuracy being greater than human based defect classification binning when the defects do not require multiple process step knowledge for accurate classification. The implementation of DRSEM ADC has the potential to improve the response time between defect detection and defect classification. Faster defect classification will allow for rapid response to yield anomalies that will ultimately reduce the wafer and/or the die yield.

  4. 3D Micro-topography of Transferred Laboratory and Natural Ice Crystal Surfaces Imaged by Cryo and Environmental Scanning Electron Microscopy

    NASA Astrophysics Data System (ADS)

    Magee, N. B.; Boaggio, K.; Bancroft, L.; Bandamede, M.

    2015-12-01

    Recent work has highlighted micro-scale roughness on the surfaces of ice crystals grown and imaged in-situ within the chambers of environmental scanning electron microscopes (ESEM). These observations appear to align with theoretical and satellite observations that suggest a prevalence of rough ice in cirrus clouds. However, the atmospheric application of the lab observations are indeterminate because the observations have been based only on crystals grown on substrates and in pure-water vapor environments. In this work, we present details and results from the development of a transfer technique which allows natural and lab-grown ice and snow crystals to be captured, preserved, and transferred into the ESEM for 3D imaging. Ice crystals were gathered from 1) natural snow, 2) a balloon-borne cirrus particle capture device, and 3) lab-grown ice crystals from a diffusion chamber. Ice crystals were captured in a pre-conditioned small-volume (~1 cm3) cryo-containment cell. The cell was then sealed closed and transferred to a specially-designed cryogenic dewer (filled with liquid nitrogen or crushed dry ice) for transport to a new Hitachi Field Emission, Variable Pressure SEM (SU-5000). The cryo-cell was then removed from the dewer and quickly placed onto the pre-conditioned cryo transfer stage attached to the ESEM (Quorum 3010T). Quantitative 3D topographical digital elevation models of ice surfaces are reported from SEM for the first time, including a variety of objective measures of statistical surface roughness. The surfaces of the transported crystals clearly exhibit signatures of mesoscopic roughening that are similar to examples of roughness seen in ESEM-grown crystals. For most transported crystals, the habits and crystal edges are more intricate that those observed for ice grown directly on substrates within the ESEM chamber. Portions of some crystals do appear smooth even at magnification greater than 1000x, a rare observation in our ESEM-grown crystals. The transported crystals hint at some significant differences in roughness morphology, but they do provide evidence that crystals grown in air/water mixtures and with minimal substrate influence also exhibit mesoscopic roughness with similarity to that observed in ESEM-grown crystals.

  5. Distribution and predominance of genotype 3 in hepatitis C virus carriers in the province of kahramanmaras, Turkey.

    PubMed

    Caliskan, Ahmet; Kirisci, Ozlem; Ozkaya, Esra; Ozden, Sevinc; Tumer, Seray; Caglar, Serkan; Guler, Selma Ates; Senol, Hande

    2015-04-01

    The hepatitis C virus (HCV) has six major genotypes and more than 100 subtypes, and the determination of the responsible genotype, collection of epidemiological data, tailoring antiviral therapy, and prediction of prognosis have an important place in disease management. The aim of the present study was to determine the distribution of HCV genotypes across geographic regions and compare these data with those obtained from other geographic locations. The HCV genotypes were identified in HCV RNA positive blood samples, obtained from different centers. The HCV genotype was determined using molecular methods [Real-Time Polymerase Chain Reaction (RT-PCR)] in 313 patients, who were found to be positive for HCV RNA. The presence of HCV RNA was investigated using the RT-PCR method in serum samples delivered to the Microbiology Laboratory at Kahramanmaras Necip Fazıl City Hospital, Kahramanmaras, Turkey, from the centers located in Kahramanmaras City center and peripheral districts of the province, between March 2010 and August 2014. The HCV genotype analysis was performed in HCV RNA positive samples, using RT-PCR reagents kit. Urine samples from the patients were tested for amphetamine with an Amphetamines II (AMPS2) kit, cocaine was tested with a Cocaine II (COC2) kit, opiates were tested with an Opiates II (OPI2) kit, and cannabinoids were tested with a Cannabinoids II (THC2) kit in Roche/Hitachi Cobas c501 device. The blood samples collected from 313 patients were included in the study. Of these patients, 212 (67.7%) were male and 101 (32.3%) were female. The mean age of the patients was 41.29 ± 20.32 years. In terms of HCV genotype distribution, 162 patients (51.7%) had genotype 1, 144 patients (46%) had genotype 3, four patients (1.3%) had genotype 2, and three patients (1%) had genotype 4. The results of urine drug tests were available in only 65 patients (20.2%). Of these, 61 (93.8%) patients had HCV genotype 3. In conclusion, the prevalence of HCV genotype 1 was 51.7%, which was lower than the rates reported in other studies in Turkey, while the prevalence of HCV genotype 3 was 46%, which was remarkably higher than the reported Turkish data. In addition, the prevalence rate for genotype 3 reported in the present study is the highest that has ever been reported in the literature.

  6. Biofouling of Cr-Nickel Spray Coated Films on Steel Surfaces

    NASA Astrophysics Data System (ADS)

    Yoshida, Kento; Kanematsu, Hideyuki; Kuroda, Daisuke; Ikigai, Hajime; Kogo, Takeshi; Yokoyama, Seiji

    2012-03-01

    Nowadays, corrosion of metals brings us serious economic loss and it often reaches several percentage of GNP. Particularly the marine corrosion was serious and the counter measure was very hard to be established, since the number of factors is huge and complicated. One of the complicated factors in marine corrosion is biofouling. Biofouling was classified into two main categories, microfouling and macrofouling. The former is composed of biofilm formation mainly. Marine bacteria are attached to material surfaces, seeking for nutrition in oligotrophic environment and they excrete polysaccharide to form biofilm on metal surfaces. Then larger living matters are attached on the biofilms to develop biofouling on metal surfaces, which often lead loss and failures of metals in marine environments. From the viewpoint of corrosion protection and maintenance of marine structures, biofouling should be mitigated as much as possible. In this study, we applied spray coating to steels and investigated if chromium-nickel spray coating could mitigate the biofouling, being compared with the conventional aluminium-zinc spray coating in marine environments. The specimens used for this investigation are aluminium, zinc, aluminium-zinc, stacked chromium/nickel and those films were formed on carbon steel (JIS SS400). And the pores formed by spray coating were sealed by a commercial reagent for some specimens. All of those specimens were immersed into sea water located at Marina Kawage (854-3, Chisato, Tsu, Mie Prefecture) in Ise Bay for two weeks. The depth of the specimen was two meter from sea water surface and the distance was always kept constant, since they were suspended from the floating pier. The temperature in sea water changed from 10 to 15 degrees Celsius during the immersion test. The biofouling behavior was investigated by low vacuum SEM (Hitachi Miniscope TM1000) and X-ray fluorescent analysis. When the spray coated specimens with and without sealing agents were compared, the former showed higher antifouling properties generally. Aluminium-zinc alloy spray coated films had higher antifouling property. And the anti-property decreased in this order: Al-Zn alloy spray coating > Zinc spray coating > Aluminium spray coating > Stacked chromium/nickel spray coating. Aluminium and zinc spray coating has been evaluated high conventionally for anti-biofouling in marine environment. However, the Cr/Ni spray coating showed pretty high anti-fouling property.

  7. The Metastable Persistence of Vapor-Deposited Amorphous Ice at Anomalously High Temperatures

    NASA Technical Reports Server (NTRS)

    Blake, David F.; Jenniskens, Peter; DeVincenzi, Donald L. (Technical Monitor)

    1995-01-01

    Studies of the gas release, vaporization behavior and infrared (IR) spectral properties of amorphous and crystalline water ice have direct application to cometary and planetary outgassing phenomena and contribute to an understanding of the physical properties of astrophysical ices. Several investigators report anomalous phenomena related to the warming of vapor-deposited astrophysical ice analogs. However gas release, ice volatilization and IR spectral features are secondary or tertiary manifestations of ice structure or morphology. These observations are useful in mimicking the bulk physical and chemical phenomena taking place in cometary and other extraterrestrial ices but do not directly reveal the structural changes which are their root cause. The phenomenological interpretation of spectral and gas release data is probably the cause of somewhat contradictory explanations invoked to account for differences in water ice behavior in similar temperature regimes. It is the microstructure, micromorphology and microchemical heterogeneity of astrophysical ices which must be characterized if the mechanisms underlying the observed phenomena are to be understood. We have been using a modified Transmission Electron Microscope to characterize the structure of vapor-deposited astrophysical ice analogs as a function of their deposition, temperature history and composition. For the present experiments, pure water vapor is deposited at high vacuum onto a 15 K amorphous carbon film inside an Hitachi H-500H TEM. The resulting ice film (approx. 0.05 micrometers thick) is warmed at the rate of 1 K per minute and diffraction patterns are collected at 1 K intervals. These patterns are converted into radial intensity distributions which are calibrated using patterns of crystalline gold deposited on a small part of the carbon substrate. The small intensity contributed by the amorphous substrate is removed by background subtraction. The proportions of amorphous and crystalline material in each pattern are determined by subtracting a percentage of crystalline component relative to amorphous and pure crystalline endmembers. Vapor-deposited water ice undergoes two amorphous to amorphous structural transformations in the temperature range 15-130 K with important astrophysical implications. The onset of cubic crystallization occurs at 142-160 K (at 1K per minute heating rates) during which the 220 and 311 diffraction maxima appear and 0.1 micrometer crystallites can be seen in bright field images. This transition is time dependent.

  8. Fast auto-acquisition tomography tilt series by using HD video camera in ultra-high voltage electron microscope.

    PubMed

    Nishi, Ryuji; Cao, Meng; Kanaji, Atsuko; Nishida, Tomoki; Yoshida, Kiyokazu; Isakozawa, Shigeto

    2014-11-01

    The ultra-high voltage electron microscope (UHVEM) H-3000 with the world highest acceleration voltage of 3 MV can observe remarkable three dimensional microstructures of microns-thick samples[1]. Acquiring a tilt series of electron tomography is laborious work and thus an automatic technique is highly desired. We proposed the Auto-Focus system using image Sharpness (AFS)[2,3] for UHVEM tomography tilt series acquisition. In the method, five images with different defocus values are firstly acquired and the image sharpness are calculated. The sharpness are then fitted to a quasi-Gaussian function to decide the best focus value[3]. Defocused images acquired by the slow scan CCD (SS-CCD) camera (Hitachi F486BK) are of high quality but one minute is taken for acquisition of five defocused images.In this study, we introduce a high-definition video camera (HD video camera; Hamamatsu Photonics K. K. C9721S) for fast acquisition of images[4]. It is an analog camera but the camera image is captured by a PC and the effective image resolution is 1280×1023 pixels. This resolution is lower than that of the SS-CCD camera of 4096×4096 pixels. However, the HD video camera captures one image for only 1/30 second. In exchange for the faster acquisition the S/N of images are low. To improve the S/N, 22 captured frames are integrated so that each image sharpness is enough to become lower fitting error. As countermeasure against low resolution, we selected a large defocus step, which is typically five times of the manual defocus step, to discriminate different defocused images.By using HD video camera for autofocus process, the time consumption for each autofocus procedure was reduced to about six seconds. It took one second for correction of an image position and the total correction time was seven seconds, which was shorter by one order than that using SS-CCD camera. When we used SS-CCD camera for final image capture, it took 30 seconds to record one tilt image. We can obtain a tilt series of 61 images within 30 minutes. Accuracy and repeatability were good enough to practical use (Figure 1). We successfully reduced the total acquisition time of a tomography tilt series in half than before.jmicro;63/suppl_1/i25/DFU066F1F1DFU066F1Fig. 1.Objective lens current change with a tilt angle during acquisition of tomography series (Sample: a rat hepatocyte, thickness: 2 m, magnification: 4k, acc. voltage: 2 MV). Tilt angle range is ±60 degree with 2 degree step angle. Two series were acquired in the same area. Both data were almost same and the deviation was smaller than the minimum step by manual, so auto-focus worked well. We also developed a computer-aided three dimensional (3D) visualization and analysis software for electron tomography "HawkC" which can sectionalize the 3D data semi-automatically[5,6]. If this auto-acquisition system is used with IMOD reconstruction software[7] and HawkC software, we will be able to do on-line UHVEM tomography. The system would help pathology examination in the future.This work was supported by the Ministry of Education, Culture, Sports, Science and Technology (MEXT), Japan, under a Grant-in-Aid for Scientific Research (Grant No. 23560024, 23560786), and SENTAN, Japan Science and Technology Agency, Japan. © The Author 2014. Published by Oxford University Press on behalf of The Japanese Society of Microscopy. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  9. PDRD (SR13046) TRITIUM PRODUCTION FINAL REPORT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, P.; Sheetz, S.

    Utilizing the results of Texas A&M University (TAMU) senior design projects on tritium production in four different small modular reactors (SMR), the Savannah River National Laboratory’s (SRNL) developed an optimization model evaluating tritium production versus uranium utilization under a FY2013 plant directed research development (PDRD) project. The model is a tool that can evaluate varying scenarios and various reactor designs to maximize the production of tritium per unit of unobligated United States (US) origin uranium that is in limited supply. The primary module in the model compares the consumption of uranium for various production reactors against the base case ofmore » Watts Bar I running a nominal load of 1,696 tritium producing burnable absorber rods (TPBARs) with an average refueling of 41,000 kg low enriched uranium (LEU) on an 18 month cycle. After inputting an initial year, starting inventory of unobligated uranium and tritium production forecast, the model will compare and contrast the depletion rate of the LEU between the entered alternatives. This is an annual tritium production rate of approximately 0.059 grams of tritium per kilogram of LEU (g-T/kg-LEU). To date, the Nuclear Regulatory Commission (NRC) license has not been amended to accept a full load of TPBARs so the nominal tritium production has not yet been achieved. The alternatives currently loaded into the model include the three light water SMRs evaluated in TAMU senior projects including, mPower, Holtec and NuScale designs. Initial evaluations of tritium production in light water reactor (LWR) based SMRs using optimized loads TPBARs is on the order 0.02-0.06 grams of tritium per kilogram of LEU used. The TAMU students also chose to model tritium production in the GE-Hitachi SPRISM, a pooltype sodium fast reactor (SFR) utilizing a modified TPBAR type target. The team was unable to complete their project so no data is available. In order to include results from a fast reactor, the SRNL Technical Advisory Committee (TAC) ran a Monte Carlo N-Particle (MCNP) model of a basic SFR for comparison. A 600MWth core surrounded by a lithium blanket produced approximately 1,000 grams of tritium annually with a 13% enriched, 6 year core. This is similar results to a mid-1990’s study where the Fast Flux Test Facility (FFTF), a 400 MWth reactor at the Idaho National Laboratory (INL), could produce about 1,000 grams with an external lithium target. Normalized to the LWRs values, comparative tritium production for an SFR could be approximately 0.31 g-T/kg LEU.« less

  10. Cybersickness-related changes in brain hemodynamics: A pilot study comparing transcranial Doppler and near-infrared spectroscopy assessments during a virtual ride on a roller coaster.

    PubMed

    Gavgani, Alireza Mazloumi; Wong, Rachel H X; Howe, Peter R C; Hodgson, Deborah M; Walker, Frederick R; Nalivaiko, Eugene

    2018-07-01

    Our aim was to assess cerebral blood flow changes during cybersickness. Transcranial Doppler (TCD) ultrasound and near infrared spectroscopy (NIRS) were used separately in two independent experiments. In both studies, a 15-min virtual roller coaster ride was used as a provocative visual stimulus. Subjective nausea ratings were obtained at 1 min intervals. The TCD study was performed in 14 healthy subjects (8 males and 6 females); in this study we also measured heart rate and arterial pressure. In a separate study a 52-channel NIRS device (Hitachi ETG-4000) was used to monitor activated brain regions by measuring oxy-hemoglobin (HbO 2) concentration in 9 healthy subjects (4 male, 5 females). The TCD study results showed a significant increase in systolic (+3.8 ± 1.8 mm Hg) and diastolic (+6.7 ± 1.3 mm Hg) pressure at the end of the virtual ride (maximum nausea) compared to baseline (no nausea). We also found that middle cerebral artery (MCA) and posterior cerebral artery (PCA) systolic flow velocity decreased significantly at the end of the ride when compared to baseline values. Likewise, the relative systolic and diastolic conductance in the MCA decreased significantly (-0.03 ± 0.02 cm × s -1  × mm Hg -1 , t, p = 0.0058 and -0.03 ± 0.01 cm × s -1  × mm Hg -1 , p = 0.05, respectively) at maximum nausea when compared to no nausea. Additionally, there was a significant decrease (-0.02 ± 0.01 cm × s -1  × mm Hg -1 , p = 0.03) in the relative systolic conductance in the PCA at the end of the ride. Analysis of the NIRS results showed a significant increase in HbO 2 concentration in 15/52 channels in parieto-temporal regions of both hemispheres in participants who experienced motion sickness symptoms during the experiment. This increase in HbO 2 concentration correlated with increasing nausea and motion sickness symptoms. We conclude that cybersickness causes complex changes in cerebral blood flow, with an increase in perfusion in some cortical regions, but with a decrease of global cerebral perfusion. Copyright © 2018 Elsevier Inc. All rights reserved.

  11. Quality assurance of proton beams using a multilayer ionization chamber system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dhanesar, Sandeep; Sahoo, Narayan; Kerr, Matthew

    2013-09-15

    Purpose: The measurement of percentage depth-dose (PDD) distributions for the quality assurance of clinical proton beams is most commonly performed with a computerized water tank dosimetry system with ionization chamber, commonly referred to as water tank. Although the accuracy and reproducibility of this method is well established, it can be time-consuming if a large number of measurements are required. In this work the authors evaluate the linearity, reproducibility, sensitivity to field size, accuracy, and time-savings of another system: the Zebra, a multilayer ionization chamber system.Methods: The Zebra, consisting of 180 parallel-plate ionization chambers with 2 mm resolution, was used tomore » measure depth-dose distributions. The measurements were performed for scattered and scanned proton pencil beams of multiple energies delivered by the Hitachi PROBEAT synchrotron-based delivery system. For scattered beams, the Zebra-measured depth-dose distributions were compared with those measured with the water tank. The principal descriptors extracted for comparisons were: range, the depth of the distal 90% dose; spread-out Bragg peak (SOBP) length, the region between the proximal 95% and distal 90% dose; and distal-dose fall off (DDF), the region between the distal 80% and 20% dose. For scanned beams, the Zebra-measured ranges were compared with those acquired using a Bragg peak chamber during commissioning.Results: The Zebra demonstrated better than 1% reproducibility and monitor unit linearity. The response of the Zebra was found to be sensitive to radiation field sizes greater than 12.5 × 12.5 cm; hence, the measurements used to determine accuracy were performed using a field size of 10 × 10 cm. For the scattered proton beams, PDD distributions showed 1.5% agreement within the SOBP, and 3.8% outside. Range values agreed within −0.1 ± 0.4 mm, with a maximum deviation of 1.2 mm. SOBP length values agreed within 0 ± 2 mm, with a maximum deviation of 6 mm. DDF values agreed within 0.3 ± 0.1 mm, with a maximum deviation of 0.6 mm. For the scanned proton pencil beams, Zebra and Bragg peak chamber range values demonstrated agreement of 0.0 ± 0.3 mm with a maximum deviation of 1.3 mm. The setup and measurement time for all Zebra measurements was 3 and 20 times less, respectively, compared to the water tank measurements.Conclusions: Our investigation shows that the Zebra can be useful not only for fast but also for accurate measurements of the depth-dose distributions of both scattered and scanned proton beams. The analysis of a large set of measurements shows that the commonly assessed beam quality parameters obtained with the Zebra are within the acceptable variations specified by the manufacturer for our delivery system.« less

  12. Enabling CD SEM metrology for 5nm technology node and beyond

    NASA Astrophysics Data System (ADS)

    Lorusso, Gian Francesco; Ohashi, Takeyoshi; Yamaguchi, Astuko; Inoue, Osamu; Sutani, Takumichi; Horiguchi, Naoto; Bömmels, Jürgen; Wilson, Christopher J.; Briggs, Basoene; Tan, Chi Lim; Raymaekers, Tom; Delhougne, Romain; Van den Bosch, Geert; Di Piazza, Luca; Kar, Gouri Sankar; Furnémont, Arnaud; Fantini, Andrea; Donadio, Gabriele Luca; Souriau, Laurent; Crotti, Davide; Yasin, Farrukh; Appeltans, Raf; Rao, Siddharth; De Simone, Danilo; Rincon Delgadillo, Paulina; Leray, Philippe; Charley, Anne-Laure; Zhou, Daisy; Veloso, Anabela; Collaert, Nadine; Hasumi, Kazuhisa; Koshihara, Shunsuke; Ikota, Masami; Okagawa, Yutaka; Ishimoto, Toru

    2017-03-01

    The CD SEM (Critical Dimension Scanning Electron Microscope) is one of the main tools used to estimate Critical Dimension (CD) in semiconductor manufacturing nowadays, but, as all metrology tools, it will face considerable challenges to keep up with the requirements of the future technology nodes. The root causes of these challenges are not uniquely related to the shrinking CD values, as one might expect, but to the increase in complexity of the devices in terms of morphology and chemical composition as well. In fact, complicated threedimensional device architectures, high aspect ratio features, and wide variety of materials are some of the unavoidable characteristics of the future metrology nodes. This means that, beside an improvement in resolution, it is critical to develop a CD SEM metrology capable of satisfying the specific needs of the devices of the nodes to come, needs that sometimes will have to be addressed through dramatic changes in approach with respect to traditional CD SEM metrology. In this paper, we report on the development of advanced CD SEM metrology at imec on a variety of device platform and processes, for both logic and memories. We discuss newly developed approaches for standard, IIIV, and germanium FinFETs (Fin Field Effect Transistors), for lateral and vertical nanowires (NW), 3D NAND (three-dimensional NAND), STT-MRAM (Spin Transfer Magnetic Torque Random-Access Memory), and ReRAM (Resistive Random Access Memory). Applications for both front-end of line (FEOL) and back-end of line (BEOL) are developed. In terms of process, S/D Epi (Source Drain Epitaxy), SAQP (Self-Aligned Quadruple Patterning), DSA (Dynamic Self-Assembly), and EUVL (Extreme Ultraviolet Lithography) have been used. The work reported here has been performed on Hitachi CG5000, CG6300, and CV5000. In terms of logic, we discuss here the S/D epi defect classification, the metrology optimization for STI (Shallow Trench Isolation) Ge FinFETs, the defectivity of III-V STI FinFETs,, metrology for vertical and horizontal NWs. With respect to memory, we discuss a STT-RAM statistical CD analysis and its comparison to electrical performance, ReRAM metrology for VMCO (Vacancy-modulated conductive oxide) with comparison with electrical performance, 3D NAND ONO (Oxide Nitride Oxide) thickness measurements. In addition, we report on 3D morphological reconstruction using CD SEM in conjunction with FIB (Focused Ion Beam), on optimized BKM (Best Known Methods) development methodologies, and on CD SEM overlay. The large variety of results reported here gives a clear overview of the creative effort put in place to ensure that the critical potential of CD SEM metrology tools is fully enabled for the 5nm node and beyond.

  13. Microstructural investigation of a locally mirror-like surface collected at 4 km depth in a Pomeranian shale sample

    NASA Astrophysics Data System (ADS)

    Pluymakers, Anne; Renard, Francois

    2016-04-01

    The presence of shiny sliding surfaces, or mirror surfaces, is sometimes thought to have been caused by slip at seismic velocities. Many fault mirrors reported so far are described to occur in carbonate-rich rocks. Here we present microstructural data on a mirror-like slip surface in the Pomeranian shale, recovered from approximately 4 km depth. The accommodated sliding of this fault is probably small, not more than one or two centimeter. The Pomeranian shale is a dark-grey to black shale, composed of 40-60% illite plus mica, 1-10% organic matter, 10% chlorite, and 10 % carbonates plus minor amounts of K-feldspar, plagioclase and kaolinite. In this sample, the surface is optically smooth with striations and some patches that reflect light. Observations using a Hitachi TM3000 (table-top) SEM show that the striations are omnipresent, though more prominent in the carbonate patches (determined using EDS analysis). The smooth surface is locally covered by granular material with a grain size up to 10 μm. This is shown to consist of a mixture of elements and thus likely locally derived fault gouge. The clay-rich parts of the smooth surface are equidimensional grains, with sub-micron grain sizes, whereas in the unperturbed part of the shale core the individual clay platelets are easy to distinguish, with lengths up to 10 μm. The striated calcite-rich patches appear as single grains with sizes up to several millimeters, though they occasionally are smeared out in a direction parallel to the striations. We have analyzed surface roughness at magnifications of 2.5x to 100x using a standard White Light Interferometer, parallel and perpendicular to slip. At low magnifications, 2.5x and 5x, Hurst exponents were anomalously low, around 0.1 to 0.2, interpreted to be related to a lack of sufficient resolution to pick up the striations. At higher magnification the Hurst exponent is 0.34 to 0.43 parallel to the striation, and 0.44 to 0.61 perpendicular to the striation. This relatively low Hurst exponent suggests that this surface has not experienced high strains, even though it locally exhibits mirror-like properties. As such, this data supports the notion that the formation of shiny surfaces is related to grain size reduction, but does not necessarily indicate major slip events. Additionally, the more strongly visible striation in the carbonate-rich parts indicates that some mineralogies are more prone to the formation of striations than others. A full interpretation of this sample is of course complicated by its small size, but these data suggest that when examining fault mirrors and the presence of striations spatial difference in mineralogy need to be taken into account.

  14. Collectively loading an application in a parallel computer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aho, Michael E.; Attinella, John E.; Gooding, Thomas M.

    Collectively loading an application in a parallel computer, the parallel computer comprising a plurality of compute nodes, including: identifying, by a parallel computer control system, a subset of compute nodes in the parallel computer to execute a job; selecting, by the parallel computer control system, one of the subset of compute nodes in the parallel computer as a job leader compute node; retrieving, by the job leader compute node from computer memory, an application for executing the job; and broadcasting, by the job leader to the subset of compute nodes in the parallel computer, the application for executing the job.

  15. Distributing an executable job load file to compute nodes in a parallel computer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gooding, Thomas M.

    Distributing an executable job load file to compute nodes in a parallel computer, the parallel computer comprising a plurality of compute nodes, including: determining, by a compute node in the parallel computer, whether the compute node is participating in a job; determining, by the compute node in the parallel computer, whether a descendant compute node is participating in the job; responsive to determining that the compute node is participating in the job or that the descendant compute node is participating in the job, communicating, by the compute node to a parent compute node, an identification of a data communications linkmore » over which the compute node receives data from the parent compute node; constructing a class route for the job, wherein the class route identifies all compute nodes participating in the job; and broadcasting the executable load file for the job along the class route for the job.« less

  16. Organization of the secure distributed computing based on multi-agent system

    NASA Astrophysics Data System (ADS)

    Khovanskov, Sergey; Rumyantsev, Konstantin; Khovanskova, Vera

    2018-04-01

    Nowadays developing methods for distributed computing is received much attention. One of the methods of distributed computing is using of multi-agent systems. The organization of distributed computing based on the conventional network computers can experience security threats performed by computational processes. Authors have developed the unified agent algorithm of control system of computing network nodes operation. Network PCs is used as computing nodes. The proposed multi-agent control system for the implementation of distributed computing allows in a short time to organize using of the processing power of computers any existing network to solve large-task by creating a distributed computing. Agents based on a computer network can: configure a distributed computing system; to distribute the computational load among computers operated agents; perform optimization distributed computing system according to the computing power of computers on the network. The number of computers connected to the network can be increased by connecting computers to the new computer system, which leads to an increase in overall processing power. Adding multi-agent system in the central agent increases the security of distributed computing. This organization of the distributed computing system reduces the problem solving time and increase fault tolerance (vitality) of computing processes in a changing computing environment (dynamic change of the number of computers on the network). Developed a multi-agent system detects cases of falsification of the results of a distributed system, which may lead to wrong decisions. In addition, the system checks and corrects wrong results.

  17. Aggregating job exit statuses of a plurality of compute nodes executing a parallel application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aho, Michael E.; Attinella, John E.; Gooding, Thomas M.

    Aggregating job exit statuses of a plurality of compute nodes executing a parallel application, including: identifying a subset of compute nodes in the parallel computer to execute the parallel application; selecting one compute node in the subset of compute nodes in the parallel computer as a job leader compute node; initiating execution of the parallel application on the subset of compute nodes; receiving an exit status from each compute node in the subset of compute nodes, where the exit status for each compute node includes information describing execution of some portion of the parallel application by the compute node; aggregatingmore » each exit status from each compute node in the subset of compute nodes; and sending an aggregated exit status for the subset of compute nodes in the parallel computer.« less

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gooding, Thomas M.

    Distributing an executable job load file to compute nodes in a parallel computer, the parallel computer comprising a plurality of compute nodes, including: determining, by a compute node in the parallel computer, whether the compute node is participating in a job; determining, by the compute node in the parallel computer, whether a descendant compute node is participating in the job; responsive to determining that the compute node is participating in the job or that the descendant compute node is participating in the job, communicating, by the compute node to a parent compute node, an identification of a data communications linkmore » over which the compute node receives data from the parent compute node; constructing a class route for the job, wherein the class route identifies all compute nodes participating in the job; and broadcasting the executable load file for the job along the class route for the job.« less

  19. Neural Computation and the Computational Theory of Cognition

    ERIC Educational Resources Information Center

    Piccinini, Gualtiero; Bahar, Sonya

    2013-01-01

    We begin by distinguishing computationalism from a number of other theses that are sometimes conflated with it. We also distinguish between several important kinds of computation: computation in a generic sense, digital computation, and analog computation. Then, we defend a weak version of computationalism--neural processes are computations in the…

  20. Investigation of the computer experiences and attitudes of pre-service mathematics teachers: new evidence from Turkey.

    PubMed

    Birgin, Osman; Catlioğlu, Hakan; Gürbüz, Ramazan; Aydin, Serhat

    2010-10-01

    This study aimed to investigate the experiences of pre-service mathematics (PSM) teachers with computers and their attitudes toward them. The Computer Attitude Scale, Computer Competency Survey, and Computer Use Information Form were administered to 180 Turkish PSM teachers. Results revealed that most PSM teachers used computers at home and at Internet cafes, and that their competency was generally intermediate and upper level. The study concludes that PSM teachers' attitudes about computers differ according to their years of study, computer ownership, level of computer competency, frequency of computer use, computer experience, and whether they had attended a computer-aided instruction course. However, computer attitudes were not affected by gender.

  1. Computer Fear and Anxiety in the United States Army

    DTIC Science & Technology

    1991-03-01

    number) FIELD I GROUP SUBGROUP Computer Fear, Computer Anxiety, Computerphobia, Cyberphobia, Technostress , Computer Aversion, Corn puterphrenia 19...physiological and psychological disorders that impact not only on individuals, but on organizations as well. " Technostress " is a related term which is...computers, technostress , computer anxious, computer resistance, terminal phobia, fear of technology, computer distrust, and computer aversion. Whatever

  2. Cloud@Home: A New Enhanced Computing Paradigm

    NASA Astrophysics Data System (ADS)

    Distefano, Salvatore; Cunsolo, Vincenzo D.; Puliafito, Antonio; Scarpa, Marco

    Cloud computing is a distributed computing paradigm that mixes aspects of Grid computing, ("… hardware and software infrastructure that provides dependable, consistent, pervasive, and inexpensive access to high-end computational capabilities" (Foster, 2002)) Internet Computing ("…a computing platform geographically distributed across the Internet" (Milenkovic et al., 2003)), Utility computing ("a collection of technologies and business practices that enables computing to be delivered seamlessly and reliably across multiple computers, ... available as needed and billed according to usage, much like water and electricity are today" (Ross & Westerman, 2004)) Autonomic computing ("computing systems that can manage themselves given high-level objectives from administrators" (Kephart & Chess, 2003)), Edge computing ("… provides a generic template facility for any type of application to spread its execution across a dedicated grid, balancing the load …" Davis, Parikh, & Weihl, 2004) and Green computing (a new frontier of Ethical computing1 starting from the assumption that in next future energy costs will be related to the environment pollution).

  3. Paper-Based and Computer-Based Concept Mappings: The Effects on Computer Achievement, Computer Anxiety and Computer Attitude

    ERIC Educational Resources Information Center

    Erdogan, Yavuz

    2009-01-01

    The purpose of this paper is to compare the effects of paper-based and computer-based concept mappings on computer hardware achievement, computer anxiety and computer attitude of the eight grade secondary school students. The students were randomly allocated to three groups and were given instruction on computer hardware. The teaching methods used…

  4. When does a physical system compute?

    PubMed

    Horsman, Clare; Stepney, Susan; Wagner, Rob C; Kendon, Viv

    2014-09-08

    Computing is a high-level process of a physical system. Recent interest in non-standard computing systems, including quantum and biological computers, has brought this physical basis of computing to the forefront. There has been, however, no consensus on how to tell if a given physical system is acting as a computer or not; leading to confusion over novel computational devices, and even claims that every physical event is a computation. In this paper, we introduce a formal framework that can be used to determine whether a physical system is performing a computation. We demonstrate how the abstract computational level interacts with the physical device level, in comparison with the use of mathematical models in experimental science. This powerful formulation allows a precise description of experiments, technology, computation and simulation, giving our central conclusion: physical computing is the use of a physical system to predict the outcome of an abstract evolution . We give conditions for computing, illustrated using a range of non-standard computing scenarios. The framework also covers broader computing contexts, where there is no obvious human computer user. We introduce the notion of a 'computational entity', and its critical role in defining when computing is taking place in physical systems.

  5. When does a physical system compute?

    PubMed Central

    Horsman, Clare; Stepney, Susan; Wagner, Rob C.; Kendon, Viv

    2014-01-01

    Computing is a high-level process of a physical system. Recent interest in non-standard computing systems, including quantum and biological computers, has brought this physical basis of computing to the forefront. There has been, however, no consensus on how to tell if a given physical system is acting as a computer or not; leading to confusion over novel computational devices, and even claims that every physical event is a computation. In this paper, we introduce a formal framework that can be used to determine whether a physical system is performing a computation. We demonstrate how the abstract computational level interacts with the physical device level, in comparison with the use of mathematical models in experimental science. This powerful formulation allows a precise description of experiments, technology, computation and simulation, giving our central conclusion: physical computing is the use of a physical system to predict the outcome of an abstract evolution. We give conditions for computing, illustrated using a range of non-standard computing scenarios. The framework also covers broader computing contexts, where there is no obvious human computer user. We introduce the notion of a ‘computational entity’, and its critical role in defining when computing is taking place in physical systems. PMID:25197245

  6. 48 CFR 227.7203-15 - Subcontractor rights in computer software or computer software documentation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... computer software or computer software documentation. 227.7203-15 Section 227.7203-15 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-15 Subcontractor rights in computer software or computer software documentation. (a...

  7. 48 CFR 227.7203-15 - Subcontractor rights in computer software or computer software documentation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... computer software or computer software documentation. 227.7203-15 Section 227.7203-15 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-15 Subcontractor rights in computer software or computer software documentation. (a...

  8. 48 CFR 227.7203-15 - Subcontractor rights in computer software or computer software documentation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... computer software or computer software documentation. 227.7203-15 Section 227.7203-15 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-15 Subcontractor rights in computer software or computer software documentation. (a...

  9. 48 CFR 227.7203-15 - Subcontractor rights in computer software or computer software documentation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... computer software or computer software documentation. 227.7203-15 Section 227.7203-15 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-15 Subcontractor rights in computer software or computer software documentation. (a...

  10. 48 CFR 227.7203-15 - Subcontractor rights in computer software or computer software documentation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... computer software or computer software documentation. 227.7203-15 Section 227.7203-15 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-15 Subcontractor rights in computer software or computer software documentation. (a...

  11. 77 FR 26041 - Certain Computers and Computer Peripheral Devices and Components Thereof and Products Containing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-02

    ... INTERNATIONAL TRADE COMMISSION [Inv. No. 337-TA-841] Certain Computers and Computer Peripheral... after importation of certain computers and computer peripheral devices and components thereof and... computers and computer peripheral devices and components thereof and products containing the same that...

  12. 48 CFR 227.7202-3 - Rights in commercial computer software or commercial computer software documentation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... computer software or commercial computer software documentation. 227.7202-3 Section 227.7202-3 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202-3 Rights in commercial computer software or commercial computer software documentation...

  13. 48 CFR 227.7202-3 - Rights in commercial computer software or commercial computer software documentation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... computer software or commercial computer software documentation. 227.7202-3 Section 227.7202-3 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202-3 Rights in commercial computer software or commercial computer software documentation...

  14. 48 CFR 227.7203-2 - Acquisition of noncommercial computer software and computer software documentation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... noncommercial computer software and computer software documentation. 227.7203-2 Section 227.7203-2 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-2 Acquisition of noncommercial computer software and computer software documentation. (a...

  15. 48 CFR 227.7203-2 - Acquisition of noncommercial computer software and computer software documentation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... noncommercial computer software and computer software documentation. 227.7203-2 Section 227.7203-2 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-2 Acquisition of noncommercial computer software and computer software documentation. (a...

  16. 48 CFR 227.7203-2 - Acquisition of noncommercial computer software and computer software documentation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... noncommercial computer software and computer software documentation. 227.7203-2 Section 227.7203-2 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-2 Acquisition of noncommercial computer software and computer software documentation. (a...

  17. 48 CFR 227.7203-10 - Contractor identification and marking of computer software or computer software documentation to...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... and marking of computer software or computer software documentation to be furnished with restrictive... Rights in Computer Software and Computer Software Documentation 227.7203-10 Contractor identification and marking of computer software or computer software documentation to be furnished with restrictive markings...

  18. 48 CFR 227.7203-2 - Acquisition of noncommercial computer software and computer software documentation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... noncommercial computer software and computer software documentation. 227.7203-2 Section 227.7203-2 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-2 Acquisition of noncommercial computer software and computer software documentation. (a...

  19. 48 CFR 227.7202-3 - Rights in commercial computer software or commercial computer software documentation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... computer software or commercial computer software documentation. 227.7202-3 Section 227.7202-3 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202-3 Rights in commercial computer software or commercial computer software documentation...

  20. 48 CFR 227.7203-14 - Conformity, acceptance, and warranty of computer software and computer software documentation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ..., and warranty of computer software and computer software documentation. 227.7203-14 Section 227.7203-14... GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-14 Conformity, acceptance, and warranty of computer software and computer...

  1. 48 CFR 227.7203-10 - Contractor identification and marking of computer software or computer software documentation to...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... and marking of computer software or computer software documentation to be furnished with restrictive... Rights in Computer Software and Computer Software Documentation 227.7203-10 Contractor identification and marking of computer software or computer software documentation to be furnished with restrictive markings...

  2. 48 CFR 227.7202-3 - Rights in commercial computer software or commercial computer software documentation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... computer software or commercial computer software documentation. 227.7202-3 Section 227.7202-3 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202-3 Rights in commercial computer software or commercial computer software documentation...

  3. 48 CFR 227.7203-10 - Contractor identification and marking of computer software or computer software documentation to...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... and marking of computer software or computer software documentation to be furnished with restrictive... Rights in Computer Software and Computer Software Documentation 227.7203-10 Contractor identification and marking of computer software or computer software documentation to be furnished with restrictive markings...

  4. 48 CFR 227.7203-14 - Conformity, acceptance, and warranty of computer software and computer software documentation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ..., and warranty of computer software and computer software documentation. 227.7203-14 Section 227.7203-14... GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-14 Conformity, acceptance, and warranty of computer software and computer...

  5. 48 CFR 227.7202-3 - Rights in commercial computer software or commercial computer software documentation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... computer software or commercial computer software documentation. 227.7202-3 Section 227.7202-3 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202-3 Rights in commercial computer software or commercial computer software documentation...

  6. 48 CFR 227.7203-14 - Conformity, acceptance, and warranty of computer software and computer software documentation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ..., and warranty of computer software and computer software documentation. 227.7203-14 Section 227.7203-14... GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-14 Conformity, acceptance, and warranty of computer software and computer...

  7. 48 CFR 227.7203-10 - Contractor identification and marking of computer software or computer software documentation to...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... and marking of computer software or computer software documentation to be furnished with restrictive... Rights in Computer Software and Computer Software Documentation 227.7203-10 Contractor identification and marking of computer software or computer software documentation to be furnished with restrictive markings...

  8. 48 CFR 227.7203-14 - Conformity, acceptance, and warranty of computer software and computer software documentation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ..., and warranty of computer software and computer software documentation. 227.7203-14 Section 227.7203-14... GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-14 Conformity, acceptance, and warranty of computer software and computer...

  9. 48 CFR 227.7203-14 - Conformity, acceptance, and warranty of computer software and computer software documentation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ..., and warranty of computer software and computer software documentation. 227.7203-14 Section 227.7203-14... GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-14 Conformity, acceptance, and warranty of computer software and computer...

  10. 48 CFR 227.7203-10 - Contractor identification and marking of computer software or computer software documentation to...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... and marking of computer software or computer software documentation to be furnished with restrictive... Rights in Computer Software and Computer Software Documentation 227.7203-10 Contractor identification and marking of computer software or computer software documentation to be furnished with restrictive markings...

  11. 48 CFR 227.7203-2 - Acquisition of noncommercial computer software and computer software documentation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... noncommercial computer software and computer software documentation. 227.7203-2 Section 227.7203-2 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-2 Acquisition of noncommercial computer software and computer software documentation. (a...

  12. The Research of the Parallel Computing Development from the Angle of Cloud Computing

    NASA Astrophysics Data System (ADS)

    Peng, Zhensheng; Gong, Qingge; Duan, Yanyu; Wang, Yun

    2017-10-01

    Cloud computing is the development of parallel computing, distributed computing and grid computing. The development of cloud computing makes parallel computing come into people’s lives. Firstly, this paper expounds the concept of cloud computing and introduces two several traditional parallel programming model. Secondly, it analyzes and studies the principles, advantages and disadvantages of OpenMP, MPI and Map Reduce respectively. Finally, it takes MPI, OpenMP models compared to Map Reduce from the angle of cloud computing. The results of this paper are intended to provide a reference for the development of parallel computing.

  13. Providing full point-to-point communications among compute nodes of an operational group in a global combining network of a parallel computer

    DOEpatents

    Archer, Charles J; Faraj, Ahmad A; Inglett, Todd A; Ratterman, Joseph D

    2013-04-16

    Methods, apparatus, and products are disclosed for providing full point-to-point communications among compute nodes of an operational group in a global combining network of a parallel computer, each compute node connected to each adjacent compute node in the global combining network through a link, that include: receiving a network packet in a compute node, the network packet specifying a destination compute node; selecting, in dependence upon the destination compute node, at least one of the links for the compute node along which to forward the network packet toward the destination compute node; and forwarding the network packet along the selected link to the adjacent compute node connected to the compute node through the selected link.

  14. Computer Anxiety: How to Measure It?

    ERIC Educational Resources Information Center

    McPherson, Bill

    1997-01-01

    Provides an overview of five scales that are used to measure computer anxiety: Computer Anxiety Index, Computer Anxiety Scale, Computer Attitude Scale, Attitudes toward Computers, and Blombert-Erickson-Lowrey Computer Attitude Task. Includes background information and scale specifics. (JOW)

  15. The roles of 'subjective computer training' and management support in the use of computers in community health centres.

    PubMed

    Yaghmaie, Farideh; Jayasuriya, Rohan

    2004-01-01

    There have been many changes made to information systems in the last decade. Changes in information systems require users constantly to update their computer knowledge and skills. Computer training is a critical issue for any user because it offers them considerable new skills. The purpose of this study was to measure the effects of 'subjective computer training' and management support on attitudes to computers, computer anxiety and subjective norms to use computers. The data were collected from community health centre staff. The results of the study showed that health staff trained in computer use had more favourable attitudes to computers, less computer anxiety and more awareness of others' expectations about computer use than untrained users. However, there was no relationship between management support and computer attitude, computer anxiety or subjective norms. Lack of computer training for the majority of healthcare staff confirmed the need for more attention to this issue, particularly in health centres.

  16. 48 CFR 227.7203-8 - Deferred delivery and deferred ordering of computer software and computer software documentation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... deferred ordering of computer software and computer software documentation. 227.7203-8 Section 227.7203-8... GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-8 Deferred delivery and deferred ordering of computer software and computer...

  17. 48 CFR 227.7203-3 - Early identification of computer software or computer software documentation to be furnished to...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... computer software or computer software documentation to be furnished to the Government with restrictions on..., DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-3 Early identification of computer software or computer software documentation to be furnished to the Government with...

  18. 48 CFR 227.7203-8 - Deferred delivery and deferred ordering of computer software and computer software documentation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... deferred ordering of computer software and computer software documentation. 227.7203-8 Section 227.7203-8... GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-8 Deferred delivery and deferred ordering of computer software and computer...

  19. 48 CFR 227.7203-3 - Early identification of computer software or computer software documentation to be furnished to...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... computer software or computer software documentation to be furnished to the Government with restrictions on..., DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-3 Early identification of computer software or computer software documentation to be furnished to the Government with...

  20. 48 CFR 227.7203-3 - Early identification of computer software or computer software documentation to be furnished to...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... computer software or computer software documentation to be furnished to the Government with restrictions on..., DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-3 Early identification of computer software or computer software documentation to be furnished to the Government with...

  1. 48 CFR 227.7203-8 - Deferred delivery and deferred ordering of computer software and computer software documentation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... deferred ordering of computer software and computer software documentation. 227.7203-8 Section 227.7203-8... GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-8 Deferred delivery and deferred ordering of computer software and computer...

  2. 48 CFR 227.7203-8 - Deferred delivery and deferred ordering of computer software and computer software documentation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... deferred ordering of computer software and computer software documentation. 227.7203-8 Section 227.7203-8... GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-8 Deferred delivery and deferred ordering of computer software and computer...

  3. 48 CFR 227.7203-8 - Deferred delivery and deferred ordering of computer software and computer software documentation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... deferred ordering of computer software and computer software documentation. 227.7203-8 Section 227.7203-8... GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-8 Deferred delivery and deferred ordering of computer software and computer...

  4. 48 CFR 227.7203-3 - Early identification of computer software or computer software documentation to be furnished to...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... computer software or computer software documentation to be furnished to the Government with restrictions on..., DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-3 Early identification of computer software or computer software documentation to be furnished to the Government with...

  5. 48 CFR 227.7203-3 - Early identification of computer software or computer software documentation to be furnished to...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... computer software or computer software documentation to be furnished to the Government with restrictions on..., DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-3 Early identification of computer software or computer software documentation to be furnished to the Government with...

  6. Cloud Computing Fundamentals

    NASA Astrophysics Data System (ADS)

    Furht, Borko

    In the introductory chapter we define the concept of cloud computing and cloud services, and we introduce layers and types of cloud computing. We discuss the differences between cloud computing and cloud services. New technologies that enabled cloud computing are presented next. We also discuss cloud computing features, standards, and security issues. We introduce the key cloud computing platforms, their vendors, and their offerings. We discuss cloud computing challenges and the future of cloud computing.

  7. Educational Technology: Best Practices from America's Schools.

    ERIC Educational Resources Information Center

    Bozeman, William C.; Baumbach, Donna J.

    This book begins with an overview of computer technology concepts, including computer system configurations, computer communications, and software. Instructional computer applications are then discussed; topics include computer-assisted instruction, computer-managed instruction, computer-enhanced instruction, LOGO, authoring programs, presentation…

  8. 48 CFR 227.7202 - Commercial computer software and commercial computer software documentation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... software and commercial computer software documentation. 227.7202 Section 227.7202 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202 Commercial computer software and commercial computer software documentation. ...

  9. 48 CFR 227.7203 - Noncommercial computer software and noncommercial computer software documentation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... software and noncommercial computer software documentation. 227.7203 Section 227.7203 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203 Noncommercial computer software and noncommercial computer software documentation. ...

  10. 48 CFR 227.7203 - Noncommercial computer software and noncommercial computer software documentation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... software and noncommercial computer software documentation. 227.7203 Section 227.7203 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203 Noncommercial computer software and noncommercial computer software documentation. ...

  11. 48 CFR 227.7203 - Noncommercial computer software and noncommercial computer software documentation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... software and noncommercial computer software documentation. 227.7203 Section 227.7203 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203 Noncommercial computer software and noncommercial computer software documentation. ...

  12. 48 CFR 227.7202 - Commercial computer software and commercial computer software documentation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... software and commercial computer software documentation. 227.7202 Section 227.7202 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202 Commercial computer software and commercial computer software documentation. ...

  13. 48 CFR 227.7203 - Noncommercial computer software and noncommercial computer software documentation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... software and noncommercial computer software documentation. 227.7203 Section 227.7203 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203 Noncommercial computer software and noncommercial computer software documentation. ...

  14. 48 CFR 227.7202 - Commercial computer software and commercial computer software documentation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... software and commercial computer software documentation. 227.7202 Section 227.7202 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202 Commercial computer software and commercial computer software documentation. ...

  15. 48 CFR 227.7202 - Commercial computer software and commercial computer software documentation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... software and commercial computer software documentation. 227.7202 Section 227.7202 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202 Commercial computer software and commercial computer software documentation. ...

  16. 48 CFR 227.7202 - Commercial computer software and commercial computer software documentation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... software and commercial computer software documentation. 227.7202 Section 227.7202 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202 Commercial computer software and commercial computer software documentation. ...

  17. 48 CFR 227.7203 - Noncommercial computer software and noncommercial computer software documentation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... software and noncommercial computer software documentation. 227.7203 Section 227.7203 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203 Noncommercial computer software and noncommercial computer software documentation. ...

  18. Attitudes to Technology, Perceived Computer Self-Efficacy and Computer Anxiety as Predictors of Computer Supported Education

    ERIC Educational Resources Information Center

    Celik, Vehbi; Yesilyurt, Etem

    2013-01-01

    There is a large body of research regarding computer supported education, perceptions of computer self-efficacy, computer anxiety and the technological attitudes of teachers and teacher candidates. However, no study has been conducted on the correlation between and effect of computer supported education, perceived computer self-efficacy, computer…

  19. Proposal for grid computing for nuclear applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Idris, Faridah Mohamad; Ismail, Saaidi; Haris, Mohd Fauzi B.

    2014-02-12

    The use of computer clusters for computational sciences including computational physics is vital as it provides computing power to crunch big numbers at a faster rate. In compute intensive applications that requires high resolution such as Monte Carlo simulation, the use of computer clusters in a grid form that supplies computational power to any nodes within the grid that needs computing power, has now become a necessity. In this paper, we described how the clusters running on a specific application could use resources within the grid, to run the applications to speed up the computing process.

  20. Demonstration of blind quantum computing.

    PubMed

    Barz, Stefanie; Kashefi, Elham; Broadbent, Anne; Fitzsimons, Joseph F; Zeilinger, Anton; Walther, Philip

    2012-01-20

    Quantum computers, besides offering substantial computational speedups, are also expected to preserve the privacy of a computation. We present an experimental demonstration of blind quantum computing in which the input, computation, and output all remain unknown to the computer. We exploit the conceptual framework of measurement-based quantum computation that enables a client to delegate a computation to a quantum server. Various blind delegated computations, including one- and two-qubit gates and the Deutsch and Grover quantum algorithms, are demonstrated. The client only needs to be able to prepare and transmit individual photonic qubits. Our demonstration is crucial for unconditionally secure quantum cloud computing and might become a key ingredient for real-life applications, especially when considering the challenges of making powerful quantum computers widely available.

  1. Providing full point-to-point communications among compute nodes of an operational group in a global combining network of a parallel computer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Archer, Charles J.; Faraj, Daniel A.; Inglett, Todd A.

    Methods, apparatus, and products are disclosed for providing full point-to-point communications among compute nodes of an operational group in a global combining network of a parallel computer, each compute node connected to each adjacent compute node in the global combining network through a link, that include: receiving a network packet in a compute node, the network packet specifying a destination compute node; selecting, in dependence upon the destination compute node, at least one of the links for the compute node along which to forward the network packet toward the destination compute node; and forwarding the network packet along the selectedmore » link to the adjacent compute node connected to the compute node through the selected link.« less

  2. ADVANCED COMPUTATIONAL METHODS IN DOSE MODELING: APPLICATION OF COMPUTATIONAL BIOPHYSICAL TRANSPORT, COMPUTATIONAL CHEMISTRY, AND COMPUTATIONAL BIOLOGY

    EPA Science Inventory

    Computational toxicology (CompTox) leverages the significant gains in computing power and computational techniques (e.g., numerical approaches, structure-activity relationships, bioinformatics) realized over the last few years, thereby reducing costs and increasing efficiency i...

  3. Gender differences in the use of computers, programming, and peer interactions in computer science classrooms

    NASA Astrophysics Data System (ADS)

    Stoilescu, Dorian; Egodawatte, Gunawardena

    2010-12-01

    Research shows that female and male students in undergraduate computer science programs view computer culture differently. Female students are interested more in the use of computers than in doing programming, whereas male students see computer science mainly as a programming activity. The overall purpose of our research was not to find new definitions for computer science culture but to see how male and female students see themselves involved in computer science practices, how they see computer science as a successful career, and what they like and dislike about current computer science practices. The study took place in a mid-sized university in Ontario. Sixteen students and two instructors were interviewed to get their views. We found that male and female views are different on computer use, programming, and the pattern of student interactions. Female and male students did not have any major issues in using computers. In computing programming, female students were not so involved in computing activities whereas male students were heavily involved. As for the opinions about successful computer science professionals, both female and male students emphasized hard working, detailed oriented approaches, and enjoying playing with computers. The myth of the geek as a typical profile of successful computer science students was not found to be true.

  4. 75 FR 30839 - Privacy Act of 1974; CMS Computer Match No. 2010-03, HHS Computer Match No. 1003, SSA Computer...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-02

    ... of the Matching Program A. General The Computer Matching and Privacy Protection Act of 1988 (Pub. L.... 100-503, the Computer Matching and Privacy Protection Act (CMPPA) of 1988), the Office of Management... 1974; CMS Computer Match No. 2010-03, HHS Computer Match No. 1003, SSA Computer Match No. 1048, IRS...

  5. A Short History of the Computer.

    ERIC Educational Resources Information Center

    Leon, George

    1984-01-01

    Briefly traces the development of computers from the abacus, John Napier's logarithms, the first computer/calculator (known as the Differential Engine), the first computer programming via steel punched cards, the electrical analog computer, electronic digital computer, and the transistor to the microchip of today's computers. (MBR)

  6. 48 CFR 227.7203-16 - Providing computer software or computer software documentation to foreign governments, foreign...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... software or computer software documentation to foreign governments, foreign contractors, or international... Rights in Computer Software and Computer Software Documentation 227.7203-16 Providing computer software or computer software documentation to foreign governments, foreign contractors, or international...

  7. 48 CFR 227.7203-16 - Providing computer software or computer software documentation to foreign governments, foreign...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... software or computer software documentation to foreign governments, foreign contractors, or international... Rights in Computer Software and Computer Software Documentation 227.7203-16 Providing computer software or computer software documentation to foreign governments, foreign contractors, or international...

  8. 48 CFR 227.7203-16 - Providing computer software or computer software documentation to foreign governments, foreign...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... software or computer software documentation to foreign governments, foreign contractors, or international... Rights in Computer Software and Computer Software Documentation 227.7203-16 Providing computer software or computer software documentation to foreign governments, foreign contractors, or international...

  9. 48 CFR 227.7203-16 - Providing computer software or computer software documentation to foreign governments, foreign...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... software or computer software documentation to foreign governments, foreign contractors, or international... Rights in Computer Software and Computer Software Documentation 227.7203-16 Providing computer software or computer software documentation to foreign governments, foreign contractors, or international...

  10. 48 CFR 227.7203-16 - Providing computer software or computer software documentation to foreign governments, foreign...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... software or computer software documentation to foreign governments, foreign contractors, or international... Rights in Computer Software and Computer Software Documentation 227.7203-16 Providing computer software or computer software documentation to foreign governments, foreign contractors, or international...

  11. Differences in muscle load between computer and non-computer work among office workers.

    PubMed

    Richter, J M; Mathiassen, S E; Slijper, H P; Over, E A B; Frens, M A

    2009-12-01

    Introduction of more non-computer tasks has been suggested to increase exposure variation and thus reduce musculoskeletal complaints (MSC) in computer-intensive office work. This study investigated whether muscle activity did, indeed, differ between computer and non-computer activities. Whole-day logs of input device use in 30 office workers were used to identify computer and non-computer work, using a range of classification thresholds (non-computer thresholds (NCTs)). Exposure during these activities was assessed by bilateral electromyography recordings from the upper trapezius and lower arm. Contrasts in muscle activity between computer and non-computer work were distinct but small, even at the individualised, optimal NCT. Using an average group-based NCT resulted in less contrast, even in smaller subgroups defined by job function or MSC. Thus, computer activity logs should be used cautiously as proxies of biomechanical exposure. Conventional non-computer tasks may have a limited potential to increase variation in muscle activity during computer-intensive office work.

  12. Nurses' computer literacy and attitudes towards the use of computers in health care.

    PubMed

    Gürdaş Topkaya, Sati; Kaya, Nurten

    2015-05-01

    This descriptive and cross-sectional study was designed to address nurses' computer literacy and attitudes towards the use of computers in health care and to determine the correlation between these two variables. This study was conducted with the participation of 688 nurses who worked at two university-affiliated hospitals. These nurses were chosen using a stratified random sampling method. The data were collected using the Multicomponent Assessment of Computer Literacy and the Pretest for Attitudes Towards Computers in Healthcare Assessment Scale v. 2. The nurses, in general, had positive attitudes towards computers, and their computer literacy was good. Computer literacy in general had significant positive correlations with individual elements of computer competency and with attitudes towards computers. If the computer is to be an effective and beneficial part of the health-care system, it is necessary to help nurses improve their computer competency. © 2014 Wiley Publishing Asia Pty Ltd.

  13. Computer Experiences, Self-Efficacy and Knowledge of Students Enrolled in Introductory University Agriculture Courses.

    ERIC Educational Resources Information Center

    Johnson, Donald M.; Ferguson, James A.; Lester, Melissa L.

    1999-01-01

    Of 175 freshmen agriculture students, 74% had prior computer courses, 62% owned computers. The number of computer topics studied predicted both computer self-efficacy and computer knowledge. A substantial positive correlation was found between self-efficacy and computer knowledge. (SK)

  14. Impact of Classroom Computer Use on Computer Anxiety.

    ERIC Educational Resources Information Center

    Lambert, Matthew E.; And Others

    Increasing use of computer programs for undergraduate psychology education has raised concern over the impact of computer anxiety on educational performance. Additionally, some researchers have indicated that classroom computer use can exacerbate pre-existing computer anxiety. To evaluate the relationship between in-class computer use and computer…

  15. Power throttling of collections of computing elements

    DOEpatents

    Bellofatto, Ralph E [Ridgefield, CT; Coteus, Paul W [Yorktown Heights, NY; Crumley, Paul G [Yorktown Heights, NY; Gara, Alan G [Mount Kidsco, NY; Giampapa, Mark E [Irvington, NY; Gooding,; Thomas, M [Rochester, MN; Haring, Rudolf A [Cortlandt Manor, NY; Megerian, Mark G [Rochester, MN; Ohmacht, Martin [Yorktown Heights, NY; Reed, Don D [Mantorville, MN; Swetz, Richard A [Mahopac, NY; Takken, Todd [Brewster, NY

    2011-08-16

    An apparatus and method for controlling power usage in a computer includes a plurality of computers communicating with a local control device, and a power source supplying power to the local control device and the computer. A plurality of sensors communicate with the computer for ascertaining power usage of the computer, and a system control device communicates with the computer for controlling power usage of the computer.

  16. The Effect of Computer Assisted and Computer Based Teaching Methods on Computer Course Success and Computer Using Attitudes of Students

    ERIC Educational Resources Information Center

    Tosun, Nilgün; Suçsuz, Nursen; Yigit, Birol

    2006-01-01

    The purpose of this research was to investigate the effects of the computer-assisted and computer-based instructional methods on students achievement at computer classes and on their attitudes towards using computers. The study, which was completed in 6 weeks, were carried out with 94 sophomores studying in formal education program of Primary…

  17. 48 CFR 252.227-7014 - Rights in noncommercial computer software and noncommercial computer software documentation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... computer software and noncommercial computer software documentation. 252.227-7014 Section 252.227-7014... Rights in noncommercial computer software and noncommercial computer software documentation. As prescribed in 227.7203-6(a)(1), use the following clause. Rights in Noncommercial Computer Software and...

  18. 48 CFR 252.227-7014 - Rights in noncommercial computer software and noncommercial computer software documentation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... computer software and noncommercial computer software documentation. 252.227-7014 Section 252.227-7014... Rights in noncommercial computer software and noncommercial computer software documentation. As prescribed in 227.7203-6(a)(1), use the following clause. Rights in Noncommercial Computer Software and...

  19. 48 CFR 252.227-7014 - Rights in noncommercial computer software and noncommercial computer software documentation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... computer software and noncommercial computer software documentation. 252.227-7014 Section 252.227-7014... Rights in noncommercial computer software and noncommercial computer software documentation. As prescribed in 227.7203-6(a)(1), use the following clause. Rights in Noncommercial Computer Software and...

  20. 48 CFR 252.227-7014 - Rights in noncommercial computer software and noncommercial computer software documentation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... computer software and noncommercial computer software documentation. 252.227-7014 Section 252.227-7014... Rights in noncommercial computer software and noncommercial computer software documentation. As prescribed in 227.7203-6(a)(1), use the following clause. Rights in Noncommercial Computer Software and...

  1. Precollege Computer Literacy: A Personal Computing Approach. Second Edition.

    ERIC Educational Resources Information Center

    Moursund, David

    Intended for elementary and secondary teachers and curriculum specialists, this booklet discusses and defines computer literacy as a functional knowledge of computers and their effects on students and the rest of society. It analyzes personal computing and the aspects of computers that have direct impact on students. Outlining computer-assisted…

  2. 48 CFR 252.227-7014 - Rights in noncommercial computer software and noncommercial computer software documentation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... computer software and noncommercial computer software documentation. 252.227-7014 Section 252.227-7014... Rights in noncommercial computer software and noncommercial computer software documentation. As prescribed in 227.7203-6(a)(1), use the following clause. Rights in Noncommercial Computer Software and...

  3. Synchronizing compute node time bases in a parallel computer

    DOEpatents

    Chen, Dong; Faraj, Daniel A; Gooding, Thomas M; Heidelberger, Philip

    2015-01-27

    Synchronizing time bases in a parallel computer that includes compute nodes organized for data communications in a tree network, where one compute node is designated as a root, and, for each compute node: calculating data transmission latency from the root to the compute node; configuring a thread as a pulse waiter; initializing a wakeup unit; and performing a local barrier operation; upon each node completing the local barrier operation, entering, by all compute nodes, a global barrier operation; upon all nodes entering the global barrier operation, sending, to all the compute nodes, a pulse signal; and for each compute node upon receiving the pulse signal: waking, by the wakeup unit, the pulse waiter; setting a time base for the compute node equal to the data transmission latency between the root node and the compute node; and exiting the global barrier operation.

  4. Synchronizing compute node time bases in a parallel computer

    DOEpatents

    Chen, Dong; Faraj, Daniel A; Gooding, Thomas M; Heidelberger, Philip

    2014-12-30

    Synchronizing time bases in a parallel computer that includes compute nodes organized for data communications in a tree network, where one compute node is designated as a root, and, for each compute node: calculating data transmission latency from the root to the compute node; configuring a thread as a pulse waiter; initializing a wakeup unit; and performing a local barrier operation; upon each node completing the local barrier operation, entering, by all compute nodes, a global barrier operation; upon all nodes entering the global barrier operation, sending, to all the compute nodes, a pulse signal; and for each compute node upon receiving the pulse signal: waking, by the wakeup unit, the pulse waiter; setting a time base for the compute node equal to the data transmission latency between the root node and the compute node; and exiting the global barrier operation.

  5. Specialized computer architectures for computational aerodynamics

    NASA Technical Reports Server (NTRS)

    Stevenson, D. K.

    1978-01-01

    In recent years, computational fluid dynamics has made significant progress in modelling aerodynamic phenomena. Currently, one of the major barriers to future development lies in the compute-intensive nature of the numerical formulations and the relative high cost of performing these computations on commercially available general purpose computers, a cost high with respect to dollar expenditure and/or elapsed time. Today's computing technology will support a program designed to create specialized computing facilities to be dedicated to the important problems of computational aerodynamics. One of the still unresolved questions is the organization of the computing components in such a facility. The characteristics of fluid dynamic problems which will have significant impact on the choice of computer architecture for a specialized facility are reviewed.

  6. Physarum machines: encapsulating reaction-diffusion to compute spanning tree

    NASA Astrophysics Data System (ADS)

    Adamatzky, Andrew

    2007-12-01

    The Physarum machine is a biological computing device, which employs plasmodium of Physarum polycephalum as an unconventional computing substrate. A reaction-diffusion computer is a chemical computing device that computes by propagating diffusive or excitation wave fronts. Reaction-diffusion computers, despite being computationally universal machines, are unable to construct certain classes of proximity graphs without the assistance of an external computing device. I demonstrate that the problem can be solved if the reaction-diffusion system is enclosed in a membrane with few ‘growth points’, sites guiding the pattern propagation. Experimental approximation of spanning trees by P. polycephalum slime mold demonstrates the feasibility of the approach. Findings provided advance theory of reaction-diffusion computation by enriching it with ideas of slime mold computation.

  7. Efficient universal blind quantum computation.

    PubMed

    Giovannetti, Vittorio; Maccone, Lorenzo; Morimae, Tomoyuki; Rudolph, Terry G

    2013-12-06

    We give a cheat sensitive protocol for blind universal quantum computation that is efficient in terms of computational and communication resources: it allows one party to perform an arbitrary computation on a second party's quantum computer without revealing either which computation is performed, or its input and output. The first party's computational capabilities can be extremely limited: she must only be able to create and measure single-qubit superposition states. The second party is not required to use measurement-based quantum computation. The protocol requires the (optimal) exchange of O(Jlog2(N)) single-qubit states, where J is the computational depth and N is the number of qubits needed for the computation.

  8. Computer Literacy Project. A General Orientation in Basic Computer Concepts and Applications.

    ERIC Educational Resources Information Center

    Murray, David R.

    This paper proposes a two-part, basic computer literacy program for university faculty, staff, and students with no prior exposure to computers. The program described would introduce basic computer concepts and computing center service programs and resources; provide fundamental preparation for other computer courses; and orient faculty towards…

  9. Know Your Discipline: Teaching the Philosophy of Computer Science

    ERIC Educational Resources Information Center

    Tedre, Matti

    2007-01-01

    The diversity and interdisciplinarity of computer science and the multiplicity of its uses in other sciences make it hard to define computer science and to prescribe how computer science should be carried out. The diversity of computer science also causes friction between computer scientists from different branches. Computer science curricula, as…

  10. Computer Anxiety and Student Teachers: Interrelationships between Computer Anxiety, Demographic Variables and an Intervention Strategy.

    ERIC Educational Resources Information Center

    McInerney, Valentina; And Others

    This study examined the effects of increased computing experience on the computer anxiety of 101 first year preservice teacher education students at a regional university in Australia. Three instruments measuring computer anxiety and attitudes--the Computer Anxiety Rating Scale (CARS), Attitudes Towards Computers Scale (ATCS), and Computer…

  11. 37 CFR 201.26 - Recordation of documents pertaining to computer shareware and donation of public domain computer...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... pertaining to computer shareware and donation of public domain computer software. 201.26 Section 201.26... public domain computer software. (a) General. This section prescribes the procedures for submission of legal documents pertaining to computer shareware and the deposit of public domain computer software...

  12. 37 CFR 201.26 - Recordation of documents pertaining to computer shareware and donation of public domain computer...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... pertaining to computer shareware and donation of public domain computer software. 201.26 Section 201.26... public domain computer software. (a) General. This section prescribes the procedures for submission of legal documents pertaining to computer shareware and the deposit of public domain computer software...

  13. 37 CFR 201.26 - Recordation of documents pertaining to computer shareware and donation of public domain computer...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... pertaining to computer shareware and donation of public domain computer software. 201.26 Section 201.26... public domain computer software. (a) General. This section prescribes the procedures for submission of legal documents pertaining to computer shareware and the deposit of public domain computer software...

  14. 37 CFR 201.26 - Recordation of documents pertaining to computer shareware and donation of public domain computer...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... pertaining to computer shareware and donation of public domain computer software. 201.26 Section 201.26... public domain computer software. (a) General. This section prescribes the procedures for submission of legal documents pertaining to computer shareware and the deposit of public domain computer software...

  15. All Roads Lead to Computing: Making, Participatory Simulations, and Social Computing as Pathways to Computer Science

    ERIC Educational Resources Information Center

    Brady, Corey; Orton, Kai; Weintrop, David; Anton, Gabriella; Rodriguez, Sebastian; Wilensky, Uri

    2017-01-01

    Computer science (CS) is becoming an increasingly diverse domain. This paper reports on an initiative designed to introduce underrepresented populations to computing using an eclectic, multifaceted approach. As part of a yearlong computing course, students engage in Maker activities, participatory simulations, and computing projects that…

  16. Computer-Related Success and Failure: A Longitudinal Field Study of the Factors Influencing Computer-Related Performance.

    ERIC Educational Resources Information Center

    Rozell, E. J.; Gardner, W. L., III

    1999-01-01

    A model of the intrapersonal processes impacting computer-related performance was tested using data from 75 manufacturing employees in a computer training course. Gender, computer experience, and attributional style were predictive of computer attitudes, which were in turn related to computer efficacy, task-specific performance expectations, and…

  17. Computer ergonomics: the medical practice guide to developing good computer habits.

    PubMed

    Hills, Laura

    2011-01-01

    Medical practice employees are likely to use computers for at least some of their work. Some sit several hours each day at computer workstations. Therefore, it is important that members of your medical practice team develop good computer work habits and that they know how to align equipment, furniture, and their bodies to prevent strain, stress, and computer-related injuries. This article delves into the field of computer ergonomics-the design of computer workstations and work habits to reduce user fatigue, discomfort, and injury. It describes practical strategies medical practice employees can use to improve their computer work habits. Specifically, this article describes the proper use of the computer workstation chair, the ideal placement of the computer monitor and keyboard, and the best lighting for computer work areas and tasks. Moreover, this article includes computer ergonomic guidelines especially for bifocal and progressive lens wearers and offers 10 tips for proper mousing. Ergonomically correct posture, movements, positioning, and equipment are all described in detail to enable the frequent computer user in your medical practice to remain healthy, pain-free, and productive.

  18. Increasing processor utilization during parallel computation rundown

    NASA Technical Reports Server (NTRS)

    Jones, W. H.

    1986-01-01

    Some parallel processing environments provide for asynchronous execution and completion of general purpose parallel computations from a single computational phase. When all the computations from such a phase are complete, a new parallel computational phase is begun. Depending upon the granularity of the parallel computations to be performed, there may be a shortage of available work as a particular computational phase draws to a close (computational rundown). This can result in the waste of computing resources and the delay of the overall problem. In many practical instances, strict sequential ordering of phases of parallel computation is not totally required. In such cases, the beginning of one phase can be correctly computed before the end of a previous phase is completed. This allows additional work to be generated somewhat earlier to keep computing resources busy during each computational rundown. The conditions under which this can occur are identified and the frequency of occurrence of such overlapping in an actual parallel Navier-Stokes code is reported. A language construct is suggested and possible control strategies for the management of such computational phase overlapping are discussed.

  19. Gender stereotypes, aggression, and computer games: an online survey of women.

    PubMed

    Norris, Kamala O

    2004-12-01

    Computer games were conceptualized as a potential mode of entry into computer-related employment for women. Computer games contain increasing levels of realism and violence, as well as biased gender portrayals. It has been suggested that aggressive personality characteristics attract people to aggressive video games, and that more women do not play computer games because they are socialized to be non-aggressive. To explore gender identity and aggressive personality in the context of computers, an online survey was conducted on women who played computer games and women who used the computer but did not play computer games. Women who played computer games perceived their online environments as less friendly but experienced less sexual harassment online, were more aggressive themselves, and did not differ in gender identity, degree of sex role stereotyping, or acceptance of sexual violence when compared to women who used the computer but did not play video games. Finally, computer gaming was associated with decreased participation in computer-related employment; however, women with high masculine gender identities were more likely to use computers at work.

  20. Recent development on computer aided tissue engineering--a review.

    PubMed

    Sun, Wei; Lal, Pallavi

    2002-02-01

    The utilization of computer-aided technologies in tissue engineering has evolved in the development of a new field of computer-aided tissue engineering (CATE). This article reviews recent development and application of enabling computer technology, imaging technology, computer-aided design and computer-aided manufacturing (CAD and CAM), and rapid prototyping (RP) technology in tissue engineering, particularly, in computer-aided tissue anatomical modeling, three-dimensional (3-D) anatomy visualization and 3-D reconstruction, CAD-based anatomical modeling, computer-aided tissue classification, computer-aided tissue implantation and prototype modeling assisted surgical planning and reconstruction.

  1. Computer hardware fault administration

    DOEpatents

    Archer, Charles J.; Megerian, Mark G.; Ratterman, Joseph D.; Smith, Brian E.

    2010-09-14

    Computer hardware fault administration carried out in a parallel computer, where the parallel computer includes a plurality of compute nodes. The compute nodes are coupled for data communications by at least two independent data communications networks, where each data communications network includes data communications links connected to the compute nodes. Typical embodiments carry out hardware fault administration by identifying a location of a defective link in the first data communications network of the parallel computer and routing communications data around the defective link through the second data communications network of the parallel computer.

  2. Beyond input-output computings: error-driven emergence with parallel non-distributed slime mold computer.

    PubMed

    Aono, Masashi; Gunji, Yukio-Pegio

    2003-10-01

    The emergence derived from errors is the key importance for both novel computing and novel usage of the computer. In this paper, we propose an implementable experimental plan for the biological computing so as to elicit the emergent property of complex systems. An individual plasmodium of the true slime mold Physarum polycephalum acts in the slime mold computer. Modifying the Elementary Cellular Automaton as it entails the global synchronization problem upon the parallel computing provides the NP-complete problem solved by the slime mold computer. The possibility to solve the problem by giving neither all possible results nor explicit prescription of solution-seeking is discussed. In slime mold computing, the distributivity in the local computing logic can change dynamically, and its parallel non-distributed computing cannot be reduced into the spatial addition of multiple serial computings. The computing system based on exhaustive absence of the super-system may produce, something more than filling the vacancy.

  3. Computer Use and Computer Anxiety in Older Korean Americans.

    PubMed

    Yoon, Hyunwoo; Jang, Yuri; Xie, Bo

    2016-09-01

    Responding to the limited literature on computer use in ethnic minority older populations, the present study examined predictors of computer use and computer anxiety in older Korean Americans. Separate regression models were estimated for computer use and computer anxiety with the common sets of predictors: (a) demographic variables (age, gender, marital status, and education), (b) physical health indicators (chronic conditions, functional disability, and self-rated health), and (c) sociocultural factors (acculturation and attitudes toward aging). Approximately 60% of the participants were computer-users, and they had significantly lower levels of computer anxiety than non-users. A higher likelihood of computer use and lower levels of computer anxiety were commonly observed among individuals with younger age, male gender, advanced education, more positive ratings of health, and higher levels of acculturation. In addition, positive attitudes toward aging were found to reduce computer anxiety. Findings provide implications for developing computer training and education programs for the target population. © The Author(s) 2015.

  4. Providing nearest neighbor point-to-point communications among compute nodes of an operational group in a global combining network of a parallel computer

    DOEpatents

    Archer, Charles J.; Faraj, Ahmad A.; Inglett, Todd A.; Ratterman, Joseph D.

    2012-10-23

    Methods, apparatus, and products are disclosed for providing nearest neighbor point-to-point communications among compute nodes of an operational group in a global combining network of a parallel computer, each compute node connected to each adjacent compute node in the global combining network through a link, that include: identifying each link in the global combining network for each compute node of the operational group; designating one of a plurality of point-to-point class routing identifiers for each link such that no compute node in the operational group is connected to two adjacent compute nodes in the operational group with links designated for the same class routing identifiers; and configuring each compute node of the operational group for point-to-point communications with each adjacent compute node in the global combining network through the link between that compute node and that adjacent compute node using that link's designated class routing identifier.

  5. Attitudes and gender differences of high school seniors within one-to-one computing environments in South Dakota

    NASA Astrophysics Data System (ADS)

    Nelson, Mathew

    In today's age of exponential change and technological advancement, awareness of any gender gap in technology and computer science-related fields is crucial, but further research must be done in an effort to better understand the complex interacting factors contributing to the gender gap. This study utilized a survey to investigate specific gender differences relating to computing self-efficacy, computer usage, and environmental factors of exposure, personal interests, and parental influence that impact gender differences of high school students within a one-to-one computing environment in South Dakota. The population who completed the One-to-One High School Computing Survey for this study consisted of South Dakota high school seniors who had been involved in a one-to-one computing environment for two or more years. The data from the survey were analyzed using descriptive and inferential statistics for the determined variables. From the review of literature and data analysis several conclusions were drawn from the findings. Among them are that overall, there was very little difference in perceived computing self-efficacy and computing anxiety between male and female students within the one-to-one computing initiative. The study supported the current research that males and females utilized computers similarly, but males spent more time using their computers to play online games. Early exposure to computers, or the age at which the student was first exposed to a computer, and the number of computers present in the home (computer ownership) impacted computing self-efficacy. The results also indicated parental encouragement to work with computers also contributed positively to both male and female students' computing self-efficacy. Finally the study also found that both mothers and fathers encouraged their male children more than their female children to work with computing and pursue careers in computing science fields.

  6. Distributed processor allocation for launching applications in a massively connected processors complex

    DOEpatents

    Pedretti, Kevin

    2008-11-18

    A compute processor allocator architecture for allocating compute processors to run applications in a multiple processor computing apparatus is distributed among a subset of processors within the computing apparatus. Each processor of the subset includes a compute processor allocator. The compute processor allocators can share a common database of information pertinent to compute processor allocation. A communication path permits retrieval of information from the database independently of the compute processor allocators.

  7. Locating hardware faults in a data communications network of a parallel computer

    DOEpatents

    Archer, Charles J.; Megerian, Mark G.; Ratterman, Joseph D.; Smith, Brian E.

    2010-01-12

    Hardware faults location in a data communications network of a parallel computer. Such a parallel computer includes a plurality of compute nodes and a data communications network that couples the compute nodes for data communications and organizes the compute node as a tree. Locating hardware faults includes identifying a next compute node as a parent node and a root of a parent test tree, identifying for each child compute node of the parent node a child test tree having the child compute node as root, running a same test suite on the parent test tree and each child test tree, and identifying the parent compute node as having a defective link connected from the parent compute node to a child compute node if the test suite fails on the parent test tree and succeeds on all the child test trees.

  8. Broadcasting collective operation contributions throughout a parallel computer

    DOEpatents

    Faraj, Ahmad [Rochester, MN

    2012-02-21

    Methods, systems, and products are disclosed for broadcasting collective operation contributions throughout a parallel computer. The parallel computer includes a plurality of compute nodes connected together through a data communications network. Each compute node has a plurality of processors for use in collective parallel operations on the parallel computer. Broadcasting collective operation contributions throughout a parallel computer according to embodiments of the present invention includes: transmitting, by each processor on each compute node, that processor's collective operation contribution to the other processors on that compute node using intra-node communications; and transmitting on a designated network link, by each processor on each compute node according to a serial processor transmission sequence, that processor's collective operation contribution to the other processors on the other compute nodes using inter-node communications.

  9. Reflections from the Computer Equity Training Project.

    ERIC Educational Resources Information Center

    Sanders, Jo Shuchat

    This paper addresses girls' patterns of computer avoidance at the middle school and other grade levels. It reviews the evidence for a gender gap in computer use in several areas: in school, at home, in computer camps, in computer magazines, and in computer-related jobs. It compares the computer equity issue to math avoidance, and cites the middle…

  10. Models of Computer Use in School Settings. Technical Report Series, Report No. 84.2.2.

    ERIC Educational Resources Information Center

    Sherwood, Robert D.

    Designed to focus on student learning and to illustrate techniques that might be used with computers to facilitate that process, this paper discusses five types of computer use in educational settings: (1) learning ABOUT computers; (2) learning WITH computers; (3) learning FROM computers; (4) learning ABOUT THINKING with computers; and (5)…

  11. Implementing Computer Technology in the Rehabilitation Process.

    ERIC Educational Resources Information Center

    McCollum, Paul S., Ed.; Chan, Fong, Ed.

    1985-01-01

    This special issue contains seven articles, addressing rehabilitation in the information age, computer-assisted rehabilitation services, computer technology in rehabilitation counseling, computer-assisted career exploration and vocational decision making, computer-assisted assessment, computer enhanced employment opportunities for persons with…

  12. Development and Initial Validation of an Instrument to Measure Physicians' Use of, Knowledge about, and Attitudes Toward Computers

    PubMed Central

    Cork, Randy D.; Detmer, William M.; Friedman, Charles P.

    1998-01-01

    This paper describes details of four scales of a questionnaire—“Computers in Medical Care”—measuring attributes of computer use, self-reported computer knowledge, computer feature demand, and computer optimism of academic physicians. The reliability (i.e., precision, or degree to which the scale's result is reproducible) and validity (i.e., accuracy, or degree to which the scale actually measures what it is supposed to measure) of each scale were examined by analysis of the responses of 771 full-time academic physicians across four departments at five academic medical centers in the United States. The objectives of this paper were to define the psychometric properties of the scales as the basis for a future demonstration study and, pending the results of further validity studies, to provide the questionnaire and scales to the medical informatics community as a tool for measuring the attitudes of health care providers. Methodology: The dimensionality of each scale and degree of association of each item with the attribute of interest were determined by principal components factor analysis with othogonal varimax rotation. Weakly associated items (factor loading <.40) were deleted. The reliability of each resultant scale was computed using Cronbach's alpha coefficient. Content validity was addressed during scale construction; construct validity was examined through factor analysis and by correlational analyses. Results: Attributes of computer use, computer knowledge, and computer optimism were unidimensional, with the corresponding scales having reliabilities of.79,.91, and.86, respectively. The computer-feature demand attribute differentiated into two dimensions: the first reflecting demand for high-level functionality with reliability of.81 and the second demand for usability with reliability of.69. There were significant positive correlations between computer use, computer knowledge, and computer optimism scale scores and respondents' hands-on computer use, computer training, and self-reported computer sophistication. In addition, items posited on the computer knowledge scale to be more difficult generated significantly lower scores. Conclusion: The four scales of the questionnaire appear to measure with adequate reliability five attributes of academic physicians' attitudes toward computers in medical care: computer use, self-reported computer knowledge, demand for computer functionality, demand for computer usability, and computer optimism. Results of initial validity studies are positive, but further validation of the scales is needed. The URL of a downloadable HTML copy of the questionnaire is provided. PMID:9524349

  13. Computer-aided design and computer science technology

    NASA Technical Reports Server (NTRS)

    Fulton, R. E.; Voigt, S. J.

    1976-01-01

    A description is presented of computer-aided design requirements and the resulting computer science advances needed to support aerospace design. The aerospace design environment is examined, taking into account problems of data handling and aspects of computer hardware and software. The interactive terminal is normally the primary interface between the computer system and the engineering designer. Attention is given to user aids, interactive design, interactive computations, the characteristics of design information, data management requirements, hardware advancements, and computer science developments.

  14. Supercomputing in Aerospace

    NASA Technical Reports Server (NTRS)

    Kutler, Paul; Yee, Helen

    1987-01-01

    Topics addressed include: numerical aerodynamic simulation; computational mechanics; supercomputers; aerospace propulsion systems; computational modeling in ballistics; turbulence modeling; computational chemistry; computational fluid dynamics; and computational astrophysics.

  15. Publishing Trends in Educational Computing.

    ERIC Educational Resources Information Center

    O'Hair, Marilyn; Johnson, D. LaMont

    1989-01-01

    Describes results of a survey of secondary school and college teachers that was conducted to determine subject matter that should be included in educational computing journals. Areas of interest included computer applications; artificial intelligence; computer-aided instruction; computer literacy; computer-managed instruction; databases; distance…

  16. Adaptive voting computer system

    NASA Technical Reports Server (NTRS)

    Koczela, L. J.; Wilgus, D. S. (Inventor)

    1974-01-01

    A computer system is reported that uses adaptive voting to tolerate failures and operates in a fail-operational, fail-safe manner. Each of four computers is individually connected to one of four external input/output (I/O) busses which interface with external subsystems. Each computer is connected to receive input data and commands from the other three computers and to furnish output data commands to the other three computers. An adaptive control apparatus including a voter-comparator-switch (VCS) is provided for each computer to receive signals from each of the computers and permits adaptive voting among the computers to permit the fail-operational, fail-safe operation.

  17. Computer Music

    NASA Astrophysics Data System (ADS)

    Cook, Perry

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.). Although most people would think that analog synthesizers and electronic music substantially predate the use of computers in music, many experiments and complete computer music systems were being constructed and used as early as the 1950s.

  18. Paging memory from random access memory to backing storage in a parallel computer

    DOEpatents

    Archer, Charles J; Blocksome, Michael A; Inglett, Todd A; Ratterman, Joseph D; Smith, Brian E

    2013-05-21

    Paging memory from random access memory (`RAM`) to backing storage in a parallel computer that includes a plurality of compute nodes, including: executing a data processing application on a virtual machine operating system in a virtual machine on a first compute node; providing, by a second compute node, backing storage for the contents of RAM on the first compute node; and swapping, by the virtual machine operating system in the virtual machine on the first compute node, a page of memory from RAM on the first compute node to the backing storage on the second compute node.

  19. Virtualization and cloud computing in dentistry.

    PubMed

    Chow, Frank; Muftu, Ali; Shorter, Richard

    2014-01-01

    The use of virtualization and cloud computing has changed the way we use computers. Virtualization is a method of placing software called a hypervisor on the hardware of a computer or a host operating system. It allows a guest operating system to run on top of the physical computer with a virtual machine (i.e., virtual computer). Virtualization allows multiple virtual computers to run on top of one physical computer and to share its hardware resources, such as printers, scanners, and modems. This increases the efficient use of the computer by decreasing costs (e.g., hardware, electricity administration, and management) since only one physical computer is needed and running. This virtualization platform is the basis for cloud computing. It has expanded into areas of server and storage virtualization. One of the commonly used dental storage systems is cloud storage. Patient information is encrypted as required by the Health Insurance Portability and Accountability Act (HIPAA) and stored on off-site private cloud services for a monthly service fee. As computer costs continue to increase, so too will the need for more storage and processing power. Virtual and cloud computing will be a method for dentists to minimize costs and maximize computer efficiency in the near future. This article will provide some useful information on current uses of cloud computing.

  20. Climate@Home: Crowdsourcing Climate Change Research

    NASA Astrophysics Data System (ADS)

    Xu, C.; Yang, C.; Li, J.; Sun, M.; Bambacus, M.

    2011-12-01

    Climate change deeply impacts human wellbeing. Significant amounts of resources have been invested in building super-computers that are capable of running advanced climate models, which help scientists understand climate change mechanisms, and predict its trend. Although climate change influences all human beings, the general public is largely excluded from the research. On the other hand, scientists are eagerly seeking communication mediums for effectively enlightening the public on climate change and its consequences. The Climate@Home project is devoted to connect the two ends with an innovative solution: crowdsourcing climate computing to the general public by harvesting volunteered computing resources from the participants. A distributed web-based computing platform will be built to support climate computing, and the general public can 'plug-in' their personal computers to participate in the research. People contribute the spare computing power of their computers to run a computer model, which is used by scientists to predict climate change. Traditionally, only super-computers could handle such a large computing processing load. By orchestrating massive amounts of personal computers to perform atomized data processing tasks, investments on new super-computers, energy consumed by super-computers, and carbon release from super-computers are reduced. Meanwhile, the platform forms a social network of climate researchers and the general public, which may be leveraged to raise climate awareness among the participants. A portal is to be built as the gateway to the climate@home project. Three types of roles and the corresponding functionalities are designed and supported. The end users include the citizen participants, climate scientists, and project managers. Citizen participants connect their computing resources to the platform by downloading and installing a computing engine on their personal computers. Computer climate models are defined at the server side. Climate scientists configure computer model parameters through the portal user interface. After model configuration, scientists then launch the computing task. Next, data is atomized and distributed to computing engines that are running on citizen participants' computers. Scientists will receive notifications on the completion of computing tasks, and examine modeling results via visualization modules of the portal. Computing tasks, computing resources, and participants are managed by project managers via portal tools. A portal prototype has been built for proof of concept. Three forums have been setup for different groups of users to share information on science aspect, technology aspect, and educational outreach aspect. A facebook account has been setup to distribute messages via the most popular social networking platform. New treads are synchronized from the forums to facebook. A mapping tool displays geographic locations of the participants and the status of tasks on each client node. A group of users have been invited to test functions such as forums, blogs, and computing resource monitoring.

  1. Molecular Sticker Model Stimulation on Silicon for a Maximum Clique Problem

    PubMed Central

    Ning, Jianguo; Li, Yanmei; Yu, Wen

    2015-01-01

    Molecular computers (also called DNA computers), as an alternative to traditional electronic computers, are smaller in size but more energy efficient, and have massive parallel processing capacity. However, DNA computers may not outperform electronic computers owing to their higher error rates and some limitations of the biological laboratory. The stickers model, as a typical DNA-based computer, is computationally complete and universal, and can be viewed as a bit-vertically operating machine. This makes it attractive for silicon implementation. Inspired by the information processing method on the stickers computer, we propose a novel parallel computing model called DEM (DNA Electronic Computing Model) on System-on-a-Programmable-Chip (SOPC) architecture. Except for the significant difference in the computing medium—transistor chips rather than bio-molecules—the DEM works similarly to DNA computers in immense parallel information processing. Additionally, a plasma display panel (PDP) is used to show the change of solutions, and helps us directly see the distribution of assignments. The feasibility of the DEM is tested by applying it to compute a maximum clique problem (MCP) with eight vertices. Owing to the limited computing sources on SOPC architecture, the DEM could solve moderate-size problems in polynomial time. PMID:26075867

  2. Elastic Cloud Computing Architecture and System for Heterogeneous Spatiotemporal Computing

    NASA Astrophysics Data System (ADS)

    Shi, X.

    2017-10-01

    Spatiotemporal computation implements a variety of different algorithms. When big data are involved, desktop computer or standalone application may not be able to complete the computation task due to limited memory and computing power. Now that a variety of hardware accelerators and computing platforms are available to improve the performance of geocomputation, different algorithms may have different behavior on different computing infrastructure and platforms. Some are perfect for implementation on a cluster of graphics processing units (GPUs), while GPUs may not be useful on certain kind of spatiotemporal computation. This is the same situation in utilizing a cluster of Intel's many-integrated-core (MIC) or Xeon Phi, as well as Hadoop or Spark platforms, to handle big spatiotemporal data. Furthermore, considering the energy efficiency requirement in general computation, Field Programmable Gate Array (FPGA) may be a better solution for better energy efficiency when the performance of computation could be similar or better than GPUs and MICs. It is expected that an elastic cloud computing architecture and system that integrates all of GPUs, MICs, and FPGAs could be developed and deployed to support spatiotemporal computing over heterogeneous data types and computational problems.

  3. Evaluation of computer usage in healthcare among private practitioners of NCT Delhi.

    PubMed

    Ganeshkumar, P; Arun Kumar, Sharma; Rajoura, O P

    2011-01-01

    1. To evaluate the usage and the knowledge of computers and Information and Communication Technology in health care delivery by private practitioners. 2. To understand the determinants of computer usage by them. A cross sectional study was conducted among the private practitioners practising in three districts of NCT of Delhi between November 2007 and December 2008 by stratified random sampling method, where knowledge and usage of computers in health care and determinants of usage of computer was evaluated in them by a pre-coded semi open ended questionnaire. About 77% of the practitioners reported to have a computer and had the accessibility to internet. Computer availability and internet accessibility was highest among super speciality practitioners. Practitioners who attended a computer course were 13.8 times [OR: 13.8 (7.3 - 25.8)] more likely to have installed an EHR in the clinic. Technical related issues were the major perceived barrier in installing a computer in the clinic. Practice speciality, previous attendance of a computer course, age of started using a computer influenced the knowledge about computers. Speciality of the practice, presence of a computer professional and gender were the determinants of usage of computer.

  4. Research on Key Technologies of Cloud Computing

    NASA Astrophysics Data System (ADS)

    Zhang, Shufen; Yan, Hongcan; Chen, Xuebin

    With the development of multi-core processors, virtualization, distributed storage, broadband Internet and automatic management, a new type of computing mode named cloud computing is produced. It distributes computation task on the resource pool which consists of massive computers, so the application systems can obtain the computing power, the storage space and software service according to its demand. It can concentrate all the computing resources and manage them automatically by the software without intervene. This makes application offers not to annoy for tedious details and more absorbed in his business. It will be advantageous to innovation and reduce cost. It's the ultimate goal of cloud computing to provide calculation, services and applications as a public facility for the public, So that people can use the computer resources just like using water, electricity, gas and telephone. Currently, the understanding of cloud computing is developing and changing constantly, cloud computing still has no unanimous definition. This paper describes three main service forms of cloud computing: SAAS, PAAS, IAAS, compared the definition of cloud computing which is given by Google, Amazon, IBM and other companies, summarized the basic characteristics of cloud computing, and emphasized on the key technologies such as data storage, data management, virtualization and programming model.

  5. Performing an allreduce operation on a plurality of compute nodes of a parallel computer

    DOEpatents

    Faraj, Ahmad [Rochester, MN

    2012-04-17

    Methods, apparatus, and products are disclosed for performing an allreduce operation on a plurality of compute nodes of a parallel computer. Each compute node includes at least two processing cores. Each processing core has contribution data for the allreduce operation. Performing an allreduce operation on a plurality of compute nodes of a parallel computer includes: establishing one or more logical rings among the compute nodes, each logical ring including at least one processing core from each compute node; performing, for each logical ring, a global allreduce operation using the contribution data for the processing cores included in that logical ring, yielding a global allreduce result for each processing core included in that logical ring; and performing, for each compute node, a local allreduce operation using the global allreduce results for each processing core on that compute node.

  6. Broadcasting a message in a parallel computer

    DOEpatents

    Berg, Jeremy E [Rochester, MN; Faraj, Ahmad A [Rochester, MN

    2011-08-02

    Methods, systems, and products are disclosed for broadcasting a message in a parallel computer. The parallel computer includes a plurality of compute nodes connected together using a data communications network. The data communications network optimized for point to point data communications and is characterized by at least two dimensions. The compute nodes are organized into at least one operational group of compute nodes for collective parallel operations of the parallel computer. One compute node of the operational group assigned to be a logical root. Broadcasting a message in a parallel computer includes: establishing a Hamiltonian path along all of the compute nodes in at least one plane of the data communications network and in the operational group; and broadcasting, by the logical root to the remaining compute nodes, the logical root's message along the established Hamiltonian path.

  7. Developing the next generation of diverse computer scientists: the need for enhanced, intersectional computing identity theory

    NASA Astrophysics Data System (ADS)

    Rodriguez, Sarah L.; Lehman, Kathleen

    2017-10-01

    This theoretical paper explores the need for enhanced, intersectional computing identity theory for the purpose of developing a diverse group of computer scientists for the future. Greater theoretical understanding of the identity formation process specifically for computing is needed in order to understand how students come to understand themselves as computer scientists. To ensure that the next generation of computer scientists is diverse, this paper presents a case for examining identity development intersectionally, understanding the ways in which women and underrepresented students may have difficulty identifying as computer scientists and be systematically oppressed in their pursuit of computer science careers. Through a review of the available scholarship, this paper suggests that creating greater theoretical understanding of the computing identity development process will inform the way in which educational stakeholders consider computer science practices and policies.

  8. AV Programs for Computer Know-How.

    ERIC Educational Resources Information Center

    Mandell, Phyllis Levy

    1985-01-01

    Lists 44 audiovisual programs (most released between 1983 and 1984) grouped in seven categories: computers in society, introduction to computers, computer operations, languages and programing, computer graphics, robotics, computer careers. Excerpts from "School Library Journal" reviews, price, and intended grade level are included. Names…

  9. 29 CFR 541.401 - Computer manufacture and repair.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... DEFINING AND DELIMITING THE EXEMPTIONS FOR EXECUTIVE, ADMINISTRATIVE, PROFESSIONAL, COMPUTER AND OUTSIDE..., the use of computers and computer software programs (e.g., engineers, drafters and others skilled in computer-aided design software), but who are not primarily engaged in computer systems analysis and...

  10. Controlling data transfers from an origin compute node to a target compute node

    DOEpatents

    Archer, Charles J [Rochester, MN; Blocksome, Michael A [Rochester, MN; Ratterman, Joseph D [Rochester, MN; Smith, Brian E [Rochester, MN

    2011-06-21

    Methods, apparatus, and products are disclosed for controlling data transfers from an origin compute node to a target compute node that include: receiving, by an application messaging module on the target compute node, an indication of a data transfer from an origin compute node to the target compute node; and administering, by the application messaging module on the target compute node, the data transfer using one or more messaging primitives of a system messaging module in dependence upon the indication.

  11. Guidelines for development of NASA (National Aeronautics and Space Administration) computer security training programs

    NASA Technical Reports Server (NTRS)

    Tompkins, F. G.

    1983-01-01

    The report presents guidance for the NASA Computer Security Program Manager and the NASA Center Computer Security Officials as they develop training requirements and implement computer security training programs. NASA audiences are categorized based on the computer security knowledge required to accomplish identified job functions. Training requirements, in terms of training subject areas, are presented for both computer security program management personnel and computer resource providers and users. Sources of computer security training are identified.

  12. ANL statement of site strategy for computing workstations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fenske, K.R.; Boxberger, L.M.; Amiot, L.W.

    1991-11-01

    This Statement of Site Strategy describes the procedure at Argonne National Laboratory for defining, acquiring, using, and evaluating scientific and office workstations and related equipment and software in accord with DOE Order 1360.1A (5-30-85), and Laboratory policy. It is Laboratory policy to promote the installation and use of computing workstations to improve productivity and communications for both programmatic and support personnel, to ensure that computing workstations acquisitions meet the expressed need in a cost-effective manner, and to ensure that acquisitions of computing workstations are in accord with Laboratory and DOE policies. The overall computing site strategy at ANL is tomore » develop a hierarchy of integrated computing system resources to address the current and future computing needs of the laboratory. The major system components of this hierarchical strategy are: Supercomputers, Parallel computers, Centralized general purpose computers, Distributed multipurpose minicomputers, and Computing workstations and office automation support systems. Computing workstations include personal computers, scientific and engineering workstations, computer terminals, microcomputers, word processing and office automation electronic workstations, and associated software and peripheral devices costing less than $25,000 per item.« less

  13. Computational Social Creativity.

    PubMed

    Saunders, Rob; Bown, Oliver

    2015-01-01

    This article reviews the development of computational models of creativity where social interactions are central. We refer to this area as computational social creativity. Its context is described, including the broader study of creativity, the computational modeling of other social phenomena, and computational models of individual creativity. Computational modeling has been applied to a number of areas of social creativity and has the potential to contribute to our understanding of creativity. A number of requirements for computational models of social creativity are common in artificial life and computational social science simulations. Three key themes are identified: (1) computational social creativity research has a critical role to play in understanding creativity as a social phenomenon and advancing computational creativity by making clear epistemological contributions in ways that would be challenging for other approaches; (2) the methodologies developed in artificial life and computational social science carry over directly to computational social creativity; and (3) the combination of computational social creativity with individual models of creativity presents significant opportunities and poses interesting challenges for the development of integrated models of creativity that have yet to be realized.

  14. Pulmonary lobar volumetry using novel volumetric computer-aided diagnosis and computed tomography

    PubMed Central

    Iwano, Shingo; Kitano, Mariko; Matsuo, Keiji; Kawakami, Kenichi; Koike, Wataru; Kishimoto, Mariko; Inoue, Tsutomu; Li, Yuanzhong; Naganawa, Shinji

    2013-01-01

    OBJECTIVES To compare the accuracy of pulmonary lobar volumetry using the conventional number of segments method and novel volumetric computer-aided diagnosis using 3D computed tomography images. METHODS We acquired 50 consecutive preoperative 3D computed tomography examinations for lung tumours reconstructed at 1-mm slice thicknesses. We calculated the lobar volume and the emphysematous lobar volume < −950 HU of each lobe using (i) the slice-by-slice method (reference standard), (ii) number of segments method, and (iii) semi-automatic and (iv) automatic computer-aided diagnosis. We determined Pearson correlation coefficients between the reference standard and the three other methods for lobar volumes and emphysematous lobar volumes. We also compared the relative errors among the three measurement methods. RESULTS Both semi-automatic and automatic computer-aided diagnosis results were more strongly correlated with the reference standard than the number of segments method. The correlation coefficients for automatic computer-aided diagnosis were slightly lower than those for semi-automatic computer-aided diagnosis because there was one outlier among 50 cases (2%) in the right upper lobe and two outliers among 50 cases (4%) in the other lobes. The number of segments method relative error was significantly greater than those for semi-automatic and automatic computer-aided diagnosis (P < 0.001). The computational time for automatic computer-aided diagnosis was 1/2 to 2/3 than that of semi-automatic computer-aided diagnosis. CONCLUSIONS A novel lobar volumetry computer-aided diagnosis system could more precisely measure lobar volumes than the conventional number of segments method. Because semi-automatic computer-aided diagnosis and automatic computer-aided diagnosis were complementary, in clinical use, it would be more practical to first measure volumes by automatic computer-aided diagnosis, and then use semi-automatic measurements if automatic computer-aided diagnosis failed. PMID:23526418

  15. Assessing attitudes toward computers and the use of Internet resources among undergraduate microbiology students

    NASA Astrophysics Data System (ADS)

    Anderson, Delia Marie Castro

    Computer literacy and use have become commonplace in our colleges and universities. In an environment that demands the use of technology, educators should be knowledgeable of the components that make up the overall computer attitude of students and be willing to investigate the processes and techniques of effective teaching and learning that can take place with computer technology. The purpose of this study is two fold. First, it investigates the relationship between computer attitudes and gender, ethnicity, and computer experience. Second, it addresses the question of whether, and to what extent, students' attitudes toward computers change over a 16 week period in an undergraduate microbiology course that supplements the traditional lecture with computer-driven assignments. Multiple regression analyses, using data from the Computer Attitudes Scale (Loyd & Loyd, 1985), showed that, in the experimental group, no significant relationships were found between computer anxiety and gender or ethnicity or between computer confidence and gender or ethnicity. However, students who used computers the longest (p = .001) and who were self-taught (p = .046) had the lowest computer anxiety levels. Likewise students who used computers the longest (p = .001) and who were self-taught (p = .041) had the highest confidence levels. No significant relationships between computer liking, usefulness, or the use of Internet resources and gender, ethnicity, or computer experience were found. Dependent T-tests were performed to determine whether computer attitude scores (pretest and posttest) increased over a 16-week period for students who had been exposed to computer-driven assignments and other Internet resources. Results showed that students in the experimental group were less anxious about working with computers and considered computers to be more useful. In the control group, no significant changes in computer anxiety, confidence, liking, or usefulness were noted. Overall, students in the experimental group, who responded to the use of Internet Resources Survey, were positive (mean of 3.4 on the 4-point scale) toward their use of Internet resources which included the online courseware developed by the researcher. Findings from this study suggest that (1) the digital divide with respect to gender and ethnicity may be narrowing, and (2) students who are exposed to a course that augments computer-driven courseware with traditional teaching methods appear to have less anxiety, have a clearer perception of computer usefulness, and feel that online resources enhance their learning.

  16. Pacing a data transfer operation between compute nodes on a parallel computer

    DOEpatents

    Blocksome, Michael A [Rochester, MN

    2011-09-13

    Methods, systems, and products are disclosed for pacing a data transfer between compute nodes on a parallel computer that include: transferring, by an origin compute node, a chunk of an application message to a target compute node; sending, by the origin compute node, a pacing request to a target direct memory access (`DMA`) engine on the target compute node using a remote get DMA operation; determining, by the origin compute node, whether a pacing response to the pacing request has been received from the target DMA engine; and transferring, by the origin compute node, a next chunk of the application message if the pacing response to the pacing request has been received from the target DMA engine.

  17. Scheduling applications for execution on a plurality of compute nodes of a parallel computer to manage temperature of the nodes during execution

    DOEpatents

    Archer, Charles J; Blocksome, Michael A; Peters, Amanda E; Ratterman, Joseph D; Smith, Brian E

    2012-10-16

    Methods, apparatus, and products are disclosed for scheduling applications for execution on a plurality of compute nodes of a parallel computer to manage temperature of the plurality of compute nodes during execution that include: identifying one or more applications for execution on the plurality of compute nodes; creating a plurality of physically discontiguous node partitions in dependence upon temperature characteristics for the compute nodes and a physical topology for the compute nodes, each discontiguous node partition specifying a collection of physically adjacent compute nodes; and assigning, for each application, that application to one or more of the discontiguous node partitions for execution on the compute nodes specified by the assigned discontiguous node partitions.

  18. Reducing power consumption while synchronizing a plurality of compute nodes during execution of a parallel application

    DOEpatents

    Archer, Charles J [Rochester, MN; Blocksome, Michael A [Rochester, MN; Peters, Amanda A [Rochester, MN; Ratterman, Joseph D [Rochester, MN; Smith, Brian E [Rochester, MN

    2012-01-10

    Methods, apparatus, and products are disclosed for reducing power consumption while synchronizing a plurality of compute nodes during execution of a parallel application that include: beginning, by each compute node, performance of a blocking operation specified by the parallel application, each compute node beginning the blocking operation asynchronously with respect to the other compute nodes; reducing, for each compute node, power to one or more hardware components of that compute node in response to that compute node beginning the performance of the blocking operation; and restoring, for each compute node, the power to the hardware components having power reduced in response to all of the compute nodes beginning the performance of the blocking operation.

  19. Reducing power consumption while synchronizing a plurality of compute nodes during execution of a parallel application

    DOEpatents

    Archer, Charles J [Rochester, MN; Blocksome, Michael A [Rochester, MN; Peters, Amanda E [Cambridge, MA; Ratterman, Joseph D [Rochester, MN; Smith, Brian E [Rochester, MN

    2012-04-17

    Methods, apparatus, and products are disclosed for reducing power consumption while synchronizing a plurality of compute nodes during execution of a parallel application that include: beginning, by each compute node, performance of a blocking operation specified by the parallel application, each compute node beginning the blocking operation asynchronously with respect to the other compute nodes; reducing, for each compute node, power to one or more hardware components of that compute node in response to that compute node beginning the performance of the blocking operation; and restoring, for each compute node, the power to the hardware components having power reduced in response to all of the compute nodes beginning the performance of the blocking operation.

  20. A novel quantum scheme for secure two-party distance computation

    NASA Astrophysics Data System (ADS)

    Peng, Zhen-wan; Shi, Run-hua; Zhong, Hong; Cui, Jie; Zhang, Shun

    2017-12-01

    Secure multiparty computational geometry is an essential field of secure multiparty computation, which computes a computation geometric problem without revealing any private information of each party. Secure two-party distance computation is a primitive of secure multiparty computational geometry, which computes the distance between two points without revealing each point's location information (i.e., coordinate). Secure two-party distance computation has potential applications with high secure requirements in military, business, engineering and so on. In this paper, we present a quantum solution to secure two-party distance computation by subtly using quantum private query. Compared to the classical related protocols, our quantum protocol can ensure higher security and better privacy protection because of the physical principle of quantum mechanics.

  1. New computing systems, future computing environment, and their implications on structural analysis and design

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Housner, Jerrold M.

    1993-01-01

    Recent advances in computer technology that are likely to impact structural analysis and design of flight vehicles are reviewed. A brief summary is given of the advances in microelectronics, networking technologies, and in the user-interface hardware and software. The major features of new and projected computing systems, including high performance computers, parallel processing machines, and small systems, are described. Advances in programming environments, numerical algorithms, and computational strategies for new computing systems are reviewed. The impact of the advances in computer technology on structural analysis and the design of flight vehicles is described. A scenario for future computing paradigms is presented, and the near-term needs in the computational structures area are outlined.

  2. Reducing power consumption during execution of an application on a plurality of compute nodes

    DOEpatents

    Archer, Charles J.; Blocksome, Michael A.; Peters, Amanda E.; Ratterman, Joseph D.; Smith, Brian E.

    2013-09-10

    Methods, apparatus, and products are disclosed for reducing power consumption during execution of an application on a plurality of compute nodes that include: powering up, during compute node initialization, only a portion of computer memory of the compute node, including configuring an operating system for the compute node in the powered up portion of computer memory; receiving, by the operating system, an instruction to load an application for execution; allocating, by the operating system, additional portions of computer memory to the application for use during execution; powering up the additional portions of computer memory allocated for use by the application during execution; and loading, by the operating system, the application into the powered up additional portions of computer memory.

  3. Computer-assisted learning in critical care: from ENIAC to HAL.

    PubMed

    Tegtmeyer, K; Ibsen, L; Goldstein, B

    2001-08-01

    Computers are commonly used to serve many functions in today's modern intensive care unit. One of the most intriguing and perhaps most challenging applications of computers has been to attempt to improve medical education. With the introduction of the first computer, medical educators began looking for ways to incorporate their use into the modern curriculum. Prior limitations of cost and complexity of computers have consistently decreased since their introduction, making it increasingly feasible to incorporate computers into medical education. Simultaneously, the capabilities and capacities of computers have increased. Combining the computer with other modern digital technology has allowed the development of more intricate and realistic educational tools. The purpose of this article is to briefly describe the history and use of computers in medical education with special reference to critical care medicine. In addition, we will examine the role of computers in teaching and learning and discuss the types of interaction between the computer user and the computer.

  4. A data mining technique for discovering distinct patterns of hand signs: implications in user training and computer interface design.

    PubMed

    Ye, Nong; Li, Xiangyang; Farley, Toni

    2003-01-15

    Hand signs are considered as one of the important ways to enter information into computers for certain tasks. Computers receive sensor data of hand signs for recognition. When using hand signs as computer inputs, we need to (1) train computer users in the sign language so that their hand signs can be easily recognized by computers, and (2) design the computer interface to avoid the use of confusing signs for improving user input performance and user satisfaction. For user training and computer interface design, it is important to have a knowledge of which signs can be easily recognized by computers and which signs are not distinguishable by computers. This paper presents a data mining technique to discover distinct patterns of hand signs from sensor data. Based on these patterns, we derive a group of indistinguishable signs by computers. Such information can in turn assist in user training and computer interface design.

  5. Computer Music

    NASA Astrophysics Data System (ADS)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  6. ComputerTown: A Do-It-Yourself Community Computer Project. [Computer Town, USA and Other Microcomputer Based Alternatives to Traditional Learning Environments].

    ERIC Educational Resources Information Center

    Zamora, Ramon M.

    Alternative learning environments offering computer-related instruction are developing around the world. Storefront learning centers, museum-based computer facilities, and special theme parks are some of the new concepts. ComputerTown, USA! is a public access computer literacy project begun in 1979 to serve both adults and children in Menlo Park…

  7. A Research Program in Computer Technology. 1982 Annual Technical Report

    DTIC Science & Technology

    1983-03-01

    for the Defense Advanced Research Projects Agency. The research applies computer science and technology to areas of high DoD/ military impact. The ISI...implement the plan; New Computing Environment - investigation and adaptation of developing computer technologies to serve the research and military ...Computing Environment - ,.*_i;.;"’.)n and adaptation of developing computer technologies to serve the research and military tser communities; and Computer

  8. Computer-based medical education in Benha University, Egypt: knowledge, attitude, limitations, and suggestions.

    PubMed

    Bayomy, Hanaa; El Awadi, Mona; El Araby, Eman; Abed, Hala A

    2016-12-01

    Computer-assisted medical education has been developed to enhance learning and enable high-quality medical care. This study aimed to assess computer knowledge and attitude toward the inclusion of computers in medical education among second-year medical students in Benha Faculty of Medicine, Egypt, to identify limitations, and obtain suggestions for successful computer-based learning. This was a one-group pre-post-test study, which was carried out on second-year students in Benha Faculty of Medicine. A structured self-administered questionnaire was used to compare students' knowledge, attitude, limitations, and suggestions toward computer usage in medical education before and after the computer course to evaluate the change in students' responses. The majority of students were familiar with use of the mouse and keyboard, basic word processing, internet and web searching, and e-mail both before and after the computer course. The proportion of students who were familiar with software programs other than the word processing and trouble-shoot software/hardware was significantly higher after the course (P<0.001). There was a significant increase in the proportion of students who agreed on owning a computer (P=0.008), the inclusion of computer skills course in medical education, downloading lecture handouts, and computer-based exams (P<0.001) after the course. After the course, there was a significant increase in the proportion of students who agreed that the lack of central computers limited the inclusion of computer in medical education (P<0.001). Although the lack of computer labs, lack of Information Technology staff mentoring, large number of students, unclear course outline, and lack of internet access were more frequently reported before the course (P<0.001), the majority of students suggested the provision of computer labs, inviting Information Technology staff to support computer teaching, and the availability of free Wi-Fi internet access covering several areas in the university campus; all would support computer-assisted medical education. Medical students in Benha University are computer literate, which allows for computer-based medical education. Staff training, provision of computer labs, and internet access are essential requirements for enhancing computer usage in medical education in the university.

  9. Distributed computing system with dual independent communications paths between computers and employing split tokens

    NASA Technical Reports Server (NTRS)

    Rasmussen, Robert D. (Inventor); Manning, Robert M. (Inventor); Lewis, Blair F. (Inventor); Bolotin, Gary S. (Inventor); Ward, Richard S. (Inventor)

    1990-01-01

    This is a distributed computing system providing flexible fault tolerance; ease of software design and concurrency specification; and dynamic balance of the loads. The system comprises a plurality of computers each having a first input/output interface and a second input/output interface for interfacing to communications networks each second input/output interface including a bypass for bypassing the associated computer. A global communications network interconnects the first input/output interfaces for providing each computer the ability to broadcast messages simultaneously to the remainder of the computers. A meshwork communications network interconnects the second input/output interfaces providing each computer with the ability to establish a communications link with another of the computers bypassing the remainder of computers. Each computer is controlled by a resident copy of a common operating system. Communications between respective ones of computers is by means of split tokens each having a moving first portion which is sent from computer to computer and a resident second portion which is disposed in the memory of at least one of computer and wherein the location of the second portion is part of the first portion. The split tokens represent both functions to be executed by the computers and data to be employed in the execution of the functions. The first input/output interfaces each include logic for detecting a collision between messages and for terminating the broadcasting of a message whereby collisions between messages are detected and avoided.

  10. Development of a SaaS application probe to the physical properties of the Earth's interior: An attempt at moving HPC to the cloud

    NASA Astrophysics Data System (ADS)

    Huang, Qian

    2014-09-01

    Scientific computing often requires the availability of a massive number of computers for performing large-scale simulations, and computing in mineral physics is no exception. In order to investigate physical properties of minerals at extreme conditions in computational mineral physics, parallel computing technology is used to speed up the performance by utilizing multiple computer resources to process a computational task simultaneously thereby greatly reducing computation time. Traditionally, parallel computing has been addressed by using High Performance Computing (HPC) solutions and installed facilities such as clusters and super computers. Today, it has been seen that there is a tremendous growth in cloud computing. Infrastructure as a Service (IaaS), the on-demand and pay-as-you-go model, creates a flexible and cost-effective mean to access computing resources. In this paper, a feasibility report of HPC on a cloud infrastructure is presented. It is found that current cloud services in IaaS layer still need to improve performance to be useful to research projects. On the other hand, Software as a Service (SaaS), another type of cloud computing, is introduced into an HPC system for computing in mineral physics, and an application of which is developed. In this paper, an overall description of this SaaS application is presented. This contribution can promote cloud application development in computational mineral physics, and cross-disciplinary studies.

  11. CSNS computing environment Based on OpenStack

    NASA Astrophysics Data System (ADS)

    Li, Yakang; Qi, Fazhi; Chen, Gang; Wang, Yanming; Hong, Jianshu

    2017-10-01

    Cloud computing can allow for more flexible configuration of IT resources and optimized hardware utilization, it also can provide computing service according to the real need. We are applying this computing mode to the China Spallation Neutron Source(CSNS) computing environment. So, firstly, CSNS experiment and its computing scenarios and requirements are introduced in this paper. Secondly, the design and practice of cloud computing platform based on OpenStack are mainly demonstrated from the aspects of cloud computing system framework, network, storage and so on. Thirdly, some improvments to openstack we made are discussed further. Finally, current status of CSNS cloud computing environment are summarized in the ending of this paper.

  12. Distributed Accounting on the Grid

    NASA Technical Reports Server (NTRS)

    Thigpen, William; Hacker, Thomas J.; McGinnis, Laura F.; Athey, Brian D.

    2001-01-01

    By the late 1990s, the Internet was adequately equipped to move vast amounts of data between HPC (High Performance Computing) systems, and efforts were initiated to link together the national infrastructure of high performance computational and data storage resources together into a general computational utility 'grid', analogous to the national electrical power grid infrastructure. The purpose of the Computational grid is to provide dependable, consistent, pervasive, and inexpensive access to computational resources for the computing community in the form of a computing utility. This paper presents a fully distributed view of Grid usage accounting and a methodology for allocating Grid computational resources for use on a Grid computing system.

  13. The meaning of computers to a group of men who are homeless.

    PubMed

    Miller, Kathleen Swenson; Bunch-Harrison, Stacey; Brumbaugh, Brett; Kutty, Rekha Sankaran; FitzGerald, Kathleen

    2005-01-01

    The purpose of this pilot study was to explore the experience with computers and the meaning of computers to a group of homeless men living in a long-term shelter. This descriptive exploratory study used semistructured interviews with seven men who had been given access to computers and had participated in individually tailored occupation based interventions through a Work Readiness Program. Three themes emerged from analyzing the interviews: access to computers, computers as a bridge to life-skill development, and changed self-perceptions as a result of connecting to technology. Because they lacked computer knowledge and feared failure, the majority of study participants had not sought out computers available through public access. The need for access to computers, the potential use of computers as a medium for intervention, and the meaning of computers to these men who represent the digital divide are described in this study.

  14. Performing process migration with allreduce operations

    DOEpatents

    Archer, Charles Jens; Peters, Amanda; Wallenfelt, Brian Paul

    2010-12-14

    Compute nodes perform allreduce operations that swap processes at nodes. A first allreduce operation generates a first result and uses a first process from a first compute node, a second process from a second compute node, and zeros from other compute nodes. The first compute node replaces the first process with the first result. A second allreduce operation generates a second result and uses the first result from the first compute node, the second process from the second compute node, and zeros from others. The second compute node replaces the second process with the second result, which is the first process. A third allreduce operation generates a third result and uses the first result from first compute node, the second result from the second compute node, and zeros from others. The first compute node replaces the first result with the third result, which is the second process.

  15. Cloud Computing with iPlant Atmosphere.

    PubMed

    McKay, Sheldon J; Skidmore, Edwin J; LaRose, Christopher J; Mercer, Andre W; Noutsos, Christos

    2013-10-15

    Cloud Computing refers to distributed computing platforms that use virtualization software to provide easy access to physical computing infrastructure and data storage, typically administered through a Web interface. Cloud-based computing provides access to powerful servers, with specific software and virtual hardware configurations, while eliminating the initial capital cost of expensive computers and reducing the ongoing operating costs of system administration, maintenance contracts, power consumption, and cooling. This eliminates a significant barrier to entry into bioinformatics and high-performance computing for many researchers. This is especially true of free or modestly priced cloud computing services. The iPlant Collaborative offers a free cloud computing service, Atmosphere, which allows users to easily create and use instances on virtual servers preconfigured for their analytical needs. Atmosphere is a self-service, on-demand platform for scientific computing. This unit demonstrates how to set up, access and use cloud computing in Atmosphere. Copyright © 2013 John Wiley & Sons, Inc.

  16. Link failure detection in a parallel computer

    DOEpatents

    Archer, Charles J.; Blocksome, Michael A.; Megerian, Mark G.; Smith, Brian E.

    2010-11-09

    Methods, apparatus, and products are disclosed for link failure detection in a parallel computer including compute nodes connected in a rectangular mesh network, each pair of adjacent compute nodes in the rectangular mesh network connected together using a pair of links, that includes: assigning each compute node to either a first group or a second group such that adjacent compute nodes in the rectangular mesh network are assigned to different groups; sending, by each of the compute nodes assigned to the first group, a first test message to each adjacent compute node assigned to the second group; determining, by each of the compute nodes assigned to the second group, whether the first test message was received from each adjacent compute node assigned to the first group; and notifying a user, by each of the compute nodes assigned to the second group, whether the first test message was received.

  17. Parallel computing works

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    An account of the Caltech Concurrent Computation Program (C{sup 3}P), a five year project that focused on answering the question: Can parallel computers be used to do large-scale scientific computations '' As the title indicates, the question is answered in the affirmative, by implementing numerous scientific applications on real parallel computers and doing computations that produced new scientific results. In the process of doing so, C{sup 3}P helped design and build several new computers, designed and implemented basic system software, developed algorithms for frequently used mathematical computations on massively parallel machines, devised performance models and measured the performance of manymore » computers, and created a high performance computing facility based exclusively on parallel computers. While the initial focus of C{sup 3}P was the hypercube architecture developed by C. Seitz, many of the methods developed and lessons learned have been applied successfully on other massively parallel architectures.« less

  18. 'I'm good, but not that good': digitally-skilled young people's identity in computing

    NASA Astrophysics Data System (ADS)

    Wong, Billy

    2016-12-01

    Computers and information technology are fast becoming a part of young people's everyday life. However, there remains a difference between the majority who can use computers and the minority who are computer scientists or professionals. Drawing on 32 semi-structured interviews with digitally skilled young people (aged 13-19), we explore their views and aspirations in computing, with a focus on the identities and discourses that these youngsters articulate in relation to this field. Our findings suggest that, even among digitally skilled young people, traditional identities of computing as people who are clever but antisocial still prevail, which can be unattractive for youths, especially girls. Digitally skilled youths identify with computing in different ways and for different reasons. Most enjoy doing computing but few aspired to being a computer person. Implications of our findings for computing education are discussed especially the continued need to broaden identities in computing, even for the digitally skilled.

  19. Internode data communications in a parallel computer

    DOEpatents

    Archer, Charles J.; Blocksome, Michael A.; Miller, Douglas R.; Parker, Jeffrey J.; Ratterman, Joseph D.; Smith, Brian E.

    2013-09-03

    Internode data communications in a parallel computer that includes compute nodes that each include main memory and a messaging unit, the messaging unit including computer memory and coupling compute nodes for data communications, in which, for each compute node at compute node boot time: a messaging unit allocates, in the messaging unit's computer memory, a predefined number of message buffers, each message buffer associated with a process to be initialized on the compute node; receives, prior to initialization of a particular process on the compute node, a data communications message intended for the particular process; and stores the data communications message in the message buffer associated with the particular process. Upon initialization of the particular process, the process establishes a messaging buffer in main memory of the compute node and copies the data communications message from the message buffer of the messaging unit into the message buffer of main memory.

  20. Internode data communications in a parallel computer

    DOEpatents

    Archer, Charles J; Blocksome, Michael A; Miller, Douglas R; Parker, Jeffrey J; Ratterman, Joseph D; Smith, Brian E

    2014-02-11

    Internode data communications in a parallel computer that includes compute nodes that each include main memory and a messaging unit, the messaging unit including computer memory and coupling compute nodes for data communications, in which, for each compute node at compute node boot time: a messaging unit allocates, in the messaging unit's computer memory, a predefined number of message buffers, each message buffer associated with a process to be initialized on the compute node; receives, prior to initialization of a particular process on the compute node, a data communications message intended for the particular process; and stores the data communications message in the message buffer associated with the particular process. Upon initialization of the particular process, the process establishes a messaging buffer in main memory of the compute node and copies the data communications message from the message buffer of the messaging unit into the message buffer of main memory.

  1. Computers in Undergraduate Science Education. Conference Proceedings.

    ERIC Educational Resources Information Center

    Blum, Ronald, Ed.

    Six areas of computer use in undergraduate education, particularly in the fields of mathematics and physics, are discussed in these proceedings. The areas included are: the computational mode; computer graphics; the simulation mode; analog computing; computer-assisted instruction; and the current politics and management of college level computer…

  2. The Old Computers' Home.

    ERIC Educational Resources Information Center

    Angier, Natalie

    1983-01-01

    The Computer Museum in Marlborough, Massachusetts houses old and not-so-old calculators, famous old computers and parts of computers, photographs and assorted memorabilia, computer-generated murals, and even a computer made of Tinkertoys that plays tick-tack-toe. The development of the museum and selected exhibits is described. (Author/JN)

  3. 37 CFR 201.40 - Exemption to prohibition against circumvention.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... security of the owner or operator of a computer, computer system, or computer network; and (ii) The... film and media studies students; (ii) Documentary filmmaking; (iii) Noncommercial videos. (2) Computer... lawfully obtained, with computer programs on the telephone handset. (3) Computer programs, in the form of...

  4. Computers in aeronautics and space research at the Lewis Research Center

    NASA Technical Reports Server (NTRS)

    1991-01-01

    This brochure presents a general discussion of the role of computers in aerospace research at NASA's Lewis Research Center (LeRC). Four particular areas of computer applications are addressed: computer modeling and simulation, computer assisted engineering, data acquisition and analysis, and computer controlled testing.

  5. Sources of computer self-efficacy: The relationship to outcome expectations, computer anxiety, and intention to use computers

    NASA Astrophysics Data System (ADS)

    Antoine, Marilyn V.

    2011-12-01

    The purpose of this research was to extend earlier research on sources of selfefficacy (Lent, Lopez, & Biechke, 1991; Usher & Pajares, 2009) to the information technology domain. The principal investigator examined how Bandura's (1977) sources of self-efficacy information---mastery experience, vicarious experience, verbal persuasion, and physiological states---shape computer self-efficacy beliefs and influence the decision to use or not use computers. The study took place at a mid-sized Historically Black College or University in the South. A convenience sample of 105 undergraduates was drawn from students enrolled in multiple sections of two introductory computer courses. There were 67 females and 38 males. This research was a correlational study of the following variables: sources of computer self-efficacy, general computer self-efficacy, outcome expectations, computer anxiety, and intention to use computers. The principal investigator administered a survey questionnaire containing 52 Likert items to measure the major study variables. Additionally, the survey instrument collected demographic variables such as gender, age, race, intended major, classification, technology use, technology adoption category, and whether the student owns a computer. The results reveal the following: (1) Mastery experience and verbal persuasion had statistically significant relationships to general computer self-efficacy, while vicarious experience and physiological states had non-significant relationships. Mastery experience had the strongest correlation to general computer self-efficacy. (2) All of the sources of computer self-efficacy had statistically significant relationships to personal outcome expectations. Vicarious experience had the strongest correlation to personal outcome expectations. (3) All of the sources of self-efficacy had statistically significant relationships to performance outcome expectations. Vicarious experience had the strongest correlation to performance outcome expectations. (4) Mastery experience and physiological states had statistically significant relationships to computer anxiety, while vicarious experience and verbal persuasion had non-significant relationships. Physiological states had the strongest correlation to computer anxiety. (5) Mastery experience, vicarious experience, and physiological states had statistically significant relationships to intention to use computers, while verbal persuasion had a non-significant relationship. Mastery experience had the strongest correlation to intention to use computers. Gender-related findings indicate that females reported higher average mastery experience, vicarious experience, physiological states, and intention to use computers than males. Females reported lower average general computer self-efficacy, computer anxiety, verbal persuasion, personal outcome expectations, and performance outcome expectations than males. The results of this study can be used to develop strategies for increasing general computer self-efficacy, outcome expectations, and intention to use computers. The results can also be used to develop strategies for reducing computer anxiety.

  6. The use of wireless laptop computers for computer-assisted learning in pharmacokinetics.

    PubMed

    Munar, Myrna Y; Singh, Harleen; Belle, Donna; Brackett, Carolyn C; Earle, Sandra B

    2006-02-15

    To implement computer-assisted learning workshops into pharmacokinetics courses in a doctor of pharmacy (PharmD) program. Workshops were designed for students to utilize computer software programs on laptop computers to build pharmacokinetic models to predict drug concentrations resulting from various dosage regimens. In addition, students were able to visualize through graphing programs how altering different parameters changed drug concentration-time curves. Surveys were conducted to measure students' attitudes toward computer technology before and after implementation. Finally, traditional examinations were used to evaluate student learning. Doctor of pharmacy students responded favorably to the use of wireless laptop computers in problem-based pharmacokinetic workshops. Eighty-eight percent (n = 61/69) and 82% (n = 55/67) of PharmD students completed surveys before and after computer implementation, respectively. Prior to implementation, 95% of students agreed that computers would enhance learning in pharmacokinetics. After implementation, 98% of students strongly agreed (p < 0.05) that computers enhanced learning. Examination results were significantly higher after computer implementation (89% with computers vs. 84% without computers; p = 0.01). Implementation of wireless laptop computers in a pharmacokinetic course enabled students to construct their own pharmacokinetic models that could respond to changing parameters. Students had greater comprehension and were better able to interpret results and provide appropriate recommendations. Computer-assisted pharmacokinetic techniques can be powerful tools when making decisions about drug therapy.

  7. The Use of Wireless Laptop Computers for Computer-Assisted Learning in Pharmacokinetics

    PubMed Central

    Munar, Myrna Y.; Singh, Harleen; Belle, Donna; Brackett, Carolyn C.; Earle, Sandra B.

    2006-01-01

    Objective To implement computer-assisted learning workshops into pharmacokinetics courses in a doctor of pharmacy (PharmD) program. Design Workshops were designed for students to utilize computer software programs on laptop computers to build pharmacokinetic models to predict drug concentrations resulting from various dosage regimens. In addition, students were able to visualize through graphing programs how altering different parameters changed drug concentration-time curves. Surveys were conducted to measure students’ attitudes toward computer technology before and after implementation. Finally, traditional examinations were used to evaluate student learning. Assessment Doctor of pharmacy students responded favorably to the use of wireless laptop computers in problem-based pharmacokinetic workshops. Eighty-eight percent (n = 61/69) and 82% (n = 55/67) of PharmD students completed surveys before and after computer implementation, respectively. Prior to implementation, 95% of students agreed that computers would enhance learning in pharmacokinetics. After implementation, 98% of students strongly agreed (p < 0.05) that computers enhanced learning. Examination results were significantly higher after computer implementation (89% with computers vs. 84% without computers; p = 0.01). Conclusion Implementation of wireless laptop computers in a pharmacokinetic course enabled students to construct their own pharmacokinetic models that could respond to changing parameters. Students had greater comprehension and were better able to interpret results and provide appropriate recommendations. Computer-assisted pharmacokinetic techniques can be powerful tools when making decisions about drug therapy. PMID:17136147

  8. A Position on a Computer Literacy Course.

    ERIC Educational Resources Information Center

    Self, Charles C.

    A position is put forth on the appropriate content of a computer literacy course and the role of computer literacy in the community college. First, various definitions of computer literacy are examined, including the programming, computer awareness, and comprehensive approaches. Next, five essential components of a computer literacy course are…

  9. Investigations in Computer-Aided Instruction and Computer-Aided Controls. Final Report.

    ERIC Educational Resources Information Center

    Rosenberg, R.C.; And Others

    These research projects, designed to delve into certain relationships between humans and computers, are focused on computer-assisted instruction and on man-computer interaction. One study demonstrates that within the limits of formal engineering theory, a computer simulated laboratory (Dynamic Systems Laboratory) can be built in which freshmen…

  10. Computational Science News | Computational Science | NREL

    Science.gov Websites

    -Cooled High-Performance Computing Technology at the ESIF February 28, 2018 NREL Launches New Website for High-Performance Computing System Users The National Renewable Energy Laboratory (NREL) Computational Science Center has launched a revamped website for users of the lab's high-performance computing (HPC

  11. High-Performance Computing and Visualization | Energy Systems Integration

    Science.gov Websites

    Facility | NREL High-Performance Computing and Visualization High-Performance Computing and Visualization High-performance computing (HPC) and visualization at NREL propel technology innovation as a . Capabilities High-Performance Computing NREL is home to Peregrine-the largest high-performance computing system

  12. Computer Training for Seniors: An Academic-Community Partnership

    ERIC Educational Resources Information Center

    Sanders, Martha J.; O'Sullivan, Beth; DeBurra, Katherine; Fedner, Alesha

    2013-01-01

    Computer technology is integral to information retrieval, social communication, and social interaction. However, only 47% of seniors aged 65 and older use computers. The purpose of this study was to determine the impact of a client-centered computer program on computer skills, attitudes toward computer use, and generativity in novice senior…

  13. Computing Education in Korea--Current Issues and Endeavors

    ERIC Educational Resources Information Center

    Choi, Jeongwon; An, Sangjin; Lee, Youngjun

    2015-01-01

    Computer education has been provided for a long period of time in Korea. Starting as a vocational program, the content of computer education for students evolved to include content on computer literacy, Information Communication Technology (ICT) literacy, and brand-new computer science. While a new curriculum related to computer science was…

  14. Computers in Electrical Engineering Education at Virginia Polytechnic Institute.

    ERIC Educational Resources Information Center

    Bennett, A. Wayne

    1982-01-01

    Discusses use of computers in Electrical Engineering (EE) at Virginia Polytechnic Institute. Topics include: departmental background, level of computing power using large scale systems, mini and microcomputers, use of digital logic trainers and analog/hybrid computers, comments on integrating computers into EE curricula, and computer use in…

  15. CAA: Computer Assisted Athletics.

    ERIC Educational Resources Information Center

    Hall, John H.

    Computers have been used in a variety of applications for athletics since the late 1950's. These have ranged from computer-controlled electric scoreboards to computer-designed pole vaulting poles. Described in this paper are a computer-based athletic injury reporting system and a computer-assisted football scouting system. The injury reporting…

  16. An Implemented Strategy for Campus Connectivity and Cooperative Computing.

    ERIC Educational Resources Information Center

    Halaris, Antony S.; Sloan, Lynda W.

    1989-01-01

    ConnectPac, a software package developed at Iona College to allow a computer user to access all services from a single personal computer, is described. ConnectPac uses mainframe computing to support a campus computing network, integrating personal and centralized computing into a menu-driven user environment. (Author/MLW)

  17. Overreaction to External Attacks on Computer Systems Could Be More Harmful than the Viruses Themselves.

    ERIC Educational Resources Information Center

    King, Kenneth M.

    1988-01-01

    Discussion of the recent computer virus attacks on computers with vulnerable operating systems focuses on the values of educational computer networks. The need for computer security procedures is emphasized, and the ethical use of computer hardware and software is discussed. (LRW)

  18. Computer Numerical Control: Instructional Manual. The North Dakota High Technology Mobile Laboratory Project.

    ERIC Educational Resources Information Center

    Sinn, John W.

    This instructional manual contains five learning activity packets for use in a workshop on computer numerical control for computer-aided manufacturing. The lessons cover the following topics: introduction to computer-aided manufacturing, understanding the lathe, using the computer, computer numerically controlled part programming, and executing a…

  19. 29 CFR 541.402 - Executive and administrative computer employees.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 29 Labor 3 2014-07-01 2014-07-01 false Executive and administrative computer employees. 541.402..., COMPUTER AND OUTSIDE SALES EMPLOYEES Computer Employees § 541.402 Executive and administrative computer employees. Computer employees within the scope of this exemption, as well as those employees not within its...

  20. 29 CFR 541.402 - Executive and administrative computer employees.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 29 Labor 3 2012-07-01 2012-07-01 false Executive and administrative computer employees. 541.402..., COMPUTER AND OUTSIDE SALES EMPLOYEES Computer Employees § 541.402 Executive and administrative computer employees. Computer employees within the scope of this exemption, as well as those employees not within its...

  1. 29 CFR 541.402 - Executive and administrative computer employees.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 29 Labor 3 2013-07-01 2013-07-01 false Executive and administrative computer employees. 541.402..., COMPUTER AND OUTSIDE SALES EMPLOYEES Computer Employees § 541.402 Executive and administrative computer employees. Computer employees within the scope of this exemption, as well as those employees not within its...

  2. Computer Connections for Gifted Children and Youth.

    ERIC Educational Resources Information Center

    Nazzaro, Jean N., Ed.

    Written by computer specialists, teachers, parents, and students, the 23 articles emphasize the role computers play in the development of thinking, problem solving, and creativity in gifted and talented students. Articles have the following titles and authors: "Computers and Computer Cultures" (S. Papert); "Classroom Computers--Beyond the 3 R's"…

  3. Redirecting Under-Utilised Computer Laboratories into Cluster Computing Facilities

    ERIC Educational Resources Information Center

    Atkinson, John S.; Spenneman, Dirk H. R.; Cornforth, David

    2005-01-01

    Purpose: To provide administrators at an Australian university with data on the feasibility of redirecting under-utilised computer laboratories facilities into a distributed high performance computing facility. Design/methodology/approach: The individual log-in records for each computer located in the computer laboratories at the university were…

  4. 48 CFR 552.216-72 - Placement of Orders.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Acquisition Service (FAS) will place all orders by EDI using computer-to-computer EDI. If computer-to-computer... EDI. (d) When computer-to-computer EDI procedures will be used to place orders, the Contractor shall... electronic orders are placed, the transaction sets used, security procedures, and guidelines for...

  5. 48 CFR 552.216-72 - Placement of Orders.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Acquisition Service (FAS) will place all orders by EDI using computer-to-computer EDI. If computer-to-computer... EDI. (d) When computer-to-computer EDI procedures will be used to place orders, the Contractor shall... electronic orders are placed, the transaction sets used, security procedures, and guidelines for...

  6. 48 CFR 552.216-72 - Placement of Orders.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Acquisition Service (FAS) will place all orders by EDI using computer-to-computer EDI. If computer-to-computer... EDI. (d) When computer-to-computer EDI procedures will be used to place orders, the Contractor shall... electronic orders are placed, the transaction sets used, security procedures, and guidelines for...

  7. 48 CFR 552.216-72 - Placement of Orders.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Acquisition Service (FAS) will place all orders by EDI using computer-to-computer EDI. If computer-to-computer... EDI. (d) When computer-to-computer EDI procedures will be used to place orders, the Contractor shall... electronic orders are placed, the transaction sets used, security procedures, and guidelines for...

  8. 48 CFR 552.216-72 - Placement of Orders.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Acquisition Service (FAS) will place all orders by EDI using computer-to-computer EDI. If computer-to-computer... EDI. (d) When computer-to-computer EDI procedures will be used to place orders, the Contractor shall... electronic orders are placed, the transaction sets used, security procedures, and guidelines for...

  9. A Call for Computational Thinking in Undergraduate Psychology

    ERIC Educational Resources Information Center

    Anderson, Nicole D.

    2016-01-01

    Computational thinking is an approach to problem solving that is typically employed by computer programmers. The advantage of this approach is that solutions can be generated through algorithms that can be implemented as computer code. Although computational thinking has historically been a skill that is exclusively taught within computer science,…

  10. 29 CFR 541.402 - Executive and administrative computer employees.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 29 Labor 3 2011-07-01 2011-07-01 false Executive and administrative computer employees. 541.402..., COMPUTER AND OUTSIDE SALES EMPLOYEES Computer Employees § 541.402 Executive and administrative computer employees. Computer employees within the scope of this exemption, as well as those employees not within its...

  11. An analysis of the 70-meter antenna hydrostatic bearing by means of computer simulation

    NASA Technical Reports Server (NTRS)

    Bartos, R. D.

    1993-01-01

    Recently, the computer program 'A Computer Solution for Hydrostatic Bearings with Variable Film Thickness,' used to design the hydrostatic bearing of the 70-meter antennas, was modified to improve the accuracy with which the program predicts the film height profile and oil pressure distribution between the hydrostatic bearing pad and the runner. This article presents a description of the modified computer program, the theory upon which the computer program computations are based, computer simulation results, and a discussion of the computer simulation results.

  12. Application Portable Parallel Library

    NASA Technical Reports Server (NTRS)

    Cole, Gary L.; Blech, Richard A.; Quealy, Angela; Townsend, Scott

    1995-01-01

    Application Portable Parallel Library (APPL) computer program is subroutine-based message-passing software library intended to provide consistent interface to variety of multiprocessor computers on market today. Minimizes effort needed to move application program from one computer to another. User develops application program once and then easily moves application program from parallel computer on which created to another parallel computer. ("Parallel computer" also include heterogeneous collection of networked computers). Written in C language with one FORTRAN 77 subroutine for UNIX-based computers and callable from application programs written in C language or FORTRAN 77.

  13. Quantum simulations with noisy quantum computers

    NASA Astrophysics Data System (ADS)

    Gambetta, Jay

    Quantum computing is a new computational paradigm that is expected to lie beyond the standard model of computation. This implies a quantum computer can solve problems that can't be solved by a conventional computer with tractable overhead. To fully harness this power we need a universal fault-tolerant quantum computer. However the overhead in building such a machine is high and a full solution appears to be many years away. Nevertheless, we believe that we can build machines in the near term that cannot be emulated by a conventional computer. It is then interesting to ask what these can be used for. In this talk we will present our advances in simulating complex quantum systems with noisy quantum computers. We will show experimental implementations of this on some small quantum computers.

  14. Computer literacy among first year medical students in a developing country: a cross sectional study.

    PubMed

    Ranasinghe, Priyanga; Wickramasinghe, Sashimali A; Pieris, Wa Rasanga; Karunathilake, Indika; Constantine, Godwin R

    2012-09-14

    The use of computer assisted learning (CAL) has enhanced undergraduate medical education. CAL improves performance at examinations, develops problem solving skills and increases student satisfaction. The study evaluates computer literacy among first year medical students in Sri Lanka. The study was conducted at Faculty of Medicine, University of Colombo, Sri Lanka between August-September 2008. First year medical students (n = 190) were invited for the study. Data on computer literacy and associated factors were collected by an expert-validated pre-tested self-administered questionnaire. Computer literacy was evaluated by testing knowledge on 6 domains; common software packages, operating systems, database management and the usage of internet and E-mail. A linear regression was conducted using total score for computer literacy as the continuous dependant variable and other independent covariates. Sample size-181 (Response rate-95.3%), 49.7% were Males. Majority of the students (77.3%) owned a computer (Males-74.4%, Females-80.2%). Students have gained their present computer knowledge by; a formal training programme (64.1%), self learning (63.0%) or by peer learning (49.2%). The students used computers for predominately; word processing (95.6%), entertainment (95.0%), web browsing (80.1%) and preparing presentations (76.8%). Majority of the students (75.7%) expressed their willingness for a formal computer training programme at the faculty.Mean score for the computer literacy questionnaire was 48.4 ± 20.3, with no significant gender difference (Males-47.8 ± 21.1, Females-48.9 ± 19.6). There were 47.9% students that had a score less than 50% for the computer literacy questionnaire. Students from Colombo district, Western Province and Student owning a computer had a significantly higher mean score in comparison to other students (p < 0.001). In the linear regression analysis, formal computer training was the strongest predictor of computer literacy (β = 13.034), followed by using internet facility, being from Western province, using computers for Web browsing and computer programming, computer ownership and doing IT (Information Technology) as a subject in GCE (A/L) examination. Sri Lankan medical undergraduates had a low-intermediate level of computer literacy. There is a need to improve computer literacy, by increasing computer training in schools, or by introducing computer training in the initial stages of the undergraduate programme. These two options require improvement in infrastructure and other resources.

  15. Perspectives on Emerging/Novel Computing Paradigms and Future Aerospace Workforce Environments

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    2003-01-01

    The accelerating pace of the computing technology development shows no signs of abating. Computing power reaching 100 Tflop/s is likely to be reached by 2004 and Pflop/s (10(exp 15) Flop/s) by 2007. The fundamental physical limits of computation, including information storage limits, communication limits and computation rate limits will likely be reached by the middle of the present millennium. To overcome these limits, novel technologies and new computing paradigms will be developed. An attempt is made in this overview to put the diverse activities related to new computing-paradigms in perspective and to set the stage for the succeeding presentations. The presentation is divided into five parts. In the first part, a brief historical account is given of development of computer and networking technologies. The second part provides brief overviews of the three emerging computing paradigms grid, ubiquitous and autonomic computing. The third part lists future computing alternatives and the characteristics of future computing environment. The fourth part describes future aerospace workforce research, learning and design environments. The fifth part lists the objectives of the workshop and some of the sources of information on future computing paradigms.

  16. Flow Ambiguity: A Path Towards Classically Driven Blind Quantum Computation

    NASA Astrophysics Data System (ADS)

    Mantri, Atul; Demarie, Tommaso F.; Menicucci, Nicolas C.; Fitzsimons, Joseph F.

    2017-07-01

    Blind quantum computation protocols allow a user to delegate a computation to a remote quantum computer in such a way that the privacy of their computation is preserved, even from the device implementing the computation. To date, such protocols are only known for settings involving at least two quantum devices: either a user with some quantum capabilities and a remote quantum server or two or more entangled but noncommunicating servers. In this work, we take the first step towards the construction of a blind quantum computing protocol with a completely classical client and single quantum server. Specifically, we show how a classical client can exploit the ambiguity in the flow of information in measurement-based quantum computing to construct a protocol for hiding critical aspects of a computation delegated to a remote quantum computer. This ambiguity arises due to the fact that, for a fixed graph, there exist multiple choices of the input and output vertex sets that result in deterministic measurement patterns consistent with the same fixed total ordering of vertices. This allows a classical user, computing only measurement angles, to drive a measurement-based computation performed on a remote device while hiding critical aspects of the computation.

  17. Low latency, high bandwidth data communications between compute nodes in a parallel computer

    DOEpatents

    Archer, Charles J.; Blocksome, Michael A.; Ratterman, Joseph D.; Smith, Brian E.

    2010-11-02

    Methods, parallel computers, and computer program products are disclosed for low latency, high bandwidth data communications between compute nodes in a parallel computer. Embodiments include receiving, by an origin direct memory access (`DMA`) engine of an origin compute node, data for transfer to a target compute node; sending, by the origin DMA engine of the origin compute node to a target DMA engine on the target compute node, a request to send (`RTS`) message; transferring, by the origin DMA engine, a predetermined portion of the data to the target compute node using memory FIFO operation; determining, by the origin DMA engine whether an acknowledgement of the RTS message has been received from the target DMA engine; if the an acknowledgement of the RTS message has not been received, transferring, by the origin DMA engine, another predetermined portion of the data to the target compute node using a memory FIFO operation; and if the acknowledgement of the RTS message has been received by the origin DMA engine, transferring, by the origin DMA engine, any remaining portion of the data to the target compute node using a direct put operation.

  18. Cloud Computing

    DTIC Science & Technology

    2010-04-29

    Cloud Computing   The answer, my friend, is blowing in the wind.   The answer is blowing in the wind. 1Bingue ‐ Cook  Cloud   Computing  STSC 2010... Cloud   Computing  STSC 2010 Objectives • Define the cloud    • Risks of  cloud   computing f l d i• Essence o  c ou  comput ng • Deployed clouds in DoD 3Bingue...Cook  Cloud   Computing  STSC 2010 Definitions of Cloud Computing       Cloud   computing  is a model for enabling  b d d ku

  19. Analysis on the security of cloud computing

    NASA Astrophysics Data System (ADS)

    He, Zhonglin; He, Yuhua

    2011-02-01

    Cloud computing is a new technology, which is the fusion of computer technology and Internet development. It will lead the revolution of IT and information field. However, in cloud computing data and application software is stored at large data centers, and the management of data and service is not completely trustable, resulting in safety problems, which is the difficult point to improve the quality of cloud service. This paper briefly introduces the concept of cloud computing. Considering the characteristics of cloud computing, it constructs the security architecture of cloud computing. At the same time, with an eye toward the security threats cloud computing faces, several corresponding strategies are provided from the aspect of cloud computing users and service providers.

  20. Lawrence Livermore National Laboratory`s Computer Security Short Subjects Videos: Hidden Password, The Incident, Dangerous Games and The Mess; Computer Security Awareness Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    A video on computer security is described. Lonnie Moore, the Computer Security Manager, CSSM/CPPM at Lawrence Livermore National Laboratory (LLNL) and Gale Warshawsky, the Coordinator for Computer Security Education and Awareness at LLNL, wanted to share topics such as computer ethics, software piracy, privacy issues, and protecting information in a format that would capture and hold an audience`s attention. Four Computer Security Short Subject videos were produced which ranged from 1--3 minutes each. These videos are very effective education and awareness tools that can be used to generate discussions about computer security concerns and good computing practices.

  1. Computer applications in remote sensing education

    NASA Technical Reports Server (NTRS)

    Danielson, R. L.

    1980-01-01

    Computer applications to instruction in any field may be divided into two broad generic classes: computer-managed instruction and computer-assisted instruction. The division is based on how frequently the computer affects the instructional process and how active a role the computer affects the instructional process and how active a role the computer takes in actually providing instruction. There are no inherent characteristics of remote sensing education to preclude the use of one or both of these techniques, depending on the computer facilities available to the instructor. The characteristics of the two classes are summarized, potential applications to remote sensing education are discussed, and the advantages and disadvantages of computer applications to the instructional process are considered.

  2. Removing the center from computing: biology's new mode of digital knowledge production.

    PubMed

    November, Joseph

    2011-06-01

    This article shows how the USA's National Institutes of Health (NIH) helped to bring about a major shift in the way computers are used to produce knowledge and in the design of computers themselves as a consequence of its early 1960s efforts to introduce information technology to biologists. Starting in 1960 the NIH sought to reform the life sciences by encouraging researchers to make use of digital electronic computers, but despite generous federal support biologists generally did not embrace the new technology. Initially the blame fell on biologists' lack of appropriate (i.e. digital) data for computers to process. However, when the NIH consulted MIT computer architect Wesley Clark about this problem, he argued that the computer's quality as a device that was centralized posed an even greater challenge to potential biologist users than did the computer's need for digital data. Clark convinced the NIH that if the agency hoped to effectively computerize biology, it would need to satisfy biologists' experimental and institutional needs by providing them the means to use a computer without going to a computing center. With NIH support, Clark developed the 1963 Laboratory Instrument Computer (LINC), a small, real-time interactive computer intended to be used inside the laboratory and controlled entirely by its biologist users. Once built, the LINC provided a viable alternative to the 1960s norm of large computers housed in computing centers. As such, the LINC not only became popular among biologists, but also served in later decades as an important precursor of today's computing norm in the sciences and far beyond, the personal computer.

  3. Reconfigurable Computing for Computational Science: A New Focus in High Performance Computing

    DTIC Science & Technology

    2006-11-01

    in the past decade. Researchers are regularly employing the power of large computing systems and parallel processing to tackle larger and more...complex problems in all of the physical sciences. For the past decade or so, most of this growth in computing power has been “free” with increased...the scientific computing community as a means to continued growth in computing capability. This paper offers a glimpse of the hardware and

  4. Computer Assistance in Information Work. Part I: Conceptual Framework for Improving the Computer/User Interface in Information Work. Part II: Catalog of Acceleration, Augmentation, and Delegation Functions in Information Work.

    ERIC Educational Resources Information Center

    Paisley, William; Butler, Matilda

    This study of the computer/user interface investigated the role of the computer in performing information tasks that users now perform without computer assistance. Users' perceptual/cognitive processes are to be accelerated or augmented by the computer; a long term goal is to delegate information tasks entirely to the computer. Cybernetic and…

  5. Human Expertise Helps Computer Classify Images

    NASA Technical Reports Server (NTRS)

    Rorvig, Mark E.

    1991-01-01

    Two-domain method of computational classification of images requires less computation than other methods for computational recognition, matching, or classification of images or patterns. Does not require explicit computational matching of features, and incorporates human expertise without requiring translation of mental processes of classification into language comprehensible to computer. Conceived to "train" computer to analyze photomicrographs of microscope-slide specimens of leucocytes from human peripheral blood to distinguish between specimens from healthy and specimens from traumatized patients.

  6. 29 CFR 541.401 - Computer manufacture and repair.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ..., the use of computers and computer software programs (e.g., engineers, drafters and others skilled in computer-aided design software), but who are not primarily engaged in computer systems analysis and...

  7. 29 CFR 541.401 - Computer manufacture and repair.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ..., the use of computers and computer software programs (e.g., engineers, drafters and others skilled in computer-aided design software), but who are not primarily engaged in computer systems analysis and...

  8. 29 CFR 541.401 - Computer manufacture and repair.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., the use of computers and computer software programs (e.g., engineers, drafters and others skilled in computer-aided design software), but who are not primarily engaged in computer systems analysis and...

  9. 29 CFR 541.401 - Computer manufacture and repair.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., the use of computers and computer software programs (e.g., engineers, drafters and others skilled in computer-aided design software), but who are not primarily engaged in computer systems analysis and...

  10. Cost Considerations in Nonlinear Finite-Element Computing

    NASA Technical Reports Server (NTRS)

    Utku, S.; Melosh, R. J.; Islam, M.; Salama, M.

    1985-01-01

    Conference paper discusses computational requirements for finiteelement analysis using quasi-linear approach to nonlinear problems. Paper evaluates computational efficiency of different computer architecturtural types in terms of relative cost and computing time.

  11. Computing Literacy in the University of the Future.

    ERIC Educational Resources Information Center

    Gantt, Vernon W.

    In exploring the impact of microcomputers and the future of the university in 1985 and beyond, a distinction should be made between computing literacy--the ability to use a computer--and computer literacy, which goes beyond successful computer use to include knowing how to program in various computer languages and understanding what goes on…

  12. Ubiquitous Versus One-to-One

    ERIC Educational Resources Information Center

    McAnear, Anita

    2006-01-01

    When we planned the editorial calendar with the topic ubiquitous computing, we were thinking of ubiquitous computing as the one-to-one ratio of computers to students and teachers and 24/7 access to electronic resources. At the time, we were aware that ubiquitous computing in the computer science field had more to do with wearable computers. Our…

  13. Democratizing Computer Science

    ERIC Educational Resources Information Center

    Margolis, Jane; Goode, Joanna; Ryoo, Jean J.

    2015-01-01

    Computer science programs are too often identified with a narrow stratum of the student population, often white or Asian boys who have access to computers at home. But because computers play such a huge role in our world today, all students can benefit from the study of computer science and the opportunity to build skills related to computing. The…

  14. Implementing an Affordable High-Performance Computing for Teaching-Oriented Computer Science Curriculum

    ERIC Educational Resources Information Center

    Abuzaghleh, Omar; Goldschmidt, Kathleen; Elleithy, Yasser; Lee, Jeongkyu

    2013-01-01

    With the advances in computing power, high-performance computing (HPC) platforms have had an impact on not only scientific research in advanced organizations but also computer science curriculum in the educational community. For example, multicore programming and parallel systems are highly desired courses in the computer science major. However,…

  15. Suggested Approaches to the Measurement of Computer Anxiety.

    ERIC Educational Resources Information Center

    Toris, Carol

    Psychologists can gain insight into human behavior by examining what people feel about, know about, and do with, computers. Two extreme reactions to computers are computer phobia, or anxiety, and computer addiction, or "hacking". A four-part questionnaire was developed to measure computer anxiety. The first part is a projective technique which…

  16. Student Perceived Importance and Correlations of Selected Computer Literacy Course Topics

    ERIC Educational Resources Information Center

    Ciampa, Mark

    2013-01-01

    Traditional college-level courses designed to teach computer literacy are in a state of flux. Today's students have high rates of access to computing technology and computer ownership, leading many policy decision makers to conclude that students already are computer literate and thus computer literacy courses are dinosaurs in a modern digital…

  17. The Gender Factor in Computer Anxiety and Interest among Some Australian High School Students.

    ERIC Educational Resources Information Center

    Okebukola, Peter Akinsola

    1993-01-01

    Western Australia eleventh graders (142 boys, 139 girls) were compared on such variables as computers at home, computer classes, experience with computers, and socioeconomic status. Girls had higher anxiety levels, boys higher computer interest. Possible causes included social beliefs about computer use, teacher sex bias, and software (games) more…

  18. Computer Programming Languages and Expertise Needed by Practicing Engineers.

    ERIC Educational Resources Information Center

    Doelling, Irvin

    1980-01-01

    Discussed is the present engineering computer environment of a large aerospace company recognized as a leader in the application and development of computer-aided design and computer-aided manufacturing techniques. A review is given of the exposure spectrum of engineers to the world of computing, the computer languages used, and the career impacts…

  19. CPU SIM: A Computer Simulator for Use in an Introductory Computer Organization-Architecture Class.

    ERIC Educational Resources Information Center

    Skrein, Dale

    1994-01-01

    CPU SIM, an interactive low-level computer simulation package that runs on the Macintosh computer, is described. The program is designed for instructional use in the first or second year of undergraduate computer science, to teach various features of typical computer organization through hands-on exercises. (MSE)

  20. Computer-Mediated Communication in a High School: The Users Shape the Medium--Part 1.

    ERIC Educational Resources Information Center

    Bresler, Liora

    1990-01-01

    This field study represents a departure from structured, or directed, computer-mediated communication as used in its natural environment, the computer lab. Using observations, interviews, and the computer medium itself, the investigators report how high school students interact with computers and create their own agendas for computer usage and…

  1. Computer-Game Construction: A Gender-Neutral Attractor to Computing Science

    ERIC Educational Resources Information Center

    Carbonaro, Mike; Szafron, Duane; Cutumisu, Maria; Schaeffer, Jonathan

    2010-01-01

    Enrollment in Computing Science university programs is at a dangerously low level. A major reason for this is the general lack of interest in Computing Science by females. In this paper, we discuss our experience with using a computer game construction environment as a vehicle to encourage female participation in Computing Science. Experiments…

  2. Cloud Computing in Support of Applied Learning: A Baseline Study of Infrastructure Design at Southern Polytechnic State University

    ERIC Educational Resources Information Center

    Conn, Samuel S.; Reichgelt, Han

    2013-01-01

    Cloud computing represents an architecture and paradigm of computing designed to deliver infrastructure, platforms, and software as constructible computing resources on demand to networked users. As campuses are challenged to better accommodate academic needs for applications and computing environments, cloud computing can provide an accommodating…

  3. 78 FR 41873 - Energy Conservation Program for Consumer Products and Certain Commercial and Industrial Equipment...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-12

    ... limited to) desktop computers, integrated desktop computers, laptop/notebook/ netbook computers, and... computer, and 65% of U.S. households owning a notebook, laptop, or netbook computer, in 2013.\\4\\ Coverage... recently published studies. In these studies, the average annual energy use for a desktop computer was...

  4. Elementary EFL Teachers' Computer Phobia and Computer Self-Efficacy in Taiwan

    ERIC Educational Resources Information Center

    Chen, Kate Tzuching

    2012-01-01

    The advent and application of computer and information technology has increased the overall success of EFL teaching; however, such success is hard to assess, and teachers prone to computer avoidance face negative consequences. Two major obstacles are high computer phobia and low computer self-efficacy. However, little research has been carried out…

  5. Computer use changes generalization of movement learning.

    PubMed

    Wei, Kunlin; Yan, Xiang; Kong, Gaiqing; Yin, Cong; Zhang, Fan; Wang, Qining; Kording, Konrad Paul

    2014-01-06

    Over the past few decades, one of the most salient lifestyle changes for us has been the use of computers. For many of us, manual interaction with a computer occupies a large portion of our working time. Through neural plasticity, this extensive movement training should change our representation of movements (e.g., [1-3]), just like search engines affect memory [4]. However, how computer use affects motor learning is largely understudied. Additionally, as virtually all participants in studies of perception and actions are computer users, a legitimate question is whether insights from these studies bear the signature of computer-use experience. We compared non-computer users with age- and education-matched computer users in standard motor learning experiments. We found that people learned equally fast but that non-computer users generalized significantly less across space, a difference negated by two weeks of intensive computer training. Our findings suggest that computer-use experience shaped our basic sensorimotor behaviors, and this influence should be considered whenever computer users are recruited as study participants. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. NACA Computer at the Lewis Flight Propulsion Laboratory

    NASA Image and Video Library

    1951-02-21

    A female computer at the National Advisory Committee for Aeronautics (NACA) Lewis Flight Propulsion Laboratory with a slide rule and Friden adding machine to make computations. The computer staff was introduced during World War II to relieve short-handed research engineers of some of the tedious computational work. The Computing Section was staffed by “computers,” young female employees, who often worked overnight when most of the tests were run. The computers obtained test data from the manometers and other instruments, made the initial computations, and plotted the data graphically. Researchers then analyzed the data and summarized the findings in a report or made modifications and ran the test again. There were over 400 female employees at the laboratory in 1944, including 100 computers. The use of computers was originally planned only for the duration of the war. The system was so successful that it was extended into the 1960s. The computers and analysts were located in the Altitude Wind Tunnel Shop and Office Building office wing during the 1940s and transferred to the new 8- by 6-Foot Supersonic Wind Tunnel in 1948.

  7. Experiments in Computing: A Survey

    PubMed Central

    Moisseinen, Nella

    2014-01-01

    Experiments play a central role in science. The role of experiments in computing is, however, unclear. Questions about the relevance of experiments in computing attracted little attention until the 1980s. As the discipline then saw a push towards experimental computer science, a variety of technically, theoretically, and empirically oriented views on experiments emerged. As a consequence of those debates, today's computing fields use experiments and experiment terminology in a variety of ways. This paper analyzes experimentation debates in computing. It presents five ways in which debaters have conceptualized experiments in computing: feasibility experiment, trial experiment, field experiment, comparison experiment, and controlled experiment. This paper has three aims: to clarify experiment terminology in computing; to contribute to disciplinary self-understanding of computing; and, due to computing's centrality in other fields, to promote understanding of experiments in modern science in general. PMID:24688404

  8. Conjunctival impression cytology in computer users.

    PubMed

    Kumar, S; Bansal, R; Khare, A; Malik, K P S; Malik, V K; Jain, K; Jain, C

    2013-01-01

    It is known that the computer users develop the features of dry eye. To study the cytological changes in the conjunctiva using conjunctival impression cytology in computer users and a control group. Fifteen eyes of computer users who had used computers for more than one year and ten eyes of an age-and-sex matched control group (those who had not used computers) were studied by conjunctival impression cytology. Conjunctival impression cytology (CIC) results in the control group were of stage 0 and stage I while the computer user group showed CIC results between stages II to stage IV. Among the computer users, the majority ( > 90 %) showed stage III and stage IV changes. We found that those who used computers daily for long hours developed more CIC changes than those who worked at the computer for a shorter daily duration. © NEPjOPH.

  9. Experiments in computing: a survey.

    PubMed

    Tedre, Matti; Moisseinen, Nella

    2014-01-01

    Experiments play a central role in science. The role of experiments in computing is, however, unclear. Questions about the relevance of experiments in computing attracted little attention until the 1980s. As the discipline then saw a push towards experimental computer science, a variety of technically, theoretically, and empirically oriented views on experiments emerged. As a consequence of those debates, today's computing fields use experiments and experiment terminology in a variety of ways. This paper analyzes experimentation debates in computing. It presents five ways in which debaters have conceptualized experiments in computing: feasibility experiment, trial experiment, field experiment, comparison experiment, and controlled experiment. This paper has three aims: to clarify experiment terminology in computing; to contribute to disciplinary self-understanding of computing; and, due to computing's centrality in other fields, to promote understanding of experiments in modern science in general.

  10. A survey of the computer literacy of undergraduate dental students at a University Dental School in Ireland during the academic year 1997-98.

    PubMed

    Ray, N J; Hannigan, A

    1999-05-01

    As dental practice management becomes more computer-based, the efficient functioning of the dentist will become dependent on adequate computer literacy. A survey has been carried out into the computer literacy of a cohort of 140 undergraduate dental students at a University Dental School in Ireland (years 1-5), in the academic year 1997-98. Aspects investigated by anonymous questionnaire were: (1) keyboard skills; (2) computer skills; (3) access to computer facilities; (4) software competencies and (5) use of medical library computer facilities. The students are relatively unfamiliar with basic computer hardware and software: 51.1% considered their expertise with computers as "poor"; 34.3% had taken a formal typewriting or computer keyboarding course; 7.9% had taken a formal computer course at university level and 67.2% were without access to computer facilities at their term-time residences. A majority of students had never used either word-processing, spreadsheet, or graphics programs. Programs relating to "informatics" were more popular, such as literature searching, accessing the Internet and the use of e-mail which represent the major use of the computers in the medical library. The lack of experience with computers may be addressed by including suitable computing courses at the secondary level (age 13-18 years) and/or tertiary level (FE/HE) education programmes. Such training may promote greater use of generic softwares, particularly in the library, with a more electronic-based approach to data handling.

  11. Human-computer interaction in multitask situations

    NASA Technical Reports Server (NTRS)

    Rouse, W. B.

    1977-01-01

    Human-computer interaction in multitask decisionmaking situations is considered, and it is proposed that humans and computers have overlapping responsibilities. Queueing theory is employed to model this dynamic approach to the allocation of responsibility between human and computer. Results of simulation experiments are used to illustrate the effects of several system variables including number of tasks, mean time between arrivals of action-evoking events, human-computer speed mismatch, probability of computer error, probability of human error, and the level of feedback between human and computer. Current experimental efforts are discussed and the practical issues involved in designing human-computer systems for multitask situations are considered.

  12. Performing a global barrier operation in a parallel computer

    DOEpatents

    Archer, Charles J; Blocksome, Michael A; Ratterman, Joseph D; Smith, Brian E

    2014-12-09

    Executing computing tasks on a parallel computer that includes compute nodes coupled for data communications, where each compute node executes tasks, with one task on each compute node designated as a master task, including: for each task on each compute node until all master tasks have joined a global barrier: determining whether the task is a master task; if the task is not a master task, joining a single local barrier; if the task is a master task, joining the global barrier and the single local barrier only after all other tasks on the compute node have joined the single local barrier.

  13. The CP-PACS parallel computer

    NASA Astrophysics Data System (ADS)

    Ukawa, Akira

    1998-05-01

    The CP-PACS computer is a massively parallel computer consisting of 2048 processing units and having a peak speed of 614 GFLOPS and 128 GByte of main memory. It was developed over the four years from 1992 to 1996 at the Center for Computational Physics, University of Tsukuba, for large-scale numerical simulations in computational physics, especially those of lattice QCD. The CP-PACS computer has been in full operation for physics computations since October 1996. In this article we describe the chronology of the development, the hardware and software characteristics of the computer, and its performance for lattice QCD simulations.

  14. Advanced flight computers for planetary exploration

    NASA Technical Reports Server (NTRS)

    Stephenson, R. Rhoads

    1988-01-01

    Research concerning flight computers for use on interplanetary probes is reviewed. The history of these computers from the Viking mission to the present is outlined. The differences between ground commercial computers and computers for planetary exploration are listed. The development of a computer for the Mariner Mark II comet rendezvous asteroid flyby mission is described. Various aspects of recently developed computer systems are examined, including the Max real time, embedded computer, a hypercube distributed supercomputer, a SAR data processor, a processor for the High Resolution IR Imaging Spectrometer, and a robotic vision multiresolution pyramid machine for processsing images obtained by a Mars Rover.

  15. The role of dedicated data computing centers in the age of cloud computing

    NASA Astrophysics Data System (ADS)

    Caramarcu, Costin; Hollowell, Christopher; Strecker-Kellogg, William; Wong, Antonio; Zaytsev, Alexandr

    2017-10-01

    Brookhaven National Laboratory (BNL) anticipates significant growth in scientific programs with large computing and data storage needs in the near future and has recently reorganized support for scientific computing to meet these needs. A key component is the enhanced role of the RHIC-ATLAS Computing Facility (RACF) in support of high-throughput and high-performance computing (HTC and HPC) at BNL. This presentation discusses the evolving role of the RACF at BNL, in light of its growing portfolio of responsibilities and its increasing integration with cloud (academic and for-profit) computing activities. We also discuss BNL’s plan to build a new computing center to support the new responsibilities of the RACF and present a summary of the cost benefit analysis done, including the types of computing activities that benefit most from a local data center vs. cloud computing. This analysis is partly based on an updated cost comparison of Amazon EC2 computing services and the RACF, which was originally conducted in 2012.

  16. Does attitude matter in computer use in Australian general practice? A zero-inflated Poisson regression analysis.

    PubMed

    Khan, Asaduzzaman; Western, Mark

    The purpose of this study was to explore factors that facilitate or hinder effective use of computers in Australian general medical practice. This study is based on data extracted from a national telephone survey of 480 general practitioners (GPs) across Australia. Clinical functions performed by GPs using computers were examined using a zero-inflated Poisson (ZIP) regression modelling. About 17% of GPs were not using computer for any clinical function, while 18% reported using computers for all clinical functions. The ZIP model showed that computer anxiety was negatively associated with effective computer use, while practitioners' belief about usefulness of computers was positively associated with effective computer use. Being a female GP or working in partnership or group practice increased the odds of effectively using computers for clinical functions. To fully capitalise on the benefits of computer technology, GPs need to be convinced that this technology is useful and can make a difference.

  17. Administering truncated receive functions in a parallel messaging interface

    DOEpatents

    Archer, Charles J; Blocksome, Michael A; Ratterman, Joseph D; Smith, Brian E

    2014-12-09

    Administering truncated receive functions in a parallel messaging interface (`PMI`) of a parallel computer comprising a plurality of compute nodes coupled for data communications through the PMI and through a data communications network, including: sending, through the PMI on a source compute node, a quantity of data from the source compute node to a destination compute node; specifying, by an application on the destination compute node, a portion of the quantity of data to be received by the application on the destination compute node and a portion of the quantity of data to be discarded; receiving, by the PMI on the destination compute node, all of the quantity of data; providing, by the PMI on the destination compute node to the application on the destination compute node, only the portion of the quantity of data to be received by the application; and discarding, by the PMI on the destination compute node, the portion of the quantity of data to be discarded.

  18. Hardware based redundant multi-threading inside a GPU for improved reliability

    DOEpatents

    Sridharan, Vilas; Gurumurthi, Sudhanva

    2015-05-05

    A system and method for verifying computation output using computer hardware are provided. Instances of computation are generated and processed on hardware-based processors. As instances of computation are processed, each instance of computation receives a load accessible to other instances of computation. Instances of output are generated by processing the instances of computation. The instances of output are verified against each other in a hardware based processor to ensure accuracy of the output.

  19. Study on the application of mobile internet cloud computing platform

    NASA Astrophysics Data System (ADS)

    Gong, Songchun; Fu, Songyin; Chen, Zheng

    2012-04-01

    The innovative development of computer technology promotes the application of the cloud computing platform, which actually is the substitution and exchange of a sort of resource service models and meets the needs of users on the utilization of different resources after changes and adjustments of multiple aspects. "Cloud computing" owns advantages in many aspects which not merely reduce the difficulties to apply the operating system and also make it easy for users to search, acquire and process the resources. In accordance with this point, the author takes the management of digital libraries as the research focus in this paper, and analyzes the key technologies of the mobile internet cloud computing platform in the operation process. The popularization and promotion of computer technology drive people to create the digital library models, and its core idea is to strengthen the optimal management of the library resource information through computers and construct an inquiry and search platform with high performance, allowing the users to access to the necessary information resources at any time. However, the cloud computing is able to promote the computations within the computers to distribute in a large number of distributed computers, and hence implement the connection service of multiple computers. The digital libraries, as a typical representative of the applications of the cloud computing, can be used to carry out an analysis on the key technologies of the cloud computing.

  20. Computers and Children: Problems and Possibilities.

    ERIC Educational Resources Information Center

    Siegfried, Pat

    1983-01-01

    Discusses the use of computers by children, highlighting a definition of computer literacy, computer education in schools, computer software, microcomputers, programming languages, and public library involvement. Seven references and a 40-item bibliography are included. (EJS)

Top