DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerber, Richard; Hack, James; Riley, Katherine
The mission of the U.S. Department of Energy Office of Science (DOE SC) is the delivery of scientific discoveries and major scientific tools to transform our understanding of nature and to advance the energy, economic, and national security missions of the United States. To achieve these goals in today’s world requires investments in not only the traditional scientific endeavors of theory and experiment, but also in computational science and the facilities that support large-scale simulation and data analysis. The Advanced Scientific Computing Research (ASCR) program addresses these challenges in the Office of Science. ASCR’s mission is to discover, develop, andmore » deploy computational and networking capabilities to analyze, model, simulate, and predict complex phenomena important to DOE. ASCR supports research in computational science, three high-performance computing (HPC) facilities — the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory and Leadership Computing Facilities at Argonne (ALCF) and Oak Ridge (OLCF) National Laboratories — and the Energy Sciences Network (ESnet) at Berkeley Lab. ASCR is guided by science needs as it develops research programs, computers, and networks at the leading edge of technologies. As we approach the era of exascale computing, technology changes are creating challenges for science programs in SC for those who need to use high performance computing and data systems effectively. Numerous significant modifications to today’s tools and techniques will be needed to realize the full potential of emerging computing systems and other novel computing architectures. To assess these needs and challenges, ASCR held a series of Exascale Requirements Reviews in 2015–2017, one with each of the six SC program offices,1 and a subsequent Crosscut Review that sought to integrate the findings from each. Participants at the reviews were drawn from the communities of leading domain scientists, experts in computer science and applied mathematics, ASCR facility staff, and DOE program managers in ASCR and the respective program offices. The purpose of these reviews was to identify mission-critical scientific problems within the DOE Office of Science (including experimental facilities) and determine the requirements for the exascale ecosystem that would be needed to address those challenges. The exascale ecosystem includes exascale computing systems, high-end data capabilities, efficient software at scale, libraries, tools, and other capabilities. This effort will contribute to the development of a strategic roadmap for ASCR compute and data facility investments and will help the ASCR Facility Division establish partnerships with Office of Science stakeholders. It will also inform the Office of Science research needs and agenda. The results of the six reviews have been published in reports available on the web at http://exascaleage.org/. This report presents a summary of the individual reports and of common and crosscutting findings, and it identifies opportunities for productive collaborations among the DOE SC program offices.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Habib, Salman; Roser, Robert; Gerber, Richard
The U.S. Department of Energy (DOE) Office of Science (SC) Offices of High Energy Physics (HEP) and Advanced Scientific Computing Research (ASCR) convened a programmatic Exascale Requirements Review on June 10–12, 2015, in Bethesda, Maryland. This report summarizes the findings, results, and recommendations derived from that meeting. The high-level findings and observations are as follows. Larger, more capable computing and data facilities are needed to support HEP science goals in all three frontiers: Energy, Intensity, and Cosmic. The expected scale of the demand at the 2025 timescale is at least two orders of magnitude — and in some cases greatermore » — than that available currently. The growth rate of data produced by simulations is overwhelming the current ability of both facilities and researchers to store and analyze it. Additional resources and new techniques for data analysis are urgently needed. Data rates and volumes from experimental facilities are also straining the current HEP infrastructure in its ability to store and analyze large and complex data volumes. Appropriately configured leadership-class facilities can play a transformational role in enabling scientific discovery from these datasets. A close integration of high-performance computing (HPC) simulation and data analysis will greatly aid in interpreting the results of HEP experiments. Such an integration will minimize data movement and facilitate interdependent workflows. Long-range planning between HEP and ASCR will be required to meet HEP’s research needs. To best use ASCR HPC resources, the experimental HEP program needs (1) an established, long-term plan for access to ASCR computational and data resources, (2) the ability to map workflows to HPC resources, (3) the ability for ASCR facilities to accommodate workflows run by collaborations potentially comprising thousands of individual members, (4) to transition codes to the next-generation HPC platforms that will be available at ASCR facilities, (5) to build up and train a workforce capable of developing and using simulations and analysis to support HEP scientific research on next-generation systems.« less
ASCR/HEP Exascale Requirements Review Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Habib, Salman; Roser, Robert; Gerber, Richard
This draft report summarizes and details the findings, results, and recommendations derived from the ASCR/HEP Exascale Requirements Review meeting held in June, 2015. The main conclusions are as follows. 1) Larger, more capable computing and data facilities are needed to support HEP science goals in all three frontiers: Energy, Intensity, and Cosmic. The expected scale of the demand at the 2025 timescale is at least two orders of magnitude -- and in some cases greater -- than that available currently. 2) The growth rate of data produced by simulations is overwhelming the current ability, of both facilities and researchers, tomore » store and analyze it. Additional resources and new techniques for data analysis are urgently needed. 3) Data rates and volumes from HEP experimental facilities are also straining the ability to store and analyze large and complex data volumes. Appropriately configured leadership-class facilities can play a transformational role in enabling scientific discovery from these datasets. 4) A close integration of HPC simulation and data analysis will aid greatly in interpreting results from HEP experiments. Such an integration will minimize data movement and facilitate interdependent workflows. 5) Long-range planning between HEP and ASCR will be required to meet HEP's research needs. To best use ASCR HPC resources the experimental HEP program needs a) an established long-term plan for access to ASCR computational and data resources, b) an ability to map workflows onto HPC resources, c) the ability for ASCR facilities to accommodate workflows run by collaborations that can have thousands of individual members, d) to transition codes to the next-generation HPC platforms that will be available at ASCR facilities, e) to build up and train a workforce capable of developing and using simulations and analysis to support HEP scientific research on next-generation systems.« less
ASCR/HEP Exascale Requirements Review Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Habib, Salman; et al.
2016-03-30
This draft report summarizes and details the findings, results, and recommendations derived from the ASCR/HEP Exascale Requirements Review meeting held in June, 2015. The main conclusions are as follows. 1) Larger, more capable computing and data facilities are needed to support HEP science goals in all three frontiers: Energy, Intensity, and Cosmic. The expected scale of the demand at the 2025 timescale is at least two orders of magnitude -- and in some cases greater -- than that available currently. 2) The growth rate of data produced by simulations is overwhelming the current ability, of both facilities and researchers, tomore » store and analyze it. Additional resources and new techniques for data analysis are urgently needed. 3) Data rates and volumes from HEP experimental facilities are also straining the ability to store and analyze large and complex data volumes. Appropriately configured leadership-class facilities can play a transformational role in enabling scientific discovery from these datasets. 4) A close integration of HPC simulation and data analysis will aid greatly in interpreting results from HEP experiments. Such an integration will minimize data movement and facilitate interdependent workflows. 5) Long-range planning between HEP and ASCR will be required to meet HEP's research needs. To best use ASCR HPC resources the experimental HEP program needs a) an established long-term plan for access to ASCR computational and data resources, b) an ability to map workflows onto HPC resources, c) the ability for ASCR facilities to accommodate workflows run by collaborations that can have thousands of individual members, d) to transition codes to the next-generation HPC platforms that will be available at ASCR facilities, e) to build up and train a workforce capable of developing and using simulations and analysis to support HEP scientific research on next-generation systems.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bacon, Charles; Bell, Greg; Canon, Shane
The Energy Sciences Network (ESnet) is the primary provider of network connectivity for the U.S. Department of Energy (DOE) Office of Science (SC), the single largest supporter of basic research in the physical sciences in the United States. In support of SC programs, ESnet regularly updates and refreshes its understanding of the networking requirements of the instruments, facilities, scientists, and science programs that it serves. This focus has helped ESnet to be a highly successful enabler of scientific discovery for over 25 years. In October 2012, ESnet and the Office of Advanced Scientific Computing Research (ASCR) of the DOE SCmore » organized a review to characterize the networking requirements of the programs funded by the ASCR program office. The requirements identified at the review are summarized in the Findings section, and are described in more detail in the body of the report.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Almgren, Ann; DeMar, Phil; Vetter, Jeffrey
The widespread use of computing in the American economy would not be possible without a thoughtful, exploratory research and development (R&D) community pushing the performance edge of operating systems, computer languages, and software libraries. These are the tools and building blocks — the hammers, chisels, bricks, and mortar — of the smartphone, the cloud, and the computing services on which we rely. Engineers and scientists need ever-more specialized computing tools to discover new material properties for manufacturing, make energy generation safer and more efficient, and provide insight into the fundamentals of the universe, for example. The research division of themore » U.S. Department of Energy’s (DOE’s) Office of Advanced Scientific Computing and Research (ASCR Research) ensures that these tools and building blocks are being developed and honed to meet the extreme needs of modern science. See also http://exascaleage.org/ascr/ for additional information.« less
ASCR Cybersecurity for Scientific Computing Integrity - Research Pathways and Ideas Workshop
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peisert, Sean; Potok, Thomas E.; Jones, Todd
At the request of the U.S. Department of Energy's (DOE) Office of Science (SC) Advanced Scientific Computing Research (ASCR) program office, a workshop was held June 2-3, 2015, in Gaithersburg, MD, to identify potential long term (10 to +20 year) cybersecurity fundamental basic research and development challenges, strategies and roadmap facing future high performance computing (HPC), networks, data centers, and extreme-scale scientific user facilities. This workshop was a follow-on to the workshop held January 7-9, 2015, in Rockville, MD, that examined higher level ideas about scientific computing integrity specific to the mission of the DOE Office of Science. Issues includedmore » research computation and simulation that takes place on ASCR computing facilities and networks, as well as network-connected scientific instruments, such as those run by various DOE Office of Science programs. Workshop participants included researchers and operational staff from DOE national laboratories, as well as academic researchers and industry experts. Participants were selected based on the submission of abstracts relating to the topics discussed in the previous workshop report [1] and also from other ASCR reports, including "Abstract Machine Models and Proxy Architectures for Exascale Computing" [27], the DOE "Preliminary Conceptual Design for an Exascale Computing Initiative" [28], and the January 2015 machine learning workshop [29]. The workshop was also attended by several observers from DOE and other government agencies. The workshop was divided into three topic areas: (1) Trustworthy Supercomputing, (2) Extreme-Scale Data, Knowledge, and Analytics for Understanding and Improving Cybersecurity, and (3) Trust within High-end Networking and Data Centers. Participants were divided into three corresponding teams based on the category of their abstracts. The workshop began with a series of talks from the program manager and workshop chair, followed by the leaders for each of the three topics and a representative of each of the four major DOE Office of Science Advanced Scientific Computing Research Facilities: the Argonne Leadership Computing Facility (ALCF), the Energy Sciences Network (ESnet), the National Energy Research Scientific Computing Center (NERSC), and the Oak Ridge Leadership Computing Facility (OLCF). The rest of the workshop consisted of topical breakout discussions and focused writing periods that produced much of this report.« less
76 FR 45786 - Advanced Scientific Computing Advisory Committee; Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-01
... updates. EU Data Initiative. HPC & EERE Wind Program. Early Career Research on Energy Efficient Interconnect for Exascale Computing. Separating Algorithm and Implentation. Update on ASCR exascale planning...
ASCR Workshop on Quantum Computing for Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aspuru-Guzik, Alan; Van Dam, Wim; Farhi, Edward
This report details the findings of the DOE ASCR Workshop on Quantum Computing for Science that was organized to assess the viability of quantum computing technologies to meet the computational requirements of the DOE’s science and energy mission, and to identify the potential impact of quantum technologies. The workshop was held on February 17-18, 2015, in Bethesda, MD, to solicit input from members of the quantum computing community. The workshop considered models of quantum computation and programming environments, physical science applications relevant to DOE's science mission as well as quantum simulation, and applied mathematics topics including potential quantum algorithms formore » linear algebra, graph theory, and machine learning. This report summarizes these perspectives into an outlook on the opportunities for quantum computing to impact problems relevant to the DOE’s mission as well as the additional research required to bring quantum computing to the point where it can have such impact.« less
NASA Astrophysics Data System (ADS)
Strayer, Michael
2007-09-01
Good morning. Welcome to Boston, the home of the Red Sox, Celtics and Bruins, baked beans, tea parties, Robert Parker, and SciDAC 2007. A year ago I stood before you to share the legacy of the first SciDAC program and identify the challenges that we must address on the road to petascale computing—a road E E Cummins described as `. . . never traveled, gladly beyond any experience.' Today, I want to explore the preparations for the rapidly approaching extreme scale (X-scale) generation. These preparations are the first step propelling us along the road of burgeoning scientific discovery enabled by the application of X- scale computing. We look to petascale computing and beyond to open up a world of discovery that cuts across scientific fields and leads us to a greater understanding of not only our world, but our universe. As part of the President's America Competitiveness Initiative, the ASCR Office has been preparing a ten year vision for computing. As part of this planning the LBNL together with ORNL and ANL hosted three town hall meetings on Simulation and Modeling at the Exascale for Energy, Ecological Sustainability and Global Security (E3). The proposed E3 initiative is organized around four programmatic themes: Engaging our top scientists, engineers, computer scientists and applied mathematicians; investing in pioneering large-scale science; developing scalable analysis algorithms, and storage architectures to accelerate discovery; and accelerating the build-out and future development of the DOE open computing facilities. It is clear that we have only just started down the path to extreme scale computing. Plan to attend Thursday's session on the out-briefing and discussion of these meetings. The road to the petascale has been at best rocky. In FY07, the continuing resolution provided 12% less money for Advanced Scientific Computing than either the President, the Senate, or the House. As a consequence, many of you had to absorb a no cost extension for your SciDAC work. I am pleased that the President's FY08 budget restores the funding for SciDAC. Quoting from Advanced Scientific Computing Research description in the House Energy and Water Development Appropriations Bill for FY08, "Perhaps no other area of research at the Department is so critical to sustaining U.S. leadership in science and technology, revolutionizing the way science is done and improving research productivity." As a society we need to revolutionize our approaches to energy, environmental and global security challenges. As we go forward along the road to the X-scale generation, the use of computation will continue to be a critical tool along with theory and experiment in understanding the behavior of the fundamental components of nature as well as for fundamental discovery and exploration of the behavior of complex systems. The foundation to overcome these societal challenges will build from the experiences and knowledge gained as you, members of our SciDAC research teams, work together to attack problems at the tera- and peta- scale. If SciDAC is viewed as an experiment for revolutionizing scientific methodology, then a strategic goal of ASCR program must be to broaden the intellectual base prepared to address the challenges of the new X-scale generation of computing. We must focus our computational science experiences gained over the past five years on the opportunities introduced with extreme scale computing. Our facilities are on a path to provide the resources needed to undertake the first part of our journey. Using the newly upgraded 119 teraflop Cray XT system at the Leadership Computing Facility, SciDAC research teams have in three days performed a 100-year study of the time evolution of the atmospheric CO2 concentration originating from the land surface. The simulation of the El Nino/Southern Oscillation which was part of this study has been characterized as `the most impressive new result in ten years' gained new insight into the behavior of superheated ionic gas in the ITER reactor as a result of an AORSA run on 22,500 processors that achieved over 87 trillion calculations per second (87 teraflops) which is 74% of the system's theoretical peak. Tomorrow, Argonne and IBM will announce that the first IBM Blue Gene/P, a 100 teraflop system, will be shipped to the Argonne Leadership Computing Facility later this fiscal year. By the end of FY2007 ASCR high performance and leadership computing resources will include the 114 teraflop IBM Blue Gene/P; a 102 teraflop Cray XT4 at NERSC and a 119 teraflop Cray XT system at Oak Ridge. Before ringing in the New Year, Oak Ridge will upgrade to 250 teraflops with the replacement of the dual core processors with quad core processors and Argonne will upgrade to between 250-500 teraflops, and next year, a petascale Cray Baker system is scheduled for delivery at Oak Ridge. The multidisciplinary teams in our SciDAC Centers for Enabling Technologies and our SciDAC Institutes must continue to work with our Scientific Application teams to overcome the barriers that prevent effective use of these new systems. These challenges include: the need for new algorithms as well as operating system and runtime software and tools which scale to parallel systems composed of hundreds of thousands processors; program development environments and tools which scale effectively and provide ease of use for developers and scientific end users; and visualization and data management systems that support moving, storing, analyzing, manipulating and visualizing multi-petabytes of scientific data and objects. The SciDAC Centers, located primarily at our DOE national laboratories will take the lead in ensuring that critical computer science and applied mathematics issues are addressed in a timely and comprehensive fashion and to address issues associated with research software lifecycle. In contrast, the SciDAC Institutes, which are university-led centers of excellence, will have more flexibility to pursue new research topics through a range of research collaborations. The Institutes will also work to broaden the intellectual and researcher base—conducting short courses and summer schools to take advantage of new high performance computing capabilities. The SciDAC Outreach Center at Lawrence Berkeley National Laboratory complements the outreach efforts of the SciDAC Institutes. The Outreach Center is our clearinghouse for SciDAC activities and resources and will communicate with the high performance computing community in part to understand their needs for workshops, summer schools and institutes. SciDAC is not ASCR's only effort to broaden the computational science community needed to meet the challenges of the new X-scale generation. I hope that you were able to attend the Computational Science Graduate Fellowship poster session last night. ASCR developed the fellowship in 1991 to meet the nation's growing need for scientists and technology professionals with advanced computer skills. CSGF, now jointly funded between ASCR and NNSA, is more than a traditional academic fellowship. It has provided more than 200 of the best and brightest graduate students with guidance, support and community in preparing them as computational scientists. Today CSGF alumni are bringing their diverse top-level skills and knowledge to research teams at DOE laboratories and in industries such as Proctor and Gamble, Lockheed Martin and Intel. At universities they are working to train the next generation of computational scientists. To build on this success, we intend to develop a wholly new Early Career Principal Investigator's (ECPI) program. Our objective is to stimulate academic research in scientific areas within ASCR's purview especially among faculty in early stages of their academic careers. Last February, we lost Ken Kennedy, one of the leading lights of our community. As we move forward into the extreme computing generation, his vision and insight will be greatly missed. In memorial to Ken Kennedy, we shall designate the ECPI grants to beginning faculty in Computer Science as the Ken Kennedy Fellowship. Watch the ASCR website for more information about ECPI and other early career programs in the computational sciences. We look to you, our scientists, researchers, and visionaries to take X-scale computing and use it to explode scientific discovery in your fields. We at SciDAC will work to ensure that this tool is the sharpest and most precise and efficient instrument to carve away the unknown and reveal the most exciting secrets and stimulating scientific discoveries of our time. The partnership between research and computing is the marriage that will spur greater discovery, and as Spencer said to Susan in Robert Parker's novel, `Sudden Mischief', `We stick together long enough, and we may get as smart as hell'. Michael Strayer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sprague, Michael A.; Boldyrev, Stanislav; Fischer, Paul
This report details the impact exascale will bring to turbulent-flow simulations in applied science and technology. The need for accurate simulation of turbulent flows is evident across the DOE applied-science and engineering portfolios, including combustion, plasma physics, nuclear-reactor physics, wind energy, and atmospheric science. The workshop brought together experts in turbulent-flow simulation, computational mathematics, and high-performance computing. Building upon previous ASCR workshops on exascale computing, participants defined a research agenda and path forward that will enable scientists and engineers to continually leverage, engage, and direct advances in computational systems on the path to exascale computing.
High Performance Computing and Storage Requirements for Nuclear Physics: Target 2017
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerber, Richard; Wasserman, Harvey
2014-04-30
In April 2014, NERSC, ASCR, and the DOE Office of Nuclear Physics (NP) held a review to characterize high performance computing (HPC) and storage requirements for NP research through 2017. This review is the 12th in a series of reviews held by NERSC and Office of Science program offices that began in 2009. It is the second for NP, and the final in the second round of reviews that covered the six Office of Science program offices. This report is the result of that review
77 FR 12823 - Advanced Scientific Computing Advisory Committee
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-02
... Exascale ARRA projects--Magellan final report, Advanced Networking update Status from Computer Science COV Early Career technical talks Summary of Applied Math and Computer Science Workshops ASCR's new SBIR..., Office of Science. ACTION: Notice of Open Meeting. SUMMARY: This notice announces a meeting of the...
Research Data Acquired in World-Class, 60-atm Subsonic Combustion Rig
NASA Technical Reports Server (NTRS)
Lee, Chi-Ming; Wey, Changlie
1999-01-01
NASA Lewis Research Center's new, world-class, 60-atmosphere (atm) combustor research facility, the Advanced Subsonic Combustion Rig (ASCR), is in operation and producing highly unique research data. Specifically, data were acquired at high pressures and temperatures representative of future subsonic engines from a fundamental flametube configuration with an advanced fuel injector. The data acquired include exhaust emissions as well as pressure and temperature distributions. Results to date represent an improved understanding of nitrous oxide (NOx) formation at high pressures and temperatures and include an NOx emissions reduction greater than 70 percent with an advanced fuel injector at operating pressures to 800 pounds per square inch absolute (psia). ASCR research is an integral part of the Advanced Subsonic Technology (AST) Propulsion Program. This program is developing critical low-emission combustion technology that will result in the next generation of gas turbine engines producing 50 to 70 percent less NOx emissions in comparison to 1996 International Civil Aviation Organization (ICAO) limits. The results to date indicate that the AST low-emission combustor goals of reducing NOx emissions by 50 to 70 percent are feasible. U.S. gas turbine manufacturers have started testing the low-emissions combustors at the ASCR. This collaborative testing will enable the industry to develop low-emission combustors at the high pressure and temperature conditions of future subsonic engines. The first stage of the flametube testing has been implemented. Four GE Aircraft Engines low-emissions fuel injector concepts, three Pratt & Whitney concepts, and two Allison concepts have been tested at Lewis ASCR facility. Subsequently, the flametube was removed from the test stand, and the sector combustor was installed. The testing of low emissions sector has begun. Low-emission combustors developed as a result of ASCR research will enable U.S. engine manufacturers to compete on a worldwide basis by producing environmentally acceptable commercial engines.
Data and Communications in Basic Energy Sciences: Creating a Pathway for Scientific Discovery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nugent, Peter E.; Simonson, J. Michael
2011-10-24
This report is based on the Department of Energy (DOE) Workshop on “Data and Communications in Basic Energy Sciences: Creating a Pathway for Scientific Discovery” that was held at the Bethesda Marriott in Maryland on October 24-25, 2011. The workshop brought together leading researchers from the Basic Energy Sciences (BES) facilities and Advanced Scientific Computing Research (ASCR). The workshop was co-sponsored by these two Offices to identify opportunities and needs for data analysis, ownership, storage, mining, provenance and data transfer at light sources, neutron sources, microscopy centers and other facilities. Their charge was to identify current and anticipated issues inmore » the acquisition, analysis, communication and storage of experimental data that could impact the progress of scientific discovery, ascertain what knowledge, methods and tools are needed to mitigate present and projected shortcomings and to create the foundation for information exchanges and collaboration between ASCR and BES supported researchers and facilities. The workshop was organized in the context of the impending data tsunami that will be produced by DOE’s BES facilities. Current facilities, like SLAC National Accelerator Laboratory’s Linac Coherent Light Source, can produce up to 18 terabytes (TB) per day, while upgraded detectors at Lawrence Berkeley National Laboratory’s Advanced Light Source will generate ~10TB per hour. The expectation is that these rates will increase by over an order of magnitude in the coming decade. The urgency to develop new strategies and methods in order to stay ahead of this deluge and extract the most science from these facilities was recognized by all. The four focus areas addressed in this workshop were: Workflow Management - Experiment to Science: Identifying and managing the data path from experiment to publication. Theory and Algorithms: Recognizing the need for new tools for computation at scale, supporting large data sets and realistic theoretical models. Visualization and Analysis: Supporting near-real-time feedback for experiment optimization and new ways to extract and communicate critical information from large data sets. Data Processing and Management: Outlining needs in computational and communication approaches and infrastructure needed to handle unprecedented data volume and information content. It should be noted that almost all participants recognized that there were unlikely to be any turn-key solutions available due to the unique, diverse nature of the BES community, where research at adjacent beamlines at a given light source facility often span everything from biology to materials science to chemistry using scattering, imaging and/or spectroscopy. However, it was also noted that advances supported by other programs in data research, methodologies, and tool development could be implemented on reasonable time scales with modest effort. Adapting available standard file formats, robust workflows, and in-situ analysis tools for user facility needs could pay long-term dividends. Workshop participants assessed current requirements as well as future challenges and made the following recommendations in order to achieve the ultimate goal of enabling transformative science in current and future BES facilities: Theory and analysis components should be integrated seamlessly within experimental workflow. Develop new algorithms for data analysis based on common data formats and toolsets. Move analysis closer to experiment. Move the analysis closer to the experiment to enable real-time (in-situ) streaming capabilities, live visualization of the experiment and an increase of the overall experimental efficiency. Match data management access and capabilities with advancements in detectors and sources. Remove bottlenecks, provide interoperability across different facilities/beamlines and apply forefront mathematical techniques to more efficiently extract science from the experiments. This workshop report examines and reviews the status of several BES facilities and highlights the successes and shortcomings of the current data and communication pathways for scientific discovery. It then ascertains what methods and tools are needed to mitigate present and projected data bottlenecks to science over the next 10 years. The goal of this report is to create the foundation for information exchanges and collaborations among ASCR and BES supported researchers, the BES scientific user facilities, and ASCR computing and networking facilities. To jumpstart these activities, there was a strong desire to see a joint effort between ASCR and BES along the lines of the highly successful Scientific Discovery through Advanced Computing (SciDAC) program in which integrated teams of engineers, scientists and computer scientists were engaged to tackle a complete end-to-end workflow solution at one or more beamlines, to ascertain what challenges will need to be addressed in order to handle future increases in data« less
DOE Advanced Scientific Advisory Committee (ASCAC): Workforce Subcommittee Letter
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chapman, Barbara; Calandra, Henri; Crivelli, Silvia
2014-07-23
Simulation and computing are essential to much of the research conducted at the DOE national laboratories. Experts in the ASCR ¬relevant Computing Sciences, which encompass a range of disciplines including Computer Science, Applied Mathematics, Statistics and domain Computational Sciences, are an essential element of the workforce in nearly all of the DOE national laboratories. This report seeks to identify the gaps and challenges facing DOE with respect to this workforce. This letter is ASCAC’s response to the charge of February 19, 2014 to identify disciplines in which significantly greater emphasis in workforce training at the graduate or postdoctoral levels ismore » necessary to address workforce gaps in current and future Office of Science mission needs.« less
Outcomes from the DOE Workshop on Turbulent Flow Simulation at the Exascale
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sprague, Michael; Boldyrev, Stanislav; Chang, Choong-Seock
This paper summarizes the outcomes from the Turbulent Flow Simulation at the Exascale: Opportunities and Challenges Workshop, which was held 4-5 August 2015, and was sponsored by the U.S. Department of Energy Office of Advanced Scientific Computing Research. The workshop objective was to define and describe the challenges and opportunities that computing at the exascale will bring to turbulent-flow simulations in applied science and technology. The need for accurate simulation of turbulent flows is evident across the U.S. Department of Energy applied-science and engineering portfolios, including combustion, plasma physics, nuclear-reactor physics, wind energy, and atmospheric science. The workshop brought togethermore » experts in turbulent-flow simulation, computational mathematics, and high-performance computing. Building upon previous ASCR workshops on exascale computing, participants defined a research agenda and path forward that will enable scientists and engineers to continually leverage, engage, and direct advances in computational systems on the path to exascale computing.« less
One-Dimensional Spontaneous Raman Measurements of Temperature Made in a Gas Turbine Combustor
NASA Technical Reports Server (NTRS)
Hicks, Yolanda R.; Locke, Randy J.; DeGroot, Wilhelmus A.; Anderson, Robert C.
2002-01-01
The NASA Glenn Research Center is working with the aeronautics industry to develop highly fuel-efficient and environmentally friendly gas turbine combustor technology. This effort includes testing new hardware designs at conditions that simulate the high-temperature, high-pressure environment expected in the next-generation of high-performance engines. Glenn has the only facilities in which such tests can be performed. One aspect of these tests is the use of nonintrusive optical and laser diagnostics to measure combustion species concentration, fuel/air ratio, fuel drop size, and velocity, and to visualize the fuel injector spray pattern and some combustion species distributions. These data not only help designers to determine the efficacy of specific designs, but provide a database for computer modelers and enhance our understanding of the many processes that take place within a combustor. Until recently, we lacked one critical capability, the ability to measure temperature. This article summarizes our latest developments in that area. Recently, we demonstrated the first-ever use of spontaneous Raman scattering to measure combustion temperatures within the Advanced Subsonics Combustion Rig (ASCR) sector rig. We also established the highest rig pressure ever achieved for a continuous-flow combustor facility, 54.4 bar. The ASCR facility can provide operating pressures from 1 to 60 bar (60 atm). This photograph shows the Raman system setup next to the ASCR rig. The test was performed using a NASA-concept fuel injector and Jet-A fuel over a range of air inlet temperatures, pressures, and fuel/air ratios.
NASA Musculoskeletal Space Medicine and Reconditioning Program
NASA Technical Reports Server (NTRS)
Kerstman, Eric; Scheuring, Richard
2011-01-01
The Astronaut Strength, Conditioning, and Rehabilitation (ASCR) group is comprised of certified strength and conditioning coaches and licensed and certified athletic trainers. The ASCR group works within NASA s Space Medicine Division providing direction and supervision to the astronaut corp with regards to physical readiness throughout all phases of space flight. The ASCR group is overseen by flight surgeons with specialized training in sports medicine or physical medicine and rehabilitation. The goals of the ASCR group include 1) designing and administering strength and conditioning programs that maximize the potential for physical performance while minimizing the rate of injury, 2) providing appropriate injury management and rehabilitation services, 3) collaborating with medical, research, engineering, and mission operations groups to develop and implement safe and effective in-flight exercise countermeasures, and 4) providing a structured, individualized post-flight reconditioning program for long duration crew members. This Panel will present the current approach to the management of musculoskeletal injuries commonly seen within the astronaut corp and will present an overview of the pre-flight physical training, in-flight exercise countermeasures, and post-flight reconditioning program for ISS astronauts.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Windus, Theresa; Banda, Michael; Devereaux, Thomas
Computers have revolutionized every aspect of our lives. Yet in science, the most tantalizing applications of computing lie just beyond our reach. The current quest to build an exascale computer with one thousand times the capability of today’s fastest machines (and more than a million times that of a laptop) will take researchers over the next horizon. The field of materials, chemical reactions, and compounds is inherently complex. Imagine millions of new materials with new functionalities waiting to be discovered — while researchers also seek to extend those materials that are known to a dizzying number of new forms. Wemore » could translate massive amounts of data from high precision experiments into new understanding through data mining and analysis. We could have at our disposal the ability to predict the properties of these materials, to follow their transformations during reactions on an atom-by-atom basis, and to discover completely new chemical pathways or physical states of matter. Extending these predictions from the nanoscale to the mesoscale, from the ultrafast world of reactions to long-time simulations to predict the lifetime performance of materials, and to the discovery of new materials and processes will have a profound impact on energy technology. In addition, discovery of new materials is vital to move computing beyond Moore’s law. To realize this vision, more than hardware is needed. New algorithms to take advantage of the increase in computing power, new programming paradigms, and new ways of mining massive data sets are needed as well. This report summarizes the opportunities and the requisite computing ecosystem needed to realize the potential before us. In addition to pursuing new and more complete physical models and theoretical frameworks, this review found that the following broadly grouped areas relevant to the U.S. Department of Energy (DOE) Office of Advanced Scientific Computing Research (ASCR) would directly affect the Basic Energy Sciences (BES) mission need. Simulation, visualization, and data analysis are crucial for advances in energy science and technology. Revolutionary mathematical, software, and algorithm developments are required in all areas of BES science to take advantage of exascale computing architectures and to meet data analysis, management, and workflow needs. In partnership with ASCR, BES has an emerging and pressing need to develop new and disruptive capabilities in data science. More capable and larger high-performance computing (HPC) and data ecosystems are required to support priority research in BES. Continued success in BES research requires developing the next-generation workforce through education and training and by providing sustained career opportunities.« less
76 FR 64330 - Advanced Scientific Computing Advisory Committee
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-18
... talks on HPC Reliability, Diffusion on Complex Networks, and Reversible Software Execution Systems Report from Applied Math Workshop on Mathematics for the Analysis, Simulation, and Optimization of Complex Systems Report from ASCR-BES Workshop on Data Challenges from Next Generation Facilities Public...
The Magellan Final Report on Cloud Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
,; Coghlan, Susan; Yelick, Katherine
The goal of Magellan, a project funded through the U.S. Department of Energy (DOE) Office of Advanced Scientific Computing Research (ASCR), was to investigate the potential role of cloud computing in addressing the computing needs for the DOE Office of Science (SC), particularly related to serving the needs of mid- range computing and future data-intensive computing workloads. A set of research questions was formed to probe various aspects of cloud computing from performance, usability, and cost. To address these questions, a distributed testbed infrastructure was deployed at the Argonne Leadership Computing Facility (ALCF) and the National Energy Research Scientific Computingmore » Center (NERSC). The testbed was designed to be flexible and capable enough to explore a variety of computing models and hardware design points in order to understand the impact for various scientific applications. During the project, the testbed also served as a valuable resource to application scientists. Applications from a diverse set of projects such as MG-RAST (a metagenomics analysis server), the Joint Genome Institute, the STAR experiment at the Relativistic Heavy Ion Collider, and the Laser Interferometer Gravitational Wave Observatory (LIGO), were used by the Magellan project for benchmarking within the cloud, but the project teams were also able to accomplish important production science utilizing the Magellan cloud resources.« less
Concentration dependent differential activity of signalling molecules in Caenorhabditis elegans
USDA-ARS?s Scientific Manuscript database
Caenorhabditis elegans employs specific glycosides of the dideoxysugar ascarylose (the ‘ascarosides’) for monitoring population density/ dauer formation and finding mates. A synergistic blend of three ascarosides, called ascr#2, ascr#3 and ascr#4 acts as a dauer pheromone at a high concentration na...
the one illustrated here, the outer membrane protein OprF of Pseudomonas aeruginosa in its -1990s, NWChem was designed to run on networked processors, as in an HPC system, using one-sided communication, says Jeff Hammond of Intel Corp.'s Parallel Computing Laboratory. In one-sided communication, a
High Performance Computing Operations Review Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cupps, Kimberly C.
2013-12-19
The High Performance Computing Operations Review (HPCOR) meeting—requested by the ASC and ASCR program headquarters at DOE—was held November 5 and 6, 2013, at the Marriott Hotel in San Francisco, CA. The purpose of the review was to discuss the processes and practices for HPC integration and its related software and facilities. Experiences and lessons learned from the most recent systems deployed were covered in order to benefit the deployment of new systems.
Workflow Management Systems for Molecular Dynamics on Leadership Computers
NASA Astrophysics Data System (ADS)
Wells, Jack; Panitkin, Sergey; Oleynik, Danila; Jha, Shantenu
Molecular Dynamics (MD) simulations play an important role in a range of disciplines from Material Science to Biophysical systems and account for a large fraction of cycles consumed on computing resources. Increasingly science problems require the successful execution of ''many'' MD simulations as opposed to a single MD simulation. There is a need to provide scalable and flexible approaches to the execution of the workload. We present preliminary results on the Titan computer at the Oak Ridge Leadership Computing Facility that demonstrate a general capability to manage workload execution agnostic of a specific MD simulation kernel or execution pattern, and in a manner that integrates disparate grid-based and supercomputing resources. Our results build upon our extensive experience of distributed workload management in the high-energy physics ATLAS project using PanDA (Production and Distributed Analysis System), coupled with recent conceptual advances in our understanding of workload management on heterogeneous resources. We will discuss how we will generalize these initial capabilities towards a more production level service on DOE leadership resources. This research is sponsored by US DOE/ASCR and used resources of the OLCF computing facility.
Quantum Testbeds Stakeholder Workshop (QTSW) Report meeting purpose and agenda.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hebner, Gregory A.
Quantum computing (QC) is a promising early-stage technology with the potential to provide scientific computing capabilities far beyond what is possible with even an Exascale computer in specific problems of relevance to the Office of Science. These include (but are not limited to) materials modeling, molecular dynamics, and quantum chromodynamics. However, commercial QC systems are not yet available and the technical maturity of current QC hardware, software, algorithms, and systems integration is woefully incomplete. Thus, there is a significant opportunity for DOE to define the technology building blocks, and solve the system integration issues to enable a revolutionary tool. Oncemore » realized, QC will have world changing impact on economic competitiveness, the scientific enterprise, and citizen well-being. Prior to this workshop, DOE / Office of Advanced Scientific Computing Research (ASCR) hosted a workshop in 2015 to explore QC scientific applications. The goal of that workshop was to assess the viability of QC technologies to meet the computational requirements in support of DOE’s science and energy mission and to identify the potential impact of these technologies.« less
... Career Center DC&R Journal Access Member Directory Mentor Match Educational Resources American Board of Colon and ... Media Policy Corporate Partners Contact Us Media Center Mentor Match Disclaimer ASCRS Product Store Donate to ASCRS ...
Interspecific nematode signals regulate dispersal behavior.
Kaplan, Fatma; Alborn, Hans T; von Reuss, Stephan H; Ajredini, Ramadan; Ali, Jared G; Akyazi, Faruk; Stelinski, Lukasz L; Edison, Arthur S; Schroeder, Frank C; Teal, Peter E
2012-01-01
Dispersal is an important nematode behavior. Upon crowding or food depletion, the free living bacteriovorus nematode Caenorhabditis elegans produces stress resistant dispersal larvae, called dauer, which are analogous to second stage juveniles (J2) of plant parasitic Meloidogyne spp. and infective juveniles (IJ)s of entomopathogenic nematodes (EPN), e.g., Steinernema feltiae. Regulation of dispersal behavior has not been thoroughly investigated for C. elegans or any other nematode species. Based on the fact that ascarosides regulate entry in dauer stage as well as multiple behaviors in C. elegans adults including mating, avoidance and aggregation, we hypothesized that ascarosides might also be involved in regulation of dispersal behavior in C. elegans and for other nematodes such as IJ of phylogenetically related EPNs. Liquid chromatography-mass spectrometry analysis of C. elegans dauer conditioned media, which shows strong dispersing activity, revealed four known ascarosides (ascr#2, ascr#3, ascr#8, icas#9). A synthetic blend of these ascarosides at physiologically relevant concentrations dispersed C. elegans dauer in the presence of food and also caused dispersion of IJs of S. feltiae and J2s of plant parasitic Meloidogyne spp. Assay guided fractionation revealed structural analogs as major active components of the S. feltiae (ascr#9) and C. elegans (ascr#2) dispersal blends. Further analysis revealed ascr#9 in all Steinernema spp. and Heterorhabditis spp. infected insect host cadavers. Ascaroside blends represent evolutionarily conserved, fundamentally important communication systems for nematodes from diverse habitats, and thus may provide sustainable means for control of parasitic nematodes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heroux, Michael; Lethin, Richard
Programming models and environments play the essential roles in high performance computing of enabling the conception, design, implementation and execution of science and engineering application codes. Programmer productivity is strongly influenced by the effectiveness of our programming models and environments, as is software sustainability since our codes have lifespans measured in decades, so the advent of new computing architectures, increased concurrency, concerns for resilience, and the increasing demands for high-fidelity, multi-physics, multi-scale and data-intensive computations mean that we have new challenges to address as part of our fundamental R&D requirements. Fortunately, we also have new tools and environments that makemore » design, prototyping and delivery of new programming models easier than ever. The combination of new and challenging requirements and new, powerful toolsets enables significant synergies for the next generation of programming models and environments R&D. This report presents the topics discussed and results from the 2014 DOE Office of Science Advanced Scientific Computing Research (ASCR) Programming Models & Environments Summit, and subsequent discussions among the summit participants and contributors to topics in this report.« less
A Fault Oblivious Extreme-Scale Execution Environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
McKie, Jim
The FOX project, funded under the ASCR X-stack I program, developed systems software and runtime libraries for a new approach to the data and work distribution for massively parallel, fault oblivious application execution. Our work was motivated by the premise that exascale computing systems will provide a thousand-fold increase in parallelism and a proportional increase in failure rate relative to today’s machines. To deliver the capability of exascale hardware, the systems software must provide the infrastructure to support existing applications while simultaneously enabling efficient execution of new programming models that naturally express dynamic, adaptive, irregular computation; coupled simulations; and massivemore » data analysis in a highly unreliable hardware environment with billions of threads of execution. Our OS research has prototyped new methods to provide efficient resource sharing, synchronization, and protection in a many-core compute node. We have experimented with alternative task/dataflow programming models and shown scalability in some cases to hundreds of thousands of cores. Much of our software is in active development through open source projects. Concepts from FOX are being pursued in next generation exascale operating systems. Our OS work focused on adaptive, application tailored OS services optimized for multi → many core processors. We developed a new operating system NIX that supports role-based allocation of cores to processes which was released to open source. We contributed to the IBM FusedOS project, which promoted the concept of latency-optimized and throughput-optimized cores. We built a task queue library based on distributed, fault tolerant key-value store and identified scaling issues. A second fault tolerant task parallel library was developed, based on the Linda tuple space model, that used low level interconnect primitives for optimized communication. We designed fault tolerance mechanisms for task parallel computations employing work stealing for load balancing that scaled to the largest existing supercomputers. Finally, we implemented the Elastic Building Blocks runtime, a library to manage object-oriented distributed software components. To support the research, we won two INCITE awards for time on Intrepid (BG/P) and Mira (BG/Q). Much of our work has had impact in the OS and runtime community through the ASCR Exascale OS/R workshop and report, leading to the research agenda of the Exascale OS/R program. Our project was, however, also affected by attrition of multiple PIs. While the PIs continued to participate and offer guidance as time permitted, losing these key individuals was unfortunate both for the project and for the DOE HPC community.« less
Density-based regulation of ascr#2 and ascr#4 expression in Caenorhabditis elegans
USDA-ARS?s Scientific Manuscript database
The ascarosides are a family of nematode small molecules, many of which induce formation of long-lived and highly stress resistant dauer larvae. More recent studies have shown that ascarosides serve additional functions as social signals and mating pheromones. For example, the male attracting pherom...
Large Scale Computing and Storage Requirements for High Energy Physics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerber, Richard A.; Wasserman, Harvey
2010-11-24
The National Energy Research Scientific Computing Center (NERSC) is the leading scientific computing facility for the Department of Energy's Office of Science, providing high-performance computing (HPC) resources to more than 3,000 researchers working on about 400 projects. NERSC provides large-scale computing resources and, crucially, the support and expertise needed for scientists to make effective use of them. In November 2009, NERSC, DOE's Office of Advanced Scientific Computing Research (ASCR), and DOE's Office of High Energy Physics (HEP) held a workshop to characterize the HPC resources needed at NERSC to support HEP research through the next three to five years. Themore » effort is part of NERSC's legacy of anticipating users needs and deploying resources to meet those demands. The workshop revealed several key points, in addition to achieving its goal of collecting and characterizing computing requirements. The chief findings: (1) Science teams need access to a significant increase in computational resources to meet their research goals; (2) Research teams need to be able to read, write, transfer, store online, archive, analyze, and share huge volumes of data; (3) Science teams need guidance and support to implement their codes on future architectures; and (4) Projects need predictable, rapid turnaround of their computational jobs to meet mission-critical time constraints. This report expands upon these key points and includes others. It also presents a number of case studies as representative of the research conducted within HEP. Workshop participants were asked to codify their requirements in this case study format, summarizing their science goals, methods of solution, current and three-to-five year computing requirements, and software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel, multi-core environment that is expected to dominate HPC architectures over the next few years. The report includes a section that describes efforts already underway or planned at NERSC that address requirements collected at the workshop. NERSC has many initiatives in progress that address key workshop findings and are aligned with NERSC's strategic plans.« less
Schallhorn, Julie M; Ciralsky, Jessica B; Yeu, Elizabeth
2017-05-01
A survey was offered to attendees of the 2016 annual meeting of the American Society of Cataract and Refractive Surgery (ASCRS) as well as online to ASCRS members. Of the 429 self-identified surgeons in training or those with fewer than 5 years in practice, 83% had performed complex cataract surgery using iris expansion devices or capsular tension rings (63%) and 70% had implanted a toric intraocular lens (IOL). A minority of respondents had performed laser-assisted cataract surgery (27%) or implanted presbyopia-correcting IOLs (39%), and only half (50%) had performed laser vision correction (LVC). Comfort with complex cataract and IOL procedures improved with increasing number of cases performed until greater than 10 cases. From this we can conclude that young surgeons have adequate exposure to complex cataracts but lack experience in refractive surgery and new IOL technology. Reported surgeon confidence improved with increased experience and exposure. Copyright © 2017 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Performance Analysis, Modeling and Scaling of HPC Applications and Tools
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhatele, Abhinav
2016-01-13
E cient use of supercomputers at DOE centers is vital for maximizing system throughput, mini- mizing energy costs and enabling science breakthroughs faster. This requires complementary e orts along several directions to optimize the performance of scienti c simulation codes and the under- lying runtimes and software stacks. This in turn requires providing scalable performance analysis tools and modeling techniques that can provide feedback to physicists and computer scientists developing the simulation codes and runtimes respectively. The PAMS project is using time allocations on supercomputers at ALCF, NERSC and OLCF to further the goals described above by performing research alongmore » the following fronts: 1. Scaling Study of HPC applications; 2. Evaluation of Programming Models; 3. Hardening of Performance Tools; 4. Performance Modeling of Irregular Codes; and 5. Statistical Analysis of Historical Performance Data. We are a team of computer and computational scientists funded by both DOE/NNSA and DOE/ ASCR programs such as ECRP, XStack (Traleika Glacier, PIPER), ExaOSR (ARGO), SDMAV II (MONA) and PSAAP II (XPACC). This allocation will enable us to study big data issues when analyzing performance on leadership computing class systems and to assist the HPC community in making the most e ective use of these resources.« less
Sex-specific mating pheromones in the nematode Panagrellus redivivus.
Choe, Andrea; Chuman, Tatsuji; von Reuss, Stephan H; Dossey, Aaron T; Yim, Joshua J; Ajredini, Ramadan; Kolawa, Adam A; Kaplan, Fatma; Alborn, Hans T; Teal, Peter E A; Schroeder, Frank C; Sternberg, Paul W; Edison, Arthur S
2012-12-18
Nematodes use an extensive chemical language based on glycosides of the dideoxysugar ascarylose for developmental regulation (dauer formation), male sex attraction, aggregation, and dispersal. However, no examples of a female- or hermaphrodite-specific sex attractant have been identified to date. In this study, we investigated the pheromone system of the gonochoristic sour paste nematode Panagrellus redivivus, which produces sex-specific attractants of the opposite sex. Activity-guided fractionation of the P. redivivus exometabolome revealed that males are strongly attracted to ascr#1 (also known as daumone), an ascaroside previously identified from Caenorhabditis elegans hermaphrodites. Female P. redivivus are repelled by high concentrations of ascr#1 but are specifically attracted to a previously unknown ascaroside that we named dhas#18, a dihydroxy derivative of the known ascr#18 and an ascaroside that features extensive functionalization of the lipid-derived side chain. Targeted profiling of the P. redivivus exometabolome revealed several additional ascarosides that did not induce strong chemotaxis. We show that P. redivivus females, but not males, produce the male-attracting ascr#1, whereas males, but not females, produce the female-attracting dhas#18. These results show that ascaroside biosynthesis in P. redivivus is highly sex-specific. Furthermore, the extensive side chain functionalization in dhas#18, which is reminiscent of polyketide-derived natural products, indicates unanticipated biosynthetic capabilities in nematodes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rojas, Joseph Maurice
We summarize the contributions of the Texas A\\&M University Group to the project (DE-FG02-09ER25949/DE-SC0002505: Topology for Statistical Modeling of Petascale Data - an ASCR-funded collaboration between Sandia National Labs, Texas A\\&M U, and U Utah) during 6/9/2011 -- 2/27/2013.
The New Zealand cataract and refractive surgery survey 1997/1998.
Elder, M; Tarr, K; Leaming, D
2000-04-01
This study documents the current practice for cataract and refractive surgery in New Zealand. A postal questionnaire was distributed in late 1997 to all consultant members of the Ophthalmological Society of New Zealand that were resident in the country at that time. Most questions were identical to the 1997 survey of the American Society of Cataract and Refraction Surgeons (ASCRS) to enable a comparison. There were 98 returns from 101 surveys distributed. Of the returns, 72 performed cataract surgery, 23 performed PRK and 11 performed LASIK. ASCRS members did more refractive surgery than did New Zealanders: 28 versus 1% of 1-5 RK per month, 7 versus 1% of 1-2 clear lens extractions per month and 85 versus 51% had access to an excimer laser. For cataract surgery, ASCRS members used more topical anaesthesia (30 vs 5.5%), used no sutures more often (73 vs 51%), used more preoperative antibiotics (76 vs 26%) and used fewer injections of antibiotic/steroids (38 vs 61%). Otherwise the two groups were broadly similar.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Drugan, C.
2009-12-07
The word 'breakthrough' aptly describes the transformational science and milestones achieved at the Argonne Leadership Computing Facility (ALCF) throughout 2008. The number of research endeavors undertaken at the ALCF through the U.S. Department of Energy's (DOE) Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program grew from 9 in 2007 to 20 in 2008. The allocation of computer time awarded to researchers on the Blue Gene/P also spiked significantly - from nearly 10 million processor hours in 2007 to 111 million in 2008. To support this research, we expanded the capabilities of Intrepid, an IBM Blue Gene/P systemmore » at the ALCF, to 557 teraflops (TF) for production use. Furthermore, we enabled breakthrough levels of productivity and capability in visualization and data analysis with Eureka, a powerful installation of NVIDIA Quadro Plex S4 external graphics processing units. Eureka delivered a quantum leap in visual compute density, providing more than 111 TF and more than 3.2 terabytes of RAM. On April 21, 2008, the dedication of the ALCF realized DOE's vision to bring the power of the Department's high performance computing to open scientific research. In June, the IBM Blue Gene/P supercomputer at the ALCF debuted as the world's fastest for open science and third fastest overall. No question that the science benefited from this growth and system improvement. Four research projects spearheaded by Argonne National Laboratory computer scientists and ALCF users were named to the list of top ten scientific accomplishments supported by DOE's Advanced Scientific Computing Research (ASCR) program. Three of the top ten projects used extensive grants of computing time on the ALCF's Blue Gene/P to model the molecular basis of Parkinson's disease, design proteins at atomic scale, and create enzymes. As the year came to a close, the ALCF was recognized with several prestigious awards at SC08 in November. We provided resources for Linear Scaling Divide-and-Conquer Electronic Structure Calculations for Thousand Atom Nanostructures, a collaborative effort between Argonne, Lawrence Berkeley National Laboratory, and Oak Ridge National Laboratory that received the ACM Gordon Bell Prize Special Award for Algorithmic Innovation. The ALCF also was named a winner in two of the four categories in the HPC Challenge best performance benchmark competition.« less
Simulation of Laboratory Tests of Steel Arch Support
NASA Astrophysics Data System (ADS)
Horyl, Petr; Šňupárek, Richard; Maršálek, Pavel; Pacześniowski, Krzysztof
2017-03-01
The total load-bearing capacity of steel arch yielding roadways supports is among their most important characteristics. These values can be obtained in two ways: experimental measurements in a specialized laboratory or computer modelling by FEM. Experimental measurements are significantly more expensive and more time-consuming. However, for proper tuning, a computer model is very valuable and can provide the necessary verification by experiment. In the cooperating workplaces of GIG Katowice, VSB-Technical University of Ostrava and the Institute of Geonics ASCR this verification was successful. The present article discusses the conditions and results of this verification for static problems. The output is a tuned computer model, which may be used for other calculations to obtain the load-bearing capacity of other types of steel arch supports. Changes in other parameters such as the material properties of steel, size torques, friction coefficient values etc. can be determined relatively quickly by changing the properties of the investigated steel arch supports.
A pervasive parallel framework for visualization: final report for FWP 10-014707
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moreland, Kenneth D.
2014-01-01
We are on the threshold of a transformative change in the basic architecture of highperformance computing. The use of accelerator processors, characterized by large core counts, shared but asymmetrical memory, and heavy thread loading, is quickly becoming the norm in high performance computing. These accelerators represent significant challenges in updating our existing base of software. An intrinsic problem with this transition is a fundamental programming shift from message passing processes to much more fine thread scheduling with memory sharing. Another problem is the lack of stability in accelerator implementation; processor and compiler technology is currently changing rapidly. This report documentsmore » the results of our three-year ASCR project to address these challenges. Our project includes the development of the Dax toolkit, which contains the beginnings of new algorithms for a new generation of computers and the underlying infrastructure to rapidly prototype and build further algorithms as necessary.« less
A Multifaceted Mathematical Approach for Complex Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alexander, F.; Anitescu, M.; Bell, J.
2012-03-07
Applied mathematics has an important role to play in developing the tools needed for the analysis, simulation, and optimization of complex problems. These efforts require the development of the mathematical foundations for scientific discovery, engineering design, and risk analysis based on a sound integrated approach for the understanding of complex systems. However, maximizing the impact of applied mathematics on these challenges requires a novel perspective on approaching the mathematical enterprise. Previous reports that have surveyed the DOE's research needs in applied mathematics have played a key role in defining research directions with the community. Although these reports have had significantmore » impact, accurately assessing current research needs requires an evaluation of today's challenges against the backdrop of recent advances in applied mathematics and computing. To address these needs, the DOE Applied Mathematics Program sponsored a Workshop for Mathematics for the Analysis, Simulation and Optimization of Complex Systems on September 13-14, 2011. The workshop had approximately 50 participants from both the national labs and academia. The goal of the workshop was to identify new research areas in applied mathematics that will complement and enhance the existing DOE ASCR Applied Mathematics Program efforts that are needed to address problems associated with complex systems. This report describes recommendations from the workshop and subsequent analysis of the workshop findings by the organizing committee.« less
Computer-animated model of accommodation and presbyopia.
Goldberg, Daniel B
2015-02-01
To understand, demonstrate, and further research the mechanisms of accommodation and presbyopia. Private practice, Little Silver, New Jersey, USA. Experimental study. The CAMA 2.0 computer-animated model of accommodation and presbyopia was produced in collaboration with an experienced medical animator using Autodesk Maya animation software and Adobe After Effects. The computer-animated model demonstrates the configuration and synchronous movements of all accommodative elements. A new classification of the zonular apparatus based on structure and function is proposed. There are 3 divisions of zonular fibers; that is, anterior, crossing, and posterior. The crossing zonular fibers form a scaffolding to support the lens; the anterior and posterior zonular fibers work reciprocally to achieve focused vision. The model demonstrates the important support function of Weiger ligament. Dynamic movement of the ora serrata demonstrates that the forces of ciliary muscle contraction store energy for disaccommodation in the elastic choroid. The flow of aqueous and vitreous provides strong evidence for our understanding of the hydrodynamic interactions during the accommodative cycle. The interaction may result from the elastic stretch in the choroid transmitted to the vitreous rather than from vitreous pressue. The model supports the concept that presbyopia results from loss of elasticity and increasing ocular rigidity in both the lenticular and extralenticular structures. The computer-animated model demonstrates the structures of accommodation moving in synchrony and might enhance understanding of the mechanisms of accommodation and presbyopia. Dr. Goldberg is a consultant to Acevision, Inc., and Bausch & Lomb. Copyright © 2015 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Simple technique to measure toric intraocular lens alignment and stability using a smartphone.
Teichman, Joshua C; Baig, Kashif; Ahmed, Iqbal Ike K
2014-12-01
Toric intraocular lenses (IOLs) are commonly implanted to correct corneal astigmatism at the time of cataract surgery. Their use requires preoperative calculation of the axis of implantation and postoperative measurement to determine whether the IOL has been implanted with the proper orientation. Moreover, toric IOL alignment stability over time is important for the patient and for the longitudinal evaluation of toric IOLs. We present a simple, inexpensive, and precise method to measure the toric IOL axis using a camera-enabled cellular phone (iPhone 5S) and computer software (ImageJ). Copyright © 2014 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Next Generation Workload Management System For Big Data on Heterogeneous Distributed Computing
NASA Astrophysics Data System (ADS)
Klimentov, A.; Buncic, P.; De, K.; Jha, S.; Maeno, T.; Mount, R.; Nilsson, P.; Oleynik, D.; Panitkin, S.; Petrosyan, A.; Porter, R. J.; Read, K. F.; Vaniachine, A.; Wells, J. C.; Wenaus, T.
2015-05-01
The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS and ALICE are the largest collaborations ever assembled in the sciences and are at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, both experiments rely on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Management System (WMS) for managing the workflow for all data processing on hundreds of data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. The scale is demonstrated by the following numbers: PanDA manages O(102) sites, O(105) cores, O(108) jobs per year, O(103) users, and ATLAS data volume is O(1017) bytes. In 2013 we started an ambitious program to expand PanDA to all available computing resources, including opportunistic use of commercial and academic clouds and Leadership Computing Facilities (LCF). The project titled ‘Next Generation Workload Management and Analysis System for Big Data’ (BigPanDA) is funded by DOE ASCR and HEP. Extending PanDA to clouds and LCF presents new challenges in managing heterogeneity and supporting workflow. The BigPanDA project is underway to setup and tailor PanDA at the Oak Ridge Leadership Computing Facility (OLCF) and at the National Research Center "Kurchatov Institute" together with ALICE distributed computing and ORNL computing professionals. Our approach to integration of HPC platforms at the OLCF and elsewhere is to reuse, as much as possible, existing components of the PanDA system. We will present our current accomplishments with running the PanDA WMS at OLCF and other supercomputers and demonstrate our ability to use PanDA as a portal independent of the computing facilities infrastructure for High Energy and Nuclear Physics as well as other data-intensive science applications.
ASCR Cybersecurity for Scientific Computing Integrity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Piesert, Sean
The Department of Energy (DOE) has the responsibility to address the energy, environmental, and nuclear security challenges that face our nation. Much of DOE’s enterprise involves distributed, collaborative teams; a signi¬cant fraction involves “open science,” which depends on multi-institutional, often international collaborations that must access or share signi¬cant amounts of information between institutions and over networks around the world. The mission of the Office of Science is the delivery of scienti¬c discoveries and major scienti¬c tools to transform our understanding of nature and to advance the energy, economic, and national security of the United States. The ability of DOE tomore » execute its responsibilities depends critically on its ability to assure the integrity and availability of scienti¬c facilities and computer systems, and of the scienti¬c, engineering, and operational software and data that support its mission.« less
Next Generation Workload Management System For Big Data on Heterogeneous Distributed Computing
Klimentov, A.; Buncic, P.; De, K.; ...
2015-05-22
The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS and ALICE are the largest collaborations ever assembled in the sciences and are at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, both experiments rely on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Managementmore » System (WMS) for managing the workflow for all data processing on hundreds of data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. The scale is demonstrated by the following numbers: PanDA manages O(10 2) sites, O(10 5) cores, O(10 8) jobs per year, O(10 3) users, and ATLAS data volume is O(10 17) bytes. In 2013 we started an ambitious program to expand PanDA to all available computing resources, including opportunistic use of commercial and academic clouds and Leadership Computing Facilities (LCF). The project titled 'Next Generation Workload Management and Analysis System for Big Data' (BigPanDA) is funded by DOE ASCR and HEP. Extending PanDA to clouds and LCF presents new challenges in managing heterogeneity and supporting workflow. The BigPanDA project is underway to setup and tailor PanDA at the Oak Ridge Leadership Computing Facility (OLCF) and at the National Research Center "Kurchatov Institute" together with ALICE distributed computing and ORNL computing professionals. Our approach to integration of HPC platforms at the OLCF and elsewhere is to reuse, as much as possible, existing components of the PanDA system. Finally, we will present our current accomplishments with running the PanDA WMS at OLCF and other supercomputers and demonstrate our ability to use PanDA as a portal independent of the computing facilities infrastructure for High Energy and Nuclear Physics as well as other data-intensive science applications.« less
Next Generation Workload Management System For Big Data on Heterogeneous Distributed Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klimentov, A.; Buncic, P.; De, K.
The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS and ALICE are the largest collaborations ever assembled in the sciences and are at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, both experiments rely on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Managementmore » System (WMS) for managing the workflow for all data processing on hundreds of data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. The scale is demonstrated by the following numbers: PanDA manages O(10 2) sites, O(10 5) cores, O(10 8) jobs per year, O(10 3) users, and ATLAS data volume is O(10 17) bytes. In 2013 we started an ambitious program to expand PanDA to all available computing resources, including opportunistic use of commercial and academic clouds and Leadership Computing Facilities (LCF). The project titled 'Next Generation Workload Management and Analysis System for Big Data' (BigPanDA) is funded by DOE ASCR and HEP. Extending PanDA to clouds and LCF presents new challenges in managing heterogeneity and supporting workflow. The BigPanDA project is underway to setup and tailor PanDA at the Oak Ridge Leadership Computing Facility (OLCF) and at the National Research Center "Kurchatov Institute" together with ALICE distributed computing and ORNL computing professionals. Our approach to integration of HPC platforms at the OLCF and elsewhere is to reuse, as much as possible, existing components of the PanDA system. Finally, we will present our current accomplishments with running the PanDA WMS at OLCF and other supercomputers and demonstrate our ability to use PanDA as a portal independent of the computing facilities infrastructure for High Energy and Nuclear Physics as well as other data-intensive science applications.« less
... Conditions / Anal Warts Anal Warts Anal Warts | ASCRS Alternate Titles: Warts ANAL WARTS Anal warts (condyloma acuminata) ... Facts About Colorectal Cancer Colon Cancer Myths vs. Reality The Colon: What it is, What it Does ...
Non-Hodgkin’s Lymphomas, Version 4.2014
Zelenetz, Andrew D.; Gordon, Leo I.; Wierda, William G.; Abramson, Jeremy S.; Advani, Ranjana H.; Andreadis, C. Babis; Bartlett, Nancy; Byrd, John C.; Czuczman, Myron S.; Fayad, Luis E.; Fisher, Richard I.; Glenn, Martha J.; Harris, Nancy Lee; Hoppe, Richard T.; Horwitz, Steven M.; Kelsey, Christopher R.; Kim, Youn H.; Krivacic, Susan; LaCasce, Ann S.; Nademanee, Auayporn; Porcu, Pierluigi; Press, Oliver; Rabinovitch, Rachel; Reddy, Nishitha; Reid, Erin; Saad, Ayman A.; Sokol, Lubomir; Swinnen, Lode J.; Tsien, Christina; Vose, Julie M.; Yahalom, Joachim; Zafar, Nadeem; Dwyer, Mary; Sundar, Hema
2016-01-01
Non-Hodgkin’s lymphomas (NHL) are a heterogeneous group of lymphoproliferative disorders originating in B lymphocytes, T lymphocytes, or natural killer cells. Mantle cell lymphoma (MCL) accounts for approximately 6% of all newly diagnosed NHL cases. Radiation therapy with or without systemic therapy is a reasonable approach for the few patients who present with early-stage disease. Rituximab-based chemoimmunotherapy followed by high-dose therapy and autologous stem cell rescue (HDT/ASCR) is recommended for patients presenting with advanced-stage disease. Induction therapy followed by rituximab maintenance may provide extended disease control for those who are not candidates for HDT/ASCR. Ibrutinib, a Bruton tyrosine kinase inhibitor, was recently approved for the treatment of relapsed or refractory disease. This manuscript discusses the recommendations outlined in the NCCN Guidelines for NHL regarding the diagnosis and management of patients with MCL. PMID:25190696
... Chapter 31. Intestinal Stomas. Chapter in Beck, D.E., Roberts, P.L., Saclarides, T.J., Senagore, A.J., Stamos, M.J., Wexner, S.D., Eds. ASCRS Textbook of Colon and Rectal Surgery, 2nd Edition. Springer, New York, NY: 2011. National Digestive Disease Information ...
NCCN Guidelines Insights: Non-Hodgkin's Lymphomas, Version 3.2016.
Horwitz, Steven M; Zelenetz, Andrew D; Gordon, Leo I; Wierda, William G; Abramson, Jeremy S; Advani, Ranjana H; Andreadis, C Babis; Bartlett, Nancy; Byrd, John C; Fayad, Luis E; Fisher, Richard I; Glenn, Martha J; Habermann, Thomas M; Lee Harris, Nancy; Hernandez-Ilizaliturri, Francisco; Hoppe, Richard T; Kaminski, Mark S; Kelsey, Christopher R; Kim, Youn H; Krivacic, Susan; LaCasce, Ann S; Lunning, Matthew; Nademanee, Auayporn; Press, Oliver; Rabinovitch, Rachel; Reddy, Nishitha; Reid, Erin; Roberts, Kenneth; Saad, Ayman A; Sokol, Lubomir; Swinnen, Lode J; Vose, Julie M; Yahalom, Joachim; Zafar, Nadeem; Dwyer, Mary; Sundar, Hema; Porcu, Pierluigi
2016-09-01
Peripheral T-cell lymphomas (PTCLs) represent a relatively uncommon heterogeneous group of non-Hodgkin's lymphomas (NHLs) with an aggressive clinical course and poor prognosis. Anthracycline-based multiagent chemotherapy with or without radiation therapy followed by first-line consolidation with high-dose therapy followed by autologous stem cell rescue (HDT/ASCR) is the standard approach to most of the patients with newly diagnosed PTCL. Relapsed or refractory disease is managed with second-line systemic therapy followed by HDT/ASCR or allogeneic stem cell transplant, based on the patient's eligibility for transplant. In recent years, several newer agents have shown significant activity in patients with relapsed or refractory disease across all 4 subtypes of PTCL. These NCCN Guideline Insights highlight the important updates to the NCCN Guidelines for NHL, specific to the management of patients with relapsed or refractory PTCL. Copyright © 2016 by the National Comprehensive Cancer Network.
Models for joint ophthalmology-optometry patient management.
Kim, John J; Kim, Christine M
2011-07-01
American Academy of Ophthalmology (AAO) and American Society of Cataract and Refractive Surgery (ASCRS) presented a joint position paper in February 2000 declaring that they do not support routine comanagement of patients with the optometrists. American Optometric Association and American Academy of Optometry quickly responded in support of AAO and ASCRS. All four entities did not preclude legitimate and proper comanagement arrangements. Since that time, the pattern of practice has changed, requiring us to rethink our positions. This paper is written to provide a possible model for the ophthalmology-optometry practice management in ophthalmic surgeries including refractive surgery. Since the publication of the Joint Position Paper, the concept of comanagement has faded and a new model of integrated management has evolved. This has occurred as the changes in the employment pattern of the ophthalmic practice have incorporated optometrists into its fold. This evolution allowed ophthalmic and optometric community to co-exist and thrive to provide better patient care.
... Dermatology and Pruritis Ani”. Chapter in Beck, D. E., Roberts, P. L., Saclarides, T. J., Senagore, A. J., Stamos, M. J., Wexner, S. D., Eds. ASCRS Textbook of Colon and Rectal Surgery, 2nd Edition. Springer, New York, NY; 2011. American Society of Colon ...
The Argonne Leadership Computing Facility 2010 annual report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Drugan, C.
Researchers found more ways than ever to conduct transformative science at the Argonne Leadership Computing Facility (ALCF) in 2010. Both familiar initiatives and innovative new programs at the ALCF are now serving a growing, global user community with a wide range of computing needs. The Department of Energy's (DOE) INCITE Program remained vital in providing scientists with major allocations of leadership-class computing resources at the ALCF. For calendar year 2011, 35 projects were awarded 732 million supercomputer processor-hours for computationally intensive, large-scale research projects with the potential to significantly advance key areas in science and engineering. Argonne also continued tomore » provide Director's Discretionary allocations - 'start up' awards - for potential future INCITE projects. And DOE's new ASCR Leadership Computing (ALCC) Program allocated resources to 10 ALCF projects, with an emphasis on high-risk, high-payoff simulations directly related to the Department's energy mission, national emergencies, or for broadening the research community capable of using leadership computing resources. While delivering more science today, we've also been laying a solid foundation for high performance computing in the future. After a successful DOE Lehman review, a contract was signed to deliver Mira, the next-generation Blue Gene/Q system, to the ALCF in 2012. The ALCF is working with the 16 projects that were selected for the Early Science Program (ESP) to enable them to be productive as soon as Mira is operational. Preproduction access to Mira will enable ESP projects to adapt their codes to its architecture and collaborate with ALCF staff in shaking down the new system. We expect the 10-petaflops system to stoke economic growth and improve U.S. competitiveness in key areas such as advancing clean energy and addressing global climate change. Ultimately, we envision Mira as a stepping-stone to exascale-class computers that will be faster than petascale-class computers by a factor of a thousand. Pete Beckman, who served as the ALCF's Director for the past few years, has been named director of the newly created Exascale Technology and Computing Institute (ETCi). The institute will focus on developing exascale computing to extend scientific discovery and solve critical science and engineering problems. Just as Pete's leadership propelled the ALCF to great success, we know that that ETCi will benefit immensely from his expertise and experience. Without question, the future of supercomputing is certainly in good hands. I would like to thank Pete for all his effort over the past two years, during which he oversaw the establishing of ALCF2, the deployment of the Magellan project, increases in utilization, availability, and number of projects using ALCF1. He managed the rapid growth of ALCF staff and made the facility what it is today. All the staff and users are better for Pete's efforts.« less
77 FR 14952 - Delegations of Authority
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-14
... civil rights from the Secretary of Agriculture directly to the Assistant Secretary for Civil Rights... CONTACT: USDA's, Assistant General Counsel Civil Rights, Tami Trost at 202-690-3993 or email tami.trost... Civil Rights, overseen by the Assistant Secretary for Civil Rights (ASCR), was aligned within USDA's...
USDA-ARS?s Scientific Manuscript database
A group of small signaling molecules called ascarosides, associated with dauer formation, male attraction and social behavior in the nematode Caenorhabditis elegans, are shown to be regulated by developmental stage and environmental factors. The concentration of dauer-inducing ascaroside, ascr#2, i...
Simple technique to treat pupillary capture after transscleral fixation of intraocular lens.
Jürgens, Ignasi; Rey, Amanda
2015-01-01
We describe a simple surgical technique to manage pupillary capture after previous transscleral fixation of an intraocular lens. Neither author has a financial or proprietary interest in any material or method mentioned. Copyright © 2015 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-21
... DEPARTMENT OF AGRICULTURE Rural Housing Service U.S. Department of Agriculture Multi-Family Housing Program 2013 Industry Forums--Open Teleconference and/or Web Conference Meetings AGENCY: Rural... at www.ascr.usda.gov , or write to: U.S Department of Agriculture, Office of the Assistance Secretary...
Chang, David F; Braga-Mele, Rosa; Henderson, Bonnie An; Mamalis, Nick; Vasavada, Abhay
2015-06-01
A 2014 online survey of the American Society of Cataract and Refractive Surgery members indicated increasing use of intracameral antibiotic injection prophylaxis compared with a comparable survey from 2007. Forty-seven percent of respondents already used or planned to adopt this measure. One half of all surgeons not using intracameral prophylaxis expressed concern about the risks of noncommercially prepared antibiotic preparations. Overall, the large majority (75%) said they believe it is important to have a commercially available antibiotic approved for intracameral injection. Assuming reasonable cost, the survey indicates that commercial availability of Aprokam (cefuroxime) would increase the overall percentage of surgeons using intracameral antibiotic injection prophylaxis to nearly 84%. Although the majority used topical perioperative antibiotic prophylaxis, and gatifloxacin and moxifloxacin were still the most popular agents, there was a trend toward declining use of fourth-generation fluoroquinolones (60%, down from 81% in 2007) and greater use of topical ofloxacin and ciprofloxacin (21%, up from 9% in 2007). Copyright © 2015 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
7 CFR 15f.13 - What is the function of the ALJ and who may communicate with him?
Code of Federal Regulations, 2010 CFR
2010-01-01
...: (i) All such written communications; (ii) Memoranda stating the substance of all such oral... any oral responses to such communications. (c) Upon receipt of a communication knowingly made or..., unless the ASCR reviews the proposed determination. (b) What is an ex parte communication? An ex parte...
2001-06-01
Brazil cDepto de Quimica - Universidade Federal de Juiz de Fora, Juiz de Fora, MG- Brazil dEscola de Engenharia de Sao Carlos- - Universidade de Sao Paulo...Inorganic Materials IIC ASCR and ICT, Pelleova 24, Prague 6, Czech Republic blnstituto de Quimica - UNESP- C.P. 355, CEP: 14801-970, Araraquara, SP
NASA Astrophysics Data System (ADS)
Zhang, Guannan; Del-Castillo-Negrete, Diego
2017-10-01
Kinetic descriptions of RE are usually based on the bounced-averaged Fokker-Planck model that determines the PDFs of RE. Despite of the simplification involved, the Fokker-Planck equation can rarely be solved analytically and direct numerical approaches (e.g., continuum and particle-based Monte Carlo (MC)) can be time consuming specially in the computation of asymptotic-type observable including the runaway probability, the slowing-down and runaway mean times, and the energy limit probability. Here we present a novel backward MC approach to these problems based on backward stochastic differential equations (BSDEs). The BSDE model can simultaneously describe the PDF of RE and the runaway probabilities by means of the well-known Feynman-Kac theory. The key ingredient of the backward MC algorithm is to place all the particles in a runaway state and simulate them backward from the terminal time to the initial time. As such, our approach can provide much faster convergence than the brute-force MC methods, which can significantly reduce the number of particles required to achieve a prescribed accuracy. Moreover, our algorithm can be parallelized as easy as the direct MC code, which paves the way for conducting large-scale RE simulation. This work is supported by DOE FES and ASCR under the Contract Numbers ERKJ320 and ERAT377.
Object Kinetic Monte Carlo Simulations of Radiation Damage In Bulk Tungsten
NASA Astrophysics Data System (ADS)
Nandipati, Giridhar; Setyawan, Wahyu; Heinisch, Howard; Roche, Kenneth; Kurtz, Richard; Wirth, Brian
2015-11-01
Results are presented for the evolution of radiation damage in bulk tungsten investigated using the object KMC simulation tool, KSOME, as a function of dose, dose rate and primary knock-on atom (PKA) energies in the range of 10 to 100 keV, at temperatures of 300, 1025 and 2050 K. At 300 K, the number density of vacancies changes minimally with dose rate while the number density of vacancy clusters slightly decreases with dose rate indicating that larger clusters are formed at higher dose rates. Although the average vacancy cluster size increases slightly, the vast majority exists as mono-vacancies. At 1025 K void lattice formation was observed at all dose rates for cascades below 60 keV and at lower dose rates for higher PKA energies. After the appearance of initial features of the void lattice, vacancy cluster density increased minimally while the average vacancy cluster size increases rapidly with dose. At 2050 K, no accumulation of defects was observed over a broad range of dose rates for all PKA energies studied in this work. Further comparisons of results of irradiation simulations at various dose rates and PKA spectra, representative of the High Flux Isotope Reactor and future fusion relevant irradiation facilities will be discussed. The U.S. Department of Energy, Office of Fusion Energy Sciences (FES) and Office of Advanced Scientific Computing Research (ASCR) has supported this study through the SciDAC-3 program.
The Secret Life of Quarks, Final Report for the University of North Carolina at Chapel Hill
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fowler, Robert J.
This final report summarizes activities and results at the University of North Carolina as part of the the SciDAC-2 Project The Secret Life of Quarks: National Computational Infrastructure for Lattice Quantum Chromodynamics. The overall objective of the project is to construct the software needed to study quantum chromo- dynamics (QCD), the theory of the strong interactions of subatomic physics, and similar strongly coupled gauge theories anticipated to be of importance in the LHC era. It built upon the successful efforts of the SciDAC-1 project National Computational Infrastructure for Lattice Gauge Theory, in which a QCD Applications Programming Interface (QCD API)more » was developed that enables lat- tice gauge theorists to make effective use of a wide variety of massively parallel computers. In the SciDAC-2 project, optimized versions of the QCD API were being created for the IBM Blue- Gene/L (BG/L) and BlueGene/P (BG/P), the Cray XT3/XT4 and its successors, and clusters based on multi-core processors and Infiniband communications networks. The QCD API is being used to enhance the performance of the major QCD community codes and to create new applications. Software libraries of physics tools have been expanded to contain sharable building blocks for inclusion in application codes, performance analysis and visualization tools, and software for au- tomation of physics work flow. New software tools were designed for managing the large data sets generated in lattice QCD simulations, and for sharing them through the International Lattice Data Grid consortium. As part of the overall project, researchers at UNC were funded through ASCR to work in three general areas. The main thrust has been performance instrumentation and analysis in support of the SciDAC QCD code base as it evolved and as it moved to new computation platforms. In support of the performance activities, performance data was to be collected in a database for the purpose of broader analysis. Third, the UNC work was done at RENCI (Renaissance Computing Institute), which has extensive expertise and facilities for scientific data visualization, so we acted in an ongoing consulting and support role in that area.« less
The inv dup (15) or idic (15) syndrome (Tetrasomy 15q).
Battaglia, Agatino
2008-11-19
The inv dup(15) or idic(15) syndrome displays distinctive clinical findings represented by early central hypotonia, developmental delay and intellectual disability, epilepsy, and autistic behaviour. Incidence at birth is estimated at 1 in 30,000 with a sex ratio of almost 1:1. Developmental delay and intellectual disability affect all individuals with inv dup(15) and are usually moderate to profound. Expressive language is absent or very poor and often echolalic. Comprehension is very limited and contextual. Intention to communicate is absent or very limited. The distinct behavioral disorder shown by children and adolescents has been widely described as autistic or autistic-like. Epilepsy with a wide variety of seizure types can occur in these individuals, with onset between 6 months and 9 years. Various EEG abnormalities have been described. Muscle hypotonia is observed in almost all individuals, associated, in most of them, with joint hyperextensibility and drooling. Facial dysmorphic features are absent or subtle, and major malformations are rare. Feeding difficulties are reported in the newborn period.Chromosome region 15q11q13, known for its instability, is highly susceptible to clinically relevant genomic rearrangements, such as supernumerary marker chromosomes formed by the inverted duplication of proximal chromosome 15. Inv dup(15) results in tetrasomy 15p and partial tetrasomy 15q. The large rearrangements, containing the Prader-Willi/Angelman syndrome critical region (PWS/ASCR), are responsible for the inv dup(15) or idic(15) syndrome. Diagnosis is achieved by standard cytogenetics and FISH analysis, using probes both from proximal chromosome 15 and from the PWS/ASCR. Microsatellite analysis on parental DNA or methylation analysis on the proband DNA, are also needed to detect the parent-of-origin of the inv dup(15) chromosome. Array CGH has been shown to provide a powerful approach for identifying and detecting the extent of the duplication. The possible occurrence of double supernumerary isodicentric chromosomes derived from chromosome 15, resulting in partial hexasomy of the maternally inherited PWS/ASCR, should be considered in the differential diagnosis. Large idic(15) are nearly always sporadic. Antenatal diagnosis is possible. Management of inv dup(15) includes a comprehensive neurophysiologic and developmental evaluation. Survival is not significantly reduced.
NASA Astrophysics Data System (ADS)
Bhalla, Amneet Pal Singh; Johansen, Hans; Graves, Dan; Martin, Dan; Colella, Phillip; Applied Numerical Algorithms Group Team
2017-11-01
We present a consistent cell-averaged discretization for incompressible Navier-Stokes equations on complex domains using embedded boundaries. The embedded boundary is allowed to freely cut the locally-refined background Cartesian grid. Implicit-function representation is used for the embedded boundary, which allows us to convert the required geometric moments in the Taylor series expansion (upto arbitrary order) of polynomials into an algebraic problem in lower dimensions. The computed geometric moments are then used to construct stencils for various operators like the Laplacian, divergence, gradient, etc., by solving a least-squares system locally. We also construct the inter-level data-transfer operators like prolongation and restriction for multi grid solvers using the same least-squares system approach. This allows us to retain high-order of accuracy near coarse-fine interface and near embedded boundaries. Canonical problems like Taylor-Green vortex flow and flow past bluff bodies will be presented to demonstrate the proposed method. U.S. Department of Energy, Office of Science, ASCR (Award Number DE-AC02-05CH11231).
The Future of Software Engineering for High Performance Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pope, G
DOE ASCR requested that from May through mid-July 2015 a study group identify issues and recommend solutions from a software engineering perspective transitioning into the next generation of High Performance Computing. The approach used was to ask some of the DOE complex experts who will be responsible for doing this work to contribute to the study group. The technique used was to solicit elevator speeches: a short and concise write up done as if the author was a speaker with only a few minutes to convince a decision maker of their top issues. Pages 2-18 contain the original texts ofmore » the contributed elevator speeches and end notes identifying the 20 contributors. The study group also ranked the importance of each topic, and those scores are displayed with each topic heading. A perfect score (and highest priority) is three, two is medium priority, and one is lowest priority. The highest scoring topic areas were software engineering and testing resources; the lowest scoring area was compliance to DOE standards. The following two paragraphs are an elevator speech summarizing the contributed elevator speeches. Each sentence or phrase in the summary is hyperlinked to its source via a numeral embedded in the text. A risk one liner has also been added to each topic to allow future risk tracking and mitigation.« less
NASA Astrophysics Data System (ADS)
Nangia, Nishant; Patankar, Neelesh A.; Bhalla, Amneet P. S.
2017-11-01
Fictitious domain methods for simulating fluid-structure interaction (FSI) have been gaining popularity in the past few decades because of their robustness in handling arbitrarily moving bodies. Often the transient net hydrodynamic forces and torques on the body are desired quantities for these types of simulations. In past studies using immersed boundary (IB) methods, force measurements are contaminated with spurious oscillations due to evaluation of possibly discontinuous spatial velocity of pressure gradients within or on the surface of the body. Based on an application of the Reynolds transport theorem, we present a moving control volume (CV) approach to computing the net forces and torques on a moving body immersed in a fluid. The approach is shown to be accurate for a wide array of FSI problems, including flow past stationary and moving objects, Stokes flow, and high Reynolds number free-swimming. The approach only requires far-field (smooth) velocity and pressure information, thereby suppressing spurious force oscillations and eliminating the need for any filtering. The proposed moving CV method is not limited to a specific IB method and is straightforward to implement within an existing parallel FSI simulation software. This work is supported by NSF (Award Numbers SI2-SSI-1450374, SI2-SSI-1450327, and DGE-1324585), the US Department of Energy, Office of Science, ASCR (Award Number DE-AC02-05CH11231), and NIH (Award Number HL117163).
NASA Project Develops Next-Generation Low-Emissions Combustor Technologies
NASA Technical Reports Server (NTRS)
Lee, Chi-Ming; Chang, Clarence T.; Herbon, John T.; Kramer, Stephen K.
2013-01-01
NASA's Environmentally Responsible Aviation (ERA) Project is working with industry to develop the fuel flexible combustor technologies for a new generation of low-emissions engine targeted for the 2020 timeframe. These new combustors will reduce nitrogen oxide (NOx) emissions to half of current state-of-the-art (SOA) combustors, while simultaneously reducing noise and fuel burn. The purpose of the low NOx fuel-flexible combustor research is to advance the Technology Readiness Level (TRL) and Integration Readiness Level (IRL) of a low NOx, fuel flexible combustor to the point where it can be integrated in the next generation of aircraft. To reduce project risk and optimize research benefit NASA chose to found two Phase 1 contracts. The first Phase 1 contracts went to engine manufactures and were awarded to: General Electric Company, and Pratt & Whitney Company. The second Phase 1 contracts went to fuel injector manufactures Goodrich Corporation, Parker Hannifin Corporation, and Woodward Fuel System Technology. In 2012, two sector combustors were tested at NASA's ASCR. The results indicated 75% NOx emission reduction below the 2004 CAEP/6 regulation level.
Optical Diagnosis of Gas Turbine Combustors Being Conducted
NASA Technical Reports Server (NTRS)
Hicks, Yolanda R.; Locke, Randy J.; Anderson, Robert C.; DeGroot, Wilhelmus A.
2001-01-01
Researchers at the NASA Glenn Research Center, in collaboration with industry, are reducing gas turbine engine emissions by studying visually the air-fuel interactions and combustion processes in combustors. This is especially critical for next generation engines that, in order to be more fuel-efficient, operate at higher temperatures and pressures than the current fleet engines. Optically based experiments were conducted in support of the Ultra-Efficient Engine Technology program in Glenn's unique, world-class, advanced subsonic combustion rig (ASCR) facility. The ASCR can supply air and jet fuel at the flow rates, temperatures, and pressures that simulate the conditions expected in the combustors of high-performance, civilian aircraft engines. In addition, this facility is large enough to support true sectors ("pie" slices of a full annular combustor). Sectors enable one to test true shapes rather than rectangular approximations of the actual hardware. Therefore, there is no compromise to actual engine geometry. A schematic drawing of the sector test stand is shown. The test hardware is mounted just upstream of the instrumentation section. The test stand can accommodate hardware up to 0.76-m diameter by 1.2-m long; thus sectors or small full annular combustors can be examined in this facility. Planar (two-dimensional) imaging using laser-induced fluorescence and Mie scattering, chemiluminescence, and video imagery were obtained for a variety of engine cycle conditions. The hardware tested was a double annular sector (two adjacent fuel injectors aligned radially) representing approximately 15 of a full annular combustor. An example of the two-dimensional data obtained for this configuration is also shown. The fluorescence data show the location of fuel and hydroxyl radical (OH) along the centerline of the fuel injectors. The chemiluminescence data show C2 within the total observable volume. The top row of this figure shows images obtained at an engine low-power condition, and the bottom row shows data from a higher power operating point. The data show distinctly the differences in flame structure between low-power and high-power engine conditions, in both location and amount of species produced (OH, C2) or consumed (fuel). The unique capability of the facility coupled with its optical accessibility helps to eliminate the need for high-pressure performance extrapolations. Tests such as described here have been used successfully to assess the performance of fuel-injection concepts and to modify those designs, if needed.
Analyses of Mobilization Manpower Supply and Demand.
1982-03-01
7AD-AI30 148 ANALYSES OF MOBIL ZATION MANPOWER SUPPLY AND DEMAND U) l1 . ADMINISTRATIVE SCIENCES CORP SPRINOFIELD VA BREAU EAL MAR82 ASCR134...79-C-0527 for use in identifying and quantifying issues in the CPAM process, and to employ the model for selected quantitative ard qualitative analyses...nurses and corpsmen) to operate on a Commander FX Microcomputer, to be used by 2 the Bureau of Medicine and Surgery to develop inputs for Navy-wide
High Pressure Low NOx Emissions Research: Recent Progress at NASA Glenn Research Center
NASA Technical Reports Server (NTRS)
Chi-Ming, Lee; Tacina, Kathleen M.; Wey, Changlie
2007-01-01
In collaboration with U.S. aircraft engine companies, NASA Glenn Research Center has contributed to the advancement of low emissions combustion systems. For the High Speed Research Program (HSR), a 90% reduction in nitrogen oxides (NOx) emissions (relative to the then-current state of the art) has been demonstrated in sector rig testing at General Electric Aircraft Engines (GEAE). For the Advanced Subsonic Technology Program (AST), a 50% reduction in NOx emissions relative to the 1996 International Civil Aviation Organization (ICAO) standards has been at demonstrated in sector rigs at both GEAE and Pratt & Whitney (P&W). During the Ultra Efficient Engine Technology Program (UEET), a 70% reduction in NOx emissions, relative to the 1996 ICAO standards, was achieved in sector rig testing at Glenn in the world class Advanced Subsonic Combustion Rig (ASCR) and at contractor facilities. Low NOx combustor development continues under the Fundamental Aeronautics Program. To achieve these reductions, experimental and analytical research has been conducted to advance the understanding of emissions formation in combustion processes. Lean direct injection (LDI) concept development uses advanced laser-based non-intrusive diagnostics and analytical work to complement the emissions measurements and to provide guidance for concept improvement. This paper describes emissions results from flametube tests of a 9- injection-point LDI fuel/air mixer tested at inlet pressures up to 5500 kPa. Sample results from CFD and laser diagnostics are also discussed.
NASA Astrophysics Data System (ADS)
Ku, Seung-Hoe; Hager, R.; Chang, C. S.; Chacon, L.; Chen, G.; EPSI Team
2016-10-01
The cancelation problem has been a long-standing issue for long wavelengths modes in electromagnetic gyrokinetic PIC simulations in toroidal geometry. As an attempt of resolving this issue, we implemented a fully implicit time integration scheme in the full-f, gyrokinetic PIC code XGC1. The new scheme - based on the implicit Vlasov-Darwin PIC algorithm by G. Chen and L. Chacon - can potentially resolve cancelation problem. The time advance for the field and the particle equations is space-time-centered, with particle sub-cycling. The resulting system of equations is solved by a Picard iteration solver with fixed-point accelerator. The algorithm is implemented in the parallel velocity formalism instead of the canonical parallel momentum formalism. XGC1 specializes in simulating the tokamak edge plasma with magnetic separatrix geometry. A fully implicit scheme could be a way to accurate and efficient gyrokinetic simulations. We will test if this numerical scheme overcomes the cancelation problem, and reproduces the dispersion relation of Alfven waves and tearing modes in cylindrical geometry. Funded by US DOE FES and ASCR, and computing resources provided by OLCF through ALCC.
Thermal characterization of phacoemulsification probes operated in axial and torsional modes.
Zacharias, Jaime
2015-01-01
To analyze temperature increases and identify potential sources of heat generated when sleeved and sleeveless phacoemulsification probes were operated in axial and torsional modes using the Infiniti Vision System with the Ozil torsional handpiece. Phacodynamics Laboratory, Pasteur Ophthalmic Clinic, Santiago, Chile. Experimental study. Two computer-controlled thermal transfer systems were developed to evaluate the contribution of internal metal stress and tip-to-sleeve friction on heat generation during phacoemulsification using axial and torsional ultrasound modalities. Both systems incorporated infrared thermal imaging and used a black-body film to accurately capture temperature measurements. Axial mode was consistently associated with greater temperature increases than torsional mode whether tips were operated with or without sleeves. In tests involving bare tips, axial mode and torsional mode peaked at 51.7°C and 34.2°C, respectively. In an example using sleeved tips in which a 30.0 g load was applied for 1 second, temperatures for axial mode reached 45°C and for torsional mode, 38°C. Friction between the sleeved probe and the incisional wall contributed more significantly to the temperature increase than internal metal stress regardless of the mode used. In all experiments, the temperature increase observed with axial mode was greater than that observed with torsional mode, even when conditions such as power or amplitude and flow rate were varied. Tip-to-sleeve friction was a more dominant source of phaco probe heating than internal metal stress. The temperature increase due to internal metal stress was greater with axial mode than with torsional mode. Dr. Zacharias received research funding from Alcon Laboratories, Inc., to conduct this study. He has no financial or proprietary interest in any material or method mentioned. Copyright © 2015 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Portable Load Measurement Device for Use During ARED Exercise on ISS
NASA Technical Reports Server (NTRS)
Hanson, A.; Peters, B.; Caldwell, E.; Sinka, J.; Kreutzburg, G.; Ploutz-Snyder, L.
2014-01-01
The Advanced Resistive Exercise Device (ARED) (Fig.1) is unique countermeasure hardware available to crewmembers aboard the International Space Station (ISS) used for resistance exercise training to protect against bone and muscle loss during long duration space missions. ARED instrumentation system was designed to measure and record exercise load data, but: - Reliably accurate data has not been available due to a defective force platform. - No ARED data has been recorded since mid-2011 due to failures in the instrumentation power system. ARED load data supports on-going HRP funded research, and is available to extramural researchers through LSDA-Repository. Astronaut Strength, Conditioning, and Rehabilitation specialists (ASCRs) use ARED data to track training progress and advance exercise prescriptions. ARED load data is necessary to fulfill medical requirements. HRP directed task intends to reduce to program risk (HRP IRMA Risk 1735), and evaluate the XSENS ForceShoeTM as a means of obtaining ARED load data during exercise sessions. The XSENS ForceShoes"TM" will fly as a hardware demonstration to ISS in May 2014 (39S). Additional portable load monitoring devices (PLMDs) are under evaluation in the ExPC Lab. PLMDs are favored over platform redesign as they support future exploration needs.
Advanced Subsonic Combustion Rig
NASA Technical Reports Server (NTRS)
Lee, Chi-Ming
1998-01-01
Researchers from the NASA Lewis Research Center have obtained the first combustion/emissions data under extreme future engine operating conditions. In Lewis' new world-class 60-atm combustor research facility--the Advanced Subsonic Combustion Rig (ASCR)--a flametube was used to conduct combustion experiments in environments as extreme as 900 psia and 3400 F. The greatest challenge for combustion researchers is the uncertainty of the effects of pressure on the formation of nitrogen oxides (NOx). Consequently, U.S. engine manufacturers are using these data to guide their future combustor designs. The flametube's metal housing has an inside diameter of 12 in. and a length of 10.5 in. The flametube can be used with a variety of different flow paths. Each flow path is lined with a high-temperature, castable refractory material (alumina) to minimize heat loss. Upstream of the flametube is the injector section, which has an inside diameter of 13 in. and a length of 0.5-in. It was designed to provide for quick changeovers. This flametube is being used to provide all U.S. engine manufacturers early assessments of advanced combustion concepts at full power conditions prior to engine production. To date, seven concepts from engine manufacturers have been evaluated and improved. This collaborated development can potentially give U.S. engine manufacturers the competitive advantage of being first in the market with advanced low-emission technologies.
NASA Glenn High Pressure Low NOx Emissions Research
NASA Technical Reports Server (NTRS)
Tacina, Kathleen M.; Wey, Changlie
2008-01-01
In collaboration with U.S. aircraft engine companies, NASA Glenn Research Center has contributed to the advancement of low emissions combustion systems. For the High Speed Research Program (HSR), a 90% reduction in nitrogen oxides (NOx) emissions (relative to the then-current state of the art) has been demonstrated in sector rig testing at General Electric Aircraft Engines (GEAE). For the Advanced Subsonic Technology Program (AST), a 50% reduction in NOx emissions relative to the 1996 International Civil Aviation Organization (ICAO) standards has been demonstrated in sector rigs at both GEAE and Pratt & Whitney (P&W). During the Ultra Efficient Engine Technology Program (UEET), a 70% reduction in NOx emissions, relative to the 1996 ICAO standards, was achieved in sector rig testing at Glenn in the world class Advanced Subsonic Combustion Rig (ASCR) and at contractor facilities. Low NOx combustor development continues under the Fundamental Aeronautics Program. To achieve these reductions, experimental and analytical research has been conducted to advance the understanding of emissions formation in combustion processes. Lean direct injection (LDI) concept development uses advanced laser-based non-intrusive diagnostics and analytical work to complement the emissions measurements and to provide guidance for concept improvement. This paper describes emissions results from flametube tests of a 9-injection-point LDI fuel/air mixer tested at inlet pressures up to 5500 kPa. Sample results from CFD and laser diagnostics are also discussed.
Anatomy and physiology of the cornea.
DelMonte, Derek W; Kim, Terry
2011-03-01
The importance of the cornea to the ocular structure and visual system is often overlooked because of the cornea's unassuming transparent nature. The cornea lacks the neurobiological sophistication of the retina and the dynamic movement of the lens; yet, without its clarity, the eye would not be able to perform its necessary functions. The complexity of structure and function necessary to maintain such elegant simplicity is the wonder that draws us to one of the most important components of our visual system. Copyright © 2011 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Use of a hydrogel sealant in epithelial ingrowth removal after laser in situ keratomileusis.
Ramsook, Sandhya S; Hersh, Peter S
2015-12-01
We describe 2 cases in which clinically significant epithelial ingrowth was removed by debridement and followed by the use of a hydrogel sealant (Resure) to seal the flap edge. In both cases, the epithelial ingrowth was seen after otherwise uneventful laser in situ keratomileusis retreatment. The visual outcomes were good with no recrudescence of interface epithelium. Neither author has a financial or proprietary interest in any material or method mentioned. Copyright © 2015 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Cataract surgery in ancient Egypt.
Blomstedt, Patric
2014-03-01
Ophthalmology was one of the most important specialties in Egyptian medicine, and more specialists are known in this field than in any other. This specialization seems, however, to have been of a purely noninvasive nature. Even though it has been claimed that cataract surgery was performed in pharaonic Egypt, careful analysis of the sources does not support the claim. No example of cataract surgery or of any other invasive ophthalmologic procedure can be found in the original sources. Copyright © 2014 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Operational and Research Musculoskeletal Summit: Summit Recommendations
NASA Technical Reports Server (NTRS)
Scheuring, Richard A.; Walton, Marlei; Davis-Street, Janis; Smaka, Todd J.; Griffin, DeVon
2006-01-01
The Medical Informatics and Health Care Systems group in the Office of Space Medicine at NASA Johnson Space Center (JSC) has been tasked by NASA with improving overall medical care on the International Space Station (ISS) and providing insights for medical care for future exploration missions. To accomplish this task, a three day Operational and Research Musculoskeletal Summit was held on August 23-25th, 2005 at Space Center Houston. The purpose of the summit was to review NASA#s a) current strategy for preflight health maintenance and injury screening, b) current treatment methods in-flight, and c) risk mitigation strategy for musculoskeletal injuries or syndromes that could occur or impact the mission. Additionally, summit participants provided a list of research topics NASA should consider to mitigate risks to astronaut health. Prior to the summit, participants participated in a web-based pre-summit forum to review the NASA Space Medical Conditions List (SMCL) of musculoskeletal conditions that may occur on ISS as well as the resources currently available to treat them. Data from the participants were compiled and integrated with the summit proceedings. Summit participants included experts from the extramural physician and researcher communities, and representatives from NASA Headquarters, the astronaut corps, JSC Medical Operations and Human Adaptations and Countermeasures Offices, Glenn Research Center Human Research Office, and the Astronaut Strength, Conditioning, and Reconditioning (ASCR) group. The recommendations in this document are based on a summary of summit discussions and the best possible evidence-based recommendations for musculoskeletal care for astronauts while on the ISS, and include recommendati ons for exploration class missions.
Khoramnia, Ramin; Auffarth, Gerd U; Rabsilber, Tanja M; Holzer, Mike P
2012-11-01
We report a 66-year-old patient who presented with increasing hyperopia, astigmatism, and presbyopia in both eyes 8 years after bilateral laser in situ keratomileusis (LASIK) and LASIK enhancement in the left eye aiming for spectacle independence. Bilateral multifocal toric Lentis Mplus intraocular lenses (IOLs) with an embedded near segment and individually customized cylinder correction were implanted uneventfully following phacoemulsification. The Haigis-L formula after previous hyperopia correction was chosen for IOL power calculation and provided reliable results. Emmetropia was targeted and achieved. Three months postoperatively, the uncorrected distance visual acuity had increased from 0.40 logMAR to 0.10 logMAR in the right eye and from 0.20 logMAR to 0.00 logMAR in the left eye. The patient gained 6 lines of uncorrected near visual acuity: 0.20 logMAR in the right eye and 0.10 logMAR in the left eye. This case shows that customized premium IOL implantation can provide accurate results even in challenging cases. The International Vision Correction Research Centre, Department of Ophthalmology, University of Heidelberg, Heidelberg, Germany, has received research grants, lecture fees, and travel reimbursement from Oculentis GmbH. Copyright © 2012 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
George, Monica C; Lazer, Zane P; George, David S
2016-05-01
We present a technique that uses a near-point string to demonstrate the anticipated near point of multifocal and accommodating intraocular lenses (IOLs). Beads are placed on the string at distances corresponding to the near points for diffractive and accommodating IOLs. The string is held up to the patient's eye to demonstrate where each of the IOLs is likely to provide the best near vision. None of the authors has a financial or proprietary interest in any material or method mentioned. Copyright © 2016 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Simple approach to prevent capsule tear-out during capsulorhexis creation in hypermature cataracts.
Robinson, Mark S; Olson, Randall J
2015-07-01
Capsule tear-out in hypermature cataracts (Argentinean flag sign) is a common and frustrating complication during the creation of a capsulorhexis. Completing the capsulorhexis through a small side-port incision, filling the anterior chamber with a highly viscous ophthalmic viscosurgical device such as a viscoadaptive agent, and using a 23- or 25-gauge microcapsulorhexis forceps to fill the side-port incision can reliably prevent this complication. Neither author has a financial or proprietary interest in any material or method mentioned. Copyright © 2015 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Active angular alignment of gauge blocks in double-ended interferometers.
Buchta, Zdeněk; Reřucha, Simon; Hucl, Václav; Cížek, Martin; Sarbort, Martin; Lazar, Josef; Cíp, Ondřej
2013-09-27
This paper presents a method implemented in a system for automatic contactless calibration of gauge blocks designed at ISI ASCR. The system combines low-coherence interferometry and laser interferometry, where the first identifies the gauge block sides position and the second one measures the gauge block length itself. A crucial part of the system is the algorithm for gauge block alignment to the measuring beam which is able to compensate the gauge block lateral and longitudinal tilt up to 0.141 mrad. The algorithm is also important for the gauge block position monitoring during its length measurement.
Active Angular Alignment of Gauge Blocks in Double-Ended Interferometers
Buchta, Zdeněk; Řeřucha, Šimon; Hucl, Václav; Čížek, Martin; Šarbort, Martin; Lazar, Josef; Číp, Ondřej
2013-01-01
This paper presents a method implemented in a system for automatic contactless calibration of gauge blocks designed at ISI ASCR. The system combines low-coherence interferometry and laser interferometry, where the first identifies the gauge block sides position and the second one measures the gauge block length itself. A crucial part of the system is the algorithm for gauge block alignment to the measuring beam which is able to compensate the gauge block lateral and longitudinal tilt up to 0.141 mrad. The algorithm is also important for the gauge block position monitoring during its length measurement. PMID:24084107
COMPARISON OF PRESSURE DROP PRODUCED BY SPIRAL WRAPS, COOKIE CUTTERS AND OTHER ROD BUNDLE SPACERS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Noyes, R.C.
The problem of predicting pressure drop due to various fuel bundle spacers is considered in some detail. Three sets of experimental data are reviewed and presented in reduced form. These data are compared to several semitheoretical approaches to pressure drop prediction and a best method is selected to make the required predictions. The comparison between predictions for the ASCR spiral wrap spacer and cookie cutter spacer shows that both types of spacers produce about the same pressure drop. Spacer pressure drop is shown to be strongly dependent on spacer frontal area and pitch. (auth)
Snyder, Michael E; Lindsell, Luke B
2010-02-01
Puncturing the anterior capsule in a patient with a very soft lens, an elastic capsule, and/or deficient zonular countertraction can be challenging even with a sharp needle or blade. The crossed-swords, capsule-pinch technique capitalizes on opposing forces from 2 needles directed toward each other with a "pinch" of the capsule between their tips. This affords a controlled and facile puncture of the capsule without creating stress on the zonules or anteroposterior displacement of the lens. Copyright 2010 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Tipotsch-Maca, Saskia M; Varsits, Ralph M; Ginzel, Christian; Vecsei-Marlovits, Pia V
2016-01-01
To assess whether a multimedia-assisted preoperative informed consent procedure has an effect on patients' knowledge concerning cataract surgery, satisfaction with the informed consent process, and reduction in anxiety levels. Hietzing Hospital, Vienna, Austria. Prospective randomized controlled clinical trial. Patients participated in an informed consent procedure for age-related cataract surgery that included the standard approach only (reading the information brochure and having a standardized face-to-face discussion) or supplemented with a computer-animated video. The main outcome was information retention assessed by a questionnaire. Further outcome measures used were the State-Trait Anxiety Inventory, the Visual Function-14 score, and an assessment of satisfaction. The study included 123 patients (64 in standard-only group; 59 in computer-animated video group). Both groups scored well on the questionnaire; however, patients who watched the video performed better (82% retention versus 72%) (P = .002). Scores tended to decrease with increasing age (r = -0.25, P = .005); however, this decrease was smaller in the group that watched the video. Both groups had elevated anxiety levels (means in video group: anxiety concerning the current situation [S-anxiety] = 63.8 ± 9.6 [SD], general tendency toward anxiety [T-anxiety] = 65.5 ± 7.9; means in control group: S-anxiety = 61.9 ± 10.3, T-anxiety = 66.2 ± 7.8). A high level of information retention was achieved using an informed consent procedure consisting of an information brochure and a standardized face-to-face discussion. A further increase in information retention was achieved, even with increasing patient age, by adding a multimedia presentation. No author has a financial or proprietary interest in any material or method mentioned. Copyright © 2016 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Hong, F C; Devine, P; McDonald, J J; Cologne, K; Brady, R R W
2018-05-01
Engagement by medical professionals with social media (SM) is increasing. Variation is noted in engagement between SM platforms and between surgical specialities and geographical regions. We aimed to study SM engagement by colorectal surgeons attending an international conference. Surgeons were identified from the delegate list of the 2017 Annual Meeting of the American Society of Colon and Rectal Surgeons (ASCRS) and Tripartite Meeting (Seattle, Washington, USA). Delegates were searched on Twitter and LinkedIn for the presence of a matching profile. SM presence, activity, gender and geographical region were analysed. Two hundred and seventy (13.2%) surgeons had Twitter accounts and 994 (44.3%) had LinkedIn profiles. UK surgeons were more likely to be on Twitter than surgeons from elsewhere (23.4% vs 12.7%, P = 0.0072). Significant variation in SM membership between each geographical region was noted, with usage rates for Twitter of 18.1% in Europe, 14.4% in North America, 12.9% in South America, 4.3% in Oceania, 3.7% in Asia and 0% in Africa. A similar picture for LinkedIn is seen. The #ASCRS17 meeting saw the highest participation of users to date (979 participants, over 7000 individual tweets and nearly 14 million impressions). SM engagement by colorectal surgeons continues to increase. Significant geographical variation is noted, suggesting that SM's unique potential for education and networking may not yet be widely appreciated globally. Future work should include further analysis into tweet contents to gain insights and optimize the use of SM as an educational adjunct. Colorectal Disease © 2018 The Association of Coloproctology of Great Britain and Ireland.
Comparison of Newer IOL Power Calculation Methods for Eyes With Previous Radial Keratotomy
Ma, Jack X.; Tang, Maolong; Wang, Li; Weikert, Mitchell P.; Huang, David; Koch, Douglas D.
2016-01-01
Purpose To evaluate the accuracy of the optical coherence tomography–based (OCT formula) and Barrett True K (True K) intraocular lens (IOL) calculation formulas in eyes with previous radial keratotomy (RK). Methods In 95 eyes of 65 patients, using the actual refraction following cataract surgery as target refraction, the predicted IOL power for each method was calculated. The IOL prediction error (PE) was obtained by subtracting the predicted IOL power from the implanted IOL power. The arithmetic IOL PE and median refractive PE were calculated and compared. Results All formulas except the True K produced hyperopic IOL PEs at 1 month, which decreased at ≥4 months (all P < 0.05). For the double-K Holladay 1, OCT formula, True K, and average of these three formulas (Average), the median absolute refractive PEs were, respectively, 0.78 diopters (D), 0.74 D, 0.60 D, and 0.59 D at 1 month; 0.69 D, 0.77 D, 0.77 D, and 0.61 D at 2 to 3 months; and 0.34 D, 0.65 D, 0.69 D, and 0.46 D at ≥4 months. The Average produced significantly smaller refractive PE than did the double-K Holladay 1 at 1 month (P < 0.05). There were no significant differences in refractive PEs among formulas at 4 months. Conclusions The OCT formula and True K were comparable to the double-K Holladay 1 method on the ASCRS (American Society of Cataract and Refractive Surgery) calculator. The Average IOL power on the ASCRS calculator may be considered when selecting the IOL power. Further improvements in the accuracy of IOL power calculation in RK eyes are desirable. PMID:27409468
Pheromone-sensing neurons regulate peripheral lipid metabolism in Caenorhabditis elegans
Stieglitz, Jon; Locke, Tiffany T.; Zhang, Ying K.; Schroeder, Frank C.; Srinivasan, Supriya
2017-01-01
It is now established that the central nervous system plays an important role in regulating whole body metabolism and energy balance. However, the extent to which sensory systems relay environmental information to modulate metabolic events in peripheral tissues has remained poorly understood. In addition, it has been challenging to map the molecular mechanisms underlying discrete sensory modalities with respect to their role in lipid metabolism. In previous work our lab has identified instructive roles for serotonin signaling as a surrogate for food availability, as well as oxygen sensing, in the control of whole body metabolism. In this study, we now identify a role for a pair of pheromone-sensing neurons in regulating fat metabolism in C. elegans, which has emerged as a tractable and highly informative model to study the neurobiology of metabolism. A genetic screen revealed that GPA-3, a member of the Gα family of G proteins, regulates body fat content in the intestine, the major metabolic organ for C. elegans. Genetic and reconstitution studies revealed that the potent body fat phenotype of gpa-3 null mutants is controlled from a pair of neurons called ADL(L/R). We show that cAMP functions as the second messenger in the ADL neurons, and regulates body fat stores via the neurotransmitter acetylcholine, from downstream neurons. We find that the pheromone ascr#3, which is detected by the ADL neurons, regulates body fat stores in a GPA-3-dependent manner. We define here a third sensory modality, pheromone sensing, as a major regulator of body fat metabolism. The pheromone ascr#3 is an indicator of population density, thus we hypothesize that pheromone sensing provides a salient 'denominator' to evaluate the amount of food available within a population and to accordingly adjust metabolic rate and body fat levels. PMID:28545126
Pheromone-sensing neurons regulate peripheral lipid metabolism in Caenorhabditis elegans.
Hussey, Rosalind; Stieglitz, Jon; Mesgarzadeh, Jaleh; Locke, Tiffany T; Zhang, Ying K; Schroeder, Frank C; Srinivasan, Supriya
2017-05-01
It is now established that the central nervous system plays an important role in regulating whole body metabolism and energy balance. However, the extent to which sensory systems relay environmental information to modulate metabolic events in peripheral tissues has remained poorly understood. In addition, it has been challenging to map the molecular mechanisms underlying discrete sensory modalities with respect to their role in lipid metabolism. In previous work our lab has identified instructive roles for serotonin signaling as a surrogate for food availability, as well as oxygen sensing, in the control of whole body metabolism. In this study, we now identify a role for a pair of pheromone-sensing neurons in regulating fat metabolism in C. elegans, which has emerged as a tractable and highly informative model to study the neurobiology of metabolism. A genetic screen revealed that GPA-3, a member of the Gα family of G proteins, regulates body fat content in the intestine, the major metabolic organ for C. elegans. Genetic and reconstitution studies revealed that the potent body fat phenotype of gpa-3 null mutants is controlled from a pair of neurons called ADL(L/R). We show that cAMP functions as the second messenger in the ADL neurons, and regulates body fat stores via the neurotransmitter acetylcholine, from downstream neurons. We find that the pheromone ascr#3, which is detected by the ADL neurons, regulates body fat stores in a GPA-3-dependent manner. We define here a third sensory modality, pheromone sensing, as a major regulator of body fat metabolism. The pheromone ascr#3 is an indicator of population density, thus we hypothesize that pheromone sensing provides a salient 'denominator' to evaluate the amount of food available within a population and to accordingly adjust metabolic rate and body fat levels.
Preflight and In-Flight Exercise Conditions for Astronauts on the International Space Station
NASA Technical Reports Server (NTRS)
Guilliams, Mark E.; Nieschwitz, Bruce; Hoellen, David; Loehr, Jim
2011-01-01
The physiological demands of spaceflight require astronauts to have certain physical abilities. They must be able to perform routine and off-nominal physical work during flight and upon re-entry into a gravity environment to ensure mission success, such as an Extra Vehicular Activity (EVA) or emergency egress. To prepare the astronauts for their mission, a Wyle Astronaut Strength Conditioning and Rehabilitation specialist (ASCR) works individually with the astronauts to prescribe preflight strength and conditioning programs and in-flight exercise, utilizing Countermeasure Systems (CMS) exercise hardware. PURPOSE: To describe the preflight and in-flight exercise programs for ISS crewmembers. METHODS: Approximately 2 years before a scheduled launch, an ASCR is assigned to each astronaut and physical training (PT) is routinely scheduled. Preflight PT of astronauts consists of carrying out strength, aerobic and general conditioning, employing the principles of periodization. Exercise programs are prescribed to the astronauts to account for their individual fitness levels, planned mission-specific tasks, areas of concern, and travel schedules. Additionally, astronauts receive instruction on how to operate CMS exercise hardware and receive training for microgravity-specific conditions. For example, astronauts are scheduled training sessions for the International Space Station (ISS) treadmill (TVIS) and cycle ergometer (CEVIS), as well as the Advanced Resistive Exercise Device (ARED). In-flight programs are designed to maintain or even improve the astronauts pre-flight levels of fitness, bone health, muscle strength, power and aerobic capacity. In-flight countermeasure sessions are scheduled in 2.5 h blocks, six days a week, which includes 1.5 h for resistive training and 1 h for aerobic exercise. CONCLUSIONS: Crewmembers reported the need for more scheduled time for preflight training. During flight, crewmembers have indicated that the in-flight exercise is sufficient, but would like more reliable and capable hardware.
Sectioning a luxated intraocular lens inside the vitreous cavity.
Vilaplana, Daniel; Pazos, Marta
2013-07-01
We describe a new technique for sectioning an intraocular lens (IOL) inside the vitreous cavity. The IOL had a broken haptic and was accidentally luxated after a complicated cataract surgery with posterior capsule rupture. The primary indication to cut the IOL in half inside the vitreous cavity is to preserve the anterior capsule integrity, especially in a small-sized capsulotomy, allowing subsequent implantation of a new IOL in the sulcus with the optical zone captured in the capsulorhexis. Neither author has a financial or proprietary interest in any material or method mentioned. Copyright © 2013 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Advanced on-site power plant development technology program
NASA Technical Reports Server (NTRS)
Kemp, F. S.
1985-01-01
A 30-cell stack was tested for 7200 hours. At 6000 hours the stack was successfully refilled with acid with no loss of performance. A second stack containing the advanced Configuration B cell package was fabricated and assembled for testing in 1985. A 200-kW brassboard inverter was successfully evaluated, verifying the design of the two-bridge ASCR circuit design. A fuel processing catalyst train was tested for 2000 hours verifying the catalyst for use in a 200-kW development reformer. The development reformer was fabricated for evaluation in 1985. The initial test plan was prepared for a 200-kW verification test article.
Electron microscopic evaluation of a gold glaucoma micro shunt after explantation.
Berk, Thomas A; Tam, Diamond Y; Werner, Liliana; Mamalis, Nick; Ahmed, Iqbal Ike K
2015-03-01
We present a case of an explanted gold glaucoma micro shunt (GMS Plus) and the subsequent light and electron microscopic analyses. The shunt was implanted in a patient with medically refractive glaucoma. The intraocular pressure (IOP) was stable at 12 mm Hg 6 months postoperatively but spiked to 26 mm Hg 6 months later; membranous growth was visible on the implant gonioscopically. A second gold micro shunt was placed 2 years after the first. The IOP was 7 mm Hg 1 week postoperatively but increased to 23 mm Hg 3 weeks later; similar membranous growth was visible on this implant. One of the shunts was explanted, and light and scanning electron microscopic analyses revealed encapsulation around the shunt exterior and connective tissue invasion of the microstructure. This represents the first electron microscopic analysis of an explanted gold glaucoma micro shunt and the first unequivocal images of the fibrotic pseudo-capsule traversing its microchannels and fenestrations. Dr. Ahmed is a consultant to and has received research grants from Solx, Inc. No other author has a financial or proprietary interest in any material or method mentioned. Copyright © 2015 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Tensile strength and failure mechanisms of tantalum at extreme strain rates
NASA Astrophysics Data System (ADS)
Hahn, Eric; Fensin, Saryu; Germann, Timothy; Meyers, Marc
Non-equilibrium molecular dynamics simulations are used to probe the tensile response of monocrystalline, bicrystalline, and nanocrystalline tantalum over six orders of magnitude of strain rate. Our analysis of the strain rate dependence of strength is extended to over nine orders of magnitude by bridging the present simulations to recent laser-driven shock experiments. Tensile strength shows a power-law dependence with strain rate over this wide range, with different relationships depending on the initial microstructure and active deformation mechanism. At high strain rates, multiple spall events occur independently and continue to occur until communication occurs by means of relaxation waves. Temperature plays a significant role in the reduction of spall strength as the initial shock required to achieve such large strain rates also contributes to temperature rise, through pressure-volume work as well as visco-plastic heating, which leads to softening and sometimes melting upon release. At ultra-high strain rates, those approaching or exceeding the atomic vibrational frequency, spall strength saturates at the ultimate cohesive strength of the material. UC Research Laboratories Grant (09-LR-06-118456-MEYM); Department of Energy NNSA/SSAP (DE-NA0002080); DOE ASCR Exascale Co-design Center for Materials in Extreme Environments.
Imaging late capsular block syndrome: ultrasound biomicroscopy versus Scheimpflug camera.
Kucukevcilioglu, Murat; Hurmeric, Volkan; Erdurman, Fazıl Cuneyt; Ceylan, Osman Melih
2011-11-01
We describe 2 patients with late capsular block syndrome whose anterior chamber morphology was evaluated with ultrasound biomicroscopy and Scheimpflug imaging before and after neodymium:YAG laser capsulotomy. Pretreatment ultrasound biomicroscopy examination showed significant capsular bag distension in both patients. Scheimpflug imaging failed to capture the posterior capsule displaced far behind the intraocular lens. Automatic anterior chamber depth measurements were incorrect with Scheimpflug imaging in 1 patient. Ultrasound biomicroscopy seems to be superior to Scheimpflug imaging in eyes with extremely distended capsular bags. No author has a financial or proprietary interest in any material or method mentioned. Copyright © 2011 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Lenticular meridional astigmatism secondary to iris mesectodermal leiomyoma.
Chalam, K V; Cutler Peck, Carolee M; Grover, Sandeep; Radhakrishnan, Ravi
2012-01-01
A 61-year-old African American man presented with decreased vision of 2 months duration. Examination revealed a significant lenticular astigmatism and sectoral cataract as a result of an amelanotic iris lesion. Slitlamp optical coherence tomography (OCT) revealed angle crowding. An excisional biopsy was performed along with phacoemulsification in the right eye, with intraocular lens implantation for meridional lenticular astigmatism. Histopathology and histoimmunochemistry confirmed a diagnosis of uveal mesectodermal leiomyoma. Lenticular astigmatism may be a subtle sign of an anterior segment tumor. Anterior segment slitlamp OCT is an effective tool in diagnosing as well as monitoring small interval changes in these types of tumors. Copyright © 2012 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nalu is a Sierra ToolKit (STK) based application module, and it has provided a set of "lessons learned" for the STK transition effort through its early adoption of STK. It makes use of the open-sourced Trilinos/ Tpetra library. Through the investment of LORD and ASCR projects, the Nalu code module has been extended beyond prototype status. Physics capability includes low Mach, variable density turbulent flow. The ongoing objective for Nalu is to facilitate partnerships with external organizations in order to extend code capability and knowledge; however, it is not intended to support routine CFD analysis. The targeted usage of thismore » module is for non-NW applications that support work-for-others in the multiphysics energy sector.« less
Microwash or macrowash technique to maintain a clear cornea during cataract surgery.
Amjadi, Shahriar; Roufas, Athena; Figueira, Edwin C; Bhardwaj, Gaurav; Francis, Katherine E; Masselos, Katherine; Francis, Ian C
2010-09-01
We describe a technique of irrigating and thereby rapidly and effectively clearing the cornea of relatively large amounts of surface contaminants that reduce surgical visibility and may contribute to endophthalmitis. This technique is referred to as "macrowash." If the technique is required, it is usually at the commencement of cataract surgery, immediately after placement of the surgical drape. The technique not only saves time, but also reduces the volume of irrigating solution required by the "microwash" technique, which is traditionally carried out by the scrub nurse/surgical assistant using a Rycroft cannula attached to a 15 mL container of irrigating solution. Copyright (c) 2010 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Biomechanical and optical behavior of human corneas before and after photorefractive keratectomy.
Sánchez, Paolo; Moutsouris, Kyros; Pandolfi, Anna
2014-06-01
To evaluate numerically the biomechanical and optical behavior of human corneas and quantitatively estimate the changes in refractive power and stress caused by photorefractive keratectomy (PRK). Athineum Refractive Center, Athens, Greece, and Politecnico di Milano, Milan, Italy. Retrospective comparative interventional cohort study. Corneal topographies of 10 human eyes were taken with a scanning-slit corneal topographer (Orbscan II) before and after PRK. Ten patient-specific finite element models were created to estimate the strain and stress fields in the cornea in preoperative and postoperative configurations. The biomechanical response in postoperative eyes was computed by directly modeling the postoperative geometry from the topographer and by reproducing the corneal ablation planned for the PRK with a numerical reprofiling procedure. Postoperative corneas were more compliant than preoperative corneas. In the optical zone, corneal thinning decreased the mechanical stiffness, causing local resteepening and making the central refractive power more sensitive to variations in intraocular pressure (IOP). At physiologic IOP, the postoperative corneas had a mean 7% forward increase in apical displacement and a mean 20% increase in the stress components at the center of the anterior surface over the preoperative condition. Patient-specific numerical models of the cornea can provide quantitative information on the changes in refractive power and in the stress field caused by refractive surgery. No author has a financial or proprietary interest in any material or method mentioned. Copyright © 2014 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Comparison of newer IOL power calculation methods for post-corneal refractive surgery eyes
Wang, Li; Tang, Maolong; Huang, David; Weikert, Mitchell P.; Koch, Douglas D.
2015-01-01
Objective To compare the newer formulae, the optical coherence tomography based intraocular lens (IOL) power formula (OCT formula) and the Barrett True-K formula (True-K), to the methods on the ASCRS calculator in eyes with previous myopic LASIK/PRK. Design Prospective case series. Participants One-hundred and four eyes of 80 patients who had previous myopic LASIK/PRK and subsequent cataract surgery and IOL implantation. Methods Using the actual refraction following cataract surgery as target refraction, predicted IOL power for each method was calculated. The IOL prediction error (PE) was obtained by subtracting the predicted IOL power from the power of IOL implanted. Main outcome measures Arithmetic IOL PEs, variances of mean arithmetic IOL PE, median refractive PE and percent of eyes within 0.5 D and 1.0 D of refractive PE. Results OCT produced smaller variance of IOL PE than did Wang-Koch-Maloney, and Shammas (P<0.05). With the OCT, True-K No History, Wang-Koch-Maloney, Shammas, Haigis-L, and Average of these 5 formulas, respectively, the median refractive PEs were 0.35 D, 0.42 D, 0.51 D, 0.48 D, 0.39 D, and 0.35 D, and the % of eyes within 0.5 D of refractive PE were 68.3%, 58.7%, 50.0%, 52.9%, 55.8%, and 67.3%, and within 1.0 D of RPE, 92.3%, 90.4%, 86.9%, 88.5%, 90.4%, and 94.2%, respectively. The OCT formula had smaller refractive PE compared to Wang-Koch-Maloney and Shammas, and the Average approach produced significantly smaller refractive PE than did all methods except OCT (all P<0.05). Conclusions The OCT and True-K No History are promising formulas. The ASCRS IOL calculator has been updated to include the OCT and Barrett True K formulas. Trial registration Intraocular Lens Power Calculation After Laser Refractive Surgery Based on Optical Coherence Tomography (OCT IOL); Identifier: NCT00532051; www.ClinicalTrials.gov PMID:26459996
Cirocco, William C
2011-04-01
A restrictive covenant (RC) may be a cause for concern among any physician entering into a contractual agreement. This survey of graduates of colorectal surgery residency training programs aimed to determine whether this was an important issue among the members of this group. An electronic survey generated by the Young Surgeons' Committee of the American Society of Colon and Rectal Surgeons (ASCRS) regarding employment contracts was sent to all graduates of colorectal surgery residency training programs from the preceding 10-year period. The survey included 5 questions (including an open-ended question for those who experienced an adverse effect when a RC was enforced), followed by queries designed to generate demographic data. This was an anonymous survey with an option to enter the name of the respondent. There were a total of 157 responses to the survey of 630 sent (25%). Of the 132 respondents to the survey who had signed a contract, 67 respondents (53%) had signed a contract that contained a RC, and 24 of these 67 respondents (35%) subsequently changed employment. The RC was enforced for 15 of these 24 respondents (63%) resulting in an adverse effect for 8 of the 15 (53%). The age range of those adversely effected was 32 to 41 years (mean, 36.3 y), all but 1 had been in private practice (86%) from 1 to 6 years (mean, 2.9 y). Most were men (71%) and all were married. The majority of survey respondents signed an employment agreement that contained a RC. Overall, 35% of the respondents subsequently changed employment and the RC was enforced in most cases (63%), often with an adverse effect on career and life (53%). ASCRS members should consider the potentially devastating consequences of signing a contract that contains a RC, especially if they would be averse to changing geographic location should they face a change in employment.
OPENING REMARKS: Scientific Discovery through Advanced Computing
NASA Astrophysics Data System (ADS)
Strayer, Michael
2006-01-01
Good morning. Welcome to SciDAC 2006 and Denver. I share greetings from the new Undersecretary for Energy, Ray Orbach. Five years ago SciDAC was launched as an experiment in computational science. The goal was to form partnerships among science applications, computer scientists, and applied mathematicians to take advantage of the potential of emerging terascale computers. This experiment has been a resounding success. SciDAC has emerged as a powerful concept for addressing some of the biggest challenges facing our world. As significant as these successes were, I believe there is also significance in the teams that achieved them. In addition to their scientific aims these teams have advanced the overall field of computational science and set the stage for even larger accomplishments as we look ahead to SciDAC-2. I am sure that many of you are expecting to hear about the results of our current solicitation for SciDAC-2. I’m afraid we are not quite ready to make that announcement. Decisions are still being made and we will announce the results later this summer. Nearly 250 unique proposals were received and evaluated, involving literally thousands of researchers, postdocs, and students. These collectively requested more than five times our expected budget. This response is a testament to the success of SciDAC in the community. In SciDAC-2 our budget has been increased to about 70 million for FY 2007 and our partnerships have expanded to include the Environment and National Security missions of the Department. The National Science Foundation has also joined as a partner. These new partnerships are expected to expand the application space of SciDAC, and broaden the impact and visibility of the program. We have, with our recent solicitation, expanded to turbulence, computational biology, and groundwater reactive modeling and simulation. We are currently talking with the Department’s applied energy programs about risk assessment, optimization of complex systems - such as the national and regional electricity grid, carbon sequestration, virtual engineering, and the nuclear fuel cycle. The successes of the first five years of SciDAC have demonstrated the power of using advanced computing to enable scientific discovery. One measure of this success could be found in the President’s State of the Union address in which President Bush identified ‘supercomputing’ as a major focus area of the American Competitiveness Initiative. Funds were provided in the FY 2007 President’s Budget request to increase the size of the NERSC-5 procurement to between 100-150 teraflops, to upgrade the LCF Cray XT3 at Oak Ridge to 250 teraflops and acquire a 100 teraflop IBM BlueGene/P to establish the Leadership computing facility at Argonne. We believe that we are on a path to establish a petascale computing resource for open science by 2009. We must develop software tools, packages, and libraries as well as the scientific application software that will scale to hundreds of thousands of processors. Computer scientists from universities and the DOE’s national laboratories will be asked to collaborate on the development of the critical system software components such as compilers, light-weight operating systems and file systems. Standing up these large machines will not be business as usual for ASCR. We intend to develop a series of interconnected projects that identify cost, schedule, risks, and scope for the upgrades at the LCF at Oak Ridge, the establishment of the LCF at Argonne, and the development of the software to support these high-end computers. The critical first step in defining the scope of the project is to identify a set of early application codes for each leadership class computing facility. These codes will have access to the resources during the commissioning phase of the facility projects and will be part of the acceptance tests for the machines. Applications will be selected, in part, by breakthrough science, scalability, and ability to exercise key hardware and software components. Possible early applications might include climate models; studies of the magnetic properties of nanoparticles as they relate to ultra-high density storage media; the rational design of chemical catalysts, the modeling of combustion processes that will lead to cleaner burning coal, and fusion and astrophysics research. I have presented just a few of the challenges that we look forward to on the road to petascale computing. Our road to petascale science might be paraphrased by the quote from e e cummings, ‘somewhere I have never traveled, gladly beyond any experience . . .’
Vitrectomy-assisted phacoemulsification for lenticular coloboma.
Agarwal, Ashvin; Narang, Priya; Agarwal, Amar
2017-02-01
We describe a technique to prevent continuous vitreous hydration during phacoemulsification in eyes with lenticular coloboma. The hydration results from communication between the anterior and posterior chambers from the edges of the colobomatous defect. To avoid this, a valved trocar is placed at the pars plana site around the area of the lenticular defect, which allows a limited dry vitrectomy during phacoemulsification. Intermittent vitrectomy with a moderate cutting rate and low vacuum parameters accompanied by temporary halting of the phacoemulsification procedure prevents vitreous herniation into the anterior chamber and limits the extension of zonular compromise, facilitating safe phacoemulsification with appropriate capsule expansion and fixation devices and implantation of an intraocular lens. Copyright © 2017 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Fishtail on a line technique for capsular tension ring insertion.
Rixen, Jordan J; Oetting, Thomas A
2014-07-01
We describe a capsular tension ring (CTR) insertion technique that is a modification of the previously described fishtail technique. A suture line is used to pull the leading eyelet out through the main incision to form the fish configuration. Similar to the fishtail technique, this insertion technique minimizes the risk for zonular damage or a capsule tear because the CTR is not dialed into the capsular bag. The advantage of the suture line is that it prevents over bending of the CTR during insertion through the main incision, which can occur using the traditional fishtail technique. Neither author has a financial or proprietary interest in any material or method mentioned. Copyright © 2014 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Rotational stability of 2 intraocular lenses with an identical design and different materials.
Draschl, Petra; Hirnschall, Nino; Luft, Nikolaus; Schuschitz, Sandra; Wiesinger, Jörg; Rigal, Karl; Findl, Oliver
2017-02-01
To evaluate the rotational stability of nontoric intraocular lenses (IOLs) of the same design and different materials. Vienna Institute for Research in Ocular Surgery, Department of Ophthalmology, Hanusch Hospital, Vienna, Austria. Prospective randomized case series. This study included cataract patients with a corneal astigmatism of less than 1.75 diopters measured with the IOLMaster 500. Each patient received a hydrophilic IOL (Pod Ay 26P) in 1 eye and a hydrophobic IOL (Podeye) in the other eye. One hour and 3 months postoperatively, retroillumination photographs were taken to assess rotational stability of the IOLs. In addition, autorefraction, subjective refraction, and Purkinje meter measurements were performed at the 3-month follow-up. Eighty eyes of 40 patients were included in this study. Three months postoperatively, the IOL rotation within the first 3 months was 2.4 degrees ± 1.85 (SD) in the hydrophilic IOL group and 1.6 ± 1.61 degrees in the hydrophobic IOL group. This difference was statistically significant (P = .016). The hydrophobic IOL was rotationally more stable than the hydrophilic IOL, although both IOLs provided good capsular bag stability. Copyright © 2017 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Strayer, Michael
2008-07-01
Welcome to Seattle and the 2008 SciDAC Conference. This conference, the fourth in the series, is a continuation of the PI meetings we first began under SciDAC-1. I would like to start by thanking the organizing committee, and Rick Stevens in particular, for organizing this year's meeting. This morning I would like to look briefly at SciDAC, to give you a brief history of SciDAC and also look ahead to see where we plan to go over the next few years. I think the best description of SciDAC, at least the simulation part, comes from a quote from Dr Ray Orbach, DOE's Under Secretary for Science and Director of the Office of Science. In an interview that appeared in the SciDAC Review magazine, Dr Orbach said, `SciDAC is unique in the world. There isn't any other program like it anywhere else, and it has the remarkable ability to do science by bringing together physical scientists, mathematicians, applied mathematicians, and computer scientists who recognize that computation is not something you do at the end, but rather it needs to be built into the solution of the very problem that one is addressing'. Of course, that is extended not just to physical scientists, but also to biological scientists. This is a theme of computational science, this partnership among disciplines, which goes all the way back to the early 1980s and Ken Wilson. It's a unique thread within the Department of Energy. SciDAC-1, launched around the turn of the millennium, created a new generation of scientific simulation codes. It advocated building out mathematical and computing system software in support of science and a new collaboratory software environment for data. The original concept for SciDAC-1 had topical centers for the execution of the various science codes, but several corrections and adjustments were needed. The ASCR scientific computing infrastructure was also upgraded, providing the hardware facilities for the program. The computing facility that we had at that time was the big 3 teraflop/s center at NERSC and that had to be shared with the programmatic side supporting research across DOE. At the time, ESnet was just slightly over half a gig per sec of bandwidth; and the science being addressed was accelerator science, climate, chemistry, fusion, astrophysics, materials science, and QCD. We built out the national collaboratories from the ASCR office, and in addition we built Integrated Software Infrastructure Centers (ISICs). Of these, three were in applied mathematics, four in computer science (including a performance evaluation research center), and four were collaboratories or Grid projects having to do with data management. For science, there were remarkable breakthroughs in simulation, such as full 3D laboratory scale flame simulation. There were also significant improvements in application codes - from factors of almost 3 to more than 100 - and code improvement as people began to realize they had to integrate mathematics tools and computer science tools into their codes to take advantage of the parallelism of the day. The SciDAC data-mining tool, Sapphire, received a 2006 R&D 100 award. And the community as a whole worked well together and began building a publication record that was substantial. In 2006, we recompeted the program with similar goals - SciDAC-1 was very successful, and we wanted to continue that success and extend what was happening under SciDAC to the broader science community. We opened up the partnership to all of the Offices of Science and the NSF and the NNSA. The goal was to create comprehensive scientific computing software and the infrastructure for the software to enable scientific discovery in the physical, biological, and environmental sciences and take the simulations to an extreme scale, in this case petascale. We would also build out a new generation of data management tools. What we observed during SciDAC-1 was that the data and the data communities - both experimental data from large experimental facilities and observational data, along with simulation data - were expanding at a rate significantly faster than Moore's law. In the past few weeks, the FastBit indexing technology software tool for data analyses and data mining developed under SciDAC's Scientific Data Management project was recognized with an R&D 100 Award, selected by an independent judging panel and editors of R&D Magazine as one of the 100 most technologically significant products introduced into the marketplace over the past year. For SciDAC-2 we had nearly 250 proposals requesting a total of slightly over 1 billion in funding. Of course, we had nowhere near 1 billion. The facilities and the science we ended up with were not significantly different from what we had in SciDAC-1. But we had put in place substantially increased facilities for science. When SciDAC-1 was originally executed with the facilities at NERSC, there was significant impact on the resources at NERSC, because not only did we have an expanding portfolio of programmatic science, but we had the SciDAC projects that also needed to run at NERSC. Suddenly, NERSC was incredibly oversubscribed. With SciDAC-2, we had in place leadership-class computing facilities at Argonne with slightly more than half a petaflop and at Oak Ridge with slightly more than a quarter petaflop with an upgrade planned at the end of this year for a petaflop. And we increased the production computing capacity at NERSC to 104 teraflop/s just so that we would not impact the programmatic research and so that we would have a startup facility for SciDAC. At the end of the summer, NERSC will be at 360 teraflop/s. Both the Oak Ridge system and the principal resource at NERSC are Cray systems; Argonne has a different architecture, an IBM Blue Gene/P. At the same time, ESnet has been built out, and we are on a path where we will have dual rings around the country, from 10 to 40 gigabits per second - a factor of 20 to 80 over what was available during SciDAC-1. The science areas include accelerator science and simulation, astrophysics, climate modeling and simulation, computational biology, fusion science, high-energy physics, petabyte high-energy/ nuclear physics, materials science and chemistry, nuclear physics, QCD, radiation transport, turbulence, and groundwater reactive transport modeling and simulation. They were supported by new enabling technology centers and university-based institutes to develop an educational thread for the SciDAC program. There were four mathematics projects and four computer science projects; and under data management, we see a significant difference in that we are bringing up new visualization projects to support and sustain data-intensive science. When we look at the budgets, we see growth in the budget from just under 60 million for SciDAC-1 to just over 80 for SciDAC-2. Part of the growth is due to bringing in NSF and NNSA as new partners, and some of the growth is due to some program offices increasing their investment in SciDAC, while other program offices are constant or have decreased their investment. This is not a reflection of their priorities per se but, rather, a reflection of the budget process and the difficult times in Washington during the past two years. New activities are under way in SciDAC - the annual PI meeting has turned into what I would describe as the premier interdisciplinary computational science meeting, one of the best in the world. Doing interdisciplinary meetings is difficult because people tend to develop a focus for their particular subject area. But this is the fourth in the series; and since the first meeting in San Francisco, these conferences have been remarkably successful. For SciDAC-2 we also created an outreach magazine, SciDAC Review, which highlights scientific discovery as well as high-performance computing. It's been very successful in telling the non-practitioners what SciDAC and computational science are all about. The other new instrument in SciDAC-2 is an outreach center. As we go from computing at the terascale to computing at the petascale, we face the problem of narrowing our research community. The number of people who are `literate' enough to compute at the terascale is more than the number of those who can compute at the petascale. To address this problem, we established the SciDAC Outreach Center to bring people into the fold and educate them as to how we do SciDAC, how the teams are composed, and what it really means to compute at scale. The resources I have mentioned don't come for free. As part of the HECRTF law of 2005, Congress mandated that the Secretary would ensure that leadership-class facilities would be open to everyone across all agencies. So we took Congress at its word, and INCITE is our instrument for making allocations at the leadership-class facilities at Argonne and Oak Ridge, as well as smaller allocations at NERSC. Therefore, the selected proposals are very large projects that are computationally intensive, that compute at scale, and that have a high science impact. An important feature is that INCITE is completely open to anyone - there is no requirement of DOE Office of Science funding, and proposals are rigorously reviewed for both the science and the computational readiness. In 2008, more than 100 proposals were received, requesting about 600 million processor-hours. We allocated just over a quarter of a billion processor-hours. Astrophysics, materials science, lattice gauge theory, and high energy and nuclear physics were the major areas. These were the teams that were computationally ready for the big machines and that had significant science they could identify. In 2009, there will be a significant increase amount of time to be allocated, over half a billion processor-hours. The deadline is August 11 for new proposals and September 12 for renewals. We anticipate a significant increase in the number of requests this year. We expect you - as successful SciDAC centers, institutes, or partnerships - to compete for and win INCITE program allocation awards. If you have a successful SciDAC proposal, we believe it will make you successful in the INCITE review. We have the expectation that you will among those most prepared and most ready to use the machines and to compute at scale. Over the past 18 months, we have assembled a team to look across our computational science portfolio and to judge what are the 10 most significant science accomplishments. The ASCR office, as it goes forward with OMB, the new administration, and Congress, will be judged by the science we have accomplished. All of our proposals - such as for increasing SciDAC, increasing applied mathematics, and so on - are tied to what have we accomplished in science. And so these 10 big accomplishments are key to establishing credibility for new budget requests. Tony Mezzacappa, who chaired the committee, will also give a presentation on the ranking of these top 10, how they got there, and what the science is all about. Here is the list - numbers 2, 5, 6, 7, 9, and 10 are all SciDAC projects. RankTitle 1Modeling the Molecular Basis of Parkinson's Disease (Tsigelny) 2Discovery of the Standing Accretion Shock Instability and Pulsar Birth Mechanism in a Core-Collapse Supernova Evolution and Explosion (Blondin) 3Prediction and Design of Macromolecular Structures and Functions (Baker) 4Understanding How Lifted Flame Stabilized in a Hot Coflow (Yoo) 5New Insights from LCF-enabled Advanced Kinetic Simulations of Global Turbulence in Fusion Systems (Tang) 6High Transition Temperature Superconductivity: A High-Temperature Superconductive State and a Pairing Mechanism in 2-D Hubbard Model (Scalapino) 7 PETsc: Providing the Solvers for DOE High-Performance Simulations (Smith) 8 Via Lactea II, A Billion Particle Simulation of the Dark Matter Halo of the Milky Way (Madau) 9Probing the Properties of Water through Advanced Computing (Galli) 10First Provably Scalable Maxwell Solver Enables Scalable Electromagnetic Simulations (Kovel) So, what's the future going to look like for us? The office is putting together an initiative with the community, which we call the E3 Initiative. We're looking for a 10-year horizon for what's going to happen. Through the series of town hall meetings, which many of you participated in, we have produced a document on `Transforming Energy, the Environment and Science through simulations at the eXtreme Scale'; it can be found at http://www.science.doe.gov/ascr/ProgramDocuments/TownHall.pdf . We sometimes call it the Exascale initiative. Exascale computing is the gold-ring level of computing that seems just out of reach; but if we work hard and stretch, we just might be able to reach it. We envision that there will be a SciDAC-X, working at the extreme scale, with SciDAC teams that will perform and carry out science in the areas that will have a great societal impact, such as alternative fuels and transportation, combustion, climate, fusion science, high-energy physics, advanced fuel cycles, carbon management, and groundwater. We envision institutes for applied mathematics and computer science that probably will segue into algorithms because, at the extreme scale, we see the distinction between the applied math and the algorithm per se and its implementation in computer science as being inseparable. We envision an INCITE-X with multi-petaflop platforms, perhaps even exaflop computing resources. ESnet will be best in class - our 10-year plan calls for having 400 terabits per second capacity available in dual rings around the country, an enormously fast data communications network for moving large amounts of data. In looking at where we've been and where we are going, we can see that the gigaflops and teraflops era was a regime where we were following Moore's law through advances in clock speed. In the current regime, we're introducing massive parallelism, which I think is exemplified by Intel's announcement of their teraflop chip, where they envision more than a thousand cores on a chip. But in order to reach exascale, extrapolations talk about machines that require 100 megawatts of power in terms of current architectures. It's clearly going to require novel architectures, things we have perhaps not yet envisioned. It is of course an era of challenge. There will be an unpredictable evolution of hardware if we are to reach the exascale; and there will clearly be multilevel heterogeneous parallelism, including multilevel memory hierarchies. We have no idea right now as to the programming models needed to execute at such an extreme scale. We have been incredibly successful at the petascale - we know that already. Managing data and just getting communications to scale is an enormous challenge. And it's not just the extreme scaling. It's the rapid increase in complexity that represents the challenge. Let me end with a metaphor. In previous meetings we have talked about the road to petascale. Indeed, we have seen in hindsight that it was a road well traveled. But perhaps the road to exascale is not a road at all. Perhaps the metaphor will be akin to scaling the south face of K2. That's clearly not something all of us will be able to do, and probably computing at the exascale is not something all of us will do. But if we achieve that goal, perhaps the words of Emily Dickinson will best summarize where we will be. Perhaps in her words, looking backward and down, you will say: I climb the `Hill of Science' I view the landscape o'er; Such transcendental prospect I ne'er beheld before!
Trocar anterior chamber maintainer: Improvised infusion technique.
Agarwal, Amar; Narang, Priya; Kumar, Dhivya A; Agarwal, Ashvin
2016-02-01
We present an improvised technique of infusion that uses a trocar cannula as an anterior chamber maintainer (ACM). Although routinely used in posterior segment surgery, the trocar cannula has been infrequently used in complex anterior segment procedures. The trocar ACM creates a transconjunctival biplanar wound of appropriate size that is self-sealing and overcomes the shortcomings of an ACM, such as spontaneous extrusion and forced introduction into the eye from variability in the size of the corneal paracentesis incision. Constant infusion inflow through the trocar ACM is used to maintain positive intraocular pressure through a self-sealing sclerotomy incision at the limbus. No author has a financial or proprietary interest in any material or method mentioned. Copyright © 2016 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Simple steep-axis marking technique using a corneal analyzer.
Ng, Alex L K; Chan, Tommy C Y; Jhanji, Vishal; Cheng, George P M
2017-02-01
We describe a simple steep-axis marking technique that uses a corneal analyzer (OPD III scan) during arcuate keratotomy in femtosecond laser-assisted cataract surgery. The technique requires a single reference mark at the limbus, which does not have to be on the horizontal axis. Using the corneal analyzer, the angle between the steep axis and the reference line between the reference mark and the center of the cornea can be determined. The angle from the reference mark is used intraoperatively to locate the steep axis. This eliminates the potential error from different head positions during keratometry measurement and during traditional marking under the slitlamp. The marking technique can also be applied to toric intraocular lens implantation during cataract surgery. Copyright © 2017 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Corneal collagen crosslinking and pigment dispersion syndrome.
LaHood, Benjamin R; Moore, Sacha
2017-03-01
We describe the case of a keratoconus patient with pigment dispersion syndrome (PDS) who was treated for progressive corneal ectasia with corneal collagen crosslinking (CXL). Pigment dispersion syndrome has been shown to have associated morphologic changes of the corneal endothelium. Corneal CXL has the potential to cause toxicity to the corneal endothelium, and adjacent pigment might increase the likelihood of damage. In this case, the presence of PDS had no detrimental effect on the outcome of treatment, and no complications were observed at 12 months follow-up, indicating that it may be safe to perform corneal CXL in the setting of PDS. This is an important observation as the number of indications for corneal CXL grows. Copyright © 2017 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Improving the trust in results of numerical simulations and scientific data analytics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cappello, Franck; Constantinescu, Emil; Hovland, Paul
This white paper investigates several key aspects of the trust that a user can give to the results of numerical simulations and scientific data analytics. In this document, the notion of trust is related to the integrity of numerical simulations and data analytics applications. This white paper complements the DOE ASCR report on Cybersecurity for Scientific Computing Integrity by (1) exploring the sources of trust loss; (2) reviewing the definitions of trust in several areas; (3) providing numerous cases of result alteration, some of them leading to catastrophic failures; (4) examining the current notion of trust in numerical simulation andmore » scientific data analytics; (5) providing a gap analysis; and (6) suggesting two important research directions and their respective research topics. To simplify the presentation without loss of generality, we consider that trust in results can be lost (or the results’ integrity impaired) because of any form of corruption happening during the execution of the numerical simulation or the data analytics application. In general, the sources of such corruption are threefold: errors, bugs, and attacks. Current applications are already using techniques to deal with different types of corruption. However, not all potential corruptions are covered by these techniques. We firmly believe that the current level of trust that a user has in the results is at least partially founded on ignorance of this issue or the hope that no undetected corruptions will occur during the execution. This white paper explores the notion of trust and suggests recommendations for developing a more scientifically grounded notion of trust in numerical simulation and scientific data analytics. We first formulate the problem and show that it goes beyond previous questions regarding the quality of results such as V&V, uncertainly quantification, and data assimilation. We then explore the complexity of this difficult problem, and we sketch complementary general approaches to address it. This paper does not focus on the trust that the execution will actually complete. The product of simulation or of data analytic executions is the final element of a potentially long chain of transformations, where each stage has the potential to introduce harmful corruptions. These corruptions may produce results that deviate from the user-expected accuracy without notifying the user of this deviation. There are many potential sources of corruption before and during the execution; consequently, in this white paper we do not focus on the protection of the end result after the execution.« less
Laser-assisted marking for toric intraocular lens alignment.
Dick, H Burkhard; Schultz, Tim
2016-01-01
We describe a technique of 3-dimensional spectral-domain optical coherence tomography-controlled laser-assisted corneal marking for toric intraocular lens implantation. To facilitate accurate alignment, the technique creates 2 perpendicular intrastromal incisions (width 0.75 mm) using an image-guided femtosecond laser. This was performed in a case series comprising 10 eyes of 10 patients. No posterior corneal perforation or epithelial alterations occurred. The incisions were plainly visible under the operating microscope, and no optical phenomena were reported 6 weeks after surgery. Laser-assisted marking can be performed safely and has the potential to enable precise axis marking. Dr. Dick is a paid consultant to Abbott Medical Optics, Inc. Dr. Schultz has no financial or proprietary interest in any material or method mentioned. Copyright © 2016 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Parkash, Rohit Om; Mahajan, Shruti; Parkash, Tushya Om; Nayak, Vittal
2017-01-01
We describe a technique for performing safe phacoemulsification of a Morgagnian cataract using the intraocular lens (IOL) scaffold. An IOL scaffold has been used in cases in which posterior capsule rupture has occurred, leaving nonemulsified nuclear pieces. The scaffold provides a barrier that prevents the nuclear fragments from falling posteriorly into the vitreous cavity. Our technique uses the IOL as a scaffold to prevent the vulnerable posterior capsule from rupturing during nuclear emulsification in Morgagnian cataract. The technique prevents rupture of the floppy posterior capsule by providing a constant support to it. The scaffold provides stable inflation of the capsular bag and prevents inadvertent emulsification. Concurrently, it prevents dehiscence of weak zonular fibers by minimizing the stress on the zonular apparatus. Copyright © 2016 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Mifflin, Mark D; Kinard, Krista; Neuffer, Marcus C
2012-06-01
Anterior stromal pocket hydration was compared with conventional hydration for preventing wound leak after 2.8 mm uniplanar clear corneal incisions (CCIs) in patients having routine cataract surgery. Conventional hydration involves hydration of the lateral walls of the main incision with visible whitening of the stroma. The anterior stromal pocket hydration technique involves creation of an additional supraincisional stromal pocket overlying the main incision, which is then hydrated instead of the main incision. Sixty-six eyes of 48 patients were included in the data analysis with 33 assigned to each study group. The anterior stromal pocket hydration technique was significantly better than conventional hydration in preventing wound leak due to direct pressure on the posterior lip of the incision. Copyright © 2012 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Canovas, Carmen; van der Mooren, Marrie; Rosén, Robert; Piers, Patricia A; Wang, Li; Koch, Douglas D; Artal, Pablo
2015-05-01
To determine the impact of the equivalent refractive index (ERI) on intraocular lens (IOL) power prediction for eyes with previous myopic laser in situ keratomileusis (LASIK) using custom ray tracing. AMO B.V., Groningen, the Netherlands, and the Department of Ophthalmology, Baylor College of Medicine, Houston, Texas, USA. Retrospective data analysis. The ERI was calculated individually from the post-LASIK total corneal power. Two methods to account for the posterior corneal surface were tested; that is, calculation from pre-LASIK data or from post-LASIK data only. Four IOL power predictions were generated using a computer-based ray-tracing technique, including individual ERI results from both calculation methods, a mean ERI over the whole population, and the ERI for normal patients. For each patient, IOL power results calculated from the four predictions as well as those obtained with the Haigis-L were compared with the optimum IOL power calculated after cataract surgery. The study evaluated 25 patients. The mean and range of ERI values determined using post-LASIK data were similar to those determined from pre-LASIK data. Introducing individual or an average ERI in the ray-tracing IOL power calculation procedure resulted in mean IOL power errors that were not significantly different from zero. The ray-tracing procedure that includes an average ERI gave a greater percentage of eyes with an IOL power prediction error within ±0.5 diopter than the Haigis-L (84% versus 52%). For IOL power determination in post-LASIK patients, custom ray tracing including a modified ERI was an accurate procedure that exceeded the current standards for normal eyes. Copyright © 2015 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Mismatch Negativity Encoding of Prediction Errors Predicts S-ketamine-Induced Cognitive Impairments
Schmidt, André; Bachmann, Rosilla; Kometer, Michael; Csomor, Philipp A; Stephan, Klaas E; Seifritz, Erich; Vollenweider, Franz X
2012-01-01
Psychotomimetics like the N-methyl--aspartate receptor (NMDAR) antagonist ketamine and the 5-hydroxytryptamine2A receptor (5-HT2AR) agonist psilocybin induce psychotic symptoms in healthy volunteers that resemble those of schizophrenia. Recent theories of psychosis posit that aberrant encoding of prediction errors (PE) may underlie the expression of psychotic symptoms. This study used a roving mismatch negativity (MMN) paradigm to investigate whether the encoding of PE is affected by pharmacological manipulation of NMDAR or 5-HT2AR, and whether the encoding of PE under placebo can be used to predict drug-induced symptoms. Using a double-blind within-subject placebo-controlled design, S-ketamine and psilocybin, respectively, were administrated to two groups of healthy subjects. Psychological alterations were assessed using a revised version of the Altered States of Consciousness (ASC-R) questionnaire. As an index of PE, we computed changes in MMN amplitudes as a function of the number of preceding standards (MMN memory trace effect) during a roving paradigm. S-ketamine, but not psilocybin, disrupted PE processing as expressed by a frontally disrupted MMN memory trace effect. Although both drugs produced positive-like symptoms, the extent of PE processing under placebo only correlated significantly with the severity of cognitive impairments induced by S-ketamine. Our results suggest that the NMDAR, but not the 5-HT2AR system, is implicated in PE processing during the MMN paradigm, and that aberrant PE signaling may contribute to the formation of cognitive impairments. The assessment of the MMN memory trace in schizophrenia may allow detecting early phases of the illness and might also serve to assess the efficacy of novel pharmacological treatments, in particular of cognitive impairments. PMID:22030715
NASA Astrophysics Data System (ADS)
Anantharaj, Valentine; Norman, Matthew; Evans, Katherine; Taylor, Mark; Worley, Patrick; Hack, James; Mayer, Benjamin
2014-05-01
During 2013, high-resolution climate model simulations accounted for over 100 million "core hours" using Titan at the Oak Ridge Leadership Computing Facility (OLCF). The suite of climate modeling experiments, primarily using the Community Earth System Model (CESM) at nearly 0.25 degree horizontal resolution, generated over a petabyte of data and nearly 100,000 files, ranging in sizes from 20 MB to over 100 GB. Effective utilization of leadership class resources requires careful planning and preparation. The application software, such as CESM, need to be ported, optimized and benchmarked for the target platform in order to meet the computational readiness requirements. The model configuration needs to be "tuned and balanced" for the experiments. This can be a complicated and resource intensive process, especially for high-resolution configurations using complex physics. The volume of I/O also increases with resolution; and new strategies may be required to manage I/O especially for large checkpoint and restart files that may require more frequent output for resiliency. It is also essential to monitor the application performance during the course of the simulation exercises. Finally, the large volume of data needs to be analyzed to derive the scientific results; and appropriate data and information delivered to the stakeholders. Titan is currently the largest supercomputer available for open science. The computational resources, in terms of "titan core hours" are allocated primarily via the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) and ASCR Leadership Computing Challenge (ALCC) programs, both sponsored by the U.S. Department of Energy (DOE) Office of Science. Titan is a Cray XK7 system, capable of a theoretical peak performance of over 27 PFlop/s, consists of 18,688 compute nodes, with a NVIDIA Kepler K20 GPU and a 16-core AMD Opteron CPU in every node, for a total of 299,008 Opteron cores and 18,688 GPUs offering a cumulative 560,640 equivalent cores. Scientific applications, such as CESM, are also required to demonstrate a "computational readiness capability" to efficiently scale across and utilize 20% of the entire system. The 0,25 deg configuration of the spectral element dynamical core of the Community Atmosphere Model (CAM-SE), the atmospheric component of CESM, has been demonstrated to scale efficiently across more than 5,000 nodes (80,000 CPU cores) on Titan. The tracer transport routines of CAM-SE have also been ported to take advantage of the hybrid many-core architecture of Titan using GPUs [see EGU2014-4233], yielding over 2X speedup when transporting over 100 tracers. The high throughput I/O in CESM, based on the Parallel IO Library (PIO), is being further augmented to support even higher resolutions and enhance resiliency. The application performance of the individual runs are archived in a database and routinely analyzed to identify and rectify performance degradation during the course of the experiments. The various resources available at the OLCF now support a scientific workflow to facilitate high-resolution climate modelling. A high-speed center-wide parallel file system, called ATLAS, capable of 1 TB/s, is available on Titan as well as on the clusters used for analysis (Rhea) and visualization (Lens/EVEREST). Long-term archive is facilitated by the HPSS storage system. The Earth System Grid (ESG), featuring search & discovery, is also used to deliver data. The end-to-end workflow allows OLCF users to efficiently share data and publish results in a timely manner.
NASA Astrophysics Data System (ADS)
Strayer, Michael
2009-07-01
Welcome to San Diego and the 2009 SciDAC conference. Over the next four days, I would like to present an assessment of the SciDAC program. We will look at where we've been, how we got to where we are and where we are going in the future. Our vision is to be first in computational science, to be best in class in modeling and simulation. When Ray Orbach asked me what I would do, in my job interview for the SciDAC Director position, I said we would achieve that vision. And with our collective dedicated efforts, we have managed to achieve this vision. In the last year, we have now the most powerful supercomputer for open science, Jaguar, the Cray XT system at the Oak Ridge Leadership Computing Facility (OLCF). We also have NERSC, probably the best-in-the-world program for productivity in science that the Office of Science so depends on. And the Argonne Leadership Computing Facility offers architectural diversity with its IBM Blue Gene/P system as a counterbalance to Oak Ridge. There is also ESnet, which is often understated—the 40 gigabit per second dual backbone ring that connects all the labs and many DOE sites. In the President's Recovery Act funding, there is exciting news that ESnet is going to build out to a 100 gigabit per second network using new optical technologies. This is very exciting news for simulations and large-scale scientific facilities. But as one noted SciDAC luminary said, it's not all about the computers—it's also about the science—and we are also achieving our vision in this area. Together with having the fastest supercomputer for science, at the SC08 conference, SciDAC researchers won two ACM Gordon Bell Prizes for the outstanding performance of their applications. The DCA++ code, which solves some very interesting problems in materials, achieved a sustained performance of 1.3 petaflops, an astounding result and a mark I suspect will last for some time. The LS3DF application for studying nanomaterials also required the development of a new and novel algorithm to produce results up to 400 times faster than a similar application, and was recognized with a prize for algorithm innovation—a remarkable achievement. Day one of our conference will include examples of petascale science enabled at the OLCF. Although Jaguar has not been officially commissioned, it has gone through its acceptance tests, and during its shakedown phase there have been pioneer applications used for the acceptance tests, and they are running at scale. These include applications in the areas of astrophysics, biology, chemistry, combustion, fusion, geosciences, materials science, nuclear energy and nuclear physics. We also have a whole compendium of science we do at our facilities; these have been documented and reviewed at our last SciDAC conference. Many of these were highlighted in our Breakthroughs Report. One session at this week's conference will feature a cross-section of these breakthroughs. In the area of scalable electromagnetic simulations, the Auxiliary-space Maxwell Solver (AMS) uses specialized finite element discretizations and multigrid-based techniques, which decompose the original problem into easier-to-solve subproblems. Congratulations to the mathematicians on this. Another application on the list of breakthroughs was the authentication of PETSc, which provides scalable solvers used in many DOE applications and has solved problems with over 3 billion unknowns and scaled to over 16,000 processors on DOE leadership-class computers. This is becoming a very versatile and useful toolkit to achieve performance at scale. With the announcement of SIAM's first class of Fellows, we are remarkably well represented. Of the group of 191, more than 40 of these Fellows are in the 'DOE space.' We are so delighted that SIAM has recognized them for their many achievements. In the coming months, we will illustrate our leadership in applied math and computer science by looking at our contributions in the areas of programming models, development and performance tools, math libraries, system software, collaboration, and visualization and data analytics. This is a large and diverse list of libraries. We have asked for two panels, one chaired by David Keyes and composed of many of the nation's leading mathematicians, to produce a report on the most significant accomplishments in applied mathematics over the last eight years, taking us back to the start of the SciDAC program. In addition, we have a similar panel in computer science to be chaired by Kathy Yelick. They are going to identify the computer science accomplishments of the past eight years. These accomplishments are difficult to get a handle on, and I'm looking forward to this report. We will also have a follow-on to our report on breakthroughs in computational science and this will also go back eight years, looking at the many accomplishments under the SciDAC and INCITE programs. This will be chaired by Tony Mezzacappa. So, where are we going in the SciDAC program? It might help to take a look at computational science and how it got started. I go back to Ken Wilson, who made the model and has written on computational science and computational science education. His model was thus: The computational scientist plays the role of the experimentalist, and the math and CS researchers play the role of theorists, and the computers themselves are the experimental apparatus. And that in simulation science, we are carrying out numerical experiments as to the nature of physical and biological sciences. Peter Lax, in the same time frame, developed a report on large-scale computing in science and engineering. Peter remarked, 'Perhaps the most important applications of scientific computing come not in the solution of old problems, but in the discovery of new phenomena through numerical experimentation.' And in the early years, I think the person who provided the most guidance, the most innovation and the most vision for where the future might lie was Ed Oliver. Ed Oliver died last year. Ed did a number of things in science. He had this personality where he knew exactly what to do, but he preferred to stay out of the limelight so that others could enjoy the fruits of his vision. We in the SciDAC program and ASCR Facilities are still enjoying the benefits of his vision. We will miss him. Twenty years after Ken Wilson, Ray Orbach laid out the fundamental premise for SciDAC in an interview that appeared in SciDAC Review: 'SciDAC is unique in the world. There isn't any other program like it anywhere else, and it has the remarkable ability to do science by bringing together physical scientists, mathematicians, applied mathematicians, and computer scientists who recognize that computation is not something you do at the end, but rather it needs to be built into the solution of the very problem that one is addressing. ' As you look at the Lax report from 1982, it talks about how 'Future significant improvements may have to come from architectures embodying parallel processing elements—perhaps several thousands of processors.' And it continues, 'esearch in languages, algorithms and numerical analysis will be crucial in learning to exploit these new architectures fully.' In the early '90s, Sterling, Messina and Smith developed a workshop report on petascale computing and concluded, 'A petaflops computer system will be feasible in two decades, or less, and rely in part on the continual advancement of the semiconductor industry both in speed enhancement and cost reduction through improved fabrication processes.' So they were not wrong, and today we are embarking on a forward look that is at a different scale, the exascale, going to 1018 flops. In 2007, Stevens, Simon and Zacharia chaired a series of town hall meetings looking at exascale computing, and in their report wrote, 'Exascale computer systems are expected to be technologically feasible within the next 15 years, or perhaps sooner. These systems will push the envelope in a number of important technologies: processor architecture, scale of multicore integration, power management and packaging.' The concept of computing on the Jaguar computer involves hundreds of thousands of cores, as do the IBM systems that are currently out there. So the scale of computing with systems with billions of processors is staggering to me, and I don't know how the software and math folks feel about it. We have now embarked on a road toward extreme scale computing. We have created a series of town hall meetings and we are now in the process of holding workshops that address what I call within the DOE speak 'the mission need,' or what is the scientific justification for computing at that scale. We are going to have a total of 13 workshops. The workshops on climate, high energy physics, nuclear physics, fusion, and nuclear energy have been held. The report from the workshop on climate is actually out and available, and the other reports are being completed. The upcoming workshops are on biology, materials, and chemistry; and workshops that engage science for nuclear security are a partnership between NNSA and ASCR. There are additional workshops on applied math, computer science, and architecture that are needed for computing at the exascale. These extreme scale workshops will provide the foundation in our office, the Office of Science, the NNSA and DOE, and we will engage the National Science Foundation and the Department of Defense as partners. We envision a 10-year program for an exascale initiative. It will be an integrated R&D program initially—you can think about five years for research and development—that would be in hardware, operating systems, file systems, networking and so on, as well as software for applications. Application software and the operating system and the hardware all need to be bundled in this period so that at the end the system will execute the science applications at scale. We also believe that this process will have to have considerable investment from the manufacturers and vendors to be successful. We have formed laboratory, university and industry working groups to start this process and formed a panel to look at where SciDAC needs to go to compute at the extreme scale, and we have formed an executive committee within the Office of Science and the NNSA to focus on these activities. We will have outreach to DoD in the next few months. We are anticipating a solicitation within the next two years in which we will compete this bundled R&D process. We don't know how we will incorporate SciDAC into extreme scale computing, but we do know there will be many challenges. And as we have shown over the years, we have the expertise and determination to surmount these challenges.
Temperature in the anterior chamber during phacoemulsification.
Suzuki, Hisaharu; Oki, Kotaro; Igarashi, Tsutomu; Shiwa, Toshihiko; Takahashi, Hiroshi
2014-05-01
To evaluate changes in the aqueous humor temperature using 2 phacoemulsification units (Stellaris 28.5 kHz device and Whitestar Signature 40 kHz device). Nippon Medical School, Musashikosugi Hospital, Kawasaki City, Kanagawa, Japan. Experimental study. Aqueous humor temperatures were measured with a temperature probe set in the anterior chamber during ultrasound (US) oscillation in porcine eyes under 5 conditions. Continuous longitudinal oscillation caused a rapid rise in aqueous humor temperature, while the pulse and elliptical modes suppressed temperature elevation. Reducing the number of US tip vibrations did not reduce the temperature in the anterior chamber. However, raising the vacuum setting allowed the aspirations to rise to the set value, thereby lowering the temperature in the anterior chamber. Because differences in the phacoemulsification settings can lead to temperature elevations in the anterior chamber, surgeons must carefully monitor these settings to prevent corneal tissue damage. Copyright © 2014 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Rotation stability of a toric intraocular lens with a second capsular tension ring.
Sagiv, Oded; Sachs, Dan
2015-05-01
An Acrysof toric intraocular lens (IOL) and a capsular tension ring (CTR) were implanted in the highly myopic eye of a 74-year-old white man during cataract surgery. On the first postoperative day, the IOL was found 90 degrees from the required position, with a consequent high amount of astigmatism. A second procedure was performed and because it was not possible to secure the toric IOL in the correct position, an additional in-the-bag CTR was inserted, with an immediate optimal outcome. The IOL remained stable up to the final follow-up examination. Co-implantation of a toric IOL and a single CTR has been reported. In our case, 2 CTRs were required to fixate the toric IOL in the correct position. This procedure is simple and safe and should be considered in cases of postoperatively misaligned toric IOLs. Copyright © 2015 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Effects of Burning Alternative Fuel in a 5-Cup Combustor Sector
NASA Technical Reports Server (NTRS)
Tacina, K. M.; Chang, C. T.; Lee, C.-M.; He, Z.; Herbon, J.
2015-01-01
A goal of NASA's Environmentally Responsible Aviation (ERA) program is to develop a combustor that will reduce the NOx emissions and that can burn both standard and alternative fuels. To meet this goal, NASA partnered with General Electric Aviation to develop a 5-cup combustor sector; this sector was tested in NASA Glenn's Advanced Subsonic Combustion Rig (ASCR). To verify that the combustor sector was fuel-flexible, it was tested with a 50-50 blend of JP-8 and a biofuel made from the camelina sativa plant. Results from this test were compared to results from tests where the fuel was neat JP-8. Testing was done at three combustor inlet conditions: cruise, 30% power, and 7% power. When compared to burning JP-8, burning the 50-50 blend did not significantly affect emissions of NOx, CO, or total hydrocarbons. Furthermore, it did not significantly affect the magnitude and frequency of the dynamic pressure fluctuations.
Luck, Jonathan
2010-07-01
I report a case of pellucid marginal degeneration (PMD) with cataract that was successfully treated with implantation of an ultra-high-power customized bitoric AT.Comfort 646TLC intraocular lens (IOL). The preoperative uncorrected distance visual acuity (UDVA) was 6/120 and the corrected distance visual acuity (CDVA), 6/24 with 10.9 diopters (D) of keratometric astigmatism on Scheimpflug imaging. After implantation of an IOL with -0.5 +16.0 x 170, the UDVA was 6/9 with a manifest refraction of +0.25 +1.25 x 150 and the CDVA, 6/6(-1). No surgical complications or postoperative problems occurred, and the patient was very satisfied with the outcome. A longer follow-up is required to confirm this favorable clinical result. The author has no financial or proprietary interest in any material or method mentioned. Copyright 2010 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Itagaki, Hideo; Kunikata, Toshio; Hiratsuka, Kentaro; Saito, Junichiro; Oshika, Tetsuro
2013-12-01
A 61-year-old man with high myopia who had received a systemic α1A-adrenoceptor antagonist had phacoemulsification and in-the-bag intraocular lens implantation in the right eye. One day postoperatively, marked pigment dispersion in the anterior chamber, posterior bowing of the iris, and iridodonesis were noted associated with a subsequent elevation in intraocular pressure (IOP). Pharmacological pupil dilation was effective in reducing pigment dispersion and IOP, and laser peripheral iridotomy was performed to alleviate posterior bowing of the iris. We hypothesize that dynamic changes in the aqueous humor flow by cataract surgery and latent flaccidity of the iris due to the systemic α1A-adrenoceptor antagonist caused reverse pupillary block. High myopia may be another risk factor for this complication. Copyright © 2013 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Biosynthesis of Modular Ascarosides in C. elegans
Panda, Oishika; Akagi, Allison E.; Artyukhin, Alexander B.; Judkins, Joshua C.; Le, Henry H.; Mahanti, Parag; Cohen, Sarah M.; Sternberg, Paul W.
2017-01-01
The nematode Caenorhabditis elegans uses simple building blocks from primary metabolism and a strategy of modular assembly to build a great diversity of signaling molecules, the ascarosides, which function as a chemical language in this model organism. In the ascarosides, the dideoxysugar ascarylose serves as a scaffold to which diverse moieties from lipid, amino acid, neurotransmitter, and nucleoside metabolism are attached. However, the mechanisms that underlie the highly specific assembly of ascarosides are not understood. We show that the acyl-CoA synthetase ACS-7, which localizes to lysosome-related organelles, is specifically required for the attachment of different building blocks to the 4′-position of ascr#9. We further show that mutants lacking lysosome-related organelles are defective in the production of all 4′-modified ascarosides, thus identifying the waste disposal system of the cell as a hotspot for ascaroside biosynthesis. PMID:28371259
Lake, Jonathan C; Boianovsky, Celso; de Faria Pacini, Thiago; Crema, Armando
2018-06-14
We describe the technique of second-wave hydrodissection (the first wave being the initial cortical cleaving hydrodissection) performed after the removal of the cataract nucleus in femtosecond laser-assisted cataract surgery. After femtosecond laser application, the cortex is typically found adhered to the anterior capsule. Under high magnification, a steady stream of a balanced salt solution is directed toward the anterior capsule using a hydrodissection cannula. Full cleavage of the remaining cortex is observed by noting the appearance of a dark inner circle by the capsulotomy edge once the balanced salt solution wave has separated the cortex from the capsule. Irrigation/aspiration (I/A) of the cortical remains after the second wave is faster than I/A without this step in femtosecond laser-assisted cataract surgery. Copyright © 2018 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Managing residual refractive error after cataract surgery.
Sáles, Christopher S; Manche, Edward E
2015-06-01
We present a review of keratorefractive and intraocular approaches to managing residual astigmatic and spherical refractive error after cataract surgery, including laser in situ keratomileusis (LASIK), photorefractive keratectomy (PRK), arcuate keratotomy, intraocular lens (IOL) exchange, piggyback IOLs, and light-adjustable IOLs. Currently available literature suggests that laser vision correction, whether LASIK or PRK, yields more effective and predictable outcomes than intraocular surgery. Piggyback IOLs with a rounded-edge profile implanted in the sulcus may be superior to IOL exchange, but both options present potential risks that likely outweigh the refractive benefits except in cases with large residual spherical errors. The light-adjustable IOL may provide an ideal treatment to pseudophakic ametropia by obviating the need for secondary invasive procedures after cataract surgery, but it is not widely available nor has it been sufficiently studied. Copyright © 2015 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
ELIMED: a new hadron therapy concept based on laser driven ion beams
NASA Astrophysics Data System (ADS)
Cirrone, Giuseppe A. P.; Margarone, Daniele; Maggiore, Mario; Anzalone, Antonello; Borghesi, Marco; Jia, S. Bijan; Bulanov, Stepan S.; Bulanov, Sergei; Carpinelli, Massimo; Cavallaro, Salvatore; Cutroneo, Mariapompea; Cuttone, Giacomo; Favetta, Marco; Gammino, Santo; Klimo, Ondrej; Manti, Lorenzo; Korn, Georg; La Malfa, Giuseppe; Limpouch, Jiri; Musumarra, Agatino; Petrovic, Ivan; Prokupek, Jan; Psikal, Jan; Ristic-Fira, Aleksandra; Renis, Marcella; Romano, Francesco P.; Romano, Francesco; Schettino, Giuseppe; Schillaci, Francesco; Scuderi, Valentina; Stancampiano, Concetta; Tramontana, Antonella; Ter-Avetisyan, Sargis; Tomasello, Barbara; Torrisi, Lorenzo; Tudisco, Salvo; Velyhan, Andriy
2013-05-01
Laser accelerated proton beams have been proposed to be used in different research fields. A great interest has risen for the potential replacement of conventional accelerating machines with laser-based accelerators, and in particular for the development of new concepts of more compact and cheaper hadrontherapy centers. In this context the ELIMED (ELI MEDical applications) research project has been launched by INFN-LNS and ASCR-FZU researchers within the pan-European ELI-Beamlines facility framework. The ELIMED project aims to demonstrate the potential clinical applicability of optically accelerated proton beams and to realize a laser-accelerated ion transport beamline for multi-disciplinary user applications. In this framework the eye melanoma, as for instance the uveal melanoma normally treated with 62 MeV proton beams produced by standard accelerators, will be considered as a model system to demonstrate the potential clinical use of laser-driven protons in hadrontherapy, especially because of the limited constraints in terms of proton energy and irradiation geometry for this particular tumour treatment. Several challenges, starting from laser-target interaction and beam transport development up to dosimetry and radiobiology, need to be overcome in order to reach the ELIMED final goals. A crucial role will be played by the final design and realization of a transport beamline capable to provide ion beams with proper characteristics in terms of energy spectrum and angular distribution which will allow performing dosimetric tests and biological cell irradiation. A first prototype of the transport beamline has been already designed and other transport elements are under construction in order to perform a first experimental test with the TARANIS laser system by the end of 2013. A wide international collaboration among specialists of different disciplines like Physics, Biology, Chemistry, Medicine and medical doctors coming from Europe, Japan, and the US is growing up around the ELIMED project with the aim to work on the conceptual design, technical and experimental realization of this core beamline of the ELI Beamlines facility.
Giers, Bert C; Khoramnia, Ramin; Weber, Lea F; Tandogan, Tamer; Auffarth, Gerd U
2016-03-01
We present the case of a 56-year-old woman with moderate myopia and bilateral cataract who had cataract extraction and intraocular lens (IOL) implantation. Due to the patient's desire for spectacle independence, a trifocal IOL with toric correction for astigmatism was implanted. During the follow-up, it became obvious that the implanted IOL had rotated and tilted due to insufficient fixation in the large capsular bag of the myopic eye. An IOL explantation was therefore performed, and the original IOL was exchanged for a bifocal toric IOL with a larger overall diameter. Stable fixation of the IOL in the capsular bag was achieved, and after surgery in the second eye, the patient recovered good bilateral vision. This case illustrates the need for careful selection of IOL diameter and sizing even in patients with moderate myopia due to the potentially larger ocular dimensions in these patients. Copyright © 2016 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Pham, Mai T; Peck, Rachel E; Dobbins, Kendall R B
2013-06-01
We report a case of ischemic optic neuropathy arising from elevated intraocular pressure (IOP) masked by interface fluid in a post-laser in situ keratomileusis (LASIK) eye. A 51-year-old man, who had had LASIK 6 years prior to presentation, sustained blunt trauma to the left eye that resulted in a hyphema and ocular hypertension. Elevated IOP resulted in accumulation of fluid in the stromal bed-LASIK flap interface, leading to underestimation of IOP when measured centrally over the flap. After days of unrecognized ocular hypertension, ischemic optic neuropathy developed. To our knowledge, this is the first reported case of ischemic optic neuropathy resulting from underestimated IOP measurements in a post-LASIK patient. It highlights the inaccuracy of IOP measurements in post-LASIK eyes and a vision-threatening potential complication. No author has a financial or proprietary interest in any material or method mentioned. Copyright © 2013 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Jacob, Soosan; Agarwal, Amar; Mazzotta, Cosimo; Agarwal, Athiya; Raj, John Michael
2017-04-01
Small-incision lenticule extraction may be associated with complications such as partial lenticular dissection, torn lenticule, lenticular adherence to cap, torn cap, and sub-cap epithelial ingrowth, some of which are more likely to occur during low-myopia corrections. We describe sequential segmental terminal lenticular side-cut dissection to facilitate minimally traumatic and smooth lenticular extraction. Anterior lamellar dissection is followed by central posterior lamellar dissection, leaving a thin peripheral rim and avoiding the lenticular side cut. This is followed by sequential segmental dissection of the lenticular side cut in a manner that fixates the lenticule and provides sufficient resistance for smooth and complete dissection of the posterior lamellar cut without undesired movements of the lenticule. The technique is advantageous in thin lenticules, where the risk for complications is high, but can also be used in thick lenticular dissection using wider sweeps to separate the lenticular side cut sequentially. Copyright © 2017 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Gooi, Patrick; Ahmed, Yusuf; Ahmed, Iqbal Ike K
2014-07-01
We describe the use of a microscope-mounted wide-angle point-of-view camera to record optimal hand positions in ocular surgery. The camera is mounted close to the objective lens beneath the surgeon's oculars and faces the same direction as the surgeon, providing a surgeon's view. A wide-angle lens enables viewing of both hands simultaneously and does not require repositioning the camera during the case. Proper hand positioning and instrument placement through microincisions are critical for effective and atraumatic handling of tissue within the eye. Our technique has potential in the assessment and training of optimal hand position for surgeons performing intraocular surgery. It is an innovative way to routinely record instrument and operating hand positions in ophthalmic surgery and has minimal requirements in terms of cost, personnel, and operating-room space. No author has a financial or proprietary interest in any material or method mentioned. Copyright © 2014 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Schuster, Alexander K; Tesarz, Jonas; Vossmerbaeumer, Urs
2015-05-01
This review was conducted to compare the physical effect of aspheric IOL implantation on wavefront properties with that of spherical IOL implantation. The peer-reviewed literature was systematically searched in Medline, Embase, Web of Science, Biosis, and the Cochrane Library according to the Cochrane Collaboration method. Inclusion criteria were randomized controlled trials comparing the use of aspheric versus spherical monofocal IOL implantation that assessed visual acuity, contrast sensitivity, or quality of vision. A secondary outcome was ocular wavefront analysis; spherical aberration, higher-order aberrations (HOAs), coma, and trefoil were evaluated. Effects were calculated as standardized mean differences (Hedges g) and were pooled using random-effect models. Thirty-four of 43 studies provided data for wavefront analysis. Aspheric monofocal IOL implantation resulted in less ocular spherical aberration and fewer ocular HOAs than spherical IOLs. This might explain the better contrast sensitivity in patients with aspheric IOLs. Copyright © 2015 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Faramarzi, Amir; Moshirfar, Majid; Karimian, Farid; Delfazayebaher, Siamak; Kheiri, Bahareh
2017-12-01
To compare the refractive and higher-order aberrations (HOAs) outcomes after photorefractive keratectomy (PRK) in patients with significant astigmatism using aspheric versus wavefront-guided aspheric profiles. Ophthalmic Research Center and Department of Ophthalmology, Shahid Beheshti University of Medical Sciences, Negah Eye Hospital, Tehran, Iran. Prospective randomized case series. One eye of each patient with a refractive astigmatism more than 2.00 diopters (D) randomly received aspheric PRK. In the other eye, wavefront-guided and aspheric treatment was performed using a personalized treatment advanced algorithm. Visual acuity, refractive errors, and HOAs were compared between the 2 groups preoperatively and 12 months postoperatively. The study comprised 32 patients (64 eyes). The mean preoperative refractive astigmatism was -4.07 D ± 1.64 (SD) and -4.02 ± 1.55 D in the aspheric group and wavefront-guided aspheric group, respectively (P = .2). The mean postoperative astigmatism was -0.46 ± 0.37 D and -0.82 ± 0.53 D in the aspheric group and wavefront-guided aspheric group, respectively (P = .02). Postoperatively, the root mean square of total HOAs was significantly increased in both groups. However, compared with wavefront-guided aspheric PRK, aspheric PRK induced fewer HOAs (P = .003). In eyes with high astigmatism, post-PRK residual astigmatism was lower in the aspheric group than in the wavefront-guided aspheric group. The increase in HOAs was significantly higher in the wavefront-guided aspheric group than in the aspheric group. Copyright © 2017 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Rosen, Emanuel; Alió, Jorge L; Dick, H Burkhard; Dell, Steven; Slade, Stephen
2016-02-01
We performed a metaanaysis of peer-reviewed studies involving implantation of a multifocal intraocular lens (IOL) in presbyopic patients with cataract or having refractive lens exchange (RLE). Previous reviews have considered the use of multifocal IOLs after cataract surgery but not after RLE, whereas greater insight might be gained from examining the full range of studies. Selected studies were examined to collate outcomes with monocular and binocular uncorrected distance, intermediate, and near visual acuity; spectacle independence; contrast sensitivity; visual symptoms; adverse events; and patient satisfaction. In 8797 eyes, the mean postoperative monocular uncorrected distance visual acuity (UDVA) was 0.05 logMAR ± 0.006 (SD) (Snellen equivalent 20/20(-3)). In 6334 patients, the mean binocular UDVA was 0.04 ± 0.00 logMAR (Snellen equivalent 20/20(-2)), with a mean spectacle independence of 80.1%. Monocular mean UDVA did not differ significantly between those who had a cataract procedure and those who had an RLE procedure. Neural adaptation to multifocality may vary among patients. Dr. Alió is a clinical research investigator for Hanita Lenses, Carl Zeiss Meditec AG, Topcon Medical Systems, Inc., Oculentis GmbH, and Akkolens International BV. Dr. Dell is a consultant to Bausch & Lomb and Abbott Medical Optics, Inc. Dr. Slade is a consultant to Alcon Surgical, Inc., Carl Zeiss Meditec AG, and Bausch & Lomb. None of the authors has a financial or proprietary interest in any material or method mentioned. Copyright © 2016 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Evaluation of loss in optical quality of multifocal intraocular lenses with glistenings.
DeHoog, Edward; Doraiswamy, Anand
2016-04-01
To study the impact of loss in optical quality from glistenings in diffractive multifocal intraocular lenses (IOLs) using ray tracing in a model eye. Independent research laboratory, Irvine, California, USA. Experimental study. A pseudophakic eye model was constructed in Zemax, an optical ray-tracing program, using the Arizona eye model as the basis. The Mie scattering theory was used to describe the intensity and direction of light as it scattered for a spherical particle immersed in a diffractive multifocal IOL. To evaluate the impact of glistening scatter, a more advanced eye model was constructed in Fred, a nonsequential optical ray-tracing software. An evaluation of scatter and modulation transfer function (MTF) was performed for a hydrophobic biomaterial with a refractive index of 1.54 for various sizes and densities of glistenings under mesopic conditions. As predicted by the Mie theory, the amount of scatter was a function of the change in the refractive index, size of the scatterer, and volume fraction of the scatterers. This modeling showed that an increase in density of glistenings can lead to a significant drop of MTF of the IOL. This effect was more pronounced in multifocal IOLs than in monofocal IOLs. Mathematical modeling showed that glistenings in multifocal IOLs lead to a reduction in MTF of the IOL and the pseudophakic eye. The relative loss of MTF in multifocal IOLs was more significant than in monofocal IOLs because of the nature of the design. Drs. DeHoog and Doraiswamy are consultants to Advanced Vision Science, Inc. Copyright © 2016 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Evaluation of the impact of light scatter from glistenings in pseudophakic eyes.
DeHoog, Edward; Doraiswamy, Anand
2014-01-01
To study the impact of light scatter from glistenings in pseudophakic eyes using ray tracing in a model eye Department of Research, Advanced Vision Science, Inc., Goleta, California, USA. Mathematical modeling and simulation. A pseudophakic eye model was constructed in Zemax using the Arizona eye model as the basis. The Mie scattering theory was used to describe the intensity and direction of light as it scatters for a spherical particle immersed in a given media (intraocular lens [IOL]). The modeling and evaluation of scatter and modulation transfer function (MTF) were performed for several biomaterials with various size and density of glistenings under scotopic, mesopic, and photopic conditions. As predicted by the Mie theory, the amount of scatter was a function of the relative difference in refractive index between the media and the scatterer, the size of the scatterer, and the volume fraction of the scatterer. The simulation demonstrated that an increase in density of glistenings can lead to a significant drop in the MTF of the IOL and the pseudophakic eye. This effect was more pronounced in IOLs with smaller cavitations, and the observation was consistent for all tested biomaterials. Mathematical modeling demonstrated that glistenings in IOLs will lead to reduction in the MTF of the IOL and the pseudophakic eye. The loss in MTF was more pronounced at high densities and small cavitation sizes across all biomaterials. Inconsistent and poor clinical quantification of glistenings in IOLs may explain some inconsistencies in the literature. Copyright © 2013 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Vasavada, Vaishali; Vasavada, Abhay R; Vasavada, Viraj A; Srivastava, Samaresh; Gajjar, Devarshi U; Mehta, Siddharth
2013-04-01
To compare incision integrity and clinical outcomes of 2 microcoaxial phacoemulsification systems. Iladevi Cataract & IOL Research Centre, Ahmedabad, India. Prospective randomized clinical trial. Eyes were randomized to have phacoemulsification using a 1.8 mm clear corneal incision (CCI) system (Group 1, Stellaris system) or a 2.2 mm CCI system (Group 2, Intrepid Infiniti system). Incision enlargement at end of surgery was measured. At the conclusion of surgery, trypan blue was applied over the conjunctival surface, anterior chamber aspirate withdrawn, and ingress into anterior chamber measured. Postoperative observations included evaluation of the CCI using anterior segment optical coherence tomography (AS-OCT), change in central corneal thickness (CCT), and anterior segment inflammation at 1 day, 1 week, and 1 month and endothelial cell loss and surgically induced astigmatism (SIA) at 3 months. Incision enlargement (P<.001) and trypan blue ingress in the anterior chamber (mean 1.7 log units ± 0.6 [SD] versus 3.8 ± 0.6 log units, P<.001) was significantly greater in Group 1 (n = 50) than in Group 2 (n = 50). On AS-OCT, endothelial misalignment and gaping were more frequent in Group 1 at 1 day (P=.001) and 1 week (P=.018). There were no significant differences in SIA, change in CCT, endothelial cell loss, or anterior segment inflammation (P>.05). At the end of surgery, it is not the initial incision size alone but also the distortion of the incision during subsequent stages of surgery that determine the integrity of the CCI. Copyright © 2013 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Luft, Nikolaus; Hirnschall, Nino; Farrokhi, Sanaz; Findl, Oliver
2015-08-01
To assess whether anterior chamber depth (ACD) measurements in pseudophakic eyes obtained with partial coherence interferometry (PCI) and optical low-coherence reflectometry (OLCR) devices can be used interchangeably. Vienna Institute for Research in Ocular Surgery, A Karl Landsteiner Institute, Hanusch Hospital, Vienna, Austria. Prospective case series. The ACD measurements in 1 eye of each pseudophakic patient were performed with the PCI-based ACMaster device and the OLCR-based Lenstar LS900 device at least 1 day postoperatively. The study comprised 65 eyes of 65 patients with a mean age of 71.7 years ± 9.0 (SD) (range 39 to 91 years). In 15 eyes, no valid ACD readings could be obtained with the OLCR device. No obvious reason for these measurement failures was identified; however, tear-film alterations shortly after surgery were suspected. No significant difference in the mean ACD in the remaining 50 eyes was found between PCI measurements (5019 ± 660 μm; range 4008 to 6181 μm) and OLCR measurements (5015 ± 663 μm; range 4017 to 6163 μm) (P = .06). Three (6%) of 50 measurements were not within the 95% limits of agreement in the Bland-Altman analysis. Pseudophakic ACD measurements with the PCI and OLCR devices can be used interchangeably. The OLCR device proved to be more user-friendly and faster; however, in a substantial number of eyes, no usable values were obtainable. No author has a financial or proprietary interest in any material or method mentioned. Copyright © 2015 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
A Research Program in Computer Technology. 1982 Annual Technical Report
1983-03-01
for the Defense Advanced Research Projects Agency. The research applies computer science and technology to areas of high DoD/ military impact. The ISI...implement the plan; New Computing Environment - investigation and adaptation of developing computer technologies to serve the research and military ...Computing Environment - ,.*_i;.;"’.)n and adaptation of developing computer technologies to serve the research and military tser communities; and Computer
Computers in aeronautics and space research at the Lewis Research Center
NASA Technical Reports Server (NTRS)
1991-01-01
This brochure presents a general discussion of the role of computers in aerospace research at NASA's Lewis Research Center (LeRC). Four particular areas of computer applications are addressed: computer modeling and simulation, computer assisted engineering, data acquisition and analysis, and computer controlled testing.
PREFACE: Plasma Physics by Laser and Applications 2013 Conference (PPLA2013)
NASA Astrophysics Data System (ADS)
Nassisi, V.; Giulietti, D.; Torrisi, L.; Delle Side, D.
2014-04-01
The ''Plasma Physics by Laser and Applications'' Conference (PPLA 2013) is a biennial meeting in which the National teams involved in Laser-Plasma Interaction at high intensities communicate their late results comparing with the colleagues from the most important European Laser Facilities. The sixth appointment has been organized in Lecce, Italy, from 2 to 4 October 2013 at the Rector Palace of the University of Salento. Surprising results obtained by laser-matter interaction at high intensities, as well as, non-equilibrium plasma generation, laser-plasma acceleration and related secondary sources, diagnostic methodologies and applications based on lasers and plasma pulses have transferred to researchers the enthusiasm to perform experiments ad maiora. The plasma generated by powerful laser pulses produces high kinetic particles and energetic photons that may be employed in different fields, from medicine to microelectronics, from engineering to nuclear fusion, from chemistry to environment. A relevant interest concerns the understanding of the fundamental physical phenomena, the employed lasers, plasma diagnostics and their consequent applications. For this reason we need continuous updates, meetings and expertise exchanges in this field in order to follow the evolution and disclose information, that has been done this year in Lecce, discussing and comparing the experiences gained in various international laboratories. The conference duration, although limited to just 3 days, permitted to highlight important aspects of the research in the aforementioned fields, giving discussion opportunities about the activities of researchers of high international prestige. The program consisted of 10 invited talks, 17 oral talks and 17 poster contributions for a total of 44 communications. The presented themes covered different areas and, far from being exhaustive gave updates, stimulating useful scientific discussions. The Organizers belong to three Italian Universities, Professor V Nassisi of Salento University, Professor D Giulietti of Pisa University and Professor L Torrisi of Messina University. The Scientific Committee was constituted by colleagues coming from different European laboratories: Dr F Belloni from European Commission, Bruxell, Belgium; Professor M Borghesi from the Queens University of Belfast, United Kingdom; Professor L Calcagno from Catania University, Italy; Professor D Giulietti from Pisa University, Italy; Dr J Krása from Academy of Science of Czech Republic, Prague; Professor V Malka from Laboratoire d'Optique Appliquée, Palaiseau, France; Professor V Nassisi from Salento University, Italy; Professor L Palladino from L'Aquila University, Italy; Professor L Torrisi from Messina University, Italy; Professor Ullschmied from Academy of Science of Czech Republic, Prague; Professor J Wolowski from Institute of Plasma Physics and Laser Microfusion of Warsaw, Poland and Dr J. Badziak from Institute of Plasma Physics and Laser Microfusion of Warsaw, Poland. The Local Organizing team was composed by: Dr G Buccolieri, Dr D Delle Side, Dr F Paladini and Dr L Velardi from Salento University and Dr M Cutroneo from Messina University. The Scientific secretariat was coordinated by Dr D. Dell'Anna from Salento University. The Topics discussed in the conference were: ·Laser-Matter interactions; ·Laser ion sources; ·Electron beam generation; ·Physics of non-equilibrium plasmas; ·Theoretical models in plasmas; ·Photons and particles emission from pulsed plasmas; ·Ion acceleration from plasma; ·Fs laser pulses; ·Pulsed laser deposition; ·Applications of laser beams and pulsed plasmas; ·Techniques of characterization of plasmas. The colleagues attending the conference were about 80. The Chairmen and Presidents of the different Conference sessions were: Professor V Nassisi, Professor D Giulietti, Professor L Torrisi, Professor M Borghesi, Dr K Rohlena (ASCR of Prague, Czech Republic), Professor D Neely (RAL, Oxon, UK), Dr J Ullschmied (ASCR, Prague, Czech Republic), Professor S Ratynskaia (Royal Institute of Technology, Stockholm, Sweden), Dr J. Krása, Dr J. Badziak. The award Leos Laska, a Czech colleague which gave in its country relevant contributions to development of the experimental activities in these research fields, has been proposed in memory to his work and to stimulate the interest of young researchers in this important sector. The Scientific Committee conferred the prize to Dr Mariapompea Cutroneo, PhD in Physics of Messina University, for her activity in the field of new methodologies related to the ion acceleration in laser-generated plasma. The widespread success of the event suggests we will meet again, next 2015, in another South Italy venue, as wonderful and welcoming as Lecce was. Vincenzo Nassisi, Danilo Giulietti, Lorenzo Torrisi and Domenico Delle Side
Research | Computational Science | NREL
Research Research NREL's computational science experts use advanced high-performance computing (HPC technologies, thereby accelerating the transformation of our nation's energy system. Enabling High-Impact Research NREL's computational science capabilities enable high-impact research. Some recent examples
NASA's computer science research program
NASA Technical Reports Server (NTRS)
Larsen, R. L.
1983-01-01
Following a major assessment of NASA's computing technology needs, a new program of computer science research has been initiated by the Agency. The program includes work in concurrent processing, management of large scale scientific databases, software engineering, reliable computing, and artificial intelligence. The program is driven by applications requirements in computational fluid dynamics, image processing, sensor data management, real-time mission control and autonomous systems. It consists of university research, in-house NASA research, and NASA's Research Institute for Advanced Computer Science (RIACS) and Institute for Computer Applications in Science and Engineering (ICASE). The overall goal is to provide the technical foundation within NASA to exploit advancing computing technology in aerospace applications.
Comparison of intraocular lens decentration and tilt measurements using 2 Purkinje meter systems.
Maedel, Sophie; Hirnschall, Nino; Bayer, Natascha; Markovic, Sabine; Tabernero, Juan; Artal, Pablo; Schaeffel, Frank; Findl, Oliver
2017-05-01
To evaluate the difference in intraocular lens tilt and decentration measurements with 2 Purkinje meters. Vienna Institute for Research in Ocular Surgery, Hanusch Hospital, Vienna, Austria. Prospective evaluation of diagnostic test. This single-center study included pseudophakic patients in 2 substudies in which 3 consecutive measurements were performed with 2 Purkinje meters (Spanish and German). In substudy 1, an inexperienced examiner performed all measurements after a short learning period. In substudy 2, all measurements were taken by experienced examiners under direct supervision of the inventors of the devices. Substudy 1 included 53 pseudophakic eyes in which all 53 scans were successful with the Spanish device; however, only 35 measurements (66%) were successful with the German Purkinje meter. The mean tilt measured with the Spanish Purkinje meter was 4.35 degrees ± 2.50 (SD) and 9.20 ± 6.96 degrees with the German Purkinje meter. The mean decentration was 0.44 ± 0.19 mm and 0.74 ± 0.91 mm, (P = .44), respectively. In substudy 2 (29 pseudophakic eyes), the number of successful scans was 29 (100%) and 18 (62%) for the Spanish meter and German Purkinje meter, respectively. The mean horizontal and vertical tilt difference vector between the 2 systems was 4.89 ± 3.24 degrees and 7.57 ± 3.82 degrees, respectively. Concerning clinical feasibility, the Spanish Purkinje meter had a greater percentage of successful scans than the German device. In addition, this device measured significantly higher tilt values than the Spanish Purkinje meter. Copyright © 2017 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Vasavada, Abhay R; Johar, Kaid; Praveen, Mamidipudi R; Vasavada, Viraj A; Arora, Anshul I
2013-04-01
To compare changes in the incision's histomorphology and denaturation of collagen I in rabbit eyes having microcoaxial phacoemulsification through 2.2 mm and 1.8 mm incision-compatible systems. Randomized experimental trial. Iladevi Cataract & IOL Research Centre, Ahmedabad, India. Thirty rabbit eyes were randomized into Group 1 (microcoaxial phacoemulsification through 2.2 mm incisions using Infiniti system [torsional ultrasound]) and Group 2 (microcoaxial phacoemulsification through 1.8 mm incisions using Stellaris system [longitudinal ultrasound]). Each group was then divided into 3 subgroups of 5 eyes each based on 1 of the 3 intervention options: phacoemulsification only, intraocular lens (IOL) insertion only, and phacoemulsification with IOL insertion. Left eyes were randomized for microcoaxial phacoemulsification, and right eyes were treated as controls. After phacoemulsification, eyes in Group 1 showed loss of epithelium at the roof of the incisions and Descemet membrane detachment at the floor of the incisions. These findings did not change after IOL insertion. After phacoemulsification, eyes in Group 2 showed loss of epithelium, but Descemet membrane remained attached. There was a longitudinal split in the incision's stroma in the direction of internal entry. The stromal damage increased after IOL implantation. Immunofluorescence studies showed no obvious irregularities in the arrangement of collagen I in either group. A dot blot analysis showed significant denaturation of collagen I in Group 2. The histomorphology of the 2.2 mm system incision showed localized Descemet membrane detachment and endothelial cell loss. The 1.8 mm system incision showed exaggerated stromal damage after IOL insertion. Copyright © 2013 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Accelerated corneal crosslinking concurrent with laser in situ keratomileusis.
Celik, H Ugur; Alagöz, Nese; Yildirim, Yusuf; Agca, Alper; Marshall, John; Demirok, Ahmet; Yilmaz, Omer Faruk
2012-08-01
To assess accelerated corneal collagen crosslinking (CXL) applied concurrently with laser in situ keratomileusis (LASIK) in a small group of patients. Beyoglu Eye Research and Training Hospital, Istanbul, Turkey. Prospective pilot interventional case series. In May 2010, patients had LASIK with concurrent accelerated CXL in 1 eye and LASIK only in the fellow eye to treat myopia or myopic astigmatism. The follow-up was 12 months. The attempted correction (spherical equivalent) ranged from -5.00 to -8.50 diopters (D) in the LASIK-CXL group and from -3.00 to -7.25 D in the LASIK-only group. Main outcome measures were manifest refraction, uncorrected (UDVA) and corrected (CDVA) distance visual acuities, and the endothelial cell count. Eight eyes of 3 women and 1 man (age 22 to 39 years old) were enrolled. At the 12-month follow-up, the LASIK-CXL group had a UDVA and manifest refraction equal to or better than those in the LASIK-only group. No eye lost 1 or more lines of CDVA at the final visit. The endothelial cell loss in the LASIK-CXL eye was not greater than in the fellow eye. No side effects were associated with either procedure. Laser in situ keratomileusis with accelerated CXL appears to be a promising modality for future applications to prevent corneal ectasia after LASIK treatment. The results in this pilot series suggest that evaluation of a larger study cohort is warranted. Drs. Yilmaz and Marshall are paid consultants to Avedro, Inc. No other author has a financial or proprietary interest in any material or method mentioned. Copyright © 2012 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Praveen, Mamidipudi R; Vasavada, Abhay R; Shah, Sajani K; Khamar, Mayuri B; Trivedi, Rupal H
2015-09-01
To evaluate the long-term impact of bilateral cataract surgery on postoperative complications, influence of age at surgery on the pattern of axial growth and central corneal thickness (CCT), and visual and orthoptic assessment in microphthalmic eyes. Iladevi Cataract and IOL Research Centre, Ahmedabad, India. Prospective longitudinal study. This study assessed children with microphthalmos who had bilateral congenital cataract surgery. Microphthalmos was defined as an eye that has an axial length (AL) that was 2 standard deviations smaller than what is normally expected at that age. All eyes were left aphakic. One of the 2 eyes was randomly selected for analysis. Postoperative complications, AL, CCT, and visual acuity were documented. This study included 72 eys of 36 children. The mean age of the patients was 4.8 months ± 6.2 (SD) (range 0.5 to 15 months). Postoperative complications included secondary glaucoma (11/36, 30.6%), visual axis obscuration (4/36, 11.1%), and posterior synechiae (10/36, 27.8%). A significant rate of change was observed in axial growth up to 4 years and in CCT up to 3 years postoperatively. When age at the time of surgery was correlated with the profile of the rate of change in AL and CCT at 1 month and 1, 2, and 4 years, statistically significant differences in AL and CCT at all timepoints were found. Loss of vision after surgery occurred in 2 eyes. After early surgical intervention, an acceptable rate of serious postoperative complications and good visual outcomes were obtained in microphthalmic eyes. No author has a financial or proprietary interest in any material or method mentioned. Copyright © 2015 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Center for Computing Research Summer Research Proceedings 2015.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bradley, Andrew Michael; Parks, Michael L.
2015-12-18
The Center for Computing Research (CCR) at Sandia National Laboratories organizes a summer student program each summer, in coordination with the Computer Science Research Institute (CSRI) and Cyber Engineering Research Institute (CERI).
Gururangan, Sridharan; Robinson, Giles; Ellison, David W; Wu, Gang; He, Xuelian; Lu, Q Richard; McLendon, Roger; Grant, Gerald; Driscoll, Timothy; Neuberg, Ronnie
2015-10-01
We present three cases of genetically confirmed Gorlin syndrome with desmoplastic medulloblastoma (DMB) in whom tumor recurred despite standard therapy. One patient was found to have a novel germline missense PTCH1 mutation. Molecular analysis of recurrent tumor using fluorescent in situ hybridization (FISH) revealed PTEN and/ or PTCH1 loss in 2 patients. Whole exome sequencing (WES) of tumor in one patient revealed loss of heterozygosity of PTCH1 and a mutation of GNAS gene in its non-coding 3' -untranslated region (UTR) with corresponding decreased protein expression. While one patient died despite high-dose chemotherapy (HDC) plus stem cell rescue (ASCR) and palliative radiotherapy, two patients are currently alive for 18+ and 120+ months respectively following retrieval therapy that did not include irradiation. Infants with DMB and GS should be treated aggressively with chemotherapy at diagnosis to prevent relapse but radiotherapy should be avoided. The use of molecular prognostic markers for DMB should be routinely used to identify the subset of tumors that might have an aggressive course. © 2015 Wiley Periodicals, Inc.
Yokokura, Shunji; Hariya, Takehiro; Kobayashi, Wataru; Meguro, Yasuhiko; Nishida, Kohji; Nakazawa, Toru
2017-03-01
We describe a technique for the penetrating keratoplasty (PKP) triple procedure that uses 29-gauge dual-chandelier illumination during creation of a non-open-sky continuous curvilinear capsulorhexis (CCC). The chandeliers are inserted through the pars plana into the vitreous cavity through the bulbar conjunctiva at the 3 o'clock and 9 o'clock positions. We compared this approach with that of a core vitrectomy, in which a single 25-gauge port is inserted into the vitreous cavity transconjunctivally through the upper temporal pars plana. The area of halation around the corneal opacity was significantly smaller in the 29-gauge group than in the 25-gauge group. The reduction in halation improved visibility of the anterior capsule and enabled the surgeon to perform CCC with greater safety. The 29-gauge chandelier system was more suitable than the 25-gauge chandelier system for the non-open-sky CCC component of the PKP triple procedure. Copyright © 2017 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Lee, F. C.; Chen, D. Y.; Jovanovic, M.; Hopkins, D. C.
1985-01-01
The results of evaluation of power semiconductor devices for electric hybrid vehicle ac drive applications are summarized. Three types of power devices are evaluated in the effort: high power bipolar or Darlington transistors, power MOSFETs, and asymmetric silicon control rectifiers (ASCR). The Bipolar transistors, including discrete device and Darlington devices, range from 100 A to 400 A and from 400 V to 900 V. These devices are currently used as key switching elements inverters for ac motor drive applications. Power MOSFETs, on the other hand, are much smaller in current rating. For the 400 V device, the current rating is limited to 25 A. For the main drive of an electric vehicle, device paralleling is normally needed to achieve practical power level. For other electric vehicle (EV) related applications such as battery charger circuit, however, MOSFET is advantageous to other devices because of drive circuit simplicity and high frequency capability. Asymmetrical SCR is basically a SCR device and needs commutation circuit for turn off. However, the device poses several advantages, i.e., low conduction drop and low cost.
Epstein, Rachel H; Mamalis, Nick; Price, Francis W; Price, Marianne O
2015-02-01
A 68-year-old woman with bilateral keratoconus presented with persistent visual acuity deficits following cataract extraction with a neodymium:YAG capsulotomy in the right eye 2 years earlier. Penetrating keratoplasty (PKP) had been performed for keratoconus in the right eye without complications until steroid drops were discontinued after 10 years because of persistent elevated intraocular pressure. The right eye experienced immunologic rejection and failure of 3 PKPs, 1 Descemet-stripping endothelial keratoplasty (DSEK), and a trabeculectomy with an eventual anatomically successful DSEK before the patient died at 95 years of age. The left eye improved following a single PKP. Postmortem histopathologic analysis of the cornea showed an anatomically successful DSEK graft with intact donor Descemet membrane and viable graft endothelial cells. To our knowledge, this is the first histopathologic analysis of an anatomically successful DSEK after multiple failed PKPs and trabeculectomy. The course in this case supports early consideration of lamellar keratoplasty, especially in patients with ocular comorbidities. No author has a financial or proprietary interest in any material or method mentioned. Copyright © 2015 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
On the magnetized disruption of inertially-confined plasma flows
NASA Astrophysics Data System (ADS)
Manuel, Mario; Kuranz, Carolyn; Rasmus, Alexander; Klein, Sallee; MacDonald, Michael; Trantham, Matt; Fein, Jeff; Belancourt, Patrick; Young, Rachel; Keiter, Paul; Drake, R. Paul; Pollock, Brad; Park, Jaebum; Hazi, Andrew; Williams, Jackson; Chen, Hui
2016-10-01
The creation and disruption of inertially-collimated plasma flows is investigated through experiment, simulation, and analytical modeling. Laser-generated plasma-jets are shown to be disrupted by an applied 5T B-field along the jet axis. An analytical model of the system describes the disruption mechanism through the competing effects of B-field advection and diffusion. These results indicate that for Rem 10-100, the ratio of inertial to magnetic pressures plays an important role in determining whether a jet is formed, but at high enough Rem , axial B-field amplification prevents inertial collimation altogether. This work is funded by the U.S. DOE, through the NNSA-DS and SC-OFES Joint Program in HED Laboratory Plasmas, Grant Number DE-NA0001840 and in collaboration with LLNL under contract DE-AC52-07NA27344. Support for this work was provided by NASA, under contract NAS8-03060, through Einstein Postdoctoral Fellowship Grant Number PF3-140111. Software used in this work was developed in part by the DOE NNSA ASC- and DOE Office of Science ASCR-supported Flash Center.
Wright, Dannen D; Wright, Alex J; Boulter, Tyler D; Bernhisel, Ashlie A; Stagg, Brian C; Zaugg, Brian; Pettey, Jeff H; Ha, Larry; Ta, Brian T; Olson, Randall J
2017-09-01
To determine the optimum bottle height, vacuum, aspiration rate, and power settings in the peristaltic mode of the Whitestar Signature Pro machine with Ellips FX tip action (transversal). John A. Moran Eye Center Laboratories, University of Utah, Salt Lake City, Utah, USA. Experimental study. Porcine lens nuclei were hardened with formalin and cut into 2.0 mm cubes. Lens cubes were emulsified using transversal and fragment removal time (efficiency), and fragment bounces off the tip (chatter) were measured to determine optimum aspiration rate, bottle height, vacuum, and power settings in the peristaltic mode. Efficiency increased in a linear fashion with increasing bottle height and vacuum. The most efficient aspiration rate was 50 mL/min, with 60 mL/min statistically similar. Increasing power increased efficiency up to 90% with increased chatter at 100%. The most efficient values for the settings tested were bottle height at 100 cm, vacuum at 600 mm Hg, aspiration rate of 50 or 60 mL/min, and power at 90%. Copyright © 2017 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Injectable suture device for intraocular lens fixation.
Smith, Jesse M; Erlanger, Michael; Olson, Jeffrey L
2015-12-01
We describe a surgical technique for scleral fixation of a posterior chamber intraocular lens (PC IOL) using a 24-gauge injectable polypropylene suture delivery system. A 3-piece PC IOL is inserted into the anterior chamber of the eye. Two sclerotomies are made 1.5 mm posterior to the limbus using a microvitreoretinal blade. The 24-gauge injector delivers a preformed suture loop into the eye with the double-armed needles still external to the eye. Each polypropylene IOL haptic is directed through the loop using microforceps. The suture loop is tightened around the haptic, and the attached needles are used to fixate the IOL to the sclera and close the sclerotomies simultaneously. This technique has been used in an ex vivo porcine eye and in an aphakic patient. In the latter, the IOL was quickly fixated to the sclera and maintained a stable position postoperatively. Dr. Olson has a patent pending for the device described in this article. No other author has a financial or proprietary interest in any material or method mentioned. Copyright © 2015 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Sykakis, Evripidis; Karim, Rushmia; Parmar, Dipak N
2013-08-01
To standardize the management of patients with herpetic eye disease scheduled for cataract surgery, a questionnaire was sent to each fellow of the Royal College of Ophthalmologists registered as a consultant with a subspecialty interest in cornea. Most respondents agreed that disease stability was required before cataract surgery was offered; 62.3% would operate on patients in whom the disease had been quiescent for 3 to 6 months. The decision to prescribe prophylactic antivirals divided the respondents, with 58.8% in favor of starting antiviral treatment. Most respondents (72.46%) did not start topical antiviral treatment. In regard to changing topical steroid use postoperatively, 80.9% would not change their routine regimen. Oral acyclovir was the first line of treatment for 92.5%. The conclusions were that a significant period of inactivity should be considered before cataract surgery is performed in patients with herpes simplex virus eye disease. Oral antiviral prophylaxis is common clinical practice, but no change in routine postoperative steroid use is needed. Copyright © 2013 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Bernstam, Elmer V.; Hersh, William R.; Johnson, Stephen B.; Chute, Christopher G.; Nguyen, Hien; Sim, Ida; Nahm, Meredith; Weiner, Mark; Miller, Perry; DiLaura, Robert P.; Overcash, Marc; Lehmann, Harold P.; Eichmann, David; Athey, Brian D.; Scheuermann, Richard H.; Anderson, Nick; Starren, Justin B.; Harris, Paul A.; Smith, Jack W.; Barbour, Ed; Silverstein, Jonathan C.; Krusch, David A.; Nagarajan, Rakesh; Becich, Michael J.
2010-01-01
Clinical and translational research increasingly requires computation. Projects may involve multiple computationally-oriented groups including information technology (IT) professionals, computer scientists and biomedical informaticians. However, many biomedical researchers are not aware of the distinctions among these complementary groups, leading to confusion, delays and sub-optimal results. Although written from the perspective of clinical and translational science award (CTSA) programs within academic medical centers, the paper addresses issues that extend beyond clinical and translational research. The authors describe the complementary but distinct roles of operational IT, research IT, computer science and biomedical informatics using a clinical data warehouse as a running example. In general, IT professionals focus on technology. The authors distinguish between two types of IT groups within academic medical centers: central or administrative IT (supporting the administrative computing needs of large organizations) and research IT (supporting the computing needs of researchers). Computer scientists focus on general issues of computation such as designing faster computers or more efficient algorithms, rather than specific applications. In contrast, informaticians are concerned with data, information and knowledge. Biomedical informaticians draw on a variety of tools, including but not limited to computers, to solve information problems in health care and biomedicine. The paper concludes with recommendations regarding administrative structures that can help to maximize the benefit of computation to biomedical research within academic health centers. PMID:19550198
Munabi, Ian G; Buwembo, William; Bajunirwe, Francis; Kitara, David Lagoro; Joseph, Ruberwa; Peter, Kawungezi; Obua, Celestino; Quinn, John; Mwaka, Erisa S
2015-02-25
Effective utilization of computers and their applications in medical education and research is of paramount importance to students. The objective of this study was to determine the association between owning a computer and use of computers for research data analysis and the other factors influencing health professions students' computer use for data analysis. We conducted a cross sectional study among undergraduate health professions students at three public universities in Uganda using a self-administered questionnaire. The questionnaire was composed of questions on participant demographics, students' participation in research, computer ownership, and use of computers for data analysis. Descriptive and inferential statistics (uni-variable and multi- level logistic regression analysis) were used to analyse data. The level of significance was set at 0.05. Six hundred (600) of 668 questionnaires were completed and returned (response rate 89.8%). A majority of respondents were male (68.8%) and 75.3% reported owning computers. Overall, 63.7% of respondents reported that they had ever done computer based data analysis. The following factors were significant predictors of having ever done computer based data analysis: ownership of a computer (adj. OR 1.80, p = 0.02), recently completed course in statistics (Adj. OR 1.48, p =0.04), and participation in research (Adj. OR 2.64, p <0.01). Owning a computer, participation in research and undertaking courses in research methods influence undergraduate students' use of computers for research data analysis. Students are increasingly participating in research, and thus need to have competencies for the successful conduct of research. Medical training institutions should encourage both curricular and extra-curricular efforts to enhance research capacity in line with the modern theories of adult learning.
Use of the Computer for Research on Instruction and Student Understanding in Physics.
NASA Astrophysics Data System (ADS)
Grayson, Diane Jeanette
This dissertation describes an investigation of how the computer may be utilized to perform research on instruction and on student understanding in physics. The research was conducted within three content areas: kinematics, waves and dynamics. The main focus of the research on instruction was the determination of factors needed for a computer program to be instructionally effective. The emphasis in the research on student understanding was the identification of specific conceptual and reasoning difficulties students encounter with the subject matter. Most of the research was conducted using the computer -based interview, a technique developed during the early part of the work, conducted within the domain of kinematics. In a computer-based interview, a student makes a prediction about how a particular system will behave under given circumstances, observes a simulation of the event on a computer screen, and then is asked by an interviewer to explain any discrepancy between prediction and observation. In the course of the research, a model was developed for producing educational software. The model has three important components: (i) research on student difficulties in the content area to be addressed, (ii) observations of students using the computer program, and (iii) consequent program modification. This model was used to guide the development of an instructional computer program dealing with graphical representations of transverse pulses. Another facet of the research involved the design of a computer program explicitly for the purposes of research. A computer program was written that simulates a modified Atwood's machine. The program was than used in computer -based interviews and proved to be an effective means of probing student understanding of dynamics concepts. In order to ascertain whether or not the student difficulties identified were peculiar to the computer, laboratory-based interviews with real equipment were also conducted. The laboratory-based interviews were designed to parallel the computer-based interviews as closely as possible. The results of both types of interviews are discussed in detail. The dissertation concludes with a discussion of some of the benefits of using the computer in physics instruction and physics education research. Attention is also drawn to some of the limitations of the computer as a research instrument or instructional device.
Emerging Uses of Computer Technology in Qualitative Research.
ERIC Educational Resources Information Center
Parker, D. Randall
The application of computer technology in qualitative research and evaluation ranges from simple word processing to doing sophisticated data sorting and retrieval. How computer software can be used for qualitative research is discussed. Researchers should consider the use of computers in data analysis in light of their own familiarity and comfort…
ERIC Educational Resources Information Center
Kieren, Thomas E.
This last paper in a set of four reviews research on a wide variety of computer applications in the mathematics classroom. It covers computer-based instruction, especially drill-and-practice and tutorial modes; computer-managed instruction; and computer-augmented problem-solving. Analytical comments on the findings and status of the research are…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Collins, W. E.
2004-08-16
Computational Science plays a big role in research and development in mathematics, science, engineering and biomedical disciplines. The Alliance for Computational Science Collaboration (ACSC) has the goal of training African-American and other minority scientists in the computational science field for eventual employment with the Department of Energy (DOE). The involvements of Historically Black Colleges and Universities (HBCU) in the Alliance provide avenues for producing future DOE African-American scientists. Fisk University has been participating in this program through grants from the DOE. The DOE grant supported computational science activities at Fisk University. The research areas included energy related projects, distributed computing,more » visualization of scientific systems and biomedical computing. Students' involvement in computational science research included undergraduate summer research at Oak Ridge National Lab, on-campus research involving the participation of undergraduates, participation of undergraduate and faculty members in workshops, and mentoring of students. These activities enhanced research and education in computational science, thereby adding to Fisk University's spectrum of research and educational capabilities. Among the successes of the computational science activities are the acceptance of three undergraduate students to graduate schools with full scholarships beginning fall 2002 (one for master degree program and two for Doctoral degree program).« less
A Research Program in Computer Technology. 1986 Annual Technical Report
1989-08-01
1986 (Annual Technical Report I July 1985 - June 1986 A Research Program in Computer Technology ISI/SR-87-178 U S C INFORMA-TION S C I EN C ES...Program in Computer Technology (Unclassified) 12. PERSONAL AUTHOR(S) 151 Research Staff 13a. TYPE OF REPORT 113b. TIME COVERED 14 DATE OF REPORT (Yeer...survivable networks 17. distributed processing, local networks, personal computers, workstation environment 18. computer acquisition, Strategic Computing 19
Computer Plotting Data Points in the Engine Research Building
1956-09-21
A female computer plotting compressor data in the Engine Research Building at the NACA’s Lewis Flight Propulsion Laboratory. The Computing Section was introduced during World War II to relieve short-handed research engineers of some of the tedious data-taking work. The computers made the initial computations and plotted the data graphically. The researcher then analyzed the data and either summarized the findings in a report or made modifications or ran the test again. With the introduction of mechanical computer systems in the 1950s the female computers learned how to encode the punch cards. As the data processing capabilities increased, fewer female computers were needed. Many left on their own to start families, while others earned mathematical degrees and moved into advanced positions.
Activities of the Research Institute for Advanced Computer Science
NASA Technical Reports Server (NTRS)
Oliger, Joseph
1994-01-01
The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under contract with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. Research at RIACS is currently being done in the following areas: (1) parallel computing; (2) advanced methods for scientific computing; (3) high performance networks; and (4) learning systems. RIACS technical reports are usually preprints of manuscripts that have been submitted to research journals or conference proceedings. A list of these reports for the period January 1, 1994 through December 31, 1994 is in the Reports and Abstracts section of this report.
Human-computer interaction: psychological aspects of the human use of computing.
Olson, Gary M; Olson, Judith S
2003-01-01
Human-computer interaction (HCI) is a multidisciplinary field in which psychology and other social sciences unite with computer science and related technical fields with the goal of making computing systems that are both useful and usable. It is a blend of applied and basic research, both drawing from psychological research and contributing new ideas to it. New technologies continuously challenge HCI researchers with new options, as do the demands of new audiences and uses. A variety of usability methods have been developed that draw upon psychological principles. HCI research has expanded beyond its roots in the cognitive processes of individual users to include social and organizational processes involved in computer usage in real environments as well as the use of computers in collaboration. HCI researchers need to be mindful of the longer-term changes brought about by the use of computing in a variety of venues.
NASA Astrophysics Data System (ADS)
Ahn, Sul-Ah; Jung, Youngim
2016-10-01
The research activities of the computational physicists utilizing high performance computing are analyzed by bibliometirc approaches. This study aims at providing the computational physicists utilizing high-performance computing and policy planners with useful bibliometric results for an assessment of research activities. In order to achieve this purpose, we carried out a co-authorship network analysis of journal articles to assess the research activities of researchers for high-performance computational physics as a case study. For this study, we used journal articles of the Scopus database from Elsevier covering the time period of 2004-2013. We extracted the author rank in the physics field utilizing high-performance computing by the number of papers published during ten years from 2004. Finally, we drew the co-authorship network for 45 top-authors and their coauthors, and described some features of the co-authorship network in relation to the author rank. Suggestions for further studies are discussed.
An overview of computer viruses in a research environment
NASA Technical Reports Server (NTRS)
Bishop, Matt
1991-01-01
The threat of attack by computer viruses is in reality a very small part of a much more general threat, specifically threats aimed at subverting computer security. Here, computer viruses are examined as a malicious logic in a research and development environment. A relation is drawn between the viruses and various models of security and integrity. Current research techniques aimed at controlling the threats posed to computer systems by threatening viruses in particular and malicious logic in general are examined. Finally, a brief examination of the vulnerabilities of research and development systems that malicious logic and computer viruses may exploit is undertaken.
NASA Technical Reports Server (NTRS)
1993-01-01
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics and computer science during the period April 1, 1993 through September 30, 1993. The major categories of the current ICASE research program are: (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest to LaRC, including acoustic and combustion; (3) experimental research in transition and turbulence and aerodynamics involving LaRC facilities and scientists; and (4) computer science.
NASA Technical Reports Server (NTRS)
1994-01-01
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period October 1, 1993 through March 31, 1994. The major categories of the current ICASE research program are: (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest to LaRC, including acoustics and combustion; (3) experimental research in transition and turbulence and aerodynamics involving LaRC facilities and scientists; and (4) computer science.
Using Computational Toxicology to Enable Risk-Based ...
presentation at Drug Safety Gordon Research Conference 2016 on research efforts in NCCT to enable Computational Toxicology to support risk assessment. Slide presentation at Drug Safety Gordon Research Conference 2016 on research efforts in NCCT to enable Computational Toxicology to support risk assessment.
NASA Astrophysics Data System (ADS)
2015-05-01
The 6th edition of the Workshop for Young Scientists on the Physics of Ultrarelativistic Nucleus-Nucleus Collisions (Hot Quarks 2014) was held in Las Negras, Spain from 21-28 September 2014. Following the traditions of the conference, this meeting gathered more than 70 participants in the first years of their scientific careers. The present issue contains the proceedings of this workshop. As in the past, the Hot Quarks workshop offered a unique atmosphere for a lively discussion and interpretation of the current measurements from high energy nuclear collisions. Recent results and upgrades at CERN's Large Hadron Collider (LHC) and Brookhaven's Relativistic Heavy Ion Collider (RHIC) were presented. Recent theoretical developments were also extensively discussed as well as the perspectives for future facilities such as the Facility for Antiproton and Ion Research (FAIR) at Darmstadt and the Electron-Ion Collider at Brookhaven. The conference's goal to provide a platform for young researchers to learn and foster their interactions was successfully met. We wish to thank the sponsors of the Hot Quarks 2014 Conference, who supported the authors of this volume: Brookhaven National Laboratory (USA), CPAN (Spain), Czech Science Foundation (GACR) under grant 13-20841S (Czech Republic), European Laboratory for Particle Physics CERN (Switzerland), European Research Council under grant 259612 (EU), ExtreMe Matter Institute EMMI (Germany), Helmholtz Association and GSI under grant VH-NG-822, Helmholtz International Center for FAIR (Germany), National Science Foundation under grant No.1359622 (USA), Nuclear Physics Institute ASCR (Czech Republic), Patronato de la Alhambra y Generalife (Spain) and the Universidad de Granada (Spain). Javier López Albacete, Universidad de Granada (Spain) Jana Bielcikova, Nuclear Physics Inst. and Academy of Sciences (Czech Republic) Rainer J. Fries, Texas A&M University (USA) Raphaël Granier de Cassagnac, CNRS-IN2P3 and École polytechnique (France) Boris Hippolyte, CNRS-IN2P3 and Université de Strasbourg (France) Jiangyong Jia, Stony Brook University and Brookhaven National Laboratory (USA) André Mischke, Utrecht University and Nikhef Amsterdam (The Netherlands) Ágnes Mócsy, Pratt Institute and Brookhaven National Laboratory (USA) Hannah Petersen, Goethe University, FIAS and GSI (Germany) Lijuan Ruan, Brookhaven National Laboratory (USA) Sevil Salur, Rutgers University, (USA)
Clinical study using a new phacoemulsification system with surgical intraocular pressure control.
Solomon, Kerry D; Lorente, Ramón; Fanney, Doug; Cionni, Robert J
2016-04-01
To compare cumulative dissipated energy (CDE), aspiration fluid used, and aspiration time during phacoemulsification cataract extraction using 2 surgical configurations. Two clinical sites in the United States and 1 in Spain. Prospective randomized clinical case series. For each patient, the first eye having surgery was randomized to the active-fluidics configuration (Centurion Vision System with Active Fluidics, 0.9 mm 45-degree Intrepid Balanced tip, and 0.9 mm Intrepid Ultra infusion sleeve) or the gravity-fluidics configuration (Infiniti Vision System with gravity fluidics, 0.9 mm 45-degree Mini-Flared Kelman tip, and 0.9 mm Ultra infusion sleeve). Second-eye surgery was completed within 14 days after first-eye surgery using the alternate configuration. The CDE, aspiration fluid used, and aspiration time were compared between configurations, and adverse events were summarized. Patient demographics and cataract characteristics were similar between configurations (100 per group). The CDE was significantly lower with the active-fluidics configuration than with the gravity-fluidics configuration (mean ± standard error, 4.32 ± 0.28 percent-seconds) (P < .001). The active-fluidics configuration used significantly less aspiration fluid than the gravity-fluidics configuration (mean 46.56 ± 1.39 mL versus 52.68 ± 1.40 mL) (P < .001) and required significantly shorter aspiration time (mean 151.9 ± 4.1 seconds versus 167.6 ± 4.1 seconds) (P < .001). No serious ocular adverse events related to the study devices or device deficiencies were observed. Significantly less CDE, aspiration fluid used, and aspiration time were observed with the active-fluidics configuration than with the gravity-fluidics configuration, showing improved surgical efficiency. Drs. Solomon and Cionni are consultants to Alcon Research, Ltd., and received compensation for conduct of the study. Dr. Lorente received compensation for clinical work in the study. Mr. Fanney is an employee of Alcon Research, Ltd. Copyright © 2016 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Computational structural mechanics methods research using an evolving framework
NASA Technical Reports Server (NTRS)
Knight, N. F., Jr.; Lotts, C. G.; Gillian, R. E.
1990-01-01
Advanced structural analysis and computational methods that exploit high-performance computers are being developed in a computational structural mechanics research activity sponsored by the NASA Langley Research Center. These new methods are developed in an evolving framework and applied to representative complex structural analysis problems from the aerospace industry. An overview of the methods development environment is presented, and methods research areas are described. Selected application studies are also summarized.
Research in applied mathematics, numerical analysis, and computer science
NASA Technical Reports Server (NTRS)
1984-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.
Code of Federal Regulations, 2012 CFR
2012-10-01
... technical data and computer software-Small Business Innovation Research (SBIR) Program. 252.227-7018 Section... Clauses 252.227-7018 Rights in noncommercial technical data and computer software—Small Business... Noncommercial Technical Data and Computer Software—Small Business Innovation Research (SBIR) Program (MAR 2011...
Models of Educational Computing @ Home: New Frontiers for Research on Technology in Learning.
ERIC Educational Resources Information Center
Kafai, Yasmin B.; Fishman, Barry J.; Bruckman, Amy S.; Rockman, Saul
2002-01-01
Discusses models of home educational computing that are linked to learning in school and recommends the need for research that addresses the home as a computer-based learning environment. Topics include a history of research on educational computing at home; technological infrastructure, including software and compatibility; Internet access;…
Division of Computer Research Summary of Awards. Fiscal Year 1984.
ERIC Educational Resources Information Center
National Science Foundation, Washington, DC. Directorate for Mathematical and Physical Sciences.
Provided in this report are summaries of grants awarded by the National Science Foundation Division of Computer Research in fiscal year 1984. Similar areas of research are grouped (for the purposes of this report only) into these major categories: (1) computational mathematics; (2) computer systems design; (3) intelligent systems; (4) software…
Crane, Michael; Steinwand, Dan; Beckmann, Tim; Krpan, Greg; Liu, Shu-Guang; Nichols, Erin; Haga, Jim; Maddox, Brian; Bilderback, Chris; Feller, Mark; Homer, George
2001-01-01
The overarching goal of this project is to build a spatially distributed infrastructure for information science research by forming a team of information science researchers and providing them with similar hardware and software tools to perform collaborative research. Four geographically distributed Centers of the U.S. Geological Survey (USGS) are developing their own clusters of low-cost, personal computers into parallel computing environments that provide a costeffective way for the USGS to increase participation in the high-performance computing community. Referred to as Beowulf clusters, these hybrid systems provide the robust computing power required for conducting information science research into parallel computing systems and applications.
NASA Astrophysics Data System (ADS)
Zuhrie, M. S.; Basuki, I.; Asto B, I. G. P.; Anifah, L.
2018-01-01
The focus of the research is the teaching module which incorporates manufacturing, planning mechanical designing, controlling system through microprocessor technology and maneuverability of the robot. Computer interactive and computer-assisted learning is strategies that emphasize the use of computers and learning aids (computer assisted learning) in teaching and learning activity. This research applied the 4-D model research and development. The model is suggested by Thiagarajan, et.al (1974). 4-D Model consists of four stages: Define Stage, Design Stage, Develop Stage, and Disseminate Stage. This research was conducted by applying the research design development with an objective to produce a tool of learning in the form of intelligent robot modules and kit based on Computer Interactive Learning and Computer Assisted Learning. From the data of the Indonesia Robot Contest during the period of 2009-2015, it can be seen that the modules that have been developed confirm the fourth stage of the research methods of development; disseminate method. The modules which have been developed for students guide students to produce Intelligent Robot Tool for Teaching Based on Computer Interactive Learning and Computer Assisted Learning. Results of students’ responses also showed a positive feedback to relate to the module of robotics and computer-based interactive learning.
Buchanan, Elizabeth; Aycock, John; Dexter, Scott; Dittrich, David; Hvizdak, Erin
2011-06-01
This paper explores the growing concerns with computer science research, and in particular, computer security research and its relationship with the committees that review human subjects research. It offers cases that review boards are likely to confront, and provides a context for appropriate consideration of such research, as issues of bots, clouds, and worms enter the discourse of human subjects review.
Why Don't All Professors Use Computers?
ERIC Educational Resources Information Center
Drew, David Eli
1989-01-01
Discusses the adoption of computer technology at universities and examines reasons why some professors don't use computers. Topics discussed include computer applications, including artificial intelligence, social science research, statistical analysis, and cooperative research; appropriateness of the technology for the task; the Computer Aptitude…
Good enough practices in scientific computing.
Wilson, Greg; Bryan, Jennifer; Cranston, Karen; Kitzes, Justin; Nederbragt, Lex; Teal, Tracy K
2017-06-01
Computers are now essential in all branches of science, but most researchers are never taught the equivalent of basic lab skills for research computing. As a result, data can get lost, analyses can take much longer than necessary, and researchers are limited in how effectively they can work with software and data. Computing workflows need to follow the same practices as lab projects and notebooks, with organized data, documented steps, and the project structured for reproducibility, but researchers new to computing often don't know where to start. This paper presents a set of good computing practices that every researcher can adopt, regardless of their current level of computational skill. These practices, which encompass data management, programming, collaborating with colleagues, organizing projects, tracking work, and writing manuscripts, are drawn from a wide variety of published sources from our daily lives and from our work with volunteer organizations that have delivered workshops to over 11,000 people since 2010.
An Overview of NASA's Intelligent Systems Program
NASA Technical Reports Server (NTRS)
Cooke, Daniel E.; Norvig, Peter (Technical Monitor)
2001-01-01
NASA and the Computer Science Research community are poised to enter a critical era. An era in which - it seems - that each needs the other. Market forces, driven by the immediate economic viability of computer science research results, place Computer Science in a relatively novel position. These forces impact how research is done, and could, in worst case, drive the field away from significant innovation opting instead for incremental advances that result in greater stability in the market place. NASA, however, requires significant advances in computer science research in order to accomplish the exploration and science agenda it has set out for itself. NASA may indeed be poised to advance computer science research in this century much the way it advanced aero-based research in the last.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-07
... Rehabilitation Research--Disability and Rehabilitation Research Projects--Inclusive Cloud and Web Computing... Rehabilitation Research Projects (DRRPs)--Inclusive Cloud and Web Computing Notice inviting applications for new...#DRRP . Priorities: Priority 1--DRRP on Inclusive Cloud and Web Computing-- is from the notice of final...
Advanced Biomedical Computing Center (ABCC) | DSITP
The Advanced Biomedical Computing Center (ABCC), located in Frederick Maryland (MD), provides HPC resources for both NIH/NCI intramural scientists and the extramural biomedical research community. Its mission is to provide HPC support, to provide collaborative research, and to conduct in-house research in various areas of computational biology and biomedical research.
Graphics supercomputer for computational fluid dynamics research
NASA Astrophysics Data System (ADS)
Liaw, Goang S.
1994-11-01
The objective of this project is to purchase a state-of-the-art graphics supercomputer to improve the Computational Fluid Dynamics (CFD) research capability at Alabama A & M University (AAMU) and to support the Air Force research projects. A cutting-edge graphics supercomputer system, Onyx VTX, from Silicon Graphics Computer Systems (SGI), was purchased and installed. Other equipment including a desktop personal computer, PC-486 DX2 with a built-in 10-BaseT Ethernet card, a 10-BaseT hub, an Apple Laser Printer Select 360, and a notebook computer from Zenith were also purchased. A reading room has been converted to a research computer lab by adding some furniture and an air conditioning unit in order to provide an appropriate working environments for researchers and the purchase equipment. All the purchased equipment were successfully installed and are fully functional. Several research projects, including two existing Air Force projects, are being performed using these facilities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hules, John
This 1998 annual report from the National Scientific Energy Research Computing Center (NERSC) presents the year in review of the following categories: Computational Science; Computer Science and Applied Mathematics; and Systems and Services. Also presented are science highlights in the following categories: Basic Energy Sciences; Biological and Environmental Research; Fusion Energy Sciences; High Energy and Nuclear Physics; and Advanced Scientific Computing Research and Other Projects.
ERIC Educational Resources Information Center
Lesgold, Alan; Reif, Frederick
The future of computers in education and the research needed to realize the computer's potential are discussed in this report, which presents a summary and the conclusions from an invitational conference involving 40 computer scientists, psychologists, educational researchers, teachers, school administrators, and parents. The summary stresses the…
Marshall, Thomas; Champagne-Langabeer, Tiffiany; Castelli, Darla; Hoelscher, Deanna
2017-12-01
To present research models based on artificial intelligence and discuss the concept of cognitive computing and eScience as disruptive factors in health and life science research methodologies. The paper identifies big data as a catalyst to innovation and the development of artificial intelligence, presents a framework for computer-supported human problem solving and describes a transformation of research support models. This framework includes traditional computer support; federated cognition using machine learning and cognitive agents to augment human intelligence; and a semi-autonomous/autonomous cognitive model, based on deep machine learning, which supports eScience. The paper provides a forward view of the impact of artificial intelligence on our human-computer support and research methods in health and life science research. By augmenting or amplifying human task performance with artificial intelligence, cognitive computing and eScience research models are discussed as novel and innovative systems for developing more effective adaptive obesity intervention programs.
Computer Science Research at Langley
NASA Technical Reports Server (NTRS)
Voigt, S. J. (Editor)
1982-01-01
A workshop was held at Langley Research Center, November 2-5, 1981, to highlight ongoing computer science research at Langley and to identify additional areas of research based upon the computer user requirements. A panel discussion was held in each of nine application areas, and these are summarized in the proceedings. Slides presented by the invited speakers are also included. A survey of scientific, business, data reduction, and microprocessor computer users helped identify areas of focus for the workshop. Several areas of computer science which are of most concern to the Langley computer users were identified during the workshop discussions. These include graphics, distributed processing, programmer support systems and tools, database management, and numerical methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shin, Dongwan; Claycomb, William R.; Urias, Vincent E.
Cloud computing is a paradigm rapidly being embraced by government and industry as a solution for cost-savings, scalability, and collaboration. While a multitude of applications and services are available commercially for cloud-based solutions, research in this area has yet to fully embrace the full spectrum of potential challenges facing cloud computing. This tutorial aims to provide researchers with a fundamental understanding of cloud computing, with the goals of identifying a broad range of potential research topics, and inspiring a new surge in research to address current issues. We will also discuss real implementations of research-oriented cloud computing systems for bothmore » academia and government, including configuration options, hardware issues, challenges, and solutions.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Qian, Xiaoqing; Deng, Z. T.
2009-11-10
This is the final report for the Department of Energy (DOE) project DE-FG02-06ER25746, entitled, "Continuing High Performance Computing Research and Education at AAMU". This three-year project was started in August 15, 2006, and it was ended in August 14, 2009. The objective of this project was to enhance high performance computing research and education capabilities at Alabama A&M University (AAMU), and to train African-American and other minority students and scientists in the computational science field for eventual employment with DOE. AAMU has successfully completed all the proposed research and educational tasks. Through the support of DOE, AAMU was able tomore » provide opportunities to minority students through summer interns and DOE computational science scholarship program. In the past three years, AAMU (1). Supported three graduate research assistants in image processing for hypersonic shockwave control experiment and in computational science related area; (2). Recruited and provided full financial support for six AAMU undergraduate summer research interns to participate Research Alliance in Math and Science (RAMS) program at Oak Ridge National Lab (ORNL); (3). Awarded highly competitive 30 DOE High Performance Computing Scholarships ($1500 each) to qualified top AAMU undergraduate students in science and engineering majors; (4). Improved high performance computing laboratory at AAMU with the addition of three high performance Linux workstations; (5). Conducted image analysis for electromagnetic shockwave control experiment and computation of shockwave interactions to verify the design and operation of AAMU-Supersonic wind tunnel. The high performance computing research and education activities at AAMU created great impact to minority students. As praised by Accreditation Board for Engineering and Technology (ABET) in 2009, ?The work on high performance computing that is funded by the Department of Energy provides scholarships to undergraduate students as computational science scholars. This is a wonderful opportunity to recruit under-represented students.? Three ASEE papers were published in 2007, 2008 and 2009 proceedings of ASEE Annual Conferences, respectively. Presentations of these papers were also made at the ASEE Annual Conferences. It is very critical to continue the research and education activities.« less
NASA Technical Reports Server (NTRS)
Gillian, Ronnie E.; Lotts, Christine G.
1988-01-01
The Computational Structural Mechanics (CSM) Activity at Langley Research Center is developing methods for structural analysis on modern computers. To facilitate that research effort, an applications development environment has been constructed to insulate the researcher from the many computer operating systems of a widely distributed computer network. The CSM Testbed development system was ported to the Numerical Aerodynamic Simulator (NAS) Cray-2, at the Ames Research Center, to provide a high end computational capability. This paper describes the implementation experiences, the resulting capability, and the future directions for the Testbed on supercomputers.
First 3 years of operation of RIACS (Research Institute for Advanced Computer Science) (1983-1985)
NASA Technical Reports Server (NTRS)
Denning, P. J.
1986-01-01
The focus of the Research Institute for Advanced Computer Science (RIACS) is to explore matches between advanced computing architectures and the processes of scientific research. An architecture evaluation of the MIT static dataflow machine, specification of a graphical language for expressing distributed computations, and specification of an expert system for aiding in grid generation for two-dimensional flow problems was initiated. Research projects for 1984 and 1985 are summarized.
ERIC Educational Resources Information Center
Tuncer, Murat
2013-01-01
Present research investigates reciprocal relations amidst computer self-efficacy, scientific research and information literacy self-efficacy. Research findings have demonstrated that according to standardized regression coefficients, computer self-efficacy has a positive effect on information literacy self-efficacy. Likewise it has been detected…
Code of Federal Regulations, 2010 CFR
2010-10-01
... technical data and computer software-Small Business Innovation Research (SBIR) Program. 252.227-7018 Section... Innovation Research (SBIR) Program. As prescribed in 227.7104(a), use the following clause: Rights in Noncommercial Technical Data and Computer Software—Small Business Innovation Research (SBIR) Program (JUN 1995...
Code of Federal Regulations, 2014 CFR
2014-10-01
... technical data and computer software-Small Business Innovation Research (SBIR) Program. 252.227-7018 Section... Innovation Research (SBIR) Program. As prescribed in 227.7104(a), use the following clause: Rights in Noncommercial Technical Data and Computer Software—Small Business Innovation Research (SBIR) Program (FEB 2014...
Code of Federal Regulations, 2011 CFR
2011-10-01
... technical data and computer software-Small Business Innovation Research (SBIR) Program. 252.227-7018 Section... Innovation Research (SBIR) Program. As prescribed in 227.7104(a), use the following clause: Rights in Noncommercial Technical Data and Computer Software—Small Business Innovation Research (SBIR) Program (MAR 2011...
Code of Federal Regulations, 2013 CFR
2013-10-01
... technical data and computer software-Small Business Innovation Research (SBIR) Program. 252.227-7018 Section... Innovation Research (SBIR) Program. As prescribed in 227.7104(a), use the following clause: Rights in Noncommercial Technical Data and Computer Software—Small Business Innovation Research (SBIR) Program (MAY 2013...
Action Research of Computer-Assisted-Remediation of Basic Research Concepts.
ERIC Educational Resources Information Center
Packard, Abbot L.; And Others
This study investigated the possibility of creating a computer-assisted remediation program to assist students having difficulties in basic college research and statistics courses. A team approach involving instructors and students drove the research into and creation of the computer program. The effect of student use was reviewed by looking at…
Values and Objectives in Computing Education Research
ERIC Educational Resources Information Center
Pears, Arnold; Malmi, Lauri
2009-01-01
What is Computing Education Research (CER), why are we doing this type of research, and what should the community achieve? As associate editors to this special edition we provide our perspectives and discuss how they have influenced the evolution of the Koli Calling International Conference on Computing Education Research over the last nine years.…
[Activities of Research Institute for Advanced Computer Science
NASA Technical Reports Server (NTRS)
Gross, Anthony R. (Technical Monitor); Leiner, Barry M.
2001-01-01
The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administrations missions. RIACS is located at the NASA Ames Research Center, Moffett Field, California. RIACS research focuses on the three cornerstones of IT research necessary to meet the future challenges of NASA missions: 1. Automated Reasoning for Autonomous Systems Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. 2. Human-Centered Computing Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities. 3. High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to analysis of large scientific datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply IT research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, visiting scientist programs and student summer programs, designed to encourage and facilitate collaboration between the university and NASA IT research communities.
ERIC Educational Resources Information Center
Stoilescu, Dorian; Egodawatte, Gunawardena
2010-01-01
Research shows that female and male students in undergraduate computer science programs view computer culture differently. Female students are interested more in the use of computers than in doing programming, whereas male students see computer science mainly as a programming activity. The overall purpose of our research was not to find new…
Open-Source Software in Computational Research: A Case Study
Syamlal, Madhava; O'Brien, Thomas J.; Benyahia, Sofiane; ...
2008-01-01
A case study of open-source (OS) development of the computational research software MFIX, used for multiphase computational fluid dynamics simulations, is presented here. The verification and validation steps required for constructing modern computational software and the advantages of OS development in those steps are discussed. The infrastructure used for enabling the OS development of MFIX is described. The impact of OS development on computational research and education in gas-solids flow, as well as the dissemination of information to other areas such as geophysical and volcanology research, is demonstrated. This study shows that the advantages of OS development were realized inmore » the case of MFIX: verification by many users, which enhances software quality; the use of software as a means for accumulating and exchanging information; the facilitation of peer review of the results of computational research.« less
Structural biology computing: Lessons for the biomedical research sciences.
Morin, Andrew; Sliz, Piotr
2013-11-01
The field of structural biology, whose aim is to elucidate the molecular and atomic structures of biological macromolecules, has long been at the forefront of biomedical sciences in adopting and developing computational research methods. Operating at the intersection between biophysics, biochemistry, and molecular biology, structural biology's growth into a foundational framework on which many concepts and findings of molecular biology are interpreted1 has depended largely on parallel advancements in computational tools and techniques. Without these computing advances, modern structural biology would likely have remained an exclusive pursuit practiced by few, and not become the widely practiced, foundational field it is today. As other areas of biomedical research increasingly embrace research computing techniques, the successes, failures and lessons of structural biology computing can serve as a useful guide to progress in other biomedically related research fields. Copyright © 2013 Wiley Periodicals, Inc.
Average focal length and power of a section of any defined surface.
Kaye, Stephen B
2010-04-01
To provide a method to allow calculation of the average focal length and power of a lens through a specified meridian of any defined surface, not limited to the paraxial approximations. University of Liverpool, Liverpool, United Kingdom. Functions were derived to model back-vertex focal length and representative power through a meridian containing any defined surface. Average back-vertex focal length was based on the definition of the average of a function, using the angle of incidence as an independent variable. Univariate functions allowed determination of average focal length and power through a section of any defined or topographically measured surface of a known refractive index. These functions incorporated aberrations confined to the section. The proposed method closely approximates the average focal length, and by inference power, of a section (meridian) of a surface to a single or scalar value. It is not dependent on the paraxial and other nonconstant approximations and includes aberrations confined to that meridian. A generalization of this method to include all orthogonal and oblique meridians is needed before a comparison with measured wavefront values can be made. Copyright (c) 2010 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Cost analysis of objective resident cataract surgery assessments.
Nandigam, Kiran; Soh, Jonathan; Gensheimer, William G; Ghazi, Ahmed; Khalifa, Yousuf M
2015-05-01
To compare 8 ophthalmology resident surgical training tools to determine which is most cost effective. University of Rochester Medical Center, Rochester, New York, USA. Retrospective evaluation of technology. A cost-analysis model was created to compile all relevant costs in running each tool in a medium-sized ophthalmology program. Quantitative cost estimates were obtained based on cost of tools, cost of time in evaluations, and supply and maintenance costs. For wet laboratory simulation, Eyesi was the least expensive cataract surgery simulation method; however, it is only capable of evaluating simulated cataract surgery rehearsal and requires supplementation with other evaluative methods for operating room performance and for noncataract wet lab training and evaluation. The most expensive training tool was the Eye Surgical Skills Assessment Test (ESSAT). The 2 most affordable methods for resident evaluation in operating room performance were the Objective Assessment of Skills in Intraocular Surgery (OASIS) and Global Rating Assessment of Skills in Intraocular Surgery (GRASIS). Cost-based analysis of ophthalmology resident surgical training tools are needed so residency programs can implement tools that are valid, reliable, objective, and cost effective. There is no perfect training system at this time. Copyright © 2015 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Minami, Keiichiro; Honbo, Masato; Mori, Yosai; Kataoka, Yasushi; Miyata, Kazunori
2015-11-01
To compare area densitometry analysis using rotating Scheimpflug photography in quantifications of posterior capsule opacification (PCO) and surface light scattering with previous anterior-segment analyzer measurement. Miyata Eye Hospital, Miyazaki, Japan. Prospective observational case series. Scheimpflug images of eyes with foldable intraocular lenses (IOLs) were obtained using rotating and fixed Scheimpflug photography. Area densitometry on the posterior and anterior surfaces was conducted for PCO and surface light scattering analyses, respectively, with an identical area size. Correlation between two measurements was analyzed using linear regression. The study included 105 eyes of 74 patients who received IOLs 1 to 18 years (mean, 4.9 ± 4.5 years) postoperatively. In the PCO analysis on the posterior IOL surface, there was a significant correlation between the two measurements (P < .001, R(2) = 0.60). In the surface light scattering analysis, a significant and higher correlation was obtained (P < .001, R(2) = 0.91) until the fixed Scheimpflug photography exhibited saturation due to intensive scatterings. Area densitometry combined with a rotating Scheimpflug photography was exchangeable to previously established densitometry measurement, and allowed successive evaluation in longer-term observations. Copyright © 2015 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Optimal Colostomy Placement in Spinal Cord Injury Patients.
Xu, Jiashou; Dharmarajan, Sekhar; Johnson, Frank E
2016-03-01
Barring unusual circumstances, sigmoid colostomy is the optimal technique for management of defecation in spinal cord injury (SCI) patients. We sought to provide evidence that a sigmoid colostomy is not difficult to perform in SCI patients and has better long-term results. The St. Louis Department of Veterans Affairs has a Commission on Accreditation of Rehabilitation Facilities (CARF)-approved SCI Unit. We reviewed the operative notes on all SCI patients who received a colostomy for fecal management by three ASCRS-certified colorectal surgeons at the St. Louis Department of Veterans Affairs from January 1, 2007 to November 26, 2012. There were 27 operations for which the recorded indication for surgery suggested that the primary disorder was SCI. Fourteen had traumatic SCI of the thoracic and/or lumbar spine and were evaluable. Of these 14 patients, 12 had laparoscopic sigmoid colostomy and two had open sigmoid colostomy. We encountered one evaluable patient with a remarkably large amount of retroperitoneal bony debris who successfully underwent laparoscopic sigmoid colostomy. In conclusion, sigmoid colostomy is the consensus optimal procedure for fecal management in SCI patients. Laparoscopic procedures are preferred. Care providers should specify sigmoid colostomy when contacting a surgeon.
Management of long-standing partially torn and flipped laser in situ keratomileusis flaps.
Kim, Jin Sun; Chung, Byunghoon; Lee, Taekjune; Kim, Woon Cho; Kim, Tae-im; Kim, Eung Kweon
2015-02-01
We describe 2 cases of traumatized and torn laser in situ keratomileusis (LASIK) flaps, partially flipped anteriorly or posteriorly, fixed for 8 months or 4 months, and accompanied by epithelial ingrowth. The 2 patients had had uneventful bilateral LASIK 6 years and 1 year before the trauma. In Case 1, the anteriorly flipped flap was removed with transepithelial phototherapeutic keratectomy. Next, mitomycin-C 0.04% was applied for 30 seconds. In Case 2, the portion of the flap that was flipped posteriorly and buried under the remaining intact LASIK flap was restored to its original normal position and epithelial ingrowth was removed mechanically with a microcurette. Irrigation with 20% ethanol was performed to inhibit the recurrence of interfacial epithelial ingrowth. The stretched amniotic membrane overlay over the cornea and sclera was sutured tightly to the episclera as the biologic pressure patch for the inhibition of epithelial re-ingrowth. Good visual acuity was restored in both cases. No author has a financial or proprietary interest in any material or method mentioned. Copyright © 2015 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Negative dysphotopsia: A perfect storm.
Henderson, Bonnie An; Geneva, Ivayla I
2015-10-01
The objective of this review was to provide a summary of the peer-reviewed literature on the etiologies of negative dysphotopsia that occurs after routine cataract surgery. A search of PubMed, Google Scholar, and Retina Medical identified 59 reports. Negative dysphotopsia has been associated with many types of intraocular lenses (IOLs), including hydrophobic and hydrophilic acrylic, silicone, and 1-piece and 3-piece designs. Proposed etiologies include edge design, edge smoothness, edge thickness, index of refraction of the IOL, pupil size, amount of functional nasal retina, edema from the clear corneal incision, distance between the iris and IOL, amount of pigmentation of the eye, corneal shape, prominent globe and shallow orbit, and interaction between the anterior capsulorhexis and IOL. Treatments include a piggyback IOL, reverse optic capture, dilation of the pupil, constriction of the pupil, neodymium:YAG capsulotomy of the nasal portion of the anterior capsule, IOL exchange with round-edged optics, and time alone. This review summarizes the findings. Dr. Henderson is a consultant to Alcon Laboratories, Inc., Abbott Medical Optics, Inc., Bausch & Lomb, and Genzyme Corp. Neither author has a financial or proprietary interest in any material or method mentioned. Copyright © 2015 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Computational mechanics and physics at NASA Langley Research Center
NASA Technical Reports Server (NTRS)
South, Jerry C., Jr.
1987-01-01
An overview is given of computational mechanics and physics at NASA Langley Research Center. Computational analysis is a major component and tool in many of Langley's diverse research disciplines, as well as in the interdisciplinary research. Examples are given for algorithm development and advanced applications in aerodynamics, transition to turbulence and turbulence simulation, hypersonics, structures, and interdisciplinary optimization.
ERIC Educational Resources Information Center
Paterson, Mark; Glass, Michael R.
2015-01-01
Google Glass was deployed in an Urban Studies field course to gather videographic data for team-based student research projects. We evaluate the potential for wearable computing technology such as Glass, in combination with other mobile computing devices, to enhance reflexive research skills, and videography in particular, during field research.…
Lynne M. Westphal
2000-01-01
By using computer packages designed for qualitative data analysis a researcher can increase trustworthiness (i.e., validity and reliability) of conclusions drawn from qualitative research results. This paper examines trustworthiness issues and therole of computer software (QSR's NUD*IST) in the context of a current research project investigating the social...
ISCR Annual Report: Fical Year 2004
DOE Office of Scientific and Technical Information (OSTI.GOV)
McGraw, J R
2005-03-03
Large-scale scientific computation and all of the disciplines that support and help to validate it have been placed at the focus of Lawrence Livermore National Laboratory (LLNL) by the Advanced Simulation and Computing (ASC) program of the National Nuclear Security Administration (NNSA) and the Scientific Discovery through Advanced Computing (SciDAC) initiative of the Office of Science of the Department of Energy (DOE). The maturation of computational simulation as a tool of scientific and engineering research is underscored in the November 2004 statement of the Secretary of Energy that, ''high performance computing is the backbone of the nation's science and technologymore » enterprise''. LLNL operates several of the world's most powerful computers--including today's single most powerful--and has undertaken some of the largest and most compute-intensive simulations ever performed. Ultrascale simulation has been identified as one of the highest priorities in DOE's facilities planning for the next two decades. However, computers at architectural extremes are notoriously difficult to use efficiently. Furthermore, each successful terascale simulation only points out the need for much better ways of interacting with the resulting avalanche of data. Advances in scientific computing research have, therefore, never been more vital to LLNL's core missions than at present. Computational science is evolving so rapidly along every one of its research fronts that to remain on the leading edge, LLNL must engage researchers at many academic centers of excellence. In Fiscal Year 2004, the Institute for Scientific Computing Research (ISCR) served as one of LLNL's main bridges to the academic community with a program of collaborative subcontracts, visiting faculty, student internships, workshops, and an active seminar series. The ISCR identifies researchers from the academic community for computer science and computational science collaborations with LLNL and hosts them for short- and long-term visits with the aim of encouraging long-term academic research agendas that address LLNL's research priorities. Through such collaborations, ideas and software flow in both directions, and LLNL cultivates its future workforce. The Institute strives to be LLNL's ''eyes and ears'' in the computer and information sciences, keeping the Laboratory aware of and connected to important external advances. It also attempts to be the ''feet and hands'' that carry those advances into the Laboratory and incorporates them into practice. ISCR research participants are integrated into LLNL's Computing and Applied Research (CAR) Department, especially into its Center for Applied Scientific Computing (CASC). In turn, these organizations address computational challenges arising throughout the rest of the Laboratory. Administratively, the ISCR flourishes under LLNL's University Relations Program (URP). Together with the other five institutes of the URP, it navigates a course that allows LLNL to benefit from academic exchanges while preserving national security. While it is difficult to operate an academic-like research enterprise within the context of a national security laboratory, the results declare the challenges well met and worth the continued effort.« less
Computing through Scientific Abstractions in SysBioPS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chin, George; Stephan, Eric G.; Gracio, Deborah K.
2004-10-13
Today, biologists and bioinformaticists have a tremendous amount of computational power at their disposal. With the availability of supercomputers, burgeoning scientific databases and digital libraries such as GenBank and PubMed, and pervasive computational environments such as the Grid, biologists have access to a wealth of computational capabilities and scientific data at hand. Yet, the rapid development of computational technologies has far exceeded the typical biologist’s ability to effectively apply the technology in their research. Computational sciences research and development efforts such as the Biology Workbench, BioSPICE (Biological Simulation Program for Intra-Cellular Evaluation), and BioCoRE (Biological Collaborative Research Environment) are importantmore » in connecting biologists and their scientific problems to computational infrastructures. On the Computational Cell Environment and Heuristic Entity-Relationship Building Environment projects at the Pacific Northwest National Laboratory, we are jointly developing a new breed of scientific problem solving environment called SysBioPSE that will allow biologists to access and apply computational resources in the scientific research context. In contrast to other computational science environments, SysBioPSE operates as an abstraction layer above a computational infrastructure. The goal of SysBioPSE is to allow biologists to apply computational resources in the context of the scientific problems they are addressing and the scientific perspectives from which they conduct their research. More specifically, SysBioPSE allows biologists to capture and represent scientific concepts and theories and experimental processes, and to link these views to scientific applications, data repositories, and computer systems.« less
Digital processing of mesoscale analysis and space sensor data
NASA Technical Reports Server (NTRS)
Hickey, J. S.; Karitani, S.
1985-01-01
The mesoscale analysis and space sensor (MASS) data management and analysis system on the research computer system is presented. The MASS data base management and analysis system was implemented on the research computer system which provides a wide range of capabilities for processing and displaying large volumes of conventional and satellite derived meteorological data. The research computer system consists of three primary computers (HP-1000F, Harris/6, and Perkin-Elmer 3250), each of which performs a specific function according to its unique capabilities. The overall tasks performed concerning the software, data base management and display capabilities of the research computer system in terms of providing a very effective interactive research tool for the digital processing of mesoscale analysis and space sensor data is described.
Computer Science Research Funding: How Much Is Too Little?
2009-06-01
Bioinformatics Parallel computing Computational biology Principles of programming Computational neuroscience Real-time and embedded systems Scientific...National Security Agency ( NSA ) • Missile Defense Agency (MDA) and others The various research programs have been coordinated through the DDR&E...DOD funding included only DARPA and OSD programs. FY07 and FY08 PBR funding included DARPA, NSA , some of the Services’ basic and applied research
van den Berg, Yvonne H M; Gommans, Rob
2017-09-01
New technologies have led to several major advances in psychological research over the past few decades. Peer nomination research is no exception. Thanks to these technological innovations, computerized data collection is becoming more common in peer nomination research. However, computer-based assessment is more than simply programming the questionnaire and asking respondents to fill it in on computers. In this chapter the advantages and challenges of computer-based assessments are discussed. In addition, a list of practical recommendations and considerations is provided to inform researchers on how computer-based methods can be applied to their own research. Although the focus is on the collection of peer nomination data in particular, many of the requirements, considerations, and implications are also relevant for those who consider the use of other sociometric assessment methods (e.g., paired comparisons, peer ratings, peer rankings) or computer-based assessments in general. © 2017 Wiley Periodicals, Inc.
NASA Technical Reports Server (NTRS)
1987-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April, 1986 through September 30, 1986 is summarized.
ERIC Educational Resources Information Center
Lesgold, Alan M., Ed.; Reif, Frederick, Ed.
The full proceedings are provided here of a conference of 40 teachers, educational researchers, and scientists from both the public and private sectors that centered on the future of computers in education and the research required to realize the computer's educational potential. A summary of the research issues considered and suggested means for…
NASA Astrophysics Data System (ADS)
McFall, Steve
1994-03-01
With the increase in business automation and the widespread availability and low cost of computer systems, law enforcement agencies have seen a corresponding increase in criminal acts involving computers. The examination of computer evidence is a new field of forensic science with numerous opportunities for research and development. Research is needed to develop new software utilities to examine computer storage media, expert systems capable of finding criminal activity in large amounts of data, and to find methods of recovering data from chemically and physically damaged computer storage media. In addition, defeating encryption and password protection of computer files is also a topic requiring more research and development.
Computational Toxicology as Implemented by the US EPA ...
Computational toxicology is the application of mathematical and computer models to help assess chemical hazards and risks to human health and the environment. Supported by advances in informatics, high-throughput screening (HTS) technologies, and systems biology, the U.S. Environmental Protection Agency EPA is developing robust and flexible computational tools that can be applied to the thousands of chemicals in commerce, and contaminant mixtures found in air, water, and hazardous-waste sites. The Office of Research and Development (ORD) Computational Toxicology Research Program (CTRP) is composed of three main elements. The largest component is the National Center for Computational Toxicology (NCCT), which was established in 2005 to coordinate research on chemical screening and prioritization, informatics, and systems modeling. The second element consists of related activities in the National Health and Environmental Effects Research Laboratory (NHEERL) and the National Exposure Research Laboratory (NERL). The third and final component consists of academic centers working on various aspects of computational toxicology and funded by the U.S. EPA Science to Achieve Results (STAR) program. Together these elements form the key components in the implementation of both the initial strategy, A Framework for a Computational Toxicology Research Program (U.S. EPA, 2003), and the newly released The U.S. Environmental Protection Agency's Strategic Plan for Evaluating the T
A parallel-processing approach to computing for the geographic sciences
Crane, Michael; Steinwand, Dan; Beckmann, Tim; Krpan, Greg; Haga, Jim; Maddox, Brian; Feller, Mark
2001-01-01
The overarching goal of this project is to build a spatially distributed infrastructure for information science research by forming a team of information science researchers and providing them with similar hardware and software tools to perform collaborative research. Four geographically distributed Centers of the U.S. Geological Survey (USGS) are developing their own clusters of low-cost personal computers into parallel computing environments that provide a costeffective way for the USGS to increase participation in the high-performance computing community. Referred to as Beowulf clusters, these hybrid systems provide the robust computing power required for conducting research into various areas, such as advanced computer architecture, algorithms to meet the processing needs for real-time image and data processing, the creation of custom datasets from seamless source data, rapid turn-around of products for emergency response, and support for computationally intense spatial and temporal modeling.
NASA Technical Reports Server (NTRS)
Noor, Ahmed K. (Compiler)
2003-01-01
The document contains the proceedings of the training workshop on Emerging and Future Computing Paradigms and their impact on the Research, Training and Design Environments of the Aerospace Workforce. The workshop was held at NASA Langley Research Center, Hampton, Virginia, March 18 and 19, 2003. The workshop was jointly sponsored by Old Dominion University and NASA. Workshop attendees came from NASA, other government agencies, industry and universities. The objectives of the workshop were to a) provide broad overviews of the diverse activities related to new computing paradigms, including grid computing, pervasive computing, high-productivity computing, and the IBM-led autonomic computing; and b) identify future directions for research that have high potential for future aerospace workforce environments. The format of the workshop included twenty-one, half-hour overview-type presentations and three exhibits by vendors.
Possible Computer Vision Systems and Automated or Computer-Aided Edging and Trimming
Philip A. Araman
1990-01-01
This paper discusses research which is underway to help our industry reduce costs, increase product volume and value recovery, and market more accurately graded and described products. The research is part of a team effort to help the hardwood sawmill industry automate with computer vision systems, and computer-aided or computer controlled processing. This paper...
Collective Computation of Neural Network
1990-03-15
Sciences, Beijing ABSTRACT Computational neuroscience is a new branch of neuroscience originating from current research on the theory of computer...scientists working in artificial intelligence engineering and neuroscience . The paper introduces the collective computational properties of model neural...vision research. On this basis, the authors analyzed the significance of the Hopfield model. Key phrases: Computational Neuroscience , Neural Network, Model
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-15
... Rehabilitation Research--Disability and Rehabilitation Research Project--Inclusive Cloud and Web Computing CFDA... inclusive Cloud and Web computing. The Assistant Secretary may use this priority for competitions in fiscal... Priority for Inclusive Cloud and Web Computing'' in the subject line of your electronic message. FOR...
Computational toxicology is a new research initiative being developed within the Office of Research and Development (ORD) of the US Environmental Protection Agency (EPA). Operationally, it is defined as the application of mathematical and computer models together with molecular c...
What Research Says about Keyboarding Skills and Computer Anxiety.
ERIC Educational Resources Information Center
Artwohl, Mary Jane
A literature search identified 14 studies that were examined concerning keyboarding and computer anxiety. Although research on the relationship between keyboarding skills and computer anxiety is scarce, studies are being conducted to measure the effects of basic keyboarding skills on increased productivity. In addition, research is being performed…
Applied Information Systems Research Program (AISRP). Workshop 2: Meeting Proceedings
NASA Technical Reports Server (NTRS)
1992-01-01
The Earth and space science participants were able to see where the current research can be applied in their disciplines and computer science participants could see potential areas for future application of computer and information systems research. The Earth and Space Science research proposals for the High Performance Computing and Communications (HPCC) program were under evaluation. Therefore, this effort was not discussed at the AISRP Workshop. OSSA's other high priority area in computer science is scientific visualization, with the entire second day of the workshop devoted to it.
NASA Technical Reports Server (NTRS)
1988-01-01
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April l, 1988 through September 30, 1988.
NASA Technical Reports Server (NTRS)
1984-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science during the period October 1, 1983 through March 31, 1984 is summarized.
NASA Technical Reports Server (NTRS)
1987-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1986 through March 31, 1987 is summarized.
Computational fluid dynamics at NASA Ames and the numerical aerodynamic simulation program
NASA Technical Reports Server (NTRS)
Peterson, V. L.
1985-01-01
Computers are playing an increasingly important role in the field of aerodynamics such as that they now serve as a major complement to wind tunnels in aerospace research and development. Factors pacing advances in computational aerodynamics are identified, including the amount of computational power required to take the next major step in the discipline. The four main areas of computational aerodynamics research at NASA Ames Research Center which are directed toward extending the state of the art are identified and discussed. Example results obtained from approximate forms of the governing equations are presented and discussed, both in the context of levels of computer power required and the degree to which they either further the frontiers of research or apply to programs of practical importance. Finally, the Numerical Aerodynamic Simulation Program--with its 1988 target of achieving a sustained computational rate of 1 billion floating-point operations per second--is discussed in terms of its goals, status, and its projected effect on the future of computational aerodynamics.
The iPlant Collaborative: Cyberinfrastructure for Enabling Data to Discovery for the Life Sciences.
Merchant, Nirav; Lyons, Eric; Goff, Stephen; Vaughn, Matthew; Ware, Doreen; Micklos, David; Antin, Parker
2016-01-01
The iPlant Collaborative provides life science research communities access to comprehensive, scalable, and cohesive computational infrastructure for data management; identity management; collaboration tools; and cloud, high-performance, high-throughput computing. iPlant provides training, learning material, and best practice resources to help all researchers make the best use of their data, expand their computational skill set, and effectively manage their data and computation when working as distributed teams. iPlant's platform permits researchers to easily deposit and share their data and deploy new computational tools and analysis workflows, allowing the broader community to easily use and reuse those data and computational analyses.
Visser, Marco D.; McMahon, Sean M.; Merow, Cory; Dixon, Philip M.; Record, Sydne; Jongejans, Eelke
2015-01-01
Computation has become a critical component of research in biology. A risk has emerged that computational and programming challenges may limit research scope, depth, and quality. We review various solutions to common computational efficiency problems in ecological and evolutionary research. Our review pulls together material that is currently scattered across many sources and emphasizes those techniques that are especially effective for typical ecological and environmental problems. We demonstrate how straightforward it can be to write efficient code and implement techniques such as profiling or parallel computing. We supply a newly developed R package (aprof) that helps to identify computational bottlenecks in R code and determine whether optimization can be effective. Our review is complemented by a practical set of examples and detailed Supporting Information material (S1–S3 Texts) that demonstrate large improvements in computational speed (ranging from 10.5 times to 14,000 times faster). By improving computational efficiency, biologists can feasibly solve more complex tasks, ask more ambitious questions, and include more sophisticated analyses in their research. PMID:25811842
Computer-Based Assessments. Information Capsule. Volume 0918
ERIC Educational Resources Information Center
Blazer, Christie
2010-01-01
This Information Capsule reviews research conducted on computer-based assessments. Advantages and disadvantages associated with computer-based testing programs are summarized and research on the comparability of computer-based and paper-and-pencil assessments is reviewed. Overall, studies suggest that for most students, there are few if any…
NASA Technical Reports Server (NTRS)
1989-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1988 through March 31, 1989 is summarized.
Implementation of cloud computing in higher education
NASA Astrophysics Data System (ADS)
Asniar; Budiawan, R.
2016-04-01
Cloud computing research is a new trend in distributed computing, where people have developed service and SOA (Service Oriented Architecture) based application. This technology is very useful to be implemented, especially for higher education. This research is studied the need and feasibility for the suitability of cloud computing in higher education then propose the model of cloud computing service in higher education in Indonesia that can be implemented in order to support academic activities. Literature study is used as the research methodology to get a proposed model of cloud computing in higher education. Finally, SaaS and IaaS are cloud computing service that proposed to be implemented in higher education in Indonesia and cloud hybrid is the service model that can be recommended.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keyes, D E; McGraw, J R
2006-02-02
Large-scale scientific computation and all of the disciplines that support and help validate it have been placed at the focus of Lawrence Livermore National Laboratory (LLNL) by the Advanced Simulation and Computing (ASC) program of the National Nuclear Security Administration (NNSA) and the Scientific Discovery through Advanced Computing (SciDAC) initiative of the Office of Science of the Department of Energy (DOE). The maturation of simulation as a fundamental tool of scientific and engineering research is underscored in the President's Information Technology Advisory Committee (PITAC) June 2005 finding that ''computational science has become critical to scientific leadership, economic competitiveness, and nationalmore » security''. LLNL operates several of the world's most powerful computers--including today's single most powerful--and has undertaken some of the largest and most compute-intensive simulations ever performed, most notably the molecular dynamics simulation that sustained more than 100 Teraflop/s and won the 2005 Gordon Bell Prize. Ultrascale simulation has been identified as one of the highest priorities in DOE's facilities planning for the next two decades. However, computers at architectural extremes are notoriously difficult to use in an efficient manner. Furthermore, each successful terascale simulation only points out the need for much better ways of interacting with the resulting avalanche of data. Advances in scientific computing research have, therefore, never been more vital to the core missions of LLNL than at present. Computational science is evolving so rapidly along every one of its research fronts that to remain on the leading edge, LLNL must engage researchers at many academic centers of excellence. In FY 2005, the Institute for Scientific Computing Research (ISCR) served as one of LLNL's main bridges to the academic community with a program of collaborative subcontracts, visiting faculty, student internships, workshops, and an active seminar series. The ISCR identifies researchers from the academic community for computer science and computational science collaborations with LLNL and hosts them for both brief and extended visits with the aim of encouraging long-term academic research agendas that address LLNL research priorities. Through these collaborations, ideas and software flow in both directions, and LLNL cultivates its future workforce. The Institute strives to be LLNL's ''eyes and ears'' in the computer and information sciences, keeping the Laboratory aware of and connected to important external advances. It also attempts to be the ''hands and feet'' that carry those advances into the Laboratory and incorporate them into practice. ISCR research participants are integrated into LLNL's Computing Applications and Research (CAR) Department, especially into its Center for Applied Scientific Computing (CASC). In turn, these organizations address computational challenges arising throughout the rest of the Laboratory. Administratively, the ISCR flourishes under LLNL's University Relations Program (URP). Together with the other four institutes of the URP, the ISCR navigates a course that allows LLNL to benefit from academic exchanges while preserving national security. While it is difficult to operate an academic-like research enterprise within the context of a national security laboratory, the results declare the challenges well met and worth the continued effort. The pages of this annual report summarize the activities of the faculty members, postdoctoral researchers, students, and guests from industry and other laboratories who participated in LLNL's computational mission under the auspices of the ISCR during FY 2005.« less
CBRAIN: a web-based, distributed computing platform for collaborative neuroimaging research
Sherif, Tarek; Rioux, Pierre; Rousseau, Marc-Etienne; Kassis, Nicolas; Beck, Natacha; Adalat, Reza; Das, Samir; Glatard, Tristan; Evans, Alan C.
2014-01-01
The Canadian Brain Imaging Research Platform (CBRAIN) is a web-based collaborative research platform developed in response to the challenges raised by data-heavy, compute-intensive neuroimaging research. CBRAIN offers transparent access to remote data sources, distributed computing sites, and an array of processing and visualization tools within a controlled, secure environment. Its web interface is accessible through any modern browser and uses graphical interface idioms to reduce the technical expertise required to perform large-scale computational analyses. CBRAIN's flexible meta-scheduling has allowed the incorporation of a wide range of heterogeneous computing sites, currently including nine national research High Performance Computing (HPC) centers in Canada, one in Korea, one in Germany, and several local research servers. CBRAIN leverages remote computing cycles and facilitates resource-interoperability in a transparent manner for the end-user. Compared with typical grid solutions available, our architecture was designed to be easily extendable and deployed on existing remote computing sites with no tool modification, administrative intervention, or special software/hardware configuration. As October 2013, CBRAIN serves over 200 users spread across 53 cities in 17 countries. The platform is built as a generic framework that can accept data and analysis tools from any discipline. However, its current focus is primarily on neuroimaging research and studies of neurological diseases such as Autism, Parkinson's and Alzheimer's diseases, Multiple Sclerosis as well as on normal brain structure and development. This technical report presents the CBRAIN Platform, its current deployment and usage and future direction. PMID:24904400
CBRAIN: a web-based, distributed computing platform for collaborative neuroimaging research.
Sherif, Tarek; Rioux, Pierre; Rousseau, Marc-Etienne; Kassis, Nicolas; Beck, Natacha; Adalat, Reza; Das, Samir; Glatard, Tristan; Evans, Alan C
2014-01-01
The Canadian Brain Imaging Research Platform (CBRAIN) is a web-based collaborative research platform developed in response to the challenges raised by data-heavy, compute-intensive neuroimaging research. CBRAIN offers transparent access to remote data sources, distributed computing sites, and an array of processing and visualization tools within a controlled, secure environment. Its web interface is accessible through any modern browser and uses graphical interface idioms to reduce the technical expertise required to perform large-scale computational analyses. CBRAIN's flexible meta-scheduling has allowed the incorporation of a wide range of heterogeneous computing sites, currently including nine national research High Performance Computing (HPC) centers in Canada, one in Korea, one in Germany, and several local research servers. CBRAIN leverages remote computing cycles and facilitates resource-interoperability in a transparent manner for the end-user. Compared with typical grid solutions available, our architecture was designed to be easily extendable and deployed on existing remote computing sites with no tool modification, administrative intervention, or special software/hardware configuration. As October 2013, CBRAIN serves over 200 users spread across 53 cities in 17 countries. The platform is built as a generic framework that can accept data and analysis tools from any discipline. However, its current focus is primarily on neuroimaging research and studies of neurological diseases such as Autism, Parkinson's and Alzheimer's diseases, Multiple Sclerosis as well as on normal brain structure and development. This technical report presents the CBRAIN Platform, its current deployment and usage and future direction.
ERIC Educational Resources Information Center
Kim, Karen A.; Fann, Amy J.; Misa-Escalante, Kimberly O.
2011-01-01
Building on research that identifies and addresses issues of women's underrepresentation in computing, this article describes promising practices in undergraduate research experiences that promote women's long-term interest in computer science and engineering. Specifically, this article explores whether and how REU programs include programmatic…
Review of Research on the Cognitive Effects of Computer-Assisted Learning.
ERIC Educational Resources Information Center
Mandinach, E.; And Others
This review of the research on the cognitive effects of computer-assisted instruction begins with an overview of the ACCCEL (Assessing Cognitive Consequences of Computer Environments for Learning) research program at the University of California at Berkeley, which consists of several interrelated studies examining the acquisition of such higher…
What Does Research on Computer-Based Instruction Have to Say to the Reading Teacher?
ERIC Educational Resources Information Center
Balajthy, Ernest
1987-01-01
Examines questions typically asked about the effectiveness of computer-based reading instruction, suggesting that these questions must be refined to provide meaningful insight into the issues involved. Describes several critical problems with existing research and presents overviews of research on the effects of computer-based instruction on…
Researchers at EPA’s National Center for Computational Toxicology (NCCT) integrate advances in biology, chemistry, exposure and computer science to help prioritize chemicals for further research based on potential human health risks. The goal of this research is to quickly evalua...
Abstract
The EPA sponsored a workshop held September 29-30, 2003 at the EPA in RTP that was focused on a proposal entitled "A Framework for a Computational Toxicology Research Program in ORD" (www.epa.gov/computox). Computational toxicology is a new research ini...
Computer Applications in Reading. Third Edition.
ERIC Educational Resources Information Center
Blanchard, Jay S.; And Others
Intended as a reference for researchers, teachers, and administrators, this book chronicles research, programs, and uses of computers in reading. Chapter 1 provides a broad view of computer applications in education, while Chapter 2 provides annotated references for computer based reading and language arts programs for children and adults in…
Computer Education and Computer Use by Preschool Educators
ERIC Educational Resources Information Center
Towns, Bernadette
2010-01-01
Researchers have found that teachers seldom use computers in the preschool classroom. However, little research has examined why preschool teachers elect not to use computers. This case study focused on identifying whether community colleges that prepare teachers for early childhood education include in their curriculum how teachers can effectively…
ERIC Educational Resources Information Center
Zagami, Jason
2015-01-01
Analysis of three decades of publications in Australian Educational Computing (AEC) provides insight into the historical trends in Australian educational computing, highlighting an emphasis on pedagogy, comparatively few articles on educational technologies, and strong research topic alignment with similar international journals. Analysis confirms…
ERIC Educational Resources Information Center
Bower, Beverly L.
1998-01-01
Reviews research on the instructional benefits of computer technology. Discusses the computer readiness of students, faculty, and institutions, and suggests that despite mixed findings, political and organizational realities indicate computer-based instruction is a feasible alternative for community colleges. Therefore, educators should continue…
Internal fluid mechanics research on supercomputers for aerospace propulsion systems
NASA Technical Reports Server (NTRS)
Miller, Brent A.; Anderson, Bernhard H.; Szuch, John R.
1988-01-01
The Internal Fluid Mechanics Division of the NASA Lewis Research Center is combining the key elements of computational fluid dynamics, aerothermodynamic experiments, and advanced computational technology to bring internal computational fluid mechanics (ICFM) to a state of practical application for aerospace propulsion systems. The strategies used to achieve this goal are to: (1) pursue an understanding of flow physics, surface heat transfer, and combustion via analysis and fundamental experiments, (2) incorporate improved understanding of these phenomena into verified 3-D CFD codes, and (3) utilize state-of-the-art computational technology to enhance experimental and CFD research. Presented is an overview of the ICFM program in high-speed propulsion, including work in inlets, turbomachinery, and chemical reacting flows. Ongoing efforts to integrate new computer technologies, such as parallel computing and artificial intelligence, into high-speed aeropropulsion research are described.
HPCCP/CAS Workshop Proceedings 1998
NASA Technical Reports Server (NTRS)
Schulbach, Catherine; Mata, Ellen (Editor); Schulbach, Catherine (Editor)
1999-01-01
This publication is a collection of extended abstracts of presentations given at the HPCCP/CAS (High Performance Computing and Communications Program/Computational Aerosciences Project) Workshop held on August 24-26, 1998, at NASA Ames Research Center, Moffett Field, California. The objective of the Workshop was to bring together the aerospace high performance computing community, consisting of airframe and propulsion companies, independent software vendors, university researchers, and government scientists and engineers. The Workshop was sponsored by the HPCCP Office at NASA Ames Research Center. The Workshop consisted of over 40 presentations, including an overview of NASA's High Performance Computing and Communications Program and the Computational Aerosciences Project; ten sessions of papers representative of the high performance computing research conducted within the Program by the aerospace industry, academia, NASA, and other government laboratories; two panel sessions; and a special presentation by Mr. James Bailey.
Uncover the Cloud for Geospatial Sciences and Applications to Adopt Cloud Computing
NASA Astrophysics Data System (ADS)
Yang, C.; Huang, Q.; Xia, J.; Liu, K.; Li, J.; Xu, C.; Sun, M.; Bambacus, M.; Xu, Y.; Fay, D.
2012-12-01
Cloud computing is emerging as the future infrastructure for providing computing resources to support and enable scientific research, engineering development, and application construction, as well as work force education. On the other hand, there is a lot of doubt about the readiness of cloud computing to support a variety of scientific research, development and educations. This research is a project funded by NASA SMD to investigate through holistic studies how ready is the cloud computing to support geosciences. Four applications with different computing characteristics including data, computing, concurrent, and spatiotemporal intensities are taken to test the readiness of cloud computing to support geosciences. Three popular and representative cloud platforms including Amazon EC2, Microsoft Azure, and NASA Nebula as well as a traditional cluster are utilized in the study. Results illustrates that cloud is ready to some degree but more research needs to be done to fully implemented the cloud benefit as advertised by many vendors and defined by NIST. Specifically, 1) most cloud platform could help stand up new computing instances, a new computer, in a few minutes as envisioned, therefore, is ready to support most computing needs in an on demand fashion; 2) the load balance and elasticity, a defining characteristic, is ready in some cloud platforms, such as Amazon EC2, to support bigger jobs, e.g., needs response in minutes, while some are not ready to support the elasticity and load balance well. All cloud platform needs further research and development to support real time application at subminute level; 3) the user interface and functionality of cloud platforms vary a lot and some of them are very professional and well supported/documented, such as Amazon EC2, some of them needs significant improvement for the general public to adopt cloud computing without professional training or knowledge about computing infrastructure; 4) the security is a big concern in cloud computing platform, with the sharing spirit of cloud computing, it is very hard to ensure higher level security, except a private cloud is built for a specific organization without public access, public cloud platform does not support FISMA medium level yet and may never be able to support FISMA high level; 5) HPC jobs needs of cloud computing is not well supported and only Amazon EC2 supports this well. The research is being taken by NASA and other agencies to consider cloud computing adoption. We hope the publication of the research would also benefit the public to adopt cloud computing.
NASA Technical Reports Server (NTRS)
Weeks, Cindy Lou
1986-01-01
Experiments were conducted at NASA Ames Research Center to define multi-tasking software requirements for multiple-instruction, multiple-data stream (MIMD) computer architectures. The focus was on specifying solutions for algorithms in the field of computational fluid dynamics (CFD). The program objectives were to allow researchers to produce usable parallel application software as soon as possible after acquiring MIMD computer equipment, to provide researchers with an easy-to-learn and easy-to-use parallel software language which could be implemented on several different MIMD machines, and to enable researchers to list preferred design specifications for future MIMD computer architectures. Analysis of CFD algorithms indicated that extensions of an existing programming language, adaptable to new computer architectures, provided the best solution to meeting program objectives. The CoFORTRAN Language was written in response to these objectives and to provide researchers a means to experiment with parallel software solutions to CFD algorithms on machines with parallel architectures.
The possible usability of three-dimensional cone beam computed dental tomography in dental research
NASA Astrophysics Data System (ADS)
Yavuz, I.; Rizal, M. F.; Kiswanjaya, B.
2017-08-01
The innovations and advantages of three-dimensional cone beam computed dental tomography (3D CBCT) are continually growing for its potential use in dental research. Imaging techniques are important for planning research in dentistry. Newly improved 3D CBCT imaging systems and accessory computer programs have recently been proven effective for use in dental research. The aim of this study is to introduce 3D CBCT and open a window for future research possibilities that should be given attention in dental research.
Jain, Arun Kumar; Malhotra, Chintan; Pasari, Anand; Kumar, Pawan; Moshirfar, Majid
2016-09-01
To compare the outcomes of topography-guided and wavefront-optimized treatment in patients having laser in situ keratomileusis (LASIK) for myopia. Advanced Eye Centre, Post Graduate Institute of Medical Education and Research, Chandigarh, India. Prospective contralateral-eye case study. Patients had topography-guided LASIK in 1 eye and wavefront-optimized LASIK in the contralateral eye using the Customized Refractive Surgery Master software and Mel 80 excimer laser. Refractive (residual manifest refraction spherical equivalent [MRSE], higher-order aberrations [HOAs]), and visual (uncorrected distance visual acuity [UDVA] and photopic and mesopic contrast sensitivity) outcomes were prospectively analyzed 6 months postoperatively. The study comprised 35 patients. The UDVA was 0.0 logMAR or better and the postoperative residual MRSE was ±0.50 diopter in 94.29% of eyes in the topography-guided group and 85.71% of eyes in the wavefront-optimized group (P = .09). More eyes in the topography-guided group than in the wavefront-optimized group had a UDVA of -0.1 logMAR or better (P = .04). Topography-guided LASIK was associated with less deterioration of mesopic contrast sensitivity at higher spatial frequencies (12 cycles per degree [cpd] and 18 cpd) and lower amounts of induced coma (P = .04) and spherical aberration (P = .04). Less stromal tissue was ablated in the topography-guided group (mean 61.57 μm ± 16.23 [SD]) than in the wavefront-optimized group (mean 79.71 ± 14.81 μm) (P < .001). Although topography-guided LASIK and wavefront-optimized LASIK gave excellent results, topography-guided LASIK was associated with better contrast sensitivity, lower induction of HOAs, and a smaller amount of tissue ablation. None of the authors has a financial or proprietary interest in any material or method mentioned. Copyright © 2016 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
The applications of computers in biological research
NASA Technical Reports Server (NTRS)
Wei, Jennifer
1988-01-01
Research in many fields could not be done without computers. There is often a great deal of technical data, even in the biological fields, that need to be analyzed. These data, unfortunately, previously absorbed much of every researcher's time. Now, due to the steady increase in computer technology, biological researchers are able to make incredible advances in their work without the added worries of tedious and difficult tasks such as the many mathematical calculations involved in today's research and health care.
The impact of home computer use on children's activities and development.
Subrahmanyam, K; Kraut, R E; Greenfield, P M; Gross, E F
2000-01-01
The increasing amount of time children are spending on computers at home and school has raised questions about how the use of computer technology may make a difference in their lives--from helping with homework to causing depression to encouraging violent behavior. This article provides an overview of the limited research on the effects of home computer use on children's physical, cognitive, and social development. Initial research suggests, for example, that access to computers increases the total amount of time children spend in front of a television or computer screen at the expense of other activities, thereby putting them at risk for obesity. At the same time, cognitive research suggests that playing computer games can be an important building block to computer literacy because it enhances children's ability to read and visualize images in three-dimensional space and track multiple images simultaneously. The limited evidence available also indicates that home computer use is linked to slightly better academic performance. The research findings are more mixed, however, regarding the effects on children's social development. Although little evidence indicates that the moderate use of computers to play games has a negative impact on children's friendships and family relationships, recent survey data show that increased use of the Internet may be linked to increases in loneliness and depression. Of most concern are the findings that playing violent computer games may increase aggressiveness and desensitize a child to suffering, and that the use of computers may blur a child's ability to distinguish real life from simulation. The authors conclude that more systematic research is needed in these areas to help parents and policymakers maximize the positive effects and to minimize the negative effects of home computers in children's lives.
Research and the Personal Computer.
ERIC Educational Resources Information Center
Blackburn, D. A.
1989-01-01
Discussed is the history and elements of the personal computer. Its uses as a laboratory assistant and generic toolkit for mathematical analysis and modeling are included. The future of the personal computer in research is addressed. (KR)
Research in Applied Mathematics, Fluid Mechanics and Computer Science
NASA Technical Reports Server (NTRS)
1999-01-01
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period October 1, 1998 through March 31, 1999.
[Research activities in applied mathematics, fluid mechanics, and computer science
NASA Technical Reports Server (NTRS)
1995-01-01
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period April 1, 1995 through September 30, 1995.
ERIC Educational Resources Information Center
Çelik, Halil Coskun
2015-01-01
The purpose of this study is to investigate the effects of computer courses on young individuals' computer self-efficacy, attitudes and achievement. The study group of this research included 60 unemployed young individuals (18-25 ages) in total; 30 in the experimental group and 30 in the control group. An experimental research model with pretest…
The iPlant Collaborative: Cyberinfrastructure for Enabling Data to Discovery for the Life Sciences
Merchant, Nirav; Lyons, Eric; Goff, Stephen; Vaughn, Matthew; Ware, Doreen; Micklos, David; Antin, Parker
2016-01-01
The iPlant Collaborative provides life science research communities access to comprehensive, scalable, and cohesive computational infrastructure for data management; identity management; collaboration tools; and cloud, high-performance, high-throughput computing. iPlant provides training, learning material, and best practice resources to help all researchers make the best use of their data, expand their computational skill set, and effectively manage their data and computation when working as distributed teams. iPlant’s platform permits researchers to easily deposit and share their data and deploy new computational tools and analysis workflows, allowing the broader community to easily use and reuse those data and computational analyses. PMID:26752627
2010-03-01
functionality and plausibility distinguishes this research from most research in computational linguistics and computational psycholinguistics . The... Psycholinguistic Theory There is extensive psycholinguistic evidence that human language processing is essentially incremental and interactive...challenges of psycholinguistic research is to explain how humans can process language effortlessly and accurately given the complexity and ambiguity that is
The River Basin Model: Computer Output. Water Pollution Control Research Series.
ERIC Educational Resources Information Center
Envirometrics, Inc., Washington, DC.
This research report is part of the Water Pollution Control Research Series which describes the results and progress in the control and abatement of pollution in our nation's waters. The River Basin Model described is a computer-assisted decision-making tool in which a number of computer programs simulate major processes related to water use that…
NASA Technical Reports Server (NTRS)
1994-01-01
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in the areas of (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest, including acoustics and combustion; (3) experimental research in transition and turbulence and aerodynamics involving Langley facilities and scientists; and (4) computer science.
NASA Technical Reports Server (NTRS)
1985-01-01
Slides are reproduced that describe the importance of having high performance number crunching and graphics capability. They also indicate the types of research and development underway at Ames Research Center to ensure that, in the near term, Ames is a smart buyer and user, and in the long-term that Ames knows the best possible solutions for number crunching and graphics needs. The drivers for this research are real computational physics applications of interest to Ames and NASA. They are concerned with how to map the applications, and how to maximize the physics learned from the results of the calculations. The computer graphics activities are aimed at getting maximum information from the three-dimensional calculations by using the real time manipulation of three-dimensional data on the Silicon Graphics workstation. Work is underway on new algorithms that will permit the display of experimental results that are sparse and random, the same way that the dense and regular computed results are displayed.
Computer Games for the Math Achievement of Diverse Students
ERIC Educational Resources Information Center
Kim, Sunha; Chang, Mido
2010-01-01
Although computer games as a way to improve students' learning have received attention by many educational researchers, no consensus has been reached on the effects of computer games on student achievement. Moreover, there is lack of empirical research on differential effects of computer games on diverse learners. In response, this study…
Tracking the PhD Students' Daily Computer Use
ERIC Educational Resources Information Center
Sim, Kwong Nui; van der Meer, Jacques
2015-01-01
This study investigated PhD students' computer activities in their daily research practice. Software that tracks computer usage (Manic Time) was installed on the computers of nine PhD students, who were at their early, mid and final stage in doing their doctoral research in four different discipline areas (Commerce, Humanities, Health Sciences and…
Computer Skills Acquisition: A Review and Future Directions for Research.
ERIC Educational Resources Information Center
Gattiker, Urs E.
A review of past research on training employees for computer-mediated work leads to the development of theory and propositions concerning the relationship between different variables, such as: (1) individual factors; (2) task and person-computer interface; (3) characteristics of training design for the acquisition of computer skills; and (4) the…
Smolinski, Tomasz G
2010-01-01
Computer literacy plays a critical role in today's life sciences research. Without the ability to use computers to efficiently manipulate and analyze large amounts of data resulting from biological experiments and simulations, many of the pressing questions in the life sciences could not be answered. Today's undergraduates, despite the ubiquity of computers in their lives, seem to be largely unfamiliar with how computers are being used to pursue and answer such questions. This article describes an innovative undergraduate-level course, titled Computer Literacy for Life Sciences, that aims to teach students the basics of a computerized scientific research pursuit. The purpose of the course is for students to develop a hands-on working experience in using standard computer software tools as well as computer techniques and methodologies used in life sciences research. This paper provides a detailed description of the didactical tools and assessment methods used in and outside of the classroom as well as a discussion of the lessons learned during the first installment of the course taught at Emory University in fall semester 2009.
Computational Fluid Dynamics Program at NASA Ames Research Center
NASA Technical Reports Server (NTRS)
Holst, Terry L.
1989-01-01
The Computational Fluid Dynamics (CFD) Program at NASA Ames Research Center is reviewed and discussed. The technical elements of the CFD Program are listed and briefly discussed. These elements include algorithm research, research and pilot code development, scientific visualization, advanced surface representation, volume grid generation, and numerical optimization. Next, the discipline of CFD is briefly discussed and related to other areas of research at NASA Ames including experimental fluid dynamics, computer science research, computational chemistry, and numerical aerodynamic simulation. These areas combine with CFD to form a larger area of research, which might collectively be called computational technology. The ultimate goal of computational technology research at NASA Ames is to increase the physical understanding of the world in which we live, solve problems of national importance, and increase the technical capabilities of the aerospace community. Next, the major programs at NASA Ames that either use CFD technology or perform research in CFD are listed and discussed. Briefly, this list includes turbulent/transition physics and modeling, high-speed real gas flows, interdisciplinary research, turbomachinery demonstration computations, complete aircraft aerodynamics, rotorcraft applications, powered lift flows, high alpha flows, multiple body aerodynamics, and incompressible flow applications. Some of the individual problems actively being worked in each of these areas is listed to help define the breadth or extent of CFD involvement in each of these major programs. State-of-the-art examples of various CFD applications are presented to highlight most of these areas. The main emphasis of this portion of the presentation is on examples which will not otherwise be treated at this conference by the individual presentations. Finally, a list of principal current limitations and expected future directions is given.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tzeng, Nian-Feng; White, Christopher D.; Moreman, Douglas
2012-07-14
The UCoMS research cluster has spearheaded three research areas since August 2004, including wireless and sensor networks, Grid computing, and petroleum applications. The primary goals of UCoMS research are three-fold: (1) creating new knowledge to push forward the technology forefronts on pertinent research on the computing and monitoring aspects of energy resource management, (2) developing and disseminating software codes and toolkits for the research community and the public, and (3) establishing system prototypes and testbeds for evaluating innovative techniques and methods. Substantial progress and diverse accomplishment have been made by research investigators in their respective areas of expertise cooperatively onmore » such topics as sensors and sensor networks, wireless communication and systems, computational Grids, particularly relevant to petroleum applications.« less
[Research Conducted at the Institute for Computer Applications in Science and Engineering
NASA Technical Reports Server (NTRS)
1997-01-01
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period 1 Oct. 1996 - 31 Mar. 1997.
Applied Computational Fluid Dynamics at NASA Ames Research Center
NASA Technical Reports Server (NTRS)
Holst, Terry L.; Kwak, Dochan (Technical Monitor)
1994-01-01
The field of Computational Fluid Dynamics (CFD) has advanced to the point where it can now be used for many applications in fluid mechanics research and aerospace vehicle design. A few applications being explored at NASA Ames Research Center will be presented and discussed. The examples presented will range in speed from hypersonic to low speed incompressible flow applications. Most of the results will be from numerical solutions of the Navier-Stokes or Euler equations in three space dimensions for general geometry applications. Computational results will be used to highlight the presentation as appropriate. Advances in computational facilities including those associated with NASA's CAS (Computational Aerosciences) Project of the Federal HPCC (High Performance Computing and Communications) Program will be discussed. Finally, opportunities for future research will be presented and discussed. All material will be taken from non-sensitive, previously-published and widely-disseminated work.
Research Institute for Advanced Computer Science: Annual Report October 1998 through September 1999
NASA Technical Reports Server (NTRS)
Leiner, Barry M.; Gross, Anthony R. (Technical Monitor)
1999-01-01
The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center (ARC). It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. ARC has been designated NASA's Center of Excellence in Information Technology. In this capacity, ARC is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA ARC and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, and visiting scientist programs, designed to encourage and facilitate collaboration between the university and NASA information technology research communities.
Research Institute for Advanced Computer Science
NASA Technical Reports Server (NTRS)
Gross, Anthony R. (Technical Monitor); Leiner, Barry M.
2000-01-01
The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center. It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. Ames has been designated NASA's Center of Excellence in Information Technology. In this capacity, Ames is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA Ames and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth; (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking. Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, and visiting scientist programs, designed to encourage and facilitate collaboration between the university and NASA information technology research communities.
Perspectives on an education in computational biology and medicine.
Rubinstein, Jill C
2012-09-01
The mainstream application of massively parallel, high-throughput assays in biomedical research has created a demand for scientists educated in Computational Biology and Bioinformatics (CBB). In response, formalized graduate programs have rapidly evolved over the past decade. Concurrently, there is increasing need for clinicians trained to oversee the responsible translation of CBB research into clinical tools. Physician-scientists with dedicated CBB training can facilitate such translation, positioning themselves at the intersection between computational biomedical research and medicine. This perspective explores key elements of the educational path to such a position, specifically addressing: 1) evolving perceptions of the role of the computational biologist and the impact on training and career opportunities; 2) challenges in and strategies for obtaining the core skill set required of a biomedical researcher in a computational world; and 3) how the combination of CBB with medical training provides a logical foundation for a career in academic medicine and/or biomedical research.
ERIC Educational Resources Information Center
Pederson, Kathleen Marshall
The status of research on computer-assisted language learning (CALL) is explored beginning with a historical perspective of research on the language laboratory, followed by analyses of applied research on CALL. A theoretical base is provided to illustrate the need for more basic research on CALL that considers computer capabilities, learner…
ERIC Educational Resources Information Center
Mayer, Richard E.
A review of the research on techniques for increasing the novice's understanding of computers and computer programming, this paper considers the potential usefulness of five tentative recommendations pertinent to the design of computer literacy curricula: (1) provide the learner with a concrete model of the computer; (2) encourage the learner to…
High Performance Computer Cluster for Theoretical Studies of Roaming in Chemical Reactions
2016-08-30
High-performance Computer Cluster for Theoretical Studies of Roaming in Chemical Reactions A dedicated high-performance computer cluster was...SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS (ES) U.S. Army Research Office P.O. Box 12211 Research Triangle Park, NC 27709-2211 Computer cluster ...peer-reviewed journals: Final Report: High-performance Computer Cluster for Theoretical Studies of Roaming in Chemical Reactions Report Title A dedicated
NASA Astrophysics Data System (ADS)
Fauzi, Ahmad
2017-11-01
Numerical computation has many pedagogical advantages: it develops analytical skills and problem-solving skills, helps to learn through visualization, and enhances physics education. Unfortunately, numerical computation is not taught to undergraduate education physics students in Indonesia. Incorporate numerical computation into the undergraduate education physics curriculum presents many challenges. The main challenges are the dense curriculum that makes difficult to put new numerical computation course and most students have no programming experience. In this research, we used case study to review how to integrate numerical computation into undergraduate education physics curriculum. The participants of this research were 54 students of the fourth semester of physics education department. As a result, we concluded that numerical computation could be integrated into undergraduate education physics curriculum using spreadsheet excel combined with another course. The results of this research become complements of the study on how to integrate numerical computation in learning physics using spreadsheet excel.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lottes, S.A.; Bojanowski, C.; Shen, J.
2012-04-09
The computational fluid dynamics (CFD) and computational structural mechanics (CSM) focus areas at Argonne's Transportation Research and Analysis Computing Center (TRACC) initiated a project to support and compliment the experimental programs at the Turner-Fairbank Highway Research Center (TFHRC) with high performance computing based analysis capabilities in August 2010. The project was established with a new interagency agreement between the Department of Energy and the Department of Transportation to provide collaborative research, development, and benchmarking of advanced three-dimensional computational mechanics analysis methods to the aerodynamics and hydraulics laboratories at TFHRC for a period of five years, beginning in October 2010. Themore » analysis methods employ well-benchmarked and supported commercial computational mechanics software. Computational mechanics encompasses the areas of Computational Fluid Dynamics (CFD), Computational Wind Engineering (CWE), Computational Structural Mechanics (CSM), and Computational Multiphysics Mechanics (CMM) applied in Fluid-Structure Interaction (FSI) problems. The major areas of focus of the project are wind and water effects on bridges - superstructure, deck, cables, and substructure (including soil), primarily during storms and flood events - and the risks that these loads pose to structural failure. For flood events at bridges, another major focus of the work is assessment of the risk to bridges caused by scour of stream and riverbed material away from the foundations of a bridge. Other areas of current research include modeling of flow through culverts to improve design allowing for fish passage, modeling of the salt spray transport into bridge girders to address suitability of using weathering steel in bridges, CFD analysis of the operation of the wind tunnel in the TFHRC wind engineering laboratory. This quarterly report documents technical progress on the project tasks for the period of October through December 2011.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lottes, S.A.; Bojanowski, C.; Shen, J.
2012-06-28
The computational fluid dynamics (CFD) and computational structural mechanics (CSM) focus areas at Argonne's Transportation Research and Analysis Computing Center (TRACC) initiated a project to support and compliment the experimental programs at the Turner-Fairbank Highway Research Center (TFHRC) with high performance computing based analysis capabilities in August 2010. The project was established with a new interagency agreement between the Department of Energy and the Department of Transportation to provide collaborative research, development, and benchmarking of advanced three-dimensional computational mechanics analysis methods to the aerodynamics and hydraulics laboratories at TFHRC for a period of five years, beginning in October 2010. Themore » analysis methods employ well benchmarked and supported commercial computational mechanics software. Computational mechanics encompasses the areas of Computational Fluid Dynamics (CFD), Computational Wind Engineering (CWE), Computational Structural Mechanics (CSM), and Computational Multiphysics Mechanics (CMM) applied in Fluid-Structure Interaction (FSI) problems. The major areas of focus of the project are wind and water effects on bridges - superstructure, deck, cables, and substructure (including soil), primarily during storms and flood events - and the risks that these loads pose to structural failure. For flood events at bridges, another major focus of the work is assessment of the risk to bridges caused by scour of stream and riverbed material away from the foundations of a bridge. Other areas of current research include modeling of flow through culverts to improve design allowing for fish passage, modeling of the salt spray transport into bridge girders to address suitability of using weathering steel in bridges, CFD analysis of the operation of the wind tunnel in the TFHRC wind engineering laboratory. This quarterly report documents technical progress on the project tasks for the period of January through March 2012.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lottes, S.A.; Kulak, R.F.; Bojanowski, C.
2011-08-26
The computational fluid dynamics (CFD) and computational structural mechanics (CSM) focus areas at Argonne's Transportation Research and Analysis Computing Center (TRACC) initiated a project to support and compliment the experimental programs at the Turner-Fairbank Highway Research Center (TFHRC) with high performance computing based analysis capabilities in August 2010. The project was established with a new interagency agreement between the Department of Energy and the Department of Transportation to provide collaborative research, development, and benchmarking of advanced three-dimensional computational mechanics analysis methods to the aerodynamics and hydraulics laboratories at TFHRC for a period of five years, beginning in October 2010. Themore » analysis methods employ well-benchmarked and supported commercial computational mechanics software. Computational mechanics encompasses the areas of Computational Fluid Dynamics (CFD), Computational Wind Engineering (CWE), Computational Structural Mechanics (CSM), and Computational Multiphysics Mechanics (CMM) applied in Fluid-Structure Interaction (FSI) problems. The major areas of focus of the project are wind and water loads on bridges - superstructure, deck, cables, and substructure (including soil), primarily during storms and flood events - and the risks that these loads pose to structural failure. For flood events at bridges, another major focus of the work is assessment of the risk to bridges caused by scour of stream and riverbed material away from the foundations of a bridge. Other areas of current research include modeling of flow through culverts to assess them for fish passage, modeling of the salt spray transport into bridge girders to address suitability of using weathering steel in bridges, vehicle stability under high wind loading, and the use of electromagnetic shock absorbers to improve vehicle stability under high wind conditions. This quarterly report documents technical progress on the project tasks for the period of April through June 2011.« less
Joint the Center for Applied Scientific Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gamblin, Todd; Bremer, Timo; Van Essen, Brian
The Center for Applied Scientific Computing serves as Livermore Lab’s window to the broader computer science, computational physics, applied mathematics, and data science research communities. In collaboration with academic, industrial, and other government laboratory partners, we conduct world-class scientific research and development on problems critical to national security. CASC applies the power of high-performance computing and the efficiency of modern computational methods to the realms of stockpile stewardship, cyber and energy security, and knowledge discovery for intelligence applications.
COMPUTATIONAL TOXICOLOGY: FRAMEWORK, PARTNERSHIPS, AND PROGRAM DEVELOPMENT
Computational toxicology is a new research initiative being developed within the Office of Research and Development (ORD) of the US Environmental Protection Agency (EPA). Operationally, it is defined as the application of mathematical and computer models together with molecular c...
Research in progress at the Institute for Computer Applications in Science and Engineering
NASA Technical Reports Server (NTRS)
1987-01-01
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April 1, 1987 through October 1, 1987.
Argonne Research Library | Argonne National Laboratory
Publications Researchers Postdocs Exascale Computing Institute for Molecular Engineering at Argonne Work with Scientific Publications Researchers Postdocs Exascale Computing Institute for Molecular Engineering at IMEInstitute for Molecular Engineering JCESRJoint Center for Energy Storage Research MCSGMidwest Center for
NASA Technical Reports Server (NTRS)
Moore, Robert C.
1998-01-01
The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities that serves as a bridge between NASA and the academic community. Under a five-year co-operative agreement with NASA, research at RIACS is focused on areas that are strategically enabling to the Ames Research Center's role as NASA's Center of Excellence for Information Technology. The primary mission of RIACS is charted to carry out research and development in computer science. This work is devoted in the main to tasks that are strategically enabling with respect to NASA's bold mission in space exploration and aeronautics. There are three foci for this work: (1) Automated Reasoning. (2) Human-Centered Computing. and (3) High Performance Computing and Networking. RIACS has the additional goal of broadening the base of researcher in these areas of importance to the nation's space and aeronautics enterprises. Through its visiting scientist program, RIACS facilitates the participation of university-based researchers, including both faculty and students, in the research activities of NASA and RIACS. RIACS researchers work in close collaboration with NASA computer scientists on projects such as the Remote Agent Experiment on Deep Space One mission, and Super-Resolution Surface Modeling.
NASA Astrophysics Data System (ADS)
Moore, S. L.; Kar, A.; Gomez, R.
2015-12-01
A partnership between Fort Valley State University (FVSU), the Jackson School of Geosciences at The University of Texas (UT) at Austin, and the Texas Advanced Computing Center (TACC) is engaging computational geoscience faculty and researchers with academically talented underrepresented minority (URM) students, training them to solve grand challenges . These next generation computational geoscientists are being trained to solve some of the world's most challenging geoscience grand challenges requiring data intensive large scale modeling and simulation on high performance computers . UT Austin's geoscience outreach program GeoFORCE, recently awarded the Presidential Award in Excellence in Science, Mathematics and Engineering Mentoring, contributes to the collaborative best practices in engaging researchers with URM students. Collaborative efforts over the past decade are providing data demonstrating that integrative pipeline programs with mentoring and paid internship opportunities, multi-year scholarships, computational training, and communication skills development are having an impact on URMs developing middle skills for geoscience careers. Since 1997, the Cooperative Developmental Energy Program at FVSU and its collaborating universities have graduated 87 engineers, 33 geoscientists, and eight health physicists. Recruited as early as high school, students enroll for three years at FVSU majoring in mathematics, chemistry or biology, and then transfer to UT Austin or other partner institutions to complete a second STEM degree, including geosciences. A partnership with the Integrative Computational Education and Research Traineeship (ICERT), a National Science Foundation (NSF) Research Experience for Undergraduates (REU) Site at TACC provides students with a 10-week summer research experience at UT Austin. Mentored by TACC researchers, students with no previous background in computational science learn to use some of the world's most powerful high performance computing resources to address a grand geosciences problem. Students increase their ability to understand and explain the societal impact of their research and communicate the research to multidisciplinary and lay audiences via near-peer mentoring, poster presentations, and publication opportunities.
Alford, Rebecca F.; Dolan, Erin L.
2017-01-01
Computational biology is an interdisciplinary field, and many computational biology research projects involve distributed teams of scientists. To accomplish their work, these teams must overcome both disciplinary and geographic barriers. Introducing new training paradigms is one way to facilitate research progress in computational biology. Here, we describe a new undergraduate program in biomolecular structure prediction and design in which students conduct research at labs located at geographically-distributed institutions while remaining connected through an online community. This 10-week summer program begins with one week of training on computational biology methods development, transitions to eight weeks of research, and culminates in one week at the Rosetta annual conference. To date, two cohorts of students have participated, tackling research topics including vaccine design, enzyme design, protein-based materials, glycoprotein modeling, crowd-sourced science, RNA processing, hydrogen bond networks, and amyloid formation. Students in the program report outcomes comparable to students who participate in similar in-person programs. These outcomes include the development of a sense of community and increases in their scientific self-efficacy, scientific identity, and science values, all predictors of continuing in a science research career. Furthermore, the program attracted students from diverse backgrounds, which demonstrates the potential of this approach to broaden the participation of young scientists from backgrounds traditionally underrepresented in computational biology. PMID:29216185
Alford, Rebecca F; Leaver-Fay, Andrew; Gonzales, Lynda; Dolan, Erin L; Gray, Jeffrey J
2017-12-01
Computational biology is an interdisciplinary field, and many computational biology research projects involve distributed teams of scientists. To accomplish their work, these teams must overcome both disciplinary and geographic barriers. Introducing new training paradigms is one way to facilitate research progress in computational biology. Here, we describe a new undergraduate program in biomolecular structure prediction and design in which students conduct research at labs located at geographically-distributed institutions while remaining connected through an online community. This 10-week summer program begins with one week of training on computational biology methods development, transitions to eight weeks of research, and culminates in one week at the Rosetta annual conference. To date, two cohorts of students have participated, tackling research topics including vaccine design, enzyme design, protein-based materials, glycoprotein modeling, crowd-sourced science, RNA processing, hydrogen bond networks, and amyloid formation. Students in the program report outcomes comparable to students who participate in similar in-person programs. These outcomes include the development of a sense of community and increases in their scientific self-efficacy, scientific identity, and science values, all predictors of continuing in a science research career. Furthermore, the program attracted students from diverse backgrounds, which demonstrates the potential of this approach to broaden the participation of young scientists from backgrounds traditionally underrepresented in computational biology.
Benefits of Exchange Between Computer Scientists and Perceptual Scientists: A Panel Discussion
NASA Technical Reports Server (NTRS)
Kaiser, Mary K.; Null, Cynthia H. (Technical Monitor)
1995-01-01
We have established several major goals for this panel: 1) Introduce the computer graphics community to some specific leaders in the use of perceptual psychology relating to computer graphics; 2) Enumerate the major results that are known, and provide a set of resources for finding others; 3) Identify research areas where knowledge of perceptual psychology can help computer system designers improve their systems; and 4) Provide advice to researchers on how they can establish collaborations in their own research programs. We believe this will be a very important panel. In addition to generating lively discussion, we hope to point out some of the fundamental issues that occur at the boundary between computer science and perception, and possibly help researchers avoid some of the common pitfalls.
Parallel Computational Fluid Dynamics: Current Status and Future Requirements
NASA Technical Reports Server (NTRS)
Simon, Horst D.; VanDalsem, William R.; Dagum, Leonardo; Kutler, Paul (Technical Monitor)
1994-01-01
One or the key objectives of the Applied Research Branch in the Numerical Aerodynamic Simulation (NAS) Systems Division at NASA Allies Research Center is the accelerated introduction of highly parallel machines into a full operational environment. In this report we discuss the performance results obtained from the implementation of some computational fluid dynamics (CFD) applications on the Connection Machine CM-2 and the Intel iPSC/860. We summarize some of the experiences made so far with the parallel testbed machines at the NAS Applied Research Branch. Then we discuss the long term computational requirements for accomplishing some of the grand challenge problems in computational aerosciences. We argue that only massively parallel machines will be able to meet these grand challenge requirements, and we outline the computer science and algorithm research challenges ahead.
ERIC Educational Resources Information Center
Congress of the U.S., Washington, DC. Senate Committee on Commerce, Science, and Transportation.
This committee report is intended to accompany S. 1067, a bill designed to provide for a coordinated federal research program in high-performance computing (HPC). The primary objective of the legislation is given as the acceleration of research, development, and application of the most advanced computing technology in research, education, and…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deng, Z.T.
2001-11-15
The objective of this project was to conduct high-performance computing research and teaching at AAMU, and to train African-American and other minority students and scientists in the computational science field for eventual employment with DOE. During the project period, eight tasks were accomplished. Student Research Assistant, Work Study, Summer Interns, Scholarship were proved to be one of the best ways for us to attract top-quality minority students. Under the support of DOE, through research, summer interns, collaborations, scholarships programs, AAMU has successfully provided research and educational opportunities to minority students in the field related to computational science.
Effect size calculation in meta-analyses of psychotherapy outcome research.
Hoyt, William T; Del Re, A C
2018-05-01
Meta-analysis of psychotherapy intervention research normally examines differences between treatment groups and some form of comparison group (e.g., wait list control; alternative treatment group). The effect of treatment is normally quantified as a standardized mean difference (SMD). We describe procedures for computing unbiased estimates of the population SMD from sample data (e.g., group Ms and SDs), and provide guidance about a number of complications that may arise related to effect size computation. These complications include (a) incomplete data in research reports; (b) use of baseline data in computing SMDs and estimating the population standard deviation (σ); (c) combining effect size data from studies using different research designs; and (d) appropriate techniques for analysis of data from studies providing multiple estimates of the effect of interest (i.e., dependent effect sizes). Clinical or Methodological Significance of this article: Meta-analysis is a set of techniques for producing valid summaries of existing research. The initial computational step for meta-analyses of research on intervention outcomes involves computing an effect size quantifying the change attributable to the intervention. We discuss common issues in the computation of effect sizes and provide recommended procedures to address them.
Ethics Regulation in Social Computing Research: Examining the Role of Institutional Review Boards.
Vitak, Jessica; Proferes, Nicholas; Shilton, Katie; Ashktorab, Zahra
2017-12-01
The parallel rise of pervasive data collection platforms and computational methods for collecting, analyzing, and drawing inferences from large quantities of user data has advanced social computing research, investigating digital traces to understand mediated behaviors of individuals, groups, and societies. At the same time, methods employed to access these data have raised questions about ethical research practices. This article provides insights into U.S. institutional review boards' (IRBs) attitudes and practices regulating social computing research. Through descriptive and inferential analysis of survey data from staff at 59 IRBs at research universities, we examine how IRBs evaluate the growing variety of studies using pervasive digital data. Findings unpack the difficulties IRB staff face evaluating increasingly technical research proposals while highlighting the belief in their ability to surmount these difficulties. They also indicate a lack of consensus among IRB staff about what should be reviewed and a willingness to work closely with researchers.
Reading and Computers: Issues for Theory and Practice. Computers and Education Series.
ERIC Educational Resources Information Center
Reinking, David, Ed.
Embodying two themes--that the computer can become an even more exciting instructional tool than it is today, and that the research necessary for developing the potential of this tool is already underway, this book explores the theoretical, research, and instructional issues concerning computers and reading. The titles of the essays and their…
ERIC Educational Resources Information Center
Ates, Alev; Altunay, Ugur; Altun, Eralp
2006-01-01
The aim of this research was to discern the effects of computer assisted English instruction on English language preparatory students' attitudes towards computers and English in a Turkish-medium high school with an intensive English program. A quasi-experimental time series research design, also called "before-after" or "repeated…
Rutkowski, Tomasz M
2015-08-01
This paper presents an applied concept of a brain-computer interface (BCI) student research laboratory (BCI-LAB) at the Life Science Center of TARA, University of Tsukuba, Japan. Several successful case studies of the student projects are reviewed together with the BCI Research Award 2014 winner case. The BCI-LAB design and project-based teaching philosophy is also explained. Future teaching and research directions summarize the review.
ERIC Educational Resources Information Center
Patterson, Janice H.; Smith, Marshall S.
This report presents a national agenda for research on the learning of thinking skills via computer technology which was developed at a National Academy of Sciences conference on educational, methodological, and practical issues involved in the use of computers to promote complex thought in grades K-12. The discussion of research topics agreed…
Computer-aided drug discovery research at a global contract research organization
NASA Astrophysics Data System (ADS)
Kitchen, Douglas B.
2017-03-01
Computer-aided drug discovery started at Albany Molecular Research, Inc in 1997. Over nearly 20 years the role of cheminformatics and computational chemistry has grown throughout the pharmaceutical industry and at AMRI. This paper will describe the infrastructure and roles of CADD throughout drug discovery and some of the lessons learned regarding the success of several methods. Various contributions provided by computational chemistry and cheminformatics in chemical library design, hit triage, hit-to-lead and lead optimization are discussed. Some frequently used computational chemistry techniques are described. The ways in which they may contribute to discovery projects are presented based on a few examples from recent publications.
Computer-aided drug discovery research at a global contract research organization.
Kitchen, Douglas B
2017-03-01
Computer-aided drug discovery started at Albany Molecular Research, Inc in 1997. Over nearly 20 years the role of cheminformatics and computational chemistry has grown throughout the pharmaceutical industry and at AMRI. This paper will describe the infrastructure and roles of CADD throughout drug discovery and some of the lessons learned regarding the success of several methods. Various contributions provided by computational chemistry and cheminformatics in chemical library design, hit triage, hit-to-lead and lead optimization are discussed. Some frequently used computational chemistry techniques are described. The ways in which they may contribute to discovery projects are presented based on a few examples from recent publications.
Charlebois, Kathleen; Palmour, Nicole; Knoppers, Bartha Maria
2016-01-01
This study aims to understand the influence of the ethical and legal issues on cloud computing adoption in the field of genomics research. To do so, we adapted Diffusion of Innovation (DoI) theory to enable understanding of how key stakeholders manage the various ethical and legal issues they encounter when adopting cloud computing. Twenty semi-structured interviews were conducted with genomics researchers, patient advocates and cloud service providers. Thematic analysis generated five major themes: 1) Getting comfortable with cloud computing; 2) Weighing the advantages and the risks of cloud computing; 3) Reconciling cloud computing with data privacy; 4) Maintaining trust and 5) Anticipating the cloud by creating the conditions for cloud adoption. Our analysis highlights the tendency among genomics researchers to gradually adopt cloud technology. Efforts made by cloud service providers to promote cloud computing adoption are confronted by researchers' perpetual cost and security concerns, along with a lack of familiarity with the technology. Further underlying those fears are researchers' legal responsibility with respect to the data that is stored on the cloud. Alternative consent mechanisms aimed at increasing patients' control over the use of their data also provide a means to circumvent various institutional and jurisdictional hurdles that restrict access by creating siloed databases. However, the risk of creating new, cloud-based silos may run counter to the goal in genomics research to increase data sharing on a global scale.
NASA Technical Reports Server (NTRS)
1994-01-01
CESDIS, the Center of Excellence in Space Data and Information Sciences was developed jointly by NASA, Universities Space Research Association (USRA), and the University of Maryland in 1988 to focus on the design of advanced computing techniques and data systems to support NASA Earth and space science research programs. CESDIS is operated by USRA under contract to NASA. The Director, Associate Director, Staff Scientists, and administrative staff are located on-site at NASA's Goddard Space Flight Center in Greenbelt, Maryland. The primary CESDIS mission is to increase the connection between computer science and engineering research programs at colleges and universities and NASA groups working with computer applications in Earth and space science. Research areas of primary interest at CESDIS include: 1) High performance computing, especially software design and performance evaluation for massively parallel machines; 2) Parallel input/output and data storage systems for high performance parallel computers; 3) Data base and intelligent data management systems for parallel computers; 4) Image processing; 5) Digital libraries; and 6) Data compression. CESDIS funds multiyear projects at U. S. universities and colleges. Proposals are accepted in response to calls for proposals and are selected on the basis of peer reviews. Funds are provided to support faculty and graduate students working at their home institutions. Project personnel visit Goddard during academic recess periods to attend workshops, present seminars, and collaborate with NASA scientists on research projects. Additionally, CESDIS takes on specific research tasks of shorter duration for computer science research requested by NASA Goddard scientists.
Research on Student Thought Processes during Computer-Based Instruction.
ERIC Educational Resources Information Center
Clark, Richard E.
1984-01-01
Reviews cognitive research related to computer-based instruction in the areas of motivation; the relationship between computer-assisted instruction and learning; learner control; transfer of learning; hemispheric dominance; and anxiety. Design professionals are urged to consider congitive views. (MBR)
Computer Supported Cooperative Work in Information Search and Retrieval.
ERIC Educational Resources Information Center
Twidale, Michael B.; Nichols, David M.
1998-01-01
Considers how research in collaborative technologies can inform research and development in library and information science. Topics include computer supported collaborative work; shared drawing; collaborative writing; MUDs; MOOs; workflow; World Wide Web; collaborative learning; computer mediated communication; ethnography; evaluation; remote…
Center for Advanced Computational Technology
NASA Technical Reports Server (NTRS)
Noor, Ahmed K.
2000-01-01
The Center for Advanced Computational Technology (ACT) was established to serve as a focal point for diverse research activities pertaining to application of advanced computational technology to future aerospace systems. These activities include the use of numerical simulations, artificial intelligence methods, multimedia and synthetic environments, and computational intelligence, in the modeling, analysis, sensitivity studies, optimization, design and operation of future aerospace systems. The Center is located at NASA Langley and is an integral part of the School of Engineering and Applied Science of the University of Virginia. The Center has four specific objectives: 1) conduct innovative research on applications of advanced computational technology to aerospace systems; 2) act as pathfinder by demonstrating to the research community what can be done (high-potential, high-risk research); 3) help in identifying future directions of research in support of the aeronautical and space missions of the twenty-first century; and 4) help in the rapid transfer of research results to industry and in broadening awareness among researchers and engineers of the state-of-the-art in applications of advanced computational technology to the analysis, design prototyping and operations of aerospace and other high-performance engineering systems. In addition to research, Center activities include helping in the planning and coordination of the activities of a multi-center team of NASA and JPL researchers who are developing an intelligent synthesis environment for future aerospace systems; organizing workshops and national symposia; as well as writing state-of-the-art monographs and NASA special publications on timely topics.
NASA Technical Reports Server (NTRS)
Moore, Robert C.
1998-01-01
The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities that serves as a bridge between NASA and the academic community. Under a five-year co-operative agreement with NASA, research at RIACS is focused on areas that are strategically enabling to the Ames Research Center's role as NASA's Center of Excellence for Information Technology. Research is carried out by a staff of full-time scientist,augmented by visitors, students, post doctoral candidates and visiting university faculty. The primary mission of RIACS is charted to carry out research and development in computer science. This work is devoted in the main to tasks that are strategically enabling with respect to NASA's bold mission in space exploration and aeronautics. There are three foci for this work: Automated Reasoning. Human-Centered Computing. and High Performance Computing and Networking. RIACS has the additional goal of broadening the base of researcher in these areas of importance to the nation's space and aeronautics enterprises. Through its visiting scientist program, RIACS facilitates the participation of university-based researchers, including both faculty and students, in the research activities of NASA and RIACS. RIACS researchers work in close collaboration with NASA computer scientists on projects such as the Remote Agent Experiment on Deep Space One mission, and Super-Resolution Surface Modeling.
1976-03-01
RESEARCH IN FUNCTIONALLY DISTRIBUTED COMPUTER SYSTEMS DEVEI.OPME--ETClU) MAR 76 P S FISHER, F MARYANSKI DAA629-76-6-0108 UNCLASSIFIED CS-76-08AN...RESEARCH IN FUNCTIONALLY !DISTRIBUTED COMPUTER SYSTEMS DEVELOPMENT Kansas State University Virgil Wallentine Principal Investigator Approved for public...reme; disiribution unlimited DTIC \\4JWE III ELECTi"U ~E V0AI. Ill ~1ONTAUG 2 0 1981&EV .IAIN LiSP4 F U.S. ARMY COMPUTER SYSTEMS COMMAND FT BELVOIR, VA
Network and computing infrastructure for scientific applications in Georgia
NASA Astrophysics Data System (ADS)
Kvatadze, R.; Modebadze, Z.
2016-09-01
Status of network and computing infrastructure and available services for research and education community of Georgia are presented. Research and Educational Networking Association - GRENA provides the following network services: Internet connectivity, network services, cyber security, technical support, etc. Computing resources used by the research teams are located at GRENA and at major state universities. GE-01-GRENA site is included in European Grid infrastructure. Paper also contains information about programs of Learning Center and research and development projects in which GRENA is participating.
COMPUTER-AIDED DATA ACQUISITION FOR COMBUSTION EXPERIMENTS
The article describes the use of computer-aided data acquisition techniques to aid the research program of the Combustion Research Branch (CRB) of the U.S. EPA's Air and Energy Engineering Research Laboratory (AEERL) in Research Triangle Park, NC, in particular on CRB's bench-sca...
LaRC local area networks to support distributed computing
NASA Technical Reports Server (NTRS)
Riddle, E. P.
1984-01-01
The Langley Research Center's (LaRC) Local Area Network (LAN) effort is discussed. LaRC initiated the development of a LAN to support a growing distributed computing environment at the Center. The purpose of the network is to provide an improved capability (over inteactive and RJE terminal access) for sharing multivendor computer resources. Specifically, the network will provide a data highway for the transfer of files between mainframe computers, minicomputers, work stations, and personal computers. An important influence on the overall network design was the vital need of LaRC researchers to efficiently utilize the large CDC mainframe computers in the central scientific computing facility. Although there was a steady migration from a centralized to a distributed computing environment at LaRC in recent years, the work load on the central resources increased. Major emphasis in the network design was on communication with the central resources within the distributed environment. The network to be implemented will allow researchers to utilize the central resources, distributed minicomputers, work stations, and personal computers to obtain the proper level of computing power to efficiently perform their jobs.
ERIC Educational Resources Information Center
Lawlor, Joseph, Ed.
Suggestions for integrating computer technology and composition instruction are presented in four conference papers, summaries of four conference courseware demonstrations, a paper describing computer-based evaluation of textual responses, and a reactor's address. In an overview of the current state of computer-based composition instruction,…
Summary of research in applied mathematics, numerical analysis, and computer sciences
NASA Technical Reports Server (NTRS)
1986-01-01
The major categories of current ICASE research programs addressed include: numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; control and parameter identification problems, with emphasis on effective numerical methods; computational problems in engineering and physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and computer systems and software, especially vector and parallel computers.
ERIC Educational Resources Information Center
Gecer, Aynur
2013-01-01
The aim of this research is to determine the computer self-efficacy perception of second grade primary school students and their opinions regarding computer ownership through metaphors. The research applied the scanning model and was conducted during the 2011-2012 academic year among seven primary schools of the Ministry of National Education in…
ERIC Educational Resources Information Center
Knoop, Patricia A.
The purpose of this report was to determine the research areas that appear most critical to achieving man-computer symbiosis. An operational definition of man-computer symbiosis was developed by: (1) reviewing and summarizing what others have said about it, and (2) attempting to distinguish it from other types of man-computer relationships. From…
Key Lessons in Building "Data Commons": The Open Science Data Cloud Ecosystem
NASA Astrophysics Data System (ADS)
Patterson, M.; Grossman, R.; Heath, A.; Murphy, M.; Wells, W.
2015-12-01
Cloud computing technology has created a shift around data and data analysis by allowing researchers to push computation to data as opposed to having to pull data to an individual researcher's computer. Subsequently, cloud-based resources can provide unique opportunities to capture computing environments used both to access raw data in its original form and also to create analysis products which may be the source of data for tables and figures presented in research publications. Since 2008, the Open Cloud Consortium (OCC) has operated the Open Science Data Cloud (OSDC), which provides scientific researchers with computational resources for storing, sharing, and analyzing large (terabyte and petabyte-scale) scientific datasets. OSDC has provided compute and storage services to over 750 researchers in a wide variety of data intensive disciplines. Recently, internal users have logged about 2 million core hours each month. The OSDC also serves the research community by colocating these resources with access to nearly a petabyte of public scientific datasets in a variety of fields also accessible for download externally by the public. In our experience operating these resources, researchers are well served by "data commons," meaning cyberinfrastructure that colocates data archives, computing, and storage infrastructure and supports essential tools and services for working with scientific data. In addition to the OSDC public data commons, the OCC operates a data commons in collaboration with NASA and is developing a data commons for NOAA datasets. As cloud-based infrastructures for distributing and computing over data become more pervasive, we ask, "What does it mean to publish data in a data commons?" Here we present the OSDC perspective and discuss several services that are key in architecting data commons, including digital identifier services.
QUARTERLY TECHNICAL PROGRESS REPORT, JULY, AUGUST, SEPTEMBER 1967.
Contents: Circuit research program; Hardware systems research; Computer system software research; Illinois pattern recognition computer: ILLIAC II... service , use, and program development; IBM 7094/1401 service , use, and program development; Problem specifications; General laboratory information.
The challenges of developing computational physics: the case of South Africa
NASA Astrophysics Data System (ADS)
Salagaram, T.; Chetty, N.
2013-08-01
Most modern scientific research problems are complex and interdisciplinary in nature. It is impossible to study such problems in detail without the use of computation in addition to theory and experiment. Although it is widely agreed that students should be introduced to computational methods at the undergraduate level, it remains a challenge to do this in a full traditional undergraduate curriculum. In this paper, we report on a survey that we conducted of undergraduate physics curricula in South Africa to determine the content and the approach taken in the teaching of computational physics. We also considered the pedagogy of computational physics at the postgraduate and research levels at various South African universities, research facilities and institutions. We conclude that the state of computational physics training in South Africa, especially at the undergraduate teaching level, is generally weak and needs to be given more attention at all universities. Failure to do so will impact negatively on the countrys capacity to grow its endeavours generally in the field of computational sciences, with negative impacts on research, and in commerce and industry.
NASA Technical Reports Server (NTRS)
1987-01-01
The Research Institute for Advanced Computer Science (RIACS) was established at the NASA Ames Research Center in June of 1983. RIACS is privately operated by the Universities Space Research Association (USRA), a consortium of 64 universities with graduate programs in the aerospace sciences, under several Cooperative Agreements with NASA. RIACS's goal is to provide preeminent leadership in basic and applied computer science research as partners in support of NASA's goals and missions. In pursuit of this goal, RIACS contributes to several of the grand challenges in science and engineering facing NASA: flying an airplane inside a computer; determining the chemical properties of materials under hostile conditions in the atmospheres of earth and the planets; sending intelligent machines on unmanned space missions; creating a one-world network that makes all scientific resources, including those in space, accessible to all the world's scientists; providing intelligent computational support to all stages of the process of scientific investigation from problem formulation to results dissemination; and developing accurate global models for climatic behavior throughout the world. In working with these challenges, we seek novel architectures, and novel ways to use them, that exploit the potential of parallel and distributed computation and make possible new functions that are beyond the current reach of computing machines. The investigation includes pattern computers as well as the more familiar numeric and symbolic computers, and it includes networked systems of resources distributed around the world. We believe that successful computer science research is interdisciplinary: it is driven by (and drives) important problems in other disciplines. We believe that research should be guided by a clear long-term vision with planned milestones. And we believe that our environment must foster and exploit innovation. Our activities and accomplishments for the calendar year 1987 and our plans for 1988 are reported.
Software Reuse Methods to Improve Technological Infrastructure for e-Science
NASA Technical Reports Server (NTRS)
Marshall, James J.; Downs, Robert R.; Mattmann, Chris A.
2011-01-01
Social computing has the potential to contribute to scientific research. Ongoing developments in information and communications technology improve capabilities for enabling scientific research, including research fostered by social computing capabilities. The recent emergence of e-Science practices has demonstrated the benefits from improvements in the technological infrastructure, or cyber-infrastructure, that has been developed to support science. Cloud computing is one example of this e-Science trend. Our own work in the area of software reuse offers methods that can be used to improve new technological development, including cloud computing capabilities, to support scientific research practices. In this paper, we focus on software reuse and its potential to contribute to the development and evaluation of information systems and related services designed to support new capabilities for conducting scientific research.
Making Cloud Computing Available For Researchers and Innovators (Invited)
NASA Astrophysics Data System (ADS)
Winsor, R.
2010-12-01
High Performance Computing (HPC) facilities exist in most academic institutions but are almost invariably over-subscribed. Access is allocated based on academic merit, the only practical method of assigning valuable finite compute resources. Cloud computing on the other hand, and particularly commercial clouds, draw flexibly on an almost limitless resource as long as the user has sufficient funds to pay the bill. How can the commercial cloud model be applied to scientific computing? Is there a case to be made for a publicly available research cloud and how would it be structured? This talk will explore these themes and describe how Cybera, a not-for-profit non-governmental organization in Alberta Canada, aims to leverage its high speed research and education network to provide cloud computing facilities for a much wider user base.
Institute for scientific computing research;fiscal year 1999 annual report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keyes, D
2000-03-28
Large-scale scientific computation, and all of the disciplines that support it and help to validate it, have been placed at the focus of Lawrence Livermore National Laboratory by the Accelerated Strategic Computing Initiative (ASCI). The Laboratory operates the computer with the highest peak performance in the world and has undertaken some of the largest and most compute-intensive simulations ever performed. Computers at the architectural extremes, however, are notoriously difficult to use efficiently. Even such successes as the Laboratory's two Bell Prizes awarded in November 1999 only emphasize the need for much better ways of interacting with the results of large-scalemore » simulations. Advances in scientific computing research have, therefore, never been more vital to the core missions of the Laboratory than at present. Computational science is evolving so rapidly along every one of its research fronts that to remain on the leading edge, the Laboratory must engage researchers at many academic centers of excellence. In FY 1999, the Institute for Scientific Computing Research (ISCR) has expanded the Laboratory's bridge to the academic community in the form of collaborative subcontracts, visiting faculty, student internships, a workshop, and a very active seminar series. ISCR research participants are integrated almost seamlessly with the Laboratory's Center for Applied Scientific Computing (CASC), which, in turn, addresses computational challenges arising throughout the Laboratory. Administratively, the ISCR flourishes under the Laboratory's University Relations Program (URP). Together with the other four Institutes of the URP, it must navigate a course that allows the Laboratory to benefit from academic exchanges while preserving national security. Although FY 1999 brought more than its share of challenges to the operation of an academic-like research enterprise within the context of a national security laboratory, the results declare the challenges well met and well worth the continued effort. A change of administration for the ISCR occurred during FY 1999. Acting Director John Fitzgerald retired from LLNL in August after 35 years of service, including the last two at helm of the ISCR. David Keyes, who has been a regular visitor in conjunction with ASCI scalable algorithms research since October 1997, overlapped with John for three months and serves half-time as the new Acting Director.« less
Challenges and opportunities of cloud computing for atmospheric sciences
NASA Astrophysics Data System (ADS)
Pérez Montes, Diego A.; Añel, Juan A.; Pena, Tomás F.; Wallom, David C. H.
2016-04-01
Cloud computing is an emerging technological solution widely used in many fields. Initially developed as a flexible way of managing peak demand it has began to make its way in scientific research. One of the greatest advantages of cloud computing for scientific research is independence of having access to a large cyberinfrastructure to fund or perform a research project. Cloud computing can avoid maintenance expenses for large supercomputers and has the potential to 'democratize' the access to high-performance computing, giving flexibility to funding bodies for allocating budgets for the computational costs associated with a project. Two of the most challenging problems in atmospheric sciences are computational cost and uncertainty in meteorological forecasting and climate projections. Both problems are closely related. Usually uncertainty can be reduced with the availability of computational resources to better reproduce a phenomenon or to perform a larger number of experiments. Here we expose results of the application of cloud computing resources for climate modeling using cloud computing infrastructures of three major vendors and two climate models. We show how the cloud infrastructure compares in performance to traditional supercomputers and how it provides the capability to complete experiments in shorter periods of time. The monetary cost associated is also analyzed. Finally we discuss the future potential of this technology for meteorological and climatological applications, both from the point of view of operational use and research.
[Surgical treatment of recurrent urethral strictures in males after unsuccessful operations].
Simon, V; Vacík, J; Michálek, J; Novák, V; Lopour, P; Spunda, M
2000-12-06
Subvesical obstructions of any origin represent a frequent and serious disorder occurring predominantly in males. Often it brings incontinence and/or erectility dysfunction, infection of urinary tract. Relapses of the acute pyelonephritis can turn into chronic tubulointersticial one and terminate in the renal insufficiency. To treat strictures, dilation, intermittent catheterization and recently stent introduction were used. Most suitable appears a stent from composite polymers. The aim of our work was to test properties of stents developed in the Institute of Macromolecular Chemistry ASCR. Stents from composite polymers, which are non-toxic, not-irritable can swell in body fluids and have mechanical properties similar to that of silicone rubber. Properties of the material are functionally graded and the casting or repoussé from the material can subsequently change its shape. Ten patients (males, aged 25 to 78 years) with long urethral strictures in its bulbocavernous part (50%) were treated with this method. Strictures were caused by pelvical fractures (4 times), prostate hypertrophy surgery (4 times), prolonged catheterizations (2 times). All patients were followed for 16 to 26 month and had no severities. Our results indicate that stent from composite polymers and silicone may have long-acting effects without irritation or crust formation and beneficially effected healing of the spongio-fibrous process.
Light distribution in diffractive multifocal optics and its optimization.
Portney, Valdemar
2011-11-01
To expand a geometrical model of diffraction efficiency and its interpretation to the multifocal optic and to introduce formulas for analysis of far and near light distribution and their application to multifocal intraocular lenses (IOLs) and to diffraction efficiency optimization. Medical device consulting firm, Newport Coast, California, USA. Experimental study. Application of a geometrical model to the kinoform (single focus diffractive optical element) was expanded to a multifocal optic to produce analytical definitions of light split between far and near images and light loss to other diffraction orders. The geometrical model gave a simple interpretation of light split in a diffractive multifocal IOL. An analytical definition of light split between far, near, and light loss was introduced as curve fitting formulas. Several examples of application to common multifocal diffractive IOLs were developed; for example, to light-split change with wavelength. The analytical definition of diffraction efficiency may assist in optimization of multifocal diffractive optics that minimize light loss. Formulas for analysis of light split between different foci of multifocal diffractive IOLs are useful in interpreting diffraction efficiency dependence on physical characteristics, such as blaze heights of the diffractive grooves and wavelength of light, as well as for optimizing multifocal diffractive optics. Disclosure is found in the footnotes. Copyright © 2011 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Multidimensional, fully implicit, exactly conserving electromagnetic particle-in-cell simulations
NASA Astrophysics Data System (ADS)
Chacon, Luis
2015-09-01
We discuss a new, conservative, fully implicit 2D-3V particle-in-cell algorithm for non-radiative, electromagnetic kinetic plasma simulations, based on the Vlasov-Darwin model. Unlike earlier linearly implicit PIC schemes and standard explicit PIC schemes, fully implicit PIC algorithms are unconditionally stable and allow exact discrete energy and charge conservation. This has been demonstrated in 1D electrostatic and electromagnetic contexts. In this study, we build on these recent algorithms to develop an implicit, orbit-averaged, time-space-centered finite difference scheme for the Darwin field and particle orbit equations for multiple species in multiple dimensions. The Vlasov-Darwin model is very attractive for PIC simulations because it avoids radiative noise issues in non-radiative electromagnetic regimes. The algorithm conserves global energy, local charge, and particle canonical-momentum exactly, even with grid packing. The nonlinear iteration is effectively accelerated with a fluid preconditioner, which allows efficient use of large timesteps, O(√{mi/me}c/veT) larger than the explicit CFL. In this presentation, we will introduce the main algorithmic components of the approach, and demonstrate the accuracy and efficiency properties of the algorithm with various numerical experiments in 1D and 2D. Support from the LANL LDRD program and the DOE-SC ASCR office.
Ultra-widefield retinal imaging through a black intraocular lens.
Yusuf, Imran H; Fung, Timothy H M; Patel, Chetan K
2015-09-01
To evaluate the feasibility of ultra-widefield retinal imaging in patients with near infrared (IR)-transmitting black intraocular lenses (IOLs). Oxford Eye Hospital, Oxford, United Kingdom. Laboratory evaluation of a diagnostic technology with interventional case report. The field of retinal imaging through a Morcher poly(methyl methacrylate) (PMMA) black IOL was determined in a purpose-built adult schematic model eye with the HRA2 Spectralis confocal scanning laser ophthalmoscope using standard imaging, Staurenghi retina lens-assisted imaging, and ultra-widefield noncontact imaging. Retinal imaging using each modality was then performed on a patient implanted with another Morcher PMMA black IOL model. Ultra-widefield noncontact imaging and lens-assisted imaging captured up to 150 degrees of field (versus 40 degrees with a standard confocal scanning laser ophthalmoscope). Ultra-widefield retinal images were successfully acquired in a patient eye with a black IOL. This study has identified the first ultra-widefield retinal imaging modalities for patients with near IR-transmitting black IOLs. Should larger studies confirm this finding, noncontact ultra-widefield confocal scanning laser ophthalmoscopy might be considered the gold standard imaging technique for retinal surveillance in patients with near IR-transmitting black IOLs. No author has a financial or proprietary interest in any material or method mentioned. Copyright © 2015 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
DOE Centers of Excellence Performance Portability Meeting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neely, J. R.
2016-04-21
Performance portability is a phrase often used, but not well understood. The DOE is deploying systems at all of the major facilities across ASCR and ASC that are forcing application developers to confront head-on the challenges of running applications across these diverse systems. With GPU-based systems at the OLCF and LLNL, and Phi-based systems landing at NERSC, ACES (LANL/SNL), and the ALCF – the issue of performance portability is confronting the DOE mission like never before. A new best practice in the DOE is to include “Centers of Excellence” with each major procurement, with a goal of focusing efforts onmore » preparing key applications to be ready for the systems coming to each site, and engaging the vendors directly in a “shared fate” approach to ensuring success. While each COE is necessarily focused on a particular deployment, applications almost invariably must be able to run effectively across the entire DOE HPC ecosystem. This tension between optimizing performance for a particular platform, while still being able to run with acceptable performance wherever the resources are available, is the crux of the challenge we call “performance portability”. This meeting was an opportunity to bring application developers, software providers, and vendors together to discuss this challenge and begin to chart a path forward.« less
NASA Astrophysics Data System (ADS)
Hut, R. W.; van de Giesen, N. C.; Drost, N.
2017-05-01
The suggestions by Hutton et al. might not be enough to guarantee reproducible computational hydrology. Archiving software code and research data alone will not be enough. We add to the suggestion of Hutton et al. that hydrologists not only document their (computer) work, but that hydrologists use the latest best practices in designing research software, most notably the use of containers and open interfaces. To make sure hydrologists know of these best practices, we urge close collaboration with Research Software Engineers (RSEs).
Digital optical computers at the optoelectronic computing systems center
NASA Technical Reports Server (NTRS)
Jordan, Harry F.
1991-01-01
The Digital Optical Computing Program within the National Science Foundation Engineering Research Center for Opto-electronic Computing Systems has as its specific goal research on optical computing architectures suitable for use at the highest possible speeds. The program can be targeted toward exploiting the time domain because other programs in the Center are pursuing research on parallel optical systems, exploiting optical interconnection and optical devices and materials. Using a general purpose computing architecture as the focus, we are developing design techniques, tools and architecture for operation at the speed of light limit. Experimental work is being done with the somewhat low speed components currently available but with architectures which will scale up in speed as faster devices are developed. The design algorithms and tools developed for a general purpose, stored program computer are being applied to other systems such as optimally controlled optical communication networks.
An Introduction to Research and the Computer: A Self-Instructional Package.
ERIC Educational Resources Information Center
Vasu, Ellen Storey; Palmer, Richard I.
This self-instructional package includes learning objectives, definitions, exercises, and feedback for learning some basic concepts and skills involved in using computers for analyzing data and understanding basic research terminology. Learning activities are divided into four sections: research and research hypotheses; variables, cases, and…
Update to Computational Aspects of Nitrogen-Rich HEDMs
2016-04-01
ARL-TR-7656 ● APR 2016 US Army Research Laboratory Update to “Computational Aspects of Nitrogen -Rich HEDMs” by Betsy M...Computational Aspects of Nitrogen -Rich HEDMs” by Betsy M Rice, Edward FC Byrd, and William D Mattson Weapons and Materials Research Directorate...
Argonne's Magellan Cloud Computing Research Project
Beckman, Pete
2017-12-11
Pete Beckman, head of Argonne's Leadership Computing Facility (ALCF), discusses the Department of Energy's new $32-million Magellan project, which designed to test how cloud computing can be used for scientific research. More information: http://www.anl.gov/Media_Center/News/2009/news091014a.html
Argonne's Magellan Cloud Computing Research Project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beckman, Pete
Pete Beckman, head of Argonne's Leadership Computing Facility (ALCF), discusses the Department of Energy's new $32-million Magellan project, which designed to test how cloud computing can be used for scientific research. More information: http://www.anl.gov/Media_Center/News/2009/news091014a.html
Computational Nanotechnology Molecular Electronics, Materials and Machines
NASA Technical Reports Server (NTRS)
Srivastava, Deepak; Biegel, Bryan A. (Technical Monitor)
2002-01-01
This presentation covers research being performed on computational nanotechnology, carbon nanotubes and fullerenes at the NASA Ames Research Center. Topics cover include: nanomechanics of nanomaterials, nanotubes and composite materials, molecular electronics with nanotube junctions, kinky chemistry, and nanotechnology for solid-state quantum computers using fullerenes.
Understanding Emergency Care Delivery Through Computer Simulation Modeling.
Laker, Lauren F; Torabi, Elham; France, Daniel J; Froehle, Craig M; Goldlust, Eric J; Hoot, Nathan R; Kasaie, Parastu; Lyons, Michael S; Barg-Walkow, Laura H; Ward, Michael J; Wears, Robert L
2018-02-01
In 2017, Academic Emergency Medicine convened a consensus conference entitled, "Catalyzing System Change through Health Care Simulation: Systems, Competency, and Outcomes." This article, a product of the breakout session on "understanding complex interactions through systems modeling," explores the role that computer simulation modeling can and should play in research and development of emergency care delivery systems. This article discusses areas central to the use of computer simulation modeling in emergency care research. The four central approaches to computer simulation modeling are described (Monte Carlo simulation, system dynamics modeling, discrete-event simulation, and agent-based simulation), along with problems amenable to their use and relevant examples to emergency care. Also discussed is an introduction to available software modeling platforms and how to explore their use for research, along with a research agenda for computer simulation modeling. Through this article, our goal is to enhance adoption of computer simulation, a set of methods that hold great promise in addressing emergency care organization and design challenges. © 2017 by the Society for Academic Emergency Medicine.
NASA Technical Reports Server (NTRS)
Weed, Richard Allen; Sankar, L. N.
1994-01-01
An increasing amount of research activity in computational fluid dynamics has been devoted to the development of efficient algorithms for parallel computing systems. The increasing performance to price ratio of engineering workstations has led to research to development procedures for implementing a parallel computing system composed of distributed workstations. This thesis proposal outlines an ongoing research program to develop efficient strategies for performing three-dimensional flow analysis on distributed computing systems. The PVM parallel programming interface was used to modify an existing three-dimensional flow solver, the TEAM code developed by Lockheed for the Air Force, to function as a parallel flow solver on clusters of workstations. Steady flow solutions were generated for three different wing and body geometries to validate the code and evaluate code performance. The proposed research will extend the parallel code development to determine the most efficient strategies for unsteady flow simulations.
ERIC Educational Resources Information Center
Heslin, J. Alexander, Jr.
In senior-level undergraduate research courses in Computer Information Systems (CIS), students are required to read and assimilate a large volume of current research literature. One course objective is to demonstrate to the student that there are patterns or models or paradigms of research. A new approach in identifying research paradigms is…
Cloud computing for genomic data analysis and collaboration.
Langmead, Ben; Nellore, Abhinav
2018-04-01
Next-generation sequencing has made major strides in the past decade. Studies based on large sequencing data sets are growing in number, and public archives for raw sequencing data have been doubling in size every 18 months. Leveraging these data requires researchers to use large-scale computational resources. Cloud computing, a model whereby users rent computers and storage from large data centres, is a solution that is gaining traction in genomics research. Here, we describe how cloud computing is used in genomics for research and large-scale collaborations, and argue that its elasticity, reproducibility and privacy features make it ideally suited for the large-scale reanalysis of publicly available archived data, including privacy-protected data.
Computational chemistry in pharmaceutical research: at the crossroads.
Bajorath, Jürgen
2012-01-01
Computational approaches are an integral part of pharmaceutical research. However, there are many of unsolved key questions that limit the scientific progress in the still evolving computational field and its impact on drug discovery. Importantly, a number of these questions are not new but date back many years. Hence, it might be difficult to conclusively answer them in the foreseeable future. Moreover, the computational field as a whole is characterized by a high degree of heterogeneity and so is, unfortunately, the quality of its scientific output. In light of this situation, it is proposed that changes in scientific standards and culture should be seriously considered now in order to lay a foundation for future progress in computational research.
NASA Astrophysics Data System (ADS)
Landgrebe, Anton J.
1987-03-01
An overview of research activities at the United Technologies Research Center (UTRC) in the area of Computational Fluid Dynamics (CFD) is presented. The requirement and use of various levels of computers, including supercomputers, for the CFD activities is described. Examples of CFD directed toward applications to helicopters, turbomachinery, heat exchangers, and the National Aerospace Plane are included. Helicopter rotor codes for the prediction of rotor and fuselage flow fields and airloads were developed with emphasis on rotor wake modeling. Airflow and airload predictions and comparisons with experimental data are presented. Examples are presented of recent parabolized Navier-Stokes and full Navier-Stokes solutions for hypersonic shock-wave/boundary layer interaction, and hydrogen/air supersonic combustion. In addition, other examples of CFD efforts in turbomachinery Navier-Stokes methodology and separated flow modeling are presented. A brief discussion of the 3-tier scientific computing environment is also presented, in which the researcher has access to workstations, mid-size computers, and supercomputers.
NASA Technical Reports Server (NTRS)
Landgrebe, Anton J.
1987-01-01
An overview of research activities at the United Technologies Research Center (UTRC) in the area of Computational Fluid Dynamics (CFD) is presented. The requirement and use of various levels of computers, including supercomputers, for the CFD activities is described. Examples of CFD directed toward applications to helicopters, turbomachinery, heat exchangers, and the National Aerospace Plane are included. Helicopter rotor codes for the prediction of rotor and fuselage flow fields and airloads were developed with emphasis on rotor wake modeling. Airflow and airload predictions and comparisons with experimental data are presented. Examples are presented of recent parabolized Navier-Stokes and full Navier-Stokes solutions for hypersonic shock-wave/boundary layer interaction, and hydrogen/air supersonic combustion. In addition, other examples of CFD efforts in turbomachinery Navier-Stokes methodology and separated flow modeling are presented. A brief discussion of the 3-tier scientific computing environment is also presented, in which the researcher has access to workstations, mid-size computers, and supercomputers.
The TeraShake Computational Platform for Large-Scale Earthquake Simulations
NASA Astrophysics Data System (ADS)
Cui, Yifeng; Olsen, Kim; Chourasia, Amit; Moore, Reagan; Maechling, Philip; Jordan, Thomas
Geoscientific and computer science researchers with the Southern California Earthquake Center (SCEC) are conducting a large-scale, physics-based, computationally demanding earthquake system science research program with the goal of developing predictive models of earthquake processes. The computational demands of this program continue to increase rapidly as these researchers seek to perform physics-based numerical simulations of earthquake processes for larger meet the needs of this research program, a multiple-institution team coordinated by SCEC has integrated several scientific codes into a numerical modeling-based research tool we call the TeraShake computational platform (TSCP). A central component in the TSCP is a highly scalable earthquake wave propagation simulation program called the TeraShake anelastic wave propagation (TS-AWP) code. In this chapter, we describe how we extended an existing, stand-alone, wellvalidated, finite-difference, anelastic wave propagation modeling code into the highly scalable and widely used TS-AWP and then integrated this code into the TeraShake computational platform that provides end-to-end (initialization to analysis) research capabilities. We also describe the techniques used to enhance the TS-AWP parallel performance on TeraGrid supercomputers, as well as the TeraShake simulations phases including input preparation, run time, data archive management, and visualization. As a result of our efforts to improve its parallel efficiency, the TS-AWP has now shown highly efficient strong scaling on over 40K processors on IBM’s BlueGene/L Watson computer. In addition, the TSCP has developed into a computational system that is useful to many members of the SCEC community for performing large-scale earthquake simulations.
NASA Technical Reports Server (NTRS)
Oliger, Joseph
1993-01-01
The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on 6 June 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under contract with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. A flexible scientific staff is provided through a university faculty visitor program, a post doctoral program, and a student visitor program. Not only does this provide appropriate expertise but it also introduces scientists outside of NASA to NASA problems. A small group of core RIACS staff provides continuity and interacts with an ARC technical monitor and scientific advisory group to determine the RIACS mission. RIACS activities are reviewed and monitored by a USRA advisory council and ARC technical monitor. Research at RIACS is currently being done in the following areas: Parallel Computing, Advanced Methods for Scientific Computing, High Performance Networks and Technology, and Learning Systems. Parallel compiler techniques, adaptive numerical methods for flows in complicated geometries, and optimization were identified as important problems to investigate for ARC's involvement in the Computational Grand Challenges of the next decade.
NASA Technical Reports Server (NTRS)
Oliger, Joseph
1992-01-01
The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on 6 June 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under a cooperative agreement with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. A flexible scientific staff is provided through a university faculty visitor program, a post doctoral program, and a student visitor program. Not only does this provide appropriate expertise but it also introduces scientists outside of NASA to NASA problems. A small group of core RIACS staff provides continuity and interacts with an ARC technical monitor and scientific advisory group to determine the RIACS mission. RIACS activities are reviewed and monitored by a USRA advisory council and ARC technical monitor. Research at RIACS is currently being done in the following areas: Parallel Computing; Advanced Methods for Scientific Computing; Learning Systems; High Performance Networks and Technology; Graphics, Visualization, and Virtual Environments.
WE-B-BRD-01: Innovation in Radiation Therapy Planning II: Cloud Computing in RT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moore, K; Kagadis, G; Xing, L
As defined by the National Institute of Standards and Technology, cloud computing is “a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.” Despite the omnipresent role of computers in radiotherapy, cloud computing has yet to achieve widespread adoption in clinical or research applications, though the transition to such “on-demand” access is underway. As this transition proceeds, new opportunities for aggregate studies and efficient use of computational resources are set againstmore » new challenges in patient privacy protection, data integrity, and management of clinical informatics systems. In this Session, current and future applications of cloud computing and distributed computational resources will be discussed in the context of medical imaging, radiotherapy research, and clinical radiation oncology applications. Learning Objectives: Understand basic concepts of cloud computing. Understand how cloud computing could be used for medical imaging applications. Understand how cloud computing could be employed for radiotherapy research.4. Understand how clinical radiotherapy software applications would function in the cloud.« less
CSM research: Methods and application studies
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.
1989-01-01
Computational mechanics is that discipline of applied science and engineering devoted to the study of physical phenomena by means of computational methods based on mathematical modeling and simulation, utilizing digital computers. The discipline combines theoretical and applied mechanics, approximation theory, numerical analysis, and computer science. Computational mechanics has had a major impact on engineering analysis and design. When applied to structural mechanics, the discipline is referred to herein as computational structural mechanics. Complex structures being considered by NASA for the 1990's include composite primary aircraft structures and the space station. These structures will be much more difficult to analyze than today's structures and necessitate a major upgrade in computerized structural analysis technology. NASA has initiated a research activity in structural analysis called Computational Structural Mechanics (CSM). The broad objective of the CSM activity is to develop advanced structural analysis technology that will exploit modern and emerging computers, such as those with vector and/or parallel processing capabilities. Here, the current research directions for the Methods and Application Studies Team of the Langley CSM activity are described.
Computer Program for Steady Transonic Flow over Thin Airfoils by Finite Elements
1975-10-01
COMPUTER PROGRAM FOR STEADY JJ TRANSONIC FLOW OVER THIN AIRFOILS BY g FINITE ELEMENTS • *q^^ r ̂ c HUNTSVILLE RESEARCH & ENGINEERING CENTER...jglMMi B Jun’ INC ORGANIMTION NAME ANO ADDRESS Lö^kfteed Missiles & Space Company, Inc. Huntsville Research & Engineering Center,^ Huntsville, Alab...This report was prepared by personnel in the Computational Mechamcs Section of the Lockheed Missiles fc Space Company, Inc.. Huntsville Research
A Heterogeneous High-Performance System for Computational and Computer Science
2016-11-15
Patents Submitted Patents Awarded Awards Graduate Students Names of Post Doctorates Names of Faculty Supported Names of Under Graduate students supported...team of research faculty from the departments of computer science and natural science at Bowie State University. The supercomputer is not only to...accelerated HPC systems. The supercomputer is also ideal for the research conducted in the Department of Natural Science, as research faculty work on
Harry Mergler with His Modified Differential Analyzer
1951-06-21
Harry Mergler stands at the control board of a differential analyzer in the new Instrument Research Laboratory at the National Advisory Committee for Aeronautics (NACA) Lewis Flight Propulsion Laboratory. The differential analyzer was a multi-variable analog computation machine devised in 1931 by Massachusetts Institute of Technology researcher and future NACA Committee member Vannevar Bush. The mechanical device could solve computations up to the sixth order, but had to be rewired before each new computation. Mergler modified Bush’s differential analyzer in the late 1940s to calculate droplet trajectories for Lewis’ icing research program. In four days Mergler’s machine could calculate what previously required weeks. NACA Lewis built the Instrument Research Laboratory in 1950 and 1951 to house the large analog computer equipment. The two-story structure also provided offices for the Mechanical Computational Analysis, and Flow Physics sections of the Physics Division. The division had previously operated from the lab’s hangar because of its icing research and flight operations activities. Mergler joined the Instrument Research Section of the Physics Division in 1948 after earning an undergraduate degree in Physics from the Case Institute of Technology. Mergler’s focus was on the synthesis of analog computers with the machine tools used to create compressor and turbine blades for jet engines.
Human computer confluence applied in healthcare and rehabilitation.
Viaud-Delmon, Isabelle; Gaggioli, Andrea; Ferscha, Alois; Dunne, Stephen
2012-01-01
Human computer confluence (HCC) is an ambitious research program studying how the emerging symbiotic relation between humans and computing devices can enable radically new forms of sensing, perception, interaction, and understanding. It is an interdisciplinary field, bringing together researches from horizons as various as pervasive computing, bio-signals processing, neuroscience, electronics, robotics, virtual & augmented reality, and provides an amazing potential for applications in medicine and rehabilitation.
[Animal experimentation, computer simulation and surgical research].
Carpentier, Alain
2009-11-01
We live in a digital world In medicine, computers are providing new tools for data collection, imaging, and treatment. During research and development of complex technologies and devices such as artificial hearts, computer simulation can provide more reliable information than experimentation on large animals. In these specific settings, animal experimentation should serve more to validate computer models of complex devices than to demonstrate their reliability.
NASA Technical Reports Server (NTRS)
Ortega, J. M.
1986-01-01
Various graduate research activities in the field of computer science are reported. Among the topics discussed are: (1) failure probabilities in multi-version software; (2) Gaussian Elimination on parallel computers; (3) three dimensional Poisson solvers on parallel/vector computers; (4) automated task decomposition for multiple robot arms; (5) multi-color incomplete cholesky conjugate gradient methods on the Cyber 205; and (6) parallel implementation of iterative methods for solving linear equations.
French Plans for Fifth Generation Computer Systems.
1984-12-07
centrally man- French industry In electronics, compu- aged project in France that covers all ters, software, and services and to make the facets of the...Centre National of Japan’s Fifth Generation Project , the de Recherche Scientifique (CNRS) Cooper- French scientific and industrial com- ative Research...systems, man-computer The National Projects interaction, novel computer structures, The French Ministry of Research and knowledge-based computer systems
ERIC Educational Resources Information Center
Estepa, A.; And Others
1992-01-01
The recording of the interaction between pupil and computer is one of the data sources frequently used in research on the use of computers in teaching. Describes the analysis methodology of these recordings to determine the use of computers in statistics and its adaptation to other research work on the use of computers in education. (Author/MDH)
A Brief Analysis of Development Situations and Trend of Cloud Computing
NASA Astrophysics Data System (ADS)
Yang, Wenyan
2017-12-01
in recent years, the rapid development of Internet technology has radically changed people's work, learning and lifestyles. More and more activities are completed by virtue of computers and networks. The amount of information and data generated is bigger day by day, and people rely more on computer, which makes computing power of computer fail to meet demands of accuracy and rapidity from people. The cloud computing technology has experienced fast development, which is widely applied in the computer industry as a result of advantages of high precision, fast computing and easy usage. Moreover, it has become a focus in information research at present. In this paper, the development situations and trend of cloud computing shall be analyzed and researched.
ERIC Educational Resources Information Center
Harsh, Matthew; Bal, Ravtosh; Wetmore, Jameson; Zachary, G. Pascal; Holden, Kerry
2018-01-01
The emergence of vibrant research communities of computer scientists in Kenya and Uganda has occurred in the context of neoliberal privatization, commercialization, and transnational capital flows from donors and corporations. We explore how this funding environment configures research culture and research practices, which are conceptualized as…
NASA Technical Reports Server (NTRS)
Smith, Paul H.
1988-01-01
The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.
NASA Astrophysics Data System (ADS)
Serugendo, Giovanna Di Marzo; Risoldi, Matteo; Solemayni, Mohammad
The following sections are included: * Introduction * Problem and Research Questions * State of the Art * TSC Structure and Computational Awareness * Methodology and Research Directions * Case Study: Democracy * Conclusions
Switching from Computer to Microcomputer Architecture Education
ERIC Educational Resources Information Center
Bolanakis, Dimosthenis E.; Kotsis, Konstantinos T.; Laopoulos, Theodore
2010-01-01
In the last decades, the technological and scientific evolution of the computing discipline has been widely affecting research in software engineering education, which nowadays advocates more enlightened and liberal ideas. This article reviews cross-disciplinary research on a computer architecture class in consideration of its switching to…
Computer Use in Research Exercises: Some Suggested Procedures for Undergraduate Political Science.
ERIC Educational Resources Information Center
Comer, John
1979-01-01
Describes some procedures designed to assist instructors in developing a research component using the computer. Benefits include development of research skills, kindling student interest in the field of political science, and recruitment potential. (Author/CK)
Design & implementation of distributed spatial computing node based on WPS
NASA Astrophysics Data System (ADS)
Liu, Liping; Li, Guoqing; Xie, Jibo
2014-03-01
Currently, the research work of SIG (Spatial Information Grid) technology mostly emphasizes on the spatial data sharing in grid environment, while the importance of spatial computing resources is ignored. In order to implement the sharing and cooperation of spatial computing resources in grid environment, this paper does a systematical research of the key technologies to construct Spatial Computing Node based on the WPS (Web Processing Service) specification by OGC (Open Geospatial Consortium). And a framework of Spatial Computing Node is designed according to the features of spatial computing resources. Finally, a prototype of Spatial Computing Node is implemented and the relevant verification work under the environment is completed.
The Many Colors and Shapes of Cloud
NASA Astrophysics Data System (ADS)
Yeh, James T.
While many enterprises and business entities are deploying and exploiting Cloud Computing, the academic institutes and researchers are also busy trying to wrestle this beast and put a leash on this possible paradigm changing computing model. Many have argued that Cloud Computing is nothing more than a name change of Utility Computing. Others have argued that Cloud Computing is a revolutionary change of the computing architecture. So it has been difficult to put a boundary of what is in Cloud Computing, and what is not. I assert that it is equally difficult to find a group of people who would agree on even the definition of Cloud Computing. In actuality, may be all that arguments are not necessary, as Clouds have many shapes and colors. In this presentation, the speaker will attempt to illustrate that the shape and the color of the cloud depend very much on the business goals one intends to achieve. It will be a very rich territory for both the businesses to take the advantage of the benefits of Cloud Computing and the academia to integrate the technology research and business research.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lottes, S.A.; Kulak, R.F.; Bojanowski, C.
2011-12-09
The computational fluid dynamics (CFD) and computational structural mechanics (CSM) focus areas at Argonne's Transportation Research and Analysis Computing Center (TRACC) initiated a project to support and compliment the experimental programs at the Turner-Fairbank Highway Research Center (TFHRC) with high performance computing based analysis capabilities in August 2010. The project was established with a new interagency agreement between the Department of Energy and the Department of Transportation to provide collaborative research, development, and benchmarking of advanced three-dimensional computational mechanics analysis methods to the aerodynamics and hydraulics laboratories at TFHRC for a period of five years, beginning in October 2010. Themore » analysis methods employ well-benchmarked and supported commercial computational mechanics software. Computational mechanics encompasses the areas of Computational Fluid Dynamics (CFD), Computational Wind Engineering (CWE), Computational Structural Mechanics (CSM), and Computational Multiphysics Mechanics (CMM) applied in Fluid-Structure Interaction (FSI) problems. The major areas of focus of the project are wind and water effects on bridges - superstructure, deck, cables, and substructure (including soil), primarily during storms and flood events - and the risks that these loads pose to structural failure. For flood events at bridges, another major focus of the work is assessment of the risk to bridges caused by scour of stream and riverbed material away from the foundations of a bridge. Other areas of current research include modeling of flow through culverts to assess them for fish passage, modeling of the salt spray transport into bridge girders to address suitability of using weathering steel in bridges, CFD analysis of the operation of the wind tunnel in the TFCHR wind engineering laboratory, vehicle stability under high wind loading, and the use of electromagnetic shock absorbers to improve vehicle stability under high wind conditions. This quarterly report documents technical progress on the project tasks for the period of July through September 2011.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arkin, Adam; Bader, David C.; Coffey, Richard
Understanding the fundamentals of genomic systems or the processes governing impactful weather patterns are examples of the types of simulation and modeling performed on the most advanced computing resources in America. High-performance computing and computational science together provide a necessary platform for the mission science conducted by the Biological and Environmental Research (BER) office at the U.S. Department of Energy (DOE). This report reviews BER’s computing needs and their importance for solving some of the toughest problems in BER’s portfolio. BER’s impact on science has been transformative. Mapping the human genome, including the U.S.-supported international Human Genome Project that DOEmore » began in 1987, initiated the era of modern biotechnology and genomics-based systems biology. And since the 1950s, BER has been a core contributor to atmospheric, environmental, and climate science research, beginning with atmospheric circulation studies that were the forerunners of modern Earth system models (ESMs) and by pioneering the implementation of climate codes onto high-performance computers. See http://exascaleage.org/ber/ for more information.« less
Translational bioinformatics in the cloud: an affordable alternative
2010-01-01
With the continued exponential expansion of publicly available genomic data and access to low-cost, high-throughput molecular technologies for profiling patient populations, computational technologies and informatics are becoming vital considerations in genomic medicine. Although cloud computing technology is being heralded as a key enabling technology for the future of genomic research, available case studies are limited to applications in the domain of high-throughput sequence data analysis. The goal of this study was to evaluate the computational and economic characteristics of cloud computing in performing a large-scale data integration and analysis representative of research problems in genomic medicine. We find that the cloud-based analysis compares favorably in both performance and cost in comparison to a local computational cluster, suggesting that cloud computing technologies might be a viable resource for facilitating large-scale translational research in genomic medicine. PMID:20691073
An overview of computer-based natural language processing
NASA Technical Reports Server (NTRS)
Gevarter, W. B.
1983-01-01
Computer based Natural Language Processing (NLP) is the key to enabling humans and their computer based creations to interact with machines in natural language (like English, Japanese, German, etc., in contrast to formal computer languages). The doors that such an achievement can open have made this a major research area in Artificial Intelligence and Computational Linguistics. Commercial natural language interfaces to computers have recently entered the market and future looks bright for other applications as well. This report reviews the basic approaches to such systems, the techniques utilized, applications, the state of the art of the technology, issues and research requirements, the major participants and finally, future trends and expectations. It is anticipated that this report will prove useful to engineering and research managers, potential users, and others who will be affected by this field as it unfolds.
Woo, E H C; White, P; Lai, C W K
2016-03-01
This paper presents an overview of global ergonomics standards and guidelines for design of computer workstations, with particular focus on their inconsistency and associated health risk impact. Overall, considerable disagreements were found in the design specifications of computer workstations globally, particularly in relation to the results from previous ergonomics research and the outcomes from current ergonomics standards and guidelines. To cope with the rapid advancement in computer technology, this article provides justifications and suggestions for modifications in the current ergonomics standards and guidelines for the design of computer workstations. Practitioner Summary: A research gap exists in ergonomics standards and guidelines for computer workstations. We explore the validity and generalisability of ergonomics recommendations by comparing previous ergonomics research through to recommendations and outcomes from current ergonomics standards and guidelines.
I Use the Computer to ADVANCE Advances in Comprehension-Strategy Research.
ERIC Educational Resources Information Center
Blohm, Paul J.
Merging the instructional implications drawn from theory and research in the interactive reading model, schemata, and metacognition with computer based instruction seems a natural approach for actively involving students' participation in reading and learning from text. Computer based graphic organizers guide students' preview or review of lengthy…
Abstract: Researchers at the EPA’s National Center for Computational Toxicology integrate advances in biology, chemistry, and computer science to examine the toxicity of chemicals and help prioritize chemicals for further research based on potential human health risks. The intent...
Computer-Based Interaction Analysis with DEGREE Revisited
ERIC Educational Resources Information Center
Barros, B.; Verdejo, M. F.
2016-01-01
We review our research with "DEGREE" and analyse how our work has impacted the collaborative learning community since 2000. Our research is framed within the context of computer-based interaction analysis and the development of computer-supported collaborative learning (CSCL) tools. We identify some aspects of our work which have been…
Eyetracking Methodology in SCMC: A Tool for Empowering Learning and Teaching
ERIC Educational Resources Information Center
Stickler, Ursula; Shi, Lijing
2017-01-01
Computer-assisted language learning, or CALL, is an interdisciplinary area of research, positioned between science and social science, computing and education, linguistics and applied linguistics. This paper argues that by appropriating methods originating in some areas of CALL-related research, for example human-computer interaction (HCI) or…
NASA Technical Reports Server (NTRS)
Bushnell, Dennis M. (Technical Monitor)
2000-01-01
This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, computer science, fluid mechanics, and structures and materials during the period October 1, 1999 through March 31, 2000.
Computer Use within a Play-Based Early Years Curriculum
ERIC Educational Resources Information Center
Howard, Justine; Miles, Gareth E.; Rees-Davies, Laura
2012-01-01
Early years curricula promote learning through play and in addition emphasise the development of computer literacy. Previous research, however, has described that teachers feel unprepared to integrate Information and Communication Technology (ICT) and play. Also, whereas research has suggested that effective computer use in the early years is…
Researchers at EPA’s National Center for Computational Toxicology integrate advances in biology, chemistry, and computer science to examine the toxicity of chemicals and help prioritize chemicals for further research based on potential human health risks. The goal of this researc...
Software design studies emphasizing Project LOGOS
NASA Technical Reports Server (NTRS)
1972-01-01
The results of a research project on the development of computer software are presented. Research funds of $200,000 were expended over a three year period for software design and projects in connection with Project LOGOS (computer-aided design and certification of computing systems). Abstracts of theses prepared during the project are provided.
NASA Technical Reports Server (NTRS)
1992-01-01
Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, fluid mechanics including fluid dynamics, acoustics, and combustion, aerodynamics, and computer science during the period 1 Apr. 1992 - 30 Sep. 1992 is summarized.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Snyder, L.; Notkin, D.; Adams, L.
1990-03-31
This task relates to research on programming massively parallel computers. Previous work on the Ensamble concept of programming was extended and investigation into nonshared memory models of parallel computation was undertaken. Previous work on the Ensamble concept defined a set of programming abstractions and was used to organize the programming task into three distinct levels; Composition of machine instruction, composition of processes, and composition of phases. It was applied to shared memory models of computations. During the present research period, these concepts were extended to nonshared memory models. During the present research period, one Ph D. thesis was completed, onemore » book chapter, and six conference proceedings were published.« less
Multiscale Computation. Needs and Opportunities for BER Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scheibe, Timothy D.; Smith, Jeremy C.
2015-01-01
The Environmental Molecular Sciences Laboratory (EMSL), a scientific user facility managed by Pacific Northwest National Laboratory for the U.S. Department of Energy, Office of Biological and Environmental Research (BER), conducted a one-day workshop on August 26, 2014 on the topic of “Multiscale Computation: Needs and Opportunities for BER Science.” Twenty invited participants, from various computational disciplines within the BER program research areas, were charged with the following objectives; Identify BER-relevant models and their potential cross-scale linkages that could be exploited to better connect molecular-scale research to BER research at larger scales and; Identify critical science directions that will motivate EMSLmore » decisions regarding future computational (hardware and software) architectures.« less
Visual Environments for CFD Research
NASA Technical Reports Server (NTRS)
Watson, Val; George, Michael W. (Technical Monitor)
1994-01-01
This viewgraph presentation gives an overview of the visual environments for computational fluid dynamics (CFD) research. It includes details on critical needs from the future computer environment, features needed to attain this environment, prospects for changes in and the impact of the visualization revolution on the human-computer interface, human processing capabilities, limits of personal environment and the extension of that environment with computers. Information is given on the need for more 'visual' thinking (including instances of visual thinking), an evaluation of the alternate approaches for and levels of interactive computer graphics, a visual analysis of computational fluid dynamics, and an analysis of visualization software.
Computational Omics Pre-Awardees | Office of Cancer Clinical Proteomics Research
The National Cancer Institute's Clinical Proteomic Tumor Analysis Consortium (CPTAC) is pleased to announce the pre-awardees of the Computational Omics solicitation. Working with NVIDIA Foundation's Compute the Cure initiative and Leidos Biomedical Research Inc., the NCI, through this solicitation, seeks to leverage computational efforts to provide tools for the mining and interpretation of large-scale publicly available ‘omics’ datasets.
ERIC Educational Resources Information Center
Office of Science and Technology Policy, Washington, DC.
This report presents the United States research and development program for 1993 for high performance computing and computer communications (HPCC) networks. The first of four chapters presents the program goals and an overview of the federal government's emphasis on high performance computing as an important factor in the nation's scientific and…
Robust Quantum Computing using Molecules with Switchable Dipole
2010-06-15
REPORT Robust quantum computing using molecules with switchable dipole 14. ABSTRACT 16. SECURITY CLASSIFICATION OF: Of the many systems studied to...Research Office P.O. Box 12211 Research Triangle Park, NC 27709-2211 15. SUBJECT TERMS Ultracold polar molecules, quantum computing , phase gates...From - To) 30-Aug-2006 Standard Form 298 (Rev 8/98) Prescribed by ANSI Std. Z39.18 - 31-Aug-2009 Robust quantum computing using molecules with
Ubiquitous computing in the military environment
NASA Astrophysics Data System (ADS)
Scholtz, Jean
2001-08-01
Increasingly people work and live on the move. To support this mobile lifestyle, especially as our work becomes more intensely information-based, companies are producing various portable and embedded information devices. The late Mark Weiser coined the term, 'ubiquitous computing' to describe an environment where computers have disappeared and are integrated into physical objects. Much industry research today is concerned with ubiquitous computing in the work and home environments. A ubiquitous computing environment would facilitate mobility by allowing information users to easily access and use information anytime, anywhere. As war fighters are inherently mobile, the question is what effect a ubiquitous computing environment would have on current military operations and doctrine. And, if ubiquitous computing is viewed as beneficial for the military, what research would be necessary to achieve a military ubiquitous computing environment? What is a vision for the use of mobile information access in a battle space? Are there different requirements for civilian and military users of this technology? What are those differences? Are there opportunities for research that will support both worlds? What type of research has been supported by the military and what areas need to be investigated? Although we don't yet have all the answers to these questions, this paper discusses the issues and presents the work we are doing to address these issues.
omniClassifier: a Desktop Grid Computing System for Big Data Prediction Modeling
Phan, John H.; Kothari, Sonal; Wang, May D.
2016-01-01
Robust prediction models are important for numerous science, engineering, and biomedical applications. However, best-practice procedures for optimizing prediction models can be computationally complex, especially when choosing models from among hundreds or thousands of parameter choices. Computational complexity has further increased with the growth of data in these fields, concurrent with the era of “Big Data”. Grid computing is a potential solution to the computational challenges of Big Data. Desktop grid computing, which uses idle CPU cycles of commodity desktop machines, coupled with commercial cloud computing resources can enable research labs to gain easier and more cost effective access to vast computing resources. We have developed omniClassifier, a multi-purpose prediction modeling application that provides researchers with a tool for conducting machine learning research within the guidelines of recommended best-practices. omniClassifier is implemented as a desktop grid computing system using the Berkeley Open Infrastructure for Network Computing (BOINC) middleware. In addition to describing implementation details, we use various gene expression datasets to demonstrate the potential scalability of omniClassifier for efficient and robust Big Data prediction modeling. A prototype of omniClassifier can be accessed at http://omniclassifier.bme.gatech.edu/. PMID:27532062
Computational analyses in cognitive neuroscience: in defense of biological implausibility.
Dror, I E; Gallogly, D P
1999-06-01
Because cognitive neuroscience researchers attempt to understand the human mind by bridging behavior and brain, they expect computational analyses to be biologically plausible. In this paper, biologically implausible computational analyses are shown to have critical and essential roles in the various stages and domains of cognitive neuroscience research. Specifically, biologically implausible computational analyses can contribute to (1) understanding and characterizing the problem that is being studied, (2) examining the availability of information and its representation, and (3) evaluating and understanding the neuronal solution. In the context of the distinct types of contributions made by certain computational analyses, the biological plausibility of those analyses is altogether irrelevant. These biologically implausible models are nevertheless relevant and important for biologically driven research.
Xie, Tianwu; Zaidi, Habib
2016-01-01
The development of multimodality preclinical imaging techniques and the rapid growth of realistic computer simulation tools have promoted the construction and application of computational laboratory animal models in preclinical research. Since the early 1990s, over 120 realistic computational animal models have been reported in the literature and used as surrogates to characterize the anatomy of actual animals for the simulation of preclinical studies involving the use of bioluminescence tomography, fluorescence molecular tomography, positron emission tomography, single-photon emission computed tomography, microcomputed tomography, magnetic resonance imaging, and optical imaging. Other applications include electromagnetic field simulation, ionizing and nonionizing radiation dosimetry, and the development and evaluation of new methodologies for multimodality image coregistration, segmentation, and reconstruction of small animal images. This paper provides a comprehensive review of the history and fundamental technologies used for the development of computational small animal models with a particular focus on their application in preclinical imaging as well as nonionizing and ionizing radiation dosimetry calculations. An overview of the overall process involved in the design of these models, including the fundamental elements used for the construction of different types of computational models, the identification of original anatomical data, the simulation tools used for solving various computational problems, and the applications of computational animal models in preclinical research. The authors also analyze the characteristics of categories of computational models (stylized, voxel-based, and boundary representation) and discuss the technical challenges faced at the present time as well as research needs in the future.
Educational Technology Research Journals: Computers & Education, 2002-2011
ERIC Educational Resources Information Center
Rackham, David D.; Hyatt, Frederick R.; Macfarlane, David C.; Nisse, Tony; Woodfield, Wendy; West, Richard E.
2013-01-01
In this study, the authors examined the journal "Computers & Education" to discover research trends in the articles published during 2002-2011. Research articles were analyzed to determine trends in the research methods and types of articles published, as well as the key topics published, top authors, and some of the most-cited…
Children and Computers: New Technology--Old Concerns.
ERIC Educational Resources Information Center
Wartella, Ellen A.; Jennings, Nancy
2000-01-01
Places current research on children and computers in historical context with earlier research on the mass media, noting recurrent patterns in 20th century media research. Concludes that to inform and sustain the creation of more quality content for children, further research is needed on the effects of media on children, with new partnerships…
Some research advances in computer graphics that will enhance applications to engineering design
NASA Technical Reports Server (NTRS)
Allan, J. J., III
1975-01-01
Research in man/machine interactions and graphics hardware/software that will enhance applications to engineering design was described. Research aspects of executive systems, command languages, and networking used in the computer applications laboratory are mentioned. Finally, a few areas where little or no research is being done were identified.
Fiction as an Introduction to Computer Science Research
ERIC Educational Resources Information Center
Goldsmith, Judy; Mattei, Nicholas
2014-01-01
The undergraduate computer science curriculum is generally focused on skills and tools; most students are not exposed to much research in the field, and do not learn how to navigate the research literature. We describe how fiction reviews (and specifically science fiction) are used as a gateway to research reviews. Students learn a little about…
NASA Astrophysics Data System (ADS)
Pour Yousefian Barfeh, Davood; Ebron, Jonalyn G.; Pabico, Jaderick P.
2018-02-01
In this study researchers pay attention to the essence of Insertion Sort and propose a sorter in Membrane Computing. This research shows how a theoretical computing device same as Membrane Computing can perform the basic concepts same as sorting. In this regard, researches introduce conditional reproduction rule such that each membrane can reproduce another membrane having same structure with the original membrane. The researchers use the functionality of comparator P system as a basis in which two multisets are compared and then stored in two adjacent membranes. And finally, the researchers present the process of sorting as a collection of transactions implemented in four levels while each level has different steps.
Krell, Kristina; Laser, Kai Thorsten; Dalla-Pozza, Robert; Winkler, Christian; Hildebrandt, Ursula; Kececioglu, Deniz; Breuer, Johannes; Herberg, Ulrike
2018-03-28
Real-time three-dimensional echocardiography (RT3DE) is a promising method for accurate assessment of left ventricular (LV) volumes and function, however, pediatric reference values are scarce. The aim of the study was to establish pediatric percentiles in a large population and to compare the inherent influence of different evaluation software on the resulting measurements. In a multicenter prospective-design study, 497 healthy children (ages 1 day to 219 months) underwent RT3DE imaging of the LV (ie33, Philips, Andover, MA). Volume analysis was performed using QLab 9.0 (Philips) and TomTec 4DLV2.7 (vendor-independent; testing high (TomTec 75 ) and low (TomTec 30 ) contour-finding activity). Reference percentiles were computed using Cole's LMS method. In 22 subjects, cardiovascular magnetic resonance imaging (CMR) was used as the reference. A total of 370/497 (74.4%) of the subjects provided adequate data sets. LV volumes had a significant association with age, body size, and gender; therefore, sex-specific percentiles were indexed to body surface area. Intra- and interobserver variability for both workstations was good (relative bias ± SD for end-diastolic volume [EDV] in %: intraobserver: QLab = -0.8 ± 2.4; TomTec 30 = -0.7 ± 7.2; TomTec 75 = -1.9 ± 6.7; interobserver: QLab = 2.4 ± 7.5; TomTec 30 = 1.2 ± 5.1; TomTec 75 = 1.3 ± 4.5). Intervendor agreement between QLab and TomTec 30 showed larger bias and wider limits of agreement (bias: QLab vs TomTec 30 : end-systolic volume [ESV] = 0.8% ± 23.6%; EDV = -2.2% ± 17.0%) with notable individual differences in small children. QLab and TomTec underestimated CMR values, with the highest agreement between CMR and QLab. RT3DE allows reproducible noninvasive assessment of LV volumes and function. However, intertechnique variability is relevant. Therefore, our software-specific percentiles, based on a large pediatric population, serve as a reference for both commonly used quantification programs. Copyright © 2018 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Knowledge and utilization of computer-software for statistics among Nigerian dentists.
Chukwuneke, F N; Anyanechi, C E; Obiakor, A O; Amobi, O; Onyejiaka, N; Alamba, I
2013-01-01
The use of computer soft ware for generation of statistic analysis has transformed health information and data to simplest form in the areas of access, storage, retrieval and analysis in the field of research. This survey therefore was carried out to assess the level of knowledge and utilization of computer software for statistical analysis among dental researchers in eastern Nigeria. Questionnaires on the use of computer software for statistical analysis were randomly distributed to 65 practicing dental surgeons of above 5 years experience in the tertiary academic hospitals in eastern Nigeria. The focus was on: years of clinical experience; research work experience; knowledge and application of computer generated software for data processing and stastistical analysis. Sixty-two (62/65; 95.4%) of these questionnaires were returned anonymously, which were used in our data analysis. Twenty-nine (29/62; 46.8%) respondents fall within those with 5-10 years of clinical experience out of which none has completed the specialist training programme. Practitioners with above 10 years clinical experiences were 33 (33/62; 53.2%) out of which 15 (15/33; 45.5%) are specialists representing 24.2% (15/62) of the total number of respondents. All the 15 specialists are actively involved in research activities and only five (5/15; 33.3%) can utilize software statistical analysis unaided. This study has i dentified poor utilization of computer software for statistic analysis among dental researchers in eastern Nigeria. This is strongly associated with lack of exposure on the use of these software early enough especially during the undergraduate training. This call for introduction of computer training programme in dental curriculum to enable practitioners develops the attitude of using computer software for their research.
Buse, Kathleen; Hill, Catherine; Benson, Kathleen
2017-01-01
While there is an extensive body of research on gender equity in engineering and computing, there have been few efforts to glean insight from a dialog among experts. To encourage collaboration and to develop a shared vision of the future research agenda, a 2 day workshop of 50 scholars who work on the topic of gender in engineering and computing was held at a rural conference center. The structure of the conference and the location allowed for time to reflect, dialog, and to craft an innovative research agenda aimed at increasing the representation of women in engineering and computing. This paper has been written by the conference organizers and details the ideas and recommendations from the scholars. The result is an innovative, collaborative approach to future research that focuses on identifying effective interventions. The new approach includes the creation of partnerships with stakeholders including businesses, government agencies, non-profits and academic institutions to allow a broader voice in setting research priorities. Researchers recommend incorporating multiple disciplines and methodologies, while expanding the use of data analytics, merging and mining existing databases and creating new datasets. The future research agenda is detailed and includes studies focused on socio-cultural interventions particularly on career choice, within undergraduate and graduate programs, and for women in professional careers. The outcome is a vision for future research that can be shared with researchers, practitioners and other stakeholders that will lead to gender equity in the engineering and computing professions. PMID:28469591
Buse, Kathleen; Hill, Catherine; Benson, Kathleen
2017-01-01
While there is an extensive body of research on gender equity in engineering and computing, there have been few efforts to glean insight from a dialog among experts. To encourage collaboration and to develop a shared vision of the future research agenda, a 2 day workshop of 50 scholars who work on the topic of gender in engineering and computing was held at a rural conference center. The structure of the conference and the location allowed for time to reflect, dialog, and to craft an innovative research agenda aimed at increasing the representation of women in engineering and computing. This paper has been written by the conference organizers and details the ideas and recommendations from the scholars. The result is an innovative, collaborative approach to future research that focuses on identifying effective interventions. The new approach includes the creation of partnerships with stakeholders including businesses, government agencies, non-profits and academic institutions to allow a broader voice in setting research priorities. Researchers recommend incorporating multiple disciplines and methodologies, while expanding the use of data analytics, merging and mining existing databases and creating new datasets. The future research agenda is detailed and includes studies focused on socio-cultural interventions particularly on career choice, within undergraduate and graduate programs, and for women in professional careers. The outcome is a vision for future research that can be shared with researchers, practitioners and other stakeholders that will lead to gender equity in the engineering and computing professions.
ERIC Educational Resources Information Center
Zheng, Lanqin; Huang, Ronghuai; Yu, Junhui
2014-01-01
This study aims to identity the emerging research trends in the field of computed-supported collaborative learning (CSCL) so as to provide insights for researchers and educators into research topics and issues for further exploration. This paper analyzed the research topics, methods and technology adoption of CSCL from 2003 to 2012. A total of 706…
UC Merced Center for Computational Biology Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Colvin, Michael; Watanabe, Masakatsu
Final report for the UC Merced Center for Computational Biology. The Center for Computational Biology (CCB) was established to support multidisciplinary scientific research and academic programs in computational biology at the new University of California campus in Merced. In 2003, the growing gap between biology research and education was documented in a report from the National Academy of Sciences, Bio2010 Transforming Undergraduate Education for Future Research Biologists. We believed that a new type of biological sciences undergraduate and graduate programs that emphasized biological concepts and considered biology as an information science would have a dramatic impact in enabling the transformationmore » of biology. UC Merced as newest UC campus and the first new U.S. research university of the 21st century was ideally suited to adopt an alternate strategy - to create a new Biological Sciences majors and graduate group that incorporated the strong computational and mathematical vision articulated in the Bio2010 report. CCB aimed to leverage this strong commitment at UC Merced to develop a new educational program based on the principle of biology as a quantitative, model-driven science. Also we expected that the center would be enable the dissemination of computational biology course materials to other university and feeder institutions, and foster research projects that exemplify a mathematical and computations-based approach to the life sciences. As this report describes, the CCB has been successful in achieving these goals, and multidisciplinary computational biology is now an integral part of UC Merced undergraduate, graduate and research programs in the life sciences. The CCB began in fall 2004 with the aid of an award from U.S. Department of Energy (DOE), under its Genomes to Life program of support for the development of research and educational infrastructure in the modern biological sciences. This report to DOE describes the research and academic programs made possible by the CCB from its inception until August, 2010, at the end of the final extension. Although DOE support for the center ended in August 2010, the CCB will continue to exist and support its original objectives. The research and academic programs fostered by the CCB have led to additional extramural funding from other agencies, and we anticipate that CCB will continue to provide support for quantitative and computational biology program at UC Merced for many years to come. Since its inception in fall 2004, CCB research projects have continuously had a multi-institutional collaboration with Lawrence Livermore National Laboratory (LLNL), and the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign, as well as individual collaborators at other sites. CCB affiliated faculty cover a broad range of computational and mathematical research including molecular modeling, cell biology, applied math, evolutional biology, bioinformatics, etc. The CCB sponsored the first distinguished speaker series at UC Merced, which had an important role is spreading the word about the computational biology emphasis at this new campus. One of CCB's original goals is to help train a new generation of biologists who bridge the gap between the computational and life sciences. To archive this goal, by summer 2006, a new program - summer undergraduate internship program, have been established under CCB to train the highly mathematical and computationally intensive Biological Science researchers. By the end of summer 2010, 44 undergraduate students had gone through this program. Out of those participants, 11 students have been admitted to graduate schools and 10 more students are interested in pursuing graduate studies in the sciences. The center is also continuing to facilitate the development and dissemination of undergraduate and graduate course materials based on the latest research in computational biology.« less
Where next for the reproducibility agenda in computational biology?
Lewis, Joanna; Breeze, Charles E; Charlesworth, Jane; Maclaren, Oliver J; Cooper, Jonathan
2016-07-15
The concept of reproducibility is a foundation of the scientific method. With the arrival of fast and powerful computers over the last few decades, there has been an explosion of results based on complex computational analyses and simulations. The reproducibility of these results has been addressed mainly in terms of exact replicability or numerical equivalence, ignoring the wider issue of the reproducibility of conclusions through equivalent, extended or alternative methods. We use case studies from our own research experience to illustrate how concepts of reproducibility might be applied in computational biology. Several fields have developed 'minimum information' checklists to support the full reporting of computational simulations, analyses and results, and standardised data formats and model description languages can facilitate the use of multiple systems to address the same research question. We note the importance of defining the key features of a result to be reproduced, and the expected agreement between original and subsequent results. Dynamic, updatable tools for publishing methods and results are becoming increasingly common, but sometimes come at the cost of clear communication. In general, the reproducibility of computational research is improving but would benefit from additional resources and incentives. We conclude with a series of linked recommendations for improving reproducibility in computational biology through communication, policy, education and research practice. More reproducible research will lead to higher quality conclusions, deeper understanding and more valuable knowledge.
Applications of computer-graphics animation for motion-perception research
NASA Technical Reports Server (NTRS)
Proffitt, D. R.; Kaiser, M. K.
1986-01-01
The advantages and limitations of using computer animated stimuli in studying motion perception are presented and discussed. Most current programs of motion perception research could not be pursued without the use of computer graphics animation. Computer generated displays afford latitudes of freedom and control that are almost impossible to attain through conventional methods. There are, however, limitations to this presentational medium. At present, computer generated displays present simplified approximations of the dynamics in natural events. Very little is known about how the differences between natural events and computer simulations influence perceptual processing. In practice, the differences are assumed to be irrelevant to the questions under study, and that findings with computer generated stimuli will generalize to natural events.
Test Anxiety, Computer-Adaptive Testing and the Common Core
ERIC Educational Resources Information Center
Colwell, Nicole Makas
2013-01-01
This paper highlights the current findings and issues regarding the role of computer-adaptive testing in test anxiety. The computer-adaptive test (CAT) proposed by one of the Common Core consortia brings these issues to the forefront. Research has long indicated that test anxiety impairs student performance. More recent research indicates that…
Learning Oceanography from a Computer Simulation Compared with Direct Experience at Sea
ERIC Educational Resources Information Center
Winn, William; Stahr, Frederick; Sarason, Christian; Fruland, Ruth; Oppenheimer, Peter; Lee, Yen-Ling
2006-01-01
Considerable research has compared how students learn science from computer simulations with how they learn from "traditional" classes. Little research has compared how students learn science from computer simulations with how they learn from direct experience in the real environment on which the simulations are based. This study compared two…
Proposed Directions for Research in Computer-Based Education.
ERIC Educational Resources Information Center
Waugh, Michael L.
Several directions for potential research efforts in the field of computer-based education (CBE) are discussed. (For the purposes of this paper, CBE is defined as any use of computers to promote learning with no intended inference as to the specific nature or organization of the educational application under discussion.) Efforts should be directed…
Children Learning from Artfully Designed, Three-Dimensional Computer Animation
ERIC Educational Resources Information Center
Ju, Yoomi Choi; Cifuentes, Lauren
2002-01-01
An artfully designed, 3-D computer-generated video story was created to demonstrate the mixing of primary colors to obtain secondary colors. Two research questions were explored in this research: Do artfully designed 3-D computer-generated video stories enhance learning or are such entertaining works a distraction from learning? And, do children…
ERIC Educational Resources Information Center
Benda, Klara; Bruckman, Amy; Guzdial, Mark
2012-01-01
We present the results of an interview study investigating student experiences in two online introductory computer science courses. Our theoretical approach is situated at the intersection of two research traditions: "distance and adult education research," which tends to be sociologically oriented, and "computer science education…
A Meta-Analysis of Effectiveness Studies on Computer Technology-Supported Language Learning
ERIC Educational Resources Information Center
Grgurovic, Maja; Chapelle, Carol A.; Shelley, Mack C.
2013-01-01
With the aim of summarizing years of research comparing pedagogies for second/foreign language teaching supported with computer technology and pedagogy not-supported by computer technology, a meta-analysis was conducted of empirical research investigating language outcomes. Thirty-seven studies yielding 52 effect sizes were included, following a…
ERIC Educational Resources Information Center
Kumi, Richard; Reychav, Iris; Sabherwal, Rajiv
2016-01-01
Many educational institutions are integrating mobile-computing technologies (MCT) into the classroom to improve learning outcomes. There is also a growing interest in research to understand how MCT influence learning outcomes. The diversity of results in prior research indicates that computer-mediated learning has different effects on various…
ERIC Educational Resources Information Center
Mills, Steven C.; Ragan, Tillman J.
This paper examines a research paradigm that is particularly suited to experimentation-related computer-based instruction and integrated learning systems. The main assumption of the model is that one of the most powerful capabilities of computer-based instruction, and specifically of integrated learning systems, is the capacity to adapt…
The Data Collector: A Qualitative Research Tool.
ERIC Educational Resources Information Center
Handler, Marianne G.; Turner, Sandra V.
Computer software that is intended to assist the qualitative researcher in the analysis of textual data is relatively new. One such program, the Data Collector, is a HyperCard computer program designed for use on the Macintosh computer. A tool for organizing and analyzing textual data obtained from observations, interviews, surveys, and other…
The Role of Context-Related Parameters in Adults' Mental Computational Acts
ERIC Educational Resources Information Center
Naresh, Nirmala; Presmeg, Norma
2012-01-01
Researchers who have carried out studies pertaining to mental computation and everyday mathematics point out that adults and children reason intuitively based upon experiences within specific contexts; they use invented strategies of their own to solve real-life problems. We draw upon research areas of mental computation and everyday mathematics…
48 CFR 227.7104 - Contracts under the Small Business Innovation Research (SBIR) Program.
Code of Federal Regulations, 2013 CFR
2013-10-01
... Data and Computer Software—Small Business Innovation Research (SBIR) Program, when technical data or computer software will be generated during performance of contracts under the SBIR program. (b) Under the clause at 252.227-7018, the Government obtains SBIR data rights in technical data and computer software...
48 CFR 227.7104 - Contracts under the Small Business Innovation Research (SBIR) Program.
Code of Federal Regulations, 2014 CFR
2014-10-01
... Data and Computer Software—Small Business Innovation Research (SBIR) Program, when technical data or computer software will be generated during performance of contracts under the SBIR program. (b) Under the clause at 252.227-7018, the Government obtains SBIR data rights in technical data and computer software...
How to Teach Residue Number System to Computer Scientists and Engineers
ERIC Educational Resources Information Center
Navi, K.; Molahosseini, A. S.; Esmaeildoust, M.
2011-01-01
The residue number system (RNS) has been an important research field in computer arithmetic for many decades, mainly because of its carry-free nature, which can provide high-performance computing architectures with superior delay specifications. Recently, research on RNS has found new directions that have resulted in the introduction of efficient…
Developing a Research Agenda for Ubiquitous Computing in Schools
ERIC Educational Resources Information Center
Zucker, Andrew
2004-01-01
Increasing numbers of states, districts, and schools provide every student with a computing device; for example, the middle schools in Maine maintain wireless Internet access and the students receive laptops. Research can provide policymakers with better evidence of the benefits and costs of 1:1 computing and establish which factors make 1:1…
A Research and Development Strategy for High Performance Computing.
ERIC Educational Resources Information Center
Office of Science and Technology Policy, Washington, DC.
This report is the result of a systematic review of the status and directions of high performance computing and its relationship to federal research and development. Conducted by the Federal Coordinating Council for Science, Engineering, and Technology (FCCSET), the review involved a series of workshops attended by numerous computer scientists and…
An information retrieval system for research file data
Joan E. Lengel; John W. Koning
1978-01-01
Research file data have been successfully retrieved at the Forest Products Laboratory through a high-speed cross-referencing system involving the computer program FAMULUS as modified by the Madison Academic Computing Center at the University of Wisconsin. The method of data input, transfer to computer storage, system utilization, and effectiveness are discussed....
ERIC Educational Resources Information Center
Ceyhan, A. Aykut; Ceyhan, Esra
2007-01-01
This research aims at examining the relationships among unethical computer usage behavior and the personality characteristics of locus of control, adjustment to social norms, antisocial tendency, and aggression on Turkish university students. The research was applied to 217 university students. Data were collected through Unethical Computer Using…
Computer-Based Simulations for Maintenance Training: Current ARI Research. Technical Report 544.
ERIC Educational Resources Information Center
Knerr, Bruce W.; And Others
Three research efforts that used computer-based simulations for maintenance training were in progress when this report was written: Game-Based Learning, which investigated the use of computer-based games to train electronics diagnostic skills; Human Performance in Fault Diagnosis Tasks, which evaluated the use of context-free tasks to train…
The 3d International Workshop on Computational Electronics
NASA Astrophysics Data System (ADS)
Goodnick, Stephen M.
1994-09-01
The Third International Workshop on Computational Electronics (IWCE) was held at the Benson Hotel in downtown Portland, Oregon, on May 18, 19, and 20, 1994. The workshop was devoted to a broad range of topics in computational electronics related to the simulation of electronic transport in semiconductors and semiconductor devices, particularly those which use large computational resources. The workshop was supported by the National Science Foundation (NSF), the Office of Naval Research and the Army Research Office, as well as local support from the Oregon Joint Graduate Schools of Engineering and the Oregon Center for Advanced Technology Education. There were over 100 participants in the Portland workshop, of which more than one quarter represented research groups outside of the United States from Austria, Canada, France, Germany, Italy, Japan, Switzerland, and the United Kingdom. There were a total 81 papers presented at the workshop, 9 invited talks, 26 oral presentations and 46 poster presentations. The emphasis of the contributions reflected the interdisciplinary nature of computational electronics with researchers from the Chemistry, Computer Science, Mathematics, Engineering, and Physics communities participating in the workshop.
Eleven quick tips for architecting biomedical informatics workflows with cloud computing.
Cole, Brian S; Moore, Jason H
2018-03-01
Cloud computing has revolutionized the development and operations of hardware and software across diverse technological arenas, yet academic biomedical research has lagged behind despite the numerous and weighty advantages that cloud computing offers. Biomedical researchers who embrace cloud computing can reap rewards in cost reduction, decreased development and maintenance workload, increased reproducibility, ease of sharing data and software, enhanced security, horizontal and vertical scalability, high availability, a thriving technology partner ecosystem, and much more. Despite these advantages that cloud-based workflows offer, the majority of scientific software developed in academia does not utilize cloud computing and must be migrated to the cloud by the user. In this article, we present 11 quick tips for architecting biomedical informatics workflows on compute clouds, distilling knowledge gained from experience developing, operating, maintaining, and distributing software and virtualized appliances on the world's largest cloud. Researchers who follow these tips stand to benefit immediately by migrating their workflows to cloud computing and embracing the paradigm of abstraction.
Eleven quick tips for architecting biomedical informatics workflows with cloud computing
Moore, Jason H.
2018-01-01
Cloud computing has revolutionized the development and operations of hardware and software across diverse technological arenas, yet academic biomedical research has lagged behind despite the numerous and weighty advantages that cloud computing offers. Biomedical researchers who embrace cloud computing can reap rewards in cost reduction, decreased development and maintenance workload, increased reproducibility, ease of sharing data and software, enhanced security, horizontal and vertical scalability, high availability, a thriving technology partner ecosystem, and much more. Despite these advantages that cloud-based workflows offer, the majority of scientific software developed in academia does not utilize cloud computing and must be migrated to the cloud by the user. In this article, we present 11 quick tips for architecting biomedical informatics workflows on compute clouds, distilling knowledge gained from experience developing, operating, maintaining, and distributing software and virtualized appliances on the world’s largest cloud. Researchers who follow these tips stand to benefit immediately by migrating their workflows to cloud computing and embracing the paradigm of abstraction. PMID:29596416
Implementation of Audio Computer-Assisted Interviewing Software in HIV/AIDS Research
Pluhar, Erika; Yeager, Katherine A.; Corkran, Carol; McCarty, Frances; Holstad, Marcia McDonnell; Denzmore-Nwagbara, Pamela; Fielder, Bridget; DiIorio, Colleen
2007-01-01
Computer assisted interviewing (CAI) has begun to play a more prominent role in HIV/AIDS prevention research. Despite the increased popularity of CAI, particularly audio computer assisted self-interviewing (ACASI), some research teams are still reluctant to implement ACASI technology due to lack of familiarity with the practical issues related to using these software packages. The purpose of this paper is to describe the implementation of one particular ACASI software package, the Questionnaire Development System™ (QDS™), in several nursing and HIV/AIDS prevention research settings. We present acceptability and satisfaction data from two large-scale public health studies in which we have used QDS with diverse populations. We also address issues related to developing and programming a questionnaire, discuss practical strategies related to planning for and implementing ACASI in the field, including selecting equipment, training staff, and collecting and transferring data, and summarize advantages and disadvantages of computer assisted research methods. PMID:17662924
Semiannual report, 1 April - 30 September 1991
NASA Technical Reports Server (NTRS)
1991-01-01
The major categories of the current Institute for Computer Applications in Science and Engineering (ICASE) research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification problems, with emphasis on effective numerical methods; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software for parallel computers. Research in these areas is discussed.
Dynamic interactions in neural networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arbib, M.A.; Amari, S.
The study of neural networks is enjoying a great renaissance, both in computational neuroscience, the development of information processing models of living brains, and in neural computing, the use of neurally inspired concepts in the construction of intelligent machines. This volume presents models and data on the dynamic interactions occurring in the brain, and exhibits the dynamic interactions between research in computational neuroscience and in neural computing. The authors present current research, future trends and open problems.
NASA Technical Reports Server (NTRS)
STACK S. H.
1981-01-01
A computer-aided design system has recently been developed specifically for the small research group environment. The system is implemented on a Prime 400 minicomputer linked with a CDC 6600 computer. The goal was to assign the minicomputer specific tasks, such as data input and graphics, thereby reserving the large mainframe computer for time-consuming analysis codes. The basic structure of the design system consists of GEMPAK, a computer code that generates detailed configuration geometry from a minimum of input; interface programs that reformat GEMPAK geometry for input to the analysis codes; and utility programs that simplify computer access and data interpretation. The working system has had a large positive impact on the quantity and quality of research performed by the originating group. This paper describes the system, the major factors that contributed to its particular form, and presents examples of its application.
Survey data collection using Audio Computer Assisted Self-Interview.
Jones, Rachel
2003-04-01
The Audio Computer Assisted Self-Interview (ACASI) is a computer application that allows a research participant to hear survey interview items over a computer headset and read the corresponding items on a computer monitor. The ACASI automates progression from one item to the next, skipping irrelevant items. The research participant responds by pressing a number keypad, sending the data directly into a database. The ACASI was used to enhance participants' sense of privacy. A convenience sample of 257 young urban women, ages 18 to 29 years, were interviewed in neighborhood settings concerning human immune deficiency virus (HIV) sexual risk behaviors. Notebook computers were used to facilitate mobility. The overwhelming majority rated their experience with ACASI as easy to use. This article will focus on the use of ACASI in HIV behavioral research, its benefits, and approaches to resolve some identified problems with this method of data collection.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Turner, James C. Jr.; Mason, Thomas; Guerrieri, Bruno
1997-10-01
Programs have been established at Florida A & M University to attract minority students to research careers in mathematics and computational science. The primary goal of the program was to increase the number of such students studying computational science via an interactive multimedia learning environment One mechanism used for meeting this goal was the development of educational modules. This academic year program established within the mathematics department at Florida A&M University, introduced students to computational science projects using high-performance computers. Additional activities were conducted during the summer, these included workshops, meetings, and lectures. Through the exposure provided by this programmore » to scientific ideas and research in computational science, it is likely that their successful applications of tools from this interdisciplinary field will be high.« less
Computational manufacturing as a bridge between design and production.
Tikhonravov, Alexander V; Trubetskov, Michael K
2005-11-10
Computational manufacturing of optical coatings is a research area that can be placed between theoretical designing and practical manufacturing in the same way that computational physics can be placed between theoretical and experimental physics. Investigations in this area have been performed for more than 30 years under the name of computer simulation of manufacturing and monitoring processes. Our goal is to attract attention to the increasing importance of computational manufacturing at the current state of the art in the design and manufacture of optical coatings and to demonstrate possible applications of this research tool.
Computational manufacturing as a bridge between design and production
NASA Astrophysics Data System (ADS)
Tikhonravov, Alexander V.; Trubetskov, Michael K.
2005-11-01
Computational manufacturing of optical coatings is a research area that can be placed between theoretical designing and practical manufacturing in the same way that computational physics can be placed between theoretical and experimental physics. Investigations in this area have been performed for more than 30 years under the name of computer simulation of manufacturing and monitoring processes. Our goal is to attract attention to the increasing importance of computational manufacturing at the current state of the art in the design and manufacture of optical coatings and to demonstrate possible applications of this research tool.
When cloud computing meets bioinformatics: a review.
Zhou, Shuigeng; Liao, Ruiqi; Guan, Jihong
2013-10-01
In the past decades, with the rapid development of high-throughput technologies, biology research has generated an unprecedented amount of data. In order to store and process such a great amount of data, cloud computing and MapReduce were applied to many fields of bioinformatics. In this paper, we first introduce the basic concepts of cloud computing and MapReduce, and their applications in bioinformatics. We then highlight some problems challenging the applications of cloud computing and MapReduce to bioinformatics. Finally, we give a brief guideline for using cloud computing in biology research.
Parallel aeroelastic computations for wing and wing-body configurations
NASA Technical Reports Server (NTRS)
Byun, Chansup
1994-01-01
The objective of this research is to develop computationally efficient methods for solving fluid-structural interaction problems by directly coupling finite difference Euler/Navier-Stokes equations for fluids and finite element dynamics equations for structures on parallel computers. This capability will significantly impact many aerospace projects of national importance such as Advanced Subsonic Civil Transport (ASCT), where the structural stability margin becomes very critical at the transonic region. This research effort will have direct impact on the High Performance Computing and Communication (HPCC) Program of NASA in the area of parallel computing.
Dynamically allocating sets of fine-grained processors to running computations
NASA Technical Reports Server (NTRS)
Middleton, David
1988-01-01
Researchers explore an approach to using general purpose parallel computers which involves mapping hardware resources onto computations instead of mapping computations onto hardware. Problems such as processor allocation, task scheduling and load balancing, which have traditionally proven to be challenging, change significantly under this approach and may become amenable to new attacks. Researchers describe the implementation of this approach used by the FFP Machine whose computation and communication resources are repeatedly partitioned into disjoint groups that match the needs of available tasks from moment to moment. Several consequences of this system are examined.
Impact of Classroom Computer Use on Computer Anxiety.
ERIC Educational Resources Information Center
Lambert, Matthew E.; And Others
Increasing use of computer programs for undergraduate psychology education has raised concern over the impact of computer anxiety on educational performance. Additionally, some researchers have indicated that classroom computer use can exacerbate pre-existing computer anxiety. To evaluate the relationship between in-class computer use and computer…
The ACI-REF Program: Empowering Prospective Computational Researchers
NASA Astrophysics Data System (ADS)
Cuma, M.; Cardoen, W.; Collier, G.; Freeman, R. M., Jr.; Kitzmiller, A.; Michael, L.; Nomura, K. I.; Orendt, A.; Tanner, L.
2014-12-01
The ACI-REF program, Advanced Cyberinfrastructure - Research and Education Facilitation, represents a consortium of academic institutions seeking to further advance the capabilities of their respective campus research communities through an extension of the personal connections and educational activities that underlie the unique and often specialized cyberinfrastructure at each institution. This consortium currently includes Clemson University, Harvard University, University of Hawai'i, University of Southern California, University of Utah, and University of Wisconsin. Working together in a coordinated effort, the consortium is dedicated to the adoption of models and strategies which leverage the expertise and experience of its members with a goal of maximizing the impact of each institution's investment in research computing. The ACI-REFs (facilitators) are tasked with making connections and building bridges between the local campus researchers and the many different providers of campus, commercial, and national computing resources. Through these bridges, ACI-REFs assist researchers from all disciplines in understanding their computing and data needs and in mapping these needs to existing capabilities or providing assistance with development of these capabilities. From the Earth sciences perspective, we will give examples of how this assistance improved methods and workflows in geophysics, geography and atmospheric sciences. We anticipate that this effort will expand the number of researchers who become self-sufficient users of advanced computing resources, allowing them to focus on making research discoveries in a more timely and efficient manner.
Measuring Impact of EPAs Computational Toxicology Research (BOSC)
Computational Toxicology (CompTox) research at the EPA was initiated in 2005. Since 2005, CompTox research efforts have made tremendous advances in developing new approaches to evaluate thousands of chemicals for potential health effects. The purpose of this case study is to trac...
NASA Technical Reports Server (NTRS)
1994-01-01
CESDIS, the Center of Excellence in Space Data and Information Sciences was developed jointly by NASA, Universities Space Research Association (USRA), and the University of Maryland in 1988 to focus on the design of advanced computing techniques and data systems to support NASA Earth and space science research programs. CESDIS is operated by USRA under contract to NASA. The Director, Associate Director, Staff Scientists, and administrative staff are located on-site at NASA's Goddard Space Flight Center in Greenbelt, Maryland. The primary CESDIS mission is to increase the connection between computer science and engineering research programs at colleges and universities and NASA groups working with computer applications in Earth and space science. The 1993-94 CESDIS year included a broad range of computer science research applied to NASA problems. This report provides an overview of these research projects and programs as well as a summary of the various other activities of CESDIS in support of NASA and the university research community, We have had an exciting and challenging year.
Implementation of audio computer-assisted interviewing software in HIV/AIDS research.
Pluhar, Erika; McDonnell Holstad, Marcia; Yeager, Katherine A; Denzmore-Nwagbara, Pamela; Corkran, Carol; Fielder, Bridget; McCarty, Frances; Diiorio, Colleen
2007-01-01
Computer-assisted interviewing (CAI) has begun to play a more prominent role in HIV/AIDS prevention research. Despite the increased popularity of CAI, particularly audio computer-assisted self-interviewing (ACASI), some research teams are still reluctant to implement ACASI technology because of lack of familiarity with the practical issues related to using these software packages. The purpose of this report is to describe the implementation of one particular ACASI software package, the Questionnaire Development System (QDS; Nova Research Company, Bethesda, MD), in several nursing and HIV/AIDS prevention research settings. The authors present acceptability and satisfaction data from two large-scale public health studies in which they have used QDS with diverse populations. They also address issues related to developing and programming a questionnaire; discuss practical strategies related to planning for and implementing ACASI in the field, including selecting equipment, training staff, and collecting and transferring data; and summarize advantages and disadvantages of computer-assisted research methods.
NASA Astrophysics Data System (ADS)
Akı, Fatma Nur; Gürel, Zeynep
2017-02-01
The purpose of this research is to determine the university students' learning experiences about flipped-physics laboratory class. The research has been completed during the fall semester of 2015 at Computer Engineering Department of Istanbul Commerce University. In this research, also known as a teacher qualitative research design, action research method is preferred to use. The participants are ten people, including seven freshman and three junior year students of Computer Engineering Department. The research data was collected at the end of the semester with the focus group interview which includes structured and open-ended questions. And data was evaluated with categorical content analysis. According to the results, students have some similar and different learning experiences to flipped education method for physics laboratory class.
A Virtual Astronomical Research Machine in No Time (VARMiNT)
NASA Astrophysics Data System (ADS)
Beaver, John
2012-05-01
We present early results of using virtual machine software to help make astronomical research computing accessible to a wider range of individuals. Our Virtual Astronomical Research Machine in No Time (VARMiNT) is an Ubuntu Linux virtual machine with free, open-source software already installed and configured (and in many cases documented). The purpose of VARMiNT is to provide a ready-to-go astronomical research computing environment that can be freely shared between researchers, or between amateur and professional, teacher and student, etc., and to circumvent the often-difficult task of configuring a suitable computing environment from scratch. Thus we hope that VARMiNT will make it easier for individuals to engage in research computing even if they have no ready access to the facilities of a research institution. We describe our current version of VARMiNT and some of the ways it is being used at the University of Wisconsin - Fox Valley, a two-year teaching campus of the University of Wisconsin System, as a means to enhance student independent study research projects and to facilitate collaborations with researchers at other locations. We also outline some future plans and prospects.
An Assessment of the Computer Science Activities of the Office of Naval Research
1986-01-01
A Panel of the Naval Studies Board of the National Research Council met for two days in October 1985 to assess the computer science programs of the ... Office of Naval (ONR). These programs are supported by the Contract Research Program (CRP) as well as the Naval Research Laboratory (NRL), the Naval
Phenomenography and Grounded Theory as Research Methods in Computing Education Research Field
ERIC Educational Resources Information Center
Kinnunen, Paivi; Simon, Beth
2012-01-01
This paper discusses two qualitative research methods, phenomenography and grounded theory. We introduce both methods' data collection and analysis processes and the type or results you may get at the end by using examples from computing education research. We highlight some of the similarities and differences between the aim, data collection and…
Web-Based Integrated Research Environment for Aerodynamic Analyses and Design
NASA Astrophysics Data System (ADS)
Ahn, Jae Wan; Kim, Jin-Ho; Kim, Chongam; Cho, Jung-Hyun; Hur, Cinyoung; Kim, Yoonhee; Kang, Sang-Hyun; Kim, Byungsoo; Moon, Jong Bae; Cho, Kum Won
e-AIRS[1,2], an abbreviation of ‘e-Science Aerospace Integrated Research System,' is a virtual organization designed to support aerodynamic flow analyses in aerospace engineering using the e-Science environment. As the first step toward a virtual aerospace engineering organization, e-AIRS intends to give a full support of aerodynamic research process. Currently, e-AIRS can handle both the computational and experimental aerodynamic research on the e-Science infrastructure. In detail, users can conduct a full CFD (Computational Fluid Dynamics) research process, request wind tunnel experiment, perform comparative analysis between computational prediction and experimental measurement, and finally, collaborate with other researchers using the web portal. The present paper describes those services and the internal architecture of the e-AIRS system.
The UK Human Genome Mapping Project online computing service.
Rysavy, F R; Bishop, M J; Gibbs, G P; Williams, G W
1992-04-01
This paper presents an overview of computing and networking facilities developed by the Medical Research Council to provide online computing support to the Human Genome Mapping Project (HGMP) in the UK. The facility is connected to a number of other computing facilities in various centres of genetics and molecular biology research excellence, either directly via high-speed links or through national and international wide-area networks. The paper describes the design and implementation of the current system, a 'client/server' network of Sun, IBM, DEC and Apple servers, gateways and workstations. A short outline of online computing services currently delivered by this system to the UK human genetics research community is also provided. More information about the services and their availability could be obtained by a direct approach to the UK HGMP-RC.
ICASE semiannual report, April 1 - September 30, 1989
NASA Technical Reports Server (NTRS)
1990-01-01
The Institute conducts unclassified basic research in applied mathematics, numerical analysis, and computer science in order to extend and improve problem-solving capabilities in science and engineering, particularly in aeronautics and space. The major categories of the current Institute for Computer Applications in Science and Engineering (ICASE) research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification problems, with emphasis on effective numerical methods; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers. ICASE reports are considered to be primarily preprints of manuscripts that have been submitted to appropriate research journals or that are to appear in conference proceedings.
Architectural Aspects of Grid Computing and its Global Prospects for E-Science Community
NASA Astrophysics Data System (ADS)
Ahmad, Mushtaq
2008-05-01
The paper reviews the imminent Architectural Aspects of Grid Computing for e-Science community for scientific research and business/commercial collaboration beyond physical boundaries. Grid Computing provides all the needed facilities; hardware, software, communication interfaces, high speed internet, safe authentication and secure environment for collaboration of research projects around the globe. It provides highly fast compute engine for those scientific and engineering research projects and business/commercial applications which are heavily compute intensive and/or require humongous amounts of data. It also makes possible the use of very advanced methodologies, simulation models, expert systems and treasure of knowledge available around the globe under the umbrella of knowledge sharing. Thus it makes possible one of the dreams of global village for the benefit of e-Science community across the globe.
V/STOL AND digital avionics system for UH-1H
NASA Technical Reports Server (NTRS)
Liden, S.
1978-01-01
A hardware and software system for the Bell UH-1H helicopter was developed that provides sophisticated navigation, guidance, control, display, and data acquisition capabilities for performing terminal area navigation, guidance and control research. Two Sperry 1819B general purpose digital computers were used. One contains the development software that performs all the specified system flight computations. The second computer is available to NASA for experimental programs that run simultaneously with the other computer programs and which may, at the push of a button, replace selected computer computations. Other features that provide research flexibility include keyboard selectable gains and parameters and software generated alphanumeric and CRT displays.
NASA Astrophysics Data System (ADS)
Skrzypek, Josef; Mesrobian, Edmond; Gungner, David J.
1989-03-01
The development of autonomous land vehicles (ALV) capable of operating in an unconstrained environment has proven to be a formidable research effort. The unpredictability of events in such an environment calls for the design of a robust perceptual system, an impossible task requiring the programming of a system bases on the expectation of future, unconstrained events. Hence, the need for a "general purpose" machine vision system that is capable of perceiving and understanding images in an unconstrained environment in real-time. The research undertaken at the UCLA Machine Perception Laboratory addresses this need by focusing on two specific issues: 1) the long term goals for machine vision research as a joint effort between the neurosciences and computer science; and 2) a framework for evaluating progress in machine vision. In the past, vision research has been carried out independently within different fields including neurosciences, psychology, computer science, and electrical engineering. Our interdisciplinary approach to vision research is based on the rigorous combination of computational neuroscience, as derived from neurophysiology and neuropsychology, with computer science and electrical engineering. The primary motivation behind our approach is that the human visual system is the only existing example of a "general purpose" vision system and using a neurally based computing substrate, it can complete all necessary visual tasks in real-time.
Cloud computing applications for biomedical science: A perspective.
Navale, Vivek; Bourne, Philip E
2018-06-01
Biomedical research has become a digital data-intensive endeavor, relying on secure and scalable computing, storage, and network infrastructure, which has traditionally been purchased, supported, and maintained locally. For certain types of biomedical applications, cloud computing has emerged as an alternative to locally maintained traditional computing approaches. Cloud computing offers users pay-as-you-go access to services such as hardware infrastructure, platforms, and software for solving common biomedical computational problems. Cloud computing services offer secure on-demand storage and analysis and are differentiated from traditional high-performance computing by their rapid availability and scalability of services. As such, cloud services are engineered to address big data problems and enhance the likelihood of data and analytics sharing, reproducibility, and reuse. Here, we provide an introductory perspective on cloud computing to help the reader determine its value to their own research.
Cloud computing applications for biomedical science: A perspective
2018-01-01
Biomedical research has become a digital data–intensive endeavor, relying on secure and scalable computing, storage, and network infrastructure, which has traditionally been purchased, supported, and maintained locally. For certain types of biomedical applications, cloud computing has emerged as an alternative to locally maintained traditional computing approaches. Cloud computing offers users pay-as-you-go access to services such as hardware infrastructure, platforms, and software for solving common biomedical computational problems. Cloud computing services offer secure on-demand storage and analysis and are differentiated from traditional high-performance computing by their rapid availability and scalability of services. As such, cloud services are engineered to address big data problems and enhance the likelihood of data and analytics sharing, reproducibility, and reuse. Here, we provide an introductory perspective on cloud computing to help the reader determine its value to their own research. PMID:29902176
Research on OpenStack of open source cloud computing in colleges and universities’ computer room
NASA Astrophysics Data System (ADS)
Wang, Lei; Zhang, Dandan
2017-06-01
In recent years, the cloud computing technology has a rapid development, especially open source cloud computing. Open source cloud computing has attracted a large number of user groups by the advantages of open source and low cost, have now become a large-scale promotion and application. In this paper, firstly we briefly introduced the main functions and architecture of the open source cloud computing OpenStack tools, and then discussed deeply the core problems of computer labs in colleges and universities. Combining with this research, it is not that the specific application and deployment of university computer rooms with OpenStack tool. The experimental results show that the application of OpenStack tool can efficiently and conveniently deploy cloud of university computer room, and its performance is stable and the functional value is good.
Team Effectiveness and Team Development in CSCL
ERIC Educational Resources Information Center
Fransen, Jos; Weinberger, Armin; Kirschner, Paul A.
2013-01-01
There is a wealth of research on computer-supported cooperative work (CSCW) that is neglected in computer-supported collaborative learning (CSCL) research. CSCW research is concerned with contextual factors, however, that may strongly influence collaborative learning processes as well, such as task characteristics, team formation, team members'…
Toward a superconducting quantum computer
Tsai, Jaw-Shen
2010-01-01
Intensive research on the construction of superconducting quantum computers has produced numerous important achievements. The quantum bit (qubit), based on the Josephson junction, is at the heart of this research. This macroscopic system has the ability to control quantum coherence. This article reviews the current state of quantum computing as well as its history, and discusses its future. Although progress has been rapid, the field remains beset with unsolved issues, and there are still many new research opportunities open to physicists and engineers. PMID:20431256
Researchers Mine Information from Next-Generation Subsurface Flow Simulations
Gedenk, Eric D.
2015-12-01
A research team based at Virginia Tech University leveraged computing resources at the US Department of Energy's (DOE's) Oak Ridge National Laboratory to explore subsurface multiphase flow phenomena that can't be experimentally observed. Using the Cray XK7 Titan supercomputer at the Oak Ridge Leadership Computing Facility, the team took Micro-CT images of subsurface geologic systems and created two-phase flow simulations. The team's model development has implications for computational research pertaining to carbon sequestration, oil recovery, and contaminant transport.
2015-08-13
AFRL-AFOSR-VA-TR-2015-0270 Examining the Role of Religiosity in Moral Cognition, Specifically in the Formation of Sacred Values, and Researching...Computational Models for Analyzing Sacred Rhetoric and its Consequential Emotions Morteza Dehghani UNIVERSITY OF SOUTHERN CALIFORNIA LOS ANGELES Final...SUBTITLE (YIP-12) Examining the Role of Religiosity in Moral Cognition, Specifically in the Formation of Sacred Values, and Researching Computational
Toward a superconducting quantum computer. Harnessing macroscopic quantum coherence.
Tsai, Jaw-Shen
2010-01-01
Intensive research on the construction of superconducting quantum computers has produced numerous important achievements. The quantum bit (qubit), based on the Josephson junction, is at the heart of this research. This macroscopic system has the ability to control quantum coherence. This article reviews the current state of quantum computing as well as its history, and discusses its future. Although progress has been rapid, the field remains beset with unsolved issues, and there are still many new research opportunities open to physicists and engineers.
NASA Computational Fluid Dynamics Conference. Volume 1: Sessions 1-6
NASA Technical Reports Server (NTRS)
1989-01-01
Presentations given at the NASA Computational Fluid Dynamics (CFD) Conference held at the NASA Ames Research Center, Moffett Field, California, March 7-9, 1989 are given. Topics covered include research facility overviews of CFD research and applications, validation programs, direct simulation of compressible turbulence, turbulence modeling, advances in Runge-Kutta schemes for solving 3-D Navier-Stokes equations, grid generation and invicid flow computation around aircraft geometries, numerical simulation of rotorcraft, and viscous drag prediction for rotor blades.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xie, Tianwu; Zaidi, Habib, E-mail: habib.zaidi@hcuge.ch; Geneva Neuroscience Center, Geneva University, Geneva CH-1205
The development of multimodality preclinical imaging techniques and the rapid growth of realistic computer simulation tools have promoted the construction and application of computational laboratory animal models in preclinical research. Since the early 1990s, over 120 realistic computational animal models have been reported in the literature and used as surrogates to characterize the anatomy of actual animals for the simulation of preclinical studies involving the use of bioluminescence tomography, fluorescence molecular tomography, positron emission tomography, single-photon emission computed tomography, microcomputed tomography, magnetic resonance imaging, and optical imaging. Other applications include electromagnetic field simulation, ionizing and nonionizing radiation dosimetry, and themore » development and evaluation of new methodologies for multimodality image coregistration, segmentation, and reconstruction of small animal images. This paper provides a comprehensive review of the history and fundamental technologies used for the development of computational small animal models with a particular focus on their application in preclinical imaging as well as nonionizing and ionizing radiation dosimetry calculations. An overview of the overall process involved in the design of these models, including the fundamental elements used for the construction of different types of computational models, the identification of original anatomical data, the simulation tools used for solving various computational problems, and the applications of computational animal models in preclinical research. The authors also analyze the characteristics of categories of computational models (stylized, voxel-based, and boundary representation) and discuss the technical challenges faced at the present time as well as research needs in the future.« less
Computational Modeling and Treatment Identification in the Myelodysplastic Syndromes.
Drusbosky, Leylah M; Cogle, Christopher R
2017-10-01
This review discusses the need for computational modeling in myelodysplastic syndromes (MDS) and early test results. As our evolving understanding of MDS reveals a molecularly complicated disease, the need for sophisticated computer analytics is required to keep track of the number and complex interplay among the molecular abnormalities. Computational modeling and digital drug simulations using whole exome sequencing data input have produced early results showing high accuracy in predicting treatment response to standard of care drugs. Furthermore, the computational MDS models serve as clinically relevant MDS cell lines for pre-clinical assays of investigational agents. MDS is an ideal disease for computational modeling and digital drug simulations. Current research is focused on establishing the prediction value of computational modeling. Future research will test the clinical advantage of computer-informed therapy in MDS.
Toward Impactful Collaborations on Computing and Mental Health
Dinakar, Karthik; Picard, Rosalind; Christensen, Helen; Torous, John
2018-01-01
We describe an initiative to bring mental health researchers, computer scientists, human-computer interaction researchers, and other communities together to address the challenges of the global mental ill health epidemic. Two face-to-face events and one special issue of the Journal of Medical Internet Research were organized. The works presented in these events and publication reflect key state-of-the-art research in this interdisciplinary collaboration. We summarize the special issue articles and contextualize them to present a picture of the most recent research. In addition, we describe a series of collaborative activities held during the second symposium and where the community identified 5 challenges and their possible solutions. PMID:29426812
Artificial Intelligence and brain.
Shapshak, Paul
2018-01-01
From the start, Kurt Godel observed that computer and brain paradigms were considered on a par by researchers and that researchers had misunderstood his theorems. He hailed with displeasure that the brain transcends computers. In this brief article, we point out that Artificial Intelligence (AI) comprises multitudes of human-made methodologies, systems, and languages, and implemented with computer technology. These advances enhance development in the electron and quantum realms. In the biological realm, animal neurons function, also utilizing electron flow, and are products of evolution. Mirror neurons are an important paradigm in neuroscience research. Moreover, the paradigm shift proposed here - 'hall of mirror neurons' - is a potentially further productive research tactic. These concepts further expand AI and brain research.
A Test-Bed of Secure Mobile Cloud Computing for Military Applications
2016-09-13
searching databases. This kind of applications is a typical example of mobile cloud computing (MCC). MCC has lots of applications in the military...Release; Distribution Unlimited UU UU UU UU 13-09-2016 1-Aug-2014 31-Jul-2016 Final Report: A Test-bed of Secure Mobile Cloud Computing for Military...Army Research Office P.O. Box 12211 Research Triangle Park, NC 27709-2211 Test-bed, Mobile Cloud Computing , Security, Military Applications REPORT
Using Online Computer Games in the ELT Classroom: A Case Study
ERIC Educational Resources Information Center
Vasileiadou, Ioanna; Makrina, Zafiri
2017-01-01
The purpose of this research was to investigate the effectiveness of computer games in learning English as a foreign language and the extent to which they increase motivation in young students. More particularly, this research investigated the validity of the hypothesis that computer games are a particularly motivating means for young students to…
Choosing a Computer Language for Institutional Research. The AIR Professional File No. 6.
ERIC Educational Resources Information Center
Strenglein, Denise
1980-01-01
It is suggested that much thought should be given to choosing an appropriate computer language for an institutional research office, considering the sophistication of the staff, types of planned application, size and type of computer, and availability of central programming support in the institution. For offices that prepare straight reports and…
Inequities in Computer Education Due to Gender, Race, and Socioeconomic Status.
ERIC Educational Resources Information Center
Urban, Cynthia M.
Recent reports have revealed that inequalities exist between males and females, racial minorities and whites, and rich and poor in accessibility to and use of computers. This study reviews the research in the field of computer-based education to determine the extent of and reasons for these inequities. The annotated research articles are arranged…
ERIC Educational Resources Information Center
Hsu, Chung-Yuan; Tsai, Chin-Chung; Liang, Jyh-Chong
2011-01-01
Educational researchers have suggested that computer games have a profound influence on students' motivation, knowledge construction, and learning performance, but little empirical research has targeted preschoolers. Thus, the purpose of the present study was to investigate the effects of implementing a computer game that integrates the…
ERIC Educational Resources Information Center
Charleston, LaVar J.; Gilbert, Juan E.; Escobar, Barbara; Jackson, Jerlando F. L.
2014-01-01
African Americans represent 1.3% of all computing sciences faculty in PhD-granting departments, underscoring the severe underrepresentation of Black/African American tenure-track faculty in computing (CRA, 2012). The Future Faculty/Research Scientist Mentoring (FFRM) program, funded by the National Science Foundation, was found to be an effective…
A Study of Computer Techniques for Music Research. Final Report.
ERIC Educational Resources Information Center
Lincoln, Harry B.
Work in three areas comprised this study of computer use in thematic indexing for music research: (1) acquisition, encoding, and keypunching of data--themes of which now number about 50,000 (primarily 16th Century Italian vocal music) and serve as a test base for program development; (2) development of computer programs to process this data; and…
ERIC Educational Resources Information Center
Smolinski, Tomasz G.
2010-01-01
Computer literacy plays a critical role in today's life sciences research. Without the ability to use computers to efficiently manipulate and analyze large amounts of data resulting from biological experiments and simulations, many of the pressing questions in the life sciences could not be answered. Today's undergraduates, despite the ubiquity of…
Computer versus Paper-Based Reading: A Case Study in English Language Teaching Context
ERIC Educational Resources Information Center
Solak, Ekrem
2014-01-01
This research aims to determine the preference of prospective English teachers in performing computer and paper-based reading tasks and to what extent computer and paper-based reading influence their reading speed, accuracy and comprehension. The research was conducted at a State run University, English Language Teaching Department in Turkey. The…
Interactive Story Authoring: A Viable Form of Creative Expression for the Classroom
ERIC Educational Resources Information Center
Carbonaro, M.; Cutumisu, M.; Duff, H.; Gillis, S.; Onuczko, C.; Siegel, J.; Schaeffer, J.; Schumacher, A.; Szafron, D.; Waugh, K.
2008-01-01
The unprecedented growth in numbers of children playing computer games has stimulated discussion and research regarding what, if any, educational value these games have for teaching and learning. The research on this topic has primarily focused on children as players of computer games rather than builders/constructors of computer games. Recently,…
ERIC Educational Resources Information Center
Sun, Pei Chen; Finger, Glenn; Liu, Zhen Lan
2014-01-01
While there have been very limited studies of the educational computing literature to analyze the research trends since the early emergence of educational computing technologies, the authors argue that it is important for both researchers and educators to understand the major, historical educational computing trends in order to inform…
ERIC Educational Resources Information Center
Liu, Pei-Lin; Chen, Chiu-Jung; Chang, Yu-Ju
2010-01-01
The purpose of this research was to investigate the effects of a computer-assisted concept mapping learning strategy on EFL college learners' English reading comprehension. The research questions were: (1) what was the influence of the computer-assisted concept mapping learning strategy on different learners' English reading comprehension? (2) did…
Educational Outcomes and Research from 1:1 Computing Settings
ERIC Educational Resources Information Center
Bebell, Damian; O'Dwyer, Laura M.
2010-01-01
Despite the growing interest in 1:1 computing initiatives, relatively little empirical research has focused on the outcomes of these investments. The current special edition of the Journal of Technology and Assessment presents four empirical studies of K-12 1:1 computing programs and one review of key themes in the conversation about 1:1 computing…
An Empirical Look at Business Students' Attitudes towards Laptop Computers in the Classroom
ERIC Educational Resources Information Center
Dykstra, DeVee E.; Tracy, Daniel L.; Wergin, Rand
2013-01-01
Mobile computing technology has proliferated across university campuses with the goals of enhancing student learning outcomes and making courses more accessible. An increasing amount of research has been conducted about mobile computing's benefits in classroom settings. Yet, the research is still in its infancy. The purpose of this paper is to add…
Tran, Phuoc; Subrahmanyam, Kaveri
2013-01-01
The use of computers in the home has become very common among young children. This paper reviews research on the effects of informal computer use and identifies potential pathways through which computers may impact children's development. Based on the evidence reviewed, we present the following guidelines to arrange informal computer experiences that will promote the development of children's academic, cognitive and social skills: (1) children should be encouraged to use computers for moderate amounts of time (2-3 days a week for an hour or two per day) and (2) children's use of computers should (a) include non-violent action-based computer games as well as educational games, (b) not displace social activities but should instead be arranged to provide opportunities for social engagement with peers and family members and (c) involve content with pro-social and non-violent themes. We conclude the paper with questions that must be addressed in future research. This paper reviews research on the effects of informal computer use on children's academic, cognitive and social skills. Based on the evidence presented, we have presented guidelines to enable parents, teachers and other adults to arrange informal computer experiences so as to maximise their potential benefit for children's development.
Preaching What We Practice: Teaching Ethical Decision-Making to Computer Security Professionals
NASA Astrophysics Data System (ADS)
Fleischmann, Kenneth R.
The biggest challenge facing computer security researchers and professionals is not learning how to make ethical decisions; rather it is learning how to recognize ethical decisions. All too often, technology development suffers from what Langdon Winner terms technological somnambulism - we sleepwalk through our technology design, following past precedents without a second thought, and fail to consider the perspectives of other stakeholders [1]. Computer security research and practice involves a number of opportunities for ethical decisions. For example, decisions about whether or not to automatically provide security updates involve tradeoffs related to caring versus user autonomy. Decisions about online voting include tradeoffs between convenience and security. Finally, decisions about routinely screening e-mails for spam involve tradeoffs of efficiency and privacy. It is critical that these and other decisions facing computer security researchers and professionals are confronted head on as value-laden design decisions, and that computer security researchers and professionals consider the perspectives of various stakeholders in making these decisions.
Support News Publications Computing for Experiments Computing for Neutrino and Muon Physics Computing for Collider Experiments Computing for Astrophysics Research and Development Accelerator Modeling ComPASS - Impact of Detector Simulation on Particle Physics Collider Experiments Daniel Elvira's paper "Impact
Heterogeneous Distributed Computing for Computational Aerosciences
NASA Technical Reports Server (NTRS)
Sunderam, Vaidy S.
1998-01-01
The research supported under this award focuses on heterogeneous distributed computing for high-performance applications, with particular emphasis on computational aerosciences. The overall goal of this project was to and investigate issues in, and develop solutions to, efficient execution of computational aeroscience codes in heterogeneous concurrent computing environments. In particular, we worked in the context of the PVM[1] system and, subsequent to detailed conversion efforts and performance benchmarking, devising novel techniques to increase the efficacy of heterogeneous networked environments for computational aerosciences. Our work has been based upon the NAS Parallel Benchmark suite, but has also recently expanded in scope to include the NAS I/O benchmarks as specified in the NHT-1 document. In this report we summarize our research accomplishments under the auspices of the grant.
ERIC Educational Resources Information Center
Celik, Vehbi; Yesilyurt, Etem
2013-01-01
There is a large body of research regarding computer supported education, perceptions of computer self-efficacy, computer anxiety and the technological attitudes of teachers and teacher candidates. However, no study has been conducted on the correlation between and effect of computer supported education, perceived computer self-efficacy, computer…
NASA Astrophysics Data System (ADS)
Herrick, Gregory Paul
The quest to accurately capture flow phenomena with length-scales both short and long and to accurately represent complex flow phenomena within disparately sized geometry inspires a need for an efficient, high-fidelity, multi-block structured computational fluid dynamics (CFD) parallel computational scheme. This research presents and demonstrates a more efficient computational method by which to perform multi-block structured CFD parallel computational simulations, thus facilitating higher-fidelity solutions of complicated geometries (due to the inclusion of grids for "small'' flow areas which are often merely modeled) and their associated flows. This computational framework offers greater flexibility and user-control in allocating the resource balance between process count and wall-clock computation time. The principal modifications implemented in this revision consist of a "multiple grid block per processing core'' software infrastructure and an analytic computation of viscous flux Jacobians. The development of this scheme is largely motivated by the desire to simulate axial compressor stall inception with more complete gridding of the flow passages (including rotor tip clearance regions) than has been previously done while maintaining high computational efficiency (i.e., minimal consumption of computational resources), and thus this paradigm shall be demonstrated with an examination of instability in a transonic axial compressor. However, the paradigm presented herein facilitates CFD simulation of myriad previously impractical geometries and flows and is not limited to detailed analyses of axial compressor flows. While the simulations presented herein were technically possible under the previous structure of the subject software, they were much less computationally efficient and thus not pragmatically feasible; the previous research using this software to perform three-dimensional, full-annulus, time-accurate, unsteady, full-stage (with sliding-interface) simulations of rotating stall inception in axial compressors utilized tip clearance periodic models, while the scheme here is demonstrated by a simulation of axial compressor stall inception utilizing gridded rotor tip clearance regions. As will be discussed, much previous research---experimental, theoretical, and computational---has suggested that understanding clearance flow behavior is critical to understanding stall inception, and previous computational research efforts which have used tip clearance models have begged the question, "What about the clearance flows?''. This research begins to address that question.
Computational Toxicology at the US EPA | Science Inventory ...
Computational toxicology is the application of mathematical and computer models to help assess chemical hazards and risks to human health and the environment. Supported by advances in informatics, high-throughput screening (HTS) technologies, and systems biology, EPA is developing robust and flexible computational tools that can be applied to the thousands of chemicals in commerce, and contaminant mixtures found in America’s air, water, and hazardous-waste sites. The ORD Computational Toxicology Research Program (CTRP) is composed of three main elements. The largest component is the National Center for Computational Toxicology (NCCT), which was established in 2005 to coordinate research on chemical screening and prioritization, informatics, and systems modeling. The second element consists of related activities in the National Health and Environmental Effects Research Laboratory (NHEERL) and the National Exposure Research Laboratory (NERL). The third and final component consists of academic centers working on various aspects of computational toxicology and funded by the EPA Science to Achieve Results (STAR) program. Key intramural projects of the CTRP include digitizing legacy toxicity testing information toxicity reference database (ToxRefDB), predicting toxicity (ToxCast™) and exposure (ExpoCast™), and creating virtual liver (v-Liver™) and virtual embryo (v-Embryo™) systems models. The models and underlying data are being made publicly available t
Use of PL/1 in a Bibliographic Information Retrieval System.
ERIC Educational Resources Information Center
Schipma, Peter B.; And Others
The Information Sciences section of ITT Research Institute (IITRI) has developed a Computer Search Center and is currently conducting a research project to explore computer searching of a variety of machine-readable data bases. The Center provides Selective Dissemination of Information services to academic, industrial and research organizations…
Computer-Assisted Analysis of Qualitative Gerontological Research.
ERIC Educational Resources Information Center
Hiemstra, Roger; And Others
1987-01-01
Asserts that qualitative research has great potential for use in gerontological research. Describes QUALOG, a computer-assisted, qualitative data analysis scheme using logic programming developed at Syracuse University. Reviews development of QUALOG and discusses how QUALOG was used to analyze data from a qualitative study of older adult learners.…
Research in progress in applied mathematics, numerical analysis, and computer science
NASA Technical Reports Server (NTRS)
1990-01-01
Research conducted at the Institute in Science and Engineering in applied mathematics, numerical analysis, and computer science is summarized. The Institute conducts unclassified basic research in applied mathematics in order to extend and improve problem solving capabilities in science and engineering, particularly in aeronautics and space.
CSM parallel structural methods research
NASA Technical Reports Server (NTRS)
Storaasli, Olaf O.
1989-01-01
Parallel structural methods, research team activities, advanced architecture computers for parallel computational structural mechanics (CSM) research, the FLEX/32 multicomputer, a parallel structural analyses testbed, blade-stiffened aluminum panel with a circular cutout and the dynamic characteristics of a 60 meter, 54-bay, 3-longeron deployable truss beam are among the topics discussed.
TRAINING AND RESEARCH PROGRAM IN COMPUTER APPLICATIONS.
ERIC Educational Resources Information Center
HUNKA, S.
TO MAKE EDUCATIONAL RESEARCHERS AND TEACHERS MORE AWARE OF THE VALUES OF ELECTRONIC AUTOMATION, THIS ARTICLE PROPOSES A TRAINING-RESEARCH PROGRAM USING THE IBM 360/67 AND THE IBM 1500 COMPUTERS. PARTICIPANTS WOULD BE SELECTED FROM (1) POST-DOCTORAL AND PROFESSIONAL UNIVERSITY STAFF MEMBERS ON SABBATICAL LEAVE WHOSE MAIN INTEREST IS EDUCATIONAL…
The Possibilities of Transformation: Critical Research and Peter McLaren
ERIC Educational Resources Information Center
Porfilio, Brad J.
2006-01-01
The purpose of this paper is to unveil how Peter McLaren's revolutionary brand of pedagogy, multiculturalism, and research colored my two-year qualitative research study, which unearthed twenty White female future teachers' experiences and perceptions in relationship to computing technology and male-centered computing culture. His ideas positioned…
Research in Distance Education: A System Modeling Approach.
ERIC Educational Resources Information Center
Saba, Farhad; Twitchell, David
1988-01-01
Describes how a computer simulation research method can be used for studying distance education systems. Topics discussed include systems research in distance education; a technique of model development using the System Dynamics approach and DYNAMO simulation language; and a computer simulation of a prototype model. (18 references) (LRW)
Computational fluid dynamics research
NASA Technical Reports Server (NTRS)
Chandra, Suresh; Jones, Kenneth; Hassan, Hassan; Mcrae, David Scott
1992-01-01
The focus of research in the computational fluid dynamics (CFD) area is two fold: (1) to develop new approaches for turbulence modeling so that high speed compressible flows can be studied for applications to entry and re-entry flows; and (2) to perform research to improve CFD algorithm accuracy and efficiency for high speed flows. Research activities, faculty and student participation, publications, and financial information are outlined.