Computational Aspects of Data Assimilation and the ESMF
NASA Technical Reports Server (NTRS)
daSilva, A.
2003-01-01
The scientific challenge of developing advanced data assimilation applications is a daunting task. Independently developed components may have incompatible interfaces or may be written in different computer languages. The high-performance computer (HPC) platforms required by numerically intensive Earth system applications are complex, varied, rapidly evolving and multi-part systems themselves. Since the market for high-end platforms is relatively small, there is little robust middleware available to buffer the modeler from the difficulties of HPC programming. To complicate matters further, the collaborations required to develop large Earth system applications often span initiatives, institutions and agencies, involve geoscience, software engineering, and computer science communities, and cross national borders.The Earth System Modeling Framework (ESMF) project is a concerted response to these challenges. Its goal is to increase software reuse, interoperability, ease of use and performance in Earth system models through the use of a common software framework, developed in an open manner by leaders in the modeling community. The ESMF addresses the technical and to some extent the cultural - aspects of Earth system modeling, laying the groundwork for addressing the more difficult scientific aspects, such as the physical compatibility of components, in the future. In this talk we will discuss the general philosophy and architecture of the ESMF, focussing on those capabilities useful for developing advanced data assimilation applications.
Evaluation of Screening for Retinopathy of Prematurity by ROPtool or a Lay Reader.
Abbey, Ashkan M; Besirli, Cagri G; Musch, David C; Andrews, Chris A; Capone, Antonio; Drenser, Kimberly A; Wallace, David K; Ostmo, Susan; Chiang, Michael; Lee, Paul P; Trese, Michael T
2016-02-01
To determine if (1) tortuosity assessment by a computer program (ROPtool, developed at the University of North Carolina, Chapel Hill, and Duke University, and licensed by FocusROP) that traces retinal blood vessels and (2) assessment by a lay reader are comparable with assessment by a panel of 3 retinopathy of prematurity (ROP) experts for remote clinical grading of vascular abnormalities such as plus disease. Validity and reliability analysis of diagnostic tools. Three hundred thirty-five fundus images of prematurely born infants. Three hundred thirty-five fundus images of prematurely born infants were obtained by neonatal intensive care unit nurses. A panel of 3 ROP experts graded 84 images showing vascular dilatation, tortuosity, or both and 251 images showing no evidence of vascular abnormalities. These images were sent electronically to an experienced lay reader who independently graded them for vascular abnormalities. The images also were analyzed using the ROPtool, which assigns a numerical value to the level of vascular abnormality and tortuosity present in each of 4 quadrants or sectors. The ROPtool measurements of vascular abnormalities were graded and compared with expert panel grades with a receiver operating characteristic (ROC) curve. Grades between human readers were cross-tabulated. The area under the ROC curve was calculated for the ROPtool, and sensitivity and specificity were computed for the lay reader. Measurements of vascular abnormalities by ROPtool and grading of vascular abnormalities by 3 ROP experts and 1 experienced lay reader. The ROC curve for ROPtool's tortuosity assessment had an area under the ROC curve of 0.917. Using a threshold value of 4.97 for the second most tortuous quadrant, ROPtool's sensitivity was 91% and its specificity was 82%. Lay reader sensitivity and specificity were 99% and 73%, respectively, and had high reliability (κ, 0.87) in repeated measurements. ROPtool had very good accuracy for detection of vascular abnormalities suggestive of plus disease when compared with expert physician graders. The lay reader's results showed excellent sensitivity and good specificity when compared with those of the expert graders. These options for remote reading of images to detect vascular abnormalities deserve consideration in the quest to use telemedicine with remote reading for efficient delivery of high-quality care and to detect infants requiring bedside examination. Copyright © 2016 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Trani, L.; Spinuso, A.; Galea, M.; Atkinson, M.; Van Eck, T.; Vilotte, J.
2011-12-01
The data bonanza generated by today's digital revolution is forcing scientists to rethink their methodologies and working practices. Traditional approaches to knowledge discovery are pushed to their limit and struggle to keep apace with the data flows produced by modern systems. This work shows how the ADMIRE data-intensive architecture supports seismologists by enabling them to focus on their scientific goals and questions, abstracting away the underlying technology platform that enacts their data integration and analysis tasks. ADMIRE accomplishes this partly by recognizing 3 different types of experts that require clearly defined interfaces between their interaction: the domain expert who is the application specialist, the data-analysis expert who is a specialist in extracting information from data, and the data-intensive engineer who develops the infrastructure for data-intensive computation. In order to provide a context in which each category of expert may flourish, ADMIRE uses a 3-level architecture: the upper - tool - level supports the work of both domain and data-analysis experts, housing an extensive and evolving set of portals, tools and development environments; the lower - enactment - level houses a large and dynamic community of providers delivering data and data-intensive enactment environments as an evolving infrastructure that supports all of the work underway in the upper layer. Most data-intensive engineers work here; the crucial innovation lies in the middle level, a gateway that is a tightly defined and stable interface through which the two diverse and dynamic upper and lower layers communicate. This is a minimal and simple protocol and language (DISPEL), ultimately to be controlled by standards, so that the upper and lower communities may invest, secure in the knowledge that changes in this interface will be carefully managed. We implemented a well-established procedure for processing seismic ambient noise on the prototype architecture. The primary goal was to evaluate its capabilities for large-scale integration and analysis of distributed data. A secondary goal was to gauge its potential and the added value that it might bring to the seismological community. Though still in its infant state, the architecture met the demands of our use case and promises to cater for our future requirements. We shall continue to develop its capabilities as part of an EU funded project VERCE - Virtual Earthquake and Seismology Research Community for Europe. VERCE aims to significantly advance our understanding of the Earth in order to aid society in its management of natural resources and hazards. Its strategy is to enable seismologists to fully exploit the under-utilized wealth of seismic data, and key to this is a data-intensive computation framework adapted to the scale and diversity of the community. This is a first step in building a data-intensive highway for geoscientists, smoothing their travel from the primary sources of data to new insights and rapid delivery of actionable information.
ERIC Educational Resources Information Center
Office of Education (DHEW), Washington, DC.
A conference, held in Washington, D. C., in 1967 by the Association for Educational Data Systems and the U.S. Office of Education, attempted to lay the groundwork for an efficient automatic data processing training program for the Federal Government utilizing new instructional methodologies. The rapid growth of computer applications and computer…
Effects of colored light-emitting diode illumination on behavior and performance of laying hens.
Huber-Eicher, B; Suter, A; Spring-Stähli, P
2013-04-01
The best method for lighting poultry houses has been an issue for many decades, generating much interest in any new systems that become available. Poultry farmers are now increasingly using colored LED (light-emitting diodes) to illuminate hen houses (e.g., in Germany, Austria, the Netherlands, and England). In Switzerland all newly installed systems are now equipped with LED, preferably green ones. The LED give monochromatic light from different wavelengths and have several advantages over conventional illuminants, including high energy efficiency, long life, high reliability, and low maintenance costs. The following study examines the effects of illumination with white, red, and green LED on behavior and production parameters of laying hens. Light intensities in the 3 treatments were adjusted to be perceived by hens as equal. Twenty-four groups of 25 laying hens were kept in identical compartments (5.0 × 3.3 m) equipped with a litter area, raised perches, feed and drinking facilities, and nest boxes. Initially, they were kept under white LED for a 2-wk adaptation period. For the next 4 wk, 8 randomly chosen compartments were lit with red LED (640 nm) and 8 others with green LED (520 nm). Behavior was monitored during the last 2 wk of the trial. Additionally weight gain, feed consumption, onset of lay, and laying performance were recorded. The results showed minor effects of green light on explorative behavior, whereas red light reduced aggressiveness compared with white light. The accelerating effect of red light on sexual development of laying hens was confirmed, and the trial demonstrated that this effect was due to the specific wavelength and not the intensity of light. However, an additional effect of light intensity may exist and should not be excluded.
UMAMI: A Recipe for Generating Meaningful Metrics through Holistic I/O Performance Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lockwood, Glenn K.; Yoo, Wucherl; Byna, Suren
I/O efficiency is essential to productivity in scientific computing, especially as many scientific domains become more data-intensive. Many characterization tools have been used to elucidate specific aspects of parallel I/O performance, but analyzing components of complex I/O subsystems in isolation fails to provide insight into critical questions: how do the I/O components interact, what are reasonable expectations for application performance, and what are the underlying causes of I/O performance problems? To address these questions while capitalizing on existing component-level characterization tools, we propose an approach that combines on-demand, modular synthesis of I/O characterization data into a unified monitoring and metricsmore » interface (UMAMI) to provide a normalized, holistic view of I/O behavior. We evaluate the feasibility of this approach by applying it to a month-long benchmarking study on two distinct largescale computing platforms. We present three case studies that highlight the importance of analyzing application I/O performance in context with both contemporaneous and historical component metrics, and we provide new insights into the factors affecting I/O performance. By demonstrating the generality of our approach, we lay the groundwork for a production-grade framework for holistic I/O analysis.« less
Applications in Data-Intensive Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shah, Anuj R.; Adkins, Joshua N.; Baxter, Douglas J.
2010-04-01
This book chapter, to be published in Advances in Computers, Volume 78, in 2010 describes applications of data intensive computing (DIC). This is an invited chapter resulting from a previous publication on DIC. This work summarizes efforts coming out of the PNNL's Data Intensive Computing Initiative. Advances in technology have empowered individuals with the ability to generate digital content with mouse clicks and voice commands. Digital pictures, emails, text messages, home videos, audio, and webpages are common examples of digital content that are generated on a regular basis. Data intensive computing facilitates human understanding of complex problems. Data-intensive applications providemore » timely and meaningful analytical results in response to exponentially growing data complexity and associated analysis requirements through the development of new classes of software, algorithms, and hardware.« less
McClinchy, Jane; Dickinson, Angela; Barron, Duncan; Thomas, Hilary
2011-12-01
In primary care, leaflets are often used to communicate health information. Increasingly, primary healthcare practitioners need to provide dietary advice. There is limited research exploring how nutrition information leaflets are used in primary care. The present study explored practitioner and lay experiences with respect to providing and receiving nutrition information in primary care, focusing in particular on the use of leaflets for nutrition information. A qualitative design was used incorporating focus groups with 57 practitioners based at seven general practitioner practices and a purposive sample of 30 lay participants attending six Consumer Health Organisations within one primary care trust. Focus groups were taped and transcribed verbatim and data were analysed thematically, assisted by computer software n6® (QSR International Pty Ltd, Melbourne, Australia). Practitioners discussed barriers to giving nutritional advice, access to leaflets, lay receptiveness to advice and their perceptions about the value of leaflets to lay people. Food was not considered in terms of its nutritional components by lay participants and the need for nutritional information was not perceived to be relevant until they had received a medical diagnosis. Lay participants discussed the importance of receiving nutritional advice relating to their medical diagnosis and the altered status of written information that was delivered personally. Practitioner and lay groups suggested improvements to ensure that nutritional advice be supported by relevant and appropriate written information. This research has underlined the continuing importance of nutrition information leaflets and concludes that there is particular value in involving lay participants in the development of nutrition information leaflets. © 2011 The Authors. Journal of Human Nutrition and Dietetics © 2011 The British Dietetic Association Ltd.
Enhancing Lay Counselor Capacity to Improve Patient Outcomes with Multimedia Technology.
Robbins, Reuben N; Mellins, Claude A; Leu, Cheng-Shiun; Rowe, Jessica; Warne, Patricia; Abrams, Elaine J; Witte, Susan; Stein, Dan J; Remien, Robert H
2015-06-01
Multimedia technologies offer powerful tools to increase capacity of health workers to deliver standardized, effective, and engaging antiretroviral medication adherence counseling. Masivukeni-is an innovative multimedia-based, computer-driven, lay counselor-delivered intervention designed to help people living with HIV in resource-limited settings achieve optimal adherence. This pilot study examined medication adherence and key psychosocial outcomes among 55 non-adherent South African HIV+ patients, on antiretroviral therapy (ART) for at least 6 months, who were randomized to receive either Masivukeni or standard of care (SOC) counseling for ART non-adherence. At baseline, there were no significant differences between the SOC and Masivukeni groups on any outcome variables. At post-intervention (approximately 5-6 weeks after baseline), -clinic-based pill count adherence data available for 20 participants (10 per intervention arm) showed a 10 % improvement for-participants and a decrease of 8 % for SOC participants. Masivukeni participants reported significantly more positive attitudes towards disclosure and medication social support, less social rejection, and better clinic-patient relationships than did SOC participants. Masivukeni shows promise to promote optimal adherence and provides preliminary evidence that multimedia, computer-based technology can help lay counselors offer better adherence counseling than standard approaches.
Enhancing Lay Counselor Capacity to Improve Patient Outcomes with Multimedia Technology
Robbins, Reuben N.; Mellins, Claude A.; Leu, Cheng-Shiun; Rowe, Jessica; Warne, Patricia; Abrams, Elaine J.; Witte, Susan; Stein, Dan J.; Remien, Robert H.
2015-01-01
Multimedia technologies offer powerful tools to increase capacity of health workers to deliver standardized, effective, and engaging antiretroviral medication adherence counseling. Masivukeni is an innovative multimedia-based, computer-driven, lay counselor-delivered intervention designed to help people living with HIV in resource-limited settings achieve optimal adherence. This pilot study examined medication adherence and key psychosocial outcomes among 55 non-adherent South African HIV+ patients, on ART for at least 6 months, who were randomized to receive either Masivukeni or standard of care (SOC) counseling for ART non-adherence. At baseline, there were no significant differences between the SOC and Masivukeni groups on any outcome variables. At post-intervention (approximately 5–6 weeks after baseline), clinic-based pill count adherence data available for 20 participants (10 per intervention arm) showed a 10% improvement for Masivukeni participants and a decrease of 8% for SOC participants. Masivukeni participants reported significantly more positive attitudes towards disclosure and medication social support, less social rejection, and better clinic-patient relationships than did SOC participants. Masivukeni shows promise to promote optimal adherence and provides preliminary evidence that multimedia, computer-based technology can help lay counselors offer better adherence counseling than standard approaches. PMID:25566763
Lay Navigator Model for Impacting Cancer Health Disparities
Meade, Cathy D.; Wells, Kristen J.; Arevalo, Mariana; Calcano, Ercilia R.; Rivera, Marlene; Sarmiento, Yolanda; Freeman, Harold P; Roetzheim, Richard G.
2014-01-01
This paper recounts experiences, challenges, and lessons learned when implementing a lay patient navigator program to improve cancer care among medically underserved patients who presented in a primary care clinic with a breast or colorectal cancer abnormality. The program employed five lay navigators to navigate 588 patients. Central programmatic elements were: 1) use of bilingual lay navigators with familiarity of communities they served; 2) provision of training, education and supportive activities; 3) multidisciplinary clinical oversight that factored in caseload intensity; and 4) well-developed partnerships with community clinics and social service entities. Deconstruction of health care system information was fundamental to navigation processes. We conclude that a lay model of navigation is well suited to assist patients through complex health care systems; however, a stepped care model that includes both lay and professional navigation may be optimal to help patients across the entire continuum. PMID:24683043
Lay navigator model for impacting cancer health disparities.
Meade, Cathy D; Wells, Kristen J; Arevalo, Mariana; Calcano, Ercilia R; Rivera, Marlene; Sarmiento, Yolanda; Freeman, Harold P; Roetzheim, Richard G
2014-09-01
This paper recounts experiences, challenges, and lessons learned when implementing a lay patient navigator program to improve cancer care among medically underserved patients who presented in a primary care clinic with a breast or colorectal cancer abnormality. The program employed five lay navigators to navigate 588 patients. Central programmatic elements were the following: (1) use of bilingual lay navigators with familiarity of communities they served; (2) provision of training, education, and supportive activities; (3) multidisciplinary clinical oversight that factored in caseload intensity; and (4) well-developed partnerships with community clinics and social service entities. Deconstruction of healthcare system information was fundamental to navigation processes. We conclude that a lay model of navigation is well suited to assist patients through complex healthcare systems; however, a stepped care model that includes both lay and professional navigation may be optimal to help patients across the entire continuum.
Vaugoyeau, Marie; Adriaensen, Frank; Artemyev, Alexandr; Bańbura, Jerzy; Barba, Emilio; Biard, Clotilde; Blondel, Jacques; Bouslama, Zihad; Bouvier, Jean-Charles; Camprodon, Jordi; Cecere, Francesco; Charmantier, Anne; Charter, Motti; Cichoń, Mariusz; Cusimano, Camillo; Czeszczewik, Dorota; Demeyrier, Virginie; Doligez, Blandine; Doutrelant, Claire; Dubiec, Anna; Eens, Marcel; Eeva, Tapio; Faivre, Bruno; Ferns, Peter N; Forsman, Jukka T; García-Del-Rey, Eduardo; Goldshtein, Aya; Goodenough, Anne E; Gosler, Andrew G; Grégoire, Arnaud; Gustafsson, Lars; Harnist, Iga; Hartley, Ian R; Heeb, Philipp; Hinsley, Shelley A; Isenmann, Paul; Jacob, Staffan; Juškaitis, Rimvydas; Korpimäki, Erkki; Krams, Indrikis; Laaksonen, Toni; Lambrechts, Marcel M; Leclercq, Bernard; Lehikoinen, Esa; Loukola, Olli; Lundberg, Arne; Mainwaring, Mark C; Mänd, Raivo; Massa, Bruno; Mazgajski, Tomasz D; Merino, Santiago; Mitrus, Cezary; Mönkkönen, Mikko; Morin, Xavier; Nager, Ruedi G; Nilsson, Jan-Åke; Nilsson, Sven G; Norte, Ana C; Orell, Markku; Perret, Philippe; Perrins, Christopher M; Pimentel, Carla S; Pinxten, Rianne; Richner, Heinz; Robles, Hugo; Rytkönen, Seppo; Senar, Juan Carlos; Seppänen, Janne T; Pascoal da Silva, Luis; Slagsvold, Tore; Solonen, Tapio; Sorace, Alberto; Stenning, Martyn J; Tryjanowski, Piotr; von Numers, Mikael; Walankiewicz, Wieslaw; Møller, Anders Pape
2016-08-01
The increase in size of human populations in urban and agricultural areas has resulted in considerable habitat conversion globally. Such anthropogenic areas have specific environmental characteristics, which influence the physiology, life history, and population dynamics of plants and animals. For example, the date of bud burst is advanced in urban compared to nearby natural areas. In some birds, breeding success is determined by synchrony between timing of breeding and peak food abundance. Pertinently, caterpillars are an important food source for the nestlings of many bird species, and their abundance is influenced by environmental factors such as temperature and date of bud burst. Higher temperatures and advanced date of bud burst in urban areas could advance peak caterpillar abundance and thus affect breeding phenology of birds. In order to test whether laying date advance and clutch sizes decrease with the intensity of urbanization, we analyzed the timing of breeding and clutch size in relation to intensity of urbanization as a measure of human impact in 199 nest box plots across Europe, North Africa, and the Middle East (i.e., the Western Palearctic) for four species of hole-nesters: blue tits (Cyanistes caeruleus), great tits (Parus major), collared flycatchers (Ficedula albicollis), and pied flycatchers (Ficedula hypoleuca). Meanwhile, we estimated the intensity of urbanization as the density of buildings surrounding study plots measured on orthophotographs. For the four study species, the intensity of urbanization was not correlated with laying date. Clutch size in blue and great tits does not seem affected by the intensity of urbanization, while in collared and pied flycatchers it decreased with increasing intensity of urbanization. This is the first large-scale study showing a species-specific major correlation between intensity of urbanization and the ecology of breeding. The underlying mechanisms for the relationships between life history and urbanization remain to be determined. We propose that effects of food abundance or quality, temperature, noise, pollution, or disturbance by humans may on their own or in combination affect laying date and/or clutch size.
Science & Technology Review: September 2016
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vogt, Ramona L.; Meissner, Caryn N.; Chinn, Ken B.
2016-09-30
This is the September issue of the Lawrence Livermore National Laboratory's Science & Technology Review, which communicates, to a broad audience, the Laboratory’s scientific and technological accomplishments in fulfilling its primary missions. This month, there are features on "Laboratory Investments Drive Computational Advances" and "Laying the Groundwork for Extreme-Scale Computing." Research highlights include "Nuclear Data Moves into the 21st Century", "Peering into the Future of Lick Observatory", and "Facility Drives Hydrogen Vehicle Innovations."
NASA Astrophysics Data System (ADS)
Moore, S. L.; Kar, A.; Gomez, R.
2015-12-01
A partnership between Fort Valley State University (FVSU), the Jackson School of Geosciences at The University of Texas (UT) at Austin, and the Texas Advanced Computing Center (TACC) is engaging computational geoscience faculty and researchers with academically talented underrepresented minority (URM) students, training them to solve grand challenges . These next generation computational geoscientists are being trained to solve some of the world's most challenging geoscience grand challenges requiring data intensive large scale modeling and simulation on high performance computers . UT Austin's geoscience outreach program GeoFORCE, recently awarded the Presidential Award in Excellence in Science, Mathematics and Engineering Mentoring, contributes to the collaborative best practices in engaging researchers with URM students. Collaborative efforts over the past decade are providing data demonstrating that integrative pipeline programs with mentoring and paid internship opportunities, multi-year scholarships, computational training, and communication skills development are having an impact on URMs developing middle skills for geoscience careers. Since 1997, the Cooperative Developmental Energy Program at FVSU and its collaborating universities have graduated 87 engineers, 33 geoscientists, and eight health physicists. Recruited as early as high school, students enroll for three years at FVSU majoring in mathematics, chemistry or biology, and then transfer to UT Austin or other partner institutions to complete a second STEM degree, including geosciences. A partnership with the Integrative Computational Education and Research Traineeship (ICERT), a National Science Foundation (NSF) Research Experience for Undergraduates (REU) Site at TACC provides students with a 10-week summer research experience at UT Austin. Mentored by TACC researchers, students with no previous background in computational science learn to use some of the world's most powerful high performance computing resources to address a grand geosciences problem. Students increase their ability to understand and explain the societal impact of their research and communicate the research to multidisciplinary and lay audiences via near-peer mentoring, poster presentations, and publication opportunities.
Data intensive computing at Sandia.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilson, Andrew T.
2010-09-01
Data-Intensive Computing is parallel computing where you design your algorithms and your software around efficient access and traversal of a data set; where hardware requirements are dictated by data size as much as by desired run times usually distilling compact results from massive data.
Health and Welfare in Dutch Organic Laying Hens.
Bestman, Monique; Wagenaar, Jan-Paul
2014-06-20
From 2007-2008, data on animal health and welfare and farm management during rearing and laying periods were collected from 49 flocks of organic laying hens in the Netherlands. Our aim was to investigate how organic egg farms performed in terms of animal health and welfare and which farm factors affected this performance. The flocks in our study were kept on farms with 34 to 25,000 hens (average 9,300 hens). Seventy-one percent of the flocks consisted of 'silver hybrids': white hens that lay brown eggs. Fifty-five percent of the flocks were kept in floor-based housing and 45% of the flocks in aviaries. No relation was found between the amount of time spent outdoors during the laying period and mortality at 60 weeks. Flocks that used their outdoor run more intensively had better feather scores. In 40% of the flocks there was mortality caused by predators. The average feed intake was 129 g/day at 30 weeks and 133 g/day at 60 weeks of age. The average percentage of mislaid eggs decreased from three at 30 weeks to two at 60 weeks. The average mortality was 7.8% at 60 weeks. Twenty-five percent of the flocks were not treated for worms in their first 50 weeks. Flubenol(©) was applied to the flocks that were treated. Ten percent of the flocks followed Flubenol(©) instructions for use and were wormed five or more times. The other 65% percent were treated irregularly between one and four times. Sixty-eight percent of the flocks showed little or no feather damage, 24% showed moderate damage and 8% showed severe damage. The feather score was better if the hens used the free-range area more intensely, the laying percentage at 60 weeks was higher, and if they were allowed to go outside sooner after arrival on the laying farm. In 69% of the flocks, hens had peck wounds in the vent area: on average this was 18% of the hens. Keel bone deformations were found in all flocks, on average in 21% of the birds. In 78% of the flocks, an average of 13% of the hens had foot-sole wounds, mostly a small crust. Combs were darker in flocks that used the range area more intensively. More fearful flocks had lighter combs. We conclude that organic farms are potentially more animal friendly than other poultry systems based on the animal welfare benefits of the free range areas. However, we also observed mortality rates, internal parasites, keel bone deformities, and foot sole lesions on organic farms that were comparable to or worse than in other husbandry systems. It is unclear whether these 'remaining' problems can be attributed to housing or if they are the result of keeping high productive genotypes in an artificial environment. Organic farms use the same high productive genotypes as other husbandry systems.
Promoting soft mast for wildlife in intensively managed forests
John J. Stransky; John H. Roese
1984-01-01
The fruit of woody plants is important as food for wildlife (Martin et al. 1951, Lay 1965). The relation of fruit production to southern forest stand conditions has been explored in only a few studies. Fruit production is greater in forest clearings than in closed forest stands (Lay 1966, Elalls and Alcaniz 1968). In Georgia slash pine (Pinus elliottii...
Kashani, Alireza G; Olsen, Michael J; Parrish, Christopher E; Wilson, Nicholas
2015-11-06
In addition to precise 3D coordinates, most light detection and ranging (LIDAR) systems also record "intensity", loosely defined as the strength of the backscattered echo for each measured point. To date, LIDAR intensity data have proven beneficial in a wide range of applications because they are related to surface parameters, such as reflectance. While numerous procedures have been introduced in the scientific literature, and even commercial software, to enhance the utility of intensity data through a variety of "normalization", "correction", or "calibration" techniques, the current situation is complicated by a lack of standardization, as well as confusing, inconsistent use of terminology. In this paper, we first provide an overview of basic principles of LIDAR intensity measurements and applications utilizing intensity information from terrestrial, airborne topographic, and airborne bathymetric LIDAR. Next, we review effective parameters on intensity measurements, basic theory, and current intensity processing methods. We define terminology adopted from the most commonly-used conventions based on a review of current literature. Finally, we identify topics in need of further research. Ultimately, the presented information helps lay the foundation for future standards and specifications for LIDAR radiometric calibration.
BUILD: A community development simulation game, appendix A
NASA Technical Reports Server (NTRS)
Orlando, J. A.; Pennington, A. J.
1973-01-01
The computer based urban decision-making game BUILD is described. BUILD is aimed at: (1) allowing maximum expression of value positions by participants through resolution of intense, task-oriented conflicts: (2) heuristically gathering information on both the technical and social functioning of the city through feedback from participants: (3) providing community participants with access to technical expertise in urban decision making, and to expose professionals to the value positions of the community: and (4) laying the groundwork for eventual development of an actual policy making tool. A brief description of the roles, sample input/output formats, an initial scenario, and information on accessing the game through a time-sharing system are included.
Selecting reference cities for i-Tree Streets
E.G. McPherson
2010-01-01
The i-Tree Streets (formerly STRATUM) computer program quantifies municipal forest structure, function, and value using tree growth and geographic data from sixteen U.S. reference cities, one for each of sixteen climate zones. Selecting the reference city that best matches a subject city is problematic when the subject city is outside the U.S., lays on the border...
Resource Use and Medicare Costs During Lay Navigation for Geriatric Patients With Cancer.
Rocque, Gabrielle B; Pisu, Maria; Jackson, Bradford E; Kvale, Elizabeth A; Demark-Wahnefried, Wendy; Martin, Michelle Y; Meneses, Karen; Li, Yufeng; Taylor, Richard A; Acemgil, Aras; Williams, Courtney P; Lisovicz, Nedra; Fouad, Mona; Kenzik, Kelly M; Partridge, Edward E
2017-06-01
Lay navigators in the Patient Care Connect Program support patients with cancer from diagnosis through survivorship to end of life. They empower patients to engage in their health care and navigate them through the increasingly complex health care system. Navigation programs can improve access to care, enhance coordination of care, and overcome barriers to timely, high-quality health care. However, few data exist regarding the financial implications of implementing a lay navigation program. To examine the influence of lay navigation on health care spending and resource use among geriatric patients with cancer within The University of Alabama at Birmingham Health System Cancer Community Network. This observational study from January 1, 2012, through December 31, 2015, used propensity score-matched regression analysis to compare quarterly changes in the mean total Medicare costs and resource use between navigated patients and nonnavigated, matched comparison patients. The setting was The University of Alabama at Birmingham Health System Cancer Community Network, which includes 2 academic and 10 community cancer centers across Alabama, Georgia, Florida, Mississippi, and Tennessee. Participants were Medicare beneficiaries with cancer who received care at participating institutions from 2012 through 2015. The primary exposure was contact with a patient navigator. Navigated patients were matched to nonnavigated patients on age, race, sex, cancer acuity (high vs low), comorbidity score, and preenrollment characteristics (costs, emergency department visits, hospitalizations, intensive care unit admissions, and chemotherapy in the preenrollment quarter). Total costs to Medicare, components of cost, and resource use (emergency department visits, hospitalizations, and intensive care unit admissions). In total, 12 428 patients (mean (SD) age at cancer diagnosis, 75 (7) years; 52.0% female) were propensity score matched, including 6214 patients in the navigated group and 6214 patients in the matched nonnavigated comparison group. Compared with the matched comparison group, the mean total costs declined by $781.29 more per quarter per navigated patient (β = -781.29, SE = 45.77, P < .001), for an estimated $19 million decline per year across the network. Inpatient and outpatient costs had the largest between-group quarterly declines, at $294 and $275, respectively, per patient. Emergency department visits, hospitalizations, and intensive care unit admissions decreased by 6.0%, 7.9%, and 10.6%, respectively, per quarter in navigated patients compared with matched comparison patients (P < .001). Costs to Medicare and health care use from 2012 through 2015 declined significantly for navigated patients compared with matched comparison patients. Lay navigation programs should be expanded as health systems transition to value-based health care.
From cosmos to connectomes: the evolution of data-intensive science.
Burns, Randal; Vogelstein, Joshua T; Szalay, Alexander S
2014-09-17
The analysis of data requires computation: originally by hand and more recently by computers. Different models of computing are designed and optimized for different kinds of data. In data-intensive science, the scale and complexity of data exceeds the comfort zone of local data stores on scientific workstations. Thus, cloud computing emerges as the preeminent model, utilizing data centers and high-performance clusters, enabling remote users to access and query subsets of the data efficiently. We examine how data-intensive computational systems originally built for cosmology, the Sloan Digital Sky Survey (SDSS), are now being used in connectomics, at the Open Connectome Project. We list lessons learned and outline the top challenges we expect to face. Success in computational connectomics would drastically reduce the time between idea and discovery, as SDSS did in cosmology. Copyright © 2014 Elsevier Inc. All rights reserved.
Druhl, Emily; Polepalli Ramesh, Balaji; Houston, Thomas K; Brandt, Cynthia A; Zulman, Donna M; Vimalananda, Varsha G; Malkani, Samir; Yu, Hong
2018-01-01
Background Many health care systems now allow patients to access their electronic health record (EHR) notes online through patient portals. Medical jargon in EHR notes can confuse patients, which may interfere with potential benefits of patient access to EHR notes. Objective The aim of this study was to develop and evaluate the usability and content quality of NoteAid, a Web-based natural language processing system that links medical terms in EHR notes to lay definitions, that is, definitions easily understood by lay people. Methods NoteAid incorporates two core components: CoDeMed, a lexical resource of lay definitions for medical terms, and MedLink, a computational unit that links medical terms to lay definitions. We developed innovative computational methods, including an adapted distant supervision algorithm to prioritize medical terms important for EHR comprehension to facilitate the effort of building CoDeMed. Ten physician domain experts evaluated the user interface and content quality of NoteAid. The evaluation protocol included a cognitive walkthrough session and a postsession questionnaire. Physician feedback sessions were audio-recorded. We used standard content analysis methods to analyze qualitative data from these sessions. Results Physician feedback was mixed. Positive feedback on NoteAid included (1) Easy to use, (2) Good visual display, (3) Satisfactory system speed, and (4) Adequate lay definitions. Opportunities for improvement arising from evaluation sessions and feedback included (1) improving the display of definitions for partially matched terms, (2) including more medical terms in CoDeMed, (3) improving the handling of terms whose definitions vary depending on different contexts, and (4) standardizing the scope of definitions for medicines. On the basis of these results, we have improved NoteAid’s user interface and a number of definitions, and added 4502 more definitions in CoDeMed. Conclusions Physician evaluation yielded useful feedback for content validation and refinement of this innovative tool that has the potential to improve patient EHR comprehension and experience using patient portals. Future ongoing work will develop algorithms to handle ambiguous medical terms and test and evaluate NoteAid with patients. PMID:29358159
Kashani, Alireza G.; Olsen, Michael J.; Parrish, Christopher E.; Wilson, Nicholas
2015-01-01
In addition to precise 3D coordinates, most light detection and ranging (LIDAR) systems also record “intensity”, loosely defined as the strength of the backscattered echo for each measured point. To date, LIDAR intensity data have proven beneficial in a wide range of applications because they are related to surface parameters, such as reflectance. While numerous procedures have been introduced in the scientific literature, and even commercial software, to enhance the utility of intensity data through a variety of “normalization”, “correction”, or “calibration” techniques, the current situation is complicated by a lack of standardization, as well as confusing, inconsistent use of terminology. In this paper, we first provide an overview of basic principles of LIDAR intensity measurements and applications utilizing intensity information from terrestrial, airborne topographic, and airborne bathymetric LIDAR. Next, we review effective parameters on intensity measurements, basic theory, and current intensity processing methods. We define terminology adopted from the most commonly-used conventions based on a review of current literature. Finally, we identify topics in need of further research. Ultimately, the presented information helps lay the foundation for future standards and specifications for LIDAR radiometric calibration. PMID:26561813
Garfield, S; Jheeta, S; Husson, F; Jacklin, A; Bischler, A; Norton, C; Franklin, B D
2016-01-01
There is a consensus that patients and the public should be involved in research in a meaningful way. However, to date, lay people have been mostly involved in developing research ideas and commenting on patient information.We previously published a paper describing our experience with lay partners conducting observations in a study of how patients in hospital are involved with their medicines. In a later part of the same study, lay partners were also involved in analysing interviews that a researcher had conducted with patients, carers and healthcare professionals about patient and carer involvement with medicines in hospital. We therefore wanted to build on our previous paper and report on our experiences with lay partners helping to conduct data analysis. We therefore interviewed the lay members and researchers involved in the analysis to find out their views.Both lay members and researchers reported that lay partners added value to the study by bringing their own perspectives and identifying further areas for the researcher to look for in the interviews. In this way researchers and lay partners were able to work together to produce a richer analysis than would have been possible from either alone. Background It is recognised that involving lay people in research in a meaningful rather than tokenistic way is both important and challenging. In this paper, we contribute to this debate by describing our experiences of lay involvement in data analysis. Methods We conducted semi-structured interviews with the lay partners and researchers involved in qualitative data analysis in a wider study of inpatient involvement in medication safety. The interviews were transcribed verbatim and coded using open thematic analysis. Results We interviewed three lay partners and the three researchers involved. These interviews demonstrated that the lay members added value to the analysis by bringing their own perspectives; these were systematically integrated into the analysis by the lead researcher to create a synergistic output. Some challenges arose, including difficulties in recruiting a diverse range of members of the public to carry out the role; however there were generally fewer challenges in data analysis than there had been with our previous experience of lay partners' involvement in data collection. Conclusions Lay members can add value to health services research by being involved in qualitative data analysis.
Kaesberg, A-K U; Louton, H; Erhard, M; Schmidt, P; Zepp, M; Helmer, F; Schwarzer, A
2018-03-01
In July 2015, a German voluntary decree stipulated that the keeping of beak-trimmed laying hens after the 1st of January 2017 will no longer be permitted. Simultaneously, the present project was initiated to validate a newly developed prognostic tool for laying hen farmers to forecast, at the beginning of a laying period, the probability of future problems with feather pecking and cannibalism in their flock. For this purpose, we used a computer-based prognostic tool in form of a questionnaire that was easy and quick to complete and facilitated comparisons of different flocks. It contained various possible risk factors that were classified into 3 score categories (1 = "no need for action," 2 = "intermediate need for action," 3 = "instant need for action"). For the validation of this tool, 43 flocks of 41 farms were examined twice, at the beginning of the laying period (around the 20th wk of life) and around the 67th wk of life. At both visits, the designated investigators filled out the questionnaire and assessed the plumage condition and the skin lesions (as indicators of occurrence of feather pecking and cannibalism) of 50 laying hens of each flock. The average prognostic score of the first visit was compared with the existence of feather pecking and cannibalism in each flock at the end of the laying period. The results showed that the prognostic score was negatively correlated with the plumage score (r = -0.32; 95% confidence interval [CI]: [-0.56; -0.02]) and positively correlated with the skin lesion score (r = 0.38; 95% CI: [0.09; 0.61]). These relationships demonstrate that a better prognostic score was associated with a better plumage and skin lesion score. After performing a principal component analysis on the single scores, we found that only 6 components are sufficient to obtain highly sensitive and specific prognostic results. Thus, the data of this analysis should be used for creating applicable software for use on laying hen farms.
NASA Astrophysics Data System (ADS)
Kartikasari, L. R.; Hertanto, B. S.; Pranoto, D.; Salim, W. N.; Nuhriawangsa, A. M. P.
2017-04-01
Purslane is considered a rich vegetable source of alpha-linolenic acid, beta-carotene and various antioxidants. The objective of the study was to investigate the effect of different dietary levels of purslane meal (Portulaca oleracea) in the diets of laying hens on physical quality of eggs. A total of 125 Hy-Line Brown hens (54 weeks old) were placed at individual cages and assigned to five dietary treatments. The diets were supplemented with 0, 2, 4, 6 and 8% purslane meal. Laying hens were fed for 5 weeks after a typical period of adaptation (7 days). Water and feed were provided ad libitum. A total of 25 egg samples of day 28 and day 35 (n = 5 egg yolks for each treatment) were collected to analyse exterior and interior physical quality of eggs. The data were analysed using ANOVA. Differences between treatment means were further analysed using Duncan’s New Multiple Range Test. Results showed that feeding different purslane meal levels in the diets improved egg weight, yolk weight, albumen weight and yolk colour. The highest intensity of yolk colour was obtained with the diet containing 8% purslane meal. However, dietary treatments did not affect egg index, albumen index, yolk index, shell weight, shell thickness and Haugh Unit. It is concluded that including purslane meal to laying hen diets increases the physical qualities of the eggs.
A Fast Synthetic Aperture Radar Raw Data Simulation Using Cloud Computing.
Li, Zhixin; Su, Dandan; Zhu, Haijiang; Li, Wei; Zhang, Fan; Li, Ruirui
2017-01-08
Synthetic Aperture Radar (SAR) raw data simulation is a fundamental problem in radar system design and imaging algorithm research. The growth of surveying swath and resolution results in a significant increase in data volume and simulation period, which can be considered to be a comprehensive data intensive and computing intensive issue. Although several high performance computing (HPC) methods have demonstrated their potential for accelerating simulation, the input/output (I/O) bottleneck of huge raw data has not been eased. In this paper, we propose a cloud computing based SAR raw data simulation algorithm, which employs the MapReduce model to accelerate the raw data computing and the Hadoop distributed file system (HDFS) for fast I/O access. The MapReduce model is designed for the irregular parallel accumulation of raw data simulation, which greatly reduces the parallel efficiency of graphics processing unit (GPU) based simulation methods. In addition, three kinds of optimization strategies are put forward from the aspects of programming model, HDFS configuration and scheduling. The experimental results show that the cloud computing based algorithm achieves 4_ speedup over the baseline serial approach in an 8-node cloud environment, and each optimization strategy can improve about 20%. This work proves that the proposed cloud algorithm is capable of solving the computing intensive and data intensive issues in SAR raw data simulation, and is easily extended to large scale computing to achieve higher acceleration.
Gharbi, Mohamed; Sakly, Nadhem; Darghouth, Mohamed Aziz
2013-01-01
Dermanyssus gallinae (Mesostigmata: Dermanyssidae), a mite of poultry, represents the most important ecotoparasite of egg-laying poultry in several countries. We estimated the prevalence of D. gallinae infestation in 38 industrial poultry farms (28 egg-laying and 10 reproductive hen farms) in the governorate of Nabeul (North-East Tunisia). Traps were placed in two locations of each farm during 24 h in August. The overall prevalence at the farms was estimated to be 34%. A total number of 329 D. gallinae were collected, giving an intensity of 0.0028 and an abundance of 0.0015. Infestation intensity and abundance were significantly higher in egg production farms than reproductive farms. There was no correlation between the intensity of infestation and temperature. An exponential correlation was observed between the birds’ age and infestation intensity. We recommend a systematic survey of poultry farms during the whole breeding period. Prompt treatment is recommended to avoid the exponential increase of mite population. PMID:24160169
Gharbi, Mohamed; Sakly, Nadhem; Darghouth, Mohamed Aziz
2013-01-01
Dermanyssus gallinae (Mesostigmata: Dermanyssidae), a mite of poultry, represents the most important ecotoparasite of egg-laying poultry in several countries. We estimated the prevalence of D. gallinae infestation in 38 industrial poultry farms (28 egg-laying and 10 reproductive hen farms) in the governorate of Nabeul (North-East Tunisia). Traps were placed in two locations of each farm during 24 h in August. The overall prevalence at the farms was estimated to be 34%. A total number of 329 D. gallinae were collected, giving an intensity of 0.0028 and an abundance of 0.0015. Infestation intensity and abundance were significantly higher in egg production farms than reproductive farms. There was no correlation between the intensity of infestation and temperature. An exponential correlation was observed between the birds' age and infestation intensity. We recommend a systematic survey of poultry farms during the whole breeding period. Prompt treatment is recommended to avoid the exponential increase of mite population. © M. Gharbi et al., published by EDP Sciences, 2013.
The Role of Lay Health Workers in Pediatric Chronic Disease: A Systematic Review
Raphael, Jean L.; Rueda, Anna; Lion, K. Casey; Giordano, Thomas P.
2013-01-01
Background Children with chronic diseases represent a high-cost and resource-intensive population of children. With continued gaps in chronic disease management and persistent fragmentation in the health care system, stakeholders are seeking new strategies to address the needs of these children. Objective To systematically assess the effectiveness of lay health worker interventions in improving health care utilization, symptom management, and family psychosocial outcomes for children with chronic conditions. Data Source PubMed, PsycINFO, and Web of Science (January 1961- February 2013). Study Eligibility Criteria, Participants, and Interventions We developed a strategy to search citations to identify relevant articles. Search terms included randomized controlled trial (RCT), lay worker, parent mentor, peer mentor, peer educator, community health workers, community health aids, patient advocate, patient facilitator, patient liaison, promotoras (es), care ambassadors, patient navigator, and non-professional. Additional studies were identified by searching the reference lists of retrieved articles and contacting clinical experts. RCTs of lay health worker interventions for children with chronic conditions were included. Studies were restricted to those concentrated on children 0–18 years of age with chronic illnesses. Study Appraisal and Synthesis Methods Abstracts were independently screened by 2 reviewers. Articles with relevant abstracts underwent full text review and were evaluated for inclusion criteria. A structured tool was used to abstract data from selected articles. Due to heterogeneous interventions and outcomes, we did not conduct a meta-analysis. Results The search yielded 736 unique articles, of which 17 met inclusion criteria. All interventions focused on specific conditions: asthma, type I diabetes, obesity, and failure to thrive. Interventions were heterogeneous in frequency, mode, and duration of interactions between lay health workers and subjects. Several interventions were multi-faceted, including both one-on-one and group interactions. Improved outcomes most commonly reported were reduced urgent care use, decreases in symptoms, fewer missed work and school days, and increased parental quality of life. One study demonstrated that lay health worker interventions were cost-effective. Conclusions Lay health workers interventions in children with chronic conditions may lead to modest improvements in urgent care use, symptoms, and parental psychosocial outcomes. Such interventions may also be cost-effective. Future research should focus on interventions targeted toward other chronic conditions such as sickle cell disease or cystic fibrosis and medically complex children whose conditions are non-categorical. PMID:24011745
A Fast Synthetic Aperture Radar Raw Data Simulation Using Cloud Computing
Li, Zhixin; Su, Dandan; Zhu, Haijiang; Li, Wei; Zhang, Fan; Li, Ruirui
2017-01-01
Synthetic Aperture Radar (SAR) raw data simulation is a fundamental problem in radar system design and imaging algorithm research. The growth of surveying swath and resolution results in a significant increase in data volume and simulation period, which can be considered to be a comprehensive data intensive and computing intensive issue. Although several high performance computing (HPC) methods have demonstrated their potential for accelerating simulation, the input/output (I/O) bottleneck of huge raw data has not been eased. In this paper, we propose a cloud computing based SAR raw data simulation algorithm, which employs the MapReduce model to accelerate the raw data computing and the Hadoop distributed file system (HDFS) for fast I/O access. The MapReduce model is designed for the irregular parallel accumulation of raw data simulation, which greatly reduces the parallel efficiency of graphics processing unit (GPU) based simulation methods. In addition, three kinds of optimization strategies are put forward from the aspects of programming model, HDFS configuration and scheduling. The experimental results show that the cloud computing based algorithm achieves 4× speedup over the baseline serial approach in an 8-node cloud environment, and each optimization strategy can improve about 20%. This work proves that the proposed cloud algorithm is capable of solving the computing intensive and data intensive issues in SAR raw data simulation, and is easily extended to large scale computing to achieve higher acceleration. PMID:28075343
Barua, Animesh; Yellapa, Aparna; Bahr, Janice M; Machado, Sergio A; Bitterman, Pincas; Basu, Sanjib; Sharma, Sameer; Abramowicz, Jacques S
2015-07-01
Tumor-associated neoangiogenesis (TAN) is an early event in ovarian cancer (OVCA) development. Increased expression of vascular endothelial growth factor receptor 2 (VEGFR2) by TAN vessels presents a potential target for early detection by ultrasound imaging. The goal of this study was to examine the suitability of VEGFR2-targeted ultrasound contrast agents in detecting spontaneous OVCA in laying hens. Effects of VEGFR2-targeted contrast agents in enhancing the intensity of ultrasound imaging from spontaneous ovarian tumors in hens were examined in a cross-sectional study. Enhancement in the intensity of ultrasound imaging was determined before and after injection of VEGFR2-targeted contrast agents. All ultrasound images were digitally stored and analyzed off-line. Following scanning, ovarian tissues were collected and processed for histology and detection of VEGFR2-expressing microvessels. Enhancement in visualization of ovarian morphology was detected by gray-scale imaging following injection of VEGFR2-targeted contrast agents. Compared with pre-contrast, contrast imaging enhanced the intensities of ultrasound imaging significantly (p < 0.0001) irrespective of the pathological status of ovaries. In contrast to normal hens, the intensity of ultrasound imaging was significantly (p < 0.0001) higher in hens with early stage OVCA and increased further in hens with late stage OVCA. Higher intensities of ultrasound imaging in hens with OVCA were positively correlated with increased (p < 0.0001) frequencies of VEGFR2-expressing microvessels. The results of this study suggest that VEGFR2-targeted contrast agents enhance the visualization of spontaneous ovarian tumors in hens at early and late stages of OVCA. The laying hen may be a suitable model to test new imaging agents and develop targeted therapeutics. © The Author(s) 2014.
Russell, Kathleen M; Champion, Victoria L; Monahan, Patrick O; Millon-Underwood, Sandra; Zhao, Qianqian; Spacey, Nicole; Rush, Nathan L; Paskett, Electra D
2010-01-01
Low-income African American women face numerous barriers to mammography screening. We tested the efficacy of a combined interactive computer program and lay health advisor intervention to increase mammography screening. In this randomized, single blind study, participants were 181 African American female health center patients of ages 41 to 75 years, at < or =250% of poverty level, with no breast cancer history, and with no screening mammogram in the past 15 months. They were assigned to either (a) a low-dose comparison group consisting of a culturally appropriate mammography screening pamphlet or (b) interactive, tailored computer instruction at baseline and four monthly lay health advisor counseling sessions. Self-reported screening data were collected at baseline and 6 months and verified by medical record. For intent-to-treat analysis of primary outcome (medical record-verified mammography screening, available on all but two participants), the intervention group had increased screening to 51% (45 of 89) compared with 18% (16 of 90) for the comparison group at 6 months. When adjusted for employment status, disability, first-degree relatives with breast cancer, health insurance, and previous breast biopsies, the intervention group was three times more likely (adjusted relative risk, 2.7; 95% confidence interval, 1.8-3.7; P < 0.0001) to get screened than the low-dose comparison group. Similar results were found for self-reported mammography stage of screening adoption. The combined intervention was efficacious in improving mammography screening in low-income African American women, with an unadjusted effect size (relative risk, 2.84) significantly higher (P < 0.05) than that in previous studies of each intervention alone.
Daigle, Courtney L; Siegford, Janice M
2014-03-01
Continuous observation is the most accurate way to determine animals' actual time budget and can provide a 'gold standard' representation of resource use, behavior frequency, and duration. Continuous observation is useful for capturing behaviors that are of short duration or occur infrequently. However, collecting continuous data is labor intensive and time consuming, making multiple individual or long-term data collection difficult. Six non-cage laying hens were video recorded for 15 h and behavioral data collected every 2 s were compared with data collected using scan sampling intervals of 5, 10, 15, 30, and 60 min and subsamples of 2 second observations performed for 10 min every 30 min, 15 min every 1 h, 30 min every 1.5 h, and 15 min every 2 h. Three statistical approaches were used to provide a comprehensive analysis to examine the quality of the data obtained via different sampling methods. General linear mixed models identified how the time budget from the sampling techniques differed from continuous observation. Correlation analysis identified how strongly results from the sampling techniques were associated with those from continuous observation. Regression analysis identified how well the results from the sampling techniques were associated with those from continuous observation, changes in magnitude, and whether a sampling technique had bias. Static behaviors were well represented with scan and time sampling techniques, while dynamic behaviors were best represented with time sampling techniques. Methods for identifying an appropriate sampling strategy based upon the type of behavior of interest are outlined and results for non-caged laying hens are presented. Copyright © 2013 Elsevier B.V. All rights reserved.
Chen, Jinying; Druhl, Emily; Polepalli Ramesh, Balaji; Houston, Thomas K; Brandt, Cynthia A; Zulman, Donna M; Vimalananda, Varsha G; Malkani, Samir; Yu, Hong
2018-01-22
Many health care systems now allow patients to access their electronic health record (EHR) notes online through patient portals. Medical jargon in EHR notes can confuse patients, which may interfere with potential benefits of patient access to EHR notes. The aim of this study was to develop and evaluate the usability and content quality of NoteAid, a Web-based natural language processing system that links medical terms in EHR notes to lay definitions, that is, definitions easily understood by lay people. NoteAid incorporates two core components: CoDeMed, a lexical resource of lay definitions for medical terms, and MedLink, a computational unit that links medical terms to lay definitions. We developed innovative computational methods, including an adapted distant supervision algorithm to prioritize medical terms important for EHR comprehension to facilitate the effort of building CoDeMed. Ten physician domain experts evaluated the user interface and content quality of NoteAid. The evaluation protocol included a cognitive walkthrough session and a postsession questionnaire. Physician feedback sessions were audio-recorded. We used standard content analysis methods to analyze qualitative data from these sessions. Physician feedback was mixed. Positive feedback on NoteAid included (1) Easy to use, (2) Good visual display, (3) Satisfactory system speed, and (4) Adequate lay definitions. Opportunities for improvement arising from evaluation sessions and feedback included (1) improving the display of definitions for partially matched terms, (2) including more medical terms in CoDeMed, (3) improving the handling of terms whose definitions vary depending on different contexts, and (4) standardizing the scope of definitions for medicines. On the basis of these results, we have improved NoteAid's user interface and a number of definitions, and added 4502 more definitions in CoDeMed. Physician evaluation yielded useful feedback for content validation and refinement of this innovative tool that has the potential to improve patient EHR comprehension and experience using patient portals. Future ongoing work will develop algorithms to handle ambiguous medical terms and test and evaluate NoteAid with patients. ©Jinying Chen, Emily Druhl, Balaji Polepalli Ramesh, Thomas K Houston, Cynthia A Brandt, Donna M Zulman, Varsha G Vimalananda, Samir Malkani, Hong Yu. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 22.01.2018.
Parent's Guide to Computers in Education.
ERIC Educational Resources Information Center
Moursund, David
Addressed to the parents of children taking computer courses in school, this booklet outlines the rationales for computer use in schools and explains for a lay audience the features and functions of computers. A look at the school of the future shows computers aiding the study of reading, writing, arithmetic, geography, and history. The features…
Visual communication of engineering and scientific data in the courtroom
NASA Astrophysics Data System (ADS)
Jackson, Gerald W.; Henry, Andrew C.
1993-01-01
Presenting engineering and scientific information in the courtroom is challenging. Quite often the data is voluminous and, therefore, difficult to digest by engineering experts, let alone a lay judge, lawyer, or jury. This paper discusses computer visualization techniques designed to provide the court methods of communicating data in visual formats thus allowing a more accurate understanding of complicated concepts and results. Examples are presented that include accident reconstructions, technical concept illustration, and engineering data visualization. Also presented is the design of an electronic courtroom which facilitates the display and communication of information to the courtroom.
MRPack: Multi-Algorithm Execution Using Compute-Intensive Approach in MapReduce
2015-01-01
Large quantities of data have been generated from multiple sources at exponential rates in the last few years. These data are generated at high velocity as real time and streaming data in variety of formats. These characteristics give rise to challenges in its modeling, computation, and processing. Hadoop MapReduce (MR) is a well known data-intensive distributed processing framework using the distributed file system (DFS) for Big Data. Current implementations of MR only support execution of a single algorithm in the entire Hadoop cluster. In this paper, we propose MapReducePack (MRPack), a variation of MR that supports execution of a set of related algorithms in a single MR job. We exploit the computational capability of a cluster by increasing the compute-intensiveness of MapReduce while maintaining its data-intensive approach. It uses the available computing resources by dynamically managing the task assignment and intermediate data. Intermediate data from multiple algorithms are managed using multi-key and skew mitigation strategies. The performance study of the proposed system shows that it is time, I/O, and memory efficient compared to the default MapReduce. The proposed approach reduces the execution time by 200% with an approximate 50% decrease in I/O cost. Complexity and qualitative results analysis shows significant performance improvement. PMID:26305223
MRPack: Multi-Algorithm Execution Using Compute-Intensive Approach in MapReduce.
Idris, Muhammad; Hussain, Shujaat; Siddiqi, Muhammad Hameed; Hassan, Waseem; Syed Muhammad Bilal, Hafiz; Lee, Sungyoung
2015-01-01
Large quantities of data have been generated from multiple sources at exponential rates in the last few years. These data are generated at high velocity as real time and streaming data in variety of formats. These characteristics give rise to challenges in its modeling, computation, and processing. Hadoop MapReduce (MR) is a well known data-intensive distributed processing framework using the distributed file system (DFS) for Big Data. Current implementations of MR only support execution of a single algorithm in the entire Hadoop cluster. In this paper, we propose MapReducePack (MRPack), a variation of MR that supports execution of a set of related algorithms in a single MR job. We exploit the computational capability of a cluster by increasing the compute-intensiveness of MapReduce while maintaining its data-intensive approach. It uses the available computing resources by dynamically managing the task assignment and intermediate data. Intermediate data from multiple algorithms are managed using multi-key and skew mitigation strategies. The performance study of the proposed system shows that it is time, I/O, and memory efficient compared to the default MapReduce. The proposed approach reduces the execution time by 200% with an approximate 50% decrease in I/O cost. Complexity and qualitative results analysis shows significant performance improvement.
Computational neurobiology is a useful tool in translational neurology: the example of ataxia
Brown, Sherry-Ann; McCullough, Louise D.; Loew, Leslie M.
2014-01-01
Hereditary ataxia, or motor incoordination, affects approximately 150,000 Americans and hundreds of thousands of individuals worldwide with onset from as early as mid-childhood. Affected individuals exhibit dysarthria, dysmetria, action tremor, and diadochokinesia. In this review, we consider an array of computational studies derived from experimental observations relevant to human neuropathology. A survey of related studies illustrates the impact of integrating clinical evidence with data from mouse models and computational simulations. Results from these studies may help explain findings in mice, and after extensive laboratory study, may ultimately be translated to ataxic individuals. This inquiry lays a foundation for using computation to understand neurobiochemical and electrophysiological pathophysiology of spinocerebellar ataxias and may contribute to development of therapeutics. The interdisciplinary analysis suggests that computational neurobiology can be an important tool for translational neurology. PMID:25653585
Can trained lay providers perform HIV testing services? A review of national HIV testing policies.
Flynn, David E; Johnson, Cheryl; Sands, Anita; Wong, Vincent; Figueroa, Carmen; Baggaley, Rachel
2017-01-04
Only an estimated 54% of people living with HIV are aware of their status. Despite progress scaling up HIV testing services (HTS), a testing gap remains. Delivery of HTS by lay providers may help close this testing gap, while also increasing uptake and acceptability of HIV testing among key populations and other priority groups. 50 National HIV testing policies were collated from WHO country intelligence databases, contacts and testing program websites. Data regarding lay provider use for HTS was extracted and collated. Our search had no geographical or language restrictions. This data was then compared with reported data from the Global AIDS Response Progress Reporting (GARPR) from July 2015. Forty-two percent of countries permit lay providers to perform HIV testing and 56% permit lay providers to administer pre-and post-test counseling. Comparative analysis with GARPR found that less than half (46%) of reported data from countries were consistent with their corresponding national HIV testing policy. Given the low uptake of lay provider use globally and their proven use in increasing HIV testing, countries should consider revising policies to support lay provider testing using rapid diagnostic tests.
Sensitivity analysis of navy aviation readiness based sparing model
2017-09-01
variability. (See Figure 4.) Figure 4. Research design flowchart 18 Figure 4 lays out the four steps of the methodology , starting in the upper left-hand...as a function of changes in key inputs. We develop NAVARM Experimental Designs (NED), a computational tool created by applying a state-of-the-art...experimental design to the NAVARM model. Statistical analysis of the resulting data identifies the most influential cost factors. Those are, in order of
The locating ways of laying pipe manipulator
NASA Astrophysics Data System (ADS)
Wang, Dan; Li, Bin; Lei, DongLiang
2010-01-01
The laying pipe manipulator is a new equipment to lay concrete pipe. This kind of manipulator makes the work of laying pipes mechanized and automated. We report here a new laying pipe manipulator. The manipulator has 5 free degrees, and is driven by the hydraulic system. In the paper, one critical question of manipulator is studied: the locating ways of the manipulator to lay concrete pipe. During the process of laying concrete pipe, how to locate the manipulator is realized by the locating system of manipulator. The locating system consists of photoelectric target, laser producer, and computer. According to different construction condition, one or two or three photoelectric targets can be used. During the process of laying concrete pipe, if the interface of pipes are jointed together, and the other segment of pipe deviates from the pipe way, one target can be used, if the angle that the manipulator rotates around the holding pipe's axes is 0°, two targets can be used, three targets can be used at any site. In the paper, according to each locating way, the theory analysis is done. And the mathematical models of the manipulator moving from original position to goal position are obtained by different locating way. And the locating experiment was done. According to the experiment result, the work principle and mathematical models of different locating way was turned out to be well adopted for requirement, the mathematical model of different locating way supplies the basic control theory for the manipulator to lay and joint concrete pipe automatically.
PNNL Data-Intensive Computing for a Smarter Energy Grid
Carol Imhoff; Zhenyu (Henry) Huang; Daniel Chavarria
2017-12-09
The Middleware for Data-Intensive Computing (MeDICi) Integration Framework, an integrated platform to solve data analysis and processing needs, supports PNNL research on the U.S. electric power grid. MeDICi is enabling development of visualizations of grid operations and vulnerabilities, with goal of near real-time analysis to aid operators in preventing and mitigating grid failures.
NASA Astrophysics Data System (ADS)
Singh, Ranjan Kumar; Rinawa, Moti Lal
2018-04-01
The residual stresses arising in fiber-reinforced laminates during their curing in closed molds lead to changes in the composites after their removal from the molds and cooling. One of these dimensional changes of angle sections is called springback. The parameters such as lay-up, stacking sequence, material system, cure temperature, thickness etc play important role in it. In present work, it is attempted to optimize lay-up and stacking sequence for maximization of flexural stiffness and minimization of springback angle. The search algorithms are employed to obtain best sequence through repair strategy such as swap. A new search algorithm, termed as lay-up search algorithm (LSA) is also proposed, which is an extension of permutation search algorithm (PSA). The efficacy of PSA and LSA is tested on the laminates with a range of lay-ups. A computer code is developed on MATLAB implementing the above schemes. Also, the strategies for multi objective optimization using search algorithms are suggested and tested.
Barua, Animesh; Bitterman, Pincas; Bahr, Janice M.; Basu, Sanjib; Sheiner, Eyal; Bradaric, Michael J.; Hales, Dale B.; Luborsky, Judith L.; Abramowicz, Jacques S.
2011-01-01
Objective Our goal was to examine the feasibility of using laying hens, a preclinical model of human spontaneous ovarian cancer, in determining the kinetics of an ultrasound contrast agent indicative of ovarian tumor-associated neoangiogenesis in early-stage ovarian cancer. Methods Three-year-old White Leghorn laying hens with decreased ovarian function were scanned before and after intravenous injection of a human serum albumin–perflutren contrast agent at a dose of 5 µL/kg body weight. Gray scale morphologic characteristics, Doppler indices, the arrival time, peak intensity, and wash-out of the contrast agent were recorded and archived on still images and video clips. Hens were euthanized thereafter; sonographic predictions were compared at gross examination; and ovarian tissues were collected. Archived clips were analyzed to determine contrast parameters and Doppler intensities of vessels. A time-intensity curve per hen was drawn, and the area under the curve was derived. Tumor types and the density of ovarian microvessels were determined by histologic examination and immunohistochemistry and compared to sonographic predictions. Results The contrast agent significantly (P < .05) enhanced the visualization of microvessels, which was confirmed by immunohistochemistry. Contrast parameters, including the time of wash-out and area under the curve, were significantly different (P < .05) between ovaries of normal hens and hens with ovarian cancer and correctly detected cancer at earlier stages than the time of peak intensity. Conclusions The laying hen may be a useful animal model for determining ovarian tumor-associated vascular kinetics diagnostic of early-stage ovarian cancer using a contrast agent. This model may also be useful for testing the efficacy of different contrast agents in a preclinical setting. PMID:21357555
Developing a Business Intelligence Process for a Training Module in SharePoint 2010
NASA Technical Reports Server (NTRS)
Schmidtchen, Bryce; Solano, Wanda M.; Albasini, Colby
2015-01-01
Prior to this project, training information for the employees of the National Center for Critical Processing and Storage (NCCIPS) was stored in an array of unrelated spreadsheets and SharePoint lists that had to be manually updated. By developing a content management system through a web application platform named SharePoint, this training system is now highly automated and provides a much less intensive method of storing training data and scheduling training courses. This system was developed by using SharePoint Designer and laying out the data structure for the interaction between different lists of data about the employees. The automation of data population inside of the lists was accomplished by implementing SharePoint workflows which essentially lay out the logic for how data is connected and calculated between certain lists. The resulting training system is constructed from a combination of five lists of data with a single list acting as the user-friendly interface. This interface is populated with the courses required for each employee and includes past and future information about course requirements. The employees of NCCIPS now have the ability to view, log, and schedule their training information and courses with much more ease. This system will relieve a significant amount of manual input and serve as a powerful informational resource for the employees of NCCIPS in the future.
Structural dynamics of shroudless, hollow fan blades with composite in-lays
NASA Technical Reports Server (NTRS)
Aiello, R. A.; Hirschbein, M. S.; Chamis, C. C.
1982-01-01
Structural and dynamic analyses are presented for a shroudless, hollow titanium fan blade proposed for future use in aircraft turbine engines. The blade was modeled and analyzed using the composite blade structural analysis computer program (COBSTRAN); an integrated program consisting of mesh generators, composite mechanics codes, NASTRAN, and pre- and post-processors. Vibration and impact analyses are presented. The vibration analysis was conducted with COBSTRAN. Results show the effect of the centrifugal force field on frequencies, twist, and blade camber. Bird impact analysis was performed with the multi-mode blade impact computer program. This program uses the geometric model and modal analysis from the COBSTRAN vibration analysis to determine the gross impact response of the fan blades to bird strikes. The structural performance of this blade is also compared to a blade of similar design but with composite in-lays on the outer surface. Results show that the composite in-lays can be selected (designed) to substantially modify the mechanical performance of the shroudless, hollow fan blade.
De Georgia, Michael A.; Kaffashi, Farhad; Jacono, Frank J.; Loparo, Kenneth A.
2015-01-01
There is a broad consensus that 21st century health care will require intensive use of information technology to acquire and analyze data and then manage and disseminate information extracted from the data. No area is more data intensive than the intensive care unit. While there have been major improvements in intensive care monitoring, the medical industry, for the most part, has not incorporated many of the advances in computer science, biomedical engineering, signal processing, and mathematics that many other industries have embraced. Acquiring, synchronizing, integrating, and analyzing patient data remain frustratingly difficult because of incompatibilities among monitoring equipment, proprietary limitations from industry, and the absence of standard data formatting. In this paper, we will review the history of computers in the intensive care unit along with commonly used monitoring and data acquisition systems, both those commercially available and those being developed for research purposes. PMID:25734185
De Georgia, Michael A; Kaffashi, Farhad; Jacono, Frank J; Loparo, Kenneth A
2015-01-01
There is a broad consensus that 21st century health care will require intensive use of information technology to acquire and analyze data and then manage and disseminate information extracted from the data. No area is more data intensive than the intensive care unit. While there have been major improvements in intensive care monitoring, the medical industry, for the most part, has not incorporated many of the advances in computer science, biomedical engineering, signal processing, and mathematics that many other industries have embraced. Acquiring, synchronizing, integrating, and analyzing patient data remain frustratingly difficult because of incompatibilities among monitoring equipment, proprietary limitations from industry, and the absence of standard data formatting. In this paper, we will review the history of computers in the intensive care unit along with commonly used monitoring and data acquisition systems, both those commercially available and those being developed for research purposes.
PNNLs Data Intensive Computing research battles Homeland Security threats
David Thurman; Joe Kielman; Katherine Wolf; David Atkinson
2018-05-11
The Pacific Northwest National Laboratorys (PNNL's) approach to data intensive computing (DIC) is focused on three key research areas: hybrid hardware architecture, software architectures, and analytic algorithms. Advancements in these areas will help to address, and solve, DIC issues associated with capturing, managing, analyzing and understanding, in near real time, data at volumes and rates that push the frontiers of current technologies.
PNNL pushing scientific discovery through data intensive computing breakthroughs
Deborah Gracio; David Koppenaal; Ruby Leung
2018-05-18
The Pacific Northwest National Laboratory's approach to data intensive computing (DIC) is focused on three key research areas: hybrid hardware architectures, software architectures, and analytic algorithms. Advancements in these areas will help to address, and solve, DIC issues associated with capturing, managing, analyzing and understanding, in near real time, data at volumes and rates that push the frontiers of current technologies.
Ory, Marcia G.; Lee, Shinduk; Zollinger, Alyson; Bhurtyal, Kiran; Jiang, Luohua; Smith, Matthew Lee
2015-01-01
The Fit & Strong! program is an evidence-based, multi-component program promoting physical activity among older adults, particularly those suffering from lower-extremity osteoarthritis. The primary purpose of the study is to examine if the Fit & Strong! program translated into a lay-leader model can produce comparable outcomes to the original program taught by physical therapists and/or certified exercise instructors. A single-group, pre–post study design was employed, and data were collected at the baseline (n = 136 participants) and the intervention conclusion (n = 71) with both baseline and post-intervention data. The measurements included socio-demographic information, health- and behavior-related information, and health-related quality of life. Various statistical tests were used for the program impact analysis and examination of the association between participant characteristics and program completion. As in the original study, there were statistically significant (p < 0.05) improvements in self-efficacy for exercise, aerobic capacity, joint stiffness, level of energy, and amount and intensity of physical activities. The odds of completing the program were significantly lower for the participants from rural areas and those having multiple chronic conditions. Successful adaptation of the Fit & Strong! program to a lay-leader model can increase the likelihood of program dissemination by broadening the selection pool of instructors and, hence, reducing the potential issue of resource limitation. However, high program attrition rates (54.1%) emphasize the importance of adopting evidence-based strategies for improving the retention of the participants from rural areas and those with multiple chronic conditions. PMID:25964912
MapReduce Based Parallel Neural Networks in Enabling Large Scale Machine Learning
Yang, Jie; Huang, Yuan; Xu, Lixiong; Li, Siguang; Qi, Man
2015-01-01
Artificial neural networks (ANNs) have been widely used in pattern recognition and classification applications. However, ANNs are notably slow in computation especially when the size of data is large. Nowadays, big data has received a momentum from both industry and academia. To fulfill the potentials of ANNs for big data applications, the computation process must be speeded up. For this purpose, this paper parallelizes neural networks based on MapReduce, which has become a major computing model to facilitate data intensive applications. Three data intensive scenarios are considered in the parallelization process in terms of the volume of classification data, the size of the training data, and the number of neurons in the neural network. The performance of the parallelized neural networks is evaluated in an experimental MapReduce computer cluster from the aspects of accuracy in classification and efficiency in computation. PMID:26681933
MapReduce Based Parallel Neural Networks in Enabling Large Scale Machine Learning.
Liu, Yang; Yang, Jie; Huang, Yuan; Xu, Lixiong; Li, Siguang; Qi, Man
2015-01-01
Artificial neural networks (ANNs) have been widely used in pattern recognition and classification applications. However, ANNs are notably slow in computation especially when the size of data is large. Nowadays, big data has received a momentum from both industry and academia. To fulfill the potentials of ANNs for big data applications, the computation process must be speeded up. For this purpose, this paper parallelizes neural networks based on MapReduce, which has become a major computing model to facilitate data intensive applications. Three data intensive scenarios are considered in the parallelization process in terms of the volume of classification data, the size of the training data, and the number of neurons in the neural network. The performance of the parallelized neural networks is evaluated in an experimental MapReduce computer cluster from the aspects of accuracy in classification and efficiency in computation.
A supportive architecture for CFD-based design optimisation
NASA Astrophysics Data System (ADS)
Li, Ni; Su, Zeya; Bi, Zhuming; Tian, Chao; Ren, Zhiming; Gong, Guanghong
2014-03-01
Multi-disciplinary design optimisation (MDO) is one of critical methodologies to the implementation of enterprise systems (ES). MDO requiring the analysis of fluid dynamics raises a special challenge due to its extremely intensive computation. The rapid development of computational fluid dynamic (CFD) technique has caused a rise of its applications in various fields. Especially for the exterior designs of vehicles, CFD has become one of the three main design tools comparable to analytical approaches and wind tunnel experiments. CFD-based design optimisation is an effective way to achieve the desired performance under the given constraints. However, due to the complexity of CFD, integrating with CFD analysis in an intelligent optimisation algorithm is not straightforward. It is a challenge to solve a CFD-based design problem, which is usually with high dimensions, and multiple objectives and constraints. It is desirable to have an integrated architecture for CFD-based design optimisation. However, our review on existing works has found that very few researchers have studied on the assistive tools to facilitate CFD-based design optimisation. In the paper, a multi-layer architecture and a general procedure are proposed to integrate different CFD toolsets with intelligent optimisation algorithms, parallel computing technique and other techniques for efficient computation. In the proposed architecture, the integration is performed either at the code level or data level to fully utilise the capabilities of different assistive tools. Two intelligent algorithms are developed and embedded with parallel computing. These algorithms, together with the supportive architecture, lay a solid foundation for various applications of CFD-based design optimisation. To illustrate the effectiveness of the proposed architecture and algorithms, the case studies on aerodynamic shape design of a hypersonic cruising vehicle are provided, and the result has shown that the proposed architecture and developed algorithms have performed successfully and efficiently in dealing with the design optimisation with over 200 design variables.
2013 Progress Report -- DOE Joint Genome Institute
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2013-11-01
In October 2012, we introduced a 10-Year Strategic Vision [http://bit.ly/JGI-Vision] for the Institute. A central focus of this Strategic Vision is to bridge the gap between sequenced genomes and an understanding of biological functions at the organism and ecosystem level. This involves the continued massive-scale generation of sequence data, complemented by orthogonal new capabilities to functionally annotate these large sequence data sets. Our Strategic Vision lays out a path to guide our decisions and ensure that the evolving set of experimental and computational capabilities available to DOE JGI users will continue to enable groundbreaking science.
Flexible Description Language for HPC based Processing of Remote Sense Data
NASA Astrophysics Data System (ADS)
Nandra, Constantin; Gorgan, Dorian; Bacu, Victor
2016-04-01
When talking about Big Data, the most challenging aspect lays in processing them in order to gain new insight, find new patterns and gain knowledge from them. This problem is likely most apparent in the case of Earth Observation (EO) data. With ever higher numbers of data sources and increasing data acquisition rates, dealing with EO data is indeed a challenge [1]. Geoscientists should address this challenge by using flexible and efficient tools and platforms. To answer this trend, the BigEarth project [2] aims to combine the advantages of high performance computing solutions with flexible processing description methodologies in order to reduce both task execution times and task definition time and effort. As a component of the BigEarth platform, WorDeL (Workflow Description Language) [3] is intended to offer a flexible, compact and modular approach to the task definition process. WorDeL, unlike other description alternatives such as Python or shell scripts, is oriented towards the description topologies, using them as abstractions for the processing programs. This feature is intended to make it an attractive alternative for users lacking in programming experience. By promoting modular designs, WorDeL not only makes the processing descriptions more user-readable and intuitive, but also helps organizing the processing tasks into independent sub-tasks, which can be executed in parallel on multi-processor platforms in order to improve execution times. As a BigEarth platform [4] component, WorDeL represents the means by which the user interacts with the system, describing processing algorithms in terms of existing operators and workflows [5], which are ultimately translated into sets of executable commands. The WorDeL language has been designed to help in the definition of compute-intensive, batch tasks which can be distributed and executed on high-performance, cloud or grid-based architectures in order to improve the processing time. Main references for further information: [1] Gorgan, D., "Flexible and Adaptive Processing of Earth Observation Data over High Performance Computation Architectures", International Conference and Exhibition Satellite 2015, August 17-19, Houston, Texas, USA. [2] Bigearth project - flexible processing of big earth data over high performance computing architectures. http://cgis.utcluj.ro/bigearth, (2014) [3] Nandra, C., Gorgan, D., "Workflow Description Language for Defining Big Earth Data Processing Tasks", Proceedings of the Intelligent Computer Communication and Processing (ICCP), IEEE-Press, pp. 461-468, (2015). [4] Bacu, V., Stefan, T., Gorgan, D., "Adaptive Processing of Earth Observation Data on Cloud Infrastructures Based on Workflow Description", Proceedings of the Intelligent Computer Communication and Processing (ICCP), IEEE-Press, pp.444-454, (2015). [5] Mihon, D., Bacu, V., Colceriu, V., Gorgan, D., "Modeling of Earth Observation Use Cases through the KEOPS System", Proceedings of the Intelligent Computer Communication and Processing (ICCP), IEEE-Press, pp. 455-460, (2015).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bianchini, G.; Burgio, N.; Carta, M.
The GUINEVERE experiment (Generation of Uninterrupted Intense Neutrons at the lead Venus Reactor) is an experimental program in support of the ADS technology presently carried out at SCK-CEN in Mol (Belgium). In the experiment a modified lay-out of the original thermal VENUS critical facility is coupled to an accelerator, built by the French body CNRS in Grenoble, working in both continuous and pulsed mode and delivering 14 MeV neutrons by bombardment of deuterons on a tritium-target. The modified lay-out of the facility consists of a fast subcritical core made of 30% U-235 enriched metallic Uranium in a lead matrix. Severalmore » off-line and on-line reactivity measurement techniques will be investigated during the experimental campaign. This report is focused on the simulation by deterministic (ERANOS French code) and Monte Carlo (MCNPX US code) calculations of three reactivity measurement techniques, Slope ({alpha}-fitting), Area-ratio and Source-jerk, applied to a GUINEVERE subcritical configuration (namely SC1). The inferred reactivity, in dollar units, by the Area-ratio method shows an overall agreement between the two deterministic and Monte Carlo computational approaches, whereas the MCNPX Source-jerk results are affected by large uncertainties and allow only partial conclusions about the comparison. Finally, no particular spatial dependence of the results is observed in the case of the GUINEVERE SC1 subcritical configuration. (authors)« less
NASA Technical Reports Server (NTRS)
1991-01-01
The purpose of the conference was to increase awareness of existing NASA developed technologies that are available for immediate use in the development of new products and processes, and to lay the groundwork for the effective utilization of emerging technologies. There were sessions on the following: Computer technology and software engineering; Human factors engineering and life sciences; Information and data management; Material sciences; Manufacturing and fabrication technology; Power, energy, and control systems; Robotics; Sensors and measurement technology; Artificial intelligence; Environmental technology; Optics and communications; and Superconductivity.
Sampled Data Adaptive Digital Computer Control of Surface Ship Maneuvers
1976-06-01
0.53 feet. Systems fcr which fuel considerations are not a motivating 157 factor lay te designed without this part of the control law ta allow finer...COXXXQXxaQXQ«^2Q£>’^ o>- —,>->>>ozor X < a. Ps4 <i i— « aC _J o < a o-*»-» ujOO • •>- o • •oo«mo z o «j II II ** » < ii ii -^ -* -,-^a:- i—— * O.-IUJ
GLIDE: a grid-based light-weight infrastructure for data-intensive environments
NASA Technical Reports Server (NTRS)
Mattmann, Chris A.; Malek, Sam; Beckman, Nels; Mikic-Rakic, Marija; Medvidovic, Nenad; Chrichton, Daniel J.
2005-01-01
The promise of the grid is that it will enable public access and sharing of immense amounts of computational and data resources among dynamic coalitions of individuals and institutions. However, the current grid solutions make several limiting assumptions that curtail their widespread adoption. To address these limitations, we present GLIDE, a prototype light-weight, data-intensive middleware infrastructure that enables access to the robust data and computational power of the grid on DREAM platforms.
Choi, Hyungwon; Kim, Sinae; Fermin, Damian; Tsou, Chih-Chiang; Nesvizhskii, Alexey I
2015-11-03
We introduce QPROT, a statistical framework and computational tool for differential protein expression analysis using protein intensity data. QPROT is an extension of the QSPEC suite, originally developed for spectral count data, adapted for the analysis using continuously measured protein-level intensity data. QPROT offers a new intensity normalization procedure and model-based differential expression analysis, both of which account for missing data. Determination of differential expression of each protein is based on the standardized Z-statistic based on the posterior distribution of the log fold change parameter, guided by the false discovery rate estimated by a well-known Empirical Bayes method. We evaluated the classification performance of QPROT using the quantification calibration data from the clinical proteomic technology assessment for cancer (CPTAC) study and a recently published Escherichia coli benchmark dataset, with evaluation of FDR accuracy in the latter. QPROT is a statistical framework with computational software tool for comparative quantitative proteomics analysis. It features various extensions of QSPEC method originally built for spectral count data analysis, including probabilistic treatment of missing values in protein intensity data. With the increasing popularity of label-free quantitative proteomics data, the proposed method and accompanying software suite will be immediately useful for many proteomics laboratories. This article is part of a Special Issue entitled: Computational Proteomics. Copyright © 2015 Elsevier B.V. All rights reserved.
Federated data storage system prototype for LHC experiments and data intensive science
NASA Astrophysics Data System (ADS)
Kiryanov, A.; Klimentov, A.; Krasnopevtsev, D.; Ryabinkin, E.; Zarochentsev, A.
2017-10-01
Rapid increase of data volume from the experiments running at the Large Hadron Collider (LHC) prompted physics computing community to evaluate new data handling and processing solutions. Russian grid sites and universities’ clusters scattered over a large area aim at the task of uniting their resources for future productive work, at the same time giving an opportunity to support large physics collaborations. In our project we address the fundamental problem of designing a computing architecture to integrate distributed storage resources for LHC experiments and other data-intensive science applications and to provide access to data from heterogeneous computing facilities. Studies include development and implementation of federated data storage prototype for Worldwide LHC Computing Grid (WLCG) centres of different levels and University clusters within one National Cloud. The prototype is based on computing resources located in Moscow, Dubna, Saint Petersburg, Gatchina and Geneva. This project intends to implement a federated distributed storage for all kind of operations such as read/write/transfer and access via WAN from Grid centres, university clusters, supercomputers, academic and commercial clouds. The efficiency and performance of the system are demonstrated using synthetic and experiment-specific tests including real data processing and analysis workflows from ATLAS and ALICE experiments, as well as compute-intensive bioinformatics applications (PALEOMIX) running on supercomputers. We present topology and architecture of the designed system, report performance and statistics for different access patterns and show how federated data storage can be used efficiently by physicists and biologists. We also describe how sharing data on a widely distributed storage system can lead to a new computing model and reformations of computing style, for instance how bioinformatics program running on supercomputers can read/write data from the federated storage.
Evaluation of Computed Tomography of Mock Uranium Fuel Rods at the Advanced Photon Source
Hunter, James F.; Brown, Donald William; Okuniewski, Maria
2015-06-01
This study discusses a multi-year effort to evaluate the utility of computed tomography at the Advanced Photon Source (APS) as a tool for non-destructive evaluation of uranium based fuel rods. The majority of the data presented is on mock material made with depleted uranium which mimics the x-ray attenuation characteristics of fuel rods while allowing for simpler handling. A range of data is presented including full thickness (5mm diameter) fuel rodlets, reduced thickness (1.8mm) sintering test samples, and pre/post irradiation samples (< 1mm thick). These data were taken on both a white beam (bending magnet) beamline and a high energy,more » monochromatic beamline. This data shows the utility of a synchrotron type source in the evealuation of manufacturing defects (pre-irradiation) and lays out the case for in situ CT of fuel pellet sintering. Finally, in addition data is shown from small post-irradiation samples and a case is made for post-irradiation CT of larger samples.« less
The Development of Educational and/or Training Computer Games for Students with Disabilities
ERIC Educational Resources Information Center
Kwon, Jungmin
2012-01-01
Computer and video games have much in common with the strategies used in special education. Free resources for game development are becoming more widely available, so lay computer users, such as teachers and other practitioners, now have the capacity to develop games using a low budget and a little self-teaching. This article provides a guideline…
Lay Theories Regarding Computer-Mediated Communication in Remote Collaboration
ERIC Educational Resources Information Center
Parke, Karl; Marsden, Nicola; Connolly, Cornelia
2017-01-01
Computer-mediated communication and remote collaboration has become an unexceptional norm as an educational modality for distance and open education, therefore the need to research and analyze students' online learning experience is necessary. This paper seeks to examine the assumptions and expectations held by students in regard to…
Data Intensive Computing on Amazon Web Services
DOE Office of Scientific and Technical Information (OSTI.GOV)
Magana-Zook, S. A.
The Geophysical Monitoring Program (GMP) has spent the past few years building up the capability to perform data intensive computing using what have been referred to as “big data” tools. These big data tools would be used against massive archives of seismic signals (>300 TB) to conduct research not previously possible. Examples of such tools include Hadoop (HDFS, MapReduce), HBase, Hive, Storm, Spark, Solr, and many more by the day. These tools are useful for performing data analytics on datasets that exceed the resources of traditional analytic approaches. To this end, a research big data cluster (“Cluster A”) was setmore » up as a collaboration between GMP and Livermore Computing (LC).« less
NASA Astrophysics Data System (ADS)
Pagnutti, Mary; Ryan, Robert E.; Cazenavette, George; Gold, Maxwell; Harlan, Ryan; Leggett, Edward; Pagnutti, James
2017-01-01
A comprehensive radiometric characterization of raw-data format imagery acquired with the Raspberry Pi 3 and V2.1 camera module is presented. The Raspberry Pi is a high-performance single-board computer designed to educate and solve real-world problems. This small computer supports a camera module that uses a Sony IMX219 8 megapixel CMOS sensor. This paper shows that scientific and engineering-grade imagery can be produced with the Raspberry Pi 3 and its V2.1 camera module. Raw imagery is shown to be linear with exposure and gain (ISO), which is essential for scientific and engineering applications. Dark frame, noise, and exposure stability assessments along with flat fielding results, spectral response measurements, and absolute radiometric calibration results are described. This low-cost imaging sensor, when calibrated to produce scientific quality data, can be used in computer vision, biophotonics, remote sensing, astronomy, high dynamic range imaging, and security applications, to name a few.
Kennedy, C E; Yeh, P T; Johnson, C; Baggaley, R
2017-12-01
New strategies for HIV testing services (HTS) are needed to achieve UN 90-90-90 targets, including diagnosis of 90% of people living with HIV. Task-sharing HTS to trained lay providers may alleviate health worker shortages and better reach target groups. We conducted a systematic review of studies evaluating HTS by lay providers using rapid diagnostic tests (RDTs). Peer-reviewed articles were included if they compared HTS using RDTs performed by trained lay providers to HTS by health professionals, or to no intervention. We also reviewed data on end-users' values and preferences around lay providers preforming HTS. Searching was conducted through 10 online databases, reviewing reference lists, and contacting experts. Screening and data abstraction were conducted in duplicate using systematic methods. Of 6113 unique citations identified, 5 studies were included in the effectiveness review and 6 in the values and preferences review. One US-based randomized trial found patients' uptake of HTS doubled with lay providers (57% vs. 27%, percent difference: 30, 95% confidence interval: 27-32, p < 0.001). In Malawi, a pre/post study showed increases in HTS sites and tests after delegation to lay providers. Studies from Cambodia, Malawi, and South Africa comparing testing quality between lay providers and laboratory staff found little discordance and high sensitivity and specificity (≥98%). Values and preferences studies generally found support for lay providers conducting HTS, particularly in non-hypothetical scenarios. Based on evidence supporting using trained lay providers, a WHO expert panel recommended lay providers be allowed to conduct HTS using HIV RDTs. Uptake of this recommendation could expand HIV testing to more people globally.
Template Interfaces for Agile Parallel Data-Intensive Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramakrishnan, Lavanya; Gunter, Daniel; Pastorello, Gilerto Z.
Tigres provides a programming library to compose and execute large-scale data-intensive scientific workflows from desktops to supercomputers. DOE User Facilities and large science collaborations are increasingly generating large enough data sets that it is no longer practical to download them to a desktop to operate on them. They are instead stored at centralized compute and storage resources such as high performance computing (HPC) centers. Analysis of this data requires an ability to run on these facilities, but with current technologies, scaling an analysis to an HPC center and to a large data set is difficult even for experts. Tigres ismore » addressing the challenge of enabling collaborative analysis of DOE Science data through a new concept of reusable "templates" that enable scientists to easily compose, run and manage collaborative computational tasks. These templates define common computation patterns used in analyzing a data set.« less
Issues central to a useful image understanding environment
NASA Astrophysics Data System (ADS)
Beveridge, J. Ross; Draper, Bruce A.; Hanson, Allen R.; Riseman, Edward M.
1992-04-01
A recent DARPA initiative has sparked interested in software environments for computer vision. The goal is a single environment to support both basic research and technology transfer. This paper lays out six fundamental attributes such a system must possess: (1) support for both C and Lisp, (2) extensibility, (3) data sharing, (4) data query facilities tailored to vision, (5) graphics, and (6) code sharing. The first three attributes fundamentally constrain the system design. Support for both C and Lisp demands some form of database or data-store for passing data between languages. Extensibility demands that system support facilities, such as spatial retrieval of data, be readily extended to new user-defined datatypes. Finally, data sharing demands that data saved by one user, including data of a user-defined type, must be readable by another user.
Chapnik, Noam; Yosef, Roy; Baram-Tsabari, Ayelet
2017-01-01
Scientists are required to communicate science and research not only to other experts in the field, but also to scientists and experts from other fields, as well as to the public and policymakers. One fundamental suggestion when communicating with non-experts is to avoid professional jargon. However, because they are trained to speak with highly specialized language, avoiding jargon is difficult for scientists, and there is no standard to guide scientists in adjusting their messages. In this research project, we present the development and validation of the data produced by an up-to-date, scientist-friendly program for identifying jargon in popular written texts, based on a corpus of over 90 million words published in the BBC site during the years 2012–2015. The validation of results by the jargon identifier, the De-jargonizer, involved three mini studies: (1) comparison and correlation with existing frequency word lists in the literature; (2) a comparison with previous research on spoken language jargon use in TED transcripts of non-science lectures, TED transcripts of science lectures and transcripts of academic science lectures; and (3) a test of 5,000 pairs of published research abstracts and lay reader summaries describing the same article from the journals PLOS Computational Biology and PLOS Genetics. Validation procedures showed that the data classification of the De-jargonizer significantly correlates with existing frequency word lists, replicates similar jargon differences in previous studies on scientific versus general lectures, and identifies significant differences in jargon use between abstracts and lay summaries. As expected, more jargon was found in the academic abstracts than lay summaries; however, the percentage of jargon in the lay summaries exceeded the amount recommended for the public to understand the text. Thus, the De-jargonizer can help scientists identify problematic jargon when communicating science to non-experts, and be implemented by science communication instructors when evaluating the effectiveness and jargon use of participants in science communication workshops and programs. PMID:28792945
Rakedzon, Tzipora; Segev, Elad; Chapnik, Noam; Yosef, Roy; Baram-Tsabari, Ayelet
2017-01-01
Scientists are required to communicate science and research not only to other experts in the field, but also to scientists and experts from other fields, as well as to the public and policymakers. One fundamental suggestion when communicating with non-experts is to avoid professional jargon. However, because they are trained to speak with highly specialized language, avoiding jargon is difficult for scientists, and there is no standard to guide scientists in adjusting their messages. In this research project, we present the development and validation of the data produced by an up-to-date, scientist-friendly program for identifying jargon in popular written texts, based on a corpus of over 90 million words published in the BBC site during the years 2012-2015. The validation of results by the jargon identifier, the De-jargonizer, involved three mini studies: (1) comparison and correlation with existing frequency word lists in the literature; (2) a comparison with previous research on spoken language jargon use in TED transcripts of non-science lectures, TED transcripts of science lectures and transcripts of academic science lectures; and (3) a test of 5,000 pairs of published research abstracts and lay reader summaries describing the same article from the journals PLOS Computational Biology and PLOS Genetics. Validation procedures showed that the data classification of the De-jargonizer significantly correlates with existing frequency word lists, replicates similar jargon differences in previous studies on scientific versus general lectures, and identifies significant differences in jargon use between abstracts and lay summaries. As expected, more jargon was found in the academic abstracts than lay summaries; however, the percentage of jargon in the lay summaries exceeded the amount recommended for the public to understand the text. Thus, the De-jargonizer can help scientists identify problematic jargon when communicating science to non-experts, and be implemented by science communication instructors when evaluating the effectiveness and jargon use of participants in science communication workshops and programs.
Extracting the Data From the LCM vk4 Formatted Output File
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wendelberger, James G.
These are slides about extracting the data from the LCM vk4 formatted output file. The following is covered: vk4 file produced by Keyence VK Software, custom analysis, no off the shelf way to read the file, reading the binary data in a vk4 file, various offsets in decimal lines, finding the height image data, directly in MATLAB, binary output beginning of height image data, color image information, color image binary data, color image decimal and binary data, MATLAB code to read vk4 file (choose a file, read the file, compute offsets, read optical image, laser optical image, read and computemore » laser intensity image, read height image, timing, display height image, display laser intensity image, display RGB laser optical images, display RGB optical images, display beginning data and save images to workspace, gamma correction subroutine), reading intensity form the vk4 file, linear in the low range, linear in the high range, gamma correction for vk4 files, computing the gamma intensity correction, observations.« less
Pishvar, Maya; Amirkhosravi, Mehrad; Altan, M Cengiz
2018-05-17
This work demonstrates a protocol to improve the quality of composite laminates fabricated by wet lay-up vacuum bag processes using the recently developed magnet assisted composite manufacturing (MACM) technique. In this technique, permanent magnets are utilized to apply a sufficiently high consolidation pressure during the curing stage. To enhance the intensity of the magnetic field, and thus, to increase the magnetic compaction pressure, the magnets are placed on a magnetic top plate. First, the entire procedure of preparing the composite lay-up on a magnetic bottom steel plate using the conventional wet lay-up vacuum bag process is described. Second, placement of a set of Neodymium-Iron-Boron permanent magnets, arranged in alternating polarity, on the vacuum bag is illustrated. Next, the experimental procedures to measure the magnetic compaction pressure and volume fractions of the composite constituents are presented. Finally, methods used to characterize microstructure and mechanical properties of composite laminates are discussed in detail. The results prove the effectiveness of the MACM method in improving the quality of wet lay-up vacuum bag laminates. This method does not require large capital investment for tooling or equipment and can also be used to consolidate geometrically complex composite parts by placing the magnets on a matching top mold positioned on the vacuum bag.
Nesting frequency and success: implications for the demography of painted turtles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tinkle, D.W.; Congdon, J.D.; Rosen, P.C.
1981-12-01
Nesting ecology and reproduction of painted turtles (Chrysemys picta) in southeast Michigan were intensively studied from 1975 to 1978. The average clutch size of Michigan painted turtles was 7.55, with body size accounting for only 9-13% of the variance. Data on nesting frequency indicate that from 30 to 50% of the females possibly do not reproduce every year and that approx. =6% reproduce twice in a given year. Predation within 48 h of egg-laying is responsible for the failure of 20% of the nests. An additional 12% nest failure is due to various other causes. These data substantially alter themore » life table previously reported in this population of painted turtles.« less
Nesting frequency and success: implications for the demography of painted turtles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tinkle, D.W.; Congdon, J.D.; Rosen, P.C.
1981-12-01
Nesting ecology and reproduction of painted turtles (Chrysemys picta) in southeast Michigan were intensively studied from 1975 to 1978. The average clutch size of Michigan painted turtles was 7.55, with body size accounting for only 9-13% of the variance. Data on nesting frequency indicate that from 30 to 50% of the females possibly do not reproduce every year and that approx.6% reproduce twice in a given year. Predation within 48 h of egg-laying is responsible for the failure of 20% of the nests. An additional 12% nest failure is due to various other causes. These data substantially alter the lifemore » table previously reported for this population of painted turtles.« less
Ueberschär, Karl-Heinz; Dänicke, Sven; Matthes, Siegfried
2007-02-01
Technical short chain chlorinated paraffins (C10-C13 with 60% chlorine) were fed to 93 laying hens from 24 to 32 weeks of age in increasing concentrations of up to 100 mg/kg feed. No significant influence on health, relative organ weights or performance (laying intensity, egg weight, feed consumption) was noted. The chlorinated paraffin content of the tissues was linearly related to the concentration of short chain paraffins of the feed. The highest concentrations were found in abdominal fat, egg yolk and fatty tissues. Breast muscle, egg albumen and bile fluid contained minimal or no residues. Less than 1% of the chlorinated paraffins ingested were incorporated into the body (without head, feet, gut and feathers), whereas about 1.5% were eliminated with the egg yolk and 30% were excreted with urine and faeces. A six-week kinetic depuration study revealed a biphasic elimination with half-lifes of 4-40 min (liver, kidneys, legs, fat, blood) for the initial rapid phase, and 15-30 days (blood, fat, liver, yolk, kidneys, legs) for the terminal slow phase.
Effects on selective serotonin antagonism on central neurotransmission
USDA-ARS?s Scientific Manuscript database
Aggression and cannibalism in laying hens can differ in intensity and degree due to many factors, including genetics. Behavioral analysis of DeKalb XL (DXL) and high group productivity and survivability (HGPS) strains revealed high and low aggressiveness, respectively. However, the exact genetic me...
The large scale microelectronics Computer-Aided Design and Test (CADAT) system
NASA Technical Reports Server (NTRS)
Gould, J. M.
1978-01-01
The CADAT system consists of a number of computer programs written in FORTRAN that provide the capability to simulate, lay out, analyze, and create the artwork for large scale microelectronics. The function of each software component of the system is described with references to specific documentation for each software component.
Barua, Animesh; Yellapa, Aparna; Bahr, Janice M; Adur, Malavika K; Utterback, Chet W; Bitterman, Pincas; Basu, Sanjib; Sharma, Sameer; Abramowicz, Jacques S
2015-01-01
Limited resolution of transvaginal ultrasound (TVUS) scanning is a significant barrier to early detection of ovarian cancer (OVCA). Contrast agents have been suggested to improve the resolution of TVUS scanning. Emerging evidence suggests that expression of interleukin 16 (IL-16) by the tumor epithelium and microvessels increases in association with OVCA development and offers a potential target for early OVCA detection. The goal of this study was to examine the feasibility of IL-16-targeted contrast agents in enhancing the intensity of ultrasound imaging from ovarian tumors in hens, a model of spontaneous OVCA. Contrast agents were developed by conjugating biotinylated anti-IL-16 antibodies with streptavidin coated microbubbles. Enhancement of ultrasound signal intensity was determined before and after injection of contrast agents. Following scanning, ovarian tissues were processed for the detection of IL-16 expressing cells and microvessels. Compared with precontrast, contrast imaging enhanced ultrasound signal intensity significantly in OVCA hens at early (P < 0.05) and late stages (P < 0.001). Higher intensities of ultrasound signals in OVCA hens were associated with increased frequencies of IL-16 expressing cells and microvessels. These results suggest that IL-16-targeted contrast agents improve the visualization of ovarian tumors. The laying hen may be a suitable model to test new imaging agents and develop targeted anti-OVCA therapeutics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tong, Dudu; Yang, Sichun; Lu, Lanyuan
2016-06-20
Structure modellingviasmall-angle X-ray scattering (SAXS) data generally requires intensive computations of scattering intensity from any given biomolecular structure, where the accurate evaluation of SAXS profiles using coarse-grained (CG) methods is vital to improve computational efficiency. To date, most CG SAXS computing methods have been based on a single-bead-per-residue approximation but have neglected structural correlations between amino acids. To improve the accuracy of scattering calculations, accurate CG form factors of amino acids are now derived using a rigorous optimization strategy, termed electron-density matching (EDM), to best fit electron-density distributions of protein structures. This EDM method is compared with and tested againstmore » other CG SAXS computing methods, and the resulting CG SAXS profiles from EDM agree better with all-atom theoretical SAXS data. By including the protein hydration shell represented by explicit CG water molecules and the correction of protein excluded volume, the developed CG form factors also reproduce the selected experimental SAXS profiles with very small deviations. Taken together, these EDM-derived CG form factors present an accurate and efficient computational approach for SAXS computing, especially when higher molecular details (represented by theqrange of the SAXS data) become necessary for effective structure modelling.« less
Computational approaches for understanding the diagnosis and treatment of Parkinson’s disease
Smith, Stephen L.; Lones, Michael A.; Bedder, Matthew; Alty, Jane E.; Cosgrove, Jeremy; Maguire, Richard J.; Pownall, Mary Elizabeth; Ivanoiu, Diana; Lyle, Camille; Cording, Amy; Elliott, Christopher J.H.
2015-01-01
This study describes how the application of evolutionary algorithms (EAs) can be used to study motor function in humans with Parkinson’s disease (PD) and in animal models of PD. Human data is obtained using commercially available sensors via a range of non-invasive procedures that follow conventional clinical practice. EAs can then be used to classify human data for a range of uses, including diagnosis and disease monitoring. New results are presented that demonstrate how EAs can also be used to classify fruit flies with and without genetic mutations that cause Parkinson’s by using measurements of the proboscis extension reflex. The case is made for a computational approach that can be applied across human and animal studies of PD and lays the way for evaluation of existing and new drug therapies in a truly objective way. PMID:26577157
Direct computation of turbulence and noise
NASA Technical Reports Server (NTRS)
Berman, C.; Gordon, G.; Karniadakis, G.; Batcho, P.; Jackson, E.; Orszag, S.
1991-01-01
Jet exhaust turbulence noise is computed using a time dependent solution of the three dimensional Navier-Stokes equations to supply the source terms for an acoustic computation based on the Phillips convected wave equation. An extrapolation procedure is then used to determine the far field noise spectrum in terms of the near field sound. This will lay the groundwork for studies of more complex flows typical of noise suppression nozzles.
Domestic violence: recognition, intervention, and prevention.
Smith, M; Martin, F
1995-02-01
Domestic violence is a significant social and health problem that has received intensive recent publicity in the lay media. Nurses should play a major role in primary, secondary, and tertiary prevention interventions. Intensified health promotion and public policy initiatives can reduce the incidence of domestic violence in the future.
Xafis, Vicki
2015-11-17
A key ethical issue arising in data linkage research relates to consent requirements. Patients' consent preferences in the context of health research have been explored but their consent preferences regarding data linkage specifically have been under-explored. In addition, the views on data linkage are often those of patient groups. As a result, little is known about lay people's views and their preferences about consent requirements in the context of data linkage. This study explores lay people's views and justifications regarding the acceptability of conducting data linkage research without obtaining consent. A qualitative study explored lay people's views regarding consent requirements in data linkage via four hypothetical data linkage scenarios of increasing complexity. Prior to considering the scenarios, participants were provided with information regarding best practice data linkage processes via discussion and a diagrammatic representation of the process. Lay people were able to understand the intricate processes involved in data linkage and the key protections afforded within a short amount of time. They were supportive of data linkage research and, on the whole, believed it should be conducted without consent provided a data linkage organization de-identifies the data used so that researchers do not handle identifiable data. Many thought that de-identified data holds a different status to identifiable data and should be used without specific consent in research that aims to benefit society. In weighing up conflicting values and interests, participants shifted consent preferences before arriving at their final consent preference for each scenario and provided justifications for their choices. They considered the protection of people's information, societal benefits, and the nature and constraints of research and recognized that these need to be balanced. With some exposure to the features of data linkage, lay people have the capacity to understand the processes sufficiently in order to consider ethical issues associated with consent preferences. Shifts in views reveal the complexity of such decisions. While privacy protection remained an important consideration for most participants, adequate protection measures adopted in best practice data linkage were viewed by most as protection enough for data linkage to proceed without specific individual consent.
2008-02-27
between the PHY layer and for example a host PC computer . The PC wants to generate and receive a sequence of data packets. The PC may also want to send...the testbed is quite similar. Given the intense computational requirements of SVD and other matrix mode operations needed to support eigen spreading a...platform for real time operation. This task is probably the major challenge in the development of the testbed. All compute intensive tasks will be
Uysal, Ahmet; Ascigil, Esra; Turunc, Gamze
2017-04-01
The present research examined the effect of spousal autonomy support on the need satisfaction and well-being of individuals with chronic pain. Married individuals with a diagnosed musculoskeletal chronic pain condition (N = 109) completed a baseline questionnaire and a follow-up questionnaire after a 6-month time period. Cross-lagged analyses indicated that spousal autonomy support predicted increases in basic need satisfaction, and need satisfaction predicted increases in well-being. Moreover, the analyses in the opposite direction were not significant. Similarly, cross-lagged analyses were more supportive of the direction from pain intensity to lower well-being, rather than well-being to pain intensity. Finally, we tested a longitudinal structural model using pain intensity and spousal autonomy support as the predictors, basic needs as the mediator, and well-being as the outcome. The model provided a good fit to the data. Results showed that spousal autonomy support had a positive effect on the need satisfaction and well-being of individuals with chronic pain, independent of pain intensity. These findings extend self-determination theory to the chronic pain context and lay the groundwork for future chronic pain studies using the self-determination theory framework.
Chang, Shen-Chang; Chiang, Hsin-I; Lin, Min-Jung; Jea, Yu-Shine; Chen, Lih-Ren; Fan, Yang-Kwang; Lee, Tzu-Tai
2016-07-01
The objective of this study is to investigate the effects of short light regimes and lower dietary protein content on the reproductive performance of White Roman geese in an environment- controlled house. Thirty-two ganders and 80 geese during the third laying period were allotted into 16 pens, randomly assigned into a split-plot design with two different lighting regimes: (1) short light regimes (SL) with 6.5h of light and 17.5h of dark (6.5L:17.5D), and (2) long light regimes (LL) with 19L:5D during the 6-wk prelaying period, followed by two different levels of protein diets (Low CP: 15% vs. High CP: 18%) for the laying period. The results showed that birds treated with the SL light regime had a heavier body weight compared to those treated with LL at the arrival of the peak period of egg production (6.19 vs. 5.87kg, P<0.05). Geese under LL had a longer laying period than those under SL treatment (277 vs. 175day, P<0.05), while the geese under SL treatment had a higher laying intensity (15.4% vs. 12.6%, P<0.05), fertility and hatchability than those under LL treatment. Our results suggest that the White Roman geese treated with 6-wk short light regime during the prelaying period and on the low CP diet during the laying period found conditions sufficient to sustain their regular reproduction performance, which would benefit geese farmers in the perspectives of energy saving and prolonged laying period. Copyright © 2016. Published by Elsevier B.V.
Predicting tree mortality following gypsy moth defoliation
D.E. Fosbroke; R.R. Hicks; K.W. Gottschalk
1991-01-01
Appropriate application of gypsy moth control strategies requires an accurate prediction of the distribution and intensity of tree mortality prior to defoliation. This prior information is necessary to better target investments in control activities where they are needed. This poster lays the groundwork for developing hazard-rating systems for forests of the...
Wongkanya, Rapeeporn; Pankam, Tippawan; Wolf, Shauna; Pattanachaiwit, Supanit; Jantarapakde, Jureeporn; Pengnongyang, Supabhorn; Thapwong, Prasopsuk; Udomjirasirichot, Apichat; Churattanakraisri, Yutthana; Prawepray, Nanthika; Paksornsit, Apiluk; Sitthipau, Thidadaow; Petchaithong, Sarayut; Jitsakulchaidejt, Raruay; Nookhai, Somboon; Lertpiriyasuwat, Cheewanan; Ongwandee, Sumet; Phanuphak, Praphan; Phanuphak, Nittaya
2018-01-01
Introduction: Rapid diagnostic testing (RDT) for HIV has a quick turn-around time, which increases the proportion of people testing who receive their result. HIV RDT in Thailand has traditionally been performed only by medical technologists (MTs), which is a barrier to its being scaled up. We evaluated the performance of HIV RDT conducted by trained lay providers who were members of, or worked closely with, a group of men who have sex with men (MSM) and with transgender women (TG) communities, and compared it to tests conducted by MTs. Methods: Lay providers received a 3-day intensive training course on how to perform a finger-prick blood collection and an HIV RDT as part of the Key Population-led Health Services (KPLHS) programme among MSM and TG. All the samples were tested by lay providers using Alere Determine HIV 1/2. HIV-reactive samples were confirmed by DoubleCheckGold Ultra HIV 1&2 and SD Bioline HIV 1/2. All HIV-positive and 10% of HIV-negative samples were re-tested by MTs using Serodia HIV 1/2. Results: Of 1680 finger-prick blood samples collected and tested using HIV RDT by lay providers in six drop-in centres in Bangkok, Chiang Mai, Chonburi and Songkhla, 252 (15%) were HIV-positive. MTs re-tested these HIV-positive samples and 143 randomly selected HIV-negative samples with 100% concordant test results. Conclusion: Lay providers in Thailand can be trained and empowered to perform HIV RDT as they were found to achieve comparable results in sample testing with MTs. Based on the task-shifting concept, this rapid HIV testing performed by lay providers as part of the KPLHS programme has great potential to enhance HIV prevention and treatment programmes among key at-risk populations.
Yang, Shuai; Zhang, Xinlei; Diao, Lihong; Guo, Feifei; Wang, Dan; Liu, Zhongyang; Li, Honglei; Zheng, Junjie; Pan, Jingshan; Nice, Edouard C; Li, Dong; He, Fuchu
2015-09-04
The Chromosome-centric Human Proteome Project (C-HPP) aims to catalog genome-encoded proteins using a chromosome-by-chromosome strategy. As the C-HPP proceeds, the increasing requirement for data-intensive analysis of the MS/MS data poses a challenge to the proteomic community, especially small laboratories lacking computational infrastructure. To address this challenge, we have updated the previous CAPER browser into a higher version, CAPER 3.0, which is a scalable cloud-based system for data-intensive analysis of C-HPP data sets. CAPER 3.0 uses cloud computing technology to facilitate MS/MS-based peptide identification. In particular, it can use both public and private cloud, facilitating the analysis of C-HPP data sets. CAPER 3.0 provides a graphical user interface (GUI) to help users transfer data, configure jobs, track progress, and visualize the results comprehensively. These features enable users without programming expertise to easily conduct data-intensive analysis using CAPER 3.0. Here, we illustrate the usage of CAPER 3.0 with four specific mass spectral data-intensive problems: detecting novel peptides, identifying single amino acid variants (SAVs) derived from known missense mutations, identifying sample-specific SAVs, and identifying exon-skipping events. CAPER 3.0 is available at http://prodigy.bprc.ac.cn/caper3.
Calibration of Clinical Audio Recording and Analysis Systems for Sound Intensity Measurement.
Maryn, Youri; Zarowski, Andrzej
2015-11-01
Sound intensity is an important acoustic feature of voice/speech signals. Yet recordings are performed with different microphone, amplifier, and computer configurations, and it is therefore crucial to calibrate sound intensity measures of clinical audio recording and analysis systems on the basis of output of a sound-level meter. This study was designed to evaluate feasibility, validity, and accuracy of calibration methods, including audiometric speech noise signals and human voice signals under typical speech conditions. Calibration consisted of 3 comparisons between data from 29 measurement microphone-and-computer systems and data from the sound-level meter: signal-specific comparison with audiometric speech noise at 5 levels, signal-specific comparison with natural voice at 3 levels, and cross-signal comparison with natural voice at 3 levels. Intensity measures from recording systems were then linearly converted into calibrated data on the basis of these comparisons, and validity and accuracy of calibrated sound intensity were investigated. Very strong correlations and quasisimilarity were found between calibrated data and sound-level meter data across calibration methods and recording systems. Calibration of clinical sound intensity measures according to this method is feasible, valid, accurate, and representative for a heterogeneous set of microphones and data acquisition systems in real-life circumstances with distinct noise contexts.
NASA Astrophysics Data System (ADS)
Vilotte, J.-P.; Atkinson, M.; Michelini, A.; Igel, H.; van Eck, T.
2012-04-01
Increasingly dense seismic and geodetic networks are continuously transmitting a growing wealth of data from around the world. The multi-use of these data leaded the seismological community to pioneer globally distributed open-access data infrastructures, standard services and formats, e.g., the Federation of Digital Seismic Networks (FDSN) and the European Integrated Data Archives (EIDA). Our ability to acquire observational data outpaces our ability to manage, analyze and model them. Research in seismology is today facing a fundamental paradigm shift. Enabling advanced data-intensive analysis and modeling applications challenges conventional storage, computation and communication models and requires a new holistic approach. It is instrumental to exploit the cornucopia of data, and to guarantee optimal operation and design of the high-cost monitoring facilities. The strategy of VERCE is driven by the needs of the seismological data-intensive applications in data analysis and modeling. It aims to provide a comprehensive architecture and framework adapted to the scale and the diversity of those applications, and integrating the data infrastructures with Grid, Cloud and HPC infrastructures. It will allow prototyping solutions for new use cases as they emerge within the European Plate Observatory Systems (EPOS), the ESFRI initiative of the solid Earth community. Computational seismology, and information management, is increasingly revolving around massive amounts of data that stem from: (1) the flood of data from the observational systems; (2) the flood of data from large-scale simulations and inversions; (3) the ability to economically store petabytes of data online; (4) the evolving Internet and Data-aware computing capabilities. As data-intensive applications are rapidly increasing in scale and complexity, they require additional services-oriented architectures offering a virtualization-based flexibility for complex and re-usable workflows. Scientific information management poses computer science challenges: acquisition, organization, query and visualization tasks scale almost linearly with the data volumes. Commonly used FTP-GREP metaphor allows today to scan gigabyte-sized datasets but will not work for scanning terabyte-sized continuous waveform datasets. New data analysis and modeling methods, exploiting the signal coherence within dense network arrays, are nonlinear. Pair-algorithms on N points scale as N2. Wave form inversion and stochastic simulations raise computing and data handling challenges These applications are unfeasible for tera-scale datasets without new parallel algorithms that use near-linear processing, storage and bandwidth, and that can exploit new computing paradigms enabled by the intersection of several technologies (HPC, parallel scalable database crawler, data-aware HPC). This issues will be discussed based on a number of core pilot data-intensive applications and use cases retained in VERCE. This core applications are related to: (1) data processing and data analysis methods based on correlation techniques; (2) cpu-intensive applications such as large-scale simulation of synthetic waveforms in complex earth systems, and full waveform inversion and tomography. We shall analyze their workflow and data flow, and their requirements for a new service-oriented architecture and a data-aware platform with services and tools. Finally, we will outline the importance of a new collaborative environment between seismology and computer science, together with the need for the emergence and the recognition of 'research technologists' mastering the evolving data-aware technologies and the data-intensive research goals in seismology.
Efficient Memory Access with NumPy Global Arrays using Local Memory Access
DOE Office of Scientific and Technical Information (OSTI.GOV)
Daily, Jeffrey A.; Berghofer, Dan C.
This paper discusses the work completed working with Global Arrays of data on distributed multi-computer systems and improving their performance. The tasks completed were done at Pacific Northwest National Laboratory in the Science Undergrad Laboratory Internship program in the summer of 2013 for the Data Intensive Computing Group in the Fundamental and Computational Sciences DIrectorate. This work was done on the Global Arrays Toolkit developed by this group. This toolkit is an interface for programmers to more easily create arrays of data on networks of computers. This is useful because scientific computation is often done on large amounts of datamore » sometimes so large that individual computers cannot hold all of it. This data is held in array form and can best be processed on supercomputers which often consist of a network of individual computers doing their computation in parallel. One major challenge for this sort of programming is that operations on arrays on multiple computers is very complex and an interface is needed so that these arrays seem like they are on a single computer. This is what global arrays does. The work done here is to use more efficient operations on that data that requires less copying of data to be completed. This saves a lot of time because copying data on many different computers is time intensive. The way this challenge was solved is when data to be operated on with binary operations are on the same computer, they are not copied when they are accessed. When they are on separate computers, only one set is copied when accessed. This saves time because of less copying done although more data access operations were done.« less
Suppadit, Tawadchai; Jaturasitha, Sanchai; Sunthorn, Napassawan; Poungsuk, Pukkapong
2012-10-01
Wolffia arrhiza meal (WAM) was evaluated as a protein replacement for soybean meal (SBM) in the diet of laying Japanese quails. A total of 480 4-week-old laying quails were randomly allocated to form six groups in a completely randomized design. Each group contained four replicates, with 20 quails per replicate. WAM was incorporated into the diets at levels of 0, 4.00, 8.00, 12.0, 16.0 and 20.0%. The results showed that feed intake per bird per day, daily egg-laying rate, feed cost per 100 egg weight, egg width, egg length, egg weight, eggshell thickness, yolk height and shell quality characteristics in terms of breaking time, Young's modulus, work, maximum force, fracturability, breaking stress, stiffness and power showed no statistically significant differences (P > 0.05) among the 0 to 16.0% levels of WAM. However, these performance measures were significantly lower with 20.0% WAM in the formulated ration (P < 0.05). Mortality showed no significant differences among dietary treatments (P > 0.05). The color intensity of the yolk increased as SBM was replaced with increasing amounts of WAM (P < 0.05). In conclusion, WAM could be successfully used in place of SBM. However, the amount used should not exceed 16.0%.
CT to Cone-beam CT Deformable Registration With Simultaneous Intensity Correction
Zhen, Xin; Gu, Xuejun; Yan, Hao; Zhou, Linghong; Jia, Xun; Jiang, Steve B.
2012-01-01
Computed tomography (CT) to cone-beam computed tomography (CBCT) deformable image registration (DIR) is a crucial step in adaptive radiation therapy. Current intensity-based registration algorithms, such as demons, may fail in the context of CT-CBCT DIR because of inconsistent intensities between the two modalities. In this paper, we propose a variant of demons, called Deformation with Intensity Simultaneously Corrected (DISC), to deal with CT-CBCT DIR. DISC distinguishes itself from the original demons algorithm by performing an adaptive intensity correction step on the CBCT image at every iteration step of the demons registration. Specifically, the intensity correction of a voxel in CBCT is achieved by matching the first and the second moments of the voxel intensities inside a patch around the voxel with those on the CT image. It is expected that such a strategy can remove artifacts in the CBCT image, as well as ensuring the intensity consistency between the two modalities. DISC is implemented on computer graphics processing units (GPUs) in compute unified device architecture (CUDA) programming environment. The performance of DISC is evaluated on a simulated patient case and six clinical head-and-neck cancer patient data. It is found that DISC is robust against the CBCT artifacts and intensity inconsistency and significantly improves the registration accuracy when compared with the original demons. PMID:23032638
Moving GIS Research Indoors: Spatiotemporal Analysis of Agricultural Animals
Daigle, Courtney L.; Banerjee, Debasmit; Montgomery, Robert A.; Biswas, Subir; Siegford, Janice M.
2014-01-01
A proof of concept applying wildlife ecology techniques to animal welfare science in intensive agricultural environments was conducted using non-cage laying hens. Studies of wildlife ecology regularly use Geographic Information Systems (GIS) to assess wild animal movement and behavior within environments with relatively unlimited space and finite resources. However, rather than depicting landscapes, a GIS could be developed in animal production environments to provide insight into animal behavior as an indicator of animal welfare. We developed a GIS-based approach for studying agricultural animal behavior in an environment with finite space and unlimited resources. Concurrent data from wireless body-worn location tracking sensor and video-recording systems, which depicted spatially-explicit behavior of hens (135 hens/room) in two identical indoor enclosures, were collected. The spatial configuration of specific hen behaviors, variation in home range patterns, and variation in home range overlap show that individual hens respond to the same environment differently. Such information could catalyze management practice adjustments (e.g., modifying feeder design and/or location). Genetically-similar hens exhibited diverse behavioral and spatial patterns via a proof of concept approach enabling detailed examinations of individual non-cage laying hen behavior and welfare. PMID:25098421
Diabetes self-management education: acceptability of using trained lay educators.
Mandalia, P K; Stone, M A; Davies, M J; Khunti, K; Carey, M E
2014-11-01
The use of lay people to deliver education programmes for people with chronic conditions is a potential method of addressing healthcare staff capacity and increasing the cost efficiency of delivering education. This qualitative substudy is embedded within an equivalence trial (2008-2011 including development stage). In the qualitative substudy, we aimed to elicit the views of key stakeholders (patients, educators) about using lay people to deliver education to people recently diagnosed with type 2 diabetes, alongside a healthcare professional educator with an equal role. In this way, we sought to explore perceptions about acceptability and also contribute to understanding the reasons underlying positive or negative quantitative findings from main trial. We conducted 27 telephone interviews with a purposive sample of patients, lay educators and healthcare professional educators involved in the main trial. Thematic analysis of transcribed data was underpinned by the constant comparative approach and structured using Framework methodology. Overall, the data suggested that the use of lay educators was acceptable to educators and patients. Perceived difference in knowledge levels between lay and healthcare professional educators did not appear to have an impact on perceived acceptability or the effectiveness of the education received. Additional themes explored were related to peer status of educators and feasibility. Some concerns were raised about lay educators with diabetes, transferring personal issues and about the impact of healthcare professional time taken up by mentoring and supporting lay educators. Positive perceptions about the use of lay educators support the positive quantitative findings from the main trial. Acceptability is an important consideration in relation to implementation of the model of delivery studied. Concerns raised within the interviews should be considered in the design of training for lay educators. ISRCTN 99350009. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pike, Bill
Data—lots of data—generated in seconds and piling up on the internet, streaming and stored in countless databases. Big data is important for commerce, society and our nation’s security. Yet the volume, velocity, variety and veracity of data is simply too great for any single analyst to make sense of alone. It requires advanced, data-intensive computing. Simply put, data-intensive computing is the use of sophisticated computers to sort through mounds of information and present analysts with solutions in the form of graphics, scenarios, formulas, new hypotheses and more. This scientific capability is foundational to PNNL’s energy, environment and security missions. Seniormore » Scientist and Division Director Bill Pike and his team are developing analytic tools that are used to solve important national challenges, including cyber systems defense, power grid control systems, intelligence analysis, climate change and scientific exploration.« less
The Caenorhabditis elegans interneuron ALA is (also) a high-threshold mechanosensor
2013-01-01
Background To survive dynamic environments, it is essential for all animals to appropriately modulate their behavior in response to various stimulus intensities. For instance, the nematode Caenorhabditis elegans suppresses the rate of egg-laying in response to intense mechanical stimuli, in a manner dependent on the mechanosensory neurons FLP and PVD. We have found that the unilaterally placed single interneuron ALA acted as a high-threshold mechanosensor, and that it was required for this protective behavioral response. Results ALA was required for the inhibition of egg-laying in response to a strong (picking-like) mechanical stimulus, characteristic of routine handling of the animals. Moreover, ALA did not respond physiologically to less intense touch stimuli, but exhibited distinct physiological responses to anterior and posterior picking-like touch, suggesting that it could distinguish between spatially separated stimuli. These responses required neither neurotransmitter nor neuropeptide release from potential upstream neurons. In contrast, the long, bilaterally symmetric processes of ALA itself were required for producing its physiological responses; when they were severed, responses to stimuli administered between the cut and the cell body were unaffected, while responses to stimuli administered posterior to the cut were abolished. Conclusion C. elegans neurons are typically classified into three major groups: sensory neurons with specialized sensory dendrites, interneurons, and motoneurons with neuromuscular junctions. Our findings suggest that ALA can autonomously sense intense touch and is thus a dual-function neuron, i.e., an interneuron as well as a novel high-threshold mechanosensor. PMID:24341457
The Caenorhabditis elegans interneuron ALA is (also) a high-threshold mechanosensor.
Sanders, Jarred; Nagy, Stanislav; Fetterman, Graham; Wright, Charles; Treinin, Millet; Biron, David
2013-12-17
To survive dynamic environments, it is essential for all animals to appropriately modulate their behavior in response to various stimulus intensities. For instance, the nematode Caenorhabditis elegans suppresses the rate of egg-laying in response to intense mechanical stimuli, in a manner dependent on the mechanosensory neurons FLP and PVD. We have found that the unilaterally placed single interneuron ALA acted as a high-threshold mechanosensor, and that it was required for this protective behavioral response. ALA was required for the inhibition of egg-laying in response to a strong (picking-like) mechanical stimulus, characteristic of routine handling of the animals. Moreover, ALA did not respond physiologically to less intense touch stimuli, but exhibited distinct physiological responses to anterior and posterior picking-like touch, suggesting that it could distinguish between spatially separated stimuli. These responses required neither neurotransmitter nor neuropeptide release from potential upstream neurons. In contrast, the long, bilaterally symmetric processes of ALA itself were required for producing its physiological responses; when they were severed, responses to stimuli administered between the cut and the cell body were unaffected, while responses to stimuli administered posterior to the cut were abolished. C. elegans neurons are typically classified into three major groups: sensory neurons with specialized sensory dendrites, interneurons, and motoneurons with neuromuscular junctions. Our findings suggest that ALA can autonomously sense intense touch and is thus a dual-function neuron, i.e., an interneuron as well as a novel high-threshold mechanosensor.
Nurse led versus lay educators support for those with asthma in primary care: a costing study
2012-01-01
Background Regular review and support for asthma self-management is promoted in guidelines. A randomised controlled trial suggested that unscheduled health care usage was similar when patients were offered self management support by a lay-trainer or practice nurses. Methods Following the RCT, a costing study was undertaken using the trial data to account for the cost of delivery of the service under both strategies and the resulting impact on unscheduled healthcare (measure of effectiveness) in this trial. Results One year data (n = 418) showed that 29% (61/205) of the nurse group required unscheduled healthcare (177 events) compared with 30.5% (65/213) for lay-trainers (178 events). The training costs for the lay-trainers were greater than nurses (£36 versus £18 respectively per patient, p<0.001), however, the consultation cost for lay-trainers were lower than nurses (£6 per patient versus £24, p<0.001). If the cost of unscheduled healthcare are accounted for then the costs of nurses is £161, and £135 for lay-trainers (mean difference £25, [95% CI = −£97, £149, p = 0.681]). The total costs (delivery and unscheduled healthcare) were £202 per patient for nurses versus £178 for lay-trainers, (mean difference £24, [95%CI = −£100, £147, p = 0.707]). Conclusions There were no significant differences in the cost of training and healthcare delivery between nurse and lay trainers, and no significant difference in the cost of unscheduled health care use. PMID:22958541
Challenges in reusing transactional data for daily documentation in neonatal intensive care.
Kim, G R; Lawson, E E; Lehmann, C U
2008-11-06
The reuse of transactional data for clinical documentation requires navigation of computational, institutional and adaptive barriers. We describe organizational and technical issues in developing and deploying a daily progress note tool in a tertiary neonatal intensive care unit that reuses and aggregates data from a commercial integrated clinical information system.
Analysis of Delays in Transmitting Time Code Using an Automated Computer Time Distribution System
1999-12-01
jlevine@clock. bldrdoc.gov Abstract An automated computer time distribution system broadcasts standard tune to users using computers and modems via...contributed to &lays - sofhareplatform (50% of the delay), transmission speed of time- codes (25OA), telephone network (lS%), modem and others (10’4). The... modems , and telephone lines. Users dial the ACTS server to receive time traceable to the national time scale of Singapore, UTC(PSB). The users can in
How the Public Engages With Brain Optimization
O’Connor, Cliodhna
2015-01-01
In the burgeoning debate about neuroscience’s role in contemporary society, the issue of brain optimization, or the application of neuroscientific knowledge and technologies to augment neurocognitive function, has taken center stage. Previous research has characterized media discourse on brain optimization as individualistic in ethos, pressuring individuals to expend calculated effort in cultivating culturally desirable forms of selves and bodies. However, little research has investigated whether the themes that characterize media dialogue are shared by lay populations. This article considers the relationship between the representations of brain optimization that surfaced in (i) a study of British press coverage between 2000 and 2012 and (ii) interviews with forty-eight London residents. Both data sets represented the brain as a resource that could be manipulated by the individual, with optimal brain function contingent on applying self-control in one’s lifestyle choices. However, these ideas emerged more sharply in the media than in the interviews: while most interviewees were aware of brain optimization practices, few were committed to carrying them out. The two data sets diverged in several ways: the media’s intense preoccupation with optimizing children’s brains was not apparent in lay dialogue, while interviewees elaborated beliefs about the underuse of brain tissue that showed no presence in the media. This article considers these continuities and discontinuities in light of their wider cultural significance and their implications for the media–mind relationship in public engagement with neuroscience. PMID:26336326
Lay Health Influencers: How They Tailor Brief Tobacco Cessation Interventions
Yuan, Nicole P.; Castañeda, Heide; Nichter, Mark; Nichter, Mimi; Wind, Steven; Carruth, Lauren; Muramoto, Myra
2014-01-01
Interventions tailored to individual smoker characteristics have increasingly received attention in the tobacco control literature. The majority of tailored interventions are generated by computers and administered with printed materials or Web-based programs. The purpose of this study was to examine the tailoring activities of community lay health influencers who were trained to perform face-to-face brief tobacco cessation interventions. Eighty participants of a large-scale, randomized controlled trial completed a 6-week qualitative follow-up interview. A majority of participants (86%) reported that they made adjustments in their intervention behaviors based on individual smoker characteristics, their relationship with the smoker, and/or setting. Situational contexts (i.e., location and timing) primarily played a role after targeted smokers were selected. The findings suggest that lay health influencers benefit from a training curriculum that emphasizes a motivational, person-centered approach to brief cessation interventions. Recommendations for future tobacco cessation intervention trainings are presented. PMID:21986244
Lay health influencers: how they tailor brief tobacco cessation interventions.
Yuan, Nicole P; Castañeda, Heide; Nichter, Mark; Nichter, Mimi; Wind, Steven; Carruth, Lauren; Muramoto, Myra
2012-10-01
Interventions tailored to individual smoker characteristics have increasingly received attention in the tobacco control literature. The majority of tailored interventions are generated by computers and administered with printed materials or web-based programs. The purpose of this study was to examine the tailoring activities of community lay health influencers who were trained to perform face-to-face brief tobacco cessation interventions. Eighty participants of a large-scale, randomized controlled trial completed a 6-week qualitative follow-up interview. A majority of participants (86%) reported that they made adjustments in their intervention behaviors based on individual smoker characteristics, their relationship with the smoker, and/or setting. Situational contexts (i.e., location and timing) primarily played a role after targeted smokers were selected. The findings suggest that lay health influencers benefit from a training curriculum that emphasizes a motivational, person-centered approach to brief cessation interventions. Recommendations for future tobacco cessation intervention trainings are presented.
A "Star Wars" Objector Lays His Research on the Line.
ERIC Educational Resources Information Center
Tobias, Sheila
1987-01-01
For one optical scientist, Harrison Barrett, the decision not to accept funding for research related to the Strategic Defense Initiative has meant giving up a major part of his work in optical computing. (MSE)
ERIC Educational Resources Information Center
Thompson, S. Anthony; Baumgartner, Lynsey
2008-01-01
In the inclusive/special education literature, practitioners often claim that using portfolios is excessively time-intensive, while other researchers lay claim to positive possibilities for students with disabilities/exceptionalities, such as increased self-esteem, internal locus of control, choice-making, and active participation in learning. To…
Towards a single seismological service infrastructure in Europe
NASA Astrophysics Data System (ADS)
Spinuso, A.; Trani, L.; Frobert, L.; Van Eck, T.
2012-04-01
In the last five year services and data providers, within the seismological community in Europe, focused their efforts in migrating the way of opening their archives towards a Service Oriented Architecture (SOA). This process tries to follow pragmatically the technological trends and available solutions aiming at effectively improving all the data stewardship activities. These advancements are possible thanks to the cooperation and the follow-ups of several EC infrastructural projects that, by looking at general purpose techniques, combine their developments envisioning a multidisciplinary platform for the earth observation as the final common objective (EPOS, Earth Plate Observation System) One of the first results of this effort is the Earthquake Data Portal (http://www.seismicportal.eu), which provides a collection of tools to discover, visualize and access a variety of seismological data sets like seismic waveform, accelerometric data, earthquake catalogs and parameters. The Portal offers a cohesive distributed search environment, linking data search and access across multiple data providers through interactive web-services, map-based tools and diverse command-line clients. Our work continues under other EU FP7 projects. Here we will address initiatives in two of those projects. The NERA, (Network of European Research Infrastructures for Earthquake Risk Assessment and Mitigation) project will implement a Common Services Architecture based on OGC services APIs, in order to provide Resource-Oriented common interfaces across the data access and processing services. This will improve interoperability between tools and across projects, enabling the development of higher-level applications that can uniformly access the data and processing services of all participants. This effort will be conducted jointly with the VERCE project (Virtual Earthquake and Seismology Research Community for Europe). VERCE aims to enable seismologists to exploit the wealth of seismic data within a data-intensive computation framework, which will be tailored to the specific needs of the community. It will provide a new interoperable infrastructure, as the computational backbone laying behind the publicly available interfaces. VERCE will have to face the challenges of implementing a service oriented architecture providing an efficient layer between the Data and the Grid infrastructures, coupling HPC data analysis and HPC data modeling applications through the execution of workflows and data sharing mechanism. Online registries of interoperable worklflow components, storage of intermediate results and data provenance are those aspects that are currently under investigations to make the VERCE facilities usable from a large scale of users, data and service providers. For such purposes the adoption of a Digital Object Architecture, to create online catalogs referencing and describing semantically all these distributed resources, such as datasets, computational processes and derivative products, is seen as one of the viable solution to monitor and steer the usage of the infrastructure, increasing its efficiency and the cooperation among the community.
Pazhur, R J; Kutter, B; Georgieff, M; Schraag, S
2003-06-01
Portable digital assistants (PDAs) may be of value to the anaesthesiologist as development in medical care is moving towards "bedside computing". Many different portable computers are currently available and it is now possible for the physician to carry a mobile computer with him all the time. It is data base, reference book, patient tracking help, date planner, computer, book, magazine, calculator and much more in one mobile device. With the help of a PDA, information that is required for our work may be available at all times and everywhere at the point of care within seconds. In this overview the possibilities for the use of PDAs in anaesthesia and intensive care medicine are discussed. Developments in other countries, possibilities in use but also problems such as data security and network technology are evaluated.
Remien, Robert H; Mellins, Claude A.; Robbins, Reuben N.; Kelsey, Ryan; Rowe, Jessica; Warne, Patricia; Chowdhury, Jenifar; Lalkhen, Nuruneesa; Hoppe, Lara; Abrams, Elaine J.; El-Bassel, Nabila; Witte, Susan; Stein, Dan J.
2013-01-01
Effective medical treatment for HIV/AIDS requires patients’ optimal adherence to antiretroviral therapy (ART). In resource-constrained settings, lack of adequate standardized counseling for patients on ART remains a significant barrier to adherence. Masivukeni (“Lets Wake Up” in Xhosa) is an innovative multimedia-based intervention designed to help people living with HIV in resource-limited settings achieve and maintain high levels of ART adherence. Adapted from a couples-based intervention tested in the United States (US), Masivukeni was developed through community-based participatory research with US and South African partners and informed by Ewart’s Social Action Theory. Innovative computer-based multimedia strategies were used to translate a labor- and training-intensive intervention into one that could be readily and widely used by lay counselors with relatively little training with low-literacy patients. In this paper, we describe the foundations of this new intervention, the process of its development, and the evidence of its high acceptability and feasibility. PMID:23468079
Mapping species distributions: a comparison of skilled naturalist and lay citizen science recording.
van der Wal, René; Anderson, Helen; Robinson, Annie; Sharma, Nirwan; Mellish, Chris; Roberts, Stuart; Darvill, Ben; Siddharthan, Advaith
2015-11-01
To assess the ability of traditional biological recording schemes and lay citizen science approaches to gather data on species distributions and changes therein, we examined bumblebee records from the UK's national repository (National Biodiversity Network) and from BeeWatch. The two recording approaches revealed similar relative abundances of bumblebee species but different geographical distributions. For the widespread common carder (Bombus pascuorum), traditional recording scheme data were patchy, both spatially and temporally, reflecting active record centre rather than species distribution. Lay citizen science records displayed more extensive geographic coverage, reflecting human population density, thus offering better opportunities to account for recording effort. For the rapidly spreading tree bumblebee (Bombus hypnorum), both recording approaches revealed similar distributions due to a dedicated mapping project which overcame the patchy nature of naturalist records. We recommend, where possible, complementing skilled naturalist recording with lay citizen science programmes to obtain a nation-wide capability, and stress the need for timely uploading of data to the national repository.
NASA Astrophysics Data System (ADS)
Zavaletta, Vanessa A.; Bartholmai, Brian J.; Robb, Richard A.
2007-03-01
Diffuse lung diseases, such as idiopathic pulmonary fibrosis (IPF), can be characterized and quantified by analysis of volumetric high resolution CT scans of the lungs. These data sets typically have dimensions of 512 x 512 x 400. It is too subjective and labor intensive for a radiologist to analyze each slice and quantify regional abnormalities manually. Thus, computer aided techniques are necessary, particularly texture analysis techniques which classify various lung tissue types. Second and higher order statistics which relate the spatial variation of the intensity values are good discriminatory features for various textures. The intensity values in lung CT scans range between [-1024, 1024]. Calculation of second order statistics on this range is too computationally intensive so the data is typically binned between 16 or 32 gray levels. There are more effective ways of binning the gray level range to improve classification. An optimal and very efficient way to nonlinearly bin the histogram is to use a dynamic programming algorithm. The objective of this paper is to show that nonlinear binning using dynamic programming is computationally efficient and improves the discriminatory power of the second and higher order statistics for more accurate quantification of diffuse lung disease.
Egg size and laying order of snowy egrets, great egrets, and black-crowned night-herons
Custer, T.W.; Frederick, P.C.
1990-01-01
The nesting biology of the family Ardeidae (bitterns, herons, and egrets) has been intensively studied (e.g., Owen 1960, Milstein et al. 1970, Werschkul 1979), but egg size in relation to laying order bas not received attention. The last egg laid in gull and tern clutches is generally smaller than preceding eggs (e.g., Parsons 1970, Nisbet 1978). The relative size of the final egg in a clutch decreases with increased body size among bird species and this relationship may be correlated with an increased brood-reduction strategy (Slagsvold et al. 1984). Relative egg size could be an important component to brood reduction, because egg size can affect subsequent survival of young (Parsons 1970, Nisbet 1978, Lundberg and Vaisanen 1979).
NASA Astrophysics Data System (ADS)
Perrin, A.; Ndao, M.; Manceron, L.
2017-10-01
A recent paper [1] presents a high-resolution, high-temperature version of the Nitrogen Dioxide Spectroscopic Databank called NDSD-1000. The NDSD-1000 database contains line parameters (positions, intensities, self- and air-broadening coefficients, exponents of the temperature dependence of self- and air-broadening coefficients) for numerous cold and hot bands of the 14N16O2 isotopomer of nitrogen dioxide. The parameters used for the line positions and intensities calculation were generated through a global modeling of experimental data collected in the literature within the framework of the method of effective operators. However, the form of the effective dipole moment operator used to compute the NO2 line intensities in the NDSD-1000 database differs from the classical one used for line intensities calculation in the NO2 infrared literature [12]. Using Fourier transform spectra recorded at high resolution in the 6.3 μm region, it is shown here, that the NDSD-1000 formulation is incorrect since the computed intensities do not account properly for the (Int(+)/Int(-)) intensity ratio between the (+) (J = N+ 1/2) and (-) (J = N-1/2) electron - spin rotation subcomponents of the computed vibration rotation transitions. On the other hand, in the HITRAN or GEISA spectroscopic databases, the NO2 line intensities were computed using the classical theoretical approach, and it is shown here that these data lead to a significant better agreement between the observed and calculated spectra.
NASA Technical Reports Server (NTRS)
Dorband, John E.
1987-01-01
Generating graphics to faithfully represent information can be a computationally intensive task. A way of using the Massively Parallel Processor to generate images by ray tracing is presented. This technique uses sort computation, a method of performing generalized routing interspersed with computation on a single-instruction-multiple-data (SIMD) computer.
Itzhaki, Michal; Bar-Tal, Yoram; Barnoy, Sivia
2012-09-01
This article is a report on a study conducted to examine the views of healthcare professionals and lay people regarding the effect of family presence during resuscitation on both the staff performing the resuscitation and the relatives who witness it. Family presence during resuscitation is controversial. Although many professional groups in different countries have recently issued position statements about the practice and have recommended new policy moves, the Israel Ministry of Health has not issued guidelines on the matter. Study design is factorial within-between subjects. Data were collected in Israel in 2008 from a convenience sample of 220 lay people and 201 healthcare staff (52 physicians and 149 nurses) using a questionnaire based on eight different resuscitation scenarios and manipulating blood involvement and resuscitations outcome. Data were analysed using one-way analysis of variance. Overall, both staff and lay people perceived family presence during resuscitation negatively. Visible bleeding and an unsuccessful outcome significantly influenced both staff's and lay people's perceptions. Female physicians and nurses reacted more negatively to family presence than did male physicians and nurses; lay men responded more negatively than lay women. Changing the current negative perceptions of family presence at resuscitation requires (a) establishing a new national policy, (b) educating healthcare staff to the benefits of the presence of close relatives and (c) training staff to support relatives who want to be present. © 2011 Blackwell Publishing Ltd.
NASA Astrophysics Data System (ADS)
Cheng, Tian-Le; Ma, Fengde D.; Zhou, Jie E.; Jennings, Guy; Ren, Yang; Jin, Yongmei M.; Wang, Yu U.
2012-01-01
Diffuse scattering contains rich information on various structural disorders, thus providing a useful means to study the nanoscale structural deviations from the average crystal structures determined by Bragg peak analysis. Extraction of maximal information from diffuse scattering requires concerted efforts in high-quality three-dimensional (3D) data measurement, quantitative data analysis and visualization, theoretical interpretation, and computer simulations. Such an endeavor is undertaken to study the correlated dynamic atomic position fluctuations caused by thermal vibrations (phonons) in precursor state of shape-memory alloys. High-quality 3D diffuse scattering intensity data around representative Bragg peaks are collected by using in situ high-energy synchrotron x-ray diffraction and two-dimensional digital x-ray detector (image plate). Computational algorithms and codes are developed to construct the 3D reciprocal-space map of diffuse scattering intensity distribution from the measured data, which are further visualized and quantitatively analyzed to reveal in situ physical behaviors. Diffuse scattering intensity distribution is explicitly formulated in terms of atomic position fluctuations to interpret the experimental observations and identify the most relevant physical mechanisms, which help set up reduced structural models with minimal parameters to be efficiently determined by computer simulations. Such combined procedures are demonstrated by a study of phonon softening phenomenon in precursor state and premartensitic transformation of Ni-Mn-Ga shape-memory alloy.
Effects of taurine and housing density on renal function in laying hens*
Ma, Zi-li; Gao, Yang; Ma, Hai-tian; Zheng, Liu-hai; Dai, Bin; Miao, Jin-feng; Zhang, Yuan-shu
2016-01-01
This study investigated the putative protective effects of supplemental 2-aminoethane sulfonic acid (taurine) and reduced housing density on renal function in laying hens. We randomly assigned fifteen thousand green-shell laying hens into three groups: a free range group, a low-density caged group, and a high-density caged group. Each group was further divided equally into a control group (C) and a taurine treatment group (T). After 15 d, we analyzed histological changes in kidney cells, inflammatory mediator levels, oxidation and anti-oxidation levels. Experimental data revealed taurine supplementation, and rearing free range or in low-density housing can lessen morphological renal damage, inflammatory mediator levels, and oxidation levels and increase anti-oxidation levels. Our data demonstrate that taurine supplementation and a reduction in housing density can ameliorate renal impairment, increase productivity, enhance health, and promote welfare in laying hens. PMID:27921400
USDA-ARS?s Scientific Manuscript database
The poultry industry is under intense pressure from the public and animal welfare advocates to eliminate the practice of beak trimming due to the potential for acute and chronic pain in the trimmed birds. However, elimination of beak trimming may have severe implications for animal welfare, as peck...
Evaluation of a Peer-Led, Low-Intensity Physical Activity Program for Older Adults
ERIC Educational Resources Information Center
Werner, Danilea; Teufel, James; Brown, Stephen L.
2014-01-01
Background: Physical inactivity is a primary contributor to decreasing functional physical fitness and increasing chronic disease in older adults. Purpose: This study assessed the health-related benefits of ExerStart for Lay Leaders, a 20-week, community based, peer-led, low-impact exercise program for older adults. ExerStart focuses on aerobic…
Reporting of Randomized Trials in Common Cancers in the Lay Media.
Ribnikar, Domen; Goldvaser, Hadar; Ocana, Alberto; Templeton, Arnoud J; Seruga, Bostjan; Amir, Eitan
2018-01-01
Limited data exist about the role of the lay media in the dissemination of results of randomized controlled trials (RCTs) in common cancers. Completed phase III RCTs evaluating new drugs in common cancers between January 2005 and October 2016 were identified from ClinicalTrials.gov. Lay media reporting was identified by searching LexisNexis Academic. Scientific reporting was defined as presentation at an academic conference or publication in full. Associations between reporting in the lay media before scientific reporting and study design and sponsorship were evaluated using logistic regression. Of 180 RCTs identified, 52% were reported in the lay media and in 27%, lay media reporting occurred before scientific reporting with an increasing trend over time (p = 0.009). Reporting in the lay media before scientific reporting was associated with positive results (OR: 2.10, p = 0.04), targeted therapy compared to chemotherapy (OR: 4.75, p = 0.006), immunotherapy compared to chemotherapy (OR: 7.60, p = 0.02), and prostate cancer compared to breast cancer (OR: 3.25, p = 0.02). Over a quarter of all RCTs in common cancers are reported in the lay media before they are reported scientifically with an increasing proportion over time. Positive trials, studies in prostate cancer, and trials of immunotherapy are associated with early reporting in the lay media. © 2017 S. Karger AG, Basel.
Enabling Large-Scale Biomedical Analysis in the Cloud
Lin, Ying-Chih; Yu, Chin-Sheng; Lin, Yen-Jen
2013-01-01
Recent progress in high-throughput instrumentations has led to an astonishing growth in both volume and complexity of biomedical data collected from various sources. The planet-size data brings serious challenges to the storage and computing technologies. Cloud computing is an alternative to crack the nut because it gives concurrent consideration to enable storage and high-performance computing on large-scale data. This work briefly introduces the data intensive computing system and summarizes existing cloud-based resources in bioinformatics. These developments and applications would facilitate biomedical research to make the vast amount of diversification data meaningful and usable. PMID:24288665
Web-based interactive visualization in a Grid-enabled neuroimaging application using HTML5.
Siewert, René; Specovius, Svenja; Wu, Jie; Krefting, Dagmar
2012-01-01
Interactive visualization and correction of intermediate results are required in many medical image analysis pipelines. To allow certain interaction in the remote execution of compute- and data-intensive applications, new features of HTML5 are used. They allow for transparent integration of user interaction into Grid- or Cloud-enabled scientific workflows. Both 2D and 3D visualization and data manipulation can be performed through a scientific gateway without the need to install specific software or web browser plugins. The possibilities of web-based visualization are presented along the FreeSurfer-pipeline, a popular compute- and data-intensive software tool for quantitative neuroimaging.
Framework for emotional mobile computation for creating entertainment experience
NASA Astrophysics Data System (ADS)
Lugmayr, Artur R.
2007-02-01
Ambient media are media, which are manifesting in the natural environment of the consumer. The perceivable borders between the media and the context, where the media is used are getting more and more blurred. The consumer is moving through a digital space of services throughout his daily life. As we are developing towards an experience society, the central point in the development of services is the creation of a consumer experience. This paper reviews possibilities and potentials of the creation of entertainment experiences with mobile phone platforms. It reviews sensor network capable of acquiring consumer behavior data, interactivity strategies, psychological models for emotional computation on mobile phones, and lays the foundations of a nomadic experience society. The paper rounds up with a presentation of several different possible service scenarios in the field of entertainment and leisure computation on mobiles. The goal of this paper is to present a framework and evaluation of possibilities of applying sensor technology on mobile platforms to create an increasing consumer entertainment experience.
On the Modeling and Management of Cloud Data Analytics
NASA Astrophysics Data System (ADS)
Castillo, Claris; Tantawi, Asser; Steinder, Malgorzata; Pacifici, Giovanni
A new era is dawning where vast amount of data is subjected to intensive analysis in a cloud computing environment. Over the years, data about a myriad of things, ranging from user clicks to galaxies, have been accumulated, and continue to be collected, on storage media. The increasing availability of such data, along with the abundant supply of compute power and the urge to create useful knowledge, gave rise to a new data analytics paradigm in which data is subjected to intensive analysis, and additional data is created in the process. Meanwhile, a new cloud computing environment has emerged where seemingly limitless compute and storage resources are being provided to host computation and data for multiple users through virtualization technologies. Such a cloud environment is becoming the home for data analytics. Consequently, providing good performance at run-time to data analytics workload is an important issue for cloud management. In this paper, we provide an overview of the data analytics and cloud environment landscapes, and investigate the performance management issues related to running data analytics in the cloud. In particular, we focus on topics such as workload characterization, profiling analytics applications and their pattern of data usage, cloud resource allocation, placement of computation and data and their dynamic migration in the cloud, and performance prediction. In solving such management problems one relies on various run-time analytic models. We discuss approaches for modeling and optimizing the dynamic data analytics workload in the cloud environment. All along, we use the Map-Reduce paradigm as an illustration of data analytics.
Benchmarking Memory Performance with the Data Cube Operator
NASA Technical Reports Server (NTRS)
Frumkin, Michael A.; Shabanov, Leonid V.
2004-01-01
Data movement across a computer memory hierarchy and across computational grids is known to be a limiting factor for applications processing large data sets. We use the Data Cube Operator on an Arithmetic Data Set, called ADC, to benchmark capabilities of computers and of computational grids to handle large distributed data sets. We present a prototype implementation of a parallel algorithm for computation of the operatol: The algorithm follows a known approach for computing views from the smallest parent. The ADC stresses all levels of grid memory and storage by producing some of 2d views of an Arithmetic Data Set of d-tuples described by a small number of integers. We control data intensity of the ADC by selecting the tuple parameters, the sizes of the views, and the number of realized views. Benchmarking results of memory performance of a number of computer architectures and of a small computational grid are presented.
[Knowledge retain of BLS and BLS-D theoretical contents in a long-term follow-up].
Paolini, Enrico; Conti, Elettra; Guerra, Federico; Capucci, Alessandro
2018-03-01
Despite the many recent improvements in basic life support (BLS) and the widespread training to a great number of lay rescuers, out-of-hospital cardiac arrest (OHCA) is still a major cause of death. Nowadays, BLS teaching protocols ask for many concepts to be learned and specific algorithms to be applied, without any available data on how much all these inputs are retained by the students. The present survey aims to evaluate how BLS is really retained by those rescuers (laypersons and nurses) who do not often put it into practice. The first survey was targeted at laypersons who are responsible for security in their work environment. The second survey was targeted at nurses operating in low-intensity wards or out-of-hospital clinics. Both surveys were anonymous and asked specific questions aimed at evaluating how BLS/BLS-D (BLS and defibrillation) information was retained on an average of 3 years after training completion. Lay rescuers showed difficulties in recognizing specific signs of OHCA, were unsure on when to call the emergency medical service and retained few and sometime erroneous information on the correct BLS sequence to perform. The nurses operating in low-intensity settings performed better in terms of OHCA recognition and advanced medical assistance activation, while how to set and operate an automatic external defibrillation was seldom clear. BLS information is usually not retained after a 3-year follow-up by people who are not involved in OHCA management as part of their everyday job. It should be useful to lower the number of concepts taught in current BLS courses in order to focus on the most significant aspects of it, such as prompt emergency medical system activation and early defibrillation.
Unsteady thermal blooming of intense laser beams
NASA Astrophysics Data System (ADS)
Ulrich, J. T.; Ulrich, P. B.
1980-01-01
A four dimensional (three space plus time) computer program has been written to compute the nonlinear heating of a gas by an intense laser beam. Unsteady, transient cases are capable of solution and no assumption of a steady state need be made. The transient results are shown to asymptotically approach the steady-state results calculated by the standard three dimensional thermal blooming computer codes. The report discusses the physics of the laser-absorber interaction, the numerical approximation used, and comparisons with experimental data. A flowchart is supplied in the appendix to the report.
MSFC crack growth analysis computer program, version 2 (users manual)
NASA Technical Reports Server (NTRS)
Creager, M.
1976-01-01
An updated version of the George C. Marshall Space Flight Center Crack Growth Analysis Program is described. The updated computer program has significantly expanded capabilities over the original one. This increased capability includes an extensive expansion of the library of stress intensity factors, plotting capability, increased design iteration capability, and the capability of performing proof test logic analysis. The technical approaches used within the computer program are presented, and the input and output formats and options are described. Details of the stress intensity equations, example data, and example problems are presented.
NASA Astrophysics Data System (ADS)
Kropivnitskaya, Y. Y.; Tiampo, K. F.; Qin, J.; Bauer, M.
2015-12-01
Intensity is one of the most useful measures of earthquake hazard, as it quantifies the strength of shaking produced at a given distance from the epicenter. Today, there are several data sources that could be used to determine intensity level which can be divided into two main categories. The first category is represented by social data sources, in which the intensity values are collected by interviewing people who experienced the earthquake-induced shaking. In this case, specially developed questionnaires can be used in addition to personal observations published on social networks such as Twitter. These observations are assigned to the appropriate intensity level by correlating specific details and descriptions to the Modified Mercalli Scale. The second category of data sources is represented by observations from different physical sensors installed with the specific purpose of obtaining an instrumentally-derived intensity level. These are usually based on a regression of recorded peak acceleration and/or velocity amplitudes. This approach relates the recorded ground motions to the expected felt and damage distribution through empirical relationships. The goal of this work is to implement and evaluate streaming data processing separately and jointly from both social and physical sensors in order to produce near real-time intensity maps and compare and analyze their quality and evolution through 10-minute time intervals immediately following an earthquake. Results are shown for the case study of the M6.0 2014 South Napa, CA earthquake that occurred on August 24, 2014. The using of innovative streaming and pipelining computing paradigms through IBM InfoSphere Streams platform made it possible to read input data in real-time for low-latency computing of combined intensity level and production of combined intensity maps in near-real time. The results compare three types of intensity maps created based on physical, social and combined data sources. Here we correlate the count and density of Tweets with intensity level and show the importance of processing combined data sources at the earliest time stages after earthquake happens. This method can supplement existing approaches of intensity level detection, especially in the regions with high number of Twitter users and low density of seismic networks.
A comparison of the breeding ecology of birds nesting in boxes and tree cavities
Kathryn L. Purcell; Jared Verner; Lewis W. Oring
1997-01-01
We compared laying date, nesting success, clutch size, and productivity of four bird species that nest in boxes and tree cavities to examine whether data from nest boxes are comparable with data from tree cavities. Western Bluebirds (Sialia mexicana) gained the most advantage from nesting in boxes. They initiated egg laying earlier, had higher nesting success, lower...
Computer Series, 98. Electronics for Scientists: A Computer-Intensive Approach.
ERIC Educational Resources Information Center
Scheeline, Alexander; Mork, Brian J.
1988-01-01
Reports the design for a principles-before-details presentation of electronics for an instrumental analysis class. Uses computers for data collection and simulations. Requires one semester with two 2.5-hour periods and two lectures per week. Includes lab and lecture syllabi. (MVL)
NASA Astrophysics Data System (ADS)
Brown, Antony G.
2009-07-01
This paper presents geomorphic, soils and palaeoecological data from a small sub-catchment in the English Midlands in an attempt to provide an integrated picture of Holocene landscape change. The area used has also been the focus of a multi-disciplinary and long-term archaeological survey (Raunds Area Project) and so has a wealth of archaeological and historical data which can be related to the environmental record. The paper combines these data, much of which are only published in the archaeological literature with new interpretations based upon unpublished data and new data particularly from the hillslopes and new radiocarbon dating from the valley floor. It is inferred that despite a long history of pastoral and arable agriculture (since the Neolithic/Bronze Age), colluviation on lower slopes, significant soil redistribution and overbank alluviation only began to a measurable extent in the Late Saxon-Medieval period (9th Century AD onwards). It is suggested that this is due to a combination of land-use factors, principally the laying out of an intensive open field system and the establishment of villages combined with a period of extremes in climate well known throughout Europe. Indeed the critical element appears to have been the social changes in this period that created this regionally distinctive landscape which happened to have a high spatial connectivity and facilitated intensive arable production with high tillage rates. Intense rainfall events during this period could therefore detach and mobilize high volumes of soil and the open field system facilitated transport to slope bases and valley floors. The need for detailed and spatially precise land-use data in order to interpret accelerated landscape change is stressed.
Niranjan, Soumya J; Huang, Chao-Hui S; Dionne-Odom, J Nicholas; Halilova, Karina I; Pisu, Maria; Drentea, Patricia; Kvale, Elizabeth A; Bevis, Kerri S; Butler, Thomas W; Partridge, Edward E; Rocque, Gabrielle B
2018-04-01
Respecting Choices is an evidence-based model of facilitating advance care planning (ACP) conversations between health-care professionals and patients. However, the effectiveness of whether lay patient navigators can successfully initiate Respecting Choices ACP conversations is unknown. As part of a large demonstration project (Patient Care Connect [PCC]), a cohort of lay patient navigators underwent Respecting Choices training and were tasked to initiate ACP conversations with Medicare beneficiaries diagnosed with cancer. This article explores PCC lay navigators' perceived barriers and facilitators in initiating Respecting Choices ACP conversations with older patients with cancer in order to inform implementation enhancements to lay navigator-facilitated ACP. Twenty-six lay navigators from 11 PCC cancer centers in 4 states (Alabama, George, Tennessee, and Florida) completed in-depth, one-on-one semistructured interviews between June 2015 and August 2015. Data were analyzed using a thematic analysis approach. This evaluation identifies 3 levels-patient, lay navigator, and organizational factors in addition to training needs that influence ACP implementation. Key facilitators included physician buy-in, patient readiness, and navigators' prior experience with end-of-life decision-making. Lay navigators' perceived challenges to initiating ACP conversations included timing of the conversation and social and personal taboos about discussing dying. Our results suggest that further training and health system support are needed for lay navigators playing a vital role in improving the implementation of ACP among older patients with cancer. The lived expertise of lay navigators along with flexible longitudinal relationships with patients and caregivers may uniquely position this workforce to promote ACP.
Hu, Janice; Geldsetzer, Pascal; Steele, Sarah Jane; Matthews, Philippa; Ortblad, Katrina; Solomon, Tsion; Shroufi, Amir; van Cutsem, Gilles; Tanser, Frank; Wyke, Sally; Vollmer, Sebastian; Pillay, Deenan; Mcconnell, Margaret; Bärnighausen, Till
2018-06-14
This study aimed to determine the causal effect of the number of lay counselors employed at a primary care clinic in rural South Africa on the number of clinic-based HIV tests performed. Fixed effects panel analysis. We collected monthly data on the number of lay counselors employed and HIV tests performed at nine primary care clinics in rural KwaZulu-Natal from January 2014 to December 2015. Using clinic- and month-level fixed effects regressions, we exploited the fact that lay counselors were removed from clinics at two quasi-random time points by a redeployment policy. A total of 24,526 HIV tests were conducted over the study period. 21 of 27 lay counselors were removed across the nine clinics in the two redeployment waves. A ten percent reduction in the number of lay counselors was associated with a 4.9% (95% confidence interval [CI]: 2.8 - 7.0, p < 0.001) decrease in the number of HIV tests performed. In absolute terms, losing one lay counselor from a clinic was associated with a mean of 29.7 (95% CI: 21.2 - 38.2, p < 0.001) fewer HIV tests carried out at the clinic per month. This study provides evidence for the crucial role that lay counselors play in the HIV response in rural South Africa. More broadly, this analysis supports the use of lay cadres in the HIV response and by extension UNAIDS' and the African Union's goal to triple the number of community health workers in sub-Saharan Africa by 2020.
In Vivo Validation of Numerical Prediction for Turbulence Intensity in an Aortic Coarctation
Arzani, Amirhossein; Dyverfeldt, Petter; Ebbers, Tino; Shadden, Shawn C.
2013-01-01
This paper compares numerical predictions of turbulence intensity with in vivo measurement. Magnetic resonance imaging (MRI) was carried out on a 60-year-old female with a restenosed aortic coarctation. Time-resolved three-directional phase-contrast (PC) MRI data was acquired to enable turbulence intensity estimation. A contrast-enhanced MR angiography (MRA) and a time-resolved 2D PCMRI measurement were also performed to acquire data needed to perform subsequent image-based computational fluid dynamics (CFD) modeling. A 3D model of the aortic coarctation and surrounding vasculature was constructed from the MRA data, and physiologic boundary conditions were modeled to match 2D PCMRI and pressure pulse measurements. Blood flow velocity data was subsequently obtained by numerical simulation. Turbulent kinetic energy (TKE) was computed from the resulting CFD data. Results indicate relative agreement (error ≈10%) between the in vivo measurements and the CFD predictions of TKE. The discrepancies in modeled vs. measured TKE values were within expectations due to modeling and measurement errors. PMID:22016327
Approximation of epidemic models by diffusion processes and their statistical inference.
Guy, Romain; Larédo, Catherine; Vergu, Elisabeta
2015-02-01
Multidimensional continuous-time Markov jump processes [Formula: see text] on [Formula: see text] form a usual set-up for modeling [Formula: see text]-like epidemics. However, when facing incomplete epidemic data, inference based on [Formula: see text] is not easy to be achieved. Here, we start building a new framework for the estimation of key parameters of epidemic models based on statistics of diffusion processes approximating [Formula: see text]. First, previous results on the approximation of density-dependent [Formula: see text]-like models by diffusion processes with small diffusion coefficient [Formula: see text], where [Formula: see text] is the population size, are generalized to non-autonomous systems. Second, our previous inference results on discretely observed diffusion processes with small diffusion coefficient are extended to time-dependent diffusions. Consistent and asymptotically Gaussian estimates are obtained for a fixed number [Formula: see text] of observations, which corresponds to the epidemic context, and for [Formula: see text]. A correction term, which yields better estimates non asymptotically, is also included. Finally, performances and robustness of our estimators with respect to various parameters such as [Formula: see text] (the basic reproduction number), [Formula: see text], [Formula: see text] are investigated on simulations. Two models, [Formula: see text] and [Formula: see text], corresponding to single and recurrent outbreaks, respectively, are used to simulate data. The findings indicate that our estimators have good asymptotic properties and behave noticeably well for realistic numbers of observations and population sizes. This study lays the foundations of a generic inference method currently under extension to incompletely observed epidemic data. Indeed, contrary to the majority of current inference techniques for partially observed processes, which necessitates computer intensive simulations, our method being mostly an analytical approach requires only the classical optimization steps.
Computing moment to moment BOLD activation for real-time neurofeedback
Hinds, Oliver; Ghosh, Satrajit; Thompson, Todd W.; Yoo, Julie J.; Whitfield-Gabrieli, Susan; Triantafyllou, Christina; Gabrieli, John D.E.
2013-01-01
Estimating moment to moment changes in blood oxygenation level dependent (BOLD) activation levels from functional magnetic resonance imaging (fMRI) data has applications for learned regulation of regional activation, brain state monitoring, and brain-machine interfaces. In each of these contexts, accurate estimation of the BOLD signal in as little time as possible is desired. This is a challenging problem due to the low signal-to-noise ratio of fMRI data. Previous methods for real-time fMRI analysis have either sacrificed the ability to compute moment to moment activation changes by averaging several acquisitions into a single activation estimate or have sacrificed accuracy by failing to account for prominent sources of noise in the fMRI signal. Here we present a new method for computing the amount of activation present in a single fMRI acquisition that separates moment to moment changes in the fMRI signal intensity attributable to neural sources from those due to noise, resulting in a feedback signal more reflective of neural activation. This method computes an incremental general linear model fit to the fMRI timeseries, which is used to calculate the expected signal intensity at each new acquisition. The difference between the measured intensity and the expected intensity is scaled by the variance of the estimator in order to transform this residual difference into a statistic. Both synthetic and real data were used to validate this method and compare it to the only other published real-time fMRI method. PMID:20682350
GATECloud.net: a platform for large-scale, open-source text processing on the cloud.
Tablan, Valentin; Roberts, Ian; Cunningham, Hamish; Bontcheva, Kalina
2013-01-28
Cloud computing is increasingly being regarded as a key enabler of the 'democratization of science', because on-demand, highly scalable cloud computing facilities enable researchers anywhere to carry out data-intensive experiments. In the context of natural language processing (NLP), algorithms tend to be complex, which makes their parallelization and deployment on cloud platforms a non-trivial task. This study presents a new, unique, cloud-based platform for large-scale NLP research--GATECloud. net. It enables researchers to carry out data-intensive NLP experiments by harnessing the vast, on-demand compute power of the Amazon cloud. Important infrastructural issues are dealt with by the platform, completely transparently for the researcher: load balancing, efficient data upload and storage, deployment on the virtual machines, security and fault tolerance. We also include a cost-benefit analysis and usage evaluation.
COMBAT: mobile-Cloud-based cOmpute/coMmunications infrastructure for BATtlefield applications
NASA Astrophysics Data System (ADS)
Soyata, Tolga; Muraleedharan, Rajani; Langdon, Jonathan; Funai, Colin; Ames, Scott; Kwon, Minseok; Heinzelman, Wendi
2012-05-01
The amount of data processed annually over the Internet has crossed the zetabyte boundary, yet this Big Data cannot be efficiently processed or stored using today's mobile devices. Parallel to this explosive growth in data, a substantial increase in mobile compute-capability and the advances in cloud computing have brought the state-of-the- art in mobile-cloud computing to an inflection point, where the right architecture may allow mobile devices to run applications utilizing Big Data and intensive computing. In this paper, we propose the MObile Cloud-based Hybrid Architecture (MOCHA), which formulates a solution to permit mobile-cloud computing applications such as object recognition in the battlefield by introducing a mid-stage compute- and storage-layer, called the cloudlet. MOCHA is built on the key observation that many mobile-cloud applications have the following characteristics: 1) they are compute-intensive, requiring the compute-power of a supercomputer, and 2) they use Big Data, requiring a communications link to cloud-based database sources in near-real-time. In this paper, we describe the operation of MOCHA in battlefield applications, by formulating the aforementioned mobile and cloudlet to be housed within a soldier's vest and inside a military vehicle, respectively, and enabling access to the cloud through high latency satellite links. We provide simulations using the traditional mobile-cloud approach as well as utilizing MOCHA with a mid-stage cloudlet to quantify the utility of this architecture. We show that the MOCHA platform for mobile-cloud computing promises a future for critical battlefield applications that access Big Data, which is currently not possible using existing technology.
Debchoudhury, Indira; Welch, Alice E; Fairclough, Monique A; Cone, James E; Brackbill, Robert M; Stellman, Steven D; Farfel, Mark R
2011-12-01
Volunteers (non-professional rescue/recovery workers) are universally present at man-made and natural disasters and share experiences and exposures with victims. Little is known of their disaster-related health outcomes. We studied 4974 adult volunteers who completed the World Trade Center Health Registry 2006-07 survey to examine associations between volunteer type (affiliated vs. lay) and probable posttraumatic stress disorder (PTSD); new or worsening respiratory symptoms; post-9/11 first diagnosis of anxiety disorder, depression, and/or PTSD; and asthma or reactive airway dysfunction syndrome (RADS). Affiliated volunteers reported membership in a recognized organization. Lay volunteers reported no organizational affiliation and occupations unrelated to rescue/recovery work. Adjusted odds ratios (OR(adj)) were calculated using multinomial regression. Lay volunteers were more likely than affiliated volunteers to have been present in lower Manhattan, experience the dust cloud, horrific events and injury on 9/11 and subsequently to report unmet healthcare needs. They had greater odds of early post-9/11 mental health diagnosis (OR(adj) 1.6; 95% CI: 1.4-2.0) and asthma/RADS (1.8; 1.2-2.7), chronic PTSD (2.2; 1.7-2.8), late-onset PTSD (1.9; 1.5-2.5), and new or worsening lower respiratory symptoms (2.0; 1.8-2.4). Lay volunteers' poorer health outcomes reflect earlier, more intense exposure to and lack of protection from physical and psychological hazards. There is a need to limit volunteers' exposures during and after disasters, as well as to provide timely screening and health care post-disaster. Copyright © 2011 Elsevier Inc. All rights reserved.
Wuthijaree, K; Lambertz, C; Gauly, M
2017-12-01
1. A cross-sectional study was conducted from September 2015 to July 2016 in South Tyrol, Northern Italy to examine the prevalence of gastrointestinal helminths in free-range laying hens under mountain farming production conditions. 2. A total of 280 laying hens from 14 free-range mountain farms (4 organic, 10 conventional) were randomly collected at the end of the laying period. Faecal samples were taken to analyse faecal egg counts (FEC) and faecal oocyst counts (FOC). The gastrointestinal tracts were removed post mortem and examined for the presence of helminths. 3. In faeces, FEC values averaged 258 eggs per g of faeces, which were dominated by Ascaridia galli and Heterakis gallinarum. Mean FOC was 80 oocysts/g. In the gastrointestinal tract, at least one nematode species was found in 99.3% of the examined hens. H. gallinarum was the most prevalent nematode (95.7%), followed by Capillaria spp. (66.8%) and A. galli (63.6%). Thirty per cent of the chickens were infected with cestodes (tapeworms). Correlation coefficients between worm counts of H. gallinarum, Capillaria spp. and A. galli ranged from 0.41 to 0.51. 5. The helminth prevalence did not differ between conventional and organic farms, whereas total worm burden was higher in organic compared with conventional farms (318.9 vs. 112.0). Prevalence and infection intensity did not differ between farms that used anthelmintic treatments and those that did not. 6. In conclusion, free-range laying hens under the studied mountain farming conditions are at high risk of nematode infection, especially in organic systems. The vast majority of hens are subclinical infected with at least one helminth species.
Integrating the Apache Big Data Stack with HPC for Big Data
NASA Astrophysics Data System (ADS)
Fox, G. C.; Qiu, J.; Jha, S.
2014-12-01
There is perhaps a broad consensus as to important issues in practical parallel computing as applied to large scale simulations; this is reflected in supercomputer architectures, algorithms, libraries, languages, compilers and best practice for application development. However, the same is not so true for data intensive computing, even though commercially clouds devote much more resources to data analytics than supercomputers devote to simulations. We look at a sample of over 50 big data applications to identify characteristics of data intensive applications and to deduce needed runtime and architectures. We suggest a big data version of the famous Berkeley dwarfs and NAS parallel benchmarks and use these to identify a few key classes of hardware/software architectures. Our analysis builds on combining HPC and ABDS the Apache big data software stack that is well used in modern cloud computing. Initial results on clouds and HPC systems are encouraging. We propose the development of SPIDAL - Scalable Parallel Interoperable Data Analytics Library -- built on system aand data abstractions suggested by the HPC-ABDS architecture. We discuss how it can be used in several application areas including Polar Science.
Communication and the laboratory physician
Penistan, J. L.
1973-01-01
A clinical laboratory documentation system is described, suitable for community hospitals without computer services. The system is cumulative and is designed to provide the laboratory physician with the clinical information necessary for intelligent review and comment on the laboratory's findings. The mode of presentation of requests to the laboratory and lay-out of the reports to the clinicians are designed to make the two-way communication as close and personal as possible; to encourage the selection of those investigations likely to prove rewarding, and to discourage unnecessary investigation. The possibility of important data escaping notice is minimized. The system is economical in capital equipment, labour and supplies. PMID:4758594
Social capital on poultry farms in South Sulawesi, Indonesia
NASA Astrophysics Data System (ADS)
Sri Lestari, V.; Natsir; Patrick, I. W.; Ali, H. M.; Asya, M.; Sirajuddin, S. N.
2018-05-01
Social capital plays an important role in the development of poultry farms in South Sulawesi. Poultry farms consisted of laying hen and broiler farms. Most of laying hen farms were located in Sidrap Regency, while broiler farms were located in Maros regency. The aim of this research was to know social capital on beef cattle farms in South Sulawesi. Population of this research was 120 farmers which consisted of 6o were laying hen farmers and 60 were broiler farmers. Variable of social capital was mutual trust, reciprocity, shared norms and linkage. The data were collected from observation and depth interview by using questionnaire. There were 10 questions. The answer was scored by using Likert scale ranging from 1 refer to strongly agree; 2 refer to agree; 3 refer to not sure; 4 refer to disagree and 5 refer to strongly disagree. The data were analyzed descriptively using frequency distribution. The research revealed that mutual trust and shared norms members of the group in broiler farms have higher level than that on laying hen farms, on the other hand, linkage or net working members of the group among laying hen farmers has higher level than that on broiler farms.
Evaluation of a well-established task-shifting initiative: the lay counselor cadre in Botswana.
Ledikwe, Jenny H; Kejelepula, Mable; Maupo, Kabelo; Sebetso, Siwulani; Thekiso, Mothwana; Smith, Monica; Mbayi, Bagele; Houghton, Nankie; Thankane, Kabo; O'Malley, Gabrielle; Semo, Bazghina-Werq
2013-01-01
Evidence supports the implementation of task shifting to address health worker shortages that are common in resource-limited settings. However, there is need to learn from established programs to identify ways to achieve the strongest, most sustainable impact. This study examined the Botswana lay counselor cadre, a task shifting initiative, to explore effectiveness and contribution to the health workforce. This evaluation used multiple methods, including a desk review, a national lay counselor survey (n = 385; response = 94%), in-depth interviews (n = 79), lay counselors focus group discussions (n = 7), lay counselors observations (n = 25), and client exit interviews (n = 47). Interview and focus group data indicate that lay counselors contribute to essentially all HIV-related programs in Botswana and they conduct the majority of HIV tests and related counseling at public health facilities throughout the country. Interviews showed that the lay counselor cadre is making the workload of more skilled health workers more manageable and increasing HIV acceptance in communities. The average score on a work-related knowledge test was 74.5%. However for 3 questions, less than half answered correctly. During observations, lay counselors demonstrated average competence for most skills assessed and clients (97.9%) were satisfied with services received. From the survey, lay counselors generally reported being comfortable with their duties; however, some reported clinical duties that extended beyond their training and mandate. Multiple factors affecting the performance of the lay counselors were identified, including insufficient resources, such as private counseling space and HIV test kits; and technical, administrative, and supervisory support. Lay counselors are fulfilling an important role in Botswana's healthcare system, serving as the entry point into HIV care, support, and treatment services. For this and other similar task shifting initiatives, it is important that lay counselors' responsibilities are clear and that training and support are adequate to optimize their effectiveness.
Evaluation of a Well-Established Task-Shifting Initiative: The Lay Counselor Cadre in Botswana
Ledikwe, Jenny H.; Kejelepula, Mable; Maupo, Kabelo; Sebetso, Siwulani; Thekiso, Mothwana; Smith, Monica; Mbayi, Bagele; Houghton, Nankie; Thankane, Kabo; O’Malley, Gabrielle; Semo, Bazghina-werq
2013-01-01
Background Evidence supports the implementation of task shifting to address health worker shortages that are common in resource-limited settings. However, there is need to learn from established programs to identify ways to achieve the strongest, most sustainable impact. This study examined the Botswana lay counselor cadre, a task shifting initiative, to explore effectiveness and contribution to the health workforce. Methods This evaluation used multiple methods, including a desk review, a national lay counselor survey (n = 385; response = 94%), in-depth interviews (n = 79), lay counselors focus group discussions (n = 7), lay counselors observations (n = 25), and client exit interviews (n = 47). Results Interview and focus group data indicate that lay counselors contribute to essentially all HIV-related programs in Botswana and they conduct the majority of HIV tests and related counseling at public health facilities throughout the country. Interviews showed that the lay counselor cadre is making the workload of more skilled health workers more manageable and increasing HIV acceptance in communities. The average score on a work-related knowledge test was 74.5%. However for 3 questions, less than half answered correctly. During observations, lay counselors demonstrated average competence for most skills assessed and clients (97.9%) were satisfied with services received. From the survey, lay counselors generally reported being comfortable with their duties; however, some reported clinical duties that extended beyond their training and mandate. Multiple factors affecting the performance of the lay counselors were identified, including insufficient resources, such as private counseling space and HIV test kits; and technical, administrative, and supervisory support. Conclusion Lay counselors are fulfilling an important role in Botswana's healthcare system, serving as the entry point into HIV care, support, and treatment services. Recommendation For this and other similar task shifting initiatives, it is important that lay counselors' responsibilities are clear and that training and support are adequate to optimize their effectiveness. PMID:23585912
Honeycombing The Icosahedron and Icosahedroning the Sphere
Joseph M McCollum
2001-01-01
This paper is an attempt to trace the theoretical foundations of the Forest Inventory and Analysis and Forest Health Monitoring hexagon networks. It is important in case one might desire to alter the intensity of the grid or lay down a new grid in Puerto Rico and the U.S. Virgin Islands, for instance. The network comes from tessellating an icosahedron with hexagons and...
Computational Process Modeling for Additive Manufacturing
NASA Technical Reports Server (NTRS)
Bagg, Stacey; Zhang, Wei
2014-01-01
Computational Process and Material Modeling of Powder Bed additive manufacturing of IN 718. Optimize material build parameters with reduced time and cost through modeling. Increase understanding of build properties. Increase reliability of builds. Decrease time to adoption of process for critical hardware. Potential to decrease post-build heat treatments. Conduct single-track and coupon builds at various build parameters. Record build parameter information and QM Meltpool data. Refine Applied Optimization powder bed AM process model using data. Report thermal modeling results. Conduct metallography of build samples. Calibrate STK models using metallography findings. Run STK models using AO thermal profiles and report STK modeling results. Validate modeling with additional build. Photodiode Intensity measurements highly linear with power input. Melt Pool Intensity highly correlated to Melt Pool Size. Melt Pool size and intensity increase with power. Applied Optimization will use data to develop powder bed additive manufacturing process model.
Camerlengo, Terry; Ozer, Hatice Gulcin; Onti-Srinivasan, Raghuram; Yan, Pearlly; Huang, Tim; Parvin, Jeffrey; Huang, Kun
2012-01-01
Next Generation Sequencing is highly resource intensive. NGS Tasks related to data processing, management and analysis require high-end computing servers or even clusters. Additionally, processing NGS experiments requires suitable storage space and significant manual interaction. At The Ohio State University's Biomedical Informatics Shared Resource, we designed and implemented a scalable architecture to address the challenges associated with the resource intensive nature of NGS secondary analysis built around Illumina Genome Analyzer II sequencers and Illumina's Gerald data processing pipeline. The software infrastructure includes a distributed computing platform consisting of a LIMS called QUEST (http://bisr.osumc.edu), an Automation Server, a computer cluster for processing NGS pipelines, and a network attached storage device expandable up to 40TB. The system has been architected to scale to multiple sequencers without requiring additional computing or labor resources. This platform provides demonstrates how to manage and automate NGS experiments in an institutional or core facility setting.
Milles, J; van der Geest, R J; Jerosch-Herold, M; Reiber, J H C; Lelieveldt, B P F
2007-01-01
This paper presents a novel method for registration of cardiac perfusion MRI. The presented method successfully corrects for breathing motion without any manual interaction using Independent Component Analysis to extract physiologically relevant features together with their time-intensity behavior. A time-varying reference image mimicking intensity changes in the data of interest is computed based on the results of ICA, and used to compute the displacement caused by breathing for each frame. Qualitative and quantitative validation of the method is carried out using 46 clinical quality, short-axis, perfusion MR datasets comprising 100 images each. Validation experiments showed a reduction of the average LV motion from 1.26+/-0.87 to 0.64+/-0.46 pixels. Time-intensity curves are also improved after registration with an average error reduced from 2.65+/-7.89% to 0.87+/-3.88% between registered data and manual gold standard. We conclude that this fully automatic ICA-based method shows an excellent accuracy, robustness and computation speed, adequate for use in a clinical environment.
Validation of an automated mite counter for Dermanyssus gallinae in experimental laying hen cages.
Mul, Monique F; van Riel, Johan W; Meerburg, Bastiaan G; Dicke, Marcel; George, David R; Groot Koerkamp, Peter W G
2015-08-01
For integrated pest management (IPM) programs to be maximally effective, monitoring of the growth and decline of the pest populations is essential. Here, we present the validation results of a new automated monitoring device for the poultry red mite (Dermanyssus gallinae), a serious pest in laying hen facilities world-wide. This monitoring device (called an "automated mite counter") was validated in experimental laying hen cages with live birds and a growing population of D. gallinae. This validation study resulted in 17 data points of 'number of mites counted' by the automated mite counter and the 'number of mites present' in the experimental laying hen cages. The study demonstrated that the automated mite counter was able to track the D. gallinae population effectively. A wider evaluation showed that this automated mite counter can become a useful tool in IPM of D. gallinae in laying hen facilities.
A Cost-Benefit Study of Doing Astrophysics On The Cloud: Production of Image Mosaics
NASA Astrophysics Data System (ADS)
Berriman, G. B.; Good, J. C. Deelman, E.; Singh, G. Livny, M.
2009-09-01
Utility grids such as the Amazon EC2 and Amazon S3 clouds offer computational and storage resources that can be used on-demand for a fee by compute- and data-intensive applications. The cost of running an application on such a cloud depends on the compute, storage and communication resources it will provision and consume. Different execution plans of the same application may result in significantly different costs. We studied via simulation the cost performance trade-offs of different execution and resource provisioning plans by creating, under the Amazon cloud fee structure, mosaics with the Montage image mosaic engine, a widely used data- and compute-intensive application. Specifically, we studied the cost of building mosaics of 2MASS data that have sizes of 1, 2 and 4 square degrees, and a 2MASS all-sky mosaic. These are examples of mosaics commonly generated by astronomers. We also study these trade-offs in the context of the storage and communication fees of Amazon S3 when used for long-term application data archiving. Our results show that by provisioning the right amount of storage and compute resources cost can be significantly reduced with no significant impact on application performance.
Dong, X F; Liu, S; Tong, J M
2018-04-14
Two hundred and sixteen 28-wk-old Hy-line laying hens were randomly distributed to three dietary treatments and fed 1of 3 diets containing 8% soybean oil, fish oil, or coconut oil from 28 to 47 wk of age to investigate comparative effect of dietary soybean oil, fish oil, and coconut oil on the performance, egg quality and blood malondialdehyde (MDA), aspartate transaminase (AST) and uric acid (UA). Hens fed fish oil showed poor performance compared with soybean oil or coconut oil, and especially egg weight throughout the trial was significantly and consistently decreased (P < 0.05) due to dietary fish oil. Unexpectedly, shell reflectivity throughout the majority of the trial was consistently and significantly higher (P < 0.05) when hens fed fish oil than that when fed soybean oil or coconut oil. Dietary treatments affected (P < 0.05) shell shape at 4 of 8 time points tested. Average shell shape in fish oil treatment was higher (P < 0.05) than that of coconut oil group. Albumen height, Haugh unit and yolk color were influenced by dietary treatments only at 1 or 2 time points. However, average albumen height and Haugh unit in fish oil treatment were higher (P < 0.05) than that of soybean oil or coconut oil treatments and average yolk color in coconut oil treatment was higher (P < 0.05) than that of soybean oil group. Serum MDA, AST and UA concentrations were increased (P < 0.05) by fish oil during the majority of the first 2 mo of the trial. These data suggested that the inclusion of fish oil into feed may reduce the performance of laying hens, especially the egg weight, decrease the intensity of egg brown color and increase blood MDA, AST and UA levels compared with soybean oil or coconut oil. As a result, hens fed fish oil may lay smaller, longer and lighter-brown eggs whereas those fed coconut oil produce blunter and darker-brown eggs relative to soybean oil.
Resampling: A Marriage of Computers and Statistics. ERIC/TM Digest.
ERIC Educational Resources Information Center
Rudner, Lawrence M.; Shafer, Mary Morello
Advances in computer technology are making it possible for educational researchers to use simpler statistical methods to address a wide range of questions with smaller data sets and fewer, and less restrictive, assumptions. This digest introduces computationally intensive statistics, collectively called resampling techniques. Resampling is a…
USDA-ARS?s Scientific Manuscript database
With enhanced data availability, distributed watershed models for large areas with high spatial and temporal resolution are increasingly used to understand water budgets and examine effects of human activities and climate change/variability on water resources. Developing parallel computing software...
Active Subspace Methods for Data-Intensive Inverse Problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Qiqi
2017-04-27
The project has developed theory and computational tools to exploit active subspaces to reduce the dimension in statistical calibration problems. This dimension reduction enables MCMC methods to calibrate otherwise intractable models. The same theoretical and computational tools can also reduce the measurement dimension for calibration problems that use large stores of data.
Preliminary Evaluation of MapReduce for High-Performance Climate Data Analysis
NASA Technical Reports Server (NTRS)
Duffy, Daniel Q.; Schnase, John L.; Thompson, John H.; Freeman, Shawn M.; Clune, Thomas L.
2012-01-01
MapReduce is an approach to high-performance analytics that may be useful to data intensive problems in climate research. It offers an analysis paradigm that uses clusters of computers and combines distributed storage of large data sets with parallel computation. We are particularly interested in the potential of MapReduce to speed up basic operations common to a wide range of analyses. In order to evaluate this potential, we are prototyping a series of canonical MapReduce operations over a test suite of observational and climate simulation datasets. Our initial focus has been on averaging operations over arbitrary spatial and temporal extents within Modern Era Retrospective- Analysis for Research and Applications (MERRA) data. Preliminary results suggest this approach can improve efficiencies within data intensive analytic workflows.
Measuring and Estimating Normalized Contrast in Infrared Flash Thermography
NASA Technical Reports Server (NTRS)
Koshti, Ajay M.
2013-01-01
Infrared flash thermography (IRFT) is used to detect void-like flaws in a test object. The IRFT technique involves heating up the part surface using a flash of flash lamps. The post-flash evolution of the part surface temperature is sensed by an IR camera in terms of pixel intensity of image pixels. The IR technique involves recording of the IR video image data and analysis of the data using the normalized pixel intensity and temperature contrast analysis method for characterization of void-like flaws for depth and width. This work introduces a new definition of the normalized IR pixel intensity contrast and normalized surface temperature contrast. A procedure is provided to compute the pixel intensity contrast from the camera pixel intensity evolution data. The pixel intensity contrast and the corresponding surface temperature contrast differ but are related. This work provides a method to estimate the temperature evolution and the normalized temperature contrast from the measured pixel intensity evolution data and some additional measurements during data acquisition.
Large-Scale Compute-Intensive Analysis via a Combined In-situ and Co-scheduling Workflow Approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Messer, Bronson; Sewell, Christopher; Heitmann, Katrin
2015-01-01
Large-scale simulations can produce tens of terabytes of data per analysis cycle, complicating and limiting the efficiency of workflows. Traditionally, outputs are stored on the file system and analyzed in post-processing. With the rapidly increasing size and complexity of simulations, this approach faces an uncertain future. Trending techniques consist of performing the analysis in situ, utilizing the same resources as the simulation, and/or off-loading subsets of the data to a compute-intensive analysis system. We introduce an analysis framework developed for HACC, a cosmological N-body code, that uses both in situ and co-scheduling approaches for handling Petabyte-size outputs. An initial inmore » situ step is used to reduce the amount of data to be analyzed, and to separate out the data-intensive tasks handled off-line. The analysis routines are implemented using the PISTON/VTK-m framework, allowing a single implementation of an algorithm that simultaneously targets a variety of GPU, multi-core, and many-core architectures.« less
A Cyber-ITS Framework for Massive Traffic Data Analysis Using Cyber Infrastructure
Fontaine, Michael D.
2013-01-01
Traffic data is commonly collected from widely deployed sensors in urban areas. This brings up a new research topic, data-driven intelligent transportation systems (ITSs), which means to integrate heterogeneous traffic data from different kinds of sensors and apply it for ITS applications. This research, taking into consideration the significant increase in the amount of traffic data and the complexity of data analysis, focuses mainly on the challenge of solving data-intensive and computation-intensive problems. As a solution to the problems, this paper proposes a Cyber-ITS framework to perform data analysis on Cyber Infrastructure (CI), by nature parallel-computing hardware and software systems, in the context of ITS. The techniques of the framework include data representation, domain decomposition, resource allocation, and parallel processing. All these techniques are based on data-driven and application-oriented models and are organized as a component-and-workflow-based model in order to achieve technical interoperability and data reusability. A case study of the Cyber-ITS framework is presented later based on a traffic state estimation application that uses the fusion of massive Sydney Coordinated Adaptive Traffic System (SCATS) data and GPS data. The results prove that the Cyber-ITS-based implementation can achieve a high accuracy rate of traffic state estimation and provide a significant computational speedup for the data fusion by parallel computing. PMID:23766690
Rasdaman for Big Spatial Raster Data
NASA Astrophysics Data System (ADS)
Hu, F.; Huang, Q.; Scheele, C. J.; Yang, C. P.; Yu, M.; Liu, K.
2015-12-01
Spatial raster data have grown exponentially over the past decade. Recent advancements on data acquisition technology, such as remote sensing, have allowed us to collect massive observation data of various spatial resolution and domain coverage. The volume, velocity, and variety of such spatial data, along with the computational intensive nature of spatial queries, pose grand challenge to the storage technologies for effective big data management. While high performance computing platforms (e.g., cloud computing) can be used to solve the computing-intensive issues in big data analysis, data has to be managed in a way that is suitable for distributed parallel processing. Recently, rasdaman (raster data manager) has emerged as a scalable and cost-effective database solution to store and retrieve massive multi-dimensional arrays, such as sensor, image, and statistics data. Within this paper, the pros and cons of using rasdaman to manage and query spatial raster data will be examined and compared with other common approaches, including file-based systems, relational databases (e.g., PostgreSQL/PostGIS), and NoSQL databases (e.g., MongoDB and Hive). Earth Observing System (EOS) data collected from NASA's Atmospheric Scientific Data Center (ASDC) will be used and stored in these selected database systems, and a set of spatial and non-spatial queries will be designed to benchmark their performance on retrieving large-scale, multi-dimensional arrays of EOS data. Lessons learnt from using rasdaman will be discussed as well.
A Cyber-ITS framework for massive traffic data analysis using cyber infrastructure.
Xia, Yingjie; Hu, Jia; Fontaine, Michael D
2013-01-01
Traffic data is commonly collected from widely deployed sensors in urban areas. This brings up a new research topic, data-driven intelligent transportation systems (ITSs), which means to integrate heterogeneous traffic data from different kinds of sensors and apply it for ITS applications. This research, taking into consideration the significant increase in the amount of traffic data and the complexity of data analysis, focuses mainly on the challenge of solving data-intensive and computation-intensive problems. As a solution to the problems, this paper proposes a Cyber-ITS framework to perform data analysis on Cyber Infrastructure (CI), by nature parallel-computing hardware and software systems, in the context of ITS. The techniques of the framework include data representation, domain decomposition, resource allocation, and parallel processing. All these techniques are based on data-driven and application-oriented models and are organized as a component-and-workflow-based model in order to achieve technical interoperability and data reusability. A case study of the Cyber-ITS framework is presented later based on a traffic state estimation application that uses the fusion of massive Sydney Coordinated Adaptive Traffic System (SCATS) data and GPS data. The results prove that the Cyber-ITS-based implementation can achieve a high accuracy rate of traffic state estimation and provide a significant computational speedup for the data fusion by parallel computing.
Raw data normalization for a multi source inverse geometry CT system
Baek, Jongduk; De Man, Bruno; Harrison, Daniel; Pelc, Norbert J.
2015-01-01
A multi-source inverse-geometry CT (MS-IGCT) system consists of a small 2D detector array and multiple x-ray sources. During data acquisition, each source is activated sequentially, and may have random source intensity fluctuations relative to their respective nominal intensity. While a conventional 3rd generation CT system uses a reference channel to monitor the source intensity fluctuation, the MS-IGCT system source illuminates a small portion of the entire field-of-view (FOV). Therefore, it is difficult for all sources to illuminate the reference channel and the projection data computed by standard normalization using flat field data of each source contains error and can cause significant artifacts. In this work, we present a raw data normalization algorithm to reduce the image artifacts caused by source intensity fluctuation. The proposed method was tested using computer simulations with a uniform water phantom and a Shepp-Logan phantom, and experimental data of an ice-filled PMMA phantom and a rabbit. The effect on image resolution and robustness of the noise were tested using MTF and standard deviation of the reconstructed noise image. With the intensity fluctuation and no correction, reconstructed images from simulation and experimental data show high frequency artifacts and ring artifacts which are removed effectively using the proposed method. It is also observed that the proposed method does not degrade the image resolution and is very robust to the presence of noise. PMID:25837090
Enabling NVM for Data-Intensive Scientific Services
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carns, Philip; Jenkins, John; Seo, Sangmin
Specialized, transient data services are playing an increasingly prominent role in data-intensive scientific computing. These services offer flexible, on-demand pairing of applications with storage hardware using semantics that are optimized for the problem domain. Concurrent with this trend, upcoming scientific computing and big data systems will be deployed with emerging NVM technology to achieve the highest possible price/productivity ratio. Clearly, therefore, we must develop techniques to facilitate the confluence of specialized data services and NVM technology. In this work we explore how to enable the composition of NVM resources within transient distributed services while still retaining their essential performance characteristics.more » Our approach involves eschewing the conventional distributed file system model and instead projecting NVM devices as remote microservices that leverage user-level threads, RPC services, RMA-enabled network transports, and persistent memory libraries in order to maximize performance. We describe a prototype system that incorporates these concepts, evaluate its performance for key workloads on an exemplar system, and discuss how the system can be leveraged as a component of future data-intensive architectures.« less
Data Intensive Scientific Workflows on a Federated Cloud: CRADA Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garzoglio, Gabriele
The Fermilab Scientific Computing Division and the KISTI Global Science Experimental Data Hub Center have built a prototypical large-scale infrastructure to handle scientific workflows of stakeholders to run on multiple cloud resources. The demonstrations have been in the areas of (a) Data-Intensive Scientific Workflows on Federated Clouds, (b) Interoperability and Federation of Cloud Resources, and (c) Virtual Infrastructure Automation to enable On-Demand Services.
Radar cross calibration investigation TAMU radar polarimeter calibration measurements
NASA Technical Reports Server (NTRS)
Blanchard, A. J.; Newton, R. W.; Bong, S.; Kronke, C.; Warren, G. L.; Carey, D.
1982-01-01
A short pulse, 20 MHz bandwidth, three frequency radar polarimeter system (RPS) operates at center frequencies of 10.003 GHz, 4.75 GHz, and 1.6 GHz and utilizes dual polarized transmit and receive antennas for each frequency. The basic lay-out of the RPS is different from other truck mounted systems in that it uses a pulse compression IF section common to all three RF heads. Separate transmit and receive antennas are used to improve the cross-polarization isolation at each particular frequency. The receive is a digitally controlled gain modulated subsystem and is interfaced directly with a microprocesser computer for control and data manipulation. Antenna focusing distance, focusing each antenna pair, rf head stability, and polarization characteristics of RPS antennas are discussed. Platform and data acquisition procedures are described.
Scalable Automated Model Search
2014-05-20
ma- chines. Categories and Subject Descriptors Big Data [Distributed Computing]: Large scale optimization 1. INTRODUCTION Modern scientific and...from Continuum Analytics[1], and Apache Spark 0.8.1. Additionally, we made use of Hadoop 1.0.4 configured on local disks as our data store for the large...Borkar et al. Hyracks: A flexible and extensible foundation for data -intensive computing. In ICDE, 2011. [16] J. Canny and H. Zhao. Big data
Computational chemistry in pharmaceutical research: at the crossroads.
Bajorath, Jürgen
2012-01-01
Computational approaches are an integral part of pharmaceutical research. However, there are many of unsolved key questions that limit the scientific progress in the still evolving computational field and its impact on drug discovery. Importantly, a number of these questions are not new but date back many years. Hence, it might be difficult to conclusively answer them in the foreseeable future. Moreover, the computational field as a whole is characterized by a high degree of heterogeneity and so is, unfortunately, the quality of its scientific output. In light of this situation, it is proposed that changes in scientific standards and culture should be seriously considered now in order to lay a foundation for future progress in computational research.
[Spatial distribution pattern of Pontania dolichura larvae and sampling technique].
Zhang, Feng; Chen, Zhijie; Zhang, Shulian; Zhao, Huiyan
2006-03-01
In this paper, the spatial distribution pattern of Pontania dolichura larvae was analyzed with Taylor's power law, Iwao's distribution function, and six aggregation indexes. The results showed that the spatial distribution pattern of P. dolichura larvae was of aggregated, and the basic component of the distribution was individual colony, with the aggregation intensity increased with density. On branches, the aggregation was caused by the adult behavior of laying eggs and the spatial position of leaves, while on leaves, the aggregation was caused by the spatial position of news leaves in spring when m < 2.37, and by the spatial position of news leaves in spring and the behavior of eclosion and laying eggs when m > 2.37. By using the parameters alpha and beta in Iwao's m * -m regression equation, the optimal and sequential sampling numbers were determined.
Evaluating virtual hosted desktops for graphics-intensive astronomy
NASA Astrophysics Data System (ADS)
Meade, B. F.; Fluke, C. J.
2018-04-01
Visualisation of data is critical to understanding astronomical phenomena. Today, many instruments produce datasets that are too big to be downloaded to a local computer, yet many of the visualisation tools used by astronomers are deployed only on desktop computers. Cloud computing is increasingly used to provide a computation and simulation platform in astronomy, but it also offers great potential as a visualisation platform. Virtual hosted desktops, with graphics processing unit (GPU) acceleration, allow interactive, graphics-intensive desktop applications to operate co-located with astronomy datasets stored in remote data centres. By combining benchmarking and user experience testing, with a cohort of 20 astronomers, we investigate the viability of replacing physical desktop computers with virtual hosted desktops. In our work, we compare two Apple MacBook computers (one old and one new, representing hardware and opposite ends of the useful lifetime) with two virtual hosted desktops: one commercial (Amazon Web Services) and one in a private research cloud (the Australian NeCTAR Research Cloud). For two-dimensional image-based tasks and graphics-intensive three-dimensional operations - typical of astronomy visualisation workflows - we found that benchmarks do not necessarily provide the best indication of performance. When compared to typical laptop computers, virtual hosted desktops can provide a better user experience, even with lower performing graphics cards. We also found that virtual hosted desktops are equally simple to use, provide greater flexibility in choice of configuration, and may actually be a more cost-effective option for typical usage profiles.
ERIC Educational Resources Information Center
Henrichs, Lotte F.; Leseman, Paul P. M.
2014-01-01
Early science instruction is important in order to lay a firm basis for learning scientific concepts and scientific thinking. In addition, young children enjoy science. However, science plays only a minor role in the kindergarten curriculum. It has been reported that teachers feel they need to prioritize language and literacy practices over…
Gebhardt-Henrich, Sabine G.; Pfulg, Andreas; Fröhlich, Ernst K. F.; Käppeli, Susanna; Guggisberg, Dominik; Liesegang, Annette; Stoffel, Michael H.
2017-01-01
Keel bone damage is a wide-spread welfare problem in laying hens. It is unclear so far whether bone quality relates to keel bone damage. The goal of the present study was to detect possible associations between keel bone damage and bone properties of intact and damaged keel bones and of tibias in end-of-lay hens raised in loose housing systems. Bones were palpated and examined by peripheral quantitative computer tomography (PQCT), a three-point bending test, and analyses of bone ash. Contrary to our expectations, PQCT revealed higher cortical and trabecular contents in fractured than in intact keel bones. This might be due to structural bone repair after fractures. Density measurements of cortical and trabecular tissues of keel bones did not differ between individuals with and without fractures. In the three-point bending test of the tibias, ultimate shear strength was significantly higher in birds with intact vs. fractured keel bones. Likewise, birds with intact or slightly deviated keel bones had higher mineral and calcium contents of the keel bone than birds with fractured keel bones. Calcium content in keel bones was correlated with calcium content in tibias. Although there were some associations between bone traits related to bone strength and keel bone damage, other factors such as stochastic events related to housing such as falls and collisions seem to be at least as important for the prevalence of keel bone damage. PMID:28848740
Long-Term Technology Planning: Laying the Foundation To Improve Illinois Schools.
ERIC Educational Resources Information Center
Barker, Bruce O.; Hall, Robert F.
This report provides guidelines for establishing a long-term technology plan for education, applicable to schools in all states. Advanced and emerging telecommunications and computer technologies have resulted in an ever increasing need for teachers and students to develop information processing and lifelong learning skills for gathering and…
Lay Health Influencers: How They Tailor Brief Tobacco Cessation Interventions
ERIC Educational Resources Information Center
Yuan, Nicole P.; Castaneda, Heide; Nichter, Mark; Nichter, Mimi; Wind, Steven; Carruth, Lauren; Muramoto, Myra
2012-01-01
Interventions tailored to individual smoker characteristics have increasingly received attention in the tobacco control literature. The majority of tailored interventions are generated by computers and administered with printed materials or web-based programs. The purpose of this study was to examine the tailoring activities of community lay…
Impedance computations and beam-based measurements: A problem of discrepancy
NASA Astrophysics Data System (ADS)
Smaluk, Victor
2018-04-01
High intensity of particle beams is crucial for high-performance operation of modern electron-positron storage rings, both colliders and light sources. The beam intensity is limited by the interaction of the beam with self-induced electromagnetic fields (wake fields) proportional to the vacuum chamber impedance. For a new accelerator project, the total broadband impedance is computed by element-wise wake-field simulations using computer codes. For a machine in operation, the impedance can be measured experimentally using beam-based techniques. In this article, a comparative analysis of impedance computations and beam-based measurements is presented for 15 electron-positron storage rings. The measured data and the predictions based on the computed impedance budgets show a significant discrepancy. Three possible reasons for the discrepancy are discussed: interference of the wake fields excited by a beam in adjacent components of the vacuum chamber, effect of computation mesh size, and effect of insufficient bandwidth of the computed impedance.
Casey-Trott, T M; Korver, D R; Guerin, M T; Sandilands, V; Torrey, S; Widowski, T M
2017-08-01
Osteoporosis in laying hens has been a production and welfare concern for several decades. The objective of this study was to determine whether differing opportunities for exercise during pullet rearing influences long-term bone quality characteristics in end-of-lay hens. A secondary objective was to assess whether differing opportunities for exercise in adult housing systems alters bone quality characteristics in end-of-lay hens. Four flock replicates of 588 Lohmann Selected Leghorn-Lite pullets were reared in either conventional cages (Conv) or an aviary rearing system (Avi) and placed into conventional cages (CC), 30-bird furnished cages (FC-S), or 60-bird furnished cages (FC-L) for adult housing. Wing and leg bones were collected at the end-of-lay to quantify bone composition and strength using quantitative computed tomography and bone breaking strength (BBS). At the end-of-lay, Avi hens had greater total and cortical cross-sectional area (P < 0.05) for the radius and tibia, greater total bone mineral content of the radius (P < 0.001), and greater tibial cortical bone mineral content (P = 0.029) than the Conv hens; however, total bone mineral density of the radius (P < 0.001) and cortical bone mineral density of the radius and tibia (P < 0.001) were greater in the Conv hens. Hens in the FC-L had greater total bone mineral density for the radius and tibia (P < 0.05) and greater trabecular bone mineral density for the radius (P = 0.027), compared to hens in the FC-S and CC. Total bone mineral content of the tibia (P = 0.030) and cortical bone mineral content of the radius (P = 0.030) and tibia (P = 0.013) were greater in the FC-L compared to the CC. The humerus of Conv hens had greater BBS than the Avi hens (P < 0.001), and the tibiae of FC-L and FC-S hens had greater BBS than CC hens (P = 0.006). Increased opportunities for exercise offered by the aviary rearing system provided improved bone quality characteristics lasting through to the end-of-lay. © The Author 2017. Published by Oxford University Press on behalf of Poultry Science Association.
Data communication network at the ASRM facility
NASA Astrophysics Data System (ADS)
Moorhead, Robert J., II; Smith, Wayne D.
1993-08-01
This report describes the simulation of the overall communication network structure for the Advanced Solid Rocket Motor (ASRM) facility being built at Yellow Creek near Iuka, Mississippi as of today. The report is compiled using information received from NASA/MSFC, LMSC, AAD, and RUST Inc. As per the information gathered, the overall network structure will have one logical FDDI ring acting as a backbone for the whole complex. The buildings will be grouped into two categories viz. manufacturing intensive and manufacturing non-intensive. The manufacturing intensive buildings will be connected via FDDI to the Operational Information System (OIS) in the main computing center in B_1000. The manufacturing non-intensive buildings will be connected by 10BASE-FL to the OIS through the Business Information System (BIS) hub in the main computing center. All the devices inside B_1000 will communicate with the BIS. The workcells will be connected to the Area Supervisory Computers (ASCs) through the nearest manufacturing intensive hub and one of the OIS hubs. Comdisco's Block Oriented Network Simulator (BONeS) has been used to simulate the performance of the network. BONeS models a network topology, traffic, data structures, and protocol functions using a graphical interface. The main aim of the simulations was to evaluate the loading of the OIS, the BIS, and the ASCs, and the network links by the traffic generated by the workstations and workcells throughout the site.
Data communication network at the ASRM facility
NASA Technical Reports Server (NTRS)
Moorhead, Robert J., II; Smith, Wayne D.
1993-01-01
This report describes the simulation of the overall communication network structure for the Advanced Solid Rocket Motor (ASRM) facility being built at Yellow Creek near Iuka, Mississippi as of today. The report is compiled using information received from NASA/MSFC, LMSC, AAD, and RUST Inc. As per the information gathered, the overall network structure will have one logical FDDI ring acting as a backbone for the whole complex. The buildings will be grouped into two categories viz. manufacturing intensive and manufacturing non-intensive. The manufacturing intensive buildings will be connected via FDDI to the Operational Information System (OIS) in the main computing center in B_1000. The manufacturing non-intensive buildings will be connected by 10BASE-FL to the OIS through the Business Information System (BIS) hub in the main computing center. All the devices inside B_1000 will communicate with the BIS. The workcells will be connected to the Area Supervisory Computers (ASCs) through the nearest manufacturing intensive hub and one of the OIS hubs. Comdisco's Block Oriented Network Simulator (BONeS) has been used to simulate the performance of the network. BONeS models a network topology, traffic, data structures, and protocol functions using a graphical interface. The main aim of the simulations was to evaluate the loading of the OIS, the BIS, and the ASCs, and the network links by the traffic generated by the workstations and workcells throughout the site.
Low Latency Workflow Scheduling and an Application of Hyperspectral Brightness Temperatures
NASA Astrophysics Data System (ADS)
Nguyen, P. T.; Chapman, D. R.; Halem, M.
2012-12-01
New system analytics for Big Data computing holds the promise of major scientific breakthroughs and discoveries from the exploration and mining of the massive data sets becoming available to the science community. However, such data intensive scientific applications face severe challenges in accessing, managing and analyzing petabytes of data. While the Hadoop MapReduce environment has been successfully applied to data intensive problems arising in business, there are still many scientific problem domains where limitations in the functionality of MapReduce systems prevent its wide adoption by those communities. This is mainly because MapReduce does not readily support the unique science discipline needs such as special science data formats, graphic and computational data analysis tools, maintaining high degrees of computational accuracies, and interfacing with application's existing components across heterogeneous computing processors. We address some of these limitations by exploiting the MapReduce programming model for satellite data intensive scientific problems and address scalability, reliability, scheduling, and data management issues when dealing with climate data records and their complex observational challenges. In addition, we will present techniques to support the unique Earth science discipline needs such as dealing with special science data formats (HDF and NetCDF). We have developed a Hadoop task scheduling algorithm that improves latency by 2x for a scientific workflow including the gridding of the EOS AIRS hyperspectral Brightness Temperatures (BT). This workflow processing algorithm has been tested at the Multicore Computing Center private Hadoop based Intel Nehalem cluster, as well as in a virtual mode under the Open Source Eucalyptus cloud. The 55TB AIRS hyperspectral L1b Brightness Temperature record has been gridded at the resolution of 0.5x1.0 degrees, and we have computed a 0.9 annual anti-correlation to the El Nino Southern oscillation in the Nino 4 region, as well as a 1.9 Kelvin decadal Arctic warming in the 4u and 12u spectral regions. Additionally, we will present the frequency of extreme global warming events by the use of a normalized maximum BT in a grid cell relative to its local standard deviation. A low-latency Hadoop scheduling environment maintains data integrity and fault tolerance in a MapReduce data intensive Cloud environment while improving the "time to solution" metric by 35% when compared to a more traditional parallel processing system for the same dataset. Our next step will be to improve the usability of our Hadoop task scheduling system, to enable rapid prototyping of data intensive experiments by means of processing "kernels". We will report on the performance and experience of implementing these experiments on the NEX testbed, and propose the use of a graphical directed acyclic graph (DAG) interface to help us develop on-demand scientific experiments. Our workflow system works within Hadoop infrastructure as a replacement for the FIFO or FairScheduler, thus the use of Apache "Pig" latin or other Apache tools may also be worth investigating on the NEX system to improve the usability of our workflow scheduling infrastructure for rapid experimentation.
Nuclear thermal propulsion workshop overview
NASA Technical Reports Server (NTRS)
Clark, John S.
1991-01-01
NASA is planning an Exploration Technology Program as part of the Space Exploration Initiative to return U.S. astronauts to the moon, conduct intensive robotic exploration of the moon and Mars, and to conduct a piloted mission to Mars by 2019. Nuclear Propulsion is one of the key technology thrust for the human mission to Mars. The workshop addresses NTP (Nuclear Thermal Rocket) technologies with purpose to: assess the state-of-the-art of nuclear propulsion concepts; assess the potential benefits of the concepts for the mission to Mars; identify critical, enabling technologies; lay-out (first order) technology development plans including facility requirements; and estimate the cost of developing these technologies to flight-ready status. The output from the workshop will serve as a data base for nuclear propulsion project planning.
The do re mi's of everyday life: the structure and personality correlates of music preferences.
Rentfrow, Peter J; Gosling, Samuel D
2003-06-01
The present research examined individual differences in music preferences. A series of 6 studies investigated lay beliefs about music, the structure underlying music preferences, and the links between music preferences and personality. The data indicated that people consider music an important aspect of their lives and listening to music an activity they engaged in frequently. Using multiple samples, methods, and geographic regions, analyses of the music preferences of over 3,500 individuals converged to reveal 4 music-preference dimensions: Reflective and Complex, Intense and Rebellious, Upbeat and Conventional, and Energetic and Rhythmic. Preferences for these music dimensions were related to a wide array of personality dimensions (e.g., Openness), self-views (e.g., political orientation), and cognitive abilities (e.g., verbal IQ).
Three years of biomedical FEL use in medicine and surgery How far have we come?
NASA Astrophysics Data System (ADS)
Jean, Benedikt
1997-02-01
Since the FEL has been made available for biophysical research in the IR, it has revolutionized the optimization strategies of laser-tissue interaction and the minimizing of adverse effects, in particular for photoablative use in surgery. Its tunability together with the free combination of wavelength and energy made it an efficient research tool, allowing the reduction of risks and costs of preclinical biomedical research. New computer-assisted surgical techniques evolved and the broader data basis of IR photothermal ablation allows more accurate predictive modelling of the efficiency and the adverse effects of photoablation. New applications for diagnostic imaging as well as the first clinical applications in neurosurgery lay ahead.
Advanced construction management for lunar base construction - Surface operations planner
NASA Technical Reports Server (NTRS)
Kehoe, Robert P.
1992-01-01
The study proposes a conceptual solution and lays the framework for developing a new, sophisticated and intelligent tool for a lunar base construction crew to use. This concept integrates expert systems for critical decision making, virtual reality for training, logistics and laydown optimization, automated productivity measurements, and an advanced scheduling tool to form a unique new planning tool. The concept features extensive use of computers and expert systems software to support the actual work, while allowing the crew to control the project from the lunar surface. Consideration is given to a logistics data base, laydown area management, flexible critical progress scheduler, video simulation of assembly tasks, and assembly information and tracking documentation.
Framework Resources Multiply Computing Power
NASA Technical Reports Server (NTRS)
2010-01-01
As an early proponent of grid computing, Ames Research Center awarded Small Business Innovation Research (SBIR) funding to 3DGeo Development Inc., of Santa Clara, California, (now FusionGeo Inc., of The Woodlands, Texas) to demonstrate a virtual computer environment that linked geographically dispersed computer systems over the Internet to help solve large computational problems. By adding to an existing product, FusionGeo enabled access to resources for calculation- or data-intensive applications whenever and wherever they were needed. Commercially available as Accelerated Imaging and Modeling, the product is used by oil companies and seismic service companies, which require large processing and data storage capacities.
Smith, Leigh M; Davidson, Patricia M; Halcomb, Elizabeth J; Andrew, Sharon
2007-11-01
The importance of early defibrillation in improving outcomes and reducing morbidity following out-of-hospital cardiac arrest underscores the importance of examining novel approaches to treatment access. The increasing evidence to support the importance of early defibrillation has increased attention on the potential for lay responders to deliver this therapy. This paper seeks to critically review the literature that evaluates the impact of lay responder defibrillator programs on survival to hospital discharge following an out-of-hospital cardiac arrest in the adult population. The electronic databases, Medline and CINAHL, were searched using keywords including; "first responder", "lay responder", "defibrillation" and "cardiac arrest". The reference lists of retrieved articles and the Internet were also searched. Articles were included in the review if they reported primary data, in the English language, which described the effect of a lay responder defibrillation program on survival to hospital discharge from out-of-hospital cardiac arrest in adults. Eleven studies met the inclusion criteria. The small number of published studies, heterogeneity of study populations and study outcome methods prohibited formal meta-analysis. Therefore, narrative analysis was undertaken. Studies included in this report provided inconsistent findings in relation to survival to hospital discharge following out-of-hospital cardiac arrest. Although there are limited data, the role of the lay responder appears promising in improving the outcome from out-of-hospital cardiac arrest following early defibrillation. Despite the inherent methodological difficulties in studying this population, future research should address outcomes related to morbidity, mortality and cost-effectiveness.
Cloud computing applications for biomedical science: A perspective.
Navale, Vivek; Bourne, Philip E
2018-06-01
Biomedical research has become a digital data-intensive endeavor, relying on secure and scalable computing, storage, and network infrastructure, which has traditionally been purchased, supported, and maintained locally. For certain types of biomedical applications, cloud computing has emerged as an alternative to locally maintained traditional computing approaches. Cloud computing offers users pay-as-you-go access to services such as hardware infrastructure, platforms, and software for solving common biomedical computational problems. Cloud computing services offer secure on-demand storage and analysis and are differentiated from traditional high-performance computing by their rapid availability and scalability of services. As such, cloud services are engineered to address big data problems and enhance the likelihood of data and analytics sharing, reproducibility, and reuse. Here, we provide an introductory perspective on cloud computing to help the reader determine its value to their own research.
Modern Computational Techniques for the HMMER Sequence Analysis
2013-01-01
This paper focuses on the latest research and critical reviews on modern computing architectures, software and hardware accelerated algorithms for bioinformatics data analysis with an emphasis on one of the most important sequence analysis applications—hidden Markov models (HMM). We show the detailed performance comparison of sequence analysis tools on various computing platforms recently developed in the bioinformatics society. The characteristics of the sequence analysis, such as data and compute-intensive natures, make it very attractive to optimize and parallelize by using both traditional software approach and innovated hardware acceleration technologies. PMID:25937944
NASA Astrophysics Data System (ADS)
Hampton, S. E.
2015-12-01
The science necessary to unravel complex environmental problems confronts severe computational challenges - coping with huge volumes of heterogeneous data, spanning vast spatial scales at high resolution, and requiring integration of disparate measurements from multiple disciplines. But as cyberinfrastructure advances to support such work, scientists in many fields lack sufficient computational skills to participate in interdisciplinary, data-intensive research. In response, we developed innovative training workshops for early-career scientists, in order to explore both the needs and solutions for training next-generation scientists in skills for data-intensive environmental research. In 2013 and 2014 we ran intensive 3-week training workshops for early-career researchers. One of the workshops was run concurrently in California and North Carolina, connected by virtual technologies and coordinated schedules. We attracted applicants to the workshop with the opportunity to pursue data-intensive small-group research projects that they proposed. This approach presented a realistic possibility that publishable products could result from 3 weeks of focused hands-on classroom instruction combined with self-directed group research in which instructors were present to assist trainees. Instruction addressed 1) collaboration modes and technologies, 2) data management, preservation, and sharing, 3) preparing data for analysis using scripting, 4) reproducible research, 5) sustainable software practices, 6) data analysis and modeling, and 7) communicating results to broad communities. The most dramatic improvements in technical skills were in data management, version control, and working with spatial data outside of proprietary software. In addition, participants built strong networks and collaborative skills that later resulted in a successful student-led grant proposal, published manuscripts, and participants reported that the training was a highly influential experience.
Barriers and Incentives to Computer Usage in Teaching
1988-09-29
classes with one or two computers. Research Methods The two major methods of data-gathering employed in this study were intensive and extensive classroom ... observation and repeated extended interviews with students and teachers. Administrators were also interviewed when appropriate. Classroom observers used
NASA Astrophysics Data System (ADS)
Afanasyev, A. P.; Bazhenov, R. I.; Luchaninov, D. V.
2018-05-01
The main purpose of the research is to develop techniques for defining the best technical and economic trajectories of cables in urban power systems. The proposed algorithms of calculation of the routes for laying cables take into consideration topological, technical and economic features of the cabling. The discrete option of an algorithm Fast marching method is applied as a calculating tool. It has certain advantages compared to other approaches. In particular, this algorithm is cost-effective to compute, therefore, it is not iterative. Trajectories of received laying cables are considered as optimal ones from the point of view of technical and economic criteria. They correspond to the present rules of modern urban development.
ERIC Educational Resources Information Center
Yu, Mei-yu; Song, Lixin; Seetoo, Amy; Cai, Cuijuan; Smith, Gary; Oakley, Deborah
2007-01-01
The lay health advisor (LHA) training program for breast cancer screening was conducted among Chinese-English bilingual trainees residing in Southeast Michigan. Guided by Bandura's Social Learning Theory, the development of the training curriculum followed the health communication process recommended by the National Cancer Institute. Data analysis…
Communal egg-laying in reptiles and amphibians: evolutionary patterns and hypotheses.
Doody, J Sean; Freedberg, Steve; Keogh, J Scott
2009-09-01
Communal egg-laying is widespread among animals, occurring in insects, mollusks, fish, amphibians, reptiles, and birds, just to name a few. While some benefits of communal egg-laying may be pervasive (e.g., it saves time and energy and may ensure the survival of mothers and their offspring), the remarkable diversity in the life histories of the animals that exhibit this behavior presents a great challenge to discovering any general explanation. Reptiles and amphibians offer ideal systems for investigating communal egg-laying because they generally lack parental care--a simplification that brings nest site choice behavior into sharp focus. We exhaustively reviewed the published literature for data on communal egg-laying in reptiles and amphibians. Our analysis demonstrates that the behavior is much more common than previously recognized (occurring in 481 spp.), especially among lizards (N = 255 spp.), where the behavior has evolved multiple times. Our conceptual review strongly suggests that different forces may be driving the evolution and maintenance of communal egg-laying in different taxa. Using a game theory approach, we demonstrate how a stable equilibrium may occur between solitary and communal layers, thus allowing both strategies to co-exist in some populations, and we discuss factors that may influence these proportions. We conclude by outlining future research directions for determining the proximate and ultimate causes of communal egg-laying.
NASA Astrophysics Data System (ADS)
Li, J.; Zhang, T.; Huang, Q.; Liu, Q.
2014-12-01
Today's climate datasets are featured with large volume, high degree of spatiotemporal complexity and evolving fast overtime. As visualizing large volume distributed climate datasets is computationally intensive, traditional desktop based visualization applications fail to handle the computational intensity. Recently, scientists have developed remote visualization techniques to address the computational issue. Remote visualization techniques usually leverage server-side parallel computing capabilities to perform visualization tasks and deliver visualization results to clients through network. In this research, we aim to build a remote parallel visualization platform for visualizing and analyzing massive climate data. Our visualization platform was built based on Paraview, which is one of the most popular open source remote visualization and analysis applications. To further enhance the scalability and stability of the platform, we have employed cloud computing techniques to support the deployment of the platform. In this platform, all climate datasets are regular grid data which are stored in NetCDF format. Three types of data access methods are supported in the platform: accessing remote datasets provided by OpenDAP servers, accessing datasets hosted on the web visualization server and accessing local datasets. Despite different data access methods, all visualization tasks are completed at the server side to reduce the workload of clients. As a proof of concept, we have implemented a set of scientific visualization methods to show the feasibility of the platform. Preliminary results indicate that the framework can address the computation limitation of desktop based visualization applications.
Ali, Syed Mashhood; Shamim, Shazia
2015-07-01
Complexation of racemic citalopram with β-cyclodextrin (β-CD) in aqueous medium was investigated to determine atom-accurate structure of the inclusion complexes. (1) H-NMR chemical shift change data of β-CD cavity protons in the presence of citalopram confirmed the formation of 1 : 1 inclusion complexes. ROESY spectrum confirmed the presence of aromatic ring in the β-CD cavity but whether one of the two or both rings was not clear. Molecular mechanics and molecular dynamic calculations showed the entry of fluoro-ring from wider side of β-CD cavity as the most favored mode of inclusion. Minimum energy computational models were analyzed for their accuracy in atomic coordinates by comparison of calculated and experimental intermolecular ROESY peak intensities, which were not found in agreement. Several least energy computational models were refined and analyzed till calculated and experimental intensities were compatible. The results demonstrate that computational models of CD complexes need to be analyzed for atom-accuracy and quantitative ROESY analysis is a promising method. Moreover, the study also validates that the quantitative use of ROESY is feasible even with longer mixing times if peak intensity ratios instead of absolute intensities are used. Copyright © 2015 John Wiley & Sons, Ltd.
Cross-cultural perspectives on physician and lay models of the common cold.
Baer, Roberta D; Weller, Susan C; de Alba García, Javier García; Rocha, Ana L Salcedo
2008-06-01
We compare physicians and laypeople within and across cultures, focusing on similarities and differences across samples, to determine whether cultural differences or lay-professional differences have a greater effect on explanatory models of the common cold. Data on explanatory models for the common cold were collected from physicians and laypeople in South Texas and Guadalajara, Mexico. Structured interview materials were developed on the basis of open-ended interviews with samples of lay informants at each locale. A structured questionnaire was used to collect information from each sample on causes, symptoms, and treatments for the common cold. Consensus analysis was used to estimate the cultural beliefs for each sample. Instead of systematic differences between samples based on nationality or level of professional training, all four samples largely shared a single-explanatory model of the common cold, with some differences on subthemes, such as the role of hot and cold forces in the etiology of the common cold. An evaluation of our findings indicates that, although there has been conjecture about whether cultural or lay-professional differences are of greater importance in understanding variation in explanatory models of disease and illness, systematic data collected on community and professional beliefs indicate that such differences may be a function of the specific illness. Further generalizations about lay-professional differences need to be based on detailed data for a variety of illnesses, to discern patterns that may be present. Finally, a systematic approach indicates that agreement across individual explanatory models is sufficient to allow for a community-level explanatory model of the common cold.
Report on Computing and Networking in the Space Science Laboratory by the SSL Computer Committee
NASA Technical Reports Server (NTRS)
Gallagher, D. L. (Editor)
1993-01-01
The Space Science Laboratory (SSL) at Marshall Space Flight Center is a multiprogram facility. Scientific research is conducted in four discipline areas: earth science and applications, solar-terrestrial physics, astrophysics, and microgravity science and applications. Representatives from each of these discipline areas participate in a Laboratory computer requirements committee, which developed this document. The purpose is to establish and discuss Laboratory objectives for computing and networking in support of science. The purpose is also to lay the foundation for a collective, multiprogram approach to providing these services. Special recognition is given to the importance of the national and international efforts of our research communities toward the development of interoperable, network-based computer applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aguilo Valentin, Miguel Alejandro; Trujillo, Susie
During calendar year 2017, Sandia National Laboratories (SNL) made strides towards developing an open portable design platform rich in highperformance computing (HPC) enabled modeling, analysis and synthesis tools. The main focus was to lay the foundations of the core interfaces that will enable plug-n-play insertion of synthesis optimization technologies in the areas of modeling, analysis and synthesis.
Paradigms of Evaluation in Natural Language Processing: Field Linguistics for Glass Box Testing
ERIC Educational Resources Information Center
Cohen, Kevin Bretonnel
2010-01-01
Although software testing has been well-studied in computer science, it has received little attention in natural language processing. Nonetheless, a fully developed methodology for glass box evaluation and testing of language processing applications already exists in the field methods of descriptive linguistics. This work lays out a number of…
Closing University Departments: The Perception of Tax Payers.
ERIC Educational Resources Information Center
Furnham, Adrian; Sisterson, Grant
2000-01-01
This pilot study looked at the British lay public's evaluation of 20 different disciplines by asking them to rank-order them. In a cutting-saving exercise those departments (disciplines) thought most worthy of saving were English, mathematics, and computer science; those rated as least important were anthropology, film and media studies, and…
Lay Hold! Heave! Building Speed: Excitement and Satisfaction in Pushing the BGE Flywheel
2009-04-01
future. Accessions has stirred up a significant review of how the Army runs the business of bringing in new lieutenants and leveraging their academic ...occasional plagiarism . Tablet personal computers (PCs) have been purchased for one ECCC small group, and the sec- ond pilot of use of the Tablet PC
Samiullah, Sami; Roberts, Juliet; Chousalkar, Kapil
2016-10-01
The aim of the current study was to assess any effect of wild and vaccine Australian infectious bronchitis virus (IBV) strains on shell colour in brown-shelled eggs. In Experiment 1, eggs were collected from day 1 to day 13 post-inoculation (p.i.) from unvaccinated laying hens challenged with IBV wild strains T and N1/88 and from a negative control group of hens. In Experiment 2, eggs were collected from 2 to 22 days p.i. from unvaccinated and vaccinated laying hens challenged with either a wild or a vaccine strain of IBV. In Experiment 1, there was a significant effect (P < 0.05) of day p.i. and of viral strain on shell reflectivity, L* and protoporphyrin IX (PP IX) in eggshells, with and without cuticle. The mean PP IX/g of shell with and without cuticle was significantly higher on day 1 p.i. compared to day 7, after which PP IX increased with day p.i. In Experiment 2, shell reflectivity and L* increased and PP IX decreased with increased day p.i. until day 12. Shell reflectivity and L* decreased slightly after day 12 and increased again towards day 22. Shell reflectivity, L* and PP IX were not significantly different for eggshells from unvaccinated and vaccinated laying hens in the intact eggshell, but were significantly different in shells from which cuticle had been removed. In conclusion, the IBV strains reduced the intensity of brown shell colour to different extents with a lower amount of PP IX in eggshells.
Thin film ferroelectric electro-optic memory
NASA Technical Reports Server (NTRS)
Thakoor, Sarita (Inventor); Thakoor, Anilkumar P. (Inventor)
1993-01-01
An electrically programmable, optically readable data or memory cell is configured from a thin film of ferroelectric material, such as PZT, sandwiched between a transparent top electrode and a bottom electrode. The output photoresponse, which may be a photocurrent or photo-emf, is a function of the product of the remanent polarization from a previously applied polarization voltage and the incident light intensity. The cell is useful for analog and digital data storage as well as opto-electric computing. The optical read operation is non-destructive of the remanent polarization. The cell provides a method for computing the product of stored data and incident optical data by applying an electrical signal to store data by polarizing the thin film ferroelectric material, and then applying an intensity modulated optical signal incident onto the thin film material to generate a photoresponse therein related to the product of the electrical and optical signals.
The Montage architecture for grid-enabled science processing of large, distributed datasets
NASA Technical Reports Server (NTRS)
Jacob, Joseph C.; Katz, Daniel S .; Prince, Thomas; Berriman, Bruce G.; Good, John C.; Laity, Anastasia C.; Deelman, Ewa; Singh, Gurmeet; Su, Mei-Hui
2004-01-01
Montage is an Earth Science Technology Office (ESTO) Computational Technologies (CT) Round III Grand Challenge investigation to deploy a portable, compute-intensive, custom astronomical image mosaicking service for the National Virtual Observatory (NVO). Although Montage is developing a compute- and data-intensive service for the astronomy community, we are also helping to address a problem that spans both Earth and Space science, namely how to efficiently access and process multi-terabyte, distributed datasets. In both communities, the datasets are massive, and are stored in distributed archives that are, in most cases, remote from the available Computational resources. Therefore, state of the art computational grid technologies are a key element of the Montage portal architecture. This paper describes the aspects of the Montage design that are applicable to both the Earth and Space science communities.
Mattfeldt, Torsten
2011-04-01
Computer-intensive methods may be defined as data analytical procedures involving a huge number of highly repetitive computations. We mention resampling methods with replacement (bootstrap methods), resampling methods without replacement (randomization tests) and simulation methods. The resampling methods are based on simple and robust principles and are largely free from distributional assumptions. Bootstrap methods may be used to compute confidence intervals for a scalar model parameter and for summary statistics from replicated planar point patterns, and for significance tests. For some simple models of planar point processes, point patterns can be simulated by elementary Monte Carlo methods. The simulation of models with more complex interaction properties usually requires more advanced computing methods. In this context, we mention simulation of Gibbs processes with Markov chain Monte Carlo methods using the Metropolis-Hastings algorithm. An alternative to simulations on the basis of a parametric model consists of stochastic reconstruction methods. The basic ideas behind the methods are briefly reviewed and illustrated by simple worked examples in order to encourage novices in the field to use computer-intensive methods. © 2010 The Authors Journal of Microscopy © 2010 Royal Microscopical Society.
Betowski, Don; Bevington, Charles; Allison, Thomas C
2016-01-19
Halogenated chemical substances are used in a broad array of applications, and new chemical substances are continually being developed and introduced into commerce. While recent research has considerably increased our understanding of the global warming potentials (GWPs) of multiple individual chemical substances, this research inevitably lags behind the development of new chemical substances. There are currently over 200 substances known to have high GWP. Evaluation of schemes to estimate radiative efficiency (RE) based on computational chemistry are useful where no measured IR spectrum is available. This study assesses the reliability of values of RE calculated using computational chemistry techniques for 235 chemical substances against the best available values. Computed vibrational frequency data is used to estimate RE values using several Pinnock-type models, and reasonable agreement with reported values is found. Significant improvement is obtained through scaling of both vibrational frequencies and intensities. The effect of varying the computational method and basis set used to calculate the frequency data is discussed. It is found that the vibrational intensities have a strong dependence on basis set and are largely responsible for differences in computed RE values.
Opportunities and challenges for the life sciences community.
Kolker, Eugene; Stewart, Elizabeth; Ozdemir, Vural
2012-03-01
Twenty-first century life sciences have transformed into data-enabled (also called data-intensive, data-driven, or big data) sciences. They principally depend on data-, computation-, and instrumentation-intensive approaches to seek comprehensive understanding of complex biological processes and systems (e.g., ecosystems, complex diseases, environmental, and health challenges). Federal agencies including the National Science Foundation (NSF) have played and continue to play an exceptional leadership role by innovatively addressing the challenges of data-enabled life sciences. Yet even more is required not only to keep up with the current developments, but also to pro-actively enable future research needs. Straightforward access to data, computing, and analysis resources will enable true democratization of research competitions; thus investigators will compete based on the merits and broader impact of their ideas and approaches rather than on the scale of their institutional resources. This is the Final Report for Data-Intensive Science Workshops DISW1 and DISW2. The first NSF-funded Data Intensive Science Workshop (DISW1, Seattle, WA, September 19-20, 2010) overviewed the status of the data-enabled life sciences and identified their challenges and opportunities. This served as a baseline for the second NSF-funded DIS workshop (DISW2, Washington, DC, May 16-17, 2011). Based on the findings of DISW2 the following overarching recommendation to the NSF was proposed: establish a community alliance to be the voice and framework of the data-enabled life sciences. After this Final Report was finished, Data-Enabled Life Sciences Alliance (DELSA, www.delsall.org ) was formed to become a Digital Commons for the life sciences community.
Opportunities and Challenges for the Life Sciences Community
Stewart, Elizabeth; Ozdemir, Vural
2012-01-01
Abstract Twenty-first century life sciences have transformed into data-enabled (also called data-intensive, data-driven, or big data) sciences. They principally depend on data-, computation-, and instrumentation-intensive approaches to seek comprehensive understanding of complex biological processes and systems (e.g., ecosystems, complex diseases, environmental, and health challenges). Federal agencies including the National Science Foundation (NSF) have played and continue to play an exceptional leadership role by innovatively addressing the challenges of data-enabled life sciences. Yet even more is required not only to keep up with the current developments, but also to pro-actively enable future research needs. Straightforward access to data, computing, and analysis resources will enable true democratization of research competitions; thus investigators will compete based on the merits and broader impact of their ideas and approaches rather than on the scale of their institutional resources. This is the Final Report for Data-Intensive Science Workshops DISW1 and DISW2. The first NSF-funded Data Intensive Science Workshop (DISW1, Seattle, WA, September 19–20, 2010) overviewed the status of the data-enabled life sciences and identified their challenges and opportunities. This served as a baseline for the second NSF-funded DIS workshop (DISW2, Washington, DC, May 16–17, 2011). Based on the findings of DISW2 the following overarching recommendation to the NSF was proposed: establish a community alliance to be the voice and framework of the data-enabled life sciences. After this Final Report was finished, Data-Enabled Life Sciences Alliance (DELSA, www.delsall.org) was formed to become a Digital Commons for the life sciences community. PMID:22401659
Guntur, Ananya R.; Gou, Bin; Gu, Pengyu; He, Ruo; Stern, Ulrich; Xiang, Yang; Yang, Chung-Hui
2017-01-01
The evolutionarily conserved TRPA1 channel can sense various stimuli including temperatures and chemical irritants. Recent results have suggested that specific isoforms of Drosophila TRPA1 (dTRPA1) are UV-sensitive and that their UV sensitivity is due to H2O2 sensitivity. However, whether such UV sensitivity served any physiological purposes in animal behavior was unclear. Here, we demonstrate that H2O2-sensitive dTRPA1 isoforms promote avoidance of UV when adult Drosophila females are selecting sites for egg-laying. First, we show that blind/visionless females are still capable of sensing and avoiding UV during egg-laying when intensity of UV is high yet within the range of natural sunlight. Second, we show that such vision-independent UV avoidance is mediated by a group of bitter-sensing neurons on the proboscis that express H2O2-sensitive dTRPA1 isoforms. We show that these bitter-sensing neurons exhibit dTRPA1-dependent UV sensitivity. Importantly, inhibiting activities of these bitter-sensing neurons, reducing their dTRPA1 expression, or reducing their H2O2-sensitivity all significantly reduced blind females’ UV avoidance, whereas selectively restoring a H2O2-sensitive isoform of dTRPA1 in these neurons restored UV avoidance. Lastly, we show that specifically expressing the red-shifted channelrhodopsin CsChrimson in these bitter-sensing neurons promotes egg-laying avoidance of red light, an otherwise neutral cue for egg-laying females. Together, these results demonstrate a physiological role of the UV-sensitive dTRPA1 isoforms, reveal that adult Drosophila possess at least two sensory systems for detecting UV, and uncover an unexpected role of bitter-sensing taste neurons in UV sensing. PMID:27932542
Guntur, Ananya R; Gou, Bin; Gu, Pengyu; He, Ruo; Stern, Ulrich; Xiang, Yang; Yang, Chung-Hui
2017-02-01
The evolutionarily conserved TRPA1 channel can sense various stimuli including temperatures and chemical irritants. Recent results have suggested that specific isoforms of Drosophila TRPA1 (dTRPA1) are UV-sensitive and that their UV sensitivity is due to H 2 O 2 sensitivity. However, whether such UV sensitivity served any physiological purposes in animal behavior was unclear. Here, we demonstrate that H 2 O 2 -sensitive dTRPA1 isoforms promote avoidance of UV when adult Drosophila females are selecting sites for egg-laying. First, we show that blind/visionless females are still capable of sensing and avoiding UV during egg-laying when intensity of UV is high yet within the range of natural sunlight. Second, we show that such vision-independent UV avoidance is mediated by a group of bitter-sensing neurons on the proboscis that express H 2 O 2 -sensitive dTRPA1 isoforms. We show that these bitter-sensing neurons exhibit dTRPA1-dependent UV sensitivity. Importantly, inhibiting activities of these bitter-sensing neurons, reducing their dTRPA1 expression, or reducing their H 2 O 2 -sensitivity all significantly reduced blind females' UV avoidance, whereas selectively restoring a H 2 O 2 -sensitive isoform of dTRPA1 in these neurons restored UV avoidance. Lastly, we show that specifically expressing the red-shifted channelrhodopsin CsChrimson in these bitter-sensing neurons promotes egg-laying avoidance of red light, an otherwise neutral cue for egg-laying females. Together, these results demonstrate a physiological role of the UV-sensitive dTRPA1 isoforms, reveal that adult Drosophila possess at least two sensory systems for detecting UV, and uncover an unexpected role of bitter-sensing taste neurons in UV sensing. Copyright © 2017 by the Genetics Society of America.
The importance of ray pathlengths when measuring objects in maximum intensity projection images.
Schreiner, S; Dawant, B M; Paschal, C B; Galloway, R L
1996-01-01
It is important to understand any process that affects medical data. Once the data have changed from the original form, one must consider the possibility that the information contained in the data has also changed. In general, false negative and false positive diagnoses caused by this post-processing must be minimized. Medical imaging is one area in which post-processing is commonly performed, but there is often little or no discussion of how these algorithms affect the data. This study uncovers some interesting properties of maximum intensity projection (MIP) algorithms which are commonly used in the post-processing of magnetic resonance (MR) and computed tomography (CT) angiographic data. The appearance of the width of vessels and the extent of malformations such as aneurysms is of interest to clinicians. This study will show how MIP algorithms interact with the shape of the object being projected. MIP's can make objects appear thinner in the projection than in the original data set and also alter the shape of the profile of the object seen in the original data. These effects have consequences for width-measuring algorithms which will be discussed. Each projected intensity is dependent upon the pathlength of the ray from which the projected pixel arises. The morphology (shape and intensity profile) of an object will change the pathlength that each ray experiences. This is termed the pathlength effect. In order to demonstrate the pathlength effect, simple computer models of an imaged vessel were created. Additionally, a static MR phantom verified that the derived equation for the projection-plane probability density function (pdf) predicts the projection-plane intensities well (R(2)=0.96). Finally, examples of projections through in vivo MR angiography and CT angiography data are presented.
A lightweight distributed framework for computational offloading in mobile cloud computing.
Shiraz, Muhammad; Gani, Abdullah; Ahmad, Raja Wasim; Adeel Ali Shah, Syed; Karim, Ahmad; Rahman, Zulkanain Abdul
2014-01-01
The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs). Therefore, Mobile Cloud Computing (MCC) leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC.
A Lightweight Distributed Framework for Computational Offloading in Mobile Cloud Computing
Shiraz, Muhammad; Gani, Abdullah; Ahmad, Raja Wasim; Adeel Ali Shah, Syed; Karim, Ahmad; Rahman, Zulkanain Abdul
2014-01-01
The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs). Therefore, Mobile Cloud Computing (MCC) leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC. PMID:25127245
van Dijk, René E; Eising, Corine M; Merrill, Richard M; Karadas, Filiz; Hatchwell, Ben; Spottiswoode, Claire N
2013-02-01
Maternal effects can influence offspring phenotype with short- and long-term consequences. Yet, how the social environment may influence egg composition is not well understood. Here, we investigate how laying order and social environment predict maternal effects in the sociable weaver, Philetairus socius, a species that lives in massive communal nests which may be occupied by only a few to 100+ individuals in a single nest. This range of social environments is associated with variation in a number of phenotypic and life-history traits. We investigate whether maternal effects are adjusted accordingly. We found no evidence for the prediction that females might benefit from modifying brood hierarchies through an increased deposition of androgens with laying order. Instead, females appear to exacerbate brood reduction by decreasing the costly production of yolk mass and antioxidants with laying order. Additionally, we found that this effect did not depend on colony size. Finally, in accordance with an expected increased intensity of environmental stress with increasing colony size, we found that yolk androgen concentration increased with colony size. This result suggests that females may enhance the competitive ability of offspring raised in larger colonies, possibly preparing the offspring for a competitive social environment.
NASA Technical Reports Server (NTRS)
Curlis, J. D.; Frost, V. S.; Dellwig, L. F.
1986-01-01
Computer-enhancement techniques applied to the SIR-A data from the Lisbon Valley area in the northern portion of the Paradox basin increased the value of the imagery in the development of geologically useful maps. The enhancement techniques include filtering to remove image speckle from the SIR-A data and combining these data with Landsat multispectral scanner data. A method well-suited for the combination of the data sets utilized a three-dimensional domain defined by intensity-hue-saturation (IHS) coordinates. Such a system allows the Landsat data to modulate image intensity, while the SIR-A data control image hue and saturation. Whereas the addition of Landsat data to the SIR-A image by means of a pixel-by-pixel ratio accentuated textural variations within the image, the addition of color to the combined images enabled isolation of areas in which gray-tone contrast was minimal. This isolation resulted in a more precise definition of stratigraphic units.
NASA Astrophysics Data System (ADS)
Brzuszek, Marcin; Daniluk, Andrzej
2006-11-01
Writing a concurrent program can be more difficult than writing a sequential program. Programmer needs to think about synchronisation, race conditions and shared variables. Transactions help reduce the inconvenience of using threads. A transaction is an abstraction, which allows programmers to group a sequence of actions on the program into a logical, higher-level computation unit. This paper presents multithreaded versions of the GROWTH program, which allow to calculate the layer coverages during the growth of thin epitaxial films and the corresponding RHEED intensities according to the kinematical approximation. The presented programs also contain graphical user interfaces, which enable displaying program data at run-time. New version program summaryTitles of programs:GROWTHGr, GROWTH06 Catalogue identifier:ADVL_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADVL_v2_0 Program obtainable from:CPC Program Library, Queen's University of Belfast, N. Ireland Catalogue identifier of previous version:ADVL Does the new version supersede the original program:No Computer for which the new version is designed and others on which it has been tested: Pentium-based PC Operating systems or monitors under which the new version has been tested: Windows 9x, XP, NT Programming language used:Object Pascal Memory required to execute with typical data:More than 1 MB Number of bits in a word:64 bits Number of processors used:1 No. of lines in distributed program, including test data, etc.:20 931 Number of bytes in distributed program, including test data, etc.: 1 311 268 Distribution format:tar.gz Nature of physical problem: The programs compute the RHEED intensities during the growth of thin epitaxial structures prepared using the molecular beam epitaxy (MBE). The computations are based on the use of kinematical diffraction theory [P.I. Cohen, G.S. Petrich, P.R. Pukite, G.J. Whaley, A.S. Arrott, Surf. Sci. 216 (1989) 222. [1
Fully automated motion correction in first-pass myocardial perfusion MR image sequences.
Milles, Julien; van der Geest, Rob J; Jerosch-Herold, Michael; Reiber, Johan H C; Lelieveldt, Boudewijn P F
2008-11-01
This paper presents a novel method for registration of cardiac perfusion magnetic resonance imaging (MRI). The presented method is capable of automatically registering perfusion data, using independent component analysis (ICA) to extract physiologically relevant features together with their time-intensity behavior. A time-varying reference image mimicking intensity changes in the data of interest is computed based on the results of that ICA. This reference image is used in a two-pass registration framework. Qualitative and quantitative validation of the method is carried out using 46 clinical quality, short-axis, perfusion MR datasets comprising 100 images each. Despite varying image quality and motion patterns in the evaluation set, validation of the method showed a reduction of the average right ventricle (LV) motion from 1.26+/-0.87 to 0.64+/-0.46 pixels. Time-intensity curves are also improved after registration with an average error reduced from 2.65+/-7.89% to 0.87+/-3.88% between registered data and manual gold standard. Comparison of clinically relevant parameters computed using registered data and the manual gold standard show a good agreement. Additional tests with a simulated free-breathing protocol showed robustness against considerable deviations from a standard breathing protocol. We conclude that this fully automatic ICA-based method shows an accuracy, a robustness and a computation speed adequate for use in a clinical environment.
NASA Technical Reports Server (NTRS)
Rignot, E.; Chellappa, R.
1993-01-01
We present a maximum a posteriori (MAP) classifier for classifying multifrequency, multilook, single polarization SAR intensity data into regions or ensembles of pixels of homogeneous and similar radar backscatter characteristics. A model for the prior joint distribution of the multifrequency SAR intensity data is combined with a Markov random field for representing the interactions between region labels to obtain an expression for the posterior distribution of the region labels given the multifrequency SAR observations. The maximization of the posterior distribution yields Bayes's optimum region labeling or classification of the SAR data or its MAP estimate. The performance of the MAP classifier is evaluated by using computer-simulated multilook SAR intensity data as a function of the parameters in the classification process. Multilook SAR intensity data are shown to yield higher classification accuracies than one-look SAR complex amplitude data. The MAP classifier is extended to the case in which the radar backscatter from the remotely sensed surface varies within the SAR image because of incidence angle effects. The results obtained illustrate the practicality of the method for combining SAR intensity observations acquired at two different frequencies and for improving classification accuracy of SAR data.
Mul, Monique F; van Riel, Johan W; Roy, Lise; Zoons, Johan; André, Geert; George, David R; Meerburg, Bastiaan G; Dicke, Marcel; van Mourik, Simon; Groot Koerkamp, Peter W G
2017-10-15
The poultry red mite, Dermanyssus gallinae, is the most significant pest of egg laying hens in many parts of the world. Control of D. gallinae could be greatly improved with advanced Integrated Pest Management (IPM) for D. gallinae in laying hen facilities. The development of a model forecasting the pests' population dynamics in laying hen facilities without and post-treatment will contribute to this advanced IPM and could consequently improve implementation of IPM by farmers. The current work describes the development and demonstration of a model which can follow and forecast the population dynamics of D. gallinae in laying hen facilities given the variation of the population growth of D. gallinae within and between flocks. This high variation could partly be explained by house temperature, flock age, treatment, and hen house. The total population growth variation within and between flocks, however, was in part explained by temporal variation. For a substantial part this variation was unexplained. A dynamic adaptive model (DAP) was consequently developed, as models of this type are able to handle such temporal variations. The developed DAP model can forecast the population dynamics of D. gallinae, requiring only current flock population monitoring data, temperature data and information of the dates of any D. gallinae treatment. Importantly, the DAP model forecasted treatment effects, while compensating for location and time specific interactions, handling the variability of these parameters. The characteristics of this DAP model, and its compatibility with different mite monitoring methods, represent progression from existing approaches for forecasting D. gallinae that could contribute to advancing improved Integrated Pest Management (IPM) for D. gallinae in laying hen facilities. Copyright © 2017 Elsevier B.V. All rights reserved.
Accessing the public MIMIC-II intensive care relational database for clinical research.
Scott, Daniel J; Lee, Joon; Silva, Ikaro; Park, Shinhyuk; Moody, George B; Celi, Leo A; Mark, Roger G
2013-01-10
The Multiparameter Intelligent Monitoring in Intensive Care II (MIMIC-II) database is a free, public resource for intensive care research. The database was officially released in 2006, and has attracted a growing number of researchers in academia and industry. We present the two major software tools that facilitate accessing the relational database: the web-based QueryBuilder and a downloadable virtual machine (VM) image. QueryBuilder and the MIMIC-II VM have been developed successfully and are freely available to MIMIC-II users. Simple example SQL queries and the resulting data are presented. Clinical studies pertaining to acute kidney injury and prediction of fluid requirements in the intensive care unit are shown as typical examples of research performed with MIMIC-II. In addition, MIMIC-II has also provided data for annual PhysioNet/Computing in Cardiology Challenges, including the 2012 Challenge "Predicting mortality of ICU Patients". QueryBuilder is a web-based tool that provides easy access to MIMIC-II. For more computationally intensive queries, one can locally install a complete copy of MIMIC-II in a VM. Both publicly available tools provide the MIMIC-II research community with convenient querying interfaces and complement the value of the MIMIC-II relational database.
Integration of drug dosing data with physiological data streams using a cloud computing paradigm.
Bressan, Nadja; James, Andrew; McGregor, Carolyn
2013-01-01
Many drugs are used during the provision of intensive care for the preterm newborn infant. Recommendations for drug dosing in newborns depend upon data from population based pharmacokinetic research. There is a need to be able to modify drug dosing in response to the preterm infant's response to the standard dosing recommendations. The real-time integration of physiological data with drug dosing data would facilitate individualised drug dosing for these immature infants. This paper proposes the use of a novel computational framework that employs real-time, temporal data analysis for this task. Deployment of the framework within the cloud computing paradigm will enable widespread distribution of individualized drug dosing for newborn infants.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cohen, J; Dossa, D; Gokhale, M
Critical data science applications requiring frequent access to storage perform poorly on today's computing architectures. This project addresses efficient computation of data-intensive problems in national security and basic science by exploring, advancing, and applying a new form of computing called storage-intensive supercomputing (SISC). Our goal is to enable applications that simply cannot run on current systems, and, for a broad range of data-intensive problems, to deliver an order of magnitude improvement in price/performance over today's data-intensive architectures. This technical report documents much of the work done under LDRD 07-ERD-063 Storage Intensive Supercomputing during the period 05/07-09/07. The following chapters describe:more » (1) a new file I/O monitoring tool iotrace developed to capture the dynamic I/O profiles of Linux processes; (2) an out-of-core graph benchmark for level-set expansion of scale-free graphs; (3) an entity extraction benchmark consisting of a pipeline of eight components; and (4) an image resampling benchmark drawn from the SWarp program in the LSST data processing pipeline. The performance of the graph and entity extraction benchmarks was measured in three different scenarios: data sets residing on the NFS file server and accessed over the network; data sets stored on local disk; and data sets stored on the Fusion I/O parallel NAND Flash array. The image resampling benchmark compared performance of software-only to GPU-accelerated. In addition to the work reported here, an additional text processing application was developed that used an FPGA to accelerate n-gram profiling for language classification. The n-gram application will be presented at SC07 at the High Performance Reconfigurable Computing Technologies and Applications Workshop. The graph and entity extraction benchmarks were run on a Supermicro server housing the NAND Flash 40GB parallel disk array, the Fusion-io. The Fusion system specs are as follows: SuperMicro X7DBE Xeon Dual Socket Blackford Server Motherboard; 2 Intel Xeon Dual-Core 2.66 GHz processors; 1 GB DDR2 PC2-5300 RAM (2 x 512); 80GB Hard Drive (Seagate SATA II Barracuda). The Fusion board is presently capable of 4X in a PCIe slot. The image resampling benchmark was run on a dual Xeon workstation with NVIDIA graphics card (see Chapter 5 for full specification). An XtremeData Opteron+FPGA was used for the language classification application. We observed that these benchmarks are not uniformly I/O intensive. The only benchmark that showed greater that 50% of the time in I/O was the graph algorithm when it accessed data files over NFS. When local disk was used, the graph benchmark spent at most 40% of its time in I/O. The other benchmarks were CPU dominated. The image resampling benchmark and language classification showed order of magnitude speedup over software by using co-processor technology to offload the CPU-intensive kernels. Our experiments to date suggest that emerging hardware technologies offer significant benefit to boosting the performance of data-intensive algorithms. Using GPU and FPGA co-processors, we were able to improve performance by more than an order of magnitude on the benchmark algorithms, eliminating the processor bottleneck of CPU-bound tasks. Experiments with a prototype solid state nonvolative memory available today show 10X better throughput on random reads than disk, with a 2X speedup on a graph processing benchmark when compared to the use of local SATA disk.« less
Implementing direct, spatially isolated problems on transputer networks
NASA Technical Reports Server (NTRS)
Ellis, Graham K.
1988-01-01
Parametric studies were performed on transputer networks of up to 40 processors to determine how to implement and maximize the performance of the solution of problems where no processor-to-processor data transfer is required for the problem solution (spatially isolated). Two types of problems are investigated a computationally intensive problem where the solution required the transmission of 160 bytes of data through the parallel network, and a communication intensive example that required the transmission of 3 Mbytes of data through the network. This data consists of solutions being sent back to the host processor and not intermediate results for another processor to work on. Studies were performed on both integer and floating-point transputers. The latter features an on-chip floating-point math unit and offers approximately an order of magnitude performance increase over the integer transputer on real valued computations. The results indicate that a minimum amount of work is required on each node per communication to achieve high network speedups (efficiencies). The floating-point processor requires approximately an order of magnitude more work per communication than the integer processor because of the floating-point unit's increased computing capacity.
Evaluation of normalization methods for cDNA microarray data by k-NN classification
Wu, Wei; Xing, Eric P; Myers, Connie; Mian, I Saira; Bissell, Mina J
2005-01-01
Background Non-biological factors give rise to unwanted variations in cDNA microarray data. There are many normalization methods designed to remove such variations. However, to date there have been few published systematic evaluations of these techniques for removing variations arising from dye biases in the context of downstream, higher-order analytical tasks such as classification. Results Ten location normalization methods that adjust spatial- and/or intensity-dependent dye biases, and three scale methods that adjust scale differences were applied, individually and in combination, to five distinct, published, cancer biology-related cDNA microarray data sets. Leave-one-out cross-validation (LOOCV) classification error was employed as the quantitative end-point for assessing the effectiveness of a normalization method. In particular, a known classifier, k-nearest neighbor (k-NN), was estimated from data normalized using a given technique, and the LOOCV error rate of the ensuing model was computed. We found that k-NN classifiers are sensitive to dye biases in the data. Using NONRM and GMEDIAN as baseline methods, our results show that single-bias-removal techniques which remove either spatial-dependent dye bias (referred later as spatial effect) or intensity-dependent dye bias (referred later as intensity effect) moderately reduce LOOCV classification errors; whereas double-bias-removal techniques which remove both spatial- and intensity effect reduce LOOCV classification errors even further. Of the 41 different strategies examined, three two-step processes, IGLOESS-SLFILTERW7, ISTSPLINE-SLLOESS and IGLOESS-SLLOESS, all of which removed intensity effect globally and spatial effect locally, appear to reduce LOOCV classification errors most consistently and effectively across all data sets. We also found that the investigated scale normalization methods do not reduce LOOCV classification error. Conclusion Using LOOCV error of k-NNs as the evaluation criterion, three double-bias-removal normalization strategies, IGLOESS-SLFILTERW7, ISTSPLINE-SLLOESS and IGLOESS-SLLOESS, outperform other strategies for removing spatial effect, intensity effect and scale differences from cDNA microarray data. The apparent sensitivity of k-NN LOOCV classification error to dye biases suggests that this criterion provides an informative measure for evaluating normalization methods. All the computational tools used in this study were implemented using the R language for statistical computing and graphics. PMID:16045803
Evaluation of normalization methods for cDNA microarray data by k-NN classification.
Wu, Wei; Xing, Eric P; Myers, Connie; Mian, I Saira; Bissell, Mina J
2005-07-26
Non-biological factors give rise to unwanted variations in cDNA microarray data. There are many normalization methods designed to remove such variations. However, to date there have been few published systematic evaluations of these techniques for removing variations arising from dye biases in the context of downstream, higher-order analytical tasks such as classification. Ten location normalization methods that adjust spatial- and/or intensity-dependent dye biases, and three scale methods that adjust scale differences were applied, individually and in combination, to five distinct, published, cancer biology-related cDNA microarray data sets. Leave-one-out cross-validation (LOOCV) classification error was employed as the quantitative end-point for assessing the effectiveness of a normalization method. In particular, a known classifier, k-nearest neighbor (k-NN), was estimated from data normalized using a given technique, and the LOOCV error rate of the ensuing model was computed. We found that k-NN classifiers are sensitive to dye biases in the data. Using NONRM and GMEDIAN as baseline methods, our results show that single-bias-removal techniques which remove either spatial-dependent dye bias (referred later as spatial effect) or intensity-dependent dye bias (referred later as intensity effect) moderately reduce LOOCV classification errors; whereas double-bias-removal techniques which remove both spatial- and intensity effect reduce LOOCV classification errors even further. Of the 41 different strategies examined, three two-step processes, IGLOESS-SLFILTERW7, ISTSPLINE-SLLOESS and IGLOESS-SLLOESS, all of which removed intensity effect globally and spatial effect locally, appear to reduce LOOCV classification errors most consistently and effectively across all data sets. We also found that the investigated scale normalization methods do not reduce LOOCV classification error. Using LOOCV error of k-NNs as the evaluation criterion, three double-bias-removal normalization strategies, IGLOESS-SLFILTERW7, ISTSPLINE-SLLOESS and IGLOESS-SLLOESS, outperform other strategies for removing spatial effect, intensity effect and scale differences from cDNA microarray data. The apparent sensitivity of k-NN LOOCV classification error to dye biases suggests that this criterion provides an informative measure for evaluating normalization methods. All the computational tools used in this study were implemented using the R language for statistical computing and graphics.
Arithmetic Data Cube as a Data Intensive Benchmark
NASA Technical Reports Server (NTRS)
Frumkin, Michael A.; Shabano, Leonid
2003-01-01
Data movement across computational grids and across memory hierarchy of individual grid machines is known to be a limiting factor for application involving large data sets. In this paper we introduce the Data Cube Operator on an Arithmetic Data Set which we call Arithmetic Data Cube (ADC). We propose to use the ADC to benchmark grid capabilities to handle large distributed data sets. The ADC stresses all levels of grid memory by producing 2d views of an Arithmetic Data Set of d-tuples described by a small number of parameters. We control data intensity of the ADC by controlling the sizes of the views through choice of the tuple parameters.
Enabling Earth Science: The Facilities and People of the NCCS
NASA Technical Reports Server (NTRS)
2002-01-01
The NCCS's mass data storage system allows scientists to store and manage the vast amounts of data generated by these computations, and its high-speed network connections allow the data to be accessed quickly from the NCCS archives. Some NCCS users perform studies that are directly related to their ability to run computationally expensive and data-intensive simulations. Because the number and type of questions scientists research often are limited by computing power, the NCCS continually pursues the latest technologies in computing, mass storage, and networking technologies. Just as important as the processors, tapes, and routers of the NCCS are the personnel who administer this hardware, create and manage accounts, maintain security, and assist the scientists, often working one on one with them.
An Assessment of the State-of-the-art in Multidisciplinary Aeromechanical Analyses
NASA Technical Reports Server (NTRS)
Datta, Anubhav; Johnson, Wayne
2008-01-01
This paper presents a survey of the current state-of-the-art in multidisciplinary aeromechanical analyses which integrate advanced Computational Structural Dynamics (CSD) and Computational Fluid Dynamics (CFD) methods. The application areas to be surveyed include fixed wing aircraft, turbomachinery, and rotary wing aircraft. The objective of the authors in the present paper, together with a companion paper on requirements, is to lay out a path for a High Performance Computing (HPC) based next generation comprehensive rotorcraft analysis. From this survey of the key technologies in other application areas it is possible to identify the critical technology gaps that stem from unique rotorcraft requirements.
Impedance computations and beam-based measurements: A problem of discrepancy
Smaluk, Victor
2018-04-21
High intensity of particle beams is crucial for high-performance operation of modern electron-positron storage rings, both colliders and light sources. The beam intensity is limited by the interaction of the beam with self-induced electromagnetic fields (wake fields) proportional to the vacuum chamber impedance. For a new accelerator project, the total broadband impedance is computed by element-wise wake-field simulations using computer codes. For a machine in operation, the impedance can be measured experimentally using beam-based techniques. In this article, a comparative analysis of impedance computations and beam-based measurements is presented for 15 electron-positron storage rings. The measured data and the predictionsmore » based on the computed impedance budgets show a significant discrepancy. For this article, three possible reasons for the discrepancy are discussed: interference of the wake fields excited by a beam in adjacent components of the vacuum chamber, effect of computation mesh size, and effect of insufficient bandwidth of the computed impedance.« less
Impedance computations and beam-based measurements: A problem of discrepancy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smaluk, Victor
High intensity of particle beams is crucial for high-performance operation of modern electron-positron storage rings, both colliders and light sources. The beam intensity is limited by the interaction of the beam with self-induced electromagnetic fields (wake fields) proportional to the vacuum chamber impedance. For a new accelerator project, the total broadband impedance is computed by element-wise wake-field simulations using computer codes. For a machine in operation, the impedance can be measured experimentally using beam-based techniques. In this article, a comparative analysis of impedance computations and beam-based measurements is presented for 15 electron-positron storage rings. The measured data and the predictionsmore » based on the computed impedance budgets show a significant discrepancy. For this article, three possible reasons for the discrepancy are discussed: interference of the wake fields excited by a beam in adjacent components of the vacuum chamber, effect of computation mesh size, and effect of insufficient bandwidth of the computed impedance.« less
The Changing Conduct of Geoscience in a Data Intensive World (Ian McHarg Medal Lecture)
NASA Astrophysics Data System (ADS)
Fox, P.
2012-04-01
Electronic facilitation of scientific research (often called eResearch or eScience) is increasingly prevelant in geosciences. Among the consequences of new and diversifying means of complex (*) data generation is that as many branches of science have become data-intensive (so-called fourth paradigm), they in turn broaden their long-tail distributions - smaller volume, but often complex data, will always lead to excellent science. There are many familar informatics functions that enable the conduct of science (by specialists or non-specialists) in this new regime. For example, the need for any user to be able to discover relations among and between the results of data analyses and informational queries. Unfortunately, true science exploration, for example visual discovery, over complex data remains more of an art form than an easily conducted practice. In general, the resource costs of creating useful visualizations has been increasing. Less than 10 years ago, it was assessed that data-centric science required a rough split between the time to generate, analyze, and publish data and the science based on that data. Today however, the visualization and analysis component has become a bottleneck, requiring considerably more of the overall effort and this trend will continue. Potentially even worse, is the choice to simplify analyses to 'get the work out'. Extra effort to make data understandable, something that should be routine, is now consuming considerable resources that could be used for many other purposes. It is now time to change that trend. This contribution lays out informatics paths for truly 'exploratory' conduct of science cast in the present and rapidly changing reality of Web/Internet-based data and software infrastructures. A logical consequence of these paths is that the people working in this new mode of research, i.e. data scientists, require additional and different education to become effective and routine users of new informatics capabilities. One goal is to achieve the same fluency that researchers may have in lab techniques, instrument utilization, model development and use, etc. Thus, in conclusion, curriculum and skill requirements for data scientists will be presented and discussed. * complex/ intensive = large volume, multi-scale, multi-modal, multi-dimensional, multi-disciplinary, and heterogeneous structure.
Spatiotemporal Domain Decomposition for Massive Parallel Computation of Space-Time Kernel Density
NASA Astrophysics Data System (ADS)
Hohl, A.; Delmelle, E. M.; Tang, W.
2015-07-01
Accelerated processing capabilities are deemed critical when conducting analysis on spatiotemporal datasets of increasing size, diversity and availability. High-performance parallel computing offers the capacity to solve computationally demanding problems in a limited timeframe, but likewise poses the challenge of preventing processing inefficiency due to workload imbalance between computing resources. Therefore, when designing new algorithms capable of implementing parallel strategies, careful spatiotemporal domain decomposition is necessary to account for heterogeneity in the data. In this study, we perform octtree-based adaptive decomposition of the spatiotemporal domain for parallel computation of space-time kernel density. In order to avoid edge effects near subdomain boundaries, we establish spatiotemporal buffers to include adjacent data-points that are within the spatial and temporal kernel bandwidths. Then, we quantify computational intensity of each subdomain to balance workloads among processors. We illustrate the benefits of our methodology using a space-time epidemiological dataset of Dengue fever, an infectious vector-borne disease that poses a severe threat to communities in tropical climates. Our parallel implementation of kernel density reaches substantial speedup compared to sequential processing, and achieves high levels of workload balance among processors due to great accuracy in quantifying computational intensity. Our approach is portable of other space-time analytical tests.
A Novel Technique to Measure In Vivo Uterine Suspensory Ligament Stiffness
Smith, Tovia M.; Luo, Jiajia; Hsu, Yvonne; Ashton-Miller, James A.; Delancey, John O.L.
2013-01-01
Objective To describe a new computer-controlled research apparatus for measuring in vivo uterine ligament force-displacement behavior and stiffness and to present pilot data in women with and without prolapse. Study Design Seventeen women with varying uterine support underwent testing in the operating room (OR) after anesthetic induction. A tripod-mounted computer-controlled linear servoactuator was used to quantify force-displacement behavior of the cervix and supporting ligaments. The servoactuator applied a caudally-directed force to a tenaculum at 4 mm/s velocity until the traction force reached 17.8N (4 lbs.). Cervix location on POP-Q in clinic, in the OR at rest, and with minimal force (<1.1N), and maximum force (17.8N) was recorded. Ligament “stiffness” between minimum and maximum force was calculated. Results The mean (SD) subject age was 54.5 (12.7) years, parity 2.9 (1.1), BMI 29.0 (4.3) kg/m2, and POP-Q point C −3.1 (3.9) cm. POP-Q point C was most strongly correlated with cervix location at maximum force (r=+0.68, p=.003) and at rest (r=+0.62, p=.009). Associations between cervix location at minimum force (r=+0.46, p=.059) and ligament stiffness (r= −0.44,p=.079) were not statistically significant. Cervix location in the OR with minimal traction lay below the lowest point found on POP-Q for 13 women. Conclusions POP-Q point C was strongly correlated with cervix location at rest and at maximum traction force; however only 19% of the variation in POP-Q point C location was explained by ligament stiffness. The cervix location in the OR at minimal traction lay below POP-Q point C value in ¾ of women. PMID:23747493
High quality chemical structure inventories provide the foundation of the U.S. EPA’s ToxCast and Tox21 projects, which are employing high-throughput technologies to screen thousands of chemicals in hundreds of biochemical and cell-based assays, probing a wide diversity of targets...
A Computer-Assisted Learning Model Based on the Digital Game Exponential Reward System
ERIC Educational Resources Information Center
Moon, Man-Ki; Jahng, Surng-Gahb; Kim, Tae-Yong
2011-01-01
The aim of this research was to construct a motivational model which would stimulate voluntary and proactive learning using digital game methods offering players more freedom and control. The theoretical framework of this research lays the foundation for a pedagogical learning model based on digital games. We analyzed the game reward system, which…
In-silico experiments of zebrafish behaviour: modeling swimming in three dimensions
NASA Astrophysics Data System (ADS)
Mwaffo, Violet; Butail, Sachit; Porfiri, Maurizio
2017-01-01
Zebrafish is fast becoming a species of choice in biomedical research for the investigation of functional and dysfunctional processes coupled with their genetic and pharmacological modulation. As with mammals, experimentation with zebrafish constitutes a complicated ethical issue that calls for the exploration of alternative testing methods to reduce the number of subjects, refine experimental designs, and replace live animals. Inspired by the demonstrated advantages of computational studies in other life science domains, we establish an authentic data-driven modelling framework to simulate zebrafish swimming in three dimensions. The model encapsulates burst-and-coast swimming style, speed modulation, and wall interaction, laying the foundations for in-silico experiments of zebrafish behaviour. Through computational studies, we demonstrate the ability of the model to replicate common ethological observables such as speed and spatial preference, and anticipate experimental observations on the correlation between tank dimensions on zebrafish behaviour. Reaching to other experimental paradigms, our framework is expected to contribute to a reduction in animal use and suffering.
Description and detection of burst events in turbulent flows
NASA Astrophysics Data System (ADS)
Schmid, P. J.; García-Gutierrez, A.; Jiménez, J.
2018-04-01
A mathematical and computational framework is developed for the detection and identification of coherent structures in turbulent wall-bounded shear flows. In a first step, this data-based technique will use an embedding methodology to formulate the fluid motion as a phase-space trajectory, from which state-transition probabilities can be computed. Within this formalism, a second step then applies repeated clustering and graph-community techniques to determine a hierarchy of coherent structures ranked by their persistencies. This latter information will be used to detect highly transitory states that act as precursors to violent and intermittent events in turbulent fluid motion (e.g., bursts). Used as an analysis tool, this technique allows the objective identification of intermittent (but important) events in turbulent fluid motion; however, it also lays the foundation for advanced control strategies for their manipulation. The techniques are applied to low-dimensional model equations for turbulent transport, such as the self-sustaining process (SSP), for varying levels of complexity.
In-silico experiments of zebrafish behaviour: modeling swimming in three dimensions
Mwaffo, Violet; Butail, Sachit; Porfiri, Maurizio
2017-01-01
Zebrafish is fast becoming a species of choice in biomedical research for the investigation of functional and dysfunctional processes coupled with their genetic and pharmacological modulation. As with mammals, experimentation with zebrafish constitutes a complicated ethical issue that calls for the exploration of alternative testing methods to reduce the number of subjects, refine experimental designs, and replace live animals. Inspired by the demonstrated advantages of computational studies in other life science domains, we establish an authentic data-driven modelling framework to simulate zebrafish swimming in three dimensions. The model encapsulates burst-and-coast swimming style, speed modulation, and wall interaction, laying the foundations for in-silico experiments of zebrafish behaviour. Through computational studies, we demonstrate the ability of the model to replicate common ethological observables such as speed and spatial preference, and anticipate experimental observations on the correlation between tank dimensions on zebrafish behaviour. Reaching to other experimental paradigms, our framework is expected to contribute to a reduction in animal use and suffering. PMID:28071731
Charabidze, Damien; Depeme, Aurore; Devigne, Cedric; Hedouin, Valery
2015-08-01
This study was designed to examine the common belief that necrophagous blowflies lay their eggs in wounds. The egg-laying behaviour of Lucilia sericata was observed under controlled conditions on wet, artificially wounded or short-haired areas of rat cadavers. Flies laid significantly more eggs on the wet area and the area with short hair than on the dry area or area with long hair. No eggs were observed inside the wounds in any of the replicates. The effect of egg immersion (body fluids often exudes in wounds) on the survival rate of larvae was also investigated. In low water condition, an average of 72.7±7.9% of the larvae survived and they reached a mean length of 7.5±0.6mm. In contrast, submerging eggs under a high volume of water strongly affected their survival rate (25±3.7%) and development. Similar results were observed using unfrozen pig blood instead of water. These data question the information found in the literature regarding the preferential egg-laying behaviour of Calliphorids flies in wounds. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
2013-01-01
Abstract Clergy and lay leaders have a pivotal role in the development and maintenance of HIV Ministries within the African American church. However, little is known about the actual roles these men and women have, the barriers they face and the supports they have found in the development and maintenance of an HIV Ministry. The purpose of this study is to examine the role, barriers and supports clergy and lay leaders experienced in the development of a long-standing HIV ministry in an African American church. These data were gathered from a larger ethnographic study, which examined the role of religious culture in the development, implementation and maintenance of an HIV ministry. Data for this study were collected through in-depth semi-structured interviews. Results revealed that the primary role of clergy and lay leaders involved dispelling myths surrounding HIV and ensuring congregational support. The primary barrier to the development and maintenance was views regarding sexuality. The primary support was their relationships with congregants that lived with HIV and AIDS. This information can assist in developing interventions to enhance the African American church movement toward HIV ministries. PMID:22870846
Stewart, Jennifer M
2014-04-01
Clergy and lay leaders have a pivotal role in the development and maintenance of HIV Ministries within the African American church. However, little is known about the actual roles these men and women have, the barriers they face and the supports they have found in the development and maintenance of an HIV Ministry. The purpose of this study is to examine the role, barriers and supports clergy and lay leaders experienced in the development of a long-standing HIV ministry in an African American church. These data were gathered from a larger ethnographic study, which examined the role of religious culture in the development, implementation and maintenance of an HIV ministry. Data for this study were collected through in-depth semi-structured interviews. Results revealed that the primary role of clergy and lay leaders involved dispelling myths surrounding HIV and ensuring congregational support. The primary barrier to the development and maintenance was views regarding sexuality. The primary support was their relationships with congregants that lived with HIV and AIDS. This information can assist in developing interventions to enhance the African American church movement toward HIV ministries.
Metnitz, P G; Laback, P; Popow, C; Laback, O; Lenz, K; Hiesmayr, M
1995-01-01
Patient Data Management Systems (PDMS) for ICUs collect, present and store clinical data. Various intentions make analysis of those digitally stored data desirable, such as quality control or scientific purposes. The aim of the Intensive Care Data Evaluation project (ICDEV), was to provide a database tool for the analysis of data recorded at various ICUs at the University Clinics of Vienna. General Hospital of Vienna, with two different PDMSs used: CareVue 9000 (Hewlett Packard, Andover, USA) at two ICUs (one medical ICU and one neonatal ICU) and PICIS Chart+ (PICIS, Paris, France) at one Cardiothoracic ICU. CONCEPT AND METHODS: Clinically oriented analysis of the data collected in a PDMS at an ICU was the beginning of the development. After defining the database structure we established a client-server based database system under Microsoft Windows NI and developed a user friendly data quering application using Microsoft Visual C++ and Visual Basic; ICDEV was successfully installed at three different ICUs, adjustment to the different PDMS configurations were done within a few days. The database structure developed by us enables a powerful query concept representing an 'EXPERT QUESTION COMPILER' which may help to answer almost any clinical questions. Several program modules facilitate queries at the patient, group and unit level. Results from ICDEV-queries are automatically transferred to Microsoft Excel for display (in form of configurable tables and graphs) and further processing. The ICDEV concept is configurable for adjustment to different intensive care information systems and can be used to support computerized quality control. However, as long as there exists no sufficient artifact recognition or data validation software for automatically recorded patient data, the reliability of these data and their usage for computer assisted quality control remain unclear and should be further studied.
NASA Technical Reports Server (NTRS)
Brown, G. S.; Curry, W. J.
1977-01-01
The statistical error of the pointing angle estimation technique is determined as a function of the effective receiver signal to noise ratio. Other sources of error are addressed and evaluated with inadequate calibration being of major concern. The impact of pointing error on the computation of normalized surface scattering cross section (sigma) from radar and the waveform attitude induced altitude bias is considered and quantitative results are presented. Pointing angle and sigma processing algorithms are presented along with some initial data. The intensive mode clean vs. clutter AGC calibration problem is analytically resolved. The use clutter AGC data in the intensive mode is confirmed as the correct calibration set for the sigma computations.
A Rich Metadata Filesystem for Scientific Data
ERIC Educational Resources Information Center
Bui, Hoang
2012-01-01
As scientific research becomes more data intensive, there is an increasing need for scalable, reliable, and high performance storage systems. Such data repositories must provide both data archival services and rich metadata, and cleanly integrate with large scale computing resources. ROARS is a hybrid approach to distributed storage that provides…
Wide-angle display developments by computer graphics
NASA Technical Reports Server (NTRS)
Fetter, William A.
1989-01-01
Computer graphics can now expand its new subset, wide-angle projection, to be as significant a generic capability as computer graphics itself. Some prior work in computer graphics is presented which leads to an attractive further subset of wide-angle projection, called hemispheric projection, to be a major communication media. Hemispheric film systems have long been present and such computer graphics systems are in use in simulators. This is the leading edge of capabilities which should ultimately be as ubiquitous as CRTs (cathode-ray tubes). These assertions are not from degrees in science or only from a degree in graphic design, but in a history of computer graphics innovations, laying groundwork by demonstration. The author believes that it is timely to look at several development strategies, since hemispheric projection is now at a point comparable to the early stages of computer graphics, requiring similar patterns of development again.
Bright, A
2008-05-01
1. In this study, the calling rates of vocalisations known to indicate distress and aversive events (Alarm calls, Squawks, Total vocalisations) and acoustic parameters of flock noise were quantified from feather and non-feather pecking laying flocks. 2. One hour of flock noise (background machinery and hen vocalisations) was recorded from 21 commercial free-range laying hen flocks aged > or =35 weeks. Ten of the flocks were classified as feather pecking (based on a plumage condition score) and 11 as non-feather pecking. 3. Recordings were made using a Sony DAT recorder and Audio-Technica omni-directional microphone, placed in the centre of the house-1.5 m from the ground. Avisoft-SASlab Pro was used to create and analyse audio spectrograms. 4. There was no effect of flock size or farm on call/s or acoustic parameters of flock noise. However, strain had an effect on the number of Total vocalisation/s; the Hebden Black flock made more calls than Lohmann flocks. Feather pecking flocks gave more Squawk/s and more Total vocalisation/s than non-feather pecking flocks. Feather pecking did not explain variation in alarm call rate or, intensity (dB) and frequency (Hz) measures of flock noise. 5. The differences between Squawk and Total vocalisation call rates of feather and non-feather pecking flocks are a new finding. An increase or change in flock calling rate may be evident before other conventional measures of laying hen welfare such as a drop in egg production or increase in plumage damage, thus enabling farmers to make management or husbandry changes to prevent an outbreak of feather pecking.
A Parallel and Incremental Approach for Data-Intensive Learning of Bayesian Networks.
Yue, Kun; Fang, Qiyu; Wang, Xiaoling; Li, Jin; Liu, Weiyi
2015-12-01
Bayesian network (BN) has been adopted as the underlying model for representing and inferring uncertain knowledge. As the basis of realistic applications centered on probabilistic inferences, learning a BN from data is a critical subject of machine learning, artificial intelligence, and big data paradigms. Currently, it is necessary to extend the classical methods for learning BNs with respect to data-intensive computing or in cloud environments. In this paper, we propose a parallel and incremental approach for data-intensive learning of BNs from massive, distributed, and dynamically changing data by extending the classical scoring and search algorithm and using MapReduce. First, we adopt the minimum description length as the scoring metric and give the two-pass MapReduce-based algorithms for computing the required marginal probabilities and scoring the candidate graphical model from sample data. Then, we give the corresponding strategy for extending the classical hill-climbing algorithm to obtain the optimal structure, as well as that for storing a BN by
3D exploitation of large urban photo archives
NASA Astrophysics Data System (ADS)
Cho, Peter; Snavely, Noah; Anderson, Ross
2010-04-01
Recent work in computer vision has demonstrated the potential to automatically recover camera and scene geometry from large collections of uncooperatively-collected photos. At the same time, aerial ladar and Geographic Information System (GIS) data are becoming more readily accessible. In this paper, we present a system for fusing these data sources in order to transfer 3D and GIS information into outdoor urban imagery. Applying this system to 1000+ pictures shot of the lower Manhattan skyline and the Statue of Liberty, we present two proof-of-concept examples of geometry-based photo enhancement which are difficult to perform via conventional image processing: feature annotation and image-based querying. In these examples, high-level knowledge projects from 3D world-space into georegistered 2D image planes and/or propagates between different photos. Such automatic capabilities lay the groundwork for future real-time labeling of imagery shot in complex city environments by mobile smart phones.
GPU Accelerated Browser for Neuroimaging Genomics.
Zigon, Bob; Li, Huang; Yao, Xiaohui; Fang, Shiaofen; Hasan, Mohammad Al; Yan, Jingwen; Moore, Jason H; Saykin, Andrew J; Shen, Li
2018-04-25
Neuroimaging genomics is an emerging field that provides exciting opportunities to understand the genetic basis of brain structure and function. The unprecedented scale and complexity of the imaging and genomics data, however, have presented critical computational bottlenecks. In this work we present our initial efforts towards building an interactive visual exploratory system for mining big data in neuroimaging genomics. A GPU accelerated browsing tool for neuroimaging genomics is created that implements the ANOVA algorithm for single nucleotide polymorphism (SNP) based analysis and the VEGAS algorithm for gene-based analysis, and executes them at interactive rates. The ANOVA algorithm is 110 times faster than the 4-core OpenMP version, while the VEGAS algorithm is 375 times faster than its 4-core OpenMP counter part. This approach lays a solid foundation for researchers to address the challenges of mining large-scale imaging genomics datasets via interactive visual exploration.
Cloud computing applications for biomedical science: A perspective
2018-01-01
Biomedical research has become a digital data–intensive endeavor, relying on secure and scalable computing, storage, and network infrastructure, which has traditionally been purchased, supported, and maintained locally. For certain types of biomedical applications, cloud computing has emerged as an alternative to locally maintained traditional computing approaches. Cloud computing offers users pay-as-you-go access to services such as hardware infrastructure, platforms, and software for solving common biomedical computational problems. Cloud computing services offer secure on-demand storage and analysis and are differentiated from traditional high-performance computing by their rapid availability and scalability of services. As such, cloud services are engineered to address big data problems and enhance the likelihood of data and analytics sharing, reproducibility, and reuse. Here, we provide an introductory perspective on cloud computing to help the reader determine its value to their own research. PMID:29902176
Data-intensive computing on numerically-insensitive supercomputers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ahrens, James P; Fasel, Patricia K; Habib, Salman
2010-12-03
With the advent of the era of petascale supercomputing, via the delivery of the Roadrunner supercomputing platform at Los Alamos National Laboratory, there is a pressing need to address the problem of visualizing massive petascale-sized results. In this presentation, I discuss progress on a number of approaches including in-situ analysis, multi-resolution out-of-core streaming and interactive rendering on the supercomputing platform. These approaches are placed in context by the emerging area of data-intensive supercomputing.
Exploring quantum computing application to satellite data assimilation
NASA Astrophysics Data System (ADS)
Cheung, S.; Zhang, S. Q.
2015-12-01
This is an exploring work on potential application of quantum computing to a scientific data optimization problem. On classical computational platforms, the physical domain of a satellite data assimilation problem is represented by a discrete variable transform, and classical minimization algorithms are employed to find optimal solution of the analysis cost function. The computation becomes intensive and time-consuming when the problem involves large number of variables and data. The new quantum computer opens a very different approach both in conceptual programming and in hardware architecture for solving optimization problem. In order to explore if we can utilize the quantum computing machine architecture, we formulate a satellite data assimilation experimental case in the form of quadratic programming optimization problem. We find a transformation of the problem to map it into Quadratic Unconstrained Binary Optimization (QUBO) framework. Binary Wavelet Transform (BWT) will be applied to the data assimilation variables for its invertible decomposition and all calculations in BWT are performed by Boolean operations. The transformed problem will be experimented as to solve for a solution of QUBO instances defined on Chimera graphs of the quantum computer.
NASA Astrophysics Data System (ADS)
Sitzia, T.; Picco, L.; Ravazzolo, D.; Comiti, F.; Mao, L.; Lenzi, M. A.
2016-07-01
We compared three gravel-bed rivers in north-eastern Italy (Brenta, Piave, Tagliamento) having similar bioclimate, geology and fluvial morphology, but affected by different intensities of anthropogenic disturbance related particularly to hydropower dams, training works and instream gravel mining. Our aim was to test whether a corresponding difference in the interactions between vegetation and geomorphological patterns existed among the three rivers. In equally spaced and sized plots (n = 710) we collected descriptors of geomorphic conditions, and presence-absence of woody species. In the less disturbed river (Tagliamento), spatial succession of woody communities from the floodplain to the channel followed a profile where higher elevation floodplains featured more developed tree communities, and lower elevation islands and bars were covered by pioneer communities. In the intermediate-disturbed river (Piave), islands and floodplains lay at similar elevation and both showed species indicators of mature developed communities. In the most disturbed river (Brenta), all these patterns were simplified, all geomorphic units lay at similar elevations, were not well characterized by species composition, and presented similar persistence age. This indicates that in human-disturbed rivers, channel and vegetation adjustments are closely linked in the long term, and suggests that intermediate levels of anthropogenic disturbance, such as those encountered in the Piave River, could counteract the natural, more dynamic conditions that may periodically fragment vegetated landscapes in natural rivers.
A Computational Approach for Modeling Neutron Scattering Data from Lipid Bilayers
Carrillo, Jan-Michael Y.; Katsaras, John; Sumpter, Bobby G.; ...
2017-01-12
Biological cell membranes are responsible for a range of structural and dynamical phenomena crucial to a cell's well-being and its associated functions. Due to the complexity of cell membranes, lipid bilayer systems are often used as biomimetic models. These systems have led to signficant insights into vital membrane phenomena such as domain formation, passive permeation and protein insertion. Experimental observations of membrane structure and dynamics are, however, limited in resolution, both spatially and temporally. Importantly, computer simulations are starting to play a more prominent role in interpreting experimental results, enabling a molecular under- standing of lipid membranes. Particularly, the synergymore » between scattering experiments and simulations offers opportunities for new discoveries in membrane physics, as the length and time scales probed by molecular dynamics (MD) simulations parallel those of experiments. We also describe a coarse-grained MD simulation approach that mimics neutron scattering data from large unilamellar lipid vesicles over a range of bilayer rigidity. Specfically, we simulate vesicle form factors and membrane thickness fluctuations determined from small angle neutron scattering (SANS) and neutron spin echo (NSE) experiments, respectively. Our simulations accurately reproduce trends from experiments and lay the groundwork for investigations of more complex membrane systems.« less
Architecture and Programming Models for High Performance Intensive Computation
2016-06-29
Applications Systems and Large-Scale-Big-Data & Large-Scale-Big-Computing (DDDAS- LS ). ICCS 2015, June 2015. Reykjavk, Ice- land. 2. Bo YT, Wang P, Guo ZL...The Mahali project,” Communications Magazine , vol. 52, pp. 111–133, Aug 2014. 14 DISTRIBUTION A: Distribution approved for public release. Response ID
Desktop Social Science: Coming of Age.
ERIC Educational Resources Information Center
Dwyer, David C.; And Others
Beginning in 1985, Apple Computer, Inc. and several school districts began a collaboration to examine the impact of intensive computer use on instruction and learning in K-12 classrooms. This paper follows the development of a Macintosh II-based management and retrieval system for text data undertaken to store and retrieve oral reflections of…
Discovering and understanding oncogenic gene fusions through data intensive computational approaches
Latysheva, Natasha S.; Babu, M. Madan
2016-01-01
Abstract Although gene fusions have been recognized as important drivers of cancer for decades, our understanding of the prevalence and function of gene fusions has been revolutionized by the rise of next-generation sequencing, advances in bioinformatics theory and an increasing capacity for large-scale computational biology. The computational work on gene fusions has been vastly diverse, and the present state of the literature is fragmented. It will be fruitful to merge three camps of gene fusion bioinformatics that appear to rarely cross over: (i) data-intensive computational work characterizing the molecular biology of gene fusions; (ii) development research on fusion detection tools, candidate fusion prioritization algorithms and dedicated fusion databases and (iii) clinical research that seeks to either therapeutically target fusion transcripts and proteins or leverages advances in detection tools to perform large-scale surveys of gene fusion landscapes in specific cancer types. In this review, we unify these different—yet highly complementary and symbiotic—approaches with the view that increased synergy will catalyze advancements in gene fusion identification, characterization and significance evaluation. PMID:27105842
BlueSNP: R package for highly scalable genome-wide association studies using Hadoop clusters.
Huang, Hailiang; Tata, Sandeep; Prill, Robert J
2013-01-01
Computational workloads for genome-wide association studies (GWAS) are growing in scale and complexity outpacing the capabilities of single-threaded software designed for personal computers. The BlueSNP R package implements GWAS statistical tests in the R programming language and executes the calculations across computer clusters configured with Apache Hadoop, a de facto standard framework for distributed data processing using the MapReduce formalism. BlueSNP makes computationally intensive analyses, such as estimating empirical p-values via data permutation, and searching for expression quantitative trait loci over thousands of genes, feasible for large genotype-phenotype datasets. http://github.com/ibm-bioinformatics/bluesnp
NASA Astrophysics Data System (ADS)
Seamon, E.; Gessler, P. E.; Flathers, E.
2015-12-01
The creation and use of large amounts of data in scientific investigations has become common practice. Data collection and analysis for large scientific computing efforts are not only increasing in volume as well as number, the methods and analysis procedures are evolving toward greater complexity (Bell, 2009, Clarke, 2009, Maimon, 2010). In addition, the growth of diverse data-intensive scientific computing efforts (Soni, 2011, Turner, 2014, Wu, 2008) has demonstrated the value of supporting scientific data integration. Efforts to bridge this gap between the above perspectives have been attempted, in varying degrees, with modular scientific computing analysis regimes implemented with a modest amount of success (Perez, 2009). This constellation of effects - 1) an increasing growth in the volume and amount of data, 2) a growing data-intensive science base that has challenging needs, and 3) disparate data organization and integration efforts - has created a critical gap. Namely, systems of scientific data organization and management typically do not effectively enable integrated data collaboration or data-intensive science-based communications. Our research efforts attempt to address this gap by developing a modular technology framework for data science integration efforts - with climate variation as the focus. The intention is that this model, if successful, could be generalized to other application areas. Our research aim focused on the design and implementation of a modular, deployable technology architecture for data integration. Developed using aspects of R, interactive python, SciDB, THREDDS, Javascript, and varied data mining and machine learning techniques, the Modular Data Response Framework (MDRF) was implemented to explore case scenarios for bio-climatic variation as they relate to pacific northwest ecosystem regions. Our preliminary results, using historical NETCDF climate data for calibration purposes across the inland pacific northwest region (Abatzoglou, Brown, 2011), show clear ecosystems shifting over a ten-year period (2001-2011), based on multiple supervised classifier methods for bioclimatic indicators.
0-6767 : evaluation of existing smartphone applications and data needs for travel survey.
DOT National Transportation Integrated Search
2014-08-01
Current and reliable data on traffic movements : play a key role in transportation planning, : modeling, and air quality analysis. Traditional : travel surveys conducted via paper or computer : are costly, time consuming, and labor intensive : for su...
Bringing MapReduce Closer To Data With Active Drives
NASA Astrophysics Data System (ADS)
Golpayegani, N.; Prathapan, S.; Warmka, R.; Wyatt, B.; Halem, M.; Trantham, J. D.; Markey, C. A.
2017-12-01
Moving computation closer to the data location has been a much theorized improvement to computation for decades. The increase in processor performance, the decrease in processor size and power requirement combined with the increase in data intensive computing has created a push to move computation as close to data as possible. We will show the next logical step in this evolution in computing: moving computation directly to storage. Hypothetical systems, known as Active Drives, have been proposed as early as 1998. These Active Drives would have a general-purpose CPU on each disk allowing for computations to be performed on them without the need to transfer the data to the computer over the system bus or via a network. We will utilize Seagate's Active Drives to perform general purpose parallel computing using the MapReduce programming model directly on each drive. We will detail how the MapReduce programming model can be adapted to the Active Drive compute model to perform general purpose computing with comparable results to traditional MapReduce computations performed via Hadoop. We will show how an Active Drive based approach significantly reduces the amount of data leaving the drive when performing several common algorithms: subsetting and gridding. We will show that an Active Drive based design significantly improves data transfer speeds into and out of drives compared to Hadoop's HDFS while at the same time keeping comparable compute speeds as Hadoop.
Three-dimensional surface profile intensity correction for spatially modulated imaging
NASA Astrophysics Data System (ADS)
Gioux, Sylvain; Mazhar, Amaan; Cuccia, David J.; Durkin, Anthony J.; Tromberg, Bruce J.; Frangioni, John V.
2009-05-01
We describe a noncontact profile correction technique for quantitative, wide-field optical measurement of tissue absorption (μa) and reduced scattering (μs') coefficients, based on geometric correction of the sample's Lambertian (diffuse) reflectance intensity. Because the projection of structured light onto an object is the basis for both phase-shifting profilometry and modulated imaging, we were able to develop a single instrument capable of performing both techniques. In so doing, the surface of the three-dimensional object could be acquired and used to extract the object's optical properties. The optical properties of flat polydimethylsiloxane (silicone) phantoms with homogenous tissue-like optical properties were extracted, with and without profilometry correction, after vertical translation and tilting of the phantoms at various angles. Objects having a complex shape, including a hemispheric silicone phantom and human fingers, were acquired and similarly processed, with vascular constriction of a finger being readily detectable through changes in its optical properties. Using profilometry correction, the accuracy of extracted absorption and reduced scattering coefficients improved from two- to ten-fold for surfaces having height variations as much as 3 cm and tilt angles as high as 40 deg. These data lay the foundation for employing structured light for quantitative imaging during surgery.
Digital signal conditioning for flight test instrumentation
NASA Technical Reports Server (NTRS)
Bever, Glenn A.
1991-01-01
An introduction to digital measurement processes on aircraft is provided. Flight test instrumentation systems are rapidly evolving from analog-intensive to digital intensive systems, including the use of onboard digital computers. The topics include measurements that are digital in origin, as well as sampling, encoding, transmitting, and storing data. Particular emphasis is placed on modern avionic data bus architectures and what to be aware of when extracting data from them. Examples of data extraction techniques are given. Tradeoffs between digital logic families, trends in digital development, and design testing techniques are discussed. An introduction to digital filtering is also covered.
Accessing the public MIMIC-II intensive care relational database for clinical research
2013-01-01
Background The Multiparameter Intelligent Monitoring in Intensive Care II (MIMIC-II) database is a free, public resource for intensive care research. The database was officially released in 2006, and has attracted a growing number of researchers in academia and industry. We present the two major software tools that facilitate accessing the relational database: the web-based QueryBuilder and a downloadable virtual machine (VM) image. Results QueryBuilder and the MIMIC-II VM have been developed successfully and are freely available to MIMIC-II users. Simple example SQL queries and the resulting data are presented. Clinical studies pertaining to acute kidney injury and prediction of fluid requirements in the intensive care unit are shown as typical examples of research performed with MIMIC-II. In addition, MIMIC-II has also provided data for annual PhysioNet/Computing in Cardiology Challenges, including the 2012 Challenge “Predicting mortality of ICU Patients”. Conclusions QueryBuilder is a web-based tool that provides easy access to MIMIC-II. For more computationally intensive queries, one can locally install a complete copy of MIMIC-II in a VM. Both publicly available tools provide the MIMIC-II research community with convenient querying interfaces and complement the value of the MIMIC-II relational database. PMID:23302652
BAAL/CUP Seminar 2015: Eyetracking as a Research Method in Online Language Education
ERIC Educational Resources Information Center
Stickler, Ursula; Shi, Lijing
2016-01-01
The seminar took place at The Open University, Milton Keynes, United Kingdom, 12-13 June 2015. The objectives of this seminar were to explore the use of eyetracking in advancing online language education, and to provide a platform for the exchange of ideas across disciplines (language, education, and computing). The overarching aim was to lay the…
Energy 101: Energy Efficient Data Centers
None
2018-04-16
Data centers provide mission-critical computing functions vital to the daily operation of top U.S. economic, scientific, and technological organizations. These data centers consume large amounts of energy to run and maintain their computer systems, servers, and associated high-performance componentsâup to 3% of all U.S. electricity powers data centers. And as more information comes online, data centers will consume even more energy. Data centers can become more energy efficient by incorporating features like power-saving "stand-by" modes, energy monitoring software, and efficient cooling systems instead of energy-intensive air conditioners. These and other efficiency improvements to data centers can produce significant energy savings, reduce the load on the electric grid, and help protect the nation by increasing the reliability of critical computer operations.
Cloud Computing Boosts Business Intelligence of Telecommunication Industry
NASA Astrophysics Data System (ADS)
Xu, Meng; Gao, Dan; Deng, Chao; Luo, Zhiguo; Sun, Shaoling
Business Intelligence becomes an attracting topic in today's data intensive applications, especially in telecommunication industry. Meanwhile, Cloud Computing providing IT supporting Infrastructure with excellent scalability, large scale storage, and high performance becomes an effective way to implement parallel data processing and data mining algorithms. BC-PDM (Big Cloud based Parallel Data Miner) is a new MapReduce based parallel data mining platform developed by CMRI (China Mobile Research Institute) to fit the urgent requirements of business intelligence in telecommunication industry. In this paper, the architecture, functionality and performance of BC-PDM are presented, together with the experimental evaluation and case studies of its applications. The evaluation result demonstrates both the usability and the cost-effectiveness of Cloud Computing based Business Intelligence system in applications of telecommunication industry.
Computer simulation as a teaching aid in pharmacy management--Part 1: Principles of accounting.
Morrison, D J
1987-06-01
The need for pharmacists to develop management expertise through participation in formal courses is now widely acknowledged. Many schools of pharmacy lay the foundations for future management training by providing introductory courses as an integral or elective part of the undergraduate syllabus. The benefit of such courses may, however, be limited by the lack of opportunity for the student to apply the concepts and procedures in a practical working environment. Computer simulations provide a means to overcome this problem, particularly in the field of resource management. In this, the first of two articles, the use of a computer model to demonstrate basic accounting principles is described.
Jagannatha, Abhyuday N; Fodeh, Samah J; Yu, Hong
2017-01-01
Background Medical terms are a major obstacle for patients to comprehend their electronic health record (EHR) notes. Clinical natural language processing (NLP) systems that link EHR terms to lay terms or definitions allow patients to easily access helpful information when reading through their EHR notes, and have shown to improve patient EHR comprehension. However, high-quality lay language resources for EHR terms are very limited in the public domain. Because expanding and curating such a resource is a costly process, it is beneficial and even necessary to identify terms important for patient EHR comprehension first. Objective We aimed to develop an NLP system, called adapted distant supervision (ADS), to rank candidate terms mined from EHR corpora. We will give EHR terms ranked as high by ADS a higher priority for lay language annotation—that is, creating lay definitions for these terms. Methods Adapted distant supervision uses distant supervision from consumer health vocabulary and transfer learning to adapt itself to solve the problem of ranking EHR terms in the target domain. We investigated 2 state-of-the-art transfer learning algorithms (ie, feature space augmentation and supervised distant supervision) and designed 5 types of learning features, including distributed word representations learned from large EHR data for ADS. For evaluating ADS, we asked domain experts to annotate 6038 candidate terms as important or nonimportant for EHR comprehension. We then randomly divided these data into the target-domain training data (1000 examples) and the evaluation data (5038 examples). We compared ADS with 2 strong baselines, including standard supervised learning, on the evaluation data. Results The ADS system using feature space augmentation achieved the best average precision, 0.850, on the evaluation set when using 1000 target-domain training examples. The ADS system using supervised distant supervision achieved the best average precision, 0.819, on the evaluation set when using only 100 target-domain training examples. The 2 ADS systems both performed significantly better than the baseline systems (P<.001 for all measures and all conditions). Using a rich set of learning features contributed to ADS’s performance substantially. Conclusions ADS can effectively rank terms mined from EHRs. Transfer learning improved ADS’s performance even with a small number of target-domain training examples. EHR terms prioritized by ADS were used to expand a lay language resource that supports patient EHR comprehension. The top 10,000 EHR terms ranked by ADS are available upon request. PMID:29089288
Federated data storage and management infrastructure
NASA Astrophysics Data System (ADS)
Zarochentsev, A.; Kiryanov, A.; Klimentov, A.; Krasnopevtsev, D.; Hristov, P.
2016-10-01
The Large Hadron Collider (LHC)’ operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe. Computing models for the High Luminosity LHC era anticipate a growth of storage needs of at least orders of magnitude; it will require new approaches in data storage organization and data handling. In our project we address the fundamental problem of designing of architecture to integrate a distributed heterogeneous disk resources for LHC experiments and other data- intensive science applications and to provide access to data from heterogeneous computing facilities. We have prototyped a federated storage for Russian T1 and T2 centers located in Moscow, St.-Petersburg and Gatchina, as well as Russian / CERN federation. We have conducted extensive tests of underlying network infrastructure and storage endpoints with synthetic performance measurement tools as well as with HENP-specific workloads, including the ones running on supercomputing platform, cloud computing and Grid for ALICE and ATLAS experiments. We will present our current accomplishments with running LHC data analysis remotely and locally to demonstrate our ability to efficiently use federated data storage experiment wide within National Academic facilities for High Energy and Nuclear Physics as well as for other data-intensive science applications, such as bio-informatics.
A General-purpose Framework for Parallel Processing of Large-scale LiDAR Data
NASA Astrophysics Data System (ADS)
Li, Z.; Hodgson, M.; Li, W.
2016-12-01
Light detection and ranging (LiDAR) technologies have proven efficiency to quickly obtain very detailed Earth surface data for a large spatial extent. Such data is important for scientific discoveries such as Earth and ecological sciences and natural disasters and environmental applications. However, handling LiDAR data poses grand geoprocessing challenges due to data intensity and computational intensity. Previous studies received notable success on parallel processing of LiDAR data to these challenges. However, these studies either relied on high performance computers and specialized hardware (GPUs) or focused mostly on finding customized solutions for some specific algorithms. We developed a general-purpose scalable framework coupled with sophisticated data decomposition and parallelization strategy to efficiently handle big LiDAR data. Specifically, 1) a tile-based spatial index is proposed to manage big LiDAR data in the scalable and fault-tolerable Hadoop distributed file system, 2) two spatial decomposition techniques are developed to enable efficient parallelization of different types of LiDAR processing tasks, and 3) by coupling existing LiDAR processing tools with Hadoop, this framework is able to conduct a variety of LiDAR data processing tasks in parallel in a highly scalable distributed computing environment. The performance and scalability of the framework is evaluated with a series of experiments conducted on a real LiDAR dataset using a proof-of-concept prototype system. The results show that the proposed framework 1) is able to handle massive LiDAR data more efficiently than standalone tools; and 2) provides almost linear scalability in terms of either increased workload (data volume) or increased computing nodes with both spatial decomposition strategies. We believe that the proposed framework provides valuable references on developing a collaborative cyberinfrastructure for processing big earth science data in a highly scalable environment.
Angeles-Albores, David; Leighton, Daniel H W; Tsou, Tiffany; Khaw, Tiffany H; Antoshechkin, Igor; Sternberg, Paul W
2017-09-07
Understanding genome and gene function in a whole organism requires us to fully comprehend the life cycle and the physiology of the organism in question. Caenorhabditis elegans XX animals are hermaphrodites that exhaust their sperm after 3 d of egg-laying. Even though C. elegans can live for many days after cessation of egg-laying, the molecular physiology of this state has not been as intensely studied as other parts of the life cycle, despite documented changes in behavior and metabolism. To study the effects of sperm depletion and aging of C. elegans during the first 6 d of adulthood, we measured the transcriptomes of first-day adult hermaphrodites and sixth-day sperm-depleted adults, and, at the same time points, mutant fog-2(lf) worms that have a feminized germline phenotype. We found that we could separate the effects of biological aging from sperm depletion. For a large subset of genes, young adult fog-2(lf) animals had the same gene expression changes as sperm-depleted sixth-day wild-type hermaphrodites, and these genes did not change expression when fog-2(lf) females reached the sixth day of adulthood. Taken together, this indicates that changing sperm status causes a change in the internal state of the worm, which we call the female-like state. Our data provide a high-quality picture of the changes that happen in global gene expression throughout the period of early aging in the worm. Copyright © 2017 Angeles-Albores et al.
Campion, Thomas R.; Waitman, Lemuel R.; May, Addison K.; Ozdas, Asli; Lorenzi, Nancy M.; Gadd, Cynthia S.
2009-01-01
Introduction: Evaluations of computerized clinical decision support systems (CDSS) typically focus on clinical performance changes and do not include social, organizational, and contextual characteristics explaining use and effectiveness. Studies of CDSS for intensive insulin therapy (IIT) are no exception, and the literature lacks an understanding of effective computer-based IIT implementation and operation. Results: This paper presents (1) a literature review of computer-based IIT evaluations through the lens of institutional theory, a discipline from sociology and organization studies, to demonstrate the inconsistent reporting of workflow and care process execution and (2) a single-site case study to illustrate how computer-based IIT requires substantial organizational change and creates additional complexity with unintended consequences including error. Discussion: Computer-based IIT requires organizational commitment and attention to site-specific technology, workflow, and care processes to achieve intensive insulin therapy goals. The complex interaction between clinicians, blood glucose testing devices, and CDSS may contribute to workflow inefficiency and error. Evaluations rarely focus on the perspective of nurses, the primary users of computer-based IIT whose knowledge can potentially lead to process and care improvements. Conclusion: This paper addresses a gap in the literature concerning the social, organizational, and contextual characteristics of CDSS in general and for intensive insulin therapy specifically. Additionally, this paper identifies areas for future research to define optimal computer-based IIT process execution: the frequency and effect of manual data entry error of blood glucose values, the frequency and effect of nurse overrides of CDSS insulin dosing recommendations, and comprehensive ethnographic study of CDSS for IIT. PMID:19815452
NASA Technical Reports Server (NTRS)
1981-01-01
Progress in the study of the intensity of the urban heat island is reported. The intensity of the heat island is commonly defined as the temperature difference between the center of the city and the surrounding suburban and rural regions. The intensity is considered as a function of changes in the season and changes in meteorological conditions in order to derive various parameters which may be used in numerical models for urban climate. Twelve case studies were selected and CCT's were ordered. In situ data was obtained from sixteen stations scattered about the city of St. Louis. Upper-air meteorological data were obtained and the water vapor and the temperature data were processed. Atmospheric transmissivities were computed for each of the case studies.
Castaño-Díez, Daniel
2017-01-01
Dynamo is a package for the processing of tomographic data. As a tool for subtomogram averaging, it includes different alignment and classification strategies. Furthermore, its data-management module allows experiments to be organized in groups of tomograms, while offering specialized three-dimensional tomographic browsers that facilitate visualization, location of regions of interest, modelling and particle extraction in complex geometries. Here, a technical description of the package is presented, focusing on its diverse strategies for optimizing computing performance. Dynamo is built upon mbtools (middle layer toolbox), a general-purpose MATLAB library for object-oriented scientific programming specifically developed to underpin Dynamo but usable as an independent tool. Its structure intertwines a flexible MATLAB codebase with precompiled C++ functions that carry the burden of numerically intensive operations. The package can be delivered as a precompiled standalone ready for execution without a MATLAB license. Multicore parallelization on a single node is directly inherited from the high-level parallelization engine provided for MATLAB, automatically imparting a balanced workload among the threads in computationally intense tasks such as alignment and classification, but also in logistic-oriented tasks such as tomogram binning and particle extraction. Dynamo supports the use of graphical processing units (GPUs), yielding considerable speedup factors both for native Dynamo procedures (such as the numerically intensive subtomogram alignment) and procedures defined by the user through its MATLAB-based GPU library for three-dimensional operations. Cloud-based virtual computing environments supplied with a pre-installed version of Dynamo can be publicly accessed through the Amazon Elastic Compute Cloud (EC2), enabling users to rent GPU computing time on a pay-as-you-go basis, thus avoiding upfront investments in hardware and longterm software maintenance. PMID:28580909
Castaño-Díez, Daniel
2017-06-01
Dynamo is a package for the processing of tomographic data. As a tool for subtomogram averaging, it includes different alignment and classification strategies. Furthermore, its data-management module allows experiments to be organized in groups of tomograms, while offering specialized three-dimensional tomographic browsers that facilitate visualization, location of regions of interest, modelling and particle extraction in complex geometries. Here, a technical description of the package is presented, focusing on its diverse strategies for optimizing computing performance. Dynamo is built upon mbtools (middle layer toolbox), a general-purpose MATLAB library for object-oriented scientific programming specifically developed to underpin Dynamo but usable as an independent tool. Its structure intertwines a flexible MATLAB codebase with precompiled C++ functions that carry the burden of numerically intensive operations. The package can be delivered as a precompiled standalone ready for execution without a MATLAB license. Multicore parallelization on a single node is directly inherited from the high-level parallelization engine provided for MATLAB, automatically imparting a balanced workload among the threads in computationally intense tasks such as alignment and classification, but also in logistic-oriented tasks such as tomogram binning and particle extraction. Dynamo supports the use of graphical processing units (GPUs), yielding considerable speedup factors both for native Dynamo procedures (such as the numerically intensive subtomogram alignment) and procedures defined by the user through its MATLAB-based GPU library for three-dimensional operations. Cloud-based virtual computing environments supplied with a pre-installed version of Dynamo can be publicly accessed through the Amazon Elastic Compute Cloud (EC2), enabling users to rent GPU computing time on a pay-as-you-go basis, thus avoiding upfront investments in hardware and longterm software maintenance.
Aamodt, Carla B; Virtue, David W; Dobbie, Alison E
2006-05-01
Teaching physical examination skills effectively, consistently, and cost-effectively is challenging. Faculty time is the most expensive resource. One solution is to train medical students using lay physical examination teaching associates. In this study, we investigated the feasibility, acceptability, and cost-effectiveness of training medical students using teaching associates trained by a lay expert instead of a clinician. We used teaching associates to instruct students about techniques of physical examination. We measured students' satisfaction with this teaching approach. We also monitored the financial cost of this approach compared to the previously used approach in which faculty physicians taught physical examination skills. Our program proved practical to accomplish and acceptable to students. Students rated the program highly, and we saved approximately $9,100, compared with our previous faculty-intensive teaching program. We believe that our program is popular with students, cost-effective, and generalizable to other institutions.
A dangerous method? The German discourse on hypnotic suggestion therapy around 1900
Maehle, Andreas-Holger
2017-01-01
In the late nineteenth century, German-speaking physicians and psychiatrists intensely debated the benefits and risks of treatment by hypnotic suggestion. While practitioners of the method sought to provide convincing evidence for its therapeutic efficacy in many medical conditions, especially nervous disorders, critics pointed to dangerous side effects, including the triggering of hysterical attacks or deterioration of nervous symptoms. Other critics claimed that patients merely simulated hypnotic phenomena in order to appease their therapist. A widespread concern was the potential for abuses of hypnosis, either by giving criminal suggestions or in the form of sexual assaults on hypnotized patients. Official inquiries by the Prussian Minister for Religious, Educational and Medical Affairs in 1902 and 1906 indicated that relatively few doctors practised hypnotherapy, whereas the method was increasingly used by lay healers. Although the Ministry found no evidence for serious harm caused by hypnotic treatments, whether performed by doctors or by lay healers, many German doctors seem to have regarded hypnotic suggestion therapy as a problematic method and abstained from using it.
Regional Sustainability: The San Luis Basin Metrics Project
There are a number of established, scientifically supported metrics of sustainability. Many of the metrics are data intensive and require extensive effort to collect data and compute. Moreover, individual metrics may not capture all aspects of a system that are relevant to sust...
Development of a Multidisciplinary Approach to Access Sustainability
There are a number of established, scientifically supported metrics of sustainability. Many of the metrics are data intensive and require extensive effort to collect data and compute the metrics. Moreover, individual metrics do not capture all aspects of a system that are relevan...
Advancing Cyberinfrastructure to support high resolution water resources modeling
NASA Astrophysics Data System (ADS)
Tarboton, D. G.; Ogden, F. L.; Jones, N.; Horsburgh, J. S.
2012-12-01
Addressing the problem of how the availability and quality of water resources at large scales are sensitive to climate variability, watershed alterations and management activities requires computational resources that combine data from multiple sources and support integrated modeling. Related cyberinfrastructure challenges include: 1) how can we best structure data and computer models to address this scientific problem through the use of high-performance and data-intensive computing, and 2) how can we do this in a way that discipline scientists without extensive computational and algorithmic knowledge and experience can take advantage of advances in cyberinfrastructure? This presentation will describe a new system called CI-WATER that is being developed to address these challenges and advance high resolution water resources modeling in the Western U.S. We are building on existing tools that enable collaboration to develop model and data interfaces that link integrated system models running within an HPC environment to multiple data sources. Our goal is to enhance the use of computational simulation and data-intensive modeling to better understand water resources. Addressing water resource problems in the Western U.S. requires simulation of natural and engineered systems, as well as representation of legal (water rights) and institutional constraints alongside the representation of physical processes. We are establishing data services to represent the engineered infrastructure and legal and institutional systems in a way that they can be used with high resolution multi-physics watershed modeling at high spatial resolution. These services will enable incorporation of location-specific information on water management infrastructure and systems into the assessment of regional water availability in the face of growing demands, uncertain future meteorological forcings, and existing prior-appropriations water rights. This presentation will discuss the informatics challenges involved with data management and easy-to-use access to high performance computing being tackled in this project.
Lorhan, Shaun; Dennis, Darcy; van der Westhuizen, Michael; Hodgson, Sally; Berrang, Tanya; Daudt, Helena
2014-08-01
To describe the experiences of patients with lung cancer with a volunteer-based lay navigation intervention. Forty patients with newly diagnosed lung cancer enrolled in a three-step navigation intervention delivered by trained volunteer lay navigators (VLNs), beginning prior to their first oncologist's appointment and ending before the start of treatment. Methodological triangulation of data was used in a mixed method study design. Cases were categorized based on the predominant needs met by the VLN: emotional, practical/informational, family, and complex. Data were analyzed using framework analysis. The provision of emotional support, information, and referrals to other services by the VLN were of particular benefit to the patient and their families. Satisfaction with the program and its timing was high; it was considered an effective means for patients to share concerns and have their needs attended to before starting treatment. This study demonstrates capacity for lay volunteers to address the multifaceted needs of lung cancer patients during their transition from primary care in the diagnosis to treatment phase. Using volunteers as navigators offers an opportunity to meet patient needs with minimal resources, increase access to services for patients, and improve the sustainability of the program. Crown Copyright © 2014. Published by Elsevier Ireland Ltd. All rights reserved.
Buetow, Stephen; Henshaw, Jenny; Bryant, Linda; O'Sullivan, Deirdre
2010-01-01
Background. Common but seldom published are Parkinson's disease (PD) medication errors involving late, extra, or missed doses. These errors can reduce medication effectiveness and the quality of life of people with PD and their caregivers. Objective. To explore lay perspectives of factors contributing to medication timing errors for PD in hospital and community settings. Design and Methods. This qualitative research purposively sampled individuals with PD, or a proxy of their choice, throughout New Zealand during 2008-2009. Data collection involved 20 semistructured, personal interviews by telephone. A general inductive analysis of the data identified core insights consistent with the study objective. Results. Five themes help to account for possible timing adherence errors by people with PD, their caregivers or professionals. The themes are the abrupt withdrawal of PD medication; wrong, vague or misread instructions; devaluation of the lay role in managing PD medications; deficits in professional knowledge and in caring behavior around PD in formal health care settings; and lay forgetfulness. Conclusions. The results add to the limited published research on medication errors in PD and help to confirm anecdotal experience internationally. They indicate opportunities for professionals and lay people to work together to reduce errors in the timing of medication for PD in hospital and community settings. PMID:20975777
Improved Optics For Quasi-Elastic Light Scattering
NASA Technical Reports Server (NTRS)
Cheung, Harry Michael
1995-01-01
Improved optical train devised for use in light-scattering measurements of quasi-elastic light scattering (QELS) and laser spectroscopy. Measurements performed on solutions, microemulsions, micellular solutions, and colloidal dispersions. Simultaneous measurements of total intensity and fluctuations in total intensity of light scattered from sample at various angles provides data used, in conjunction with diffusion coefficients, to compute sizes of particles in sample.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Curtis, Darren S.; Peterson, Elena S.; Oehmen, Chris S.
2008-05-04
This work presents the ScalaBLAST Web Application (SWA), a web based application implemented using the PHP script language, MySQL DBMS, and Apache web server under a GNU/Linux platform. SWA is an application built as part of the Data Intensive Computer for Complex Biological Systems (DICCBS) project at the Pacific Northwest National Laboratory (PNNL). SWA delivers accelerated throughput of bioinformatics analysis via high-performance computing through a convenient, easy-to-use web interface. This approach greatly enhances emerging fields of study in biology such as ontology-based homology, and multiple whole genome comparisons which, in the absence of a tool like SWA, require a heroicmore » effort to overcome the computational bottleneck associated with genome analysis. The current version of SWA includes a user account management system, a web based user interface, and a backend process that generates the files necessary for the Internet scientific community to submit a ScalaBLAST parallel processing job on a dedicated cluster.« less
System on a chip with MPEG-4 capability
NASA Astrophysics Data System (ADS)
Yassa, Fathy; Schonfeld, Dan
2002-12-01
Current products supporting video communication applications rely on existing computer architectures. RISC processors have been used successfully in numerous applications over several decades. DSP processors have become ubiquitous in signal processing and communication applications. Real-time applications such as speech processing in cellular telephony rely extensively on the computational power of these processors. Video processors designed to implement the computationally intensive codec operations have also been used to address the high demands of video communication applications (e.g., cable set-top boxes and DVDs). This paper presents an overview of a system-on-chip (SOC) architecture used for real-time video in wireless communication applications. The SOC specifications answer to the system requirements imposed by the application environment. A CAM-based video processor is used to accelerate data intensive video compression tasks such as motion estimations and filtering. Other components are dedicated to system level data processing and audio processing. A rich set of I/Os allows the SOC to communicate with other system components such as baseband and memory subsystems.
NASA Technical Reports Server (NTRS)
Anspaugh, B. E.; Miyahira, T. F.; Weiss, R. S.
1979-01-01
Computed statistical averages and standard deviations with respect to the measured cells for each intensity temperature measurement condition are presented. Display averages and standard deviations of the cell characteristics in a two dimensional array format are shown: one dimension representing incoming light intensity, and another, the cell temperature. Programs for calculating the temperature coefficients of the pertinent cell electrical parameters are presented, and postirradiation data are summarized.
Cloud-based Jupyter Notebooks for Water Data Analysis
NASA Astrophysics Data System (ADS)
Castronova, A. M.; Brazil, L.; Seul, M.
2017-12-01
The development and adoption of technologies by the water science community to improve our ability to openly collaborate and share workflows will have a transformative impact on how we address the challenges associated with collaborative and reproducible scientific research. Jupyter notebooks offer one solution by providing an open-source platform for creating metadata-rich toolchains for modeling and data analysis applications. Adoption of this technology within the water sciences, coupled with publicly available datasets from agencies such as USGS, NASA, and EPA enables researchers to easily prototype and execute data intensive toolchains. Moreover, implementing this software stack in a cloud-based environment extends its native functionality to provide researchers a mechanism to build and execute toolchains that are too large or computationally demanding for typical desktop computers. Additionally, this cloud-based solution enables scientists to disseminate data processing routines alongside journal publications in an effort to support reproducibility. For example, these data collection and analysis toolchains can be shared, archived, and published using the HydroShare platform or downloaded and executed locally to reproduce scientific analysis. This work presents the design and implementation of a cloud-based Jupyter environment and its application for collecting, aggregating, and munging various datasets in a transparent, sharable, and self-documented manner. The goals of this work are to establish a free and open source platform for domain scientists to (1) conduct data intensive and computationally intensive collaborative research, (2) utilize high performance libraries, models, and routines within a pre-configured cloud environment, and (3) enable dissemination of research products. This presentation will discuss recent efforts towards achieving these goals, and describe the architectural design of the notebook server in an effort to support collaborative and reproducible science.
Professional and lay people perceptions of anterior maxillary esthetics
NASA Astrophysics Data System (ADS)
Roslan, Husniyati; Lillywhite, Graeme
2016-12-01
Achieving esthetic outcomes with implant-based restorations in the esthetic zone is a challenge due to the difficulty in replacing lost papillae. This study aimed to assess the influence of contact point position on the overall perception of esthetics as assessed by dental professionals and lay people. A cross-sectional study using self-administered questionnaire was distributed among 300 prosthodontists, general dentists and lay people in the United Kingdom. The questionnaire consisted of photographic images of a smile, intentionally altered using image manipulation software. Variations in contact length between maxillary central incisors were created to mimic the clinical situation when missing teeth were replaced with implant-supported crowns. These images were rated using VAS. One-way and two-way ANOVAs, and Tukey's test were used to analyze the data. The overall response rate by the three groups was 72%. Lay people and general dentists were more critical than prosthodontists in all VAS ratings (p < 0.000). Overall, all the groups perceived that the esthetic value reduced as the contact point increased in its length.
Ware, Norma C; Pisarski, Emily E; Haberer, Jessica E; Wyatt, Monique A; Tumwesigye, Elioda; Baeten, Jared M; Celum, Connie L; Bangsberg, David R
2015-05-01
Effectiveness of antiretroviral pre-exposure prophylaxis (PrEP) for HIV prevention will require high adherence. Using qualitative data, this paper identifies potential lay social resources for support of PrEP adherence by HIV serodiscordant couples in Uganda, laying the groundwork for incorporation of these resources into adherence support initiatives as part of implementation. The qualitative analysis characterizes support for PrEP adherence provided by HIV-infected spouses, children, extended family members, and the larger community. Results suggest social resources for support of PrEP adherence in Africa are plentiful outside formal health care settings and health systems and that couples will readily use them. The same shortage of health professionals that impeded scale-up of antiretroviral treatment for HIV/AIDS in Africa promises to challenge delivery of PrEP. Building on the treatment scale-up experience, implementers can address this challenge by examining the value of lay social resources for adherence support in developing strategies for delivery of PrEP.
Genomic cloud computing: legal and ethical points to consider
Dove, Edward S; Joly, Yann; Tassé, Anne-Marie; Burton, Paul; Chisholm, Rex; Fortier, Isabel; Goodwin, Pat; Harris, Jennifer; Hveem, Kristian; Kaye, Jane; Kent, Alistair; Knoppers, Bartha Maria; Lindpaintner, Klaus; Little, Julian; Riegman, Peter; Ripatti, Samuli; Stolk, Ronald; Bobrow, Martin; Cambon-Thomsen, Anne; Dressler, Lynn; Joly, Yann; Kato, Kazuto; Knoppers, Bartha Maria; Rodriguez, Laura Lyman; McPherson, Treasa; Nicolás, Pilar; Ouellette, Francis; Romeo-Casabona, Carlos; Sarin, Rajiv; Wallace, Susan; Wiesner, Georgia; Wilson, Julia; Zeps, Nikolajs; Simkevitz, Howard; De Rienzo, Assunta; Knoppers, Bartha M
2015-01-01
The biggest challenge in twenty-first century data-intensive genomic science, is developing vast computer infrastructure and advanced software tools to perform comprehensive analyses of genomic data sets for biomedical research and clinical practice. Researchers are increasingly turning to cloud computing both as a solution to integrate data from genomics, systems biology and biomedical data mining and as an approach to analyze data to solve biomedical problems. Although cloud computing provides several benefits such as lower costs and greater efficiency, it also raises legal and ethical issues. In this article, we discuss three key ‘points to consider' (data control; data security, confidentiality and transfer; and accountability) based on a preliminary review of several publicly available cloud service providers' Terms of Service. These ‘points to consider' should be borne in mind by genomic research organizations when negotiating legal arrangements to store genomic data on a large commercial cloud service provider's servers. Diligent genomic cloud computing means leveraging security standards and evaluation processes as a means to protect data and entails many of the same good practices that researchers should always consider in securing their local infrastructure. PMID:25248396
Genomic cloud computing: legal and ethical points to consider.
Dove, Edward S; Joly, Yann; Tassé, Anne-Marie; Knoppers, Bartha M
2015-10-01
The biggest challenge in twenty-first century data-intensive genomic science, is developing vast computer infrastructure and advanced software tools to perform comprehensive analyses of genomic data sets for biomedical research and clinical practice. Researchers are increasingly turning to cloud computing both as a solution to integrate data from genomics, systems biology and biomedical data mining and as an approach to analyze data to solve biomedical problems. Although cloud computing provides several benefits such as lower costs and greater efficiency, it also raises legal and ethical issues. In this article, we discuss three key 'points to consider' (data control; data security, confidentiality and transfer; and accountability) based on a preliminary review of several publicly available cloud service providers' Terms of Service. These 'points to consider' should be borne in mind by genomic research organizations when negotiating legal arrangements to store genomic data on a large commercial cloud service provider's servers. Diligent genomic cloud computing means leveraging security standards and evaluation processes as a means to protect data and entails many of the same good practices that researchers should always consider in securing their local infrastructure.
Shultz, M.T.; Piatt, John F.; Harding, A.M.A.; Kettle, Arthur B.; van Pelt, Thomas I.
2009-01-01
Seabirds are thought to time breeding to match the seasonal peak of food availability with peak chick energetic demands, but warming ocean temperatures have altered the timing of spring events, creating the potential for mismatches. The resilience of seabird populations to climate change depends on their ability to anticipate changes in the timing and magnitude of peak food availability and 'fine-tune' efforts to match ('Anticipation Hypothesis'). The degree that inter-annual variation in seabird timing of breeding and reproductive performance represents anticipated food availability versus energetic constraints ('Constraint Hypothesis') is poorly understood. We examined the relative merits of the Constraint and Anticipation Hypotheses by testing 2 predictions of the Constraint Hypothesis: (1) seabird timing of breeding is related to food availability prior to egg laying rather than the date of peak food availability, (2) initial reproductive output (e.g. laying success, clutch size) is related to pre-lay food availability rather than anticipated chick-rearing food availability. We analyzed breeding biology data of common murres Uria aalge and black-legged kittiwakes Rissa tridactyla and 2 proxies of the seasonal dynamics of their food availability (near-shore forage fish abundance and sea-surface temperature) at 2 colonies in Lower Cook Inlet, Alaska, USA, from 1996 to 1999. Our results support the Constraint Hypothesis: (1) for both species, egg laying was later in years with warmer sea-surface temperature and lower food availability prior to egg laying, but was not related to the date of peak food availability, (2) pre-egg laying food availability explained variation in kittiwake laying success and clutch size. Murre reproductive success was best explained by food availability during chick rearing. ?? 2009 Inter-Research.
Caputo, Maria Luce; Muschietti, Sandro; Burkart, Roman; Benvenuti, Claudio; Conte, Giulio; Regoli, François; Mauri, Romano; Klersy, Catherine; Moccetti, Tiziano; Auricchio, Angelo
2017-05-01
We compared the time to initiation of cardiopulmonary resuscitation (CPR) by lay responders and/or first responders alerted either via Short Message Service (SMS) or by using a mobile application-based alert system (APP). The Ticino Registry of Cardiac Arrest collects all data about out-of-hospital cardiac arrests (OHCAs) occurring in the Canton of Ticino. At the time of a bystander's call, the EMS dispatcher sends one ambulance and alerts the first-responders network made up of police officers or fire brigade equipped with an automatic external defibrillator, the so called "traditional" first responders, and - if the scene was considered safe - lay responders as well. We evaluated the time from call to arrival of traditional first responders and/or lay responders when alerted either via SMS or the new developed mobile APP. Over the study period 593 OHCAs have occurred. Notification to the first responders network was sent via SMS in 198 cases and via mobile APP in 134 cases. Median time to first responder/lay responder arrival on scene was significantly reduced by the APP-based system (3.5 [2.8-5.2]) compared to the SMS-based system (5.6 [4.2-8.5] min, p 0.0001). The proportion of lay responders arriving first on the scene significantly increased (70% vs. 15%, p<0.01) with the APP. Earlier arrival of a first responder or of a lay responder determined a higher survival rate. The mobile APP system is highly efficient in the recruitment of first responders, significantly reducing the time to the initiation of CPR thus increasing survival rates. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
Sidhu, Manbinder S; Gale, Nicola K; Gill, Paramjit; Marshall, Tom; Jolly, Kate
2015-02-07
Self-management education is at the forefront of addressing the increasing prevalence of chronic diseases. For those at greatest risk, such as minority-ethnic and/or socio-economically deprived groups, self-management education can be culturally-tailored to encourage behavioural change. Yet, the application of culturally appropriate material and expertise within health promotion services continues to be debated. We critique the design, implementation, and delivery of a culturally-tailored self-management intervention, with particular focus on the experiences of lay educators. A mixed methods qualitative evaluation was undertaken to understand self-management service provision to culturally diverse communities (i.e. how components such as lay workers, group-based design, and culturally-appropriate educational material are intended to encourage behavioural change). We interviewed lay educators delivering the Chronic Disease Educator programme along with attendees, whilst observing workshops. Data were thematically analysed using a content-based constant comparison approach through a number of interpretative analytical stages. Lay educators felt part of the local community, relating to attendees from different races and ethnicities. However, lay educators faced challenges when addressing health beliefs and changing lifestyle practices. Culturally-tailored components aided communication, with educator's cultural awareness leading to close relationships with attendees, while the group-based design facilitated discussions of the emotional impact of illness. Lay educators bring with them a number of nuanced skills and knowledge when delivering self-management education. The development and training required for this role is inhibited by financial constraints at policy-level. The interpretation of being from the 'community' links with the identity and status of the lay role, overlapping notions of race, ethnicity, and language.
Werner, Perla; Heinik, Jeremia; Giveon, Shmuel; Segel-Karpas, Dikla; Kitai, Eliezer
2014-01-01
Mild cognitive impairment (MCI) or mild neurocognitive disorder is a well-established clinical entity included in current diagnostic guidelines for Alzheimer's disease and in major psychiatric classifications. In all, a loosely defined concern obtained from conceptually different sources (the individual, a knowledgeable informant, or a clinician) regarding a decline in cognition and change in functioning constitutes a sine qua non for initiating diagnostics and providing therapy and support. This concern in practice may translate into complex proactive help-seeking behavior. A better understanding of help-seeking preferences is required in order to promote early detection and management. To compare help-seeking preferences of family physicians and the lay public in the area of MCI. A structured questionnaire was used to collect data from 197 family physicians (self-administered) and 517 persons aged 45 and over from the lay public (face to face). Information regarding familiarity with MCI and help-seeking preferences was assessed. The vast majority in both samples reported that family physician, spouse, and children are the most highly recommended sources of help-seeking. In regard to professional sources of help-seeking, a higher percentage of the physicians than the lay public sample consistently recommended seeking help from nurses and social workers and psychiatrists, but a higher percentage of the lay public recommended turning to a neurologist for help. There were both similarities and differences between family physicians and the lay public in their preferences regarding help-seeking for a person with MCI. Most prominent is the physicians' greater tendency to recommend professional sources of help-seeking. Understanding of help-seeking preferences of both physicians and lay persons might help overcome barriers for establishing diagnosis, receiving care, and improving communication between doctors and patients.
Werner, Perla; Heinik, Jeremia; Giveon, Shmuel; Segel-Karpas, Dikla; Kitai, Eliezer
2014-01-01
Background Mild cognitive impairment (MCI) or mild neurocognitive disorder is a well-established clinical entity included in current diagnostic guidelines for Alzheimer’s disease and in major psychiatric classifications. In all, a loosely defined concern obtained from conceptually different sources (the individual, a knowledgeable informant, or a clinician) regarding a decline in cognition and change in functioning constitutes a sine qua non for initiating diagnostics and providing therapy and support. This concern in practice may translate into complex proactive help-seeking behavior. A better understanding of help-seeking preferences is required in order to promote early detection and management. Objectives To compare help-seeking preferences of family physicians and the lay public in the area of MCI. Methods A structured questionnaire was used to collect data from 197 family physicians (self-administered) and 517 persons aged 45 and over from the lay public (face to face). Information regarding familiarity with MCI and help-seeking preferences was assessed. Results The vast majority in both samples reported that family physician, spouse, and children are the most highly recommended sources of help-seeking. In regard to professional sources of help-seeking, a higher percentage of the physicians than the lay public sample consistently recommended seeking help from nurses and social workers and psychiatrists, but a higher percentage of the lay public recommended turning to a neurologist for help. Discussion There were both similarities and differences between family physicians and the lay public in their preferences regarding help-seeking for a person with MCI. Most prominent is the physicians’ greater tendency to recommend professional sources of help-seeking. Conclusion Understanding of help-seeking preferences of both physicians and lay persons might help overcome barriers for establishing diagnosis, receiving care, and improving communication between doctors and patients. PMID:24748779
Automated Tape Laying Machine for Composite Structures.
The invention comprises an automated tape laying machine, for laying tape on a composite structure. The tape laying machine has a tape laying head...neatly cut. The automated tape laying device utilizes narrow width tape to increase machine flexibility and reduce wastage.
Diversity in computing technologies and strategies for dynamic resource allocation
Garzoglio, G.; Gutsche, O.
2015-12-23
Here, High Energy Physics (HEP) is a very data intensive and trivially parallelizable science discipline. HEP is probing nature at increasingly finer details requiring ever increasing computational resources to process and analyze experimental data. In this paper, we discuss how HEP provisioned resources so far using Grid technologies, how HEP is starting to include new resource providers like commercial Clouds and HPC installations, and how HEP is transparently provisioning resources at these diverse providers.
ESnet: Large-Scale Science and Data Management ( (LBNL Summer Lecture Series)
Johnston, Bill
2017-12-09
Summer Lecture Series 2004: Bill Johnston of Berkeley Lab's Computing Sciences is a distinguished networking and computing researcher. He managed the Energy Sciences Network (ESnet), a leading-edge, high-bandwidth network funded by DOE's Office of Science. Used for everything from videoconferencing to climate modeling, and flexible enough to accommodate a wide variety of data-intensive applications and services, ESNet's traffic volume is doubling every year and currently surpasses 200 terabytes per month.
Evolution of the ATLAS PanDA workload management system for exascale computational science
NASA Astrophysics Data System (ADS)
Maeno, T.; De, K.; Klimentov, A.; Nilsson, P.; Oleynik, D.; Panitkin, S.; Petrosyan, A.; Schovancova, J.; Vaniachine, A.; Wenaus, T.; Yu, D.; Atlas Collaboration
2014-06-01
An important foundation underlying the impressive success of data processing and analysis in the ATLAS experiment [1] at the LHC [2] is the Production and Distributed Analysis (PanDA) workload management system [3]. PanDA was designed specifically for ATLAS and proved to be highly successful in meeting all the distributed computing needs of the experiment. However, the core design of PanDA is not experiment specific. The PanDA workload management system is capable of meeting the needs of other data intensive scientific applications. Alpha-Magnetic Spectrometer [4], an astro-particle experiment on the International Space Station, and the Compact Muon Solenoid [5], an LHC experiment, have successfully evaluated PanDA and are pursuing its adoption. In this paper, a description of the new program of work to develop a generic version of PanDA will be given, as well as the progress in extending PanDA's capabilities to support supercomputers and clouds and to leverage intelligent networking. PanDA has demonstrated at a very large scale the value of automated dynamic brokering of diverse workloads across distributed computing resources. The next generation of PanDA will allow other data-intensive sciences and a wider exascale community employing a variety of computing platforms to benefit from ATLAS' experience and proven tools.
A single-cell spiking model for the origin of grid-cell patterns
Kempter, Richard
2017-01-01
Spatial cognition in mammals is thought to rely on the activity of grid cells in the entorhinal cortex, yet the fundamental principles underlying the origin of grid-cell firing are still debated. Grid-like patterns could emerge via Hebbian learning and neuronal adaptation, but current computational models remained too abstract to allow direct confrontation with experimental data. Here, we propose a single-cell spiking model that generates grid firing fields via spike-rate adaptation and spike-timing dependent plasticity. Through rigorous mathematical analysis applicable in the linear limit, we quantitatively predict the requirements for grid-pattern formation, and we establish a direct link to classical pattern-forming systems of the Turing type. Our study lays the groundwork for biophysically-realistic models of grid-cell activity. PMID:28968386
Skills and Knowledge for Data-Intensive Environmental Research
Hampton, Stephanie E.; Jones, Matthew B.; Wasser, Leah A.; Schildhauer, Mark P.; Supp, Sarah R.; Brun, Julien; Hernandez, Rebecca R.; Boettiger, Carl; Collins, Scott L.; Gross, Louis J.; Fernández, Denny S.; Budden, Amber; White, Ethan P.; Teal, Tracy K.; Aukema, Juliann E.
2017-01-01
Abstract The scale and magnitude of complex and pressing environmental issues lend urgency to the need for integrative and reproducible analysis and synthesis, facilitated by data-intensive research approaches. However, the recent pace of technological change has been such that appropriate skills to accomplish data-intensive research are lacking among environmental scientists, who more than ever need greater access to training and mentorship in computational skills. Here, we provide a roadmap for raising data competencies of current and next-generation environmental researchers by describing the concepts and skills needed for effectively engaging with the heterogeneous, distributed, and rapidly growing volumes of available data. We articulate five key skills: (1) data management and processing, (2) analysis, (3) software skills for science, (4) visualization, and (5) communication methods for collaboration and dissemination. We provide an overview of the current suite of training initiatives available to environmental scientists and models for closing the skill-transfer gap. PMID:28584342
Skills and Knowledge for Data-Intensive Environmental Research.
Hampton, Stephanie E; Jones, Matthew B; Wasser, Leah A; Schildhauer, Mark P; Supp, Sarah R; Brun, Julien; Hernandez, Rebecca R; Boettiger, Carl; Collins, Scott L; Gross, Louis J; Fernández, Denny S; Budden, Amber; White, Ethan P; Teal, Tracy K; Labou, Stephanie G; Aukema, Juliann E
2017-06-01
The scale and magnitude of complex and pressing environmental issues lend urgency to the need for integrative and reproducible analysis and synthesis, facilitated by data-intensive research approaches. However, the recent pace of technological change has been such that appropriate skills to accomplish data-intensive research are lacking among environmental scientists, who more than ever need greater access to training and mentorship in computational skills. Here, we provide a roadmap for raising data competencies of current and next-generation environmental researchers by describing the concepts and skills needed for effectively engaging with the heterogeneous, distributed, and rapidly growing volumes of available data. We articulate five key skills: (1) data management and processing, (2) analysis, (3) software skills for science, (4) visualization, and (5) communication methods for collaboration and dissemination. We provide an overview of the current suite of training initiatives available to environmental scientists and models for closing the skill-transfer gap.
Design for cyclic loading endurance of composites
NASA Technical Reports Server (NTRS)
Shiao, Michael C.; Murthy, Pappu L. N.; Chamis, Christos C.; Liaw, Leslie D. G.
1993-01-01
The application of the computer code IPACS (Integrated Probabilistic Assessment of Composite Structures) to aircraft wing type structures is described. The code performs a complete probabilistic analysis for composites taking into account the uncertainties in geometry, boundary conditions, material properties, laminate lay-ups, and loads. Results of the analysis are presented in terms of cumulative distribution functions (CDF) and probability density function (PDF) of the fatigue life of a wing type composite structure under different hygrothermal environments subjected to the random pressure. The sensitivity of the fatigue life to a number of critical structural/material variables is also computed from the analysis.
QUANTUM: The Exhibition - quantum at the museum
NASA Astrophysics Data System (ADS)
Laforest, Martin; Olano, Angela; Day-Hamilton, Tobi
Distilling the essence of quantum phenomena, and how they are being harnessed to develop powerful quantum technologies, into a series of bite-sized, elementary-school-level pieces is what the scientific outreach team at the University of Waterloo's Institute for Quantum Computing was tasked with. QUANTUM: The Exhibition uses a series of informational panels, multimedia and interactive displays to introduce visitors to quantum phenomena and how they will revolutionize computing, information security and sensing. We'll discuss some of the approaches we took to convey the essence and impact of quantum mechanics and technologies to a lay audience while ensuring scientific accuracy.
Computational inverse methods of heat source in fatigue damage problems
NASA Astrophysics Data System (ADS)
Chen, Aizhou; Li, Yuan; Yan, Bo
2018-04-01
Fatigue dissipation energy is the research focus in field of fatigue damage at present. It is a new idea to solve the problem of calculating fatigue dissipation energy by introducing inverse method of heat source into parameter identification of fatigue dissipation energy model. This paper introduces the research advances on computational inverse method of heat source and regularization technique to solve inverse problem, as well as the existing heat source solution method in fatigue process, prospects inverse method of heat source applying in fatigue damage field, lays the foundation for further improving the effectiveness of fatigue dissipation energy rapid prediction.
What is Data-Intensive Science?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Critchlow, Terence J.; Kleese van Dam, Kerstin
2013-06-03
What is Data Intensive Science? Today we are living in a digital world, where scientists often no longer interact directly with the physical object of their research, but do so via digitally captured, reduced, calibrated, analyzed, synthesized and, at times, visualized data. Advances in experimental and computational technologies have lead to an exponential growth in the volumes, variety and complexity of this data and while the deluge is not happening everywhere in an absolute sense, it is in a relative one. Science today is data intensive. Data intensive science has the potential to transform not only how we do science,more » but how quickly we can translate scientific progress into complete solutions, policies, decisions and ultimately economic success. Critically, data intensive science touches some of the most important challenges we are facing. Consider a few of the grand challenges outlined by the U.S. National Academy of Engineering: make solar energy economical, provide energy from fusion, develop carbon sequestration methods, advance health informatics, engineer better medicines, secure cyberspace, and engineer the tools of scientific discovery. Arguably, meeting any of these challenges requires the collaborative effort of trans-disciplinary teams, but also significant contributions from enabling data intensive technologies. Indeed for many of them, advances in data intensive research will be the single most important factor in developing successful and timely solutions. Simple extrapolations of how we currently interact with and utilize data and knowledge are not sufficient to meet this need. Given the importance of these challenges, a new, bold vision for the role of data in science, and indeed how research will be conducted in a data intensive environment is evolving.« less
Contextual classification of multispectral image data: Approximate algorithm
NASA Technical Reports Server (NTRS)
Tilton, J. C. (Principal Investigator)
1980-01-01
An approximation to a classification algorithm incorporating spatial context information in a general, statistical manner is presented which is computationally less intensive. Classifications that are nearly as accurate are produced.
Regional sustainable environmental management: sustainability metrics research for decision makers
There are a number of established, scientifically supported metrics of sustainability. Many of the metrics are data intensive and require extensive effort to collect data and compute. Moreover, individual metrics may not capture all aspects of a system that are relevant to sust...
Development of a multidisciplinary approach to assess regional sustainability
There are a number of established, scientifically supported metrics of sustainability. Many of the metrics are data intensive and require extensive effort to collect data and compute the metrics. Moreover, individual metrics do not capture all aspects of a system that are relev...
A Bayesian and Physics-Based Ground Motion Parameters Map Generation System
NASA Astrophysics Data System (ADS)
Ramirez-Guzman, L.; Quiroz, A.; Sandoval, H.; Perez-Yanez, C.; Ruiz, A. L.; Delgado, R.; Macias, M. A.; Alcántara, L.
2014-12-01
We present the Ground Motion Parameters Map Generation (GMPMG) system developed by the Institute of Engineering at the National Autonomous University of Mexico (UNAM). The system delivers estimates of information associated with the social impact of earthquakes, engineering ground motion parameters (gmp), and macroseismic intensity maps. The gmp calculated are peak ground acceleration and velocity (pga and pgv) and response spectral acceleration (SA). The GMPMG relies on real-time data received from strong ground motion stations belonging to UNAM's networks throughout Mexico. Data are gathered via satellite and internet service providers, and managed with the data acquisition software Earthworm. The system is self-contained and can perform all calculations required for estimating gmp and intensity maps due to earthquakes, automatically or manually. An initial data processing, by baseline correcting and removing records containing glitches or low signal-to-noise ratio, is performed. The system then assigns a hypocentral location using first arrivals and a simplified 3D model, followed by a moment tensor inversion, which is performed using a pre-calculated Receiver Green's Tensors (RGT) database for a realistic 3D model of Mexico. A backup system to compute epicentral location and magnitude is in place. A Bayesian Kriging is employed to combine recorded values with grids of computed gmp. The latter are obtained by using appropriate ground motion prediction equations (for pgv, pga and SA with T=0.3, 0.5, 1 and 1.5 s ) and numerical simulations performed in real time, using the aforementioned RGT database (for SA with T=2, 2.5 and 3 s). Estimated intensity maps are then computed using SA(T=2S) to Modified Mercalli Intensity correlations derived for central Mexico. The maps are made available to the institutions in charge of the disaster prevention systems. In order to analyze the accuracy of the maps, we compare them against observations not considered in the computations, and present some examples of recent earthquakes. We conclude that the system provides information with a fair goodness-of-fit against observations. This project is partially supported by DGAPA-PAPIIT (UNAM) project TB100313-RR170313.
Dynamic array processing for computationally intensive expert systems in CLIPS
NASA Technical Reports Server (NTRS)
Athavale, N. N.; Ragade, R. K.; Fenske, T. E.; Cassaro, M. A.
1990-01-01
This paper puts forth an architecture for implementing a loop for advanced data structure of arrays in CLIPS. An attempt is made to use multi-field variables in such an architecture to process a set of data during the decision making cycle. Also, current limitations on the expert system shells are discussed in brief in this paper. The resulting architecture is designed to circumvent the current limitations set by the expert system shell and also by the operating environment. Such advanced data structures are needed for tightly coupling symbolic and numeric computation modules.
Deformable registration of CT and cone-beam CT with local intensity matching.
Park, Seyoun; Plishker, William; Quon, Harry; Wong, John; Shekhar, Raj; Lee, Junghoon
2017-02-07
Cone-beam CT (CBCT) is a widely used intra-operative imaging modality in image-guided radiotherapy and surgery. A short scan followed by a filtered-backprojection is typically used for CBCT reconstruction. While data on the mid-plane (plane of source-detector rotation) is complete, off-mid-planes undergo different information deficiency and the computed reconstructions are approximate. This causes different reconstruction artifacts at off-mid-planes depending on slice locations, and therefore impedes accurate registration between CT and CBCT. In this paper, we propose a method to accurately register CT and CBCT by iteratively matching local CT and CBCT intensities. We correct CBCT intensities by matching local intensity histograms slice by slice in conjunction with intensity-based deformable registration. The correction-registration steps are repeated in an alternating way until the result image converges. We integrate the intensity matching into three different deformable registration methods, B-spline, demons, and optical flow that are widely used for CT-CBCT registration. All three registration methods were implemented on a graphics processing unit for efficient parallel computation. We tested the proposed methods on twenty five head and neck cancer cases and compared the performance with state-of-the-art registration methods. Normalized cross correlation (NCC), structural similarity index (SSIM), and target registration error (TRE) were computed to evaluate the registration performance. Our method produced overall NCC of 0.96, SSIM of 0.94, and TRE of 2.26 → 2.27 mm, outperforming existing methods by 9%, 12%, and 27%, respectively. Experimental results also show that our method performs consistently and is more accurate than existing algorithms, and also computationally efficient.
Deformable registration of CT and cone-beam CT with local intensity matching
NASA Astrophysics Data System (ADS)
Park, Seyoun; Plishker, William; Quon, Harry; Wong, John; Shekhar, Raj; Lee, Junghoon
2017-02-01
Cone-beam CT (CBCT) is a widely used intra-operative imaging modality in image-guided radiotherapy and surgery. A short scan followed by a filtered-backprojection is typically used for CBCT reconstruction. While data on the mid-plane (plane of source-detector rotation) is complete, off-mid-planes undergo different information deficiency and the computed reconstructions are approximate. This causes different reconstruction artifacts at off-mid-planes depending on slice locations, and therefore impedes accurate registration between CT and CBCT. In this paper, we propose a method to accurately register CT and CBCT by iteratively matching local CT and CBCT intensities. We correct CBCT intensities by matching local intensity histograms slice by slice in conjunction with intensity-based deformable registration. The correction-registration steps are repeated in an alternating way until the result image converges. We integrate the intensity matching into three different deformable registration methods, B-spline, demons, and optical flow that are widely used for CT-CBCT registration. All three registration methods were implemented on a graphics processing unit for efficient parallel computation. We tested the proposed methods on twenty five head and neck cancer cases and compared the performance with state-of-the-art registration methods. Normalized cross correlation (NCC), structural similarity index (SSIM), and target registration error (TRE) were computed to evaluate the registration performance. Our method produced overall NCC of 0.96, SSIM of 0.94, and TRE of 2.26 → 2.27 mm, outperforming existing methods by 9%, 12%, and 27%, respectively. Experimental results also show that our method performs consistently and is more accurate than existing algorithms, and also computationally efficient.
A spatial and genetic analysis of Cowbird host selection
Hahn, D.C.; Sedgwick, J.A.; Painter, I.S.; Casna, N.J.; Morrison, Michael L.; Hall, Linnea S.; Robinson, Scott K.; Rothstein, Stephen I.; Hahn, D. Caldwell; Rich, Terrell D.
1999-01-01
Our study of brood parasitism patterns in forest communities revealed the egg-laying frequency and host selection patterns of female cowbirds. By integrating molecular genetics and spatial data, we have the first published estimate on cowbird laying rates in field studies. The 29 females in the study laid only 1-5 eggs each, much lower than previous estimates from captive cowbirds and extrapolations from ovarian development in capture/recapture studies that had suggested that as many as 40 eggs could be laid per individual cowbird. Cowbird females also were shown for the first time to lay significantly more eggs within the home range areas they established rather than outside the home range. No patterns were uncovered for individual females preferentially parasitizing particular host species
A revised ground-motion and intensity interpolation scheme for shakemap
Worden, C.B.; Wald, D.J.; Allen, T.I.; Lin, K.; Garcia, D.; Cua, G.
2010-01-01
We describe a weighted-average approach for incorporating various types of data (observed peak ground motions and intensities and estimates from groundmotion prediction equations) into the ShakeMap ground motion and intensity mapping framework. This approach represents a fundamental revision of our existing ShakeMap methodology. In addition, the increased availability of near-real-time macroseismic intensity data, the development of newrelationships between intensity and peak ground motions, and new relationships to directly predict intensity from earthquake source information have facilitated the inclusion of intensity measurements directly into ShakeMap computations. Our approach allows for the combination of (1) direct observations (ground-motion measurements or reported intensities), (2) observations converted from intensity to ground motion (or vice versa), and (3) estimated ground motions and intensities from prediction equations or numerical models. Critically, each of the aforementioned data types must include an estimate of its uncertainties, including those caused by scaling the influence of observations to surrounding grid points and those associated with estimates given an unknown fault geometry. The ShakeMap ground-motion and intensity estimates are an uncertainty-weighted combination of these various data and estimates. A natural by-product of this interpolation process is an estimate of total uncertainty at each point on the map, which can be vital for comprehensive inventory loss calculations. We perform a number of tests to validate this new methodology and find that it produces a substantial improvement in the accuracy of ground-motion predictions over empirical prediction equations alone.
Say, Rebecca; Thomson, Richard; Robson, Stephen; Exley, Catherine
2013-01-16
Women who have a breech presentation at term have to decide whether to attempt external cephalic version (ECV) and how they want to give birth if the baby remains breech, either by planned caesarean section (CS) or vaginal breech birth. The aim of this study was to explore the attitudes of women with a breech presentation and health professionals who manage breech presentation to ECV. We carried out semi-structured interviews with pregnant women with a breech presentation (n=11) and health professionals who manage breech presentation (n=11) recruited from two hospitals in North East England. We used purposive sampling to include women who chose ECV and women who chose planned CS. We analysed data using thematic analysis, comparing between individuals and seeking out disconfirming cases. Four main themes emerged from the data collected during interviews with pregnant women with a breech presentation: ECV as a means of enabling natural birth; concerns about ECV; lay and professional accounts of ECV; and breech presentation as a means of choosing planned CS. Some women's attitudes to ECV were affected by their preferences for how to give birth. Other women chose CS because ECV was not acceptable to them. Two main themes emerged from the interview data about health professionals' attitudes towards ECV: directive counselling and attitudes towards lay beliefs about ECV and breech presentation. Women had a range of attitudes to ECV informed by their preferences for how to give birth; the acceptability of ECV to them; and lay accounts of ECV, which were frequently negative. Most professionals described having a preference for ECV and reported directively counselling women to choose it. Some professionals were dismissive of lay beliefs about ECV. Some key challenges for shared decision making about breech presentation were identified: health professionals counselling women directively about ECV and the differences between evidence-based information about ECV and lay beliefs. To address these challenges a number of approaches will be required.
Computing the Effects of Strain on Electronic States: A Survey of Methods and Issues
2012-12-01
DISTRIBUTION/AVAILABILITY STATEMENT Approved for public release; distribution is unlimited. 13. SUPPLEMENTARY NOTES 14. ABSTRACT We present a...lays the foundations for first-principles approaches, including discussion of spin-orbit coupling. Section 3 presents an overview of empirical...addition and removal energies of the independent-electron approximation. For simplicity, the energy levels in the figure have been presented as if they
Toward a Mobile Agent Relay Network
2010-03-01
in the study of particle movement. In computer science, flocking movement has been adapted for use in the collective, cooperative movement of...MARN). For our approach, we utilize a mod- ified flocking behavior to generate cooperative movement that utilizes the agent’s re- lay capability. We...Summary Our testing focuses on measuring effective cooperative movement and robustness against malicious agents. The movement testing demonstrated that a
High Resolution Nature Runs and the Big Data Challenge
NASA Technical Reports Server (NTRS)
Webster, W. Phillip; Duffy, Daniel Q.
2015-01-01
NASA's Global Modeling and Assimilation Office at Goddard Space Flight Center is undertaking a series of very computationally intensive Nature Runs and a downscaled reanalysis. The nature runs use the GEOS-5 as an Atmospheric General Circulation Model (AGCM) while the reanalysis uses the GEOS-5 in Data Assimilation mode. This paper will present computational challenges from three runs, two of which are AGCM and one is downscaled reanalysis using the full DAS. The nature runs will be completed at two surface grid resolutions, 7 and 3 kilometers and 72 vertical levels. The 7 km run spanned 2 years (2005-2006) and produced 4 PB of data while the 3 km run will span one year and generate 4 BP of data. The downscaled reanalysis (MERRA-II Modern-Era Reanalysis for Research and Applications) will cover 15 years and generate 1 PB of data. Our efforts to address the big data challenges of climate science, we are moving toward a notion of Climate Analytics-as-a-Service (CAaaS), a specialization of the concept of business process-as-a-service that is an evolving extension of IaaS, PaaS, and SaaS enabled by cloud computing. In this presentation, we will describe two projects that demonstrate this shift. MERRA Analytic Services (MERRA/AS) is an example of cloud-enabled CAaaS. MERRA/AS enables MapReduce analytics over MERRA reanalysis data collection by bringing together the high-performance computing, scalable data management, and a domain-specific climate data services API. NASA's High-Performance Science Cloud (HPSC) is an example of the type of compute-storage fabric required to support CAaaS. The HPSC comprises a high speed Infinib and network, high performance file systems and object storage, and a virtual system environments specific for data intensive, science applications. These technologies are providing a new tier in the data and analytic services stack that helps connect earthbound, enterprise-level data and computational resources to new customers and new mobility-driven applications and modes of work. In our experience, CAaaS lowers the barriers and risk to organizational change, fosters innovation and experimentation, and provides the agility required to meet our customers' increasing and changing needs
NASA Astrophysics Data System (ADS)
Dave, Gaurav P.; Sureshkumar, N.; Blessy Trencia Lincy, S. S.
2017-11-01
Current trend in processor manufacturing focuses on multi-core architectures rather than increasing the clock speed for performance improvement. Graphic processors have become as commodity hardware for providing fast co-processing in computer systems. Developments in IoT, social networking web applications, big data created huge demand for data processing activities and such kind of throughput intensive applications inherently contains data level parallelism which is more suited for SIMD architecture based GPU. This paper reviews the architectural aspects of multi/many core processors and graphics processors. Different case studies are taken to compare performance of throughput computing applications using shared memory programming in OpenMP and CUDA API based programming.
Lay Led Ministries in USAREUR: Impact on Command and Community.
1985-04-17
Second Coming. However, with the exception of the Jewish program, these church groups appeared to generally fall into two (2) categories: Black/Ethnic and...same denomination-- Church of God in Christ. Summary Statement: The data which was received from the group of inter- viewed lay ministers could generally ...Chaplain Center and School at Fort Monmouth, New Jersey. 3. Questionnaires distributed to those persons most closely connected with the issue in the
Computer program for determining rotational line intensity factors for diatomic molecules
NASA Technical Reports Server (NTRS)
Whiting, E. E.
1973-01-01
A FORTRAN IV computer program, that provides a new research tool for determining reliable rotational line intensity factors (also known as Honl-London factors), for most electric and magnetic dipole allowed diatomic transitions, is described in detail. This users manual includes instructions for preparing the input data, a program listing, detailed flow charts, and three sample cases. The program is applicable to spin-allowed dipole transitions with either or both states intermediate between Hund's case (a) and Hund's case (b) coupling and to spin-forbidden dipole transitions with either or both states intermediate between Hund's case (c) and Hund's case (b) coupling.
GISpark: A Geospatial Distributed Computing Platform for Spatiotemporal Big Data
NASA Astrophysics Data System (ADS)
Wang, S.; Zhong, E.; Wang, E.; Zhong, Y.; Cai, W.; Li, S.; Gao, S.
2016-12-01
Geospatial data are growing exponentially because of the proliferation of cost effective and ubiquitous positioning technologies such as global remote-sensing satellites and location-based devices. Analyzing large amounts of geospatial data can provide great value for both industrial and scientific applications. Data- and compute- intensive characteristics inherent in geospatial big data increasingly pose great challenges to technologies of data storing, computing and analyzing. Such challenges require a scalable and efficient architecture that can store, query, analyze, and visualize large-scale spatiotemporal data. Therefore, we developed GISpark - a geospatial distributed computing platform for processing large-scale vector, raster and stream data. GISpark is constructed based on the latest virtualized computing infrastructures and distributed computing architecture. OpenStack and Docker are used to build multi-user hosting cloud computing infrastructure for GISpark. The virtual storage systems such as HDFS, Ceph, MongoDB are combined and adopted for spatiotemporal data storage management. Spark-based algorithm framework is developed for efficient parallel computing. Within this framework, SuperMap GIScript and various open-source GIS libraries can be integrated into GISpark. GISpark can also integrated with scientific computing environment (e.g., Anaconda), interactive computing web applications (e.g., Jupyter notebook), and machine learning tools (e.g., TensorFlow/Orange). The associated geospatial facilities of GISpark in conjunction with the scientific computing environment, exploratory spatial data analysis tools, temporal data management and analysis systems make up a powerful geospatial computing tool. GISpark not only provides spatiotemporal big data processing capacity in the geospatial field, but also provides spatiotemporal computational model and advanced geospatial visualization tools that deals with other domains related with spatial property. We tested the performance of the platform based on taxi trajectory analysis. Results suggested that GISpark achieves excellent run time performance in spatiotemporal big data applications.
Exploring Cloud Computing for Large-scale Scientific Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, Guang; Han, Binh; Yin, Jian
This paper explores cloud computing for large-scale data-intensive scientific applications. Cloud computing is attractive because it provides hardware and software resources on-demand, which relieves the burden of acquiring and maintaining a huge amount of resources that may be used only once by a scientific application. However, unlike typical commercial applications that often just requires a moderate amount of ordinary resources, large-scale scientific applications often need to process enormous amount of data in the terabyte or even petabyte range and require special high performance hardware with low latency connections to complete computation in a reasonable amount of time. To address thesemore » challenges, we build an infrastructure that can dynamically select high performance computing hardware across institutions and dynamically adapt the computation to the selected resources to achieve high performance. We have also demonstrated the effectiveness of our infrastructure by building a system biology application and an uncertainty quantification application for carbon sequestration, which can efficiently utilize data and computation resources across several institutions.« less
Transformation of OODT CAS to Perform Larger Tasks
NASA Technical Reports Server (NTRS)
Mattmann, Chris; Freeborn, Dana; Crichton, Daniel; Hughes, John; Ramirez, Paul; Hardman, Sean; Woollard, David; Kelly, Sean
2008-01-01
A computer program denoted OODT CAS has been transformed to enable performance of larger tasks that involve greatly increased data volumes and increasingly intensive processing of data on heterogeneous, geographically dispersed computers. Prior to the transformation, OODT CAS (also alternatively denoted, simply, 'CAS') [wherein 'OODT' signifies 'Object-Oriented Data Technology' and 'CAS' signifies 'Catalog and Archive Service'] was a proven software component used to manage scientific data from spaceflight missions. In the transformation, CAS was split into two separate components representing its canonical capabilities: file management and workflow management. In addition, CAS was augmented by addition of a resource-management component. This third component enables CAS to manage heterogeneous computing by use of diverse resources, including high-performance clusters of computers, commodity computing hardware, and grid computing infrastructures. CAS is now more easily maintainable, evolvable, and reusable. These components can be used separately or, taking advantage of synergies, can be used together. Other elements of the transformation included addition of a separate Web presentation layer that supports distribution of data products via Really Simple Syndication (RSS) feeds, and provision for full Resource Description Framework (RDF) exports of metadata.
The Nuclear Energy Advanced Modeling and Simulation Enabling Computational Technologies FY09 Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Diachin, L F; Garaizar, F X; Henson, V E
2009-10-12
In this document we report on the status of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Enabling Computational Technologies (ECT) effort. In particular, we provide the context for ECT In the broader NEAMS program and describe the three pillars of the ECT effort, namely, (1) tools and libraries, (2) software quality assurance, and (3) computational facility (computers, storage, etc) needs. We report on our FY09 deliverables to determine the needs of the integrated performance and safety codes (IPSCs) in these three areas and lay out the general plan for software quality assurance to meet the requirements of DOE andmore » the DOE Advanced Fuel Cycle Initiative (AFCI). We conclude with a brief description of our interactions with the Idaho National Laboratory computer center to determine what is needed to expand their role as a NEAMS user facility.« less
Charalambous, Charalambos C; Alcantara, Carolina C; French, Margaret A; Li, Xin; Matt, Kathleen S; Kim, Hyosub E; Morton, Susanne M; Reisman, Darcy S
2018-05-15
Previous work demonstrated an effect of a single high-intensity exercise bout coupled with motor practice on the retention of a newly acquired skilled arm movement, in both neurologically intact and impaired adults. In the present study, using behavioural and computational analyses we demonstrated that a single exercise bout, regardless of its intensity and timing, did not increase the retention of a novel locomotor task after stroke. Considering both present and previous work, we postulate that the benefits of exercise effect may depend on the type of motor learning (e.g. skill learning, sensorimotor adaptation) and/or task (e.g. arm accuracy-tracking task, walking). Acute high-intensity exercise coupled with motor practice improves the retention of motor learning in neurologically intact adults. However, whether exercise could improve the retention of locomotor learning after stroke is still unknown. Here, we investigated the effect of exercise intensity and timing on the retention of a novel locomotor learning task (i.e. split-belt treadmill walking) after stroke. Thirty-seven people post stroke participated in two sessions, 24 h apart, and were allocated to active control (CON), treadmill walking (TMW), or total body exercise on a cycle ergometer (TBE). In session 1, all groups exercised for a short bout (∼5 min) at low (CON) or high (TMW and TBE) intensity and before (CON and TMW) or after (TBE) the locomotor learning task. In both sessions, the locomotor learning task was to walk on a split-belt treadmill in a 2:1 speed ratio (100% and 50% fast-comfortable walking speed) for 15 min. To test the effect of exercise on 24 h retention, we applied behavioural and computational analyses. Behavioural data showed that neither high-intensity group showed greater 24 h retention compared to CON, and computational data showed that 24 h retention was attributable to a slow learning process for sensorimotor adaptation. Our findings demonstrated that acute exercise coupled with a locomotor adaptation task, regardless of its intensity and timing, does not improve retention of the novel locomotor task after stroke. We postulate that exercise effects on motor learning may be context specific (e.g. type of motor learning and/or task) and interact with the presence of genetic variant (BDNF Val66Met). © 2018 The Authors. The Journal of Physiology © 2018 The Physiological Society.
Statistical methods and computing for big data.
Wang, Chun; Chen, Ming-Hui; Schifano, Elizabeth; Wu, Jing; Yan, Jun
2016-01-01
Big data are data on a massive scale in terms of volume, intensity, and complexity that exceed the capacity of standard analytic tools. They present opportunities as well as challenges to statisticians. The role of computational statisticians in scientific discovery from big data analyses has been under-recognized even by peer statisticians. This article summarizes recent methodological and software developments in statistics that address the big data challenges. Methodologies are grouped into three classes: subsampling-based, divide and conquer, and online updating for stream data. As a new contribution, the online updating approach is extended to variable selection with commonly used criteria, and their performances are assessed in a simulation study with stream data. Software packages are summarized with focuses on the open source R and R packages, covering recent tools that help break the barriers of computer memory and computing power. Some of the tools are illustrated in a case study with a logistic regression for the chance of airline delay.
Streaming support for data intensive cloud-based sequence analysis.
Issa, Shadi A; Kienzler, Romeo; El-Kalioby, Mohamed; Tonellato, Peter J; Wall, Dennis; Bruggmann, Rémy; Abouelhoda, Mohamed
2013-01-01
Cloud computing provides a promising solution to the genomics data deluge problem resulting from the advent of next-generation sequencing (NGS) technology. Based on the concepts of "resources-on-demand" and "pay-as-you-go", scientists with no or limited infrastructure can have access to scalable and cost-effective computational resources. However, the large size of NGS data causes a significant data transfer latency from the client's site to the cloud, which presents a bottleneck for using cloud computing services. In this paper, we provide a streaming-based scheme to overcome this problem, where the NGS data is processed while being transferred to the cloud. Our scheme targets the wide class of NGS data analysis tasks, where the NGS sequences can be processed independently from one another. We also provide the elastream package that supports the use of this scheme with individual analysis programs or with workflow systems. Experiments presented in this paper show that our solution mitigates the effect of data transfer latency and saves both time and cost of computation.
Statistical methods and computing for big data
Wang, Chun; Chen, Ming-Hui; Schifano, Elizabeth; Wu, Jing
2016-01-01
Big data are data on a massive scale in terms of volume, intensity, and complexity that exceed the capacity of standard analytic tools. They present opportunities as well as challenges to statisticians. The role of computational statisticians in scientific discovery from big data analyses has been under-recognized even by peer statisticians. This article summarizes recent methodological and software developments in statistics that address the big data challenges. Methodologies are grouped into three classes: subsampling-based, divide and conquer, and online updating for stream data. As a new contribution, the online updating approach is extended to variable selection with commonly used criteria, and their performances are assessed in a simulation study with stream data. Software packages are summarized with focuses on the open source R and R packages, covering recent tools that help break the barriers of computer memory and computing power. Some of the tools are illustrated in a case study with a logistic regression for the chance of airline delay. PMID:27695593
Sensor network based vehicle classification and license plate identification system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frigo, Janette Rose; Brennan, Sean M; Rosten, Edward J
Typically, for energy efficiency and scalability purposes, sensor networks have been used in the context of environmental and traffic monitoring applications in which operations at the sensor level are not computationally intensive. But increasingly, sensor network applications require data and compute intensive sensors such video cameras and microphones. In this paper, we describe the design and implementation of two such systems: a vehicle classifier based on acoustic signals and a license plate identification system using a camera. The systems are implemented in an energy-efficient manner to the extent possible using commercially available hardware, the Mica motes and the Stargate platform.more » Our experience in designing these systems leads us to consider an alternate more flexible, modular, low-power mote architecture that uses a combination of FPGAs, specialized embedded processing units and sensor data acquisition systems.« less
Campion, Thomas R; Waitman, Lemuel R; May, Addison K; Ozdas, Asli; Lorenzi, Nancy M; Gadd, Cynthia S
2010-01-01
Evaluations of computerized clinical decision support systems (CDSS) typically focus on clinical performance changes and do not include social, organizational, and contextual characteristics explaining use and effectiveness. Studies of CDSS for intensive insulin therapy (IIT) are no exception, and the literature lacks an understanding of effective computer-based IIT implementation and operation. This paper presents (1) a literature review of computer-based IIT evaluations through the lens of institutional theory, a discipline from sociology and organization studies, to demonstrate the inconsistent reporting of workflow and care process execution and (2) a single-site case study to illustrate how computer-based IIT requires substantial organizational change and creates additional complexity with unintended consequences including error. Computer-based IIT requires organizational commitment and attention to site-specific technology, workflow, and care processes to achieve intensive insulin therapy goals. The complex interaction between clinicians, blood glucose testing devices, and CDSS may contribute to workflow inefficiency and error. Evaluations rarely focus on the perspective of nurses, the primary users of computer-based IIT whose knowledge can potentially lead to process and care improvements. This paper addresses a gap in the literature concerning the social, organizational, and contextual characteristics of CDSS in general and for intensive insulin therapy specifically. Additionally, this paper identifies areas for future research to define optimal computer-based IIT process execution: the frequency and effect of manual data entry error of blood glucose values, the frequency and effect of nurse overrides of CDSS insulin dosing recommendations, and comprehensive ethnographic study of CDSS for IIT. Copyright (c) 2009. Published by Elsevier Ireland Ltd.
Accelerated Adaptive MGS Phase Retrieval
NASA Technical Reports Server (NTRS)
Lam, Raymond K.; Ohara, Catherine M.; Green, Joseph J.; Bikkannavar, Siddarayappa A.; Basinger, Scott A.; Redding, David C.; Shi, Fang
2011-01-01
The Modified Gerchberg-Saxton (MGS) algorithm is an image-based wavefront-sensing method that can turn any science instrument focal plane into a wavefront sensor. MGS characterizes optical systems by estimating the wavefront errors in the exit pupil using only intensity images of a star or other point source of light. This innovative implementation of MGS significantly accelerates the MGS phase retrieval algorithm by using stream-processing hardware on conventional graphics cards. Stream processing is a relatively new, yet powerful, paradigm to allow parallel processing of certain applications that apply single instructions to multiple data (SIMD). These stream processors are designed specifically to support large-scale parallel computing on a single graphics chip. Computationally intensive algorithms, such as the Fast Fourier Transform (FFT), are particularly well suited for this computing environment. This high-speed version of MGS exploits commercially available hardware to accomplish the same objective in a fraction of the original time. The exploit involves performing matrix calculations in nVidia graphic cards. The graphical processor unit (GPU) is hardware that is specialized for computationally intensive, highly parallel computation. From the software perspective, a parallel programming model is used, called CUDA, to transparently scale multicore parallelism in hardware. This technology gives computationally intensive applications access to the processing power of the nVidia GPUs through a C/C++ programming interface. The AAMGS (Accelerated Adaptive MGS) software takes advantage of these advanced technologies, to accelerate the optical phase error characterization. With a single PC that contains four nVidia GTX-280 graphic cards, the new implementation can process four images simultaneously to produce a JWST (James Webb Space Telescope) wavefront measurement 60 times faster than the previous code.
Analysis of Transportation Options for Commercial Spent Fuel in the U.S.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kalinina, Elena; Busch, Ingrid Karin
The U.S. Department of Energy (DOE) is laying the groundwork for implementing interim storage The U.S. Department of Energy (DOE) is laying the groundwork for implementing interim storage The U.S. Department of Energy (DOE) is laying the groundwork for implementing interim storage The U.S. Department of Energy (DOE) is laying the groundwork for implementing interim storage The U.S. Department of Energy (DOE) is laying the groundwork for implementing interim storage The U.S. Department of Energy (DOE) is laying the groundwork for implementing interim storage The U.S. Department of Energy (DOE) is laying the groundwork for implementing interim storage The U.S.more » Department of Energy (DOE) is laying the groundwork for implementing interim storage The U.S. Department of Energy (DOE) is laying the groundwork for implementing interim storage The U.S. Department of Energy (DOE) is laying the groundwork for implementing interim storage The U.S. Department of Energy (DOE) is laying the groundwork for implementing interim storage The U.S. Department of Energy (DOE) is laying the groundwork for implementing interim storage The U.S. Department of Energy (DOE) is laying the groundwork for implementing interim storage The U.S. Department of Energy (DOE) is laying the groundwork for implementing interim storage The U.S. Department of Energy (DOE) is laying the groundwork for implementing interim storage The U.S. Department of Energy (DOE) is laying the groundwork for implementing interim storage The U.S. Department of Energy (DOE) is laying the groundwork for implementing interim storage The U.S. Department of Energy (DOE) is laying the groundwork for implementing interim storage The U.S. Department of Energy (DOE) is laying the groundwork for implementing interim storage The U.S. Department of Energy (DOE) is laying the groundwork for implementing interim storage The U.S. Department of Energy (DOE) is laying the groundwork for implementing interim storage The U.S. Department of Energy (DOE) is laying the groundwork for implementing interim storage The U.S. Department of Energy (DOE) is laying the groundwork for implementing interim storage The U.S. Department of Energy (DOE) is laying the groundwork for implementing interim storage The U.S. Department of Energy (DOE) is laying the groundwork for implementing interim storage The U.S. Department of Energy (DOE) is laying the groundwork for implementing interim storage The U.S. Department of Energy (DOE) is laying the groundwork for implementing interim storage The U.S. Department of Energy (DOE) is laying the groundwork for implementing interim storage The U.S. Department of Energy (DOE) is laying the groundwork for implementing interim storage The U.S. Department of Energy (DOE) is laying the groundwork for implementing interim storage The U.S. Department of Energy (DOE) is laying the groundwork for implementing interim storage The U.S. Department of Energy (DOE) is laying the groundwork for implementing interim storage The U.S. Department of Energy (DOE) is laying the groundwork for implementing interim storage The U.S. Department of Energy (DOE) is laying the groundwork for implementing interim storage The U.S. Department of Energy (DOE) is laying the groundwork for implementing interim storage The U.S. Department of Energy (DOE) is laying the groundwork for implementing interim storage The U.S. Department of Energy (DOE) is laying the groundwork for implementing interim storage The U.S. Department of Energy (DOE) is laying the groundwork for implementing interim storage The U.S. Department of Energy (DOE) is laying the groundwork for implementing interim storage The U.S. Department of Energy (DOE) is laying the groundwork for implementing interim storage The U.S. Department of Energy (DOE) is laying the groundwork for implementing interim storage and associated transportation of spent nuclear fuel (SNF) highand associated transportation of spent nuclear fuel (SNF) and high and associated transportation of spent nuclear fuel (SNF) highand associated transportation of spent nuclear fuel (SNF) and high and associated transportation of spent nuclear fuel (SNF) highand associated transportation of spent nuclear fuel (SNF) and highand associated transportation of spent nuclear fuel (SNF) and high and associated transportation of spent nuclear fuel (SNF) high and associated transportation of spent nuclear fuel (SNF) high and associated transportation of spent nuclear fuel (SNF) high and associated transportation of spent nuclear fuel (SNF) high and associated transportation of spent nuclear fuel (SNF) high and associated transportation of spent nuclear fuel (SNF) high and associated transportation of spent nuclear fuel (SNF) highand associated transportation of spent nuclear fuel (SNF) and high and associated transportation of spent nuclear fuel (SNF) high and associated transportation of spent nuclear fuel (SNF) highand associated transportation of spent nuclear fuel (SNF)...« less
Computer simulation of reconstructed image for computer-generated holograms
NASA Astrophysics Data System (ADS)
Yasuda, Tomoki; Kitamura, Mitsuru; Watanabe, Masachika; Tsumuta, Masato; Yamaguchi, Takeshi; Yoshikawa, Hiroshi
2009-02-01
This report presents the results of computer simulation images for image-type Computer-Generated Holograms (CGHs) observable under white light fabricated with an electron beam lithography system. The simulated image is obtained by calculating wavelength and intensity of diffracted light traveling toward the viewing point from the CGH. Wavelength and intensity of the diffracted light are calculated using FFT image generated from interference fringe data. Parallax image of CGH corresponding to the viewing point can be easily obtained using this simulation method. Simulated image from interference fringe data was compared with reconstructed image of real CGH with an Electron Beam (EB) lithography system. According to the result, the simulated image resembled the reconstructed image of the CGH closely in shape, parallax, coloring and shade. And, in accordance with the shape of the light sources the simulated images which were changed in chroma saturation and blur by using two kinds of simulations: the several light sources method and smoothing method. In addition, as the applications of the CGH, full-color CGH and CGH with multiple images were simulated. The result was that the simulated images of those CGHs closely resembled the reconstructed image of real CGHs.
NASA Astrophysics Data System (ADS)
Vilotte, Jean-Pierre; Atkinson, Malcolm; Carpené, Michele; Casarotti, Emanuele; Frank, Anton; Igel, Heiner; Rietbrock, Andreas; Schwichtenberg, Horst; Spinuso, Alessandro
2016-04-01
Seismology pioneers global and open-data access -- with internationally approved data, metadata and exchange standards facilitated worldwide by the Federation of Digital Seismic Networks (FDSN) and in Europe the European Integrated Data Archives (EIDA). The growing wealth of data generated by dense observation and monitoring systems and recent advances in seismic wave simulation capabilities induces a change in paradigm. Data-intensive seismology research requires a new holistic approach combining scalable high-performance wave simulation codes and statistical data analysis methods, and integrating distributed data and computing resources. The European E-Infrastructure project "Virtual Earthquake and seismology Research Community e-science environment in Europe" (VERCE) pioneers the federation of autonomous organisations providing data and computing resources, together with a comprehensive, integrated and operational virtual research environment (VRE) and E-infrastructure devoted to the full path of data use in a research-driven context. VERCE delivers to a broad base of seismology researchers in Europe easily used high-performance full waveform simulations and misfit calculations, together with a data-intensive framework for the collaborative development of innovative statistical data analysis methods, all of which were previously only accessible to a small number of well-resourced groups. It balances flexibility with new integrated capabilities to provide a fluent path from research innovation to production. As such, VERCE is a major contribution to the implementation phase of the ``European Plate Observatory System'' (EPOS), the ESFRI initiative of the solid-Earth community. The VRE meets a range of seismic research needs by eliminating chores and technical difficulties to allow users to focus on their research questions. It empowers researchers to harvest the new opportunities provided by well-established and mature high-performance wave simulation codes of the community. It enables active researchers to invent and refine scalable methods for innovative statistical analysis of seismic waveforms in a wide range of application contexts. The VRE paves the way towards a flexible shared framework for seismic waveform inversion, lowering the barriers to uptake for the next generation of researchers. The VRE can be accessed through the science gateway that puts together computational and data-intensive research into the same framework, integrating multiple data sources and services. It provides a context for task-oriented and data-streaming workflows, and maps user actions to the full gamut of the federated platform resources and procurement policies, activating the necessary behind-the-scene automation and transformation. The platform manages and produces domain metadata, coupling them with the provenance information describing the relationships and the dependencies, which characterise the whole workflow process. This dynamic knowledge base, can be explored for validation purposes via a graphical interface and a web API. Moreover, it fosters the assisted selection and re-use of the data within each phase of the scientific analysis. These phases can be identified as Simulation, Data Access, Preprocessing, Misfit and data processing, and are presented to the users of the gateway as dedicated and interactive workspaces. By enabling researchers to share results and provenance information, VERCE steers open-science behaviour, allowing researchers to discover and build on prior work and thereby to progress faster. A key asset is the agile strategy that VERCE deployed in a multi-organisational context, engaging seismologists, data scientists, ICT researchers, HPC and data resource providers, system administrators into short-lived tasks each with a goal that is a seismology priority, and intimately coupling research thinking with technical innovation. This changes the focus from HPC production environments and community data services to user-focused scenario, avoiding wasteful bouts of technology centricity where technologists collect requirements and develop a system that is not used because the ideas of the planned users have moved on. As such the technologies and concepts developed in VERCE are relevant to many other disciplines in computational and data driven Earth Sciences and can provide the key technologies for a European wide computational and data intensive framework in Earth Sciences.
A parallel-processing approach to computing for the geographic sciences
Crane, Michael; Steinwand, Dan; Beckmann, Tim; Krpan, Greg; Haga, Jim; Maddox, Brian; Feller, Mark
2001-01-01
The overarching goal of this project is to build a spatially distributed infrastructure for information science research by forming a team of information science researchers and providing them with similar hardware and software tools to perform collaborative research. Four geographically distributed Centers of the U.S. Geological Survey (USGS) are developing their own clusters of low-cost personal computers into parallel computing environments that provide a costeffective way for the USGS to increase participation in the high-performance computing community. Referred to as Beowulf clusters, these hybrid systems provide the robust computing power required for conducting research into various areas, such as advanced computer architecture, algorithms to meet the processing needs for real-time image and data processing, the creation of custom datasets from seamless source data, rapid turn-around of products for emergency response, and support for computationally intense spatial and temporal modeling.
The BioIntelligence Framework: a new computational platform for biomedical knowledge computing.
Farley, Toni; Kiefer, Jeff; Lee, Preston; Von Hoff, Daniel; Trent, Jeffrey M; Colbourn, Charles; Mousses, Spyro
2013-01-01
Breakthroughs in molecular profiling technologies are enabling a new data-intensive approach to biomedical research, with the potential to revolutionize how we study, manage, and treat complex diseases. The next great challenge for clinical applications of these innovations will be to create scalable computational solutions for intelligently linking complex biomedical patient data to clinically actionable knowledge. Traditional database management systems (DBMS) are not well suited to representing complex syntactic and semantic relationships in unstructured biomedical information, introducing barriers to realizing such solutions. We propose a scalable computational framework for addressing this need, which leverages a hypergraph-based data model and query language that may be better suited for representing complex multi-lateral, multi-scalar, and multi-dimensional relationships. We also discuss how this framework can be used to create rapid learning knowledge base systems to intelligently capture and relate complex patient data to biomedical knowledge in order to automate the recovery of clinically actionable information.
An independent software system for the analysis of dynamic MR images.
Torheim, G; Lombardi, M; Rinck, P A
1997-01-01
A computer system for the manual, semi-automatic, and automatic analysis of dynamic MR images was to be developed on UNIX and personal computer platforms. The system was to offer an integrated and standardized way of performing both image processing and analysis that was independent of the MR unit used. The system consists of modules that are easily adaptable to special needs. Data from MR units or other diagnostic imaging equipment in techniques such as CT, ultrasonography, or nuclear medicine can be processed through the ACR-NEMA/DICOM standard file formats. A full set of functions is available, among them cine-loop visual analysis, and generation of time-intensity curves. Parameters such as cross-correlation coefficients, area under the curve, peak/maximum intensity, wash-in and wash-out slopes, time to peak, and relative signal intensity/contrast enhancement can be calculated. Other parameters can be extracted by fitting functions like the gamma-variate function. Region-of-interest data and parametric values can easily be exported. The system has been successfully tested in animal and patient examinations.
Three-dimensional spiral CT during arterial portography: comparison of three rendering techniques.
Heath, D G; Soyer, P A; Kuszyk, B S; Bliss, D F; Calhoun, P S; Bluemke, D A; Choti, M A; Fishman, E K
1995-07-01
The three most common techniques for three-dimensional reconstruction are surface rendering, maximum-intensity projection (MIP), and volume rendering. Surface-rendering algorithms model objects as collections of geometric primitives that are displayed with surface shading. The MIP algorithm renders an image by selecting the voxel with the maximum intensity signal along a line extended from the viewer's eye through the data volume. Volume-rendering algorithms sum the weighted contributions of all voxels along the line. Each technique has advantages and shortcomings that must be considered during selection of one for a specific clinical problem and during interpretation of the resulting images. With surface rendering, sharp-edged, clear three-dimensional reconstruction can be completed on modest computer systems; however, overlapping structures cannot be visualized and artifacts are a problem. MIP is computationally a fast technique, but it does not allow depiction of overlapping structures, and its images are three-dimensionally ambiguous unless depth cues are provided. Both surface rendering and MIP use less than 10% of the image data. In contrast, volume rendering uses nearly all of the data, allows demonstration of overlapping structures, and engenders few artifacts, but it requires substantially more computer power than the other techniques.
NASA Astrophysics Data System (ADS)
Derkachov, G.; Jakubczyk, T.; Jakubczyk, D.; Archer, J.; Woźniak, M.
2017-07-01
Utilising Compute Unified Device Architecture (CUDA) platform for Graphics Processing Units (GPUs) enables significant reduction of computation time at a moderate cost, by means of parallel computing. In the paper [Jakubczyk et al., Opto-Electron. Rev., 2016] we reported using GPU for Mie scattering inverse problem solving (up to 800-fold speed-up). Here we report the development of two subroutines utilising GPU at data preprocessing stages for the inversion procedure: (i) A subroutine, based on ray tracing, for finding spherical aberration correction function. (ii) A subroutine performing the conversion of an image to a 1D distribution of light intensity versus azimuth angle (i.e. scattering diagram), fed from a movie-reading CPU subroutine running in parallel. All subroutines are incorporated in PikeReader application, which we make available on GitHub repository. PikeReader returns a sequence of intensity distributions versus a common azimuth angle vector, corresponding to the recorded movie. We obtained an overall ∼ 400 -fold speed-up of calculations at data preprocessing stages using CUDA codes running on GPU in comparison to single thread MATLAB-only code running on CPU.
Chen, Jinying; Jagannatha, Abhyuday N; Fodeh, Samah J; Yu, Hong
2017-10-31
Medical terms are a major obstacle for patients to comprehend their electronic health record (EHR) notes. Clinical natural language processing (NLP) systems that link EHR terms to lay terms or definitions allow patients to easily access helpful information when reading through their EHR notes, and have shown to improve patient EHR comprehension. However, high-quality lay language resources for EHR terms are very limited in the public domain. Because expanding and curating such a resource is a costly process, it is beneficial and even necessary to identify terms important for patient EHR comprehension first. We aimed to develop an NLP system, called adapted distant supervision (ADS), to rank candidate terms mined from EHR corpora. We will give EHR terms ranked as high by ADS a higher priority for lay language annotation-that is, creating lay definitions for these terms. Adapted distant supervision uses distant supervision from consumer health vocabulary and transfer learning to adapt itself to solve the problem of ranking EHR terms in the target domain. We investigated 2 state-of-the-art transfer learning algorithms (ie, feature space augmentation and supervised distant supervision) and designed 5 types of learning features, including distributed word representations learned from large EHR data for ADS. For evaluating ADS, we asked domain experts to annotate 6038 candidate terms as important or nonimportant for EHR comprehension. We then randomly divided these data into the target-domain training data (1000 examples) and the evaluation data (5038 examples). We compared ADS with 2 strong baselines, including standard supervised learning, on the evaluation data. The ADS system using feature space augmentation achieved the best average precision, 0.850, on the evaluation set when using 1000 target-domain training examples. The ADS system using supervised distant supervision achieved the best average precision, 0.819, on the evaluation set when using only 100 target-domain training examples. The 2 ADS systems both performed significantly better than the baseline systems (P<.001 for all measures and all conditions). Using a rich set of learning features contributed to ADS's performance substantially. ADS can effectively rank terms mined from EHRs. Transfer learning improved ADS's performance even with a small number of target-domain training examples. EHR terms prioritized by ADS were used to expand a lay language resource that supports patient EHR comprehension. The top 10,000 EHR terms ranked by ADS are available upon request. ©Jinying Chen, Abhyuday N Jagannatha, Samah J Fodeh, Hong Yu. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 31.10.2017.
The challenge of assessing the potential developmental health risks for the tens of thousands of environmental chemicals is beyond the capacity for resource-intensive animal protocols. Large data streams coming from high-throughput (HTS) and high-content (HCS) profiling of biolog...
Shi, Yulin; Veidenbaum, Alexander V; Nicolau, Alex; Xu, Xiangmin
2015-01-15
Modern neuroscience research demands computing power. Neural circuit mapping studies such as those using laser scanning photostimulation (LSPS) produce large amounts of data and require intensive computation for post hoc processing and analysis. Here we report on the design and implementation of a cost-effective desktop computer system for accelerated experimental data processing with recent GPU computing technology. A new version of Matlab software with GPU enabled functions is used to develop programs that run on Nvidia GPUs to harness their parallel computing power. We evaluated both the central processing unit (CPU) and GPU-enabled computational performance of our system in benchmark testing and practical applications. The experimental results show that the GPU-CPU co-processing of simulated data and actual LSPS experimental data clearly outperformed the multi-core CPU with up to a 22× speedup, depending on computational tasks. Further, we present a comparison of numerical accuracy between GPU and CPU computation to verify the precision of GPU computation. In addition, we show how GPUs can be effectively adapted to improve the performance of commercial image processing software such as Adobe Photoshop. To our best knowledge, this is the first demonstration of GPU application in neural circuit mapping and electrophysiology-based data processing. Together, GPU enabled computation enhances our ability to process large-scale data sets derived from neural circuit mapping studies, allowing for increased processing speeds while retaining data precision. Copyright © 2014 Elsevier B.V. All rights reserved.
Shi, Yulin; Veidenbaum, Alexander V.; Nicolau, Alex; Xu, Xiangmin
2014-01-01
Background Modern neuroscience research demands computing power. Neural circuit mapping studies such as those using laser scanning photostimulation (LSPS) produce large amounts of data and require intensive computation for post-hoc processing and analysis. New Method Here we report on the design and implementation of a cost-effective desktop computer system for accelerated experimental data processing with recent GPU computing technology. A new version of Matlab software with GPU enabled functions is used to develop programs that run on Nvidia GPUs to harness their parallel computing power. Results We evaluated both the central processing unit (CPU) and GPU-enabled computational performance of our system in benchmark testing and practical applications. The experimental results show that the GPU-CPU co-processing of simulated data and actual LSPS experimental data clearly outperformed the multi-core CPU with up to a 22x speedup, depending on computational tasks. Further, we present a comparison of numerical accuracy between GPU and CPU computation to verify the precision of GPU computation. In addition, we show how GPUs can be effectively adapted to improve the performance of commercial image processing software such as Adobe Photoshop. Comparison with Existing Method(s) To our best knowledge, this is the first demonstration of GPU application in neural circuit mapping and electrophysiology-based data processing. Conclusions Together, GPU enabled computation enhances our ability to process large-scale data sets derived from neural circuit mapping studies, allowing for increased processing speeds while retaining data precision. PMID:25277633
Hickey, Catherine
2015-01-01
Davanloo's Intensive Short-term Dynamic Psychotherapy has been the subject of various reviews. The first article in this series focused on a review of Davanloo's early work as well as a discussion of some of his most recent research findings. A case from the Montreal closed circuit training program was reviewed. This second article will focus on Davanloo's views on the transference neurosis and how its development should be avoided at all costs. There will be further exploration of the case presented in Part I from the Montreal closed circuit training program. There will also be a special focus on detecting the transference neurosis when present and the technical interventions needed to lay the foundations for removing it.
Media multitasking behavior: concurrent television and computer usage.
Brasel, S Adam; Gips, James
2011-09-01
Changes in the media landscape have made simultaneous usage of the computer and television increasingly commonplace, but little research has explored how individuals navigate this media multitasking environment. Prior work suggests that self-insight may be limited in media consumption and multitasking environments, reinforcing a rising need for direct observational research. A laboratory experiment recorded both younger and older individuals as they used a computer and television concurrently, multitasking across television and Internet content. Results show that individuals are attending primarily to the computer during media multitasking. Although gazes last longer on the computer when compared to the television, the overall distribution of gazes is strongly skewed toward very short gazes only a few seconds in duration. People switched between media at an extreme rate, averaging more than 4 switches per min and 120 switches over the 27.5-minute study exposure. Participants had little insight into their switching activity and recalled their switching behavior at an average of only 12 percent of their actual switching rate revealed in the objective data. Younger individuals switched more often than older individuals, but other individual differences such as stated multitasking preference and polychronicity had little effect on switching patterns or gaze duration. This overall pattern of results highlights the importance of exploring new media environments, such as the current drive toward media multitasking, and reinforces that self-monitoring, post hoc surveying, and lay theory may offer only limited insight into how individuals interact with media.
Media Multitasking Behavior: Concurrent Television and Computer Usage
Gips, James
2011-01-01
Abstract Changes in the media landscape have made simultaneous usage of the computer and television increasingly commonplace, but little research has explored how individuals navigate this media multitasking environment. Prior work suggests that self-insight may be limited in media consumption and multitasking environments, reinforcing a rising need for direct observational research. A laboratory experiment recorded both younger and older individuals as they used a computer and television concurrently, multitasking across television and Internet content. Results show that individuals are attending primarily to the computer during media multitasking. Although gazes last longer on the computer when compared to the television, the overall distribution of gazes is strongly skewed toward very short gazes only a few seconds in duration. People switched between media at an extreme rate, averaging more than 4 switches per min and 120 switches over the 27.5-minute study exposure. Participants had little insight into their switching activity and recalled their switching behavior at an average of only 12 percent of their actual switching rate revealed in the objective data. Younger individuals switched more often than older individuals, but other individual differences such as stated multitasking preference and polychronicity had little effect on switching patterns or gaze duration. This overall pattern of results highlights the importance of exploring new media environments, such as the current drive toward media multitasking, and reinforces that self-monitoring, post hoc surveying, and lay theory may offer only limited insight into how individuals interact with media. PMID:21381969
Long live the Data Scientist, but can he/she persist?
NASA Astrophysics Data System (ADS)
Wyborn, L. A.
2011-12-01
In recent years the fourth paradigm of data intensive science has slowly taken hold as the increased capacity of instruments and an increasing number of instruments (in particular sensor networks) have changed how fundamental research is undertaken. Most modern scientific research is about digital capture of data direct from instruments, processing it by computers, storing the results on computers and only publishing a small fraction of data in hard copy publications. At the same time, the rapid increase in capacity of supercomputers, particularly at petascale, means that far larger data sets can be analysed and to greater resolution than previously possible. The new cloud computing paradigm which allows distributed data, software and compute resources to be linked by seamless workflows, is creating new opportunities in processing of high volumes of data to an increasingly larger number of researchers. However, to take full advantage of these compute resources, data sets for analysis have to be aggregated from multiple sources to create high performance data sets. These new technology developments require that scientists must become more skilled in data management and/or have a higher degree of computer literacy. In almost every science discipline there is now an X-informatics branch and a computational X branch (eg, Geoinformatics and Computational Geoscience): both require a new breed of researcher that has skills in both the science fundamentals and also knowledge of some ICT aspects (computer programming, data base design and development, data curation, software engineering). People that can operate in both science and ICT are increasingly known as 'data scientists'. Data scientists are a critical element of many large scale earth and space science informatics projects, particularly those that are tackling current grand challenges at an international level on issues such as climate change, hazard prediction and sustainable development of our natural resources. These projects by their very nature require the integration of multiple digital data sets from multiple sources. Often the preparation of the data for computational analysis can take months and requires painstaking attention to detail to ensure that anomalies identified are real and are not just artefacts of the data preparation and/or the computational analysis. Although data scientists are increasingly vital to successful data intensive earth and space science projects, unless they are recognised for their capabilities in both the science and the computational domains they are likely to migrate to either a science role or an ICT role as their career advances. Most reward and recognition systems do not recognise those with skills in both, hence, getting trained data scientists to persist beyond one or two projects can be challenge. Those data scientists that persist in the profession are characteristically committed and enthusiastic people who have the support of their organisations to take on this role. They also tend to be people who share developments and are critical to the success of the open source software movement. However, the fact remains that survival of the data scientist as a species is being threatened unless something is done to recognise their invaluable contributions to the new fourth paradigm of science.
Hsieh, Hong-Po; Ko, Fan-Hua; Sung, Kung-Bin
2018-04-20
An iterative curve fitting method has been applied in both simulation [J. Biomed. Opt.17, 107003 (2012)JBOPFO1083-366810.1117/1.JBO.17.10.107003] and phantom [J. Biomed. Opt.19, 077002 (2014)JBOPFO1083-366810.1117/1.JBO.19.7.077002] studies to accurately extract optical properties and the top layer thickness of a two-layered superficial tissue model from diffuse reflectance spectroscopy (DRS) data. This paper describes a hybrid two-step parameter estimation procedure to address two main issues of the previous method, including (1) high computational intensity and (2) converging to local minima. The parameter estimation procedure contained a novel initial estimation step to obtain an initial guess, which was used by a subsequent iterative fitting step to optimize the parameter estimation. A lookup table was used in both steps to quickly obtain reflectance spectra and reduce computational intensity. On simulated DRS data, the proposed parameter estimation procedure achieved high estimation accuracy and a 95% reduction of computational time compared to previous studies. Furthermore, the proposed initial estimation step led to better convergence of the following fitting step. Strategies used in the proposed procedure could benefit both the modeling and experimental data processing of not only DRS but also related approaches such as near-infrared spectroscopy.
Jayaraman, Sudha; Mabweijano, Jacqueline R; Lipnick, Michael S; Caldwell, Nolan; Miyamoto, Justin; Wangoda, Robert; Mijumbi, Cephas; Hsia, Renee; Dicker, Rochelle; Ozgediz, Doruk
2009-12-01
Uganda currently has no organized prehospital emergency system. We sought to measure the current burden of injury seen by lay people in Kampala, Uganda and to determine the feasibility of a lay first-responder training program. We conducted a cross-sectional survey of current prehospital care providers in Kampala: police officers, minibus taxi drivers, and Local Council officials, and collected data on types and frequencies of emergencies witnessed, barriers to aid provision, history of training, and current availability of first-aid supplies. A context-appropriate course on basic first-aid for trauma was designed and implemented. We measured changes in trainees' fund of knowledge before and after training. A total of 309 lay people participated in the study, and during the previous 6 months saw 18 traumatic emergencies each; 39% saw an injury-related death. The most common injury mechanisms were road crashes, assault, and burns. In these cases, 90% of trainees provided some aid, most commonly lifting (82%) or transport (76%). Fifty-two percent of trainees had previous first-aid training, 44% had some access to equipment, and 32% had ever purchased a first-aid kit. Before training, participants answered 45% of test questions correctly (mean %) and this increased to 86% after training (p < 0.0001). Lay people witness many emergencies and deaths in Kampala, Uganda and provide much needed care but are ill-prepared to do so. A context-appropriate prehospital trauma care course can be developed and improve lay people's knowledge of basic trauma care. The effectiveness of such a training program needs to be evaluated prospectively.
Graveland, J; Berends, A E
1997-01-01
The calcium demand of egg-laying birds is much higher than in other vertebrates during reproduction. We showed elsewhere that a low level of calcium availability can greatly affect the eggshell quality and reproduction of free-living passerines. However, there are few data on calcium demand and calcium intake in relation to egg laying and behaviour and egg-laying performance under conditions of calcium shortage in nondomesticated birds. We examined these aspects in an experiment with captive great tits, Parus major, on a diet deficient in calcium, with or without snail shells as an additional calcium source. More than 90% of the calcium intake for egg production took place during the egg-laying period. Females ingested about 1.7 times as much calcium as they deposited in eggshells. Removing the snail shells after the first egg resulted in eggshell defects and interruptions of laying after 1-3 d. Females without snail shells doubled their searching effort and started to burrow in the soil and to eat sand, small stones, and their own eggs. Most calcium was consumed in the evening, probably to supplement the calcium available from the medullary bone with an additional calcium source in the gut during eggshell formation. The results demonstrated that eggshell formation requires accurate timing of the calcium intake and that obtaining sufficient calcium is time-consuming, even in calcium-rich environments. These factors pertaining to calcium intake greatly affect the ability of birds to collect sufficient calcium for eggshell formation in calcium-poor areas.
A smartphone application for dispatch of lay responders to out-of-hospital cardiac arrests.
Berglund, Ellinor; Claesson, Andreas; Nordberg, Per; Djärv, Therese; Lundgren, Peter; Folke, Fredrik; Forsberg, Sune; Riva, Gabriel; Ringh, Mattias
2018-05-01
Dispatch of lay volunteers trained in cardiopulmonary resuscitation (CPR) and equipped with automated external defibrillators (AEDs) may improve survival in cases of out-of-hospital cardiac arrest (OHCA). The aim of this study was to investigate the functionality and performance of a smartphone application for locating and alerting nearby trained laymen/women in cases of OHCA. A system using a smartphone application activated by Emergency Dispatch Centres was used to locate and alert laymen/women to nearby suspected OHCAs. Lay responders were instructed either to perform CPR or collect a nearby AED. An online survey was carried out among the responders. From February to August 2016, the system was activated in 685 cases of suspected OHCA. Among these, 224 cases were Emergency Medical Services (EMSs)-treated OHCAs (33%). EMS-witnessed cases (n = 11) and cases with missing survey data (n = 15) were excluded. In the remaining 198 OHCAs, lay responders arrived at the scene in 116 cases (58%), and prior to EMSs in 51 cases (26%). An AED was attached in 17 cases (9%) and 4 (2%) were defibrillated. Lay responders performed CPR in 54 cases (27%). Median distance to the OHCA was 560 m (IQR 332-860 m), and 1280 m (IQR 748-1776 m) via AED pick-up. The survey-answering rate was 82%. A smartphone application can be used to alert CPR-trained lay volunteers to OHCAs for CPR. Further improvements are needed to shorten the time to defibrillation before EMS arrival. Copyright © 2018 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aliaga, José I., E-mail: aliaga@uji.es; Alonso, Pedro; Badía, José M.
We introduce a new iterative Krylov subspace-based eigensolver for the simulation of macromolecular motions on desktop multithreaded platforms equipped with multicore processors and, possibly, a graphics accelerator (GPU). The method consists of two stages, with the original problem first reduced into a simpler band-structured form by means of a high-performance compute-intensive procedure. This is followed by a memory-intensive but low-cost Krylov iteration, which is off-loaded to be computed on the GPU by means of an efficient data-parallel kernel. The experimental results reveal the performance of the new eigensolver. Concretely, when applied to the simulation of macromolecules with a few thousandsmore » degrees of freedom and the number of eigenpairs to be computed is small to moderate, the new solver outperforms other methods implemented as part of high-performance numerical linear algebra packages for multithreaded architectures.« less
MeDICi Software Superglue for Data Analysis Pipelines
Ian Gorton
2017-12-09
The Middleware for Data-Intensive Computing (MeDICi) Integration Framework is an integrated middleware platform developed to solve data analysis and processing needs of scientists across many domains. MeDICi is scalable, easily modified, and robust to multiple languages, protocols, and hardware platforms, and in use today by PNNL scientists for bioinformatics, power grid failure analysis, and text analysis.
Scoping the polymer genome: A roadmap for rational polymer dielectrics design and beyond
Mannodi-Kanakkithodi, Arun; Chandrasekaran, Anand; Kim, Chiho; ...
2017-12-19
The Materials Genome Initiative (MGI) has heralded a sea change in the philosophy of materials design. In an increasing number of applications, the successful deployment of novel materials has benefited from the use of computational methodologies, data descriptors, and machine learning. Polymers have long suffered from a lack of data on electronic, mechanical, and dielectric properties across large chemical spaces, causing a stagnation in the set of suitable candidates for various applications. Extensive efforts over the last few years have seen the fruitful application of MGI principles toward the accelerated discovery of attractive polymer dielectrics for capacitive energy storage. Here,more » we review these efforts, highlighting the importance of computational data generation and screening, targeted synthesis and characterization, polymer fingerprinting and machine-learning prediction models, and the creation of an online knowledgebase to guide ongoing and future polymer discovery and design. We lay special emphasis on the fingerprinting of polymers in terms of their genome or constituent atomic and molecular fragments, an idea that pays homage to the pioneers of the human genome project who identified the basic building blocks of the human DNA. As a result, by scoping the polymer genome, we present an essential roadmap for the design of polymer dielectrics, and provide future perspectives and directions for expansions to other polymer subclasses and properties.« less
Scoping the polymer genome: A roadmap for rational polymer dielectrics design and beyond
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mannodi-Kanakkithodi, Arun; Chandrasekaran, Anand; Kim, Chiho
The Materials Genome Initiative (MGI) has heralded a sea change in the philosophy of materials design. In an increasing number of applications, the successful deployment of novel materials has benefited from the use of computational methodologies, data descriptors, and machine learning. Polymers have long suffered from a lack of data on electronic, mechanical, and dielectric properties across large chemical spaces, causing a stagnation in the set of suitable candidates for various applications. Extensive efforts over the last few years have seen the fruitful application of MGI principles toward the accelerated discovery of attractive polymer dielectrics for capacitive energy storage. Here,more » we review these efforts, highlighting the importance of computational data generation and screening, targeted synthesis and characterization, polymer fingerprinting and machine-learning prediction models, and the creation of an online knowledgebase to guide ongoing and future polymer discovery and design. We lay special emphasis on the fingerprinting of polymers in terms of their genome or constituent atomic and molecular fragments, an idea that pays homage to the pioneers of the human genome project who identified the basic building blocks of the human DNA. As a result, by scoping the polymer genome, we present an essential roadmap for the design of polymer dielectrics, and provide future perspectives and directions for expansions to other polymer subclasses and properties.« less
A Fast Algorithm for Lattice Hyperonic Potentials
NASA Astrophysics Data System (ADS)
Nemura, Hidekatsu; Aoki, Sinya; Doi, Takumi; Gongyo, Shinya; Hatsuda, Tetsuo; Ikeda, Yoichi; Inoue, Takashi; Iritani, Takumi; Ishii, Noriyoshi; Miyamoto, Takaya; Murano, Keiko; Sasaki, Kenji
We describe an efficient algorithm to compute a large number of baryon-baryon interactions from NN to ΞΞ by means of HAL QCD method, which lays the groundwork for the nearly physical point lattice QCD calculation with volume (96a)4 ≈ (8.2 fm)4. Preliminary results of ΛN potential calculated with quark masses corresponding to (mπ, mK) ≈ (146,525) MeV are presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nikolic, R J
This month's issue has the following articles: (1) Dawn of a New Era of Scientific Discovery - Commentary by Edward I. Moses; (2) At the Frontiers of Fundamental Science Research - Collaborators from national laboratories, universities, and international organizations are using the National Ignition Facility to probe key fundamental science questions; (3) Livermore Responds to Crisis in Post-Earthquake Japan - More than 70 Laboratory scientists provided round-the-clock expertise in radionuclide analysis and atmospheric dispersion modeling as part of the nation's support to Japan following the March 2011 earthquake and nuclear accident; (4) A Comprehensive Resource for Modeling, Simulation, and Experimentsmore » - A new Web-based resource called MIDAS is a central repository for material properties, experimental data, and computer models; and (5) Finding Data Needles in Gigabit Haystacks - Livermore computer scientists have developed a novel computer architecture based on 'persistent' memory to ease data-intensive computations.« less
Building a Data Science capability for USGS water research and communication
NASA Astrophysics Data System (ADS)
Appling, A.; Read, E. K.
2015-12-01
Interpreting and communicating water issues in an era of exponentially increasing information requires a blend of domain expertise, computational proficiency, and communication skills. The USGS Office of Water Information has established a Data Science team to meet these needs, providing challenging careers for diverse domain scientists and innovators in the fields of information technology and data visualization. Here, we detail the experience of building a Data Science capability as a bridging element between traditional water resources analyses and modern computing tools and data management techniques. This approach includes four major components: 1) building reusable research tools, 2) documenting data-intensive research approaches in peer reviewed journals, 3) communicating complex water resources issues with interactive web visualizations, and 4) offering training programs for our peers in scientific computing. These components collectively improve the efficiency, transparency, and reproducibility of USGS data analyses and scientific workflows.
A vectorized Lanczos eigensolver for high-performance computers
NASA Technical Reports Server (NTRS)
Bostic, Susan W.
1990-01-01
The computational strategies used to implement a Lanczos-based-method eigensolver on the latest generation of supercomputers are described. Several examples of structural vibration and buckling problems are presented that show the effects of using optimization techniques to increase the vectorization of the computational steps. The data storage and access schemes and the tools and strategies that best exploit the computer resources are presented. The method is implemented on the Convex C220, the Cray 2, and the Cray Y-MP computers. Results show that very good computation rates are achieved for the most computationally intensive steps of the Lanczos algorithm and that the Lanczos algorithm is many times faster than other methods extensively used in the past.
Aerial radiometric and magnetic survey: Aztec National Topographic Map, New Mexico
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1979-01-01
The results of analyses of the airborne gamma radiation and total magnetic field survey flown for the region identified as the Aztec National Topographic Map NJ13-10 are presented. The airborne data gathered are reduced by ground computer facilities to yield profile plots of the basic uranium, thorium and potassium equivalent gamma radiation intensities, ratios of these intensities, aircraft altitude above the earth's surface, total gamma ray and earth's magnetic field intensity, correlated as a function of geologic units. The distribution of data within each geologic unit, for all surveyed map lines and tie lines, has been calculated and is included.more » Two sets of profiled data for each line are included, with one set displaying the above-cited data. The second set includes only flight line magnetic field, temperature, pressure, altitude data plus magnetic field data as measured at a base station. A general description of the area, including descriptions of the various geologic units and the corresponding airborne data, is included also.« less
Aerial radiometric and magnetic survey: Lander National Topographic Map, Wyoming
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1979-01-01
The results of analyses of the airborne gamma radiation and total magnetic field survey flown for the region identified as the Lander National Topographic Map NK12-6 are presented. The airborne data gathered are reduced by ground computer facilities to yield profile plots of the basic uranium, thorium and potassium equivalent gamma radiation intensities, ratios of these intensities, aircraft altitude above the earth's surface, total gamma ray and earth's magnetic field intensity, correlated as a function of geologic units. The distribution of data within each geologic unit, for all surveyed map lines and tie lines, has been calculated and is included.more » Two sets of profiled data for each line are included, with one set displaying the above-cited data. The second set includes only flight line magnetic field, temperature, pressure, altitude data plus magnetic field data as measured at a base station. A general description of the area, including descriptions of the various geologic units and the corresponding airborne data, is included also.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Venkata, Manjunath Gorentla; Aderholdt, William F
The pre-exascale systems are expected to have a significant amount of hierarchical and heterogeneous on-node memory, and this trend of system architecture in extreme-scale systems is expected to continue into the exascale era. along with hierarchical-heterogeneous memory, the system typically has a high-performing network ad a compute accelerator. This system architecture is not only effective for running traditional High Performance Computing (HPC) applications (Big-Compute), but also for running data-intensive HPC applications and Big-Data applications. As a consequence, there is a growing desire to have a single system serve the needs of both Big-Compute and Big-Data applications. Though the system architecturemore » supports the convergence of the Big-Compute and Big-Data, the programming models and software layer have yet to evolve to support either hierarchical-heterogeneous memory systems or the convergence. A programming abstraction to address this problem. The programming abstraction is implemented as a software library and runs on pre-exascale and exascale systems supporting current and emerging system architecture. Using distributed data-structures as a central concept, it provides (1) a simple, usable, and portable abstraction for hierarchical-heterogeneous memory and (2) a unified programming abstraction for Big-Compute and Big-Data applications.« less
Are Clinicians Better Than Lay Judges at Recalling Case Details? An Evaluation of Expert Memory.
Webb, Christopher A; Keeley, Jared W; Eakin, Deborah K
2016-04-01
This study examined the role of expertise in clinicians' memory for case details. Clinicians' diagnostic formulations may afford mechanisms for retaining and retrieving information. Experts (N = 41; 47.6% males, 23.8% females; 28.6% did not report gender; age: mean [M] = 54.69) were members of the American Board of Professional Psychologists. Lay judges (N = 156; 25.4% males, 74.1% females; age: M = 18.85) were undergraduates enrolled in general psychology. Three vignettes were presented to each group, creating a 2 (group: expert, lay judge) x 3 (vignettes: simple, complex-coherent, complex-incoherent) mixed factorial design. Recall accuracy for vignette details was the dependent variable. Data analyses used multivariate analyses of variance to detect group differences among multiple continuous variables. Experts recalled more information than lay judges, overall. However, experts also exhibited more false memories for the complex-incoherent case because of their schema-based knowledge. This study supported clinical expertise as beneficial. Nonetheless, negative influences from experts' schema-based knowledge, as exhibited, could adversely affect clinical practices. © 2016 Wiley Periodicals, Inc.
Sun, Christina J.; García, Manuel; Mann, Lilli; Alonzo, Jorge; Eng, Eugenia; Rhodes, Scott D.
2015-01-01
The HOLA intervention was a lay health advisor intervention designed to reduce the disproportionate HIV burden borne by Latino sexual and gender identity minorities (gay, bisexual, and other men who have sex with men, and transgender persons) living in the United States. Process evaluation data were collected for over a year of intervention implementation from 11 trained Latino male and transgender lay health advisors (Navegantes) to document the activities each Navegante conducted to promote condom use and HIV testing among his or her 8 social network members enrolled in the study. Over 13 months, the Navegantes reported conducting 1,820 activities. The most common activity was condom distribution. Navegantes had extensive reach beyond their enrolled social network members, and they engaged in health promotion activities beyond social network members enrolled in the study. There were significant differences between the types of activities conducted by Navegantes depending on who was present. Results suggest that lay health advisor interventions reach large number of at-risk community members and may benefit populations disproportionately impacted by HIV. PMID:25416309
Spectral mapping of soil organic matter
NASA Technical Reports Server (NTRS)
Kristof, S. J.; Baumgardner, M. F.; Johannsen, C. J.
1974-01-01
Multispectral remote sensing data were examined for use in the mapping of soil organic matter content. Computer-implemented pattern recognition techniques were used to analyze data collected in May 1969 and May 1970 by an airborne multispectral scanner over a 40-km flightline. Two fields within the flightline were selected for intensive study. Approximately 400 surface soil samples from these fields were obtained for organic matter analysis. The analytical data were used as training sets for computer-implemented analysis of the spectral data. It was found that within the geographical limitations included in this study, multispectral data and automatic data processing techniques could be used very effectively to delineate and map surface soils areas containing different levels of soil organic matter.
Novel centrifugal technology for measuring sperm concentration in the home.
Schaff, Ulrich Y; Fredriksen, Laura L; Epperson, Jon G; Quebral, Tiffany R; Naab, Sara; Sarno, Mark J; Eisenberg, Michael L; Sommer, Greg J
2017-02-01
To evaluate the analytical performance and usability of the Trak Male Fertility Testing System, a semiquantitative (categorical) device recently US Food and Drug Administration (FDA)-cleared for measuring sperm concentration in the home by untrained users. A three-site clinical trial comparing self-reported lay user results versus reference results obtained by computer-aided semen analysis (CASA). Simulated home use environments at fertility centers and urologist offices. A total of 239 untrained users. None. Sperm concentration results reported from self-testing lay users and laboratory reference method by CASA were evaluated semiquantitatively against the device's clinical cutoffs of 15 M/mL (current World Health Organization cutoff) and 55 M/mL (associated with faster time to pregnancy). Additional reported metrics include assay linearity, precision, limit of detection, and ease-of-use ratings from lay users. Lay users achieved an accuracy (versus the reference) of 93.3% (95% confidence interval [CI] 84.1%-97.4%) for results categorized as ≤15 M/mL, 82.4% (95% CI 73.3%-88.9%) for results categorized as 15-55 M/mL, and 95.5% (95% CI 88.9%-98.2%) for results categorized as >55 M/mL. When measured quantitatively, Trak results had a strong linear correlation with CASA measurements (r = 0.99). The precision and limit of detection studies show that the device has adequate reproducibility and detection range for home use. Subjects generally rated the device as easy to use. The Trak System is an accurate tool for semiquantitatively measuring sperm concentration in the home. The system may enable screening and longitudinal assessment of sperm concentration at home. ClinicalTrials.gov identifier: NCT02475395. Copyright © 2016 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.
Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing
2011-01-01
Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century. PMID:21444779
Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing
2011-04-05
Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century.
Visualizing risks in cancer communication: A systematic review of computer-supported visual aids.
Stellamanns, Jan; Ruetters, Dana; Dahal, Keshav; Schillmoeller, Zita; Huebner, Jutta
2017-08-01
Health websites are becoming important sources for cancer information. Lay users, patients and carers seek support for critical decisions, but they are prone to common biases when quantitative information is presented. Graphical representations of risk data can facilitate comprehension, and interactive visualizations are popular. This review summarizes the evidence on computer-supported graphs that present risk data and their effects on various measures. The systematic literature search was conducted in several databases, including MEDLINE, EMBASE and CINAHL. Only studies with a controlled design were included. Relevant publications were carefully selected and critically appraised by two reviewers. Thirteen studies were included. Ten studies evaluated static graphs and three dynamic formats. Most decision scenarios were hypothetical. Static graphs could improve accuracy, comprehension, and behavioural intention. But the results were heterogeneous and inconsistent among the studies. Dynamic formats were not superior or even impaired performance compared to static formats. Static graphs show promising but inconsistent results, while research on dynamic visualizations is scarce and must be interpreted cautiously due to methodical limitations. Well-designed and context-specific static graphs can support web-based cancer risk communication in particular populations. The application of dynamic formats cannot be recommended and needs further research. Copyright © 2017 Elsevier B.V. All rights reserved.
Machine learning: Trends, perspectives, and prospects.
Jordan, M I; Mitchell, T M
2015-07-17
Machine learning addresses the question of how to build computers that improve automatically through experience. It is one of today's most rapidly growing technical fields, lying at the intersection of computer science and statistics, and at the core of artificial intelligence and data science. Recent progress in machine learning has been driven both by the development of new learning algorithms and theory and by the ongoing explosion in the availability of online data and low-cost computation. The adoption of data-intensive machine-learning methods can be found throughout science, technology and commerce, leading to more evidence-based decision-making across many walks of life, including health care, manufacturing, education, financial modeling, policing, and marketing. Copyright © 2015, American Association for the Advancement of Science.
DiCarlo, Abby; Fayorsey, Ruby; Syengo, Masila; Chege, Duncan; Sirengo, Martin; Reidy, William; Otieno, Juliana; Omoto, Jackton; Hawken, Mark P; Abrams, Elaine J
2018-01-10
The recent scale-up of prevention of mother-to-child transmission of HIV (PMTCT) services has rapidly accelerated antiretroviral therapy (ART) uptake among pregnant and postpartum women in sub-Saharan Africa. The Mother and Infant Retention for Health (MIR4Health) study evaluates the impact of a combination intervention administered by trained lay health workers to decrease attrition among HIV-positive women initiating PMTCT services and their infants through 6 months postpartum. This was a qualitative study nested within the MIR4Health trial. MIR4Health was conducted at 10 health facilities in Nyanza, Kenya from September 2013 to September 2015. The trial intervention addressed behavioral, social, and structural barriers to PMTCT retention and included: appointment reminders via text and phone calls, follow-up and tracking for missed clinic visits, PMTCT health education at home visits and during clinic visits, and retention and adherence support and counseling. All interventions were administered by lay health workers. We describe results of a nested small qualitative inquiry which conducted two focus groups to assess the experiences and perceptions of lay health workers administering the interventions. Discussions were recorded and simultaneously transcribed and translated into English. Data were analyzed using framework analysis approach. Study findings show lay health workers played a critical role supporting mothers in PMTCT services across a range of behavioral, social, and structural domains, including improved communication and contact, health education, peer support, and patient advocacy and assistance. Findings also identified barriers to the uptake and implementation of the interventions, such as concerns about privacy and stigma, and the limitations of the healthcare system including healthcare worker attitudes. Overall, study findings indicate that lay health workers found the interventions to be feasible, acceptable, and well received by clients. Lay health workers played a fundamental role in supporting mothers engaged in PMTCT services and provided valuable feedback on the implementation of PMTCT interventions. Future interventions must include strategies to ensure client privacy, decrease stigma within communities, and address the practical limitations of health systems. This study adds important insight to the growing body of research on lay health worker experiences in HIV and PMTCT care. Clinicaltrials.gov NCT01962220 .
Schaal, T P; Arango, J; Wolc, A; Brady, J V; Fulton, J E; Rubinoff, I; Ehr, I J; Persia, M E; O'Sullivan, N P
2016-02-01
Venous blood gas and chemistry reference ranges were determined for commercial Hy-Line W-36 pullets and laying hens utilizing the portable i-STAT®1 analyzer and CG8+ cartridges. A total of 632 samples were analyzed from birds between 4 and 110 wk of age. Reference ranges were established for pullets (4 to 15 wk), first cycle laying hens (20 to 68 wk), and second cycle (post molt) laying hens (70 to 110 wk) for the following traits: sodium (Na mmol/L), potassium (K mmol/L), ionized calcium (iCa mmol/L), glucose (Glu mg/dl), hematocrit (Hct% Packed Cell Volume [PCV]), pH, partial pressure carbon dioxide (PCO2 mm Hg), partial pressure oxygen (PO2 mm Hg), total concentration carbon dioxide (TCO2 mmol/L), bicarbonate (HCO3 mmol/L), base excess (BE mmol/L), oxygen saturation (sO2%), and hemoglobin (Hb g/dl). Data were analyzed using ANOVA to investigate the effect of production status as categorized by bird age. Trait relationships were evaluated by linear correlation and their spectral decomposition. All traits differed significantly among pullets and mature laying hens in both first and second lay cycles. Levels for K, iCa, Hct, pH, TCO2, HCO3, BE, sO2, and Hb differed significantly between first cycle and second cycle laying hens. Many venous blood gas and chemistry parameters were significantly correlated. The first 3 eigenvalues explained ∼2/3 of total variation. The first 2 principal components (PC) explained 51% of the total variation and indicated acid-balance and relationship between blood O2 and CO2. The third PC explained 16% of variation and seems to be related to blood iCa. Establishing reference ranges for pullet and laying hen blood gas and chemistry with the i-STAT®1 handheld unit provides a mechanism to further investigate pullet and layer physiology, evaluate metabolic disturbances, and may potentially serve as a means to select breeder candidates with optimal blood gas or chemistry levels on-farm. © The Author 2015. Published by Oxford University Press on behalf of the Poultry Science Association.
Using the FORTH Language to Develop an ICU Data Acquisition System
Goldberg, Arthur; SooHoo, Spencer L.; Koerner, Spencer K.; Chang, Robert S. Y.
1980-01-01
This paper describes a powerful programming tool that should be considered as an alternative to the more conventional programming languages now in use for developing medical computer systems. Forth provides instantaneous response to user commands, rapid program execution and tremendous programming versatility. An operating system and a language in one carefully designed unit, Forth is well suited for developing data acquisition systems and for interfacing computers to other instruments. We present some of the general features of Forth and describe its use in implementing a data collection system for a Respiratory Intensive Care Unit (RICU).
A novel approach to multiple sequence alignment using hadoop data grids.
Sudha Sadasivam, G; Baktavatchalam, G
2010-01-01
Multiple alignment of protein sequences helps to determine evolutionary linkage and to predict molecular structures. The factors to be considered while aligning multiple sequences are speed and accuracy of alignment. Although dynamic programming algorithms produce accurate alignments, they are computation intensive. In this paper we propose a time efficient approach to sequence alignment that also produces quality alignment. The dynamic nature of the algorithm coupled with data and computational parallelism of hadoop data grids improves the accuracy and speed of sequence alignment. The principle of block splitting in hadoop coupled with its scalability facilitates alignment of very large sequences.
The dipole moment surface for hydrogen sulfide H2S
NASA Astrophysics Data System (ADS)
Azzam, Ala`a. A. A.; Lodi, Lorenzo; Yurchenko, Sergey N.; Tennyson, Jonathan
2015-08-01
In this work we perform a systematic ab initio study of the dipole moment surface (DMS) of H2S at various levels of theory and of its effect on the intensities of vibration-rotation transitions; H2S intensities are known from the experiment to display anomalies which have so far been difficult to reproduce by theoretical calculations. We use the transition intensities from the HITRAN database of 14 vibrational bands for our comparisons. The intensities of all fundamental bands show strong sensitivity to the ab initio method used for constructing the DMS while hot, overtone and combination bands up to 4000 cm-1 do not. The core-correlation and relativistic effects are found to be important for computed line intensities, for instance affecting the most intense fundamental band (ν2) by about 20%. Our recommended DMS, called ALYT2, is based on the CCSD(T)/aug-cc-pV(6+d)Z level of theory supplemented by a core-correlation/relativistic corrective surface obtained at the CCSD[T]/aug-cc-pCV5Z-DK level. The corresponding computed intensities agree significantly better (to within 10%) with experimental data taken directly from original papers. Worse agreement (differences of about 25%) is found for those HITRAN intensities obtained from fitted effective dipole models, suggesting the presence of underlying problems in those fits.
Evaluating open-source cloud computing solutions for geosciences
NASA Astrophysics Data System (ADS)
Huang, Qunying; Yang, Chaowei; Liu, Kai; Xia, Jizhe; Xu, Chen; Li, Jing; Gui, Zhipeng; Sun, Min; Li, Zhenglong
2013-09-01
Many organizations start to adopt cloud computing for better utilizing computing resources by taking advantage of its scalability, cost reduction, and easy to access characteristics. Many private or community cloud computing platforms are being built using open-source cloud solutions. However, little has been done to systematically compare and evaluate the features and performance of open-source solutions in supporting Geosciences. This paper provides a comprehensive study of three open-source cloud solutions, including OpenNebula, Eucalyptus, and CloudStack. We compared a variety of features, capabilities, technologies and performances including: (1) general features and supported services for cloud resource creation and management, (2) advanced capabilities for networking and security, and (3) the performance of the cloud solutions in provisioning and operating the cloud resources as well as the performance of virtual machines initiated and managed by the cloud solutions in supporting selected geoscience applications. Our study found that: (1) no significant performance differences in central processing unit (CPU), memory and I/O of virtual machines created and managed by different solutions, (2) OpenNebula has the fastest internal network while both Eucalyptus and CloudStack have better virtual machine isolation and security strategies, (3) Cloudstack has the fastest operations in handling virtual machines, images, snapshots, volumes and networking, followed by OpenNebula, and (4) the selected cloud computing solutions are capable for supporting concurrent intensive web applications, computing intensive applications, and small-scale model simulations without intensive data communication.
Ultraviolet continuum absorption /less than about 1000 A/ above the quiet sun transition region
NASA Technical Reports Server (NTRS)
Doschek, G. A.; Feldman, U.
1982-01-01
Lyman continuum absorption shortward of 912 A in the quiet sun solar transition region is investigated by combining spectra obtained from the Apollo Telescope Mount experiments on Skylab. The most recent atomic data are used to compute line intensities for lines that fall on both sides of the Lyman limit. Lines of O III, O IV, O V, and S IV are considered. The computed intensity ratios of most lines from O IV, O V, and S IV agree with the experimental ratios to within a factor of 2. However, the discrepancies show no apparent wavelength dependence. From this fact, it is concluded that at least part of the discrepancy between theory and observation for lines of these ions can be accounted for by uncertainties in instrumental calibration and atomic data. However, difficulties remain in reconciling observation and theory, particularly for lines of O III, and one line of S IV. The other recent results of Schmahl and Orrall (1979) are also discussed in terms of newer atomic data.
Efficient Parallel Video Processing Techniques on GPU: From Framework to Implementation
Su, Huayou; Wen, Mei; Wu, Nan; Ren, Ju; Zhang, Chunyuan
2014-01-01
Through reorganizing the execution order and optimizing the data structure, we proposed an efficient parallel framework for H.264/AVC encoder based on massively parallel architecture. We implemented the proposed framework by CUDA on NVIDIA's GPU. Not only the compute intensive components of the H.264 encoder are parallelized but also the control intensive components are realized effectively, such as CAVLC and deblocking filter. In addition, we proposed serial optimization methods, including the multiresolution multiwindow for motion estimation, multilevel parallel strategy to enhance the parallelism of intracoding as much as possible, component-based parallel CAVLC, and direction-priority deblocking filter. More than 96% of workload of H.264 encoder is offloaded to GPU. Experimental results show that the parallel implementation outperforms the serial program by 20 times of speedup ratio and satisfies the requirement of the real-time HD encoding of 30 fps. The loss of PSNR is from 0.14 dB to 0.77 dB, when keeping the same bitrate. Through the analysis to the kernels, we found that speedup ratios of the compute intensive algorithms are proportional with the computation power of the GPU. However, the performance of the control intensive parts (CAVLC) is much related to the memory bandwidth, which gives an insight for new architecture design. PMID:24757432
Computation of acoustic ressure fields produced in feline brain by high-intensity focused ultrasound
NASA Astrophysics Data System (ADS)
Omidi, Nazanin
In 1975, Dunn et al. (JASA 58:512-514) showed that a simple relation describes the ultrasonic threshold for cavitation-induced changes in the mammalian brain. The thresholds for tissue damage were estimated for a variety of acoustic parameters in exposed feline brain. The goal of this study was to improve the estimates for acoustic pressures and intensities present in vivo during those experimental exposures by estimating them using nonlinear rather than linear theory. In our current project, the acoustic pressure waveforms produced in the brains of anesthetized felines were numerically simulated for a spherically focused, nominally f1-transducer (focal length = 13 cm) at increasing values of the source pressure at frequencies of 1, 3, and 9 MHz. The corresponding focal intensities were correlated with the experimental data of Dunn et al. The focal pressure waveforms were also computed at the location of the true maximum. For low source pressures, the computed waveforms were the same as those determined using linear theory, and the focal intensities matched experimentally determined values. For higher source pressures, the focal pressure waveforms became increasingly distorted, with the compressional amplitude of the wave becoming greater, and the rarefactional amplitude becoming lower than the values calculated using linear theory. The implications of these results for clinical exposures are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
De, K; Jha, S; Klimentov, A
2016-01-01
The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS, one of the largest collaborations ever assembled in the sciences, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Managementmore » System for managing the workflow for all data processing on over 150 data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. While PanDA currently uses more than 250,000 cores with a peak performance of 0.3 petaFLOPS, LHC data taking runs require more resources than Grid computing can possibly provide. To alleviate these challenges, LHC experiments are engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with supercomputers in United States, Europe and Russia (in particular with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF), MIRA supercomputer at Argonne Leadership Computing Facilities (ALCF), Supercomputer at the National Research Center Kurchatov Institute , IT4 in Ostrava and others). Current approach utilizes modified PanDA pilot framework for job submission to the supercomputers batch queues and local data management, with light-weight MPI wrappers to run single threaded workloads in parallel on LCFs multi-core worker nodes. This implementation was tested with a variety of Monte-Carlo workloads on several supercomputing platforms for ALICE and ATLAS experiments and it is in full production for the ATLAS experiment since September 2015. We will present our current accomplishments with running PanDA WMS at supercomputers and demonstrate our ability to use PanDA as a portal independent of the computing facilities infrastructure for High Energy and Nuclear Physics as well as other data-intensive science applications, such as bioinformatics and astro-particle physics.« less
SAVLOC, computer program for automatic control and analysis of X-ray fluorescence experiments
NASA Technical Reports Server (NTRS)
Leonard, R. F.
1977-01-01
A program for a PDP-15 computer is presented which provides for control and analysis of trace element determinations by using X-ray fluorescence. The program simultaneously handles data accumulation for one sample and analysis of data from previous samples. Data accumulation consists of sample changing, timing, and data storage. Analysis requires the locating of peaks in X-ray spectra, determination of intensities of peaks, identification of origins of peaks, and determination of a real density of the element responsible for each peak. The program may be run in either a manual (supervised) mode or an automatic (unsupervised) mode.
Washburn, Lisa T; Cornell, Carol E; Phillips, Martha; Felix, Holly; Traywick, LaVona
2014-09-01
The effect of volunteer lay leaders on availability and sustainability of strength-training programs for older adults has not been well explored. We describe implementation of the StrongWomen strength training program by the Arkansas Cooperative Extension Service, and report on the relationship between delivery approach (agent-led, lay-led, or combination of agent- and lay-led) and program access and sustainability. All state Extension agents (n = 66) were surveyed on program implementation, continuance, and use of lay leaders. Program records were used to identify the number of trained lay leaders. Regression models were used to examine the relationship between delivery approach and group availability. Counties using lay leaders had twice as many groups as counties using only agents. There was a significant, positive relationship between the number of lay leaders and the number of groups. Counties using lay leaders were 8.3 times more likely to have continuing groups compared with counties not using lay leaders. Program continuance was significantly and positively associated with lay leader use. Lay delivery expanded access to strength training programs and increased the likelihood that programs would continue. This approach can be used to increase access to and sustainability of strength training programs, particularly in resource-constrained areas.
Modeling the Proton Radiation Belt With Van Allen Probes Relativistic Electron-Proton Telescope Data
NASA Technical Reports Server (NTRS)
Kanekal, S. G.; Li, X.; Baker, D. N.; Selesnick, R. S.; Hoxie, V. C.
2018-01-01
An empirical model of the proton radiation belt is constructed from data taken during 2013-2017 by the Relativistic Electron-Proton Telescopes on the Van Allen Probes satellites. The model intensity is a function of time, kinetic energy in the range 18-600 megaelectronvolts, equatorial pitch angle, and L shell of proton guiding centers. Data are selected, on the basis of energy deposits in each of the nine silicon detectors, to reduce background caused by hard proton energy spectra at low L. Instrument response functions are computed by Monte Carlo integration, using simulated proton paths through a simplified structural model, to account for energy loss in shielding material for protons outside the nominal field of view. Overlap of energy channels, their wide angular response, and changing satellite orientation require the model dependencies on all three independent variables be determined simultaneously. This is done by least squares minimization with a customized steepest descent algorithm. Model uncertainty accounts for statistical data error and systematic error in the simulated instrument response. A proton energy spectrum is also computed from data taken during the 8 January 2014 solar event, to illustrate methods for the simpler case of an isotropic and homogeneous model distribution. Radiation belt and solar proton results are compared to intensities computed with a simplified, on-axis response that can provide a good approximation under limited circumstances.
Hybrid cloud and cluster computing paradigms for life science applications
2010-01-01
Background Clouds and MapReduce have shown themselves to be a broadly useful approach to scientific computing especially for parallel data intensive applications. However they have limited applicability to some areas such as data mining because MapReduce has poor performance on problems with an iterative structure present in the linear algebra that underlies much data analysis. Such problems can be run efficiently on clusters using MPI leading to a hybrid cloud and cluster environment. This motivates the design and implementation of an open source Iterative MapReduce system Twister. Results Comparisons of Amazon, Azure, and traditional Linux and Windows environments on common applications have shown encouraging performance and usability comparisons in several important non iterative cases. These are linked to MPI applications for final stages of the data analysis. Further we have released the open source Twister Iterative MapReduce and benchmarked it against basic MapReduce (Hadoop) and MPI in information retrieval and life sciences applications. Conclusions The hybrid cloud (MapReduce) and cluster (MPI) approach offers an attractive production environment while Twister promises a uniform programming environment for many Life Sciences applications. Methods We used commercial clouds Amazon and Azure and the NSF resource FutureGrid to perform detailed comparisons and evaluations of different approaches to data intensive computing. Several applications were developed in MPI, MapReduce and Twister in these different environments. PMID:21210982
Modeling the Proton Radiation Belt With Van Allen Probes Relativistic Electron-Proton Telescope Data
NASA Astrophysics Data System (ADS)
Selesnick, R. S.; Baker, D. N.; Kanekal, S. G.; Hoxie, V. C.; Li, X.
2018-01-01
An empirical model of the proton radiation belt is constructed from data taken during 2013-2017 by the Relativistic Electron-Proton Telescopes on the Van Allen Probes satellites. The model intensity is a function of time, kinetic energy in the range 18-600 MeV, equatorial pitch angle, and L shell of proton guiding centers. Data are selected, on the basis of energy deposits in each of the nine silicon detectors, to reduce background caused by hard proton energy spectra at low L. Instrument response functions are computed by Monte Carlo integration, using simulated proton paths through a simplified structural model, to account for energy loss in shielding material for protons outside the nominal field of view. Overlap of energy channels, their wide angular response, and changing satellite orientation require the model dependencies on all three independent variables be determined simultaneously. This is done by least squares minimization with a customized steepest descent algorithm. Model uncertainty accounts for statistical data error and systematic error in the simulated instrument response. A proton energy spectrum is also computed from data taken during the 8 January 2014 solar event, to illustrate methods for the simpler case of an isotropic and homogeneous model distribution. Radiation belt and solar proton results are compared to intensities computed with a simplified, on-axis response that can provide a good approximation under limited circumstances.
Hybrid cloud and cluster computing paradigms for life science applications.
Qiu, Judy; Ekanayake, Jaliya; Gunarathne, Thilina; Choi, Jong Youl; Bae, Seung-Hee; Li, Hui; Zhang, Bingjing; Wu, Tak-Lon; Ruan, Yang; Ekanayake, Saliya; Hughes, Adam; Fox, Geoffrey
2010-12-21
Clouds and MapReduce have shown themselves to be a broadly useful approach to scientific computing especially for parallel data intensive applications. However they have limited applicability to some areas such as data mining because MapReduce has poor performance on problems with an iterative structure present in the linear algebra that underlies much data analysis. Such problems can be run efficiently on clusters using MPI leading to a hybrid cloud and cluster environment. This motivates the design and implementation of an open source Iterative MapReduce system Twister. Comparisons of Amazon, Azure, and traditional Linux and Windows environments on common applications have shown encouraging performance and usability comparisons in several important non iterative cases. These are linked to MPI applications for final stages of the data analysis. Further we have released the open source Twister Iterative MapReduce and benchmarked it against basic MapReduce (Hadoop) and MPI in information retrieval and life sciences applications. The hybrid cloud (MapReduce) and cluster (MPI) approach offers an attractive production environment while Twister promises a uniform programming environment for many Life Sciences applications. We used commercial clouds Amazon and Azure and the NSF resource FutureGrid to perform detailed comparisons and evaluations of different approaches to data intensive computing. Several applications were developed in MPI, MapReduce and Twister in these different environments.
Electromagnetic Pulses Generated From Laser Target Interactions at Shenguang II Laser Facility
NASA Astrophysics Data System (ADS)
Yang, Jinwen; Li, Tingshuai; Yi, Tao; Wang, Chuanke; Yang, Ming; Yang, Weiming; Liu, Shenye; Jiang, Shaoen; Ding, Yongkun
2016-10-01
Significant electromagnetic pulses (EMP) can be generated by the intensive laser irradiating solid targets in inertial confinement fusion (ICF). To evaluate the EMP intensity and distribution in and outside the laser chamber, we designed and fabricated a discone antenna with ultra-wide bands of over 10 GHz. The return loss (S11 parameter) of this antenna was below -10 dB and could even achieve under -30 dB at 3.1 GHz. The EMP intensity in this study at 80 cm and 40 cm away from the target chamber center (TCC) reached 400 kV/m and 2000 kV/m. The current results are expected to offer preliminary information to study physics regarding laser plasma interactions and will also lay experimental foundation for EMI shielding design to protect various diagnostics. supported by the Fundamental Research Funds for the Central Universities of China (No. ZYGX2015J108) and National Natural Science Foundation of China (Nos. 11575166 and 51581140)
NASA Technical Reports Server (NTRS)
Smith, C. W.; Bhateley, I. C.
1976-01-01
Two techniques for extending the range of applicability of the basic vortex-lattice method are discussed. The first improves the computation of aerodynamic forces on thin, low-aspect-ratio wings of arbitrary planforms at subsonic Mach numbers by including the effects of leading-edge and tip vortex separation, characteristic of this type wing, through use of the well-known suction-analogy method of E. C. Polhamus. Comparisons with experimental data for a variety of planforms are presented. The second consists of the use of the vortex-lattice method to predict pressure distributions over thick multi-element wings (wings with leading- and trailing-edge devices). A method of laying out the lattice is described which gives accurate pressures on the top and part of the bottom surface of the wing. Limited comparisons between the result predicted by this method, the conventional lattice arrangement method, experimental data, and 2-D potential flow analysis techniques are presented.
Simulation Based Exploration of Critical Zone Dynamics in Intensively Managed Landscapes
NASA Astrophysics Data System (ADS)
Kumar, P.
2017-12-01
The advent of high-resolution measurements of topographic and (vertical) vegetation features using areal LiDAR are enabling us to resolve micro-scale ( 1m) landscape structural characteristics over large areas. Availability of hyperspectral measurements is further augmenting these LiDAR data by enabling the biogeochemical characterization of vegetation and soils at unprecedented spatial resolutions ( 1-10m). Such data have opened up novel opportunities for modeling Critical Zone processes and exploring questions that were not possible before. We show how an integrated 3-D model at 1m grid resolution can enable us to resolve micro-topographic and ecological dynamics and their control on hydrologic and biogeochemical processes over large areas. We address the computational challenge of such detailed modeling by exploiting hybrid CPU and GPU computing technologies. We show results of moisture, biogeochemical, and vegetation dynamics from studies in the Critical Zone Observatory for Intensively managed Landscapes (IMLCZO) in the Midwestern United States.
A world-wide databridge supported by a commercial cloud provider
NASA Astrophysics Data System (ADS)
Tat Cheung, Kwong; Field, Laurence; Furano, Fabrizio
2017-10-01
Volunteer computing has the potential to provide significant additional computing capacity for the LHC experiments. One of the challenges with exploiting volunteer computing is to support a global community of volunteers that provides heterogeneous resources. However, high energy physics applications require more data input and output than the CPU intensive applications that are typically used by other volunteer computing projects. While the so-called databridge has already been successfully proposed as a method to span the untrusted and trusted domains of volunteer computing and Grid computing respective, globally transferring data between potentially poor-performing residential networks and CERN could be unreliable, leading to wasted resources usage. The expectation is that by placing a storage endpoint that is part of a wider, flexible geographical databridge deployment closer to the volunteers, the transfer success rate and the overall performance can be improved. This contribution investigates the provision of a globally distributed databridge implemented upon a commercial cloud provider.
Don't put all your eggs in one nest: spread them and cut time at risk.
Andersson, Malte; Åhlund, Matti
2012-09-01
In many egg-laying animals, some females spread their clutch among several nests. The fitness effects of this reproductive tactic are obscure. Using mathematical modeling and field observations, we analyze an unexplored benefit of egg spreading in brood parasitic and other breeding systems: reduced time at risk for offspring. If a clutch takes many days to lay until incubation and embryo development starts after the last egg, by spreading her eggs a parasitic female can reduce offspring time in the vulnerable nest at risk of predation or other destruction. The model suggests that she can achieve much of this benefit by spreading her eggs among a few nests, even if her total clutch is large. Field data from goldeneye ducks Bucephala clangula show that egg spreading enables a fecund female to lay a clutch that is much larger than average without increasing offspring time at risk in a nest. This advantage increases with female condition (fecundity) and can markedly raise female reproductive success. These results help explain the puzzle of nesting parasites in some precocial birds, which lay eggs in the nests of other females before laying eggs in their own nest. Risk reduction by egg spreading may also play a role in the evolution of other breeding systems and taxa-for instance, polyandry with male parental care in some birds and fishes.
Germanium Plasmon Enhanced Resonators for Label-Free Terahertz Protein Sensing
NASA Astrophysics Data System (ADS)
Bettenhausen, Maximilian; Römer, Friedhard; Witzigmann, Bernd; Flesch, Julia; Kurre, Rainer; Korneev, Sergej; Piehler, Jacob; You, Changjiang; Kazmierczak, Marcin; Guha, Subhajit; Capellini, Giovanni; Schröder, Thomas
2018-03-01
A Terahertz protein sensing concept based on subwavelength Ge resonators is presented. Ge bowtie resonators, compatible with CMOS fabrication technology, have been designed and characterized with a resonance frequency of 0.5 THz and calculated local intensity enhancement of 10.000. Selective biofunctionalization of Ge resonators on Si wafer was achieved in one step using lipoic acid-HaloTag ligand (LA-HTL) for biofunctionalization and passivation. The results lay the foundation for future investigation of protein tertiary structure and the dynamics of protein hydration shell in response to protein conformation changes.
How Digital Image Processing Became Really Easy
NASA Astrophysics Data System (ADS)
Cannon, Michael
1988-02-01
In the early and mid-1970s, digital image processing was the subject of intense university and corporate research. The research lay along two lines: (1) developing mathematical techniques for improving the appearance of or analyzing the contents of images represented in digital form, and (2) creating cost-effective hardware to carry out these techniques. The research has been very effective, as evidenced by the continued decline of image processing as a research topic, and the rapid increase of commercial companies to market digital image processing software and hardware.
Ahmad, Aftab; Khan, Vikram; Badola, Smita; Arya, Gaurav; Bansal, Nayanci; Saxena, A. K.
2010-01-01
The prevalence, intensities of infestation, range of infestation and population composition of two phthirapteran species, Ardeicola expallidus Blagoveshtchensky (Phthiraptera: Philopteridae) and Ciconiphilus decimfasciatus Boisduval and Lacordaire (Menoponidae) on seventy cattle egrets were recorded during August 2004 to March 2005, in India. The frequency distribution patterns of both the species were skewed but did not correspond to the negative binomial model. The oviposition sites, egg laying patterns and the nature of the eggs of the two species were markedly different. PMID:21067416
Hogg, Christine; Williamson, Charlotte
2008-01-01
Increasingly, lay people are appointed as members to health service committees. The term ‘lay’ is used loosely and the reasons for involving lay people are seldom clearly defined. This paper argues that the different roles that lay people play need to be explicitly defined in order for their contributions to be realized. Although lay members of health service committees are generally assumed to be working for patients’ interests, our observations lead us to think that some lay people tend to support professionals’ or managers’ interests rather than patients’ interests as patients would define them. We suggest that lay people fall into three broad categories: supporters of dominant (professional) interests, supporters of challenging (managerial) interests and supporters of repressed (patient) interests. These alignments should be taken into account in appointments to health service bodies. Further research is needed on the alignments and roles of lay members. PMID:11286594
NASA Astrophysics Data System (ADS)
Englander, J. G.; Austin, A. T.; Brandt, A. R.
2016-12-01
The need to quantify flaring by oil and gas fields is receiving more scrutiny, as there has been scientific and regulatory interest in quantifying the greenhouse gas (GHG) impact of oil and gas production. The National Oceanic and Atmospheric Administration (NOAA) has developed a method to track flaring activity using a Visible Infrared Imaging Radiometer Suite (VIIRS) satellite.[1] This reports data on the average size, power, and light intensity of each flare. However, outside of some small studies, the flaring intensity has generally been estimated at the country level.[2]While informative, country-level assessments cannot provide guidance about the sustainability of particular crude streams or products produced. In this work we generate detailed oil-field-level flaring intensities for a number of global oilfield operations. We do this by merging the VIIRS dataset with global oilfield atlases and other spatial data sources. Joining these datasets together with production data allows us to provide better estimates for the GHG intensity of flaring at the field level for these countries.[3]First, we compute flaring intensities at the field level for 75 global oil fields representing approximately 25% of global production. In addition, we examine in detail three oil producing countries known to have high rates of flaring: Egypt, Nigeria, and Venezuela. For these countries we compute the flaring rate for all fields in the country and explore within-and between-country variation. The countries' fields will be analyzed to determine the correlation of flare activity to a certain field type, crude type, region, or production method. [1] Cao, C. "Visible Infrared Imaging Radiometer Suite (VIIRS)." NOAA NPP VIIRS. NOAA, 2013. Web. 30 July 2016. [2] Elvidge, C. D. et al., "A Fifteen Year Record of Global Natural Gas Flaring Derived from Satellite Data," Energies, vol. 2, no. 3, pp. 595-622, Aug. 2009. [3] World Energy Atlas. 6th ed. London: Petroleum Economist, 2011. Print.
NASA Astrophysics Data System (ADS)
Stone, S.; Parker, M. S.; Howe, B.; Lazowska, E.
2015-12-01
Rapid advances in technology are transforming nearly every field from "data-poor" to "data-rich." The ability to extract knowledge from this abundance of data is the cornerstone of 21st century discovery. At the University of Washington eScience Institute, our mission is to engage researchers across disciplines in developing and applying advanced computational methods and tools to real world problems in data-intensive discovery. Our research team consists of individuals with diverse backgrounds in domain sciences such as astronomy, oceanography and geology, with complementary expertise in advanced statistical and computational techniques such as data management, visualization, and machine learning. Two key elements are necessary to foster careers in data science: individuals with cross-disciplinary training in both method and domain sciences, and career paths emphasizing alternative metrics for advancement. We see persistent and deep-rooted challenges for the career paths of people whose skills, activities and work patterns don't fit neatly into the traditional roles and success metrics of academia. To address these challenges the eScience Institute has developed training programs and established new career opportunities for data-intensive research in academia. Our graduate students and post-docs have mentors in both a methodology and an application field. They also participate in coursework and tutorials to advance technical skill and foster community. Professional Data Scientist positions were created to support research independence while encouraging the development and adoption of domain-specific tools and techniques. The eScience Institute also supports the appointment of faculty who are innovators in developing and applying data science methodologies to advance their field of discovery. Our ultimate goal is to create a supportive environment for data science in academia and to establish global recognition for data-intensive discovery across all fields.
CIM for 300-mm semiconductor fab
NASA Astrophysics Data System (ADS)
Luk, Arthur
1997-08-01
Five years ago, factory automation (F/A) was not prevalent in the fab. Today facing the drastically changed market and the intense competition, management request the plant floor data be forward to their desktop computer. This increased demand rapidly pushed F/A to the computer integrated manufacturing (CIM). Through personalization, we successfully reduced a computer size, let them can be stored on our desktop. PC initiates a computer new era. With the advent of the network, the network computer (NC) creates fresh problems for us. When we plan to invest more than $3 billion to build new 300 mm fab, the next generation technology raises a challenging bar.
Sensory characteristics and consumer preference for chicken meat in Guinea.
Sow, T M A; Grongnet, J F
2010-10-01
This study identified the sensory characteristics and consumer preference for chicken meat in Guinea. Five chicken samples [live village chicken, live broiler, live spent laying hen, ready-to-cook broiler, and ready-to-cook broiler (imported)] bought from different locations were assessed by 10 trained panelists using 19 sensory attributes. The ANOVA results showed that 3 chicken appearance attributes (brown, yellow, and white), 5 chicken odor attributes (oily, intense, medicine smell, roasted, and mouth persistent), 3 chicken flavor attributes (sweet, bitter, and astringent), and 8 chicken texture attributes (firm, tender, juicy, chew, smooth, springy, hard, and fibrous) were significantly discriminating between the chicken samples (P<0.05). Principal component analysis of the sensory data showed that the first 2 principal components explained 84% of the sensory data variance. The principal component analysis results showed that the live village chicken, the live spent laying hen, and the ready-to-cook broiler (imported) were very well represented and clearly distinguished from the live broiler and the ready-to-cook broiler. One hundred twenty consumers expressed their preferences for the chicken samples using a 5-point Likert scale. The hierarchical cluster analysis of the preference data identified 4 homogenous consumer clusters. The hierarchical cluster analysis results showed that the live village chicken was the most preferred chicken sample, whereas the ready-to-cook broiler was the least preferred one. The partial least squares regression (PLSR) type 1 showed that 72% of the sensory data for the first 2 principal components explained 83% of the chicken preference. The PLSR1 identified that the sensory characteristics juicy, oily, sweet, hard, mouth persistent, and yellow were the most relevant sensory drivers of the Guinean chicken preference. The PLSR2 (with multiple responses) identified the relationship between the chicken samples, their sensory attributes, and the consumer clusters. Our results showed that there was not a chicken category that was exclusively preferred from the other chicken samples and therefore highlight the existence of place for development of all chicken categories in the local market.
Fatigue Crack Growth Rate and Stress-Intensity Factor Corrections for Out-of-Plane Crack Growth
NASA Technical Reports Server (NTRS)
Forth, Scott C.; Herman, Dave J.; James, Mark A.
2003-01-01
Fatigue crack growth rate testing is performed by automated data collection systems that assume straight crack growth in the plane of symmetry and use standard polynomial solutions to compute crack length and stress-intensity factors from compliance or potential drop measurements. Visual measurements used to correct the collected data typically include only the horizontal crack length, which for cracks that propagate out-of-plane, under-estimates the crack growth rates and over-estimates the stress-intensity factors. The authors have devised an approach for correcting both the crack growth rates and stress-intensity factors based on two-dimensional mixed mode-I/II finite element analysis (FEA). The approach is used to correct out-of-plane data for 7050-T7451 and 2025-T6 aluminum alloys. Results indicate the correction process works well for high DeltaK levels but fails to capture the mixed-mode effects at DeltaK levels approaching threshold (da/dN approximately 10(exp -10) meter/cycle).
Vereecken, C; Covents, M; Maes, L; Moyson, T
2014-01-01
The increased availability of computers and the efficiency and user-acceptability of computer-assisted questioning have increased the attractiveness of computer-administered querying for large-scale population nutrition research during the last decade. The Young Adolescents' Nutrition Assessment on Computer (YANA-C), a computer-based 24-h dietary recall, was originally developed to collect dietary data among Belgian-Flemish adolescents. A web-based version was created to collect parentally reported dietary data of preschoolers, called Young Children's Nutrition Assessment on the Web (YCNA-W), which has been improved and adapted for use in young adolescents: Children and Adolescents' Nutrition Assessment and Advice on the Web (CANAA-W). The present study describes recent developments and the formative evaluation of the dietary assessment component. A feasibility questionnaire was completed by 131 children [mean (SD) age: 11.3 (0.7) years] and 53 parents. Eight focus groups were held with children (n = 65) and three with parents (n = 17). Children (C) and parents (P) found the instrument clear (C: 97%; P: 94%), comprehensible (C: 92%; P: 100%), attractive (C: 84%; P: 85%), fun (C: 93%; P: 83%) and easy to complete (C: 91%; P: 83%). There was ample explanation (C: 95%; P: 94%); the pictures were clear (C: 97%; P: 96%); and most respondents found the food items easy to find (C: 71%, P: 85%). The results helped to refine the lay out and structure of the instrument and the list of food items included. Children and parents were enthusiastic. The major challenge will be to convince parents who are less interested in dietary intake and less computer literate to participate in this type of study. Children in this age group (11-12 years) should complete the instrument with assistance from an adult. © 2013 The Authors Journal of Human Nutrition and Dietetics © 2013 The British Dietetic Association Ltd.
Big Data Ecosystems Enable Scientific Discovery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Critchlow, Terence J.; Kleese van Dam, Kerstin
Over the past 5 years, advances in experimental, sensor and computational technologies have driven the exponential growth in the volumes, acquisition rates, variety and complexity of scientific data. As noted by Hey et al in their 2009 e-book The Fourth Paradigm, this availability of large-quantities of scientifically meaningful data has given rise to a new scientific methodology - data intensive science. Data intensive science is the ability to formulate and evaluate hypotheses using data and analysis to extend, complement and, at times, replace experimentation, theory, or simulation. This new approach to science no longer requires scientists to interact directly withmore » the objects of their research; instead they can utilize digitally captured, reduced, calibrated, analyzed, synthesized and visualized results - allowing them carry out 'experiments' in data.« less
Thermal-neutron capture gamma-rays. Volume 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tuli, J.K.
1997-05-01
The energy and photon intensity of gamma rays as seen in thermal-neutron capture are presented ordered by Z, A of target nuclei. All gamma-rays with intensity of {ge}2% of the strongest transition are included. The strongest transition is indicated in each case. Where the target nuclide mass number is indicated as nat the natural target was used. The gamma energies given are in keV. The gamma intensities given are relative to 100 for the strongest transition. All data for A > 44 are taken from Evaluated Nuclear Structure Data File (4/97), a computer file of evaluated nuclear structure data maintainedmore » by the National Nuclear Data Center, Brookhaven National Laboratory, on behalf of the Nuclear Structure and Decay and Decay Data network, coordinated by the International Atomic Energy Agency, Vienna. These data are published in Nuclear Data Sheets, Academic Press, San Diego, CA. The data for A {le} 44 is taken from ``Prompt Gamma Rays from Thermal-Neutron Capture,`` M.A. Lone, R.A. Leavitt, D.A. Harrison, Atomic Data and Nuclear Data Tables 26, 511 (1981).« less
Not Scotch, but Rum: The Scope and Diffusion of the Scottish Presence in the Published Record
ERIC Educational Resources Information Center
Lavoie, Brian
2013-01-01
Big data sets and powerful computing capacity have transformed scholarly inquiry across many disciplines. While the impact of data-intensive research methodologies is perhaps most distinct in the natural and social sciences, the humanities have also benefited from these new analytical tools. While full-text data is necessary to study topics such…
Free-field propagation of high intensity noise
NASA Technical Reports Server (NTRS)
Welz, Joseph P.; Mcdaniel, Oliver H.
1990-01-01
Observed spectral data from supersonic jet aircraft are known to contain much more high frequency energy than can be explained by linear acoustic propagation theory. It is believed that the high frequency energy is an effect of nonlinear distortion due to the extremely high acoustic levels generated by the jet engines. The objective, to measure acoustic waveform distortion for spherically diverging high intensity noise, was reached by using an electropneumatic acoustic source capable of generating sound pressure levels in the range of 140 to 160 decibels (re 20 micro Pa). The noise spectrum was shaped to represent the spectra generated by jet engines. Two microphones were used to capture the acoustic pressure waveform at different points along the propagation path in order to provide a direct measure of the waveform distortion as well as spectral distortion. A secondary objective was to determine that the observed distortion is an acoustic effect. To do this an existing computer prediction code that deals with nonlinear acoustic propagation was used on data representative of the measured data. The results clearly demonstrate that high intensity jet noise does shift the energy in the spectrum to the higher frequencies along the propagation path. In addition, the data from the computer model are in good agreement with the measurements, thus demonstrating that the waveform distortion can be accounted for with nonlinear acoustic theory.
Implementing biological logic gates using gold nanoparticles conjugated to fluorophores
NASA Astrophysics Data System (ADS)
Barnoy, Eran A.; Popovtzer, Rachela; Fixler, Dror
2018-02-01
We describe recent research in which we explored biologically relevant logic gates using gold nanoparticles (GNPs) conjugated to fluorophores and tracing the results remotely by time-domain fluorescence lifetime imaging microscopy (FLIM). GNPs have a well-known effect on nearby fluorophores in terms of their fluorescence intensity (FI - increase or decrease) as well as fluorescence lifetime (FLT). We have designed a few bio-switch systems in which the FLIMdetected fluorescence varies after biologically relevant stimulation. Some of our tools include fluorescein diacetate (FDA) which can be activated by either esterases or pH, peptide chains cleavable by caspase 3, and the polymer polyacrylic acid which varies in size based on surrounding pH. After conjugating GNPs to chosen fluorophores, we have successfully demonstrated the logic gates of NOT, AND, OR, NAND, NOR, and XOR by imaging different stages of activation. These logic gates have been demonstrated both in solutions as well as within cultured cells, thereby possibly opening the door for nanoparticulate in vivo smart detection. While these initial probes are mainly tools for intelligent detection systems, they lay the foundation for logic gates functioning in conjunction so as to lead to a form of in vivo biological computing, where the system would be able to release proper treatment options in specific situations without external influence.
SOME RARE-EARTH ALLOY SYSTEMS. I. La-Gd, La-Y, Gd-Y
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spedding, F.H.; Valletta, R.M.; Daane, A.H.
The La-Y, La--Gd, and Gd--Y alloy systems were examined by conventional metallurgical research techniques. As would be expected from the similarity of the parent metals, the Gd--Y system exhibits complete solid solubility across the system in both the alpha and beta regions, with nearly perfect behavior indicated by the essentially linear plots of lattice constants and other related data, The La--Y and La--Gd systems show complete solid solubility in the high temperature bcc region, with limited solubility in the room temperature forms. In the central region of these two systems at room temperature, an ordered phase with the samarium structuremore » is observed, Some correlation of structure and lattice constants of this phase with the properties of the related pure metals is observed. (auth)« less
Impact of the lay-off length on +Gz tolerance.
Mikuliszyn, Romuald; Kowalski, Wieslaw; Kowalczuk, Krzysztof
2002-07-01
There are many factors affecting pilots' +Gz-tolerance. Recently, attention of the aviation community has been focused on lay-off and it's impact on +Gz-tolerance. Pilots of the Polish Air Force (PAF) have dealt with that problem for several years now. The aim of the study was to provide insight on how lay-off periods with different duration impact +Gz-tolerance. 95 male jet pilots from the PAF participated in the study. Every one had at least two weeks lay-off period (non-medical reasons). Subjects were divided into four groups according to the length of lay-off period (2-4 weeks; 5-13 weeks; 14-26 weeks; 27-154 weeks), All pilots were subjected to a centrifuge exposure in GOR (0.1 G/s) or ROR (1.0 G/s) profiles, depending on the pre-lay-off exposure. Post-lay-off exposures were carried out directly after lay-off. 18 jet pilots without any lay-off constituted the control group. The difference between pre- and post-lay-off G-tolerance limit (-0,93 +/- 0,53) was statistically significant (p<0.01) only for one group, where lay-off period ranged between two and four weeks. No statistically significant differences were found where influence of other factors like total and yearly flight hours, heart rate gain (AHR) or physical activity measured as maximal oxygen intake were considered. 2-4 weeks of lay-off period decreases +Gz tolerance is statistically significant manner. Subsequent increase of lay-off period does not result in mean tolerance changes for group, however in certain individuals critical decrement of +Gz tolerance occurs. Total and last year flying hours, physical fitness does not modify impact of lay-off period on +Gz tolerance.
WPS mediation: An approach to process geospatial data on different computing backends
NASA Astrophysics Data System (ADS)
Giuliani, Gregory; Nativi, Stefano; Lehmann, Anthony; Ray, Nicolas
2012-10-01
The OGC Web Processing Service (WPS) specification allows generating information by processing distributed geospatial data made available through Spatial Data Infrastructures (SDIs). However, current SDIs have limited analytical capacities and various problems emerge when trying to use them in data and computing-intensive domains such as environmental sciences. These problems are usually not or only partially solvable using single computing resources. Therefore, the Geographic Information (GI) community is trying to benefit from the superior storage and computing capabilities offered by distributed computing (e.g., Grids, Clouds) related methods and technologies. Currently, there is no commonly agreed approach to grid-enable WPS. No implementation allows one to seamlessly execute a geoprocessing calculation following user requirements on different computing backends, ranging from a stand-alone GIS server up to computer clusters and large Grid infrastructures. Considering this issue, this paper presents a proof of concept by mediating different geospatial and Grid software packages, and by proposing an extension of WPS specification through two optional parameters. The applicability of this approach will be demonstrated using a Normalized Difference Vegetation Index (NDVI) mediated WPS process, highlighting benefits, and issues that need to be further investigated to improve performances.
Key Lessons in Building "Data Commons": The Open Science Data Cloud Ecosystem
NASA Astrophysics Data System (ADS)
Patterson, M.; Grossman, R.; Heath, A.; Murphy, M.; Wells, W.
2015-12-01
Cloud computing technology has created a shift around data and data analysis by allowing researchers to push computation to data as opposed to having to pull data to an individual researcher's computer. Subsequently, cloud-based resources can provide unique opportunities to capture computing environments used both to access raw data in its original form and also to create analysis products which may be the source of data for tables and figures presented in research publications. Since 2008, the Open Cloud Consortium (OCC) has operated the Open Science Data Cloud (OSDC), which provides scientific researchers with computational resources for storing, sharing, and analyzing large (terabyte and petabyte-scale) scientific datasets. OSDC has provided compute and storage services to over 750 researchers in a wide variety of data intensive disciplines. Recently, internal users have logged about 2 million core hours each month. The OSDC also serves the research community by colocating these resources with access to nearly a petabyte of public scientific datasets in a variety of fields also accessible for download externally by the public. In our experience operating these resources, researchers are well served by "data commons," meaning cyberinfrastructure that colocates data archives, computing, and storage infrastructure and supports essential tools and services for working with scientific data. In addition to the OSDC public data commons, the OCC operates a data commons in collaboration with NASA and is developing a data commons for NOAA datasets. As cloud-based infrastructures for distributing and computing over data become more pervasive, we ask, "What does it mean to publish data in a data commons?" Here we present the OSDC perspective and discuss several services that are key in architecting data commons, including digital identifier services.
Designing highly flexible and usable cyberinfrastructures for convergence.
Herr, Bruce W; Huang, Weixia; Penumarthy, Shashikant; Börner, Katy
2006-12-01
This article presents the results of a 7-year-long quest into the development of a "dream tool" for our research in information science and scientometrics and more recently, network science. The results are two cyberinfrastructures (CI): The Cyberinfrastructure for Information Visualization and the Network Workbench that enjoy a growing national and interdisciplinary user community. Both CIs use the cyberinfrastructure shell (CIShell) software specification, which defines interfaces between data sets and algorithms/services and provides a means to bundle them into powerful tools and (Web) services. In fact, CIShell might be our major contribution to progress in convergence. Just as Wikipedia is an "empty shell" that empowers lay persons to share text, a CIShell implementation is an "empty shell" that empowers user communities to plug-and-play, share, compare and combine data sets, algorithms, and compute resources across national and disciplinary boundaries. It is argued here that CIs will not only transform the way science is conducted but also will play a major role in the diffusion of expertise, data sets, algorithms, and technologies across multiple disciplines and business sectors leading to a more integrative science.
NASA Technical Reports Server (NTRS)
Schweikhard, W. G.; Dennon, S. R.
1986-01-01
A review of the Melick method of inlet flow dynamic distortion prediction by statistical means is provided. These developments include the general Melick approach with full dynamic measurements, a limited dynamic measurement approach, and a turbulence modelling approach which requires no dynamic rms pressure fluctuation measurements. These modifications are evaluated by comparing predicted and measured peak instantaneous distortion levels from provisional inlet data sets. A nonlinear mean-line following vortex model is proposed and evaluated as a potential criterion for improving the peak instantaneous distortion map generated from the conventional linear vortex of the Melick method. The model is simplified to a series of linear vortex segments which lay along the mean line. Maps generated with this new approach are compared with conventionally generated maps, as well as measured peak instantaneous maps. Inlet data sets include subsonic, transonic, and supersonic inlets under various flight conditions.
48 CFR 1352.271-86 - Lay days.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Lay days. 1352.271-86... SOLICITATION PROVISIONS AND CONTRACT CLAUSES Text of Provisions and Clauses 1352.271-86 Lay days. As prescribed in 48 CFR 1371.117, insert the following clause: Lay Days (APR 2010) (a) A lay day is defined as an...
48 CFR 1252.217-75 - Lay days.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Lay days. 1252.217-75... SOLICITATION PROVISIONS AND CONTRACT CLAUSES Text of Provisions and Clauses 1252.217-75 Lay days. As prescribed at (TAR) 48 CFR 1217.7001(c) and (e), insert the following clause: Lay Days (OCT 1994) (a) Lay day...
2001-12-01
and Lieutenant Namik Kaplan , Turkish Navy. Maj Tiefert’s thesis, “Modeling Control Channel Dynamics of SAAM using NS Network Simulation”, helped lay...DEC99] Deconinck , Dr. ir. Geert, Fault Tolerant Systems, ESAT / Division ACCA , Katholieke Universiteit Leuven, October 1999. [FRE00] Freed...Systems”, Addison-Wesley, 1989. [KAP99] Kaplan , Namik, “Prototyping of an Active and Lightweight Router,” March 1999 [KAT99] Kati, Effraim
Interoperability of GADU in using heterogeneous Grid resources for bioinformatics applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sulakhe, D.; Rodriguez, A.; Wilde, M.
2008-03-01
Bioinformatics tools used for efficient and computationally intensive analysis of genetic sequences require large-scale computational resources to accommodate the growing data. Grid computational resources such as the Open Science Grid and TeraGrid have proved useful for scientific discovery. The genome analysis and database update system (GADU) is a high-throughput computational system developed to automate the steps involved in accessing the Grid resources for running bioinformatics applications. This paper describes the requirements for building an automated scalable system such as GADU that can run jobs on different Grids. The paper describes the resource-independent configuration of GADU using the Pegasus-based virtual datamore » system that makes high-throughput computational tools interoperable on heterogeneous Grid resources. The paper also highlights the features implemented to make GADU a gateway to computationally intensive bioinformatics applications on the Grid. The paper will not go into the details of problems involved or the lessons learned in using individual Grid resources as it has already been published in our paper on genome analysis research environment (GNARE) and will focus primarily on the architecture that makes GADU resource independent and interoperable across heterogeneous Grid resources.« less
Hainsworth, J; Barlow, J
2001-08-01
To determine whether undergoing training to become a lay leader and conducting an arthritis self-management course is associated with improvements in physical and psychological health status, arthritis self-efficacy, use of self-management techniques, and visits to the general practitioner. In addition, we aimed to describe the experiences of training and course delivery from the older volunteers' perspective. 21 participants completed all assessments and had a median age of 58, median disease duration of 10 years, and either osteoarthritis (n = 13) or rheumatoid arthritis (n = 8). The study was a pretest-posttest design with qualitative data collected at 3 points in time: before training, 6 weeks after training, and 6 months after training. Quantitative data were collected through self-administered postal questionnaires at baseline and 6-month followup. Six months after training, participants reported small, significant increases in arthritis self-efficacy for pain (P = 0.002), cognitive symptom management (P = 0.004), and communication with their physician (P = 0.024) and a small, significant decrease in depressed mood (P = 0.04). Qualitative data supported these findings, with participants reporting more confidence, happiness, and a changed outlook on life in general. Volunteerism was associated with altruistic behavior and with filling the vocational void caused by retirement. Findings support the value of volunteerism and training to become lay leaders in arthritis self-management programs. Volunteers reported positive changes both in themselves and in course participants. They enjoyed helping similar others and being involved in a worthwhile activity, and they valued their newly acquired status as lay leaders. Many had begun to apply their newfound knowledge about self-management to their own situation, reporting less pain and more willingness "to get on with life."
2013-01-01
Background Women who have a breech presentation at term have to decide whether to attempt external cephalic version (ECV) and how they want to give birth if the baby remains breech, either by planned caesarean section (CS) or vaginal breech birth. The aim of this study was to explore the attitudes of women with a breech presentation and health professionals who manage breech presentation to ECV. Methods We carried out semi-structured interviews with pregnant women with a breech presentation (n=11) and health professionals who manage breech presentation (n=11) recruited from two hospitals in North East England. We used purposive sampling to include women who chose ECV and women who chose planned CS. We analysed data using thematic analysis, comparing between individuals and seeking out disconfirming cases. Results Four main themes emerged from the data collected during interviews with pregnant women with a breech presentation: ECV as a means of enabling natural birth; concerns about ECV; lay and professional accounts of ECV; and breech presentation as a means of choosing planned CS. Some women’s attitudes to ECV were affected by their preferences for how to give birth. Other women chose CS because ECV was not acceptable to them. Two main themes emerged from the interview data about health professionals’ attitudes towards ECV: directive counselling and attitudes towards lay beliefs about ECV and breech presentation. Conclusions Women had a range of attitudes to ECV informed by their preferences for how to give birth; the acceptability of ECV to them; and lay accounts of ECV, which were frequently negative. Most professionals described having a preference for ECV and reported directively counselling women to choose it. Some professionals were dismissive of lay beliefs about ECV. Some key challenges for shared decision making about breech presentation were identified: health professionals counselling women directively about ECV and the differences between evidence-based information about ECV and lay beliefs. To address these challenges a number of approaches will be required. PMID:23324533
Streaming Support for Data Intensive Cloud-Based Sequence Analysis
Issa, Shadi A.; Kienzler, Romeo; El-Kalioby, Mohamed; Tonellato, Peter J.; Wall, Dennis; Bruggmann, Rémy; Abouelhoda, Mohamed
2013-01-01
Cloud computing provides a promising solution to the genomics data deluge problem resulting from the advent of next-generation sequencing (NGS) technology. Based on the concepts of “resources-on-demand” and “pay-as-you-go”, scientists with no or limited infrastructure can have access to scalable and cost-effective computational resources. However, the large size of NGS data causes a significant data transfer latency from the client's site to the cloud, which presents a bottleneck for using cloud computing services. In this paper, we provide a streaming-based scheme to overcome this problem, where the NGS data is processed while being transferred to the cloud. Our scheme targets the wide class of NGS data analysis tasks, where the NGS sequences can be processed independently from one another. We also provide the elastream package that supports the use of this scheme with individual analysis programs or with workflow systems. Experiments presented in this paper show that our solution mitigates the effect of data transfer latency and saves both time and cost of computation. PMID:23710461
Collins, Kevin M.; Koelle, Michael R.
2013-01-01
C. elegans regulates egg laying by alternating between an inactive phase and a serotonin-triggered active phase. We found that the conserved ERG potassium channel UNC-103 enables this two-state behavior by limiting excitability of the egg-laying muscles. Using both high-speed video recording and calcium imaging of egg-laying muscles in behaving animals, we found that the muscles appear to be excited at a particular phase of each locomotor body bend. During the inactive phase, this rhythmic excitation infrequently evokes calcium transients or contraction of the egg-laying muscles. During the serotonin-triggered active phase, however, these muscles are more excitable and each body bend is accompanied by a calcium transient that drives twitching or full contraction of the egg-laying muscles. We found that ERG null mutants lay eggs too frequently, and that ERG function is necessary and sufficient in the egg-laying muscles to limit egg laying. ERG K+ channels localize to postsynaptic sites in the egg-laying muscle, and mutants lacking ERG have more frequent calcium transients and contractions of the egg-laying muscles even during the inactive phase. Thus ERG channels set postsynaptic excitability at a threshold so that further adjustments of excitability by serotonin generate two distinct behavioral states. PMID:23303953
Technologies for Large Data Management in Scientific Computing
NASA Astrophysics Data System (ADS)
Pace, Alberto
2014-01-01
In recent years, intense usage of computing has been the main strategy of investigations in several scientific research projects. The progress in computing technology has opened unprecedented opportunities for systematic collection of experimental data and the associated analysis that were considered impossible only few years ago. This paper focuses on the strategies in use: it reviews the various components that are necessary for an effective solution that ensures the storage, the long term preservation, and the worldwide distribution of large quantities of data that are necessary in a large scientific research project. The paper also mentions several examples of data management solutions used in High Energy Physics for the CERN Large Hadron Collider (LHC) experiments in Geneva, Switzerland which generate more than 30,000 terabytes of data every year that need to be preserved, analyzed, and made available to a community of several tenth of thousands scientists worldwide.
NASA Astrophysics Data System (ADS)
Skirnisson, Karl
2016-07-01
Common eider Somateria mollissima L. 1758, subsp. borealis, is widely distributed along the coasts of Iceland. In this study association of parasite infections and food composition was studied among 40 females and 38 males (66 adults, 12 subadults), shot under license on four occasions within the same year (February; before egg-laying in May; after the breeding period in late June; and in November) in Skerjafjörður, SW Iceland. Parasitological examinations revealed 31 helminth species (11 digeneans, ten cestodes, seven nematodes, and three acanthocephalans). Distinct digenean species parasitized the gallbladder, kidney and bursa of Fabricius, whereas other helminths parasitized the gastrointestinal tract. Thirty-six invertebrate prey species were identified as food; waste and bread fed by humans, were also consumed by some birds. Amidostomum acutum was the only parasite found with a direct life cycle, whereas other species were food transmitted and ingested with different invertebrate prey. Opposite to females male birds rarely utilized periwinkles and gammarids as a food source. As a result, Microphallus and Microsomacanthus infection intensities were low except in February, when subadult males were responsible for an infection peak. Females caring for young increased their consumption of periwinkles close to the littoral zone in June; during pre-breeding, females also increased their gammarid intake. As a consequence, Microphallus and Microsomacanthus infection intensities temporarily peaked. Increased food intake (including Mytilus edulis) of females before the egg-laying period resulted in twofold higher Gymnophallus bursicola infection intensity than observed for males. Profilicollis botulus infection reflected seasonal changes in decapod consumption in both genders. Different life history strategies of males and females, especially before and during the breeding season and caring of young, and during molting in distinct feeding areas in summer, promote differences in consumption of prey-transmitted parasites that result in distinct infection patterns of the genders.
NASA Astrophysics Data System (ADS)
Skirnisson, Karl
2015-10-01
Common eider Somateria mollissima L. 1758, subsp. borealis, is widely distributed along the coasts of Iceland. In this study association of parasite infections and food composition was studied among 40 females and 38 males (66 adults, 12 subadults), shot under license on four occasions within the same year (February; before egg-laying in May; after the breeding period in late June; and in November) in Skerjafjörður, SW Iceland. Parasitological examinations revealed 31 helminth species (11 digeneans, ten cestodes, seven nematodes, and three acanthocephalans). Distinct digenean species parasitized the gallbladder, kidney and bursa of Fabricius, whereas other helminths parasitized the gastrointestinal tract. Thirty-six invertebrate prey species were identified as food; waste and bread fed by humans, were also consumed by some birds. Amidostomum acutum was the only parasite found with a direct life cycle, whereas other species were food transmitted and ingested with different invertebrate prey. Opposite to females male birds rarely utilized periwinkles and gammarids as a food source. As a result, Microphallus and Microsomacanthus infection intensities were low except in February, when subadult males were responsible for an infection peak. Females caring for young increased their consumption of periwinkles close to the littoral zone in June; during pre-breeding, females also increased their gammarid intake. As a consequence, Microphallus and Microsomacanthus infection intensities temporarily peaked. Increased food intake (including Mytilus edulis) of females before the egg-laying period resulted in twofold higher Gymnophallus bursicola infection intensity than observed for males. Profilicollis botulus infection reflected seasonal changes in decapod consumption in both genders. Different life history strategies of males and females, especially before and during the breeding season and caring of young, and during molting in distinct feeding areas in summer, promote differences in consumption of prey-transmitted parasites that result in distinct infection patterns of the genders.
Arora, C; Savulescu, J; Maslen, H; Selgelid, M; Wilkinson, D
2016-11-08
Resuscitation and treatment of critically ill newborn infants is associated with relatively high mortality, morbidity and cost. Guidelines relating to resuscitation have traditionally focused on the best interests of infants. There are, however, limited resources available in the neonatal intensive care unit (NICU), meaning that difficult decisions sometimes need to be made. This study explores the intuitions of lay people (non-health professionals) regarding resource allocation decisions in the NICU. The study design was a cross-sectional quantitative survey, consisting of 20 hypothetical rationing scenarios. There were 119 respondents who entered the questionnaire, and 109 who completed it. The respondents were adult US and Indian participants of the online crowdsourcing platform Mechanical Turk. Respondents were asked to decide which of two infants to treat in a situation of scarce resources. Demographic characteristics, personality traits and political views were recorded. Respondents were also asked to respond to a widely cited thought experiment involving rationing. The majority of respondents, in all except one scenario, chose the utilitarian option of directing treatment to the infant with the higher chance of survival, higher life expectancy, less severe disability, and less expensive treatment. As discrepancy between outcomes decreased, however, there was a statistically significant increase in egalitarian responses and decrease in utilitarian responses in scenarios involving chance of survival (P = 0.001), life expectancy (P = 0.0001), and cost of treatment (P = 0.01). In the classic 'lifeboat' scenario, all but two respondents were utilitarian. This survey suggests that in situations of scarcity and equal clinical need, non-health professionals support rationing of life-saving treatment based on probability of survival, duration of survival, cost of treatment or quality of life. However, where the difference in prognosis or cost is very small, non-health professionals preferred to give infants an equal chance of receiving treatment.
Affective cognition: Exploring lay theories of emotion.
Ong, Desmond C; Zaki, Jamil; Goodman, Noah D
2015-10-01
Humans skillfully reason about others' emotions, a phenomenon we term affective cognition. Despite its importance, few formal, quantitative theories have described the mechanisms supporting this phenomenon. We propose that affective cognition involves applying domain-general reasoning processes to domain-specific content knowledge. Observers' knowledge about emotions is represented in rich and coherent lay theories, which comprise consistent relationships between situations, emotions, and behaviors. Observers utilize this knowledge in deciphering social agents' behavior and signals (e.g., facial expressions), in a manner similar to rational inference in other domains. We construct a computational model of a lay theory of emotion, drawing on tools from Bayesian statistics, and test this model across four experiments in which observers drew inferences about others' emotions in a simple gambling paradigm. This work makes two main contributions. First, the model accurately captures observers' flexible but consistent reasoning about the ways that events and others' emotional responses to those events relate to each other. Second, our work models the problem of emotional cue integration-reasoning about others' emotion from multiple emotional cues-as rational inference via Bayes' rule, and we show that this model tightly tracks human observers' empirical judgments. Our results reveal a deep structural relationship between affective cognition and other forms of inference, and suggest wide-ranging applications to basic psychological theory and psychiatry. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.
Numerical minimization of AC losses in coaxial coated conductor cables
NASA Astrophysics Data System (ADS)
Rostila, L.; Suuriniemi, S.; Lehtonen, J.; Grasso, G.
2010-02-01
Power cables are one of the most promising applications for the superconducting coated conductors. In the AC use, only small resistive loss is generated, but the removal of the dissipated heat from the cryostat is inefficient due to the large temperature difference. The aim of this work is to minimize the AC losses in a multilayer coaxial cable, in which the tapes form current carrying cylinders. The optimized parameters are the tape numbers and lay angles in these cylinders. This work shows how to cope with the mechanical constraints for the lay angles and discrete tape number in optimization. Three common types of coaxial cables are studied here to demonstrate the feasibility of optimization, in which the AC losses were computed with a circuit analysis model formulated here for arbitrary phase currents, number of phases, and layers. Because the current sharing is practically determined by the inductances of the layers, the optima were obtained much faster by neglecting the nonlinear resistances caused by the AC losses. In addition, the example calculations show that the optimal cable structure do not usually depend on the AC loss model for the individual tapes. On the other hand, depending on the cable type, the losses of the optimized cables may be sensitive to the lay angles, and therefore, we recommend to study the sensitivity for the new cable designs individually.
48 CFR 3052.217-94 - Lay days (USCG).
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Lay days (USCG). 3052.217... CLAUSES Text of Provisions and Clauses 3052.217-94 Lay days (USCG). As prescribed in USCG guidance at (HSAR) 48 CFR 3017.9000(a) and (b), insert the following clause: Lay Days (DEC 2003) (a) Lay day time...
Real-time data-intensive computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parkinson, Dilworth Y., E-mail: dyparkinson@lbl.gov; Chen, Xian; Hexemer, Alexander
2016-07-27
Today users visit synchrotrons as sources of understanding and discovery—not as sources of just light, and not as sources of data. To achieve this, the synchrotron facilities frequently provide not just light but often the entire end station and increasingly, advanced computational facilities that can reduce terabytes of data into a form that can reveal a new key insight. The Advanced Light Source (ALS) has partnered with high performance computing, fast networking, and applied mathematics groups to create a “super-facility”, giving users simultaneous access to the experimental, computational, and algorithmic resources to make this possible. This combination forms an efficientmore » closed loop, where data—despite its high rate and volume—is transferred and processed immediately and automatically on appropriate computing resources, and results are extracted, visualized, and presented to users or to the experimental control system, both to provide immediate insight and to guide decisions about subsequent experiments during beamtime. We will describe our work at the ALS ptychography, scattering, micro-diffraction, and micro-tomography beamlines.« less
NASA Astrophysics Data System (ADS)
Evseev, D. G.; Savrukhin, A. V.; Neklyudov, A. N.
2018-01-01
Computer simulation of the kinetics of thermal processes and structural and phase transformations in the wall of a bogie side frame produced from steel 20GL is performed with allowance for the differences in the cooling intensity under volume-surface hardening. The simulation is based on the developed method employing the diagram of decomposition of austenite at different cooling rates. The data obtained are used to make conclusion on the effect of the cooling intensity on propagation of martensite structure over the wall section.
High-intensity positron microprobe at Jefferson Lab
Golge, Serkan; Vlahovic, Branislav; Wojtsekhowski, Bogdan B.
2014-06-19
We present a conceptual design for a novel continuous wave electron-linac based high-intensity slow-positron production source with a projected intensity on the order of 10 10 e +/s. Reaching this intensity in our design relies on the transport of positrons (T + below 600 keV) from the electron-positron pair production converter target to a low-radiation and low-temperature area for moderation in a high-efficiency cryogenic rare gas moderator, solid Ne. The performance of the integrated beamline has been verified through computational studies. The computational results include Monte Carlo calculations of the optimized electron/positron beam energies, converter target thickness, synchronized raster system,more » transport of the beam from the converter target to the moderator, extraction of the beam from the channel, and moderation efficiency calculations. For the extraction of positrons from the magnetic channel a magnetic field terminator plug prototype has been built and experimental data on the effectiveness of this prototype are presented. The dissipation of the heat away from the converter target and radiation protection measures are also discussed.« less
Technical Note: scuda: A software platform for cumulative dose assessment.
Park, Seyoun; McNutt, Todd; Plishker, William; Quon, Harry; Wong, John; Shekhar, Raj; Lee, Junghoon
2016-10-01
Accurate tracking of anatomical changes and computation of actually delivered dose to the patient are critical for successful adaptive radiation therapy (ART). Additionally, efficient data management and fast processing are practically important for the adoption in clinic as ART involves a large amount of image and treatment data. The purpose of this study was to develop an accurate and efficient Software platform for CUmulative Dose Assessment (scuda) that can be seamlessly integrated into the clinical workflow. scuda consists of deformable image registration (DIR), segmentation, dose computation modules, and a graphical user interface. It is connected to our image PACS and radiotherapy informatics databases from which it automatically queries/retrieves patient images, radiotherapy plan, beam data, and daily treatment information, thus providing an efficient and unified workflow. For accurate registration of the planning CT and daily CBCTs, the authors iteratively correct CBCT intensities by matching local intensity histograms during the DIR process. Contours of the target tumor and critical structures are then propagated from the planning CT to daily CBCTs using the computed deformations. The actual delivered daily dose is computed using the registered CT and patient setup information by a superposition/convolution algorithm, and accumulated using the computed deformation fields. Both DIR and dose computation modules are accelerated by a graphics processing unit. The cumulative dose computation process has been validated on 30 head and neck (HN) cancer cases, showing 3.5 ± 5.0 Gy (mean±STD) absolute mean dose differences between the planned and the actually delivered doses in the parotid glands. On average, DIR, dose computation, and segmentation take 20 s/fraction and 17 min for a 35-fraction treatment including additional computation for dose accumulation. The authors developed a unified software platform that provides accurate and efficient monitoring of anatomical changes and computation of actually delivered dose to the patient, thus realizing an efficient cumulative dose computation workflow. Evaluation on HN cases demonstrated the utility of our platform for monitoring the treatment quality and detecting significant dosimetric variations that are keys to successful ART.
Technical Note: SCUDA: A software platform for cumulative dose assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, Seyoun; McNutt, Todd; Quon, Harry
Purpose: Accurate tracking of anatomical changes and computation of actually delivered dose to the patient are critical for successful adaptive radiation therapy (ART). Additionally, efficient data management and fast processing are practically important for the adoption in clinic as ART involves a large amount of image and treatment data. The purpose of this study was to develop an accurate and efficient Software platform for CUmulative Dose Assessment (SCUDA) that can be seamlessly integrated into the clinical workflow. Methods: SCUDA consists of deformable image registration (DIR), segmentation, dose computation modules, and a graphical user interface. It is connected to our imagemore » PACS and radiotherapy informatics databases from which it automatically queries/retrieves patient images, radiotherapy plan, beam data, and daily treatment information, thus providing an efficient and unified workflow. For accurate registration of the planning CT and daily CBCTs, the authors iteratively correct CBCT intensities by matching local intensity histograms during the DIR process. Contours of the target tumor and critical structures are then propagated from the planning CT to daily CBCTs using the computed deformations. The actual delivered daily dose is computed using the registered CT and patient setup information by a superposition/convolution algorithm, and accumulated using the computed deformation fields. Both DIR and dose computation modules are accelerated by a graphics processing unit. Results: The cumulative dose computation process has been validated on 30 head and neck (HN) cancer cases, showing 3.5 ± 5.0 Gy (mean±STD) absolute mean dose differences between the planned and the actually delivered doses in the parotid glands. On average, DIR, dose computation, and segmentation take 20 s/fraction and 17 min for a 35-fraction treatment including additional computation for dose accumulation. Conclusions: The authors developed a unified software platform that provides accurate and efficient monitoring of anatomical changes and computation of actually delivered dose to the patient, thus realizing an efficient cumulative dose computation workflow. Evaluation on HN cases demonstrated the utility of our platform for monitoring the treatment quality and detecting significant dosimetric variations that are keys to successful ART.« less
NASA Astrophysics Data System (ADS)
Klimentov, A.; De, K.; Jha, S.; Maeno, T.; Nilsson, P.; Oleynik, D.; Panitkin, S.; Wells, J.; Wenaus, T.
2016-10-01
The.LHC, operating at CERN, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe. ATLAS, one of the largest collaborations ever assembled in the sciences, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Management System for managing the workflow for all data processing on over 150 data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. While PanDA currently uses more than 250,000 cores with a peak performance of 0.3 petaFLOPS, LHC data taking runs require more resources than grid can possibly provide. To alleviate these challenges, LHC experiments are engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with supercomputers in United States, in particular with Titan supercomputer at Oak Ridge Leadership Computing Facility. Current approach utilizes modified PanDA pilot framework for job submission to the supercomputers batch queues and local data management, with light-weight MPI wrappers to run single threaded workloads in parallel on LCFs multi-core worker nodes. This implementation was tested with a variety of Monte-Carlo workloads on several supercomputing platforms for ALICE and ATLAS experiments and it is in full pro duction for the ATLAS since September 2015. We will present our current accomplishments with running PanDA at supercomputers and demonstrate our ability to use PanDA as a portal independent of the computing facilities infrastructure for High Energy and Nuclear Physics as well as other data-intensive science applications, such as bioinformatics and astro-particle physics.
Verloigne, Maïté; Van Lippevelde, Wendy; Bere, Elling; Manios, Yannis; Kovács, Éva; Grillenberger, Monika; Maes, Lea; Brug, Johannes; De Bourdeaudhuij, Ilse
2015-09-18
The aim was to investigate which individual and family environmental factors are related to television and computer time separately in 10- to-12-year-old children within and across five European countries (Belgium, Germany, Greece, Hungary, Norway). Data were used from the ENERGY-project. Children and one of their parents completed a questionnaire, including questions on screen time behaviours and related individual and family environmental factors. Family environmental factors included social, political, economic and physical environmental factors. Complete data were obtained from 2022 child-parent dyads (53.8 % girls, mean child age 11.2 ± 0.8 years; mean parental age 40.5 ± 5.1 years). To examine the association between individual and family environmental factors (i.e. independent variables) and television/computer time (i.e. dependent variables) in each country, multilevel regression analyses were performed using MLwiN 2.22, adjusting for children's sex and age. In all countries, children reported more television and/or computer time, if children and their parents thought that the maximum recommended level for watching television and/or using the computer was higher and if children had a higher preference for television watching and/or computer use and a lower self-efficacy to control television watching and/or computer use. Most physical and economic environmental variables were not significantly associated with television or computer time. Slightly more individual factors were related to children's computer time and more parental social environmental factors to children's television time. We also found different correlates across countries: parental co-participation in television watching was significantly positively associated with children's television time in all countries, except for Greece. A higher level of parental television and computer time was only associated with a higher level of children's television and computer time in Hungary. Having rules regarding children's television time was related to less television time in all countries, except for Belgium and Norway. Most evidence was found for an association between screen time and individual and parental social environmental factors, which means that future interventions aiming to reduce screen time should focus on children's individual beliefs and habits as well parental social factors. As we identified some different correlates for television and computer time and across countries, cross-European interventions could make small adaptations per specific screen time activity and lay different emphases per country.
Bryant, Barbara
2012-01-01
In living cells, DNA is packaged along with protein and RNA into chromatin. Chemical modifications to nucleotides and histone proteins are added, removed and recognized by multi-functional molecular complexes. Here I define a new computational model, in which chromatin modifications are information units that can be written onto a one-dimensional string of nucleosomes, analogous to the symbols written onto cells of a Turing machine tape, and chromatin-modifying complexes are modeled as read-write rules that operate on a finite set of adjacent nucleosomes. I illustrate the use of this “chromatin computer” to solve an instance of the Hamiltonian path problem. I prove that chromatin computers are computationally universal – and therefore more powerful than the logic circuits often used to model transcription factor control of gene expression. Features of biological chromatin provide a rich instruction set for efficient computation of nontrivial algorithms in biological time scales. Modeling chromatin as a computer shifts how we think about chromatin function, suggests new approaches to medical intervention, and lays the groundwork for the engineering of a new class of biological computing machines. PMID:22567109
Preliminary evaluation of a nest usage sensor to detect double nest occupations of laying hens.
Zaninelli, Mauro; Costa, Annamaria; Tangorra, Francesco Maria; Rossi, Luciana; Agazzi, Alessandro; Savoini, Giovanni
2015-01-26
Conventional cage systems will be replaced by housing systems that allow hens to move freely. These systems may improve hens' welfare, but they lead to some disadvantages: disease, bone fractures, cannibalism, piling and lower egg production. New selection criteria for existing commercial strains should be identified considering individual data about laying performance and the behavior of hens. Many recording systems have been developed to collect these data. However, the management of double nest occupations remains critical for the correct egg-to-hen assignment. To limit such events, most systems adopt specific trap devices and additional mechanical components. Others, instead, only prevent these occurrences by narrowing the nest, without any detection and management. The aim of this study was to develop and test a nest usage "sensor", based on imaging analysis, that is able to automatically detect a double nest occupation. Results showed that the developed sensor correctly identified the double nest occupation occurrences. Therefore, the imaging analysis resulted in being a useful solution that could simplify the nest construction for this type of recording system, allowing the collection of more precise and accurate data, since double nest occupations would be managed and the normal laying behavior of hens would not be discouraged by the presence of the trap devices.
Preliminary Evaluation of a Nest Usage Sensor to Detect Double Nest Occupations of Laying Hens
Zaninelli, Mauro; Costa, Annamaria; Tangorra, Francesco Maria; Rossi, Luciana; Agazzi, Alessandro; Savoini, Giovanni
2015-01-01
Conventional cage systems will be replaced by housing systems that allow hens to move freely. These systems may improve hens' welfare, but they lead to some disadvantages: disease, bone fractures, cannibalism, piling and lower egg production. New selection criteria for existing commercial strains should be identified considering individual data about laying performance and the behavior of hens. Many recording systems have been developed to collect these data. However, the management of double nest occupations remains critical for the correct egg-to-hen assignment. To limit such events, most systems adopt specific trap devices and additional mechanical components. Others, instead, only prevent these occurrences by narrowing the nest, without any detection and management. The aim of this study was to develop and test a nest usage “sensor”, based on imaging analysis, that is able to automatically detect a double nest occupation. Results showed that the developed sensor correctly identified the double nest occupation occurrences. Therefore, the imaging analysis resulted in being a useful solution that could simplify the nest construction for this type of recording system, allowing the collection of more precise and accurate data, since double nest occupations would be managed and the normal laying behavior of hens would not be discouraged by the presence of the trap devices. PMID:25629704
Randomization Procedures Applied to Analysis of Ballistic Data
1991-06-01
test,;;15. NUMBER OF PAGES data analysis; computationally intensive statistics ; randomization tests; permutation tests; 16 nonparametric statistics ...be 0.13. 8 Any reasonable statistical procedure would fail to support the notion of improvement of dynamic over standard indexing based on this data ...AD-A238 389 TECHNICAL REPORT BRL-TR-3245 iBRL RANDOMIZATION PROCEDURES APPLIED TO ANALYSIS OF BALLISTIC DATA MALCOLM S. TAYLOR BARRY A. BODT - JUNE
Dakua, Sarada Prasad; Abinahed, Julien; Al-Ansari, Abdulla
2015-01-01
Abstract. Liver segmentation continues to remain a major challenge, largely due to its intense complexity with surrounding anatomical structures (stomach, kidney, and heart), high noise level and lack of contrast in pathological computed tomography (CT) data. We present an approach to reconstructing the liver surface in low contrast CT. The main contributions are: (1) a stochastic resonance-based methodology in discrete cosine transform domain is developed to enhance the contrast of pathological liver images, (2) a new formulation is proposed to prevent the object boundary, resulting from the cellular automata method, from leaking into the surrounding areas of similar intensity, and (3) a level-set method is suggested to generate intermediate segmentation contours from two segmented slices distantly located in a subject sequence. We have tested the algorithm on real datasets obtained from two sources, Hamad General Hospital and medical image computing and computer-assisted interventions grand challenge workshop. Various parameters in the algorithm, such as w, Δt, z, α, μ, α1, and α2, play imperative roles, thus their values are precisely selected. Both qualitative and quantitative evaluation performed on liver data show promising segmentation accuracy when compared with ground truth data reflecting the potential of the proposed method. PMID:26158101
Hadoop for High-Performance Climate Analytics: Use Cases and Lessons Learned
NASA Technical Reports Server (NTRS)
Tamkin, Glenn
2013-01-01
Scientific data services are a critical aspect of the NASA Center for Climate Simulations mission (NCCS). Hadoop, via MapReduce, provides an approach to high-performance analytics that is proving to be useful to data intensive problems in climate research. It offers an analysis paradigm that uses clusters of computers and combines distributed storage of large data sets with parallel computation. The NCCS is particularly interested in the potential of Hadoop to speed up basic operations common to a wide range of analyses. In order to evaluate this potential, we prototyped a series of canonical MapReduce operations over a test suite of observational and climate simulation datasets. The initial focus was on averaging operations over arbitrary spatial and temporal extents within Modern Era Retrospective- Analysis for Research and Applications (MERRA) data. After preliminary results suggested that this approach improves efficiencies within data intensive analytic workflows, we invested in building a cyber infrastructure resource for developing a new generation of climate data analysis capabilities using Hadoop. This resource is focused on reducing the time spent in the preparation of reanalysis data used in data-model inter-comparison, a long sought goal of the climate community. This paper summarizes the related use cases and lessons learned.
State of the practice on data access, sharing, and integration.
DOT National Transportation Integrated Search
2016-12-01
The purpose of this state-of-the-practice review was to lay both technical and institutional foundation for all aspects of the development of the Virtual Data Access Framework. The review focused on current data sharing and integration practices amon...
NASA Astrophysics Data System (ADS)
Evans, Ben; Allen, Chris; Antony, Joseph; Bastrakova, Irina; Gohar, Kashif; Porter, David; Pugh, Tim; Santana, Fabiana; Smillie, Jon; Trenham, Claire; Wang, Jingbo; Wyborn, Lesley
2015-04-01
The National Computational Infrastructure (NCI) has established a powerful and flexible in-situ petascale computational environment to enable both high performance computing and Data-intensive Science across a wide spectrum of national environmental and earth science data collections - in particular climate, observational data and geoscientific assets. This paper examines 1) the computational environments that supports the modelling and data processing pipelines, 2) the analysis environments and methods to support data analysis, and 3) the progress so far to harmonise the underlying data collections for future interdisciplinary research across these large volume data collections. NCI has established 10+ PBytes of major national and international data collections from both the government and research sectors based on six themes: 1) weather, climate, and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology, and 6) astronomy, social and biosciences. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. The data is largely sourced from NCI's partners (which include the custodians of many of the major Australian national-scale scientific collections), leading research communities, and collaborating overseas organisations. New infrastructures created at NCI mean the data collections are now accessible within an integrated High Performance Computing and Data (HPC-HPD) environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large-scale high-bandwidth Lustre filesystems. The hardware was designed at inception to ensure that it would allow the layered software environment to flexibly accommodate the advancement of future data science. New approaches to software technology and data models have also had to be developed to enable access to these large and exponentially increasing data volumes at NCI. Traditional HPC and data environments are still made available in a way that flexibly provides the tools, services and supporting software systems on these new petascale infrastructures. But to enable the research to take place at this scale, the data, metadata and software now need to evolve together - creating a new integrated high performance infrastructure. The new infrastructure at NCI currently supports a catalogue of integrated, reusable software and workflows from earth system and ecosystem modelling, weather research, satellite and other observed data processing and analysis. One of the challenges for NCI has been to support existing techniques and methods, while carefully preparing the underlying infrastructure for the transition needed for the next class of Data-intensive Science. In doing so, a flexible range of techniques and software can be made available for application across the corpus of data collections available, and to provide a new infrastructure for future interdisciplinary research.
BioPig: a Hadoop-based analytic toolkit for large-scale sequence data.
Nordberg, Henrik; Bhatia, Karan; Wang, Kai; Wang, Zhong
2013-12-01
The recent revolution in sequencing technologies has led to an exponential growth of sequence data. As a result, most of the current bioinformatics tools become obsolete as they fail to scale with data. To tackle this 'data deluge', here we introduce the BioPig sequence analysis toolkit as one of the solutions that scale to data and computation. We built BioPig on the Apache's Hadoop MapReduce system and the Pig data flow language. Compared with traditional serial and MPI-based algorithms, BioPig has three major advantages: first, BioPig's programmability greatly reduces development time for parallel bioinformatics applications; second, testing BioPig with up to 500 Gb sequences demonstrates that it scales automatically with size of data; and finally, BioPig can be ported without modification on many Hadoop infrastructures, as tested with Magellan system at National Energy Research Scientific Computing Center and the Amazon Elastic Compute Cloud. In summary, BioPig represents a novel program framework with the potential to greatly accelerate data-intensive bioinformatics analysis.
The BioIntelligence Framework: a new computational platform for biomedical knowledge computing
Farley, Toni; Kiefer, Jeff; Lee, Preston; Von Hoff, Daniel; Trent, Jeffrey M; Colbourn, Charles
2013-01-01
Breakthroughs in molecular profiling technologies are enabling a new data-intensive approach to biomedical research, with the potential to revolutionize how we study, manage, and treat complex diseases. The next great challenge for clinical applications of these innovations will be to create scalable computational solutions for intelligently linking complex biomedical patient data to clinically actionable knowledge. Traditional database management systems (DBMS) are not well suited to representing complex syntactic and semantic relationships in unstructured biomedical information, introducing barriers to realizing such solutions. We propose a scalable computational framework for addressing this need, which leverages a hypergraph-based data model and query language that may be better suited for representing complex multi-lateral, multi-scalar, and multi-dimensional relationships. We also discuss how this framework can be used to create rapid learning knowledge base systems to intelligently capture and relate complex patient data to biomedical knowledge in order to automate the recovery of clinically actionable information. PMID:22859646
Computational provenance in hydrologic science: a snow mapping example.
Dozier, Jeff; Frew, James
2009-03-13
Computational provenance--a record of the antecedents and processing history of digital information--is key to properly documenting computer-based scientific research. To support investigations in hydrologic science, we produce the daily fractional snow-covered area from NASA's moderate-resolution imaging spectroradiometer (MODIS). From the MODIS reflectance data in seven wavelengths, we estimate the fraction of each 500 m pixel that snow covers. The daily products have data gaps and errors because of cloud cover and sensor viewing geometry, so we interpolate and smooth to produce our best estimate of the daily snow cover. To manage the data, we have developed the Earth System Science Server (ES3), a software environment for data-intensive Earth science, with unique capabilities for automatically and transparently capturing and managing the provenance of arbitrary computations. Transparent acquisition avoids the scientists having to express their computations in specific languages or schemas in order for provenance to be acquired and maintained. ES3 models provenance as relationships between processes and their input and output files. It is particularly suited to capturing the provenance of an evolving algorithm whose components span multiple languages and execution environments.
Building Research Cyberinfrastructure at Small/Medium Research Institutions
ERIC Educational Resources Information Center
Agee, Anne; Rowe, Theresa; Woo, Melissa; Woods, David
2010-01-01
A 2006 ECAR study defined cyberinfrastructure as the coordinated aggregate of "hardware, software, communications, services, facilities, and personnel that enable researchers to conduct advanced computational, collaborative, and data-intensive research." While cyberinfrastructure was initially seen as support for scientific and…
Computational chemistry and aeroassisted orbital transfer vehicles
NASA Technical Reports Server (NTRS)
Cooper, D. M.; Jaffe, R. L.; Arnold, J. O.
1985-01-01
An analysis of the radiative heating phenomena encountered during a typical aeroassisted orbital transfer vehicle (AOTV) trajectory was made to determine the potential impact of computational chemistry on AOTV design technology. Both equilibrium and nonequilibrium radiation mechanisms were considered. This analysis showed that computational chemistry can be used to predict (1) radiative intensity factors and spectroscopic data; (2) the excitation rates of both atoms and molecules; (3) high-temperature reaction rate constants for metathesis and charge exchange reactions; (4) particle ionization and neutralization rates and cross sections; and (5) spectral line widths.
Sun, Christina J; García, Manuel; Mann, Lilli; Alonzo, Jorge; Eng, Eugenia; Rhodes, Scott D
2015-05-01
The HOLA intervention was a lay health advisor intervention designed to reduce the disproportionate HIV burden borne by Latino sexual and gender identity minorities (gay, bisexual, and other men who have sex with men, and transgender persons) living in the United States. Process evaluation data were collected for over a year of intervention implementation from 11 trained Latino male and transgender lay health advisors (Navegantes) to document the activities each Navegante conducted to promote condom use and HIV testing among his or her eight social network members enrolled in the study. Over 13 months, the Navegantes reported conducting 1,820 activities. The most common activity was condom distribution. Navegantes had extensive reach beyond their enrolled social network members, and they engaged in health promotion activities beyond social network members enrolled in the study. There were significant differences between the types of activities conducted by Navegantes depending on who was present. Results suggest that lay health advisor interventions reach large number of at-risk community members and may benefit populations disproportionately affected by HIV. © 2014 Society for Public Health Education.
Origins of suicidality: compatibility of lay and expert beliefs - qualitative study.
Zadravec, Tina; Grad, Onja
2013-06-01
Today there exist different views on origins of suicidal behaviour, which can influence the help-seeking behaviour and the adherence to the treatment of suicidal people. The beliefs lay people and patients have about the origins of suicidal behaviour as well as the compatibility of their beliefs with the views of the mental health personnel (general practitioners and psychiatrists) were assessed. 45 semi-structured interviews with the general population, suicide attempters, general practitioners and psychiatrists were conducted, audio typed, transcribed and a thematic analysis of the data was carried out. The results indicated the incompatibility of the views. The general population and the suicide attempters favoured psychological explanations of suicidal behaviour, whereas the general practitioners and psychiatrists promoted medical explanations. The only common theme was perception of the suicidal crisis as a crucial factor in suicidality. Lay people and experts believe that suicidal crisis is the main origin of suicidal behaviour. The awareness of this common denominator and also of the differences in opinions between lay people and experts should be kept in mind when planning and implementing prevention and treatment programmes if we wish to promote help-seeking behaviour and attain good adherence to treatment.
Consumerism, reflexivity and the medical encounter.
Lupton, D
1997-08-01
Much emphasis has been placed recently in sociological, policy and popular discourses on changes in lay people's attitudes towards the medical profession that have been labelled by some as a move towards the embracing of "consumerism". Notions of consumerism tend to assume that lay people act as "rational" actors in the context of the medical encounter. They align with broader sociological concepts of the "reflexive self" as a product of late modernity; that is, the self who acts in a calculated manner to engage in self-improvement and who is sceptical about expert knowledges. To explore the ways that people think and feel about medicine and the medical profession, this article draws on findings from a study involving in-depth interviews with 60 lay people from a wide range of backgrounds living in Sydney. These data suggest that, in their interactions with doctors and other health care workers, lay people may pursue both the ideal-type "consumerist" and the "passive patient" subject position simultaneously or variously, depending on the context. The article concludes that late modernist notions of reflexivity as applied to issues of consumerism fail to recognize the complexity and changeable nature of the desires, emotions and needs that characterize the patient-doctor relationship.
Flow management for hydropower extirpates aquatic insects, undermining river food webs
Kennedy, Theodore A.; Muehlbauer, Jeffrey D.; Yackulic, Charles B.; Lytle, D.A.; Miller, S.A.; Dibble, Kimberly L.; Kortenhoeven, Eric W.; Metcalfe, Anya; Baxter, Colden V.
2016-01-01
Dams impound the majority of rivers and provide important societal benefits, especially daily water releases that enable on-peak hydroelectricity generation. Such “hydropeaking” is common worldwide, but its downstream impacts remain unclear. We evaluated the response of aquatic insects, a cornerstone of river food webs, to hydropeaking using a life history–hydrodynamic model. Our model predicts that aquatic-insect abundance will depend on a basic life-history trait—adult egg-laying behavior—such that open-water layers will be unaffected by hydropeaking, whereas ecologically important and widespread river-edge layers, such as mayflies, will be extirpated. These predictions are supported by a more-than-2500-sample, citizen-science data set of aquatic insects from the Colorado River in the Grand Canyon and by a survey of insect diversity and hydropeaking intensity across dammed rivers of the Western United States. Our study reveals a hydropeaking-related life history bottleneck that precludes viable populations of many aquatic insects from inhabiting regulated rivers.
Myth or Reality-Transdermal Magnesium?
Gröber, Uwe; Werner, Tanja; Vormann, Jürgen; Kisters, Klaus
2017-07-28
In the following review, we evaluated the current literature and evidence-based data on transdermal magnesium application and show that the propagation of transdermal magnesium is scientifically unsupported. The importance of magnesium and the positive effects of magnesium supplementation are extensively documented in magnesium deficiency, e.g., cardiovascular disease and diabetes mellitus. The effectiveness of oral magnesium supplementation for the treatment of magnesium deficiency has been studied in detail. However, the proven and well-documented oral magnesium supplementation has become questioned in the recent years through intensive marketing for its transdermal application (e.g., magnesium-containing sprays, magnesium flakes, and magnesium salt baths). In both, specialist and lay press as well as on the internet, there are increasing numbers of articles claiming the effectiveness and superiority of transdermal magnesium over an oral application. It is claimed that the transdermal absorption of magnesium in comparison to oral application is more effective due to better absorption and fewer side effects as it bypasses the gastrointestinal tract.
Myth or Reality—Transdermal Magnesium?
Gröber, Uwe; Werner, Tanja; Vormann, Jürgen
2017-01-01
In the following review, we evaluated the current literature and evidence-based data on transdermal magnesium application and show that the propagation of transdermal magnesium is scientifically unsupported. The importance of magnesium and the positive effects of magnesium supplementation are extensively documented in magnesium deficiency, e.g., cardiovascular disease and diabetes mellitus. The effectiveness of oral magnesium supplementation for the treatment of magnesium deficiency has been studied in detail. However, the proven and well-documented oral magnesium supplementation has become questioned in the recent years through intensive marketing for its transdermal application (e.g., magnesium-containing sprays, magnesium flakes, and magnesium salt baths). In both, specialist and lay press as well as on the internet, there are increasing numbers of articles claiming the effectiveness and superiority of transdermal magnesium over an oral application. It is claimed that the transdermal absorption of magnesium in comparison to oral application is more effective due to better absorption and fewer side effects as it bypasses the gastrointestinal tract. PMID:28788060
INTERFRAGMENTARY SURFACE AREA AS AN INDEX OF COMMINUTION SEVERITY IN CORTICAL BONE IMPACT
Beardsley, Christina L.; Anderson, Donald D.; Marsh, J. Lawrence; Brown, Thomas D.
2008-01-01
Summary A monotonic relationship is expected between energy absorption and fracture surface area generation for brittle solids, based on fracture mechanics principles. It was hypothesized that this relationship is demonstrable in bone, to the point that on a continuous scale, comminuted fractures created with specific levels of energy delivery could be discriminated from one another. Using bovine cortical bone segments in conjunction with digital image analysis of CT fracture data, the surface area freed by controlled impact fracture events was measured. The results demonstrated a statistically significant (p<0.0001) difference in measured de novo surface area between three specimen groups, over a range of input energies from 0.423 to 0.702 J/g. Local material properties were also incorporated into these measurements via CT Hounsfield intensities. This study confirms that comminution severity of bone fractures can indeed be measured on a continuous scale, based on energy absorption. This lays a foundation for similar assessments in human injuries. PMID:15885492
LANDSAT-1 data, its use in a soil survey program
NASA Technical Reports Server (NTRS)
Westin, F. C.; Frazee, C. J.
1975-01-01
The following applications of LANDSAT imagery were investigated: assistance in recognizing soil survey boundaries, low intensity soil surveys, and preparation of a base map for publishing thematic soils maps. The following characteristics of LANDSAT imagery were tested as they apply to the recognition of soil boundaries in South Dakota and western Minnesota: synoptic views due to the large areas covered, near-orthography and lack of distortion, flexibility of selecting the proper season, data recording in four parts of the spectrum, and the use of computer compatible tapes. A low intensity soil survey of Pennington County, South Dakota was completed in 1974. Low intensity inexpensive soil surveys can provide the data needed to evaluate agricultural land for the remaining counties until detailed soil surveys are completed. In using LANDSAT imagery as a base map for publishing thematic soil maps, the first step was to prepare a mosaic with 20 LANDSAT scenes from several late spring passes in 1973.
The Galaxy platform for accessible, reproducible and collaborative biomedical analyses: 2016 update
Afgan, Enis; Baker, Dannon; van den Beek, Marius; Blankenberg, Daniel; Bouvier, Dave; Čech, Martin; Chilton, John; Clements, Dave; Coraor, Nate; Eberhard, Carl; Grüning, Björn; Guerler, Aysam; Hillman-Jackson, Jennifer; Von Kuster, Greg; Rasche, Eric; Soranzo, Nicola; Turaga, Nitesh; Taylor, James; Nekrutenko, Anton; Goecks, Jeremy
2016-01-01
High-throughput data production technologies, particularly ‘next-generation’ DNA sequencing, have ushered in widespread and disruptive changes to biomedical research. Making sense of the large datasets produced by these technologies requires sophisticated statistical and computational methods, as well as substantial computational power. This has led to an acute crisis in life sciences, as researchers without informatics training attempt to perform computation-dependent analyses. Since 2005, the Galaxy project has worked to address this problem by providing a framework that makes advanced computational tools usable by non experts. Galaxy seeks to make data-intensive research more accessible, transparent and reproducible by providing a Web-based environment in which users can perform computational analyses and have all of the details automatically tracked for later inspection, publication, or reuse. In this report we highlight recently added features enabling biomedical analyses on a large scale. PMID:27137889
Open source data logger for low-cost environmental monitoring
2014-01-01
Abstract The increasing transformation of biodiversity into a data-intensive science has seen numerous independent systems linked and aggregated into the current landscape of biodiversity informatics. This paper outlines how we can move forward with this programme, incorporating real time environmental monitoring into our methodology using low-power and low-cost computing platforms. PMID:24855446
Dear, Rachel; Barratt, Alexandra; Askie, Lisa; McGeechan, Kevin; Arora, Sheena; Crossing, Sally; Currow, David; Tattersall, Martin
2011-02-01
Clinical trials registries are now operating in the USA, Europe, Australia, China, and India and more are planned. Trial registries could be an excellent source of information about clinical trials for patients and others affected by cancer as well as health care professionals, but may be difficult for patients to navigate and use. An opportunity arose in Australia to develop a consumer friendly cancer clinical trials website (Australian Cancer Trials Online (ACTO), www.australiancancertrials.gov.au) using an automated data feed from two large clinical trial registries. In this article, we describe aspects of this new website, and explore ways in which such a website may add value to clinical trial data which are already collected and held by trial registries. The development of ACTO was completed by a Web company working in close association with staff at the Australian New Zealand Clinical Trials Registry (ANZCTR), and with consumer representatives. Data for the website were sourced directly and only from clinical trial registries, thus avoiding the creation of an additional trials database. It receives an automated, daily data feed of newly registered cancer clinical trials from both the ANZCTR and Clinical Trials.gov. The development of ACTO exemplifies the advantage of a local clinical trial registry working with consumers to provide accessible information about cancer clinical trials to meet consumers' information needs. We found that the inclusion of a lay summary added substantial value for consumers, and recommend that consideration be given to adding a lay summary to the mandatory data items collected by all trial registries. Furthermore, improved navigation, decision support tools, and consistency in data collection between clinical trial registries will also enable consumer websites to provide additional value for users. Clinical trial registration is not compulsory in Australia. If the additional cancer items (including a lay summary) are not provided by registrants of cancer trials on ANZCTR, this can compromise the quality and usefulness of the data for the end-user, in this case consumers, as they may encounter gaps in the data. Expanding the World Health Organization Trial Registration Data Set to include this additional information, particularly the lay summary, would be valuable. A well-coordinated system of clinical trial registration is critical to the success of efforts to provide better access for all to inform about clinical trials.
NASA Technical Reports Server (NTRS)
Ly, Vuong T.; Mandl, Daniel J.
2014-01-01
This presentation lays out the data processing products that exist and are planned for the Matsu cloud for Earth Observing 1. The presentation focuses on a new feature called co-registration of Earth Observing 1 with Landsat Global Land Survey chips.
Adsorption of n-hexane and intermediate molecular weight aromatic hydrocarbons on LaY zeolite
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruthven, D.M.; Kaul, B.K.
Experimental equilibrium isotherms, Henry`s law constants, and heats of sorption are reported for n-hexane, benzene, toluene, p-xylene, mesitylene, naphthalene, trimethylbenzene (TMP), and hexamethylbenzene (HMB) in La-exchanged zeolite Y (Si/Al = 1.8). Henry`s law constants and energies of adsorption are substantially smaller than those for NaX zeolite, reflecting the absence of accessible cations in LaY. These data provide a basis for the estimation of adsorbed phase concentrations of the relevant hydrocarbons on REY cracking catalysts under reaction conditions.
Cai, L; Nyachoti, C M; Hancock, J D; Lee, J Y; Kim, Y H; Lee, D H; Kim, I H
2016-06-01
The objective of this study was to determine the effects of rare earth element-enriched yeast (RY) on egg production, coefficient of total tract apparent digestibility (CTTAD), egg quality, excreta gas emission and excreta microbiota of laying hens. A total of 216 ISA brown laying hens of 52 weeks of age were used in a 5-week feeding trial and data were collected every week. Birds were randomly allotted to three dietary treatments each with six replicates and 12 hens per replicate. Each cage (38 cm width × 50 cm length × 40 cm height) contained one hen. Treatments consisted of corn-soya bean meal-based diet supplemented with 0, 500 or 1000 mg/kg of RY. From weeks 55 to 56, inclusion of RY linearly increased (p < 0.05) egg production. The CTTAD of nitrogen was increased (linear, p < 0.05) with increasing dietary level of RY. In week 55, yolk height and Haugh units were increased linearly (p < 0.05) with increasing dietary RY content. However, no significant effects were observed in terms of excreta emissions and excreta microbiota in laying hens. In conclusion, dietary supplementation with RY improved egg production and CTTAD of nitrogen and slightly improved egg quality in laying hens of the late period of peak egg production. Journal of Animal Physiology and Animal Nutrition © 2015 Blackwell Verlag GmbH.
Iskender, H; Yenice, G; Dokumacioglu, E; Kaynar, O; Hayirli, A; Kaya, A
2017-10-01
1. The aim of this experiment was to compare the effects of dietary supplementation of hesperidin, naringin and quercetin on laying hen performance, egg quality and egg yolk lipid and protein profiles. 2. A total of 96 Lohmann White laying hens weighing an average of 1500 g at 28 weeks of age were randomly assigned to a basal diet and the basal diet supplemented (0.5 g/kg) with either hesperidin, naringin or quercetin. Each treatment was replicated in 6 cages in an 8-week experimental period. Data were analysed using one-way analysis of variance. 3. None of the dietary flavonoids affected laying performance and eggshell quality. Hesperidin and quercetin supplementations decreased albumen and yolk indexes. 4. As compared to the control group, egg yolk cholesterol content decreased and egg yolk protein content increased in response to dietary hesperidin and quercetin supplementation. The mean egg yolk cholesterol (mg/g) and protein (g/100 g) contents were 10.08/14.28, 16.12/14.08, 14.75/15.04 and 15.15/14.85 for the control group and groups supplemented with naringin, hesperidin and quercetin, respectively. 5. Egg yolk lipid and protein profiles were variable. 6. In conclusion, dietary supplementation of hesperidin or quercetin could be used in the diets during the early laying period to reduce egg yolk cholesterol and increase egg yolk protein, which may be attractive to consumers.
Muirhead, Vanessa; Levine, Alissa; Nicolau, Belinda; Landry, Anne; Bedos, Christophe
2013-02-01
This study aimed to better understand low-income parents' child dental care decisions through a life course approach that captured parents' experiences within the social context of poverty. We conducted 43 qualitative life history interviews with 10 parents, who were long-term social assistance recipients living in Montreal, Canada. Thematic analysis involved interview debriefing, transcript coding, theme identification and data interpretation. Our interviews identified two emergent themes: lay diagnosis and parental oral health management. Parents described a process of 'lay diagnosis' that consisted of examining their children's teeth and interpreting their children's oral signs and symptoms based on their observations. These lay diagnoses were also shaped by their own dental crises, care experiences and oral health knowledge gained across a life course of poverty and dental disadvantage. Parents' management strategies included monitoring and managing their children's oral health themselves or by seeking professional recourse. Parents' management strategies were influenced both by their lay diagnoses and their perceived ability to manage their children's oral health. Parents felt responsible for their children's dental care, empowered to manage their oral health and sometimes forgo dental visits for their children because of their own self-management life history. This original approach revealed insights that help to understand why low-income parents may underutilize free dental services. Further research should consider how dental programs can nurture parental empowerment and capitalize on parents' perceived ability to diagnose and manage their children's oral health.
Nutrient reserve dynamics of breeding canvasbacks
Barzen, Jeb A.; Serie, Jerome R.
1990-01-01
We compared nutrients in reproductive and nonreproductive tissues of breeding Canvasbacks (Aythya valisineria) to assess the relative importance of endogenous reserves and exogenous foods. Fat reserves of females increased during rapid follicle growth and varied more widely in size during the early phase of this period. Females began laying with ca. 205 g of fat in reserve and lost 1.8 g of carcass fat for every 1 g of fat contained in their ovary and eggs. Females lost body mass (primarily fat) at a declining rate as incubation advanced. Protein reserves increased directly with dry oviduct mass during rapid follicle growth. This direct relationship was highly dependent upon data from 2 birds and likely biased by structural size. During laying, protein reserves did not vary with the combined mass of dry oviduct and dry egg protein. Between laying and incubation, mean protein reserves decreased by an amount equal to the protein found in 2.1 Canvasback eggs. Calcium reserves did not vary with the cumulative total of calcium deposited in eggs. Mean calcium reserve declined by the equivalent content of 1.2 eggs between laying and incubation. We believe that protein and calcium were stored in small amounts during laying, and that they were supplemented continually by exogenous sources. In contrast, fat was stored in large amounts and contributed significantly to egg production and body maintenance. Male Canvasbacks lost fat steadily-but not protein or calcium-as the breeding season progressed.
Visualizing multiattribute Web transactions using a freeze technique
NASA Astrophysics Data System (ADS)
Hao, Ming C.; Cotting, Daniel; Dayal, Umeshwar; Machiraju, Vijay; Garg, Pankaj
2003-05-01
Web transactions are multidimensional and have a number of attributes: client, URL, response times, and numbers of messages. One of the key questions is how to simultaneously lay out in a graph the multiple relationships, such as the relationships between the web client response times and URLs in a web access application. In this paper, we describe a freeze technique to enhance a physics-based visualization system for web transactions. The idea is to freeze one set of objects before laying out the next set of objects during the construction of the graph. As a result, we substantially reduce the force computation time. This technique consists of three steps: automated classification, a freeze operation, and a graph layout. These three steps are iterated until the final graph is generated. This iterated-freeze technique has been prototyped in several e-service applications at Hewlett Packard Laboratories. It has been used to visually analyze large volumes of service and sales transactions at online web sites.
Room temperature linelists for CO2 asymmetric isotopologues with ab initio computed intensities
NASA Astrophysics Data System (ADS)
Zak, Emil J.; Tennyson, Jonathan; Polyansky, Oleg L.; Lodi, Lorenzo; Zobov, Nikolay F.; Tashkun, Sergei A.; Perevalov, Valery I.
2017-12-01
The present paper reports room temperature line lists for six asymmetric isotopologues of carbon dioxide: 16O12C18O (628), 16O12C17O (627), 16O13C18O (638),16O13C17O (637), 17O12C18O (728) and 17O13C18O (738), covering the range 0-8000 cm-1. Variational rotation-vibration wavefunctions and energy levels are computed using the DVR3D software suite and a high quality semi-empirical potential energy surface (PES), followed by computation of intensities using an ab initio dipole moment surface (DMS). A theoretical procedure for quantifying sensitivity of line intensities to minor distortions of the PES/DMS renders our theoretical model as critically evaluated. Several recent high quality measurements and theoretical approaches are discussed to provide a benchmark of our results against the most accurate available data. Indeed, the thesis of transferability of accuracy among different isotopologues with the use of mass-independent PES is supported by several examples. Thereby, we conclude that the majority of line intensities for strong bands are predicted with sub-percent accuracy. Accurate line positions are generated using an effective Hamiltonian, constructed from the latest experiments. This study completes the list of relevant isotopologues of carbon dioxide; these line lists are available to remote sensing studies and inclusion in databases.
Advanced processing for high-bandwidth sensor systems
NASA Astrophysics Data System (ADS)
Szymanski, John J.; Blain, Phil C.; Bloch, Jeffrey J.; Brislawn, Christopher M.; Brumby, Steven P.; Cafferty, Maureen M.; Dunham, Mark E.; Frigo, Janette R.; Gokhale, Maya; Harvey, Neal R.; Kenyon, Garrett; Kim, Won-Ha; Layne, J.; Lavenier, Dominique D.; McCabe, Kevin P.; Mitchell, Melanie; Moore, Kurt R.; Perkins, Simon J.; Porter, Reid B.; Robinson, S.; Salazar, Alfonso; Theiler, James P.; Young, Aaron C.
2000-11-01
Compute performance and algorithm design are key problems of image processing and scientific computing in general. For example, imaging spectrometers are capable of producing data in hundreds of spectral bands with millions of pixels. These data sets show great promise for remote sensing applications, but require new and computationally intensive processing. The goal of the Deployable Adaptive Processing Systems (DAPS) project at Los Alamos National Laboratory is to develop advanced processing hardware and algorithms for high-bandwidth sensor applications. The project has produced electronics for processing multi- and hyper-spectral sensor data, as well as LIDAR data, while employing processing elements using a variety of technologies. The project team is currently working on reconfigurable computing technology and advanced feature extraction techniques, with an emphasis on their application to image and RF signal processing. This paper presents reconfigurable computing technology and advanced feature extraction algorithm work and their application to multi- and hyperspectral image processing. Related projects on genetic algorithms as applied to image processing will be introduced, as will the collaboration between the DAPS project and the DARPA Adaptive Computing Systems program. Further details are presented in other talks during this conference and in other conferences taking place during this symposium.
Barnfield, Sarah; Pitts, Alison Clara; Kalaria, Raj; Allan, Louise; Tullo, Ellen
2017-01-01
Why did we do this study? It can be difficult for scientists to communicate their research findings to the public. This is partly due to the complexity of translating scientific language into words that the public understand. Further, it may be hard for the public to find out about and locate information about research studies. We aimed to adapt some scientific articles about the links between dementia and stroke into lay summaries to be displayed online for the general public. How did we do it? We collaborated with five people from a volunteer organisation, VOICENorth. They took part in two group discussions about studies reporting on the link between dementia and stroke, and selected four studies to translate into lay summaries and display on a website. We discussed the layout and language of the summaries and made adaptations to make them more understandable to the general public. What did we find? We were able to work with members of the public to translate research findings into lay summaries suitable for a general audience. We made changes to language and layout including the use of 'question and answer' style layouts, the addition of a reference list of scientific terms, and removing certain words. What does this mean? Working with members of the public is a realistic way to create resources that improve the accessibility of research findings to the wider public. Background Scientific research is often poorly understood by the general public and difficult for them to access. This presents a major barrier to disseminating and translating research findings. Stroke and dementia are both major public health issues, and research has shown lifestyle measures help to prevent them. This project aimed to select a series of studies from the Newcastle Cognitive Function after Stroke cohort (COGFAST) and create lay summaries comprehensible and accessible to the public. Methods We used a focus group format to collaborate with five members of the public to review COGFAST studies, prioritise those of most interest to the wider public, and modify the language and layout of the selected lay summaries. Focus groups were audio-taped and the team used the data to make iterative amendments, as suggested by members of the public, to the summaries and to a research website. We calculated the Flesch reading ease and Flesch-Kincaid grade level for each summary before and after the changes were made. Results In total, we worked with five members of the public in two focus groups to examine draft lay summaries, created by researchers, relating to eight COGFAST studies. Members of the public prioritised four COGFAST lay summaries according to the importance of the topic to the general public. We made a series of revisions to the summaries including the use of 'question and answer' style layouts, the addition of a glossary, and the exclusion of scientific jargon. Group discussion highlighted that lay summaries should be engaging, concise and comprehensible. We incorporated suggestions from members of the public into the design of a study website to display the summaries. The application of existing quantitative tools to estimate readability resulted in an apparently paradoxical increase in complexity of the lay summaries following the changes made. Conclusion This study supports previous literature demonstrating challenges in creating generic guidelines for researchers to create lay summaries. Existing quantitative metrics to assess readability may be inappropriate for assessing scientific lay summaries. We have shown it is feasible and successful to involve members of the public to create lay summaries to communicate the findings of complex scientific research. Trial registration Not applicable to the lay summary project.
Hadoop-GIS: A High Performance Spatial Data Warehousing System over MapReduce.
Aji, Ablimit; Wang, Fusheng; Vo, Hoang; Lee, Rubao; Liu, Qiaoling; Zhang, Xiaodong; Saltz, Joel
2013-08-01
Support of high performance queries on large volumes of spatial data becomes increasingly important in many application domains, including geospatial problems in numerous fields, location based services, and emerging scientific applications that are increasingly data- and compute-intensive. The emergence of massive scale spatial data is due to the proliferation of cost effective and ubiquitous positioning technologies, development of high resolution imaging technologies, and contribution from a large number of community users. There are two major challenges for managing and querying massive spatial data to support spatial queries: the explosion of spatial data, and the high computational complexity of spatial queries. In this paper, we present Hadoop-GIS - a scalable and high performance spatial data warehousing system for running large scale spatial queries on Hadoop. Hadoop-GIS supports multiple types of spatial queries on MapReduce through spatial partitioning, customizable spatial query engine RESQUE, implicit parallel spatial query execution on MapReduce, and effective methods for amending query results through handling boundary objects. Hadoop-GIS utilizes global partition indexing and customizable on demand local spatial indexing to achieve efficient query processing. Hadoop-GIS is integrated into Hive to support declarative spatial queries with an integrated architecture. Our experiments have demonstrated the high efficiency of Hadoop-GIS on query response and high scalability to run on commodity clusters. Our comparative experiments have showed that performance of Hadoop-GIS is on par with parallel SDBMS and outperforms SDBMS for compute-intensive queries. Hadoop-GIS is available as a set of library for processing spatial queries, and as an integrated software package in Hive.
Hadoop-GIS: A High Performance Spatial Data Warehousing System over MapReduce
Aji, Ablimit; Wang, Fusheng; Vo, Hoang; Lee, Rubao; Liu, Qiaoling; Zhang, Xiaodong; Saltz, Joel
2013-01-01
Support of high performance queries on large volumes of spatial data becomes increasingly important in many application domains, including geospatial problems in numerous fields, location based services, and emerging scientific applications that are increasingly data- and compute-intensive. The emergence of massive scale spatial data is due to the proliferation of cost effective and ubiquitous positioning technologies, development of high resolution imaging technologies, and contribution from a large number of community users. There are two major challenges for managing and querying massive spatial data to support spatial queries: the explosion of spatial data, and the high computational complexity of spatial queries. In this paper, we present Hadoop-GIS – a scalable and high performance spatial data warehousing system for running large scale spatial queries on Hadoop. Hadoop-GIS supports multiple types of spatial queries on MapReduce through spatial partitioning, customizable spatial query engine RESQUE, implicit parallel spatial query execution on MapReduce, and effective methods for amending query results through handling boundary objects. Hadoop-GIS utilizes global partition indexing and customizable on demand local spatial indexing to achieve efficient query processing. Hadoop-GIS is integrated into Hive to support declarative spatial queries with an integrated architecture. Our experiments have demonstrated the high efficiency of Hadoop-GIS on query response and high scalability to run on commodity clusters. Our comparative experiments have showed that performance of Hadoop-GIS is on par with parallel SDBMS and outperforms SDBMS for compute-intensive queries. Hadoop-GIS is available as a set of library for processing spatial queries, and as an integrated software package in Hive. PMID:24187650
Burns, Randal; Roncal, William Gray; Kleissas, Dean; Lillaney, Kunal; Manavalan, Priya; Perlman, Eric; Berger, Daniel R; Bock, Davi D; Chung, Kwanghun; Grosenick, Logan; Kasthuri, Narayanan; Weiler, Nicholas C; Deisseroth, Karl; Kazhdan, Michael; Lichtman, Jeff; Reid, R Clay; Smith, Stephen J; Szalay, Alexander S; Vogelstein, Joshua T; Vogelstein, R Jacob
2013-01-01
We describe a scalable database cluster for the spatial analysis and annotation of high-throughput brain imaging data, initially for 3-d electron microscopy image stacks, but for time-series and multi-channel data as well. The system was designed primarily for workloads that build connectomes - neural connectivity maps of the brain-using the parallel execution of computer vision algorithms on high-performance compute clusters. These services and open-science data sets are publicly available at openconnecto.me. The system design inherits much from NoSQL scale-out and data-intensive computing architectures. We distribute data to cluster nodes by partitioning a spatial index. We direct I/O to different systems-reads to parallel disk arrays and writes to solid-state storage-to avoid I/O interference and maximize throughput. All programming interfaces are RESTful Web services, which are simple and stateless, improving scalability and usability. We include a performance evaluation of the production system, highlighting the effec-tiveness of spatial data organization.
Burns, Randal; Roncal, William Gray; Kleissas, Dean; Lillaney, Kunal; Manavalan, Priya; Perlman, Eric; Berger, Daniel R.; Bock, Davi D.; Chung, Kwanghun; Grosenick, Logan; Kasthuri, Narayanan; Weiler, Nicholas C.; Deisseroth, Karl; Kazhdan, Michael; Lichtman, Jeff; Reid, R. Clay; Smith, Stephen J.; Szalay, Alexander S.; Vogelstein, Joshua T.; Vogelstein, R. Jacob
2013-01-01
We describe a scalable database cluster for the spatial analysis and annotation of high-throughput brain imaging data, initially for 3-d electron microscopy image stacks, but for time-series and multi-channel data as well. The system was designed primarily for workloads that build connectomes— neural connectivity maps of the brain—using the parallel execution of computer vision algorithms on high-performance compute clusters. These services and open-science data sets are publicly available at openconnecto.me. The system design inherits much from NoSQL scale-out and data-intensive computing architectures. We distribute data to cluster nodes by partitioning a spatial index. We direct I/O to different systems—reads to parallel disk arrays and writes to solid-state storage—to avoid I/O interference and maximize throughput. All programming interfaces are RESTful Web services, which are simple and stateless, improving scalability and usability. We include a performance evaluation of the production system, highlighting the effec-tiveness of spatial data organization. PMID:24401992
DDDAMS-based Urban Surveillance and Crowd Control via UAVs and UGVs
2015-12-04
for crowd dynamics modeling by incorporating multi-resolution data, where a grid-based method is used to model crowd motion with UAVs’ low -resolution...information and more computational intensive (and time-consuming). Given that the deployment of fidelity selection results in simulation faces computational... low fidelity information FOV y (A) DR x (A) DR y (A) Not detected high fidelity information Table 1: Parameters for UAV and UGV for their detection
Kinematic synthesis of adjustable robotic mechanisms
NASA Astrophysics Data System (ADS)
Chuenchom, Thatchai
1993-01-01
Conventional hard automation, such as a linkage-based or a cam-driven system, provides high speed capability and repeatability but not the flexibility required in many industrial applications. The conventional mechanisms, that are typically single-degree-of-freedom systems, are being increasingly replaced by multi-degree-of-freedom multi-actuators driven by logic controllers. Although this new trend in sophistication provides greatly enhanced flexibility, there are many instances where the flexibility needs are exaggerated and the associated complexity is unnecessary. Traditional mechanism-based hard automation, on the other hand, neither can fulfill multi-task requirements nor are cost-effective mainly due to lack of methods and tools to design-in flexibility. This dissertation attempts to bridge this technological gap by developing Adjustable Robotic Mechanisms (ARM's) or 'programmable mechanisms' as a middle ground between high speed hard automation and expensive serial jointed-arm robots. This research introduces the concept of adjustable robotic mechanisms towards cost-effective manufacturing automation. A generalized analytical synthesis technique has been developed to support the computational design of ARM's that lays the theoretical foundation for synthesis of adjustable mechanisms. The synthesis method developed in this dissertation, called generalized adjustable dyad and triad synthesis, advances the well-known Burmester theory in kinematics to a new level. While this method provides planar solutions, a novel patented scheme is utilized for converting prescribed three-dimensional motion specifications into sets of planar projections. This provides an analytical and a computational tool for designing adjustable mechanisms that satisfy multiple sets of three-dimensional motion specifications. Several design issues were addressed, including adjustable parameter identification, branching defect, and mechanical errors. An efficient mathematical scheme for identification of adjustable member was also developed. The analytical synthesis techniques developed in this dissertation were successfully implemented in a graphic-intensive user-friendly computer program. A physical prototype of a general purpose adjustable robotic mechanism has been constructed to serve as a proof-of-concept model.
Understanding trends in C-H bond activation in heterogeneous catalysis.
Latimer, Allegra A; Kulkarni, Ambarish R; Aljama, Hassan; Montoya, Joseph H; Yoo, Jong Suk; Tsai, Charlie; Abild-Pedersen, Frank; Studt, Felix; Nørskov, Jens K
2017-02-01
While the search for catalysts capable of directly converting methane to higher value commodity chemicals and liquid fuels has been active for over a century, a viable industrial process for selective methane activation has yet to be developed. Electronic structure calculations are playing an increasingly relevant role in this search, but large-scale materials screening efforts are hindered by computationally expensive transition state barrier calculations. The purpose of the present letter is twofold. First, we show that, for the wide range of catalysts that proceed via a radical intermediate, a unifying framework for predicting C-H activation barriers using a single universal descriptor can be established. Second, we combine this scaling approach with a thermodynamic analysis of active site formation to provide a map of methane activation rates. Our model successfully rationalizes the available empirical data and lays the foundation for future catalyst design strategies that transcend different catalyst classes.
Understanding trends in C–H bond activation in heterogeneous catalysis
Latimer, Allegra A.; Kulkarni, Ambarish R.; Aljama, Hassan; ...
2016-10-10
While the search for catalysts capable of directly converting methane to higher value commodity chemicals and liquid fuels has been active for over a century, a viable industrial process for selective methane activation has yet to be developed1. Electronic structure calculations are playing an increasingly relevant role in this search, but large-scale materials screening efforts are hindered by computationally expensive transition state barrier calculations. The purpose of the present letter is twofold. First, we show that, for the wide range of catalysts that proceed via a radical intermediate, a unifying framework for predicting C–H activation barriers using a single universalmore » descriptor can be established. Second, we combine this scaling approach with a thermodynamic analysis of active site formation to provide a map of methane activation rates. Lastly, our model successfully rationalizes the available empirical data and lays the foundation for future catalyst design strategies that transcend different catalyst classes.« less
Understanding trends in C-H bond activation in heterogeneous catalysis
NASA Astrophysics Data System (ADS)
Latimer, Allegra A.; Kulkarni, Ambarish R.; Aljama, Hassan; Montoya, Joseph H.; Yoo, Jong Suk; Tsai, Charlie; Abild-Pedersen, Frank; Studt, Felix; Nørskov, Jens K.
2017-02-01
While the search for catalysts capable of directly converting methane to higher value commodity chemicals and liquid fuels has been active for over a century, a viable industrial process for selective methane activation has yet to be developed. Electronic structure calculations are playing an increasingly relevant role in this search, but large-scale materials screening efforts are hindered by computationally expensive transition state barrier calculations. The purpose of the present letter is twofold. First, we show that, for the wide range of catalysts that proceed via a radical intermediate, a unifying framework for predicting C-H activation barriers using a single universal descriptor can be established. Second, we combine this scaling approach with a thermodynamic analysis of active site formation to provide a map of methane activation rates. Our model successfully rationalizes the available empirical data and lays the foundation for future catalyst design strategies that transcend different catalyst classes.
Development of on line automatic separation device for apple and sleeve
NASA Astrophysics Data System (ADS)
Xin, Dengke; Ning, Duo; Wang, Kangle; Han, Yuhang
2018-04-01
Based on STM32F407 single chip microcomputer as control core, automatic separation device of fruit sleeve is designed. This design consists of hardware and software. In hardware, it includes mechanical tooth separator and three degree of freedom manipulator, as well as industrial control computer, image data acquisition card, end effector and other structures. The software system is based on Visual C++ development environment, to achieve localization and recognition of fruit sleeve with the technology of image processing and machine vision, drive manipulator of foam net sets of capture, transfer, the designated position task. Test shows: The automatic separation device of the fruit sleeve has the advantages of quick response speed and high separation success rate, and can realize separation of the apple and plastic foam sleeve, and lays the foundation for further studying and realizing the application of the enterprise production line.
Molecular expressions: exploring the world of optics and microscopy. http://microscopy.fsu.edu.
Eliceiri, Kevin W
2004-08-01
Our knowledge of the structure, dynamics and physiology of a cell has increased significantly in the last ten years through the emergence of new optical imaging modalities such as optical sectioning microscopy, computer- enhanced video microscopy and laser-scanning microscopy. These techniques together with the use of genetically engineered fluorophores have helped scientists visualize the 3-dimensional dynamic processes of living cells. However as powerful as these imaging tools are, they can often be difficult to understand and fully utilize. Below I will discuss my favorite website: The Molecular Expressions Web Site that endeavors to present the power of microscopy to its visitors. The Molecular Expressions group does a remarkable job of not only clearly presenting the principles behind these techniques in a manner approachable by lay and scientific audiences alike but also provides representative data from each as well.
Whitfield, Richard H; Newcombe, Robert G; Woollard, Malcolm
2003-12-01
The introduction of the European Resuscitation Guidelines (2000) for cardiopulmonary resuscitation (CPR) and automated external defibrillation (AED) prompted the development of an up-to-date and reliable method of assessing the quality of performance of CPR in combination with the use of an AED. The Cardiff Test of basic life support (BLS) and AED version 3.1 was developed to meet this need and uses standardised checklists to retrospectively evaluate performance from analyses of video recordings and data drawn from a laptop computer attached to a training manikin. This paper reports the inter- and intra-observer reliability of this test. Data used to assess reliability were obtained from an investigation of CPR and AED skill acquisition in a lay responder AED training programme. Six observers were recruited to evaluate performance in 33 data sets, repeating their evaluation after a minimum interval of 3 weeks. More than 70% of the 42 variables considered in this study had a kappa score of 0.70 or above for inter-observer reliability or were drawn from computer data and therefore not subject to evaluator variability. 85% of the 42 variables had kappa scores for intra-observer reliability of 0.70 or above or were drawn from computer data. The standard deviations for inter- and intra-observer measures of time to first shock were 11.6 and 7.7 s, respectively. The inter- and intra-observer reliability for the majority of the variables in the Cardiff Test of BLS and AED version 3.1 is satisfactory. However, reliability is less acceptable with respect to shaking when checking for responsiveness, initial check/clearing of the airway, checks for signs of circulation, time to first shock and performance of interventions in the correct sequence. Further research is required to determine if modifications to the method of assessing these variables can increase reliability.
ERIC Educational Resources Information Center
Buche, Mari W.; Davis, Larry R.; Vician, Chelley
2007-01-01
Computers are pervasive in business and education, and it would be easy to assume that all individuals embrace technology. However, evidence shows that roughly 30 to 40 percent of individuals experience some level of computer anxiety. Many academic programs involve computing-intensive courses, but the actual effects of this exposure on computer…
Myers, Matthew R; Giridhar, Dushyanth
2011-06-01
In the characterization of high-intensity focused ultrasound (HIFU) systems, it is desirable to know the intensity field within a tissue phantom. Infrared (IR) thermography is a potentially useful method for inferring this intensity field from the heating pattern within the phantom. However, IR measurements require an air layer between the phantom and the camera, making inferences about the thermal field in the absence of the air complicated. For example, convection currents can arise in the air layer and distort the measurements relative to the phantom-only situation. Quantitative predictions of intensity fields based upon IR temperature data are also complicated by axial and radial diffusion of heat. In this paper, mathematical expressions are derived for use with IR temperature data acquired at times long enough that noise is a relatively small fraction of the temperature trace, but small enough that convection currents have not yet developed. The relations were applied to simulated IR data sets derived from computed pressure and temperature fields. The simulation was performed in a finite-element geometry involving a HIFU transducer sonicating upward in a phantom toward an air interface, with an IR camera mounted atop an air layer, looking down at the heated interface. It was found that, when compared to the intensity field determined directly from acoustic propagation simulations, intensity profiles could be obtained from the simulated IR temperature data with an accuracy of better than 10%, at pre-focal, focal, and post-focal locations. © 2011 Acoustical Society of America
Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew
2015-01-01
Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists. PMID:25742012
Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew
2015-01-01
Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists.
South, J; Kinsella, K; Meah, A
2012-08-01
This paper examines lay interpretations of lay health worker roles within three UK community-based health promotion projects. It argues that understanding lay health worker roles requires critical analysis of the complex interrelationships between professionals, lay workers and the communities receiving a programme. Findings are presented that are drawn from a qualitative study of lay engagement in public health programme delivery where a key objective was to examine the perspectives of community members with the experience of receiving services delivered by lay health workers. Interviews and focus groups were conducted with 46 programme recipients from three case study projects; a breastfeeding peer support service, a walking for health scheme and a neighbourhood health project. The results show how participants interpreted the function and responsibilities of lay health workers and how those roles provided personalized support and facilitated engagement in group activities. Further insights into community participation processes are provided revealing the potential for active engagement in both formal and informal roles. The paper concludes that social relationships are core to understanding lay health worker programmes and therefore analysis needs to take account of the capacity for community members to move within a spectrum of participation defined by increasing responsibility for others.
Presentation Of Comparative Data for Transportation Planning Studies
DOT National Transportation Integrated Search
1997-01-01
Clear, yet detailed, presentations of transportation planning data to lay groups as well as to technical groups is becoming more and more of a necessity in the planning process. Presentation of technical information in understandable terms has become...
NASA Astrophysics Data System (ADS)
Lescinsky, D. T.; Wyborn, L. A.; Evans, B. J. K.; Allen, C.; Fraser, R.; Rankine, T.
2014-12-01
We present collaborative work on a generic, modular infrastructure for virtual laboratories (VLs, similar to science gateways) that combine online access to data, scientific code, and computing resources as services that support multiple data intensive scientific computing needs across a wide range of science disciplines. We are leveraging access to 10+ PB of earth science data on Lustre filesystems at Australia's National Computational Infrastructure (NCI) Research Data Storage Infrastructure (RDSI) node, co-located with NCI's 1.2 PFlop Raijin supercomputer and a 3000 CPU core research cloud. The development, maintenance and sustainability of VLs is best accomplished through modularisation and standardisation of interfaces between components. Our approach has been to break up tightly-coupled, specialised application packages into modules, with identified best techniques and algorithms repackaged either as data services or scientific tools that are accessible across domains. The data services can be used to manipulate, visualise and transform multiple data types whilst the scientific tools can be used in concert with multiple scientific codes. We are currently designing a scalable generic infrastructure that will handle scientific code as modularised services and thereby enable the rapid/easy deployment of new codes or versions of codes. The goal is to build open source libraries/collections of scientific tools, scripts and modelling codes that can be combined in specially designed deployments. Additional services in development include: provenance, publication of results, monitoring, workflow tools, etc. The generic VL infrastructure will be hosted at NCI, but can access alternative computing infrastructures (i.e., public/private cloud, HPC).The Virtual Geophysics Laboratory (VGL) was developed as a pilot project to demonstrate the underlying technology. This base is now being redesigned and generalised to develop a Virtual Hazards Impact and Risk Laboratory (VHIRL); any enhancements and new capabilities will be incorporated into a generic VL infrastructure. At same time, we are scoping seven new VLs and in the process, identifying other common components to prioritise and focus development.
GPU and APU computations of Finite Time Lyapunov Exponent fields
NASA Astrophysics Data System (ADS)
Conti, Christian; Rossinelli, Diego; Koumoutsakos, Petros
2012-03-01
We present GPU and APU accelerated computations of Finite-Time Lyapunov Exponent (FTLE) fields. The calculation of FTLEs is a computationally intensive process, as in order to obtain the sharp ridges associated with the Lagrangian Coherent Structures an extensive resampling of the flow field is required. The computational performance of this resampling is limited by the memory bandwidth of the underlying computer architecture. The present technique harnesses data-parallel execution of many-core architectures and relies on fast and accurate evaluations of moment conserving functions for the mesh to particle interpolations. We demonstrate how the computation of FTLEs can be efficiently performed on a GPU and on an APU through OpenCL and we report over one order of magnitude improvements over multi-threaded executions in FTLE computations of bluff body flows.
Cloud Based Metalearning System for Predictive Modeling of Biomedical Data
Vukićević, Milan
2014-01-01
Rapid growth and storage of biomedical data enabled many opportunities for predictive modeling and improvement of healthcare processes. On the other side analysis of such large amounts of data is a difficult and computationally intensive task for most existing data mining algorithms. This problem is addressed by proposing a cloud based system that integrates metalearning framework for ranking and selection of best predictive algorithms for data at hand and open source big data technologies for analysis of biomedical data. PMID:24892101
Should I lay or should I wait? Egg-laying in the two-spotted spider mite Tetranychus urticae Koch.
Clotuche, Gwendoline; Turlure, Camille; Mailleux, Anne-Catherine; Detrain, Claire; Hance, Thierry
2013-01-01
Optimality theory predicts that females tend to maximize their offspring survival by choosing the egg-laying site. In this context, the use of conspecific cues allows a more reliable assessment of the habitat quality. To test this hypothesis, Tetranychus urticae Koch is an appropriate biological model as it is a phytophagous mite living in group, protected against external aggression by a common web. Experiments were conducted to determine the respective influence of substrate (living substrate: bean leaf vs. non-living substrate: glass plate), silk and presence of conspecific eggs on the egg-laying behavior of T. urticae females. On both living and non-living substrates, the presence of silk positively influenced the probability of a female to lay an egg, but had no influence on the number of eggs deposited. The egg-laying behavior was mainly determined by the nature of the substrate with mites laying fewer eggs on a non-living substrate than on a living one. The presence of a conspecific egg had no impact on either the probability of laying an egg or on the oviposition rate. This study showed a high variability among females in their fecundity and egg-laying performance. The physiology of females (individual fecundity), the egg-laying substrate and to a lesser extent the presence of silk impacted on the decision of spider mites to lay eggs. Copyright © 2012 Elsevier B.V. All rights reserved.