Robust Coordination for Large Sets of Simple Rovers
NASA Technical Reports Server (NTRS)
Tumer, Kagan; Agogino, Adrian
2006-01-01
The ability to coordinate sets of rovers in an unknown environment is critical to the long-term success of many of NASA;s exploration missions. Such coordination policies must have the ability to adapt in unmodeled or partially modeled domains and must be robust against environmental noise and rover failures. In addition such coordination policies must accommodate a large number of rovers, without excessive and burdensome hand-tuning. In this paper we present a distributed coordination method that addresses these issues in the domain of controlling a set of simple rovers. The application of these methods allows reliable and efficient robotic exploration in dangerous, dynamic, and previously unexplored domains. Most control policies for space missions are directly programmed by engineers or created through the use of planning tools, and are appropriate for single rover missions or missions requiring the coordination of a small number of rovers. Such methods typically require significant amounts of domain knowledge, and are difficult to scale to large numbers of rovers. The method described in this article aims to address cases where a large number of rovers need to coordinate to solve a complex time dependent problem in a noisy environment. In this approach, each rover decomposes a global utility, representing the overall goal of the system, into rover-specific utilities that properly assign credit to the rover s actions. Each rover then has the responsibility to create a control policy that maximizes its own rover-specific utility. We show a method of creating rover-utilities that are "aligned" with the global utility, such that when the rovers maximize their own utility, they also maximize the global utility. In addition we show that our method creates rover-utilities that allow the rovers to create their control policies quickly and reliably. Our distributed learning method allows large sets rovers be used unmodeled domains, while providing robustness against rover failures and changing environments. In experimental simulations we show that our method scales well with large numbers of rovers in addition to being robust against noisy sensor inputs and noisy servo control. The results show that our method is able to scale to large numbers of rovers and achieves up to 400% performance improvement over standard machine learning methods.
Large-Eddy Simulation of Wind-Plant Aerodynamics: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Churchfield, M. J.; Lee, S.; Moriarty, P. J.
In this work, we present results of a large-eddy simulation of the 48 multi-megawatt turbines composing the Lillgrund wind plant. Turbulent inflow wind is created by performing an atmospheric boundary layer precursor simulation and turbines are modeled using a rotating, variable-speed actuator line representation. The motivation for this work is that few others have done wind plant large-eddy simulations with a substantial number of turbines, and the methods for carrying out the simulations are varied. We wish to draw upon the strengths of the existing simulations and our growing atmospheric large-eddy simulation capability to create a sound methodology for performingmore » this type of simulation. We have used the OpenFOAM CFD toolbox to create our solver.« less
English Teaching and the Economic Development of Colombia.
ERIC Educational Resources Information Center
Stansfield, Charles W.
To supply the large number of workers qualified for complex jobs, a demand created by the growing needs of a rapidly growing population, Colombia must make provisions for an expanded system of higher education. This can be accomplished by sending students abroad to study at the university level. The large number of students coming to the United…
Translations on Eastern Europe, Political, Sociological, and Military Affairs, Number 1404-A
1977-06-22
Personality Development in Light of Technological Progress (Harry Nick; EINHEIT, Apr 77) 128 Significance of National Culture in Socialism...industrialization in such a way as to create "labor intensive" technologies in the Hungarian regions, preferably those which require low levels of training... technology , as well as the large number of well-paid personnel create excellent con- ditions for the implementation of the plans of the ideological
Finding Safety in Small Numbers.
ERIC Educational Resources Information Center
McPartland, James; Jordan, Will; Legters, Nettie; Balfanz, Robert
1997-01-01
A large Baltimore high school has shown how personalizing relationships and focusing the curriculum can turn around an unsafe school and create a climate conducive to learning. The school adopted the Talent Development model, which created six smaller units or academies. Instead of suspending or transferring ill-behaved students, Patterson…
Creating Seamless K-16 Pathways: Role of Assessment
ERIC Educational Resources Information Center
Michaels, Hillary R.; Hawthorne, Katrice; Cuevas, Nuria M.; Mateev, Alexei G.
2011-01-01
The large number of underprepared students entering the nation's two-and four-year colleges and universities has created what Levin and Calcagno (2008) consider a "remediation crisis" (p.181). Despite the recent attainment of high school diplomas, many incoming students are academically unprepared for college-level coursework in reading,…
Integrating Community with Collections in Educational Digital Libraries
ERIC Educational Resources Information Center
Akbar, Monika
2013-01-01
Some classes of Internet users have specific information needs and specialized information-seeking behaviors. For example, educators who are designing a course might create a syllabus, recommend books, create lecture slides, and use tools as lecture aid. All of these resources are available online, but are scattered across a large number of…
Long, A. D.; Mullaney, S. L.; Reid, L. A.; Fry, J. D.; Langley, C. H.; Mackay, TFC.
1995-01-01
Factors responsible for selection response for abdominal bristle number and correlated responses in sternopleural bristle number were mapped to the X and third chromosome of Drosophila melanogaster. Lines divergent for high and low abdominal bristle number were created by 25 generations of artificial selection from a large base population, with an intensity of 25 individuals of each sex selected from 100 individuals of each sex scored per generation. Isogenic chromosome substitution lines in which the high (H) X or third chromosome were placed in an isogenic low (L) background were derived from the selection lines and from the 93 recombinant isogenic (RI) HL X and 67 RI chromosome 3 lines constructed from them. Highly polymorphic neutral r00 transposable elements were hybridized in situ to the polytene chromosomes of the RI lines to create a set of cytogenetic markers. These techniques yielded a dense map with an average spacing of 4 cM between informative markers. Factors affecting bristle number, and relative viability of the chromosome 3 RI lines, were mapped using a multiple regression interval mapping approach, conditioning on all markers >/=10 cM from the tested interval. Two factors with large effects on abdominal bristle number were mapped on the X chromosome and five factors on the third chromosome. One factor with a large effect on sternopleural bristle number was mapped to the X and two were mapped to the third chromosome; all factors with sternopleural effects corresponded to those with effects on abdominal bristle number. Two of the chromosome 3 factors with large effects on abdominal bristle number were also associated with reduced viability. Significant sex-specific effects and epistatic interactions between mapped factors of the same order of magnitude as the additive effects were observed. All factors mapped to the approximate positions of likely candidate loci (ASC, bb, emc, h, mab, Dl and E(spl)), previously characterized by mutations with large effects on bristle number. PMID:7768438
LARGE AREA LAND COVER MAPPING THROUGH SCENE-BASED CLASSIFICATION COMPOSITING
Over the past decade, a number of initiatives have been undertaken to create definitive national and global data sets consisting of precision corrected Landsat MSS and TM scenes. One important application of these data is the derivation of large area land cover products spanning ...
Identification of forgeries in handwritten petitions for ballot propositions
NASA Astrophysics Data System (ADS)
Srihari, Sargur; Ramakrishnan, Veshnu; Malgireddy, Manavender; Ball, Gregory R.
2009-01-01
Many governments have some form of "direct democracy" legislation procedure whereby individual citizens can propose various measures creating or altering laws. Generally, such a process is started with the gathering of a large number of signatures. There is interest in whether or not there are fraudulent signatures present in such a petition, and if so what percentage of the signatures are indeed fraudulent. However, due to the large number of signatures (tens of thousands), it is not feasible to have a document examiner verify the signatures directly. Instead, there is interest in creating a subset of signatures where there is a high probability of fraud that can be verified. We present a method by which a pairwise comparison of signatures can be performed and subsequent sorting can generate such subsets.
Geographic Skills: A Case Study of Students in the United Arab Emirates
ERIC Educational Resources Information Center
Alhosani, Naeema Mohamed Dawood; Yagoub, M. M.
2015-01-01
The worldwide technology boom has created an information revolution. Consequently, a large number of people who previously had limited access to geographic data can now use Internet-based geographic information for a number of diverse purposes. The average person has access to geographic information for tourism, shopping, business, and even route…
Michael D. Conner; Robert C. Wilkinson
1983-01-01
Ips beetles usually attack weakened, dying, or recently felled trees and fresh logging debris. Large numbers Ips may build up when natural events such as lightning storms, ice storms, tornadoes, wildfires, and droughts create large amounts of pine suitable for the breeding of these beetles. Ips populations may also build up following forestry activities, such as...
The large scale microelectronics Computer-Aided Design and Test (CADAT) system
NASA Technical Reports Server (NTRS)
Gould, J. M.
1978-01-01
The CADAT system consists of a number of computer programs written in FORTRAN that provide the capability to simulate, lay out, analyze, and create the artwork for large scale microelectronics. The function of each software component of the system is described with references to specific documentation for each software component.
Ziaja, Beata; Saxena, Vikrant; Son, Sang-Kil; Medvedev, Nikita; Barbrel, Benjamin; Woloncewicz, Bianca; Stransky, Michal
2016-05-01
We report on the kinetic Boltzmann approach adapted for simulations of highly ionized matter created from a solid by its x-ray irradiation. X rays can excite inner-shell electrons, which leads to the creation of deeply lying core holes. Their relaxation, especially in heavier elements, can take complicated paths, leading to a large number of active configurations. Their number can be so large that solving the set of respective evolution equations becomes computationally inefficient and another modeling approach should be used instead. To circumvent this complexity, the commonly used continuum models employ a superconfiguration scheme. Here, we propose an alternative approach which still uses "true" atomic configurations but limits their number by restricting the sample relaxation to the predominant relaxation paths. We test its reliability, performing respective calculations for a bulk material consisting of light atoms and comparing the results with a full calculation including all relaxation paths. Prospective application for heavy elements is discussed.
Smart Objects, Dumb Archives: A User-Centric, Layered Digital Library Framework
NASA Technical Reports Server (NTRS)
Maly, Kurt; Nelson, Michael L.; Zubair, Mohammad
1999-01-01
Currently, there exist a large number of superb digital libraries, all of which are, unfortunately, vertically integrated and all presenting a monolithic interface to their users. Ideally, a user would want to locate resources from a variety of digital libraries dealing only with one interface. A number of approaches exist to this interoperability issue exist including: defining a universal protocol for all libraries to adhere to; or developing mechanisms to translate between protocols. The approach we illustrate in this paper is to push down the level of universal protocols to one for digital object communication and for communication for simple archives. This approach creates the opportunity for digital library service providers to create digital libraries tailored to the needs of user communities drawing from available archives and individual publishers who adhere to this standard. We have created a reference implementation based on the hyper text transfer protocol (http) with the protocols being derived from the Dienst protocol. We have created a special class of digital objects called buckets and a number of archives based on a NASA collection and NSF funded projects. Starting from NCSTRL we have developed a set of digital library services called NCSTRL+ and have created digital libraries for researchers, educators and students that can each draw on all the archives and individually created buckets.
How institutions shaped the last major evolutionary transition to large-scale human societies
Powers, Simon T.; van Schaik, Carel P.; Lehmann, Laurent
2016-01-01
What drove the transition from small-scale human societies centred on kinship and personal exchange, to large-scale societies comprising cooperation and division of labour among untold numbers of unrelated individuals? We propose that the unique human capacity to negotiate institutional rules that coordinate social actions was a key driver of this transition. By creating institutions, humans have been able to move from the default ‘Hobbesian’ rules of the ‘game of life’, determined by physical/environmental constraints, into self-created rules of social organization where cooperation can be individually advantageous even in large groups of unrelated individuals. Examples include rules of food sharing in hunter–gatherers, rules for the usage of irrigation systems in agriculturalists, property rights and systems for sharing reputation between mediaeval traders. Successful institutions create rules of interaction that are self-enforcing, providing direct benefits both to individuals that follow them, and to individuals that sanction rule breakers. Forming institutions requires shared intentionality, language and other cognitive abilities largely absent in other primates. We explain how cooperative breeding likely selected for these abilities early in the Homo lineage. This allowed anatomically modern humans to create institutions that transformed the self-reliance of our primate ancestors into the division of labour of large-scale human social organization. PMID:26729937
Choosing the best partition of the output from a large-scale simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Challacombe, Chelsea Jordan; Casleton, Emily Michele
Data partitioning becomes necessary when a large-scale simulation produces more data than can be feasibly stored. The goal is to partition the data, typically so that every element belongs to one and only one partition, and store summary information about the partition, either a representative value plus an estimate of the error or a distribution. Once the partitions are determined and the summary information stored, the raw data is discarded. This process can be performed in-situ; meaning while the simulation is running. When creating the partitions there are many decisions that researchers must make. For instance, how to determine oncemore » an adequate number of partitions have been created, how are the partitions created with respect to dividing the data, or how many variables should be considered simultaneously. In addition, decisions must be made for how to summarize the information within each partition. Because of the combinatorial number of possible ways to partition and summarize the data, a method of comparing the different possibilities will help guide researchers into choosing a good partitioning and summarization scheme for their application.« less
Universities Report More Licensing Income but Fewer Start-Ups in 2005
ERIC Educational Resources Information Center
Blumenstyk, Goldie
2007-01-01
According to a survey conducted by the Association of University Technology Managers, at least two dozen universities each earned more than $10-million from their licensing of rights to new drugs, software, and other inventions in the 2005 fiscal year. The number of institutions creating large numbers of spinoff companies based on their…
Progress Report Number 2 for Contract Number N00014-93-C-0051, April 16 thru May 15, 1993
1993-05-26
accompanied with subhar the modulation frequency is further decreased to 375 Hz , the signal intensity suddenly shows large f and the frequency spectrum...ccrrespondinrg Pbase p•,tra a strange attracwor (Fig. 3d) at this modulation frequency of 375 Hz . SUMMARY: * A new technique has been developed to create
During the last decade, a number of initiatives have been undertaken to create systematic national and global data sets of processed satellite imagery. An important application of these data is the derivation of large area (i.e. multi-scene) land cover products. Such products, ho...
Criminal Intent with Property: A Study of Real Estate Fraud Prediction and Detection
ERIC Educational Resources Information Center
Blackman, David H.
2013-01-01
The large number of real estate transactions across the United States, combined with closing process complexity, creates extremely large data sets that conceal anomalies indicative of fraud. The quantitative amount of damage due to fraud is immeasurable to the lives of individuals who are victims, not to mention the financial impact to…
Shumaker, L; Fetterolf, D E; Suhrie, J
1998-01-01
The recent availability of inexpensive document scanners and optical character recognition technology has created the ability to process surveys in large numbers with a minimum of operator time. Programs, which allow computer entry of such scanned questionnaire results directly into PC based relational databases, have further made it possible to quickly collect and analyze significant amounts of information. We have created an internal capability to easily generate survey data and conduct surveillance across a number of medical practice sites within a managed care/practice management organization. Patient satisfaction surveys, referring physician surveys and a variety of other evidence gathering tools have been deployed.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-08
... disabilities can obtain a copy of the application package in an accessible format (e.g., braille, large print... your application. You can obtain a DUNS number from Dun and Bradstreet. A DUNS number can be created...- Grants help desk at 1-888-336-8930. If e-Application is unavailable due to technical problems with the...
Facilitating and Learning at the Edge of Chaos: Expanding the Context of Experiential Education.
ERIC Educational Resources Information Center
Oekerman, Carl
Significant recent discoveries within a number of scientific disciplines, collectively referred to as the science of complexity, are creating a major shift in how human beings understand the complex, adaptive systems that make up the world. A complex adaptive system consists of networks of large numbers of agents that interact with each other and…
Development of EnergyPlus Utility to Batch Simulate Building Energy Performance on a National Scale
DOE Office of Scientific and Technical Information (OSTI.GOV)
Valencia, Jayson F.; Dirks, James A.
2008-08-29
EnergyPlus is a simulation program that requires a large number of details to fully define and model a building. Hundreds or even thousands of lines in a text file are needed to run the EnergyPlus simulation depending on the size of the building. To manually create these files is a time consuming process that would not be practical when trying to create input files for thousands of buildings needed to simulate national building energy performance. To streamline the process needed to create the input files for EnergyPlus, two methods were created to work in conjunction with the National Renewable Energymore » Laboratory (NREL) Preprocessor; this reduced the hundreds of inputs needed to define a building in EnergyPlus to a small set of high-level parameters. The first method uses Java routines to perform all of the preprocessing on a Windows machine while the second method carries out all of the preprocessing on the Linux cluster by using an in-house built utility called Generalized Parametrics (GPARM). A comma delimited (CSV) input file is created to define the high-level parameters for any number of buildings. Each method then takes this CSV file and uses the data entered for each parameter to populate an extensible markup language (XML) file used by the NREL Preprocessor to automatically prepare EnergyPlus input data files (idf) using automatic building routines and macro templates. Using a Linux utility called “make”, the idf files can then be automatically run through the Linux cluster and the desired data from each building can be aggregated into one table to be analyzed. Creating a large number of EnergyPlus input files results in the ability to batch simulate building energy performance and scale the result to national energy consumption estimates.« less
Specifications Used for ASA24® Digital Images
The Children's Nutrition Research Center's (CNRC) at the Baylor College of Medicine developed a food photography system to photograph precise portion sizes of a large number of food items to create quality standardized images used for dietary recall protocols.
Fjield, T; Hynynen, K
2000-01-01
Phased-array technology offers an incredible advantage to therapeutic ultrasound due to the ability to electronically steer foci, create multiple foci, or to create an enlarged focal region by using phase cancellation. However, to take advantage of this flexibility, the phased-arrays generally consist of many elements. Each of these elements requires its own radio-frequency generator with independent amplitude and phase control, resulting in a large, complex, and expensive driving system. A method is presented here where in certain cases the number of amplifier channels can be reduced to a fraction of the number of transducer elements, thereby simplifying the driving system and reducing the overall system complexity and cost, by using isolation transformers to produce 180 degrees phase shifts.
Pouching a draining duodenal cutaneous fistula: a case study.
Zwanziger, P J
1999-01-01
Blockage of the mesenteric artery typically causes necrosis to the colon, requiring extensive surgical resection. In severe cases, the necrosis requires removal of the entire colon, creating numerous problems for the WOC nurse when pouching the opening created for effluent. This article describes the management of a draining duodenal fistula in a middle-aged woman, who survived surgery for a blocked mesenteric artery that necessitated the removal of the majority of the small and large intestine. Nutrition, skin management, and pouch options are described over a number of months as the fistula evolved and a stoma was created.
Introducing HEP to high-school and university students through ATLAS event analysis tools
NASA Astrophysics Data System (ADS)
Fassouliotis, Dimitris; Kourkoumelis, Christine; Vourakis, Stylianos
2017-12-01
Several EU outreach projects have been running for a few years now and have created a large number of inquiry based educational resources for highschool teachers and students. Their goal is the promotion of science education in schools though new methods built on the inquiry based education techniques, involving large consortia of European partners and implementation of largescale pilots in a very large number of European schools. Until recently there has been a shortage of educational scenaria addressed to university students to be implemented in the framework of the laboratory courses. Two such scenaria were introduced recently at the National and Kapodistrian University undergraduate labs and are described below.
Asteroid Systems: Binaries, Triples, and Pairs
NASA Astrophysics Data System (ADS)
Margot, J.-L.; Pravec, P.; Taylor, P.; Carry, B.; Jacobson, S.
In the past decade, the number of known binary near-Earth asteroids has more than quadrupled and the number of known large main-belt asteroids with satellites has doubled. Half a dozen triple asteroids have been discovered, and the previously unrecognized populations of asteroid pairs and small main-belt binaries have been identified. The current observational evidence confirms that small (≲20 km) binaries form by rotational fission and establishes that the Yarkovsky-O'Keefe-Radzievskii-Paddack (YORP) effect powers the spin-up process. A unifying paradigm based on rotational fission and post-fission dynamics can explain the formation of small binaries, triples, and pairs. Large (>~20 km) binaries with small satellites are most likely created during large collisions.
Multiplex-Ready Technology for mid-throughput genotyping of molecular markers.
Bonneau, Julien; Hayden, Matthew
2014-01-01
Screening molecular markers across large populations in breeding programs is generally time consuming and expensive. The Multiplex-Ready Technology (MRT) (Hayden et al., BMC genomics 9:80, 2008) was created to optimize polymorphism screening and genotyping using standardized PCR reaction conditions. The flexibility of this method maximizes the number of markers (up to 24 markers SSR or SNP, ideally small PCR product <500 bp and highly polymorphic) by using fluorescent dye (VIC, FAM, NED, and PET) and a semiautomated DNA fragment analyzer (ABI3730) capillary electrophoresis for large numbers of DNA samples (96 or 384 samples).
Phage diabody repertoires for selection of large numbers of bispecific antibody fragments.
McGuinness, B T; Walter, G; FitzGerald, K; Schuler, P; Mahoney, W; Duncan, A R; Hoogenboom, H R
1996-09-01
Methods for the generation of large numbers of different bispecific antibodies are presented. Cloning strategies are detailed to create repertoires of bispecific diabody molecules with variability on one or both of the antigen binding sites. This diabody format, when combined with the power of phage display technology, allows the generation and analysis of thousands of different bispecific molecules. Selection for binding presumably also selects for more stable diabodies. Phage diabody libraries enable screening or selection of the best combination bispecific molecule with regards to affinity of binding, epitope recognition and pairing before manufacture of the best candidate.
Generation of large-scale density fluctuations by buoyancy
NASA Technical Reports Server (NTRS)
Chasnov, J. R.; Rogallo, R. S.
1990-01-01
The generation of fluid motion from a state of rest by buoyancy forces acting on a homogeneous isotropic small-scale density field is considered. Nonlinear interactions between the generated fluid motion and the initial isotropic small-scale density field are found to create an anisotropic large-scale density field with spectrum proportional to kappa(exp 4). This large-scale density field is observed to result in an increasing Reynolds number of the fluid turbulence in its final period of decay.
Monitoring landscape influence on nearshore condition
A major source of stress to the Great Lakes comes from tributary and landscape run-off. The large number of watersheds and the disparate landuse within them create variability in the tributary input along the extent of the nearshore. Identifying the local or regional response t...
The emergence of understanding in a computer model of concepts and analogy-making
NASA Astrophysics Data System (ADS)
Mitchell, Melanie; Hofstadter, Douglas R.
1990-06-01
This paper describes Copycat, a computer model of the mental mechanisms underlying the fluidity and adaptability of the human conceptual system in the context of analogy-making. Copycat creates analogies between idealized situations in a microworld that has been designed to capture and isolate many of the central issues of analogy-making. In Copycat, an understanding of the essence of a situation and the recognition of deep similarity between two superficially different situations emerge from the interaction of a large number of perceptual agents with an associative, overlapping, and context-sensitive network of concepts. Central features of the model are: a high degree of parallelism; competition and cooperation among a large number of small, locally acting agents that together create a global understanding of the situation at hand; and a computational temperature that measures the amount of perceptual organization as processing proceeds and that in turn controls the degree of randomness with which decisions are made in the system.
Enhanced Cultivation Of Stimulated Murine B Cells
NASA Technical Reports Server (NTRS)
Sammons, David W.
1994-01-01
Method of in vitro cultivation of large numbers of stimulated murine B lymphocytes. Cells electrofused with other cells to produce hybridomas and monoclonal antibodies. Offers several advantages: polyclonally stimulated B-cell blasts cultivated for as long as 14 days, hybridomas created throughout culture period, yield of hybridomas increases during cultivation, and possible to expand polyclonally in vitro number of B cells specific for antigenic determinants first recognized in vivo.
[Dual process in large number estimation under uncertainty].
Matsumuro, Miki; Miwa, Kazuhisa; Terai, Hitoshi; Yamada, Kento
2016-08-01
According to dual process theory, there are two systems in the mind: an intuitive and automatic System 1 and a logical and effortful System 2. While many previous studies about number estimation have focused on simple heuristics and automatic processes, the deliberative System 2 process has not been sufficiently studied. This study focused on the System 2 process for large number estimation. First, we described an estimation process based on participants’ verbal reports. The task, corresponding to the problem-solving process, consisted of creating subgoals, retrieving values, and applying operations. Second, we investigated the influence of such deliberative process by System 2 on intuitive estimation by System 1, using anchoring effects. The results of the experiment showed that the System 2 process could mitigate anchoring effects.
A new R function, exsic, to assist taxonomists in creating indices
USDA-ARS?s Scientific Manuscript database
Taxonomists manage large amounts of specimen data. This is usually initiated in spreadsheets and then converted for publication into locality lists and in indices to associate collectors and collector numbers from herbarium sheets to identifications, a format technically termed an exsiccate list. Th...
Occupational cancer in the European part of the Commonwealth of Independent States.
Bulbulyan, M A; Boffetta, P
1999-01-01
Precise information on the number of workers currently exposed to carcinogens in the Commonwealth of Independent States (CIS) is lacking. However, the large number of workers employed in high-risk industries such as the chemical and metal industries suggests that the number of workers potentially exposed to carcinogens may be large. In the CIS, women account for almost 50% of the industrial work force. Although no precise data are available on the number of cancers caused by occupational exposures, indirect evidence suggests that the magnitude of the problem is comparable to that observed in Western Europe, representing some 20,000 cases per year. The large number of women employed in the past and at present in industries that create potential exposure to carcinogens is a special characteristic of the CIS. In recent years an increasing amount of high-quality research has been conducted on occupational cancer in the CIS; there is, however, room for further improvement. International training programs should be established, and funds from international research and development programs should be devoted to this area. In recent years, following privatization of many large-scale industries, access to employment and exposure data is becoming increasingly difficult. PMID:10350512
1987-02-17
that this has created a predisposed social bias to the firstborn position. This report will use somewhat more recent data, and that of a more scientific...submissive sons, an unwholesome development’ (8:56). The number of single-parent households is continuing to grow in this country. Will this social ...development result in a large number of maladjusted adolescents and young adults in the years to come? Time will tell. Harvard Professor Emeritus, Dr Carle
ERIC Educational Resources Information Center
Mayadas, A. Frank; Bourne, John; Bacsich, Paul
2009-01-01
Online education is established, growing, and here to stay. It is creating new opportunities for students and also for faculty, regulators of education, and the educational institutions themselves. Much of what is being learned by the practitioners will flow into the large numbers of blended courses that will be developed and delivered on most…
Editorial: The Advent of a Molecular Genetics of General Intelligence.
ERIC Educational Resources Information Center
Weiss, Volkmar
1995-01-01
Raw IQ scores do not demonstrate the bell curve created by normalized scores, even the bell-shaped distribution does not require large numbers of underlying genes. Family data support a major gene locus of IQ. The correlation between glutathione peroxidase and IQ should be investigated through molecular genetics. (SLD)
ERIC Educational Resources Information Center
Mendels, Pamela; Mitgang, Lee D.
2013-01-01
Principals have a substantial effect on the quality of learning in their schools. Likewise, districts have a substantial effect on the quality of their leaders. A growing number of large school districts are focusing on two objectives to strengthen school leadership: (1) building a pipeline of new principals who are ready to tackle the most…
ERIC Educational Resources Information Center
Stillman, Jennifer Burns
2013-01-01
The gentrification of many the country's big cities is providing a once-in-a-generation opportunity to create a large number of racially and socioeconomically integrated schools. But to capitalize on this opportunity, urban schools that currently serve a predominantly poor and minority population must find a way to attract and retain the…
Increasing Accessibility by Pooling Digital Resources
ERIC Educational Resources Information Center
Cushion, Steve
2004-01-01
There are now many CALL authoring packages that can create interactive websites and a large number of language teachers are writing materials for the whole range of such packages. Currently, each product stores its data in different formats thus hindering interoperability, pooling of digital resources and moving between software packages based in…
Impact of Learning Assistance Center Utilization on Success
ERIC Educational Resources Information Center
Wurtz, Keith A.
2015-01-01
A large number of community college students are developmental students. One of the most important challenges for community colleges today is to create programs that effectively educate community college developmental students. This study examines the effect of learning assistance centers on the success and persistence of students at a Southern…
Creating Educational Opportunities for Engineers with Communication Technologies.
ERIC Educational Resources Information Center
Baldwin, Lionel V.
The large number and known career patterns of engineers make them an important target population for the use of videotechnology in programs of continuing professional education. Currently, universities use videobased instruction with engineering students on and off campus. A variety of signal delivery systems are used to link job sites to…
Planning and Managing School Facilities. Second Edition.
ERIC Educational Resources Information Center
Kowalski, Theodore J.
This book addresses the administrative procedures associated with planning and managing school facilities. As noted at the outset, practitioner interest in school facilities has been growing rapidly in recent years because decades of neglect, poor planning, and cost cutting have created a situation in which large numbers of America's school…
App Development Paradigms for Instructional Developers
ERIC Educational Resources Information Center
Luterbach, Kenneth J.; Hubbell, Kenneth R.
2015-01-01
To create instructional apps for desktop, laptop and mobile devices, developers must select a development tool. Tool selection is critical and complicated by the large number and variety of app development tools. One important criterion to consider is the type of development environment, which may primarily be visual or symbolic. Those distinct…
Multiprocessor Neural Network in Healthcare.
Godó, Zoltán Attila; Kiss, Gábor; Kocsis, Dénes
2015-01-01
A possible way of creating a multiprocessor artificial neural network is by the use of microcontrollers. The RISC processors' high performance and the large number of I/O ports mean they are greatly suitable for creating such a system. During our research, we wanted to see if it is possible to efficiently create interaction between the artifical neural network and the natural nervous system. To achieve as much analogy to the living nervous system as possible, we created a frequency-modulated analog connection between the units. Our system is connected to the living nervous system through 128 microelectrodes. Two-way communication is provided through A/D transformation, which is even capable of testing psychopharmacons. The microcontroller-based analog artificial neural network can play a great role in medical singal processing, such as ECG, EEG etc.
Waterbird use of saltmarsh ponds created for open marsh water management
Erwin, R.M.; Hatfield, J.S.; Howe, M.A.; Klugman, S.K.
1994-01-01
Open Marsh Water Management (OMWM) as an alternative to pesticides for mosquito control in saltmarshes along the Atlantic Coast has created debate among biologists. We designed an experiment to determine waterbird (American black duck (Anas rubripes) and other waterfowl, wading birds, shorebirds, gulls, and terns) use (during daylight) of ponds created for mosquito control compared with use of pre-existing water bodies (i.e., natural tidal ponds, creeks, old ditches) and refuge impoundments. We also evaluated the influence of pond size and depth on waterbird use of wetlands. We documented bird use of different habitats for 1 year. The highest densities of waterfowl, in autumn, occurred in 0.030.06ha ponds (P lt 0.05) versus ponds either lt 0.02 ha or gt 0.08 ha; highest shorebird densities occurred in summer in ponds gt 0.10 ha (P lt 0.05). Pond depth affected shorebird and other waterfowl use in some seasons. Comparisons of mean number of birds using created (OMWM) ponds with mean number of birds using other water bodies revealed that most species showed no pattern (P gt 0.05) of disproportionate use versus availability. At high tidal levels, most species groups used OMWM ponds in the marsh more often (P lt 0.05) than other water bodies. Black ducks and other waterfowl used nearby refuge impoundments in higher densities than they did OMWM ponds, for nesting and during autumn-winter (all Ps lt 0.05). Creating small ( lt 0.1 ha) ponds for mosquito control does not enhance waterbird habitat, at least not where large impoundments are in close proximity. We recommend that in areas where OMWM practices seem appropriate, fewer large ( gt 0.10 ha) ponds be constructed with shallow ( lt 15 cm) basins and sloping sides.
NASA Astrophysics Data System (ADS)
Andersen, G.; Dearborn, M.; Hcharg, G.
2010-09-01
We are investigating new technologies for creating ultra-large apertures (>20m) for space-based imagery. Our approach has been to create diffractive primaries in flat membranes deployed from compact payloads. These structures are attractive in that they are much simpler to fabricate, launch and deploy compared to conventional three-dimensional optics. In this case the flat focusing element is a photon sieve which consists of a large number of holes in an otherwise opaque substrate. A photon sieve is essentially a large number of holes located according to an underlying Fresnel Zone Plate (FZP) geometry. The advantages over the FZP are that there are no support struts which lead to diffraction spikes in the far-field and non-uniform tension which can cause wrinkling of the substrate. Furthermore, with modifications in hole size and distribution we can achieve improved resolution and contrast over conventional optics. The trade-offs in using diffractive optics are the large amounts of dispersion and decreased efficiency. We present both theoretical and experimental results from small-scale prototypes. Several key solutions to issues of limited bandwidth and efficiency have been addressed. Along with these we have studied the materials aspects in order to optimize performance and achieve a scalable solution to an on-orbit demonstrator. Our current efforts are being directed towards an on-orbit 1m solar observatory demonstration deployed from a CubeSat bus.
Streaming of Continuous Media for Distance Education Systems
ERIC Educational Resources Information Center
Dashti, Ali; Safar, Maytham
2007-01-01
Distance education created new challenges regarding the delivery of large size isochronous continuous streaming media (SM) objects. In this paper, we consider the design of a framework for customized SM presentations, where each presentation consists of a number of SM objects that should be retrieved and displayed to the user in a coherent…
Endangered Species: Real Life in Two Dimensions
ERIC Educational Resources Information Center
Henderson, Lynette K.
2012-01-01
The focus of "Endangered Species: Real Life in Two Dimensions" is to create awareness about a critical environmental issue. There is a special urgency to this project because large numbers of animal species are currently endangered or on the brink of extinction. In addition to being enlightened about this important topic through research, students…
ERIC Educational Resources Information Center
Yu, Hong Qing; Pedrinaci, C.; Dietze, S.; Domingue, J.
2012-01-01
Multimedia educational resources play an important role in education, particularly for distance learning environments. With the rapid growth of the multimedia web, large numbers of educational video resources are increasingly being created by several different organizations. It is crucial to explore, share, reuse, and link these educational…
Tomorrow's School Leaders: What Do We Know about Them?--A Case Study
ERIC Educational Resources Information Center
Ritchie, Jim; Lindstrom, Phyllis H.; Mendoza-Reis, Noni
2004-01-01
Large numbers of projected retirements have created the need for recruiting and preparing capable school leaders for the future. This study explored the characteristics of candidates in an administration preparation program, factors for their career decisions, and the implications of these factors for recruiting candidates and building stronger…
Social and Economic Change in Rural Iowa: The Development of Rural Ghettos.
ERIC Educational Resources Information Center
Jacobsen, G. Michael; Albertson, Bonnie Sanderson
1987-01-01
Rural ghettos are created when proportionately large numbers of unemployed, low income, and elderly residents live in communities located 30 miles or more from communities which can provide the range of goods and services necessary for a decent quality of life. This definition may apply to 300 Iowa towns. (JHZ)
Art Education for a Change: Contemporary Issues and the Visual Arts
ERIC Educational Resources Information Center
Darts, David
2006-01-01
Throughout the year, students of Contemporary Issues and the Visual Arts class, an interdisciplinary course for high school juniors and seniors at a large Canadian suburban high school, devised and created a number of individual and collective artistic investigations and creative cultural interventions, both within the classroom and the larger…
ASPEN--A Web-Based Application for Managing Student Server Accounts
ERIC Educational Resources Information Center
Sandvig, J. Christopher
2004-01-01
The growth of the Internet has greatly increased the demand for server-side programming courses at colleges and universities. Students enrolled in such courses must be provided with server-based accounts that support the technologies that they are learning. The process of creating, managing and removing large numbers of student server accounts is…
A Survey of Speech Programs in Community Colleges.
ERIC Educational Resources Information Center
Meyer, Arthur C.
The rapid growth of community colleges in the last decade resulted in large numbers of students enrolled in programs previously unavailable to them in a single comprehensive institution. The purpose of this study was to gather and analyze data to provide information about the speech programs that community colleges created or expanded as a result…
ERIC Educational Resources Information Center
Williford, A. Michael; Wadley, Joni Y.
2008-01-01
This study reports how an institutional research office at a large public research university has taken the lead to call attention to retention problems, describe attrition/retention predictors, and influence policy. Building on existing retention theories and previous institutional research studies, the institutional research office began…
Brief Trauma and Mental Health Assessments for Female Offenders in Addiction Treatment
ERIC Educational Resources Information Center
Rowan-Szal, Grace A.; Joe, George W.; Bartholomew, Norma G.; Pankow, Jennifer; Simpson, D. Dwayne
2012-01-01
Increasing numbers of women in prison raise concerns about gender-specific problems and needs severity. Female offenders report higher trauma as well as mental and medical health complications than males, but large inmate populations and limited resources create challenges in administering proper diagnostic screening and assessments. This study…
NASA Technical Reports Server (NTRS)
Kolesar, C. E.
1987-01-01
Research activity on an airfoil designed for a large airplane capable of very long endurance times at a low Mach number of 0.22 is examined. Airplane mission objectives and design optimization resulted in requirements for a very high design lift coefficient and a large amount of laminar flow at high Reynolds number to increase the lift/drag ratio and reduce the loiter lift coefficient. Natural laminar flow was selected instead of distributed mechanical suction for the measurement technique. A design lift coefficient of 1.5 was identified as the highest which could be achieved with a large extent of laminar flow. A single element airfoil was designed using an inverse boundary layer solution and inverse airfoil design computer codes to create an airfoil section that would achieve performance goals. The design process and results, including airfoil shape, pressure distributions, and aerodynamic characteristics are presented. A two dimensional wind tunnel model was constructed and tested in a NASA Low Turbulence Pressure Tunnel which enabled testing at full scale design Reynolds number. A comparison is made between theoretical and measured results to establish accuracy and quality of the airfoil design technique.
ERIC Educational Resources Information Center
Congress of the U.S., Washington, DC. House Select Committee on Children, Youth, and Families.
This Congressional report found that today's social and economic conditions harm large numbers of American families in ways that the child welfare, mental health, and juvenile justice systems were not created and are ill-prepared to address. The following findings are reported: (1) the number of children placed outside of their homes during the…
The benefits of adaptive parametrization in multi-objective Tabu Search optimization
NASA Astrophysics Data System (ADS)
Ghisu, Tiziano; Parks, Geoffrey T.; Jaeggi, Daniel M.; Jarrett, Jerome P.; Clarkson, P. John
2010-10-01
In real-world optimization problems, large design spaces and conflicting objectives are often combined with a large number of constraints, resulting in a highly multi-modal, challenging, fragmented landscape. The local search at the heart of Tabu Search, while being one of its strengths in highly constrained optimization problems, requires a large number of evaluations per optimization step. In this work, a modification of the pattern search algorithm is proposed: this modification, based on a Principal Components' Analysis of the approximation set, allows both a re-alignment of the search directions, thereby creating a more effective parametrization, and also an informed reduction of the size of the design space itself. These changes make the optimization process more computationally efficient and more effective - higher quality solutions are identified in fewer iterations. These advantages are demonstrated on a number of standard analytical test functions (from the ZDT and DTLZ families) and on a real-world problem (the optimization of an axial compressor preliminary design).
Targeted Cellular Drug Delivery using Tailored Dendritic Nanostructures
NASA Astrophysics Data System (ADS)
Kannan, Rangaramanujam; Kolhe, Parag; Kannan, Sujatha; Lieh-Lai, Mary
2002-03-01
Dendrimers and hyperbranched polymers possess highly branched architectures, with a large number of controllable, tailorble, ‘peripheral’ functionalities. Since the surface chemistry of these materials can be modified with relative ease, these materials have tremendous potential in targeted drug and gene delivery. The large number of end groups can also be tailored to create special affinity to targeted cells, and can also encapsulate drugs and deliver them in a controlled manner. We are developing tailor-modified dendritic systems for drug delivery. Synthesis, in-vitro drug loading, in-vitro drug delivery, and the targeting efficiency to the cell are being studied systematically using a wide variety of experimental tools. Polyamidoamine and Polyol dendrimers, with different generations and end-groups are studied, with drugs such as Ibuprofen and Methotrexate. Our results indicate that a large number of drug molecules can be encapsulated/attached to the dendrimers, depending on the end groups. The drug-encapsulated dendrimer is able to enter the cells rapidly and deliver the drug. Targeting strategies being explored
Applying automatic item generation to create cohesive physics testlets
NASA Astrophysics Data System (ADS)
Mindyarto, B. N.; Nugroho, S. E.; Linuwih, S.
2018-03-01
Computer-based testing has created the demand for large numbers of items. This paper discusses the production of cohesive physics testlets using an automatic item generation concepts and procedures. The testlets were composed by restructuring physics problems to reveal deeper understanding of the underlying physical concepts by inserting a qualitative question and its scientific reasoning question. A template-based testlet generator was used to generate the testlet variants. Using this methodology, 1248 testlet variants were effectively generated from 25 testlet templates. Some issues related to the effective application of the generated physics testlets in practical assessments were discussed.
2016-07-20
Nili Fossae is a large band of parallel graben located to the northeast of Syrtis Major. The graben in this image from NASA 2001 Mars Odyssey spacecraft were formed by tectonic activity, with faulting that creates the linear depression. Orbit Number: 64105 Latitude: 23.3115 Longitude: 78.6126 Instrument: VIS Captured: 2016-05-27 05:24 http://photojournal.jpl.nasa.gov/catalog/PIA20785
A Critical Mass: Creating Comprehensive Services for Dual Language Learners in Harrisonburg
ERIC Educational Resources Information Center
Garcia, Amaya; Carnock, Janie Tankard
2016-01-01
Harrisonburg, Virginia, a community nestled in the fertile hills of the Shenandoah Valley, is emblematic of the demographic changes taking shape in the U.S. for some years now. The town's agricultural industry has attracted a large number of immigrant workers from Central America. In addition, Harrisonburg's refugee resettlement center has drawn…
Aspects of Education for Democratic Citizenship in Post-War Germany
ERIC Educational Resources Information Center
Phillips, David
2012-01-01
Interest in post-crisis education and concomitantly in education for democracy and citizenship, manifest in a large number of recent initiatives and publications, provides an opportunity to revisit the period of occupation in Germany after the Second World War, when there was concern--at least in the Western Zones--to create an awareness of the…
A review of climate change impacts on birds
Robert W. Butler; William Taylor
2005-01-01
Regions of the world with high coastal zone biological productivity often support large numbers of birds. Important sources of this productivity are oceanographic upwelling created by winds and ocean currents, and runoff from the land. It is suggested that climate change effects on winds and ocean currents will potentially affect the timing and magnitude of coastal...
Factors determining the location of forest products firms
R.F. Fraser; F.M. Goode
1991-01-01
In the past decade there has been an increase in the number of intrastate, state and regional programs aimed at encouraging forest resource based economic development in northeastern USA. These programs were aimed at deriving economic benefits from the large volume of mature forest resources in the region. The improvement of rura1 communities by creating job...
Is Epenthesis a Means to Optimize Feet? A Reanalysis of the CLPF Database
ERIC Educational Resources Information Center
Taelman, Helena; Gillis, Steven
2008-01-01
Fikkert (1994) analyzed a large corpus of Dutch children's early language production, and found that they often add targetless syllables to their words in order to create bisyllabic feet. In this note we point out a methodological problem with that analysis: in an important number of cases, epenthetic vowels occur at places where grammatical…
The Zoot Suit Riots: Exploring Social Issues in American History
ERIC Educational Resources Information Center
Chiodo, John J.
2013-01-01
The Zoot Suit Riots provide students with a case study of social unrest in American history. The influx of Latinos into the Los Angeles area prior to World War II created high levels of social unrest between Mexican Americans, military servicemen, and local residences. With large numbers of soldiers stationed in the area during the Second World…
Creating Grander Families: Older Adults Adopting Younger Kin and Nonkin
ERIC Educational Resources Information Center
Hinterlong, James; Ryan, Scott
2008-01-01
Purpose: There is a dearth of research on older adoptive parents caring for minor children, despite a growing number of such adoptions finalized each year. This study offers a large-scale investigation of adoptive families headed by older parents. We describe these families and explore how preadoptive kinship between the adoptive parent and the…
Community Colleges Hope to Keep Aging Professors in the Classroom
ERIC Educational Resources Information Center
McCormack, Eugene
2008-01-01
This article discusses how community colleges respond to the rising number of faculty members who are eligible for retirement. Many faculty members at community colleges are near retirement largely because many of the colleges were created and did the bulk of their hiring between 1965 and 1975, when the first group of baby boomers was entering the…
Vocational Discernment among Tibetan Buddhist Monks in Dharamsala, India
ERIC Educational Resources Information Center
Thomas, Alvin; Kellom, Gar E.
2009-01-01
A major historical shift is taking place in Tibetan Buddhism with the relocation of large numbers of monks from Tibet and the establishment of monasteries in Dharamsala, India and other parts of South Asia. This has created a shift in the way that young men are joining these monasteries and leading this age old religious tradition. Fifteen college…
Multi-resource and multi-scale approaches for meeting the challenge of managing multiple species
Frank R. Thompson; Deborah M. Finch; John R. Probst; Glen D. Gaines; David S. Dobkin
1999-01-01
The large number of Neotropical migratory bird (NTMB) species and their diverse habitat requirements create conflicts and difficulties for land managers and conservationists. We provide examples of assessments or conservation efforts that attempt to address the problem of managing for multiple NTMB species. We advocate approaches at a variety of spatial and geographic...
A biological rationale for musical scales.
Gill, Kamraan Z; Purves, Dale
2009-12-03
Scales are collections of tones that divide octaves into specific intervals used to create music. Since humans can distinguish about 240 different pitches over an octave in the mid-range of hearing, in principle a very large number of tone combinations could have been used for this purpose. Nonetheless, compositions in Western classical, folk and popular music as well as in many other musical traditions are based on a relatively small number of scales that typically comprise only five to seven tones. Why humans employ only a few of the enormous number of possible tone combinations to create music is not known. Here we show that the component intervals of the most widely used scales throughout history and across cultures are those with the greatest overall spectral similarity to a harmonic series. These findings suggest that humans prefer tone combinations that reflect the spectral characteristics of conspecific vocalizations. The analysis also highlights the spectral similarity among the scales used by different cultures.
A Biological Rationale for Musical Scales
Gill, Kamraan Z.; Purves, Dale
2009-01-01
Scales are collections of tones that divide octaves into specific intervals used to create music. Since humans can distinguish about 240 different pitches over an octave in the mid-range of hearing [1], in principle a very large number of tone combinations could have been used for this purpose. Nonetheless, compositions in Western classical, folk and popular music as well as in many other musical traditions are based on a relatively small number of scales that typically comprise only five to seven tones [2]–[6]. Why humans employ only a few of the enormous number of possible tone combinations to create music is not known. Here we show that the component intervals of the most widely used scales throughout history and across cultures are those with the greatest overall spectral similarity to a harmonic series. These findings suggest that humans prefer tone combinations that reflect the spectral characteristics of conspecific vocalizations. The analysis also highlights the spectral similarity among the scales used by different cultures. PMID:19997506
Incremental terrain processing for large digital elevation models
NASA Astrophysics Data System (ADS)
Ye, Z.
2012-12-01
Incremental terrain processing for large digital elevation models Zichuan Ye, Dean Djokic, Lori Armstrong Esri, 380 New York Street, Redlands, CA 92373, USA (E-mail: zye@esri.com, ddjokic@esri.com , larmstrong@esri.com) Efficient analyses of large digital elevation models (DEM) require generation of additional DEM artifacts such as flow direction, flow accumulation and other DEM derivatives. When the DEMs to analyze have a large number of grid cells (usually > 1,000,000,000) the generation of these DEM derivatives is either impractical (it takes too long) or impossible (software is incapable of processing such a large number of cells). Different strategies and algorithms can be put in place to alleviate this situation. This paper describes an approach where the overall DEM is partitioned in smaller processing units that can be efficiently processed. The processed DEM derivatives for each partition can then be either mosaicked back into a single large entity or managed on partition level. For dendritic terrain morphologies, the way in which partitions are to be derived and the order in which they are to be processed depend on the river and catchment patterns. These patterns are not available until flow pattern of the whole region is created, which in turn cannot be established upfront due to the size issues. This paper describes a procedure that solves this problem: (1) Resample the original large DEM grid so that the total number of cells is reduced to a level for which the drainage pattern can be established. (2) Run standard terrain preprocessing operations on the resampled DEM to generate the river and catchment system. (3) Define the processing units and their processing order based on the river and catchment system created in step (2). (4) Based on the processing order, apply the analysis, i.e., flow accumulation operation to each of the processing units, at the full resolution DEM. (5) As each processing unit is processed based on the processing order defined in (3), compare the resulting drainage pattern with the drainage pattern established at the coarser scale and adjust the drainage boundaries and rivers if necessary.
Skyscape Archaeology: an emerging interdiscipline for archaeoastronomers and archaeologists
NASA Astrophysics Data System (ADS)
Henty, Liz
2016-02-01
For historical reasons archaeoastronomy and archaeology differ in their approach to prehistoric monuments and this has created a divide between the disciplines which adopt seemingly incompatible methodologies. The reasons behind the impasse will be explored to show how these different approaches gave rise to their respective methods. Archaeology investigations tend to concentrate on single site analysis whereas archaeoastronomical surveys tend to be data driven from the examination of a large number of similar sets. A comparison will be made between traditional archaeoastronomical data gathering and an emerging methodology which looks at sites on a small scale and combines archaeology and astronomy. Silva's recent research in Portugal and this author's survey in Scotland have explored this methodology and termed it skyscape archaeology. This paper argues that this type of phenomenological skyscape archaeology offers an alternative to large scale statistical studies which analyse astronomical data obtained from a large number of superficially similar archaeological sites.
Stratification during evaporative assembly of multicomponent nanoparticle films
Liu, Xiao; Liu, Weiping; Carr, Amanda J.; ...
2018-01-03
Multicomponent coatings with layers comprising different functionalities are of interest for a variety of applications, including electronic devices, energy storage, and biomaterials. Rather than creating such a film using multiple deposition steps, we explore a single-step method to create such films by varying the particle Peclet numbers, Pe. Our hypothesis, based on recent theoretical descriptions of the stratification process, is that by varying particle size and evaporation rate such that Pe of large and small particles are above and below unity, we can create stratified films of polymeric and inorganic particles. In this paper, we present AFM on the surfacemore » composition of films comprising poly(styrene) nanoparticles (diameter 25–90 nm) and silica nanoparticles (diameter 8–14 nm). Previous studies on films containing both inorganic and polymeric particles correspond to large Pe values (e.g., 120–460), while we utilize Pe ~ 0.3–4, enabling us to test theories that have been developed for different regimes of Pe. We demonstrate evidence of stratification and effect of the Pe ratio, although our results agree only qualitatively with theory. Finally, our results also provide validation of recent theoretical descriptions of the film drying process that predict different regimes for large-on-top and small-on-top stratification.« less
Stratification during evaporative assembly of multicomponent nanoparticle films
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Xiao; Liu, Weiping; Carr, Amanda J.
Multicomponent coatings with layers comprising different functionalities are of interest for a variety of applications, including electronic devices, energy storage, and biomaterials. Rather than creating such a film using multiple deposition steps, we explore a single-step method to create such films by varying the particle Peclet numbers, Pe. Our hypothesis, based on recent theoretical descriptions of the stratification process, is that by varying particle size and evaporation rate such that Pe of large and small particles are above and below unity, we can create stratified films of polymeric and inorganic particles. In this paper, we present AFM on the surfacemore » composition of films comprising poly(styrene) nanoparticles (diameter 25–90 nm) and silica nanoparticles (diameter 8–14 nm). Previous studies on films containing both inorganic and polymeric particles correspond to large Pe values (e.g., 120–460), while we utilize Pe ~ 0.3–4, enabling us to test theories that have been developed for different regimes of Pe. We demonstrate evidence of stratification and effect of the Pe ratio, although our results agree only qualitatively with theory. Finally, our results also provide validation of recent theoretical descriptions of the film drying process that predict different regimes for large-on-top and small-on-top stratification.« less
Adams, Bradley J; Aschheim, Kenneth W
2016-01-01
Comparison of antemortem and postmortem dental records is a leading method of victim identification, especially for incidents involving a large number of decedents. This process may be expedited with computer software that provides a ranked list of best possible matches. This study provides a comparison of the most commonly used conventional coding and sorting algorithms used in the United States (WinID3) with a simplified coding format that utilizes an optimized sorting algorithm. The simplified system consists of seven basic codes and utilizes an optimized algorithm based largely on the percentage of matches. To perform this research, a large reference database of approximately 50,000 antemortem and postmortem records was created. For most disaster scenarios, the proposed simplified codes, paired with the optimized algorithm, performed better than WinID3 which uses more complex codes. The detailed coding system does show better performance with extremely large numbers of records and/or significant body fragmentation. © 2015 American Academy of Forensic Sciences.
Maquiladoras and National Security: Design Theory as a Guide
2011-10-25
created in 1964 when the Mexican government established the Border Industrialization Program. Previously, under the Bracero program , large numbers...valve” for the Mexican economy in terms of unemployment. Previously, under the Bracero program , Mexican laborers were allowed temporary entry into...the United States to pursue seasonal labor opportunities. A guest worker program, perhaps modeled after the Bracero program that was discontinued in
ERIC Educational Resources Information Center
Banados, Emerita
2006-01-01
Faced with the need to teach English to a large number of students, the "Universidad de Concepcion," Chile, has created an innovative Communicative English Program using ICT, which is made up of four modules covered in four academic terms. The English program aims to develop integrated linguistic skills with a focus on learning for…
Jose F. Negron; Willis C. Schaupp; Lee Pederson
2011-01-01
There are about 500 species of bark beetles (Coleoptera: Curculionidae: Scolytinae) in the United States (Wood 1982). A number of them are important disturbance agents in forested ecosystems, occasionally creating large tracts of dead trees. One eruptive species is the Douglas-fir beetle, Dendroctonus pseudotsugae Hopkins, which utilizes Douglas-fir, Pseudotsuga...
FARM WORKERS IN A SPECIALIZED SEASONAL CROP AREA, STANISLAUS COUNTY, CALIFORNIA.
ERIC Educational Resources Information Center
METZLER, WILLIAM H.
SPECIALIZATION IN THE CROPS BEST ADAPTED TO THE LOCAL AREA IS SEEN AS A HIGHLY PRODUCTIVE SYSTEM OF AGRICULTURE, BUT BY CREATING THE NEED FOR LARGE NUMBERS OF WORKERS FOR SHORT PERIODS OF TIME, IT CAUSES UNEMPLOYMENT AND MIGRATION. A SURVEY OF FRUIT AND VEGETABLE WORKERS IN STANISLAUS COUNTY, CALIFORNIA IN 1962-63 REVEALS--(1) THEIR EARNINGS ARE…
The Perceptions of New Middle School Teachers Regarding Teacher Job Satisfaction
ERIC Educational Resources Information Center
Evans, Paula Joan
2017-01-01
Teacher attrition has been a problem for school systems for more than 30 years. Large numbers of new teachers leave the profession within their first 5 years of service, creating a significant cost associated with hiring and training of replacement teachers. Attrition is problematic for a middle school in the state of Georgia. New teachers at the…
Unbiased multi-fidelity estimate of failure probability of a free plane jet
NASA Astrophysics Data System (ADS)
Marques, Alexandre; Kramer, Boris; Willcox, Karen; Peherstorfer, Benjamin
2017-11-01
Estimating failure probability related to fluid flows is a challenge because it requires a large number of evaluations of expensive models. We address this challenge by leveraging multiple low fidelity models of the flow dynamics to create an optimal unbiased estimator. In particular, we investigate the effects of uncertain inlet conditions in the width of a free plane jet. We classify a condition as failure when the corresponding jet width is below a small threshold, such that failure is a rare event (failure probability is smaller than 0.001). We estimate failure probability by combining the frameworks of multi-fidelity importance sampling and optimal fusion of estimators. Multi-fidelity importance sampling uses a low fidelity model to explore the parameter space and create a biasing distribution. An unbiased estimate is then computed with a relatively small number of evaluations of the high fidelity model. In the presence of multiple low fidelity models, this framework offers multiple competing estimators. Optimal fusion combines all competing estimators into a single estimator with minimal variance. We show that this combined framework can significantly reduce the cost of estimating failure probabilities, and thus can have a large impact in fluid flow applications. This work was funded by DARPA.
FINITE ELEMENT MODEL FOR TIDAL AND RESIDUAL CIRCULATION.
Walters, Roy A.
1986-01-01
Harmonic decomposition is applied to the shallow water equations, thereby creating a system of equations for the amplitude of the various tidal constituents and for the residual motions. The resulting equations are elliptic in nature, are well posed and in practice are shown to be numerically well-behaved. There are a number of strategies for choosing elements: the two extremes are to use a few high-order elements with continuous derivatives, or to use a large number of simpler linear elements. In this paper simple linear elements are used and prove effective.
Computer-generated forces in distributed interactive simulation
NASA Astrophysics Data System (ADS)
Petty, Mikel D.
1995-04-01
Distributed Interactive Simulation (DIS) is an architecture for building large-scale simulation models from a set of independent simulator nodes communicating via a common network protocol. DIS is most often used to create a simulated battlefield for military training. Computer Generated Forces (CGF) systems control large numbers of autonomous battlefield entities in a DIS simulation using computer equipment and software rather than humans in simulators. CGF entities serve as both enemy forces and supplemental friendly forces in a DIS exercise. Research into various aspects of CGF systems is ongoing. Several CGF systems have been implemented.
NASA Technical Reports Server (NTRS)
Hawke, Veronica; Gage, Peter; Manning, Ted
2007-01-01
ComGeom2, a tool developed to generate Common Geometry representation for multidisciplinary analysis, has been used to create a large set of geometries for use in a design study requiring analysis by two computational codes. This paper describes the process used to generate the large number of configurations and suggests ways to further automate the process and make it more efficient for future studies. The design geometry for this study is the launch abort system of the NASA Crew Launch Vehicle.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Dong; Heidelberger, Philip; Sugawara, Yutaka
An apparatus and method for extending the scalability and improving the partitionability of networks that contain all-to-all links for transporting packet traffic from a source endpoint to a destination endpoint with low per-endpoint (per-server) cost and a small number of hops. An all-to-all wiring in the baseline topology is decomposed into smaller all-to-all components in which each smaller all-to-all connection is replaced with star topology by using global switches. Stacking multiple copies of the star topology baseline network creates a multi-planed switching topology for transporting packet traffic. Point-to-point unified stacking method using global switch wiring methods connects multiple planes ofmore » a baseline topology by using the global switches to create a large network size with a low number of hops, i.e., low network latency. Grouped unified stacking method increases the scalability (network size) of a stacked topology.« less
Survey of reconstructive microsurgery training in Korea.
Moon, Seong June; Hong, Joon Pio; Kang, So Ra; Suh, Hyun Suk
2015-01-01
Microsurgical technique in reconstructive surgery is important. Despite recognizing this fact, there are no systematized microsurgery training programs in Korea. The purpose of this study was to diagnose the current training programs and discuss the direction that is needed to improve them. The authors conducted a survey of graduates of a plastic surgery residency program. The questionnaire included the volume of microsurgery, training environment, area of microsurgery, department(s) performing microsurgery, and the frequency with which flaps were used. Many specialties other than plastic surgery involved microsurgical procedures. The volume of microsurgery cases was disproportionate between large and small hospitals, creating an imbalance of residents' experience with microsurgical procedures. The increase in microsurgical procedures being performed has increased the number of surgeons who want to train in microsurgery. Increasing the number of microsurgery training programs will create more microsurgeons in Korea. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.
Partial Cavity Flows at High Reynolds Numbers
NASA Astrophysics Data System (ADS)
Makiharju, Simo; Elbing, Brian; Wiggins, Andrew; Dowling, David; Perlin, Marc; Ceccio, Steven
2009-11-01
Partial cavity flows created for friction drag reduction were examined on a large-scale. Partial cavities were investigated at Reynolds numbers up to 120 million, and stable cavities with frictional drag reduction of more than 95% were attained at optimal conditions. The model used was a 3 m wide and 12 m long flat plate with a plenum on the bottom. To create the partial cavity, air was injected at the base of an 18 cm backwards-facing step 2.1 m from the leading edge. The geometry at the cavity closure was varied for different flow speeds to optimize the closure of the cavity. Cavity gas flux, thickness, frictional loads, and cavity pressures were measured over a range of flow speeds and air injection fluxes. High-speed video was used extensively to investigate the unsteady three dimensional cavity closure, the overall cavity shape and oscillations.
Large-Eddy Simulation of Wind-Plant Aerodynamics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Churchfield, M. J.; Lee, S.; Moriarty, P. J.
In this work, we present results of a large-eddy simulation of the 48 multi-megawatt turbines composing the Lillgrund wind plant. Turbulent inflow wind is created by performing an atmospheric boundary layer precursor simulation, and turbines are modeled using a rotating, variable-speed actuator line representation. The motivation for this work is that few others have done large-eddy simulations of wind plants with a substantial number of turbines, and the methods for carrying out the simulations are varied. We wish to draw upon the strengths of the existing simulations and our growing atmospheric large-eddy simulation capability to create a sound methodology formore » performing this type of simulation. We used the OpenFOAM CFD toolbox to create our solver. The simulated time-averaged power production of the turbines in the plant agrees well with field observations, except with the sixth turbine and beyond in each wind-aligned. The power produced by each of those turbines is overpredicted by 25-40%. A direct comparison between simulated and field data is difficult because we simulate one wind direction with a speed and turbulence intensity characteristic of Lillgrund, but the field observations were taken over a year of varying conditions. The simulation shows the significant 60-70% decrease in the performance of the turbines behind the front row in this plant that has a spacing of 4.3 rotor diameters in this direction. The overall plant efficiency is well predicted. This work shows the importance of using local grid refinement to simultaneously capture the meter-scale details of the turbine wake and the kilometer-scale turbulent atmospheric structures. Although this work illustrates the power of large-eddy simulation in producing a time-accurate solution, it required about one million processor-hours, showing the significant cost of large-eddy simulation.« less
Changing the Lines in the Coloring Book
2004-05-18
multitude of SCADA programmers creates a multitude of idiosyncratic programs, making it very difficult to know how to hack into large numbers of them...an observed pattern of hacking that would alert authorities seems not to have been discussed.) However, some SCADAs are not physically connected to the...normal hacking . Simultaneously, it would identify software vulnerabilities in other products and design viruses and worms for attacking them. (The
An Earth-System Approach to Understanding the Deepwater Horizon Oil Spill
ERIC Educational Resources Information Center
Robeck, Edward
2011-01-01
The Deepwater Horizon explosion on April 20, 2010, and the subsequent release of oil into the Gulf of Mexico created an ecological disaster of immense proportions. The estimates of the amounts of oil, whether for the amount released per day or the total amount of oil disgorged from the well, call on numbers so large they defy the capacity of most…
Nguyen, Hung; Badie, Nima; McSpadden, Luke; Pedrotty, Dawn; Bursac, Nenad
2014-01-01
Micropatterning is a powerful technique to control cell shape and position on a culture substrate. In this chapter, we describe the method to reproducibly create large numbers of micropatterned heterotypic cell pairs with defined size, shape, and length of cell–cell contact. These cell pairs can be utilized in patch clamp recordings to quantify electrical interactions between cardiomyocytes and non-cardiomyocytes. PMID:25070342
ERIC Educational Resources Information Center
Stackhouse, Shannon Alexis
2009-01-01
The importance of education for individual well-being, social cohesion and economic growth is widely accepted by researchers and policymakers alike. Yet there exist vast numbers of people around the world, largely poor, who continue to lag behind wealthier people, often within their own nations. Conditional cash transfer programs were created to…
ERIC Educational Resources Information Center
Khoshhal, Yasin
2016-01-01
With the ever-growing needs for more resources, the lack of concentration on preparing an exclusive activity for a particular classroom can be observed in a large number of educational contexts. The present study investigates the efficiency of ready-made activities for busy teachers. To this end, an activity from the ready-made resource book,…
Roswitha Schmickl; Aaron Liston; Vojtěch Zeisek; Kenneth Oberlander; Kevin Weitemier; Shannon C. K. Straub; Richard C. Cronn; Léanne L. Dreyer; Jan Suda
2016-01-01
Phylogenetics benefits from using a large number of putatively independent nuclear loci and their combination with other sources of information, such as the plastid and mitochondrial genomes. To facilitate the selection of orthologous low-copy nuclear (LCN) loci for phylogenetics in nonmodel organisms, we created an automated and interactive script to select hundreds...
Nurses "Fit for Purpose": Using a Task-Centred Group to Help Students Learn from Experience
ERIC Educational Resources Information Center
Platt, Chris
2002-01-01
In a time of high demand for qualified nurses, mass education has become the norm. This is a necessary constraint within which we have to learn to function. However, the need to educate large numbers of nurses may lead to a one-sided unimaginative reliance on the traditional lecture method, creating a learning environment that inadvertently limits…
NASA Astrophysics Data System (ADS)
Niwa, Masaki; Takashina, Shoichi; Mori, Yojiro; Hasegawa, Hiroshi; Sato, Ken-ichi; Watanabe, Toshio
2015-01-01
With the continuous increase in Internet traffic, reconfigurable optical add-drop multiplexers (ROADMs) have been widely adopted in the core and metro core networks. Current ROADMs, however, allow only static operation. To realize future dynamic optical-network services, and to minimize any human intervention in network operation, the optical signal add/drop part should have colorless/directionless/contentionless (C/D/C) capabilities. This is possible with matrix switches or a combination of splitter-switches and optical tunable filters. The scale of the matrix switch increases with the square of the number of supported channels, and hence, the matrix-switch-based architecture is not suitable for creating future large-scale ROADMs. In contrast, the numbers of splitter ports, switches, and tunable filters increase linearly with the number of supported channels, and hence the tunable-filter-based architecture will support all future traffic. So far, we have succeeded in fabricating a compact tunable filter that consists of multi-stage cyclic arrayed-waveguide gratings (AWGs) and switches by using planar-lightwave-circuit (PLC) technologies. However, this multistage configuration suffers from large insertion loss and filter narrowing. Moreover, power-consuming temperature control is necessary since it is difficult to make cyclic AWGs athermal. We propose here novel tunable-filter architecture that sandwiches a single-stage non-cyclic athermal AWG having flatter-topped passbands between small-scale switches. With this configuration, the optical tunable filter attains low insertion loss, large passband bandwidths, low power consumption, compactness, and high cost-effectiveness. A prototype is monolithically fabricated with PLC technologies and its excellent performance is experimentally confirmed utilizing 80-channel 30-GBaud dual-polarization quadrature phase-shift-keying (QPSK) signals.
Impact phenomena as factors in the evolution of the Earth
NASA Technical Reports Server (NTRS)
Grieve, R. A. F.; Parmentier, E. M.
1984-01-01
It is estimated that 30 to 200 large impact basins could have been formed on the early Earth. These large impacts may have resulted in extensive volcanism and enhanced endogenic geologic activity over large areas. Initial modelling of the thermal and subsidence history of large terrestrial basins indicates that they created geologic and thermal anomalies which lasted for geologically significant times. The role of large-scale impact in the biological evolution of the Earth has been highlighted by the discovery of siderophile anomalies at the Cretaceous-Tertiary boundary and associated with North American microtektites. Although in neither case has an associated crater been identified, the observations are consistent with the deposition of projectile-contaminated high-speed ejecta from major impact events. Consideration of impact processes reveals a number of mechanisms by which large-scale impact may induce extinctions.
Challenges in Creating Online Exercises and Exams in Organic Chemistry.
Jaun, Bernhard; Thilgen, Carlo
2018-02-01
e-Learning has become increasingly important in chemical education and online exams can be an attractive alternative to traditional exams written on paper, particularly in classes with a large number of students. Ten years ago, we began to set up an e-course complementing our lecture courses Organic Chemistry I and II within the open-source e-learning environment Moodle. In this article, we retrace a number of decisions we took over time, thereby illustrating the challenges one faces when creating online exercises and exams in (organic) chemistry. Special emphasis is put on the development of MOSFECCS (MOlecular Structural Formula Editor and Calculator of Canonical SMILES), our new editor for drawing structural formulae and converting them to alphanumeric SMILES codes that can be submitted as answers to e-problems. Convinced that the possibility for structure input is essential to set up sensible chemistry quizzes and exams, and realising that existing tools present major flaws in an educational context, we decided to embark on the implementation of MOSFECCS which takes into account a number of didactic aspects.
Computations on Wings With Full-Span Oscillating Control Surfaces Using Navier-Stokes Equations
NASA Technical Reports Server (NTRS)
Guruswamy, Guru P.
2013-01-01
A dual-level parallel procedure is presented for computing large databases to support aerospace vehicle design. This procedure has been developed as a single Unix script within the Parallel Batch Submission environment utilizing MPIexec and runs MPI based analysis software. It has been developed to provide a process for aerospace designers to generate data for large numbers of cases with the highest possible fidelity and reasonable wall clock time. A single job submission environment has been created to avoid keeping track of multiple jobs and the associated system administration overhead. The process has been demonstrated for computing large databases for the design of typical aerospace configurations, a launch vehicle and a rotorcraft.
Transition to adulthood of female garment-factory workers in Bangladesh.
Amin, S; Diamond, I; Naved, R T; Newby, M
1998-06-01
This article examines data from a study on garment-factory workers in Bangladesh to explore the implications of work for the early socialization of young women. For the first time, large numbers of young Bangladeshi women are being given an alternative to lives in which they move directly from childhood to adulthood through early marriage and childbearing. Employment creates a period of transition in contrast to the abrupt assumption of adult roles at very young ages that marriage and childbearing mandate. This longer transition creates a period of adolescence for young women working in the garment sector that is shown to have strong implications for the women's long-term reproductive health.
NASA Technical Reports Server (NTRS)
Leake, Stephen; Green, Tom; Cofer, Sue; Sauerwein, Tim
1989-01-01
HARPS is a telerobot control system that can perform some simple but useful tasks. This capability is demonstrated by performing the ORU exchange demonstration. HARPS is based on NASREM (NASA Standard Reference Model). All software is developed in Ada, and the project incorporates a number of different CASE (computer-aided software engineering) tools. NASREM was found to be a valid and useful model for building a telerobot control system. Its hierarchical and distributed structure creates a natural and logical flow for implementing large complex robust control systems. The ability of Ada to create and enforce abstraction enhanced the implementation of such control systems.
ERIC Educational Resources Information Center
Angouri, Jo
2010-01-01
The current international nature of socio-economic activities is reshaping workplace settings and creating the need for large numbers of employees to perform successful communicative acts with a wider range of interactants than in the past, often using a language other than their mother tongue. Against this backdrop much emphasis has been placed…
NASA Technical Reports Server (NTRS)
Wang, Wenlong; Mandra, Salvatore; Katzgraber, Helmut G.
2016-01-01
In this paper, we propose a patch planting method for creating arbitrarily large spin glass instances with known ground states. The scaling of the computational complexity of these instances with various block numbers and sizes is investigated and compared with random instances using population annealing Monte Carlo and the quantum annealing DW2X machine. The method can be useful for benchmarking tests for future generation quantum annealing machines, classical and quantum mechanical optimization algorithms.
Risk Management in Media Engagement
2009-02-01
the US foreign policy of backing Israel remained a sour point for a number of Muslim countries. Gulf War I, because there was a large Muslim...Muslim World). 31 Bush Administration‟s appointment of Charlotte Beers as Under Secretary of State for Public Diplomacy, a newly created post...reasons of the Soviets‟ demise. Beers attempted to engage the target audience emotionally rather than discursively, with one of the first initiatives to
Tunneling into fuzzball states
NASA Astrophysics Data System (ADS)
Mathur, Samir D.
2010-01-01
String theory suggests that black hole microstates are quantum, horizon sized ‘fuzzballs', rather than smooth geometries with horizon. Radiation from fuzzballs can carry information and does not lead to information loss. But if we let a shell of matter collapse then it creates a horizon, and it seems that subsequent radiation will lead to information loss. We argue that the resolution to this problem is that the shell can tunnel to the fuzzball configurations. The amplitude for tunneling is small because we are relating two macroscopically different configurations, but the number of states that we can tunnel to, given through the Bekenstein entropy, is very large. These small and large numbers can cancel each other, making it possible for the shell to tunnel into fuzzball states before a significant amount of radiation has been emitted. This offers a way to resolve the information paradox.
Accreting Black Hole Binaries in Globular Clusters
NASA Astrophysics Data System (ADS)
Kremer, Kyle; Chatterjee, Sourav; Rodriguez, Carl L.; Rasio, Frederic A.
2018-01-01
We explore the formation of mass-transferring binary systems containing black holes (BHs) within globular clusters (GC). We show that it is possible to form mass-transferring BH binaries with main sequence, giant, and white dwarf companions with a variety of orbital parameters in GCs spanning a large range in present-day properties. All mass-transferring BH binaries found in our models at late times are dynamically created. The BHs in these systems experienced a median of ∼30 dynamical encounters within the cluster before and after acquiring the donor. Furthermore, we show that the presence of mass-transferring BH systems has little correlation with the total number of BHs within the cluster at any time. This is because the net rate of formation of BH–non-BH binaries in a cluster is largely independent of the total number of retained BHs. Our results suggest that the detection of a mass-transferring BH binary in a GC does not necessarily indicate that the host cluster contains a large BH population.
Two Rounds of Whole Genome Duplication in the Ancestral Vertebrate
Dehal, Paramvir; Boore, Jeffrey L
2005-01-01
The hypothesis that the relatively large and complex vertebrate genome was created by two ancient, whole genome duplications has been hotly debated, but remains unresolved. We reconstructed the evolutionary relationships of all gene families from the complete gene sets of a tunicate, fish, mouse, and human, and then determined when each gene duplicated relative to the evolutionary tree of the organisms. We confirmed the results of earlier studies that there remains little signal of these events in numbers of duplicated genes, gene tree topology, or the number of genes per multigene family. However, when we plotted the genomic map positions of only the subset of paralogous genes that were duplicated prior to the fish–tetrapod split, their global physical organization provides unmistakable evidence of two distinct genome duplication events early in vertebrate evolution indicated by clear patterns of four-way paralogous regions covering a large part of the human genome. Our results highlight the potential for these large-scale genomic events to have driven the evolutionary success of the vertebrate lineage. PMID:16128622
A replacement for islet equivalents with improved reliability and validity.
Huang, Han-Hung; Ramachandran, Karthik; Stehno-Bittel, Lisa
2013-10-01
Islet equivalent (IE), the standard estimate of isolated islet volume, is an essential measure to determine the amount of transplanted islet tissue in the clinic and is used in research laboratories to normalize results, yet it is based on the false assumption that all islets are spherical. Here, we developed and tested a new easy-to-use method to quantify islet volume with greater accuracy. Isolated rat islets were dissociated into single cells, and the total cell number per islet was determined by using computer-assisted cytometry. Based on the cell number per islet, we created a regression model to convert islet diameter to cell number with a high R2 value (0.8) and good validity and reliability with the same model applicable to young and old rats and males or females. Conventional IE measurements overestimated the tissue volume of islets. To compare results obtained using IE or our new method, we compared Glut2 protein levels determined by Western Blot and proinsulin content via ELISA between small (diameter≤100 μm) and large (diameter≥200 μm) islets. When normalized by IE, large islets showed significantly lower Glut2 level and proinsulin content. However, when normalized by cell number, large and small islets had no difference in Glut2 levels, but large islets contained more proinsulin. In conclusion, normalizing islet volume by IE overestimated the tissue volume, which may lead to erroneous results. Normalizing by cell number is a more accurate method to quantify tissue amounts used in islet transplantation and research.
Clinical review: SARS - lessons in disaster management.
Hawryluck, Laura; Lapinsky, Stephen E; Stewart, Thomas E
2005-08-01
Disaster management plans have traditionally been required to manage major traumatic events that create a large number of victims. Infectious diseases, whether they be natural (e.g. SARS [severe acute respiratory syndrome] and influenza) or the result of bioterrorism, have the potential to create a large influx of critically ill into our already strained hospital systems. With proper planning, hospitals, health care workers and our health care systems can be better prepared to deal with such an eventuality. This review explores the Toronto critical care experience of coping in the SARS outbreak disaster. Our health care system and, in particular, our critical care system were unprepared for this event, and as a result the impact that SARS had was worse than it could have been. Nonetheless, we were able to organize a response rapidly during the outbreak. By describing our successes and failures, we hope to help others to learn and avoid the problems we encountered as they develop their own disaster management plans in anticipation of similar future situations.
[Recent Progress in Promoting Research Integrity].
Tanaka, Satoshi
2018-01-01
An increasing number of cases of research misconduct and whistle-blowing in the fields of medicine and life sciences has created public concern about research integrity. In Europe and the United States, there has been a large focus on poor reproducibility in life science research, and poor reproducibility is largely associated with research misconduct. Research integrity is equally crucial in the pharmaceutical sciences, which play an important role in medical and life sciences. Individual cases of research misconduct have not been investigated in detail in Japan, because it was generally believed that only researchers with strong or strange personalities would participate in misconduct. However, a better understanding of research misconduct will enable more in-depth discussions about research integrity, which is now known to be closely associated with normal research activities. Here I will introduce information on various contemporary activities being performed to create a sound research environment, drawn from practices in universities, pharmaceutical companies, and government agencies. I will also discuss ways in which individual researchers can promote research integrity.
Sparse synthetic aperture with Fresnel elements (S-SAFE) using digital incoherent holograms
Kashter, Yuval; Rivenson, Yair; Stern, Adrian; Rosen, Joseph
2015-01-01
Creating a large-scale synthetic aperture makes it possible to break the resolution boundaries dictated by the wave nature of light of common optical systems. However, their implementation is challenging, since the generation of a large size continuous mosaic synthetic aperture composed of many patterns is complicated in terms of both phase matching and time-multiplexing duration. In this study we present an advanced configuration for an incoherent holographic imaging system with super resolution qualities that creates a partial synthetic aperture. The new system, termed sparse synthetic aperture with Fresnel elements (S-SAFE), enables significantly decreasing the number of the recorded elements, and it is free from positional constrains on their location. Additionally, in order to obtain the best image quality we propose an optimal mosaicking structure derived on the basis of physical and numerical considerations, and introduce three reconstruction approaches which are compared and discussed. The super-resolution capabilities of the proposed scheme and its limitations are analyzed, numerically simulated and experimentally demonstrated. PMID:26367947
ERIC Educational Resources Information Center
Brown, J. C.
1915-01-01
The International Commission on the Teaching of Mathematics created by the International Congress of Mathematics at Rome, Italy, in 1908, submitted a large body of reports to the congress at Cambridge, England, in 1912. Those for the United States have been published as bulletins for the Bureau of Education. The material in this bulletin shows…
2016-01-14
This image captured by NASA 2001 Mars Odyssey spacecraft shows a portion of one of the larger depressions on the NW edge of the Elysium volcanic complex. Portions of this large channel system appear to have been created by liquid flow, while other portions appear to have tectonic action as the formation process. Orbit Number: 61770 Latitude: 28.4502 Longitude: 138.828 Instrument: VIS Captured: 2015-11-16 21:51 http://photojournal.jpl.nasa.gov/catalog/PIA20236
NASA Applications of Molecular Nanotechnology
NASA Technical Reports Server (NTRS)
Globus, Al; Bailey, David; Han, Jie; Jaffe, Richard; Levit, Creon; Merkle, Ralph; Srivastava, Deepak
1998-01-01
Laboratories throughout the world are rapidly gaining atomically precise control over matter. As this control extends to an ever wider variety of materials, processes and devices, opportunities for applications relevant to NASA's missions will be created. This document surveys a number of future molecular nanotechnology capabilities of aerospace interest. Computer applications, launch vehicle improvements, and active materials appear to be of particular interest. We also list a number of applications for each of NASA's enterprises. If advanced molecular nanotechnology can be developed, almost all of NASA's endeavors will be radically improved. In particular, a sufficiently advanced molecular nanotechnology can arguably bring large scale space colonization within our grasp.
Wind-turbine-performance assessment
NASA Astrophysics Data System (ADS)
Vachon, W. A.
1982-06-01
An updated summary of recent test data and experiences is reported from both federally and privately funded large wind turbine (WT) development and test programs, and from key WT programs in Europe. Progress and experiences on both the cluster of three MOD-2 2.5-MW WT's, the MOD-1 2-MW WT, and other WT installations are described. An examination of recent test experiences and plans from approximately five privately funded large WT programs in the United States indicates that, during machine checkout and startup, technical problems are identified, which require and startup, a number of technical problems are identified, which will require design changes and create program delays.
Creating databases for biological information: an introduction.
Stein, Lincoln
2002-08-01
The essence of bioinformatics is dealing with large quantities of information. Whether it be sequencing data, microarray data files, mass spectrometric data (e.g., fingerprints), the catalog of strains arising from an insertional mutagenesis project, or even large numbers of PDF files, there inevitably comes a time when the information can simply no longer be managed with files and directories. This is where databases come into play. This unit briefly reviews the characteristics of several database management systems, including flat file, indexed file, and relational databases, as well as ACeDB. It compares their strengths and weaknesses and offers some general guidelines for selecting an appropriate database management system.
Design and fabrication of multispectral optics using expanded glass map
NASA Astrophysics Data System (ADS)
Bayya, Shyam; Gibson, Daniel; Nguyen, Vinh; Sanghera, Jasbinder; Kotov, Mikhail; Drake, Gryphon; Deegan, John; Lindberg, George
2015-06-01
As the desire to have compact multispectral imagers in various DoD platforms is growing, the dearth of multispectral optics is widely felt. With the limited number of material choices for optics, these multispectral imagers are often very bulky and impractical on several weight sensitive platforms. To address this issue, NRL has developed a large set of unique infrared glasses that transmit from 0.9 to > 14 μm in wavelength and expand the glass map for multispectral optics with refractive indices from 2.38 to 3.17. They show a large spread in dispersion (Abbe number) and offer some unique solutions for multispectral optics designs. The new NRL glasses can be easily molded and also fused together to make bonded doublets. A Zemax compatible glass file has been created and is available upon request. In this paper we present some designs, optics fabrication and imaging, all using NRL materials.
NASA Technical Reports Server (NTRS)
Hudson, W. R.
1976-01-01
A microscopic surface texture is created by sputter etching a surface while simultaneously sputter depositing a lower sputter yield material onto the surface. A xenon ion beam source has been used to perform this texturing process on samples as large as three centimeters in diameter. Ion beam textured surface structures have been characterized with SEM photomicrographs for a large number of materials including Cu, Al, Si, Ti, Ni, Fe, Stainless steel, Au, and Ag. Surfaces have been textured using a variety of low sputter yield materials - Ta, Mo, Nb, and Ti. The initial stages of the texture creation have been documented, and the technique of ion beam sputter removal of any remaining deposited material has been studied. A number of other texturing parameters have been studied such as the variation of the texture with ion beam power, surface temperature, and the rate of texture growth with sputter etching time.
[Application of Kohonen Self-Organizing Feature Maps in QSAR of human ADMET and kinase data sets].
Hegymegi-Barakonyi, Bálint; Orfi, László; Kéri, György; Kövesdi, István
2013-01-01
QSAR predictions have been proven very useful in a large number of studies for drug design, such as kinase inhibitor design as targets for cancer therapy, however the overall predictability often remains unsatisfactory. To improve predictability of ADMET features and kinase inhibitory data, we present a new method using Kohonen's Self-Organizing Feature Map (SOFM) to cluster molecules based on explanatory variables (X) and separate dissimilar ones. We calculated SOFM clusters for a large number of molecules with human ADMET and kinase inhibitory data, and we showed that chemically similar molecules were in the same SOFM cluster, and within such clusters the QSAR models had significantly better predictability. We used also target variables (Y, e.g. ADMET) jointly with X variables to create a novel type of clustering. With our method, cells of loosely coupled XY data could be identified and separated into different model building sets.
NASA Astrophysics Data System (ADS)
Min, M.
2017-10-01
Context. Opacities of molecules in exoplanet atmospheres rely on increasingly detailed line-lists for these molecules. The line lists available today contain for many species up to several billions of lines. Computation of the spectral line profile created by pressure and temperature broadening, the Voigt profile, of all of these lines is becoming a computational challenge. Aims: We aim to create a method to compute the Voigt profile in a way that automatically focusses the computation time into the strongest lines, while still maintaining the continuum contribution of the high number of weaker lines. Methods: Here, we outline a statistical line sampling technique that samples the Voigt profile quickly and with high accuracy. The number of samples is adjusted to the strength of the line and the local spectral line density. This automatically provides high accuracy line shapes for strong lines or lines that are spectrally isolated. The line sampling technique automatically preserves the integrated line opacity for all lines, thereby also providing the continuum opacity created by the large number of weak lines at very low computational cost. Results: The line sampling technique is tested for accuracy when computing line spectra and correlated-k tables. Extremely fast computations ( 3.5 × 105 lines per second per core on a standard current day desktop computer) with high accuracy (≤1% almost everywhere) are obtained. A detailed recipe on how to perform the computations is given.
Dynamic permeability in fault damage zones induced by repeated coseismic fracturing events
NASA Astrophysics Data System (ADS)
Aben, F. M.; Doan, M. L.; Mitchell, T. M.
2017-12-01
Off-fault fracture damage in upper crustal fault zones change the fault zone properties and affect various co- and interseismic processes. One of these properties is the permeability of the fault damage zone rocks, which is generally higher than the surrounding host rock. This allows large-scale fluid flow through the fault zone that affects fault healing and promotes mineral transformation processes. Moreover, it might play an important role in thermal fluid pressurization during an earthquake rupture. The damage zone permeability is dynamic due to coseismic damaging. It is crucial for earthquake mechanics and for longer-term processes to understand how the dynamic permeability structure of a fault looks like and how it evolves with repeated earthquakes. To better detail coseismically induced permeability, we have performed uniaxial split Hopkinson pressure bar experiments on quartz-monzonite rock samples. Two sample sets were created and analyzed: single-loaded samples subjected to varying loading intensities - with damage varying from apparently intact to pulverized - and samples loaded at a constant intensity but with a varying number of repeated loadings. The first set resembles a dynamic permeability structure created by a single large earthquake. The second set resembles a permeability structure created by several earthquakes. After, the permeability and acoustic velocities were measured as a function of confining pressure. The permeability in both datasets shows a large and non-linear increase over several orders of magnitude (from 10-20 up to 10-14 m2) with an increasing amount of fracture damage. This, combined with microstructural analyses of the varying degrees of damage, suggests a percolation threshold. The percolation threshold does not coincide with the pulverization threshold. With increasing confining pressure, the permeability might drop up to two orders of magnitude, which supports the possibility of large coseismic fluid pulses over relatively large distances along a fault. Also, a relatively small threshold could potentially increase permeability in a large volume of rock, given that previous earthquakes already damaged these rocks.
Light-induced defects in hybrid lead halide perovskite
NASA Astrophysics Data System (ADS)
Sharia, Onise; Schneider, William
One of the main challenges facing organohalide perovskites for solar application is stability. Solar cells must last decades to be economically viable alternatives to traditional energy sources. While some causes of instability can be avoided through engineering, light-induced defects can be fundamentally limiting factor for practical application of the material. Light creates large numbers of electron and hole pairs that can contribute to degradation processes. Using ab initio theoretical methods, we systematically explore first steps of light induced defect formation in methyl ammonium lead iodide, MAPbI3. In particular, we study charged and neutral Frenkel pair formation involving Pb and I atoms. We find that most of the defects, except negatively charged Pb Frenkel pairs, are reversible, and thus most do not lead to degradation. Negative Pb defects create a mid-gap state and localize the conduction band electron. A minimum energy path study shows that, once the first defect is created, Pb atoms migrate relatively fast. The defects have two detrimental effects on the material. First, they create charge traps below the conduction band. Second, they can lead to degradation of the material by forming Pb clusters.
Short-crested waves in the surf zone
NASA Astrophysics Data System (ADS)
Wei, Zhangping; Dalrymple, Robert A.; Xu, Munan; Garnier, Roland; Derakhti, Morteza
2017-05-01
This study investigates short-crested waves in the surf zone by using the mesh-free Smoothed Particle Hydrodynamics model, GPUSPH. The short-crested waves are created by generating intersecting wave trains in a numerical wave basin with a beach. We first validate the numerical model for short-crested waves by comparison with large-scale laboratory measurements. Then short-crested wave breaking over a planar beach is studied comprehensively. We observe rip currents as discussed in Dalrymple (1975) and undertow created by synchronous intersecting waves. The wave breaking of the short-crested wavefield created by the nonlinear superposition of intersecting waves and wave-current interaction result in the formation of isolated breakers at the ends of breaking wave crests. Wave amplitude diffraction at these isolated breakers gives rise to an increase in the alongshore wave number in the inner surf zone. Moreover, 3-D vortices and multiple circulation cells with a rotation frequency much lower than the incident wave frequency are observed across the outer surf zone to the beach. Finally, we investigate vertical vorticity generation under short-crested wave breaking and find that breaking of short-crested waves generates vorticity as pointed out by Peregrine (1998). Vorticity generation is not only observed under short-crested waves with a limited number of wave components but also under directional wave spectra.
Contribution to terminology internationalization by word alignment in parallel corpora.
Deléger, Louise; Merkel, Magnus; Zweigenbaum, Pierre
2006-01-01
Creating a complete translation of a large vocabulary is a time-consuming task, which requires skilled and knowledgeable medical translators. Our goal is to examine to which extent such a task can be alleviated by a specific natural language processing technique, word alignment in parallel corpora. We experiment with translation from English to French. Build a large corpus of parallel, English-French documents, and automatically align it at the document, sentence and word levels using state-of-the-art alignment methods and tools. Then project English terms from existing controlled vocabularies to the aligned word pairs, and examine the number and quality of the putative French translations obtained thereby. We considered three American vocabularies present in the UMLS with three different translation statuses: the MeSH, SNOMED CT, and the MedlinePlus Health Topics. We obtained several thousand new translations of our input terms, this number being closely linked to the number of terms in the input vocabularies. Our study shows that alignment methods can extract a number of new term translations from large bodies of text with a moderate human reviewing effort, and thus contribute to help a human translator obtain better translation coverage of an input vocabulary. Short-term perspectives include their application to a corpus 20 times larger than that used here, together with more focused methods for term extraction.
Contribution to Terminology Internationalization by Word Alignment in Parallel Corpora
Deléger, Louise; Merkel, Magnus; Zweigenbaum, Pierre
2006-01-01
Background and objectives Creating a complete translation of a large vocabulary is a time-consuming task, which requires skilled and knowledgeable medical translators. Our goal is to examine to which extent such a task can be alleviated by a specific natural language processing technique, word alignment in parallel corpora. We experiment with translation from English to French. Methods Build a large corpus of parallel, English-French documents, and automatically align it at the document, sentence and word levels using state-of-the-art alignment methods and tools. Then project English terms from existing controlled vocabularies to the aligned word pairs, and examine the number and quality of the putative French translations obtained thereby. We considered three American vocabularies present in the UMLS with three different translation statuses: the MeSH, SNOMED CT, and the MedlinePlus Health Topics. Results We obtained several thousand new translations of our input terms, this number being closely linked to the number of terms in the input vocabularies. Conclusion Our study shows that alignment methods can extract a number of new term translations from large bodies of text with a moderate human reviewing effort, and thus contribute to help a human translator obtain better translation coverage of an input vocabulary. Short-term perspectives include their application to a corpus 20 times larger than that used here, together with more focused methods for term extraction. PMID:17238328
Scalability improvements to NRLMOL for DFT calculations of large molecules
NASA Astrophysics Data System (ADS)
Diaz, Carlos Manuel
Advances in high performance computing (HPC) have provided a way to treat large, computationally demanding tasks using thousands of processors. With the development of more powerful HPC architectures, the need to create efficient and scalable code has grown more important. Electronic structure calculations are valuable in understanding experimental observations and are routinely used for new materials predictions. For the electronic structure calculations, the memory and computation time are proportional to the number of atoms. Memory requirements for these calculations scale as N2, where N is the number of atoms. While the recent advances in HPC offer platforms with large numbers of cores, the limited amount of memory available on a given node and poor scalability of the electronic structure code hinder their efficient usage of these platforms. This thesis will present some developments to overcome these bottlenecks in order to study large systems. These developments, which are implemented in the NRLMOL electronic structure code, involve the use of sparse matrix storage formats and the use of linear algebra using sparse and distributed matrices. These developments along with other related development now allow ground state density functional calculations using up to 25,000 basis functions and the excited state calculations using up to 17,000 basis functions while utilizing all cores on a node. An example on a light-harvesting triad molecule is described. Finally, future plans to further improve the scalability will be presented.
Hanchard, Neil A; Umana, Luis A; D'Alessandro, Lisa; Azamian, Mahshid; Poopola, Mojisola; Morris, Shaine A; Fernbach, Susan; Lalani, Seema R; Towbin, Jeffrey A; Zender, Gloria A; Fitzgerald-Butt, Sara; Garg, Vidu; Bowman, Jessica; Zapata, Gladys; Hernandez, Patricia; Arrington, Cammon B; Furthner, Dieter; Prakash, Siddharth K; Bowles, Neil E; McBride, Kim L; Belmont, John W
2017-08-01
Congenital left-sided cardiac lesions (LSLs) are a significant contributor to the mortality and morbidity of congenital heart disease (CHD). Structural copy number variants (CNVs) have been implicated in LSL without extra-cardiac features; however, non-penetrance and variable expressivity have created uncertainty over the use of CNV analyses in such patients. High-density SNP microarray genotyping data were used to infer large, likely-pathogenic, autosomal CNVs in a cohort of 1,139 probands with LSL and their families. CNVs were molecularly confirmed and the medical records of individual carriers reviewed. The gene content of novel CNVs was then compared with public CNV data from CHD patients. Large CNVs (>1 MB) were observed in 33 probands (∼3%). Six of these were de novo and 14 were not observed in the only available parent sample. Associated cardiac phenotypes spanned a broad spectrum without clear predilection. Candidate CNVs were largely non-recurrent, associated with heterozygous loss of copy number, and overlapped known CHD genomic regions. Novel CNV regions were enriched for cardiac development genes, including seven that have not been previously associated with human CHD. CNV analysis can be a clinically useful and molecularly informative tool in LSLs without obvious extra-cardiac defects, and may identify a clinically relevant genomic disorder in a small but important proportion of these individuals. © 2017 Wiley Periodicals, Inc.
Burke, Holly M.; Moret, Whitney; Field, Samuel; Chen, Mario; Zeng, Yanwu; Seka, Firmin M.
2016-01-01
The objective of this study was to identify and describe levels of household economic vulnerability in HIV-affected communities in Côte d’Ivoire, defined as those with a high prevalence of HIV and large numbers of orphans and vulnerable children. We conducted a cross-sectional survey of 3,749 households in five health regions of Côte d’Ivoire. Using principal component analysis, we attempted to identify sets of correlated vulnerabilities and derive a small number of composite scores to create an index for targeting interventions to vulnerable populations. The 65 vulnerability measures examined did not cluster in ways that would allow for the creation of a small number of composite measures. Instead, we found that households face numerous unique pathways to vulnerability. PMID:27655530
The reliability and stability of visual working memory capacity.
Xu, Z; Adam, K C S; Fang, X; Vogel, E K
2018-04-01
Because of the central role of working memory capacity in cognition, many studies have used short measures of working memory capacity to examine its relationship to other domains. Here, we measured the reliability and stability of visual working memory capacity, measured using a single-probe change detection task. In Experiment 1, the participants (N = 135) completed a large number of trials of a change detection task (540 in total, 180 each of set sizes 4, 6, and 8). With large numbers of both trials and participants, reliability estimates were high (α > .9). We then used an iterative down-sampling procedure to create a look-up table for expected reliability in experiments with small sample sizes. In Experiment 2, the participants (N = 79) completed 31 sessions of single-probe change detection. The first 30 sessions took place over 30 consecutive days, and the last session took place 30 days later. This unprecedented number of sessions allowed us to examine the effects of practice on stability and internal reliability. Even after much practice, individual differences were stable over time (average between-session r = .76).
Enhanced cellular transport and drug targeting using dendritic nanostructures
NASA Astrophysics Data System (ADS)
Kannan, R. M.; Kolhe, Parag; Kannan, Sujatha; Lieh-Lai, Mary
2003-03-01
Dendrimers and hyperbranched polymers possess highly branched architectures, with a large number of controllable, tailorable, peripheral' functionalities. Since the surface chemistry of these materials can be modified with relative ease, these materials have tremendous potential in targeted drug delivery. The large density of end groups can also be tailored to create enhanced affinity to targeted cells, and can also encapsulate drugs and deliver them in a controlled manner. We are developing tailor-modified dendritic systems for drug delivery. Synthesis, drug/ligand conjugation, in vitro cellular and in vivo drug delivery, and the targeting efficiency to the cell are being studied systematically using a wide variety of experimental tools. Results on PAMAM dendrimers and polyol hyperbranched polymers suggest that: (1) These materials complex/encapsulate a large number of drug molecules and release them at tailorable rates; (2) The drug-dendrimer complex is transported very rapidly through a A549 lung epithelial cancel cell line, compared to free drug, perhaps by endocytosis. The ability of the drug-dendrimer-ligand complexes to target specific asthma and cancer cells is currently being explored using in vitro and in vivo animal models.
Melchardt, Thomas; Hufnagl, Clemens; Weinstock, David M; Kopp, Nadja; Neureiter, Daniel; Tränkenschuh, Wolfgang; Hackl, Hubert; Weiss, Lukas; Rinnerthaler, Gabriel; Hartmann, Tanja N; Greil, Richard; Weigert, Oliver; Egle, Alexander
2016-08-09
Little information is available about the role of certain mutations for clonal evolution and the clinical outcome during relapse in diffuse large B-cell lymphoma (DLBCL). Therefore, we analyzed formalin-fixed-paraffin-embedded tumor samples from first diagnosis, relapsed or refractory disease from 28 patients using next-generation sequencing of the exons of 104 coding genes. Non-synonymous mutations were present in 74 of the 104 genes tested. Primary tumor samples showed a median of 8 non-synonymous mutations (range: 0-24) with the used gene set. Lower numbers of non-synonymous mutations in the primary tumor were associated with a better median OS compared with higher numbers (28 versus 15 months, p=0.031). We observed three patterns of clonal evolution during relapse of disease: large global change, subclonal selection and no or minimal change possibly suggesting preprogrammed resistance. We conclude that targeted re-sequencing is a feasible and informative approach to characterize the molecular pattern of relapse and it creates novel insights into the role of dynamics of individual genes.
Multibiodose radiation emergency triage categorization software.
Ainsbury, Elizabeth A; Barnard, Stephen; Barrios, Lleonard; Fattibene, Paola; de Gelder, Virginie; Gregoire, Eric; Lindholm, Carita; Lloyd, David; Nergaard, Inger; Rothkamm, Kai; Romm, Horst; Scherthan, Harry; Thierens, Hubert; Vandevoorde, Charlot; Woda, Clemens; Wojcik, Andrzej
2014-07-01
In this note, the authors describe the MULTIBIODOSE software, which has been created as part of the MULTIBIODOSE project. The software enables doses estimated by networks of laboratories, using up to five retrospective (biological and physical) assays, to be combined to give a single estimate of triage category for each individual potentially exposed to ionizing radiation in a large scale radiation accident or incident. The MULTIBIODOSE software has been created in Java. The usage of the software is based on the MULTIBIODOSE Guidance: the program creates a link to a single SQLite database for each incident, and the database is administered by the lead laboratory. The software has been tested with Java runtime environment 6 and 7 on a number of different Windows, Mac, and Linux systems, using data from a recent intercomparison exercise. The Java program MULTIBIODOSE_1.0.jar is freely available to download from http://www.multibiodose.eu/software or by contacting the software administrator: MULTIBIODOSE-software@gmx.com.
Postinflationary Higgs relaxation and the origin of matter-antimatter asymmetry.
Kusenko, Alexander; Pearce, Lauren; Yang, Louis
2015-02-13
The recent measurement of the Higgs boson mass implies a relatively slow rise of the standard model Higgs potential at large scales, and a possible second minimum at even larger scales. Consequently, the Higgs field may develop a large vacuum expectation value during inflation. The relaxation of the Higgs field from its large postinflationary value to the minimum of the effective potential represents an important stage in the evolution of the Universe. During this epoch, the time-dependent Higgs condensate can create an effective chemical potential for the lepton number, leading to a generation of the lepton asymmetry in the presence of some large right-handed Majorana neutrino masses. The electroweak sphalerons redistribute this asymmetry between leptons and baryons. This Higgs relaxation leptogenesis can explain the observed matter-antimatter asymmetry of the Universe even if the standard model is valid up to the scale of inflation, and any new physics is suppressed by that high scale.
Symmetric large momentum transfer for atom interferometry with BECs
NASA Astrophysics Data System (ADS)
Abend, Sven; Gebbe, Martina; Gersemann, Matthias; Rasel, Ernst M.; Quantus Collaboration
2017-04-01
We develop and demonstrate a novel scheme for a symmetric large momentum transfer beam splitter for interferometry with Bose-Einstein condensates. Large momentum transfer beam splitters are a key technique to enhance the scaling factor and sensitivity of an atom interferometer and to create largely delocalized superposition states. To realize the beam splitter, double Bragg diffraction is used to create a superposition of two symmetric momentum states. Afterwards both momentum states are loaded into a retro-reflected optical lattice and accelerated by Bloch oscillations on opposite directions, keeping the initial symmetry. The favorable scaling behavior of this symmetric acceleration, allows to transfer more than 1000 ℏk of total differential splitting in a single acceleration sequence of 6 ms duration while we still maintain a fraction of approx. 25% of the initial atom number. As a proof of the coherence of this beam splitter, contrast in a closed Mach-Zehnder atom interferometer has been observed with up to 208 ℏk of momentum separation, which equals a differential wave-packet velocity of approx. 1.1 m/s for 87Rb. The presented work is supported by the CRC 1128 geo-Q and the DLR with funds provided by the Federal Ministry of Economic Affairs and Energy (BMWi) due to an enactment of the German Bundestag under Grant No. DLR 50WM1552-1557 (QUANTUS-IV-Fallturm).
Design of structurally distinct proteins using strategies inspired by evolution
Jacobs, T. M.; Williams, B.; Williams, T.; ...
2016-05-06
Natural recombination combines pieces of preexisting proteins to create new tertiary structures and functions. In this paper, we describe a computational protocol, called SEWING, which is inspired by this process and builds new proteins from connected or disconnected pieces of existing structures. Helical proteins designed with SEWING contain structural features absent from other de novo designed proteins and, in some cases, remain folded at more than 100°C. High-resolution structures of the designed proteins CA01 and DA05R1 were solved by x-ray crystallography (2.2 angstrom resolution) and nuclear magnetic resonance, respectively, and there was excellent agreement with the design models. Finally, thismore » method provides a new strategy to rapidly create large numbers of diverse and designable protein scaffolds.« less
Creating databases for biological information: an introduction.
Stein, Lincoln
2013-06-01
The essence of bioinformatics is dealing with large quantities of information. Whether it be sequencing data, microarray data files, mass spectrometric data (e.g., fingerprints), the catalog of strains arising from an insertional mutagenesis project, or even large numbers of PDF files, there inevitably comes a time when the information can simply no longer be managed with files and directories. This is where databases come into play. This unit briefly reviews the characteristics of several database management systems, including flat file, indexed file, relational databases, and NoSQL databases. It compares their strengths and weaknesses and offers some general guidelines for selecting an appropriate database management system. Copyright 2013 by JohnWiley & Sons, Inc.
NASA Astrophysics Data System (ADS)
Ambjørn, J.; Watabiki, Y.
2017-12-01
We recently formulated a model of the universe based on an underlying W3-symmetry. It allows the creation of the universe from nothing and the creation of baby universes and wormholes for spacetimes of dimension 2, 3, 4, 6 and 10. Here we show that the classical large time and large space limit of these universes is one of exponential fast expansion without the need of a cosmological constant. Under a number of simplifying assumptions, our model predicts that w = ‑1.2 in the case of four-dimensional spacetime. The possibility of obtaining a w-value less than ‑1 is linked to the ability of our model to create baby universes and wormholes.
Polymer Formulations for Cartilage Repair
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gutowska, Anna; Jasionowski, Marek; Morris, J. E.
2001-05-15
Regeneration of destroyed articular cartilage can be induced by transplantation of cartilage cells into a defect. The best results are obtained with the use of autologus cells. However, obtaining large amounts of autologus cartilage cells causes a problem of creating a large cartilage defect in a donor site. Techniques are currently being developed to harvest a small number of cells and propagate them in vitro. It is a challenging task, however, due to the fact that ordinarily, in a cell culture on flat surfaces, chondrocytes do not maintain their in vivo phenotype and irreversibly diminish or cease the synthesis ofmore » aggregating proteoglycans. Therefore, the research is continuing to develop culture conditions for chondrocytes with the preserved phenotype.« less
2018-03-26
Off the image to the right is Yuty Crater, located between Simud and Tiu Valles. The crater ejcta forms the large lobes along the right side of this VIS image. This type of ejecta was created by surface flow rather than air fall. It is thought that the near surface materials contained volatiles (like water) which mixed with the ejecta at the time of the impact. Orbit Number: 68736 Latitude: 22.247 Longitude: 325.213 Instrument: VIS Captured: 2017-06-12 17:57 https://photojournal.jpl.nasa.gov/catalog/PIA22303
Geology of the American Southwest
NASA Astrophysics Data System (ADS)
Baldridge, W. Scott
2004-06-01
Scott Baldridge presents a concise guide to the geology of the Southwestern U.S. Two billion years of Earth history are represented in the rocks and landscape of the Southwest U.S., creating natural wonders such as the Grand Canyon, Monument Valley, and Death Valley. This region is considered a geologist's "dream", attracting a large number of undergraduate field classes and amateur geologists. The volume will prove invaluable to students and will also appeal to anyone interested in the geology and landscape of the region's National Parks.
2013-09-01
considered acts of redemption. A necessary part of this is the defensive dehumanization of the victim which will deprive them of their unique value based...fosters their identity, dehumanizes the enemy and creates a “killer” mentality that is capable of murdering large numbers of innocent people. She...alienated religious group can inflict upon its perceived outgroup. She finds religion to be the ideal motivator of people to violence. Dehumanization
A Debugger for Computational Grid Applications
NASA Technical Reports Server (NTRS)
Hood, Robert; Jost, Gabriele
2000-01-01
The p2d2 project at NAS has built a debugger for applications running on heterogeneous computational grids. It employs a client-server architecture to simplify the implementation. Its user interface has been designed to provide process control and state examination functions on a computation containing a large number of processes. It can find processes participating in distributed computations even when those processes were not created under debugger control. These process identification techniques work both on conventional distributed executions as well as those on a computational grid.
Russell Crater Dunes - False Color
2017-07-07
The THEMIS VIS camera contains 5 filters. The data from different filters can be combined in multiple ways to create a false color image. These false color images may reveal subtle variations of the surface not easily identified in a single band image. Today's false color image shows part of the large dune form on the floor of Russell Crater. Orbit Number: 59672 Latitude: -54.337 Longitude: 13.1087 Instrument: VIS Captured: 2015-05-28 02:39 https://photojournal.jpl.nasa.gov/catalog/PIA21701
Yildiz, Ahmet; Ozdemir, Ercan; Gulturk, Sefa; Erdal, Sena
2009-01-01
Creatine (Cr) has been shown to increase the total muscle mass. The purpose of this study was to investigate the effect of Cr supplementation on muscle morphology and swimming performance, using an animal model. Each rat was subjected to exercise 15-minute period daily for the 12 weeks. The rats were randomly divided into four groups: no Cr supplementation (CON), no Cr supplementation and incomplete food intake (lacking lysine and methionine in diet for rats) (INCO), Cr supplementation 1 g·kg-1·day-1 (CREAT-I) and Cr supplementation 2 g·kg-1·day-1 (CREAT-II). Three months later, all groups adult rats exercised in swimming pool chambers. Swimming time was recorded as minute for each rat. Following swimming performance period, the animals were killed by cervical dislocation and the gastrocnemius and diaphragm muscles were dissected. Serial slices of 5-7 μm were allocated paraffin wax and histochemical staining procedure of cross-sections was carried out with heamatoxylin-eosin technics. All groups gained body weight at the end of 12 weeks but there was no statistical difference among them. Swimming time values were statistical difference between CREAT-II and CON group as well as between CREAT-I and CON group (p < 0.05). In the INCO group was determined increased connective tissue cell of the muscle sample. In contrast, in the CREAT-I and CREAT-II group, the basic histological changes were large-scale muscle fibers and hypertrophic muscle cells. These results suggest that long-term creatine supplementation increased the number of muscle fibers and enhanced endurance swimming performance in rats. Key points There is no study about the effects of creatine long-term supplementation on muscle morphology and swimming performance in rats. Long-term creatine supplementation increase muscle hypertrophy (but not body weight) and enhance endurance swimming performance in rats. The quantitative analysis indicated that the number of muscle fibers per defined area increased in creatine supplementation groups. PMID:24149591
Magma ocean formation due to giant impacts
NASA Technical Reports Server (NTRS)
Tonks, W. B.; Melosh, H. J.
1992-01-01
The effect of giant impacts on the initial chemical and thermal states of the terrestrial planets is just now being explored. A large high speed impact creates an approximately hemispherical melt region with a radius that depends on the projectile's radius and impact speed. It is shown that giant impacts on large planets can create large, intact melt regions containing melt volumes up to a few times the volume of the projectile. These large melt regions are not created on asteroid sized bodies. If extruded to the surface, these regions contain enough melt to create a magma ocean of considerable depth, depending on the impact speed, projectile radius, and gravity of the target planet.
Automatic identification of variables in epidemiological datasets using logic regression.
Lorenz, Matthias W; Abdi, Negin Ashtiani; Scheckenbach, Frank; Pflug, Anja; Bülbül, Alpaslan; Catapano, Alberico L; Agewall, Stefan; Ezhov, Marat; Bots, Michiel L; Kiechl, Stefan; Orth, Andreas
2017-04-13
For an individual participant data (IPD) meta-analysis, multiple datasets must be transformed in a consistent format, e.g. using uniform variable names. When large numbers of datasets have to be processed, this can be a time-consuming and error-prone task. Automated or semi-automated identification of variables can help to reduce the workload and improve the data quality. For semi-automation high sensitivity in the recognition of matching variables is particularly important, because it allows creating software which for a target variable presents a choice of source variables, from which a user can choose the matching one, with only low risk of having missed a correct source variable. For each variable in a set of target variables, a number of simple rules were manually created. With logic regression, an optimal Boolean combination of these rules was searched for every target variable, using a random subset of a large database of epidemiological and clinical cohort data (construction subset). In a second subset of this database (validation subset), this optimal combination rules were validated. In the construction sample, 41 target variables were allocated on average with a positive predictive value (PPV) of 34%, and a negative predictive value (NPV) of 95%. In the validation sample, PPV was 33%, whereas NPV remained at 94%. In the construction sample, PPV was 50% or less in 63% of all variables, in the validation sample in 71% of all variables. We demonstrated that the application of logic regression in a complex data management task in large epidemiological IPD meta-analyses is feasible. However, the performance of the algorithm is poor, which may require backup strategies.
Implementation and evaluation of a community-based interprofessional learning activity.
Luebbers, Ellen L; Dolansky, Mary A; Vehovec, Anton; Petty, Gayle
2017-01-01
Implementation of large-scale, meaningful interprofessional learning activities for pre-licensure students has significant barriers and requires novel approaches to ensure success. To accomplish this goal, faculty at Case Western Reserve University, Ohio, USA, used the Ottawa Model of Research Use (OMRU) framework to create, improve, and sustain a community-based interprofessional learning activity for large numbers of medical students (N = 177) and nursing students (N = 154). The model guided the process and included identification of context-specific barriers and facilitators, continual monitoring and improvement using data, and evaluation of student learning outcomes as well as programme outcomes. First year Case Western Reserve University medical students and undergraduate nursing students participated in team-structured prevention screening clinics in the Cleveland Metropolitan Public School District. Identification of barriers and facilitators assisted with overcoming logistic and scheduling issues, large class size, differing ages and skill levels of students and creating sustainability. Continual monitoring led to three distinct phases of improvement and resulted in the creation of an authentic team structure, role clarification, and relevance for students. Evaluation of student learning included both qualitative and quantitative methods, resulting in statistically significant findings and qualitative themes of learner outcomes. The OMRU implementation model provided a useful framework for successful implementation resulting in a sustainable interprofessional learning activity.
NASA Astrophysics Data System (ADS)
Eroglu, Deniz; Marwan, Norbert
2017-04-01
The complex nature of a variety of phenomena in physical, biological, or earth sciences is driven by a large number of degrees of freedom which are strongly interconnected. Although the evolution of such systems is described by multivariate time series (MTS), so far research mostly focuses on analyzing these components one by one. Recurrence based analyses are powerful methods to understand the underlying dynamics of a dynamical system and have been used for many successful applications including examples from earth science, economics, or chemical reactions. The backbone of these techniques is creating the phase space of the system. However, increasing the dimension of a system requires increasing the length of the time series in order get significant and reliable results. This requirement is one of the challenges in many disciplines, in particular in palaeoclimate, thus, it is not easy to create a phase space from measured MTS due to the limited number of available obervations (samples). To overcome this problem, we suggest to create recurrence networks from each component of the system and combine them into a multiplex network structure, the multiplex recurrence network (MRN). We test the MRN by using prototypical mathematical models and demonstrate its use by studying high-dimensional palaeoclimate dynamics derived from pollen data from the Bear Lake (Utah, US). By using the MRN, we can distinguish typical climate transition events, e.g., such between Marine Isotope Stages.
Reiner, Bruce I
2013-02-01
While occupational stress and fatigue have been well described throughout medicine, the radiology community is particularly susceptible due to declining reimbursements, heightened demands for service deliverables, and increasing exam volume and complexity. The resulting occupational stress can be variable in nature and dependent upon a number of intrinsic and extrinsic stressors. Intrinsic stressors largely account for inter-radiologist stress variability and relate to unique attributes of the radiologist such as personality, emotional state, education/training, and experience. Extrinsic stressors may account for intra-radiologist stress variability and include cumulative workload and task complexity. The creation of personalized stress profiles creates a mechanism for accounting for both inter- and intra-radiologist stress variability, which is essential in creating customizable stress intervention strategies. One viable option for real-time occupational stress measurement is voice stress analysis, which can be directly implemented through existing speech recognition technology and has been proven to be effective in stress measurement and analysis outside of medicine. This technology operates by detecting stress in the acoustic properties of speech through a number of different variables including duration, glottis source factors, pitch distribution, spectral structure, and intensity. The correlation of these speech derived stress measures with outcomes data can be used to determine the user-specific inflection point at which stress becomes detrimental to clinical performance.
Simple Patchy-Based Simulators Used to Explore Pondscape Systematic Dynamics
Fang, Wei-Ta; Chou, Jui-Yu; Lu, Shiau-Yun
2014-01-01
Thousands of farm ponds disappeared on the tableland in Taoyuan County, Taiwan since 1920s. The number of farm ponds that have disappeared is 1,895 (37%), 2,667 ponds remain (52%), and only 537 (11%) new ponds were created within a 757 km2 area in Taoyuan, Taiwan between 1926 and 1960. In this study, a geographic information system (GIS) and logistic stepwise regression model were used to detect pond-loss rates and to understand the driving forces behind pondscape changes. The logistic stepwise regression model was used to develop a series of relationships between pondscapes affected by intrinsic driving forces (patch size, perimeter, and patch shape) and external driving forces (distance from the edge of the ponds to the edges of roads, rivers, and canals). The authors concluded that the loss of ponds was caused by pond intrinsic factors, such as pond perimeter; a large perimeter increases the chances of pond loss, but also increases the possibility of creating new ponds. However, a large perimeter is closely associated with circular shapes (lower value of the mean pond-patch fractal dimension [MPFD]), which characterize the majority of newly created ponds. The method used in this study might be helpful to those seeking to protect this unique landscape by enabling the monitoring of patch-loss problems by using simple patchy-based simulators. PMID:24466281
A large area cosmic muon detector located at Ohya stone mine
NASA Technical Reports Server (NTRS)
Nii, N.; Mizutani, K.; Aoki, T.; Kitamura, T.; Mitsui, K.; Matsuno, S.; Muraki, Y.; Ohashi, Y.; Okada, A.; Kamiya, Y.
1985-01-01
The chemical composition of the primary cosmic rays between 10 to the 15th power eV and 10 to the 18th power eV were determined by a Large Area Cosmic Muon Detector located at Ohya stone mine. The experimental aims of Ohya project are; (1) search for the ultra high-energy gamma-rays; (2) search for the GUT monopole created by Big Bang; and (3) search for the muon bundle. A large number of muon chambers were installed at the shallow underground near Nikko (approx. 100 Km north of Tokyo, situated at Ohya-town, Utsunomiya-city). At the surface of the mine, very fast 100 channel scintillation counters were equipped in order to measure the direction of air showers. These air shower arrays were operated at the same time, together with the underground muon chamber.
Molecular transport through large-diameter DNA nanopores
NASA Astrophysics Data System (ADS)
Krishnan, Swati; Ziegler, Daniela; Arnaut, Vera; Martin, Thomas G.; Kapsner, Korbinian; Henneberg, Katharina; Bausch, Andreas R.; Dietz, Hendrik; Simmel, Friedrich C.
2016-09-01
DNA-based nanopores are synthetic biomolecular membrane pores, whose geometry and chemical functionality can be tuned using the tools of DNA nanotechnology, making them promising molecular devices for applications in single-molecule biosensing and synthetic biology. Here we introduce a large DNA membrane channel with an ~4 nm diameter pore, which has stable electrical properties and spontaneously inserts into flat lipid bilayer membranes. Membrane incorporation is facilitated by a large number of hydrophobic functionalizations or, alternatively, streptavidin linkages between biotinylated channels and lipids. The channel displays an Ohmic conductance of ~3 nS, consistent with its size, and allows electrically driven translocation of single-stranded and double-stranded DNA analytes. Using confocal microscopy and a dye influx assay, we demonstrate the spontaneous formation of membrane pores in giant unilamellar vesicles. Pores can be created both in an outside-in and an inside-out configuration.
2017-06-09
DIGITAL GUNNERY: HOW COMBAT VEHICLE GUNNERY TRAINING CREATES A MODEL FOR TRAINING THE MISSION COMMAND SYSTEM A thesis presented...Training Creates a Model for Training the Mission Command System 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S...digital systems that give commanders an unprecedented ability to understand and lead in the battlefields where they operate. Unfortunately, units
Phylogenetically-informed priorities for amphibian conservation.
Isaac, Nick J B; Redding, David W; Meredith, Helen M; Safi, Kamran
2012-01-01
The amphibian decline and extinction crisis demands urgent action to prevent further large numbers of species extinctions. Lists of priority species for conservation, based on a combination of species' threat status and unique contribution to phylogenetic diversity, are one tool for the direction and catalyzation of conservation action. We describe the construction of a near-complete species-level phylogeny of 5713 amphibian species, which we use to create a list of evolutionarily distinct and globally endangered species (EDGE list) for the entire class Amphibia. We present sensitivity analyses to test the robustness of our priority list to uncertainty in species' phylogenetic position and threat status. We find that both sources of uncertainty have only minor impacts on our 'top 100' list of priority species, indicating the robustness of the approach. By contrast, our analyses suggest that a large number of Data Deficient species are likely to be high priorities for conservation action from the perspective of their contribution to the evolutionary history.
Creation of a Unified Set of Core-Collapse Supernovae for Training of Photometric Classifiers
NASA Astrophysics Data System (ADS)
D'Arcy Kenworthy, William; Scolnic, Daniel; Kessler, Richard
2017-01-01
One of the key tasks for future supernova cosmology analyses is to photometrically distinguish type Ia supernovae (SNe) from their core collapse (CC) counterparts. In order to train programs for this purpose, it is necessary to train on a large number of core-collapse SNe. However, there are only a handful used for current programs. We plan to use the large amount of CC lightcurves available on the Open Supernova Catalog (OSC). Since this data is scraped from many different surveys, it is given in a number of photometric systems with different calibration and filters. We therefore created a program to fit smooth lightcurves (as a function of time) to photometric observations of arbitrary SNe. The Supercal method is then used to translate the smoothed lightcurves to a single photometric system. We can thus compile a training set of 782 supernovae, of which 127 are not type Ia. These smoothed lightcurves are also being contributed upstream to the OSC as derived data.
Statistical mechanics of complex economies
NASA Astrophysics Data System (ADS)
Bardoscia, Marco; Livan, Giacomo; Marsili, Matteo
2017-04-01
In the pursuit of ever increasing efficiency and growth, our economies have evolved to remarkable degrees of complexity, with nested production processes feeding each other in order to create products of greater sophistication from less sophisticated ones, down to raw materials. The engine of such an expansion have been competitive markets that, according to general equilibrium theory (GET), achieve efficient allocations under specific conditions. We study large random economies within the GET framework, as templates of complex economies, and we find that a non-trivial phase transition occurs: the economy freezes in a state where all production processes collapse when either the number of primary goods or the number of available technologies fall below a critical threshold. As in other examples of phase transitions in large random systems, this is an unintended consequence of the growth in complexity. Our findings suggest that the Industrial Revolution can be regarded as a sharp transition between different phases, but also imply that well developed economies can collapse if too many intermediate goods are introduced.
NASA Astrophysics Data System (ADS)
Petrila, S.; Brabie, G.; Chirita, B.
2016-08-01
The analysis performed on manufacturing flows within industrial enterprises producing hydrostatic components twos made on a number of factors that influence smooth running of production such: distance between pieces, waiting time from one surgery to another; time achievement of setups on CNC machines; tool changing in case of a large number of operators and manufacturing complexity of large files [2]. To optimize the manufacturing flow it was used the software Tecnomatix. This software represents a complete portfolio of manufacturing solutions digital manufactured by Siemens. It provides innovation by linking all production methods of a product from process design, process simulation, validation and ending the manufacturing process. Among its many capabilities to create a wide range of simulations, the program offers various demonstrations regarding the behavior manufacturing cycles. This program allows the simulation and optimization of production systems and processes in several areas such as: car suppliers, production of industrial equipment; electronics manufacturing, design and production of aerospace and defense parts.
Molecular communication and networking: opportunities and challenges.
Nakano, Tadashi; Moore, Michael J; Wei, Fang; Vasilakos, Athanasios V; Shuai, Jianwei
2012-06-01
The ability of engineered biological nanomachines to communicate with biological systems at the molecular level is anticipated to enable future applications such as monitoring the condition of a human body, regenerating biological tissues and organs, and interfacing artificial devices with neural systems. From the viewpoint of communication theory and engineering, molecular communication is proposed as a new paradigm for engineered biological nanomachines to communicate with the natural biological nanomachines which form a biological system. Distinct from the current telecommunication paradigm, molecular communication uses molecules as the carriers of information; sender biological nanomachines encode information on molecules and release the molecules in the environment, the molecules then propagate in the environment to receiver biological nanomachines, and the receiver biological nanomachines biochemically react with the molecules to decode information. Current molecular communication research is limited to small-scale networks of several biological nanomachines. Key challenges to bridge the gap between current research and practical applications include developing robust and scalable techniques to create a functional network from a large number of biological nanomachines. Developing networking mechanisms and communication protocols is anticipated to introduce new avenues into integrating engineered and natural biological nanomachines into a single networked system. In this paper, we present the state-of-the-art in the area of molecular communication by discussing its architecture, features, applications, design, engineering, and physical modeling. We then discuss challenges and opportunities in developing networking mechanisms and communication protocols to create a network from a large number of bio-nanomachines for future applications.
Wood, Paul L
2014-01-01
Metabolomics research has the potential to provide biomarkers for the detection of disease, for subtyping complex disease populations, for monitoring disease progression and therapy, and for defining new molecular targets for therapeutic intervention. These potentials are far from being realized because of a number of technical, conceptual, financial, and bioinformatics issues. Mass spectrometry provides analytical platforms that address the technical barriers to success in metabolomics research; however, the limited commercial availability of analytical and stable isotope standards has created a bottleneck for the absolute quantitation of a number of metabolites. Conceptual and financial factors contribute to the generation of statistically under-powered clinical studies, whereas bioinformatics issues result in the publication of a large number of unidentified metabolites. The path forward in this field involves targeted metabolomics analyses of large control and patient populations to define both the normal range of a defined metabolite and the potential heterogeneity (eg, bimodal) in complex patient populations. This approach requires that metabolomics research groups, in addition to developing a number of analytical platforms, build sufficient chemistry resources to supply the analytical standards required for absolute metabolite quantitation. Examples of metabolomics evaluations of sulfur amino-acid metabolism in psychiatry, neurology, and neuro-oncology and of lipidomics in neurology will be reviewed. PMID:23842599
Wood, Paul L
2014-01-01
Metabolomics research has the potential to provide biomarkers for the detection of disease, for subtyping complex disease populations, for monitoring disease progression and therapy, and for defining new molecular targets for therapeutic intervention. These potentials are far from being realized because of a number of technical, conceptual, financial, and bioinformatics issues. Mass spectrometry provides analytical platforms that address the technical barriers to success in metabolomics research; however, the limited commercial availability of analytical and stable isotope standards has created a bottleneck for the absolute quantitation of a number of metabolites. Conceptual and financial factors contribute to the generation of statistically under-powered clinical studies, whereas bioinformatics issues result in the publication of a large number of unidentified metabolites. The path forward in this field involves targeted metabolomics analyses of large control and patient populations to define both the normal range of a defined metabolite and the potential heterogeneity (eg, bimodal) in complex patient populations. This approach requires that metabolomics research groups, in addition to developing a number of analytical platforms, build sufficient chemistry resources to supply the analytical standards required for absolute metabolite quantitation. Examples of metabolomics evaluations of sulfur amino-acid metabolism in psychiatry, neurology, and neuro-oncology and of lipidomics in neurology will be reviewed.
A calibration method based on virtual large planar target for cameras with large FOV
NASA Astrophysics Data System (ADS)
Yu, Lei; Han, Yangyang; Nie, Hong; Ou, Qiaofeng; Xiong, Bangshu
2018-02-01
In order to obtain high precision in camera calibration, a target should be large enough to cover the whole field of view (FOV). For cameras with large FOV, using a small target will seriously reduce the precision of calibration. However, using a large target causes many difficulties in making, carrying and employing the large target. In order to solve this problem, a calibration method based on the virtual large planar target (VLPT), which is virtually constructed with multiple small targets (STs), is proposed for cameras with large FOV. In the VLPT-based calibration method, first, the positions and directions of STs are changed several times to obtain a number of calibration images. Secondly, the VLPT of each calibration image is created by finding the virtual point corresponding to the feature points of the STs. Finally, intrinsic and extrinsic parameters of the camera are calculated by using the VLPTs. Experiment results show that the proposed method can not only achieve the similar calibration precision as those employing a large target, but also have good stability in the whole measurement area. Thus, the difficulties to accurately calibrate cameras with large FOV can be perfectly tackled by the proposed method with good operability.
On the analysis of large data sets
NASA Astrophysics Data System (ADS)
Ruch, Gerald T., Jr.
We present a set of tools and techniques for performing detailed comparisons between computational models with high dimensional parameter spaces and large sets of archival data. By combining a principal component analysis of a large grid of samples from the model with an artificial neural network, we create a powerful data visualization tool as well as a way to robustly recover physical parameters from a large set of experimental data. Our techniques are applied in the context of circumstellar disks, the likely sites of planetary formation. An analysis is performed applying the two layer approximation of Chiang et al. (2001) and Dullemond et al. (2001) to the archive created by the Spitzer Space Telescope Cores to Disks Legacy program. We find two populations of disk sources. The first population is characterized by the lack of a puffed up inner rim while the second population appears to contain an inner rim which casts a shadow across the disk. The first population also exhibits a trend of increasing spectral index while the second population exhibits a decreasing trend in the strength of the 20 mm silicate emission feature. We also present images of the giant molecular cloud W3 obtained with the Infrared Array Camera (IRAC) and the Multiband Imaging Photometer (MIPS) on board the Spitzer Space Telescope. The images encompass the star forming regions W3 Main, W3(OH), and a region that we refer to as the Central Cluster which encloses the emission nebula IC 1795. We present a star count analysis of the point sources detected in W3. The star count analysis shows that the stellar population of the Central Cluster, when compared to that in the background, contains an over density of sources. The Central Cluster also contains an excess of sources with colors consistent with Class II Young Stellar Objects (YSOs). A analysis of the color-color diagrams also reveals a large number of Class II YSOs in the Central Cluster. Our results suggest that an earlier epoch of star formation created the Central Cluster, created a cavity, and triggered the active star formation in the W3 Main and W3(OH) regions. We also detect a new outflow and its candidate exciting star.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zarnoch, Stanley J.; Vukovich, Mark A.; Kilgo, John C.
A 14-year study of snag characteristics was established in 41- to 44-year old loblolly pine (Pinus taeda L.) stands in southeastern USA. During the initial 5.5 years, no stand manipulation or unusually high-mortality events occurred. Afterwards, three treatments were applied consisting of trees thinned and removed, trees felled and not removed, and artificial creation of snags produced by girdling and herbicide injection. The thinned treatments were designed to maintain the same live canopy density as the snag-created treatment, disregarding snags that remained standing.We monitored snag height, diameter, density, volume, and bark percentage; the number of cavities was monitored in naturalmore » snags only. During the first 5.5 years, recruitment and loss rates were stable, resulting in a stable snag population. Large snags (≥25 cm diameter) were common, but subcanopy small snags (10 to <25 cm diameter) dominated numerically. Large natural snags survived (90% quantile) significantly longer (6.0–9.4 years) than smaller snags (4.4–6.9 years). Large artificial snags persisted the longest (11.8 years). Cavities in natural snags developed within 3 years following tree death. The mean number of cavities per snag was five times greater in large versus small snags and large snags were more likely to have multiple cavities, emphasizing the importance of mature pine stands for cavity-dependent wildlife species.« less
Texture for script identification.
Busch, Andrew; Boles, Wageeh W; Sridharan, Sridha
2005-11-01
The problem of determining the script and language of a document image has a number of important applications in the field of document analysis, such as indexing and sorting of large collections of such images, or as a precursor to optical character recognition (OCR). In this paper, we investigate the use of texture as a tool for determining the script of a document image, based on the observation that text has a distinct visual texture. An experimental evaluation of a number of commonly used texture features is conducted on a newly created script database, providing a qualitative measure of which features are most appropriate for this task. Strategies for improving classification results in situations with limited training data and multiple font types are also proposed.
Free space-planning solutions in the architecture of multi-storey buildings
NASA Astrophysics Data System (ADS)
Ibragimov, Alexander; Danilov, Alexander
2018-03-01
Here some aspects of the development of steel frame structure design from the standpoint of geometry and morphogenesis of bearing steel structures of civil engineering objects. An alternative approach to forming constructive schemes may be application of curved steel elements in the main load-bearing system. As an example, it may be circular and parabolic arches or segments of varying outline and orientation. The considered approach implies creating large internal volumes without loss in the load-bearing capacity of the frame. The basic concept makes possible a wide variety of layout and design solutions. The presence of free internal spaces of large volume in buildings of a "skyscraper" type contributes to resolving a great number of problems, including those of communicative nature.
Honeck, Patrick; Michel, Maurice Stephan; Trojan, Lutz; Alken, Peter
2009-02-01
Despite the large number of surgical techniques for continent cutaneous diversion described in literature, the creation of a reliable, continent and easily catheterizable continence mechanism remains a complex surgical procedure. Aim of this study was the evaluation of a new method for a catheterizable continence mechanism using stapled pig intestine. Small and large pig intestines were used for construction. A 3 or 6 cm double row stapling system was used. Three variations using small and large intestine segments were constructed. A 3 or 6 cm long stapler line was placed alongside a 12 Fr catheter positioned at the antimesenterial side creating a partially two-luminal segment. Construction time for the tube was measured. The created tube was then embedded into the pouch. Pressure evaluation of the continence mechanism was performed for each variation. Intermittent external manual compression was used to simulate sudden pressure exposure. All variations were 100% continent under filling volumes of up to 700 ml and pressure levels of 58 +/- 6 cm H(2)O for large intestine and 266 ml and 87 +/- 18 cm H(2)O for small intestine, respectively. With further filling above the mentioned capacity suture insufficiency occurred but no tube insufficiency. Construction time for all variations was less than 12 min. The described technique is an easy and fast method to construct a continence mechanism using small or large intestine. Our ex vivo experiments have shown sufficient continence situation in an ex-vivo model. Further investigations in an in-vivo model are needed to confirm these results.
NASA Astrophysics Data System (ADS)
Jose, Tony; Narayanan, Vijayakumar
2018-03-01
Radio over fiber (RoF) systems use a large number of base stations (BSs) and a number of central stations (CSs), which are interlinked together to form the network. RoF systems use multiple wavelengths for communication between CSs or between CSs and BSs to facilitate the huge amount of data traffic due to the multiple services for a large number of users. When erbium-doped fiber amplifiers (EDFAs) are used as amplifiers in such wavelength-division multiplexed systems, the nonuniform gain spectrum of EDFAs causes instability to some of the channels while providing faithful amplification to other channels. To avoid this inconsistency, the gain spectrum of the amplifier needs to be uniform along the whole usable range of wavelengths. A gain contouring technique is proposed to provide uniform gain to all channels irrespective of wavelength. Optical add/drop multiplexers (OADMs) and different lengths of erbium-doped fibers are used to create such a gain contouring mechanism in the optical domain itself. The effect of a cascade of nonuniform gain amplifiers is studied, and the proposed system mitigates the adverse effects caused due to nonuniform gain-induced channel instability effectively.
Progress, pitfalls and parallel universes: a history of insect phylogenetics
Simon, Chris; Yavorskaya, Margarita; Beutel, Rolf G.
2016-01-01
The phylogeny of insects has been both extensively studied and vigorously debated for over a century. A relatively accurate deep phylogeny had been produced by 1904. It was not substantially improved in topology until recently when phylogenomics settled many long-standing controversies. Intervening advances came instead through methodological improvement. Early molecular phylogenetic studies (1985–2005), dominated by a few genes, provided datasets that were too small to resolve controversial phylogenetic problems. Adding to the lack of consensus, this period was characterized by a polarization of philosophies, with individuals belonging to either parsimony or maximum-likelihood camps; each largely ignoring the insights of the other. The result was an unfortunate detour in which the few perceived phylogenetic revolutions published by both sides of the philosophical divide were probably erroneous. The size of datasets has been growing exponentially since the mid-1980s accompanied by a wave of confidence that all relationships will soon be known. However, large datasets create new challenges, and a large number of genes does not guarantee reliable results. If history is a guide, then the quality of conclusions will be determined by an improved understanding of both molecular and morphological evolution, and not simply the number of genes analysed. PMID:27558853
Controlling Ionic Transport for Device Design in Synthetic Nanopores
NASA Astrophysics Data System (ADS)
Kalman, Eric Boyd
Polymer nanopores present a number of behaviors not seen in microscale systems, such as ion current rectification, ionic selectivity, size exclusion and potential dependent ion concentrations in and near the pore. The existence of these effects stems from the small size of nanopores with respect to the characteristic length scales of surface interactions at the interface between the nanopore surface and the solution within it. The large surface-to-volume ratio due to the nanoscale geometry of a nanopore, as well as similarity in scale between geometry and interaction demands the solution interact with the nanopore walls. As surfaces in solution almost always carry residual charge, these surface forces are primarily the electrostatic interactions between the charge groups on the pore surface and the ions in solution. These interactions may be used by the experimentalist to control ionic transport through synthetic nanopores, and use them as a template for the construction of devices. In this research, we present our work on creating a number of ionic analogs to seminal electronic devices, specifically diodes, and transistors, by controlling ionic transport through the electrostatic interactions between a single synthetic nanopore and ions. Control is achieved by "doping" the effective charge carrier concentration in specific regions of the nanopore through manipulation of the pore's surface charge. This manipulation occurs through two mechanisms: chemical modification of the surface charge and electrostatic manipulation of the local internal nanopore potential using a gate electrode. Additionally, the innate selectivity of the charged nanopores walls allows for the separation of charges in solution. This well-known effect, which spawns measureable quantities, the streaming potential and current, has been used to create nanoscale water desalination membranes. We attempt to create a device using membranes with large nanopore densities for the desalination of water which should theoretically outperform currently available devices, as through our previous work we have developed techniques allowing for transport manipulation not current accessible in traditional membrane motifs.
Caron-Lormier, Geoffrey; Harvey, Naomi D; England, Gary C W; Asher, Lucy
2016-01-01
In a resource-limited world, organisations attempting to reduce the impact of health or behaviour issues need to choose carefully how to allocate resources for the highest overall impact. However, such choices may not always be obvious. Which has the biggest impact? A large change to a small number of individuals, or a small change to a large number of individuals? The challenge is identifying the issues that have the greatest impact on the population so potential interventions can be prioritised. We addressed this by developing a score to quantify the impact of health conditions and behaviour problems in a population of working guide dogs using data from Guide Dogs, UK. The cumulative incidence of different issues was combined with information about their impact, in terms of reduction in working life, to create a work score. The work score was created at population-level to illustrate issues with the greatest impact on the population and to understand contributions of breeds or crossbreeds to the workforce. An individual work deficit score was also created and means of this score used to illustrate the impact on working life within a subgroup of the population such as a breed, or crossbreed generation. The work deficit scores showed that those removed for behavioural issues had a greater impact on the overall workforce than those removed for health reasons. Additionally trends over time illustrated the positive influence of interventions Guide Dogs have made to improve their workforce. Information highlighted by these scores is pertinent to the effort of Guide Dogs to ensure partnerships are lasting. Recognising that the scores developed here could be transferable to a wide variety of contexts and species, most notably human work force decisions; we discuss possible uses and adaptations such as reduction in lifespan, quality of life and yield in production animals.
Caron-Lormier, Geoffrey; England, Gary C. W.; Asher, Lucy
2016-01-01
In a resource-limited world, organisations attempting to reduce the impact of health or behaviour issues need to choose carefully how to allocate resources for the highest overall impact. However, such choices may not always be obvious. Which has the biggest impact? A large change to a small number of individuals, or a small change to a large number of individuals? The challenge is identifying the issues that have the greatest impact on the population so potential interventions can be prioritised. We addressed this by developing a score to quantify the impact of health conditions and behaviour problems in a population of working guide dogs using data from Guide Dogs, UK. The cumulative incidence of different issues was combined with information about their impact, in terms of reduction in working life, to create a work score. The work score was created at population-level to illustrate issues with the greatest impact on the population and to understand contributions of breeds or crossbreeds to the workforce. An individual work deficit score was also created and means of this score used to illustrate the impact on working life within a subgroup of the population such as a breed, or crossbreed generation. The work deficit scores showed that those removed for behavioural issues had a greater impact on the overall workforce than those removed for health reasons. Additionally trends over time illustrated the positive influence of interventions Guide Dogs have made to improve their workforce. Information highlighted by these scores is pertinent to the effort of Guide Dogs to ensure partnerships are lasting. Recognising that the scores developed here could be transferable to a wide variety of contexts and species, most notably human work force decisions; we discuss possible uses and adaptations such as reduction in lifespan, quality of life and yield in production animals. PMID:27829045
ORIGAMI Automator Primer. Automated ORIGEN Source Terms and Spent Fuel Storage Pool Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wieselquist, William A.; Thompson, Adam B.; Bowman, Stephen M.
2016-04-01
Source terms and spent nuclear fuel (SNF) storage pool decay heat load analyses for operating nuclear power plants require a large number of Oak Ridge Isotope Generation and Depletion (ORIGEN) calculations. SNF source term calculations also require a significant amount of bookkeeping to track quantities such as core and assembly operating histories, spent fuel pool (SFP) residence times, heavy metal masses, and enrichments. The ORIGEN Assembly Isotopics (ORIGAMI) module in the SCALE code system provides a simple scheme for entering these data. However, given the large scope of the analysis, extensive scripting is necessary to convert formats and process datamore » to create thousands of ORIGAMI input files (one per assembly) and to process the results into formats readily usable by follow-on analysis tools. This primer describes a project within the SCALE Fulcrum graphical user interface (GUI) called ORIGAMI Automator that was developed to automate the scripting and bookkeeping in large-scale source term analyses. The ORIGAMI Automator enables the analyst to (1) easily create, view, and edit the reactor site and assembly information, (2) automatically create and run ORIGAMI inputs, and (3) analyze the results from ORIGAMI. ORIGAMI Automator uses the standard ORIGEN binary concentrations files produced by ORIGAMI, with concentrations available at all time points in each assembly’s life. The GUI plots results such as mass, concentration, activity, and decay heat using a powerful new ORIGEN Post-Processing Utility for SCALE (OPUS) GUI component. This document includes a description and user guide for the GUI, a step-by-step tutorial for a simplified scenario, and appendices that document the file structures used.« less
Assessing healthcare market trends and capital needs: 1996-2000.
Coile, R C
1995-08-01
An analysis of recent data suggests several significant trends for the next five years, including a continuation of market-based reform, increases in managed care penetration, growth of Medicare and Medicaid health maintenance organizations, and erosion of hospital profits. A common response to these trends is to create integrated delivery systems, which can require significant capital investment. The wisest capital investment strategy may be to avoid asset-based integration in favor of "virtual integration," which emphasizes coordination through patient-management agreements, provider incentives, and information systems, rather than investment in large number of facilities.
Detection of electromagnetic radiation using micromechanical multiple quantum wells structures
Datskos, Panagiotis G [Knoxville, TN; Rajic, Slobodan [Knoxville, TN; Datskou, Irene [Knoxville, TN
2007-07-17
An apparatus and method for detecting electromagnetic radiation employs a deflectable micromechanical apparatus incorporating multiple quantum wells structures. When photons strike the quantum-well structure, physical stresses are created within the sensor, similar to a "bimetallic effect." The stresses cause the sensor to bend. The extent of deflection of the sensor can be measured through any of a variety of conventional means to provide a measurement of the photons striking the sensor. A large number of such sensors can be arranged in a two-dimensional array to provide imaging capability.
Classifying and Finding Nearby Compact Stellar Systems
NASA Astrophysics Data System (ADS)
Colebaugh, Alexander; Cunningham, Devin; Dixon, Christopher; Romanowsky, Aaron; Striegel, Stephanie
2018-01-01
Compact stellar systems (CSSs) such as compact ellipticals (cEs) and ultracompact dwarfs (UCDs) are relatively rare and poorly understood types of galaxies. To build a more complete picture of these objects, we create search queries using the Sloan Digital Sky Survey, to inventory CSSs in the nearby universe and to explore their properties. We develop an objective set of criteria for classifying cEs, and use these to construct a large, novel catalog of cEs both during and after formation. We also investigate the numbers of cEs and UCDs around nearby giant galaxies.
Trace elements in Antarctic meteorites: Weathering and genetic information
NASA Technical Reports Server (NTRS)
Lipschutz, M. E.
1986-01-01
Antarctic meteorite discoveries have created great scientific interest due to the large number of specimens recovered (approximately 7000) and because included are representatives of hitherto rare or unknown types. Antarctic meteorites are abundant because they have fallen over long periods and were preserved, transported, and concentrated by the ice sheets. The weathering effects on the Antarctic meteorites are described. Weathering effects of trace element contents of H5 chondrites were studied in detail. The results are examined. The properties of Antarctic finds and non-Antarctic falls are discussed.
NASA Astrophysics Data System (ADS)
Baru, Chaitan; Nandigam, Viswanath; Krishnan, Sriram
2010-05-01
Increasingly, the geoscience user community expects modern IT capabilities to be available in service of their research and education activities, including the ability to easily access and process large remote sensing datasets via online portals such as GEON (www.geongrid.org) and OpenTopography (opentopography.org). However, serving such datasets via online data portals presents a number of challenges. In this talk, we will evaluate the pros and cons of alternative storage strategies for management and processing of such datasets using binary large object implementations (BLOBs) in database systems versus implementation in Hadoop files using the Hadoop Distributed File System (HDFS). The storage and I/O requirements for providing online access to large datasets dictate the need for declustering data across multiple disks, for capacity as well as bandwidth and response time performance. This requires partitioning larger files into a set of smaller files, and is accompanied by the concomitant requirement for managing large numbers of file. Storing these sub-files as blobs in a shared-nothing database implemented across a cluster provides the advantage that all the distributed storage management is done by the DBMS. Furthermore, subsetting and processing routines can be implemented as user-defined functions (UDFs) on these blobs and would run in parallel across the set of nodes in the cluster. On the other hand, there are both storage overheads and constraints, and software licensing dependencies created by such an implementation. Another approach is to store the files in an external filesystem with pointers to them from within database tables. The filesystem may be a regular UNIX filesystem, a parallel filesystem, or HDFS. In the HDFS case, HDFS would provide the file management capability, while the subsetting and processing routines would be implemented as Hadoop programs using the MapReduce model. Hadoop and its related software libraries are freely available. Another consideration is the strategy used for partitioning large data collections, and large datasets within collections, using round-robin vs hash partitioning vs range partitioning methods. Each has different characteristics in terms of spatial locality of data and resultant degree of declustering of the computations on the data. Furthermore, we have observed that, in practice, there can be large variations in the frequency of access to different parts of a large data collection and/or dataset, thereby creating "hotspots" in the data. We will evaluate the ability of different approaches for dealing effectively with such hotspots and alternative strategies for dealing with hotspots.
NASA Astrophysics Data System (ADS)
Lieu, Richard
2018-01-01
A hierarchy of statistics of increasing sophistication and accuracy is proposed, to exploit an interesting and fundamental arithmetic structure in the photon bunching noise of incoherent light of large photon occupation number, with the purpose of suppressing the noise and rendering a more reliable and unbiased measurement of the light intensity. The method does not require any new hardware, rather it operates at the software level, with the help of high precision computers, to reprocess the intensity time series of the incident light to create a new series with smaller bunching noise coherence length. The ultimate accuracy improvement of this method of flux measurement is limited by the timing resolution of the detector and the photon occupation number of the beam (the higher the photon number the better the performance). The principal application is accuracy improvement in the bolometric flux measurement of a radio source.
Stochastic algorithm for simulating gas transport coefficients
NASA Astrophysics Data System (ADS)
Rudyak, V. Ya.; Lezhnev, E. V.
2018-02-01
The aim of this paper is to create a molecular algorithm for modeling the transport processes in gases that will be more efficient than molecular dynamics method. To this end, the dynamics of molecules are modeled stochastically. In a rarefied gas, it is sufficient to consider the evolution of molecules only in the velocity space, whereas for a dense gas it is necessary to model the dynamics of molecules also in the physical space. Adequate integral characteristics of the studied system are obtained by averaging over a sufficiently large number of independent phase trajectories. The efficiency of the proposed algorithm was demonstrated by modeling the coefficients of self-diffusion and the viscosity of several gases. It was shown that the accuracy comparable to the experimental one can be obtained on a relatively small number of molecules. The modeling accuracy increases with the growth of used number of molecules and phase trajectories.
Filler/ Polycarbosilane Systems as CMC Matrix Precursors
NASA Technical Reports Server (NTRS)
Hurwitz, Frances I.
1998-01-01
Pyrolytic conversion of polymeric precursors to ceramics is accompanied by loss of volatiles and large volume changes. Infiltration of a low viscosity polymer into a fiber preform will fill small spaces within fiber tows by capillary forces, but create large matrix cracks within large, intertow areas. One approach to minimizing shrinkage and reducing the number of required infiltration cycles is to use particulate fillers. In this study, Starfire allylhydridopolycarbosilane (AHPCS) was blended with a silicon carbide powder, with and without dispersant, using shear mixing. The polymer and polymer/particle interactions were characterized using nuclear magnetic resonance, differential scanning calorimetry, thermogravimetric analysis and rheometry. Polymer/particulate slurries and suspensions were used to infiltrate a figidized preform of an eight ply five harness satin CG Nicalon fiber having a dual layer BN/SiC interface coating, and the resulting composites characterized by optical and scanning electron microscopy.
Deterministic Generation of All-Photonic Quantum Repeaters from Solid-State Emitters
NASA Astrophysics Data System (ADS)
Buterakos, Donovan; Barnes, Edwin; Economou, Sophia E.
2017-10-01
Quantum repeaters are nodes in a quantum communication network that allow reliable transmission of entanglement over large distances. It was recently shown that highly entangled photons in so-called graph states can be used for all-photonic quantum repeaters, which require substantially fewer resources compared to atomic-memory-based repeaters. However, standard approaches to building multiphoton entangled states through pairwise probabilistic entanglement generation severely limit the size of the state that can be created. Here, we present a protocol for the deterministic generation of large photonic repeater states using quantum emitters such as semiconductor quantum dots and defect centers in solids. We show that arbitrarily large repeater states can be generated using only one emitter coupled to a single qubit, potentially reducing the necessary number of photon sources by many orders of magnitude. Our protocol includes a built-in redundancy, which makes it resilient to photon loss.
NASA Astrophysics Data System (ADS)
Urbánek, Michal; Flajšman, Lukáš; Křižáková, Viola; Gloss, Jonáš; Horký, Michal; Schmid, Michael; Varga, Peter
2018-06-01
Focused ion beam irradiation of metastable Fe78Ni22 thin films grown on Cu(100) substrates is used to create ferromagnetic, body-centered cubic patterns embedded into paramagnetic, face-centered-cubic surrounding. The structural and magnetic phase transformation can be controlled by varying parameters of the transforming gallium ion beam. The focused ion beam parameters such as the ion dose, number of scans, and scanning direction can be used not only to control a degree of transformation but also to change the otherwise four-fold in-plane magnetic anisotropy into the uniaxial anisotropy along a specific crystallographic direction. This change is associated with a preferred growth of specific crystallographic domains. The possibility to create magnetic patterns with continuous magnetization transitions and at the same time to create patterns with periodical changes in magnetic anisotropy makes this system an ideal candidate for rapid prototyping of a large variety of nanostructured samples. Namely, spin-wave waveguides and magnonic crystals can be easily combined into complex devices in a single fabrication step.
Weber, J Mark; Reeves, Andrew; Cernota, William H; Wesley, Roy K
2017-01-01
Transposon mutagenesis is an invaluable technique in molecular biology for the creation of random mutations that can be easily identified and mapped. However, in the field of microbial strain improvement, transposon mutagenesis has scarcely been used; instead, chemical and physical mutagenic methods have been traditionally favored. Transposons have the advantage of creating single mutations in the genome, making phenotype to genotype assignments less challenging than with traditional mutagens which commonly create multiple mutations in the genome. The site of a transposon mutation can also be readily mapped using DNA sequencing primer sites engineered into the transposon termini. In this chapter an in vitro method for transposon mutagenesis of Saccharopolyspora erythraea is presented. Since in vivo transposon tools are not available for most actinomycetes including S. erythraea, an in vitro method was developed. The in vitro method involves a significant investment in time and effort to create the mutants, but once the mutants are made and screened, a large number of highly relevant mutations of direct interest to erythromycin production can be found.
Newsome, A E
1975-12-01
This paper discusses the interactions between the large and medium-sized marsupials, the introduced ruminant domestic stock, and the environment in the arid zone of Australia. The grazing of sheep and cattle has produced suitable subclimax pastures which today favor two sympatric kangaroos but not the smaller bandicoots and wallabies. Tall grass tussocks used as shelter by the latter have been grazed down by the ruminants, and replaced by "marsupial lawns" or xeric spinifex, depending on locality, thereby improving the food supplies for the plains kangaroo and the hill kangaroo, respectively. It is argued, however, that even these smaller marsupials benefited originally from the new grazing regime. Patchy grazing of the grasslands probably created edge effects and other early seral changes which improved the food supplies while leaving adequate shelter. Continued grazing by increasingly large numbers of sheep and cattle ultimately and critically removed the shelter and, therefore, eliminated the bandicoots and wallabies. There is evidence that the plains kangaroo, though generally abundant at the present time, is vulnerable to competitive displacement by sheep, cattle, rabbits, and, in one region, by the hill kangaroo when it invades the plains. The plains kangaroo with its diet of green herbage is most threatened during droughts because the other herbivores have finer-grained diets. Like the bandicoots and wallabies the plains kangaroo in at least two localities appears to have first increased in numbers and then decreased. Sheep and cattle numbers have generally done the same. It is postulated, therefore, that there may not be two opposing response curves for the large and medium-sized marsupials to the ruminant invasion of the inland plains, but, in the long run, only one: an initial numerical increase and then decline. Only the time-scale is different, taking 50 years or more for the plains kangaroo, but perhaps half that time or less for the bandicoots and wallabies. The hill kangaroo may be the ultimate winner because it requires the least nitrogen, and the spinifex it eats during drought has spread as part of the subclimax created by ruminants.
Area law microstate entropy from criticality and spherical symmetry
NASA Astrophysics Data System (ADS)
Dvali, Gia
2018-05-01
It is often assumed that the area law of microstate entropy and the holography are intrinsic properties exclusively of the gravitational systems, such as black holes. We construct a nongravitational model that exhibits an entropy that scales as area of a sphere of one dimension less. It is represented by a nonrelativistic bosonic field living on a d -dimensional sphere of radius R and experiencing an angular-momentum-dependent attractive interaction. We show that the system possesses a quantum critical point with the emergent gapless modes. Their number is equal to the area of a d -1 -dimensional sphere of the same radius R . These gapless modes create an exponentially large number of degenerate microstates with the corresponding microstate entropy given by the area of the same d -1 -dimensional sphere. Thanks to a double-scaling limit, the counting of the entropy and of the number of the gapless modes is made exact. The phenomenon takes place for arbitrary number of dimensions and can be viewed as a version of holography.
NASA Astrophysics Data System (ADS)
Eversman, Walter
The differences in the radiated acoustic fields of ducted and unducted propellers of the same thrust operating under similar conditions are investigated. An FEM model is created for the generation, propagation, and radiation of steady, rotor alone noise and exit guide vane interaction noise of a ducted fan. For a specified number of blades, angular mode harmonic, and rotor angular velocity, the acoustic field is described in a cylindrical coordinate system reduced to only the axial and radial directions. It is found that, contrary to the usual understanding of the Tyler and Sofrin (1962) result, supersonic tip speed rotor noise can be cut off if the tip Mach number is only slightly in excess of unity and if the number of blades is relatively small. If there are many blades, the fundamental angular mode number is large, and the Tyler and Sofrin result for thin annuli becomes more relevant. Shrouding of subsonic tip speed propellers is a very effective means of controlling rotor alone noise.
An individual-based model for population viability analysis of humpback chub in Grand Canyon
Pine, William Pine; Healy, Brian; Smith, Emily Omana; Trammell, Melissa; Speas, Dave; Valdez, Rich; Yard, Mike; Walters, Carl; Ahrens, Rob; Vanhaverbeke, Randy; Stone, Dennis; Wilson, Wade
2013-01-01
We developed an individual-based population viability analysis model (females only) for evaluating risk to populations from catastrophic events or conservation and research actions. This model tracks attributes (size, weight, viability, etc.) for individual fish through time and then compiles this information to assess the extinction risk of the population across large numbers of simulation trials. Using a case history for the Little Colorado River population of Humpback Chub Gila cypha in Grand Canyon, Arizona, we assessed extinction risk and resiliency to a catastrophic event for this population and then assessed a series of conservation actions related to removing specific numbers of Humpback Chub at different sizes for conservation purposes, such as translocating individuals to establish other spawning populations or hatchery refuge development. Our results suggested that the Little Colorado River population is generally resilient to a single catastrophic event and also to removals of larvae and juveniles for conservation purposes, including translocations to establish new populations. Our results also suggested that translocation success is dependent on similar survival rates in receiving and donor streams and low emigration rates from recipient streams. In addition, translocating either large numbers of larvae or small numbers of large juveniles has generally an equal likelihood of successful population establishment at similar extinction risk levels to the Little Colorado River donor population. Our model created a transparent platform to consider extinction risk to populations from catastrophe or conservation actions and should prove useful to managers assessing these risks for endangered species such as Humpback Chub.
Deterministic quantum nonlinear optics with single atoms and virtual photons
NASA Astrophysics Data System (ADS)
Kockum, Anton Frisk; Miranowicz, Adam; Macrı, Vincenzo; Savasta, Salvatore; Nori, Franco
2017-06-01
We show how analogs of a large number of well-known nonlinear-optics phenomena can be realized with one or more two-level atoms coupled to one or more resonator modes. Through higher-order processes, where virtual photons are created and annihilated, an effective deterministic coupling between two states of such a system can be created. In this way, analogs of three-wave mixing, four-wave mixing, higher-harmonic and -subharmonic generation (i.e., up- and down-conversion), multiphoton absorption, parametric amplification, Raman and hyper-Raman scattering, the Kerr effect, and other nonlinear processes can be realized. In contrast to most conventional implementations of nonlinear optics, these analogs can reach unit efficiency, only use a minimal number of photons (they do not require any strong external drive), and do not require more than two atomic levels. The strength of the effective coupling in our proposed setups becomes weaker the more intermediate transition steps are needed. However, given the recent experimental progress in ultrastrong light-matter coupling and improvement of coherence times for engineered quantum systems, especially in the field of circuit quantum electrodynamics, we estimate that many of these nonlinear-optics analogs can be realized with currently available technology.
Creating an outpatient center of excellence in CT.
Itri, Jason N; Bakow, Eric; Woods, Jordan
2014-12-01
CT examinations represent a substantial portion of the workload for many radiology departments, and optimizing service delivery is a critical function to ensure customer satisfaction. This article describes how the Six Sigma methodology was used in the radiology department at a large academic hospital to improve the patient experience and increase CT capacity while reducing waste and improving staff satisfaction. The 5 distinct phases of Six Sigma are reviewed as they apply to our CT Center of Excellence project: define, measure, analyze, improve, and control. Process metrics used in this project include the percentage of outpatient CT exams started within 5 minutes of the scheduled appointment time, and the number of studies with protocols selected >48 hours before the CT exam is performed. Outcome metrics include monthly department expense per scan and CT Press Ganey "standard test and treatment" mean scores. An approach to developing interventions is described based on identifying critical sources of variation, ranking these by creating risk prioritization numbers, performing root cause analysis, and utilizing the failure mode and effects analysis tool to prioritize possible solutions. Finally, the key features of action plans and a control plan are reviewed. Copyright © 2014 American College of Radiology. Published by Elsevier Inc. All rights reserved.
Moretti, Rocco; Lyskov, Sergey; Das, Rhiju; Meiler, Jens; Gray, Jeffrey J
2018-01-01
The Rosetta molecular modeling software package provides a large number of experimentally validated tools for modeling and designing proteins, nucleic acids, and other biopolymers, with new protocols being added continually. While freely available to academic users, external usage is limited by the need for expertise in the Unix command line environment. To make Rosetta protocols available to a wider audience, we previously created a web server called Rosetta Online Server that Includes Everyone (ROSIE), which provides a common environment for hosting web-accessible Rosetta protocols. Here we describe a simplification of the ROSIE protocol specification format, one that permits easier implementation of Rosetta protocols. Whereas the previous format required creating multiple separate files in different locations, the new format allows specification of the protocol in a single file. This new, simplified protocol specification has more than doubled the number of Rosetta protocols available under ROSIE. These new applications include pK a determination, lipid accessibility calculation, ribonucleic acid redesign, protein-protein docking, protein-small molecule docking, symmetric docking, antibody docking, cyclic toxin docking, critical binding peptide determination, and mapping small molecule binding sites. ROSIE is freely available to academic users at http://rosie.rosettacommons.org. © 2017 The Protein Society.
Flexible conformable hydrophobized surfaces for turbulent flow drag reduction
NASA Astrophysics Data System (ADS)
Brennan, Joseph C.; Geraldi, Nicasio R.; Morris, Robert H.; Fairhurst, David J.; McHale, Glen; Newton, Michael I.
2015-05-01
In recent years extensive work has been focused onto using superhydrophobic surfaces for drag reduction applications. Superhydrophobic surfaces retain a gas layer, called a plastron, when submerged underwater in the Cassie-Baxter state with water in contact with the tops of surface roughness features. In this state the plastron allows slip to occur across the surface which results in a drag reduction. In this work we report flexible and relatively large area superhydrophobic surfaces produced using two different methods: Large roughness features were created by electrodeposition on copper meshes; Small roughness features were created by embedding carbon nanoparticles (soot) into Polydimethylsiloxane (PDMS). Both samples were made into cylinders with a diameter under 12 mm. To characterize the samples, scanning electron microscope (SEM) images and confocal microscope images were taken. The confocal microscope images were taken with each sample submerged in water to show the extent of the plastron. The hydrophobized electrodeposited copper mesh cylinders showed drag reductions of up to 32% when comparing the superhydrophobic state with a wetted out state. The soot covered cylinders achieved a 30% drag reduction when comparing the superhydrophobic state to a plain cylinder. These results were obtained for turbulent flows with Reynolds numbers 10,000 to 32,500.
Modeling and visualizing borehole information on virtual globes using KML
NASA Astrophysics Data System (ADS)
Zhu, Liang-feng; Wang, Xi-feng; Zhang, Bing
2014-01-01
Advances in virtual globes and Keyhole Markup Language (KML) are providing the Earth scientists with the universal platforms to manage, visualize, integrate and disseminate geospatial information. In order to use KML to represent and disseminate subsurface geological information on virtual globes, we present an automatic method for modeling and visualizing a large volume of borehole information. Based on a standard form of borehole database, the method first creates a variety of borehole models with different levels of detail (LODs), including point placemarks representing drilling locations, scatter dots representing contacts and tube models representing strata. Subsequently, the level-of-detail based (LOD-based) multi-scale representation is constructed to enhance the efficiency of visualizing large numbers of boreholes. Finally, the modeling result can be loaded into a virtual globe application for 3D visualization. An implementation program, termed Borehole2KML, is developed to automatically convert borehole data into KML documents. A case study of using Borehole2KML to create borehole models in Shanghai shows that the modeling method is applicable to visualize, integrate and disseminate borehole information on the Internet. The method we have developed has potential use in societal service of geological information.
Grandgeorge, Paul; Antkowiak, Arnaud; Neukirch, Sébastien
2017-09-18
A flexible fiber carrying a liquid drop may coil inside the drop thereby creating a drop-on-fiber system with an ultra-extensible behavior. During compression, the excess fiber is spooled inside the droplet and capillary forces keep the system taut. During subsequent elongation, the fiber is gradually released and if a large number of spools is uncoiled a high stretchability is achieved. This mechanical behaviour is of interest for stretchable connectors but information, may it be electronic or photonic, usually travels through stiff functional materials. These high Young's moduli, leading to large bending rigidity, prevent in-drop coiling. Here we overcome this limitation by attaching a beam of soft elastomer to the functional fiber, thereby creating a composite system which exhibits in-drop coiling and carries information while being ultra-extensible. We present a simple model to explain the underlying mechanics of the addition of the soft beam and we show how it favors in-drop coiling. We illustrate the method with a two-centimeter long micronic PEDOT:PSS conductive fiber joined to a PVS soft beam, showing that the system conveys electricity throughout a 1900% elongation. Copyright © 2017 Elsevier B.V. All rights reserved.
X-ray techniques for innovation in industry
Lawniczak-Jablonska, Krystyna; Cutler, Jeffrey
2014-01-01
The smart specialization declared in the European program Horizon 2020, and the increasing cooperation between research and development found in companies and researchers at universities and research institutions have created a new paradigm where many calls for proposals require participation and funding from public and private entities. This has created a unique opportunity for large-scale facilities, such as synchrotron research laboratories, to participate in and support applied research programs. Scientific staff at synchrotron facilities have developed many advanced tools that make optimal use of the characteristics of the light generated by the storage ring. These tools have been exceptionally valuable for materials characterization including X-ray absorption spectroscopy, diffraction, tomography and scattering, and have been key in solving many research and development issues. Progress in optics and detectors, as well as a large effort put into the improvement of data analysis codes, have resulted in the development of reliable and reproducible procedures for materials characterization. Research with photons has contributed to the development of a wide variety of products such as plastics, cosmetics, chemicals, building materials, packaging materials and pharma. In this review, a few examples are highlighted of successful cooperation leading to solutions of a variety of industrial technological problems which have been exploited by industry including lessons learned from the Science Link project, supported by the European Commission, as a new approach to increase the number of commercial users at large-scale research infrastructures. PMID:25485139
Effects of acidic precipitation on waterbirds in Maine
Longcore, J.R.; McAuley, D.G.; Stromborg, K.L.; Hensler, G.L.
1985-01-01
During 1982-84 waterbird use and numbers of waterbird broods were recorded for 29 wetlands on two study areas (25 and 77 km2) in east-central Maine underlain with bedrock having low, acid-neutralizing capacity (ANC). Twenty-nine wetlands over bedrock with high ANC (Class 3) and 31 wetlands over bedrock of low ANC (Class 1) were evaluated as predictors of wetland pH and alkalinity. Using the alkalinity value of 25 times was greater (P< ..0001) for downstream (84%) versus headwater (16%) wetlands during 1982-84. Avian use was similar when wetlands were classified either as beaver-created or glacial in origin. Headwater wetlands, which are most vulnerable to acidification within the low ANC areas, are used mostly by common goldeneye (Bucephala clangula), and common loon (Gavia immer). Common merganser (Mergus merganser), spotted sandpiper (Actitis macularia), and chimney swift (Chaetura pelagica) were associated with headwater wetlands about equally. The majority of species (16), including dabbling ducks, used, almost exclusively, wetlands classified as downstream or beaver-created. For all years, 87% of the 246 broods observed was on wetlands classified as either downstream or beaver-created. Our data suggest that avian use of wetlands is influenced more by the morphometric and vegetative characteristics of the wetland basin rather than by the wetland water chemistry. Nevertheless, large numbers of a variety of avian species are associated with wetlands underlain with bedrock that has little or no capacity to neutralize acidic depositions.
2017-01-01
Background Digital health social networks (DHSNs) are widespread, and the consensus is that they contribute to wellness by offering social support and knowledge sharing. The success of a DHSN is based on the number of participants and their consistent creation of externalities through the generation of new content. To promote network growth, it would be helpful to identify characteristics of superusers or actors who create value by generating positive network externalities. Objective The aim of the study was to investigate the feasibility of developing predictive models that identify potential superusers in real time. This study examined associations between posting behavior, 4 demographic variables, and 20 indication-specific variables. Methods Data were extracted from the custom structured query language (SQL) databases of 4 digital health behavior change interventions with DHSNs. Of these, 2 were designed to assist in the treatment of addictions (problem drinking and smoking cessation), and 2 for mental health (depressive disorder, panic disorder). To analyze posting behavior, 10 models were developed, and negative binomial regressions were conducted to examine associations between number of posts, and demographic and indication-specific variables. Results The DHSNs varied in number of days active (3658-5210), number of registrants (5049-52,396), number of actors (1085-8452), and number of posts (16,231-521,997). In the sample, all 10 models had low R2 values (.013-.086) with limited statistically significant demographic and indication-specific variables. Conclusions Very few variables were associated with social network engagement. Although some variables were statistically significant, they did not appear to be practically significant. Based on the large number of study participants, variation in DHSN theme, and extensive time-period, we did not find strong evidence that demographic characteristics or indication severity sufficiently explain the variability in number of posts per actor. Researchers should investigate alternative models that identify superusers or other individuals who create social network externalities. PMID:28213340
Visual thinking in action: visualizations as used on whiteboards.
Walny, Jagoda; Carpendale, Sheelagh; Riche, Nathalie Henry; Venolia, Gina; Fawcett, Philip
2011-12-01
While it is still most common for information visualization researchers to develop new visualizations from a data- or taskdriven perspective, there is growing interest in understanding the types of visualizations people create by themselves for personal use. As part of this recent direction, we have studied a large collection of whiteboards in a research institution, where people make active use of combinations of words, diagrams and various types of visuals to help them further their thought processes. Our goal is to arrive at a better understanding of the nature of visuals that are created spontaneously during brainstorming, thinking, communicating, and general problem solving on whiteboards. We use the qualitative approaches of open coding, interviewing, and affinity diagramming to explore the use of recognizable and novel visuals, and the interplay between visualization and diagrammatic elements with words, numbers and labels. We discuss the potential implications of our findings on information visualization design. © 2011 IEEE
An Autograding (Student) Problem Management System for the Compeuwtir Ilittur8
NASA Technical Reports Server (NTRS)
Kohne, Glenn S.
1996-01-01
In order to develop analysis skills necessary in engineering disciplines, students need practice solving problems using specified analytical techniques. Unless homework is collected and graded, students tend not to spend much time or effort in performing it. Teachers do not, realistically, have the time to grade large numbers of homework problems on a regular basis. This paper presents and makes available a miracle cure. The Autograding Problem Management System (APMS) provides a discipline-independent mechanism for teachers to create (quickly and easily) sets of homework problems. The APMS system provides CRT and/or printed summaries of the graded student responses. This presentation will demonstrate both the speed and the drag-and-drop simplicity of using the APMS to create self-grading homework problem sets comprised of traditional types of problems and of problems which would not be possible without the use of computers.
NASA Astrophysics Data System (ADS)
Bruno, Luigi; Decuzzi, Paolo; Gentile, Francesco
2016-01-01
The promise of nanotechnology lies in the possibility of engineering matter on the nanoscale and creating technological interfaces that, because of their small scales, may directly interact with biological objects, creating new strategies for the treatment of pathologies that are otherwise beyond the reach of conventional medicine. Nanotechnology is inherently a multiscale, multiphenomena challenge. Fundamental understanding and highly accurate predictive methods are critical to successful manufacturing of nanostructured materials, bio/mechanical devices and systems. In biomedical engineering, and in the mechanical analysis of biological tissues, classical continuum approaches are routinely utilized, even if these disregard the discrete nature of tissues, that are an interpenetrating network of a matrix (the extra cellular matrix, ECM) and a generally large but finite number of cells with a size falling in the micrometer range. Here, we introduce a nano-mechanical theory that accounts for the-non continuum nature of bio systems and other discrete systems. This discrete field theory, doublet mechanics (DM), is a technique to model the mechanical behavior of materials over multiple scales, ranging from some millimeters down to few nanometers. In the paper, we use this theory to predict the response of a granular material to an external applied load. Such a representation is extremely attractive in modeling biological tissues which may be considered as a spatial set of a large number of particulate (cells) dispersed in an extracellular matrix. Possibly more important of this, using digital image correlation (DIC) optical methods, we provide an experimental verification of the model.
Slovenia’s Construction Act and Implementation Plans: A Case Study of Izola IPA-8
NASA Astrophysics Data System (ADS)
Ažman Momirski, Lucija
2017-10-01
The guidelines for urban design in Izola’s IPA-8 planning area, which is earmarked for hotels, apartment complexes, and sports, specify diverse forms of leisure living space required by modern society. The new tourist complex is not a large monotonous hotel complex, but rather a spatial arrangement in which guests experience an authentic local environment and city residents enjoy the new high-quality ambience. The hotel area is defined by three major communication axes from north to south, linking the countryside to the coastal area and opening up attractive sea views in the new complex. Internal east-west links connect buildings and public spaces. Because of the terraced terrain, a large number of paved ramps and internal public gardens have been designed between the structures. The extensions of the communication axes are laid out as squares, named based on the function of the public spaces. Hotel Street is the central axis and main connecting street, with public hotel services and restaurants. The west axis extends into Culture Square, where activities related to Izola’s culture and history are presented; here there is an opportunity to create new galleries, a small local museum, and an exhibition room. Apartment Square is located on the east communication axis, along which only a limited number of trade, catering, and service activities are planned. The plan received first prize in a public competition, and it later developed into detailed municipal spatial plan. In this process, it became clear that Slovenia’s Construction Act (ZGO-1) does not support plans to create terraced buildings.
Kale, Nimish; Lee, Jaeseong; Lotfian, Reza; Jafari, Roozbeh
2012-10-01
Daily living activity monitoring is important for early detection of the onset of many diseases and for improving quality of life especially in elderly. A wireless wearable network of inertial sensor nodes can be used to observe daily motions. Continuous stream of data generated by these sensor networks can be used to recognize the movements of interest. Dynamic Time Warping (DTW) is a widely used signal processing method for time-series pattern matching because of its robustness to variations in time and speed as opposed to other template matching methods. Despite this flexibility, for the application of activity recognition, DTW can only find the similarity between the template of a movement and the incoming samples, when the location and orientation of the sensor remains unchanged. Due to this restriction, small sensor misplacements can lead to a decrease in the classification accuracy. In this work, we adopt DTW distance as a feature for real-time detection of human daily activities like sit to stand in the presence of sensor misplacement. To measure this performance of DTW, we need to create a large number of sensor configurations while the sensors are rotated or misplaced. Creating a large number of closely spaced sensors is impractical. To address this problem, we use the marker based optical motion capture system and generate simulated inertial sensor data for different locations and orientations on the body. We study the performance of the DTW under these conditions to determine the worst-case sensor location variations that the algorithm can accommodate.
Plasma Gradient Piston: a new approach to precision pulse shaping
NASA Astrophysics Data System (ADS)
Prisbrey, Shon T.
2011-10-01
We have successfully developed a method to create shaped pressure drives from large shocks that can be applied to a wide variety of experimental platforms. The method consists of transforming a large shock or blast wave into a ramped pressured drive by utilizing a graded density reservoir that unloads across a gap and stagnates against the sample being studied. The utilization of a graded density reservoir, different materials, and a gap transforms the energy in the initial large shock into a quasi-isentropic ramped compression. Control of the ramp history is via the size of the initial shock, the chosen reservoir materials, their densities, the thickness of each density layer, and the gap size. There are two keys to utilizing this approach to create ramped drives: the ability to produce a large shock, and making the layered density reservoir. A number of facilities can produce the strong initial shock (Z, Omega, NIF, Phoenix, high explosives, NIKE, LMJ, pulsed power,...). We have demonstrated ramped drives from 0.5 to 1.5 Mbar utilizing a large shock created at the Omega laser facility. We recently concluded a pair of NIF drive shots where we successfully converted a hohlraum-generated shock into a stepped, ramped pressure drive with a peak pressure of ~4 - 5 Mbar in a Ta sample. We will explain the basic concepts needed for producing a ramped pressure drive, compare experimental data with simulations from Omega (Pmax ~ 1 Mbar) and NIF (Pmax ~ 5-10 Mbar), and present designs for ramped, staged-shock designs up to Pmax ~ 30 Mbar. The approach that we have developed enables precision pulse shaping of the drive (applied pressure vs. time) via target characteristics, as opposed to tailoring laser power vs time or Z-pinch facility current vs time. This enables ramped, quasi-isentropic materials studies to be performed on a wide variety of HED facilities. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344. LLNL-ABS-490532.
Using Mosix for Wide-Area Compuational Resources
Maddox, Brian G.
2004-01-01
One of the problems with using traditional Beowulf-type distributed processing clusters is that they require an investment in dedicated computer resources. These resources are usually needed in addition to pre-existing ones such as desktop computers and file servers. Mosix is a series of modifications to the Linux kernel that creates a virtual computer, featuring automatic load balancing by migrating processes from heavily loaded nodes to less used ones. An extension of the Beowulf concept is to run a Mosixenabled Linux kernel on a large number of computer resources in an organization. This configuration would provide a very large amount of computational resources based on pre-existing equipment. The advantage of this method is that it provides much more processing power than a traditional Beowulf cluster without the added costs of dedicating resources.
Robotic surgery - advance or gimmick?
De Wilde, Rudy L; Herrmann, Anja
2013-06-01
Robotic surgery is increasingly implemented as a minimally invasive approach to a variety of gynaecological procedures. The use of conventional laparoscopy by a broad range of surgeons, especially in complex procedures, is hampered by several drawbacks. Robotic surgery was created with the aim of overcoming some of the limitations. Although robotic surgery has many advantages, it is also associated with clear disadvantages. At present, the proof of superiority over access by laparotomy or laparoscopy through large randomised- controlled trials is still lacking. Until results of such trials are present, a firm conclusion about the usefulness of robotic surgery cannot be drawn. Robotic surgery is promising, making the advantages of minimally invasive surgery potentially available to a large number of surgeons and patients in the future. Copyright © 2013 Elsevier Ltd. All rights reserved.
Beckers, Jacques M; Andersen, Torben E; Owner-Petersen, Mette
2007-03-05
Under seeing limited conditions very high resolution spectroscopy becomes very difficult for extremely large telescopes (ELTs). Using adaptive optics (AO) the stellar image size decreases proportional with the telescope diameter. This makes the spectrograph optics and hence its resolution independent of the telescope diameter. However AO for use with ELTs at visible wavelengths require deformable mirrors with many elements. Those are not likely to be available for quite some time. We propose to use the pupil slicing technique to create a number of sub-pupils each of which having its own deformable mirror. The images from all sub-pupils are combined incoherently with a diameter corresponding to the diffraction limit of the sub-pupil. The technique is referred to as "Pupil Slicing Adaptive Optics" or PSAO.
Changes in size of deforested patches in the Brazilian Amazon.
Rosa, Isabel M D; Souza, Carlos; Ewers, Robert M
2012-10-01
Different deforestation agents, such as small farmers and large agricultural businesses, create different spatial patterns of deforestation. We analyzed the proportion of deforestation associated with different-sized clearings in the Brazilian Amazon from 2002 through 2009. We used annual deforestation maps to determine total area deforested and the size distribution of deforested patches per year. The size distribution of deforested areas changed over time in a consistent, directional manner. Large clearings (>1000 ha) comprised progressively smaller amounts of total annual deforestation. The number of smaller clearings (6.25-50.00 ha) remained unchanged over time. Small clearings accounted for 73% of all deforestation in 2009, up from 30% in 2002, whereas the proportion of deforestation attributable to large clearings decreased from 13% to 3% between 2002 and 2009. Large clearings were concentrated in Mato Grosso, but also occurred in eastern Pará and in Rondônia. In 2002 large clearings accounted for 17%, 15%, and 10% of all deforestation in Mato Grosso, Pará, and Rondônia, respectively. Even in these states, where there is a highly developed agricultural business dominated by soybean production and cattle ranching, the proportional contribution of large clearings to total deforestation declined. By 2009 large clearings accounted for 2.5%, 3.5%, and 1% of all deforestation in Mato Grosso, Pará, and Rondônia, respectively. These changes in deforestation patch size are coincident with the implementation of new conservation policies by the Brazilian government, which suggests that these policies are not effectively reducing the number of small clearings in primary forest, whether these are caused by large landholders or smallholders, but have been more effective at reducing the frequency of larger clearings. ©2012 Society for Conservation Biology.
Visual Analysis of Cloud Computing Performance Using Behavioral Lines.
Muelder, Chris; Zhu, Biao; Chen, Wei; Zhang, Hongxin; Ma, Kwan-Liu
2016-02-29
Cloud computing is an essential technology to Big Data analytics and services. A cloud computing system is often comprised of a large number of parallel computing and storage devices. Monitoring the usage and performance of such a system is important for efficient operations, maintenance, and security. Tracing every application on a large cloud system is untenable due to scale and privacy issues. But profile data can be collected relatively efficiently by regularly sampling the state of the system, including properties such as CPU load, memory usage, network usage, and others, creating a set of multivariate time series for each system. Adequate tools for studying such large-scale, multidimensional data are lacking. In this paper, we present a visual based analysis approach to understanding and analyzing the performance and behavior of cloud computing systems. Our design is based on similarity measures and a layout method to portray the behavior of each compute node over time. When visualizing a large number of behavioral lines together, distinct patterns often appear suggesting particular types of performance bottleneck. The resulting system provides multiple linked views, which allow the user to interactively explore the data by examining the data or a selected subset at different levels of detail. Our case studies, which use datasets collected from two different cloud systems, show that this visual based approach is effective in identifying trends and anomalies of the systems.
EMERGE: Engineered Materials that Create Environments for ReGeneration via Electric Field
2016-10-01
Recruitment of multiple cell lines by collagen-synthetic copolymer matrices in corneal regeneration ,” Biomaterials (2004). A) B) REDD-2016-537...AWARD NUMBER: W81XWH-14-1-0542 TITLE: EMERGE: Engineered Materials that Create Environments for ReGeneration via Electric Field PRINCIPAL...23 Sep 2016 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER EMERGE: Engineered Materials that Create Environments for ReGeneration via Electric Field
The Challenges of Adopting a Culture of Mission Command in the US Army
2015-05-23
NUMBER 6. AUTHOR(S) LTC(P) James W. Wright 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND...the development and implementation of high- end information technology creates a paradox for mission command. 15. SUBJECT TERMS Mission command...centralized control and less risk. Likewise, the development and implementation of high- end information technology creates a paradox for mission
[An alternative continence mechanism for continent catheterisable micturation].
Honeck, P; Alken, P
2010-01-01
The creation of a stable, reliable, continent and easily catheterisable continence mechanism is an essential prerequisite for the construction of a continent cutaneous urinary reservoir. Although a substantial number of surgical methods has been described, construction is still a complex surgical procedure. The aim of this study was the evaluation of a new method for a continence mechanism using stapled small or large intestine. Small and large pig intestine was used for construction. For stapling the tube a 3 cm or 6 cm double row stapling system was used. Two variations using small and large intestine segments were constructed (IL 1, COL 1, COL 2). A 3 or 6 cm long stapler line was placed alongside a 12 Fr catheter positioned at the antimesenterial side creating a partially two-luminal segment. The open end of the non-catheterised lumen and the opposite intestinal end were closed by continuous sutures. The created tube was then embedded into the pouch. Pressure evaluation was performed for each variation. Intermittent external manual compression was used to simulate sudden pressure exposure. Construction times for the IL 1 and COL 1 variations were 10 +/- 1.5 min and 6.2 +/- 1.3 min for COL 2. All variations showed no leakage during filling or external compression. The maximum capacity was lower for the IL 1 compared to the COL variation. The maximum pressure levels reached did not differ significantly. The described technique is an easy and fast method to construct a continent and easy to catheterize continence mechanism using small or large intestine.
Managing the Earth’s Biggest Mass Gathering Event and WASH Conditions: Maha Kumbh Mela (India)
Baranwal, Annu; Anand, Ankit; Singh, Ravikant; Deka, Mridul; Paul, Abhishek; Borgohain, Sunny; Roy, Nobhojit
2015-01-01
Background: Mass gatherings including a large number of people makes the planning and management of the event a difficult task. Kumbh Mela is one such, internationally famous religious mass gathering. It creates the substantial challenge of creating a temporary city in which millions of people can stay for a defined period of time. The arrangements need to allow this very large number of people to reside with proper human waste disposal, medical services, adequate supplies of food and clean water, transportation etc. Methods: We report a case study of Maha Kumbh, 2013 which focuses on the management and planning that went into the preparation of Kumbh Mela and understanding its water, sanitation and hygiene conditions. It was an observational cross-sectional study, the field work was done for 13 days, from 21 January to 2 February 2013. Results: Our findings suggest that the Mela committee and all other agencies involved in Mela management proved to be successful in supervising the event and making it convenient, efficient and safe. Health care services and water sanitation and hygiene conditions were found to be satisfactory. BhuleBhatke Kendra (Center for helping people who got separated from their families) had the major task of finding missing people and helping them to meet their families. Some of the shortfalls identified were that drainage was a major problem and some fire incidents were reported. Therefore, improvement in drainage facilities and reduction in fire incidents are essential to making Mela cleaner and safer. The number of persons per toilet was high and there were no separate toilets for males and females. Special facilities and separate toilets for men and women will improve their stay in Mela. Conclusion: Inculcation of modern methods and technologies are likely to help in supporting crowd management and improving water, sanitation and hygiene conditions in the continuously expanding KumbhMela, in the coming years. PMID:25932345
Modelling of information diffusion on social networks with applications to WeChat
NASA Astrophysics Data System (ADS)
Liu, Liang; Qu, Bo; Chen, Bin; Hanjalic, Alan; Wang, Huijuan
2018-04-01
Traces of user activities recorded in online social networks open new possibilities to systematically understand the information diffusion process on social networks. From the online social network WeChat, we collected a large number of information cascade trees, each of which tells the spreading trajectory of a message/information such as which user creates the information and which users view or forward the information shared by which neighbours. In this work, we propose two heterogeneous non-linear models, one for the topologies of the information cascade trees and the other for the stochastic process of information diffusion on a social network. Both models are validated by the WeChat data in reproducing and explaining key features of cascade trees. Specifically, we apply the Random Recursive Tree (RRT) to model the growth of cascade trees. The RRT model could capture key features, i.e. the average path length and degree variance of a cascade tree in relation to the number of nodes (size) of the tree. Its single identified parameter quantifies the relative depth or broadness of the cascade trees and indicates that information propagates via a star-like broadcasting or viral-like hop by hop spreading. The RRT model explains the appearance of hubs, thus a possibly smaller average path length as the cascade size increases, as observed in WeChat. We further propose the stochastic Susceptible View Forward Removed (SVFR) model to depict the dynamic user behaviour including creating, viewing, forwarding and ignoring a message on a given social network. Beside the average path length and degree variance of the cascade trees in relation to their sizes, the SVFR model could further explain the power-law cascade size distribution in WeChat and unravel that a user with a large number of friends may actually have a smaller probability to read a message (s)he receives due to limited attention.
Solutions for extracting file level spatial metadata from airborne mission data
NASA Astrophysics Data System (ADS)
Schwab, M. J.; Stanley, M.; Pals, J.; Brodzik, M.; Fowler, C.; Icebridge Engineering/Spatial Metadata
2011-12-01
Authors: Michael Stanley Mark Schwab Jon Pals Mary J. Brodzik Cathy Fowler Collaboration: Raytheon EED and NSIDC Raytheon / EED 5700 Rivertech Court Riverdale, MD 20737 NSIDC University of Colorado UCB 449 Boulder, CO 80309-0449 Data sets acquired from satellites and aircraft may differ in many ways. We will focus on the differences in spatial coverage between the two platforms. Satellite data sets over a given period typically cover large geographic regions. These data are collected in a consistent, predictable and well understood manner due to the uniformity of satellite orbits. Since satellite data collection paths are typically smooth and uniform the data from satellite instruments can usually be described with simple spatial metadata. Subsequently, these spatial metadata can be stored and searched easily and efficiently. Conversely, aircraft have significantly more freedom to change paths, circle, overlap, and vary altitude all of which add complexity to the spatial metadata. Aircraft are also subject to wind and other elements that result in even more complicated and unpredictable spatial coverage areas. This unpredictability and complexity makes it more difficult to extract usable spatial metadata from data sets collected on aircraft missions. It is not feasible to use all of the location data from aircraft mission data sets for use as spatial metadata. The number of data points in typical data sets poses serious performance problems for spatial searching. In order to provide efficient spatial searching of the large number of files cataloged in our systems, we need to extract approximate spatial descriptions as geo-polygons from a small number of vertices (fewer than two hundred). We present some of the challenges and solutions for creating airborne mission-derived spatial metadata. We are implementing these methods to create the spatial metadata for insertion of IceBridge mission data into ECS for public access through NSIDC and ECHO but, they are potentially extensible to any aircraft mission data.
Managing the Earth's Biggest Mass Gathering Event and WASH Conditions: Maha Kumbh Mela (India).
Baranwal, Annu; Anand, Ankit; Singh, Ravikant; Deka, Mridul; Paul, Abhishek; Borgohain, Sunny; Roy, Nobhojit
2015-04-13
Mass gatherings including a large number of people makes the planning and management of the event a difficult task. Kumbh Mela is one such, internationally famous religious mass gathering. It creates the substantial challenge of creating a temporary city in which millions of people can stay for a defined period of time. The arrangements need to allow this very large number of people to reside with proper human waste disposal, medical services, adequate supplies of food and clean water, transportation etc. We report a case study of Maha Kumbh, 2013 which focuses on the management and planning that went into the preparation of Kumbh Mela and understanding its water, sanitation and hygiene conditions. It was an observational cross-sectional study, the field work was done for 13 days, from 21 January to 2 February 2013. Our findings suggest that the Mela committee and all other agencies involved in Mela management proved to be successful in supervising the event and making it convenient, efficient and safe. Health care services and water sanitation and hygiene conditions were found to be satisfactory. BhuleBhatke Kendra (Center for helping people who got separated from their families) had the major task of finding missing people and helping them to meet their families. Some of the shortfalls identified were that drainage was a major problem and some fire incidents were reported. Therefore, improvement in drainage facilities and reduction in fire incidents are essential to making Mela cleaner and safer. The number of persons per toilet was high and there were no separate toilets for males and females. Special facilities and separate toilets for men and women will improve their stay in Mela. Inculcation of modern methods and technologies are likely to help in supporting crowd management and improving water, sanitation and hygiene conditions in the continuously expanding KumbhMela, in the coming years.
Asteroid Discovery and Characterization with the Large Synoptic Survey Telescope
NASA Astrophysics Data System (ADS)
Jones, R. Lynne; Jurić, Mario; Ivezić, Željko
2016-01-01
The Large Synoptic Survey Telescope (LSST) will be a ground-based, optical, all-sky, rapid cadence survey project with tremendous potential for discovering and characterizing asteroids. With LSST's large 6.5m diameter primary mirror, a wide 9.6 square degree field of view 3.2 Gigapixel camera, and rapid observational cadence, LSST will discover more than 5 million asteroids over its ten year survey lifetime. With a single visit limiting magnitude of 24.5 in r band, LSST will be able to detect asteroids in the Main Belt down to sub-kilometer sizes. The current strawman for the LSST survey strategy is to obtain two visits (each `visit' being a pair of back-to-back 15s exposures) per field, separated by about 30 minutes, covering the entire visible sky every 3-4 days throughout the observing season, for ten years. The catalogs generated by LSST will increase the known number of small bodies in the Solar System by a factor of 10-100 times, among all populations. The median number of observations for Main Belt asteroids will be on the order of 200-300, with Near Earth Objects receiving a median of 90 observations. These observations will be spread among ugrizy bandpasses, providing photometric colors and allow sparse lightcurve inversion to determine rotation periods, spin axes, and shape information. These catalogs will be created using automated detection software, the LSST Moving Object Processing System (MOPS), that will take advantage of the carefully characterized LSST optical system, cosmetically clean camera, and recent improvements in difference imaging. Tests with the prototype MOPS software indicate that linking detections (and thus `discovery') will be possible at LSST depths with our working model for the survey strategy, but evaluation of MOPS and improvements in the survey strategy will continue. All data products and software created by LSST will be publicly available.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ishihara, T
Currently, the problem at hand is in distributing identical copies of OEP and filter software to a large number of farm nodes. One of the common methods used to transfer these softwares is through unicast. Unicast protocol faces the problem of repetitiously sending the same data over the network. Since the sending rate is limited, this process poses to be a bottleneck. Therefore, one possible solution to this problem lies in creating a reliable multicast protocol. A specific type of multicast protocol is the Bulk Multicast Protocol [4]. This system consists of one sender distributing data to many receivers. Themore » sender delivers data at a given rate of data packets. In response to that, the receiver replies to the sender with a status packet which contains information about the packet loss in terms of Negative Acknowledgment. The probability of the status packet sent back to the sender is+, where N is the number of receivers. The protocol is designed to have approximately 1 status packet for each data packet sent. In this project, we were able to show that the time taken for the complete transfer of a file to multiple receivers was about 12 times faster with multicast than by the use of unicast. The implementation of this experimental protocol shows remarkable improvement in mass data transfer to a large number of farm machines.« less
Baker, Michael S
2007-03-01
How do we train for the entire spectrum of potential emergency and crisis scenarios? Will we suddenly face large numbers of combat casualties, an earthquake, a plane crash, an industrial explosion, or a terrorist bombing? The daily routine can suddenly be complicated by large numbers of patients, exceeding the ability to treat in a routine fashion. Disaster events can result in patients with penetrating wounds, burns, blast injuries, chemical contamination, or all of these at once. Some events may disrupt infrastructure or result in loss of essential equipment or key personnel. The chaos of a catastrophic event impedes decision-making and effective treatment of patients. Disasters require a paradigm shift from the application of unlimited resources for the greatest good of each individual patient to the allocation of care, with limited resources, for the greatest good for the greatest number of patients. Training and preparation are essential to remain effective during crises and major catastrophic events. Disaster triage and crisis management represent a tactical art that incorporates clinical skills, didactic information, communication ability, leadership, and decision-making. Planning, rehearsing, and exercising various scenarios encourage the flexibility, adaptability, and innovation required in disaster settings. These skills can bring order to the chaos of overwhelming disaster events.
Residual Ductility and Microstructural Evolution in Continuous-Bending-under-Tension of AA-6022-T4
Zecevic, Milovan; Roemer, Timothy J.; Knezevic, Marko; Korkolis, Yannis P.; Kinsey, Brad L.
2016-01-01
A ubiquitous experiment to characterize the formability of sheet metal is the simple tension test. Past research has shown that if the material is repeatedly bent and unbent during this test (i.e., Continuous-Bending-under-Tension, CBT), the percent elongation at failure can significantly increase. In this paper, this phenomenon is evaluated in detail for AA-6022-T4 sheets using a custom-built CBT device. In particular, the residual ductility of specimens that are subjected to CBT processing is investigated. This is achieved by subjecting a specimen to CBT processing and then creating subsize tensile test and microstructural samples from the specimens after varying numbers of CBT cycles. Interestingly, the engineering stress initially increases after CBT processing to a certain number of cycles, but then decreases with less elongation achieved for increasing numbers of CBT cycles. Additionally, a detailed microstructure and texture characterization are performed using standard scanning electron microscopy and electron backscattered diffraction imaging. The results show that the material under CBT preserves high integrity to large plastic strains due to a uniform distribution of damage formation and evolution in the material. The ability to delay ductile fracture during the CBT process to large plastic strains, results in formation of a strong <111> fiber texture throughout the material. PMID:28773257
Opportunities and constraints for organizations to help sustain tropical forest resources
NASA Astrophysics Data System (ADS)
Hyman, Eric L.
1986-01-01
A large number of organizations make decisions that directly or indirectly affect tropical forests. The principal constraints that affect these organizations are (1) insufficient funds; (2) insufficient knowledge about the resources and appropriate technologies; (3) institutional, cultural, and political factors; (4) inadequate communication; and (5) contradictory efforts. Opportunities for improving the efficiency and effectiveness of these organizations include (1) increasing cooperation among US government agencies; (2) redirecting international organizations; (3) increasing coordination among organizations; (4) boosting support of nongovernmental organizations and universities; (5) encouraging responsible involvement by private corporations; (6) strengthening existing organizations; and (7) creating new organizations.
Why do proteases mess up with antigen presentation by re-shuffling antigen sequences?
Liepe, Juliane; Ovaa, Huib; Mishto, Michele
2018-04-30
The sequence of a large number of MHC-presented epitopes is not present as such in the original antigen because it has been re-shuffled by the proteasome or other proteases. Why do proteases throw a spanner in the works of our model of antigen tagging and immune recognition? We describe in this review what we know about the immunological relevance of post-translationally spliced epitopes and why proteases seem to have a second (dark) personality, which is keen to create new peptide bonds. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.
2017-01-09
Today's VIS image shows some of the extensive wind etched terrain in Memnonia Sulci, located south west of Olympus Mons. The linear ridges are called yardangs and form by wind removal of semi-cemented material. The ridges are parallel to wind direction, so the predominate winds that created the yardangs in this image blew NW/SE. At the bottom of the image several of the ridges have been eroded into smaller ridges aligned perpendicular to the large yardangs, indicating winds at a different angle. Orbit Number: 66197 Latitude: -5.91796 Longitude: 183.886 Instrument: VIS Captured: 2016-11-15 13:08 http://photojournal.jpl.nasa.gov/catalog/PIA21283
Computational Design of Functionalized Metal–Organic Framework Nodes for Catalysis
2017-01-01
Recent progress in the synthesis and characterization of metal–organic frameworks (MOFs) has opened the door to an increasing number of possible catalytic applications. The great versatility of MOFs creates a large chemical space, whose thorough experimental examination becomes practically impossible. Therefore, computational modeling is a key tool to support, rationalize, and guide experimental efforts. In this outlook we survey the main methodologies employed to model MOFs for catalysis, and we review selected recent studies on the functionalization of their nodes. We pay special attention to catalytic applications involving natural gas conversion. PMID:29392172
Study of alumina-trichite reinforcement of a nickel-based matric by means of powder metallurgy
NASA Technical Reports Server (NTRS)
Walder, A.; Hivert, A.
1982-01-01
Research was conducted on reinforcing nickel based matrices with alumina trichites by using powder metallurgy. Alumina trichites previously coated with nickel are magnetically aligned. The felt obtained is then sintered under a light pressure at a temperature just below the melting point of nickel. The halogenated atmosphere technique makes it possible to incorporate a large number of additive elements such as chromium, titanium, zirconium, tantalum, niobium, aluminum, etc. It does not appear that going from laboratory scale to a semi-industrial scale in production would create any major problems.
A world of cities and the end of TB
Prasad, Amit; Ross, Alex; Rosenberg, Paul; Dye, Christopher
2016-01-01
The WHO's End TB Strategy aims to reduce TB deaths by 95% and incidence by 90% between 2015 and 2035. As the world rapidly urbanizes, more people could have access to better infrastructure and services to help combat poverty and infectious diseases, including TB. And yet large numbers of people now live in overcrowded slums, with poor access to urban health services, amplifying the burden of TB. An alignment of the Sustainable Development Goals (SDGs) for health and for urban development provides an opportunity to accelerate the overall decline in infection and disease, and to create cities free of TB. PMID:26884491
Electronic labelling in recycling of manufactured articles.
Olejnik, Lech; Krammer, Alfred
2002-12-01
The concept of a recycling system aiming at the recovery of resources from manufactured articles is proposed. The system integrates electronic labels for product identification and internet for global data exchange. A prototype for the recycling of electric motors has been developed, which implements a condition-based recycling decision system to automatically select the environmentally and economically appropriate recycling strategy, thereby opening a potential market for second-hand motors and creating a profitable recycling process itself. The project has been designed to evaluate the feasibility of electronic identification applied on a large number of motors and to validate the system in real field conditions.
NASA Astrophysics Data System (ADS)
Mukherjee, Sayak; Stewart, David; Stewart, William; Lanier, Lewis L.; Das, Jayajit
2017-08-01
Single-cell responses are shaped by the geometry of signalling kinetic trajectories carved in a multidimensional space spanned by signalling protein abundances. It is, however, challenging to assay a large number (more than 3) of signalling species in live-cell imaging, which makes it difficult to probe single-cell signalling kinetic trajectories in large dimensions. Flow and mass cytometry techniques can measure a large number (4 to more than 40) of signalling species but are unable to track single cells. Thus, cytometry experiments provide detailed time-stamped snapshots of single-cell signalling kinetics. Is it possible to use the time-stamped cytometry data to reconstruct single-cell signalling trajectories? Borrowing concepts of conserved and slow variables from non-equilibrium statistical physics we develop an approach to reconstruct signalling trajectories using snapshot data by creating new variables that remain invariant or vary slowly during the signalling kinetics. We apply this approach to reconstruct trajectories using snapshot data obtained from in silico simulations, live-cell imaging measurements, and, synthetic flow cytometry datasets. The application of invariants and slow variables to reconstruct trajectories provides a radically different way to track objects using snapshot data. The approach is likely to have implications for solving matching problems in a wide range of disciplines.
Fast flexible modeling of RNA structure using internal coordinates.
Flores, Samuel Coulbourn; Sherman, Michael A; Bruns, Christopher M; Eastman, Peter; Altman, Russ Biagio
2011-01-01
Modeling the structure and dynamics of large macromolecules remains a critical challenge. Molecular dynamics (MD) simulations are expensive because they model every atom independently, and are difficult to combine with experimentally derived knowledge. Assembly of molecules using fragments from libraries relies on the database of known structures and thus may not work for novel motifs. Coarse-grained modeling methods have yielded good results on large molecules but can suffer from difficulties in creating more detailed full atomic realizations. There is therefore a need for molecular modeling algorithms that remain chemically accurate and economical for large molecules, do not rely on fragment libraries, and can incorporate experimental information. RNABuilder works in the internal coordinate space of dihedral angles and thus has time requirements proportional to the number of moving parts rather than the number of atoms. It provides accurate physics-based response to applied forces, but also allows user-specified forces for incorporating experimental information. A particular strength of RNABuilder is that all Leontis-Westhof basepairs can be specified as primitives by the user to be satisfied during model construction. We apply RNABuilder to predict the structure of an RNA molecule with 160 bases from its secondary structure, as well as experimental information. Our model matches the known structure to 10.2 Angstroms RMSD and has low computational expense.
Strategic Planning for the Air Force. Leveraging Business Planning Insights to Create Future Value
1998-01-01
Strategic Planning for the Air Force Leveraging Business Planning Insights to Create Future Value DEBORAH L. WESTPHAL, RICHARD SZAFRANSKI...SUBTITLE Strategic Planning for the Air Force. Leveraging Business Planning Insights to Create Future Value 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c...can be so, unless leaders and planners are willing to think in the boundary between order and chaos. Long-Range Planning, Strategic Thinking, or
Robust point matching via vector field consensus.
Jiayi Ma; Ji Zhao; Jinwen Tian; Yuille, Alan L; Zhuowen Tu
2014-04-01
In this paper, we propose an efficient algorithm, called vector field consensus, for establishing robust point correspondences between two sets of points. Our algorithm starts by creating a set of putative correspondences which can contain a very large number of false correspondences, or outliers, in addition to a limited number of true correspondences (inliers). Next, we solve for correspondence by interpolating a vector field between the two point sets, which involves estimating a consensus of inlier points whose matching follows a nonparametric geometrical constraint. We formulate this a maximum a posteriori (MAP) estimation of a Bayesian model with hidden/latent variables indicating whether matches in the putative set are outliers or inliers. We impose nonparametric geometrical constraints on the correspondence, as a prior distribution, using Tikhonov regularizers in a reproducing kernel Hilbert space. MAP estimation is performed by the EM algorithm which by also estimating the variance of the prior model (initialized to a large value) is able to obtain good estimates very quickly (e.g., avoiding many of the local minima inherent in this formulation). We illustrate this method on data sets in 2D and 3D and demonstrate that it is robust to a very large number of outliers (even up to 90%). We also show that in the special case where there is an underlying parametric geometrical model (e.g., the epipolar line constraint) that we obtain better results than standard alternatives like RANSAC if a large number of outliers are present. This suggests a two-stage strategy, where we use our nonparametric model to reduce the size of the putative set and then apply a parametric variant of our approach to estimate the geometric parameters. Our algorithm is computationally efficient and we provide code for others to use it. In addition, our approach is general and can be applied to other problems, such as learning with a badly corrupted training data set.
DES Science Portal: II- Creating Science-Ready Catalogs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fausti Neto, Angelo; et al.
We present a novel approach for creating science-ready catalogs through a software infrastructure developed for the Dark Energy Survey (DES). We integrate the data products released by the DES Data Management and additional products created by the DES collaboration in an environment known as DES Science Portal. Each step involved in the creation of a science-ready catalog is recorded in a relational database and can be recovered at any time. We describe how the DES Science Portal automates the creation and characterization of lightweight catalogs for DES Year 1 Annual Release, and show its flexibility in creating multiple catalogs withmore » different inputs and configurations. Finally, we discuss the advantages of this infrastructure for large surveys such as DES and the Large Synoptic Survey Telescope. The capability of creating science-ready catalogs efficiently and with full control of the inputs and configurations used is an important asset for supporting science analysis using data from large astronomical surveys.« less
The effect of fast created inbreeding on litter size and body weights in mice
Holt, Marte; Meuwissen, Theo; Vangen, Odd
2005-01-01
This study was designed to reveal any differences in effects of fast created versus total inbreeding on reproduction and body weights in mice. A line selected for large litter size for 124 generations (H) and a control line (K) maintained without selection for the same number of generations were crossed (HK) and used as a basis for the experiment. Within the HK cross, full sib, cousin or random mating were practised for two generations in order to create new inbreeding (IBF) at a fast rate. In the first generation of systematic mating, old inbreeding was regenerated in addition to creation of new inbreeding from the mating design giving total inbreeding (IBT). The number of pups born alive (NBA) and body weights of the animals were then analysed by a model including both IBT and IBF. The IBT of the dam was in the present study found to reduce the mean NBA with -0.48 (± 0.22) (p < 0.05) pups per 10% increase in the inbreeding coefficient, while the additional effect of IBF was -0.42 (± 0.27). For the trait NBA per female mated, the effect of IBT was estimated to be -0.45 (± 0.29) per 10% increase in the inbreeding coefficient and the effect of IBF was -0.90 (± 0.37) (p < 0.05) pups. In the present study, only small or non-significant effects of IBF of the dam could be found on sex-ratio and body weights at three and six weeks of age in a population already adjusted for IBT. PMID:16093013
ORBIT: an integrated environment for user-customized bioinformatics tools.
Bellgard, M I; Hiew, H L; Hunter, A; Wiebrands, M
1999-10-01
There are a large number of computational programs freely available to bioinformaticians via a client/server, web-based environment. However, the client interface to these tools (typically an html form page) cannot be customized from the client side as it is created by the service provider. The form page is usually generic enough to cater for a wide range of users. However, this implies that a user cannot set as 'default' advanced program parameters on the form or even customize the interface to his/her specific requirements or preferences. Currently, there is a lack of end-user interface environments that can be modified by the user when accessing computer programs available on a remote server running on an intranet or over the Internet. We have implemented a client/server system called ORBIT (Online Researcher's Bioinformatics Interface Tools) where individual clients can have interfaces created and customized to command-line-driven, server-side programs. Thus, Internet-based interfaces can be tailored to a user's specific bioinformatic needs. As interfaces are created on the client machine independent of the server, there can be different interfaces to the same server-side program to cater for different parameter settings. The interface customization is relatively quick (between 10 and 60 min) and all client interfaces are integrated into a single modular environment which will run on any computer platform supporting Java. The system has been developed to allow for a number of future enhancements and features. ORBIT represents an important advance in the way researchers gain access to bioinformatics tools on the Internet.
The Continuum of Health Professions
Jensen, Clyde B.
2015-01-01
The large number of health care professions with overlapping scopes of practice is intimidating to students, confusing to patients, and frustrating to policymakers. As abundant and diverse as the hundreds of health care professions are, they possess sufficient numbers of common characteristics to warrant their placement on a common continuum of health professions that permits methodical comparisons. From 2009–2012, the author developed and delivered experimental courses at 2 community colleges for the purposes of creating and validating a novel method for comparing health care professions. This paper describes the bidirectional health professions continuum that emerged from these courses and its potential value in helping students select a health care career, motivating health care providers to seek interprofessional collaboration, assisting patients with the selection of health care providers, and helping policymakers to better understand the health care professions they regulate. PMID:26770147
If this then that: an introduction to automated task services.
Hoy, Matthew B
2015-01-01
This article explores automated task services, a type of website that allows users to create rules that are triggered by activity on one website and perform a task on another site. The most well-known automated task service is If This Then That (IFTTT), but recently a large number of these services have sprung up. These services can be used to connect websites, apps, business services, and even devices such as phones and home automation equipment. This allows for millions of possible combinations of rules, triggers, and actions. Librarians can put these services to use in many ways, from automating social media postings to remembering to bring their umbrella when rain is in the forecast. A list of popular automated task services is included, as well as a number of ideas for using these services in libraries.
Disease registries on the nationwide health information network.
Russler, Daniel
2011-05-01
Donation by individuals of their protected health information (PHI) for evidence-based research potentially benefits all individuals with disease through improved understandings of disease patterns. In the future, a better understanding of how disease features combine into unique patterns of disease will generate new disease classifications, supporting greater specificity in health management techniques. However, without large numbers of people who donate their PHI to disease registries designed for research, it is difficult for researchers to discover the existence of complex patterns or to create more specific evidence-based management techniques. In order to identify new opportunities in disease registry design, an analysis of the current stage of maturity of the newly created U.S. Nationwide Health Information Network (NwHIN) related to large-scale consumer donation of PHI is presented. Utilizing a use-case analysis methodology, the consumer-centric designs of the policies and technologies created for the NwHIN were examined for the potential to support consumer donations of PHI to research. The NwHIN design has placed the enforcement point for the policy-based release of PHI over the Internet into a specialized gateway accessible to consumer authorization. However, current NwHIN policies leave the final decision regarding release of PHI for research to the health care providers rather than to the consumers themselves. Should disease registries designed for research be established on the NwHIN, consumers might then directly authorize the donation of their PHI to these disease registries. However, under current NwHIN policies, consumer authorization does not guarantee release of PHI by health providers. © 2011 Diabetes Technology Society.
Calvo, Sarah E; Tucker, Elena J; Compton, Alison G; Kirby, Denise M; Crawford, Gabriel; Burtt, Noel P; Rivas, Manuel A; Guiducci, Candace; Bruno, Damien L; Goldberger, Olga A; Redman, Michelle C; Wiltshire, Esko; Wilson, Callum J; Altshuler, David; Gabriel, Stacey B; Daly, Mark J; Thorburn, David R; Mootha, Vamsi K
2010-01-01
Discovering the molecular basis of mitochondrial respiratory chain disease is challenging given the large number of both mitochondrial and nuclear genes involved. We report a strategy of focused candidate gene prediction, high-throughput sequencing, and experimental validation to uncover the molecular basis of mitochondrial complex I (CI) disorders. We created five pools of DNA from a cohort of 103 patients and then performed deep sequencing of 103 candidate genes to spotlight 151 rare variants predicted to impact protein function. We used confirmatory experiments to establish genetic diagnoses in 22% of previously unsolved cases, and discovered that defects in NUBPL and FOXRED1 can cause CI deficiency. Our study illustrates how large-scale sequencing, coupled with functional prediction and experimental validation, can reveal novel disease-causing mutations in individual patients. PMID:20818383
NASA Technical Reports Server (NTRS)
Leptoukh, Gregory; Shen, Suhung; Csiszar, Ivan; Romanov, Peter; Loboda, Tatiana; Gerasimov, Irina
2008-01-01
A large number of fires detected in July of 2003 - a nearly 200-time increase in fire detections compared to other years during 2001-2006. despite the summer monsoon suppression of large fire occurrence. Traditional vegetation indices (NDVI and EVI) included in operational fire danger assessment provide little information on the fuel state in this ecosystem pre- or post-fire. No considerable differences in surface temperature and soil moisture in July were observed between the catastrophic year of 2003 and the two subsequent years of low summer fire occurrence of 2004 and 2005. However, the temporal analysis indicates that dry spring conditions in 2003 (detected through low soil moisture measurements in April and May) may have led to a stressed vegetative state and created conditions conducive to catastrophic fire occurrence.
Integrative analysis of the Caenorhabditis elegans genome by the modENCODE project.
Gerstein, Mark B; Lu, Zhi John; Van Nostrand, Eric L; Cheng, Chao; Arshinoff, Bradley I; Liu, Tao; Yip, Kevin Y; Robilotto, Rebecca; Rechtsteiner, Andreas; Ikegami, Kohta; Alves, Pedro; Chateigner, Aurelien; Perry, Marc; Morris, Mitzi; Auerbach, Raymond K; Feng, Xin; Leng, Jing; Vielle, Anne; Niu, Wei; Rhrissorrakrai, Kahn; Agarwal, Ashish; Alexander, Roger P; Barber, Galt; Brdlik, Cathleen M; Brennan, Jennifer; Brouillet, Jeremy Jean; Carr, Adrian; Cheung, Ming-Sin; Clawson, Hiram; Contrino, Sergio; Dannenberg, Luke O; Dernburg, Abby F; Desai, Arshad; Dick, Lindsay; Dosé, Andréa C; Du, Jiang; Egelhofer, Thea; Ercan, Sevinc; Euskirchen, Ghia; Ewing, Brent; Feingold, Elise A; Gassmann, Reto; Good, Peter J; Green, Phil; Gullier, Francois; Gutwein, Michelle; Guyer, Mark S; Habegger, Lukas; Han, Ting; Henikoff, Jorja G; Henz, Stefan R; Hinrichs, Angie; Holster, Heather; Hyman, Tony; Iniguez, A Leo; Janette, Judith; Jensen, Morten; Kato, Masaomi; Kent, W James; Kephart, Ellen; Khivansara, Vishal; Khurana, Ekta; Kim, John K; Kolasinska-Zwierz, Paulina; Lai, Eric C; Latorre, Isabel; Leahey, Amber; Lewis, Suzanna; Lloyd, Paul; Lochovsky, Lucas; Lowdon, Rebecca F; Lubling, Yaniv; Lyne, Rachel; MacCoss, Michael; Mackowiak, Sebastian D; Mangone, Marco; McKay, Sheldon; Mecenas, Desirea; Merrihew, Gennifer; Miller, David M; Muroyama, Andrew; Murray, John I; Ooi, Siew-Loon; Pham, Hoang; Phippen, Taryn; Preston, Elicia A; Rajewsky, Nikolaus; Rätsch, Gunnar; Rosenbaum, Heidi; Rozowsky, Joel; Rutherford, Kim; Ruzanov, Peter; Sarov, Mihail; Sasidharan, Rajkumar; Sboner, Andrea; Scheid, Paul; Segal, Eran; Shin, Hyunjin; Shou, Chong; Slack, Frank J; Slightam, Cindie; Smith, Richard; Spencer, William C; Stinson, E O; Taing, Scott; Takasaki, Teruaki; Vafeados, Dionne; Voronina, Ksenia; Wang, Guilin; Washington, Nicole L; Whittle, Christina M; Wu, Beijing; Yan, Koon-Kiu; Zeller, Georg; Zha, Zheng; Zhong, Mei; Zhou, Xingliang; Ahringer, Julie; Strome, Susan; Gunsalus, Kristin C; Micklem, Gos; Liu, X Shirley; Reinke, Valerie; Kim, Stuart K; Hillier, LaDeana W; Henikoff, Steven; Piano, Fabio; Snyder, Michael; Stein, Lincoln; Lieb, Jason D; Waterston, Robert H
2010-12-24
We systematically generated large-scale data sets to improve genome annotation for the nematode Caenorhabditis elegans, a key model organism. These data sets include transcriptome profiling across a developmental time course, genome-wide identification of transcription factor-binding sites, and maps of chromatin organization. From this, we created more complete and accurate gene models, including alternative splice forms and candidate noncoding RNAs. We constructed hierarchical networks of transcription factor-binding and microRNA interactions and discovered chromosomal locations bound by an unusually large number of transcription factors. Different patterns of chromatin composition and histone modification were revealed between chromosome arms and centers, with similarly prominent differences between autosomes and the X chromosome. Integrating data types, we built statistical models relating chromatin, transcription factor binding, and gene expression. Overall, our analyses ascribed putative functions to most of the conserved genome.
Phylogenetic search through partial tree mixing
2012-01-01
Background Recent advances in sequencing technology have created large data sets upon which phylogenetic inference can be performed. Current research is limited by the prohibitive time necessary to perform tree search on a reasonable number of individuals. This research develops new phylogenetic algorithms that can operate on tens of thousands of species in a reasonable amount of time through several innovative search techniques. Results When compared to popular phylogenetic search algorithms, better trees are found much more quickly for large data sets. These algorithms are incorporated in the PSODA application available at http://dna.cs.byu.edu/psoda Conclusions The use of Partial Tree Mixing in a partition based tree space allows the algorithm to quickly converge on near optimal tree regions. These regions can then be searched in a methodical way to determine the overall optimal phylogenetic solution. PMID:23320449
Simplified Deployment of Health Informatics Applications by Providing Docker Images.
Löbe, Matthias; Ganslandt, Thomas; Lotzmann, Lydia; Mate, Sebastian; Christoph, Jan; Baum, Benjamin; Sariyar, Murat; Wu, Jie; Stäubert, Sebastian
2016-01-01
Due to the specific needs of biomedical researchers, in-house development of software is widespread. A common problem is to maintain and enhance software after the funded project has ended. Even if many tools are made open source, only a couple of projects manage to attract a user basis large enough to ensure sustainability. Reasons for this include complex installation and configuration of biomedical software as well as an ambiguous terminology of the features provided; all of which make evaluation of software laborious. Docker is a para-virtualization technology based on Linux containers that eases deployment of applications and facilitates evaluation. We investigated a suite of software developments funded by a large umbrella organization for networked medical research within the last 10 years and created Docker containers for a number of applications to support utilization and dissemination.
Effect of steady and time-harmonic magnetic fields on macrosegragation in alloy solidification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Incropera, F.P.; Prescott, P.J.
Buoyancy-induced convection during the solidification of alloys can contribute significantly to the redistribution of alloy constituents, thereby creating large composition gradients in the final ingot. Termed macrosegregation, the condition diminishes the quality of the casting and, in the extreme, may require that the casting be remelted. The deleterious effects of buoyancy-driven flows may be suppressed through application of an external magnetic field, and in this study the effects of both steady and time-harmonic fields have been considered. For a steady magnetic field, extremely large field strengths would be required to effectively dampen convection patterns that contribute to macrosegregation. However, bymore » reducing spatial variations in temperature and composition, turbulent mixing induced by a time-harmonic field reduces the number and severity of segregates in the final casting.« less
DeRobertis, Christopher V.; Lu, Yantian T.
2010-02-23
A method, system, and program storage device for creating a new user account or user group with a unique identification number in a computing environment having multiple user registries is provided. In response to receiving a command to create a new user account or user group, an operating system of a clustered computing environment automatically checks multiple registries configured for the operating system to determine whether a candidate identification number for the new user account or user group has been assigned already to one or more existing user accounts or groups, respectively. The operating system automatically assigns the candidate identification number to the new user account or user group created in a target user registry if the checking indicates that the candidate identification number has not been assigned already to any of the existing user accounts or user groups, respectively.
Are "New Building" Learning Gains Sustainable?
ERIC Educational Resources Information Center
Walczak, Mary M.; Van Wylen, David G. L.
2015-01-01
New science facilities have become a reality on many college campuses in the last few decades. Large time investments in creating shared programmatic vision and designing flexible spaces, partnered with large fiscal investments, have created a new generation of science building. Unfortunately, few studies provide evidence about whether the…
Occupational health and safety aspects of animal handling in dairy production.
Lindahl, Cecilia; Lundqvist, Peter; Hagevoort, G Robert; Lunner Kolstrup, Christina; Douphrate, David I; Pinzke, Stefan; Grandin, Temple
2013-01-01
Livestock handling in dairy production is associated with a number of health and safety issues. A large number of fatal and nonfatal injuries still occur when handling livestock. The many animal handling tasks on a dairy farm include moving cattle between different locations, vaccination, administration of medication, hoof care, artificial insemination, ear tagging, milking, and loading onto trucks. There are particular problems with bulls, which continue to cause considerable numbers of injuries and fatalities in dairy production. In order to reduce the number of injuries during animal handling on dairy farms, it is important to understand the key factors in human-animal interactions. These include handler attitudes and behavior, animal behavior, and fear in cows. Care when in close proximity to the animal is the key for safe handling, including knowledge of the flight zone, and use of the right types of tools and suitable restraint equipment. Thus, in order to create safe working conditions during livestock handling, it is important to provide handlers with adequate training and to establish sound safety management procedures on the farm.
Biological effects of two successive shock waves focused on liver tissues and melanoma cells.
Benes, J; Sunka, P; Králová, J; Kaspar, J; Poucková, P
2007-01-01
A new generator of two successive shock waves focused to a common focal point has been developed. Cylindrical pressure waves created by multichannel electrical discharges on two cylindrical composite anodes are focused by a metallic parabolic reflector - cathode, and near the focus they are transformed to strong shock waves. Schlieren photos of the focal region have demonstrated that mutual interaction of the two waves results in generation of a large number of secondary short-wavelength shocks. Interaction of the focused shockwaves with liver tissues and cancer cell suspensions was investigated. Localized injury of rabbit liver induced by the shock waves was demonstrated by magnetic resonance imaging. Histological analysis of liver samples taken from the injured region revealed that the transition between the injured and the healthy tissues is sharp. Suspension of melanoma B16 cells was exposed and the number of the surviving cells rapidly decreased with increasing number of shocks and only 8 % of cells survived 350 shocks. Photographs of cells demonstrate that even small number of shocks results in perforation of cell membranes.
The GLAS Science Algorithm Software (GSAS) User's Guide Version 7
NASA Technical Reports Server (NTRS)
Lee, Jeffrey E.
2013-01-01
The Geoscience Laser Altimeter System (GLAS) is the primary instrument for the ICESat (Ice, Cloud and Land Elevation Satellite) laser altimetry mission. ICESat was the benchmark Earth Observing System (EOS) mission for measuring ice sheet mass balance, cloud and aerosol heights, as well as land topography and vegetation characteristics. From 2003 to 2009, the ICESat mission provided multi-year elevation data needed to determine ice sheet mass balance as well as cloud property information, especially for stratospheric clouds common over polar areas. It also provided topography and vegetation data around the globe, in addition to the polar-specific coverage over the Greenland and Antarctic ice sheets.This document is the final version of the GLAS Science Algorithm Software Users Guide document. It contains the instructions to install the GLAS Science Algorithm Software (GSAS) in the production environment that was used to create the standard data products. It also describes the usage of each GSAS program in that environment with their required inputs and outputs. Included are a number of utility programs that are used to create ancillary data files that are used in the processing but generally are not distributed to the public as data products. Of importance is the values for the large number of constants used in the GSAS algorithm during processing are provided in an appendix.
Trivelpiece-Gould modes in a uniform unbounded plasma
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stenzel, R. L.; Urrutia, J. M.
Trivelpiece-Gould (TG) modes originally described electrostatic surface waves on an axially magnetized cylindrical plasma column. Subsequent studies of electromagnetic waves in such plasma columns revealed two modes, a predominantly magnetic helicon mode (H) and the mixed magnetic and electrostatic Trivelpiece-Gould modes (TG). The latter are similar to whistler modes near the oblique cyclotron resonance in unbounded plasmas. The wave propagation in cylindrical geometry is assumed to be paraxial while the modes exhibit radial standing waves. The present work shows that TG modes also arise in a uniform plasma without radial standing waves. It is shown experimentally that oblique cyclotron resonancemore » arises in large mode number helicons. Their azimuthal wave number far exceeds the axial wave number which creates whistlers near the oblique cyclotron resonance. Cyclotron damping absorbs the TG mode and can energize electrons in the center of a plasma column rather than the edge of conventional TG modes. The angular orbital field momentum can produce new perpendicular wave-particle interactions.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lieu, Richard
A hierarchy of statistics of increasing sophistication and accuracy is proposed to exploit an interesting and fundamental arithmetic structure in the photon bunching noise of incoherent light of large photon occupation number, with the purpose of suppressing the noise and rendering a more reliable and unbiased measurement of the light intensity. The method does not require any new hardware, rather it operates at the software level with the help of high-precision computers to reprocess the intensity time series of the incident light to create a new series with smaller bunching noise coherence length. The ultimate accuracy improvement of this methodmore » of flux measurement is limited by the timing resolution of the detector and the photon occupation number of the beam (the higher the photon number the better the performance). The principal application is accuracy improvement in the signal-limited bolometric flux measurement of a radio source.« less
Large-scale neuromorphic computing systems
NASA Astrophysics Data System (ADS)
Furber, Steve
2016-10-01
Neuromorphic computing covers a diverse range of approaches to information processing all of which demonstrate some degree of neurobiological inspiration that differentiates them from mainstream conventional computing systems. The philosophy behind neuromorphic computing has its origins in the seminal work carried out by Carver Mead at Caltech in the late 1980s. This early work influenced others to carry developments forward, and advances in VLSI technology supported steady growth in the scale and capability of neuromorphic devices. Recently, a number of large-scale neuromorphic projects have emerged, taking the approach to unprecedented scales and capabilities. These large-scale projects are associated with major new funding initiatives for brain-related research, creating a sense that the time and circumstances are right for progress in our understanding of information processing in the brain. In this review we present a brief history of neuromorphic engineering then focus on some of the principal current large-scale projects, their main features, how their approaches are complementary and distinct, their advantages and drawbacks, and highlight the sorts of capabilities that each can deliver to neural modellers.
Geeleher, Paul; Zhang, Zhenyu; Wang, Fan; Gruener, Robert F; Nath, Aritro; Morrison, Gladys; Bhutra, Steven; Grossman, Robert L; Huang, R Stephanie
2017-10-01
Obtaining accurate drug response data in large cohorts of cancer patients is very challenging; thus, most cancer pharmacogenomics discovery is conducted in preclinical studies, typically using cell lines and mouse models. However, these platforms suffer from serious limitations, including small sample sizes. Here, we have developed a novel computational method that allows us to impute drug response in very large clinical cancer genomics data sets, such as The Cancer Genome Atlas (TCGA). The approach works by creating statistical models relating gene expression to drug response in large panels of cancer cell lines and applying these models to tumor gene expression data in the clinical data sets (e.g., TCGA). This yields an imputed drug response for every drug in each patient. These imputed drug response data are then associated with somatic genetic variants measured in the clinical cohort, such as copy number changes or mutations in protein coding genes. These analyses recapitulated drug associations for known clinically actionable somatic genetic alterations and identified new predictive biomarkers for existing drugs. © 2017 Geeleher et al.; Published by Cold Spring Harbor Laboratory Press.
PharmTeX: a LaTeX-Based Open-Source Platform for Automated Reporting Workflow.
Rasmussen, Christian Hove; Smith, Mike K; Ito, Kaori; Sundararajan, Vijayakumar; Magnusson, Mats O; Niclas Jonsson, E; Fostvedt, Luke; Burger, Paula; McFadyen, Lynn; Tensfeldt, Thomas G; Nicholas, Timothy
2018-03-16
Every year, the pharmaceutical industry generates a large number of scientific reports related to drug research, development, and regulatory submissions. Many of these reports are created using text processing tools such as Microsoft Word. Given the large number of figures, tables, references, and other elements, this is often a tedious task involving hours of copying and pasting and substantial efforts in quality control (QC). In the present article, we present the LaTeX-based open-source reporting platform, PharmTeX, a community-based effort to make reporting simple, reproducible, and user-friendly. The PharmTeX creators put a substantial effort into simplifying the sometimes complex elements of LaTeX into user-friendly functions that rely on advanced LaTeX and Perl code running in the background. Using this setup makes LaTeX much more accessible for users with no prior LaTeX experience. A software collection was compiled for users not wanting to manually install the required software components. The PharmTeX templates allow for inclusion of tables directly from mathematical software output as well and figures from several formats. Code listings can be included directly from source. No previous experience and only a few hours of training are required to start writing reports using PharmTeX. PharmTeX significantly reduces the time required for creating a scientific report fully compliant with regulatory and industry expectations. QC is made much simpler, since there is a direct link between analysis output and report input. PharmTeX makes available to report authors the strengths of LaTeX document processing without the need for extensive training. Graphical Abstract ᅟ.
Case study: the introduction of stereoscopic games on the Sony PlayStation 3
NASA Astrophysics Data System (ADS)
Bickerstaff, Ian
2012-03-01
A free stereoscopic firmware update on Sony Computer Entertainment's PlayStation® 3 console provides the potential to increase enormously the popularity of stereoscopic 3D in the home. For this to succeed though, a large selection of content has to become available that exploits 3D in the best way possible. In addition to the existing challenges found in creating 3D movies and television programmes, the stereography must compensate for the dynamic and unpredictable environments found in games. Automatically, the software must map the depth range of the scene into the display's comfort zone, while minimising depth compression. This paper presents a range of techniques developed to solve this problem and the challenge of creating twice as many images as the 2D version without excessively compromising the frame rate or image quality. At the time of writing, over 80 stereoscopic PlayStation 3 games have been released and notable titles are used as examples to illustrate how the techniques have been adapted for different game genres. Since the firmware's introduction in 2010, the industry has matured with a large number of developers now producing increasingly sophisticated 3D content. New technologies such as viewer head tracking and head-mounted displays should increase the appeal of 3D in the home still further.
Reid, Beth; Ho, Shirley; Padmanabhan, Nikhil; ...
2015-11-17
The Baryon Oscillation Spectroscopic Survey (BOSS), part of the Sloan Digital Sky Survey (SDSS) III project, has provided the largest survey of galaxy redshifts available to date, in terms of both the number of galaxy redshifts measured by a single survey, and the effective cosmological volume covered. Key to analysing the clustering of these data to provide cosmological measurements is understanding the detailed properties of this sample. Potential issues include variations in the target catalogue caused by changes either in the targeting algorithm or properties of the data used, the pattern of spectroscopic observations, the spatial distribution of targets formore » which redshifts were not obtained, and variations in the target sky density due to observational systematics. We document here the target selection algorithms used to create the galaxy samples that comprise BOSS. We also present the algorithms used to create large-scale structure catalogues for the final Data Release (DR12) samples and the associated random catalogues that quantify the survey mask. The algorithms are an evolution of those used by the BOSS team to construct catalogues from earlier data, and have been designed to accurately quantify the galaxy sample. Furthermore, the code used, designated mksample, is released with this paper.« less
Flexible conformable hydrophobized surfaces for turbulent flow drag reduction
Brennan, Joseph C; Geraldi, Nicasio R; Morris, Robert H; Fairhurst, David J; McHale, Glen; Newton, Michael I
2015-01-01
In recent years extensive work has been focused onto using superhydrophobic surfaces for drag reduction applications. Superhydrophobic surfaces retain a gas layer, called a plastron, when submerged underwater in the Cassie-Baxter state with water in contact with the tops of surface roughness features. In this state the plastron allows slip to occur across the surface which results in a drag reduction. In this work we report flexible and relatively large area superhydrophobic surfaces produced using two different methods: Large roughness features were created by electrodeposition on copper meshes; Small roughness features were created by embedding carbon nanoparticles (soot) into Polydimethylsiloxane (PDMS). Both samples were made into cylinders with a diameter under 12 mm. To characterize the samples, scanning electron microscope (SEM) images and confocal microscope images were taken. The confocal microscope images were taken with each sample submerged in water to show the extent of the plastron. The hydrophobized electrodeposited copper mesh cylinders showed drag reductions of up to 32% when comparing the superhydrophobic state with a wetted out state. The soot covered cylinders achieved a 30% drag reduction when comparing the superhydrophobic state to a plain cylinder. These results were obtained for turbulent flows with Reynolds numbers 10,000 to 32,500. PMID:25975704
NASA Astrophysics Data System (ADS)
Kearney, K.; Aydin, K.
2016-02-01
Oceanic food webs are often depicted as network graphs, with the major organisms or functional groups displayed as nodes and the fluxes of between them as the edges. However, the large number of nodes and edges and high connectance of many management-oriented food webs coupled with graph layout algorithms poorly-suited to certain desired characteristics of food web visualizations often lead to hopelessly tangled diagrams that convey little information other than, "It's complex." Here, I combine several new graph visualization techniques- including a new node layout alorithm based on a trophic similarity (quantification of shared predator and prey) and trophic level, divided edge bundling for edge routing, and intelligent automated placement of labels- to create a much clearer visualization of the important fluxes through a food web. The technique will be used to highlight the differences in energy flow within three Alaskan Large Marine Ecosystems (the Bering Sea, Gulf of Alaska, and Aleutian Islands) that include very similar functional groups but unique energy pathways.
Jang, Mi; Shim, Won Joon; Han, Gi Myung; Song, Young Kyoung; Hong, Sang Hee
2018-06-01
Fragmentation of large plastic debris into smaller particles results in increasing microplastic concentrations in the marine environment. In plastic debris fragmentation processes, the influence of biological factors remains largely unknown. This study investigated the fragmentation of expanded polystyrene (EPS) debris by polychaetes (Marphysa sanguinea) living on the debris. A large number of EPS particles (131 ± 131 particles/individual, 0.2-3.8 mm in length) were found in the digestive tracts of burrowing polychaetes living on EPS debris. To confirm the formation of microplastics by polychaetes and identify the quantity and morphology of produced microplastics, polychaetes were exposed to EPS blocks in filtered seawater under laboratory conditions. Polychaetes burrowed into the blocks and created numerous EPS microplastic particles, indicating that a single polychaete can produce hundreds of thousands of microplastic particles per year. These results reveal the potential role of marine organisms as microplastic producers in the marine environment. Copyright © 2018 Elsevier Ltd. All rights reserved.
Conformal bootstrap at large charge
NASA Astrophysics Data System (ADS)
Jafferis, Daniel; Mukhametzhanov, Baur; Zhiboedov, Alexander
2018-05-01
We consider unitary CFTs with continuous global symmetries in d > 2. We consider a state created by the lightest operator of large charge Q ≫ 1 and analyze the correlator of two light charged operators in this state. We assume that the correlator admits a well-defined large Q expansion and, relatedly, that the macroscopic (thermodynamic) limit of the correlator exists. We find that the crossing equations admit a consistent truncation, where only a finite number N of Regge trajectories contribute to the correlator at leading nontrivial order. We classify all such truncated solutions to the crossing. For one Regge trajectory N = 1, the solution is unique and given by the effective field theory of a Goldstone mode. For two or more Regge trajectories N ≥ 2, the solutions are encoded in roots of a certain degree N polynomial. Some of the solutions admit a simple weakly coupled EFT description, whereas others do not. In the weakly coupled case, each Regge trajectory corresponds to a field in the effective Lagrangian.
Suarez, Celina A; You, Hai-Lu; Suarez, Marina B; Li, Da-Qing; Trieschmann, J B
2017-11-10
Lanzhousaurus magnidens, a large non-hadrosauriform iguanodontian dinosaur from the Lower Cretaceous Hekou Group of Gansu Province, China has the largest known herbivorous dinosaur teeth. Unlike its hadrosauriform relatives possessing tooth batteries of many small teeth, Lanzhousaurus utilized a small number (14) of very large teeth (~10 cm long) to create a large, continuous surface for mastication. Here we investigate the significance of Lanzhousaurus in the evolutionary history of iguanodontian-hadrosauriform transition by using a combination of stable isotope analysis and CT imagery. We infer that Lanzhousaurus had a rapid rate of tooth enamel elongation or amelogenesis at 0.24 mm/day with dental tissues common to other Iguanodontian dinosaurs. Among ornithopods, high rates of amelogenesis have been previously observed in hadrosaurids, where they have been associated with a sophisticated masticatory apparatus. These data suggest rapid amelogenesis evolved among non-hadrosauriform iguanodontians such as Lanzhousaurus, representing a crucial step that was exapted for the evolution of the hadrosaurian feeding mechanism.
Using First Differences to Reduce Inhomogeneity in Radiosonde Temperature Datasets.
NASA Astrophysics Data System (ADS)
Free, Melissa; Angell, James K.; Durre, Imke; Lanzante, John; Peterson, Thomas C.; Seidel, Dian J.
2004-11-01
The utility of a “first difference” method for producing temporally homogeneous large-scale mean time series is assessed. Starting with monthly averages, the method involves dropping data around the time of suspected discontinuities and then calculating differences in temperature from one year to the next, resulting in a time series of year-to-year differences for each month at each station. These first difference time series are then combined to form large-scale means, and mean temperature time series are constructed from the first difference series. When applied to radiosonde temperature data, the method introduces random errors that decrease with the number of station time series used to create the large-scale time series and increase with the number of temporal gaps in the station time series. Root-mean-square errors for annual means of datasets produced with this method using over 500 stations are estimated at no more than 0.03 K, with errors in trends less than 0.02 K decade-1 for 1960 97 at 500 mb. For a 50-station dataset, errors in trends in annual global means introduced by the first differencing procedure may be as large as 0.06 K decade-1 (for six breaks per series), which is greater than the standard error of the trend. Although the first difference method offers significant resource and labor advantages over methods that attempt to adjust the data, it introduces an error in large-scale mean time series that may be unacceptable in some cases.
Large-scale Labeled Datasets to Fuel Earth Science Deep Learning Applications
NASA Astrophysics Data System (ADS)
Maskey, M.; Ramachandran, R.; Miller, J.
2017-12-01
Deep learning has revolutionized computer vision and natural language processing with various algorithms scaled using high-performance computing. However, generic large-scale labeled datasets such as the ImageNet are the fuel that drives the impressive accuracy of deep learning results. Large-scale labeled datasets already exist in domains such as medical science, but creating them in the Earth science domain is a challenge. While there are ways to apply deep learning using limited labeled datasets, there is a need in the Earth sciences for creating large-scale labeled datasets for benchmarking and scaling deep learning applications. At the NASA Marshall Space Flight Center, we are using deep learning for a variety of Earth science applications where we have encountered the need for large-scale labeled datasets. We will discuss our approaches for creating such datasets and why these datasets are just as valuable as deep learning algorithms. We will also describe successful usage of these large-scale labeled datasets with our deep learning based applications.
A Study of the Efficiency of Spatial Indexing Methods Applied to Large Astronomical Databases
NASA Astrophysics Data System (ADS)
Donaldson, Tom; Berriman, G. Bruce; Good, John; Shiao, Bernie
2018-01-01
Spatial indexing of astronomical databases generally uses quadrature methods, which partition the sky into cells used to create an index (usually a B-tree) written as database column. We report the results of a study to compare the performance of two common indexing methods, HTM and HEALPix, on Solaris and Windows database servers installed with a PostgreSQL database, and a Windows Server installed with MS SQL Server. The indexing was applied to the 2MASS All-Sky Catalog and to the Hubble Source catalog. On each server, the study compared indexing performance by submitting 1 million queries at each index level with random sky positions and random cone search radius, which was computed on a logarithmic scale between 1 arcsec and 1 degree, and measuring the time to complete the query and write the output. These simulated queries, intended to model realistic use patterns, were run in a uniform way on many combinations of indexing method and indexing level. The query times in all simulations are strongly I/O-bound and are linear with number of records returned for large numbers of sources. There are, however, considerable differences between simulations, which reveal that hardware I/O throughput is a more important factor in managing the performance of a DBMS than the choice of indexing scheme. The choice of index itself is relatively unimportant: for comparable index levels, the performance is consistent within the scatter of the timings. At small index levels (large cells; e.g. level 4; cell size 3.7 deg), there is large scatter in the timings because of wide variations in the number of sources found in the cells. At larger index levels, performance improves and scatter decreases, but the improvement at level 8 (14 min) and higher is masked to some extent in the timing scatter caused by the range of query sizes. At very high levels (20; 0.0004 arsec), the granularity of the cells becomes so high that a large number of extraneous empty cells begin to degrade performance. Thus, for the use patterns studied here the database performance is not critically dependent on the exact choices of index or level.
NASA Astrophysics Data System (ADS)
Warrier, M.; Bhardwaj, U.; Hemani, H.; Schneider, R.; Mutzke, A.; Valsakumar, M. C.
2015-12-01
We report on molecular Dynamics (MD) simulations carried out in fcc Cu and bcc W using the Large-scale Atomic/Molecular Massively Parallel Simulator (LAMMPS) code to study (i) the statistical variations in the number of interstitials and vacancies produced by energetic primary knock-on atoms (PKA) (0.1-5 keV) directed in random directions and (ii) the in-cascade cluster size distributions. It is seen that around 60-80 random directions have to be explored for the average number of displaced atoms to become steady in the case of fcc Cu, whereas for bcc W around 50-60 random directions need to be explored. The number of Frenkel pairs produced in the MD simulations are compared with that from the Binary Collision Approximation Monte Carlo (BCA-MC) code SDTRIM-SP and the results from the NRT model. It is seen that a proper choice of the damage energy, i.e. the energy required to create a stable interstitial, is essential for the BCA-MC results to match the MD results. On the computational front it is seen that in-situ processing saves the need to input/output (I/O) atomic position data of several tera-bytes when exploring a large number of random directions and there is no difference in run-time because the extra run-time in processing data is offset by the time saved in I/O.
The statistical power to detect cross-scale interactions at macroscales
Wagner, Tyler; Fergus, C. Emi; Stow, Craig A.; Cheruvelil, Kendra S.; Soranno, Patricia A.
2016-01-01
Macroscale studies of ecological phenomena are increasingly common because stressors such as climate and land-use change operate at large spatial and temporal scales. Cross-scale interactions (CSIs), where ecological processes operating at one spatial or temporal scale interact with processes operating at another scale, have been documented in a variety of ecosystems and contribute to complex system dynamics. However, studies investigating CSIs are often dependent on compiling multiple data sets from different sources to create multithematic, multiscaled data sets, which results in structurally complex, and sometimes incomplete data sets. The statistical power to detect CSIs needs to be evaluated because of their importance and the challenge of quantifying CSIs using data sets with complex structures and missing observations. We studied this problem using a spatially hierarchical model that measures CSIs between regional agriculture and its effects on the relationship between lake nutrients and lake productivity. We used an existing large multithematic, multiscaled database, LAke multiscaled GeOSpatial, and temporal database (LAGOS), to parameterize the power analysis simulations. We found that the power to detect CSIs was more strongly related to the number of regions in the study rather than the number of lakes nested within each region. CSI power analyses will not only help ecologists design large-scale studies aimed at detecting CSIs, but will also focus attention on CSI effect sizes and the degree to which they are ecologically relevant and detectable with large data sets.
NASA Technical Reports Server (NTRS)
Miles, Jeffrey Hilton
1999-01-01
A linear spatial instability model for multiple spatially periodic supersonic rectangular jets is solved using Floquet-Bloch theory. It is assumed that in the region of interest a coherent wave can propagate. For the case studied large spatial growth rates are found. This work is motivated by an increase in mixing found in experimental measurements of spatially periodic supersonic rectangular jets with phase-locked screech and edge tone feedback locked subsonic jets. The results obtained in this paper suggests that phase-locked screech or edge tones may produce correlated spatially periodic jet flow downstream of the nozzles which creates a large span wise multi-nozzle region where a coherent wave can propagate. The large spatial growth rates for eddies obtained by model calculation herein are related to the increased mixing since eddies are the primary mechanism that transfer energy from the mean flow to the large turbulent structures. Calculations of spacial growth rates will be presented for a set of relative Mach numbers and spacings for which experimental measurements have been made. Calculations of spatial growth rates are presented for relative Mach numbers from 1.25 to 1.75 with ratios of nozzle spacing to nozzle width ratios from s/w(sub N) = 4 to s/w(sub N) = 13.7. The model may be of significant scientific and engineering value in the quest to understand and construct supersonic mixer-ejector nozzles which provide increased mixing and reduced noise.
Kane, Van R.; North, Malcolm P.; Lutz, James A.; Churchill, Derek J.; Roberts, Susan L.; Smith, Douglas F.; McGaughey, Robert J.; Kane, Jonathan T.; Brooks, Matthew L.
2014-01-01
Mosaics of tree clumps and openings are characteristic of forests dominated by frequent, low- and moderate-severity fires. When restoring these fire-suppressed forests, managers often try to reproduce these structures to increase ecosystem resilience. We examined unburned and burned forest structures for 1937 0.81 ha sample areas in Yosemite National Park, USA. We estimated severity for fires from 1984 to 2010 using the Landsat-derived Relativized differenced Normalized Burn Ratio (RdNBR) and measured openings and canopy clumps in five height strata using airborne LiDAR data. Because our study area lacked concurrent field data, we identified methods to allow structural analysis using LiDAR data alone. We found three spatial structures, canopy-gap, clump-open, and open, that differed in spatial arrangement and proportion of canopy and openings. As fire severity increased, the total area in canopy decreased while the number of clumps increased, creating a patchwork of openings and multistory tree clumps. The presence of openings > 0.3 ha, an approximate minimum gap size needed to favor shade-intolerant pine regeneration, increased rapidly with loss of canopy area. The range and variation of structures for a given fire severity were specific to each forest type. Low- to moderate-severity fires best replicated the historic clump-opening patterns that were common in forests with frequent fire regimes. Our results suggest that managers consider the following goals for their forest restoration: 1) reduce total canopy cover by breaking up large contiguous areas into variable-sized tree clumps and scattered large individual trees; 2) create a range of opening sizes and shapes, including ~ 50% of the open area in gaps > 0.3 ha; 3) create multistory clumps in addition to single story clumps; 4) retain historic densities of large trees; and 5) vary treatments to include canopy-gap, clump-open, and open mosaics across project areas to mimic the range of patterns found for each forest type in our study.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chiu, J; Ma, L
2015-06-15
Purpose: To develop a treatment delivery and planning strategy by increasing the number of beams to minimize dose to brain tissue surrounding a target, while maximizing dose coverage to the target. Methods: We analyzed 14 different treatment plans via Leksell PFX and 4C. For standardization, single tumor cases were chosen. Original treatment plans were compared with two optimized plans. The number of beams was increased in treatment plans by varying tilt angles of the patient head, while maintaining original isocenter and the beam positions in the x-, y- and z-axes, collimator size, and beam blocking. PFX optimized plans increased beammore » numbers with three pre-set tilt angles, 70, 90, 110, and 4C optimized plans increased beam numbers with tilt angles increasing arbitrarily from range of 30 to 150 degrees. Optimized treatment plans were compared dosimetrically with original treatment plans. Results: Comparing total normal tissue isodose volumes between original and optimized plans, the low-level percentage isodose volumes decreased in all plans. Despite the addition of multiple beams up to a factor of 25, beam-on times for 1 tilt angle versus 3 or more tilt angles were comparable (<1 min.). In 64% (9/14) of the studied cases, the volume percentage decrease by >5%, with the highest value reaching 19%. The addition of more tilt angles correlates to a greater decrease in normal brain irradiated volume. Selectivity and coverage for original and optimized plans remained comparable. Conclusion: Adding large number of additional focused beams with variable patient head tilt shows improvement for dose fall-off for brain radiosurgery. The study demonstrates technical feasibility of adding beams to decrease target volume.« less
Experimental Investigation of Reynolds Number Effects on Test Quality in a Hypersonic Expansion Tube
NASA Astrophysics Data System (ADS)
Rossmann, Tobias; Devin, Alyssa; Shi, Wen; Verhoog, Charles
2017-11-01
Reynolds number effects on test time and the temporal and spatial flow quality in a hypersonic expansion tube are explored using high-speed pressure, infrared optical, and Schlieren imaging measurements. Boundary layer models for shock tube flows are fairly well established to assist in the determination of test time and flow dimensions at typical high enthalpy test conditions. However, the application of these models needs to be more fully explored due to the unsteady expansion of turbulent boundary layers and contact regions separating dissimilar gasses present in expansion tube flows. Additionally, expansion tubes rely on the development of a steady jet with a large enough core-flow region at the exit of the acceleration tube to create a constant velocity region inside of the test section. High-speed measurements of pressure and Mach number at several locations within the expansion tube allow for the determination of an experimental x-t diagram. The comparison of the experimentally determined x-t diagram to theoretical highlights the Reynolds number dependent effects on expansion tube. Additionally, spatially resolved measurements of the Reynolds number dependent, steady core-flow in the expansion tube viewing section are shown. NSF MRI CBET #1531475, Lafayette College, McCutcheon Foundation.
Flexible method for monitoring fuel cell voltage
Mowery, Kenneth D.; Ripley, Eugene V.
2002-01-01
A method for equalizing the measured voltage of each cluster in a fuel cell stack wherein at least one of the clusters has a different number of cells than the identical number of cells in the remaining clusters by creating a pseudo voltage for the different cell numbered cluster. The average cell voltage of the all of the cells in the fuel cell stack is calculated and multiplied by a constant equal to the difference in the number of cells in the identical cell clusters and the number of cells in the different numbered cell cluster. The resultant product is added to the actual voltage measured across the different numbered cell cluster to create a pseudo voltage which is equivalent in cell number to the number of cells in the other identical numbered cell clusters.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schanen, Michel; Marin, Oana; Zhang, Hong
Adjoints are an important computational tool for large-scale sensitivity evaluation, uncertainty quantification, and derivative-based optimization. An essential component of their performance is the storage/recomputation balance in which efficient checkpointing methods play a key role. We introduce a novel asynchronous two-level adjoint checkpointing scheme for multistep numerical time discretizations targeted at large-scale numerical simulations. The checkpointing scheme combines bandwidth-limited disk checkpointing and binomial memory checkpointing. Based on assumptions about the target petascale systems, which we later demonstrate to be realistic on the IBM Blue Gene/Q system Mira, we create a model of the expected performance of our checkpointing approach and validatemore » it using the highly scalable Navier-Stokes spectralelement solver Nek5000 on small to moderate subsystems of the Mira supercomputer. In turn, this allows us to predict optimal algorithmic choices when using all of Mira. We also demonstrate that two-level checkpointing is significantly superior to single-level checkpointing when adjoining a large number of time integration steps. To our knowledge, this is the first time two-level checkpointing had been designed, implemented, tuned, and demonstrated on fluid dynamics codes at large scale of 50k+ cores.« less
Virtual reality based surgical assistance and training system for long duration space missions.
Montgomery, K; Thonier, G; Stephanides, M; Schendel, S
2001-01-01
Access to medical care during long duration space missions is extremely important. Numerous unanticipated medical problems will need to be addressed promptly and efficiently. Although telemedicine provides a convenient tool for remote diagnosis and treatment, it is impractical due to the long delay between data transmission and reception to Earth. While a well-trained surgeon-internist-astronaut would be an essential addition to the crew, the vast number of potential medical problems necessitate instant access to computerized, skill-enhancing and diagnostic tools. A functional prototype of a virtual reality based surgical training and assistance tool was created at our center, using low-power, small, lightweight components that would be easy to transport on a space mission. The system consists of a tracked, head-mounted display, a computer system, and a number of tracked surgical instruments. The software provides a real-time surgical simulation system with integrated monitoring and information retrieval and a voice input/output subsystem. Initial medical content for the system has been created, comprising craniofacial, hand, inner ear, and general anatomy, as well as information on a number of surgical procedures and techniques. One surgical specialty in particular, microsurgery, was provided as a full simulation due to its long training requirements, significant impact on result due to experience, and likelihood for need. However, the system is easily adapted to realistically simulate a large number of other surgical procedures. By providing a general system for surgical simulation and assistance, the astronaut-surgeon can maintain their skills, acquire new specialty skills, and use tools for computer-based surgical planning and assistance to minimize overall crew and mission risk.
NASA Astrophysics Data System (ADS)
Cheek, Kim A.
2017-08-01
Ideas about temporal (and spatial) scale impact students' understanding across science disciplines. Learners have difficulty comprehending the long time periods associated with natural processes because they have no referent for the magnitudes involved. When people have a good "feel" for quantity, they estimate cardinal number magnitude linearly. Magnitude estimation errors can be explained by confusion about the structure of the decimal number system, particularly in terms of how powers of ten are related to one another. Indonesian children regularly use large currency units. This study investigated if they estimate long time periods accurately and if they estimate those time periods the same way they estimate analogous currency units. Thirty-nine children from a private International Baccalaureate school estimated temporal magnitudes up to 10,000,000,000 years in a two-part study. Artifacts children created were compared to theoretical model predictions previously used in number magnitude estimation studies as reported by Landy et al. (Cognitive Science 37:775-799, 2013). Over one third estimated the magnitude of time periods up to 10,000,000,000 years linearly, exceeding what would be expected based upon prior research with children this age who lack daily experience with large quantities. About half treated successive powers of ten as a count sequence instead of multiplicatively related when estimating magnitudes of time periods. Children generally estimated the magnitudes of long time periods and familiar, analogous currency units the same way. Implications for ways to improve the teaching and learning of this crosscutting concept/overarching idea are discussed.
Medical and Scientific Evaluations aboard the KC-135. Microgravity-Compatible Flow Cytometer
NASA Technical Reports Server (NTRS)
Crucian, Brian; Nelman-Gonzalez, Mayra; Sams, Clarence
2005-01-01
A spaceflight-compatible flow cytometer would be useful for the diagnosis of astronaut illness during long duration spaceflight and for conducting in-flight research to evaluate the effects of microgravity on human physiology. Until recently, the primary limitations preventing the development of a spaceflight compatible flow cytometer have been largely mechanical. Standard commercially available flow cytometers are large, complex instruments that use high-energy lasers and require significant training to operate. Standard flow cytometers function by suspending the particles to be analyzed inside a sheath fluid for analysis. This requires the presence of several liters of sheath fluid for operation, and generates a corresponding amount of liquid hazardous waste. The particles are then passed through a flow cell which uses the fluid mechanical property of hydrodynamic focusing to place the cells in single-file (laminar flow) as they pass through a laser beam for scanning and evaluation. Many spaceflight experiments have demonstrated that fluid physics is dramatically altered in microgravity (MSF [Manned Space Flight] Fluid Physics Data Sheet-August 1997) and previous studies have shown that sheath-fluid based hydrodynamic focusing may also be altered during microgravity (Crucian et al, 2000). For these reasons it is likely that any spaceflight compatible design for a flow cytometer would abandon the sheath fluid requirement. The elimination of sheath fluid would remove both the problems of weight associated with large volumes of liquids as well as the large volume of liquid waste generated. It would also create the need for a method to create laminar particle flow distinct from the standard sheath-fluid based method. The spaceflight prototype instrument is based on a recently developed commercial flow cytometer possessing a novel flow cell design that creates single-particle laser scanning and evaluation without the need for sheath-fluid based hydrodynamic focusing. This instrument also possesses a number of design advances that make it conditionally microgravity compatible: it is highly miniaturized and lightweight, uses a low energy diode laser, has a small number of moving parts, does not use sheath fluid and does not generate significant liquid waste. Although possessing certain limitations, the commercial cytometer functions operationally like a standard bench top laboratory flow cytometer, aspirating liquid particle samples and generating histogram or dot-plot data in standard FCS file format. In its current configuration however, the cytometer is limited to three parameter/two-color capability (two color PMTs + forward scatter), does not allow compensation between colors, does not allow linear analysis and is operated by rather inflexible software with limited capabilities. This is due to the fact that the cytometer has been designed and marketed as an instrument specific to a few particular assays, not as a multipurpose cytometer.
Shock wave loading of a magnetic guide
NASA Astrophysics Data System (ADS)
Kindt, L.
2011-10-01
The atom laser has long been a holy grail within atom physics and with the creation of an atom laser we hope to bring a similar revolution in to the field of atom optics. With the creation of the Bose-Einstein Condensate (BEC) in 1995 the path to an atom laser was initiated. An atom laser is continues source of BEC. In a Bose condensate all the atoms occupy the same quantum state and can be described by the same wave function and phase. With an atom laser the De Broglie wavelength of atoms can be much smaller than the wavelength of light. Due to the ultimate control over the atoms the atom laser is very interesting for atom optics, lithography, metrology, etching and deposition of atoms on a surface. All previous atom lasers have been created from atoms coupled out from an existing Bose-Einstein Condensate. There are different approaches but common to them all is that the duration of the output of the atom laser is limited by the size of the initial BEC and they all have a low flux. This leaves the quest to build a continuous high flux atom laser. An alternative approach to a continuous BEC beam is to channel a continuous ultra cold atomic beam into a magnetic guide and then cool this beam down to degeneracy. Cooling down a continuous beam of atoms faces three large problems: The collision rate has to be large enough for effective rethermalization, since evaporative cooling in 2D is not as effective as in 3D and a large thermal conductivity due to atoms with a high angular momentum causes heating downstream in the guide. We have built a 4 meter magnetic guide that is placed on a downward slope with a magnetic barrier in the end. In the guide we load packets of ultra cold rubidium atoms with a frequency rate large enough for the packets to merge together to form a continuous atomic beam. The atomic beam is supersonic and when the beam reaches the end barrier it will return and collide with itself. The collisions lowers the velocity of the beam into subsonic velocities and a shock wave is created between the two velocity regions. In order to conserve number of particle, momentum and enthalpy the density of the atomic beam passing through the shock wave must increase. We have build such a shock wave in an atomic beam and observed the density increase due to this. As an extra feature having a subsonic beam on a downward slope adds an extra density increase due to gravitational compression. Loading ultra cold atoms into a 3D trap from the dense subsonic beam overcomes the problem with 2D cooling and thermal conductivity. This was done and evaporative cooling was applied creating an unprecedented large number rubidium BEC.
Origin-Dependent Inverted-Repeat Amplification: Tests of a Model for Inverted DNA Amplification
Brewer, Bonita J.; Payen, Celia; Di Rienzi, Sara C.; Higgins, Megan M.; Ong, Giang; Dunham, Maitreya J.; Raghuraman, M. K.
2015-01-01
DNA replication errors are a major driver of evolution—from single nucleotide polymorphisms to large-scale copy number variations (CNVs). Here we test a specific replication-based model to explain the generation of interstitial, inverted triplications. While no genetic information is lost, the novel inversion junctions and increased copy number of the included sequences create the potential for adaptive phenotypes. The model—Origin-Dependent Inverted-Repeat Amplification (ODIRA)—proposes that a replication error at pre-existing short, interrupted, inverted repeats in genomic sequences generates an extrachromosomal, inverted dimeric, autonomously replicating intermediate; subsequent genomic integration of the dimer yields this class of CNV without loss of distal chromosomal sequences. We used a combination of in vitro and in vivo approaches to test the feasibility of the proposed replication error and its downstream consequences on chromosome structure in the yeast Saccharomyces cerevisiae. We show that the proposed replication error—the ligation of leading and lagging nascent strands to create “closed” forks—can occur in vitro at short, interrupted inverted repeats. The removal of molecules with two closed forks results in a hairpin-capped linear duplex that we show replicates in vivo to create an inverted, dimeric plasmid that subsequently integrates into the genome by homologous recombination, creating an inverted triplication. While other models have been proposed to explain inverted triplications and their derivatives, our model can also explain the generation of human, de novo, inverted amplicons that have a 2:1 mixture of sequences from both homologues of a single parent—a feature readily explained by a plasmid intermediate that arises from one homologue and integrates into the other homologue prior to meiosis. Our tests of key features of ODIRA lend support to this mechanism and suggest further avenues of enquiry to unravel the origins of interstitial, inverted CNVs pivotal in human health and evolution. PMID:26700858
Origin-Dependent Inverted-Repeat Amplification: Tests of a Model for Inverted DNA Amplification.
Brewer, Bonita J; Payen, Celia; Di Rienzi, Sara C; Higgins, Megan M; Ong, Giang; Dunham, Maitreya J; Raghuraman, M K
2015-12-01
DNA replication errors are a major driver of evolution--from single nucleotide polymorphisms to large-scale copy number variations (CNVs). Here we test a specific replication-based model to explain the generation of interstitial, inverted triplications. While no genetic information is lost, the novel inversion junctions and increased copy number of the included sequences create the potential for adaptive phenotypes. The model--Origin-Dependent Inverted-Repeat Amplification (ODIRA)-proposes that a replication error at pre-existing short, interrupted, inverted repeats in genomic sequences generates an extrachromosomal, inverted dimeric, autonomously replicating intermediate; subsequent genomic integration of the dimer yields this class of CNV without loss of distal chromosomal sequences. We used a combination of in vitro and in vivo approaches to test the feasibility of the proposed replication error and its downstream consequences on chromosome structure in the yeast Saccharomyces cerevisiae. We show that the proposed replication error-the ligation of leading and lagging nascent strands to create "closed" forks-can occur in vitro at short, interrupted inverted repeats. The removal of molecules with two closed forks results in a hairpin-capped linear duplex that we show replicates in vivo to create an inverted, dimeric plasmid that subsequently integrates into the genome by homologous recombination, creating an inverted triplication. While other models have been proposed to explain inverted triplications and their derivatives, our model can also explain the generation of human, de novo, inverted amplicons that have a 2:1 mixture of sequences from both homologues of a single parent--a feature readily explained by a plasmid intermediate that arises from one homologue and integrates into the other homologue prior to meiosis. Our tests of key features of ODIRA lend support to this mechanism and suggest further avenues of enquiry to unravel the origins of interstitial, inverted CNVs pivotal in human health and evolution.
International Design Concepts for the SKA
NASA Astrophysics Data System (ADS)
Tarter, J.
2001-12-01
In August of 2000, representatives of eleven countries signed a Memorandum of Understanding to Establish the International Square Kilometre Array Steering Committee (ISSC). Arguably, the SKA could be built today, but without question it would be unaffordable. Increasing collecting area by a factor of 100 beyond today's largest array cannot be done cost effectively by simple extensions of what has been done before. New concepts, new designs, and new technologies will be required, as well as a paradigm shift. It will be necessary to heavily exploit emerging communications and consumer market technologies; to "hammer" them into shapes required to solve the SKA challenges, rather than inventing our own solutions from scratch. Or if we do invent ab initio solutions, we should look at creating consumer markets to embrace them, so that the full benefits of mass production and manufacturing can be realized. The strawman science goals of the SKA are extremely ambitious. Today there are six primary design concepts being studied that attempt to meet some or all of these goals; phased arrays of active elements embedded into flat tiles, "super Arecibo" antennas constructed in individual limestone karst sinkholes and arrayed together, large arrays of small, spherical (or hemispherical) Luneberg lenses, large deformable apertures with long focal ratios and aerostat-borne focal plane array receivers, arrays of large parabolic antennas constructed from steel "ropes," and large arrays of small parabolic dishes derived from the TVRO industry. This talk summarizes the strengths and weaknesses of these various designs in their current, incomplete state. In the US, the US SKA Consortium of 10 academic and research organizations has generated a roadmap to guide and assess the technology development that will be required to produce a successful SKA design, with well understood costs, performance, and minimal risk. The design and construction efforts for the ATA, LOFAR and the EVLA will provide essential opportunities for proofs-of-concept for portions of the preferred US design; a very large number of small elements configured into a Large-N number of stations. An aggressive timetable has been adopted for choosing a final (hybrid?) SKA design and the selection of a site, with a target date of 2005. The first, tentative steps have been taken to create an international project office capable of overseeing the development and construction of this facility, negotiating creative solutions to problems of radio frequency interference, and along the way, inventing the infrastructure and management appropriate to this "born international" venture.
Vascular surgical data registries for small computers.
Kaufman, J L; Rosenberg, N
1984-08-01
Recent designs for computer-based vascular surgical registries and clinical data bases have employed large centralized systems with formal programming and mass storage. Small computers, of the types created for office use or for word processing, now contain sufficient speed and memory storage capacity to allow construction of decentralized office-based registries. Using a standardized dictionary of terms and a method of data organization adapted to word processing, we have created a new vascular surgery data registry, "VASREG." Data files are organized without programming, and a limited number of powerful logical statements in English are used for sorting. The capacity is 25,000 records with current inexpensive memory technology. VASREG is adaptable to computers made by a variety of manufacturers, and interface programs are available for conversion of the word processor formated registry data into forms suitable for analysis by programs written in a standard programming language. This is a low-cost clinical data registry available to any physician. With a standardized dictionary, preparation of regional and national statistical summaries may be facilitated.
Solenoid for Laser Induced Plasma Experiments at Janus
NASA Astrophysics Data System (ADS)
Klein, Sallee; Leferve, Heath; Kemp, Gregory; Mariscal, Derek; Rasmus, Alex; Williams, Jackson; Gillespie, Robb; Manuel, Mario; Kuranz, Carolyn; Keiter, Paul; Drake, R.
2017-10-01
Creating invariant magnetic fields for experiments involving laser induced plasmas is particularly challenging due to the high voltages at which the solenoid must be pulsed. Creating a solenoid resilient enough to survive through large numbers of voltage discharges, enabling it to endure a campaign lasting several weeks, is exceptionally difficult. Here we present a solenoid that is robust through 40 μs pulses at a 13 kV potential. This solenoid is a vast improvement over our previously fielded designs in peak magnetic field capabilities and robustness. Designed to be operated at small-scale laser facilities, the solenoid housing allows for versatility of experimental set-ups among diagnostic and target positions. Within the perpendicular field axis at the center there is 300 degrees of clearance which can be easily modified to meet the needs of a specific experiment, as well as an f/3 cone for transmitted or backscattered light. After initial design efforts, these solenoids are relatively inexpensive to manufacture.
Yang, Jianji J; Cohen, Aaron M; Cohen, Aaron; McDonagh, Marian S
2008-11-06
Automatic document classification can be valuable in increasing the efficiency in updating systematic reviews (SR). In order for the machine learning process to work well, it is critical to create and maintain high-quality training datasets consisting of expert SR inclusion/exclusion decisions. This task can be laborious, especially when the number of topics is large and source data format is inconsistent.To approach this problem, we build an automated system to streamline the required steps, from initial notification of update in source annotation files to loading the data warehouse, along with a web interface to monitor the status of each topic. In our current collection of 26 SR topics, we were able to standardize almost all of the relevance judgments and recovered PMIDs for over 80% of all articles. Of those PMIDs, over 99% were correct in a manual random sample study. Our system performs an essential function in creating training and evaluation data sets for SR text mining research.
Yang, Jianji J.; Cohen, Aaron M.; McDonagh, Marian S.
2008-01-01
Automatic document classification can be valuable in increasing the efficiency in updating systematic reviews (SR). In order for the machine learning process to work well, it is critical to create and maintain high-quality training datasets consisting of expert SR inclusion/exclusion decisions. This task can be laborious, especially when the number of topics is large and source data format is inconsistent. To approach this problem, we build an automated system to streamline the required steps, from initial notification of update in source annotation files to loading the data warehouse, along with a web interface to monitor the status of each topic. In our current collection of 26 SR topics, we were able to standardize almost all of the relevance judgments and recovered PMIDs for over 80% of all articles. Of those PMIDs, over 99% were correct in a manual random sample study. Our system performs an essential function in creating training and evaluation datasets for SR text mining research. PMID:18999194
[Insert Your Science Here] Week: Creating science-driven public awareness campaigns
NASA Astrophysics Data System (ADS)
Mattson, Barbara; Mitchell, Sara; McElvery, Raleigh; Reddy, Francis; Wiessinger, Scott; Skelly, Clare; Saravia, Claire; Straughn, Amber N.; Washington, Dewayne
2018-01-01
NASA Goddard’s in-house Astrophysics Communications Team is responsible for facilitating the production of traditional and social media products to provide understanding and inspiration about NASA’s astrophysics missions and discoveries. Our team is largely driven by the scientific news cycle of launches, mission milestones, anniversaries, and discoveries, which can leave a number of topics behind, waiting for a discovery to be highlighted. These overlooked topics include compelling stories about ongoing research, underlying science, and science not tied to a specific mission. In looking for a way to boost coverage of these unsung topics, we struck upon an idea of creating “theme weeks” which bring together the broader scientific community around a topic, object, or scientific concept. This poster will present the first two of our Goddard-led theme weeks: Pulsar Week and Dark Energy Week. We will describe the efforts involved, our metrics, and the benefits and challenges we encountered. We will also suggest a template for doing this for your own science based on our successes.
A knowledge discovery object model API for Java
Zuyderduyn, Scott D; Jones, Steven JM
2003-01-01
Background Biological data resources have become heterogeneous and derive from multiple sources. This introduces challenges in the management and utilization of this data in software development. Although efforts are underway to create a standard format for the transmission and storage of biological data, this objective has yet to be fully realized. Results This work describes an application programming interface (API) that provides a framework for developing an effective biological knowledge ontology for Java-based software projects. The API provides a robust framework for the data acquisition and management needs of an ontology implementation. In addition, the API contains classes to assist in creating GUIs to represent this data visually. Conclusions The Knowledge Discovery Object Model (KDOM) API is particularly useful for medium to large applications, or for a number of smaller software projects with common characteristics or objectives. KDOM can be coupled effectively with other biologically relevant APIs and classes. Source code, libraries, documentation and examples are available at . PMID:14583100
Creating Value with Long Term R&D: The life science industry
NASA Astrophysics Data System (ADS)
Soloman, Darlene J. S.
2008-03-01
Agilent Laboratories looks to the future to identify, invest and enable technologies and applications that will nurture the world’s people, environment and economies, and help ensure Agilent’s continuing leadership. Following a brief introduction to Agilent Technologies and Agilent Laboratories, Solomon will discuss how innovation and long-term R&D are transcending traditional boundaries. Focusing on the life sciences industry, she will discuss current trends in R&D and the importance of measurement in advancing the industry. She will describe some of the challenges that are disrupting the pharmaceutical industry where significant and sustained investment in R&D has not translated into large numbers of block-buster therapeutics. Much of this gap results from the profound complexity of biological systems. New discoveries quickly generate new questions, which in turn drive more research and necessitate new business models. Solomon will highlight examples of Agilent’s long-range R&D in life sciences, emphasizing the importance of physics. She’ll conclude with the importance of creating sustainable value with R&D.
Exiling children, creating orphans: when immigration policies hurt citizens.
Zayas, Luis H; Bradlee, Mollie H
2014-04-01
Citizen-children born in the United States to undocumented immigrants have become collateral damage of immigration enforcement. These children suffer the effects of immigration laws designed to deport large numbers of people. In removal proceedings, parents often must decide to either leave their citizen-children behind in the care of others or take them to a country the child may have never known. Accordingly, immigration policy frequently creates two de facto classes of children: exiles and orphans. In discussing these classes, the authors offer a summary of how U.S. citizen-children come into contact with the immigration enforcement system. The article explores the impact of detention and deportation on the health, mental health, and developmental trajectories of citizen-children and argues for reforms in policy and practice that will adhere to the highest standards of child welfare practice. By integrating these children into the immigration discourse, practitioners and policymakers will be better able to understand the effects of immigration enforcement, reduce harm to children, and provide for the protection of their rights.
van Mierlo, Trevor; Li, Xinlong; Hyatt, Douglas; Ching, Andrew T
2017-02-17
Digital health social networks (DHSNs) are widespread, and the consensus is that they contribute to wellness by offering social support and knowledge sharing. The success of a DHSN is based on the number of participants and their consistent creation of externalities through the generation of new content. To promote network growth, it would be helpful to identify characteristics of superusers or actors who create value by generating positive network externalities. The aim of the study was to investigate the feasibility of developing predictive models that identify potential superusers in real time. This study examined associations between posting behavior, 4 demographic variables, and 20 indication-specific variables. Data were extracted from the custom structured query language (SQL) databases of 4 digital health behavior change interventions with DHSNs. Of these, 2 were designed to assist in the treatment of addictions (problem drinking and smoking cessation), and 2 for mental health (depressive disorder, panic disorder). To analyze posting behavior, 10 models were developed, and negative binomial regressions were conducted to examine associations between number of posts, and demographic and indication-specific variables. The DHSNs varied in number of days active (3658-5210), number of registrants (5049-52,396), number of actors (1085-8452), and number of posts (16,231-521,997). In the sample, all 10 models had low R 2 values (.013-.086) with limited statistically significant demographic and indication-specific variables. Very few variables were associated with social network engagement. Although some variables were statistically significant, they did not appear to be practically significant. Based on the large number of study participants, variation in DHSN theme, and extensive time-period, we did not find strong evidence that demographic characteristics or indication severity sufficiently explain the variability in number of posts per actor. Researchers should investigate alternative models that identify superusers or other individuals who create social network externalities. ©Trevor van Mierlo, Xinlong Li, Douglas Hyatt, Andrew T Ching. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 17.02.2017.
Shanower, G A; Kantor, G J
1997-11-01
Xeroderma pigmentosum group C cells repair DNA damaged by ultraviolet radiation in an unusual pattern throughout the genome. They remove cyclobutane pyrimidine dimers only from the DNA of transcriptionally active chromatin regions and only from the strand that contains the transcribed strand. The repair proceeds in a manner that creates damage-free islands which are in some cases much larger than the active gene associated with them. For example, the small transcriptionally active beta-actin gene (3.5 kb) is repaired as part of a 50 kb single-stranded region. The repair responsible for creating these islands requires active transcription, suggesting that the two activities are coupled. A preferential repair pathway in normal human cells promotes repair of actively transcribed DNA strands and is coupled to transcription. It is not known if similar large islands, referred to as repair domains, are preferentially created as a result of the coupling. Data are presented showing that in normal cells, preferential repair in the beta-actin region is associated with the creation of a large, completely repaired region in the partially repaired genome. Repair at other genomic locations which contain inactive genes (insulin, 754) does not create similar large regions as quickly. In contrast, repair in Cockayne syndrome cells, which are defective in the preferential repair pathway but not in genome-overall repair, proceeds in the beta-actin region by a mechanism which does not create preferentially a large repaired region. Thus a correlation between the activity required to preferentially repair active genes and that required to create repaired domains is detected. We propose an involvement of the transcription-repair coupling factor in a coordinated repair pathway for removing DNA damage from entire transcription units.
Generalizing ecological site concepts of the Colorado Plateau for landscape-level applications
Duniway, Michael C.; Nauman, Travis; Johanson, Jamin K.; Green, Shane; Miller, Mark E.; Bestelmeyer, Brandon T.
2016-01-01
Numerous ecological site descriptions in the southern Utah portion of the Colorado Plateau can be difficult to navigate, so we held a workshop aimed at adding value and functionality to the current ecological site system.We created new groups of ecological sites and drafted state-and-transition models for these new groups.We were able to distill the current large number of ecological sites in the study area (ca. 150) into eight ecological site groups that capture important variability in ecosystem dynamics.Several inventory and monitoring programs and landscape scale planning actions will likely benefit from more generalized ecological site group concepts.
NASA Technical Reports Server (NTRS)
Grunes, Mitchell R.; Choi, Junho
1995-01-01
We are in the preliminary stages of creating an operational system for losslessly compressing packet data streams. The end goal is to reduce costs. Real world constraints include transmission in the presence of error, tradeoffs between the costs of compression and the costs of transmission and storage, and imperfect knowledge of the data streams to be transmitted. The overall method is to bring together packets of similar type, split the data into bit fields, and test a large number of compression algorithms. Preliminary results are very encouraging, typically offering compression factors substantially higher than those obtained with simpler generic byte stream compressors, such as Unix Compress and HA 0.98.
Alcohol advertising bans and alcohol abuse.
Young, D J
1993-07-01
Henry Saffer [Saffer (1991) Journal of Health Economics 10, 65-79] concludes that bans on broadcast advertising for alcoholic beverages reduce total alcohol consumption, motor vehicle fatalities, and cirrhosis deaths. A reexamination of his data and procedures reveals a number of flaws. First, there is evidence of reverse causation: countries with low consumption/death rates tend to adopt advertising bans, creating a (spurious) negative correlation between bans and consumption/death rates. Second, even this correlation largely disappears when the estimates are corrected for serial correlation. Third, estimates based on the components of consumption--spirits, beer and wine--mostly indicate that bans are associated with increased consumption.
Overview of Animal Models of Obesity
Lutz, Thomas A.; Woods, Stephen C.
2012-01-01
This is a review of animal models of obesity currently used in research. We have focused upon more commonly utilized models since there are far too many newly created models to consider, especially those caused by selective molecular genetic approaches modifying one or more genes in specific populations of cells. Further, we will not discuss the generation and use of inducible transgenic animals (induced knock-out or knock-in) even though they often bear significant advantages compared to traditional transgenic animals; influences of the genetic modification during the development of the animals can be minimized. The number of these animal models is simply too large to be covered in this chapter. PMID:22948848
From integrative bioethics to pseudoscience.
Bracanović, Tomislav
2012-12-01
Integrative bioethics is a brand of bioethics conceived and propagated by a group of Croatian philosophers and other scholars. This article discusses and shows that the approach encounters several serious difficulties. In criticizing certain standard views on bioethics and in presenting their own, the advocates of integrative bioethics fall into various conceptual confusions and inconsistencies. Although presented as a project that promises to deal with moral dilemmas created by modern science and technology, integrative bioethics does not contain the slightest normativity or action-guiding capacity. Portrayed as a scientific and interdisciplinary enterprise, integrative bioethics displays a large number of pseudoscientific features that throw into doubt its overall credibility. © 2012 Blackwell Publishing Ltd.
Yeom, Jae Min; Yum, Seong Soo; Liu, Yangang; ...
2017-04-20
Entrainment and mixing processes and their effects on cloud microphysics in the continental stratocumulus clouds observed in Oklahoma during the RACORO campaign are analyzed in the frame of homogeneous and inhomogeneous mixing concepts by combining the approaches of microphysical correlation, mixing diagram, and transition scale (number). A total of 110 horizontally penetrated cloud segments is analyzed in this paper. Mixing diagram and cloud microphysical relationship analyses show homogeneous mixing trait of positive relationship between liquid water content (L) and mean volume of droplets (V) (i.e., smaller droplets in more diluted parcel) in most cloud segments. Relatively small temperature and humiditymore » differences between the entraining air from above the cloud top and cloudy air and relatively large turbulent dissipation rate are found to be responsible for this finding. The related scale parameters (i.e., transition length and transition scale number) are relatively large, which also indicates high likelihood of homogeneous mixing. Finally, clear positive relationship between L and vertical velocity (W) for some cloud segments is suggested to be evidence of vertical circulation mixing, which may further enhance the positive relationship between L and V created by homogeneous mixing.« less
Toward a 3D video format for auto-stereoscopic displays
NASA Astrophysics Data System (ADS)
Vetro, Anthony; Yea, Sehoon; Smolic, Aljoscha
2008-08-01
There has been increased momentum recently in the production of 3D content for cinema applications; for the most part, this has been limited to stereo content. There are also a variety of display technologies on the market that support 3DTV, each offering a different viewing experience and having different input requirements. More specifically, stereoscopic displays support stereo content and require glasses, while auto-stereoscopic displays avoid the need for glasses by rendering view-dependent stereo pairs for a multitude of viewing angles. To realize high quality auto-stereoscopic displays, multiple views of the video must either be provided as input to the display, or these views must be created locally at the display. The former approach has difficulties in that the production environment is typically limited to stereo, and transmission bandwidth for a large number of views is not likely to be available. This paper discusses an emerging 3D data format that enables the latter approach to be realized. A new framework for efficiently representing a 3D scene and enabling the reconstruction of an arbitrarily large number of views prior to rendering is introduced. Several design challenges are also highlighted through experimental results.
High performance photonic ADC for space applications
NASA Astrophysics Data System (ADS)
Pantoja, S.; Piqueras, M. A.; Villalba, P.; Martínez, B.; Rico, E.
2017-11-01
The flexibility required for future telecom payloads will require of more digital processing capabilities, moving from conventional analogue repeaters to more advanced and efficient analog subsystems or DSPbased solutions. Aggregate data throughputs will have to be handled onboard, creating the need for effective, ADC/DSP and DSP/DAC high speed links. Broadband payloads will have to receive, route and retransmit hundreds of channels and need to be designed so as to meet such requirements of larger bandwidth, system transparency and flexibility.[1][2] One important device in these new architectures is analog to digital converter (ADC) and its equivalent digital to analog converter (DAC). These will be the in/out interface for the use of digital processing in order to provide flexible beam to beam connectivity and variable bandwidth allocation. For telecom payloads having a large number of feeds and thus a large number of converters the mass and consumption of the mixer stage has become significant. Moreover, the inclusion of ADCs in the payload presents new trade-offs in design (jitter, quantization noise, ambiguity). This paper deals with an alternative solution of these two main problems with the exploitation of photonic techniques.
Two Non Linear Dynamics Plasma Astrophysics Experiments At LANL
NASA Astrophysics Data System (ADS)
Intrator, T.; Weber, T.; Feng, Y.; Sears, J.; Smith, R. J.; Swan, H.; Hutchinson, T.; Boguski, J.; Gao, K.; Chapdelaine, L.; Dunn, J. P.
2013-12-01
Two laboratory experiments at Los Alamos National Laboratory (LANL) have been built to gain access to a wide range of fundamental plasma physics issues germane to astro, space, and fusion plasmas. The over arching theme is magnetized plasma dynamics that include currents, MHD forces and instabilities, sheared flows and shocks, along with creation and annihilation of magnetic field. The Relaxation Scaling Experiment (RSX) creates current sheets and flux ropes that exhibit fully 3D dynamics, that are observed to kink, bounce, merge and reconnect, shred, and reform in complicated ways. We show recent movies from a large detailed data set that describe the 3D magnetic structure and helicity budget of a driven and dissipative system that spontaneously self saturates a kink instability. The Magnetized Shock Experiment (MSX) uses a Field reversed configuration (FRC) that is ejected at high speed and then stagnated onto a stopping mirror field, which drives a collisionless magnetized shock. A plasmoid accelerator will also access super critical shocks at much larger Alfven Mach numbers. Unique features include access to parallel, oblique and perpendicular shocks, in regions much larger than ion gyro radius and inertial length, large magnetic and fluid Reynolds numbers, and volume for turbulence.
NASA Astrophysics Data System (ADS)
Yeom, Jae Min; Yum, Seong Soo; Liu, Yangang; Lu, Chunsong
2017-09-01
Entrainment and mixing processes and their effects on cloud microphysics in the continental stratocumulus clouds observed in Oklahoma during the RACORO campaign are analyzed in the frame of homogeneous and inhomogeneous mixing concepts by combining the approaches of microphysical correlation, mixing diagram, and transition scale (number). A total of 110 horizontally penetrated cloud segments is analyzed. Mixing diagram and cloud microphysical relationship analyses show homogeneous mixing trait of positive relationship between liquid water content (L) and mean volume of droplets (V) (i.e., smaller droplets in more diluted parcel) in most cloud segments. Relatively small temperature and humidity differences between the entraining air from above the cloud top and cloudy air and relatively large turbulent dissipation rate are found to be responsible for this finding. The related scale parameters (i.e., transition length and transition scale number) are relatively large, which also indicates high likelihood of homogeneous mixing. Clear positive relationship between L and vertical velocity (W) for some cloud segments is suggested to be evidence of vertical circulation mixing, which may further enhance the positive relationship between L and V created by homogeneous mixing.
The origin and emergence of life under impact bombardment.
Cockell, Charles S
2006-10-29
Craters formed by asteroids and comets offer a number of possibilities as sites for prebiotic chemistry, and they invite a literal application of Darwin's 'warm little pond'. Some of these attributes, such as prolonged circulation of heated water, are found in deep-ocean hydrothermal vent systems, previously proposed as sites for prebiotic chemistry. However, impact craters host important characteristics in a single location, which include the formation of diverse metal sulphides, clays and zeolites as secondary hydrothermal minerals (which can act as templates or catalysts for prebiotic syntheses), fracturing of rock during impact (creating a large surface area for reactions), the delivery of iron in the case of the impact of iron-containing meteorites (which might itself act as a substrate for prebiotic reactions), diverse impact energies resulting in different rates of hydrothermal cooling and thus organic syntheses, and the indiscriminate nature of impacts into every available lithology-generating large numbers of 'experiments' in the origin of life. Following the evolution of life, craters provide cryptoendolithic and chasmoendolithic habitats, particularly in non-sedimentary lithologies, where limited pore space would otherwise restrict colonization. In impact melt sheets, shattered, mixed rocks ultimately provided diverse geochemical gradients, which in present-day craters support the growth of microbial communities.
EventThread: Visual Summarization and Stage Analysis of Event Sequence Data.
Guo, Shunan; Xu, Ke; Zhao, Rongwen; Gotz, David; Zha, Hongyuan; Cao, Nan
2018-01-01
Event sequence data such as electronic health records, a person's academic records, or car service records, are ordered series of events which have occurred over a period of time. Analyzing collections of event sequences can reveal common or semantically important sequential patterns. For example, event sequence analysis might reveal frequently used care plans for treating a disease, typical publishing patterns of professors, and the patterns of service that result in a well-maintained car. It is challenging, however, to visually explore large numbers of event sequences, or sequences with large numbers of event types. Existing methods focus on extracting explicitly matching patterns of events using statistical analysis to create stages of event progression over time. However, these methods fail to capture latent clusters of similar but not identical evolutions of event sequences. In this paper, we introduce a novel visualization system named EventThread which clusters event sequences into threads based on tensor analysis and visualizes the latent stage categories and evolution patterns by interactively grouping the threads by similarity into time-specific clusters. We demonstrate the effectiveness of EventThread through usage scenarios in three different application domains and via interviews with an expert user.
NASA Astrophysics Data System (ADS)
Tajima, T.; Nakajima, K.; Mourou, G.
2017-02-01
The fundamental idea of Laser Wakefield Acceleration (LWFA) is reviewed. An ultrafast intense laser pulse drives coherent wakefield with a relativistic amplitude robustly supported by the plasma. While the large amplitude of wakefields involves collective resonant oscillations of the eigenmode of the entire plasma electrons, the wake phase velocity ˜ c and ultrafastness of the laser pulse introduce the wake stability and rigidity. A large number of worldwide experiments show a rapid progress of this concept realization toward both the high-energy accelerator prospect and broad applications. The strong interest in this has been spurring and stimulating novel laser technologies, including the Chirped Pulse Amplification, the Thin Film Compression, the Coherent Amplification Network, and the Relativistic Mirror Compression. These in turn have created a conglomerate of novel science and technology with LWFA to form a new genre of high field science with many parameters of merit in this field increasing exponentially lately. This science has triggered a number of worldwide research centers and initiatives. Associated physics of ion acceleration, X-ray generation, and astrophysical processes of ultrahigh energy cosmic rays are reviewed. Applications such as X-ray free electron laser, cancer therapy, and radioisotope production etc. are considered. A new avenue of LWFA using nanomaterials is also emerging.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yeom, Jae Min; Yum, Seong Soo; Liu, Yangang
Entrainment and mixing processes and their effects on cloud microphysics in the continental stratocumulus clouds observed in Oklahoma during the RACORO campaign are analyzed in the frame of homogeneous and inhomogeneous mixing concepts by combining the approaches of microphysical correlation, mixing diagram, and transition scale (number). A total of 110 horizontally penetrated cloud segments is analyzed in this paper. Mixing diagram and cloud microphysical relationship analyses show homogeneous mixing trait of positive relationship between liquid water content (L) and mean volume of droplets (V) (i.e., smaller droplets in more diluted parcel) in most cloud segments. Relatively small temperature and humiditymore » differences between the entraining air from above the cloud top and cloudy air and relatively large turbulent dissipation rate are found to be responsible for this finding. The related scale parameters (i.e., transition length and transition scale number) are relatively large, which also indicates high likelihood of homogeneous mixing. Finally, clear positive relationship between L and vertical velocity (W) for some cloud segments is suggested to be evidence of vertical circulation mixing, which may further enhance the positive relationship between L and V created by homogeneous mixing.« less
Ding, Wei; Li, Li; Xiong, Kun; Wang, Yao; Li, Wei; Nie, Yao; Chen, Siguo; Qi, Xueqiang; Wei, Zidong
2015-04-29
Herein, we report a "shape fixing via salt recrystallization" method to efficiently synthesize nitrogen-doped carbon material with a large number of active sites exposed to the three-phase zones, for use as an ORR catalyst. Self-assembled polyaniline with a 3D network structure was fixed and fully sealed inside NaCl via recrystallization of NaCl solution. During pyrolysis, the NaCl crystal functions as a fully sealed nanoreactor, which facilitates nitrogen incorporation and graphitization. The gasification in such a closed nanoreactor creates a large number of pores in the resultant samples. The 3D network structure, which is conducive to mass transport and high utilization of active sites, was found to have been accurately transferred to the final N-doped carbon materials, after dissolution of the NaCl. Use of the invented cathode catalyst in a proton exchange membrane fuel cell produces a peak power of 600 mW cm(-2), making this among the best nonprecious metal catalysts for the ORR reported so far. Furthermore, N-doped carbon materials with a nanotube or nanoshell morphology can be realized by the invented method.
Bioactive Components in Fish Venoms
Ziegman, Rebekah; Alewood, Paul
2015-01-01
Animal venoms are widely recognized excellent resources for the discovery of novel drug leads and physiological tools. Most are comprised of a large number of components, of which the enzymes, small peptides, and proteins are studied for their important bioactivities. However, in spite of there being over 2000 venomous fish species, piscine venoms have been relatively underrepresented in the literature thus far. Most studies have explored whole or partially fractioned venom, revealing broad pharmacology, which includes cardiovascular, neuromuscular, cytotoxic, inflammatory, and nociceptive activities. Several large proteinaceous toxins, such as stonustoxin, verrucotoxin, and Sp-CTx, have been isolated from scorpaenoid fish. These form pores in cell membranes, resulting in cell death and creating a cascade of reactions that result in many, but not all, of the physiological symptoms observed from envenomation. Additionally, Natterins, a novel family of toxins possessing kininogenase activity have been found in toadfish venom. A variety of smaller protein toxins, as well as a small number of peptides, enzymes, and non-proteinaceous molecules have also been isolated from a range of fish venoms, but most remain poorly characterized. Many other bioactive fish venom components remain to be discovered and investigated. These represent an untapped treasure of potentially useful molecules. PMID:25941767
Global Parameter Optimization of CLM4.5 Using Sparse-Grid Based Surrogates
NASA Astrophysics Data System (ADS)
Lu, D.; Ricciuto, D. M.; Gu, L.
2016-12-01
Calibration of the Community Land Model (CLM) is challenging because of its model complexity, large parameter sets, and significant computational requirements. Therefore, only a limited number of simulations can be allowed in any attempt to find a near-optimal solution within an affordable time. The goal of this study is to calibrate some of the CLM parameters in order to improve model projection of carbon fluxes. To this end, we propose a computationally efficient global optimization procedure using sparse-grid based surrogates. We first use advanced sparse grid (SG) interpolation to construct a surrogate system of the actual CLM model, and then we calibrate the surrogate model in the optimization process. As the surrogate model is a polynomial whose evaluation is fast, it can be efficiently evaluated with sufficiently large number of times in the optimization, which facilitates the global search. We calibrate five parameters against 12 months of GPP, NEP, and TLAI data from the U.S. Missouri Ozark (US-MOz) tower. The results indicate that an accurate surrogate model can be created for the CLM4.5 with a relatively small number of SG points (i.e., CLM4.5 simulations), and the application of the optimized parameters leads to a higher predictive capacity than the default parameter values in the CLM4.5 for the US-MOz site.
Supré, K; Lommelen, K; De Meulemeester, L
2014-07-16
In dairy farms, antimicrobial drugs are frequently used for treatment of (sub)clinical mastitis. Determining the antimicrobial susceptibility of mastitis pathogens is needed to come to a correct use of antimicrobials. Strains of Staphylococcus aureus (n=768), Streptococcus uberis (n=939), Streptococcus dysgalactiae (n=444), Escherichia coli (n=563), and Klebsiella species (n=59) originating from routine milk samples from (sub)clinical mastitis were subjected to the disk diffusion method. Disks contained representatives of frequently used antibiotics in dairy. A limited number of clinical breakpoints were available through CLSI, and showed that susceptibility of Staph. aureus, E. coli, and Klebsiella was moderate to high. For streptococcal species however, a large variation between the tested species and the different antimicrobials was observed. In a next step, wild type populations were described based on epidemiological cut off values (EUCAST). Because of the limited number of official cut off values, the data were observed as a mastitis subpopulation and self-generated cut off values were created and a putative wild type population was suggested. The need for accurate clinical breakpoints for veterinary pathogens is high. Despite the lack of these breakpoints, however, a population study can be performed based on the distribution of inhibition zone diameters on the condition that a large number of strains is tested. Copyright © 2014 Elsevier B.V. All rights reserved.
Jagtap, Pratik; Goslinga, Jill; Kooren, Joel A; McGowan, Thomas; Wroblewski, Matthew S; Seymour, Sean L; Griffin, Timothy J
2013-04-01
Large databases (>10(6) sequences) used in metaproteomic and proteogenomic studies present challenges in matching peptide sequences to MS/MS data using database-search programs. Most notably, strict filtering to avoid false-positive matches leads to more false negatives, thus constraining the number of peptide matches. To address this challenge, we developed a two-step method wherein matches derived from a primary search against a large database were used to create a smaller subset database. The second search was performed against a target-decoy version of this subset database merged with a host database. High confidence peptide sequence matches were then used to infer protein identities. Applying our two-step method for both metaproteomic and proteogenomic analysis resulted in twice the number of high confidence peptide sequence matches in each case, as compared to the conventional one-step method. The two-step method captured almost all of the same peptides matched by the one-step method, with a majority of the additional matches being false negatives from the one-step method. Furthermore, the two-step method improved results regardless of the database search program used. Our results show that our two-step method maximizes the peptide matching sensitivity for applications requiring large databases, especially valuable for proteogenomics and metaproteomics studies. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Elias, Andrew; Crayton, Samuel H; Warden-Rothman, Robert; Tsourkas, Andrew
2014-07-28
Given the rapidly expanding library of disease biomarkers and targeting agents, the number of unique targeted nanoparticles is growing exponentially. The high variability and expense of animal testing often makes it unfeasible to examine this large number of nanoparticles in vivo. This often leads to the investigation of a single formulation that performed best in vitro. However, nanoparticle performance in vivo depends on many variables, many of which cannot be adequately assessed with cell-based assays. To address this issue, we developed a lanthanide-doped nanoparticle method that allows quantitative comparison of multiple targeted nanoparticles simultaneously. Specifically, superparamagnetic iron oxide (SPIO) nanoparticles with different targeting ligands were created, each with a unique lanthanide dopant. Following the simultaneous injection of the various SPIO compositions into tumor-bearing mice, inductively coupled plasma mass spectroscopy was used to quantitatively and orthogonally assess the concentration of each SPIO composition in serial blood and resected tumor samples.
2009-03-06
CAPE CANAVERAL, Fla. – On Launch Pad 17-B at Cape Canaveral Air Force Station in Florida, United Launch Alliance's Delta II rocket carrying NASA's Kepler spacecraft rises through the exhaust cloud created by the firing of the rocket’s engines. Liftoff was on time at 10:49 p.m. EST. Kepler is a spaceborne telescope designed to search the nearby region of our galaxy for Earth-size planets orbiting in the habitable zone of stars like our sun. The habitable zone is the region around a star where temperatures permit water to be liquid on a planet's surface. The challenge for Kepler is to look at a large number of stars in order to statistically estimate the total number of Earth-size planets orbiting sun-like stars in the habitable zone. Kepler will survey more than 100,000 stars in our galaxy. Photo credit: NASA/Regina Mitchell-Ryall, Tom Farrar
Efficient Sample Tracking With OpenLabFramework
List, Markus; Schmidt, Steffen; Trojnar, Jakub; Thomas, Jochen; Thomassen, Mads; Kruse, Torben A.; Tan, Qihua; Baumbach, Jan; Mollenhauer, Jan
2014-01-01
The advance of new technologies in biomedical research has led to a dramatic growth in experimental throughput. Projects therefore steadily grow in size and involve a larger number of researchers. Spreadsheets traditionally used are thus no longer suitable for keeping track of the vast amounts of samples created and need to be replaced with state-of-the-art laboratory information management systems. Such systems have been developed in large numbers, but they are often limited to specific research domains and types of data. One domain so far neglected is the management of libraries of vector clones and genetically engineered cell lines. OpenLabFramework is a newly developed web-application for sample tracking, particularly laid out to fill this gap, but with an open architecture allowing it to be extended for other biological materials and functional data. Its sample tracking mechanism is fully customizable and aids productivity further through support for mobile devices and barcoded labels. PMID:24589879
Anthropometrical data of middle-aged Japanese women for industrial design applications.
Ashizawa, K; Okada, A; Kouchi, M; Horino, S; Kikuchi, Y
1994-06-01
Despite the growing importance of human interface design, and despite the growing number of working women, no considerations have been given to women's working spaces and tools. Their designs are based on men's anthropometrical data, and this does not assure safety and amenity of women's working environments. Moreover, few data on women's body measurements are available. The Research Institute of Human Engineering for Quality Life is carrying out an ergonomic anthropometrical study on a large number of Japanese people to create a database for industrial use. The fee for the use of these data is, however, making it very difficult to profit from their acquisition. Therefore, we conducted an anthropometrical study for industrial design use on middle-aged female subjects, who are in the most difficult age group to access. This report should be useful in designing working spaces and tools for women as laborers as well as users.
Multislice CT urography: state of the art.
Noroozian, M; Cohan, R H; Caoili, E M; Cowan, N C; Ellis, J H
2004-01-01
Recent improvements in helical CT hardware and software have provided imagers with the tools to obtain an increasingly large number of very thin axial images. As a result, a number of new applications for multislice CT have recently been developed, one of which is CT urography. The motivation for performing CT urography is the desire to create a single imaging test that can completely assess the kidneys and urinary tract for urolithiasis, renal masses and mucosal abnormalities of the renal collecting system, ureters and bladder. Although the preferred technique for performing multislice CT urography has not yet been determined and results are preliminary, early indications suggest that this examination can detect even subtle benign and malignant urothelial abnormalities and that it has the potential to completely replace excretory urography within the next several years. An important limitation of multislice CT urography is increased patient radiation exposure encountered when some of the more thorough recommended techniques are utilized.
Soranno, Patricia A.; Cheruvelil, Kendra Spence; Webster, Katherine E.; Bremigan, Mary T.; Wagner, Tyler; Stow, Craig A.
2010-01-01
Governmental entities are responsible for managing and conserving large numbers of lake, river, and wetland ecosystems that can be addressed only rarely on a case-by-case basis. We present a system for predictive classification modeling, grounded in the theoretical foundation of landscape limnology, that creates a tractable number of ecosystem classes to which management actions may be tailored. We demonstrate our system by applying two types of predictive classification modeling approaches to develop nutrient criteria for eutrophication management in 1998 north temperate lakes. Our predictive classification system promotes the effective management of multiple ecosystems across broad geographic scales by explicitly connecting management and conservation goals to the classification modeling approach, considering multiple spatial scales as drivers of ecosystem dynamics, and acknowledging the hierarchical structure of freshwater ecosystems. Such a system is critical for adaptive management of complex mosaics of freshwater ecosystems and for balancing competing needs for ecosystem services in a changing world.
Testing the robustness of Citizen Science projects: Evaluating the results of pilot project COMBER.
Chatzigeorgiou, Giorgos; Faulwetter, Sarah; Dailianis, Thanos; Smith, Vincent Stuart; Koulouri, Panagiota; Dounas, Costas; Arvanitidis, Christos
2016-01-01
Citizen Science (CS) as a term implies a great deal of approaches and scopes involving many different fields of science. The number of the relevant projects globally has been increased significantly in the recent years. Large scale ecological questions can be answered only through extended observation networks and CS projects can support this effort. Although the need of such projects is apparent, an important part of scientific community cast doubt on the reliability of CS data sets. The pilot CS project COMBER has been created in order to provide evidence to answer the aforementioned question in the coastal marine biodiversity monitoring. The results of the current analysis show that a carefully designed CS project with clear hypotheses, wide participation and data sets validation, can be a valuable tool for the large scale and long term changes in marine biodiversity pattern change and therefore for relevant management and conservation issues.
NASA Technical Reports Server (NTRS)
Bernhardt, Paul A.; Scales, W. A.
1990-01-01
Ionospheric plasma density irregularities can be produced by chemical releases into the upper atmosphere. F-region plasma modification occurs by: (1) chemically enhancing the electron number density; (2) chemically reducing the electron population; or (3) physically convecting the plasma from one region to another. The three processes (production, loss, and transport) determine the effectiveness of ionospheric chemical releases in subtle and surprising ways. Initially, a chemical release produces a localized change in plasma density. Subsequent processes, however, can lead to enhanced transport in chemically modified regions. Ionospheric modifications by chemical releases excites artificial enhancements in airglow intensities by exothermic chemical reactions between the newly created plasma species. Numerical models were developed to describe the creation and evolution of large scale density irregularities and airglow clouds generated by artificial means. Experimental data compares favorably with theses models. It was found that chemical releases produce transient, large amplitude perturbations in electron density which can evolve into fine scale irregularities via nonlinear transport properties.
Drought and Epidemic Typhus, Central Mexico, 1655–1918
Acuna-Soto, Rudofo; Stahle, David W.
2014-01-01
Epidemic typhus is an infectious disease caused by the bacterium Rickettsia prowazekii and transmitted by body lice (Pediculus humanus corporis). This disease occurs where conditions are crowded and unsanitary. This disease accompanied war, famine, and poverty for centuries. Historical and proxy climate data indicate that drought was a major factor in the development of typhus epidemics in Mexico during 1655–1918. Evidence was found for 22 large typhus epidemics in central Mexico, and tree-ring chronologies were used to reconstruct moisture levels over central Mexico for the past 500 years. Below-average tree growth, reconstructed drought, and low crop yields occurred during 19 of these 22 typhus epidemics. Historical documents describe how drought created large numbers of environmental refugees that fled the famine-stricken countryside for food relief in towns. These refugees often ended up in improvised shelters in which crowding encouraged conditions necessary for spread of typhus. PMID:24564928
Gardner, W.P.; Susong, D.D.; Solomon, D.K.; Heasler, H.P.
2011-01-01
Multiple environmental tracers are used to investigate age distribution, evolution, and mixing in local- to regional-scale groundwater circulation around the Norris Geyser Basin area in Yellowstone National Park. Springs ranging in temperature from 3??C to 90??C in the Norris Geyser Basin area were sampled for stable isotopes of hydrogen and oxygen, major and minor element chemistry, dissolved chlorofluorocarbons, and tritium. Groundwater near Norris Geyser Basin is comprised of two distinct systems: a shallow, cool water system and a deep, high-temperature hydrothermal system. These two end-member systems mix to create springs with intermediate temperature and composition. Using multiple tracers from a large number of springs, it is possible constrain the distribution of possible flow paths and refine conceptual models of groundwater circulation in and around a large, complex hydrothermal system. Copyright 2011 by the American Geophysical Union.
Browsing schematics: Query-filtered graphs with context nodes
NASA Technical Reports Server (NTRS)
Ciccarelli, Eugene C.; Nardi, Bonnie A.
1988-01-01
The early results of a research project to create tools for building interfaces to intelligent systems on the NASA Space Station are reported. One such tool is the Schematic Browser which helps users engaged in engineering problem solving find and select schematics from among a large set. Users query for schematics with certain components, and the Schematic Browser presents a graph whose nodes represent the schematics with those components. The query greatly reduces the number of choices presented to the user, filtering the graph to a manageable size. Users can reformulate and refine the query serially until they locate the schematics of interest. To help users maintain orientation as they navigate a large body of data, the graph also includes nodes that are not matches but provide global and local context for the matching nodes. Context nodes include landmarks, ancestors, siblings, children and previous matches.
Self-Propulsion of Pure Water Droplets by Spontaneous Marangoni-Stress-Driven Motion
NASA Astrophysics Data System (ADS)
Izri, Ziane; van der Linden, Marjolein N.; Michelin, Sébastien; Dauchot, Olivier
2014-12-01
We report spontaneous motion in a fully biocompatible system consisting of pure water droplets in an oil-surfactant medium of squalane and monoolein. Water from the droplet is solubilized by the reverse micellar solution, creating a concentration gradient of swollen reverse micelles around each droplet. The strong advection and weak diffusion conditions allow for the first experimental realization of spontaneous motion in a system of isotropic particles at sufficiently large Péclet number according to a straightforward generalization of a recently proposed mechanism [S. Michelin, E. Lauga, and D. Bartolo, Phys. Fluids 25, 061701 (2013); S. Michelin and E. Lauga, J. Fluid Mech. 747, 572 (2014)]. Experiments with a highly concentrated solution of salt instead of water, and tetradecane instead of squalane, confirm the above mechanism. The present swimming droplets are able to carry external bodies such as large colloids, salt crystals, and even cells.
Self-propulsion of pure water droplets by spontaneous Marangoni-stress-driven motion.
Izri, Ziane; van der Linden, Marjolein N; Michelin, Sébastien; Dauchot, Olivier
2014-12-12
We report spontaneous motion in a fully biocompatible system consisting of pure water droplets in an oil-surfactant medium of squalane and monoolein. Water from the droplet is solubilized by the reverse micellar solution, creating a concentration gradient of swollen reverse micelles around each droplet. The strong advection and weak diffusion conditions allow for the first experimental realization of spontaneous motion in a system of isotropic particles at sufficiently large Péclet number according to a straightforward generalization of a recently proposed mechanism [S. Michelin, E. Lauga, and D. Bartolo, Phys. Fluids 25, 061701 (2013); S. Michelin and E. Lauga, J. Fluid Mech. 747, 572 (2014)]. Experiments with a highly concentrated solution of salt instead of water, and tetradecane instead of squalane, confirm the above mechanism. The present swimming droplets are able to carry external bodies such as large colloids, salt crystals, and even cells.
Drought and epidemic typhus, central Mexico, 1655-1918.
Burns, Jordan N; Acuna-Soto, Rudofo; Stahle, David W
2014-03-01
Epidemic typhus is an infectious disease caused by the bacterium Rickettsia prowazekii and transmitted by body lice (Pediculus humanus corporis). This disease occurs where conditions are crowded and unsanitary. This disease accompanied war, famine, and poverty for centuries. Historical and proxy climate data indicate that drought was a major factor in the development of typhus epidemics in Mexico during 1655-1918. Evidence was found for 22 large typhus epidemics in central Mexico, and tree-ring chronologies were used to reconstruct moisture levels over central Mexico for the past 500 years. Below-average tree growth, reconstructed drought, and low crop yields occurred during 19 of these 22 typhus epidemics. Historical documents describe how drought created large numbers of environmental refugees that fled the famine-stricken countryside for food relief in towns. These refugees often ended up in improvised shelters in which crowding encouraged conditions necessary for spread of typhus.
The effect of mass loading on the temperature of a flowing plasma. [in vicinity of Io
NASA Technical Reports Server (NTRS)
Linker, Jon A.; Kivelson, Margaret G.; Walker, Raymond J.
1989-01-01
How the addition of ions at rest (mass loading) affects the temperature of a flowing plasma in a MHD approximation is investigated, using analytic theory and time dependent, three-dimensional MHD simulations of plasma flow past Io. The MHD equations show that the temperature can increase or decrease relative to the background, depending on the local sonic Mach number M(S), of the flow. For flows with M(S) of greater than sq rt 9/5 (when gamma = 5/3), mass loading increases the plasma temperature. However, the simulations show a nonlinear response to the addition of mass. If the mass loading rate is large enough, the temperature increase may be smaller than expected, or the temperature may actually decrease, because a large mass loading rate slows the flow and decreases the thermal energy of the newly created plasma.
ERIC Educational Resources Information Center
Whitehead, Linda C.; Ginsberg, Stacey I.
1999-01-01
Presents suggestions for creating family-like programs in large child-care centers in three areas: (1) physical environment, incorporating cozy spaces, beauty, and space for family interaction; (2) caregiving climate, such as sharing home photographs, and serving meals family style; and (3) family involvement, including regular conversations with…
Aerodynamic Optimization of Rocket Control Surface Geometry Using Cartesian Methods and CAD Geometry
NASA Technical Reports Server (NTRS)
Nelson, Andrea; Aftosmis, Michael J.; Nemec, Marian; Pulliam, Thomas H.
2004-01-01
Aerodynamic design is an iterative process involving geometry manipulation and complex computational analysis subject to physical constraints and aerodynamic objectives. A design cycle consists of first establishing the performance of a baseline design, which is usually created with low-fidelity engineering tools, and then progressively optimizing the design to maximize its performance. Optimization techniques have evolved from relying exclusively on designer intuition and insight in traditional trial and error methods, to sophisticated local and global search methods. Recent attempts at automating the search through a large design space with formal optimization methods include both database driven and direct evaluation schemes. Databases are being used in conjunction with surrogate and neural network models as a basis on which to run optimization algorithms. Optimization algorithms are also being driven by the direct evaluation of objectives and constraints using high-fidelity simulations. Surrogate methods use data points obtained from simulations, and possibly gradients evaluated at the data points, to create mathematical approximations of a database. Neural network models work in a similar fashion, using a number of high-fidelity database calculations as training iterations to create a database model. Optimal designs are obtained by coupling an optimization algorithm to the database model. Evaluation of the current best design then gives either a new local optima and/or increases the fidelity of the approximation model for the next iteration. Surrogate methods have also been developed that iterate on the selection of data points to decrease the uncertainty of the approximation model prior to searching for an optimal design. The database approximation models for each of these cases, however, become computationally expensive with increase in dimensionality. Thus the method of using optimization algorithms to search a database model becomes problematic as the number of design variables is increased.
Landspotting: Social gaming to collect vast amounts of data for satellite validation
NASA Astrophysics Data System (ADS)
Fritz, S.; Purgathofer, P.; Kayali, F.; Fellner, M.; Wimmer, M.; Sturn, T.; Triebnig, G.; Krause, S.; Schindler, F.; Kollegger, M.; Perger, C.; Dürauer, M.; Haberl, W.; See, L.; McCallum, I.
2012-04-01
At present there is no single satellite-derived global land cover product that is accurate enough to provide reliable estimates of forest or cropland area to determine, e.g., how much additional land is available to grow biofuels or to tackle problems of food security. The Landspotting Project aims to improve the quality of this land cover information by vastly increasing the amount of in-situ validation data available for calibration and validation of satellite-derived land cover. The Geo-Wiki (Geo-Wiki.org) system currently allows users to compare three satellite derived land cover products and validate them using Google Earth. However, there is presently no incentive for anyone to provide this data so the amount of validation through Geo-Wiki has been limited. However, recent competitions have proven that incentive driven campaigns can rapidly create large amounts of input. The LandSpotting Project is taking a truly innovative approach through the development of the Landspotting game. The game engages users whilst simultaneously collecting a large amount of in-situ land cover information. The development of the game is informed by the current raft of successful social gaming that is available on the internet and as mobile applications, many of which are geo-spatial in nature. Games that are integrated within a social networking site such as Facebook illustrate the power to reach and continually engage a large number of individuals. The number of active Facebook users is estimated to be greater than 400 million, where 100 million are accessing Facebook from mobile devices. The Landspotting Game has similar game mechanics as the famous strategy game "Civilization" (i.e. build, harvest, research, war, diplomacy, etc.). When a player wishes to make a settlement, they must first classify the land cover over the area they wish to settle. As the game is played on the earth surface with Google Maps, we are able to record and store this land cover/land use classification geographically. Every player can play the game for free (i.e. a massive multiplayer online game). Furthermore, it is a social game on Facebook (e.g. invite friends, send friends messages, purchase gifts, help friends, post messages onto the wall, etc). The game is played in a web browser, therefore it runs everywhere (where Flash is supported) without requiring the user to install anything additional. At the same time, the Geo-Wiki system will be modified to use the acquired in-situ validation information to create new outputs: a hybrid land cover map, which takes the best information from each individual product to create a single integrated version; a database of validation points that will be freely available to the land cover user community; and a facility that allows users to create a specific targeted validation area, which will then be provided to the crowdsourcing community for validation. These outputs will turn Geo-Wiki into a valuable system for earth system scientists.
NASA Astrophysics Data System (ADS)
Saksena, S.; Merwade, V.; Singhofen, P.
2017-12-01
There is an increasing global trend towards developing large scale flood models that account for spatial heterogeneity at watershed scales to drive the future flood risk planning. Integrated surface water-groundwater modeling procedures can elucidate all the hydrologic processes taking part during a flood event to provide accurate flood outputs. Even though the advantages of using integrated modeling are widely acknowledged, the complexity of integrated process representation, computation time and number of input parameters required have deterred its application to flood inundation mapping, especially for large watersheds. This study presents a faster approach for creating watershed scale flood models using a hybrid design that breaks down the watershed into multiple regions of variable spatial resolution by prioritizing higher order streams. The methodology involves creating a hybrid model for the Upper Wabash River Basin in Indiana using Interconnected Channel and Pond Routing (ICPR) and comparing the performance with a fully-integrated 2D hydrodynamic model. The hybrid approach involves simplification procedures such as 1D channel-2D floodplain coupling; hydrologic basin (HUC-12) integration with 2D groundwater for rainfall-runoff routing; and varying spatial resolution of 2D overland flow based on stream order. The results for a 50-year return period storm event show that hybrid model (NSE=0.87) performance is similar to the 2D integrated model (NSE=0.88) but the computational time is reduced to half. The results suggest that significant computational efficiency can be obtained while maintaining model accuracy for large-scale flood models by using hybrid approaches for model creation.
Larger sized wire arrays on 1.5 MA Z-pinch generator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Safronova, A. S., E-mail: alla@unr.edu; Kantsyrev, V. L., E-mail: alla@unr.edu; Weller, M. E., E-mail: alla@unr.edu
Experiments on the UNR Zebra generator with Load Current Multiplier (LCM) allow for implosions of larger sized wire array loads than at standard current of 1 MA. Advantages of larger sized planar wire array implosions include enhanced energy coupling to plasmas, better diagnostic access to observable plasma regions, and more complex geometries of the wire loads. The experiments with larger sized wire arrays were performed on 1.5 MA Zebra with LCM (the anode-cathode gap was 1 cm, which is half the gap used in the standard mode). In particular, larger sized multi-planar wire arrays had two outer wire planes frommore » mid-atomic-number wires to create a global magnetic field (gmf) and plasma flow between them. A modified central plane with a few Al wires at the edges was put in the middle between outer planes to influence gmf and to create Al plasma flow in the perpendicular direction (to the outer arrays plasma flow). Such modified plane has different number of empty slots: it was increased from 6 up to 10, hence increasing the gap inside the middle plane from 4.9 to 7.7 mm, respectively. Such load configuration allows for more independent study of the flows of L-shell mid-atomic-number plasma (between the outer planes) and K-shell Al plasma (which first fills the gap between the edge wires along the middle plane) and their radiation in space and time. We demonstrate that such configuration produces higher linear radiation yield and electron temperatures as well as advantages of better diagnostics access to observable plasma regions and how the load geometry (size of the gap in the middle plane) influences K-shell Al radiation. In particular, K-shell Al radiation was delayed compared to L-shell mid-atomic-number radiation when the gap in the middle plane was large enough (when the number of empty slots was increased up to ten)« less
NASA Technical Reports Server (NTRS)
Niles, P.B.
2008-01-01
The chemistry, sedimentology, and geology of the Meridiani sedimentary deposits are best explained by eolian reworking of the sublimation residue of a large scale ice/dust deposit. This large ice deposit was located in close proximity to Terra Meridiani and incorporated large amounts of dust, sand, and SO2 aerosols generated by impacts and volcanism during early martian history. Sulfate formation and chemical weathering of the initial igneous material is hypothesized to have occurred inside of the ice when the darker mineral grains were heated by solar radiant energy. This created conditions in which small films of liquid water were created in and around the mineral grains. This water dissolved the SO2 and reacted with the mineral grains forming an acidic environment under low water/rock conditions. Subsequent sublimation of this ice deposit left behind large amounts of weathered sublimation residue which became the source material for the eolian process that deposited the Terra Meridiani deposit. The following features of the Meridiani sediments are best explained by this model: The large scale of the deposit, its mineralogic similarity across large distances, the cation-conservative nature of the weathering processes, the presence of acidic groundwaters on a basaltic planet, the accumulation of a thick sedimentary sequence outside of a topographic basin, and the low water/rock ratio needed to explain the presence of very soluble minerals and elements in the deposit. Remote sensing studies have linked the Meridiani deposits to a number of other martian surface features through mineralogic similarities, geomorphic similarities, and regional associations. These include layered deposits in Arabia Terra, interior layered deposits in the Valles Marineris system, southern Elysium/Aeolis, Amazonis Planitia, and the Hellas basin, Aram Chaos, Aureum Chaos, and Ioni Chaos. The common properties shared by these deposits suggest that all of these deposits share a common formation process which must have acted over a large area of Mars. The results of this study suggest a mechanism for volatile transport on Mars without invoking an early greenhouse. They also imply a common formation mechanism for most of the sulfate minerals and layered deposits on Mars, which explains their common occurrence.
Coupling large scale hydrologic-reservoir-hydraulic models for impact studies in data sparse regions
NASA Astrophysics Data System (ADS)
O'Loughlin, Fiachra; Neal, Jeff; Wagener, Thorsten; Bates, Paul; Freer, Jim; Woods, Ross; Pianosi, Francesca; Sheffied, Justin
2017-04-01
As hydraulic modelling moves to increasingly large spatial domains it has become essential to take reservoirs and their operations into account. Large-scale hydrological models have been including reservoirs for at least the past two decades, yet they cannot explicitly model the variations in spatial extent of reservoirs, and many reservoirs operations in hydrological models are not undertaken during the run-time operation. This requires a hydraulic model, yet to-date no continental scale hydraulic model has directly simulated reservoirs and their operations. In addition to the need to include reservoirs and their operations in hydraulic models as they move to global coverage, there is also a need to link such models to large scale hydrology models or land surface schemes. This is especially true for Africa where the number of river gauges has consistently declined since the middle of the twentieth century. In this study we address these two major issues by developing: 1) a coupling methodology for the VIC large-scale hydrological model and the LISFLOOD-FP hydraulic model, and 2) a reservoir module for the LISFLOOD-FP model, which currently includes four sets of reservoir operating rules taken from the major large-scale hydrological models. The Volta Basin, West Africa, was chosen to demonstrate the capability of the modelling framework as it is a large river basin ( 400,000 km2) and contains the largest man-made lake in terms of area (8,482 km2), Lake Volta, created by the Akosombo dam. Lake Volta also experiences a seasonal variation in water levels of between two and six metres that creates a dynamic shoreline. In this study, we first run our coupled VIC and LISFLOOD-FP model without explicitly modelling Lake Volta and then compare these results with those from model runs where the dam operations and Lake Volta are included. The results show that we are able to obtain variation in the Lake Volta water levels and that including the dam operations and Lake Volta has significant impacts on the water levels across the domain.
Bornemann-Shepherd, Melanie; Le-Lazar, Jamie; Makic, Mary Beth Flynn; DeVine, Deborah; McDevitt, Kelly; Paul, Marcee
2015-01-01
Hospital capacity constraints lead to large numbers of inpatients being held for extended periods in the emergency department. This creates concerns with safety, quality of care, and dissatisfaction of patients and staff. The aim of this quality-improvement project was to improve satisfaction and processes in which nurses provided care to inpatient boarders held in the emergency department. A quality-improvement project framework that included the use of a questionnaire was used to ascertain employee and patient dissatisfaction and identify opportunities for improvement. A task force was created to develop action plans related to holding and caring for inpatients in the emergency department. A questionnaire was sent to nursing staff in spring 2012, and responses from the questionnaire identified improvements that could be implemented to improve care for inpatient boarders. Situation-background-assessment-recommendation (SBAR) communications and direct observations were also used to identify specific improvements. Post-questionnaire results indicated improved satisfaction for both staff and patients. It was recognized early that the ED inpatient area would benefit from the supervision of an inpatient director, managers, and staff. Outcomes showed that creating an inpatient unit within the emergency department had a positive effect on staff and patient satisfaction. Copyright © 2015 Emergency Nurses Association. Published by Elsevier Inc. All rights reserved.
Semantic Web repositories for genomics data using the eXframe platform.
Merrill, Emily; Corlosquet, Stéphane; Ciccarese, Paolo; Clark, Tim; Das, Sudeshna
2014-01-01
With the advent of inexpensive assay technologies, there has been an unprecedented growth in genomics data as well as the number of databases in which it is stored. In these databases, sample annotation using ontologies and controlled vocabularies is becoming more common. However, the annotation is rarely available as Linked Data, in a machine-readable format, or for standardized queries using SPARQL. This makes large-scale reuse, or integration with other knowledge bases very difficult. To address this challenge, we have developed the second generation of our eXframe platform, a reusable framework for creating online repositories of genomics experiments. This second generation model now publishes Semantic Web data. To accomplish this, we created an experiment model that covers provenance, citations, external links, assays, biomaterials used in the experiment, and the data collected during the process. The elements of our model are mapped to classes and properties from various established biomedical ontologies. Resource Description Framework (RDF) data is automatically produced using these mappings and indexed in an RDF store with a built-in Sparql Protocol and RDF Query Language (SPARQL) endpoint. Using the open-source eXframe software, institutions and laboratories can create Semantic Web repositories of their experiments, integrate it with heterogeneous resources and make it interoperable with the vast Semantic Web of biomedical knowledge.
Laboratory studies of magnetized collisionless flows and shocks using accelerated plasmoids
NASA Astrophysics Data System (ADS)
Weber, T. E.; Smith, R. J.; Hsu, S. C.
2015-11-01
Magnetized collisionless shocks are thought to play a dominant role in the overall partition of energy throughout the universe, but have historically proven difficult to create in the laboratory. The Magnetized Shock Experiment (MSX) at LANL creates conditions similar to those found in both space and astrophysical shocks by accelerating hot (100s of eV during translation) dense (1022 - 1023 m-3) Field Reversed Configuration (FRC) plasmoids to high velocities (100s of km/s); resulting in β ~ 1, collisionless plasma flows with sonic and Alfvén Mach numbers of ~10. The FRC subsequently impacts a static target such as a strong parallel or anti-parallel (reconnection-wise) magnetic mirror, a solid obstacle, or neutral gas cloud to create shocks with characteristic length and time scales that are both large enough to observe yet small enough to fit within the experiment. This enables study of the complex interplay of kinetic and fluid processes that mediate cosmic shocks and can generate non-thermal distributions, produce density and magnetic field enhancements much greater than predicted by fluid theory, and accelerate particles. An overview of the experimental capabilities of MSX will be presented, including diagnostics, selected recent results, and future directions. Supported by the DOE Office of Fusion Energy Sciences under contract DE-AC52-06NA25369.
NASA Astrophysics Data System (ADS)
Sokolova, Inna
2015-04-01
Availability of the acoustic wave on the record of microbarograph is one of discriminate signs of atmospheric (surface layer of atmosphere) and contact explosions. Nowadays there is large number of air wave records from chemical explosions recorded by the IMS infrasound stations installed during recent decade. But there is small number of air wave records from nuclear explosions as air and contact nuclear explosions had been conducted since 1945 to 1962, before the Limited Test Ban Treaty was signed in 1963 (the treaty banning nuclear weapon tests in the atmosphere, in outer space and under water) by the Great Britain, USSR and USA. That time there was small number of installed microbarographs. First infrasound stations in the USSR appeared in 1954, and by the moment of the USSR collapse the network consisted of 25 infrasound stations, 3 of which were located on Kazakhstan territory - in Kurchatov (East Kazakhstan), in Borovoye Observatory (North Kazakhstan) and Talgar Observatory (Northern Tien Shan). The microbarograph of Talgar Observatory was installed in 1962 and recorded large number of air nuclear explosions conducted at Semipalatinsk Test Site and Novaya Zemlya Test Site. The epicentral distance to the STS was ~700 km, and to Novaya Zemlya Test Site ~3500 km. The historical analog records of the microbarograph were analyzed on the availability of the acoustic wave. The selected records were digitized, the database of acoustic signals from nuclear explosions was created. In addition, acoustic signals from atmospheric nuclear explosions conducted at the USSR Test Sites were recorded by analogue broadband seismic stations at wide range of epicentral distances, 300-3600 km. These signals coincide well by its form and spectral content with records of microbarographs and can be used for monitoring tasks and discrimination in places where infrasound observations are absent. Nuclear explosions which records contained acoustic wave were from 0.03 to 30 kt yield for the STS, and from 8.3 to 25 Mt yield for Novaya Zemlya Test Site region. The peculiarities of the wave pattern and spectral content of the acoustic wave records, and relation regularities of acoustic wave amplitude and periods with explosion yield and distance were investigated. The created database can be applied in different monitoring tasks, such as infrasound stations calibration, discrimination of nuclear explosions, precision of nuclear explosions parameters, determination of the explosion yield etc.
NASA Astrophysics Data System (ADS)
Kivotides, Demosthenes
2018-03-01
The interactions between vortex tubes and magnetic-flux rings in incompressible magnetohydrodynamics are investigated at high kinetic and magnetic Reynolds numbers, and over a wide range of the interaction parameter. The latter is a measure of the turnover time of the large-scale fluid motions in units of the magnetic damping time, or of the strength of the Lorentz force in units of the inertial force. The small interaction parameter results, which are related to kinematic turbulent dynamo studies, indicate the evolution of magnetic rings into flattened spirals wrapped around the vortex tubes. This process is also observed at intermediate interaction parameter values, only now the Lorentz force creates new vortical structures at the magnetic spiral edges, which have a striking solenoid vortex-line structure, and endow the flattened magnetic-spiral surfaces with a curvature. At high interaction parameter values, the decisive physical factor is Lorentz force effects. The latter create two (adjacent to the magnetic ring) vortex rings that reconnect with the vortex tube by forming an intriguing, serpentinelike, vortex-line structure, and generate, in turn, two new magnetic rings, adjacent to the initial one. In this regime, the morphologies of the vorticity and magnetic field structures are similar. The effects of these structures on kinetic and magnetic energy spectra, as well as on the direction of energy transfer between flow and magnetic fields, are also indicated.
Lean management systems: creating a culture of continuous quality improvement.
Clark, David M; Silvester, Kate; Knowles, Simon
2013-08-01
This is the first in a series of articles describing the application of Lean management systems to Laboratory Medicine. Lean is the term used to describe a principle-based continuous quality improvement (CQI) management system based on the Toyota production system (TPS) that has been evolving for over 70 years. Its origins go back much further and are heavily influenced by the work of W Edwards Deming and the scientific method that forms the basis of most quality management systems. Lean has two fundamental elements--a systematic approach to process improvement by removing waste in order to maximise value for the end-user of the service and a commitment to respect, challenge and develop the people who work within the service to create a culture of continuous improvement. Lean principles have been applied to a growing number of Healthcare systems throughout the world to improve the quality and cost-effectiveness of services for patients and a number of laboratories from all the pathology disciplines have used Lean to shorten turnaround times, improve quality (reduce errors) and improve productivity. Increasingly, models used to plan and implement large scale change in healthcare systems, including the National Health Service (NHS) change model, have evidence-based improvement methodologies (such as Lean CQI) as a core component. Consequently, a working knowledge of improvement methodology will be a core skill for Pathologists involved in leadership and management.
Natural light illumination system.
Whang, Allen Jong-Woei; Chen, Yi-Yung; Yang, Shu-Hua; Pan, Po-Hsuan; Chou, Kao-Hsu; Lee, Yu-Chi; Lee, Zong-Yi; Chen, Chi-An; Chen, Cheng-Nan
2010-12-10
In recent years, green energy has undergone a lot of development and has been the subject of many applications. Many research studies have focused on illumination with sunlight as a means of saving energy and creating healthy lighting. Natural light illumination systems have collecting, transmitting, and lighting elements. Today, most daylight collectors use dynamic concentrators; these include Sun tracking systems. However, this design is too expensive to be cost effective. To create a low-cost collector that can be easily installed on a large building, we have designed a static concentrator, which is prismatic and cascadable, to collect sunlight for indoor illumination. The transmission component uses a large number of optical fibers. Because optical fibers are expensive, this means that most of the cost for the system will be related to transmission. In this paper, we also use a prismatic structure to design an optical coupler for coupling n to 1. With the n-to-1 coupler, the number of optical fibers necessary can be greatly reduced. Although this new natural light illumination system can effectively guide collected sunlight and send it to the basement or to other indoor places for healthy lighting, previously there has been no way to manage the collected sunlight when lighting was not desired. To solve this problem, we have designed an optical switch and a beam splitter to control and separate the transmitted light. When replacing traditional sources, the lighting should have similar characteristics, such as intensity distribution and geometric parameters, to those of traditional artificial sources. We have designed, simulated, and optimized an illumination lightpipe with a dot pattern to redistribute the collected sunlight from the natural light illumination system such that it equals the qualities of a traditional lighting system. We also provide an active lighting module that provides lighting from the natural light illumination system or LED auxiliary sources, depending on circumstances. The system is controlled by a light detector. We used optical simulation tools to design and simulate the efficiency of the active module. Finally, we used the natural light illumination system to provide natural illumination for a traffic tunnel. This system will provide a great number of benefits for the people who use it.
Gibbons, Brittney R; Xu, Minzhong; Bacić, Zlatko
2009-04-23
We report rigorous quantum three-dimensional calculations of highly excited intermolecular vibrational states of the van der Waals (vdW) complex phthalocyanine.He (Pc.He). The Pc molecule was treated as rigid and the intermolecular potential energy surface (IPES) was represented as a sum of atom-atom Lennard-Jones pair potentials. The IPES has four equivalent global minima on the diagonals of the square-shaped Pc, inside its five-membered rings, and four slightly shallower local minima between them, creating a distinctive corrugation pattern of the molecular nanosurface. The vdW vibrational states analyzed in this work extend to about two-thirds of the well depth of the IPES. For the assignment of the in-plane (xy) vdW vibrational excitations it was necessary to resort to two sets of quantum numbers, the Cartesian quantum numbers [nu(x), nu(y)] and the quantum numbers (v, l) of the 2D isotropic oscillator, depending on the nodal structure and the symmetry of the wave functions. The delocalization of the He atom parallel to the molecular surface is large already in the ground vdW state. It increases rapidly with the number of quanta in the in-plane vdW vibrations, with the maximum root-mean-square amplitudes Deltax and Deltay of about 7 au at the excitation energies around 40 cm(-1). The wave functions of the highly excited states tend to be delocalized over the entire nanosurface and often have a square shape, reflecting that of the substrate.
ROOT: A C++ framework for petabyte data storage, statistical analysis and visualization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Antcheva, I.; /CERN; Ballintijn, M.
2009-01-01
ROOT is an object-oriented C++ framework conceived in the high-energy physics (HEP) community, designed for storing and analyzing petabytes of data in an efficient way. Any instance of a C++ class can be stored into a ROOT file in a machine-independent compressed binary format. In ROOT the TTree object container is optimized for statistical data analysis over very large data sets by using vertical data storage techniques. These containers can span a large number of files on local disks, the web or a number of different shared file systems. In order to analyze this data, the user can chose outmore » of a wide set of mathematical and statistical functions, including linear algebra classes, numerical algorithms such as integration and minimization, and various methods for performing regression analysis (fitting). In particular, the RooFit package allows the user to perform complex data modeling and fitting while the RooStats library provides abstractions and implementations for advanced statistical tools. Multivariate classification methods based on machine learning techniques are available via the TMVA package. A central piece in these analysis tools are the histogram classes which provide binning of one- and multi-dimensional data. Results can be saved in high-quality graphical formats like Postscript and PDF or in bitmap formats like JPG or GIF. The result can also be stored into ROOT macros that allow a full recreation and rework of the graphics. Users typically create their analysis macros step by step, making use of the interactive C++ interpreter CINT, while running over small data samples. Once the development is finished, they can run these macros at full compiled speed over large data sets, using on-the-fly compilation, or by creating a stand-alone batch program. Finally, if processing farms are available, the user can reduce the execution time of intrinsically parallel tasks - e.g. data mining in HEP - by using PROOF, which will take care of optimally distributing the work over the available resources in a transparent way.« less
Heterogeneity and scale of sustainable development in cities.
Brelsford, Christa; Lobo, José; Hand, Joe; Bettencourt, Luís M A
2017-08-22
Rapid worldwide urbanization is at once the main cause and, potentially, the main solution to global sustainable development challenges. The growth of cities is typically associated with increases in socioeconomic productivity, but it also creates strong inequalities. Despite a growing body of evidence characterizing these heterogeneities in developed urban areas, not much is known systematically about their most extreme forms in developing cities and their consequences for sustainability. Here, we characterize the general patterns of income and access to services in a large number of developing cities, with an emphasis on an extensive, high-resolution analysis of the urban areas of Brazil and South Africa. We use detailed census data to construct sustainable development indices in hundreds of thousands of neighborhoods and show that their statistics are scale-dependent and point to the critical role of large cities in creating higher average incomes and greater access to services within their national context. We then quantify the general statistical trajectory toward universal basic service provision at different scales to show that it is characterized by varying levels of inequality, with initial increases in access being typically accompanied by growing disparities over characteristic spatial scales. These results demonstrate how extensions of these methods to other goals and data can be used over time and space to produce a simple but general quantitative assessment of progress toward internationally agreed sustainable development goals.
Heterogeneity and scale of sustainable development in cities
Brelsford, Christa; Lobo, José; Hand, Joe
2017-01-01
Rapid worldwide urbanization is at once the main cause and, potentially, the main solution to global sustainable development challenges. The growth of cities is typically associated with increases in socioeconomic productivity, but it also creates strong inequalities. Despite a growing body of evidence characterizing these heterogeneities in developed urban areas, not much is known systematically about their most extreme forms in developing cities and their consequences for sustainability. Here, we characterize the general patterns of income and access to services in a large number of developing cities, with an emphasis on an extensive, high-resolution analysis of the urban areas of Brazil and South Africa. We use detailed census data to construct sustainable development indices in hundreds of thousands of neighborhoods and show that their statistics are scale-dependent and point to the critical role of large cities in creating higher average incomes and greater access to services within their national context. We then quantify the general statistical trajectory toward universal basic service provision at different scales to show that it is characterized by varying levels of inequality, with initial increases in access being typically accompanied by growing disparities over characteristic spatial scales. These results demonstrate how extensions of these methods to other goals and data can be used over time and space to produce a simple but general quantitative assessment of progress toward internationally agreed sustainable development goals. PMID:28461489
Reichardt, Anne; Polchow, Bianca; Shakibaei, Mehdi; Henrich, Wolfgang; Hetzer, Roland; Lueders, Cora
2013-01-01
Widespread use of human umbilical cord cells for cardiovascular tissue engineering requires production of large numbers of well-characterized cells under controlled conditions. In current research projects, the expansion of cells to be used to create a tissue construct is usually performed in static cell culture systems which are, however, often not satisfactory due to limitations in nutrient and oxygen supply. To overcome these limitations dynamic cell expansion in bioreactor systems under controllable conditions could be an important tool providing continuous perfusion for the generation of large numbers of viable pre-conditioned cells in a short time period. For this purpose cells derived from human umbilical cord arteries were expanded in a rotating bed system bioreactor for up to 9 days. For a comparative study, cells were cultivated under static conditions in standard culture devices. Our results demonstrated that the microenvironment in the perfusion bioreactor was more favorable than that of the standard cell culture flasks. Data suggested that cells in the bioreactor expanded 39 fold (38.7 ± 6.1 fold) in comparison to statically cultured cells (31.8 ± 3.0 fold). Large-scale production of cells in the bioreactor resulted in more than 3 x 108 cells from a single umbilical cord fragment within 9 days. Furthermore cell doubling time was lower in the bioreactor system and production of extracellular matrix components was higher. With this study, we present an appropriate method to expand human umbilical cord artery derived cells with high cellular proliferation rates in a well-defined bioreactor system under GMP conditions. PMID:23847691
Too much of a good thing? An observational study of prolific authors.
Wager, Elizabeth; Singhvi, Sanjay; Kleinert, Sabine
2015-01-01
Introduction. Researchers' productivity is usually measured in terms of their publication output. A minimum number of publications is required for some medical qualifications and professional appointments. However, authoring an unfeasibly large number of publications might indicate disregard of authorship criteria or even fraud. We therefore examined publication patterns of highly prolific authors in 4 medical specialties. Methods. We analysed Medline publications from 2008-12 using bespoke software to disambiguate individual authors focusing on 4 discrete topics (to further reduce the risk of combining publications from authors with the same name and affiliation). This enabled us to assess the number and type of publications per author per year. Results. While 99% of authors were listed on fewer than 20 publications in the 5-year period, 24 authors in the chosen areas were listed on at least 25 publications in a single year (i.e., >1 publication per 10 working days). Types of publication by the prolific authors varied but included substantial numbers of original research papers (not simply editorials or letters). Conclusions. Institutions and funders should be alert to unfeasibly prolific authors when measuring and creating incentives for researcher productivity.
Blueprint for a microwave trapped ion quantum computer.
Lekitsch, Bjoern; Weidt, Sebastian; Fowler, Austin G; Mølmer, Klaus; Devitt, Simon J; Wunderlich, Christof; Hensinger, Winfried K
2017-02-01
The availability of a universal quantum computer may have a fundamental impact on a vast number of research fields and on society as a whole. An increasingly large scientific and industrial community is working toward the realization of such a device. An arbitrarily large quantum computer may best be constructed using a modular approach. We present a blueprint for a trapped ion-based scalable quantum computer module, making it possible to create a scalable quantum computer architecture based on long-wavelength radiation quantum gates. The modules control all operations as stand-alone units, are constructed using silicon microfabrication techniques, and are within reach of current technology. To perform the required quantum computations, the modules make use of long-wavelength radiation-based quantum gate technology. To scale this microwave quantum computer architecture to a large size, we present a fully scalable design that makes use of ion transport between different modules, thereby allowing arbitrarily many modules to be connected to construct a large-scale device. A high error-threshold surface error correction code can be implemented in the proposed architecture to execute fault-tolerant operations. With appropriate adjustments, the proposed modules are also suitable for alternative trapped ion quantum computer architectures, such as schemes using photonic interconnects.
Collective Interaction of a Compressible Periodic Parallel Jet Flow
NASA Technical Reports Server (NTRS)
Miles, Jeffrey Hilton
1997-01-01
A linear instability model for multiple spatially periodic supersonic rectangular jets is solved using Floquet-Bloch theory. The disturbance environment is investigated using a two dimensional perturbation of a mean flow. For all cases large temporal growth rates are found. This work is motivated by an increase in mixing found in experimental measurements of spatially periodic supersonic rectangular jets with phase-locked screech. The results obtained in this paper suggests that phase-locked screech or edge tones may produce correlated spatially periodic jet flow downstream of the nozzles which creates a large span wise multi-nozzle region where a disturbance can propagate. The large temporal growth rates for eddies obtained by model calculation herein are related to the increased mixing since eddies are the primary mechanism that transfer energy from the mean flow to the large turbulent structures. Calculations of growth rates are presented for a range of Mach numbers and nozzle spacings corresponding to experimental test conditions where screech synchronized phase locking was observed. The model may be of significant scientific and engineering value in the quest to understand and construct supersonic mixer-ejector nozzles which provide increased mixing and reduced noise.
Pynamic: the Python Dynamic Benchmark
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, G L; Ahn, D H; de Supinksi, B R
2007-07-10
Python is widely used in scientific computing to facilitate application development and to support features such as computational steering. Making full use of some of Python's popular features, which improve programmer productivity, leads to applications that access extremely high numbers of dynamically linked libraries (DLLs). As a result, some important Python-based applications severely stress a system's dynamic linking and loading capabilities and also cause significant difficulties for most development environment tools, such as debuggers. Furthermore, using the Python paradigm for large scale MPI-based applications can create significant file IO and further stress tools and operating systems. In this paper, wemore » present Pynamic, the first benchmark program to support configurable emulation of a wide-range of the DLL usage of Python-based applications for large scale systems. Pynamic has already accurately reproduced system software and tool issues encountered by important large Python-based scientific applications on our supercomputers. Pynamic provided insight for our system software and tool vendors, and our application developers, into the impact of several design decisions. As we describe the Pynamic benchmark, we will highlight some of the issues discovered in our large scale system software and tools using Pynamic.« less
The Inspiring Science Education project and the resources for HEP analysis by university students
NASA Astrophysics Data System (ADS)
Fassouliotis, Dimitris; Kourkoumelis, Christine; Vourakis, Stylianos
2016-11-01
The Inspiring Science Education outreach project has been running for more than two years, creating a large number of inquiry based educational resources for high-school teachers and students. Its goal is the promotion of science education in schools though new methods built on the inquiry based education techniques, involving large consortia of European partners and implementation of large-scale pilots in schools. Recent hands-on activities, developing and testing the above mentioned innovative applications are reviewed. In general, there is a lack for educational scenaria and laboratory courses earmarked for more advanced, namely university, students. At the University of Athens for the last four years, the HYPATIA on-line event analysis tool has been used as a lab course for fourth year undergraduate physics students, majoring in HEP. Up to now, the course was limited to visual inspection of a few tens of ATLAS events. Recently the course was enriched with additional analysis exercises, which involve large samples of events. The students through a user friendly interface can analyse the samples and optimize the cut selection in order to search for new physics. The implementation of this analysis is described.
NASA Astrophysics Data System (ADS)
Beichner, Robert
2015-03-01
The Student Centered Active Learning Environment with Upside-down Pedagogies (SCALE-UP) project was developed nearly 20 years ago as an economical way to provide collaborative, interactive instruction even for large enrollment classes. Nearly all research-based pedagogies have been designed with fairly high faculty-student ratios. The economics of introductory courses at large universities often precludes that situation, so SCALE-UP was created as a way to facilitate highly collaborative active learning with large numbers of students served by only a few faculty and assistants. It enables those students to learn and succeed not only in acquiring content, but also to practice important 21st century skills like problem solving, communication, and teamsmanship. The approach was initially targeted at undergraduate science and engineering students taking introductory physics courses in large enrollment sections. It has since expanded to multiple content areas, including chemistry, math, engineering, biology, business, nursing, and even the humanities. Class sizes range from 24 to over 600. Data collected from multiple sites around the world indicates highly successful implementation at more than 250 institutions. NSF support was critical for initial development and dissemination efforts. Generously supported by NSF (9752313, 9981107) and FIPSE (P116B971905, P116B000659).
How Many Is Too Many? On the Relationship between Research Productivity and Impact
Larivière, Vincent; Costas, Rodrigo
2016-01-01
Over the last few decades, the institutionalisation of quantitative research evaluations has created incentives for scholars to publish as many papers as possible. This paper assesses the effects of such incentives on individual researchers’ scientific impact, by analysing the relationship between their number of articles and their proportion of highly cited papers. In other words, does the share of an author’s top 1% most cited papers increase, remain stable, or decrease as his/her total number of papers increase? Using a large dataset of disambiguated researchers (N = 28,078,476) over the 1980–2013 period, this paper shows that, on average, the higher the number of papers a researcher publishes, the higher the proportion of these papers are amongst the most cited. This relationship is stronger for older cohorts of researchers, while decreasing returns to scale are observed for recent cohorts. On the whole, these results suggest that for established researchers, the strategy of publishing as many papers as possible did not yield lower shares of highly cited publications, but such a pattern is not always observed for younger scholars. PMID:27682366
NASA Astrophysics Data System (ADS)
Motes, Keith R.; Olson, Jonathan P.; Rabeaux, Evan J.; Dowling, Jonathan P.; Olson, S. Jay; Rohde, Peter P.
2015-05-01
Quantum number-path entanglement is a resource for supersensitive quantum metrology and in particular provides for sub-shot-noise or even Heisenberg-limited sensitivity. However, such number-path entanglement has been thought to be resource intensive to create in the first place—typically requiring either very strong nonlinearities, or nondeterministic preparation schemes with feedforward, which are difficult to implement. Very recently, arising from the study of quantum random walks with multiphoton walkers, as well as the study of the computational complexity of passive linear optical interferometers fed with single-photon inputs, it has been shown that such passive linear optical devices generate a superexponentially large amount of number-path entanglement. A logical question to ask is whether this entanglement may be exploited for quantum metrology. We answer that question here in the affirmative by showing that a simple, passive, linear-optical interferometer—fed with only uncorrelated, single-photon inputs, coupled with simple, single-mode, disjoint photodetection—is capable of significantly beating the shot-noise limit. Our result implies a pathway forward to practical quantum metrology with readily available technology.
Motes, Keith R; Olson, Jonathan P; Rabeaux, Evan J; Dowling, Jonathan P; Olson, S Jay; Rohde, Peter P
2015-05-01
Quantum number-path entanglement is a resource for supersensitive quantum metrology and in particular provides for sub-shot-noise or even Heisenberg-limited sensitivity. However, such number-path entanglement has been thought to be resource intensive to create in the first place--typically requiring either very strong nonlinearities, or nondeterministic preparation schemes with feedforward, which are difficult to implement. Very recently, arising from the study of quantum random walks with multiphoton walkers, as well as the study of the computational complexity of passive linear optical interferometers fed with single-photon inputs, it has been shown that such passive linear optical devices generate a superexponentially large amount of number-path entanglement. A logical question to ask is whether this entanglement may be exploited for quantum metrology. We answer that question here in the affirmative by showing that a simple, passive, linear-optical interferometer--fed with only uncorrelated, single-photon inputs, coupled with simple, single-mode, disjoint photodetection--is capable of significantly beating the shot-noise limit. Our result implies a pathway forward to practical quantum metrology with readily available technology.
Snow Tweets: Emergency Information Dissemination in a US County During 2014 Winter Storms
Bonnan-White, Jess; Shulman, Jason; Bielecke, Abigail
2014-01-01
Introduction: This paper describes how American federal, state, and local organizations created, sourced, and disseminated emergency information via social media in preparation for several winter storms in one county in the state of New Jersey (USA). Methods: Postings submitted to Twitter for three winter storm periods were collected from selected organizations, along with a purposeful sample of select private local users. Storm-related posts were analyzed for stylistic features (hashtags, retweet mentions, embedded URLs). Sharing and re-tweeting patterns were also mapped using NodeXL. Results: Results indicate emergency management entities were active in providing preparedness and response information during the selected winter weather events. A large number of posts, however, did not include unique Twitter features that maximize dissemination and discovery by users. Visual representations of interactions illustrate opportunities for developing stronger relationships among agencies. Discussion: Whereas previous research predominantly focuses on large-scale national or international disaster contexts, the current study instead provides needed analysis in a small-scale context. With practice during localized events like extreme weather, effective information dissemination in large events can be enhanced. PMID:25685629
Snow Tweets: Emergency Information Dissemination in a US County During 2014 Winter Storms.
Bonnan-White, Jess; Shulman, Jason; Bielecke, Abigail
2014-12-22
This paper describes how American federal, state, and local organizations created, sourced, and disseminated emergency information via social media in preparation for several winter storms in one county in the state of New Jersey (USA). Postings submitted to Twitter for three winter storm periods were collected from selected organizations, along with a purposeful sample of select private local users. Storm-related posts were analyzed for stylistic features (hashtags, retweet mentions, embedded URLs). Sharing and re-tweeting patterns were also mapped using NodeXL. RESULTS indicate emergency management entities were active in providing preparedness and response information during the selected winter weather events. A large number of posts, however, did not include unique Twitter features that maximize dissemination and discovery by users. Visual representations of interactions illustrate opportunities for developing stronger relationships among agencies. Whereas previous research predominantly focuses on large-scale national or international disaster contexts, the current study instead provides needed analysis in a small-scale context. With practice during localized events like extreme weather, effective information dissemination in large events can be enhanced.
Targeting vector construction through recombineering.
Malureanu, Liviu A
2011-01-01
Gene targeting in mouse embryonic stem cells is an essential, yet still very expensive and highly time-consuming, tool and method to study gene function at the organismal level or to create mouse models of human diseases. Conventional cloning-based methods have been largely used for generating targeting vectors, but are hampered by a number of limiting factors, including the variety and location of restriction enzymes in the gene locus of interest, the specific PCR amplification of repetitive DNA sequences, and cloning of large DNA fragments. Recombineering is a technique that exploits the highly efficient homologous recombination function encoded by λ phage in Escherichia coli. Bacteriophage-based recombination can recombine homologous sequences as short as 30-50 bases, allowing manipulations such as insertion, deletion, or mutation of virtually any genomic region. The large availability of mouse genomic bacterial artificial chromosome (BAC) libraries covering most of the genome facilitates the retrieval of genomic DNA sequences from the bacterial chromosomes through recombineering. This chapter describes a successfully applied protocol and aims to be a detailed guide through the steps of generation of targeting vectors through recombineering.
Is the negative IOD during 2016 the reason for monsoon failure over southwest peninsular India?
NASA Astrophysics Data System (ADS)
Sreelekha, P. N.; Babu, C. A.
2018-01-01
The study investigates the mechanism responsible for the deficit rainfall over southwest peninsular India during the 2016 monsoon season. Analysis shows that the large-scale variation in circulation pattern due to the strong, negative Indian Ocean Dipole phenomenon was the reason for the deficit rainfall. Significant reduction in the number of northward-propagating monsoon-organized convections together with fast propagation over the southwest peninsular India resulted in reduction in rainfall. On the other hand, their persistence for longer time over the central part of India resulted in normal rainfall. It was found that the strong convection over the eastern equatorial Indian Ocean creates strong convergence over that region. The combined effect of the sinking due to the well-developed Walker circulation originated over the eastern equatorial Indian Ocean and the descending limb of the monsoon Hadley cell caused strong subsidence over the western equatorial Indian Ocean. The tail of this large-scale sinking extended up to the southern parts of India. This hinders formation of monsoon-organized convections leading to a large deficiency of rainfall during monsoon 2016 over the southwest peninsular India.
The future of management: The NASA paradigm
NASA Technical Reports Server (NTRS)
Harris, Philip R.
1992-01-01
Prototypes of 21st century management, especially for large scale enterprises, may well be found within the aerospace industry. The space era inaugurated a number of projects of such scope and magnitude that another type of management had to be created to ensure successful achievement. The challenges will be not just in terms of technology and its management, but also human and cultural in dimension. Futurists, students of management, and those concerned with technological administration would do well to review the literature of emerging space management for its wider implications. NASA offers a paradigm, or demonstrated model, of future trends in the field of management at large. More research is needed on issues of leadership for Earth based project in space and space based programs with managers there. It is needed to realize that large scale technical enterprises, such as are undertaken in space, require a new form of management. NASA and other responsible agencies are urged to study excellence in space macromanagement, including the necessary multidisciplinary skills. Two recommended targets are the application of general living systems theory and macromanagement concepts for space stations in the 1990s.
Process and information integration via hypermedia
NASA Technical Reports Server (NTRS)
Hammen, David G.; Labasse, Daniel L.; Myers, Robert M.
1990-01-01
Success stories for advanced automation prototypes abound in the literature but the deployments of practical large systems are few in number. There are several factors that militate against the maturation of such prototypes into products. Here, the integration of advanced automation software into large systems is discussed. Advanced automation systems tend to be specific applications that need to be integrated and aggregated into larger systems. Systems integration can be achieved by providing expert user-developers with verified tools to efficiently create small systems that interface to large systems through standard interfaces. The use of hypermedia as such a tool in the context of the ground control centers that support Shuttle and space station operations is explored. Hypermedia can be an integrating platform for data, conventional software, and advanced automation software, enabling data integration through the display of diverse types of information and through the creation of associative links between chunks of information. Further, hypermedia enables process integration through graphical invoking of system functions. Through analysis and examples, researchers illustrate how diverse information and processing paradigms can be integrated into a single software platform.
NASA Astrophysics Data System (ADS)
Ercolano, Barbara; Weber, Michael L.; Owen, James E.
2018-01-01
Circumstellar discs with large dust depleted cavities and vigorous accretion on to the central star are often considered signposts for (multiple) giant planet formation. In this Letter, we show that X-ray photoevaporation operating in discs with modest (factors 3-10) gas-phase depletion of carbon and oxygen at large radii ( > 15 au) yields the inner radius and accretion rates for most of the observed discs, without the need to invoke giant planet formation. We present one-dimensional viscous evolution models of discs affected by X-ray photoevaporation assuming moderate gas-phase depletion of carbon and oxygen, well within the range reported by recent observations. Our models use a simplified prescription for scaling the X-ray photoevaporation rates and profiles at different metallicity, and our quantitative result depends on this scaling. While more rigorous hydrodynamical modelling of mass-loss profiles at low metallicities is required to constrain the observational parameter space that can be explained by our models, the general conclusion that metal sequestering at large radii may be responsible for the observed diversity of transition discs is shown to be robust. Gap opening by giant planet formation may still be responsible for a number of observed transition discs with large cavities and very high accretion rate.
The Large Synoptic Survey Telescope
NASA Astrophysics Data System (ADS)
Axelrod, T. S.
2006-07-01
The Large Synoptic Survey Telescope (LSST) is an 8.4 meter telescope with a 10 square degree field degree field and a 3 Gigapixel imager, planned to be on-sky in 2012. It is a dedicated all-sky survey instrument, with several complementary science missions. These include understanding dark energy through weak lensing and supernovae; exploring transients and variable objects; creating and maintaining a solar system map, with particular emphasis on potentially hazardous objects; and increasing the precision with which we understand the structure of the Milky Way. The instrument operates continuously at a rapid cadence, repetitively scanning the visible sky every few nights. The data flow rates from LSST are larger than those from current surveys by roughly a factor of 1000: A few GB/night are typical today. LSST will deliver a few TB/night. From a computing hardware perspective, this factor of 1000 can be dealt with easily in 2012. The major issues in designing the LSST data management system arise from the fact that the number of people available to critically examine the data will not grow from current levels. This has a number of implications. For example, every large imaging survey today is resigned to the fact that their image reduction pipelines fail at some significant rate. Many of these failures are dealt with by rerunning the reduction pipeline under human supervision, with carefully ``tweaked'' parameters to deal with the original problem. For LSST, this will no longer be feasible. The problem is compounded by the fact that the processing must of necessity occur on clusters with large numbers of CPU's and disk drives, and with some components connected by long-haul networks. This inevitably results in a significant rate of hardware component failures, which can easily lead to further software failures. Both hardware and software failures must be seen as a routine fact of life rather than rare exceptions to normality.
NASA Astrophysics Data System (ADS)
Jaenisch, Holger M.; Handley, James W.
2010-04-01
Malware are analogs of viruses. Viruses are comprised of large numbers of polypeptide proteins. The shape and function of the protein strands determines the functionality of the segment, similar to a subroutine in malware. The full combination of subroutines is the malware organism, in analogous fashion as a collection of polypeptides forms protein structures that are information bearing. We propose to apply the methods of Bioinformatics to analyze malware to provide a rich feature set for creating a unique and novel detection and classification scheme that is originally applied to Bioinformatics amino acid sequencing. Our proposed methods enable real time in situ (in contrast to in vivo) detection applications.
lcps: Light curve pre-selection
NASA Astrophysics Data System (ADS)
Schlecker, Martin
2018-05-01
lcps searches for transit-like features (i.e., dips) in photometric data. Its main purpose is to restrict large sets of light curves to a number of files that show interesting behavior, such as drops in flux. While lcps is adaptable to any format of time series, its I/O module is designed specifically for photometry of the Kepler spacecraft. It extracts the pre-conditioned PDCSAP data from light curves files created by the standard Kepler pipeline. It can also handle csv-formatted ascii files. lcps uses a sliding window technique to compare a section of flux time series with its surroundings. A dip is detected if the flux within the window is lower than a threshold fraction of the surrounding fluxes.
Nanolaminate microfluidic device for mobility selection of particles
Surh, Michael P [Livermore, CA; Wilson, William D [Pleasanton, CA; Barbee, Jr., Troy W.; Lane, Stephen M [Oakland, CA
2006-10-10
A microfluidic device made from nanolaminate materials that are capable of electrophoretic selection of particles on the basis of their mobility. Nanolaminate materials are generally alternating layers of two materials (one conducting, one insulating) that are made by sputter coating a flat substrate with a large number of layers. Specific subsets of the conducting layers are coupled together to form a single, extended electrode, interleaved with other similar electrodes. Thereby, the subsets of conducting layers may be dynamically charged to create time-dependent potential fields that can trap or transport charge colloidal particles. The addition of time-dependence is applicable to all geometries of nanolaminate electrophoretic and electrochemical designs from sinusoidal to nearly step-like.
Cognitive ergonomics of operational tools
NASA Astrophysics Data System (ADS)
Lüdeke, A.
2012-10-01
Control systems have become increasingly more powerful over the past decades. The availability of high data throughput and sophisticated graphical interactions has opened a variety of new possibilities. But has this helped to provide intuitive, easy to use applications to simplify the operation of modern large scale accelerator facilities? We will discuss what makes an application useful to operation and what is necessary to make a tool easy to use. We will show that even the implementation of a small number of simple application design rules can help to create ergonomic operational tools. The author is convinced that such tools do indeed help to achieve higher beam availability and better beam performance at accelerator facilities.
Räsänen, Pekka
2012-01-01
The information society has raised the value of numeracy. This is a challenge to schools and societies, because individual differences are large already in basic number sense and calculation skills. Approximately 5-7 % of school children have extensive difficulties to keep with the speed of curricular demands, i.e. one child in every classroom. These children often have difficulties in other areas of learning too, but disorders in learning can also manifest only in mathematics. Undiagnosed and untreated mathematical disorders become a lifelong handicap creating a barrier to vocational education. They also hinder independent management of mathematical activities of daily living. Low numeracy is a measurable social problem. Intensive and early special education or neuropsychological rehabilitation can diminish the negative effects of the disorders.
Camilo, Cesar M; Lima, Gustavo M A; Maluf, Fernando V; Guido, Rafael V C; Polikarpov, Igor
2016-01-01
Following burgeoning genomic and transcriptomic sequencing data, biochemical and molecular biology groups worldwide are implementing high-throughput cloning and mutagenesis facilities in order to obtain a large number of soluble proteins for structural and functional characterization. Since manual primer design can be a time-consuming and error-generating step, particularly when working with hundreds of targets, the automation of primer design process becomes highly desirable. HTP-OligoDesigner was created to provide the scientific community with a simple and intuitive online primer design tool for both laboratory-scale and high-throughput projects of sequence-independent gene cloning and site-directed mutagenesis and a Tm calculator for quick queries.
Supramolecular catalysis beyond enzyme mimics.
Meeuwissen, Jurjen; Reek, Joost N H
2010-08-01
Supramolecular catalysis - the assembly of catalyst species by harnessing multiple weak intramolecular interactions - has, until recently, been dominated by enzyme-inspired approaches. Such approaches often attempt to create an enzyme-like 'active site' and have concentrated on reactions similar to those catalysed by enzymes themselves. Here, we discuss the application of supramolecular assembly to the more traditional transition metal catalysis and to small-molecule organocatalysis. The modularity of self-assembled multicomponent catalysts means that a relatively small pool of catalyst components can provide rapid access to a large number of catalysts that can be evaluated for industrially relevant reactions. In addition, we discuss how catalyst-substrate interactions can be tailored to direct substrates along particular reaction paths and selectivities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burke, F.; Donoghue, N.
1994-11-01
Vietnam`s energy needs are clear and acute. Economic reforms have triggered a dynamic development process with a large and growing appetite for power. In view of the Vietnamese government`s own shortages of capital, private international power companies have been identified as key problem-solvers in the country`s efforts to meet a skyrocketing demand for energy resources. There are no restrictions on the nature of projects in which non-Vietnamese investors may participate. A number of legal issues need resolution before independent power producers can take advantage of the Republic`s recently created Builder-Operator-Transfer Contracts (the BOT Regulations). This paper discusses these regulations andmore » how they affect independent power producers« less
Improving Grasp Skills Using Schema Structured Learning
NASA Technical Reports Server (NTRS)
Platt, Robert; Grupen, ROderic A.; Fagg, Andrew H.
2006-01-01
Abstract In the control-based approach to robotics, complex behavior is created by sequencing and combining control primitives. While it is desirable for the robot to autonomously learn the correct control sequence, searching through the large number of potential solutions can be time consuming. This paper constrains this search to variations of a generalized solution encoded in a framework known as an action schema. A new algorithm, SCHEMA STRUCTURED LEARNING, is proposed that repeatedly executes variations of the generalized solution in search of instantiations that satisfy action schema objectives. This approach is tested in a grasping task where Dexter, the UMass humanoid robot, learns which reaching and grasping controllers maximize the probability of grasp success.
NASA Technical Reports Server (NTRS)
Freeman, D. C., Jr.; Powell, R. W.
1979-01-01
Aft center-of-gravity locations dictated by the large number of rocket engines required has been a continuing problem of single-stage-to-orbit vehicles. Recent work at Langley has demonstrated that these aft center-of-gravity problems become more pronounced for the proposed heavy-lift mission, creating some unique design problems for both the SSTO and staged vehicle systems. During the course of this study, an effort was made to bring together automated vehicle design, wind-tunnel tests, and flight control analyses to assess the impact of longitudinal and lateral-directional instability, and control philosophy on entry vehicle design technology.
Gopalaswamy, Arjun M.; Royle, J. Andrew; Hines, James E.; Singh, Pallavi; Jathanna, Devcharan; Kumar, N. Samba; Karanth, K. Ullas
2012-01-01
1. The advent of spatially explicit capture-recapture models is changing the way ecologists analyse capture-recapture data. However, the advantages offered by these new models are not fully exploited because they can be difficult to implement. 2. To address this need, we developed a user-friendly software package, created within the R programming environment, called SPACECAP. This package implements Bayesian spatially explicit hierarchical models to analyse spatial capture-recapture data. 3. Given that a large number of field biologists prefer software with graphical user interfaces for analysing their data, SPACECAP is particularly useful as a tool to increase the adoption of Bayesian spatially explicit capture-recapture methods in practice.
Adaptable mission planning for kino-dynamic systems
NASA Astrophysics Data System (ADS)
Bush, Lawrence A. M.; Jimenez, Tony R.; Williams, Brian C.
Autonomous systems can perform tasks that are dangerous, monotonous, or even impossible for humans. To approach the problem of planning for Unmanned Aerial Vehicles (UAVs) we present a hierarchical method that combines a high-level planner with a low-level planner. We pose the problem of high-level planning as a Selective Traveling Salesman Problem (STSP) and select the order in which to visit our science sites. We then use a kino-dynamic path planner to create a large number of intermediate waypoints. This is a complete system that combines high and low level planning to achieve a goal. This paper demonstrates the benefits gained by adaptable high-level plans versus static and greedy plans.
Optimization of a Monte Carlo Model of the Transient Reactor Test Facility
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Kristin; DeHart, Mark; Goluoglu, Sedat
2017-03-01
The ultimate goal of modeling and simulation is to obtain reasonable answers to problems that don’t have representations which can be easily evaluated while minimizing the amount of computational resources. With the advances during the last twenty years of large scale computing centers, researchers have had the ability to create a multitude of tools to minimize the number of approximations necessary when modeling a system. The tremendous power of these centers requires the user to possess an immense amount of knowledge to optimize the models for accuracy and efficiency.This paper seeks to evaluate the KENO model of TREAT to optimizemore » calculational efforts.« less
ERIC Educational Resources Information Center
Weber, Jonathan
2006-01-01
Creating a digital library might seem like a task best left to a large research collection with a vast staff and generous budget. However, tools for successfully creating digital libraries are getting easier to use all the time. The explosion of people creating content for the web has led to the availability of many high-quality applications and…
Radial inlet guide vanes for a combustor
Zuo, Baifang; Simons, Derrick; York, William; Ziminsky, Willy S
2013-02-12
A combustor may include an interior flow path therethrough, a number of fuel nozzles in communication with the interior flow path, and an inlet guide vane system positioned about the interior flow path to create a swirled flow therein. The inlet guide vane system may include a number of windows positioned circumferentially around the fuel nozzles. The inlet guide vane system may also include a number of inlet guide vanes positioned circumferentially around the fuel nozzles and adjacent to the windows to create a swirled flow within the interior flow path.
a Cache Design Method for Spatial Information Visualization in 3d Real-Time Rendering Engine
NASA Astrophysics Data System (ADS)
Dai, X.; Xiong, H.; Zheng, X.
2012-07-01
A well-designed cache system has positive impacts on the 3D real-time rendering engine. As the amount of visualization data getting larger, the effects become more obvious. They are the base of the 3D real-time rendering engine to smoothly browsing through the data, which is out of the core memory, or from the internet. In this article, a new kind of caches which are based on multi threads and large file are introduced. The memory cache consists of three parts, the rendering cache, the pre-rendering cache and the elimination cache. The rendering cache stores the data that is rendering in the engine; the data that is dispatched according to the position of the view point in the horizontal and vertical directions is stored in the pre-rendering cache; the data that is eliminated from the previous cache is stored in the eliminate cache and is going to write to the disk cache. Multi large files are used in the disk cache. When a disk cache file size reaches the limit length(128M is the top in the experiment), no item will be eliminated from the file, but a new large cache file will be created. If the large file number is greater than the maximum number that is pre-set, the earliest file will be deleted from the disk. In this way, only one file is opened for writing and reading, and the rest are read-only so the disk cache can be used in a high asynchronous way. The size of the large file is limited in order to map to the core memory to save loading time. Multi-thread is used to update the cache data. The threads are used to load data to the rendering cache as soon as possible for rendering, to load data to the pre-rendering cache for rendering next few frames, and to load data to the elimination cache which is not necessary for the moment. In our experiment, two threads are designed. The first thread is to organize the memory cache according to the view point, and created two threads: the adding list and the deleting list, the adding list index the data that should be loaded to the pre-rendering cache immediately, the deleting list index the data that is no longer visible in the rendering scene and should be moved to the eliminate cache; the other thread is to move the data in the memory and disk cache according to the adding and the deleting list, and create the download requests when the data is indexed in the adding but cannot be found either in memory cache or disk cache, eliminate cache data is moved to the disk cache when the adding list and deleting are empty. The cache designed as described above in our experiment shows reliable and efficient, and the data loading time and files I/O time decreased sharply, especially when the rendering data getting larger.
Number Sense on the Number Line
ERIC Educational Resources Information Center
Woods, Dawn Marie; Ketterlin Geller, Leanne; Basaraba, Deni
2018-01-01
A strong foundation in early number concepts is critical for students' future success in mathematics. Research suggests that visual representations, like a number line, support students' development of number sense by helping them create a mental representation of the order and magnitude of numbers. In addition, explicitly sequencing instruction…
System for creating on site, remote from a sterile environment, parenteral solutions
NASA Technical Reports Server (NTRS)
Finley, Mike (Inventor); Scharf, Mike (Inventor); Packard, Jeff (Inventor); Kipp, Jim (Inventor); Dudar, Tom (Inventor); Owens, Jim (Inventor); Bindokas, Al (Inventor)
1996-01-01
The present invention provides a system and method for creating on site, remote from a sterile environment, parenteral solutions in large volume parenteral containers for intravenous administration to a patient. In an embodiment, this system comprises an empty large volume container including at least one port for accessing an interior of the container. The port includes a sterilizing filter for sterilizing a fluid fed through the port into the container. A second container is provided including a solute and having means for coupling the second container to the large volume container and thereby providing fluid communication therebetween allowing the solute to be received within the interior of the container. A sterile water source is also provided including means for placing the sterile water source in fluid communication with the port and allowing water to flow from the sterile water source into the interior of the container. This allows the solute, and sterile water that has been fed through the filter, to create a parenteral solution in the large volume parenteral container.
Boundary effects and the onset of Taylor vortices
NASA Astrophysics Data System (ADS)
Rucklidge, A. M.; Champneys, A. R.
2004-05-01
It is well established that the onset of spatially periodic vortex states in the Taylor-Couette flow between rotating cylinders occurs at the value of Reynolds number predicted by local bifurcation theory. However, the symmetry breaking induced by the top and bottom plates means that the true situation should be a disconnected pitchfork. Indeed, experiments have shown that the fold on the disconnected branch can occur at more than double the Reynolds number of onset. This leads to an apparent contradiction: why should Taylor vortices set in so sharply at the Reynolds number predicted by the symmetric theory, given such large symmetry-breaking effects caused by the boundary conditions? This paper offers a generic explanation. The details are worked out using a Swift-Hohenberg pattern formation model that shares the same qualitative features as the Taylor-Couette flow. Onset occurs via a wall mode whose exponential tail penetrates further into the bulk of the domain as the driving parameter increases. In a large domain of length L, we show that the wall mode creates significant amplitude in the centre at parameter values that are O( L-2) away from the value of onset in the problem with ideal boundary conditions. We explain this as being due to a Hamiltonian Hopf bifurcation in space, which occurs at the same parameter value as the pitchfork bifurcation of the temporal dynamics. The disconnected anomalous branch remains O(1) away from the onset parameter since it does not arise as a bifurcation from the wall mode.
Two LANL laboratory astrophysics experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Intrator, Thomas P.
2014-01-24
Two laboratory experiments are described that have been built at Los Alamos (LANL) to gain access to a wide range of fundamental plasma physics issues germane to astro, space, and fusion plasmas. The overarching theme is magnetized plasma dynamics which includes significant currents, MHD forces and instabilities, magnetic field creation and annihilation, sheared flows and shocks. The Relaxation Scaling Experiment (RSX) creates current sheets and flux ropes that exhibit fully 3D dynamics, and can kink, bounce, merge and reconnect, shred, and reform in complicated ways. Recent movies from a large data set describe the 3D magnetic structure of a drivenmore » and dissipative single flux rope that spontaneously self-saturates a kink instability. Examples of a coherent shear flow dynamo driven by colliding flux ropes will also be shown. The Magnetized Shock Experiment (MSX) uses Field reversed configuration (FRC) experimental hardware that forms and ejects FRCs at 150km/sec. This is sufficient to drive a collision less magnetized shock when stagnated into a mirror stopping field region with Alfven Mach number MA=3 so that super critical shocks can be studied. We are building a plasmoid accelerator to drive Mach numbers MA >> 3 to access solar wind and more exotic astrophysical regimes. Unique features of this experiment include access to parallel, oblique and perpendicular shocks, shock region much larger than ion gyro radii and ion inertial length, room for turbulence, and large magnetic and fluid Reynolds numbers.« less
Fisher, Frederick S.; Bultman, Mark W.; Pappagianis, Demosthenes
2000-01-01
Coccidioidomycosis (Valley Fever) is a disease caused by the inhalation of the arthroconidia (spores) of Coccidioides immitis, a fungus that lives in the soils of southwestern United States. Although large numbers of people are exposed to the arthroconidia and are consequently infected, very few individuals contract the more serious forms of the disease. Earth scientists working in field areas where Coccidioides immitis is endemic have an increased risk of becoming infected. Because field operations often disturb the upper surface of the ground, they may inhale large numbers of arthroconidia. This also increases their risk of developing more severe forms of the disease. Any other occupations or activities that create dusty conditions in endemic areas also have increased risk of infection. Risk management strategies can lower the incidence of infection and also reduce the numbers of arthroconidia inhaled thereby decreasing the chances of developing more serious disease. Dust control, by utilizing dust masks, and dust prevention, by limiting ground disturbing activities, are the primary weapons against infection. However, infection risk can also be lowered by conducting fields studies in the winter months; avoiding sites favorable for Coccidioides immitis growth; seeking prompt medical treatment if flu-like or respiratory illness occur during, or within a few weeks following, fieldwork; getting a coccidioidin skin test to determine susceptibility to the disease; and by educating all members of the field party about the possibilities and consequences of infection.
A Weighted Configuration Model and Inhomogeneous Epidemics
NASA Astrophysics Data System (ADS)
Britton, Tom; Deijfen, Maria; Liljeros, Fredrik
2011-12-01
A random graph model with prescribed degree distribution and degree dependent edge weights is introduced. Each vertex is independently equipped with a random number of half-edges and each half-edge is assigned an integer valued weight according to a distribution that is allowed to depend on the degree of its vertex. Half-edges with the same weight are then paired randomly to create edges. An expression for the threshold for the appearance of a giant component in the resulting graph is derived using results on multi-type branching processes. The same technique also gives an expression for the basic reproduction number for an epidemic on the graph where the probability that a certain edge is used for transmission is a function of the edge weight (reflecting how closely `connected' the corresponding vertices are). It is demonstrated that, if vertices with large degree tend to have large (small) weights on their edges and if the transmission probability increases with the edge weight, then it is easier (harder) for the epidemic to take off compared to a randomized epidemic with the same degree and weight distribution. A recipe for calculating the probability of a large outbreak in the epidemic and the size of such an outbreak is also given. Finally, the model is fitted to three empirical weighted networks of importance for the spread of contagious diseases and it is shown that R 0 can be substantially over- or underestimated if the correlation between degree and weight is not taken into account.
Time-series animation techniques for visualizing urban growth
Acevedo, W.; Masuoka, P.
1997-01-01
Time-series animation is a visually intuitive way to display urban growth. Animations of landuse change for the Baltimore-Washington region were generated by showing a series of images one after the other in sequential order. Before creating an animation, various issues which will affect the appearance of the animation should be considered, including the number of original data frames to use, the optimal animation display speed, the number of intermediate frames to create between the known frames, and the output media on which the animations will be displayed. To create new frames between the known years of data, the change in each theme (i.e. urban development, water bodies, transportation routes) must be characterized and an algorithm developed to create the in-between frames. Example time-series animations were created using a temporal GIS database of the Baltimore-Washington area. Creating the animations involved generating raster images of the urban development, water bodies, and principal transportation routes; overlaying the raster images on a background image; and importing the frames to a movie file. Three-dimensional perspective animations were created by draping each image over digital elevation data prior to importing the frames to a movie file. ?? 1997 Elsevier Science Ltd.
NASA Astrophysics Data System (ADS)
Hronusov, V. V.
2006-12-01
We suggest a method of using external public servers for rearranging, restructuring and rapid sharing of environmental data for the purpose of quick presentations in numerous GE clients. The method allows to add new philosophy for the presentation (publication) of the data (mostly static) stored in the public domain (e.g., Blue Marble, Visible Earth, etc). - The new approach is generated by publishing freely accessible spreadsheets which contain enough information and links to the data. Due to the fact that most of the large depositories of the data on the environmental monitoring have rather simple net address system as well as simple hierarchy mostly based on the date and type of the data, it is possible to develop the http-based link to the file which contains the data. Publication of new data on the server is recorded by a simple entering a new address into a cell in the spreadsheet. At the moment we use the EditGrid (www.editgrid.com) system as a spreadsheet platform. The generation of kml-codes is achieved on the basis of XML data and XSLT procedures. Since the EditGride environment supports "fetch" and similar commands, it is possible to create"smart-adaptive" KML generation on the fly based on the data streams from RSS and XML sources. The previous GIS-based methods could combine hi-definition data combined from various sources, but large- scale comparisons of dynamic processes have been usually out of reach of the technology. The suggested method allows unlimited number of GE clients to view, review and compare dynamic and static process of previously un-combinable sources, and on unprecedent scales. The ease of automated or computer-assisted georeferencing has already led to translation about 3000 raster public domain imagery, point and linear data sources into GE-language. In addition the suggested method allows a user to create rapid animations to demonstrate dynamic processes; roducts of high demand in education, meteorology, volcanology and potentially in a number of industries. In general it is possible to state that the new approach, which we have tested on numerous projects, saves times and energy in creating huge amounts of georeferenced data of various kinds, and thus provided an excellent tools for education and science.
NASA Astrophysics Data System (ADS)
Morrison, S. M.; Downs, R. T.; Golden, J. J.; Pires, A.; Fox, P. A.; Ma, X.; Zednik, S.; Eleish, A.; Prabhu, A.; Hummer, D. R.; Liu, C.; Meyer, M.; Ralph, J.; Hystad, G.; Hazen, R. M.
2016-12-01
We have developed a comprehensive database of copper (Cu) mineral characteristics. These data include crystallographic, paragenetic, chemical, locality, age, structural complexity, and physical property information for the 689 Cu mineral species approved by the International Mineralogical Association (rruff.info/ima). Synthesis of this large, varied dataset allows for in-depth exploration of statistical trends and visualization techniques. With social network analysis (SNA) and cluster analysis of minerals, we create sociograms and chord diagrams. SNA visualizations illustrate the relationships and connectivity between mineral species, which often form cliques associated with rock type and/or geochemistry. Using mineral ecology statistics, we analyze mineral-locality frequency distribution and predict the number of missing mineral species, visualized with accumulation curves. By assembly of 2-dimensional KLEE diagrams of co-existing elements in minerals, we illustrate geochemical trends within a mineral system. To explore mineral age and chemical oxidation state, we create skyline diagrams and compare trends with varying chemistry. These trends illustrate mineral redox changes through geologic time and correlate with significant geologic occurrences, such as the Great Oxidation Event (GOE) or Wilson Cycles.
Young, Robert S
2016-07-01
Frequent evolutionary birth and death events have created a large quantity of biologically important, lineage-specific DNA within mammalian genomes. The birth and death of DNA sequences is so frequent that the total number of these insertions and deletions in the human population remains unknown, although there are differences between these groups, e.g. transposable elements contribute predominantly to sequence insertion. Functional turnover - where the activity of a locus is specific to one lineage, but the underlying DNA remains conserved - can also drive birth and death. However, this does not appear to be a major driver of divergent transcriptional regulation. Both sequence and functional turnover have contributed to the birth and death of thousands of functional promoters in the human and mouse genomes. These findings reveal the pervasive nature of evolutionary birth and death and suggest that lineage-specific regions may play an important but previously underappreciated role in human biology and disease. © 2016 The Authors BioEssays Published by WILEY Periodicals, Inc.
Optimization of freeform surfaces using intelligent deformation techniques for LED applications
NASA Astrophysics Data System (ADS)
Isaac, Annie Shalom; Neumann, Cornelius
2018-04-01
For many years, optical designers have great interests in designing efficient optimization algorithms to bring significant improvement to their initial design. However, the optimization is limited due to a large number of parameters present in the Non-uniform Rationaly b-Spline Surfaces. This limitation was overcome by an indirect technique known as optimization using freeform deformation (FFD). In this approach, the optical surface is placed inside a cubical grid. The vertices of this grid are modified, which deforms the underlying optical surface during the optimization. One of the challenges in this technique is the selection of appropriate vertices of the cubical grid. This is because these vertices share no relationship with the optical performance. When irrelevant vertices are selected, the computational complexity increases. Moreover, the surfaces created by them are not always feasible to manufacture, which is the same problem faced in any optimization technique while creating freeform surfaces. Therefore, this research addresses these two important issues and provides feasible design techniques to solve them. Finally, the proposed techniques are validated using two different illumination examples: street lighting lens and stop lamp for automobiles.
Helicon and Trivelpiece-Gould modes in uniform unbounded plasmas
NASA Astrophysics Data System (ADS)
Stenzel, R. L.; Urrutia, J. M.
2016-10-01
Helicon modes are whistler modes with angular orbital momentum caused by phase rotation in addition to the axial phase propagation. Although these modes have been associated with whistler eigenmodes in bounded plasma columns, they do exist in unbounded plasmas. Experiments in a large laboratory plasma show the wave excitation with phased antenna arrays, the wave field topology and the propagation of helicons. Low frequency whistlers can have two modes with different wavelengths at a given frequency, called helicons and Trivelpiece-Gould modes. The latter are whistler modes near the oblique cyclotron resonance. The oblique propagation is due to short radial wavelengths near the boundary. In unbounded plasmas, the oblique propagation arises from short azimuthal wavelengths. This has been observed in high-mode number helicons (e.g., m = 8). It creates wave absorption in the center of the helicon mode. The strong absorption of the wave can heat electrons and create perpendicular wave-particle interactions. These results may be of interest in space plasmas for scattering of energetic electrons and in helicon plasma sources for plasma processing and thruster applications. Work supported by NSF/DOE.
Surface Modification of ICF Target Capsules by Pulsed Laser Ablation
Carlson, Lane C.; Johnson, Michael A.; Bunn, Thomas L.
2016-06-30
Topographical modifications of spherical surfaces are imprinted on National Ignition Facility (NIF) target capsules by extending the capabilities of a recently developed full surface (4π) laser ablation and mapping apparatus. The laser ablation method combines the precision, energy density and long reach of a focused laser beam to pre-impose sinusoidal modulations on the outside surface of High Density Carbon (HDC) capsules and the inside surface of Glow Discharge Polymer (GDP) capsules. Sinusoidal modulations described in this paper have sub-micron to 10’s of microns vertical scale and wavelengths as small as 30 μm and as large as 200 μm. The modulatedmore » patterns are created by rastering a focused laser fired at discrete capsule surface locations for a specified number of pulses. The computer program developed to create these raster patterns uses inputs such as laser beam intensity profile, the material removal function, the starting surface figure and the desired surface figure. The patterns are optimized to minimize surface roughness. Lastly, in this paper, simulated surfaces are compared with actual ablated surfaces measured using confocal microscopy.« less
Pathway Tools version 13.0: integrated software for pathway/genome informatics and systems biology
Paley, Suzanne M.; Krummenacker, Markus; Latendresse, Mario; Dale, Joseph M.; Lee, Thomas J.; Kaipa, Pallavi; Gilham, Fred; Spaulding, Aaron; Popescu, Liviu; Altman, Tomer; Paulsen, Ian; Keseler, Ingrid M.; Caspi, Ron
2010-01-01
Pathway Tools is a production-quality software environment for creating a type of model-organism database called a Pathway/Genome Database (PGDB). A PGDB such as EcoCyc integrates the evolving understanding of the genes, proteins, metabolic network and regulatory network of an organism. This article provides an overview of Pathway Tools capabilities. The software performs multiple computational inferences including prediction of metabolic pathways, prediction of metabolic pathway hole fillers and prediction of operons. It enables interactive editing of PGDBs by DB curators. It supports web publishing of PGDBs, and provides a large number of query and visualization tools. The software also supports comparative analyses of PGDBs, and provides several systems biology analyses of PGDBs including reachability analysis of metabolic networks, and interactive tracing of metabolites through a metabolic network. More than 800 PGDBs have been created using Pathway Tools by scientists around the world, many of which are curated DBs for important model organisms. Those PGDBs can be exchanged using a peer-to-peer DB sharing system called the PGDB Registry. PMID:19955237
A rat RNA-Seq transcriptomic BodyMap across 11 organs and 4 developmental stages
Yu, Ying; Fuscoe, James C.; Zhao, Chen; Guo, Chao; Jia, Meiwen; Qing, Tao; Bannon, Desmond I.; Lancashire, Lee; Bao, Wenjun; Du, Tingting; Luo, Heng; Su, Zhenqiang; Jones, Wendell D.; Moland, Carrie L.; Branham, William S.; Qian, Feng; Ning, Baitang; Li, Yan; Hong, Huixiao; Guo, Lei; Mei, Nan; Shi, Tieliu; Wang, Kevin Y.; Wolfinger, Russell D.; Nikolsky, Yuri; Walker, Stephen J.; Duerksen-Hughes, Penelope; Mason, Christopher E.; Tong, Weida; Thierry-Mieg, Jean; Thierry-Mieg, Danielle; Shi, Leming; Wang, Charles
2014-01-01
The rat has been used extensively as a model for evaluating chemical toxicities and for understanding drug mechanisms. However, its transcriptome across multiple organs, or developmental stages, has not yet been reported. Here we show, as part of the SEQC consortium efforts, a comprehensive rat transcriptomic BodyMap created by performing RNA-Seq on 320 samples from 11 organs of both sexes of juvenile, adolescent, adult and aged Fischer 344 rats. We catalogue the expression profiles of 40,064 genes, 65,167 transcripts, 31,909 alternatively spliced transcript variants and 2,367 non-coding genes/non-coding RNAs (ncRNAs) annotated in AceView. We find that organ-enriched, differentially expressed genes reflect the known organ-specific biological activities. A large number of transcripts show organ-specific, age-dependent or sex-specific differential expression patterns. We create a web-based, open-access rat BodyMap database of expression profiles with crosslinks to other widely used databases, anticipating that it will serve as a primary resource for biomedical research using the rat model. PMID:24510058
Extracting Databases from Dark Data with DeepDive.
Zhang, Ce; Shin, Jaeho; Ré, Christopher; Cafarella, Michael; Niu, Feng
2016-01-01
DeepDive is a system for extracting relational databases from dark data : the mass of text, tables, and images that are widely collected and stored but which cannot be exploited by standard relational tools. If the information in dark data - scientific papers, Web classified ads, customer service notes, and so on - were instead in a relational database, it would give analysts a massive and valuable new set of "big data." DeepDive is distinctive when compared to previous information extraction systems in its ability to obtain very high precision and recall at reasonable engineering cost; in a number of applications, we have used DeepDive to create databases with accuracy that meets that of human annotators. To date we have successfully deployed DeepDive to create data-centric applications for insurance, materials science, genomics, paleontologists, law enforcement, and others. The data unlocked by DeepDive represents a massive opportunity for industry, government, and scientific researchers. DeepDive is enabled by an unusual design that combines large-scale probabilistic inference with a novel developer interaction cycle. This design is enabled by several core innovations around probabilistic training and inference.
[Evaluation of Educational Effect of Problem-Posing System in Nursing Processing Study].
Tsuji, Keiko; Takano, Yasuomi; Yamakawa, Hiroto; Kaneko, Daisuke; Takai, Kiyako; Kodama, Hiromi; Hagiwara, Tomoko; Komatsugawa, Hiroshi
2015-09-01
The nursing processing study is generally difficult, because it is important for nursing college students to understand knowledge and utilize it. We have developed an integrated system to understand, utilize, and share knowledge. We added a problem-posing function to this system, and expected that students would deeply understand the nursing processing study through the new system. This system consisted of four steps: create a problem, create an answer input section, create a hint, and verification. Nursing students created problems related to nursing processing by this system. When we gave a lecture on the nursing processing for second year students of A university, we tried to use the creating problem function of this system. We evaluated the effect by the number of problems and the contents of the created problem, that is, whether the contents consisted of a lecture stage or not. We also evaluated the correlation between those and regular examination and report scores. We derived the following: 1. weak correlation between the number of created problems and report score (r=0.27), 2. significant differences between regular examination and report scores of students who created problems corresponding to the learning stage, and those of students who created problems not corresponding to it (P<0.05). From these results, problem-posing is suggested to be effective to fix and utilize knowledge in the lecture of nursing processing theory.
Propulsion at low Reynolds number via beam extrusion
NASA Astrophysics Data System (ADS)
Gosselin, Frederick; Neetzow, Paul
2014-03-01
We present experimental and theoretical results on the extrusion of a slender beam in a viscous fluid. We are particularly interested in the force necessary to extrude the beam as it buckles with large amplitude due to viscous friction. The problem is inspired by the propulsion of Paramecium via trichocyst extrusion. Self-propulsion in micro-organisms is mostly achieved through the beating of flagella or cilia. However, to avoid a severe aggression, unicellular Paramecium has been observed to extrude trichocysts in the direction of the aggression to burst away. These trichocysts are rod-like organelles which, upon activation, grow to about 40 μm in length in 3 milliseconds before detaching from the animal. The drag force created by these extruding rods pushing against the viscous fluid generates thrust in the opposite direction. We developed an experimental setup to measure the force required to push a steel piano wire into an aquarium filled with corn syrup. This setup offers a near-zero Reynolds number, and allows studying deployments for a range of constant extrusion speeds. The experimental results are reproduced with a numerical model coupling a large amplitude Euler-Bernoulli beam theory with a fluid load model proportional to the local beam velocity. This study was funded in part by the The Natural Sciences and Engineering Research Council of Canada.
The origin and emergence of life under impact bombardment
Cockell, Charles S
2006-01-01
Craters formed by asteroids and comets offer a number of possibilities as sites for prebiotic chemistry, and they invite a literal application of Darwin's ‘warm little pond’. Some of these attributes, such as prolonged circulation of heated water, are found in deep-ocean hydrothermal vent systems, previously proposed as sites for prebiotic chemistry. However, impact craters host important characteristics in a single location, which include the formation of diverse metal sulphides, clays and zeolites as secondary hydrothermal minerals (which can act as templates or catalysts for prebiotic syntheses), fracturing of rock during impact (creating a large surface area for reactions), the delivery of iron in the case of the impact of iron-containing meteorites (which might itself act as a substrate for prebiotic reactions), diverse impact energies resulting in different rates of hydrothermal cooling and thus organic syntheses, and the indiscriminate nature of impacts into every available lithology—generating large numbers of ‘experiments’ in the origin of life. Following the evolution of life, craters provide cryptoendolithic and chasmoendolithic habitats, particularly in non-sedimentary lithologies, where limited pore space would otherwise restrict colonization. In impact melt sheets, shattered, mixed rocks ultimately provided diverse geochemical gradients, which in present-day craters support the growth of microbial communities. PMID:17008223
NASA Technical Reports Server (NTRS)
Lamb, M.; Stallings, R. L., Jr.
1976-01-01
An experimental investigation was conducted in the Langley Unitary Plan wind tunnel to estimate the peak aerodynamic heating on the space shuttle solid rocket booster during the descent phase of its flight. Heat transfer measurements were obtained using 0.013 scale models instrumented with thermocouples at a Mach number of 3.70, Reynolds number per meter of 11.48 million, and angles of attack from 0 to 180 deg. At angles of attack of 0 and 180 deg, heat transfer measurements on the cylindrical section of the model between the conical nose and ring interaction region were in good agreement with flat plate strip theory for laminar and turbulent flow. At angles of attack up to 30 deg, measurements on this section of the model were in good agreement with laminar swept-cylinder theory, whereas at angles of attack from 120 to 180 deg, the measurements were in good agreement with turbulent swept-cylinder theory. The good agreement with turbulent theory indicated that large flow disturbances created by the nozzle and afterbody flare at these large angles of attack influenced the downstream heating primarily by promoting boundary layer transition. Measurements obtained at 90 deg angle of attack were indicative of laminar flow.
Composition bias and the origin of ORFan genes
Yomtovian, Inbal; Teerakulkittipong, Nuttinee; Lee, Byungkook; Moult, John; Unger, Ron
2010-01-01
Motivation: Intriguingly, sequence analysis of genomes reveals that a large number of genes are unique to each organism. The origin of these genes, termed ORFans, is not known. Here, we explore the origin of ORFan genes by defining a simple measure called ‘composition bias’, based on the deviation of the amino acid composition of a given sequence from the average composition of all proteins of a given genome. Results: For a set of 47 prokaryotic genomes, we show that the amino acid composition bias of real proteins, random ‘proteins’ (created by using the nucleotide frequencies of each genome) and ‘proteins’ translated from intergenic regions are distinct. For ORFans, we observed a correlation between their composition bias and their relative evolutionary age. Recent ORFan proteins have compositions more similar to those of random ‘proteins’, while the compositions of more ancient ORFan proteins are more similar to those of the set of all proteins of the organism. This observation is consistent with an evolutionary scenario wherein ORFan genes emerged and underwent a large number of random mutations and selection, eventually adapting to the composition preference of their organism over time. Contact: ron@biocoml.ls.biu.ac.il Supplementary information: Supplementary data are available at Bioinformatics online. PMID:20231229
OneD: increasing reproducibility of Hi-C samples with abnormal karyotypes.
Vidal, Enrique; le Dily, François; Quilez, Javier; Stadhouders, Ralph; Cuartero, Yasmina; Graf, Thomas; Marti-Renom, Marc A; Beato, Miguel; Filion, Guillaume J
2018-05-04
The three-dimensional conformation of genomes is an essential component of their biological activity. The advent of the Hi-C technology enabled an unprecedented progress in our understanding of genome structures. However, Hi-C is subject to systematic biases that can compromise downstream analyses. Several strategies have been proposed to remove those biases, but the issue of abnormal karyotypes received little attention. Many experiments are performed in cancer cell lines, which typically harbor large-scale copy number variations that create visible defects on the raw Hi-C maps. The consequences of these widespread artifacts on the normalized maps are mostly unexplored. We observed that current normalization methods are not robust to the presence of large-scale copy number variations, potentially obscuring biological differences and enhancing batch effects. To address this issue, we developed an alternative approach designed to take into account chromosomal abnormalities. The method, called OneD, increases reproducibility among replicates of Hi-C samples with abnormal karyotype, outperforming previous methods significantly. On normal karyotypes, OneD fared equally well as state-of-the-art methods, making it a safe choice for Hi-C normalization. OneD is fast and scales well in terms of computing resources for resolutions up to 5 kb.
Ghosh, Purabi R.; Fawcett, Derek; Sharma, Shashi B.; Poinern, Gerrard E. J.
2017-01-01
The quantities of organic waste produced globally by aquacultural and horticulture are extremely large and offer an attractive renewable source of biomolecules and bioactive compounds. The availability of such large and diverse sources of waste materials creates a unique opportunity to develop new recycling and food waste utilisation strategies. The aim of this review is to report the current status of research in the emerging field of producing high-value nanoparticles from food waste. Eco-friendly biogenic processes are quite rapid, and are usually carried out at normal room temperature and pressure. These alternative clean technologies do not rely on the use of the toxic chemicals and solvents commonly associated with traditional nanoparticle manufacturing processes. The relatively small number of research articles in the field have been surveyed and evaluated. Among the diversity of waste types, promising candidates and their ability to produce various high-value nanoparticles are discussed. Experimental parameters, nanoparticle characteristics and potential applications for nanoparticles in pharmaceuticals and biomedical applications are discussed. In spite of the advantages, there are a number of challenges, including nanoparticle reproducibility and understanding the formation mechanisms between different food waste products. Thus, there is considerable scope and opportunity for further research in this emerging field. PMID:28773212
Two non linear dynamics plasma astrophysics experiments at LANL
NASA Astrophysics Data System (ADS)
Intrator, T. P.; Weber, T. E.; Feng, Y.; Sears, J. A.; Swan, H.; Hutchinson, T.; Boguski, J.; Gao, K.; Chapdelaine, L.; Dunn, J.
2013-10-01
Two laboratory experiments at Los Alamos National Laboratory (LANL) have been built to gain access to a wide range of fundamental plasma physics issues germane astro, space, and fusion plasmas. The over arching theme is magnetized plasma dynamics that include currents, MHD forces and instabilities, sheared flows and shocks, creation and annihilation of magnetic field. The Reconnection Scaling Experiment (RSX) creates current sheets and flux ropes that exhibit fully 3D dynamics, that can kink, bounce, merge and reconnect, shred, and reform in complicated ways. The most recent movies from a large detailed data set describe the 3D magnetic structure and helicity budget of a driven and dissipative system that spontaneously self saturates a kink instability. The Magnetized Shock Experiment (MSX) uses a Field reversed configuration (FRC) that is ejected at high speed and then stagnated onto a stopping mirror field, which drives a collisionless magnetized shock. A plasmoid accelerator will also access super critical shocks at much larger Alfven Mach numbers. Unique features include access to parallel, oblique and perpendicular shocks, in regions much larger than ion gyro radius and inertial length, large magnetic and fluid Reynolds numbers, and volume for turbulence. Center for Magnetic Self Organization, NASA Geospace NNHIOA044I-Basic, Department of Energy DE-AC52-06NA25369.
Can Big Pharma Behavior Change to Benefit Patients?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saul Rosenberg MD & Gilbert Chu PhD
2005-03-09
Professors Rosenberg and Chu will discuss how the behavior of large pharmaceutical companies can sometimes compromise the needs of patients. The behavior includes strategies for lobbying Congress, exploiting patent law, targeting large consumer markets, creating demand from patients, and influencing physicians. In some cases, this behavior has created ethical and legal problems. The talk will conclude with a discussion of possible ways to encourage changes that will benefit patients.
O'Connell, Megan E; Tuokko, Holly; Voll, Stacey; Simard, Martine; Griffith, Lauren E; Taler, Vanessa; Wolfson, Christina; Kirkland, Susan; Raina, Parminder
We detail a new approach to the creation of normative data for neuropsychological tests. The traditional approach to normative data creation is to make demographic adjustments based on observations of correlations between single neuropsychological tests and selected demographic variables. We argue, however, that this does not describe the implications for clinical practice, such as increased likelihood of misclassification of cognitive impairment, nor does it elucidate the impact on decision-making with a neuropsychological battery. We propose base rate analyses; specifically, differential base rates of impaired scores between theoretical and actual base rates as the basis for decisions to create demographic adjustments within normative data. Differential base rates empirically describe the potential clinical implications of failing to create an appropriate normative group. We demonstrate this approach with data from a short telephone-administered neuropsychological battery given to a large, neurologically healthy sample aged 45-85 years old. We explored whether adjustments for age and medical conditions were warranted based on differential base rates of spuriously impaired scores. Theoretical base rates underestimated the frequency of impaired scores in older adults and overestimated the frequency of impaired scores in younger adults, providing an evidence base for the creation of age-corrected normative data. In contrast, the number of medical conditions (numerous cardiovascular, hormonal, and metabolic conditions) was not related to differential base rates of impaired scores. Despite a small correlation between number of medical conditions and each neuropsychological variable, normative adjustments for number of medical conditions does not appear warranted. Implications for creation of normative data are discussed.
Efficient Use of Video for 3d Modelling of Cultural Heritage Objects
NASA Astrophysics Data System (ADS)
Alsadik, B.; Gerke, M.; Vosselman, G.
2015-03-01
Currently, there is a rapid development in the techniques of the automated image based modelling (IBM), especially in advanced structure-from-motion (SFM) and dense image matching methods, and camera technology. One possibility is to use video imaging to create 3D reality based models of cultural heritage architectures and monuments. Practically, video imaging is much easier to apply when compared to still image shooting in IBM techniques because the latter needs a thorough planning and proficiency. However, one is faced with mainly three problems when video image sequences are used for highly detailed modelling and dimensional survey of cultural heritage objects. These problems are: the low resolution of video images, the need to process a large number of short baseline video images and blur effects due to camera shake on a significant number of images. In this research, the feasibility of using video images for efficient 3D modelling is investigated. A method is developed to find the minimal significant number of video images in terms of object coverage and blur effect. This reduction in video images is convenient to decrease the processing time and to create a reliable textured 3D model compared with models produced by still imaging. Two experiments for modelling a building and a monument are tested using a video image resolution of 1920×1080 pixels. Internal and external validations of the produced models are applied to find out the final predicted accuracy and the model level of details. Related to the object complexity and video imaging resolution, the tests show an achievable average accuracy between 1 - 5 cm when using video imaging, which is suitable for visualization, virtual museums and low detailed documentation.
Orth, Patrick; Zurakowski, David; Alini, Mauro; Cucchiarini, Magali
2013-01-01
Advanced tissue engineering approaches for articular cartilage repair in the knee joint rely on translational animal models. In these investigations, cartilage defects may be established either in one joint (unilateral design) or in both joints of the same animal (bilateral design). We hypothesized that a lower intraindividual variability following the bilateral strategy would reduce the number of required joints. Standardized osteochondral defects were created in the trochlear groove of 18 rabbits. In 12 animals, defects were produced unilaterally (unilateral design; n=12 defects), while defects were created bilaterally in 6 animals (bilateral design; n=12 defects). After 3 weeks, osteochondral repair was evaluated histologically applying an established grading system. Based on intra- and interindividual variabilities, required sample sizes for the detection of discrete differences in the histological score were determined for both study designs (α=0.05, β=0.20). Coefficients of variation (%CV) of the total histological score values were 1.9-fold increased following the unilateral design when compared with the bilateral approach (26 versus 14%CV). The resulting numbers of joints needed to treat were always higher for the unilateral design, resulting in an up to 3.9-fold increase in the required number of experimental animals. This effect was most pronounced for the detection of small-effect sizes and estimating large standard deviations. The data underline the possible benefit of bilateral study designs for the decrease of sample size requirements for certain investigations in articular cartilage research. These findings might also be transferred to other scoring systems, defect types, or translational animal models in the field of cartilage tissue engineering. PMID:23510128
Using Galaxy to Perform Large-Scale Interactive Data Analyses
Hillman-Jackson, Jennifer; Clements, Dave; Blankenberg, Daniel; Taylor, James; Nekrutenko, Anton
2012-01-01
Innovations in biomedical research technologies continue to provide experimental biologists with novel and increasingly large genomic and high-throughput data resources to be analyzed. As creating and obtaining data has become easier, the key decision faced by many researchers is a practical one: where and how should an analysis be performed? Datasets are large and analysis tool set-up and use is riddled with complexities outside of the scope of core research activities. The authors believe that Galaxy (galaxyproject.org) provides a powerful solution that simplifies data acquisition and analysis in an intuitive web-application, granting all researchers access to key informatics tools previously only available to computational specialists working in Unix-based environments. We will demonstrate through a series of biomedically relevant protocols how Galaxy specifically brings together 1) data retrieval from public and private sources, for example, UCSC’s Eukaryote and Microbial Genome Browsers (genome.ucsc.edu), 2) custom tools (wrapped Unix functions, format standardization/conversions, interval operations) and 3rd party analysis tools, for example, Bowtie/Tuxedo Suite (bowtie-bio.sourceforge.net), Lastz (www.bx.psu.edu/~rsharris/lastz/), SAMTools (samtools.sourceforge.net), FASTX-toolkit (hannonlab.cshl.edu/fastx_toolkit), and MACS (liulab.dfci.harvard.edu/MACS), and creates results formatted for visualization in tools such as the Galaxy Track Browser (GTB, galaxyproject.org/wiki/Learn/Visualization), UCSC Genome Browser (genome.ucsc.edu), Ensembl (www.ensembl.org), and GeneTrack (genetrack.bx.psu.edu). Galaxy rapidly has become the most popular choice for integrated next generation sequencing (NGS) analytics and collaboration, where users can perform, document, and share complex analysis within a single interface in an unprecedented number of ways. PMID:18428782
Urbanization and the problem of restricting the growth of very large cities.
Bialkovskaia, V; Novikov, V
1983-10-01
This article discusses the problem of preventing the excessive growth of very large cities to the detriment of the development of smaller urban settlements in the USSR. The increase in size of the urban population throughout the entire USSR is mainly connected with the increase in the number of city dwellers. In 1960 and 1970 the number of largest cities in the USSR increased, along with a share of the nation's population living in these large cities. The low natural increase in population of very large cities creates a high demand for labor power which must come from the population of other cities. In 1970-1980, Moscow, one of the largest millionaire cities, had the lowest population growth rate of all major USSR cities (113.7%). The growth of Moscow and other very large cities in the last few years has been due to the mechanical increase in population and the increase in area. The analysis of Moscow's pattern of population growth over time focuses on changes in the level of availability of social and everyday services. The prewar period is characterized by a reserve of labor resources, the highest growth in industry and science, but a low overall population dynamic in the city. In the postwar period there was a significant decline in the annual increase of all indicators; this was a period of strong social development of the city. The period between 1966 and 1980 shows a further slowdown in the growth rate of city forming branches by an accelerated development of municipal service branches. The demand for measures to restrict the growth of very large Soviet cities depends on: 1) the reorientation of the development of the economic base, 2) the restructuring of their economy, and 3) the siting of various types of production of goods and services. Developing the specialization of the urban economy consists of planned development of the production of goods and services based on the use of available resources.
Managing Large Datasets for Atmospheric Research
NASA Technical Reports Server (NTRS)
Chen, Gao
2015-01-01
Since the mid-1980s, airborne and ground measurements have been widely used to provide comprehensive characterization of atmospheric composition and processes. Field campaigns have generated a wealth of insitu data and have grown considerably over the years in terms of both the number of measured parameters and the data volume. This can largely be attributed to the rapid advances in instrument development and computing power. The users of field data may face a number of challenges spanning data access, understanding, and proper use in scientific analysis. This tutorial is designed to provide an introduction to using data sets, with a focus on airborne measurements, for atmospheric research. The first part of the tutorial provides an overview of airborne measurements and data discovery. This will be followed by a discussion on the understanding of airborne data files. An actual data file will be used to illustrate how data are reported, including the use of data flags to indicate missing data and limits of detection. Retrieving information from the file header will be discussed, which is essential to properly interpreting the data. Field measurements are typically reported as a function of sampling time, but different instruments often have different sampling intervals. To create a combined data set, the data merge process (interpolation of all data to a common time base) will be discussed in terms of the algorithm, data merge products available from airborne studies, and their application in research. Statistical treatment of missing data and data flagged for limit of detection will also be covered in this section. These basic data processing techniques are applicable to both airborne and ground-based observational data sets. Finally, the recently developed Toolsets for Airborne Data (TAD) will be introduced. TAD (tad.larc.nasa.gov) is an airborne data portal offering tools to create user defined merged data products with the capability to provide descriptive statistics and the option to treat measurement uncertainty.
Sample size determination for bibliographic retrieval studies
Yao, Xiaomei; Wilczynski, Nancy L; Walter, Stephen D; Haynes, R Brian
2008-01-01
Background Research for developing search strategies to retrieve high-quality clinical journal articles from MEDLINE is expensive and time-consuming. The objective of this study was to determine the minimal number of high-quality articles in a journal subset that would need to be hand-searched to update or create new MEDLINE search strategies for treatment, diagnosis, and prognosis studies. Methods The desired width of the 95% confidence intervals (W) for the lowest sensitivity among existing search strategies was used to calculate the number of high-quality articles needed to reliably update search strategies. New search strategies were derived in journal subsets formed by 2 approaches: random sampling of journals and top journals (having the most high-quality articles). The new strategies were tested in both the original large journal database and in a low-yielding journal (having few high-quality articles) subset. Results For treatment studies, if W was 10% or less for the lowest sensitivity among our existing search strategies, a subset of 15 randomly selected journals or 2 top journals were adequate for updating search strategies, based on each approach having at least 99 high-quality articles. The new strategies derived in 15 randomly selected journals or 2 top journals performed well in the original large journal database. Nevertheless, the new search strategies developed using the random sampling approach performed better than those developed using the top journal approach in a low-yielding journal subset. For studies of diagnosis and prognosis, no journal subset had enough high-quality articles to achieve the expected W (10%). Conclusion The approach of randomly sampling a small subset of journals that includes sufficient high-quality articles is an efficient way to update or create search strategies for high-quality articles on therapy in MEDLINE. The concentrations of diagnosis and prognosis articles are too low for this approach. PMID:18823538
Zarate, Oscar A; Brody, Julia Green; Brown, Phil; Ramirez-Andreotta, Mónica D; Perovich, Laura; Matz, Jacob
2016-01-01
An individual's health, genetic, or environmental-exposure data, placed in an online repository, creates a valuable shared resource that can accelerate biomedical research and even open opportunities for crowd-sourcing discoveries by members of the public. But these data become "immortalized" in ways that may create lasting risk as well as benefit. Once shared on the Internet, the data are difficult or impossible to redact, and identities may be revealed by a process called data linkage, in which online data sets are matched to each other. Reidentification (re-ID), the process of associating an individual's name with data that were considered deidentified, poses risks such as insurance or employment discrimination, social stigma, and breach of the promises often made in informed-consent documents. At the same time, re-ID poses risks to researchers and indeed to the future of science, should re-ID end up undermining the trust and participation of potential research participants. The ethical challenges of online data sharing are heightened as so-called big data becomes an increasingly important research tool and driver of new research structures. Big data is shifting research to include large numbers of researchers and institutions as well as large numbers of participants providing diverse types of data, so the participants' consent relationship is no longer with a person or even a research institution. In addition, consent is further transformed because big data analysis often begins with descriptive inquiry and generation of a hypothesis, and the research questions cannot be clearly defined at the outset and may be unforeseeable over the long term. In this article, we consider how expanded data sharing poses new challenges, illustrated by genomics and the transition to new models of consent. We draw on the experiences of participants in an open data platform-the Personal Genome Project-to allow study participants to contribute their voices to inform ethical consent practices and protocol reviews for big-data research. © 2015 The Hastings Center.
ERIC Educational Resources Information Center
Tsang, Chin Fu
1975-01-01
Discusses the possibility of creating elements with an atomic number of around 114. Describes the underlying physics responsible for the limited extent of the periodic table and enumerates problems that must be overcome in creating a superheavy nucleus. (GS)
NASA Astrophysics Data System (ADS)
Lubowich, Donald
2015-08-01
I describe how to create an astronomy program for thousands of people at outdoor concerts based on my $308,000 NASA-funded Music and Astronomy Under the Stars (MAUS) program (60 events 2009 - 2013), and the Astronomy Festival on the National Mall (AFNM, 10,000 people/yr).MAUS reached 50,000 music lovers at local parks and at the Central Park Jazz, Newport Folk, Ravinia, or Tanglewood Music Festivals with classical, folk, pop/rock, opera, Caribbean, or county-western concerts assisted by astronomy clubs. Yo-Yo-Ma, the Chicago and Boston Symphony Orchestras, Ravi Coltrane, Esperanza Spalding, Phish, Blood Sweat and Tears, Deep Purple, Tony Orlando, and Wilco performed at these events. AFNM was started in 2010 with co-sponsorship by the White House Office of Science and Technology Policy. MAUS and AFMN combine solar, optical, and radio telescope observations; large posters/banners; hands-on activities, imaging with a cell phone mount; citizen science activities; hand-outs; and teacher info packet. Representatives from scientific institutions participated. Tyco Brahe, Johannes Kepler, and Caroline Herschel made guest appearances.MAUS reached underserved groups and attracted large crowds. Young kids participated in this family learning experience-often the first time they looked through a telescope. While < 50% of the participants took part in a science activity in the past year, they found MAUS enjoyable and understandable; learned about astronomy; wanted to learn more; and increased their interest in science (ave. rating 3.6/4). MAUS is effective in promoting science education!Lessons learned: plan early; create partnerships with parks, concert organizers, and astronomy clubs; test equipment; have backup equipment; create professional displays; select the best location to obtain a largest number of participants; use social media/www sites to promote the events; use many telescopes for multiple targets; project a live image or video; select equipment that is easy to use, store, set-up, and take down; use hands-on astronomy activities; position the displays for maximum visibility (they are teachable moments); have educator hand-outs, show citizen science projects, promote astronomy clubs and science museums.
Zens, Martin; Grotejohann, Birgit; Tassoni, Adrian; Duttenhoefer, Fabian; Südkamp, Norbert P; Niemeyer, Philipp
2017-05-23
Observational studies have proven to be a valuable resource in medical research, especially when performed on a large scale. Recently, mobile device-based observational studies have been discovered by an increasing number of researchers as a promising new source of information. However, the development and deployment of app-based studies is not trivial and requires profound programming skills. The aim of this project was to develop a modular online research platform that allows researchers to create medical studies for mobile devices without extensive programming skills. The platform approach for a modular research platform consists of three major components. A Web-based platform forms the researchers' main workplace. This platform communicates via a shared database with a platform independent mobile app. Furthermore, a separate Web-based login platform for physicians and other health care professionals is outlined and completes the concept. A prototype of the research platform has been developed and is currently in beta testing. Simple questionnaire studies can be created within minutes and published for testing purposes. Screenshots of an example study are provided, and the general working principle is displayed. In this project, we have created a basis for a novel research platform. The necessity and implications of a modular approach were displayed and an outline for future development given. International researchers are invited and encouraged to participate in this ongoing project. ©Martin Zens, Birgit Grotejohann, Adrian Tassoni, Fabian Duttenhoefer, Norbert P Südkamp, Philipp Niemeyer. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 23.05.2017.
Torlinska, Barbara; Bath, Sarah C; Janjua, Aisha; Boelaert, Kristien; Chan, Shiao-Yng
2018-03-01
Severe iodine deficiency during pregnancy has been associated with pregnancy/neonatal loss, and adverse pregnancy outcomes; however, the impact of mild-to-moderate iodine insufficiency, though prevalent in pregnancy, is not well-documented. We assessed whether mild iodine deficiency during pregnancy was associated with pregnancy/infant loss, or with other adverse pregnancy outcomes. We used samples and data from the Avon Longitudinal Study of Parents and Children (ALSPAC), from 3140 singleton pregnancies and from a further 42 women with pregnancy/infant loss. The group was classified as mildly-to-moderately iodine deficient with a median urinary iodine concentration of 95.3 µg/L (IQR 57.0-153.0; median urinary iodine-to-creatinine ratio (UI/Creat) 124 µg/g, IQR 82-198). The likelihood of pregnancy/infant loss was not different across four UI/Creat groups (<50, 50-149, 150-250, >250 µg/g). The incidence of pre-eclampsia, non-proteinuric gestational hypertension, gestational diabetes, glycosuria, anaemia, post-partum haemorrhage, preterm delivery, mode of delivery, being small for gestational age, and large for gestational age did not differ significantly among UI/Creat groups, nor were there any significant differences in the median UI/Creat. We conclude that maternal iodine status was not associated with adverse pregnancy outcomes in a mildly-to-moderately iodine-deficient pregnant population. However, in view of the low number of women with pregnancy/infant loss in our study, further research is required.
Pre- and Post-Processing Tools to Create and Characterize Particle-Based Composite Model Structures
2017-11-01
ARL-TR-8213 ● NOV 2017 US Army Research Laboratory Pre- and Post -Processing Tools to Create and Characterize Particle-Based...ARL-TR-8213 ● NOV 2017 US Army Research Laboratory Pre- and Post -Processing Tools to Create and Characterize Particle-Based Composite...AND SUBTITLE Pre- and Post -Processing Tools to Create and Characterize Particle-Based Composite Model Structures 5a. CONTRACT NUMBER 5b. GRANT
An Evaluation of Techniques for Clustering Search Results
2005-01-01
CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING...Star Wars 38. park, disney, ride, attraction, theme, film, change, disneyland , 10. Creating Fantasies: Disney’s ’Imagineers’ Build Space Attrac 23
Digital disruption ?syndromes.
Sullivan, Clair; Staib, Andrew
2017-05-18
The digital transformation of hospitals in Australia is occurring rapidly in order to facilitate innovation and improve efficiency. Rapid transformation can cause temporary disruption of hospital workflows and staff as processes are adapted to the new digital workflows. The aim of this paper is to outline various types of digital disruption and some strategies for effective management. A large tertiary university hospital recently underwent a rapid, successful roll-out of an integrated electronic medical record (EMR). We observed this transformation and propose several digital disruption "syndromes" to assist with understanding and management during digital transformation: digital deceleration, digital transparency, digital hypervigilance, data discordance, digital churn and post-digital 'depression'. These 'syndromes' are defined and discussed in detail. Successful management of this temporary digital disruption is important to ensure a successful transition to a digital platform. What is known about this topic? Digital disruption is defined as the changes facilitated by digital technologies that occur at a pace and magnitude that disrupt established ways of value creation, social interactions, doing business and more generally our thinking. Increasing numbers of Australian hospitals are implementing digital solutions to replace traditional paper-based systems for patient care in order to create opportunities for improved care and efficiencies. Such large scale change has the potential to create transient disruption to workflows and staff. Managing this temporary disruption effectively is an important factor in the successful implementation of an EMR. What does this paper add? A large tertiary university hospital recently underwent a successful rapid roll-out of an integrated electronic medical record (EMR) to become Australia's largest digital hospital over a 3-week period. We observed and assisted with the management of several cultural, behavioural and operational forms of digital disruption which lead us to propose some digital disruption 'syndromes'. The definition and management of these 'syndromes' are discussed in detail. What are the implications for practitioners? Minimising the temporary effects of digital disruption in hospitals requires an understanding that these digital 'syndromes' are to be expected and actively managed during large-scale transformation.
CFD Script for Rapid TPS Damage Assessment
NASA Technical Reports Server (NTRS)
McCloud, Peter
2013-01-01
This grid generation script creates unstructured CFD grids for rapid thermal protection system (TPS) damage aeroheating assessments. The existing manual solution is cumbersome, open to errors, and slow. The invention takes a large-scale geometry grid and its large-scale CFD solution, and creates a unstructured patch grid that models the TPS damage. The flow field boundary condition for the patch grid is then interpolated from the large-scale CFD solution. It speeds up the generation of CFD grids and solutions in the modeling of TPS damages and their aeroheating assessment. This process was successfully utilized during STS-134.
NASA Astrophysics Data System (ADS)
Agafonova, Oxana; Avramenko, Anna; Chaudhari, Ashvinkumar; Hellsten, Antti
2016-09-01
Large Eddy Simulations (LES) are carried out using OpenFOAM to investigate the canopy created velocity inflection in the wake development of a large wind turbine array. Simulations are performed for two cases with and without forest separately. Results of the simulations are further compared to clearly show the changes in the wake and turbulence structure due to the forest. Moreover, the actual mechanical shaft power produced by a single turbine in the array is calculated for both cases. Aerodynamic efficiency and power losses due to the forest are discussed as well.
Edwards, Roger A; Dee, Deborah; Umer, Amna; Perrine, Cria G; Shealy, Katherine R; Grummer-Strawn, Laurence M
2014-02-01
A substantial proportion of US maternity care facilities engage in practices that are not evidence-based and that interfere with breastfeeding. The CDC Survey of Maternity Practices in Infant Nutrition and Care (mPINC) showed significant variation in maternity practices among US states. The purpose of this article is to use benchmarking techniques to identify states within relevant peer groups that were top performers on mPINC survey indicators related to breastfeeding support. We used 11 indicators of breastfeeding-related maternity care from the 2011 mPINC survey and benchmarking techniques to organize and compare hospital-based maternity practices across the 50 states and Washington, DC. We created peer categories for benchmarking first by region (grouping states by West, Midwest, South, and Northeast) and then by size (grouping states by the number of maternity facilities and dividing each region into approximately equal halves based on the number of facilities). Thirty-four states had scores high enough to serve as benchmarks, and 32 states had scores low enough to reflect the lowest score gap from the benchmark on at least 1 indicator. No state served as the benchmark on more than 5 indicators and no state was furthest from the benchmark on more than 7 indicators. The small peer group benchmarks in the South, West, and Midwest were better than the large peer group benchmarks on 91%, 82%, and 36% of the indicators, respectively. In the West large, the Midwest large, the Midwest small, and the South large peer groups, 4-6 benchmarks showed that less than 50% of hospitals have ideal practice in all states. The evaluation presents benchmarks for peer group state comparisons that provide potential and feasible targets for improvement.
Carpenter, Danielle; Walker, Susan; Prescott, Natalie; Schalkwijk, Joost; Armour, John Al
2011-08-18
Copy number variation (CNV) contributes to the variation observed between individuals and can influence human disease progression, but the accurate measurement of individual copy numbers is technically challenging. In the work presented here we describe a modification to a previously described paralogue ratio test (PRT) method for genotyping the CCL3L1/CCL4L1 copy variable region, which we use to ascertain CCL3L1/CCL4L1 copy number in 1581 European samples. As the products of CCL3L1 and CCL4L1 potentially play a role in autoimmunity we performed case control association studies with Crohn's disease, rheumatoid arthritis and psoriasis clinical cohorts. We evaluate the PRT methodology used, paying particular attention to accuracy and precision, and highlight the problems of differential bias in copy number measurements. Our PRT methods for measuring copy number were of sufficient precision to detect very slight but systematic differential bias between results from case and control DNA samples in one study. We find no evidence for an association between CCL3L1 copy number and Crohn's disease, rheumatoid arthritis or psoriasis. Differential bias of this small magnitude, but applied systematically across large numbers of samples, would create a serious risk of false positive associations in copy number, if measured using methods of lower precision, or methods relying on single uncorroborated measurements. In this study the small differential bias detected by PRT in one sample set was resolved by a simple pre-treatment by restriction enzyme digestion.
2011-01-01
Background Copy number variation (CNV) contributes to the variation observed between individuals and can influence human disease progression, but the accurate measurement of individual copy numbers is technically challenging. In the work presented here we describe a modification to a previously described paralogue ratio test (PRT) method for genotyping the CCL3L1/CCL4L1 copy variable region, which we use to ascertain CCL3L1/CCL4L1 copy number in 1581 European samples. As the products of CCL3L1 and CCL4L1 potentially play a role in autoimmunity we performed case control association studies with Crohn's disease, rheumatoid arthritis and psoriasis clinical cohorts. Results We evaluate the PRT methodology used, paying particular attention to accuracy and precision, and highlight the problems of differential bias in copy number measurements. Our PRT methods for measuring copy number were of sufficient precision to detect very slight but systematic differential bias between results from case and control DNA samples in one study. We find no evidence for an association between CCL3L1 copy number and Crohn's disease, rheumatoid arthritis or psoriasis. Conclusions Differential bias of this small magnitude, but applied systematically across large numbers of samples, would create a serious risk of false positive associations in copy number, if measured using methods of lower precision, or methods relying on single uncorroborated measurements. In this study the small differential bias detected by PRT in one sample set was resolved by a simple pre-treatment by restriction enzyme digestion. PMID:21851606
GOGrapher: A Python library for GO graph representation and analysis.
Muller, Brian; Richards, Adam J; Jin, Bo; Lu, Xinghua
2009-07-07
The Gene Ontology is the most commonly used controlled vocabulary for annotating proteins. The concepts in the ontology are organized as a directed acyclic graph, in which a node corresponds to a biological concept and a directed edge denotes the parent-child semantic relationship between a pair of terms. A large number of protein annotations further create links between proteins and their functional annotations, reflecting the contemporary knowledge about proteins and their functional relationships. This leads to a complex graph consisting of interleaved biological concepts and their associated proteins. What is needed is a simple, open source library that provides tools to not only create and view the Gene Ontology graph, but to analyze and manipulate it as well. Here we describe the development and use of GOGrapher, a Python library that can be used for the creation, analysis, manipulation, and visualization of Gene Ontology related graphs. An object-oriented approach was adopted to organize the hierarchy of the graphs types and associated classes. An Application Programming Interface is provided through which different types of graphs can be pragmatically created, manipulated, and visualized. GOGrapher has been successfully utilized in multiple research projects, e.g., a graph-based multi-label text classifier for protein annotation. The GOGrapher project provides a reusable programming library designed for the manipulation and analysis of Gene Ontology graphs. The library is freely available for the scientific community to use and improve.
A National Virtual Specimen Database for Early Cancer Detection
NASA Technical Reports Server (NTRS)
Crichton, Daniel; Kincaid, Heather; Kelly, Sean; Thornquist, Mark; Johnsey, Donald; Winget, Marcy
2003-01-01
Access to biospecimens is essential for enabling cancer biomarker discovery. The National Cancer Institute's (NCI) Early Detection Research Network (EDRN) comprises and integrates a large number of laboratories into a network in order to establish a collaborative scientific environment to discover and validate disease markers. The diversity of both the institutions and the collaborative focus has created the need for establishing cross-disciplinary teams focused on integrating expertise in biomedical research, computational and biostatistics, and computer science. Given the collaborative design of the network, the EDRN needed an informatics infrastructure. The Fred Hutchinson Cancer Research Center, the National Cancer Institute,and NASA's Jet Propulsion Laboratory (JPL) teamed up to build an informatics infrastructure creating a collaborative, science-driven research environment despite the geographic and morphology differences of the information systems that existed within the diverse network. EDRN investigators identified the need to share biospecimen data captured across the country managed in disparate databases. As a result, the informatics team initiated an effort to create a virtual tissue database whereby scientists could search and locate details about specimens located at collaborating laboratories. Each database, however, was locally implemented and integrated into collection processes and methods unique to each institution. This meant that efforts to integrate databases needed to be done in a manner that did not require redesign or re-implementation of existing system
Shearer, Barbara S.; Nagy, Suzanne P.
2003-01-01
The Florida State University (FSU) College of Medicine Medical Library is the first academic medical library to be established since the Web's dramatic appearance during the 1990s. A large customer base for electronic medical information resources is both comfortable with and eager to migrate to the electronic format completely, and vendors are designing radical pricing models that make print journal cancellations economically advantageous. In this (almost) post-print environment, the new FSU Medical Library is being created and will continue to evolve. By analyzing print journal subscription lists of eighteen academic medical libraries with similar missions to the community-based FSU College of Medicine and by entering these and selected quality indicators into a Microsoft Access database, a core list was created. This list serves as a selection guide, as a point for discussion with faculty and curriculum leaders when creating budgets, and for financial negotiations in a broader university environment. After journal titles specific to allied health sciences, veterinary medicine, dentistry, pharmacy, library science, and nursing were eliminated from the list, 4,225 unique journal titles emerged. Based on a ten-point scale including SERHOLD holdings and DOCLINE borrowing activity, a list of 449 core titles is identified. The core list has been saved in spreadsheet format for easy sorting by a number of parameters. PMID:12883565
Jeddah Historical Building Information Modelling "JHBIM" - Object Library
NASA Astrophysics Data System (ADS)
Baik, A.; Alitany, A.; Boehm, J.; Robson, S.
2014-05-01
The theory of using Building Information Modelling "BIM" has been used in several Heritage places in the worldwide, in the case of conserving, documenting, managing, and creating full engineering drawings and information. However, one of the most serious issues that facing many experts in order to use the Historical Building Information Modelling "HBIM", is creating the complicated architectural elements of these Historical buildings. In fact, many of these outstanding architectural elements have been designed and created in the site to fit the exact location. Similarly, this issue has been faced the experts in Old Jeddah in order to use the BIM method for Old Jeddah historical Building. Moreover, The Saudi Arabian City has a long history as it contains large number of historic houses and buildings that were built since the 16th century. Furthermore, the BIM model of the historical building in Old Jeddah always take a lot of time, due to the unique of Hijazi architectural elements and no such elements library, which have been took a lot of time to be modelled. This paper will focus on building the Hijazi architectural elements library based on laser scanner and image survey data. This solution will reduce the time to complete the HBIM model and offering in depth and rich digital architectural elements library to be used in any heritage projects in Al-Balad district, Jeddah City.
Semantic Web repositories for genomics data using the eXframe platform
2014-01-01
Background With the advent of inexpensive assay technologies, there has been an unprecedented growth in genomics data as well as the number of databases in which it is stored. In these databases, sample annotation using ontologies and controlled vocabularies is becoming more common. However, the annotation is rarely available as Linked Data, in a machine-readable format, or for standardized queries using SPARQL. This makes large-scale reuse, or integration with other knowledge bases very difficult. Methods To address this challenge, we have developed the second generation of our eXframe platform, a reusable framework for creating online repositories of genomics experiments. This second generation model now publishes Semantic Web data. To accomplish this, we created an experiment model that covers provenance, citations, external links, assays, biomaterials used in the experiment, and the data collected during the process. The elements of our model are mapped to classes and properties from various established biomedical ontologies. Resource Description Framework (RDF) data is automatically produced using these mappings and indexed in an RDF store with a built-in Sparql Protocol and RDF Query Language (SPARQL) endpoint. Conclusions Using the open-source eXframe software, institutions and laboratories can create Semantic Web repositories of their experiments, integrate it with heterogeneous resources and make it interoperable with the vast Semantic Web of biomedical knowledge. PMID:25093072
Schopflocher, Donald; VanSpronsen, Eric; Spence, John C; Vallianatos, Helen; Raine, Kim D; Plotnikoff, Ronald C; Nykiforuk, Candace I J
2012-07-26
Detailed assessments of the built environment often resist data reduction and summarization. This project sought to develop a method of reducing built environment data to an extent that they can be effectively communicated to researchers and community stakeholders. We aim to help in an understanding of how these data can be used to create neighbourhood groupings based on built environment characteristics and how the process of discussing these neighbourhoods with community stakeholders can result in the development of community-informed health promotion interventions. We used the Irvine Minnesota Inventory (IMI) to assess 296 segments of a semi-rural community in Alberta. Expert raters "created" neighbourhoods by examining the data. Then, a consensus grouping was developed using cluster analysis, and the number of IMI variables to characterize the neighbourhoods was reduced by multiple discriminant function analysis. The 296 segments were reduced to a consensus set of 10 neighbourhoods, which could be separated from each other by 9 functions constructed from 24 IMI variables. Biplots of these functions were an effective means of summarizing and presenting the results of the community assessment, and stimulated community action. It is possible to use principled quantitative methods to reduce large amounts of information about the built environment into meaningful summaries. These summaries, or built environment neighbourhoods, were useful in catalyzing action with community stakeholders and led to the development of health-promoting built environment interventions.
Shearer, Barbara S; Nagy, Suzanne P
2003-07-01
The Florida State University (FSU) College of Medicine Medical Library is the first academic medical library to be established since the Web's dramatic appearance during the 1990s. A large customer base for electronic medical information resources is both comfortable with and eager to migrate to the electronic format completely, and vendors are designing radical pricing models that make print journal cancellations economically advantageous. In this (almost) post-print environment, the new FSU Medical Library is being created and will continue to evolve. By analyzing print journal subscription lists of eighteen academic medical libraries with similar missions to the community-based FSU College of Medicine and by entering these and selected quality indicators into a Microsoft Access database, a core list was created. This list serves as a selection guide, as a point for discussion with faculty and curriculum leaders when creating budgets, and for financial negotiations in a broader university environment. After journal titles specific to allied health sciences, veterinary medicine, dentistry, pharmacy, library science, and nursing were eliminated from the list, 4,225 unique journal titles emerged. Based on a ten-point scale including SERHOLD holdings and DOCLINE borrowing activity, a list of 449 core titles is identified. The core list has been saved in spreadsheet format for easy sorting by a number of parameters.
From Lobster Shells to Plastic Objects: A Bioplastics Activity
ERIC Educational Resources Information Center
Hudson, Reuben; Glaisher, Samuel; Bishop, Alexandra; Katz, Jeffrey L.
2015-01-01
A multiple day activity for students to create large-scale plastic objects from the biopolymer chitin (major component of lobster, crab, and shrimp shells) is described. The plastic objects created are durable and made from benign materials, making them suitable for students to take home to play with. Since the student-created plastic objects are…
Quantum graviton creation in a model universe
NASA Technical Reports Server (NTRS)
Berger, B. K.
1974-01-01
Consideration of the mechanism of production of gravitons in the empty, anisotropic, spatially inhomogeneous Gowdy three-torus cosmology. The Gowdy cosmology is an exact solution of the vacuum Einstein equations and is obtained as a generalization of the homogeneous empty Bianchi Type I (Kasner) cosmology by permitting the metric components to depend on one of the space variables in addition to time. The Hamiltonian methods of Arnowitt, Deser, and Misner are employed to identify the dynamical variables which are to be quantized. The WKB regime solution is identical to that found by Doroshkevich, Zel'dovich, and Novikov (DZN) for a universe containing collisionless anisotropic radiation. Using a procedure similar to that of Parker (1971) or Zel'dovich and Starobinskii (1971) for defining quantum number, it is found that the DZN large-time radiation consists of quanta (gravitons) created from an initial vacuum. The quantum behavior is much like the semiclassical enhancement of quantum number with the added feature of creation of quanta from vacuum fluctuations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mamaluy, Denis; Gao, Xujiao; Tierney, Brian David
We created a highly efficient, universal 3D quant um transport simulator. We demonstrated that the simulator scales linearly - both with the problem size (N) and number of CPUs, which presents an important break-through in the field of computational nanoelectronics. It allowed us, for the first time, to accurately simulate and optim ize a large number of realistic nanodevices in a much shorter time, when compared to other methods/codes such as RGF[%7EN 2.333 ]/KNIT, KWANT, and QTBM[%7EN 3 ]/NEMO5. In order to determine the best-in-class for different beyond-CMOS paradigms, we performed rigorous device optimization for high-performance logic devices at 6-,more » 5- and 4-nm gate lengths. We have discovered that there exists a fundamental down-scaling limit for CMOS technology and other Field-Effect Transistors (FETs). We have found that, at room temperatures, all FETs, irre spective of their channel material, will start experiencing unacceptable level of thermally induced errors around 5-nm gate lengths.« less
On the generation and evolution of internal gravity waves
NASA Technical Reports Server (NTRS)
Lansing, F. S.; Maxworthy, T.
1984-01-01
The tidal generation and evolution of internal gravity waves is investigated experimentally and theoretically using a two-dimensional two-layer model. Time-dependent flow is created by moving a profile of maximum submerged depth 7.7 cm through a total stroke of 29 cm in water above a freon-kerosene mixture in an 8.6-m-long 30-cm-deep 20-cm-wide transparent channel, and the deformation of the fluid interface is recorded photographically. A theoretical model of the interface as a set of discrete vortices is constructed numerically; the rigid structures are represented by a source distribution; governing equations in Lagrangian form are obtained; and two integrodifferential equations relating baroclinic vorticity generation and source-density generation are derived. The experimental and computed results are shown in photographs and graphs, respectively, and found to be in good agreement at small Froude numbers. The reasons for small discrepancies in the position of the maximum interface displacement at large Froude numbers are examined.
Genovo: De Novo Assembly for Metagenomes
NASA Astrophysics Data System (ADS)
Laserson, Jonathan; Jojic, Vladimir; Koller, Daphne
Next-generation sequencing technologies produce a large number of noisy reads from the DNA in a sample. Metagenomics and population sequencing aim to recover the genomic sequences of the species in the sample, which could be of high diversity. Methods geared towards single sequence reconstruction are not sensitive enough when applied in this setting. We introduce a generative probabilistic model of read generation from environmental samples and present Genovo, a novel de novo sequence assembler that discovers likely sequence reconstructions under the model. A Chinese restaurant process prior accounts for the unknown number of genomes in the sample. Inference is made by applying a series of hill-climbing steps iteratively until convergence. We compare the performance of Genovo to three other short read assembly programs across one synthetic dataset and eight metagenomic datasets created using the 454 platform, the largest of which has 311k reads. Genovo's reconstructions cover more bases and recover more genes than the other methods, and yield a higher assembly score.
Quantity, Revisited: An Object-Oriented Reusable Class
NASA Technical Reports Server (NTRS)
Funston, Monica Gayle; Gerstle, Walter; Panthaki, Malcolm
1998-01-01
"Quantity", a prototype implementation of an object-oriented class, was developed for two reasons: to help engineers and scientists manipulate the many types of quantities encountered during routine analysis, and to create a reusable software component to for large domain-specific applications. From being used as a stand-alone application to being incorporated into an existing computational mechanics toolkit, "Quantity" appears to be a useful and powerful object. "Quantity" has been designed to maintain the full engineering meaning of values with respect to units and coordinate systems. A value is a scalar, vector, tensor, or matrix, each of which is composed of Value Components, each of which may be an integer, floating point number, fuzzy number, etc., and its associated physical unit. Operations such as coordinate transformation and arithmetic operations are handled by member functions of "Quantity". The prototype has successfully tested such characteristics as maintaining a numeric value, an associated unit, and an annotation. In this paper we further explore the design of "Quantity", with particular attention to coordinate systems.
Inkjet formation of unilamellar lipid vesicles for cell-like encapsulation†
Stachowiak, Jeanne C.; Richmond, David L.; Li, Thomas H.; Brochard-Wyart, Françoise
2010-01-01
Encapsulation of macromolecules within lipid vesicles has the potential to drive biological discovery and enable development of novel, cell-like therapeutics and sensors. However, rapid and reliable production of large numbers of unilamellar vesicles loaded with unrestricted and precisely-controlled contents requires new technologies that overcome size, uniformity, and throughput limitations of existing approaches. Here we present a high-throughput microfluidic method for vesicle formation and encapsulation using an inkjet printer at rates up to 200 Hz. We show how multiple high-frequency pulses of the inkjet’s piezoelectric actuator create a microfluidic jet that deforms a bilayer lipid membrane, controlling formation of individual vesicles. Variations in pulse number, pulse voltage, and solution viscosity are used to control the vesicle size. As a first step toward cell-like reconstitution using this method, we encapsulate the cytoskeletal protein actin and use co-encapsulated microspheres to track its polymerization into a densely entangled cytoskeletal network upon vesicle formation. PMID:19568667
Otte, Jean-Bernard; Meyers, Rebecka
2010-11-01
The PLUTO is a registry developed by an international collaboration of the Liver Tumors Strategy Group (SIOPEL) of the SIOP. Although the number of patients collected in PLUTO to date is too small to add any analytic power to the existing literature, this new registry has great promise. It has been created to clarify issues regarding the role of liver transplantation in the treatment of children with unresectable liver tumors. By reviewing the results to date, we hope we can motivate more centers to participate, enroll patients, complete data entry, and boost the potential impact of the collaborative effort. To achieve this goal, a large number of patients are needed, which requires an intensified international collaboration. Pediatric oncologists, pediatric surgical oncologists, and pediatric liver transplant surgeons are all encouraged to participate and contribute. This is a preliminary glimpse of what we hope to be a series of interim reports over the next decade from the steering committee to help guide therapy in this very challenging group of children. © 2010 John Wiley & Sons A/S.
Toward quantitative fluorescence microscopy with DNA origami nanorulers.
Beater, Susanne; Raab, Mario; Tinnefeld, Philip
2014-01-01
The dynamic development of fluorescence microscopy has created a large number of new techniques, many of which are able to overcome the diffraction limit. This chapter describes the use of DNA origami nanostructures as scaffold for quantifying microscope properties such as sensitivity and resolution. The DNA origami technique enables placing of a defined number of fluorescent dyes in programmed geometries. We present a variety of DNA origami nanorulers that include nanorulers with defined labeling density and defined distances between marks. The chapter summarizes the advantages such as practically free choice of dyes and labeling density and presents examples of nanorulers in use. New triangular DNA origami nanorulers that do not require photoinduced switching by imaging transient binding to DNA nanostructures are also reported. Finally, we simulate fluorescence images of DNA origami nanorulers and reveal that the optimal DNA nanoruler for a specific application has an intermark distance that is roughly 1.3-fold the expected optical resolution. © 2014 Elsevier Inc. All rights reserved.
Elias, Andrew; Crayton, Samuel H.; Warden-Rothman, Robert; Tsourkas, Andrew
2014-01-01
Given the rapidly expanding library of disease biomarkers and targeting agents, the number of unique targeted nanoparticles is growing exponentially. The high variability and expense of animal testing often makes it unfeasible to examine this large number of nanoparticles in vivo. This often leads to the investigation of a single formulation that performed best in vitro. However, nanoparticle performance in vivo depends on many variables, many of which cannot be adequately assessed with cell-based assays. To address this issue, we developed a lanthanide-doped nanoparticle method that allows quantitative comparison of multiple targeted nanoparticles simultaneously. Specifically, superparamagnetic iron oxide (SPIO) nanoparticles with different targeting ligands were created, each with a unique lanthanide dopant. Following the simultaneous injection of the various SPIO compositions into tumor-bearing mice, inductively coupled plasma mass spectroscopy was used to quantitatively and orthogonally assess the concentration of each SPIO composition in serial blood and resected tumor samples. PMID:25068300
Tug-of-war of microtubule filaments at the boundary of a kinesin- and dynein-patterned surface
NASA Astrophysics Data System (ADS)
Ikuta, Junya; Kamisetty, Nagendra K.; Shintaku, Hirofumi; Kotera, Hidetoshi; Kon, Takahide; Yokokawa, Ryuji
2014-06-01
Intracellular cargo is transported by multiple motor proteins. Because of the force balance of motors with mixed polarities, cargo moves bidirectionally to achieve biological functions. Here, we propose a microtubule gliding assay for a tug-of-war study of kinesin and dynein. A boundary of the two motor groups is created by photolithographically patterning gold to selectively attach kinesin to the glass and dynein to the gold surface using a self-assembled monolayer. The relationship between the ratio of two antagonistic motor numbers and the velocity is derived from a force-velocity relationship for each motor to calculate the detachment force and motor backward velocity. Although the tug-of-war involves >100 motors, values are calculated for a single molecule and reflect the collective dynein and non-collective kinesin functions when they work as a team. This assay would be useful for detailed in vitro analysis of intracellular motility, e.g., mitosis, where a large number of motors with mixed polarities are involved.
Campbell, Rebecca; Pierce, Steven J; Sharma, Dhruv B; Shaw, Jessica; Feeney, Hannah; Nye, Jeffrey; Schelling, Kristin; Fehler-Cabral, Giannina
2017-01-01
A growing number of U.S. cities have large numbers of untested sexual assault kits (SAKs) in police property facilities. Testing older kits and maintaining current case work will be challenging for forensic laboratories, creating a need for more efficient testing methods. We evaluated selective degradation methods for DNA extraction using actual case work from a sample of previously unsubmitted SAKs in Detroit, Michigan. We randomly assigned 350 kits to either standard or selective degradation testing methods and then compared DNA testing rates and CODIS entry rates between the two groups. Continuation-ratio modeling showed no significant differences, indicating that the selective degradation method had no decrement in performance relative to customary methods. Follow-up equivalence tests indicated that CODIS entry rates for the two methods could differ by more than ±5%. Selective degradation methods required less personnel time for testing and scientific review than standard testing. © 2016 American Academy of Forensic Sciences.
System steganalysis with automatic fingerprint extraction
Sloan, Tom; Hernandez-Castro, Julio; Isasi, Pedro
2018-01-01
This paper tries to tackle the modern challenge of practical steganalysis over large data by presenting a novel approach whose aim is to perform with perfect accuracy and in a completely automatic manner. The objective is to detect changes introduced by the steganographic process in those data objects, including signatures related to the tools being used. Our approach achieves this by first extracting reliable regularities by analyzing pairs of modified and unmodified data objects; then, combines these findings by creating general patterns present on data used for training. Finally, we construct a Naive Bayes model that is used to perform classification, and operates on attributes extracted using the aforementioned patterns. This technique has been be applied for different steganographic tools that operate in media files of several types. We are able to replicate or improve on a number or previously published results, but more importantly, we in addition present new steganalytic findings over a number of popular tools that had no previous known attacks. PMID:29694366
Tug-of-war of microtubule filaments at the boundary of a kinesin- and dynein-patterned surface
Ikuta, Junya; Kamisetty, Nagendra K.; Shintaku, Hirofumi; Kotera, Hidetoshi; Kon, Takahide; Yokokawa, Ryuji
2014-01-01
Intracellular cargo is transported by multiple motor proteins. Because of the force balance of motors with mixed polarities, cargo moves bidirectionally to achieve biological functions. Here, we propose a microtubule gliding assay for a tug-of-war study of kinesin and dynein. A boundary of the two motor groups is created by photolithographically patterning gold to selectively attach kinesin to the glass and dynein to the gold surface using a self-assembled monolayer. The relationship between the ratio of two antagonistic motor numbers and the velocity is derived from a force-velocity relationship for each motor to calculate the detachment force and motor backward velocity. Although the tug-of-war involves >100 motors, values are calculated for a single molecule and reflect the collective dynein and non-collective kinesin functions when they work as a team. This assay would be useful for detailed in vitro analysis of intracellular motility, e.g., mitosis, where a large number of motors with mixed polarities are involved. PMID:24923426
Mazanov, Jason
2016-04-01
Debate about the ethics of drug control in sport has largely focused on arguing the relative merits of the existing antidoping policy or the adoption of a health-based harm minimisation approach. A number of ethical challenges arising from antidoping have been identified, and a number of, as yet, unanswered questions remain for the maturing ethics of applying harm minimisation principles to drug control for sport. This paper introduces a 'third approach' to the debate, examining some implications of applying a stakeholder theory of corporate social responsibility (CSR) to the issue of doping in sport. The introduction of the stakeholder-CSR model creates an opportunity to challenge the two dominant schools by enabling a different perspective to contribute to the development of an ethically robust drug control for sport. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
MINC 2.0: A Flexible Format for Multi-Modal Images.
Vincent, Robert D; Neelin, Peter; Khalili-Mahani, Najmeh; Janke, Andrew L; Fonov, Vladimir S; Robbins, Steven M; Baghdadi, Leila; Lerch, Jason; Sled, John G; Adalat, Reza; MacDonald, David; Zijdenbos, Alex P; Collins, D Louis; Evans, Alan C
2016-01-01
It is often useful that an imaging data format can afford rich metadata, be flexible, scale to very large file sizes, support multi-modal data, and have strong inbuilt mechanisms for data provenance. Beginning in 1992, MINC was developed as a system for flexible, self-documenting representation of neuroscientific imaging data with arbitrary orientation and dimensionality. The MINC system incorporates three broad components: a file format specification, a programming library, and a growing set of tools. In the early 2000's the MINC developers created MINC 2.0, which added support for 64-bit file sizes, internal compression, and a number of other modern features. Because of its extensible design, it has been easy to incorporate details of provenance in the header metadata, including an explicit processing history, unique identifiers, and vendor-specific scanner settings. This makes MINC ideal for use in large scale imaging studies and databases. It also makes it easy to adapt to new scanning sequences and modalities.
Antrobus, T.J.; Guilfoyle, M.P.; Barrow, W.C.; Hamel, P.B.; Wakeley, J.S.
2000-01-01
Neotropical migrants are birds that breed in North America and winter primarily in Central and South America. Long-term population studies of birds in the Eastern United States indicated declines of some forest-dwelling birds, many of which winter in the Neotropics (Peterjohn and others 1995). These declines were attributed to loss of wintering and breeding habitat due to deforestation and fragmentation, respectively. Many species of Nearctic migrants--birds that breed in the northern regions of North America and winter in the Southern United States--are also experiencing population declines. Because large areas of undistrubed, older, bottomland hardwood forests oftern contain large numbers of habitat specialists, including forest-interior neotropical migrants and wintering Nearctic migrants, these forests may be critical in maintaining avian diversity. This study had two primary objectivs: (1) to create a baseline data set that can be used as a standard against which other bottomland hardwood forests can be compared, and (2) to establish long-term monitoring stations during both breeding and wintering seasons to discern population trends of avian species using bottomland hardwood forests.
Conceptual design proposal: HUGO global range/mobility transport aircraft
NASA Technical Reports Server (NTRS)
Johnston, Tom; Perretta, Dave; Mcbane, Doug; Morin, Greg; Thomas, Greg; Woodward, Joe; Gulakowski, Steve
1993-01-01
With the collapse of the former Soviet Union and the emergence of the United Nations actively pursuing a peace keeping role in world affairs, the United States has been forced into a position as the world's leading peace enforcer. It is still a very dangerous world with seemingly never ending ideological, territorial, and economic disputes requiring the U.S. to maintain a credible deterrent posture in this uncertain environment. This has created an urgent need to rapidly transport large numbers of troops and equipment from the continental United States (CONUS) to any potential world trouble spot by means of a global range/mobility transport aircraft. The most recent examples being Operation Desert Shield/Storm and Operation Restore Hope. To meet this challenge head-on, a request for proposal (RFP) was developed and incorporated into the 1992/1993 AIAA/McDonnell Douglas Corporation Graduate Team Aircraft Design Competition. The RFP calls for the conceptual design and justification of a large aircraft capable of power projecting a significant military force without surface transportation reliance.
NASA Astrophysics Data System (ADS)
Thornton, Ronald
2010-10-01
Physics education research has shown that learning environments that engage students and allow them to take an active part in their learning can lead to large conceptual gains compared to traditional instruction. Examples of successful curricula and methods include Peer Instruction, Just in Time Teaching, RealTime Physics, Workshop Physics, Scale-Up, and Interactive Lecture Demonstrations (ILDs). An active learning environment is often difficult to achieve in lecture sessions. This presentation will demonstrate the use of sequences of Interactive Lecture Demonstrations (ILDs) that use real experiments often involving real-time data collection and display combined with student interaction to create an active learning environment in large or small lecture classes. Interactive lecture demonstrations will be done in the area of mechanics using real-time motion probes and the Visualizer. A video tape of students involved in interactive lecture demonstrations will be shown. The results of a number of research studies at various institutions (including international) to measure the effectiveness of ILDs and guided inquiry conceptual laboratories will be presented.
Testing the robustness of Citizen Science projects: Evaluating the results of pilot project COMBER
Faulwetter, Sarah; Dailianis, Thanos; Smith, Vincent Stuart; Koulouri, Panagiota; Dounas, Costas; Arvanitidis, Christos
2016-01-01
Abstract Background Citizen Science (CS) as a term implies a great deal of approaches and scopes involving many different fields of science. The number of the relevant projects globally has been increased significantly in the recent years. Large scale ecological questions can be answered only through extended observation networks and CS projects can support this effort. Although the need of such projects is apparent, an important part of scientific community cast doubt on the reliability of CS data sets. New information The pilot CS project COMBER has been created in order to provide evidence to answer the aforementioned question in the coastal marine biodiversity monitoring. The results of the current analysis show that a carefully designed CS project with clear hypotheses, wide participation and data sets validation, can be a valuable tool for the large scale and long term changes in marine biodiversity pattern change and therefore for relevant management and conservation issues. PMID:28174507
Parameter optimization of differential evolution algorithm for automatic playlist generation problem
NASA Astrophysics Data System (ADS)
Alamag, Kaye Melina Natividad B.; Addawe, Joel M.
2017-11-01
With the digitalization of music, the number of collection of music increased largely and there is a need to create lists of music that filter the collection according to user preferences, thus giving rise to the Automatic Playlist Generation Problem (APGP). Previous attempts to solve this problem include the use of search and optimization algorithms. If a music database is very large, the algorithm to be used must be able to search the lists thoroughly taking into account the quality of the playlist given a set of user constraints. In this paper we perform an evolutionary meta-heuristic optimization algorithm, Differential Evolution (DE) using different combination of parameter values and select the best performing set when used to solve four standard test functions. Performance of the proposed algorithm is then compared with normal Genetic Algorithm (GA) and a hybrid GA with Tabu Search. Numerical simulations are carried out to show better results from Differential Evolution approach with the optimized parameter values.
Enhancement of gaps in thin graphitic films for heterostructure formation
NASA Astrophysics Data System (ADS)
Hague, J. P.
2014-04-01
There are a large number of atomically thin graphitic films with a structure similar to that of graphene. These films have a spread of band gaps relating to their ionicity and, also, to the substrate on which they are grown. Such films could have a range of applications in digital electronics, where graphene is difficult to use. I use the dynamical cluster approximation to show how electron-phonon coupling between film and substrate can enhance these gaps in a way that depends on the range and strength of the coupling. It is found that one of the driving factors in this effect is a charge density wave instability for electrons on a honeycomb lattice that can open a gap in monolayer graphene. The enhancement at intermediate coupling is sufficiently large that spatially varying substrates and superstrates could be used to create heterostructures in thin graphitic films with position-dependent electron-phonon coupling and gaps, leading to advanced electronic components.
Non-radial pulsations and large-scale structure in stellar winds
NASA Astrophysics Data System (ADS)
Blomme, R.
2009-07-01
Almost all early-type stars show Discrete Absorption Components (DACs) in their ultraviolet spectral lines. These can be attributed to Co-rotating Interaction Regions (CIRs): large-scale spiral-shaped structures that sweep through the stellar wind. We used the Zeus hydrodynamical code to model the CIRs. In the model, the CIRs are caused by ``spots" on the stellar surface. Through the radiative acceleration these spots create fast streams in the stellar wind material. Where the fast and slow streams collide, a CIR is formed. By varying the parameters of the spots, we quantitatively fit the observed DACs in HD~64760. An important result from our work is that the spots do not rotate with the same velocity as the stellar surface. The fact that the cause of the CIRs is not fixed on the surface eliminates many potential explanations. The only remaining explanation is that the CIRs are due to the interference pattern of a number of non-radial pulsations.
NASA Technical Reports Server (NTRS)
Shaffer, Joe R.; Headley, David E.
1993-01-01
Compact storable components expand to create large shelter. Fully deployed structure provides large, unobstructed bay. Deployed trusses support wall and roof blankets. Provides temporary cover for vehicles, people, and materials. Terrestrial version used as garage, hangar, or large tent.
Hot working behavior of selective laser melted and laser metal deposited Inconel 718
NASA Astrophysics Data System (ADS)
Bambach, Markus; Sizova, Irina
2018-05-01
The production of Nickel-based high-temperature components is of great importance for the transport and energy sector. Forging of high-temperature alloys often requires expensive dies, multiple forming steps and leads to forged parts with tolerances that require machining to create the final shape and a large amount of scrap. Additive manufacturing offers the possibility to print the desired shapes directly as net-shape components, requiring only little additional effort in machining. Especially for high-temperature alloys carrying a large amount of energy per unit mass, additive manufacturing could be more energy-efficient than forging if the energy contained in the machining scrap exceeds the energy needed for powder production and laser processing. However, the microstructure and performance of 3d-printed parts will not reach the level of forged material unless further expensive processes such as hot-isostatic pressing are used. Using the design freedom and possibilities to locally engineer material, additive manufacturing could be combined with forging operations to novel process chains, offering the possibility to reduce the number of forging steps and to create near-net shape forgings with desired local properties. Some innovative process chains combining additive manufacturing and forging have been patented recently, but almost no scientific knowledge on the workability of 3D printed preforms exists. The present study investigates the flow stress and microstructure evolution during hot working of pre-forms produced by laser powder deposition and selective laser melting (Figure 1) and puts forward a model for the flow stress.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Femec, D.A.
This report describes two code-generating tools used to speed design and implementation of relational databases and user interfaces: CREATE-SCHEMA and BUILD-SCREEN. CREATE-SCHEMA produces the SQL commands that actually create and define the database. BUILD-SCREEN takes templates for data entry screens and generates the screen management system routine calls to display the desired screen. Both tools also generate the related FORTRAN declaration statements and precompiled SQL calls. Included with this report is the source code for a number of FORTRAN routines and functions used by the user interface. This code is broadly applicable to a number of different databases.
32 CFR 21.560 - Must DoD Components assign numbers uniformly to awards?
Code of Federal Regulations, 2010 CFR
2010-07-01
... nonprocurement instrument. (c) The 9th position must be a number: (1) “1” for grants. (2) “2” for cooperative... assigning these numbers and may create multiple series of letters and numbers to meet internal needs for...
Dynamics of molecules in extreme rotational states
Yuan, Liwei; Teitelbaum, Samuel W.; Robinson, Allison; Mullin, Amy S.
2011-01-01
We have constructed an optical centrifuge with a pulse energy that is more than 2 orders of magnitude larger than previously reported instruments. This high pulse energy enables us to create large enough number densities of molecules in extreme rotational states to perform high-resolution state-resolved transient IR absorption measurements. Here we report the first studies of energy transfer dynamics involving molecules in extreme rotational states. In these studies, the optical centrifuge drives CO2 molecules into states with J ∼ 220 and we use transient IR probing to monitor the subsequent rotational, translational, and vibrational energy flow dynamics. The results reported here provide the first molecular insights into the relaxation of molecules with rotational energy that is comparable to that of a chemical bond.
[Fostering a breastfeeding-friendly workplace].
Chen, Yi-Chun; Kuo, Shu-Chen
2013-02-01
Creating supportive environments that encourage mothers to breastfeed their children has emerged in recent years as a key health issue for women and children. Taiwan has a large and still growing number of new mothers in the workplace. Early postpartum return to work and inconvenient workplace conditions often discourage women from breastfeeding or cause early discontinuation. This study describes the current status of worksite breastfeeding-friendly policies in Taiwan and selected other countries and assesses the effects of work-related factors on working mother breastfeeding behavior. Although maternity leave has been positively correlated with breastfeeding duration, maternity leave in Taiwan remains significantly shorter than in other countries. Flexible working conditions, the provision of lactation rooms, and support from colleagues are critical components of promoting breastfeeding in the workplace.
Moon Zoo: Making the public part of a crater survey algorithm
NASA Astrophysics Data System (ADS)
Gay, P. L.; Brown, S.; Huang, D.; Daus, C.; Lehan, C.; Robbins, S.
2011-10-01
The Moon Zoo citizen science website launched in May 2010 and invited the public to annotate images from the Lunar Reconnaissance Orbiter's Narrow Angle Camera (NAC). Tasks included marking the edges of craters with an ellipse tool, indicating where linear features (e.g. scarps) and special types of craters (e.g. dark haloed) are located with a box, and rating the number of boulders in an image. The goal of this project is to create crater and feature catalogues for large areas of the moon. In addition to doing science, Moon Zoo also seeks to educate its audience through educational content, to engage them through social media, and to understand them through research into their motivations and behaviors.
NASA Technical Reports Server (NTRS)
Horvath, N. C.; Gray, T. I.; Mccrary, D. G. (Principal Investigator)
1982-01-01
Data from the National Oceanic and Atmospheric Administration satellite system (NOAA-6 satellite) were analyzed to study their nonmeteorological uses. A file of charts, graphs, and tables was created form the products generated. It was found that the most useful data lie between pixel numbers 400 and 2000 on a given scan line. The analysis of the generated products indicates that the Gray-McCrary Index can discern vegetation and associated daily and seasonal changes. The solar zenith-angle correction used in previous studies was found to be a useful adjustment to the index. The METSAT system seems best suited for providing large-area analyses of surface features on a daily basis.
Knechtel, Johann
2017-01-01
Abstract We have developed a novel approach for creating membrane-spanning protein-based pores. The construction principle is based on using well-defined, circular DNA nanostructures to arrange a precise number of pore-forming protein toxin monomers. We can thereby obtain, for the first time, protein pores with specifically set diameters. We demonstrate this principle by constructing artificial alpha-hemolysin (αHL) pores. The DNA/αHL hybrid nanopores composed of twelve, twenty or twenty-six monomers show stable insertions into lipid bilayers during electrical recordings, along with steady, pore size-dependent current levels. Our approach successfully advances the applicability of nanopores, in particular towards label-free studies of single molecules in large nanoscaled biological structures. PMID:29088457
World Virtual Observatory Organization
NASA Astrophysics Data System (ADS)
Ignatyev, Mikhail; Pinigin, Gennadij
On the base of experience of our Unoversity and Observatory we investigate the seven blocks model of virtual organization for consolidation of resources. This model consists of the next blocks: 1.Population-scientists students robots and agents. 2.Aspiration of population groups. 3.Territory. 4.Production. 5.Ecology and safety. 6.Finance. 7. External relations - input and output flows of population information resources.The world virtual observatory is the virtual world which consists of three groups of variables - appearances essences and structured uncertainty which defines the number and distribution of arbitrary coefficients in equivalent equations. The consolodation of recources permit to create the large telescopes with distributed structure on our planet and cosmos. Virtual instruments can have the best characteristics by means of collective effects which have investigated in our paper.
An investigation of turbulence structure in a low-Reynolds-number incompressible turbulent boundary
NASA Technical Reports Server (NTRS)
White, B. R.; Strataridakis, C. J.
1987-01-01
An existing high turbulence intensity level (5%) atmospheric boundary-layer wind tunnel has been successfully converted to a relatively low level turbulence (0.3%) wind tunnel through extensive modification, testing, and calibration. A splitter plate was designed, built, and installed into the wind-tunnel facility to create thick, mature, two-dimensional turbulent boundary layer flow at zero pressure gradient. Single and cross hot-wire measurements show turbulent boundary layer characteristics of good quality with unusually large physical size, i.e., viscous sublayer of the order of 1 mm high. These confirm the potential ability of the tunnel to be utilized for future high-quality near-wall turbulent boundary layer measurements. It compares very favorably with many low turbulence research tunnels.
Ackerman, Joshua T.; Hartman, C. Alex; Herzog, Mark P.; Smith, Lacy M.; Moskal, Stacy M.; De La Cruz, Susan E. W.; Yee, Julie L.; Takekawa, John Y.
2014-01-01
The South Bay Salt Pond Restoration Project aims to restore 50–90 percent of former salt evaporation ponds into tidal marsh in South San Francisco Bay, California. However, large numbers of waterbirds use these ponds annually as nesting and foraging habitat. Islands within ponds are particularly important habitat for nesting, foraging, and roosting waterbirds. To maintain current waterbird populations, the South Bay Salt Pond Restoration Project plans to create new islands within former salt ponds in South San Francisco Bay. In a series of studies, we investigated pond and individual island attributes that are most beneficial to nesting, foraging, and roosting waterbirds.
Polarization imaging of imperfect m-plane GaN surfaces
NASA Astrophysics Data System (ADS)
Sakai, Yuji; Kawayama, Iwao; Nakanishi, Hidetoshi; Tonouchi, Masayoshi
2017-04-01
Surface polar states in m-plane GaN wafers were studied using a laser terahertz (THz) emission microscope (LTEM). Femtosecond laser illumination excites THz waves from the surface due to photocarrier acceleration by local spontaneous polarization and/or the surface built-in electric field. The m-plane, in general, has a large number of unfavorable defects and unintentional polarization inversion created during the regrowth process. The LTEM images can visualize surface domains with different polarizations, some of which are hard to visualize with photoluminescence mapping, i.e., non-radiative defect areas. The present study demonstrates that the LTEM provides rich information about the surface polar states of GaN, which is crucial to improve the performance of GaN-based optoelectronic and power devices.
Digital Reconstruction of 3D Polydisperse Dry Foam
NASA Astrophysics Data System (ADS)
Chieco, A.; Feitosa, K.; Roth, A. E.; Korda, P. T.; Durian, D. J.
2012-02-01
Dry foam is a disordered packing of bubbles that distort into familiar polyhedral shapes. We have implemented a method that uses optical axial tomography to reconstruct the internal structure of a dry foam in three dimensions. The technique consists of taking a series of photographs of the dry foam against a uniformly illuminated background at successive angles. By summing the projections we create images of the foam cross section. Image analysis of the cross sections allows us to locate Plateau borders and vertices. The vertices are then connected according to Plateau's rules to reconstruct the internal structure of the foam. Using this technique we are able to visualize a large number of bubbles of real 3D foams and obtain statistics of faces and edges.
NASA Astrophysics Data System (ADS)
Lundegård, Iann
2015-09-01
Today an increasing number of countries around the world have acquired almost the same metaphorical speech about teaching and learning. These theories grown in the Western world are largely produced within the framework of psychology and individualistic oriented educational philosophy and fits with the ever-expanding financial growth paradigm. This article gives a brief reference to an exchange that in the early 1900's took place between two different ways to go in American educational philosophy. Then selects John Dewey's route choice, which took a step away from attempts to create a rationalistic ultimate definition of teaching and learning. Instead, a couple of different metaphors for education are demonstrated that can be used as a basis for pragmatically organizing teaching toward specific purposes and consequences in relation to different cultural traditions.
Scada Malware, a Proof of Concept
NASA Astrophysics Data System (ADS)
Carcano, Andrea; Fovino, Igor Nai; Masera, Marcelo; Trombetta, Alberto
Critical Infrastructures are nowadays exposed to new kind of threats. The cause of such threats is related to the large number of new vulnerabilities and architectural weaknesses introduced by the extensive use of ICT and Network technologies into such complex critical systems. Of particular interest are the set of vulnerabilities related to the class of communication protocols normally known as “SCADA” protocols, under which fall all the communication protocols used to remotely control the RTU devices of an industrial system. In this paper we present a proof of concept of the potential effects of a set of computer malware specifically designed and created in order to impact, by taking advantage of some vulnerabilities of the ModBUS protocol, on a typical Supervisory Control and Data Acquisition system.
Computer-generated imagery for 4-D meteorological data
NASA Technical Reports Server (NTRS)
Hibbard, William L.
1986-01-01
The University of Wisconsin-Madison Space Science and Engineering Center is developing animated stereo display terminals for use with McIDAS (Man-computer Interactive Data Access System). This paper describes image-generation techniques which have been developed to take maximum advantage of these terminals, integrating large quantities of four-dimensional meteorological data from balloon and satellite soundings, satellite images, Doppler and volumetric radar, and conventional surface observations. The images have been designed to use perspective, shading, hidden-surface removal, and transparency to augment the animation and stereo-display geometry. They create an illusion of a moving three-dimensional model of the atmosphere. This paper describes the design of these images and a number of rules of thumb for generating four-dimensional meteorological displays.
Antipodal hotspot pairs on the earth
NASA Technical Reports Server (NTRS)
Rampino, Michael R.; Caldeira, Ken
1992-01-01
The results of statistical analyses performed on three published hotspot distributions suggest that significantly more hotspots occur as nearly antipodal pairs than is anticipated from a random distribution, or from their association with geoid highs and divergent plate margins. The observed number of antipodal hotspot pairs depends on the maximum allowable deviation from exact antipodality. At a maximum deviation of not greater than 700 km, 26 to 37 percent of hotspots form antipodal pairs in the published lists examined here, significantly more than would be expected from the general hotspot distribution. Two possible mechanisms that might create such a distribution include: (1) symmetry in the generation of mantle plumes, and (2) melting related to antipodal focusing of seismic energy from large-body impacts.
Ethnic boxes: the unintended consequences of Habsburg bureaucratic classification
2018-01-01
The classificatory efforts that accompanied the modernization of the Habsburg state inadvertently helped establish, promote, and perpetuate national categories of identification, often contrary to the intentions of the Habsburg bureaucracy. The state did not create nations, but its classification of languages made available some ethnolinguistic identity categories that nationalists used to make political claims. The institutionalization of these categories also made them more relevant, especially as nationalist movements simultaneously worked toward the same goal. Yet identification with a nation did not follow an algorithmic logic, in the beginning of the twentieth century, sometimes earlier, various nationalisms could undoubtedly mobilize large numbers of people in Austria–Hungary, but people still had agency and nation-ness remained contingent and situational. PMID:29932174
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pereira, Ana I.; ALGORITMI,University of Minho; Lima, José
There are several approaches to create the Humanoid robot gait planning. This problem presents a large number of unknown parameters that should be found to make the humanoid robot to walk. Optimization in simulation models can be used to find the gait based on several criteria such as energy minimization, acceleration, step length among the others. The energy consumption can also be reduced with elastic elements coupled to each joint. The presented paper addresses an optimization method, the Stretched Simulated Annealing, that runs in an accurate and stable simulation model to find the optimal gait combined with elastic elements. Finalmore » results demonstrate that optimization is a valid gait planning technique.« less
A world of cities and the end of TB.
Prasad, Amit; Ross, Alex; Rosenberg, Paul; Dye, Christopher
2016-03-01
The WHO's End TB Strategy aims to reduce TB deaths by 95% and incidence by 90% between 2015 and 2035. As the world rapidly urbanizes, more people could have access to better infrastructure and services to help combat poverty and infectious diseases, including TB. And yet large numbers of people now live in overcrowded slums, with poor access to urban health services, amplifying the burden of TB. An alignment of the Sustainable Development Goals (SDGs) for health and for urban development provides an opportunity to accelerate the overall decline in infection and disease, and to create cities free of TB. © The Author 2016. Published by Oxford University Press on behalf of Royal Society of Tropical Medicine and Hygiene.
Information integration for a sky survey by data warehousing
NASA Astrophysics Data System (ADS)
Luo, A.; Zhang, Y.; Zhao, Y.
The virtualization service of data system for a sky survey LAMOST is very important for astronomers The service needs to integrate information from data collections catalogs and references and support simple federation of a set of distributed files and associated metadata Data warehousing has been in existence for several years and demonstrated superiority over traditional relational database management systems by providing novel indexing schemes that supported efficient on-line analytical processing OLAP of large databases Now relational database systems such as Oracle etc support the warehouse capability which including extensions to the SQL language to support OLAP operations and a number of metadata management tools have been created The information integration of LAMOST by applying data warehousing is to effectively provide data and knowledge on-line
Branches of Triangulated Origami Near the Unfolded State
NASA Astrophysics Data System (ADS)
Chen, Bryan Gin-ge; Santangelo, Christian D.
2018-01-01
Origami structures are characterized by a network of folds and vertices joining unbendable plates. For applications to mechanical design and self-folding structures, it is essential to understand the interplay between the set of folds in the unfolded origami and the possible 3D folded configurations. When deforming a structure that has been folded, one can often linearize the geometric constraints, but the degeneracy of the unfolded state makes a linear approach impossible there. We derive a theory for the second-order infinitesimal rigidity of an initially unfolded triangulated origami structure and use it to study the set of nearly unfolded configurations of origami with four boundary vertices. We find that locally, this set consists of a number of distinct "branches" which intersect at the unfolded state, and that the number of these branches is exponential in the number of vertices. We find numerical and analytical evidence that suggests that the branches are characterized by choosing each internal vertex to either "pop up" or "pop down." The large number of pathways along which one can fold an initially unfolded origami structure strongly indicates that a generic structure is likely to become trapped in a "misfolded" state. Thus, new techniques for creating self-folding origami are likely necessary; controlling the popping state of the vertices may be one possibility.
The BRAMS Zoo, a citizen science project
NASA Astrophysics Data System (ADS)
Calders, S.
2015-01-01
Currently, the BRAMS network comprises around 30 receiving stations, and each station collects 24 hours of data per day. With such a large number of raw data, automatic detection of meteor echoes is mandatory. Several algorithms have been developed, using different techniques. (They are discussed in the Proceedings of IMC 2014.) This task is complicated because of the presence of parasitic signals (mostly airplane echoes) on one hand and the fact that some meteor echoes (overdense) exhibit complex shapes that are hard to recognize on the other hand. Currently, none of the algorithms can perfectly mimic the human eye which stays the best detector. Therefore we plan to collaborate with Citizen Science in order to create a "BRAMS zoo". The idea is to ask their very large community of users to draw boxes around meteor echoes in spectrograms. The results will be used to assess the accuracy of the automatic detection algorithms on a large data set. We will focus on a few selected meteor showers which are always more fascinating for the large public than the sporadic background. Moreover, during meteor showers, many more complex overdense echoes are observed for which current automatic detection methods might fail. Finally, the dataset of manually detected meteors can also be useful e.g. for IMCCE to study the dynamic evolution of cometary dust.
Blueprint for a microwave trapped ion quantum computer
Lekitsch, Bjoern; Weidt, Sebastian; Fowler, Austin G.; Mølmer, Klaus; Devitt, Simon J.; Wunderlich, Christof; Hensinger, Winfried K.
2017-01-01
The availability of a universal quantum computer may have a fundamental impact on a vast number of research fields and on society as a whole. An increasingly large scientific and industrial community is working toward the realization of such a device. An arbitrarily large quantum computer may best be constructed using a modular approach. We present a blueprint for a trapped ion–based scalable quantum computer module, making it possible to create a scalable quantum computer architecture based on long-wavelength radiation quantum gates. The modules control all operations as stand-alone units, are constructed using silicon microfabrication techniques, and are within reach of current technology. To perform the required quantum computations, the modules make use of long-wavelength radiation–based quantum gate technology. To scale this microwave quantum computer architecture to a large size, we present a fully scalable design that makes use of ion transport between different modules, thereby allowing arbitrarily many modules to be connected to construct a large-scale device. A high error–threshold surface error correction code can be implemented in the proposed architecture to execute fault-tolerant operations. With appropriate adjustments, the proposed modules are also suitable for alternative trapped ion quantum computer architectures, such as schemes using photonic interconnects. PMID:28164154
Bennema-Broos, M; Groenewegen, P P; Westert, G P
2001-06-01
In this paper, the hypothesis that the spatial distribution of hospital beds is more even in countries with socialist or social democratic governments than in countries with conservative or Christian democratic governments was tested. To avoid the confounding influences of historical and institutional differences between countries, we used the Federal Republic of Germany as a case study. The German federal states have their own governments who play an important role in creating structures for the planning of hospital facilities. The test of the hypothesis was largely quantitative. At the level of federal states the rank correlation was computed between the weighted number of years of left-wing government participation and the coefficient of variation in the number of hospital beds per 1000 inhabitants. In addition to this, the hospital plans of two federal states were studied. The hypothesis was supported by the data, showing a positive association between the number of years of left-wing government participation and regional variation in the number of hospital beds. A comparison of the hospital plans of two contrasting federal states showed less government interference in hospital planning in the state with a tradition of right-wing government. There seems to be a relation between left-wing government participation in West German states and a more equal distribution of the number of hospital beds per 1,000 inhabitants.
de Vries, Rob B M; Buma, Pieter; Leenaars, Marlies; Ritskes-Hoitinga, Merel; Gordijn, Bert
2012-12-01
The use of laboratory animals in tissue engineering research is an important underexposed ethical issue. Several ethical questions may be raised about this use of animals. This article focuses on the possibilities of reducing the number of animals used. Given that there is considerable debate about the adequacy of the current animal models in tissue engineering research, we investigate whether it is possible to reduce the number of laboratory animals by selecting and using only those models that have greatest predictive value for future clinical application of the tissue engineered product. The field of articular cartilage tissue engineering is used as a case study. Based on a study of the scientific literature and interviews with leading experts in the field, an overview is provided of the animal models used and the advantages and disadvantages of each model, particularly in terms of extrapolation to the human situation. Starting from this overview, it is shown that, by skipping the small models and using only one large preclinical model, it is indeed possible to restrict the number of animal models, thereby reducing the number of laboratory animals used. Moreover, it is argued that the selection of animal models should become more evidence based and that researchers should seize more opportunities to choose or create characteristics in the animal models that increase their predictive value.
Electric Propulsion Laboratory Vacuum Chamber
1964-06-21
Engineer Paul Reader and his colleagues take environmental measurements during testing of a 20-inch diameter ion engine in a vacuum tank at the Electric Propulsion Laboratory (EPL). Researchers at the Lewis Research Center were investigating the use of a permanent-magnet circuit to create the magnetic field required power electron bombardment ion engines. Typical ion engines use a solenoid coil to create this magnetic field. It was thought that the substitution of a permanent magnet would create a comparable magnetic field with a lower weight. Testing of the magnet system in the EPL vacuum tanks revealed no significant operational problems. Reader found the weight of the two systems was similar, but that the thruster’s efficiency increased with the magnet. The EPL contained a series of large vacuum tanks that could be used to simulate conditions in space. Large vacuum pumps reduced the internal air pressure, and a refrigeration system created the cryogenic temperatures found in space.
75 FR 47883 - Elimination of USDOT Number Registrant-Only Classification
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-09
... DEPARTMENT OF TRANSPORTATION Federal Motor Carrier Safety Administration Elimination of USDOT Number Registrant-Only Classification AGENCY: Federal Motor Carrier Safety Administration (FMCSA), DOT..., FMCSA created the ``registrant-only'' USDOT number classification to identify registered owners of CMVs...
Vanniyasingam, Thuva; Cunningham, Charles E; Foster, Gary; Thabane, Lehana
2016-01-01
Objectives Discrete choice experiments (DCEs) are routinely used to elicit patient preferences to improve health outcomes and healthcare services. While many fractional factorial designs can be created, some are more statistically optimal than others. The objective of this simulation study was to investigate how varying the number of (1) attributes, (2) levels within attributes, (3) alternatives and (4) choice tasks per survey will improve or compromise the statistical efficiency of an experimental design. Design and methods A total of 3204 DCE designs were created to assess how relative design efficiency (d-efficiency) is influenced by varying the number of choice tasks (2–20), alternatives (2–5), attributes (2–20) and attribute levels (2–5) of a design. Choice tasks were created by randomly allocating attribute and attribute level combinations into alternatives. Outcome Relative d-efficiency was used to measure the optimality of each DCE design. Results DCE design complexity influenced statistical efficiency. Across all designs, relative d-efficiency decreased as the number of attributes and attribute levels increased. It increased for designs with more alternatives. Lastly, relative d-efficiency converges as the number of choice tasks increases, where convergence may not be at 100% statistical optimality. Conclusions Achieving 100% d-efficiency is heavily dependent on the number of attributes, attribute levels, choice tasks and alternatives. Further exploration of overlaps and block sizes are needed. This study's results are widely applicable for researchers interested in creating optimal DCE designs to elicit individual preferences on health services, programmes, policies and products. PMID:27436671
NASA Astrophysics Data System (ADS)
Chang, C.-C.; Yang, R.-J.
2004-04-01
Electroosmotic flow in microchannels is restricted to low Reynolds number regimes characterized by extremely weak inertia forces and laminar flow. Consequently, the mixing of different species occurs primarily through diffusion, and hence cannot readily be achieved within a short mixing channel. The current study presents a numerical investigation of electrokinetically driven flow mixing in microchannels with various numbers of incorporated patterned rectangular blocks. Furthermore, a novel approach is introduced which patterns heterogeneous surfaces on the upper faces of these rectangular blocks in order to enhance species mixing. The simulation results confirm that the introduction of rectangular blocks within the mixing channel slightly enhances species mixing by constricting the bulk flow, hence creating a stronger diffusion effect. However, it is noted that a large number of blocks and hence a long mixing channel are required if a complete mixing of the species is to be obtained. The results also indicate that patterning heterogeneous upper surfaces on the rectangular blocks is an effective means of enhancing the species mixing. It is shown that increasing the magnitude of the heterogeneous surface zeta potential enables a reduction in the mixing channel length and an improved degree of mixing efficiency.
Formation of free round jets with long laminar regions at large Reynolds numbers
NASA Astrophysics Data System (ADS)
Zayko, Julia; Teplovodskii, Sergey; Chicherina, Anastasia; Vedeneev, Vasily; Reshmin, Alexander
2018-04-01
The paper describes a new, simple method for the formation of free round jets with long laminar regions by a jet-forming device of ˜1.5 jet diameters in size. Submerged jets of 0.12 m diameter at Reynolds numbers of 2000-12 560 are experimentally studied. It is shown that for the optimal regime, the laminar region length reaches 5.5 diameters for Reynolds number ˜10 000 which is not achievable for other methods of laminar jet formation. To explain the existence of the optimal regime, a steady flow calculation in the forming unit and a stability analysis of outcoming jet velocity profiles are conducted. The shortening of the laminar regions, compared with the optimal regime, is explained by the higher incoming turbulence level for lower velocities and by the increase of perturbation growth rates for larger velocities. The initial laminar regions of free jets can be used for organising air curtains for the protection of objects in medicine and technologies by creating the air field with desired properties not mixed with ambient air. Free jets with long laminar regions can also be used for detailed studies of perturbation growth and transition to turbulence in round jets.
NASA Astrophysics Data System (ADS)
Kaiser, Bryan E.; Poroseva, Svetlana V.; Canfield, Jesse M.; Sauer, Jeremy A.; Linn, Rodman R.
2013-11-01
The High Gradient hydrodynamics (HIGRAD) code is an atmospheric computational fluid dynamics code created by Los Alamos National Laboratory to accurately represent flows characterized by sharp gradients in velocity, concentration, and temperature. HIGRAD uses a fully compressible finite-volume formulation for explicit Large Eddy Simulation (LES) and features an advection scheme that is second-order accurate in time and space. In the current study, boundary conditions implemented in HIGRAD are varied to find those that better reproduce the reduced physics of a flat plate boundary layer to compare with complex physics of the atmospheric boundary layer. Numerical predictions are compared with available DNS, experimental, and LES data obtained by other researchers. High-order turbulence statistics are collected. The Reynolds number based on the free-stream velocity and the momentum thickness is 120 at the inflow and the Mach number for the flow is 0.2. Results are compared at Reynolds numbers of 670 and 1410. A part of the material is based upon work supported by NASA under award NNX12AJ61A and by the Junior Faculty UNM-LANL Collaborative Research Grant.
Flight Test Measurements From The Tu-144LL Structure/Cabin Noise Follow-On Experiment
NASA Technical Reports Server (NTRS)
Rizzi, Stephen A.; Rackl, Robert G.; Andrianov, Eduard V.
2000-01-01
This follow-on flight experiment on the TU-144LL Supersonic Flying Laboratory, conducted during the period September 1998 to April 1999, was a continuation of previous Structure/Cabin Noise Experiment 2.1. Data was obtained over a wide range of altitudes and Mach numbers. Measured were: turbulent boundary layer pressure fluctuations on the fuselage over its length; structural response on skin panels using accelerometers; and flow direction over three windows using 'flow cones'. The effect of steps in the flow was also measured using two window blank pairs; each pair bridged by a plate which created small sharp forward and aft facing steps. The effect of transducer flushness with the exterior surface was also measured during flight. Height test points were chosen to cover much of the TU-144's flight envelope, as well as to obtain as large a unit Reynolds number range as possible at various Mach numbers: takeoff, subsonic, transonic, and supersonic cruise conditions up to Mach 2. Data on engine runups and background noise were acquired on the ground. The data in the form of time histories of the acoustic signals, together with auxiliary data and basic MATLAB processing modules, are available on CD-R disks.
IFDOTMETER: A New Software Application for Automated Immunofluorescence Analysis.
Rodríguez-Arribas, Mario; Pizarro-Estrella, Elisa; Gómez-Sánchez, Rubén; Yakhine-Diop, S M S; Gragera-Hidalgo, Antonio; Cristo, Alejandro; Bravo-San Pedro, Jose M; González-Polo, Rosa A; Fuentes, José M
2016-04-01
Most laboratories interested in autophagy use different imaging software for managing and analyzing heterogeneous parameters in immunofluorescence experiments (e.g., LC3-puncta quantification and determination of the number and size of lysosomes). One solution would be software that works on a user's laptop or workstation that can access all image settings and provide quick and easy-to-use analysis of data. Thus, we have designed and implemented an application called IFDOTMETER, which can run on all major operating systems because it has been programmed using JAVA (Sun Microsystems). Briefly, IFDOTMETER software has been created to quantify a variety of biological hallmarks, including mitochondrial morphology and nuclear condensation. The program interface is intuitive and user-friendly, making it useful for users not familiar with computer handling. By setting previously defined parameters, the software can automatically analyze a large number of images without the supervision of the researcher. Once analysis is complete, the results are stored in a spreadsheet. Using software for high-throughput cell image analysis offers researchers the possibility of performing comprehensive and precise analysis of a high number of images in an automated manner, making this routine task easier. © 2015 Society for Laboratory Automation and Screening.
NASA Astrophysics Data System (ADS)
Gao, Chan; Tian, Dongfeng; Li, Maosheng; Qian, Dazhi
2018-03-01
In fusion applications, helium, implanted or created by transmutation, plays an important role in the response of reduced-activation ferritic/martensitic steels to neutron radiation damage. The effects of helium concentration and radiation temperature on interaction of interstitial helium atoms with displacement cascades have been studied in Fe-He system using molecular dynamics with recently developed Fe-He potential. Results indicate that interstitial helium atoms produce no additional defects at peak time and promote recombination of Frenkel pairs at lower helium concentrations, but suppress recombination of Frenkel pairs at larger helium concentrations. Moreover, large helium concentrations promote the production of defects at the end of cascades. The number of substitutional helium atoms increases with helium concentration at peak time and the end of cascades, but the number of substitutional helium atoms at peak time is smaller than that at the end of displacement cascades. High radiation temperatures promote the production at peak time and the recombination of defects at the end of cascades. The number of substitutional helium atoms increases with radiation temperature, but that at peak time is smaller than that at the end of cascades.
Improving the Operations of the Earth Observing One Mission via Automated Mission Planning
NASA Technical Reports Server (NTRS)
Chien, Steve A.; Tran, Daniel; Rabideau, Gregg; Schaffer, Steve; Mandl, Daniel; Frye, Stuart
2010-01-01
We describe the modeling and reasoning about operations constraints in an automated mission planning system for an earth observing satellite - EO-1. We first discuss the large number of elements that can be naturally represented in an expressive planning and scheduling framework. We then describe a number of constraints that challenge the current state of the art in automated planning systems and discuss how we modeled these constraints as well as discuss tradeoffs in representation versus efficiency. Finally we describe the challenges in efficiently generating operations plans for this mission. These discussions involve lessons learned from an operations model that has been in use since Fall 2004 (called R4) as well as a newer more accurate operations model operational since June 2009 (called R5). We present analysis of the R5 software documenting a significant (greater than 50%) increase in the number of weekly observations scheduled by the EO-1 mission. We also show that the R5 mission planning system produces schedules within 15% of an upper bound on optimal schedules. This operational enhancement has created value of millions of dollars US over the projected remaining lifetime of the EO-1 mission.
Development of an Efficient Binaural Simulation for the Analysis of Structural Acoustic Data
NASA Technical Reports Server (NTRS)
Johnson, Marty E.; Lalime, Aimee L.; Grosveld, Ferdinand W.; Rizzi, Stephen A.; Sullivan, Brenda M.
2003-01-01
Applying binaural simulation techniques to structural acoustic data can be very computationally intensive as the number of discrete noise sources can be very large. Typically, Head Related Transfer Functions (HRTFs) are used to individually filter the signals from each of the sources in the acoustic field. Therefore, creating a binaural simulation implies the use of potentially hundreds of real time filters. This paper details two methods of reducing the number of real-time computations required by: (i) using the singular value decomposition (SVD) to reduce the complexity of the HRTFs by breaking them into dominant singular values and vectors and (ii) by using equivalent source reduction (ESR) to reduce the number of sources to be analyzed in real-time by replacing sources on the scale of a structural wavelength with sources on the scale of an acoustic wavelength. The ESR and SVD reduction methods can be combined to provide an estimated computation time reduction of 99.4% for the structural acoustic data tested. In addition, preliminary tests have shown that there is a 97% correlation between the results of the combined reduction methods and the results found with the current binaural simulation techniques
Robust pulmonary lobe segmentation against incomplete fissures
NASA Astrophysics Data System (ADS)
Gu, Suicheng; Zheng, Qingfeng; Siegfried, Jill; Pu, Jiantao
2012-03-01
As important anatomical landmarks of the human lung, accurate lobe segmentation may be useful for characterizing specific lung diseases (e.g., inflammatory, granulomatous, and neoplastic diseases). A number of investigations showed that pulmonary fissures were often incomplete in image depiction, thereby leading to the computerized identification of individual lobes a challenging task. Our purpose is to develop a fully automated algorithm for accurate identification of individual lobes regardless of the integrity of pulmonary fissures. The underlying idea of the developed lobe segmentation scheme is to use piecewise planes to approximate the detected fissures. After a rotation and a global smoothing, a number of small planes were fitted using local fissures points. The local surfaces are finally combined for lobe segmentation using a quadratic B-spline weighting strategy to assure that the segmentation is smooth. The performance of the developed scheme was assessed by comparing with a manually created reference standard on a dataset of 30 lung CT examinations. These examinations covered a number of lung diseases and were selected from a large chronic obstructive pulmonary disease (COPD) dataset. The results indicate that our scheme of lobe segmentation is efficient and accurate against incomplete fissures.
Serial analysis of gene expression in the silkworm, Bombyx mori.
Huang, Jianhua; Miao, Xuexia; Jin, Weirong; Couble, Pierre; Mita, Kasuei; Zhang, Yong; Liu, Wenbin; Zhuang, Leijun; Shen, Yan; Keime, Celine; Gandrillon, Olivier; Brouilly, Patrick; Briolay, Jerome; Zhao, Guoping; Huang, Yongping
2005-08-01
The silkworm Bombyx mori is one of the most economically important insects and serves as a model for Lepidoptera insects. We used serial analysis of gene expression (SAGE) to derive profiles of expressed genes during the developmental life cycle of the silkworm and to create a reference for understanding silkworm metamorphosis. We generated four SAGE libraries, one from each of the four developmental stages of the silkworm. In total we obtained 257,964 SAGE tags, of which 39,485 were unique tags. Sorted by copy number, 14.1% of the unique tags were detected at a median to high level (five or more copies), 24.2% at lower levels (two to four copies), and 61.7% as single copies. Using a basic local alignment search tool on the EST database, 35% of the tags matched known silkworm expressed sequence tags. SAGE demonstrated that a number of the genes were up- or down-regulated during the four developmental phases of the egg, larva, pupa, and adult. Furthermore, we found that the generation of longer cDNA fragments from SAGE tags constituted the most efficient method of gene identification, which facilitated the analysis of a large number of unknown genes.
NASA Astrophysics Data System (ADS)
Rainer, M.; Poretti, E.; Mistò, A.; Panzera, M. R.; Molinaro, M.; Cepparo, F.; Roth, M.; Michel, E.; Monteiro, M. J. P. F. G.
2016-12-01
We created a large database of physical parameters and variability indicators by fully reducing and analyzing the large number of spectra taken to complement the asteroseismic observations of the COnvection, ROtation and planetary Transits (CoRoT) satellite. 7103 spectra of 261 stars obtained with the ESO echelle spectrograph HARPS have been stored in the VO-compliant database Spectroscopic Indicators in a SeisMic Archive (SISMA), along with the CoRoT photometric data of the 72 CoRoT asteroseismic targets. The remaining stars belong to the same variable classes of the CoRoT targets and were observed to better characterize the properties of such classes. Several useful variability indicators (mean line profiles, indices of differential rotation, activity and emission lines) together with v\\sin I and radial-velocity measurements have been extracted from the spectra. The atmospheric parameters {T}{eff},{log}g, and [Fe/H] have been computed following a homogeneous procedure. As a result, we fully characterize a sample of new and known variable stars by computing several spectroscopic indicators, also providing some cases of simultaneous photometry and spectroscopy.
Huang, Kuo-Wei; Su, Ting-Wei; Ozcan, Aydogan; Chiou, Pei-Yu
2013-06-21
We demonstrate an optoelectronic tweezer (OET) coupled to a lensfree holographic microscope for real-time interactive manipulation of cells and micro-particles over a large field-of-view (FOV). This integrated platform can record the holographic images of cells and particles over the entire active area of a CCD sensor array, perform digital image reconstruction to identify target cells, dynamically track the positions of cells and particles, and project light beams to trigger light-induced dielectrophoretic forces to pattern and sort cells on a chip. OET technology has been previously shown to be capable of performing parallel single cell manipulation over a large area. However, its throughput has been bottlenecked by the number of cells that can be imaged within the limited FOV of a conventional microscope objective lens. Integrating lensfree holographic imaging with OET solves this fundamental FOV barrier, while also creating a compact on-chip cell/particle manipulation platform. Using this unique platform, we have successfully demonstrated real-time interactive manipulation of thousands of single cells and micro-particles over an ultra-large area of e.g., 240 mm(2) (i.e. 17.96 mm × 13.52 mm).
When Gravity Fails: Local Search Topology
NASA Technical Reports Server (NTRS)
Frank, Jeremy; Cheeseman, Peter; Stutz, John; Lau, Sonie (Technical Monitor)
1997-01-01
Local search algorithms for combinatorial search problems frequently encounter a sequence of states in which it is impossible to improve the value of the objective function; moves through these regions, called {\\em plateau moves), dominate the time spent in local search. We analyze and characterize {\\em plateaus) for three different classes of randomly generated Boolean Satisfiability problems. We identify several interesting features of plateaus that impact the performance of local search algorithms. We show that local minima tend to be small but occasionally may be very large. We also show that local minima can be escaped without unsatisfying a large number of clauses, but that systematically searching for an escape route may be computationally expensive if the local minimum is large. We show that plateaus with exits, called benches, tend to be much larger than minima, and that some benches have very few exit states which local search can use to escape. We show that the solutions (i.e. global minima) of randomly generated problem instances form clusters, which behave similarly to local minima. We revisit several enhancements of local search algorithms and explain their performance in light of our results. Finally we discuss strategies for creating the next generation of local search algorithms.