A statistical parts-based appearance model of inter-subject variability.
Toews, Matthew; Collins, D Louis; Arbel, Tal
2006-01-01
In this article, we present a general statistical parts-based model for representing the appearance of an image set, applied to the problem of inter-subject MR brain image matching. In contrast with global image representations such as active appearance models, the parts-based model consists of a collection of localized image parts whose appearance, geometry and occurrence frequency are quantified statistically. The parts-based approach explicitly addresses the case where one-to-one correspondence does not exist between subjects due to anatomical differences, as parts are not expected to occur in all subjects. The model can be learned automatically, discovering structures that appear with statistical regularity in a large set of subject images, and can be robustly fit to new images, all in the presence of significant inter-subject variability. As parts are derived from generic scale-invariant features, the framework can be applied in a wide variety of image contexts, in order to study the commonality of anatomical parts or to group subjects according to the parts they share. Experimentation shows that a parts-based model can be learned from a large set of MR brain images, and used to determine parts that are common within the group of subjects. Preliminary results indicate that the model can be used to automatically identify distinctive features for inter-subject image registration despite large changes in appearance.
Part-based deep representation for product tagging and search
NASA Astrophysics Data System (ADS)
Chen, Keqing
2017-06-01
Despite previous studies, tagging and indexing the product images remain challenging due to the large inner-class variation of the products. In the traditional methods, the quantized hand-crafted features such as SIFTs are extracted as the representation of the product images, which are not discriminative enough to handle the inner-class variation. For discriminative image representation, this paper firstly presents a novel deep convolutional neural networks (DCNNs) architect true pre-trained on a large-scale general image dataset. Compared to the traditional features, our DCNNs representation is of more discriminative power with fewer dimensions. Moreover, we incorporate the part-based model into the framework to overcome the negative effect of bad alignment and cluttered background and hence the descriptive ability of the deep representation is further enhanced. Finally, we collect and contribute a well-labeled shoe image database, i.e., the TBShoes, on which we apply the part-based deep representation for product image tagging and search, respectively. The experimental results highlight the advantages of the proposed part-based deep representation.
Aucott, W.R.; Meadows, R.S.; Patterson, G.G.
1987-01-01
Base flow was computed to estimate discharge from regional aquifers for six large streams in the upper Coastal Plain of South Carolina and parts of North Carolina and Georgia. Aquifers that sustain the base flow of both large and small streams are stratified into shallow and deep flow systems. Base-flow during dry conditions on main stems of large streams was assumed to be the discharge from the deep groundwater flow system. Six streams were analyzed: the Savannah, South and North Fork Edisto, Lynches, Pee Dee, and the Luber Rivers. Stream reaches in the Upper Coastal Plain were studied because of the relatively large aquifer discharge in these areas in comparison to the lower Coastal Plain. Estimates of discharge from the deep groundwater flow system to the six large streams averaged 1.8 cu ft/sec/mi of stream and 0.11 cu ft/sec/sq mi of surface drainage area. The estimates were made by subtracting all tributary inflows from the discharge gain between two gaging stations on a large stream during an extreme low-flow period. These estimates pertain only to flow in the deep groundwater flow system. Shallow flow systems and total base flow are > flow in the deep system. (USGS)
Voice vs. Text-Based Discussion Forums: An Implementation of Wimba Voice Boards.
ERIC Educational Resources Information Center
Marriott, Philip; Hiscock, Jane
This paper reports on a two-year exploratory study to determine the viability of voice-based threaded discussions forums as a means of stimulating discussion and understanding of weekly readings as part of a large undergraduate communications course. From March to June 2001, 600 students participating in a large introduction to communication…
Eastern Colorado mobility study : final report
DOT National Transportation Integrated Search
2002-04-01
Colorado, with an economy based in large part on agriculture, has a need to transport large quantities of commodities. The rapidly growing urban areas in the state also need many products and goods to support the growth. Furthermore, Colorado is stra...
Schulz, S; Romacker, M; Hahn, U
1998-01-01
The development of powerful and comprehensive medical ontologies that support formal reasoning on a large scale is one of the key requirements for clinical computing in the next millennium. Taxonomic medical knowledge, a major portion of these ontologies, is mainly characterized by generalization and part-whole relations between concepts. While reasoning in generalization hierarchies is quite well understood, no fully conclusive mechanism as yet exists for part-whole reasoning. The approach we take emulates part-whole reasoning via classification-based reasoning using SEP triplets, a special data structure for encoding part-whole relations that is fully embedded in the formal framework of standard description logics.
Schulz, S.; Romacker, M.; Hahn, U.
1998-01-01
The development of powerful and comprehensive medical ontologies that support formal reasoning on a large scale is one of the key requirements for clinical computing in the next millennium. Taxonomic medical knowledge, a major portion of these ontologies, is mainly characterized by generalization and part-whole relations between concepts. While reasoning in generalization hierarchies is quite well understood, no fully conclusive mechanism as yet exists for part-whole reasoning. The approach we take emulates part-whole reasoning via classification-based reasoning using SEP triplets, a special data structure for encoding part-whole relations that is fully embedded in the formal framework of standard description logics. Images Figure 3 PMID:9929335
NASA Astrophysics Data System (ADS)
Fitzgerald, Michael; Danaia, Lena; McKinnon, David H.
2017-07-01
In recent years, calls for the adoption of inquiry-based pedagogies in the science classroom have formed a part of the recommendations for large-scale high school science reforms. However, these pedagogies have been problematic to implement at scale. This research explores the perceptions of 34 positively inclined early-adopter teachers in relation to their implementation of inquiry-based pedagogies. The teachers were part of a large-scale Australian high school intervention project based around astronomy. In a series of semi-structured interviews, the teachers identified a number of common barriers that prevented them from implementing inquiry-based approaches. The most important barriers identified include the extreme time restrictions on all scales, the poverty of their common professional development experiences, their lack of good models and definitions for what inquiry-based teaching actually is, and the lack of good resources enabling the capacity for change. Implications for expectations of teachers and their professional learning during educational reform and curriculum change are discussed.
ERIC Educational Resources Information Center
Kelly, Diana K.
In 1990, a study was conducted at Fullerton College (FC), a large suburban community college in Southern California, to determine if the use of classroom research by part-time faculty would stimulate greater involvement in learning and increase the course completion rate of adult learners in evening classes. A group of 16 part-time faculty…
Another HISA--the new standard: health informatics--service architecture.
Klein, Gunnar O; Sottile, Pier Angelo; Endsleff, Frederik
2007-01-01
In addition to the meaning as Health Informatics Society of Australia, HISA is the acronym used for the new European Standard: Health Informatics - Service Architecture. This EN 12967 standard has been developed by CEN - the federation of 29 national standards bodies in Europe. This standard defines the essential elements of a Service Oriented Architecture and a methodology for localization particularly useful for large healthcare organizations. It is based on the Open Distributed Processing (ODP) framework from ISO 10746 and contains the following parts: Part 1: Enterprise viewpoint. Part 2: Information viewpoint. Part 3: Computational viewpoint. This standard is now also the starting point for the consideration for an International standard in ISO/TC 215. The basic principles with a set of health specific middleware services as a common platform for various applications for regional health information systems, or large integrated hospital information systems, are well established following a previous prestandard. Examples of large scale deployments in Sweden, Denmark and Italy are described.
Practical Elements in Danish Engineering Programmes, Including the European Project Semester
ERIC Educational Resources Information Center
Hansen, Jorgen
2012-01-01
In Denmark, all engineering programmes in HE have practical elements; for instance, at Bachelor's level, an internship is an integrated part of the programme. Furthermore, Denmark has a long-established tradition of problem-based and project-organized learning, and a large part of students' projects, including their final projects, is done in…
Implementing a Reliability Centered Maintenance Program at NASA's Kennedy Space Center
NASA Technical Reports Server (NTRS)
Tuttle, Raymond E.; Pete, Robert R.
1998-01-01
Maintenance practices have long focused on time based "preventive maintenance" techniques. Components were changed out and parts replaced based on how long they had been in place instead of what condition they were in. A reliability centered maintenance (RCM) program seeks to offer equal or greater reliability at decreased cost by insuring only applicable, effective maintenance is performed and by in large part replacing time based maintenance with condition based maintenance. A significant portion of this program involved introducing non-intrusive technologies, such as vibration analysis, oil analysis and I/R cameras, to an existing labor force and management team.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berkel, M. van; Fellow of the Japan Society for the Promotion of Science; FOM Institute DIFFER-Dutch Institute for Fundamental Energy Research, Association EURATOM-FOM, Trilateral Euregio Cluster, P.O. Box 1207, 3430 BE Nieuwegein
In this paper, a number of new explicit approximations are introduced to estimate the perturbative diffusivity (χ), convectivity (V), and damping (τ) in cylindrical geometry. For this purpose, the harmonic components of heat waves induced by localized deposition of modulated power are used. The approximations are based on the heat equation in cylindrical geometry using the symmetry (Neumann) boundary condition at the plasma center. This means that the approximations derived here should be used only to estimate transport coefficients between the plasma center and the off-axis perturbative source. If the effect of cylindrical geometry is small, it is also possiblemore » to use semi-infinite domain approximations presented in Part I and Part II of this series. A number of new approximations are derived in this part, Part III, based upon continued fractions of the modified Bessel function of the first kind and the confluent hypergeometric function of the first kind. These approximations together with the approximations based on semi-infinite domains are compared for heat waves traveling towards the center. The relative error for the different derived approximations is presented for different values of the frequency, transport coefficients, and dimensionless radius. Moreover, it is shown how combinations of different explicit formulas can be used to estimate the transport coefficients over a large parameter range for cases without convection and damping, cases with damping only, and cases with convection and damping. The relative error between the approximation and its underlying model is below 2% for the case, where only diffusivity and damping are considered. If also convectivity is considered, the diffusivity can be estimated well in a large region, but there is also a large region in which no suitable approximation is found. This paper is the third part (Part III) of a series of three papers. In Part I, the semi-infinite slab approximations have been treated. In Part II, cylindrical approximations are treated for heat waves traveling towards the plasma edge assuming a semi-infinite domain.« less
Analysis of Deformation and Equivalent Stress during Biomass Material Compression Molding
NASA Astrophysics Data System (ADS)
Xu, Guiying; Wei, Hetao; Zhang, Zhien; Yu, Shaohui; Wang, Congzhe; Huang, Guowen
2018-02-01
Ansys is adopted to analyze mold deformation and stress field distribution rule during the process of compressing biomass under pressure of 20Mpa. By means of unit selection, material property setting, mesh partition, contact pair establishment, load and constraint applying, and solver setting, the stress and strain of overall mold are analyzed. Deformation and equivalent Stress of compression structure, base, mold, and compression bar were analyzed. We can have conclusions: The distribution of stress forced on compressor is not completely uniform, where the stress at base is slightly decreased; the stress and strain of compression bar is the largest, and stress concentration my occur at top of compression bar, which goes against compression bar service life; the overall deformation of main mold is smaller; although there is slight difference between upper and lower part, the overall variation is not obvious, but the stress difference between upper and lower part of main mold is extremely large so that reaches to 10 times; the stress and strain in base decrease in circular shape, but there is still stress concentration in ledge, which goes against service life; contact stress does not distribute uniformly, there is increasing or decreasing trend in adjacent parts, which is very large in some parts. in constructing both.
An interior-point method-based solver for simulation of aircraft parts riveting
NASA Astrophysics Data System (ADS)
Stefanova, Maria; Yakunin, Sergey; Petukhova, Margarita; Lupuleac, Sergey; Kokkolaras, Michael
2018-05-01
The particularities of the aircraft parts riveting process simulation necessitate the solution of a large amount of contact problems. A primal-dual interior-point method-based solver is proposed for solving such problems efficiently. The proposed method features a worst case polynomial complexity bound ? on the number of iterations, where n is the dimension of the problem and ε is a threshold related to desired accuracy. In practice, the convergence is often faster than this worst case bound, which makes the method applicable to large-scale problems. The computational challenge is solving the system of linear equations because the associated matrix is ill conditioned. To that end, the authors introduce a preconditioner and a strategy for determining effective initial guesses based on the physics of the problem. Numerical results are compared with ones obtained using the Goldfarb-Idnani algorithm. The results demonstrate the efficiency of the proposed method.
NASA Astrophysics Data System (ADS)
Oda, Masahiro; Kitasaka, Takayuki; Furukawa, Kazuhiro; Watanabe, Osamu; Ando, Takafumi; Goto, Hidemi; Mori, Kensaku
2011-03-01
The purpose of this paper is to present a new method to detect ulcers, which is one of the symptoms of Crohn's disease, from CT images. Crohn's disease is an inflammatory disease of the digestive tract. Crohn's disease commonly affects the small intestine. An optical or a capsule endoscope is used for small intestine examinations. However, these endoscopes cannot pass through intestinal stenosis parts in some cases. A CT image based diagnosis allows a physician to observe whole intestine even if intestinal stenosis exists. However, because of the complicated shape of the small and large intestines, understanding of shapes of the intestines and lesion positions are difficult in the CT image based diagnosis. Computer-aided diagnosis system for Crohn's disease having automated lesion detection is required for efficient diagnosis. We propose an automated method to detect ulcers from CT images. Longitudinal ulcers make rough surface of the small and large intestinal wall. The rough surface consists of combination of convex and concave parts on the intestinal wall. We detect convex and concave parts on the intestinal wall by a blob and an inverse-blob structure enhancement filters. A lot of convex and concave parts concentrate on roughed parts. We introduce a roughness value to differentiate convex and concave parts concentrated on the roughed parts from the other on the intestinal wall. The roughness value effectively reduces false positives of ulcer detection. Experimental results showed that the proposed method can detect convex and concave parts on the ulcers.
Development and Evaluation of Thesauri-Based Bibliographic Biomedical Search Engine
ERIC Educational Resources Information Center
Alghoson, Abdullah
2017-01-01
Due to the large volume and exponential growth of biomedical documents (e.g., books, journal articles), it has become increasingly challenging for biomedical search engines to retrieve relevant documents based on users' search queries. Part of the challenge is the matching mechanism of free-text indexing that performs matching based on…
Annama H chondrite—Mineralogy, physical properties, cosmic ray exposure, and parent body history
NASA Astrophysics Data System (ADS)
Kohout, TomáÅ.¡; Haloda, Jakub; Halodová, Patricie; Meier, Matthias M. M.; Maden, Colin; Busemann, Henner; Laubenstein, Matthias; Caffee, Marc. W.; Welten, Kees C.; Hopp, Jens; Trieloff, Mario; Mahajan, Ramakant R.; Naik, Sekhar; Trigo-Rodriguez, Josep M.; Moyano-Cambero, Carles E.; Oshtrakh, Michael I.; Maksimova, Alevtina A.; Chukin, Andrey V.; Semionkin, Vladimir A.; Karabanalov, Maksim S.; Felner, Israel; Petrova, Evgeniya V.; Brusnitsyna, Evgeniia V.; Grokhovsky, Victor I.; Yakovlev, Grigoriy A.; Gritsevich, Maria; Lyytinen, Esko; Moilanen, Jarmo; Kruglikov, Nikolai A.; Ishchenko, Aleksey V.
2017-08-01
The fall of the Annama meteorite occurred early morning (local time) on April 19, 2014 on the Kola Peninsula (Russia). Based on mineralogy and physical properties, Annama is a typical H chondrite. It has a high Ar-Ar age of 4.4 Ga. Its cosmic ray exposure history is atypical as it is not part of the large group of H chondrites with a prominent 7-8 Ma peak in the exposure age histograms. Instead, its exposure age is within uncertainty of a smaller peak at 30 ± 4 Ma. The results from short-lived radionuclides are compatible with an atmospheric pre-entry radius of 30-40 cm. However, based on noble gas and cosmogenic radionuclide data, Annama must have been part of a larger body (radius >65 cm) for a large part of its cosmic ray exposure history. The 10Be concentration indicates a recent (3-5 Ma) breakup which may be responsible for the Annama parent body size reduction to 30-35 cm pre-entry radius.
Denkinger, Michael D; Franke, Sebastian; Rapp, Kilian; Weinmayr, Gudrun; Duran-Tauleria, Enric; Nikolaus, Thorsten; Peter, Richard
2010-07-27
A large number of studies have demonstrated a positive effect of increased physical activity (PA) on various health outcomes. In all large geriatric studies, however, PA has only been assessed by interview-based instruments which are all subject to substantial bias. This may represent one reason why associations of PA with geriatric syndromes such as falls show controversial results. The general aim of the Active-Ulm study was to determine the association of accelerometer-based physical activity with different health-related parameters, and to study the influence of this standardized objective measure of physical activity on health- and disability-related parameters in a longitudinal setting. We have set up an observational cohort study in 1500 community dwelling older persons (65 to 90 years) stratified by age and sex. Addresses have been obtained from the local residents registration offices. The study is carried out jointly with the IMCA--Respiratory Health Survey in the Elderly implemented in the context of the European project IMCA II. The study has a cross-sectional part (1) which focuses on PA and disability and two longitudinal parts (2) and (3). The primary information for part (2) is a prospective 1 year falls calendar including assessment of medication change. Part (3) will be performed about 36 months following baseline. Primary variables of interest include disability, PA, falls and cognitive function. Baseline recruitment has started in March 2009 and will be finished in April 2010.All participants are visited three times within one week, either at home or in the study center. Assessments included interviews on quality of life, diagnosed diseases, common risk factors as well as novel cognitive tests and established tests of physical functioning. PA is measured using an accelerometer-based sensor device, carried continuously over a one week period and accompanied by a prospective activity diary. The assessment of PA using a high standard accelerometer-based device is feasible in a large population-based study. The results obtained from cross-sectional and longitudinal analyses will shed light on important associations between PA and various outcomes and may provide information for specific interventions in older people.
Secondary Analysis of Large-Scale Assessment Data: An Alternative to Variable-Centred Analysis
ERIC Educational Resources Information Center
Chow, Kui Foon; Kennedy, Kerry John
2014-01-01
International large-scale assessments are now part of the educational landscape in many countries and often feed into major policy decisions. Yet, such assessments also provide data sets for secondary analysis that can address key issues of concern to educators and policymakers alike. Traditionally, such secondary analyses have been based on a…
Questions of scientific responsibility: the Baltimore case.
Lang, Serge
1993-01-01
A number of cases of questionable behavior in science have been extensively reported in the media during the last two or three years. What standards are upheld by the scientific community affect the community internally, and also affect its relations with society at large, including Congress. Here I wish to address questions of scientific responsibility, using the Baltimore case as a concrete instance where they came up. The first part containing historical background is necessary to provide readers with documentation so that they can have some factual bases on which to evaluate respective positions and my conclusions that follow -- based on further but more succinctly summarized documentation. I have reproduced many quotes because I firmly believe people are entitled to be represented by their own wording. Conversely, I hold people accountable for their official positions. Some of these are reproduced in footnotes, and some longer ones are reproduced in appendices. I also do not ask to be trusted. By providing numerous references, I hope that readers who find my documentation insufficient can follow up by looking up these references.... The article is in six parts: Part I. Historical Background. This part gives mostly a historical background of the early phases of the Baltimore case. Part II. The First Issue of Responsibility. This part presents a discussion of certain scientific responsibilities based on that background, specifically: the responsibility of answering questions about one's work, and the responsibility whether to submit to authority. Part III. The NIH Investigations. This part summarizes the two NIH investigations. Part IV. The Dingell Subcommittee. This part deals with the responsibilities of a Congressional Committee vis-à-vis science. Part V. Further Issues of Responsibility. This part goes into an open ended discussion of many issues of responsibility facing scientists, vis-à-vis themselves and vis-à-vis society at large, including Congress. The list is long, and readers can look at the section and paragraph headings to get an idea of their content. Part VI. Personal Credibility, a Shift at the Grass Roots, and Baltimore's Persistence.
Residual stand damage survey for three small tractors used in harvesting northern hardwoods
Neil K. Huyler; George D. Aiken; Chris B. LeDoux
1994-01-01
There have always been concerns about the impact of timber harvesting with conventional ground-based harvesting equipment on many parts of the forest ecosystem. One of these parts, which is easily measured, is the residual stand. The interest in small tractors (less than 60 horsepower) has increased in recent years because private landowners are concerned that large...
Carbon Nanotube Thermoelectric Coolers
2015-02-06
project, we studied other approaches to highly efficient thermoelectric energy transformation using nanotube and monoatomic materials . This...and implementing advanced semiconducting materials with large thermal conductance Λ. In this respect, carbon-based low-dimensional materials like...with a material which Λph is very low whereas the electron part, Λe, is high. Here decimating the phonon part of heat conductance is accomplished by
The telencephalon of the Göttingen minipig, cytoarchitecture and cortical surface anatomy.
Bjarkam, Carsten R; Glud, Andreas N; Orlowski, Dariusz; Sørensen, Jens Christian H; Palomero-Gallagher, Nicola
2017-07-01
During the last 20 years pigs have become increasingly popular in large animal translational neuroscience research as an economical and ethical feasible substitute to non-human primates. The anatomy of the pig telencephalon is, however, not well known. We present, accordingly, a detailed description of the surface anatomy and cytoarchitecture of the Göttingen minipig telencephalon based on macrophotos and consecutive high-power microphotographs of 15 μm thick paraffin embedded Nissl-stained coronal sections. In 1-year-old specimens the formalin perfused brain measures approximately 55 × 47 × 36 mm (length, width, height) and weighs around 69 g. The telencephalic part of the Göttingen minipig cerebrum covers a large surface area, which can be divided into a neocortical gyrencephalic part located dorsal to the rhinal fissure, and a ventral subrhinal part dominated by olfactory, amygdaloid, septal, and hippocampal structures. This part of the telencephalon is named the subrhinal lobe, and based on cytoarchitectural and sulcal anatomy, can be discerned from the remaining dorsally located neocortical perirhinal/insular, pericallosal, frontal, parietal, temporal, and occipital lobes. The inner subcortical structure of the minipig telencephalon is dominated by a prominent ventricular system and large basal ganglia, wherein the putamen and the caudate nucleus posterior and dorsally are separated into two entities by the internal capsule, whereas both structures ventrally fuse into a large accumbens nucleus. The presented anatomical data is accompanied by surface renderings and high-power macrophotographs illustrating the telencephalic sulcal pattern, and the localization of the identified lobes and cytoarchitectonic areas. Additionally, 24 representative Nissl-stained telencephalic coronal sections are presented as supplementary material in atlas form on http://www.cense.dk/minipig_atlas/index.html and referred to as S1-S24 throughout the manuscript.
NASA Astrophysics Data System (ADS)
Zhu, Yi; Zhang, Jiping; Wang, Junxia; Chen, Wenyuan; Han, Yiqun; Ye, Chunxiang; Li, Yingruo; Liu, Jun; Zeng, Limin; Wu, Yusheng; Wang, Xinfeng; Wang, Wenxing; Chen, Jianmin; Zhu, Tong
2016-10-01
The North China Plain (NCP) has been experiencing severe air pollution problems with rapid economic growth and urbanisation. Many field and model studies have examined the distribution of air pollutants in the NCP, but convincing results have not been achieved, mainly due to a lack of direct measurements of pollutants over large areas. Here, we employed a mobile laboratory to observe the main air pollutants in a large part of the NCP from 11 June to 15 July 2013. High median concentrations of sulfur dioxide (SO2) (12 ppb), nitrogen oxides (NOx) (NO + NO2; 452 ppb), carbon monoxide (CO) (956 ppb), black carbon (BC; 5.5 µg m-3) and ultrafine particles (28 350 cm-3) were measured. Most of the high values, i.e. 95 percentile concentrations, were distributed near large cities, suggesting the influence of local emissions. In addition, we analysed the regional transport of SO2 and CO, relatively long-lived pollutants, based on our mobile observations together with wind field and satellite data analyses. Our results suggested that, for border areas of the NCP, wind from outside this area would have a diluting effect on pollutants, while south winds would bring in pollutants that have accumulated during transport through other parts of the NCP. For the central NCP, the concentrations of pollutants were likely to remain at high levels, partly due to the influence of regional transport by prevalent south-north winds over the NCP and partly by local emissions.
Tactile mental body parts representation in obesity.
Scarpina, Federica; Castelnuovo, Gianluca; Molinari, Enrico
2014-12-30
Obese people׳s distortions in visually-based mental body-parts representations have been reported in previous studies, but other sensory modalities have largely been neglected. In the present study, we investigated possible differences in tactilely-based body-parts representation between an obese and a healthy-weight group; additionally we explore the possible relationship between the tactile- and the visually-based body representation. Participants were asked to estimate the distance between two tactile stimuli that were simultaneously administered on the arm or on the abdomen, in the absence of visual input. The visually-based body-parts representation was investigated by a visual imagery method in which subjects were instructed to compare the horizontal extension of body part pairs. According to the results, the obese participants overestimated the size of the tactilely-perceived distances more than the healthy-weight group when the arm, and not the abdomen, was stimulated. Moreover, they reported a lower level of accuracy than did the healthy-weight group when estimating horizontal distances relative to their bodies, confirming an inappropriate visually-based mental body representation. Our results imply that body representation disturbance in obese people is not limited to the visual mental domain, but it spreads to the tactilely perceived distances. The inaccuracy was not a generalized tendency but was body-part related. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Capturing the benefits of complete streets : [summary].
DOT National Transportation Integrated Search
2016-01-01
Many transportation investment decisions are based in large part on a return on investment : in terms of increased vehicular throughput and increased travel speeds/reduced travel times. : Alternatively, todays planning conventions also recognize t...
Large Scale Helium Liquefaction and Considerations for Site Services for a Plant Located in Algeria
NASA Astrophysics Data System (ADS)
Froehlich, P.; Clausen, J. J.
2008-03-01
The large-scale liquefaction of helium extracted from natural gas is depicted. Based on a block diagram the process chain, starting with the pipeline downstream of the natural-gas plant to the final storage of liquid helium, is explained. Information will be provided about the recent experiences during installation and start-up of a bulk helium liquefaction plant located in Skikda, Algeria, including part-load operation based on a reduced feed gas supply. The local working and ambient conditions are described, including challenging logistic problems like shipping and receiving of parts, qualified and semi-qualified subcontractors, basic provisions and tools on site, and precautions to sea water and ambient conditions. Finally, the differences in commissioning (technically and evaluation of time and work packages) to European locations and standards will be discussed.
Schmidt, Olga; Hausmann, Axel; Cancian de Araujo, Bruno; Sutrisno, Hari; Peggie, Djunijanti; Schmidt, Stefan
2017-01-01
Here we present a general collecting and preparation protocol for DNA barcoding of Lepidoptera as part of large-scale rapid biodiversity assessment projects, and a comparison with alternative preserving and vouchering methods. About 98% of the sequenced specimens processed using the present collecting and preparation protocol yielded sequences with more than 500 base pairs. The study is based on the first outcomes of the Indonesian Biodiversity Discovery and Information System (IndoBioSys). IndoBioSys is a German-Indonesian research project that is conducted by the Museum für Naturkunde in Berlin and the Zoologische Staatssammlung München, in close cooperation with the Research Center for Biology - Indonesian Institute of Sciences (RCB-LIPI, Bogor).
ERIC Educational Resources Information Center
Archbald, Douglas A.; Kaplan, David
2004-01-01
Inter- and intra-district public school choice, vouchers, tuition tax credits and other forms of school choice have been advocated for decades, in large part on grounds that the market forces engendered will improve public education. There are many studies of school choice policies and programs and a large theoretical literature on school choice,…
Response Variability in Commercial MOSFET SEE Qualification
George, J. S.; Clymer, D. A.; Turflinger, T. L.; ...
2016-12-01
Single-event effects (SEE) evaluation of five different part types of next generation, commercial trench MOSFETs indicates large part-to-part variation in determining a safe operating area (SOA) for drain-source voltage (V DS) following a test campaign that exposed >50 samples per part type to heavy ions. These results suggest a determination of a SOA using small sample sizes may fail to capture the full extent of the part-to-part variability. An example method is discussed for establishing a Safe Operating Area using a one-sided statistical tolerance limit based on the number of test samples. Finally, burn-in is shown to be a criticalmore » factor in reducing part-to-part variation in part response. Implications for radiation qualification requirements are also explored.« less
Response Variability in Commercial MOSFET SEE Qualification
DOE Office of Scientific and Technical Information (OSTI.GOV)
George, J. S.; Clymer, D. A.; Turflinger, T. L.
Single-event effects (SEE) evaluation of five different part types of next generation, commercial trench MOSFETs indicates large part-to-part variation in determining a safe operating area (SOA) for drain-source voltage (V DS) following a test campaign that exposed >50 samples per part type to heavy ions. These results suggest a determination of a SOA using small sample sizes may fail to capture the full extent of the part-to-part variability. An example method is discussed for establishing a Safe Operating Area using a one-sided statistical tolerance limit based on the number of test samples. Finally, burn-in is shown to be a criticalmore » factor in reducing part-to-part variation in part response. Implications for radiation qualification requirements are also explored.« less
Issues Confronting Rural Pharmacies after a Decade of Medicare Part D.
Ullrich, Fred; Salako, Abiodun; Mueller, Keith
2017-04-01
Purpose. The RUPRI Center for Rural Health Policy Analysis has been monitoring the status of rural independent pharmacies since the implementation of Medicare Part D in 2005. After a decade of Part D, we reassess in this brief the issues that concern rural pharmacies and may ultimately challenge their provision of services. This reassessment is based on survey responses from rural pharmacists. Key Findings: (1) Rural pharmacists indicated that two challenges--direct and indirect remuneration (DIR) fees, and delayed maximum allowable cost (MAC) adjustment--ranked highest on scales of both magnitude and immediacy. Nearly eighty (79.8) percent of respondents reported DIR fees as a very large magnitude challenge, with 83.3 percent reporting this as a very immediate challenge. Seventy-eight percent of respondents reported MACs not being updated quickly enough to reflect changes in wholesale drug costs as a very large magnitude challenge, with 79.7 percent indicating it as a very immediate challenge. (2) Medicare Part D continues to be a concern for rural pharmacies--58.8 percent of pharmacists said being an out-of-network pharmacy for Part D plans was a very large magnitude challenge (an additional 29.0 percent said large magnitude) and 60.5 percent said it was a very immediate challenge (an additional 28.1 percent said moderately immediate). (3) Pharmacy staffing, competition from pharmacy chains, and contracts for services for Medicaid patients were less likely to be reported as significant or immediate challenges.
Consequences of Part Temperature Variability in Electron Beam Melting of Ti-6Al-4V
NASA Astrophysics Data System (ADS)
Fisher, Brian A.; Mireles, Jorge; Ridwan, Shakerur; Wicker, Ryan B.; Beuth, Jack
2017-12-01
To facilitate adoption of Ti-6Al-4V (Ti64) parts produced via additive manufacturing (AM), the ability to ensure part quality is critical. Measuring temperatures is an important component of part quality monitoring in all direct metal AM processes. In this work, surface temperatures were monitored using a custom infrared camera system attached to an Arcam electron beam melting (EBM®) machine. These temperatures were analyzed to understand their possible effect on solidification microstructure based on solidification cooling rates extracted from finite element simulations. Complicated thermal histories were seen during part builds, and temperature changes occurring during typical Ti64 builds may be large enough to affect solidification microstructure. There is, however, enough time between fusion of individual layers for spatial temperature variations (i.e., hot spots) to dissipate. This means that an effective thermal control strategy for EBM® can be based on average measured surface temperatures, ignoring temperature variability.
ERIC Educational Resources Information Center
Brown, Stephen C., Ed.
This book contains 14 case studies, written by those involved in the teaching and training initiatives, that illustrate the use of open and distance learning strategies. The case studies, drawn from many parts of the world (but mostly British based), feature efforts in large and small companies in a variety of industries. The first part of the…
ERIC Educational Resources Information Center
Beaino, Ghada; Khoshnood, Babak; Kaminski, Monique; Pierrat, Veronique; Marret, Stephane; Matis, Jacqueline; Ledesert, Bernard; Thiriez, Gerard; Fresson, Jeanne; Roze, Jean-Christophe; Zupan-Simunek, Veronique; Arnaud, Catherine; Burguet, Antoine; Larroque, Beatrice; Breart, Gerard; Ancel, Pierre-Yves
2010-01-01
Aim: The aim of this study was to assess the independent role of cerebral lesions on ultrasound scan, and several other neonatal and obstetric factors, as potential predictors of cerebral palsy (CP) in a large population-based cohort of very preterm infants. Method: As part of EPIPAGE, a population-based prospective cohort study, perinatal data…
The Use of Mapping in Child Welfare Investigations: A Strength-Based Hybrid Intervention
ERIC Educational Resources Information Center
Lwin, Kristen; Versanov, Avi; Cheung, Connie; Goodman, Deborah; Andrews, Nancy
2014-01-01
To enhance strengths-based service, a large urban child welfare agency in Ontario, Canada implemented part of the Signs of Safety (SOS) model in 2010. SOS was created to engage families involved with the child welfare system, and is rooted in the beliefs of collaboration, strengths-based practice, and safety. The hybrid of the full SOS model…
Biomedical information retrieval across languages.
Daumke, Philipp; Markü, Kornél; Poprat, Michael; Schulz, Stefan; Klar, Rüdiger
2007-06-01
This work presents a new dictionary-based approach to biomedical cross-language information retrieval (CLIR) that addresses many of the general and domain-specific challenges in current CLIR research. Our method is based on a multilingual lexicon that was generated partly manually and partly automatically, and currently covers six European languages. It contains morphologically meaningful word fragments, termed subwords. Using subwords instead of entire words significantly reduces the number of lexical entries necessary to sufficiently cover a specific language and domain. Mediation between queries and documents is based on these subwords as well as on lists of word-n-grams that are generated from large monolingual corpora and constitute possible translation units. The translations are then sent to a standard Internet search engine. This process makes our approach an effective tool for searching the biomedical content of the World Wide Web in different languages. We evaluate this approach using the OHSUMED corpus, a large medical document collection, within a cross-language retrieval setting.
Yu, Yinan; Diamantaras, Konstantinos I; McKelvey, Tomas; Kung, Sun-Yuan
2018-02-01
In kernel-based classification models, given limited computational power and storage capacity, operations over the full kernel matrix becomes prohibitive. In this paper, we propose a new supervised learning framework using kernel models for sequential data processing. The framework is based on two components that both aim at enhancing the classification capability with a subset selection scheme. The first part is a subspace projection technique in the reproducing kernel Hilbert space using a CLAss-specific Subspace Kernel representation for kernel approximation. In the second part, we propose a novel structural risk minimization algorithm called the adaptive margin slack minimization to iteratively improve the classification accuracy by an adaptive data selection. We motivate each part separately, and then integrate them into learning frameworks for large scale data. We propose two such frameworks: the memory efficient sequential processing for sequential data processing and the parallelized sequential processing for distributed computing with sequential data acquisition. We test our methods on several benchmark data sets and compared with the state-of-the-art techniques to verify the validity of the proposed techniques.
From Static to Dynamic: Choosing and Implementing a Web-Based CMS
ERIC Educational Resources Information Center
Kneale, Ruth
2008-01-01
Working as systems librarian for the Advanced Technology Solar Telescope (ATST), a project for the National Solar Observatory (NSO) based in Tucson, Arizona, a large part of the author's responsibilities involve running the web site. She began looking into content management systems (CMSs), specifically ones for website control. A CMS is generally…
ERIC Educational Resources Information Center
Dahl, Kari Kragh Blume; Millora, Christopher Malagad
2016-01-01
This study explores reflective experience during transformative, group-based learning among university leaders following a natural disaster such as a typhoon in two Philippine universities. Natural disasters are recurrent phenomena in many parts of the world, but the literature largely ignores their impact on lifelong human learning, for instance…
A Simple Mechanical Experiment on Exponential Growth
ERIC Educational Resources Information Center
McGrew, Ralph
2015-01-01
With a rod, cord, pulleys, and slotted masses, students can observe and graph exponential growth in the cord tension over a factor of increase as large as several hundred. This experiment is adaptable for use either in algebra-based or calculus-based physics courses, fitting naturally with the study of sliding friction. Significant parts of the…
Prediction of microstructure, residual stress, and deformation in laser powder bed fusion process
NASA Astrophysics Data System (ADS)
Yang, Y. P.; Jamshidinia, M.; Boulware, P.; Kelly, S. M.
2018-05-01
Laser powder bed fusion (L-PBF) process has been investigated significantly to build production parts with a complex shape. Modeling tools, which can be used in a part level, are essential to allow engineers to fine tune the shape design and process parameters for additive manufacturing. This study focuses on developing modeling methods to predict microstructure, hardness, residual stress, and deformation in large L-PBF built parts. A transient sequentially coupled thermal and metallurgical analysis method was developed to predict microstructure and hardness on L-PBF built high-strength, low-alloy steel parts. A moving heat-source model was used in this analysis to accurately predict the temperature history. A kinetics based model which was developed to predict microstructure in the heat-affected zone of a welded joint was extended to predict the microstructure and hardness in an L-PBF build by inputting the predicted temperature history. The tempering effect resulting from the following built layers on the current-layer microstructural phases were modeled, which is the key to predict the final hardness correctly. It was also found that the top layers of a build part have higher hardness because of the lack of the tempering effect. A sequentially coupled thermal and mechanical analysis method was developed to predict residual stress and deformation for an L-PBF build part. It was found that a line-heating model is not suitable for analyzing a large L-PBF built part. The layer heating method is a potential method for analyzing a large L-PBF built part. The experiment was conducted to validate the model predictions.
Prediction of microstructure, residual stress, and deformation in laser powder bed fusion process
NASA Astrophysics Data System (ADS)
Yang, Y. P.; Jamshidinia, M.; Boulware, P.; Kelly, S. M.
2017-12-01
Laser powder bed fusion (L-PBF) process has been investigated significantly to build production parts with a complex shape. Modeling tools, which can be used in a part level, are essential to allow engineers to fine tune the shape design and process parameters for additive manufacturing. This study focuses on developing modeling methods to predict microstructure, hardness, residual stress, and deformation in large L-PBF built parts. A transient sequentially coupled thermal and metallurgical analysis method was developed to predict microstructure and hardness on L-PBF built high-strength, low-alloy steel parts. A moving heat-source model was used in this analysis to accurately predict the temperature history. A kinetics based model which was developed to predict microstructure in the heat-affected zone of a welded joint was extended to predict the microstructure and hardness in an L-PBF build by inputting the predicted temperature history. The tempering effect resulting from the following built layers on the current-layer microstructural phases were modeled, which is the key to predict the final hardness correctly. It was also found that the top layers of a build part have higher hardness because of the lack of the tempering effect. A sequentially coupled thermal and mechanical analysis method was developed to predict residual stress and deformation for an L-PBF build part. It was found that a line-heating model is not suitable for analyzing a large L-PBF built part. The layer heating method is a potential method for analyzing a large L-PBF built part. The experiment was conducted to validate the model predictions.
4TH Mediterranean Workshop and Tropical Meeting "Novel Optical Materials and Applications" NOMA 99.
1999-07-19
If excitons have anharmonicity, the combination of large oscillator strength and the anharmonicity leads to large optical nonlinearity. Numerous...almost all NLO-polymers have a large optical loss compared with passive WG polymers. In these hybrid structures, only an active part for signal...quasi- phase matching based on chirality. I .::;~~~~~~~~a -: . .: 1 2 :: : , : •:•: :.?- ?i•;-•’’’::: .. AZOBENZENE POLYMERS FOR OPTICAL INFORMATION
NASA Astrophysics Data System (ADS)
de la Torre, S.; Guzzo, L.; Peacock, J. A.; Branchini, E.; Iovino, A.; Granett, B. R.; Abbas, U.; Adami, C.; Arnouts, S.; Bel, J.; Bolzonella, M.; Bottini, D.; Cappi, A.; Coupon, J.; Cucciati, O.; Davidzon, I.; De Lucia, G.; Fritz, A.; Franzetti, P.; Fumana, M.; Garilli, B.; Ilbert, O.; Krywult, J.; Le Brun, V.; Le Fèvre, O.; Maccagni, D.; Małek, K.; Marulli, F.; McCracken, H. J.; Moscardini, L.; Paioro, L.; Percival, W. J.; Polletta, M.; Pollo, A.; Schlagenhaufer, H.; Scodeggio, M.; Tasca, L. A. M.; Tojeiro, R.; Vergani, D.; Zanichelli, A.; Burden, A.; Di Porto, C.; Marchetti, A.; Marinoni, C.; Mellier, Y.; Monaco, P.; Nichol, R. C.; Phleps, S.; Wolk, M.; Zamorani, G.
2013-09-01
We present the general real- and redshift-space clustering properties of galaxies as measured in the first data release of the VIPERS survey. VIPERS is a large redshift survey designed to probe in detail the distant Universe and its large-scale structure at 0.5 < z < 1.2. We describe in this analysis the global properties of the sample and discuss the survey completeness and associated corrections. This sample allows us to measure the galaxy clustering with an unprecedented accuracy at these redshifts. From the redshift-space distortions observed in the galaxy clustering pattern we provide a first measurement of the growth rate of structure at z = 0.8: fσ8 = 0.47 ± 0.08. This is completely consistent with the predictions of standard cosmological models based on Einstein gravity, although this measurement alone does not discriminate between different gravity models. Based on observations collected at the European Southern Observatory, Cerro Paranal, Chile, using the Very Large Telescope under programmes 182.A-0886 and partly 070.A-9007. Also based on observations obtained with MegaPrime/MegaCam, a joint project of CFHT and CEA/DAPNIA, at the Canada-France-Hawaii Telescope (CFHT), which is operated by the National Research Council (NRC) of Canada, the Institut National des Sciences de l'Univers of the Centre National de la Recherche Scientifique (CNRS) of France, and the University of Hawaii. This work is based in part on data products produced at TERAPIX and the Canadian Astronomy Data Centre as part of the Canada-France-Hawaii Telescope Legacy Survey, a collaborative project of NRC and CNRS. The VIPERS web site is http://www.vipers.inaf.it/
Giorgio Vacchiano; John D. Shaw; R. Justin DeRose; James N. Long
2008-01-01
Diameter increment is an important variable in modeling tree growth. Most facets of predicted tree development are dependent in part on diameter or diameter increment, the most commonly measured stand variable. The behavior of the Forest Vegetation Simulator (FVS) largely relies on the performance of the diameter increment model and the subsequent use of predicted dbh...
NASA Astrophysics Data System (ADS)
Cheng, Gong; Han, Junwei; Zhou, Peicheng; Guo, Lei
2014-12-01
The rapid development of remote sensing technology has facilitated us the acquisition of remote sensing images with higher and higher spatial resolution, but how to automatically understand the image contents is still a big challenge. In this paper, we develop a practical and rotation-invariant framework for multi-class geospatial object detection and geographic image classification based on collection of part detectors (COPD). The COPD is composed of a set of representative and discriminative part detectors, where each part detector is a linear support vector machine (SVM) classifier used for the detection of objects or recurring spatial patterns within a certain range of orientation. Specifically, when performing multi-class geospatial object detection, we learn a set of seed-based part detectors where each part detector corresponds to a particular viewpoint of an object class, so the collection of them provides a solution for rotation-invariant detection of multi-class objects. When performing geographic image classification, we utilize a large number of pre-trained part detectors to discovery distinctive visual parts from images and use them as attributes to represent the images. Comprehensive evaluations on two remote sensing image databases and comparisons with some state-of-the-art approaches demonstrate the effectiveness and superiority of the developed framework.
Hausmann, Axel; Cancian de Araujo, Bruno; Sutrisno, Hari; Peggie, Djunijanti; Schmidt, Stefan
2017-01-01
Abstract Here we present a general collecting and preparation protocol for DNA barcoding of Lepidoptera as part of large-scale rapid biodiversity assessment projects, and a comparison with alternative preserving and vouchering methods. About 98% of the sequenced specimens processed using the present collecting and preparation protocol yielded sequences with more than 500 base pairs. The study is based on the first outcomes of the Indonesian Biodiversity Discovery and Information System (IndoBioSys). IndoBioSys is a German-Indonesian research project that is conducted by the Museum für Naturkunde in Berlin and the Zoologische Staatssammlung München, in close cooperation with the Research Center for Biology – Indonesian Institute of Sciences (RCB-LIPI, Bogor). PMID:29134041
Liu, Tao; Guo, Yin; Yang, Shourui; Yin, Shibin; Zhu, Jigui
2017-01-01
Industrial robots are expected to undertake ever more advanced tasks in the modern manufacturing industry, such as intelligent grasping, in which robots should be capable of recognizing the position and orientation of a part before grasping it. In this paper, a monocular-based 6-degree of freedom (DOF) pose estimation technology to enable robots to grasp large-size parts at informal poses is proposed. A camera was mounted on the robot end-flange and oriented to measure several featured points on the part before the robot moved to grasp it. In order to estimate the part pose, a nonlinear optimization model based on the camera object space collinearity error in different poses is established, and the initial iteration value is estimated with the differential transformation. Measuring poses of the camera are optimized based on uncertainty analysis. Also, the principle of the robotic intelligent grasping system was developed, with which the robot could adjust its pose to grasp the part. In experimental tests, the part poses estimated with the method described in this paper were compared with those produced by a laser tracker, and results show the RMS angle and position error are about 0.0228° and 0.4603 mm. Robotic intelligent grasping tests were also successfully performed in the experiments. PMID:28216555
Liu, Tao; Guo, Yin; Yang, Shourui; Yin, Shibin; Zhu, Jigui
2017-02-14
Industrial robots are expected to undertake ever more advanced tasks in the modern manufacturing industry, such as intelligent grasping, in which robots should be capable of recognizing the position and orientation of a part before grasping it. In this paper, a monocular-based 6-degree of freedom (DOF) pose estimation technology to enable robots to grasp large-size parts at informal poses is proposed. A camera was mounted on the robot end-flange and oriented to measure several featured points on the part before the robot moved to grasp it. In order to estimate the part pose, a nonlinear optimization model based on the camera object space collinearity error in different poses is established, and the initial iteration value is estimated with the differential transformation. Measuring poses of the camera are optimized based on uncertainty analysis. Also, the principle of the robotic intelligent grasping system was developed, with which the robot could adjust its pose to grasp the part. In experimental tests, the part poses estimated with the method described in this paper were compared with those produced by a laser tracker, and results show the RMS angle and position error are about 0.0228° and 0.4603 mm. Robotic intelligent grasping tests were also successfully performed in the experiments.
Water resources of the Wild Rice River watershed, northwestern Minnesota
Winter, Thomas C.; Bidwell, L.E.; Maclay, Robert W.
1970-01-01
The area of the watershed is about 2,600 square miles and includes most of Mahnomen and Norman Counties and parts of Becker, Clay, Clearwater, and Polk Counties. The population of the area is about 37,000 people of which about 70 percent live on farms. The economy is based principally on farming. The area of lake clay and silt is used mostly for raising sugar beets and wheat; and potatoes are grown largely on the sandy soils. In the western part of the morainal area small grain, dairy, and cattle farming is the most common. The eastern part of the morainal area is important for forest products and recreation. Industries in the area are small and are based on agricultural processing and service.
Hierarchical Parallelism in Finite Difference Analysis of Heat Conduction
NASA Technical Reports Server (NTRS)
Padovan, Joseph; Krishna, Lala; Gute, Douglas
1997-01-01
Based on the concept of hierarchical parallelism, this research effort resulted in highly efficient parallel solution strategies for very large scale heat conduction problems. Overall, the method of hierarchical parallelism involves the partitioning of thermal models into several substructured levels wherein an optimal balance into various associated bandwidths is achieved. The details are described in this report. Overall, the report is organized into two parts. Part 1 describes the parallel modelling methodology and associated multilevel direct, iterative and mixed solution schemes. Part 2 establishes both the formal and computational properties of the scheme.
Multi-pose system for geometric measurement of large-scale assembled rotational parts
NASA Astrophysics Data System (ADS)
Deng, Bowen; Wang, Zhaoba; Jin, Yong; Chen, Youxing
2017-05-01
To achieve virtual assembly of large-scale assembled rotational parts based on in-field geometric data, we develop a multi-pose rotative arm measurement system with a gantry and 2D laser sensor (RAMSGL) to measure and provide the geometry of these parts. We mount a 2D laser sensor onto the end of a six-jointed rotative arm to guarantee the accuracy and efficiency, combine the rotative arm with a gantry to measure pairs of assembled rotational parts. By establishing and using the D-H model of the system, the 2D laser data is turned into point clouds and finally geometry is calculated. In addition, we design three experiments to evaluate the performance of the system. Experimental results show that the system’s max length measuring deviation using gauge blocks is 35 µm, max length measuring deviation using ball plates is 50 µm, max single-point repeatability error is 25 µm, and measurement scope is from a radius of 0 mm to 500 mm.
NASA Astrophysics Data System (ADS)
Medina, Neiler; Sanchez, Arlex; Nokolic, Igor; Vojinovic, Zoran
2016-04-01
This research explores the uses of Agent Based Models (ABM) and its potential to test large scale evacuation strategies in coastal cities at risk from flood events due to extreme hydro-meteorological events with the final purpose of disaster risk reduction by decreasing human's exposure to the hazard. The first part of the paper corresponds to the theory used to build the models such as: Complex adaptive systems (CAS) and the principles and uses of ABM in this field. The first section outlines the pros and cons of using AMB to test city evacuation strategies at medium and large scale. The second part of the paper focuses on the central theory used to build the ABM, specifically the psychological and behavioral model as well as the framework used in this research, specifically the PECS reference model is cover in this section. The last part of this section covers the main attributes or characteristics of human beings used to described the agents. The third part of the paper shows the methodology used to build and implement the ABM model using Repast-Symphony as an open source agent-based modelling and simulation platform. The preliminary results for the first implementation in a region of the island of Sint-Maarten a Dutch Caribbean island are presented and discussed in the fourth section of paper. The results obtained so far, are promising for a further development of the model and its implementation and testing in a full scale city
Differences in Access to Care among Students Using School-Based Health Centers
ERIC Educational Resources Information Center
Parasuraman, Sarika Rane; Shi, Leiyu
2015-01-01
Health care reform has changed the landscape for the nation's health safety net, and school-based health centers (SBHCs) remain an important part of this system. However, few large-scale studies have been conducted to assess their impact on access to care. This study investigated differences in access among a nationally representative sample of…
Insulating board, hardboard, and other structural fiberboards
W. C. Lewis; S. L. Schwartz
1965-01-01
The wood-base fiber panel materials are a part of the rapidly evolving technology based on converting lignocellulose to fiber and reconstituting the fiber into large sheets and panels. While some equipment and techniques used are the same as for producing paper, there are enough differences in techniques used and other requirements for manufacture that a separate...
Using Response Ratios for Meta-Analyzing Single-Case Designs with Behavioral Outcomes
ERIC Educational Resources Information Center
Pustejovsky, James E.
2018-01-01
Methods for meta-analyzing single-case designs (SCDs) are needed to inform evidence-based practice in clinical and school settings and to draw broader and more defensible generalizations in areas where SCDs comprise a large part of the research base. The most widely used outcomes in single-case research are measures of behavior collected using…
Evaluation of advanced microelectronics for inclusion in MIL-STD-975
NASA Technical Reports Server (NTRS)
Scott, W. Richard
1991-01-01
The approach taken by NASA and JPL (Jet Propulsion Laboratory) in the development of a MIL-STD-975 section which contains advanced technology such as Large Scale Integration and Very Large Scale Integration (LSI/VLSI) microelectronic devices is described. The parts listed in this section are recommended as satisfactory for NASA flight applications, in the absence of alternate qualified devices, based on satisfactory results of a vendor capability audit, the availability of sufficient characterization and reliability data from the manufacturers and users and negotiated detail procurement specifications. The criteria used in the selection and evaluation of the vendors and candidate parts, the preparation of procurement specifications, and the status of this activity are discussed.
1991-08-11
The southern half of the island of Okinawa, Japan (26.5N, 128.0E) can be seen in this nearly cloud free view. Okinawa is part of the Ryuku Islands which extend from Taiwan northeastward to Kyushu, southernmost of the Japanese Home Islands. The large military base at Kadena, with large runways, is visible near the center of the scene. Kadena is one of several emergency landing sites around the world for the space shuttle.
NASA Astrophysics Data System (ADS)
Moore, Keegan J.; Bunyan, Jonathan; Tawfick, Sameh; Gendelman, Oleg V.; Li, Shuangbao; Leamy, Michael; Vakakis, Alexander F.
2018-01-01
In linear time-invariant dynamical and acoustical systems, reciprocity holds by the Onsager-Casimir principle of microscopic reversibility, and this can be broken only by odd external biases, nonlinearities, or time-dependent properties. A concept is proposed in this work for breaking dynamic reciprocity based on irreversible nonlinear energy transfers from large to small scales in a system with nonlinear hierarchical internal structure, asymmetry, and intentional strong stiffness nonlinearity. The resulting nonreciprocal large-to-small scale energy transfers mimic analogous nonlinear energy transfer cascades that occur in nature (e.g., in turbulent flows), and are caused by the strong frequency-energy dependence of the essentially nonlinear small-scale components of the system considered. The theoretical part of this work is mainly based on action-angle transformations, followed by direct numerical simulations of the resulting system of nonlinear coupled oscillators. The experimental part considers a system with two scales—a linear large-scale oscillator coupled to a small scale by a nonlinear spring—and validates the theoretical findings demonstrating nonreciprocal large-to-small scale energy transfer. The proposed study promotes a paradigm for designing nonreciprocal acoustic materials harnessing strong nonlinearity, which in a future application will be implemented in designing lattices incorporating nonlinear hierarchical internal structures, asymmetry, and scale mixing.
NASA Astrophysics Data System (ADS)
Cornillon, L.; Devilliers, C.; Behar-Lafenetre, S.; Ait-Zaid, S.; Berroth, K.; Bravo, A. C.
2017-11-01
Dealing with ceramic materials for more than two decades, Thales Alenia Space - France has identified Silicon Nitride Si3N4 as a high potential material for the manufacturing of stiff, stable and lightweight truss structure for future large telescopes. Indeed, for earth observation or astronomic observation, space mission requires more and more telescopes with high spatial resolution, which leads to the use of large primary mirrors, and a long distance between primary and secondary mirrors. Therefore current and future large space telescopes require a huge truss structure to hold and locate precisely the mirrors. Such large structure requires very strong materials with high specific stiffness and a low coefficient of thermal expansion (CTE). Based on the silicon nitride performances and on the know how of FCT Ingenieurkeramik to manufacture complex parts, Thales Alenia Space (TAS) has engaged, in cooperation with FCT, activities to develop and qualify silicon nitride parts for other applications for space projects.
NASA Astrophysics Data System (ADS)
Cornillon, L.; Devilliers, C.; Behar-Lafenetre, S.; Ait-Zaid, S.; Berroth, K.; Bravo, A. C.
2017-11-01
Dealing with ceramic materials for more than two decades, Thales Alenia Space - France has identified Silicon Nitride Si3N4 as a high potential material for the manufacturing of stiff, stable and lightweight truss structure for future large telescopes. Indeed, for earth observation or astronomic observation, space mission requires more and more telescopes with high spatial resolution, which leads to the use of large primary mirrors, and a long distance between primary and secondary mirrors. Therefore current and future large space telescopes require a huge truss structure to hold and locate precisely the mirrors. Such large structure requires very strong materials with high specific stiffness and a low coefficient of thermal expansion (CTE). Based on the silicon nitride performances and on the know how of FCT Ingenieurkeramik to manufacture complex parts, Thales Alenia Space (TAS) has engaged, in cooperation with FCT, activities to develop and qualify silicon nitride parts for other applications for space projects.
ERIC Educational Resources Information Center
Fussler, Herman; Payne, Charles T.
Part I is a discussion of the following project tasks: A) development of an on-line, real-time bibliographic data processing system; B) implementation in library operations; C) character sets; D) Project MARC; E) circulation; and F) processing operation studies. Part II is a brief discussion of efforts to work out cooperative library systems…
Design Considerations for Large Computer Communication Networks,
1976-04-01
particular, we will discuss the last three assumptions in order to motivate some of the models to be considered in this chapter. Independence Assumption...channels. fg Part (a), again motivated by an earlier remark on deterministic routing, will become more accurate when we include in the model, based on fixed...hierarchical routing, then this assumption appears to be quite acceptable. Part (b) is motivated by the quite symmetrical structure of the networks considered
Occupational Health in Eastern Europe
Malan, R. M.
1963-01-01
Progress may be fostered as much by spreading information as by research. The aim of this review is to add to the existing knowledge of the pattern of occupational health services in the socialist countries of Eastern Europe. The work consists of two main parts. Part I is based on official information issued by government departments or typewritten reports prepared by government officials, and relates mostly to the Union of Soviet Socialist Republics and to Czechoslovakia. Part II is largely based on direct observation, discussion, and comparison of the occupational health services in Czechoslovakia, of which I have more extensive knowledge than of the other countries of Eastern Europe. This part embodies a number of conclusions and is followed by a list of bibliographical references. Throughout the review I have endeavoured to show how problems which exist all over the world are dealt with in Eastern Europe. PMID:13932439
Mineral resource potential of the Middle Santiam Roadless Area, Linn County, Oregon
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walker, G.W.
1984-01-01
This report discusses the results of a mineral survey of the Middle Santiam Roadless Area (06929), Willamette National Forest, Linn County Oregon. Middle Santiam Roadless Area is adjacent on the east of the Quartzville mining district, a district that has yielded small amounts of base- and precious-metal ores. Many rock types and alteration features that characterize the mining district occur only the western part of the roadless area, and analysis of a few samples from this part of the roadless area indicates evidence of weak mineralization. The western part of the roadless area is therefore identified as having a moderatemore » potential for small deposits of base and precious metals and a low potential for large very low-grade precious-metal deposits. The eastern part of the roadless area has a low potential for metalliferous deposits. 7 refs., 4 figs., 1 tab.« less
POLLUTION PREVENTION RESEARCH STRATEGY
One of the strategic goals of the U.S. Environmental Protection Agency (EPA) is to prevent pollution and reduce risk in communities, homes, workplaces, and ecosystems. This goal must be based in large part on the application of the best available science and technology associat...
Using CLIPS in the domain of knowledge-based massively parallel programming
NASA Technical Reports Server (NTRS)
Dvorak, Jiri J.
1994-01-01
The Program Development Environment (PDE) is a tool for massively parallel programming of distributed-memory architectures. Adopting a knowledge-based approach, the PDE eliminates the complexity introduced by parallel hardware with distributed memory and offers complete transparency in respect of parallelism exploitation. The knowledge-based part of the PDE is realized in CLIPS. Its principal task is to find an efficient parallel realization of the application specified by the user in a comfortable, abstract, domain-oriented formalism. A large collection of fine-grain parallel algorithmic skeletons, represented as COOL objects in a tree hierarchy, contains the algorithmic knowledge. A hybrid knowledge base with rule modules and procedural parts, encoding expertise about application domain, parallel programming, software engineering, and parallel hardware, enables a high degree of automation in the software development process. In this paper, important aspects of the implementation of the PDE using CLIPS and COOL are shown, including the embedding of CLIPS with C++-based parts of the PDE. The appropriateness of the chosen approach and of the CLIPS language for knowledge-based software engineering are discussed.
Paws without claws? Ecological effects of large carnivores in anthropogenic landscapes
Sahlén, E.; Elmhagen, B.; Chamaillé-Jammes, S.; Sand, H.; Lone, K.; Cromsigt, J. P. G. M.
2016-01-01
Large carnivores are frequently presented as saviours of biodiversity and ecosystem functioning through their creation of trophic cascades, an idea largely based on studies coming primarily out of relatively natural landscapes. However, in large parts of the world, particularly in Europe, large carnivores live in and are returning to strongly human-modified ecosystems. At present, we lack a coherent framework to predict the effects of large carnivores in these anthropogenic landscapes. We review how human actions influence the ecological roles of large carnivores by affecting their density or behaviour or those of mesopredators or prey species. We argue that the potential for density-mediated trophic cascades in anthropogenic landscapes is limited to unproductive areas where even low carnivore numbers may impact prey densities or to the limited parts of the landscape where carnivores are allowed to reach ecologically functional densities. The potential for behaviourally mediated trophic cascades may be larger and more widespread, because even low carnivore densities affect prey behaviour. We conclude that predator–prey interactions in anthropogenic landscapes will be highly context-dependent and human actions will often attenuate the ecological effects of large carnivores. We highlight the knowledge gaps and outline a new research avenue to study the role of carnivores in anthropogenic landscapes. PMID:27798302
Large-Eddy Simulation of the Base Flow of a Cylindrical Space Vehicle Configuration
NASA Astrophysics Data System (ADS)
Meiß, J.-H.; Schröder, W.
2009-01-01
A Large-Eddy Simulation (LES) is performed out to in- vestigate high Reynolds number base flow of an axisymmetric rocket-like configuration having an underex- panded nozzle flow. The subsonic base region of low pressure levels is characterized and bounded by the interaction of the freestream of Mach 5.3 and the wide plume of the hot exhaust jet of Mach 3.8. An analysis of the base flow shows that the system of base area vortices determines the highly time-dependent pressure distribution and causes an upstream convection of hot exhaust gas. A comparison of the results with experiments conducted at the German Aerospace Center (DLR) Cologne shows good agreement. The investigation is part of the German RESPACE Pro- gram, which focuses on Key Technologies for Reusable Space Systems.
Management of fish populations in large rivers: a review of tools and approaches
Petts, Geoffrey E.; Imhoff, Jack G.; Manny, Bruce A.; Maher, John F. B.; Weisberg, Stephen B.
1989-01-01
In common with most branches of science, the management of riverine fish populations is characterised by reductionist and isolationist philosophies. Traditional fish management focuses on stocking and controls on fishing. This paper presents a concensus of scientists involved in the LARS workshop on the management of fish populations in large rivers. A move towards a more holistic philosophy is advocated, with fish management forming an integral part of sustainable river development. Based upon a questionnaire survey of LARS members, with wide-ranging expertise and experience from all parts of the world, lists of management tools currently in use are presented. Four categories of tools are described: flow, water-quality, habitat, and biological. The potential applications of tools for fish management in large rivers is discussed and research needs are identified. The lack of scientific evaluations of the different tools remains the major constraint to their wider application.
Bayer image parallel decoding based on GPU
NASA Astrophysics Data System (ADS)
Hu, Rihui; Xu, Zhiyong; Wei, Yuxing; Sun, Shaohua
2012-11-01
In the photoelectrical tracking system, Bayer image is decompressed in traditional method, which is CPU-based. However, it is too slow when the images become large, for example, 2K×2K×16bit. In order to accelerate the Bayer image decoding, this paper introduces a parallel speedup method for NVIDA's Graphics Processor Unit (GPU) which supports CUDA architecture. The decoding procedure can be divided into three parts: the first is serial part, the second is task-parallelism part, and the last is data-parallelism part including inverse quantization, inverse discrete wavelet transform (IDWT) as well as image post-processing part. For reducing the execution time, the task-parallelism part is optimized by OpenMP techniques. The data-parallelism part could advance its efficiency through executing on the GPU as CUDA parallel program. The optimization techniques include instruction optimization, shared memory access optimization, the access memory coalesced optimization and texture memory optimization. In particular, it can significantly speed up the IDWT by rewriting the 2D (Tow-dimensional) serial IDWT into 1D parallel IDWT. Through experimenting with 1K×1K×16bit Bayer image, data-parallelism part is 10 more times faster than CPU-based implementation. Finally, a CPU+GPU heterogeneous decompression system was designed. The experimental result shows that it could achieve 3 to 5 times speed increase compared to the CPU serial method.
ERIC Educational Resources Information Center
Gan, Zhengdong
2012-01-01
This study, which is part of a large-scale study of using objective measures to validate assessment rating scales and assessment tasks in a high-profile school-based assessment initiative in Hong Kong, examined how grammatical complexity measures relate to task type and analytic evaluations of students' speaking proficiency in a classroom-based…
M. D. Brinckman; J. F. Munsell
2009-01-01
Interest in wood-based bio-energy production systems is increasing. Multiscalar, mixed-method approaches focusing on both biophysical and social aspects of procurable feedstock are needed. Family forests will likely play an important role in supplying forest-based biomass. However, access depends in large part on the management trends among family forest owners. This...
Coal-tar-based pavement sealcoat, polycyclic aromatic Hydrocarbons (PAHs), and environmental health
Mahler, B.J.; Van Metre, P.C.
2011-01-01
Studies by the U.S. Geological Survey (USGS) have identified coal-tar-based sealcoat-the black, viscous liquid sprayed or painted on asphalt pavement such as parking lots-as a major source of polycyclic aromatic hydrocarbon (PAH) contamination in urban areas for large parts of the Nation. Several PAHs are suspected human carcinogens and are toxic to aquatic life.
Dowel Bar Retrofit Mix Design and Specification : Technical Report
DOT National Transportation Integrated Search
2012-01-01
Current INDOT specifications for repair materials to be used in dowel bar retrofit (DBR) applications (Sections 507.08 and 901.07 of INDOTs Book of Specifications) are based, in large part, on the requirements of ASTM C 928 and the manufacturer-pr...
Electrical Equipment of Electrical Stations and Substations,
1979-10-25
of Communist society. In 1921 he wrote: "fhe sole material base of socialism can be the large/coarse machine industry, capable of reorganizing and...produced with the aid of special switching system, structurally/ constructionally being part transformer itself. The transformers, supplied with this
Desertification in the south Junggar Basin, 2000-2009: Part II. Model development and trend analysis
NASA Astrophysics Data System (ADS)
Jiang, Miao; Lin, Yi
2018-07-01
The substantial objective of desertification monitoring is to derive its development trend, which facilitates pre-making policies to handle its potential influences. Aiming at this extreme goal, previous studies have proposed a large number of remote sensing (RS) based methods to retrieve multifold indicators, as reviewed in Part I. However, most of these indicators individually capable of characterizing a single aspect of land attributes, e.g., albedo quantifying land surface reflectivity, cannot show a full picture of desertification processes; few comprehensive RS-based models have either been published. To fill this gap, this Part II was dedicated to developing a RS information model for comprehensively characterizing the desertification and deriving its trend, based on the indicators retrieved in Part I in the same case of the south Junggar Basin, China in the last decade (2000-2009). The proposed model was designed to have three dominant component modules, i.e., the vegetation-relevant sub-model, the soil-relevant sub-model, and the water-relevant sub-model, which synthesize all of the retrieved indicators to integrally reflect the processes of desertification; based on the model-output indices, the desertification trends were derived using the least absolute deviation fitting algorithm. Tests indicated that the proposed model did work and the study area showed different development tendencies for different desertification levels. Overall, this Part II established a new comprehensive RS information model for desertification risk assessment and its trend deriving, and the whole study comprising Part I and Part II advanced a relatively standard framework for RS-based desertification monitoring.
Semantic Annotations and Querying of Web Data Sources
NASA Astrophysics Data System (ADS)
Hornung, Thomas; May, Wolfgang
A large part of the Web, actually holding a significant portion of the useful information throughout the Web, consists of views on hidden databases, provided by numerous heterogeneous interfaces that are partly human-oriented via Web forms ("Deep Web"), and partly based on Web Services (only machine accessible). In this paper we present an approach for annotating these sources in a way that makes them citizens of the Semantic Web. We illustrate how queries can be stated in terms of the ontology, and how the annotations are used to selected and access appropriate sources and to answer the queries.
Reshaping of large aeronautical structural parts: A simplified simulation approach
NASA Astrophysics Data System (ADS)
Mena, Ramiro; Aguado, José V.; Guinard, Stéphane; Huerta, Antonio
2018-05-01
Large aeronautical structural parts present important distortions after machining. This problem is caused by the presence of residual stresses, which are developed during previous manufacturing steps (quenching). Before being put into service, the nominal geometry is restored by means of mechanical methods. This operation is called reshaping and exclusively depends on the skills of a well-trained and experienced operator. Moreover, this procedure is time consuming and nowadays, it is only based on a trial and error approach. Therefore, there is a need at industrial level to solve this problem with the support of numerical simulation tools. By using a simplification hypothesis, it was found that the springback phenomenon behaves linearly and it allows developing a strategy to implement reshaping at an industrial level.
Evaluation of Global Observations-Based Evapotranspiration Datasets and IPCC AR4 Simulations
NASA Technical Reports Server (NTRS)
Mueller, B.; Seneviratne, S. I.; Jimenez, C.; Corti, T.; Hirschi, M.; Balsamo, G.; Ciais, P.; Dirmeyer, P.; Fisher, J. B.; Guo, Z.;
2011-01-01
Quantification of global land evapotranspiration (ET) has long been associated with large uncertainties due to the lack of reference observations. Several recently developed products now provide the capacity to estimate ET at global scales. These products, partly based on observational data, include satellite ]based products, land surface model (LSM) simulations, atmospheric reanalysis output, estimates based on empirical upscaling of eddycovariance flux measurements, and atmospheric water balance datasets. The LandFlux-EVAL project aims to evaluate and compare these newly developed datasets. Additionally, an evaluation of IPCC AR4 global climate model (GCM) simulations is presented, providing an assessment of their capacity to reproduce flux behavior relative to the observations ]based products. Though differently constrained with observations, the analyzed reference datasets display similar large-scale ET patterns. ET from the IPCC AR4 simulations was significantly smaller than that from the other products for India (up to 1 mm/d) and parts of eastern South America, and larger in the western USA, Australia and China. The inter-product variance is lower across the IPCC AR4 simulations than across the reference datasets in several regions, which indicates that uncertainties may be underestimated in the IPCC AR4 models due to shared biases of these simulations.
NASA Astrophysics Data System (ADS)
Haer, Toon; Aerts, Jeroen
2015-04-01
Between 1998 and 2009, Europe suffered over 213 major damaging floods, causing 1126 deaths, displacing around half a million people. In this period, floods caused at least 52 billion euro in insured economic losses making floods the most costly natural hazard faced in Europe. In many low-lying areas, the main strategy to cope with floods is to reduce the risk of the hazard through flood defence structures, like dikes and levees. However, it is suggested that part of the responsibility for flood protection needs to shift to households and businesses in areas at risk, and that governments and insurers can effectively stimulate the implementation of individual protective measures. However, adaptive behaviour towards flood risk reduction and the interaction between the government, insurers, and individuals has hardly been studied in large-scale flood risk assessments. In this study, an European Agent-Based Model is developed including agent representatives for the administrative stakeholders of European Member states, insurers and reinsurers markets, and individuals following complex behaviour models. The Agent-Based Modelling approach allows for an in-depth analysis of the interaction between heterogeneous autonomous agents and the resulting (non-)adaptive behaviour. Existing flood damage models are part of the European Agent-Based Model to allow for a dynamic response of both the agents and the environment to changing flood risk and protective efforts. By following an Agent-Based Modelling approach this study is a first contribution to overcome the limitations of traditional large-scale flood risk models in which the influence of individual adaptive behaviour towards flood risk reduction is often lacking.
Challenges in the global-scale quantification of permafrost changes
NASA Astrophysics Data System (ADS)
Gruber, S.
2012-12-01
Permafrost underlies much of Earth's surface and interacts with climate, land-surface phenomena and human systems. This presentation highlights heterogeneity and near-isothermal ground, two simple and well-known phenomena, as important challenges for investigating current and future states of permafrost. Heterogeneity, which can be introduced by e.g., topography, vegetation or subsurface material, is shown to be important for large parts of the global permafrost areas based on two proxies calculated from a global model of permafrost distribution. The model is based on a 1km DEM and NCEP-NCAR as well as CRU TS 2.0 air temperature data. Near-isothermal ground occurs when heat flow into a volume of ground material is accompanied by only a minute temperature change due to the dominance of latent heat transfer near 0°C. This causes our monitoring systems, which are to a large part based on temperature measurements, to lose much of their sensitivity as an instrument to measure permafrost changes. The importance of this is argued for based on (a) the long duration that soil columns are usually exposed to this effect, (b) the abundance of boreholes with temperatures close to 0°C based on the IPY-TSP data set, and (c) the global abundance and relative importance of ground near 0°C. The results presented indicated that systems and methods of gathering permafrost evidence and monitoring data need to better account for heterogeneity and isothermal ground in order to maintain long-term relevance, and that in large-area models sub-grid heterogeneity needs explicit attention.
Huang, Xiaoxi; Tao, Zhimin; Praskavich, John C; Goswami, Anandarup; Al-Sharab, Jafar F; Minko, Tamara; Polshettiwar, Vivek; Asefa, Tewodros
2014-09-16
The pore size and pore structure of nanoporous materials can affect the materials' physical properties, as well as potential applications in different areas, including catalysis, drug delivery, and biomolecular therapeutics. KCC-1, one of the newest members of silica nanomaterials, possesses fibrous, large pore, dendritic pore networks with wide pore entrances, large pore size distribution, spacious pore volume and large surface area--structural features that are conducive for adsorption and release of large guest molecules and biomacromolecules (e.g., proteins and DNAs). Here, we report the results of our comparative studies of adsorption of salmon DNA in a series of KCC-1-based nanomaterials that are functionalized with different organoamine groups on different parts of their surfaces (channel walls, external surfaces or both). For comparison the results of our studies of adsorption of salmon DNA in similarly functionalized, MCM-41 mesoporous silica nanomaterials with cylindrical pores, some of the most studied silica nanomaterials for drug/gene delivery, are also included. Our results indicate that, despite their relatively lower specific surface area, the KCC-1-based nanomaterials show high adsorption capacity for DNA than the corresponding MCM-41-based nanomaterials, most likely because of KCC-1's large pores, wide pore mouths, fibrous pore network, and thereby more accessible and amenable structure for DNA molecules to diffuse through. Conversely, the MCM-41-based nanomaterials adsorb much less DNA, presumably because their outer surfaces/cylindrical channel pore entrances can get blocked by the DNA molecules, making the inner parts of the materials inaccessible. Moreover, experiments involving fluorescent dye-tagged DNAs suggest that the amine-grafted KCC-1 materials are better suited for delivering the DNAs adsorbed on their surfaces into cellular environments than their MCM-41 counterparts. Finally, cellular toxicity tests show that the KCC-1-based materials are biocompatible. On the basis of these results, the fibrous and porous KCC-1-based nanomaterials can be said to be more suitable to carry, transport, and deliver DNAs and genes than cylindrical porous nanomaterials such as MCM-41.
DOT National Transportation Integrated Search
1976-01-01
Prior to 1932, road maintenance and construction in Virginia were largely the responsibility of the individual county governments. Bridge construction projects, based on local requirements, formed a natural part of these activities. Local responsibil...
DOT National Transportation Integrated Search
1976-01-01
Prior to 1932, road maintenance and construction in Virginia were largely the responsibility of the individual county governments. Bridge construction projects, based on local requirements, formed a natural part of these activities. Local responsibil...
ERIC Educational Resources Information Center
Wilkins, Jesse L. M.; Norton, Anderson; Boyce, Steven J.
2013-01-01
Previous research has documented schemes and operations that undergird students' understanding of fractions. This prior research was based, in large part, on small-group teaching experiments. However, written assessments are needed in order for teachers and researchers to assess students' ways of operating on a whole-class scale. In this study,…
New installation for inclined EAS investigations
NASA Astrophysics Data System (ADS)
Zadeba, E. A.; Ampilogov, N. V.; Barbashina, N. S.; Bogdanov, A. G.; Borisov, A. A.; Chernov, D. V.; Dushkin, L. I.; Fakhrutdinov, R. M.; Kokoulin, R. P.; Kompaniets, K. G.; Kozhin, A. S.; Ovchinnikov, V. V.; Ovechkin, A. S.; Petrukhin, A. A.; Shutenko, V. V.; Volkov, N. S.; Vorobjev, V. S.; Yashin, I. I.
2017-06-01
The large-scale coordinate-tracking detector TREK for registration of inclined EAS is being developed in MEPhI. The detector is based on multiwire drift chambers from the neutrino experiment at the IHEP U-70 accelerator. Their key advantages are a large effective area (1.85 m2), a good coordinate and angular resolution with a small number of measuring channels. The detector will be operated as part of the experimental complex NEVOD, in particular, jointly with a Cherenkov water detector (CWD) with a volume of 2000 cubic meters and the coordinate detector DECOR. The first part of the detector named Coordinate-Tracking Unit based on the Drift Chambers (CTUDC), representing two coordinate planes of 8 drift chambers in each, has been developed and mounted on opposite sides of the CWD. It has the same principle of joint operation with the NEVOD-DECOR triggering system and the same drift chambers alignment, so the main features of the TREK detector will be examined. Results of the CTUDC development and a joint operation with NEVOD-DECOR complex are presented.
Analysis of Computer Network Information Based on "Big Data"
NASA Astrophysics Data System (ADS)
Li, Tianli
2017-11-01
With the development of the current era, computer network and large data gradually become part of the people's life, people use the computer to provide convenience for their own life, but at the same time there are many network information problems has to pay attention. This paper analyzes the information security of computer network based on "big data" analysis, and puts forward some solutions.
Design of a Multi-Touch Tabletop for Simulation-Based Training
2014-06-01
receive, for example using point and click mouse-based computer interactions to specify the routes that vehicles take as part of a convoy...learning, coordination and support for planning. We first provide background in tabletop interaction in general and survey earlier efforts to use...tremendous progress over the past five years. Touch detection technologies now enable multiple users to interact simultaneously on large areas with
Extensibility in local sensor based planning for hyper-redundant manipulators (robot snakes)
NASA Technical Reports Server (NTRS)
Choset, Howie; Burdick, Joel
1994-01-01
Partial Shape Modification (PSM) is a local sensor feedback method used for hyper-redundant robot manipulators, in which the redundancy is very large or infinite such as that of a robot snake. This aspect of redundancy enables local obstacle avoidance and end-effector placement in real time. Due to the large number of joints or actuators in a hyper-redundant manipulator, small displacement errors of such easily accumulate to large errors in the position of the tip relative to the base. The accuracy could be improved by a local sensor based planning method in which sensors are distributed along the length of the hyper-redundant robot. This paper extends the local sensor based planning strategy beyond the limitations of the fixed length of such a manipulator when its joint limits are met. This is achieved with an algorithm where the length of the deforming part of the robot is variable. Thus , the robot's local avoidance of obstacles is improved through the enhancement of its extensibility.
NASA Technical Reports Server (NTRS)
1979-01-01
The performance, design and verification requirements for the space Construction Automated Fabrication Experiment (SCAFE) are defined. The SCAFE program defines, develops, and demonstrates the techniques, processes, and equipment required for the automatic fabrication of structural elements in space and for the assembly of such elements into a large, lightweight structure. The program defines a large structural platform to be constructed in orbit using the space shuttle as a launch vehicle and construction base.
Metrology of Large Parts. Chapter 5
NASA Technical Reports Server (NTRS)
Stahl, H. Philip
2012-01-01
As discussed in the first chapter of this book, there are many different methods to measure a part using optical technology. Chapter 2 discussed the use of machine vision to measure macroscopic features such as length and position, which was extended to the use of interferometry as a linear measurement tool in chapter 3, and laser or other trackers to find the relation of key points on large parts in chapter 4. This chapter looks at measuring large parts to optical tolerances in the sub-micron range using interferometry, ranging, and optical tools discussed in the previous chapters. The purpose of this chapter is not to discuss specific metrology tools (such as interferometers or gauges), but to describe a systems engineering approach to testing large parts. Issues such as material warpage and temperature drifts that may be insignificant when measuring a part to micron levels under a microscope, as will be discussed in later chapters, can prove to be very important when making the same measurement over a larger part. In this chapter, we will define a set of guiding principles for successfully overcoming these challenges and illustrate the application of these principles with real world examples. While these examples are drawn from specific large optical testing applications, they inform the problems associated with testing any large part to optical tolerances. Manufacturing today relies on micrometer level part performance. Fields such as energy and transportation are demanding higher tolerances to provide increased efficiencies and fuel savings. By looking at how the optics industry approaches sub-micrometer metrology, one can gain a better understanding of the metrology challenges for any larger part specified to micrometer tolerances. Testing large parts, whether optical components or precision structures, to optical tolerances is just like testing small parts, only harder. Identical with what one does for small parts, a metrologist tests large parts and optics in particular to quantify their mechanical properties (such as dimensions, mass, etc); their optical prescription or design (i.e. radius of curvature, conic constant, vertex location, size); and their full part shape. And, just as with small parts, a metrologist accomplishes these tests using distance measuring instruments such as tape measures, inside micrometers, coordinate measuring machines, distance measuring interferometers; angle measuring instruments such as theodolites, autocollimators; and surface measuring instruments including interferometers, stylus profilers, interference microscopes, photogrammetric cameras, or other tools. However, while the methodology may be similar, it is more difficult to test a large object for the simple reason that most metrologists do not have the necessary intuition. The skills used to test small parts or optics in a laboratory do not extrapolate to testing large parts in an industrial setting any more than a backyard gardener might successfully operate a farm. But first, what is a large part? A simple definition might be the part's size or diameter. For optics and diffuse surface parts alike, the driving constraint is ability to illuminate the part's surface. For reflective convex mirrors, large is typically anything greater than 1 meter. But, for refractive optics, flats or convex mirrors, large is typically greater than 0.5 meter. While a size definition is simple, it may be less than universal. A more nuanced definition might be that a large part is any component which cannot be easily tested in a standard laboratory environment, on a standard vibration isolated table using standard laboratory infrastructure. A micro-switch or a precision lens might be easily measured to nanometer levels under a microscope in a lab, but a power turbine spline or a larger telescope mirror will not fit under that microscope and may not even fit on the table.
Overholser, Brian R; Sowinski, Kevin M
2007-12-01
Biostatistics is the application of statistics to biologic data. The field of statistics can be broken down into 2 fundamental parts: descriptive and inferential. Descriptive statistics are commonly used to categorize, display, and summarize data. Inferential statistics can be used to make predictions based on a sample obtained from a population or some large body of information. It is these inferences that are used to test specific research hypotheses. This 2-part review will outline important features of descriptive and inferential statistics as they apply to commonly conducted research studies in the biomedical literature. Part 1 in this issue will discuss fundamental topics of statistics and data analysis. Additionally, some of the most commonly used statistical tests found in the biomedical literature will be reviewed in Part 2 in the February 2008 issue.
Content-Based Medical Image Retrieval
NASA Astrophysics Data System (ADS)
Müller, Henning; Deserno, Thomas M.
This chapter details the necessity for alternative access concepts to the currently mainly text-based methods in medical information retrieval. This need is partly due to the large amount of visual data produced, the increasing variety of medical imaging data and changing user patterns. The stored visual data contain large amounts of unused information that, if well exploited, can help diagnosis, teaching and research. The chapter briefly reviews the history of image retrieval and its general methods before technologies that have been developed in the medical domain are focussed. We also discuss evaluation of medical content-based image retrieval (CBIR) systems and conclude with pointing out their strengths, gaps, and further developments. As examples, the MedGIFT project and the Image Retrieval in Medical Applications (IRMA) framework are presented.
TiO2 Nanotubes: Recent Advances in Synthesis and Gas Sensing Properties
Galstyan, Vardan; Comini, Elisabetta; Faglia, Guido; Sberveglieri, Giorgio
2013-01-01
Synthesis—particularly by electrochemical anodization-, growth mechanism and chemical sensing properties of pure, doped and mixed titania tubular arrays are reviewed. The first part deals on how anodization parameters affect the size, shape and morphology of titania nanotubes. In the second part fabrication of sensing devices based on titania nanotubes is presented, together with their most notable gas sensing performances. Doping largely improves conductivity and enhances gas sensing performances of TiO2 nanotubes. PMID:24184919
NASA Technical Reports Server (NTRS)
Payne, Fred R.
1992-01-01
Lumley's 1967 Moscow paper provided, for the first time, a completely rational definition of the physically-useful term 'large eddy', popular for a half-century. The numerical procedures based upon his results are: (1) PODT (Proper Orthogonal Decomposition Theorem), which extracts the Large Eddy structure of stochastic processes from physical or computer simulation two-point covariances, and 2) LEIM (Large-Eddy Interaction Model), a predictive scheme for the dynamical large eddies based upon higher order turbulence modeling. Earlier Lumley's work (1964) forms the basis for the final member of the triad of numerical procedures: this predicts the global neutral modes of turbulence which have surprising agreement with both structural eigenmodes and those obtained from the dynamical equations. The ultimate goal of improved engineering design tools for turbulence may be near at hand, partly due to the power and storage of 'supermicrocomputer' workstations finally becoming adequate for the demanding numerics of these procedures.
Tephritid fruit fly transgenesis and applications
USDA-ARS?s Scientific Manuscript database
Tephritid fruit flies are among the most serious agricultural pests in the world, owing in large part to those species having broad host ranges including hundreds of fruits and vegetables. They are the largest group of insects subject to population control by a biologically-based systems, most notab...
Advanced Carbon Fabric/Phenolics for Thermal Protection Applications.
1982-02-01
structural properties are lower than rayon-based carbon fabriL analogues, they appear to be adequate for most ablative heat- shielding applications...34Development of Ablative Nozzles. Part II Ablative Nozzle Concept, Scaling Law , and Test Results," IAS Mtg. on Large Rockets, Sacramento, CA., Oct. 30
PROSPECTS ON BEHAVIORAL STUDIES OF MARINE AND FRESHWATER TOXINS.
This manuscript is based in large part on an invited presentation. The manuscript provides a brief overview of the growing issue of the human-health and environmental impact of a variety of toxins of marine and freshwater origin, the current (generally crude) state of behavioral...
NASA Astrophysics Data System (ADS)
Fakhari, Vahid; Choi, Seung-Bok; Cho, Chang-Hyun
2015-04-01
This work presents a new robust model reference adaptive control (MRAC) for vibration control caused from vehicle engine using an electromagnetic type of active engine mount. Vibration isolation performances of the active mount associated with the robust controller are evaluated in the presence of large uncertainties. As a first step, an active mount with linear solenoid actuator is prepared and its dynamic model is identified via experimental test. Subsequently, a new robust MRAC based on the gradient method with σ-modification is designed by selecting a proper reference model. In designing the robust adaptive control, structured (parametric) uncertainties in the stiffness of the passive part of the mount and in damping ratio of the active part of the mount are considered to investigate the robustness of the proposed controller. Experimental and simulation results are presented to evaluate performance focusing on the robustness behavior of the controller in the face of large uncertainties. The obtained results show that the proposed controller can sufficiently provide the robust vibration control performance even in the presence of large uncertainties showing an effective vibration isolation.
A Framework for Preliminary Design of Aircraft Structures Based on Process Information. Part 1
NASA Technical Reports Server (NTRS)
Rais-Rohani, Masoud
1998-01-01
This report discusses the general framework and development of a computational tool for preliminary design of aircraft structures based on process information. The described methodology is suitable for multidisciplinary design optimization (MDO) activities associated with integrated product and process development (IPPD). The framework consists of three parts: (1) product and process definitions; (2) engineering synthesis, and (3) optimization. The product and process definitions are part of input information provided by the design team. The backbone of the system is its ability to analyze a given structural design for performance as well as manufacturability and cost assessment. The system uses a database on material systems and manufacturing processes. Based on the identified set of design variables and an objective function, the system is capable of performing optimization subject to manufacturability, cost, and performance constraints. The accuracy of the manufacturability measures and cost models discussed here depend largely on the available data on specific methods of manufacture and assembly and associated labor requirements. As such, our focus in this research has been on the methodology itself and not so much on its accurate implementation in an industrial setting. A three-tier approach is presented for an IPPD-MDO based design of aircraft structures. The variable-complexity cost estimation methodology and an approach for integrating manufacturing cost assessment into design process are also discussed. This report is presented in two parts. In the first part, the design methodology is presented, and the computational design tool is described. In the second part, a prototype model of the preliminary design Tool for Aircraft Structures based on Process Information (TASPI) is described. Part two also contains an example problem that applies the methodology described here for evaluation of six different design concepts for a wing spar.
An Application of CICCT Accident Categories to Aviation Accidents in 1988-2004
NASA Technical Reports Server (NTRS)
Evans, Joni K.
2007-01-01
Interventions or technologies developed to improve aviation safety often focus on specific causes or accident categories. Evaluation of the potential effectiveness of those interventions is dependent upon mapping the historical aviation accidents into those same accident categories. To that end, the United States civil aviation accidents occurring between 1988 and 2004 (n=26,117) were assigned accident categories based upon the taxonomy developed by the CAST/ICAO Common Taxonomy Team (CICTT). Results are presented separately for four main categories of flight rules: Part 121 (large commercial air carriers), Scheduled Part 135 (commuter airlines), Non-Scheduled Part 135 (on-demand air taxi) and Part 91 (general aviation). Injuries and aircraft damage are summarized by year and by accident category.
33 CFR Appendix A to Part 279 - Sample Resource Use Objectives
Code of Federal Regulations, 2011 CFR
2011-07-01
... technical advice. Resource use objective: To establish an ecological study area at Wakulla Wash for the protection and study of its unique vegetative associations. (Discussion) The analysis of pertinent factors... educational purposes; there is a large population base within two hours drive of the project; two local...
Towards a Constructivist Pedagogy for Year 12 Mathematics
ERIC Educational Resources Information Center
Sheppard, Ian
2008-01-01
Constructivist pedagogies are generally not considered to support the teaching of mathematics for externally assessed examination-based courses. In large part, teachers have believed that such approaches are inefficient in covering a set syllabus. This article summarises the author's learning journey in Year 12 mathematics in 2004 where attempts…
E-Sponsor Mentoring: Support for Students in Developmental Education
ERIC Educational Resources Information Center
Hodges, Russ; Payne, Emily Miller; Dietz, Albert; Hajovsky, Michelle
2014-01-01
Researchers investigated the use of two mentoring programs for students who were part of a support component of Fundamentals of Conceptual Understanding and Success (FOCUS), a comprehensive intervention grant for students enrolled in developmental mathematics coursework at a large public Texas university. The technology-based mentoring program,…
Laser system for identification, tracking, and control of flying insects
USDA-ARS?s Scientific Manuscript database
Flying insects are common vectors for transmission of pathogens and inflict significant harm on humans in large parts of the developing world. Besides the direct impact to humans, these pathogens also cause harm to crops and result in agricultural losses. Here, we present a laser-based system that c...
Effective public health policy should not be based solely on clinical, individualbased
information, but requires a broad characterization of human health conditions
across large geographic areas. For the most part, the necessary monitoring of human
health to ...
Factors Affecting Intervention Fidelity of Differentiated Instruction in Kindergarten
ERIC Educational Resources Information Center
Dijkstra, Elma M.; Walraven, Amber; Mooij, Ton; Kirschner, Paul A.
2017-01-01
This paper reports on the findings in the first phase of a design-based research project as part of a large-scale intervention study in Dutch kindergartens. The project aims at enhancing differentiated instruction and evaluating its effects on children's development, in particular high-ability children. This study investigates relevant…
ERIC Educational Resources Information Center
Education Digest: Essential Readings Condensed for Quick Review, 2011
2011-01-01
A new data analysis, based on data collected as part of The Broad Prize process, provides insights into which large urban school districts in the United States are doing the best job of educating traditionally disadvantaged groups: African-American, Hispanics, and low-income students. Since 2002, The Eli and Edythe Broad Foundation has awarded The…
A WHOLE-LAKE WATER QUALITY SURVEY OF LAKE OAHE BASED ON A SPATIALLY-BALANCED PROBABILISTIC DESIGN
Assessing conditions on large bodies of water presets multiple statistical and logistical challenges. As part of the Upper Missouri River Program of the Environmental Monitoring and Assessment Project (EMAP) we surveyed water quality of Lake Oahe in July-August, 2002 using a spat...
Using Mobile Phones to Increase Classroom Interaction
ERIC Educational Resources Information Center
Cobb, Stephanie; Heaney, Rose; Corcoran, Olivia; Henderson-Begg, Stephanie
2010-01-01
This study examines the possible benefits of using mobile phones to increase interaction and promote active learning in large classroom settings. First year undergraduate students studying Cellular Processes at the University of East London took part in a trial of a new text-based classroom interaction system and evaluated their experience by…
ERIC Educational Resources Information Center
Wergin, Jon F.
2011-01-01
In this essay, Jon Wergin reminds readers of the philosophical and historical foundations of the doctor of education (EdD) degree. He argues that the EdD should be based, in large part, on John Dewey's progressive ideals of democratization and Paulo Freire's concepts of emancipatory education. Drawing on theories of reflective practice,…
Management of Classroom Behaviors: Perceived Readiness of Education Interns
ERIC Educational Resources Information Center
Garland, Dennis; Garland, Krista Vince; Vasquez, Eleazar, III
2013-01-01
Education students at a large research university participated in internships during their final semesters as part of their respective programs of study as a capstone experience. Qualitative and quantitative methods were used to collect data on the perceptions of interns' readiness and knowledge of evidence-based practices to manage classroom…
Limitations of Experiments in Education Research
ERIC Educational Resources Information Center
Schanzenbach, Diane Whitmore
2012-01-01
Research based on randomized experiments (along with high-quality quasi-experiments) has gained traction in education circles in recent years. There is little doubt this has been driven in large part by the shift in research funding strategy by the Department of Education's Institute of Education Sciences under Grover Whitehurst's lead, described…
Predicting agricultural impacts of large-scale drought: 2012 and the case for better modeling
USDA-ARS?s Scientific Manuscript database
We present an example of a simulation-based forecast for the 2012 U.S. maize growing season produced as part of a high-resolution, multi-scale, predictive mechanistic modeling study designed for decision support, risk management, and counterfactual analysis. The simulations undertaken for this analy...
ERIC Educational Resources Information Center
Buckley, Barbara C.; Gobert, Janice D.; Kindfield, Ann C. H.; Horwitz, Paul; Tinker, Robert F.; Gerlits, Bobbi; Wilensky, Uri; Dede, Chris; Willett, John
2004-01-01
This paper describes part of a project called Modeling Across the Curriculum which is a large-scale research study in 15 schools across the United States. The specific data presented and discussed here in this paper is based on BioLogica, a hypermodel, interactive environment for learning genetics, which was implemented in multiple classes in…
Phenomenology-Based Inverse Scattering for Sensor Information Fusion
2006-09-15
abilities in the past. Rule -based systems and mathematics of logic implied significant similarities between the two: Thoughts, words, and phrases...all are logical statements. The situation has changed, in part due to the fact that logic- rule systems have not been sufficiently powerful to explain...references]. 3 Language mechanisms of our mind include abilities to acquire a large vocabulary, rules of grammar, and to use the finite set of
Robotic Range Clearance Competition (R2C2)
2011-10-01
unexploded ordnance (UXO). A large part of the debris field consists of ferrous metal objects that magnetic 39 Distribution A: Approved for public...was set at 7 degrees above horizontal based on terrain around the Base station. We used the BSUBR file for all fields except the Subsurface...and subsurface clearance test areas had numerous pieces of simulated unexploded ordinance (SUXO) buried at random locations around the field . These
Using vignettes to explore work-based learning: Part 1.
Wareing, Mark
This is the first of two articles exploring the use of vignettes as an alternative method of presenting the data arising from interviews. The interviews were carried out as part of research into work-based learning: both articles are based on findings from a hermeneutic phenomenological study into the lived experience of foundation degree mentors and their students-healthcare assistants undertaking a foundation degree in health and social care in order to become assistant practitioners. Part 1 presents a vignette of a notional workplace mentor (Staff Nurse Sophie) that describes her lived experience supporting two equally notional foundation degree students. Sophie's perspective will be a distillation of data arising from interviews with eight workplace mentors, all employed on acute wards within a large NHS hospital trust. The vignette attempts to demonstrate the role of the workplace mentor in the support of work-based learning, and the interprofessional factors that determine the landscape of workplace learning for foundation degree students. The potential of a vignette to assist in a deeper hermeneutic understanding of meanings arising from data will be explored, and the limitations of the approach considered.
Automated Cutting And Drilling Of Composite Parts
NASA Technical Reports Server (NTRS)
Warren, Charles W.
1993-01-01
Proposed automated system precisely cuts and drills large, odd-shaped parts made of composite materials. System conceived for manufacturing lightweight composite parts to replace heavier parts in Space Shuttle. Also useful in making large composite parts for other applications. Includes robot locating part to be machined, positions cutter, and positions drill. Gantry-type robot best suited for task.
ERIC Educational Resources Information Center
Knutson, Kristopher; Smith, Jennifer; Nichols, Paul; Wallert, Mark A.; Provost, Joseph J.
2010-01-01
Research-based learning in a teaching environment is an effective way to help bring the excitement and experience of independent bench research to a large number of students. The program described here is the second of a two-semester biochemistry laboratory series. Here, students are empowered to design, execute and analyze their own experiments…
Müller, Hermann J; O'Grady, Rebecca; Krummenacher, Joseph; Heller, Dieter
2008-11-01
Three experiments re-examined Baylis and Driver's (1993) strong evidence for object-based selection, that making relative apex location judgments is harder between two objects than within a single object, with object (figure-ground) segmentation determined solely by color-based perceptual set. Using variations of the Baylis and Driver paradigm, the experiments replicated a two-object cost. However, they also showed a large part of the two-object cost to be attributable to space-based factors, though there remained an irreducible cost consistent with 'true' object-based selection.
Transaction-Based Building Controls Framework, Volume 1: Reference Guide
DOE Office of Scientific and Technical Information (OSTI.GOV)
Somasundaram, Sriram; Pratt, Robert G.; Akyol, Bora A.
This document proposes a framework concept to achieve the objectives of raising buildings’ efficiency and energy savings potential benefitting building owners and operators. We call it a transaction-based framework, wherein mutually-beneficial and cost-effective market-based transactions can be enabled between multiple players across different domains. Transaction-based building controls are one part of the transactional energy framework. While these controls realize benefits by enabling automatic, market-based intra-building efficiency optimizations, the transactional energy framework provides similar benefits using the same market -based structure, yet on a larger scale and beyond just buildings, to the society at large.
Static and Dynamic Properties of DNA Confined in Nanochannels
NASA Astrophysics Data System (ADS)
Gupta, Damini
Next-generation sequencing (NGS) techniques have considerably reduced the cost of high-throughput DNA sequencing. However, it is challenging to detect large-scale genomic variations by NGS due to short read lengths. Genome mapping can easily detect large-scale structural variations because it operates on extremely large intact molecules of DNA with adequate resolution. One of the promising methods of genome mapping is based on confining large DNA molecules inside a nanochannel whose cross-sectional dimensions are approximately 50 nm. Even though this genome mapping technology has been commercialized, the current understanding of the polymer physics of DNA in nanochannel confinement is based on theories and lacks much needed experimental support. The results of this dissertation are aimed at providing a detailed experimental understanding of equilibrium properties of nanochannel-confined DNA molecules. The results are divided into three parts. In first part, we evaluate the role of channel shape on thermodynamic properties of channel confined DNA molecules using a combination of fluorescence microscopy and simulations. Specifically, we show that high aspect ratio of rectangular channels significantly alters the chain statistics as compared to an equivalent square channel with same cross-sectional area. In the second part, we present experimental evidence that weak excluded volume effects arise in DNA nanochannel confinement, which form the physical basis for the extended de Gennes regime. We also show how confinement spectroscopy and simulations can be combined to reduce molecular weight dispersity effects arising from shearing, photo-cleavage, and nonuniform staining of DNA. Finally, the third part of the thesis concerns the dynamic properties of nanochannel confined DNA. We directly measure the center-of-mass diffusivity of single DNA molecules in confinement and show that that it is necessary to modify the classical results of de Gennes to account for local chain stiffness of DNA in order to explain the experimental results. In the end, we believe that our findings from the experimental test of the phase diagram for channel-confined DNA, with careful control over molecular weight dispersity, channel geometry, and electrostatic interactions, will provide a firm foundation for the emerging genome mapping technology.
Automatic Extraction of Destinations, Origins and Route Parts from Human Generated Route Directions
NASA Astrophysics Data System (ADS)
Zhang, Xiao; Mitra, Prasenjit; Klippel, Alexander; Maceachren, Alan
Researchers from the cognitive and spatial sciences are studying text descriptions of movement patterns in order to examine how humans communicate and understand spatial information. In particular, route directions offer a rich source of information on how cognitive systems conceptualize movement patterns by segmenting them into meaningful parts. Route directions are composed using a plethora of cognitive spatial organization principles: changing levels of granularity, hierarchical organization, incorporation of cognitively and perceptually salient elements, and so forth. Identifying such information in text documents automatically is crucial for enabling machine-understanding of human spatial language. The benefits are: a) creating opportunities for large-scale studies of human linguistic behavior; b) extracting and georeferencing salient entities (landmarks) that are used by human route direction providers; c) developing methods to translate route directions to sketches and maps; and d) enabling queries on large corpora of crawled/analyzed movement data. In this paper, we introduce our approach and implementations that bring us closer to the goal of automatically processing linguistic route directions. We report on research directed at one part of the larger problem, that is, extracting the three most critical parts of route directions and movement patterns in general: origin, destination, and route parts. We use machine-learning based algorithms to extract these parts of routes, including, for example, destination names and types. We prove the effectiveness of our approach in several experiments using hand-tagged corpora.
NASA Astrophysics Data System (ADS)
Xu, Jianxin; Liang, Hong
2013-07-01
Terrestrial laser scanning creates a point cloud composed of thousands or millions of 3D points. Through pre-processing, generating TINs, mapping texture, a 3D model of a real object is obtained. When the object is too large, the object is separated into some parts. This paper mainly focuses on problem of gray uneven of two adjacent textures' intersection. The new algorithm is presented in the paper, which is per-pixel linear interpolation along loop line buffer .The experiment data derives from point cloud of stone lion which is situated in front of west gate of Henan Polytechnic University. The model flow is composed of three parts. First, the large object is separated into two parts, and then each part is modeled, finally the whole 3D model of the stone lion is composed of two part models. When the two part models are combined, there is an obvious fissure line in the overlapping section of two adjacent textures for the two models. Some researchers decrease brightness value of all pixels for two adjacent textures by some algorithms. However, some algorithms are effect and the fissure line still exists. Gray uneven of two adjacent textures is dealt by the algorithm in the paper. The fissure line in overlapping section textures is eliminated. The gray transition in overlapping section become more smoothly.
NASA Astrophysics Data System (ADS)
Fanget, Alain
2009-06-01
Many authors claim that to understand the response of a propellant, specifically under quasi static and dynamic loading, the mesostructural morphology and the mechanical behaviour of each of its components have to be known. However the scale of the mechanical description of the behaviour of a propellant is relative to its heterogeneities and the wavelength of loading. The shorter it is, the more important the topological description of the material is. In our problems, involving the safety of energetic materials, the propellant can be subjected to a large spectrum of loadings. This presentation is divided into five parts. The first part describes the processes used to extract the information about the morphology of the meso-structure of the material and presents some results. The results, the difficulties and the perspectives for this part will be recalled. The second part determines the physical processes involved at this scale from experimental results. Taking into account the knowledge of the morphology, two ways have been chosen to describe the response of the material. One concerns the quasi static loading, the object of the third part, in which we show how we use the mesoscopic scale as a base of development to build constitutive models. The fourth part presents for low but dynamic loading the comparison between numerical analysis and experiments.
Feedback control in deep drawing based on experimental datasets
NASA Astrophysics Data System (ADS)
Fischer, P.; Heingärtner, J.; Aichholzer, W.; Hortig, D.; Hora, P.
2017-09-01
In large-scale production of deep drawing parts, like in automotive industry, the effects of scattering material properties as well as warming of the tools have a significant impact on the drawing result. In the scope of the work, an approach is presented to minimize the influence of these effects on part quality by optically measuring the draw-in of each part and adjusting the settings of the press to keep the strain distribution, which is represented by the draw-in, inside a certain limit. For the design of the control algorithm, a design of experiments for in-line tests is used to quantify the influence of the blank holder force as well as the force distribution on the draw-in. The results of this experimental dataset are used to model the process behavior. Based on this model, a feedback control loop is designed. Finally, the performance of the control algorithm is validated in the production line.
Velasco, Veronica; Griffin, Kenneth W; Antichi, Mariella; Celata, Corrado
2015-10-01
Across developed countries, experimentation with alcohol, tobacco, and other drugs often begins in the early adolescent years. Several evidence-based programs have been developed to prevent adolescent substance use. Many of the most rigorously tested and empirically supported prevention programs were initially developed and tested in the United States. Increasingly, these interventions are being adopted for use in Europe and throughout the world. This paper reports on a large-scale comprehensive initiative designed to select, adapt, implement, and sustain an evidence-based drug abuse prevention program in Italy. As part of a large-scale regionally funded collaboration in the Lombardy region of Italy, we report on processes through which a team of stakeholders selected, translated and culturally adapted, planned, implemented and evaluated the Life Skills Training (LST) school-based drug abuse prevention program, an evidence-based intervention developed in the United States. We discuss several challenges and lessons learned and implications for prevention practitioners and researchers attempting to undertake similar international dissemination projects. We review several published conceptual models designed to promote the replication and widespread dissemination of effective programs, and discuss their strengths and limitations in the context of planning and implementing a complex, large-scale real-world dissemination effort. Copyright © 2015 Elsevier Ltd. All rights reserved.
Professional development in inquiry-based science for elementary teachers of diverse student groups
NASA Astrophysics Data System (ADS)
Lee, Okhee; Hart, Juliet E.; Cuevas, Peggy; Enders, Craig
2004-12-01
As part of a larger project aimed at promoting science and literacy for culturally and linguistically diverse elementary students, this study has two objectives: (a) to describe teachers' initial beliefs and practices about inquiry-based science and (b) to examine the impact of the professional development intervention (primarily through instructional units and teacher workshops) on teachers' beliefs and practices related to inquiry-based science. The research involved 53 third- and fourth-grade teachers at six elementary schools in a large urban school district. At the end of the school year, teachers reported enhanced knowledge of science content and stronger beliefs about the importance of science instruction with diverse student groups, although their actual practices did not change significantly. Based on the results of this first year of implementation as part of a 3-year longitudinal design, implications for professional development and further research are discussed.
The capital-asset-pricing model and arbitrage pricing theory: A unification
Khan, M. Ali; Sun, Yeneng
1997-01-01
We present a model of a financial market in which naive diversification, based simply on portfolio size and obtained as a consequence of the law of large numbers, is distinguished from efficient diversification, based on mean-variance analysis. This distinction yields a valuation formula involving only the essential risk embodied in an asset’s return, where the overall risk can be decomposed into a systematic and an unsystematic part, as in the arbitrage pricing theory; and the systematic component further decomposed into an essential and an inessential part, as in the capital-asset-pricing model. The two theories are thus unified, and their individual asset-pricing formulas shown to be equivalent to the pervasive economic principle of no arbitrage. The factors in the model are endogenously chosen by a procedure analogous to the Karhunen–Loéve expansion of continuous time stochastic processes; it has an optimality property justifying the use of a relatively small number of them to describe the underlying correlational structures. Our idealized limit model is based on a continuum of assets indexed by a hyperfinite Loeb measure space, and it is asymptotically implementable in a setting with a large but finite number of assets. Because the difficulties in the formulation of the law of large numbers with a standard continuum of random variables are well known, the model uncovers some basic phenomena not amenable to classical methods, and whose approximate counterparts are not already, or even readily, apparent in the asymptotic setting. PMID:11038614
The capital-asset-pricing model and arbitrage pricing theory: a unification.
Ali Khan, M; Sun, Y
1997-04-15
We present a model of a financial market in which naive diversification, based simply on portfolio size and obtained as a consequence of the law of large numbers, is distinguished from efficient diversification, based on mean-variance analysis. This distinction yields a valuation formula involving only the essential risk embodied in an asset's return, where the overall risk can be decomposed into a systematic and an unsystematic part, as in the arbitrage pricing theory; and the systematic component further decomposed into an essential and an inessential part, as in the capital-asset-pricing model. The two theories are thus unified, and their individual asset-pricing formulas shown to be equivalent to the pervasive economic principle of no arbitrage. The factors in the model are endogenously chosen by a procedure analogous to the Karhunen-Loéve expansion of continuous time stochastic processes; it has an optimality property justifying the use of a relatively small number of them to describe the underlying correlational structures. Our idealized limit model is based on a continuum of assets indexed by a hyperfinite Loeb measure space, and it is asymptotically implementable in a setting with a large but finite number of assets. Because the difficulties in the formulation of the law of large numbers with a standard continuum of random variables are well known, the model uncovers some basic phenomena not amenable to classical methods, and whose approximate counterparts are not already, or even readily, apparent in the asymptotic setting.
Milne, Roger L.; Herranz, Jesús; Michailidou, Kyriaki; Dennis, Joe; Tyrer, Jonathan P.; Zamora, M. Pilar; Arias-Perez, José Ignacio; González-Neira, Anna; Pita, Guillermo; Alonso, M. Rosario; Wang, Qin; Bolla, Manjeet K.; Czene, Kamila; Eriksson, Mikael; Humphreys, Keith; Darabi, Hatef; Li, Jingmei; Anton-Culver, Hoda; Neuhausen, Susan L.; Ziogas, Argyrios; Clarke, Christina A.; Hopper, John L.; Dite, Gillian S.; Apicella, Carmel; Southey, Melissa C.; Chenevix-Trench, Georgia; Swerdlow, Anthony; Ashworth, Alan; Orr, Nicholas; Schoemaker, Minouk; Jakubowska, Anna; Lubinski, Jan; Jaworska-Bieniek, Katarzyna; Durda, Katarzyna; Andrulis, Irene L.; Knight, Julia A.; Glendon, Gord; Mulligan, Anna Marie; Bojesen, Stig E.; Nordestgaard, Børge G.; Flyger, Henrik; Nevanlinna, Heli; Muranen, Taru A.; Aittomäki, Kristiina; Blomqvist, Carl; Chang-Claude, Jenny; Rudolph, Anja; Seibold, Petra; Flesch-Janys, Dieter; Wang, Xianshu; Olson, Janet E.; Vachon, Celine; Purrington, Kristen; Winqvist, Robert; Pylkäs, Katri; Jukkola-Vuorinen, Arja; Grip, Mervi; Dunning, Alison M.; Shah, Mitul; Guénel, Pascal; Truong, Thérèse; Sanchez, Marie; Mulot, Claire; Brenner, Hermann; Dieffenbach, Aida Karina; Arndt, Volker; Stegmaier, Christa; Lindblom, Annika; Margolin, Sara; Hooning, Maartje J.; Hollestelle, Antoinette; Collée, J. Margriet; Jager, Agnes; Cox, Angela; Brock, Ian W.; Reed, Malcolm W.R.; Devilee, Peter; Tollenaar, Robert A.E.M.; Seynaeve, Caroline; Haiman, Christopher A.; Henderson, Brian E.; Schumacher, Fredrick; Le Marchand, Loic; Simard, Jacques; Dumont, Martine; Soucy, Penny; Dörk, Thilo; Bogdanova, Natalia V.; Hamann, Ute; Försti, Asta; Rüdiger, Thomas; Ulmer, Hans-Ulrich; Fasching, Peter A.; Häberle, Lothar; Ekici, Arif B.; Beckmann, Matthias W.; Fletcher, Olivia; Johnson, Nichola; dos Santos Silva, Isabel; Peto, Julian; Radice, Paolo; Peterlongo, Paolo; Peissel, Bernard; Mariani, Paolo; Giles, Graham G.; Severi, Gianluca; Baglietto, Laura; Sawyer, Elinor; Tomlinson, Ian; Kerin, Michael; Miller, Nicola; Marme, Federik; Burwinkel, Barbara; Mannermaa, Arto; Kataja, Vesa; Kosma, Veli-Matti; Hartikainen, Jaana M.; Lambrechts, Diether; Yesilyurt, Betul T.; Floris, Giuseppe; Leunen, Karin; Alnæs, Grethe Grenaker; Kristensen, Vessela; Børresen-Dale, Anne-Lise; García-Closas, Montserrat; Chanock, Stephen J.; Lissowska, Jolanta; Figueroa, Jonine D.; Schmidt, Marjanka K.; Broeks, Annegien; Verhoef, Senno; Rutgers, Emiel J.; Brauch, Hiltrud; Brüning, Thomas; Ko, Yon-Dschun; Couch, Fergus J.; Toland, Amanda E.; Yannoukakos, Drakoulis; Pharoah, Paul D.P.; Hall, Per; Benítez, Javier; Malats, Núria; Easton, Douglas F.
2014-01-01
Part of the substantial unexplained familial aggregation of breast cancer may be due to interactions between common variants, but few studies have had adequate statistical power to detect interactions of realistic magnitude. We aimed to assess all two-way interactions in breast cancer susceptibility between 70 917 single nucleotide polymorphisms (SNPs) selected primarily based on prior evidence of a marginal effect. Thirty-eight international studies contributed data for 46 450 breast cancer cases and 42 461 controls of European origin as part of a multi-consortium project (COGS). First, SNPs were preselected based on evidence (P < 0.01) of a per-allele main effect, and all two-way combinations of those were evaluated by a per-allele (1 d.f.) test for interaction using logistic regression. Second, all 2.5 billion possible two-SNP combinations were evaluated using Boolean operation-based screening and testing, and SNP pairs with the strongest evidence of interaction (P < 10−4) were selected for more careful assessment by logistic regression. Under the first approach, 3277 SNPs were preselected, but an evaluation of all possible two-SNP combinations (1 d.f.) identified no interactions at P < 10−8. Results from the second analytic approach were consistent with those from the first (P > 10−10). In summary, we observed little evidence of two-way SNP interactions in breast cancer susceptibility, despite the large number of SNPs with potential marginal effects considered and the very large sample size. This finding may have important implications for risk prediction, simplifying the modelling required. Further comprehensive, large-scale genome-wide interaction studies may identify novel interacting loci if the inherent logistic and computational challenges can be overcome. PMID:24242184
Milne, Roger L; Herranz, Jesús; Michailidou, Kyriaki; Dennis, Joe; Tyrer, Jonathan P; Zamora, M Pilar; Arias-Perez, José Ignacio; González-Neira, Anna; Pita, Guillermo; Alonso, M Rosario; Wang, Qin; Bolla, Manjeet K; Czene, Kamila; Eriksson, Mikael; Humphreys, Keith; Darabi, Hatef; Li, Jingmei; Anton-Culver, Hoda; Neuhausen, Susan L; Ziogas, Argyrios; Clarke, Christina A; Hopper, John L; Dite, Gillian S; Apicella, Carmel; Southey, Melissa C; Chenevix-Trench, Georgia; Swerdlow, Anthony; Ashworth, Alan; Orr, Nicholas; Schoemaker, Minouk; Jakubowska, Anna; Lubinski, Jan; Jaworska-Bieniek, Katarzyna; Durda, Katarzyna; Andrulis, Irene L; Knight, Julia A; Glendon, Gord; Mulligan, Anna Marie; Bojesen, Stig E; Nordestgaard, Børge G; Flyger, Henrik; Nevanlinna, Heli; Muranen, Taru A; Aittomäki, Kristiina; Blomqvist, Carl; Chang-Claude, Jenny; Rudolph, Anja; Seibold, Petra; Flesch-Janys, Dieter; Wang, Xianshu; Olson, Janet E; Vachon, Celine; Purrington, Kristen; Winqvist, Robert; Pylkäs, Katri; Jukkola-Vuorinen, Arja; Grip, Mervi; Dunning, Alison M; Shah, Mitul; Guénel, Pascal; Truong, Thérèse; Sanchez, Marie; Mulot, Claire; Brenner, Hermann; Dieffenbach, Aida Karina; Arndt, Volker; Stegmaier, Christa; Lindblom, Annika; Margolin, Sara; Hooning, Maartje J; Hollestelle, Antoinette; Collée, J Margriet; Jager, Agnes; Cox, Angela; Brock, Ian W; Reed, Malcolm W R; Devilee, Peter; Tollenaar, Robert A E M; Seynaeve, Caroline; Haiman, Christopher A; Henderson, Brian E; Schumacher, Fredrick; Le Marchand, Loic; Simard, Jacques; Dumont, Martine; Soucy, Penny; Dörk, Thilo; Bogdanova, Natalia V; Hamann, Ute; Försti, Asta; Rüdiger, Thomas; Ulmer, Hans-Ulrich; Fasching, Peter A; Häberle, Lothar; Ekici, Arif B; Beckmann, Matthias W; Fletcher, Olivia; Johnson, Nichola; dos Santos Silva, Isabel; Peto, Julian; Radice, Paolo; Peterlongo, Paolo; Peissel, Bernard; Mariani, Paolo; Giles, Graham G; Severi, Gianluca; Baglietto, Laura; Sawyer, Elinor; Tomlinson, Ian; Kerin, Michael; Miller, Nicola; Marme, Federik; Burwinkel, Barbara; Mannermaa, Arto; Kataja, Vesa; Kosma, Veli-Matti; Hartikainen, Jaana M; Lambrechts, Diether; Yesilyurt, Betul T; Floris, Giuseppe; Leunen, Karin; Alnæs, Grethe Grenaker; Kristensen, Vessela; Børresen-Dale, Anne-Lise; García-Closas, Montserrat; Chanock, Stephen J; Lissowska, Jolanta; Figueroa, Jonine D; Schmidt, Marjanka K; Broeks, Annegien; Verhoef, Senno; Rutgers, Emiel J; Brauch, Hiltrud; Brüning, Thomas; Ko, Yon-Dschun; Couch, Fergus J; Toland, Amanda E; Yannoukakos, Drakoulis; Pharoah, Paul D P; Hall, Per; Benítez, Javier; Malats, Núria; Easton, Douglas F
2014-04-01
Part of the substantial unexplained familial aggregation of breast cancer may be due to interactions between common variants, but few studies have had adequate statistical power to detect interactions of realistic magnitude. We aimed to assess all two-way interactions in breast cancer susceptibility between 70,917 single nucleotide polymorphisms (SNPs) selected primarily based on prior evidence of a marginal effect. Thirty-eight international studies contributed data for 46,450 breast cancer cases and 42,461 controls of European origin as part of a multi-consortium project (COGS). First, SNPs were preselected based on evidence (P < 0.01) of a per-allele main effect, and all two-way combinations of those were evaluated by a per-allele (1 d.f.) test for interaction using logistic regression. Second, all 2.5 billion possible two-SNP combinations were evaluated using Boolean operation-based screening and testing, and SNP pairs with the strongest evidence of interaction (P < 10(-4)) were selected for more careful assessment by logistic regression. Under the first approach, 3277 SNPs were preselected, but an evaluation of all possible two-SNP combinations (1 d.f.) identified no interactions at P < 10(-8). Results from the second analytic approach were consistent with those from the first (P > 10(-10)). In summary, we observed little evidence of two-way SNP interactions in breast cancer susceptibility, despite the large number of SNPs with potential marginal effects considered and the very large sample size. This finding may have important implications for risk prediction, simplifying the modelling required. Further comprehensive, large-scale genome-wide interaction studies may identify novel interacting loci if the inherent logistic and computational challenges can be overcome.
Workflow management in large distributed systems
NASA Astrophysics Data System (ADS)
Legrand, I.; Newman, H.; Voicu, R.; Dobre, C.; Grigoras, C.
2011-12-01
The MonALISA (Monitoring Agents using a Large Integrated Services Architecture) framework provides a distributed service system capable of controlling and optimizing large-scale, data-intensive applications. An essential part of managing large-scale, distributed data-processing facilities is a monitoring system for computing facilities, storage, networks, and the very large number of applications running on these systems in near realtime. All this monitoring information gathered for all the subsystems is essential for developing the required higher-level services—the components that provide decision support and some degree of automated decisions—and for maintaining and optimizing workflow in large-scale distributed systems. These management and global optimization functions are performed by higher-level agent-based services. We present several applications of MonALISA's higher-level services including optimized dynamic routing, control, data-transfer scheduling, distributed job scheduling, dynamic allocation of storage resource to running jobs and automated management of remote services among a large set of grid facilities.
2007-09-01
simulation modeling approach to describing carbon- flow-based, ecophysiological processes and biomass dynamics of fresh- water submersed aquatic plant...the distribution and abundance of SAV. In aquatic systems a small part of the irradiance can be reflected by the water surface, and further...to the fact that water temperatures in the lake were relatively low compared to air tem- peratures because of the large inflow of groundwater (Titus
NASA Technical Reports Server (NTRS)
Jeun, B. H.; Barger, G. L.
1977-01-01
A data base of synoptic meteorological information was compiled for the People's Republic of China, as an integral part of the Large Area Crop Inventory Experiment. A system description is provided, including hardware and software specifications, computation algorithms and an evaluation of output validity. Operations are also outlined, with emphasis placed on least squares interpolation.
Optimization of Regenerators for AMRR Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nellis, Gregory; Klein, Sanford; Brey, William
Active Magnetic Regenerative Refrigeration (AMRR) systems have no direct global warming potential or ozone depletion potential and hold the potential for providing refrigeration with efficiencies that are equal to or greater than the vapor compression systems used today. The work carried out in this project has developed and improved modeling tools that can be used to optimize and evaluate the magnetocaloric materials and geometric structure of the regenerator beds required for AMRR Systems. There has been an explosion in the development of magnetocaloric materials for AMRR systems over the past few decades. The most attractive materials, based on the magnitudemore » of the measured magnetocaloric effect, tend to also have large amounts of hysteresis. This project has provided for the first time a thermodynamically consistent method for evaluating these hysteretic materials in the context of an AMRR cycle. An additional, practical challenge that has been identified for AMRR systems is related to the participation of the regenerator wall in the cyclic process. The impact of housing heat capacity on both passive and active regenerative systems has been studied and clarified within this project. This report is divided into two parts corresponding to these two efforts. Part 1 describes the work related to modeling magnetic hysteresis while Part 2 discusses the modeling of the heat capacity of the housing. A key outcome of this project is the development of a publically available modeling tool that allows researchers to identify a truly optimal magnetocaloric refrigerant. Typically, the refrigeration potential of a magnetocaloric material is judged entirely based on the magnitude of the magnetocaloric effect and other properties of the material that are deemed unimportant. This project has shown that a material with a large magnetocaloric effect (as evidenced, for example, by a large adiabatic temperature change) may not be optimal when it is accompanied by a large hysteresis. The trade-off between these various material properties and the proper design of an AMRR system can only be evaluated correctly using the comprehensive, physics-based model developed by this project. The development of these modeling tools and optimization studies will provide the knowledge base that is required to achieve transformational discoveries. The widespread adoption of AMRR technology will change the character of energy demand in this country and provide manufacturing jobs as well as employment associated with retrofitting existing HVAC&R applications.« less
Wireless gas detection with a smartphone via rf communication
Azzarelli, Joseph M.; Mirica, Katherine A.; Ravnsbæk, Jens B.; Swager, Timothy M.
2014-01-01
Chemical sensing is of critical importance to human health, safety, and security, yet it is not broadly implemented because existing sensors often require trained personnel, expensive and bulky equipment, and have large power requirements. This study reports the development of a smartphone-based sensing strategy that employs chemiresponsive nanomaterials integrated into the circuitry of commercial near-field communication tags to achieve non-line-of-sight, portable, and inexpensive detection and discrimination of gas-phase chemicals (e.g., ammonia, hydrogen peroxide, cyclohexanone, and water) at part-per-thousand and part-per-million concentrations. PMID:25489066
Computed Flow Through An Artificial Heart Valve
NASA Technical Reports Server (NTRS)
Rogers, Stewart E.; Kwak, Dochan; Kiris, Cetin; Chang, I-Dee
1994-01-01
Report discusses computations of blood flow through prosthetic tilting disk valve. Computational procedure developed in simulation used to design better artificial hearts and valves by reducing or eliminating following adverse flow characteristics: large pressure losses, which prevent hearts from working efficiently; separated and secondary flows, which causes clotting; and high turbulent shear stresses, which damages red blood cells. Report reiterates and expands upon part of NASA technical memorandum "Computed Flow Through an Artificial Heart and Valve" (ARC-12983). Also based partly on research described in "Numerical Simulation of Flow Through an Artificial Heart" (ARC-12478).
Integrating Information Technology into an Accounting Communication Class
ERIC Educational Resources Information Center
Vik, Gretchen N.
2007-01-01
In the accounting communication class, which includes both writing and making presentations, the article-based memo has always been the first assignment, in which students learn business formats and writing style, use of headings, audience analysis, and adapting material for different audiences. As part of a large project to revise the accounting…
A Standard Procedure for Conducting Cognitive Task Analysis.
ERIC Educational Resources Information Center
Redding, Richard E.
Traditional methods for task analysis have been largely based on the Instructional Systems Development (ISD) model, which is widely used throughout industry and the military. The first part of this document gives an overview of cognitive task analysis, which is conducted within the first phase of ISD. The following steps of cognitive task analysis…
NASA Technical Reports Server (NTRS)
Francois, J.
1981-01-01
The effects of airplane noise on the mental equilibrium of residents living near airports are discussed, and based on population sample surveys involving health questionnaires and self-administered personality tests. Progressive changes were observed on the part of residents living near a large airport.
The Why's and How's of Integrating Downloadable Academic Ebooks
ERIC Educational Resources Information Center
Buckley, Matthew J.; Johnson, Melissa Maria
2013-01-01
There has been a noticeable divide the past few years within the library world regarding electronic books. Many academic libraries have been purchasing or leasing web-based academic ebooks for years. Most public libraries on the other hand (thanks in large part to services such as OverDrive) have directed their attention toward downloadable…
The Researcher's Role in the Renewal of Vocational Education.
ERIC Educational Resources Information Center
Asche, F. Marion
1986-01-01
The author argues that the potential effectiveness of researchers in the renewal of vocational education will depend in large measure on their ability to participate in the larger shift from total dependence on physical models of research and their ability to build new interdisciplinary models based in part on emerging practices in business and…
Teaching Team Invasion Games and Motivational Climate
ERIC Educational Resources Information Center
Gray, Shirley; Sproule, John; Morgan, Kevin
2009-01-01
Team invasion games (TIG) make up a large part of the PE curriculum in Scottish schools. It is important, therefore, to understand the environmental conditions that contribute to pupils' motivation to learn to play TIG. Consequently, this study aimed to identify the teaching behaviours exhibited when teaching TIG using a game-based approach and a…
Forest structure and development: implications for forest management
Kevin L. O' Hara
2004-01-01
A general premise of forest managers is that modern silviculture should be based, in large part, on natural disturbance patterns and species' adaptations to these disturbances. An understanding of forest stand dynamics is therefore a prerequisite to sound forest management. This paper provides a brief overview of forest stand development, stand structures, and...
Decentralized Control of a Large Space Structure as Applied to the CSDL 2 Model.
1982-12-01
and are arranged by their respective controller assignments. The individual modes may be identified by the imaginary parts of the eigen- values, as...August 1958 to Phillip Z. and Hanako Y. Aldridge in Johnson Air Force Base, Japan. After a childhood in the United States, he returned to Japan
Experiences with Autonomy: Learners' Voices on Language Learning
ERIC Educational Resources Information Center
Kristmanson, Paula; Lafargue, Chantal; Culligan, Karla
2013-01-01
This article focuses on the experiences of Grade 12 students using a language portfolio based on the principles and guidelines of the European Language Portfolio (ELP) in their second language classes in a large urban high school. As part of a larger action-research project, focus group interviews were conducted to gather data related to…
ERIC Educational Resources Information Center
Barber, Wendy; King, Sherry
2016-01-01
Universities and institutions of higher education are facing economic pressures to sustain large classes, while simultaneously maintaining the quality of the online learning environment (Deming et al., 2015). Digital learning environments require significant pedagogical shifts on the part of the teacher. This paper is a qualitative examination of…
77 FR 67671 - Larry Elbert Perry, M.D.; Decision and Order
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-13
... Respondent does ``not have authority to practice medicine or handle controlled substances in the State of... Request contended that the loss of his Kentucky authority was based, in large part, on a disciplinary action by the Tennessee Board of Medicine, and that an extension should be granted for ``a reasonable...
NASA Technical Reports Server (NTRS)
Homemdemello, Luiz S.
1992-01-01
An assembly planner for tetrahedral truss structures is presented. To overcome the difficulties due to the large number of parts, the planner exploits the simplicity and uniformity of the shapes of the parts and the regularity of their interconnection. The planning automation is based on the computational formalism known as production system. The global data base consists of a hexagonal grid representation of the truss structure. This representation captures the regularity of tetrahedral truss structures and their multiple hierarchies. It maps into quadratic grids and can be implemented in a computer by using a two-dimensional array data structure. By maintaining the multiple hierarchies explicitly in the model, the choice of a particular hierarchy is only made when needed, thus allowing a more informed decision. Furthermore, testing the preconditions of the production rules is simple because the patterned way in which the struts are interconnected is incorporated into the topology of the hexagonal grid. A directed graph representation of assembly sequences allows the use of both graph search and backtracking control strategies.
Asynchronous transfer mode distribution network by use of an optoelectronic VLSI switching chip.
Lentine, A L; Reiley, D J; Novotny, R A; Morrison, R L; Sasian, J M; Beckman, M G; Buchholz, D B; Hinterlong, S J; Cloonan, T J; Richards, G W; McCormick, F B
1997-03-10
We describe a new optoelectronic switching system demonstration that implements part of the distribution fabric for a large asynchronous transfer mode (ATM) switch. The system uses a single optoelectronic VLSI modulator-based switching chip with more than 4000 optical input-outputs. The optical system images the input fibers from a two-dimensional fiber bundle onto this chip. A new optomechanical design allows the system to be mounted in a standard electronic equipment frame. A large section of the switch was operated as a 208-Mbits/s time-multiplexed space switch, which can serve as part of an ATM switch by use of an appropriate out-of-band controller. A larger section with 896 input light beams and 256 output beams was operated at 160 Mbits/s as a slowly reconfigurable space switch.
Vikre, Peter G.; John, David A.; du Bray, Edward A.; Fleck, Robert J.
2015-09-25
Based on volcanic stratigraphy, geochronology, remnant paleosurfaces, and paleopotentiometric surfaces in mining districts and alteration zones, present landforms in the Bodie Hills volcanic field reflect incremental construction of stratovolcanoes and large- to small-volume flow-domes, magmatic inflation, and fault displacements. Landform evolution began with construction of the 15–13 Ma Masonic and 13–12 Ma Aurora volcanic centers in the northwestern and northeastern parts of the field, respectively. Smaller volcanoes erupted at ~11–10 Ma in, between, and south of these centers as erosional detritus accumulated north of the field in Fletcher Valley. Distally sourced, 9.7–9.3 Ma Eureka Valley Tuff filled drainages and depressions among older volcanoes and was partly covered by nearly synchronous eruptives during construction of four large 10–8 Ma volcanoes, in the southern part of the field. The lack of significant internal fault displacement, distribution of Eureka Valley Tuff, and elevation estimates derived from floras, suggest that the Bodie Hills volcanic field attained present elevations mostly through volcano construction and magmatic inflation, and that maximum paleoelevations (>8,500 ft) at the end of large volume eruptions at ~8 Ma are similar to present elevations.
Measurement and prediction of broadband noise from large horizontal axis wind turbine generators
NASA Technical Reports Server (NTRS)
Grosveld, F. W.; Shepherd, K. P.; Hubbard, H. H.
1995-01-01
A method is presented for predicting the broadband noise spectra of large wind turbine generators. It includes contributions from such noise sources as the inflow turbulence to the rotor, the interactions between the turbulent boundary layers on the blade surfaces with their trailing edges and the wake due to a blunt trailing edge. The method is partly empirical and is based on acoustic measurements of large wind turbines and airfoil models. Spectra are predicted for several large machines including the proposed MOD-5B. Measured data are presented for the MOD-2, the WTS-4, the MOD-OA, and the U.S. Windpower Inc. machines. Good agreement is shown between the predicted and measured far field noise spectra.
Evaluation of Genetic Algorithm Concepts using Model Problems. Part 1; Single-Objective Optimization
NASA Technical Reports Server (NTRS)
Holst, Terry L.; Pulliam, Thomas H.
2003-01-01
A genetic-algorithm-based optimization approach is described and evaluated using a simple hill-climbing model problem. The model problem utilized herein allows for the broad specification of a large number of search spaces including spaces with an arbitrary number of genes or decision variables and an arbitrary number hills or modes. In the present study, only single objective problems are considered. Results indicate that the genetic algorithm optimization approach is flexible in application and extremely reliable, providing optimal results for all problems attempted. The most difficult problems - those with large hyper-volumes and multi-mode search spaces containing a large number of genes - require a large number of function evaluations for GA convergence, but they always converge.
Ohba, Hideo; Yamaguchi, Satoshi; Sadatomo, Takashi; Takeda, Masaaki; Kolakshyapati, Manish; Kurisu, Kaoru
2017-03-01
The first-line treatment of encephalocele is reduction of herniated structures. Large irreducible encephalocele entails resection of the lesion. In such case, it is essential to ascertain preoperatively if the herniated structure encloses critical venous drainage. Two cases of encephalocele presenting with large occipital mass underwent magnetic resonance (MR) imaging. In first case, the skin mass enclosed the broad space containing cerebrospinal fluid and a part of occipital lobe and cerebellum. The second case had occipital mass harboring a large portion of cerebrum enclosing dilated ventricular space. Both cases had common venous anomalies such as split superior sagittal sinus and high-positioned torcular herophili. They underwent resection of encephalocele without subsequent venous congestion. We could explain the pattern of venous anomalies in encephalocele based on normal developmental theory. Developmental theory connotes that major dural sinuses cannot herniate into the sac of encephalocele. Irrespective to its size, encephalocele can be resected safely at the neck without subsequent venous congestion.
Laghari, Samreen; Niazi, Muaz A
2016-01-01
Computer Networks have a tendency to grow at an unprecedented scale. Modern networks involve not only computers but also a wide variety of other interconnected devices ranging from mobile phones to other household items fitted with sensors. This vision of the "Internet of Things" (IoT) implies an inherent difficulty in modeling problems. It is practically impossible to implement and test all scenarios for large-scale and complex adaptive communication networks as part of Complex Adaptive Communication Networks and Environments (CACOONS). The goal of this study is to explore the use of Agent-based Modeling as part of the Cognitive Agent-based Computing (CABC) framework to model a Complex communication network problem. We use Exploratory Agent-based Modeling (EABM), as part of the CABC framework, to develop an autonomous multi-agent architecture for managing carbon footprint in a corporate network. To evaluate the application of complexity in practical scenarios, we have also introduced a company-defined computer usage policy. The conducted experiments demonstrated two important results: Primarily CABC-based modeling approach such as using Agent-based Modeling can be an effective approach to modeling complex problems in the domain of IoT. Secondly, the specific problem of managing the Carbon footprint can be solved using a multiagent system approach.
NASA Astrophysics Data System (ADS)
Miyazawa, Y.; Inoue, A.; Maruyama, A.
2013-12-01
Grassland within a caldera of Mt. Aso has been maintained for fertilizer production from grasses and cattle feeding. Due to the changes in the agricultural and social structure since 1950's, a large part of the grassland was converted to plantations or abandoned to shrublands. Because vegetations of different plant functional types differ in evapotranspiration; ET, a research project was launched to examine the effects of the ongoing land use change on the ET within the caldera, and consequently affect the surface and groundwater discharge of the region. As the part of the project, transpiration rate; E of the major 3 forest types were investigated using sap flow measurements. Based on the measured data, stomatal conductance; Gs was inversely calculated and its response to the environmental factors was modeled using Jarvis-type equation in order to estimate ET of a given part of the caldera based on the plant functional type and the weather data. The selected forests were conifer plantation, deciduous broadleaved plantation and shrubland, which were installed with sap flow sensors to calculate stand-level transpiration rate. Sap flux; Js did not show clear differences among sites despite the large differences in sapwood area. In early summer solar radiation was limited to low levels due to frequent rainfall events and therefore, Js was the function of solar radiation rather than other environmental factors, such as vapor pressure deficit and soil water content. Gs was well regressed with the vapor pressure deficit and solar radiation. The estimated E based on Gs model and the weather data was 0.3-1.2 mm day-1 for each site and was comparable to the E of grassland in other study sites. Results suggested that transpiration rate in growing was not different between vegetations but its annual value are thought to differ due to the different phenology.
Electronic management: Exploring its impact on small business
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bewayo, E.D.
Macworld magazine recently reported that more than one in five companies eavesdrops electronically on its employees. Electronic eavesdropping is one name given to electronic management Besides being known as electronic eaves-dropping, electronic management also goes by electronic monitoring, electronic supervision, electronic snooping, electronic sweat-shopping, electronic surveillance, electronic Big Brothering, and computerized performance monitoring. Some of these labels connote negative things about electronic management, and relate to applications of electronic management to extreme and unreasonable levels. In the rest of this paper the terms electronic management and electronic monitoring will be used interchangeably. In this paper we discuss the impacts ofmore » electronic management, positive and negative, on workplaces, with emphasis on small businesses. This small business emphasis is partly because of the author`s research interests, and partly because most of what has been written on electronic management has been based on large business contexts. This large business bias has been partly due to the fact that the early inroads of electronic management were almost exclusively limited to large companies--beginning with telephone service observation in the late 1800s. However, now with the growing affordability and, consequently, the proliferation of electronic technology (especially the computer), electronic management is no longer the monopoly of large corporations. Electronic management has now reached restaurants, drug stores, liquor stores, convenience stores, and trucking companies. And in some industries, e.g., banking, every business, regardless of size, uses electronic monitoring.« less
Factors Influencing Retention Among Part-Time Clinical Nursing Faculty.
Carlson, Joanne S
This study sought to determine job characteristics influencing retention of part-time clinical nurse faculty teaching in pre-licensure nursing education. Large numbers of part-time faculty are needed to educate students in the clinical setting. Faculty retention helps maintain consistency and may positively influence student learning. A national sample of part-time clinical nurse faculty teaching in baccalaureate programs responded to a web-based survey. Respondents were asked to identify the primary reason for wanting or not wanting to continue working for a school of nursing (SON). Affinity for students, pay and benefits, support, and feeling valued were the top three reasons given for continuing to work at an SON. Conflicts with life and other job responsibilities, low pay, and workload were the top three reasons given for not continuing. Results from this study can assist nursing programs in finding strategies to help reduce attrition among part-time clinical faculty.
Parts and Components Reliability Assessment: A Cost Effective Approach
NASA Technical Reports Server (NTRS)
Lee, Lydia
2009-01-01
System reliability assessment is a methodology which incorporates reliability analyses performed at parts and components level such as Reliability Prediction, Failure Modes and Effects Analysis (FMEA) and Fault Tree Analysis (FTA) to assess risks, perform design tradeoffs, and therefore, to ensure effective productivity and/or mission success. The system reliability is used to optimize the product design to accommodate today?s mandated budget, manpower, and schedule constraints. Stand ard based reliability assessment is an effective approach consisting of reliability predictions together with other reliability analyses for electronic, electrical, and electro-mechanical (EEE) complex parts and components of large systems based on failure rate estimates published by the United States (U.S.) military or commercial standards and handbooks. Many of these standards are globally accepted and recognized. The reliability assessment is especially useful during the initial stages when the system design is still in the development and hard failure data is not yet available or manufacturers are not contractually obliged by their customers to publish the reliability estimates/predictions for their parts and components. This paper presents a methodology to assess system reliability using parts and components reliability estimates to ensure effective productivity and/or mission success in an efficient manner, low cost, and tight schedule.
Smith predictor with sliding mode control for processes with large dead times
NASA Astrophysics Data System (ADS)
Mehta, Utkal; Kaya, İbrahim
2017-11-01
The paper discusses the Smith Predictor scheme with Sliding Mode Controller (SP-SMC) for processes with large dead times. This technique gives improved load-disturbance rejection with optimum input control signal variations. A power rate reaching law is incorporated in the sporadic part of sliding mode control such that the overall performance recovers meaningfully. The proposed scheme obtains parameter values by satisfying a new performance index which is based on biobjective constraint. In simulation study, the efficiency of the method is evaluated for robustness and transient performance over reported techniques.
The database design of LAMOST based on MYSQL/LINUX
NASA Astrophysics Data System (ADS)
Li, Hui-Xian, Sang, Jian; Wang, Sha; Luo, A.-Li
2006-03-01
The Large Sky Area Multi-Object Fiber Spectroscopic Telescope (LAMOST) will be set up in the coming years. A fully automated software system for reducing and analyzing the spectra has to be developed with the telescope. This database system is an important part of the software system. The requirements for the database of the LAMOST, the design of the LAMOST database system based on MYSQL/LINUX and performance tests of this system are described in this paper.
Quantum communication and information processing
NASA Astrophysics Data System (ADS)
Beals, Travis Roland
Quantum computers enable dramatically more efficient algorithms for solving certain classes of computational problems, but, in doing so, they create new problems. In particular, Shor's Algorithm allows for efficient cryptanalysis of many public-key cryptosystems. As public key cryptography is a critical component of present-day electronic commerce, it is crucial that a working, secure replacement be found. Quantum key distribution (QKD), first developed by C.H. Bennett and G. Brassard, offers a partial solution, but many challenges remain, both in terms of hardware limitations and in designing cryptographic protocols for a viable large-scale quantum communication infrastructure. In Part I, I investigate optical lattice-based approaches to quantum information processing. I look at details of a proposal for an optical lattice-based quantum computer, which could potentially be used for both quantum communications and for more sophisticated quantum information processing. In Part III, I propose a method for converting and storing photonic quantum bits in the internal state of periodically-spaced neutral atoms by generating and manipulating a photonic band gap and associated defect states. In Part II, I present a cryptographic protocol which allows for the extension of present-day QKD networks over much longer distances without the development of new hardware. I also present a second, related protocol which effectively solves the authentication problem faced by a large QKD network, thus making QKD a viable, information-theoretic secure replacement for public key cryptosystems.
NASA Astrophysics Data System (ADS)
Zingl, Manuel; Nuss, Martin; Bauernfeind, Daniel; Aichhorn, Markus
2018-05-01
Recently solvers for the Anderson impurity model (AIM) working directly on the real-frequency axis have gained much interest. A simple and yet frequently used impurity solver is exact diagonalization (ED), which is based on a discretization of the AIM bath degrees of freedom. Usually, the bath parameters cannot be obtained directly on the real-frequency axis, but have to be determined by a fit procedure on the Matsubara axis. In this work we present an approach where the bath degrees of freedom are first discretized directly on the real-frequency axis using a large number of bath sites (≈ 50). Then, the bath is optimized by unitary transformations such that it separates into two parts that are weakly coupled. One part contains the impurity site and its interacting Green's functions can be determined with ED. The other (larger) part is a non-interacting system containing all the remaining bath sites. Finally, the Green's function of the full AIM is calculated via coupling these two parts with cluster perturbation theory.
2017-01-01
Experimental studies have revealed evidence of both parts-based and holistic representations of objects and faces in the primate visual system. However, it is still a mystery how such seemingly contradictory types of processing can coexist within a single system. Here, we propose a novel theory called mixture of sparse coding models, inspired by the formation of category-specific subregions in the inferotemporal (IT) cortex. We developed a hierarchical network that constructed a mixture of two sparse coding submodels on top of a simple Gabor analysis. The submodels were each trained with face or non-face object images, which resulted in separate representations of facial parts and object parts. Importantly, evoked neural activities were modeled by Bayesian inference, which had a top-down explaining-away effect that enabled recognition of an individual part to depend strongly on the category of the whole input. We show that this explaining-away effect was indeed crucial for the units in the face submodel to exhibit significant selectivity to face images over object images in a similar way to actual face-selective neurons in the macaque IT cortex. Furthermore, the model explained, qualitatively and quantitatively, several tuning properties to facial features found in the middle patch of face processing in IT as documented by Freiwald, Tsao, and Livingstone (2009). These included, in particular, tuning to only a small number of facial features that were often related to geometrically large parts like face outline and hair, preference and anti-preference of extreme facial features (e.g., very large/small inter-eye distance), and reduction of the gain of feature tuning for partial face stimuli compared to whole face stimuli. Thus, we hypothesize that the coding principle of facial features in the middle patch of face processing in the macaque IT cortex may be closely related to mixture of sparse coding models. PMID:28742816
Hosoya, Haruo; Hyvärinen, Aapo
2017-07-01
Experimental studies have revealed evidence of both parts-based and holistic representations of objects and faces in the primate visual system. However, it is still a mystery how such seemingly contradictory types of processing can coexist within a single system. Here, we propose a novel theory called mixture of sparse coding models, inspired by the formation of category-specific subregions in the inferotemporal (IT) cortex. We developed a hierarchical network that constructed a mixture of two sparse coding submodels on top of a simple Gabor analysis. The submodels were each trained with face or non-face object images, which resulted in separate representations of facial parts and object parts. Importantly, evoked neural activities were modeled by Bayesian inference, which had a top-down explaining-away effect that enabled recognition of an individual part to depend strongly on the category of the whole input. We show that this explaining-away effect was indeed crucial for the units in the face submodel to exhibit significant selectivity to face images over object images in a similar way to actual face-selective neurons in the macaque IT cortex. Furthermore, the model explained, qualitatively and quantitatively, several tuning properties to facial features found in the middle patch of face processing in IT as documented by Freiwald, Tsao, and Livingstone (2009). These included, in particular, tuning to only a small number of facial features that were often related to geometrically large parts like face outline and hair, preference and anti-preference of extreme facial features (e.g., very large/small inter-eye distance), and reduction of the gain of feature tuning for partial face stimuli compared to whole face stimuli. Thus, we hypothesize that the coding principle of facial features in the middle patch of face processing in the macaque IT cortex may be closely related to mixture of sparse coding models.
NASA Astrophysics Data System (ADS)
Goldberg, Fred; Price, Edward; Robinson, Stephen; Boyd-Harlow, Danielle; McKean, Michael
2012-06-01
We report on the adaptation of the small enrollment, lab and discussion based physical science course, Physical Science and Everyday Thinking (PSET), for a large-enrollment, lecture-style setting. Like PSET, the new Learning Physical Science (LEPS) curriculum was designed around specific principles based on research on learning to meet the needs of nonscience students, especially prospective and practicing elementary and middle school teachers. We describe the structure of the two curricula and the adaptation process, including a detailed comparison of similar activities from the two curricula and a case study of a LEPS classroom implementation. In LEPS, short instructor-guided lessons replace lengthier small group activities, and movies, rather than hands-on investigations, provide the evidence used to support and test ideas. LEPS promotes student peer interaction as an important part of sense making via “clicker” questions, rather than small group and whole class discussions typical of PSET. Examples of student dialog indicate that this format is capable of generating substantive student discussion and successfully enacting the design principles. Field-test data show similar student content learning gains with the two curricula. Nevertheless, because of classroom constraints, some important practices of science that were an integral part of PSET were not included in LEPS.
Programming secure mobile agents in healthcare environments using role-based permissions.
Georgiadis, C K; Baltatzis, J; Pangalos, G I
2003-01-01
The healthcare environment consists of vast amounts of dynamic and unstructured information, distributed over a large number of information systems. Mobile agent technology is having an ever-growing impact on the delivery of medical information. It supports acquiring and manipulating information distributed in a large number of information systems. Moreover is suitable for the computer untrained medical stuff. But the introduction of mobile agents generates advanced threads to the sensitive healthcare information, unless the proper countermeasures are taken. By applying the role-based approach to the authorization problem, we ease the sharing of information between hospital information systems and we reduce the administering part. The different initiative of the agent's migration method, results in different methods of assigning roles to the agent.
NASA Astrophysics Data System (ADS)
Li, Shimiao; Guo, Tong; Yuan, Lin; Chen, Jinping
2018-01-01
Surface topography measurement is an important tool widely used in many fields to determine the characteristics and functionality of a part or material. Among existing methods for this purpose, the focus variation method has proved high performance particularly in large slope scenarios. However, its performance depends largely on the effectiveness of focus function. This paper presents a method for surface topography measurement using a new focus measurement function based on dual-tree complex wavelet transform. Experiments are conducted on simulated defocused images to prove its high performance in comparison with other traditional approaches. The results showed that the new algorithm has better unimodality and sharpness. The method was also verified by measuring a MEMS micro resonator structure.
NASA Astrophysics Data System (ADS)
Ashchepkov, Igor; Ntaflos, Theodoros; Vladykin, Nikolai; Yudin, Denis; Prokopiev, Sergei; Salikhov, Ravil; Travin, Alexei; Kutnenko, Olga
2014-05-01
The Daldyn terrain include two large kimberlite fields Daldyn and Alakite which are to the East and west part of this area correspondingly and referred to the 367-355Ma. The concentrates and mantle xenoliths from the almost 50 kimberlite pipes allow reconstruct the PTXFO2 sections and transects as well as 3D image for the each kimberlite field as well as the for the whole area. In General the common division to the 6 large layers for each parts of SCLM are close but the composition of the layer and rock sequences are different. The Daldyn SCLM is compiled from the alternation of very cold (33 mwm-2) and relatively heated (37-40 mwm-2) large layers while SCLM of Alakite is more uniform and colder in lower part. The Base of the SCLM is much highly heated in Daldyn terrain and in most studied sections are represented by the deformed or HT porphyroclastic peridotites which are rare in the Alakite field. The pyroxenite layer is more thick and pronounced within the Daldyn SCLM. The amount of eclogites and their Mg' number is also higher in general higher in SCLM of Daldyn field The composition of the peridotites are closer to the abyssal MORB peridotites while from Alakit are in general more depleted and closer to continental back arc environment. But the alkalinity of the pyroxenes and abundant metasomatic mineral such as phlogopites and richterites and pargasites are much higher in the Alakite SCKM. The trace elements of primary peridotite Cpx from Alakite SCLM reveal lower melting degrees in Alakite field The boundary between the fields locate between Sytykanskaya and Zagadochnaya pipes is characterized by upwelling of SCLM Base. SCLM layering in eastern part of Daldyn field near the Zarnitsa and Dalnyaya pipe suggests inflexion and is more permeable allowing high scale melt fluid interaction and metasomatism. NE part of the Alakit mantle SCLM from Sytykan to Molodost and further to Fainshteinovskaya pipe is more fertile and consist from the 4 evident units with the Fe# rising upward. In the central part of Alakite region lying on the line Yubileinya - Aykhal pipes and surrounding pipes in clusters represent the most depleted and relatively low metasomatized dunite core. The thermal structure of the Alakite field is in the base is relatively uniform and colder than in Daldyn SCLM. The thermal gradients are more steeper in Alakit mantle. The metasomatic associations in the Daldyn field are marking mainly Archean history in the upper part (2.3-2.5 Ma) of the SCLM (Pokhilenko et al., 2012 and refer the influence of the protokimbelites to the in the lower part ( Ashchepkov et al 2013; Pokhilenko, Alifirova, 2012) . But also the range of the more modern event refer to the Rodinia history (0.7 -1.2 By ) (Pokhilenko et al., 2012 )while or data for the Alakit SCLM mainly mark the modern event from the disseminated abundant metasomatism in the lower in middle part from (1.3 - 0.6 By ). RFBR 11-05-00060; 11-05-91060-PICS.
Domestic Violence between Same-Sex Partners: Implications for Counseling.
ERIC Educational Resources Information Center
Peterman, Linda M.; Dixon, Charlotte G.
2003-01-01
Discusses the dynamics of domestic violence between partners of the same sex. The social and cultural issues in the gay and lesbian communities play a large part in perpetuating the myths of domestic violence, which keeps the abuse hidden. This article is based on an extensive review of the literature and a clinical consensus among experts in the…
Molecular Electronic Shift Registers
NASA Technical Reports Server (NTRS)
Beratan, David N.; Onuchic, Jose N.
1990-01-01
Molecular-scale shift registers eventually constructed as parts of high-density integrated memory circuits. In principle, variety of organic molecules makes possible large number of different configurations and modes of operation for such shift-register devices. Several classes of devices and implementations in some specific types of molecules proposed. All based on transfer of electrons or holes along chains of repeating molecular units.
Behind the Veil: An In-Depth Exploration of Egyptian Muslim Women's Lives through Dance
ERIC Educational Resources Information Center
Toncy, Nancy
2008-01-01
Muslim women in Arabic countries have unique experiences that are shaped in large part by their cultures' beliefs regarding the female body. Mandated behaviors and men's attitudes towards women's role in society have likewise created oppressive situations which have affected women's sense of self. Because many of those experiences are body-based,…
A closed-loop control-loading system
NASA Technical Reports Server (NTRS)
Ashworth, B. R.; Parrish, R. V.
1979-01-01
Langley Differential Maneuvering Simulator (DMS) realistically simulates two aircraft operating in differential mode. It consists of two identical fixed-base cockpits and dome projection systems. Each projection system consists of sky/Earth projector and target-image generator and projector. Although programmable control forces are small part of overall system, they play large role in providing pilot with kinesthetic cues.
Automatic Gait Recognition for Human ID at a Distance
2004-11-01
at the modeling and understanding of human movement through image sequences. The ongoing interest in gait in a biometric is in a large part the wider...2.2 Model -Based Approaches...with Canonical Analysis (CA) [11]. At that stage, only one approach had used a model to analyze leg movement [12] as opposed to using human body shape
ERIC Educational Resources Information Center
Badilescu-Buga, Emil
2012-01-01
Learning Activity Management System (LAMS) has been trialled and used by users from many countries around the globe, but despite the positive attitude towards its potential benefits to pedagogical processes its adoption in practice has been uneven, reflecting how difficult it is to make a new technology based concept an integral part of the…
ERIC Educational Resources Information Center
Kanchewa, Stella S.; Rhodes, Jean E.; Schwartz, Sarah E. O.; Olsho, Lauren E. W.
2014-01-01
Although assigned mentoring relationships have typically involved same-gender matches, a growing number of programs, particularly those in schools, have begun pairing female mentors with male mentees. This practice stems, in large part, from the relative dearth of male mentors and programs' efforts to increase the availability of youth mentoring…
Accelerating NASA GN&C Flight Software Development
NASA Technical Reports Server (NTRS)
Tamblyn, Scott; Henry, Joel; Rapp, John
2010-01-01
When the guidance, navigation, and control (GN&C) system for the Orion crew vehicle undergoes Critical Design Review (CDR), more than 90% of the flight software will already be developed - a first for NASA on a project of this scope and complexity. This achievement is due in large part to a new development approach using Model-Based Design.
Perceptions of Educational Opportunities in Small Schools in Rural Australia and Canada
ERIC Educational Resources Information Center
Stevens, Ken
2009-01-01
Australia and Canada are large countries with small populations relative to their size, in which a not inconsiderable number of citizens live beyond major centres of population. In both resource-based economies, the provision of quality education in rural schools is an important part of the national social and economic infrastructure. This article…
The Effect of Gestational Age on Symptom Severity in Children with Autism Spectrum Disorder
ERIC Educational Resources Information Center
Movsas, Tammy Z.; Paneth, Nigel
2012-01-01
Between 2006 and 2010, two research-validated instruments, Social Communication Questionnaire (SCQ) and Social Responsiveness Scale (SRS) were filled out online by 4,188 mothers of Autism Spectrum Disorder (ASD) children, aged 4-21, as part of voluntary parental participation in a large web-based registry. Univariate and multivariate linear…
ERIC Educational Resources Information Center
National Center for Education Evaluation and Regional Assistance, 2014
2014-01-01
The Teacher Incentive Fund (TIF) provides grants to support performance-based compensation systems for teachers and principals in high-need schools. The study measures the impact of pay-for-performance bonuses as part of a comprehensive compensation system within a large, multisite random assignment study design. The treatment schools were to…
ERIC Educational Resources Information Center
Hilbig, Benjamin E.; Pohl, Rudiger F.
2009-01-01
According to part of the adaptive toolbox notion of decision making known as the recognition heuristic (RH), the decision process in comparative judgments--and its duration--is determined by whether recognition discriminates between objects. By contrast, some recently proposed alternative models predict that choices largely depend on the amount of…
ERIC Educational Resources Information Center
DiMino, John L.; Risler, Robin
2014-01-01
This article focuses on the experiences of predoctoral interns supervising the clinical work of less experienced externs in psychology and social work as part of a training program in a large university counseling center. After 4 years of running a relationally based supervision of supervision group, the authors believe that providing supervision…
The National Map - Geographic Names
,
2002-01-01
Governments depend on a common set of base geographic information as a tool for economic and community development, land and natural resource management, and health and safety services. Emergency management and homeland security applications rely on this information. Private industry, nongovernmental organizations, and individual citizens use the same geographic data. Geographic information underpins an increasingly large part of the Nation's economy.
The National Map - Orthoimagery
,
2002-01-01
Governments depend on a common set of base geographic information as a tool for economic and community development, land and natural resource management, and health and safety services. Emergency management and homeland security applications rely on this information. Private industry, nongovernmental organizations, and individual citizens use the same geographic data. Geographic information underpins an increasingly large part of the Nation's economy.
Electromagnetic Waves in a Uniform Gravitational Field and Planck's Postulate
ERIC Educational Resources Information Center
Acedo, Luis; Tung, Michael M.
2012-01-01
The gravitational redshift forms the central part of the majority of the classical tests for the general theory of relativity. It could be successfully checked even in laboratory experiments on the earth's surface. The standard derivation of this effect is based on the distortion of the local structure of spacetime induced by large masses. The…
ERIC Educational Resources Information Center
Frimberger, Katja
2016-01-01
The following article explores the conceptual background and pedagogical realities of establishing a well-being focussed language pedagogy in the context of an informal educational event called "Language Fest". The event was organised as part of the UK Arts and Humanities Research Council-funded large grant project "Researching…
Building Communities of Learners. A Collaboration among Teachers, Students, Families, and Community.
ERIC Educational Resources Information Center
McCaleb, Sudia Paloma
This book suggests an approach to education that includes students' family members as valuable citizens in a community of learners which also includes students, teachers, and other members of the community at large. Part 1 examines current trends in parental involvement and the hidden assumptions on which many such programs are based. It is argued…
Teaching Data Analysis with Interactive Visual Narratives
ERIC Educational Resources Information Center
Saundage, Dilal; Cybulski, Jacob L.; Keller, Susan; Dharmasena, Lasitha
2016-01-01
Data analysis is a major part of business analytics (BA), which refers to the skills, methods, and technologies that enable managers to make swift, quality decisions based on large amounts of data. BA has become a major component of Information Systems (IS) courses all over the world. The challenge for IS educators is to teach data analysis--the…
Attitudes towards the Euro: An Empirical Study Based on the German Socio-Economic Panel (SOEP)
ERIC Educational Resources Information Center
Isengard, Bettina; Schneider, Thorsten
2007-01-01
This paper investigates changing attitudes towards the euro over time in Germany using longitudinal micro-data from the German Socio Economic Panel Study. We observe that a large part of the German population was worried about the new currency both before and after its introduction. Social psychological theories provide insight into these…
Fossil fuel combustion results in the emission of greenhouse gases. Currently, the earth is experiencing unprecedented, human-induced changes in the atmosphere with consequent and threatening changes to its climate. This event is due, in large part, to fossil fuel emissions.
,
2002-01-01
Governments depend on a common set of base geographic information as a tool for economic and community development, land and natural resource management, and health and safety services. Emergency management and homeland security applications rely on this information. Private industry, nongovernmental organizations, and individual citizens use the same geographic data. Geographic information underpins an increasingly large part of the Nation's economy.
The National Map - Hydrography
,
2002-01-01
Governments depend on a common set of base geographic information as a tool for economic and community development, land and natural resource management, and health and safety services. Emergency management and homeland security applications rely on this information. Private industry, nongovernmental organizations, and individual citizens use the same geographic data. Geographic information underpins an increasingly large part of the Nation's economy.
ERIC Educational Resources Information Center
Benjamin, Aaron S.
2010-01-01
It is widely assumed that older adults suffer a deficit in the psychological processes that underlie remembering of contextual or source information. This conclusion is based in large part on empirical interactions, including disordinal ones, that reveal differential effects of manipulations of memory strength on recognition in young and old…
Design and realization of the real-time spectrograph controller for LAMOST based on FPGA
NASA Astrophysics Data System (ADS)
Wang, Jianing; Wu, Liyan; Zeng, Yizhong; Dai, Songxin; Hu, Zhongwen; Zhu, Yongtian; Wang, Lei; Wu, Zhen; Chen, Yi
2008-08-01
A large Schmitt reflector telescope, Large Sky Area Multi-Object Fiber Spectroscopic Telescope(LAMOST), is being built in China, which has effective aperture of 4 meters and can observe the spectra of as many as 4000 objects simultaneously. To fit such a large amount of observational objects, the dispersion part is composed of a set of 16 multipurpose fiber-fed double-beam Schmidt spectrographs, of which each has about ten of moveable components realtimely accommodated and manipulated by a controller. An industrial Ethernet network connects those 16 spectrograph controllers. The light from stars is fed to the entrance slits of the spectrographs with optical fibers. In this paper, we mainly introduce the design and realization of our real-time controller for the spectrograph, our design using the technique of System On Programmable Chip (SOPC) based on Field Programmable Gate Array (FPGA) and then realizing the control of the spectrographs through NIOSII Soft Core Embedded Processor. We seal the stepper motor controller as intellectual property (IP) cores and reuse it, greatly simplifying the design process and then shortening the development time. Under the embedded operating system μC/OS-II, a multi-tasks control program has been well written to realize the real-time control of the moveable parts of the spectrographs. At present, a number of such controllers have been applied in the spectrograph of LAMOST.
Manufacture of large glass honeycomb mirrors. [for astronomical telescopes
NASA Technical Reports Server (NTRS)
Angel, J. R. P.; Hill, J. M.
1982-01-01
The problem of making very large glass mirrors for astronomical telescopes is examined, and the advantages of honeycomb mirrors made of borosilicate glass are discussed. Thermal gradients in the glass that degrade the figure of thick borosilicate mirrors during use can be largely eliminated in a honeycomb structure by internal ventilation (in air) or careful control of the radiation environment (in space). It is expected that ground-based telescopes with honeycomb mirrors will give better images than those with solid mirrors. Materials, techniques, and the experience that has been gained making trial mirrors and test castings as part of a program to develop 8-10-m-diameter lightweight mirrors are discussed.
Large antenna measurement and compensation techniques
NASA Technical Reports Server (NTRS)
Rahmatsamii, Y.
1989-01-01
Antennas in the range of 20 meters or larger will be an integral part of future satellite communication and scientific payloads. In order to commercially use these large, low sidelobe and multiple-beam antennas, a high level of confidence must be established as to their performance in the 0-g and space environment. It is also desirable to compensate for slowly varying surface distortions which could results from thermal effects. An overview of recent advances in performing rf measurements on large antennas is presented with emphasis given to the application of a space-based far-field range utilizing the Space Shuttle. The concept of surface distortion compensation is discussed by providing numerical and measurement results.
Dohm, J.M.; Ferris, J.C.; Baker, V.R.; Anderson, R.C.; Hare, T.M.; Strom, R.G.; Barlow, N.G.; Tanaka, K.L.; Klemaszewski, J.E.; Scott, D.H.
2001-01-01
Paleotopographic reconstructions based on a synthesis of published geologic information and high-resolution topography, including topographic profiles, reveal the potential existence of an enormous drainage basin/aquifer system in the eastern part of the Tharsis region during the Noachian Period. Large topographic highs formed the margin of the gigantic drainage basin. Subsequently, lavas, sediments, and volatiles partly infilled the basin, resulting in an enormous and productive regional aquifer. The stacked sequences of water-bearing strata were then deformed locally and, in places, exposed by magmatic-driven uplifts, tectonic deformation, and erosion. This basin model provides a potential source of water necessary to carve the large outflow channel systems of the Tharsis and surrounding regions and to contribute to the formation of putative northern-plains ocean(s) and/or paleolakes. Copyright 2001 by the American Geophysical Union.
Expanding roles in a library-based bioinformatics service program: a case study
Li, Meng; Chen, Yi-Bu; Clintworth, William A
2013-01-01
Question: How can a library-based bioinformatics support program be implemented and expanded to continuously support the growing and changing needs of the research community? Setting: A program at a health sciences library serving a large academic medical center with a strong research focus is described. Methods: The bioinformatics service program was established at the Norris Medical Library in 2005. As part of program development, the library assessed users' bioinformatics needs, acquired additional funds, established and expanded service offerings, and explored additional roles in promoting on-campus collaboration. Results: Personnel and software have increased along with the number of registered software users and use of the provided services. Conclusion: With strategic efforts and persistent advocacy within the broader university environment, library-based bioinformatics service programs can become a key part of an institution's comprehensive solution to researchers' ever-increasing bioinformatics needs. PMID:24163602
Iavindrasana, Jimison; Depeursinge, Adrien; Ruch, Patrick; Spahni, Stéphane; Geissbuhler, Antoine; Müller, Henning
2007-01-01
The diagnostic and therapeutic processes, as well as the development of new treatments, are hindered by the fragmentation of information which underlies them. In a multi-institutional research study database, the clinical information system (CIS) contains the primary data input. An important part of the money of large scale clinical studies is often paid for data creation and maintenance. The objective of this work is to design a decentralized, scalable, reusable database architecture with lower maintenance costs for managing and integrating distributed heterogeneous data required as basis for a large-scale research project. Technical and legal aspects are taken into account based on various use case scenarios. The architecture contains 4 layers: data storage and access are decentralized at their production source, a connector as a proxy between the CIS and the external world, an information mediator as a data access point and the client side. The proposed design will be implemented inside six clinical centers participating in the @neurIST project as part of a larger system on data integration and reuse for aneurism treatment.
Kalan, Katja; Ivovic, Vladimir; Glasnovic, Peter; Buzan, Elena
2017-11-07
In Slovenia, two invasive mosquito species are present, Aedes albopictus (Skuse, 1895) (Diptera: Culicidae) and Aedes japonicus (Theobald, 1901) (Diptera: Culicidae). In this study, we examined their actual distribution and suitable habitats for new colonizations. Data from survey of species presence in 2013 and 2015, bioclimatic variables and altitude were used for the construction of predictive maps. We produced various models in Maxent software and tested two bioclimatic variable sets, WorldClim and CHELSA. For the variable selection of A. albopictus modeling we used statistical and expert knowledge-based approach, whereas for A. j. japonicus we used only a statistically based approach. The best performing models for both species were chosen according to AIC score-based evaluation. In 2 yr of sampling, A. albopictus was largely confined to the western half of Slovenia, whereas A. j. japonicus spread significantly and can be considered as an established species in a large part of the country. Comparison of models with WorldClim and CHELSA variables for both species showed models with CHELSA variables as a better tool for prediction. Finally, we validated the models performance in predicting distribution of species according to collected field data. Our study confirms that both species are co-occurring and are sympatric in a large part of the country area. The tested models could be used for future prevention of invasive mosquitoes spreading in other countries with similar bioclimatic conditions. © The Authors 2017. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
[25 years of the DRG-based health-financing system in Hungary].
Babarczy, Balázs; Gyenes, Péter; Imre, László
2015-07-19
After a thourough development phase, a new system of health financing was introduced in Hungary in 1993. One of the cornerstones of the system was the financing of acute hospital care through Diagnosis-Related Groups (DRGs). This method was part of a comprehensive healthcare model, elaborated and published around 1990 by experts of Gyógyinfok, a public institute. The health financing system that was finally introduced reflcted in large part this theoretical model, while the current Hungarian system differs from it in some important respects. The objective of this article is to identify these points of divergence.
Christianson, J B
1988-01-01
Publicly funded programs that increase the use of formal community-based care by the elderly could cause less reliance on informal care. The effect of channeling on informal caregiving was examined using data collected from frail elderly and from their primary caregivers. The findings suggest some withdrawal from caregiving on the part of neighbors and friends during the demonstration. Overall, however, these reductions were not large relative to the increased use of formal community-based services. PMID:3130333
The Application of Design to Cost and Life Cycle to Aircraft Engines.
1980-05-01
appearing in both columns include AGE (common and peculiar), transportation, management, and training. These cost elements are not usually large in...Proc. of install engine X CIP x Spare engine X Spare parts (base/depot) X Depot labor X Base labor X ECPs-mod/retro X X AGE (peculiar/common) X X...introduits de maniere aleatoire dans le cadre j’hypotheses. En outre les moteurs ou les sous-ensembles, compte tenu du suivi de leur age et de leur
A soft actuation system for segmented reflector articulation and isolation
NASA Technical Reports Server (NTRS)
Agronin, Michael L.; Jandura, Louise
1990-01-01
Segmented reflectors have been proposed for space based applications such as optical communication and large diameter telescopes. An actuation system for mirrors in a space based segmented mirror array was developed as part of NASA's Precision Segmented Reflector program. The actuation system, called the Articulated Panel Module (APM), provides 3 degrees of freedom mirror articulation, gives isolation from structural motion, and simplifies space assembly of the mirrors to the reflector backup truss. A breadboard of the APM was built and is described.
Handling missing values in the MDS-UPDRS.
Goetz, Christopher G; Luo, Sheng; Wang, Lu; Tilley, Barbara C; LaPelle, Nancy R; Stebbins, Glenn T
2015-10-01
This study was undertaken to define the number of missing values permissible to render valid total scores for each Movement Disorder Society Unified Parkinson's Disease Rating Scale (MDS-UPDRS) part. To handle missing values, imputation strategies serve as guidelines to reject an incomplete rating or create a surrogate score. We tested a rigorous, scale-specific, data-based approach to handling missing values for the MDS-UPDRS. From two large MDS-UPDRS datasets, we sequentially deleted item scores, either consistently (same items) or randomly (different items) across all subjects. Lin's Concordance Correlation Coefficient (CCC) compared scores calculated without missing values with prorated scores based on sequentially increasing missing values. The maximal number of missing values retaining a CCC greater than 0.95 determined the threshold for rendering a valid prorated score. A second confirmatory sample was selected from the MDS-UPDRS international translation program. To provide valid part scores applicable across all Hoehn and Yahr (H&Y) stages when the same items are consistently missing, one missing item from Part I, one from Part II, three from Part III, but none from Part IV can be allowed. To provide valid part scores applicable across all H&Y stages when random item entries are missing, one missing item from Part I, two from Part II, seven from Part III, but none from Part IV can be allowed. All cutoff values were confirmed in the validation sample. These analyses are useful for constructing valid surrogate part scores for MDS-UPDRS when missing items fall within the identified threshold and give scientific justification for rejecting partially completed ratings that fall below the threshold. © 2015 International Parkinson and Movement Disorder Society.
Genotype Specification Language.
Wilson, Erin H; Sagawa, Shiori; Weis, James W; Schubert, Max G; Bissell, Michael; Hawthorne, Brian; Reeves, Christopher D; Dean, Jed; Platt, Darren
2016-06-17
We describe here the Genotype Specification Language (GSL), a language that facilitates the rapid design of large and complex DNA constructs used to engineer genomes. The GSL compiler implements a high-level language based on traditional genetic notation, as well as a set of low-level DNA manipulation primitives. The language allows facile incorporation of parts from a library of cloned DNA constructs and from the "natural" library of parts in fully sequenced and annotated genomes. GSL was designed to engage genetic engineers in their native language while providing a framework for higher level abstract tooling. To this end we define four language levels, Level 0 (literal DNA sequence) through Level 3, with increasing abstraction of part selection and construction paths. GSL targets an intermediate language based on DNA slices that translates efficiently into a wide range of final output formats, such as FASTA and GenBank, and includes formats that specify instructions and materials such as oligonucleotide primers to allow the physical construction of the GSL designs by individual strain engineers or an automated DNA assembly core facility.
Advanced Extraction of Spatial Information from High Resolution Satellite Data
NASA Astrophysics Data System (ADS)
Pour, T.; Burian, J.; Miřijovský, J.
2016-06-01
In this paper authors processed five satellite image of five different Middle-European cities taken by five different sensors. The aim of the paper was to find methods and approaches leading to evaluation and spatial data extraction from areas of interest. For this reason, data were firstly pre-processed using image fusion, mosaicking and segmentation processes. Results going into the next step were two polygon layers; first one representing single objects and the second one representing city blocks. In the second step, polygon layers were classified and exported into Esri shapefile format. Classification was partly hierarchical expert based and partly based on the tool SEaTH used for separability distinction and thresholding. Final results along with visual previews were attached to the original thesis. Results are evaluated visually and statistically in the last part of the paper. In the discussion author described difficulties of working with data of large size, taken by different sensors and different also thematically.
Educating for ethical leadership through web-based coaching.
Eide, Tom; Dulmen, Sandra van; Eide, Hilde
2016-12-01
Ethical leadership is important for developing ethical healthcare practice. However, there is little research-based knowledge on how to stimulate and educate for ethical leadership. The aim was to develop and investigate the feasibility of a 6-week web-based, ethical leadership educational programme and learn from participants' experience. Training programme and research design: A training programme was developed consisting of (1) a practice part, where the participating middle managers developed and ran an ethics project in their own departments aiming at enhancing the ethical mindfulness of the organizational culture, and (2) a web-based reflection part, including online reflections and coaching while executing the ethics project. Focus group interviews were used to explore the participants' experiences with and the feasibility of the training. Participants and research context: Nine middle managers were recruited from a part-time master's programme in leadership in Oslo, Norway. The research context was the participating leaders' work situation during the 6 weeks of training. Ethical considerations: Participation was voluntary, data anonymized and the confidentiality of the participating leaders/students and their institutions maintained. No patient or medical information was involved. Eight of the nine recruited leaders completed the programme. They evaluated the training programme as efficient and supportive, with the written, situational feedback/coaching as the most important element, enhancing reflection and motivation, counteracting a feeling of loneliness and promoting the execution of change. The findings seem consistent with the basic assumptions behind the educational design, based partly on e-health research, feedback studies and organizational ethics methodology, partly on theories on workplace learning, reflection, recognition and motivation. The training programme seems feasible. It should be adjusted according to participants' proposals and tested further in a large-scale study.
Hydrogeology of the western part of the Salt River Valley area, Maricopa County, Arizona
Brown, James G.; Pool, D.R.
1989-01-01
The Salt River Valley is a major population and agricultural center of more than 3,000 mi2 in central Arizona (fig. 1). The western part of the Salt River Valley area (area of this report) covers about 1,500 mi2. The Phoenix metropolitan area with a population of more than 1.6 million in 1985 (Valley National Bank, 1987) is located within the valley. The watersheds of the Salt, Verde, and Agua Fria Rivers provide the valley with a reliable but limited surface-water supply that must be augmented with ground water even in years of plentiful rainfall. Large-scale ground-water withdrawals began in the Salt River Valley in the early part of the 20th century; between 1915 and 1983, the total estimated ground-water pumpage was 81 million acre-ft (U.S. Geological Survey, 1984). Because of the low average annual rainfall and high potential evapotranspiration, the principal sources of ground-water recharge are urban runoff, excess irrigation, canal seepage and surface-water flows during years of higher-than-normal rainfall. Withdrawals greatly exceed recharge and, in some area, ground-water levels have declines as much as 350 ft (Laney and other, 1978; Ross, 1978). In the study area, ground-water declines of more than 300 ft have occurred in Deer Valley and from Luke Air Force Base north to Beardsley. As a result, a large depression of the water table has developed west of Luke Air Force Base (fig. 2). Ground-water use has decreased in recent years because precipitation and surface-water supplies have been greater than normal. Increased precipitation also caused large quantities of runoff to be released into the normally dry Salt and Gila River channels. From February 1978 to June 1980, streamflow losses of at least 90,000 acre-ft occurred between Jointhead Dam near the east boundary of the study area and Gillespie Dam several miles southwest of the west edge of the study area (Mann and Rhone, 1983). Consequently, ground-water declines in a large part of the basin have slowed, and ground-water levels in some sarea have risen significantly. In many areas along the Salt River and northeast of the confluence of the Salt and Agua Fria River, ground-water levels rose more than 25 ft between 1978 and 1984 (Reeter and Remick, 1986).
Code of Federal Regulations, 2014 CFR
2014-04-01
... for Sale at United States Boat Shows C Appendix C to Part 113 Customs Duties U.S. CUSTOMS AND BORDER... Appendix C to Part 113—Bond for Deferral of Duty on Large Yachts Imported for Sale at United States Boat Shows Bond for Deferral of Duty on Large Yachts Imported for Sale at United States Boat Shows ____, as...
Code of Federal Regulations, 2013 CFR
2013-04-01
... for Sale at United States Boat Shows C Appendix C to Part 113 Customs Duties U.S. CUSTOMS AND BORDER... Appendix C to Part 113—Bond for Deferral of Duty on Large Yachts Imported for Sale at United States Boat Shows Bond for Deferral of Duty on Large Yachts Imported for Sale at United States Boat Shows ____, as...
Code of Federal Regulations, 2012 CFR
2012-04-01
... for Sale at United States Boat Shows C Appendix C to Part 113 Customs Duties U.S. CUSTOMS AND BORDER... Appendix C to Part 113—Bond for Deferral of Duty on Large Yachts Imported for Sale at United States Boat Shows Bond for Deferral of Duty on Large Yachts Imported for Sale at United States Boat Shows ____, as...
Ground-water resources of Olmsted Air Force Base, Middletown, Pennsylvania
Meisler, Harold; Longwill, Stanley Miller
1961-01-01
Olmsted Air Force Base is underlain by the Gettysburg shale of Triassic age. The Gettysburg shale at the Air Force Base consists of interbedded red sandstone, siltstone, and shale. The average strike of the strata is N. 43° E., and the strata dip to the northwest at an average angle of 26°. The transmissibility of known aquifers in the warehouse area of the Air Force Base is low. Therefore, wells in the warehouse area have low specific capacities and yield only small supplies of water. Wells on the main base, however, yield relatively large supplies of water because the transmissibilities of the aquifers are relatively high. Pumping tests in the warehouse area and the eastern area of the main base indicated the presence of impermeable boundaries in both areas. Pumping tests in the central and western parts of the main base revealed that the Susquehanna River probably is acting as a source of recharge (forms a recharge boundary) for wells in those areas. Data obtained during this investigation indicate that additional supplies of ground water for Olmsted Air Force Base could best be obtained from the western part of the main base.
Terminological reference of a knowledge-based system: the data dictionary.
Stausberg, J; Wormek, A; Kraut, U
1995-01-01
The development of open and integrated knowledge bases makes new demands on the definition of the used terminology. The definition should be realized in a data dictionary separated from the knowledge base. Within the works done at a reference model of medical knowledge, a data dictionary has been developed and used in different applications: a term definition shell, a documentation tool and a knowledge base. The data dictionary includes that part of terminology, which is largely independent of a certain knowledge model. For that reason, the data dictionary can be used as a basis for integrating knowledge bases into information systems, for knowledge sharing and reuse and for modular development of knowledge-based systems.
2016-01-01
Background Computer Networks have a tendency to grow at an unprecedented scale. Modern networks involve not only computers but also a wide variety of other interconnected devices ranging from mobile phones to other household items fitted with sensors. This vision of the "Internet of Things" (IoT) implies an inherent difficulty in modeling problems. Purpose It is practically impossible to implement and test all scenarios for large-scale and complex adaptive communication networks as part of Complex Adaptive Communication Networks and Environments (CACOONS). The goal of this study is to explore the use of Agent-based Modeling as part of the Cognitive Agent-based Computing (CABC) framework to model a Complex communication network problem. Method We use Exploratory Agent-based Modeling (EABM), as part of the CABC framework, to develop an autonomous multi-agent architecture for managing carbon footprint in a corporate network. To evaluate the application of complexity in practical scenarios, we have also introduced a company-defined computer usage policy. Results The conducted experiments demonstrated two important results: Primarily CABC-based modeling approach such as using Agent-based Modeling can be an effective approach to modeling complex problems in the domain of IoT. Secondly, the specific problem of managing the Carbon footprint can be solved using a multiagent system approach. PMID:26812235
Study on the high-frequency laser measurement of slot surface difference
NASA Astrophysics Data System (ADS)
Bing, Jia; Lv, Qiongying; Cao, Guohua
2017-10-01
In view of the measurement of the slot surface difference in the large-scale mechanical assembly process, Based on high frequency laser scanning technology and laser detection imaging principle, This paragraph designs a double galvanometer pulse laser scanning system. Laser probe scanning system architecture consists of three parts: laser ranging part, mechanical scanning part, data acquisition and processing part. The part of laser range uses high-frequency laser range finder to measure the distance information of the target shape and get a lot of point cloud data. Mechanical scanning part includes high-speed rotary table, high-speed transit and related structure design, in order to realize the whole system should be carried out in accordance with the design of scanning path on the target three-dimensional laser scanning. Data processing part mainly by FPGA hardware with LAbVIEW software to design a core, to process the point cloud data collected by the laser range finder at the high-speed and fitting calculation of point cloud data, to establish a three-dimensional model of the target, so laser scanning imaging is realized.
ERIC Educational Resources Information Center
Geluso, Joe
2013-01-01
Usage-based theories of language learning suggest that native speakers of a language are acutely aware of formulaic language due in large part to frequency effects. Corpora and data-driven learning can offer useful insights into frequent patterns of naturally occurring language to second/foreign language learners who, unlike native speakers, are…
ERIC Educational Resources Information Center
Hof, Barbara
2018-01-01
After the Sputnik shock of 1957, the United States initiated education reform, based in part on the hope that technology could facilitate efficient school learning. This development was largely driven by the confrontation between the eastern and western Blocs: on both sides of the Iron Curtain, reformists promoted educational technology for the…
Matt Bumgardner; Robert J. Bush; Cynthia D. West
2001-01-01
Previous research has shown that substantial yield improvements are possible when character-marks are not removed from hardwood furniture parts. Attempts to promote increased use of character-marked wood in fumiture should be based on an understimn&ing of how design concepts originate and move through the stages of product development. However, very little has...
ERIC Educational Resources Information Center
Goedegebuure, Leo, Ed.; And Others
This book is the result of a research project on the most important principles, structural features, and functionalities of higher education policies in 11 developed nations around the world. Reports on each nation, are based in large part on analysis of responses to a common questionnaire by national experts in each nation. An opening chapter,…
Spathodea campanulata P. Beauv.
K.F. Connor; J.K. Francis
2002-01-01
S. campanulata is a medium-sized tree that commonly reaches a height of 21 m; however, in some parts of West Africa, it may reach 30 m. Heart and butt rots are common in trees older than 20 to 25 years that have suffered mechanical or fire damage. In Hawaii, large trees form narrow butresses at the base. It grows naturally in the secondary forests...
Implementation at the School Building Level: The Development and Analysis of Nine Mini-Case Studies.
ERIC Educational Resources Information Center
Hall, Gene; And Others
As part of a district-wide longitudinal study of the implementation of a science curriculum innovation, researchers developed case studies of a sample of nine elementary schools in the Jefferson County School District, a large suburban system in Colorado. The study applied the Concerns-Based Adoption Model, which assumes that change is carried out…
ERIC Educational Resources Information Center
Cocchiarella, Fabrizio; Booth, Paul
2015-01-01
This article presents the findings of a cross-disciplinary project between BA (Hons) Interior Design, Creative Multimedia and Film and Media Studies at a large Metropolitan University in the North of England. The collaboration was part of Unit X, a faculty-wide credit-bearing initiative to enable better collaboration across art and design courses.…
Ecological Correlates of Spanish Adolescents' Physical Activity during Physical Education Classes
ERIC Educational Resources Information Center
Molina-García, Javier; Queralt, Ana; Estevan, Isaac; Sallis, James F.
2016-01-01
The public health benefit of school physical education (PE) depends in large part on physical activity (PA) provided during class. According to the literature, PE has a valuable role in public health, and PA levels during PE classes depend on a wide range of factors. The main objective of this study, based on ecological models of behaviour, was to…
Weighing In--Healthy at Any Size?
ERIC Educational Resources Information Center
Jackson, Camille
2012-01-01
It's easy for overweight children to feel singled out and shamed about their body size, at home and at school. Experts say children can easily interpret even the well-intentioned "war on childhood obesity," meant to promote health, to mean a war on their bodies and on them. Size-based stigma stems in large part from the myth that being fat is a…
ERIC Educational Resources Information Center
Blau, Ina; Hameiri, Mira
2017-01-01
Digital educational data management has become an integral part of school practices. Accessing school database by teachers, students, and parents from mobile devices promotes data-driven educational interactions based on real-time information. This paper analyses mobile access of educational database in a large sample of 429 schools during an…
Remilling of salvaged wood siding coated with lead-based paint. Part 2, Wood product yield
John J. Janowiak; Robert H. Falk; Brian W. Beakler; Richard G. Lampo; Thomas R. Napier
2005-01-01
Many U.S. military buildings being targeted for removal contain large quantities of potentially reusable wood materials. In this study, we evaluated approximately 2180 m (7,152 ft) of painted Douglas-fir siding salvaged from U.S. Army barracks. Utilizing a conventional woodworking molder, we evaluated the feasibility of producing several standardized wood product...
Data Storage Hierarchy Systems for Data Base Computers
1979-08-01
Thesis Supervisor Accepted by ................................................ Chairman, Department Committee - /-111 Report...failing to comply with a collection of information if it does not display a currently valid OMB control number. 1. REPORT DATE AUG 1979 2. REPORT...with very large capacity and small access time. As part of the INFOPLEX research effort, this thesis is focused on the study of high performance, highly
ERIC Educational Resources Information Center
Abi-Nader, Jeannette
This report is based on an ethnographic study of a multicultural "college prep" program catering to minority students. It was part of the elective bilingual education offering at a large urban high school, and recorded an 11-year history of successfully graduating Hispanic high school students and sending at least 65% of them on to college. The…
Active Galaxies Educational Unit: An Educator's Guide with Activities in Science and Mathematics.
ERIC Educational Resources Information Center
National Aeronautics and Space Administration, Washington, DC.
As a part of its educational effort, the National Aeronautics and Space Administration (NASA) Education and Public Outreach group at Sonoma State University (SSU) has put together a series of activities based on the science of one of NASA's exciting space missions, the Gamma-ray Large Area Space Telescope (GLAST). GLAST is a NASA satellite planned…
ERIC Educational Resources Information Center
Harvey, Francis A.
This paper describes the evolution and development of an intelligent information system, i.e., a knowledge base for steel structures being undertaken as part of the Technical Information Center for Steel Structures at Lehigh University's Center of Advanced Technology for Large Structural Systems (ATLSS). The initial development of the Technical…
ERIC Educational Resources Information Center
Max, Jeffrey; Constantine, Jill; Wellington, Alison; Hallgren, Kristin; Glazerman, Steven; Chiang, Hanley; Speroni, Cecilia
2014-01-01
The Teacher Incentive Fund (TIF) provides grants to support performance-based compensation systems for teachers and principals in high-need schools. The study measures the impact of pay-for-performance bonuses as part of a comprehensive compensation system within a large, multisite random assignment study design. The treatment schools were to…
ERIC Educational Resources Information Center
Chiang, Hanley; Wellington, Alison; Hallgren, Kristin; Speroni, Cecilia; Herrmann, Mariesa; Glazerman, Steven; Constantine, Jill
2015-01-01
The Teacher Incentive Fund (TIF) provides grants to support performance-based compensation systems for teachers and principals in high-need schools. The study measures the impact of pay-for-performance bonuses as part of a comprehensive compensation system within a large, multisite random assignment study design. The treatment schools were to…
The End of Need-Based Student Financial Aid in Canada?
ERIC Educational Resources Information Center
Junor, Sean; Usher, Alex
2007-01-01
There was a major change in Canadian student aid in the late 1990s, due largely to a package of measures adopted by the Government Canada as part of its "Canada Opportunities Strategy". At the time, what aroused the most comment was the creation in 1998 of the $2.5 billion Canada Millennium Scholarship Foundation (Foundation). But while…
Automation and hypermedia technology applications
NASA Technical Reports Server (NTRS)
Jupin, Joseph H.; Ng, Edward W.; James, Mark L.
1993-01-01
This paper represents a progress report on HyLite (Hypermedia Library technology): a research and development activity to produce a versatile system as part of NASA's technology thrusts in automation, information sciences, and communications. HyLite can be used as a system or tool to facilitate the creation and maintenance of large distributed electronic libraries. The contents of such a library may be software components, hardware parts or designs, scientific data sets or databases, configuration management information, etc. Proliferation of computer use has made the diversity and quantity of information too large for any single user to sort, process, and utilize effectively. In response to this information deluge, we have created HyLite to enable the user to process relevant information into a more efficient organization for presentation, retrieval, and readability. To accomplish this end, we have incorporated various AI techniques into the HyLite hypermedia engine to facilitate parameters and properties of the system. The proposed techniques include intelligent searching tools for the libraries, intelligent retrievals, and navigational assistance based on user histories. HyLite itself is based on an earlier project, the Encyclopedia of Software Components (ESC) which used hypermedia to facilitate and encourage software reuse.
White, William T; Ebert, David A; Naylor, Gavin J P; Ho, Hsuan-Ching; Clerkin, Paul; Veríssimo, Ana; Cotton, Charles F
2013-01-01
The genus Centrophorus is one of the most taxonomically complex and confusing elasmobranch groups. A revision of this group is currently underway and this first paper sets an important foundation in this process by redescribing the type species of the genus--Centrophorus granulosus. This taxon name has been previously applied to two different morphotypes: a large species > 1.5 m TL and a smaller species -1 m TL. Centrophorus acus and C. niaukang are the most commonly used names applied to the larger morphotype. The original description of C. granulosus was based on a large specimen of -1.5 m TL, but subsequent redescriptions were based on either of the large or small morphotypes. Centrophorus granulosus is herein redescribed as a large species and a neotype is designated. Centrophorus acus and C. niaukang are found to be junior synonyms of C. granulosus. Centrophorus granulosus is distinguishable from its congeners by its large size, dermal denticle shape, colouration and a number of morphological and biological characteristics. Ontogenetic changes in morphology, dentition and denticle shape for this species are described in detail.
STT Doubles with Large Delta M - Part VII: Andromeda, Pisces, Auriga
NASA Astrophysics Data System (ADS)
Knapp, Wilfried; Nanson, John
2017-01-01
The results of visual double star observing sessions suggested a pattern for STT doubles with large DM of being harder to resolve than would be expected based on the WDS catalog data. It was felt this might be a problem with expectations on one hand, and on the other might be an indication of a need for new precise measurements, so we decided to take a closer look at a selected sample of STT doubles and do some research. Similar to the other objects covered so far several of the components show parameters quite different from the current WDS data.
The Optical Gravitational Lensing Experiment. Eclipsing Binary Stars in the Large Magellanic Cloud
NASA Astrophysics Data System (ADS)
Wyrzykowski, L.; Udalski, A.; Kubiak, M.; Szymanski, M.; Zebrun, K.; Soszynski, I.; Wozniak, P. R.; Pietrzynski, G.; Szewczyk, O.
2003-03-01
We present the catalog of 2580 eclipsing binary stars detected in 4.6 square degree area of the central parts of the Large Magellanic Cloud. The photometric data were collected during the second phase of the OGLE microlensing search from 1997 to 2000. The eclipsing objects were selected with the automatic search algorithm based on an artificial neural network. Basic statistics of eclipsing stars are presented. Also, the list of 36 candidates of detached eclipsing binaries for spectroscopic study and for precise LMC distance determination is provided. The full catalog is accessible from the OGLE Internet archive.
Large-pitch kagome-structured hollow-core photonic crystal fiber
NASA Astrophysics Data System (ADS)
Couny, F.; Benabid, F.; Light, P. S.
2006-12-01
We report the fabrication and characterization of a new type of hollow-core photonic crystal fiber based on large-pitch (˜12μm) kagome lattice cladding. The optical characteristics of the 19-cell, 7-cell, and single-cell core defect fibers include broad optical transmission bands covering the visible and near-IR parts of the spectrum with relatively low loss and low chromatic dispersion, no detectable surface modes and high confinement of light in the core. Various applications of such a novel fiber are also discussed, including gas sensing, quantum optics, and high harmonic generation.
NASA Astrophysics Data System (ADS)
Wu, Yanhui; Han, Mangui; Liu, Tao; Deng, Longjiang
2015-07-01
The effective permittivity of composites containing Fe-Cu-Nb-Si-B nanocrystalline micro flakes has been studied within 0.5-10 GHz. Obvious differences in microwave permittivity have been observed for composites consisting of large flakes (size range: 23-111 μm, average thickness: 4.5 μm) and small flakes (size range: 3-21 μm, average thickness: 1.3 μm). Both the real part and imaginary part of permittivity of large flake composite are much larger than these small one in a given frequency. And faster decrease of permittivity with the increasing frequency can be observed for large flake composite than that of small one. These differences in permittivity spectra of different flakes have been explained from the perspective of interfacial polarization and ac conductivity. The assumption that more extensive ohmic contact interface between large flakes and matrix has been validated by the fittings and the calculated percolation threshold. Meanwhile, the permeability spectra of both composites also have been studied by Lorentzian dispersion law. The broadened spectra can be attributed to the distribution of magnetic anisotropy fields of two kinds of ferromagnetic phases in the particles. Finally, the composite containing the small flakes exhibits better electromagnetic wave absorption properties.
Code of Federal Regulations, 2014 CFR
2014-07-01
..., Medium, and Large HMIWI 1 Table 1 to Subpart HHH of Part 62 Protection of Environment ENVIRONMENTAL... Part 62—Emission Limits for Small Rural, Small, Medium, and Large HMIWI For the air pollutant You must meet this emissions limit HMIWI size Small rural Small Medium Large With these units(7 percent oxygen...
Code of Federal Regulations, 2013 CFR
2013-07-01
..., Medium and Large HMIWI 1 Table 1 to Subpart HHH of Part 62 Protection of Environment ENVIRONMENTAL... Part 62—Emissions Limits for Small Rural, Small, Medium and Large HMIWI For the air pollutant You must meet this emissions limit HMIWI size Small rural Small Medium Large With these units(7 percent oxygen...
Overcoming limitations of model-based diagnostic reasoning systems
NASA Technical Reports Server (NTRS)
Holtzblatt, Lester J.; Marcotte, Richard A.; Piazza, Richard L.
1989-01-01
The development of a model-based diagnostic system to overcome the limitations of model-based reasoning systems is discussed. It is noted that model-based reasoning techniques can be used to analyze the failure behavior and diagnosability of system and circuit designs as part of the system process itself. One goal of current research is the development of a diagnostic algorithm which can reason efficiently about large numbers of diagnostic suspects and can handle both combinational and sequential circuits. A second goal is to address the model-creation problem by developing an approach for using design models to construct the GMODS model in an automated fashion.
Formalism Challenges of the Cougaar Model Driven Architecture
NASA Technical Reports Server (NTRS)
Bohner, Shawn A.; George, Boby; Gracanin, Denis; Hinchey, Michael G.
2004-01-01
The Cognitive Agent Architecture (Cougaar) is one of the most sophisticated distributed agent architectures developed today. As part of its research and evolution, Cougaar is being studied for application to large, logistics-based applications for the Department of Defense (DoD). Anticipiting future complex applications of Cougaar, we are investigating the Model Driven Architecture (MDA) approach to understand how effective it would be for increasing productivity in Cougar-based development efforts. Recognizing the sophistication of the Cougaar development environment and the limitations of transformation technologies for agents, we have systematically developed an approach that combines component assembly in the large and transformation in the small. This paper describes some of the key elements that went into the Cougaar Model Driven Architecture approach and the characteristics that drove the approach.
Amplification of Angular Rotations Using Weak Measurements
NASA Astrophysics Data System (ADS)
Magaña-Loaiza, Omar S.; Mirhosseini, Mohammad; Rodenburg, Brandon; Boyd, Robert W.
2014-05-01
We present a weak measurement protocol that permits a sensitive estimation of angular rotations based on the concept of weak-value amplification. The shift in the state of a pointer, in both angular position and the conjugate orbital angular momentum bases, is used to estimate angular rotations. This is done by an amplification of both the real and imaginary parts of the weak-value of a polarization operator that has been coupled to the pointer, which is a spatial mode, via a spin-orbit coupling. Our experiment demonstrates the first realization of weak-value amplification in the azimuthal degree of freedom. We have achieved effective amplification factors as large as 100, providing a sensitivity that is on par with more complicated methods that employ quantum states of light or extremely large values of orbital angular momentum.
A NEW THREE-DIMENSIONAL SOLAR WIND MODEL IN SPHERICAL COORDINATES WITH A SIX-COMPONENT GRID
DOE Office of Scientific and Technical Information (OSTI.GOV)
Feng, Xueshang; Zhang, Man; Zhou, Yufen, E-mail: fengx@spaceweather.ac.cn
In this paper, we introduce a new three-dimensional magnetohydrodynamics numerical model to simulate the steady state ambient solar wind from the solar surface to 215 R {sub s} or beyond, and the model adopts a splitting finite-volume scheme based on a six-component grid system in spherical coordinates. By splitting the magnetohydrodynamics equations into a fluid part and a magnetic part, a finite volume method can be used for the fluid part and a constrained-transport method able to maintain the divergence-free constraint on the magnetic field can be used for the magnetic induction part. This new second-order model in space andmore » time is validated when modeling the large-scale structure of the solar wind. The numerical results for Carrington rotation 2064 show its ability to produce structured solar wind in agreement with observations.« less
Kupek, Emil; de Assis, Maria Alice A
2016-09-01
External validation of food recall over 24 h in schoolchildren is often restricted to eating events in schools and is based on direct observation as the reference method. The aim of this study was to estimate the dietary intake out of school, and consequently the bias in such research design based on only part-time validated food recall, using multiple imputation (MI) conditioned on the information on child age, sex, BMI, family income, parental education and the school attended. The previous-day, web-based questionnaire WebCAAFE, structured as six meals/snacks and thirty-two foods/beverage, was answered by a sample of 7-11-year-old Brazilian schoolchildren (n 602) from five public schools. Food/beverage intake recalled by children was compared with the records provided by trained observers during school meals. Sensitivity analysis was performed with artificial data emulating those recalled by children on WebCAAFE in order to evaluate the impact of both differential and non-differential bias. Estimated bias was within ±30 % interval for 84·4 % of the thirty-two foods/beverages evaluated in WebCAAFE, and half of the latter reached statistical significance (P<0·05). Rarely (<3 %) consumed dietary items were often under-reported (fish/seafood, vegetable soup, cheese bread, French fries), whereas some of those most frequently reported (meat, bread/biscuits, fruits) showed large overestimation. Compared with the analysis restricted to fully validated data, MI reduced differential bias in sensitivity analysis but the bias still remained large in most cases. MI provided a suitable statistical framework for part-time validation design of dietary intake over six daily eating events.
NASA Astrophysics Data System (ADS)
Gloger, Oliver; Tönnies, Klaus; Mensel, Birger; Völzke, Henry
2015-11-01
In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches.
Pithovirus sibericum, a new bona fide member of the "Fourth TRUC" club.
Sharma, Vikas; Colson, Philippe; Chabrol, Olivier; Pontarotti, Pierre; Raoult, Didier
2015-01-01
Nucleocytoplasmic large DNA viruses, or representatives of the proposed order Megavirales, include giant viruses of Acanthamoeba that were discovered over the last 12 years and are bona fide microbes. Phylogenies based on a few genes conserved amongst these megaviruses and shared by microbes classified as Eukarya, Bacteria, and Archaea, allowed for delineation of a fourth monophylogenetic group or "TRUC" (Things Resisting Uncompleted Classification) composed of the Megavirales representatives. A new Megavirales member named Pithovirus sibericum was isolated from a >30,000-year-old dated Siberian permafrost sample. This virion is as large as recently described pandoraviruses but has a genome that is approximately three to four times shorter. Our objective was to update the classification of P. sibericum as a new member of the "Fourth TRUC" club. Phylogenetic trees were constructed based on four conserved ancient genes and a phyletic analysis was concurrently conducted based on the presence/absence patterns of a set of informational genes from members of Megavirales, Bacteria, Archaea, and Eukarya. Phylogenetic analyses based on the four conserved genes revealed that P. sibericum is part of the fourth TRUC composed of Megavirales members, and is closely related to the families Marseilleviridae and Ascoviridae/Iridoviridae. Additionally, hierarchical clustering delineated four branches, and showed that P. sibericum is part of this fourth TRUC. Overall, phylogenetic and phyletic analyses using informational genes clearly indicate that P. sibericum is a new bona fide member of the "Fourth TRUC" club composed of representatives of Megavirales, alongside Bacteria, Archaea, and Eukarya.
Gloger, Oliver; Tönnies, Klaus; Mensel, Birger; Völzke, Henry
2015-11-21
In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches.
Miniaturized microscope for high throughput screening of tumor spheroids in microfluidic devices
NASA Astrophysics Data System (ADS)
Uranga, Javier; Rodríguez-Pena, Alejandro; Gahigiro, Desiré; Ortiz-de-Solorzano, Carlos
2018-02-01
High-throughput in vitro screening of highly physiological three-dimensional cell cultures (3D-HTS) is rapidly gaining importance in preclinical studies, to study the effect of the microenvironment in tumor development, and to evaluate the efficacy of new anticancer drugs. Furthermore, it could also be envisioned the use of 3D-HTS systems in personalized anti-cancer treatment planning, based on tumor organoids or spheroids grown from tumor biopsies or isolated tumor circulating cells. Most commercial, multi-well plate based 3D-HTS systems are large, expensive, and are based on the use of multi-well plates that hardly provide a physiological environment and require the use of large amounts of biological material and reagents. In this paper we present a novel, miniaturized inverted microscope (hereinafter miniscospe), made up of low-cost, mass producible parts, that can be used to monitor the growth of living tumor cell spheroids within customized three-dimensional microfluidic platforms. Our 3D-HTS miniscope combines phase contrast imaging based on oblique back illumination technique with traditional widefield epi-fluorescence imaging, implemented using miniaturized electro-optical parts and gradient-index refraction lenses. This small (3x6x2cm), lightweight device can effectively image overtime the growth of (>200) tumor spheroids in a controlled and reproducible environment. Our miniscope can be used to acquire time-lapse images of cellular living spheroids over the course of several hours and captures their growth before and after drug treatment, to evaluate the effectiveness of the drug.
Approximate kernel competitive learning.
Wu, Jian-Sheng; Zheng, Wei-Shi; Lai, Jian-Huang
2015-03-01
Kernel competitive learning has been successfully used to achieve robust clustering. However, kernel competitive learning (KCL) is not scalable for large scale data processing, because (1) it has to calculate and store the full kernel matrix that is too large to be calculated and kept in the memory and (2) it cannot be computed in parallel. In this paper we develop a framework of approximate kernel competitive learning for processing large scale dataset. The proposed framework consists of two parts. First, it derives an approximate kernel competitive learning (AKCL), which learns kernel competitive learning in a subspace via sampling. We provide solid theoretical analysis on why the proposed approximation modelling would work for kernel competitive learning, and furthermore, we show that the computational complexity of AKCL is largely reduced. Second, we propose a pseudo-parallelled approximate kernel competitive learning (PAKCL) based on a set-based kernel competitive learning strategy, which overcomes the obstacle of using parallel programming in kernel competitive learning and significantly accelerates the approximate kernel competitive learning for large scale clustering. The empirical evaluation on publicly available datasets shows that the proposed AKCL and PAKCL can perform comparably as KCL, with a large reduction on computational cost. Also, the proposed methods achieve more effective clustering performance in terms of clustering precision against related approximate clustering approaches. Copyright © 2014 Elsevier Ltd. All rights reserved.
Chen, Xi; Cui, Qiang; Tang, Yuye; Yoo, Jejoong; Yethiraj, Arun
2008-01-01
A hierarchical simulation framework that integrates information from molecular dynamics (MD) simulations into a continuum model is established to study the mechanical response of mechanosensitive channel of large-conductance (MscL) using the finite element method (FEM). The proposed MD-decorated FEM (MDeFEM) approach is used to explore the detailed gating mechanisms of the MscL in Escherichia coli embedded in a palmitoyloleoylphosphatidylethanolamine lipid bilayer. In Part I of this study, the framework of MDeFEM is established. The transmembrane and cytoplasmic helices are taken to be elastic rods, the loops are modeled as springs, and the lipid bilayer is approximated by a three-layer sheet. The mechanical properties of the continuum components, as well as their interactions, are derived from molecular simulations based on atomic force fields. In addition, analytical closed-form continuum model and elastic network model are established to complement the MDeFEM approach and to capture the most essential features of gating. In Part II of this study, the detailed gating mechanisms of E. coli-MscL under various types of loading are presented and compared with experiments, structural model, and all-atom simulations, as well as the analytical models established in Part I. It is envisioned that such a hierarchical multiscale framework will find great value in the study of a variety of biological processes involving complex mechanical deformations such as muscle contraction and mechanotransduction. PMID:18390626
Thimmaiah, Tim; Voje, William E; Carothers, James M
2015-01-01
With progress toward inexpensive, large-scale DNA assembly, the demand for simulation tools that allow the rapid construction of synthetic biological devices with predictable behaviors continues to increase. By combining engineered transcript components, such as ribosome binding sites, transcriptional terminators, ligand-binding aptamers, catalytic ribozymes, and aptamer-controlled ribozymes (aptazymes), gene expression in bacteria can be fine-tuned, with many corollaries and applications in yeast and mammalian cells. The successful design of genetic constructs that implement these kinds of RNA-based control mechanisms requires modeling and analyzing kinetically determined co-transcriptional folding pathways. Transcript design methods using stochastic kinetic folding simulations to search spacer sequence libraries for motifs enabling the assembly of RNA component parts into static ribozyme- and dynamic aptazyme-regulated expression devices with quantitatively predictable functions (rREDs and aREDs, respectively) have been described (Carothers et al., Science 334:1716-1719, 2011). Here, we provide a detailed practical procedure for computational transcript design by illustrating a high throughput, multiprocessor approach for evaluating spacer sequences and generating functional rREDs. This chapter is written as a tutorial, complete with pseudo-code and step-by-step instructions for setting up a computational cluster with an Amazon, Inc. web server and performing the large numbers of kinefold-based stochastic kinetic co-transcriptional folding simulations needed to design functional rREDs and aREDs. The method described here should be broadly applicable for designing and analyzing a variety of synthetic RNA parts, devices and transcripts.
Community-based native seed production for restoration in Brazil - the role of science and policy.
Schmidt, I B; de Urzedo, D I; Piña-Rodrigues, F C M; Vieira, D L M; de Rezende, G M; Sampaio, A B; Junqueira, R G P
2018-05-20
Large-scale restoration programmes in the tropics require large volumes of high quality, genetically diverse and locally adapted seeds from a large number of species. However, scarcity of native seeds is a critical restriction to achieve restoration targets. In this paper, we analyse three successful community-based networks that supply native seeds and seedlings for Brazilian Amazon and Cerrado restoration projects. In addition, we propose directions to promote local participation, legal, technical and commercialisation issues for up-scaling the market of native seeds for restoration with high quality and social justice. We argue that effective community-based restoration arrangements should follow some principles: (i) seed production must be based on real market demand; (ii) non-governmental and governmental organisations have a key role in supporting local organisation, legal requirements and selling processes; (iii) local ecological knowledge and labour should be valued, enabling local communities to promote large-scale seed production; (iv) applied research can help develop appropriate techniques and solve technical issues. The case studies from Brazil and principles presented here can be useful for the up-scaling restoration ecology efforts in many other parts of the world and especially in tropical countries where improving rural community income is a strategy for biodiversity conservation and restoration. © 2018 German Society for Plant Sciences and The Royal Botanical Society of the Netherlands.
Redefining the genetics of Murine Gammaherpesvirus 68 via transcriptome-based annotation
Johnson, L. Steven; Willert, Erin K.; Virgin, Herbert W.
2010-01-01
Summary Viral genetic studies often focus on large open reading frames (ORFs) identified during genome annotation (ORF-based annotation). Here we provide a tool and software set for defining gene expression by murine gammaherpesvirus 68 (γHV68) nucleotide-by-nucleotide across the 119,450 basepair (bp) genome. These tools allowed us to determine that viral RNA expression was significantly more complex than predicted from ORF-based annotation, including over 73,000 nucleotides of unexpected transcription within 30 expressed genomic regions (EGRs). Approximately 90% of this RNA expression was antisense to genomic regions containing known large ORFs. We verified the existence of novel transcripts in three EGRs using standard methods to validate the approach and determined which parts of the transcriptome depend on protein or viral DNA synthesis. This redefines the genetic map of γHV68, indicates that herpesviruses contain significantly more genetic complexity than predicted from ORF-based genome annotations, and provides new tools and approaches for viral genetic studies. PMID:20542255
Concepts and Plans towards fast large scale Monte Carlo production for the ATLAS Experiment
NASA Astrophysics Data System (ADS)
Ritsch, E.; Atlas Collaboration
2014-06-01
The huge success of the physics program of the ATLAS experiment at the Large Hadron Collider (LHC) during Run 1 relies upon a great number of simulated Monte Carlo events. This Monte Carlo production takes the biggest part of the computing resources being in use by ATLAS as of now. In this document we describe the plans to overcome the computing resource limitations for large scale Monte Carlo production in the ATLAS Experiment for Run 2, and beyond. A number of fast detector simulation, digitization and reconstruction techniques are being discussed, based upon a new flexible detector simulation framework. To optimally benefit from these developments, a redesigned ATLAS MC production chain is presented at the end of this document.
NASA Astrophysics Data System (ADS)
Mariño, Marcos
2015-09-01
Preface; Part I. Instantons: 1. Instantons in quantum mechanics; 2. Unstable vacua in quantum field theory; 3. Large order behavior and Borel summability; 4. Non-perturbative aspects of Yang-Mills theories; 5. Instantons and fermions; Part II. Large N: 6. Sigma models at large N; 7. The 1=N expansion in QCD; 8. Matrix models and matrix quantum mechanics at large N; 9. Large N QCD in two dimensions; 10. Instantons at large N; Appendix A. Harmonic analysis on S3; Appendix B. Heat kernel and zeta functions; Appendix C. Effective action for large N sigma models; References; Author index; Subject index.
NASA Technical Reports Server (NTRS)
Dominick, Wayne D. (Editor); Hall, Philip P.
1985-01-01
The amount of information contained in the data bases of large-scale information storage and retrieval systems is very large and growing at a rapid rate. The methods available for assessing this information have not been successful in making the information easily available to the people who have the greatest need for it. This thesis describes the design of a personal computer based system which will provide a means for these individuals to retrieve this data through one standardized interface. The thesis identifies each of the major problems associated with providing access to casual users of IS and R systems and describes the manner in which these problems are to be solved by the utilization of the local processing power of a PC. Additional capabilities, not available with standard access methods, are also provided to improve the user's ability to make use of this information. The design of PC/MISI is intended to facilitate its use as a research vehicle. Evaluation mechanisms and possible areas of future research are described. The PC/MISI development effort is part of a larger research effort directed at improving access to remote IS and R systems. This research effort, supported in part by NASA, is also reviewed.
A Ranking Approach on Large-Scale Graph With Multidimensional Heterogeneous Information.
Wei, Wei; Gao, Bin; Liu, Tie-Yan; Wang, Taifeng; Li, Guohui; Li, Hang
2016-04-01
Graph-based ranking has been extensively studied and frequently applied in many applications, such as webpage ranking. It aims at mining potentially valuable information from the raw graph-structured data. Recently, with the proliferation of rich heterogeneous information (e.g., node/edge features and prior knowledge) available in many real-world graphs, how to effectively and efficiently leverage all information to improve the ranking performance becomes a new challenging problem. Previous methods only utilize part of such information and attempt to rank graph nodes according to link-based methods, of which the ranking performances are severely affected by several well-known issues, e.g., over-fitting or high computational complexity, especially when the scale of graph is very large. In this paper, we address the large-scale graph-based ranking problem and focus on how to effectively exploit rich heterogeneous information of the graph to improve the ranking performance. Specifically, we propose an innovative and effective semi-supervised PageRank (SSP) approach to parameterize the derived information within a unified semi-supervised learning framework (SSLF-GR), then simultaneously optimize the parameters and the ranking scores of graph nodes. Experiments on the real-world large-scale graphs demonstrate that our method significantly outperforms the algorithms that consider such graph information only partially.
NASA Astrophysics Data System (ADS)
Wu, Yanhui; Han, Mangui; Tang, Zhongkai; Deng, Longjiang
2014-04-01
The effective permeability values of composites containing Fe-Cu-Nb-Si-B nanocrystalline flakes have been studied within 0.5-10 GHz. Obvious differences in microwave permeability have been observed between large flakes (size range: 23-111 μm, average thickness: 4.5 μm) and small flakes (size range: 3-21 μm, average thickness: 1.3 μm). The initial real part of microwave permeability of large flakes is larger but it is decreasing faster. The larger flakes also show a larger magnetic loss. Taking into account the eddy current effect, the intrinsic microwave permeability values have been extracted based on the modified Maxwell-Garnet law, which have also been verified by the Acher's law. The dependences of skin depth on frequency have been calculated for both kinds of flakes. It is shown that the eddy current effect in the large flakes is significant. However, the eddy current effect can be ignored in the small flakes.
40 CFR Appendix II to Part 1048 - Large Spark-ignition (SI) Composite Transient Cycle
Code of Federal Regulations, 2010 CFR
2010-07-01
... Transient Cycle II Appendix II to Part 1048 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY.... 1048, App. II Appendix II to Part 1048—Large Spark-ignition (SI) Composite Transient Cycle The following table shows the transient duty-cycle for engines that are not constant-speed engines, as described...
Large temporal scale and capacity subsurface bulk energy storage with CO2
NASA Astrophysics Data System (ADS)
Saar, M. O.; Fleming, M. R.; Adams, B. M.; Ogland-Hand, J.; Nelson, E. S.; Randolph, J.; Sioshansi, R.; Kuehn, T. H.; Buscheck, T. A.; Bielicki, J. M.
2017-12-01
Decarbonizing energy systems by increasing the penetration of variable renewable energy (VRE) technologies requires efficient and short- to long-term energy storage. Very large amounts of energy can be stored in the subsurface as heat and/or pressure energy in order to provide both short- and long-term (seasonal) storage, depending on the implementation. This energy storage approach can be quite efficient, especially where geothermal energy is naturally added to the system. Here, we present subsurface heat and/or pressure energy storage with supercritical carbon dioxide (CO2) and discuss the system's efficiency, deployment options, as well as its advantages and disadvantages, compared to several other energy storage options. CO2-based subsurface bulk energy storage has the potential to be particularly efficient and large-scale, both temporally (i.e., seasonal) and spatially. The latter refers to the amount of energy that can be stored underground, using CO2, at a geologically conducive location, potentially enabling storing excess power from a substantial portion of the power grid. The implication is that it would be possible to employ centralized energy storage for (a substantial part of) the power grid, where the geology enables CO2-based bulk subsurface energy storage, whereas the VRE technologies (solar, wind) are located on that same power grid, where (solar, wind) conditions are ideal. However, this may require reinforcing the power grid's transmission lines in certain parts of the grid to enable high-load power transmission from/to a few locations.
NASA Astrophysics Data System (ADS)
Chen, Ting; Zheng, Xianghao; Zhang, Yu-ning; Li, Shengcai
2018-02-01
Owing to the part-load operations for the enhancement of grid flexibility, the Francis turbine often suffers from severe low-frequency and large-amplitude hydraulic instability, which is mostly pertinent to the highly unsteady swirling vortex rope in the draft tube. The influence of disturbances in the upstream (e.g., large-scale vortex structures in the spiral casing) on the draft-tube vortex flow is not well understood yet. In the present paper, the influence of the upstream disturbances on the vortical flow in the draft tube is studied based on the vortex identification method and the analysis of several important parameters (e.g., the swirl number and the velocity profile). For a small guide vane opening (representing the part-load condition), the vortices triggered in the spiral casing propagate downstream and significantly affect the swirling vortex-rope precession in the draft tube, leading to the changes of the intensity and the processional frequency of the swirling vortex rope. When the guide vane opening approaches the optimum one (representing the full-load condition), the upstream disturbance becomes weaker and thus its influences on the downstream flow are very limited.
Hazards, Disasters, and The National Map
,
2003-01-01
Governments depend on base geographic information that describes the Earth's surface and locates features. They use this information for economic and community development, land and natural resource management, delivery of health services, and ensuring public safety. It is also the foundation for studying and solving geographically based problems. Geographic information underpins an increasingly large part of the Nation's economy. It is an important part of our national infrastructure in the same way that the Interstate Highway System is an essential element of our transportation network. Federal, State, and local response and management personnel must have current, reliable, and easily accessible geographic information and maps to prepare for, respond to, or recover from emergency situations. In life-threatening events, such as earthquakes, floods, or wildland fires, geographic information is essential for locating critical infrastructure and carrying out evacuation and rescue operations.
Alali, Sanaz; Gribble, Adam; Vitkin, I Alex
2016-03-01
A new polarimetry method is demonstrated to image the entire Mueller matrix of a turbid sample using four photoelastic modulators (PEMs) and a charge coupled device (CCD) camera, with no moving parts. Accurate wide-field imaging is enabled with a field-programmable gate array (FPGA) optical gating technique and an evolutionary algorithm (EA) that optimizes imaging times. This technique accurately and rapidly measured the Mueller matrices of air, polarization elements, and turbid phantoms. The system should prove advantageous for Mueller matrix analysis of turbid samples (e.g., biological tissues) over large fields of view, in less than a second.
ERIC Educational Resources Information Center
Schmidt, Michael D.; Blizzard, C. Leigh; Venn, Alison J.; Cochrane, Jennifer A.; Dwyer, Terence
2007-01-01
The aim of this study was to summarize both practical and methodological issues in using pedometers to assess physical activity in a large epidemiologic study. As part of a population-based survey of cardiovascular disease risk factors, physical activity was assessed using pedometers and activity diaries in 775 men and women ages 25-64 years who…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-16
....S.C. 552(a), 1 CFR part 51, and Sec. 97.20 of Title 14 of the Code of Federal Regulations. The large... contained in this amendment are based on the criteria contained in the U.S. Standard for Terminal Instrument... 97 continues to read as follows: Authority: 49 U.S.C. 106(g), 40103, 40106, 40113, 40114, 40120...
ERIC Educational Resources Information Center
Tyler, John H.; Murnane, Richard J.; Willett, John B.
2004-01-01
As part of standards-based educational reform efforts, more than 40 states will soon require students to achieve passing scores on standardized exams in order to obtain a high school diploma. Currently, many states are struggling with the design of their examination systems, debating such questions as which subjects should be tested, what should…
Mineral resource potential map of the Fossil Ridge Wilderness Study Area, Gunnison County, Colorado
DeWitt, Ed; Stoneman, R.J.; Clark, J.R.; Kluender, S.E.
1985-01-01
Areas that immediately adjoin the Fossil Ridge Wilderness Study Area have a high potential for molybdenum in large deposits, lead in medium-size deposits, and zinc -in small- to medium-size deposits. Depending on the extraction of base metals, parts of the adjoining areas could have a low resource potential for bismuth and cadmium as byproducts in medium-size deposits.
ERIC Educational Resources Information Center
Guice, Sherry; Brooks, Gregory W.
A study, part of a 5-year investigation (1991-1995) of patterns of implementation of literature-based instruction in schools serving large numbers of children from low-income families, recounts children's literacy experiences as observed in a third-grade classroom in an urban school in upstate New York. The primary goal was to understand the…
Undergraduate Social Work Students: Learning Interviewing Skills in a Hybrid Practice Class
ERIC Educational Resources Information Center
Barclay, Barbara
2012-01-01
This action research case study explored undergraduate social work students' perceived learning of interviewing skills in a hybrid environment course delivery. The single case study consisted of 19 students enrolled in a practice course blending web-based and face-to-face (f2f) meetings (4 of 15 f2f) within a large urban college. As part of…
2006-01-01
preparing a Continuation in Part ( CIP ) to add the new I7L cleavage assays recently developed by SIGA. Conclusions By using homology-based... developmental cycle . RNA viruses and retroviruses commonly undergo formative proteolysis in which large polyproteins are cleaved by viral encoded proteinases to...structural model of the vaccinia virus (VV) I7L proteinase was developed at Transtech Pharma. A unique chemical library of ~ 51,000 compounds was
ERIC Educational Resources Information Center
National Center for Education Evaluation and Regional Assistance, 2015
2015-01-01
The Teacher Incentive Fund (TIF) provides grants to support performance-based compensation systems for teachers and principals in high-need schools. The study measures the impact of pay-for-performance bonuses as part of a comprehensive compensation system within a large, multisite random assignment study design. The treatment schools were to…
ERIC Educational Resources Information Center
Mayer, John; Kieras, David E.
Using a system based on standard augmented transition network (ATN) parsing approach, this report describes a technique for the rapid development of natural language parsing, called High-Level Grammar Specification Language (HGSL). The first part of the report describes the syntax and semantics of HGSL and the network implementation of each of its…
ERIC Educational Resources Information Center
Chiang, Hanley; Speroni, Cecilia; Herrmann, Mariesa; Hallgren, Kristin; Burkander, Paul; Wellington, Alison
2017-01-01
The Teacher Incentive Fund (TIF) provides grants to support performance-based compensation systems for teachers and principals in high-need schools. The study measures the impact of pay-for-performance bonuses as part of a comprehensive compensation system within a large, multisite random assignment study design. The treatment schools were to…
ERIC Educational Resources Information Center
Chiang, Hanley; Speroni, Cecilia; Herrmann, Mariesa; Hallgren, Kristin; Burkander, Paul; Wellington, Alison
2017-01-01
The Teacher Incentive Fund (TIF) provides grants to support performance-based compensation systems for teachers and principals in high-need schools. The study measures the impact of pay-for-performance bonuses as part of a comprehensive compensation system within a large, multisite random assignment study design. The treatment schools were to…
ERIC Educational Resources Information Center
Max, Jeffrey; Constantine, Jill; Wellington, Alison; Hallgren, Kristin; Glazerman, Steven; Chiang, Hanley; Speroni, Cecilia
2014-01-01
The Teacher Incentive Fund (TIF) provides grants to support performance-based compensation systems for teachers and principals in high-need schools. The study measures the impact of pay-for-performance bonuses as part of a comprehensive compensation system within a large, multisite random assignment study design. The treatment schools were to…
NASA Astrophysics Data System (ADS)
Liu, Z.; LU, G.; He, H.; Wu, Z.; He, J.
2017-12-01
Seasonal pluvial-drought transition processes are unique natural phenomena. To explore possible mechanisms, we considered Southwest China (SWC) as the study region and comprehensively investigated the temporal evolution of large-scale and regional atmospheric variables with the simple method of Standardized Anomalies (SA). Some key results include: (1) The net vertical integral of water vapour flux (VIWVF) across the four boundaries may be a feasible indicator of pluvial-drought transition processes over SWC, because its SA-based index is almost consistent with process development. (2) The vertical SA-based patterns of regional horizontal divergence (D) and vertical motion (ω) also coincides with the pluvial-drought transition processes well, and the SA-based index of regional D show relatively high correlation with the identified processes over SWC. (3) With respect to large-scale anomalies of circulation patterns, a well-organized Eurasian Pattern is one important feature during the pluvial-drought transition over SWC. (4) To explore the possibility of simulating drought development using previous pluvial anomalies, large-scale and regional atmospheric SA-based indices were used. As a whole, when SA-based indices of regional dynamic and water-vapor variables are introduced, simulated drought development only with large-scale anomalies can be improved a lot. (5) Eventually, pluvial-drought transition processes and associated regional atmospheric anomalies over nine Chinese drought study regions were investigated. With respect to regional D, vertically single or double "upper-positive-lower-negative" and "upper-negative-lower-positive" patterns are the most common vertical SA-based patterns during the pluvial and drought parts of transition processes, respectively.
The Tsaoling 1941 Landslide, New Insight of Numerical Simulation of Discrete Element Model
NASA Astrophysics Data System (ADS)
Tang, C.-L.; Hu, J.-C.; Lin, M.-L.
2009-04-01
Large earthquakes in the southeastern Taiwan are not rare in the historical catalogue. Tsaoling, located southeast of Taiwan, last five large landslides occurred in the 19th and 20th centuries. According to the literature about the Tsaoling landslide, we concluded four characteristics of the Tsaoling landslide, (1) repeated (2) multi-landslide surface, (3) huge landslide block, and (4) some people survived after sliding a long distance (>2 km). This is the reason why we want to understand the causes of the repeated landslides in Tsaoling and its mechanisms. However, there is not any record about the landslide in 1862 and the most of the landslide evidence disappeared. Hence, we aim at the landslide dynamics of the 1941 landslide in this study. Tsaoling area is located in a large dipping towards the south-southwest monocline. The dip of strata toward the SSW is similar to the both sides of the Chinshui River valley. The bedrock of the Tsaoling area is Pliocene in age and belongs to the upper Chinshui Shale and the lower Cholan Formation. The plane failure analysis and Newmark displacement method are common for slope stability in recent years. However, the plane failure analysis can only provide a safety factor. When the safe factor (FS) is less than 1, it can only indicate that the slope is unstable. The result of Newmark displacement method is a value of displacement length. Both assumptions of the analysis are based on a rigid body. For the large landslide, like the Tsaoling landslide, the volume of landslide masses are over 108 m3, and the landslide block cannot be considered a rigid body. We considered the block as a quasi-rigid body, because the blocks are deformable and jointed. The original version of Distinct Element Method (DEM) was devoted to the modeling of rock-block systems and it was lately applied to the modeling of granular material. The calculation cycle in PFC2D is a time-stepping algorithm that consists of the repeated application of the law of motion to each particle, a force-displacement law to each contact, and a constant updating of wall positions. The physical properties of the particles in the model can be traced in time dominant (i.e. velocity, displacement, force, and stress). During the simulating, we can get the variation of physical properties, so the inter-block change of displacement, force, and stress could be monitored. After the seismic shaking, the result of the PFC model can be divided into three portions, upper (thick), middle (transitional) and lower (thin). The shear displacements of the three parts on the sliding plane are not agreement. The displacement of the lower part block is large than the upper and middle part of the blocks. The shear displacement of middle part is between upper and lower part. During the shaking of the earthquake, the different parts in the block collide with each other, and the upper part of the block was hit back and stayed in origin position or slid a short distance, but the lower part of the block was hit down by the upper block. The collision pushed down a certain length to the lower part of the block. The shear length just lost the strength of the sliding plane and induced the landslide during the 1941 earthquake. The upper part of the block stayed on the slope but revealed unstable. Eight months later, the upper part of the block slid down was induced by a 700 mm downpour in three days.
NASA Technical Reports Server (NTRS)
Clausen, Christian A., III
1996-01-01
Liquid oxygen is used as the oxidizer for the liquid fueled main engines during the launch of the space shuttle. Any hardware that comes into contact with pure oxygen either during servicing of the shuttle or in the operation of the shuttle must be validated as being free of nonvolatile residue (NVR). This is a safety requirement to prevent spontaneous combustion of carbonaceous NVR if it was to come into contact with pure oxygen. Previous NVR validation testing of space hardware used Freon (CFC-113) as the test solvent. Because CFC-113 no longer can be used, a program was conducted to develop a NVR test procedure that uses a safe environmentally friendly solvent. The solvent that has been used in the new NVR test procedure is water. Work that has been conducted over the past three years has served to demonstrate that when small parts are subjected to ultrasound in a water bath and NVR is present a sufficient quantity is dispersed into the water to analyze for its concentration by the TOC method. The work that is described in this report extends the water wash NVR validation test to large-scale parts; that is, parts too large to be subjected to ultrasound. The method consists of concentrating the NVR in the water wash onto a bed of silica gel. The total adsorbent bed is then analyzed for TOC content by using a solid sample probe. Work that has been completed thus far has demonstrated that hydrocarbon based NVR's can be detected at levels of less than 0.1 mg per square foot of part's surface area by using a simple water wash.
H I observations of the nearest starburst galaxy NGC 253 with the SKA precursor KAT-7
NASA Astrophysics Data System (ADS)
Lucero, D. M.; Carignan, C.; Elson, E. C.; Randriamampandry, T. H.; Jarrett, T. H.; Oosterloo, T. A.; Heald, G. H.
2015-07-01
We present H I observations of the Sculptor group starburst spiral galaxy NGC 253, obtained with the Karoo Array Telescope (KAT-7). KAT-7 is a pathfinder for the Square Kilometre Array precursor MeerKAT, under construction. The short baselines and low system temperature of the telescope make it very sensitive to large-scale, low-surface-brightness emission. The KAT-7 observations detected 33 per cent more flux than previous Very Large Array observations, mainly in the outer parts and in the halo for a total H I mass of 2.1 ± 0.1 × 109 M⊙. H I can be found at large distances perpendicular to the plane out to projected distances of ˜9-10 kpc away from the nucleus and ˜13-14 kpc at the edge of the disc. A novel technique, based on interactive profile fitting, was used to separate the main disc gas from the anomalous (halo) gas. The rotation curve (RC) derived for the H I disc confirms that it is declining in the outer parts, as seen in previous optical Fabry-Perot measurements. As for the anomalous component, its RC has a very shallow gradient in the inner parts and turns over at the same radius as the disc, kinematically lagging by 100 km s-1. The kinematics of the observed extra-planar gas is compatible with an outflow due to the central starburst and galactic fountains in the outer parts. However, the gas kinematics shows no evidence for inflow. Analysis of the near-IR WISE data, shows clearly that the star formation rate is compatible with the starburst nature of NGC 253.
A Component-Based Vocabulary-Extensible Sign Language Gesture Recognition Framework.
Wei, Shengjing; Chen, Xiang; Yang, Xidong; Cao, Shuai; Zhang, Xu
2016-04-19
Sign language recognition (SLR) can provide a helpful tool for the communication between the deaf and the external world. This paper proposed a component-based vocabulary extensible SLR framework using data from surface electromyographic (sEMG) sensors, accelerometers (ACC), and gyroscopes (GYRO). In this framework, a sign word was considered to be a combination of five common sign components, including hand shape, axis, orientation, rotation, and trajectory, and sign classification was implemented based on the recognition of five components. Especially, the proposed SLR framework consisted of two major parts. The first part was to obtain the component-based form of sign gestures and establish the code table of target sign gesture set using data from a reference subject. In the second part, which was designed for new users, component classifiers were trained using a training set suggested by the reference subject and the classification of unknown gestures was performed with a code matching method. Five subjects participated in this study and recognition experiments under different size of training sets were implemented on a target gesture set consisting of 110 frequently-used Chinese Sign Language (CSL) sign words. The experimental results demonstrated that the proposed framework can realize large-scale gesture set recognition with a small-scale training set. With the smallest training sets (containing about one-third gestures of the target gesture set) suggested by two reference subjects, (82.6 ± 13.2)% and (79.7 ± 13.4)% average recognition accuracy were obtained for 110 words respectively, and the average recognition accuracy climbed up to (88 ± 13.7)% and (86.3 ± 13.7)% when the training set included 50~60 gestures (about half of the target gesture set). The proposed framework can significantly reduce the user's training burden in large-scale gesture recognition, which will facilitate the implementation of a practical SLR system.
Numerical simulation of seismic wave propagation from land-excited large volume air-gun source
NASA Astrophysics Data System (ADS)
Cao, W.; Zhang, W.
2017-12-01
The land-excited large volume air-gun source can be used to study regional underground structures and to detect temporal velocity changes. The air-gun source is characterized by rich low frequency energy (from bubble oscillation, 2-8Hz) and high repeatability. It can be excited in rivers, reservoirs or man-made pool. Numerical simulation of the seismic wave propagation from the air-gun source helps to understand the energy partitioning and characteristics of the waveform records at stations. However, the effective energy recorded at a distance station is from the process of bubble oscillation, which can not be approximated by a single point source. We propose a method to simulate the seismic wave propagation from the land-excited large volume air-gun source by finite difference method. The process can be divided into three parts: bubble oscillation and source coupling, solid-fluid coupling and the propagation in the solid medium. For the first part, the wavelet of the bubble oscillation can be simulated by bubble model. We use wave injection method combining the bubble wavelet with elastic wave equation to achieve the source coupling. Then, the solid-fluid boundary condition is implemented along the water bottom. And the last part is the seismic wave propagation in the solid medium, which can be readily implemented by the finite difference method. Our method can get accuracy waveform of land-excited large volume air-gun source. Based on the above forward modeling technology, we analysis the effect of the excited P wave and the energy of converted S wave due to different water shapes. We study two land-excited large volume air-gun fields, one is Binchuan in Yunnan, and the other is Hutubi in Xinjiang. The station in Binchuan, Yunnan is located in a large irregular reservoir, the waveform records have a clear S wave. Nevertheless, the station in Hutubi, Xinjiang is located in a small man-made pool, the waveform records have very weak S wave. Better understanding of the characteristics of land-excited large volume air-gun can help to better use of the air-gun source.
Apollo 16 view of the earth from translunar injection
NASA Technical Reports Server (NTRS)
1972-01-01
A good view of the Earth photographed shortly after translunar injection of April 16, 1972. Although there is much cloud cover (over Canada and the oceans), the United States in large part, most of Mexico and some parts of Central America are clearly visible. Note Lake Michigan and Lake Superior. Also note the Bahama Banks at upper right part of the sphere. A large part of the Rocky Mountain Range is also visible.
A new species of Odorrana (Amphibia: Anura: Ranidae) from Vietnam.
Pham, Cuong The; Nguyen, Truong Quang; Le, Minh Duc; Bonkowski, Michael; Ziegler, Thomas
2016-02-26
A new species of Odorrana is described from the karst forests in northeastern Vietnam based on morphological differences and molecular divergence. Morphologically, the new species is distinguishable from its congeners on the basis of a combination of the following diagnostic characters: (1) size large (SVL 85.9-91.6 mm in males, 108.7-110.1 mm in females); (2) head longer than wide; (3) vomerine teeth present; (4) external vocal sacs absent; (5) snout short (SL/SVL 0.16-0.17); (6) tympanum large (TD/ED 0.70 in males, 0.68 in females); (7) dorsal surface of head and anterior part of body smooth, posterior part of body and flanks with small tubercles; (8) supratympanic fold present; (9) dorsolateral fold absent; (10) webbing formula I0-0II0-0III0-1/2IV1/2-0V; (11) in life, dorsum green with dark brown spots; (12) flanks greyish brown with dark brown spots; (13) throat and chest grey, underside of limbs with large dark brown spots, edged in white, forming a network. In the phylogenetic analyses, the new species is unambiguously nested within the O. andersonii group, and placed as the sister taxon to O. wuchuanensis.
Large-Scale Production of Carbon Nanotubes Using the Jefferson Lab Free Electron Laser
NASA Technical Reports Server (NTRS)
Holloway, Brian C.
2003-01-01
We report on our interdisciplinary program to use the Free Electron Laser (FEL) at the Thomas Jefferson National Accelerator Facility (J-Lab) for high-volume pulsed laser vaporization synthesis of carbon nanotubes. Based in part on the funding of from this project, a novel nanotube production system was designed, tested, and patented. Using this new system nanotube production rates over 100 times faster than conventional laser systems were achieved. Analysis of the material produced shows that it is of as high a quality as the standard laser-based materials.
Aerodynamic studies of delta-wing shuttle orbiters. Part 1: Low speed
NASA Technical Reports Server (NTRS)
Freeman, D. C., Jr.; Ellison, J. C.
1972-01-01
Numerous wind tunnel tests conducted on the evolving delta-wing orbiters have generated a fairly large aerodynamic data base over the entire entry operation range of these vehicles. A limited assessment is made of some of the aerodynamics of the current HO type orbiters, and several specific problem areas selected from the broad data base are discussed. These include, from a subsonic viewpoint, discussions of trim drag effect; effects of the installation of main rocket engine nozzles, OMS and RCS packages, Reynolds number effects, lateral-directional stability characteristics, and landing characteristics.
New method of processing heat treatment experiments with numerical simulation support
NASA Astrophysics Data System (ADS)
Kik, T.; Moravec, J.; Novakova, I.
2017-08-01
In this work, benefits of combining modern software for numerical simulations of welding processes with laboratory research was described. Proposed new method of processing heat treatment experiments leading to obtaining relevant input data for numerical simulations of heat treatment of large parts was presented. It is now possible, by using experiments on small tested samples, to simulate cooling conditions comparable with cooling of bigger parts. Results from this method of testing makes current boundary conditions during real cooling process more accurate, but also can be used for improvement of software databases and optimization of a computational models. The point is to precise the computation of temperature fields for large scale hardening parts based on new method of temperature dependence determination of the heat transfer coefficient into hardening media for the particular material, defined maximal thickness of processed part and cooling conditions. In the paper we will also present an example of the comparison standard and modified (according to newly suggested methodology) heat transfer coefficient data’s and theirs influence on the simulation results. It shows how even the small changes influence mainly on distribution of temperature, metallurgical phases, hardness and stresses distribution. By this experiment it is also possible to obtain not only input data and data enabling optimization of computational model but at the same time also verification data. The greatest advantage of described method is independence of used cooling media type.
The pack size effect: Influence on consumer perceptions of portion sizes.
Hieke, Sophie; Palascha, Aikaterini; Jola, Corinne; Wills, Josephine; Raats, Monique M
2016-01-01
Larger portions as well as larger packs can lead to larger prospective consumption estimates, larger servings and increased consumption, described as 'portion-size effects' and 'pack size effects'. Although related, the effects of pack sizes on portion estimates have received less attention. While it is not possible to generalize consumer behaviour across cultures, external cues taken from pack size may affect us all. We thus examined whether pack sizes influence portion size estimates across cultures, leading to a general 'pack size effect'. We compared portion size estimates based on digital presentations of different product pack sizes of solid and liquid products. The study with 13,177 participants across six European countries consisted of three parts. Parts 1 and 2 asked participants to indicate the number of portions present in a combined photographic and text-based description of different pack sizes. The estimated portion size was calculated as the quotient of the content weight or volume of the food presented and the number of stated portions. In Part 3, participants stated the number of food items that make up a portion when presented with packs of food containing either a small or a large number of items. The estimated portion size was calculated as the item weight times the item number. For all three parts and across all countries, we found that participants' portion estimates were based on larger portions for larger packs compared to smaller packs (Part 1 and 2) as well as more items to make up a portion (Part 3); hence, portions were stated to be larger in all cases. Considering that the larger estimated portions are likely to be consumed, there are implications for energy intake and weight status. Copyright © 2015 Elsevier Ltd. All rights reserved.
Analysis and advocacy in home- and community-based care: an approach in three parts.
Hudson, Robert B
2010-01-01
A new chapter in health policy presents both challenges and opportunities for aging policy analysts and advocates concerned with long-term care. Millions of long-term care recipients and providers live today in a pubic policy netherworld, one principally associated with Medicaid. I suggest here that moving policy forward will entail three key steps: (a) to overcome structural lag in key home and community-based care (HCBC) policy arenas; (b) to reverse a contemporary pattern of risk-shifting from institutions to individuals; and (c) to inform and empower caregivers to have their own pressing needs recognized. Recent developments in Washington provide new optimism on these fronts. Voluntary long-term care and community-based care (LTC/HCBC) proposals are on the table within the broad context of health care reform. Whether they remain will be, in large, part a function of how far we have moved along the fronts described: modernizing policies, recognizing risks, and activating neglected policy constituencies.
Region Evolution eXplorer - A tool for discovering evolution trends in ontology regions.
Christen, Victor; Hartung, Michael; Groß, Anika
2015-01-01
A large number of life science ontologies has been developed to support different application scenarios such as gene annotation or functional analysis. The continuous accumulation of new insights and knowledge affects specific portions in ontologies and thus leads to their adaptation. Therefore, it is valuable to study which ontology parts have been extensively modified or remained unchanged. Users can monitor the evolution of an ontology to improve its further development or apply the knowledge in their applications. Here we present REX (Region Evolution eXplorer) a web-based system for exploring the evolution of ontology parts (regions). REX provides an analysis platform for currently about 1,000 versions of 16 well-known life science ontologies. Interactive workflows allow an explorative analysis of changing ontology regions and can be used to study evolution trends for long-term periods. REX is a web application providing an interactive and user-friendly interface to identify (un)stable regions in large life science ontologies. It is available at http://www.izbi.de/rex.
NATIONAL POLICIES TO MEET THE CHALLENGE OF SUBSTANCE ABUSE : PROGRAMMES AND IMPLEMENTATION
Malhotra, Anil; Mohan, Ashwin
2000-01-01
Drug abuse has become a growing issue of concern to humanity. India has a large consumer base of drug and alcohol abusers. This has serious repercussions in terms of morbidity & mortality. Hence the need for a national policy. In India, the Narcotic Drugs and Psychotropic Substances Act. 1985 (NDPS) provides the framework for drug abuse control in the country. A large number of measures have been undertaken as part of demand reduction activities. These include framing policies and programmes, setting up of centres, developing pilot projects, etc. However, the implementation still needs a lot to be desired. The efforts have not yet been streamlined and no revision of policies has taken place based on experience. This paper critically reviews the initiatives taken thus far to control drug abuse in our country. PMID:21407973
STT Doubles with Large Delta_M - Part VIII: Tau Per Ori Cam Mon Cnc Peg
NASA Astrophysics Data System (ADS)
Knapp, Wilfried; Nanson, John
2017-04-01
The results of visual double star observing sessions suggested a pattern for STT doubles with large delta_M of being harder to resolve than would be expected based on the WDS catalog data. It was felt this might be a problem with expectations on one hand, and on the other might be an indication of a need for new precise measurements, so we decided to take a closer look at a selected sample of STT doubles and do some research. Again like for the other STT objects covered so far several of the components show parameters quite different from the current WDS data.
Web-Based Consumer Health Information: Public Access, Digital Division, and Remainders
Lorence, Daniel; Park, Heeyoung
2006-01-01
Public access Internet portals and decreasing costs of personal computers have created a growing consensus that unequal access to information, or a “digital divide,” has largely disappeared for US consumers. A series of technology initiatives in the late 1990s were believed to have largely eliminated the divide. For healthcare patients, access to information is an essential part of the consumer-centric framework outlined in the recently proposed national health information initiative. Data from a recent study of health information-seeking behaviors on the Internet suggest that a “digitally underserved group” persists, effectively limiting the planned national health information infrastructure to wealthier Americans. PMID:16926743
Preliminary analysis of Dione Regio, Venus: The final Magellan regional imaging gap
NASA Technical Reports Server (NTRS)
Keddie, S. T.
1993-01-01
In Sep. 1992, the Magellan spacecraft filled the final large gap in its coverage of Venus when it imaged an area west of Alpha Regio. F-BIDR's and some test MIDR's of parts of this area were available as of late December. Dione Regio was imaged by the Arecibo observatory and a preliminary investigation of Magellan images supports the interpretations made based on these earlier images: Dione Regio is a regional highland on which is superposed three large, very distinct volcanic edifices. The superior resolution and different viewing geometry of the Magellan images also clarified some uncertainties and revealed fascinating details about this region.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 19 Customs Duties 1 2011-04-01 2011-04-01 false Bond for Deferral of Duty on Large Yachts Imported... Appendix C to Part 113—Bond for Deferral of Duty on Large Yachts Imported for Sale at United States Boat Shows Bond for Deferral of Duty on Large Yachts Imported for Sale at United States Boat Shows ____, as...
Code of Federal Regulations, 2010 CFR
2010-04-01
... 19 Customs Duties 1 2010-04-01 2010-04-01 false Bond for Deferral of Duty on Large Yachts Imported... Appendix C to Part 113—Bond for Deferral of Duty on Large Yachts Imported for Sale at United States Boat Shows Bond for Deferral of Duty on Large Yachts Imported for Sale at United States Boat Shows ____, as...
Towards Exascale Seismic Imaging and Inversion
NASA Astrophysics Data System (ADS)
Tromp, J.; Bozdag, E.; Lefebvre, M. P.; Smith, J. A.; Lei, W.; Ruan, Y.
2015-12-01
Post-petascale supercomputers are now available to solve complex scientific problems that were thought unreachable a few decades ago. They also bring a cohort of concerns tied to obtaining optimum performance. Several issues are currently being investigated by the HPC community. These include energy consumption, fault resilience, scalability of the current parallel paradigms, workflow management, I/O performance and feature extraction with large datasets. In this presentation, we focus on the last three issues. In the context of seismic imaging and inversion, in particular for simulations based on adjoint methods, workflows are well defined.They consist of a few collective steps (e.g., mesh generation or model updates) and of a large number of independent steps (e.g., forward and adjoint simulations of each seismic event, pre- and postprocessing of seismic traces). The greater goal is to reduce the time to solution, that is, obtaining a more precise representation of the subsurface as fast as possible. This brings us to consider both the workflow in its entirety and the parts comprising it. The usual approach is to speedup the purely computational parts based on code optimization in order to reach higher FLOPS and better memory management. This still remains an important concern, but larger scale experiments show that the imaging workflow suffers from severe I/O bottlenecks. Such limitations occur both for purely computational data and seismic time series. The latter are dealt with by the introduction of a new Adaptable Seismic Data Format (ASDF). Parallel I/O libraries, namely HDF5 and ADIOS, are used to drastically reduce the cost of disk access. Parallel visualization tools, such as VisIt, are able to take advantage of ADIOS metadata to extract features and display massive datasets. Because large parts of the workflow are embarrassingly parallel, we are investigating the possibility of automating the imaging process with the integration of scientific workflow management tools, specifically Pegasus.
Schäfer, Ingmar; von Leitner, Eike-Christin; Schön, Gerhard; Koller, Daniela; Hansen, Heike; Kolonko, Tina; Kaduszkiewicz, Hanna; Wegscheider, Karl; Glaeske, Gerd; van den Bussche, Hendrik
2010-01-01
Objective Multimorbidity is a common problem in the elderly that is significantly associated with higher mortality, increased disability and functional decline. Information about interactions of chronic diseases can help to facilitate diagnosis, amend prevention and enhance the patients' quality of life. The aim of this study was to increase the knowledge of specific processes of multimorbidity in an unselected elderly population by identifying patterns of statistically significantly associated comorbidity. Methods Multimorbidity patterns were identified by exploratory tetrachoric factor analysis based on claims data of 63,104 males and 86,176 females in the age group 65+. Analyses were based on 46 diagnosis groups incorporating all ICD-10 diagnoses of chronic diseases with a prevalence ≥ 1%. Both genders were analyzed separately. Persons were assigned to multimorbidity patterns if they had at least three diagnosis groups with a factor loading of 0.25 on the corresponding pattern. Results Three multimorbidity patterns were found: 1) cardiovascular/metabolic disorders [prevalence female: 30%; male: 39%], 2) anxiety/depression/somatoform disorders and pain [34%; 22%], and 3) neuropsychiatric disorders [6%; 0.8%]. The sampling adequacy was meritorious (Kaiser-Meyer-Olkin measure: 0.85 and 0.84, respectively) and the factors explained a large part of the variance (cumulative percent: 78% and 75%, respectively). The patterns were largely age-dependent and overlapped in a sizeable part of the population. Altogether 50% of female and 48% of male persons were assigned to at least one of the three multimorbidity patterns. Conclusion This study shows that statistically significant co-occurrence of chronic diseases can be subsumed in three prevalent multimorbidity patterns if accounting for the fact that different multimorbidity patterns share some diagnosis groups, influence each other and overlap in a large part of the population. In recognizing the full complexity of multimorbidity we might improve our ability to predict needs and achieve possible benefits for elderly patients who suffer from multimorbidity. PMID:21209965
NASA Astrophysics Data System (ADS)
Chen, Yi-Zhe; Liu, Wei; Yuan, Shi-Jian
2015-05-01
Normally, the strength and formability of aluminum alloys can be increased largely by severe plastic deformation and heat treatment. However, many plastic deformation processes are more suitable for making raw material, not for formed parts. In this article, an experimental study of the thermomechanical treatment by using the sheet hydroforming process was developed to improve both mechanical strength and formability for aluminum alloys in forming complex parts. The limiting drawing ratio, thickness, and strain distribution of complex parts formed by sheet hydroforming were investigated to study the formability and sheet-deformation behavior. Based on the optimal formed parts, the tensile strength, microhardness, grain structure, and strengthening precipitates were analyzed to identify the strengthening effect of thermomechanical treatment. The results show that in the solution state, the limiting drawing ratio of cylindrical parts could be increased for 10.9% compared with traditional deep drawing process. The peak values of tensile stress and microhardness of formed parts are 18.0% and 12.5% higher than that in T6 state. This investigation shows that the thermomechanical treatment by sheet hydroforming is a potential method for the products manufacturing of aluminum alloy with high strength and good formability.
Ductile fracture of cylindrical vessels containing a large flaw
NASA Technical Reports Server (NTRS)
Erdogan, F.; Irwin, G. R.; Ratwani, M.
1976-01-01
The fracture process in pressurized cylindrical vessels containing a relatively large flaw is considered. The flaw is assumed to be a part-through or through meridional crack. The flaw geometry, the yield behavior of the material, and the internal pressure are assumed to be such that in the neighborhood of the flaw the cylinder wall undergoes large-scale plastic deformations. Thus, the problem falls outside the range of applicability of conventional brittle fracture theories. To study the problem, plasticity considerations are introduced into the shell theory through the assumptions of fully-yielded net ligaments using a plastic strip model. Then a ductile fracture criterion is developed which is based on the concept of net ligament plastic instability. A limited verification is attempted by comparing the theoretical predictions with some existing experimental results.
III. FROM SMALL TO BIG: METHODS FOR INCORPORATING LARGE SCALE DATA INTO DEVELOPMENTAL SCIENCE.
Davis-Kean, Pamela E; Jager, Justin
2017-06-01
For decades, developmental science has been based primarily on relatively small-scale data collections with children and families. Part of the reason for the dominance of this type of data collection is the complexity of collecting cognitive and social data on infants and small children. These small data sets are limited in both power to detect differences and the demographic diversity to generalize clearly and broadly. Thus, in this chapter we will discuss the value of using existing large-scale data sets to tests the complex questions of child development and how to develop future large-scale data sets that are both representative and can answer the important questions of developmental scientists. © 2017 The Society for Research in Child Development, Inc.
NASA Astrophysics Data System (ADS)
Brax, Christoffer; Niklasson, Lars
2009-05-01
Maritime Domain Awareness is important for both civilian and military applications. An important part of MDA is detection of unusual vessel activities such as piracy, smuggling, poaching, collisions, etc. Today's interconnected sensorsystems provide us with huge amounts of information over large geographical areas which can make the operators reach their cognitive capacity and start to miss important events. We propose and agent-based situation management system that automatically analyse sensor information to detect unusual activity and anomalies. The system combines knowledge-based detection with data-driven anomaly detection. The system is evaluated using information from both radar and AIS sensors.
Customizing a rangefinder for community-based wildlife conservation initiatives
Ransom, Jason I.
2011-01-01
Population size of many threatened and endangered species is relatively unknown because estimating animal abundance in remote parts of the world, without access to aircraft for surveying vast areas, is a scientific challenge with few proposed solutions. One option is to enlist local community members and train them in data collection for large line transect or point count surveys, but financial and sometimes technological constraints prevent access to the necessary equipment and training for accurately quantifying distance measurements. Such measurements are paramount for generating reliable estimates of animal density. This problem was overcome in a survey of Asiatic wild ass (Equus hemionus) in the Great Gobi B Strictly Protected Area, Mongolia, by converting an inexpensive optical sporting rangefinder into a species-specific rangefinder with visual-based categorical labels. Accuracy trials concluded 96.86% of 350 distance measures matched those from a laser rangefinder. This simple customized optic subsequently allowed for a large group of minimally-trained observers to simultaneously record quantitative measures of distance, despite language, education, and skill differences among the diverse group. The large community-based effort actively engaged local residents in species conservation by including them as the foundation for collecting scientific data.
Geohydrologic reconnaissance of the upper Potomac River basin
Trainer, Frank W.; Watkins, Frank A.
1975-01-01
The upper Potomac River basin, in the central Appalachian region in Pennsylvania, Maryland, Virginia, and West Virginia, is a humid temperate region of diverse fractured rocks. Three geohydrologic terranes, which underlie large parts of the basin, are described in terms of their aquifer characteristics and of the magnitude and duration of their base runoff: (1) fractured rock having a thin regolith, (2) fractured rock having a thick regolith, and (3) carbonate rock. Crystalline rock in the mountainous part of the Blue Ridge province and shale with tight sandstone in the folded Appalachians are covered with thin regolith. Water is stored in and moves through fairly unmodified fractures. Average transmissivity (T) is estimated to be 150 feet squared per day, and average storage coefficient (S), 0.005. Base runoff declines rapidly from its high levels during spring and is poorly sustained during the summer season of high evapotranspiration. The rocks in this geohydrologic terrane are the least effective in the basin for the development of water supplies and as a source of dry-weather streamflow. Crystalline and sedimentary rocks in the Piedmont province and in the lowland part of the Blue Ridge province are covered with thick regolith. Water is stored in and moves through both the regolith and the underlying fractured rock. Estimated average values for aquifer characteristics are T, 200 feet squared per day, and S, 0.01. Base runoff is better sustained in this terrane than in the thin-regolith terrane and on the average .is about twice as great. Carbonate rock, in which fractures have been widened selectively by solution, especially near streams, has estimated average aquifer characteristics of T, 500 feet squared per day, and S, 0.03-0.04. This rock is the most effective in the basin in terms of water supply and base runoff. Where its fractures have not been widened by solution, the carbonate rock is a fractured-rock aquifer much like the noncarbonate rock. At low values the frequency of specific capacities of wells is much the same in all rocks in the basin, but high values of specific capacity are as much as 10 times more frequent in carbonate rock than in noncarbonate rock. Nearly all the large springs and high-capacity wells in the basin are in carbonate rock. Base runoff from the carbonate rock is better sustained during dry weather and on the average is about three times as great as base runoff from fractured rock having a thin regolith. The potential role of these water-bearing terranes in water management probably lies in the local development of large water supplies from the carbonate rock and in the possible manipulation of underground storage for such purposes as providing space for artificial recharge of ground water and providing ground water to be used for the augmentation of low streamflow. The chief water-quality problems in the basin--acidic mine-drainage water in the western part of the basin, local highly mineralized ground water, and the high nitrate content of ground water in some of the densely populated parts of the basin--would probably have little adverse affect on the use of ground water for low-flow augmentation.
NASA Astrophysics Data System (ADS)
Bazilevs, Y.; Kamran, K.; Moutsanidis, G.; Benson, D. J.; Oñate, E.
2017-07-01
In this two-part paper we begin the development of a new class of methods for modeling fluid-structure interaction (FSI) phenomena for air blast. We aim to develop accurate, robust, and practical computational methodology, which is capable of modeling the dynamics of air blast coupled with the structure response, where the latter involves large, inelastic deformations and disintegration into fragments. An immersed approach is adopted, which leads to an a-priori monolithic FSI formulation with intrinsic contact detection between solid objects, and without formal restrictions on the solid motions. In Part I of this paper, the core air-blast FSI methodology suitable for a variety of discretizations is presented and tested using standard finite elements. Part II of this paper focuses on a particular instantiation of the proposed framework, which couples isogeometric analysis (IGA) based on non-uniform rational B-splines and a reproducing-kernel particle method (RKPM), which is a Meshfree technique. The combination of IGA and RKPM is felt to be particularly attractive for the problem class of interest due to the higher-order accuracy and smoothness of both discretizations, and relative simplicity of RKPM in handling fragmentation scenarios. A collection of mostly 2D numerical examples is presented in each of the parts to illustrate the good performance of the proposed air-blast FSI framework.
Energy Absorption Mechanisms in Unidirectional Composites Subjected to Dynamic Loading Events
2012-03-30
integral part of commercial, recreation, and defense markets . The proliferation of applications for fiber-reinforced composite technology can be in large...soft body armors. The growth of composites in high-performance markets continues to outpace the development of new and improved physics-based...pp. 718 – 730, 2008. 16. G. C. Jacob, J. F. Fellers, S. Simunovic, and J. M. Starbuck , “Energy Absorption in Polymer Composites for
ERIC Educational Resources Information Center
Alex, Jogymol K.; Mammen, Kuttickattu J.
2016-01-01
This paper reports on one part of a large study which attempted to identify the linguistic and hierarchical characteristics of van Hiele theory amongst grade 10 learners. The sample consisted of a total of 359 participants from five purposively selected schools from Mthatha District in the Eastern Cape Province of South Africa. The performance of…
ERIC Educational Resources Information Center
Nwabude, Aaron A. R.; Ade-Ojo, Gordon O.
2012-01-01
The work reported in this paper is part of a study that explored some of the roles of Further Education Colleges in the United Kingdom. The paper is based largely on literature from books and on-line resources and short interviews from five British further education colleges, but also on the author's views and experience. The major aim of the…
ERIC Educational Resources Information Center
Lee, Mimi Miyoung; Chauvot, Jennifer; Plankis, Brian; Vowell, Julie; Culpepper, Shea
2011-01-01
iSMART (Integration of Science, Mathematics, and Reflective Teaching) Program is an online science and mathematics integrated graduate program for middle school teachers across the state of Texas. As part of a large design-based research project, this paper describes the initial stages of the design process of the iSMART program for its first…
ERIC Educational Resources Information Center
Waxman, Hersh C.; Padron, Yolanda N.; Lee, Yuan-Hsuan
2010-01-01
The No Child Left Behind Act (NCLB) of 2002 calls for several changes in the K-12 education system in the United States. It focuses on evidence-based educational practices for schools in the United States. This study was part of a large-scale, 8-year research project that examined the quality of classroom instruction from three elementary schools…
Cosmonumerology, Cosmophysics, and the Large Numbers Hypothesis: British Cosmology in the 1930s
NASA Astrophysics Data System (ADS)
Durham, Ian
2001-04-01
A number of unorthodox cosmological models were developed in the 1930s, many by British theoreticians. Three of the most notable of these theories included Eddington's cosmonumerology, Milne's cosmophysics, and Dirac's large numbers hypothesis (LNH). Dirac's LNH was based partly on the other two and it has been argued that modern steady-state theories are based partly on Milne's cosmophysics. But what influenced Eddington and Milne? Both were products of the late Victorian education system in Britain and could conceivably have been influenced by Victorian thought which, in addition to its strict (though technically unoffical) social caste system, had a flair for the unusual. Victorianism was filled with a fascination for the occult and the supernatural, and science was not insulated from this trend (witness the Henry Slade trial in 1877). It is conceivable that the normally strict mentality of the scientific process in the minds of Eddington and Milne was affected, indirectly, by this trend for the unusual, possibly pushing them into thinking "outside the box" as it were. In addition, cosmonumerology and the LNH exhibit signs of Pythagorean and Aristotelian thought. It is the aim of this ongoing project at St. Andrews to determine the influences and characterize the relations existing in and within these and related theories.
IT Governance in SMEs: Trust or Control?
NASA Astrophysics Data System (ADS)
Devos, Jan; van Landeghem, Hendrik; Deschoolmeester, Dirk
It is believed by many scholars that a small and medium-sized enterprise (SME) cannot be seen through the lens of a large firm. Theories which explain IT governance in large organizations and methodologies used by practitioners can therefore not be extrapolated to SMEs, which have a completely different economic, cultural and managerial environment. SMEs suffer from resource poverty, have less IS experience and need more external support. SMEs largely contribute to the failure of many IS projects. We define an outsourced information system failure (OISF) as a failure of IT governance in an SME environment and propose a structure for stating propositions derived from both agency theory and theory of trust. The theoretical question addressed in this paper is: how and why do OISFs occur in SMEs? We have chosen a qualitative and positivistic IS case study research strategy based on multiple cases. Eight cases of IS projects were selected. We found that trust is more important than control issues like output-based contracts and structured controls for eliminating opportunistic behaviour in SMEs. We conclude that the world of SMEs is significantly different from that of large companies. This necessitates extra care to be taken on the part of researchers and practitioners when designing artefacts for SMEs.
Organization and management of heterogeneous, dispersed data bases in nuclear engineering
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eastman, C.M.
1986-01-01
Large, complex, multiperson engineering projects in many areas, nuclear, aerospace, electronics, and manufacturing, have inherent needs for coordination, control, and management of the related engineering data. Taken in the abstract, the notion of an integrated engineering data base (IED) for such projects is attractive. The potential capabilities of an (IED) are that all data are managed in a coordinated way, are made accessible to all users who need it, allow relations between all parts of the data to be tracked and managed, provide backup, recovery, audit trails, security and access control, and allow overall project status to be monitored andmore » managed. Common data accessing schemes and user interfaces to applications are also part of an IED. This paper describes a new software product that allows incremental realization of many of the capabilities of an IED, without the massive disruption and risk.« less
The research of hourglass worm dynamic balancing simulation based on SolidWorks motion
NASA Astrophysics Data System (ADS)
Wang, Zhuangzhuang; Yang, Jie; Liu, Pingyi; Zhao, Junpeng
2018-02-01
Hourglass worm is extensively used in industry due to its characteristic of heavy-load and a large reduction ratio. Varying sizes of unbalanced mass distribution appeared in the design of a single head worm. With machines developing towards higher speed and precision, the vibration and shock caused by the unbalanced mass distribution of rotating parts must be considered. Therefore, the balance grade of these parts must meet higher requirements. A method based on theoretical analysis and SolidWorks motion software simulation is presented in this paper; the virtual dynamic balance simulation test of the hourglass worm was carried out during the design of the product, so as to ensure that the hourglass worm meet the requirements of dynamic balance in the design process. This can effectively support the structural design of the hourglass worm and provide a way of thinking and designing the same type of products.
Politics, power and poverty: health for all in 2000 in the Third World?
Green, R H
1991-01-01
Health for All by 2000 could become a reality in the Third World countries. On present resource allocation, medical professional and political patterns and trends that is unlikely to happen in more than a few countries. For it to happen requires basic priority shifts to universal access primary health care (including preventative). The main obstacles to such a shift are not absolute resource constraints but medical professional conservatism together with its interaction with elite interests and with political priorities based partly on perceived demand and partly on (largely medical) professional advice. These obstacles are surmountable-as illustrated by divergent performances among countries--but only if education, promotion, efficiency in terms of lives saved and healthy years gained, community participation and political activism for Health for All are more carefully analytically based and pursued more seriously and widely than they have been to date.
Formal Methods for Automated Diagnosis of Autosub 6000
NASA Technical Reports Server (NTRS)
Ernits, Juhan; Dearden, Richard; Pebody, Miles
2009-01-01
This is a progress report on applying formal methods in the context of building an automated diagnosis and recovery system for Autosub 6000, an Autonomous Underwater Vehicle (AUV). The diagnosis task involves building abstract models of the control system of the AUV. The diagnosis engine is based on Livingstone 2, a model-based diagnoser originally built for aerospace applications. Large parts of the diagnosis model can be built without concrete knowledge about each mission, but actual mission scripts and configuration parameters that carry important information for diagnosis are changed for every mission. Thus we use formal methods for generating the mission control part of the diagnosis model automatically from the mission script and perform a number of invariant checks to validate the configuration. After the diagnosis model is augmented with the generated mission control component model, it needs to be validated using verification techniques.
A Parameterization of Dry Thermals and Shallow Cumuli for Mesoscale Numerical Weather Prediction
NASA Astrophysics Data System (ADS)
Pergaud, Julien; Masson, Valéry; Malardel, Sylvie; Couvreux, Fleur
2009-07-01
For numerical weather prediction models and models resolving deep convection, shallow convective ascents are subgrid processes that are not parameterized by classical local turbulent schemes. The mass flux formulation of convective mixing is now largely accepted as an efficient approach for parameterizing the contribution of larger plumes in convective dry and cloudy boundary layers. We propose a new formulation of the EDMF scheme (for Eddy DiffusivityMass Flux) based on a single updraft that improves the representation of dry thermals and shallow convective clouds and conserves a correct representation of stratocumulus in mesoscale models. The definition of entrainment and detrainment in the dry part of the updraft is original, and is specified as proportional to the ratio of buoyancy to vertical velocity. In the cloudy part of the updraft, the classical buoyancy sorting approach is chosen. The main closure of the scheme is based on the mass flux near the surface, which is proportional to the sub-cloud layer convective velocity scale w *. The link with the prognostic grid-scale cloud content and cloud cover and the projection on the non- conservative variables is processed by the cloud scheme. The validation of this new formulation using large-eddy simulations focused on showing the robustness of the scheme to represent three different boundary layer regimes. For dry convective cases, this parameterization enables a correct representation of the countergradient zone where the mass flux part represents the top entrainment (IHOP case). It can also handle the diurnal cycle of boundary-layer cumulus clouds (EUROCSARM) and conserve a realistic evolution of stratocumulus (EUROCSFIRE).
Framework Support For Knowledge-Based Software Development
NASA Astrophysics Data System (ADS)
Huseth, Steve
1988-03-01
The advent of personal engineering workstations has brought substantial information processing power to the individual programmer. Advanced tools and environment capabilities supporting the software lifecycle are just beginning to become generally available. However, many of these tools are addressing only part of the software development problem by focusing on rapid construction of self-contained programs by a small group of talented engineers. Additional capabilities are required to support the development of large programming systems where a high degree of coordination and communication is required among large numbers of software engineers, hardware engineers, and managers. A major player in realizing these capabilities is the framework supporting the software development environment. In this paper we discuss our research toward a Knowledge-Based Software Assistant (KBSA) framework. We propose the development of an advanced framework containing a distributed knowledge base that can support the data representation needs of tools, provide environmental support for the formalization and control of the software development process, and offer a highly interactive and consistent user interface.
Quantification of uncertainty in machining operations for on-machine acceptance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Claudet, Andre A.; Tran, Hy D.; Su, Jiann-Chemg
2008-09-01
Manufactured parts are designed with acceptance tolerances, i.e. deviations from ideal design conditions, due to unavoidable errors in the manufacturing process. It is necessary to measure and evaluate the manufactured part, compared to the nominal design, to determine whether the part meets design specifications. The scope of this research project is dimensional acceptance of machined parts; specifically, parts machined using numerically controlled (NC, or also CNC for Computer Numerically Controlled) machines. In the design/build/accept cycle, the designer will specify both a nominal value, and an acceptable tolerance. As part of the typical design/build/accept business practice, it is required to verifymore » that the part did meet acceptable values prior to acceptance. Manufacturing cost must include not only raw materials and added labor, but also the cost of ensuring conformance to specifications. Ensuring conformance is a substantial portion of the cost of manufacturing. In this project, the costs of measurements were approximately 50% of the cost of the machined part. In production, cost of measurement would be smaller, but still a substantial proportion of manufacturing cost. The results of this research project will point to a science-based approach to reducing the cost of ensuring conformance to specifications. The approach that we take is to determine, a priori, how well a CNC machine can manufacture a particular geometry from stock. Based on the knowledge of the manufacturing process, we are then able to decide features which need further measurements from features which can be accepted 'as is' from the CNC. By calibration of the machine tool, and establishing a machining accuracy ratio, we can validate the ability of CNC to fabricate to a particular level of tolerance. This will eliminate the costs of checking for conformance for relatively large tolerances.« less
Analysis of Critical Parts and Materials
1980-12-01
1 1 1% 1% 1% 1% Large Orders Manual Ordering of Some Critical Parts Order Spares with Original Order Incentives Belter Capital Investment...demand 23 Large orders 24 Long lead procurement funding (including raw materials, facility funding) 25 Manpower analysis and training 26 Manual ... ordering of some critical parts 27 More active role in schedule negotiation 28 Multiple source procurements 29 Multi-year program funding 30 Order
Fan, Long; Hui, Jerome H L; Yu, Zu Guo; Chu, Ka Hou
2014-07-01
Species identification based on short sequences of DNA markers, that is, DNA barcoding, has emerged as an integral part of modern taxonomy. However, software for the analysis of large and multilocus barcoding data sets is scarce. The Basic Local Alignment Search Tool (BLAST) is currently the fastest tool capable of handling large databases (e.g. >5000 sequences), but its accuracy is a concern and has been criticized for its local optimization. However, current more accurate software requires sequence alignment or complex calculations, which are time-consuming when dealing with large data sets during data preprocessing or during the search stage. Therefore, it is imperative to develop a practical program for both accurate and scalable species identification for DNA barcoding. In this context, we present VIP Barcoding: a user-friendly software in graphical user interface for rapid DNA barcoding. It adopts a hybrid, two-stage algorithm. First, an alignment-free composition vector (CV) method is utilized to reduce searching space by screening a reference database. The alignment-based K2P distance nearest-neighbour method is then employed to analyse the smaller data set generated in the first stage. In comparison with other software, we demonstrate that VIP Barcoding has (i) higher accuracy than Blastn and several alignment-free methods and (ii) higher scalability than alignment-based distance methods and character-based methods. These results suggest that this platform is able to deal with both large-scale and multilocus barcoding data with accuracy and can contribute to DNA barcoding for modern taxonomy. VIP Barcoding is free and available at http://msl.sls.cuhk.edu.hk/vipbarcoding/. © 2014 John Wiley & Sons Ltd.
Enriching text with images and colored light
NASA Astrophysics Data System (ADS)
Sekulovski, Dragan; Geleijnse, Gijs; Kater, Bram; Korst, Jan; Pauws, Steffen; Clout, Ramon
2008-01-01
We present an unsupervised method to enrich textual applications with relevant images and colors. The images are collected by querying large image repositories and subsequently the colors are computed using image processing. A prototype system based on this method is presented where the method is applied to song lyrics. In combination with a lyrics synchronization algorithm the system produces a rich multimedia experience. In order to identify terms within the text that may be associated with images and colors, we select noun phrases using a part of speech tagger. Large image repositories are queried with these terms. Per term representative colors are extracted using the collected images. Hereto, we either use a histogram-based or a mean shift-based algorithm. The representative color extraction uses the non-uniform distribution of the colors found in the large repositories. The images that are ranked best by the search engine are displayed on a screen, while the extracted representative colors are rendered on controllable lighting devices in the living room. We evaluate our method by comparing the computed colors to standard color representations of a set of English color terms. A second evaluation focuses on the distance in color between a queried term in English and its translation in a foreign language. Based on results from three sets of terms, a measure of suitability of a term for color extraction based on KL Divergence is proposed. Finally, we compare the performance of the algorithm using either the automatically indexed repository of Google Images and the manually annotated Flickr.com. Based on the results of these experiments, we conclude that using the presented method we can compute the relevant color for a term using a large image repository and image processing.
The Devil Is in the Details! On Regulating Cannabis Use in Canada Based on Public Health Criteria
Rehm, Jürgen; Crépault, Jean-François; Fischer, Benedikt
2017-01-01
This commentary to the editorial of Hajizadeh argues that the economic, social and health consequences of legalizing cannabis in Canada will depend in large part on the exact stipulations (mainly from the federal government) and on the implementation, regulation and practice of the legalization act (on provincial and municipal levels). A strict regulatory framework is necessary to minimize the health burden attributable to cannabis use. This includes prominently control of production and sale of the legal cannabis including control of price and content with ban of marketing and advertisement. Regulation of medical marijuana should be part of such a framework as well. PMID:28812798
NASA Astrophysics Data System (ADS)
Philipps, V.; Malaquias, A.; Hakola, A.; Karhunen, J.; Maddaluno, G.; Almaviva, S.; Caneve, L.; Colao, F.; Fortuna, E.; Gasior, P.; Kubkowska, M.; Czarnecka, A.; Laan, M.; Lissovski, A.; Paris, P.; van der Meiden, H. J.; Petersson, P.; Rubel, M.; Huber, A.; Zlobinski, M.; Schweer, B.; Gierse, N.; Xiao, Q.; Sergienko, G.
2013-09-01
Analysis and understanding of wall erosion, material transport and fuel retention are among the most important tasks for ITER and future devices, since these questions determine largely the lifetime and availability of the fusion reactor. These data are also of extreme value to improve the understanding and validate the models of the in vessel build-up of the T inventory in ITER and future D-T devices. So far, research in these areas is largely supported by post-mortem analysis of wall tiles. However, access to samples will be very much restricted in the next-generation devices (such as ITER, JT-60SA, W7-X, etc) with actively cooled plasma-facing components (PFC) and increasing duty cycle. This has motivated the development of methods to measure the deposition of material and retention of plasma fuel on the walls of fusion devices in situ, without removal of PFC samples. For this purpose, laser-based methods are the most promising candidates. Their feasibility has been assessed in a cooperative undertaking in various European associations under EFDA coordination. Different laser techniques have been explored both under laboratory and tokamak conditions with the emphasis to develop a conceptual design for a laser-based wall diagnostic which is integrated into an ITER port plug, aiming to characterize in situ relevant parts of the inner wall, the upper region of the inner divertor, part of the dome and the upper X-point region.
Chen, Yang; Ren, Xiaofeng; Zhang, Guo-Qiang; Xu, Rong
2013-01-01
Visual information is a crucial aspect of medical knowledge. Building a comprehensive medical image base, in the spirit of the Unified Medical Language System (UMLS), would greatly benefit patient education and self-care. However, collection and annotation of such a large-scale image base is challenging. To combine visual object detection techniques with medical ontology to automatically mine web photos and retrieve a large number of disease manifestation images with minimal manual labeling effort. As a proof of concept, we first learnt five organ detectors on three detection scales for eyes, ears, lips, hands, and feet. Given a disease, we used information from the UMLS to select affected body parts, ran the pretrained organ detectors on web images, and combined the detection outputs to retrieve disease images. Compared with a supervised image retrieval approach that requires training images for every disease, our ontology-guided approach exploits shared visual information of body parts across diseases. In retrieving 2220 web images of 32 diseases, we reduced manual labeling effort to 15.6% while improving the average precision by 3.9% from 77.7% to 81.6%. For 40.6% of the diseases, we improved the precision by 10%. The results confirm the concept that the web is a feasible source for automatic disease image retrieval for health image database construction. Our approach requires a small amount of manual effort to collect complex disease images, and to annotate them by standard medical ontology terms.
Springback effects during single point incremental forming: Optimization of the tool path
NASA Astrophysics Data System (ADS)
Giraud-Moreau, Laurence; Belchior, Jérémy; Lafon, Pascal; Lotoing, Lionel; Cherouat, Abel; Courtielle, Eric; Guines, Dominique; Maurine, Patrick
2018-05-01
Incremental sheet forming is an emerging process to manufacture sheet metal parts. This process is more flexible than conventional one and well suited for small batch production or prototyping. During the process, the sheet metal blank is clamped by a blank-holder and a small-size smooth-end hemispherical tool moves along a user-specified path to deform the sheet incrementally. Classical three-axis CNC milling machines, dedicated structure or serial robots can be used to perform the forming operation. Whatever the considered machine, large deviations between the theoretical shape and the real shape can be observed after the part unclamping. These deviations are due to both the lack of stiffness of the machine and residual stresses in the part at the end of the forming stage. In this paper, an optimization strategy of the tool path is proposed in order to minimize the elastic springback induced by residual stresses after unclamping. A finite element model of the SPIF process allowing the shape prediction of the formed part with a good accuracy is defined. This model, based on appropriated assumptions, leads to calculation times which remain compatible with an optimization procedure. The proposed optimization method is based on an iterative correction of the tool path. The efficiency of the method is shown by an improvement of the final shape.
Segmentation algorithm of colon based on multi-slice CT colonography
NASA Astrophysics Data System (ADS)
Hu, Yizhong; Ahamed, Mohammed Shabbir; Takahashi, Eiji; Suzuki, Hidenobu; Kawata, Yoshiki; Niki, Noboru; Suzuki, Masahiro; Iinuma, Gen; Moriyama, Noriyuki
2012-02-01
CT colonography is a radiology test that looks at people's large intestines(colon). CT colonography can screen many options of colon cancer. This test is used to detect polyps or cancers of the colon. CT colonography is safe and reliable. It can be used if people are too sick to undergo other forms of colon cancer screening. In our research, we proposed a method for automatic segmentation of the colon from abdominal computed Tomography (CT) images. Our multistage detection method extracted colon and spited colon into different parts according to the colon anatomy information. We found that among the five segmented parts of the colon, sigmoid (20%) and rectum (50%) are more sensitive toward polyps and masses than the other three parts. Our research focused on detecting the colon by the individual diagnosis of sigmoid and rectum. We think it would make the rapid and easy diagnosis of colon in its earlier stage and help doctors for analysis of correct position of each part and detect the colon rectal cancer much easier.
Kolivand, Hoshang; Fern, Bong Mei; Rahim, Mohd Shafry Mohd; Sulong, Ghazali; Baker, Thar; Tully, David
2018-01-01
In this paper, we present a new method to recognise the leaf type and identify plant species using phenetic parts of the leaf; lobes, apex and base detection. Most of the research in this area focuses on the popular features such as the shape, colour, vein, and texture, which consumes large amounts of computational processing and are not efficient, especially in the Acer database with a high complexity structure of the leaves. This paper is focused on phenetic parts of the leaf which increases accuracy. Detecting the local maxima and local minima are done based on Centroid Contour Distance for Every Boundary Point, using north and south region to recognise the apex and base. Digital morphology is used to measure the leaf shape and the leaf margin. Centroid Contour Gradient is presented to extract the curvature of leaf apex and base. We analyse 32 leaf images of tropical plants and evaluated with two different datasets, Flavia, and Acer. The best accuracy obtained is 94.76% and 82.6% respectively. Experimental results show the effectiveness of the proposed technique without considering the commonly used features with high computational cost.
Fern, Bong Mei; Rahim, Mohd Shafry Mohd; Sulong, Ghazali; Baker, Thar; Tully, David
2018-01-01
In this paper, we present a new method to recognise the leaf type and identify plant species using phenetic parts of the leaf; lobes, apex and base detection. Most of the research in this area focuses on the popular features such as the shape, colour, vein, and texture, which consumes large amounts of computational processing and are not efficient, especially in the Acer database with a high complexity structure of the leaves. This paper is focused on phenetic parts of the leaf which increases accuracy. Detecting the local maxima and local minima are done based on Centroid Contour Distance for Every Boundary Point, using north and south region to recognise the apex and base. Digital morphology is used to measure the leaf shape and the leaf margin. Centroid Contour Gradient is presented to extract the curvature of leaf apex and base. We analyse 32 leaf images of tropical plants and evaluated with two different datasets, Flavia, and Acer. The best accuracy obtained is 94.76% and 82.6% respectively. Experimental results show the effectiveness of the proposed technique without considering the commonly used features with high computational cost. PMID:29420568
Bilal, Muhammad; Asgher, Muhammad; Parra-Saldivar, Roberto; Hu, Hongbo; Wang, Wei; Zhang, Xuehong; Iqbal, Hafiz M N
2017-01-15
In the twenty-first century, chemical and associated industries quest a transition prototype from traditional chemical-based concepts to a greener, sustainable and environmentally-friendlier catalytic alternative, both at the laboratory and industrial scale. In this context, bio-based catalysis offers numerous benefits along with potential biotechnological and environmental applications. The bio-based catalytic processes are energy efficient than conventional methodologies under moderate processing, generating no and negligible secondary waste pollution. Thanks to key scientific advances, now, solid-phase biocatalysts can be economically tailored on a large scale. Nevertheless, it is mandatory to recover and reprocess the enzyme for their commercial feasibility, and immobilization engineering can efficiently accomplish this challenge. The first part of the present review work briefly outlines the immobilization of lignin-modifying enzymes (LMEs) including lignin peroxidase (LiP), manganese peroxidase (MnP) and laccase of white-rot fungi (WRF). Whereas, in the second part, a particular emphasis has been given on the recent achievements of carrier-immobilized LMEs for the degradation, decolorization, or detoxification of industrial dyes and dye-based industrial wastewater effluents. Copyright © 2016 Elsevier B.V. All rights reserved.
Maes, W H; Steppe, K
2012-08-01
As evaporation of water is an energy-demanding process, increasing evapotranspiration rates decrease the surface temperature (Ts) of leaves and plants. Based on this principle, ground-based thermal remote sensing has become one of the most important methods for estimating evapotranspiration and drought stress and for irrigation. This paper reviews its application in agriculture. The review consists of four parts. First, the basics of thermal remote sensing are briefly reviewed. Second, the theoretical relation between Ts and the sensible and latent heat flux is elaborated. A modelling approach was used to evaluate the effect of weather conditions and leaf or vegetation properties on leaf and canopy temperature. Ts increases with increasing air temperature and incoming radiation and with decreasing wind speed and relative humidity. At the leaf level, the leaf angle and leaf dimension have a large influence on Ts; at the vegetation level, Ts is strongly impacted by the roughness length; hence, by canopy height and structure. In the third part, an overview of the different ground-based thermal remote sensing techniques and approaches used to estimate drought stress or evapotranspiration in agriculture is provided. Among other methods, stress time, stress degree day, crop water stress index (CWSI), and stomatal conductance index are discussed. The theoretical models are used to evaluate the performance and sensitivity of the most important methods, corroborating the literature data. In the fourth and final part, a critical view on the future and remaining challenges of ground-based thermal remote sensing is presented.
Terry, Paul E; Seaverson, Erin Ld; Staufacker, Michael J; Tanaka, Akiko
2011-06-01
Extensive research on tobacco cessation affirms the effectiveness of interventions, although the literature is more limited concerning the impact of programs designed specifically for the workplace. The present study examines the effectiveness of a telephone-based health coaching tobacco cessation program that was provided as part of worksite health promotion programs by 10 large employers. The participants were recruited based on their health risks as identified by health assessments, and the program was personalized to meet their individual needs and stages of change. The results indicate that at 12 months, health coaching program participants achieved a 32% quit rate, compared to 18% for nonparticipants. The quit rate was highest (44%) among program completers who were ready to change at baseline. These results suggest that a tobacco cessation program offered as part of a worksite health promotion program can be highly effective, especially for those who are ready to change. However, the relatively low annual participation rate may indicate that tobacco users remain among the most difficult to engage and to support in their efforts to complete programs. Therefore, implementing a variety of engagement strategies, such as policy changes, as well as social and financial incentives and penalties will most likely have a positive effect at the population level.
Avoiding space robot collisions utilizing the NASA/GSFC tri-mode skin sensor
NASA Technical Reports Server (NTRS)
Prinz, F. B.
1991-01-01
Sensor based robot motion planning research has primarily focused on mobile robots. Consider, however, the case of a robot manipulator expected to operate autonomously in a dynamic environment where unexpected collisions can occur with many parts of the robot. Only a sensor based system capable of generating collision free paths would be acceptable in such situations. Recently, work in this area has been reported in which a deterministic solution for 2DOF systems has been generated. The arm was sensitized with 'skin' of infra-red sensors. We have proposed a heuristic (potential field based) methodology for redundant robots with large DOF's. The key concepts are solving the path planning problem by cooperating global and local planning modules, the use of complete information from the sensors and partial (but appropriate) information from a world model, representation of objects with hyper-ellipsoids in the world model, and the use of variational planning. We intend to sensitize the robot arm with a 'skin' of capacitive proximity sensors. These sensors were developed at NASA, and are exceptionally suited for the space application. In the first part of the report, we discuss the development and modeling of the capacitive proximity sensor. In the second part we discuss the motion planning algorithm.
Explosive force of primacord grid forms large sheet metal parts
NASA Technical Reports Server (NTRS)
1966-01-01
Primacord which is woven through fish netting in a grid pattern is used for explosive forming of large sheet metal parts. The explosive force generated by the primacord detonation is uniformly distributed over the entire surface of the sheet metal workpiece.
The ODD protocol: A review and first update
Grimm, Volker; Berger, Uta; DeAngelis, Donald L.; Polhill, J. Gary; Giske, Jarl; Railsback, Steve F.
2010-01-01
The 'ODD' (Overview, Design concepts, and Details) protocol was published in 2006 to standardize the published descriptions of individual-based and agent-based models (ABMs). The primary objectives of ODD are to make model descriptions more understandable and complete, thereby making ABMs less subject to criticism for being irreproducible. We have systematically evaluated existing uses of the ODD protocol and identified, as expected, parts of ODD needing improvement and clarification. Accordingly, we revise the definition of ODD to clarify aspects of the original version and thereby facilitate future standardization of ABM descriptions. We discuss frequently raised critiques in ODD but also two emerging, and unanticipated, benefits: ODD improves the rigorous formulation of models and helps make the theoretical foundations of large models more visible. Although the protocol was designed for ABMs, it can help with documenting any large, complex model, alleviating some general objections against such models.
Belmeziti, Ali; Coutard, Olivier; de Gouvello, Bernard
2014-01-01
This paper is based on a prospective scenario of development of rainwater harvesting (RWH) on a given large urban area (such as metropolitan area or region). In such a perspective, a new method is proposed to quantify the related potential of potable water savings (PPWS) indicator on this type of area by adapting the reference model usually used on the building level. The method is based on four setting-up principles: gathering (definition of buildings-types and municipalities-types), progressing (use of an intermediate level), increasing (choice of an upper estimation) and prioritizing (ranking the stakes of RWH). Its application to the Paris agglomeration shows that is possible to save up to 11% of the total current potable water through the use of RWH. It also shows that the residential sector offers the most important part because it holds two-thirds of the agglomeration PPWS.
Rensing, Stefan A; Ick, Julia; Fawcett, Jeffrey A; Lang, Daniel; Zimmer, Andreas; Van de Peer, Yves; Reski, Ralf
2007-01-01
Background: Analyses of complete genomes and large collections of gene transcripts have shown that most, if not all seed plants have undergone one or more genome duplications in their evolutionary past. Results: In this study, based on a large collection of EST sequences, we provide evidence that the haploid moss Physcomitrella patens is a paleopolyploid as well. Based on the construction of linearized phylogenetic trees we infer the genome duplication to have occurred between 30 and 60 million years ago. Gene Ontology and pathway association of the duplicated genes in P. patens reveal different biases of gene retention compared with seed plants. Conclusion: Metabolic genes seem to have been retained in excess following the genome duplication in P. patens. This might, at least partly, explain the versatility of metabolism, as described for P. patens and other mosses, in comparison to other land plants. PMID:17683536
A Web-Based Framework For a Time-Domain Warehouse
NASA Astrophysics Data System (ADS)
Brewer, J. M.; Bloom, J. S.; Kennedy, R.; Starr, D. L.
2009-09-01
The Berkeley Transients Classification Pipeline (TCP) uses a machine-learning classifier to automatically categorize transients from large data torrents and provide automated notification of astronomical events of scientific interest. As part of the training process, we created a large warehouse of light-curve sources with well-labelled classes that serve as priors to the classification engine. This web-based interactive framework, which we are now making public via DotAstro.org (http://dotastro.org/), allows us to ingest time-variable source data in a wide variety of formats and store it in a common internal data model. Data is passed between pipeline modules in a prototype XML representation of time-series format (VOTimeseries), which can also be emitted to collaborators through dotastro.org. After import, the sources can be visualized using Google Sky, light curves can be inspected interactively, and classifications can be manually adjusted.
NASA Technical Reports Server (NTRS)
McElwain, Michael; Van Gorkom, Kyle; Bowers, Charles W.; Carnahan, Timothy M.; Kimble, Randy A.; Knight, J. Scott; Lightsey, Paul; Maghami, Peiman G.; Mustelier, David; Niedner, Malcolm B.;
2017-01-01
The James Webb Space Telescope (JWST) is a large (6.5 m) cryogenic segmented aperture telescope with science instruments that cover the near- and mid-infrared from 0.6-27 microns. The large aperture not only provides high photometric sensitivity, but it also enables high angular resolution across the bandpass, with a diffraction limited point spread function (PSF) at wavelengths longer than 2 microns. The JWST PSF quality and stability are intimately tied to the science capabilities as it is convolved with the astrophysical scene. However, the PSF evolves at a variety of timescales based on telescope jitter and thermal distortion as the observatory attitude is varied. We present the image quality and stability requirements, recent predictions from integrated modeling, measurements made during ground-based testing, and performance characterization activities that will be carried out as part of the commissioning process.
A geometric modeler based on a dual-geometry representation polyhedra and rational b-splines
NASA Technical Reports Server (NTRS)
Klosterman, A. L.
1984-01-01
For speed and data base reasons, solid geometric modeling of large complex practical systems is usually approximated by a polyhedra representation. Precise parametric surface and implicit algebraic modelers are available but it is not yet practical to model the same level of system complexity with these precise modelers. In response to this contrast the GEOMOD geometric modeling system was built so that a polyhedra abstraction of the geometry would be available for interactive modeling without losing the precise definition of the geometry. Part of the reason that polyhedra modelers are effective is that all bounded surfaces can be represented in a single canonical format (i.e., sets of planar polygons). This permits a very simple and compact data structure. Nonuniform rational B-splines are currently the best representation to describe a very large class of geometry precisely with one canonical format. The specific capabilities of the modeler are described.
Learning invariance from natural images inspired by observations in the primary visual cortex.
Teichmann, Michael; Wiltschut, Jan; Hamker, Fred
2012-05-01
The human visual system has the remarkable ability to largely recognize objects invariant of their position, rotation, and scale. A good interpretation of neurobiological findings involves a computational model that simulates signal processing of the visual cortex. In part, this is likely achieved step by step from early to late areas of visual perception. While several algorithms have been proposed for learning feature detectors, only few studies at hand cover the issue of biologically plausible learning of such invariance. In this study, a set of Hebbian learning rules based on calcium dynamics and homeostatic regulations of single neurons is proposed. Their performance is verified within a simple model of the primary visual cortex to learn so-called complex cells, based on a sequence of static images. As a result, the learned complex-cell responses are largely invariant to phase and position.
Mapping of unknown industrial plant using ROS-based navigation mobile robot
NASA Astrophysics Data System (ADS)
Priyandoko, G.; Ming, T. Y.; Achmad, M. S. H.
2017-10-01
This research examines how humans work with teleoperated unmanned mobile robot inspection in industrial plant area resulting 2D/3D map for further critical evaluation. This experiment focuses on two parts, the way human-robot doing remote interactions using robust method and the way robot perceives the environment surround as a 2D/3D perspective map. ROS (robot operating system) as a tool was utilized in the development and implementation during the research which comes up with robust data communication method in the form of messages and topics. RGBD SLAM performs the visual mapping function to construct 2D/3D map using Kinect sensor. The results showed that the mobile robot-based teleoperated system are successful to extend human perspective in term of remote surveillance in large area of industrial plant. It was concluded that the proposed work is robust solution for large mapping within an unknown construction building.
Advanced Video Analysis Needs for Human Performance Evaluation
NASA Technical Reports Server (NTRS)
Campbell, Paul D.
1994-01-01
Evaluators of human task performance in space missions make use of video as a primary source of data. Extraction of relevant human performance information from video is often a labor-intensive process requiring a large amount of time on the part of the evaluator. Based on the experiences of several human performance evaluators, needs were defined for advanced tools which could aid in the analysis of video data from space missions. Such tools should increase the efficiency with which useful information is retrieved from large quantities of raw video. They should also provide the evaluator with new analytical functions which are not present in currently used methods. Video analysis tools based on the needs defined by this study would also have uses in U.S. industry and education. Evaluation of human performance from video data can be a valuable technique in many industrial and institutional settings where humans are involved in operational systems and processes.
The comparison and analysis of extracting video key frame
NASA Astrophysics Data System (ADS)
Ouyang, S. Z.; Zhong, L.; Luo, R. Q.
2018-05-01
Video key frame extraction is an important part of the large data processing. Based on the previous work in key frame extraction, we summarized four important key frame extraction algorithms, and these methods are largely developed by comparing the differences between each of two frames. If the difference exceeds a threshold value, take the corresponding frame as two different keyframes. After the research, the key frame extraction based on the amount of mutual trust is proposed, the introduction of information entropy, by selecting the appropriate threshold values into the initial class, and finally take a similar mean mutual information as a candidate key frame. On this paper, several algorithms is used to extract the key frame of tunnel traffic videos. Then, with the analysis to the experimental results and comparisons between the pros and cons of these algorithms, the basis of practical applications is well provided.
Calibrated peer review assignments for the earth sciences
Rudd, J.A.; Wang, V.Z.; Cervato, C.; Ridky, R.W.
2009-01-01
Calibrated Peer Review ??? (CPR), a web-based instructional tool developed as part of the National Science Foundation reform initiatives in undergraduate science education, allows instructors to incorporate multiple writing assignments in large courses without overwhelming the instructor. This study reports successful implementation of CPR in a large, introductory geology course and student learning of geoscience content. For each CPR assignment in this study, students studied web-based and paper resources, wrote an essay, and reviewed seven essays (three from the instructor, three from peers, and their own) on the topic. Although many students expressed negative attitudes and concerns, particularly about the peer review process of this innovative instructional approach, they also recognized the learning potential of completing CPR assignments. Comparing instruction on earthquakes and plate boundaries using a CPR assignment vs. an instructional video lecture and homework essay with extensive instructor feedback, students mastered more content via CPR instruction.
Kerkmeijer, Linda G W; Fuller, Clifton D; Verkooijen, Helena M; Verheij, Marcel; Choudhury, Ananya; Harrington, Kevin J; Schultz, Chris; Sahgal, Arjun; Frank, Steven J; Goldwein, Joel; Brown, Kevin J; Minsky, Bruce D; van Vulpen, Marco
2016-01-01
An international research consortium has been formed to facilitate evidence-based introduction of MR-guided radiotherapy (MR-linac) and to address how the MR-linac could be used to achieve an optimized radiation treatment approach to improve patients' survival, local, and regional tumor control and quality of life. The present paper describes the organizational structure of the clinical part of the MR-linac consortium. Furthermore, it elucidates why collaboration on this large project is necessary, and how a central data registry program will be implemented.
Stable isotope dimethyl labelling for quantitative proteomics and beyond
Hsu, Jue-Liang; Chen, Shu-Hui
2016-01-01
Stable-isotope reductive dimethylation, a cost-effective, simple, robust, reliable and easy-to- multiplex labelling method, is widely applied to quantitative proteomics using liquid chromatography-mass spectrometry. This review focuses on biological applications of stable-isotope dimethyl labelling for a large-scale comparative analysis of protein expression and post-translational modifications based on its unique properties of the labelling chemistry. Some other applications of the labelling method for sample preparation and mass spectrometry-based protein identification and characterization are also summarized. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644970
Rule-Based Event Processing and Reaction Rules
NASA Astrophysics Data System (ADS)
Paschke, Adrian; Kozlenkov, Alexander
Reaction rules and event processing technologies play a key role in making business and IT / Internet infrastructures more agile and active. While event processing is concerned with detecting events from large event clouds or streams in almost real-time, reaction rules are concerned with the invocation of actions in response to events and actionable situations. They state the conditions under which actions must be taken. In the last decades various reaction rule and event processing approaches have been developed, which for the most part have been advanced separately. In this paper we survey reaction rule approaches and rule-based event processing systems and languages.
Contrasting runoff trends between dry and wet parts of eastern Tibetan Plateau.
Wang, Yuanyuan; Zhang, Yongqiang; Chiew, Francis H S; McVicar, Tim R; Zhang, Lu; Li, Hongxia; Qin, Guanghua
2017-11-13
As the "Asian Water Tower", the Tibetan Plateau (TP) provides water resources for more than 1.4 billion people, but suffers from climatic and environmental changes, followed by the changes in water balance components. We used state-of-the-art satellite-based products to estimate spatial and temporal variations and trends in annual precipitation, evapotranspiration and total water storage change across eastern TP, which were then used to reconstruct an annual runoff variability series for 2003-2014. The basin-scale reconstructed streamflow variability matched well with gauge observations for five large rivers. Annual runoff increased strongly in dry part because of increases in precipitation, but decreased in wet part because of decreases in precipitation, aggravated by noticeable increases in evapotranspiration in the north of wet part. Although precipitation primarily governed temporal-spatial pattern of runoff, total water storage change contributed greatly to runoff variation in regions with wide-spread permanent snow/ice or permafrost. Our study indicates that the contrasting runoff trends between the dry and wet parts of eastern TP requires a change in water security strategy, and attention should be paid to the negative water resources impacts detected for southwestern part which has undergone vast glacier retreat and decreasing precipitation.
Summation-by-Parts operators with minimal dispersion error for coarse grid flow calculations
NASA Astrophysics Data System (ADS)
Linders, Viktor; Kupiainen, Marco; Nordström, Jan
2017-07-01
We present a procedure for constructing Summation-by-Parts operators with minimal dispersion error both near and far from numerical interfaces. Examples of such operators are constructed and compared with a higher order non-optimised Summation-by-Parts operator. Experiments show that the optimised operators are superior for wave propagation and turbulent flows involving large wavenumbers, long solution times and large ranges of resolution scales.
NASA Astrophysics Data System (ADS)
Han, Junwon
The remarkable development of polymer synthesis techniques to make complex polymers with controlled chain architectures has inevitably demanded the advancement of polymer characterization tools to analyze the molecular dispersity in polymeric materials beyond size exclusion chromatography (SEC). In particular, man-made synthetic copolymers that consist of more than one monomer type are disperse mixtures of polymer chains that have distributions in terms of both chemical heterogeneity and chain length (molar mass). While the molecular weight distribution has been quite reliably estimated by the SEC, it is still challenging to properly characterize the chemical composition distribution in the copolymers. Here, I have developed and applied adsorption-based interaction chromatography (IC) techniques as a promising tool to characterize and fractionate polystyrene-based block, random and branched copolymers in terms of their chemical heterogeneity. The first part of this thesis is focused on the adsorption-desorption based purification of PS-b-PMMA diblock copolymers using nanoporous silica. The liquid chromatography analysis and large scale purification are discussed for the PS-b-PMMA block copolymers that have been synthesized by sequential anionic polymerization. SEC and IC are compared to critically analyze the contents of PS homopolymers in the as-synthesized block copolymers. In addition, I have developed an IC technique to provide faster and more reliable information on the chemical heterogeneity in the as-synthesized block copolymers. Finally, a large scale (multi-gram) separation technique is developed to obtain "homopolymer-free" block copolymers via a simple chromatographic filtration technique. By taking advantage of the large specific surface area of nanoporous silica (≈300m 2/g), large scale purification of neat PS-b-PMMA has successfully been achieved by controlling adsorption and desorption of the block copolymers on the silica gel surface using a gravity column. The second part of this thesis is focused on the liquid chromatography analysis and fractionation of RAFT-polymerized PS-b -PMMA diblock copolymers and AFM studies. In this study, PS- b-PMMA block copolymers were synthesized by a RAFT free radical polymerization process---the PMMA block with a phenyldithiobenzoate end group was synthesized first. The contents of unreacted PS and PMMA homopolymers in as-synthesized PS-b-PMMA block copolymers were quantitatively analyzed by solvent gradient interaction chromatography (SGIC) technique employing bare silica and C18-bonded silica columns, respectively. In addition, by 2-dimensional large-scale IC fractionation method, atomic force microscopy (AFM) study of these fractionated samples revealed various morphologies with respect to the chemical composition of each fraction. The third part of this thesis is to analyze random copolymers with tunable monomer sequence distributions using interaction chromatography. Here, IC was used for characterizing the composition and monomer sequence distribution in statistical copolymers of poly(styrene-co-4-bromostyrene) (PBrxS). The PBrS copolymers were synthesized by the bromination of monodisperse polystyrenes; the degree of bromination (x) and the sequence distribution were adjusted by varying the bromination time and the solvent quality, respectively. Both normal-phase (bare silica) and reversed-phase (C18-bonded silica) columns were used at different combinations of solvents and non-solvents to monitor the content of the 4-bromostyrene units in the copolymer and their average monomer sequence distribution. The fourth part of this thesis is to analyze and fractionate highly branched polymers such as dendronized polymers and star-shaped homo and copolymers. I have developed an interaction chromatography technique to separate polymers with nonlinear chain architecture. Specifically, the IC technique has been used to separate dendronized polymers and PS-based highly branched copolymers and to ultimately obtain well-defined dendronized or branched copolymers with a low polydispersity. The effects of excess arm-polymers on (1) the micellar self-assembly of dendronized polymers and (2) the regularity of the pore morphology in the low-k applications by the sol-gel process have been studied.
Map showing general availability of ground water in the Kaiparowits coal-basin area, Utah
Price, Don
1977-01-01
This is one of a series of maps that describe the geology and related natural resources in the Kaiparowits coal-basin area. This map is based partly on records of water wells, springs, and coal and petroleum exploration holes, partly on unpublished reports of field evaluations of prospective stock-water well sites by personnel of the U.S. Geological Survey, and partly on a 6-day field reconnaissance by the writer in parts of the mapped area.Most of the data used to compile this map were collected by the U.S. Geological Survey in cooperation with State, local, and other Federal agencies. Published sources of data included Phoenix (1963), Iorns, Hembree, and Phoenix (1964), Cooley (1965), Feltis (1966), Goode (1966, 1969), and the final environmental impact statement for the proposed Kaiparowits power project (U.S. Bureau of Land Management, 1976).Few data about the availability or depth of ground water could be obtained for large areas in the Kaiparowits coal basin. In those areas, expected yields of individual wells are inferred from the geology as compiled by Stokes (1964) and Hackman and Wyant (1973), and depths of ground water in wells are inferred largely from the local topography.El Paso Natural Gas Co., Resources Co., Kaiser Engineers, and Southern California Edison Co. provided specific information regarding the availability and depth of ground water in their exploratory holes on the Kaiparowits Plateau. The cooperation of those firms is gratefully acknowledged.
D. M. Jimenez; B. W. Butler; J. Reardon
2003-01-01
Current methods for predicting fire-induced plant mortality in shrubs and trees are largely empirical. These methods are not readily linked to duff burning, soil heating, and surface fire behavior models. In response to the need for a physics-based model of this process, a detailed model for predicting the temperature distribution through a tree stem as a function of...
Part-Task Training Strategies in Simulated Carrier Landing Final Approach Training
1983-11-01
received a large amount of attention in the recent past. However, the notion that the value of flight simulation may b• enhanced when principles of...as training devices through the application of principles of learning. The research proposed here s based on this point of view. THIS EXPERIMENT The...tracking. Following Goldstein’s suggestion, one should look for training techniques suggested by learnina principles developed from research on
1986-08-01
Technology Laboratory, Watertown, MA AIR FORCE BASIC RESEARCH IN DYNAMICS AND CONTROL OF LARGE SPACE STRUCTURES Anthony K. Amos, Boiling Air Force Base...Engineering, Watchun$, NJ TEMPERATURE SHIFT CONSIDERATIONS FOR DAMPING MATERIALS L. Rogers, Air Force Wright Aeronautictl Laboratories, Wright...INDUCED CAVITY ACOUSTICS L.L. Shaw,. Air Force Wr4ht Aeroaauical Laborawrics, Wri•ht-Paucswon AFB. OH i 4i SESSION CHAIRMEN AND COCHAIRMEN 56th Shock and
Manned Mars missions: A working group report
NASA Technical Reports Server (NTRS)
Duke, Michael B. (Editor); Keaton, Paul W. (Editor)
1986-01-01
The discussions of the Working Group (based in large part on working papers, which will shortly be published separately) are summarized. These papers cover a broad range of subjects which need to be addressed in the formulation of such a formidable enterprise as a manned Mars program. Science objective and operations; Mars surface infrastructure and activities; mission and system concepts and configurations; life sciences; impacts on the space infrastructure; and costs, schedules, and organizations are addressed.
H. Viana; Warren B. Cohen; D. Lopes; J. Aranha
2010-01-01
Following the European Union strategy concerning renewable energy (RE), Portugal established in their national policy programmes that the production of electrical energy from RE should reach 45% of the total supply by 2010. Since Portugal has large forest biomass resources, a significant part of this energy will be obtained from this source. In addition to the two...
NASA Astrophysics Data System (ADS)
Thorne, Ben; Alonso, David; Naess, Sigurd; Dunkley, Jo
2017-04-01
PySM generates full-sky simulations of Galactic foregrounds in intensity and polarization relevant for CMB experiments. The components simulated are thermal dust, synchrotron, AME, free-free, and CMB at a given Nside, with an option to integrate over a top hat bandpass, to add white instrument noise, and to smooth with a given beam. PySM is based on the large-scale Galactic part of Planck Sky Model code and uses some of its inputs
Global View of Mars Topography
NASA Technical Reports Server (NTRS)
2007-01-01
[figure removed for brevity, see original site] Annotated Version This global map of Mars is based on topographical information collected by the Mars Orbiter Laser Altimeter instrument on NASA's Mars Global Surveyor orbiter. Illumination is from the upper right. The image width is approximately 18,000 kilometers (11,185 miles). Candor Chasma forms part of the large Martian canyon system named Valles Marineris. The location of Southwest Candor Chasma is indicated in the annotated version.Faust: Flexible Acquistion and Understanding System for Text
2013-07-01
second version is still underway and it will continue in development as part of the DARPA DEFT program; it is written in Java and Clojure with MySQL and...SUTime, a Java library that recognizes and normalizes temporal expressions using deterministic patterns [101]. UIUC made another such framework... Java -based, large-scale inference engine called Tuffy. It leverages the full power of a relational optimizer in an RDBMS to perform the grounding of MLN
Fiber-Optic Communications Systems,
1983-12-15
inherent noise level and a large dynamic diapason. Usually, an amplifier with reverse reaction, which is composed of a preamplifier and wide band amplifier...which are attached to a preamplifier on the base of an FET transistor and has an amplification of of around of 20 dB, an adaptation part and a low...systems are used to connect anti-submarine sonar to computers for signal processing. One such transmission line can connect 52 parallel channels with a
2012-12-01
Nitrosodiethylamine NDMA N-Nitrosodimethylamine NDPA N-Nitrosodi-n-propylamine ng/L nanogram/liter NMEA N-Nitrosomethylethylamine NMOR N...using USEPA Method 521. N-nitrosodimethylamine ( NDMA ) was 2.6 parts per trillion (ppt) with a detection limit of 2 ppt. All other nitrosamines...was returned to service. All samples were analyzed by Weck Laboratories using USEPA Method 521. Analytes included NDEA, NDMA , NDBA, NDPA, NMEA
NASA Technical Reports Server (NTRS)
1977-01-01
Power levels up to 100 kWe average were baselined for the electrical power system of the space construction base, a long-duration manned facility capable of supporting manufacturing and large scale construction projects in space. Alternatives to the solar array battery systems discussed include: (1) solar concentrator/brayton; (2) solar concentrator/thermionic; (3) isotope/brayton; (4) nuclear/brayton; (5) nuclear thermoelectric; and (6) nuclear thermionic.
Self-sustained magnetoelectric oscillations in magnetic resonant tunneling structures.
Ertler, Christian; Fabian, Jaroslav
2008-08-15
The dynamic interplay of transport, electrostatic, and magnetic effects in the resonant tunneling through ferromagnetic quantum wells is theoretically investigated. It is shown that the carrier-mediated magnetic order in the ferromagnetic region not only induces, but also takes part in intrinsic, robust, and sustainable high-frequency current oscillations over a large window of nominally steady bias voltages. This phenomenon could spawn a new class of quantum electronic devices based on ferromagnetic semiconductors.
Metal and Oxide Additives as Agents for Munitions Self-Remediation
2010-07-01
properties of TiO2 can be modified by adding various dopants which serve to expand the range of light energy adsorbed into the visible part of the...spectrum. Photocatalyst development is an extremely active area of research with respect to both substrate and dopant . The selection of an anatase...based photocatalyst is largely due to its established dominance and chemical stability (Diebold 2003). Tungsten trioxide (WO3) is one of many dopants
3D Heart: a new visual training method for electrocardiographic analysis.
Olson, Charles W; Lange, David; Chan, Jack-Kang; Olson, Kim E; Albano, Alfred; Wagner, Galen S; Selvester, Ronald H S
2007-01-01
This new training method is based on developing a sound understanding of the sequence in which electrical excitation spreads through both the normal and the infarcted myocardium. The student is made aware of the cardiac electrical performance through a series of 3-dimensional pictures during the excitation process. The electrocardiogram 3D Heart 3-dimensional program contains a variety of different activation simulations. Currently, this program enables the user to view the activation simulation for all of the following pathology examples: normal activation; large, medium, and small anterior myocardial infarction (MI); large, medium, and small posterolateral MI; large, medium, and small inferior MI. Simulations relating to other cardiac abnormalities, such as bundle branch block and left ventricular hypertrophy fasicular block, are being developed as part of a National Institute of Health (NIH) Phase 1 Small Business Innovation Research (SBIR) program.
A numerical projection technique for large-scale eigenvalue problems
NASA Astrophysics Data System (ADS)
Gamillscheg, Ralf; Haase, Gundolf; von der Linden, Wolfgang
2011-10-01
We present a new numerical technique to solve large-scale eigenvalue problems. It is based on the projection technique, used in strongly correlated quantum many-body systems, where first an effective approximate model of smaller complexity is constructed by projecting out high energy degrees of freedom and in turn solving the resulting model by some standard eigenvalue solver. Here we introduce a generalization of this idea, where both steps are performed numerically and which in contrast to the standard projection technique converges in principle to the exact eigenvalues. This approach is not just applicable to eigenvalue problems encountered in many-body systems but also in other areas of research that result in large-scale eigenvalue problems for matrices which have, roughly speaking, mostly a pronounced dominant diagonal part. We will present detailed studies of the approach guided by two many-body models.
Two cases of exenteration of the brain from Brenneke shotgun slugs.
Karger, B; Banaschak, S
1997-01-01
A case of extended suicide resulted in two fatalities due to craniocerebral gunshots from a 12-gauge shotgun firing Brenneke shotgun slugs. In each case, the gunshot shattered the skull and the brain and in one case, large parts of the brain including a complete hemisphere were ejected similar to a "Krönlein shot". The location of the trajectory close to the base of the skull, the muzzle gases and the ballistic characteristics of the missile contributed to this rare form of head injury. The high mass and the large diameter of the lead missile do not necessitate a high muzzle velocity to crush large amounts of tissue or to produce an explosive type of head injury. The wadding material and the metal screw attached to the Brenneke slug can be of forensic significance.
A time for transformative leadership in academic health sciences.
Armstrong, Paul W
2007-01-01
Academic medicine, in its broadest sense, has made major contributions to human health in the past quarter century. This has been achieved in large part because it has attracted an outstanding cadre of--largely altruistic--professionals. These pioneering efforts have served as the life-blood of the discipline. Their journeys of discovery, often complemented by collaboration with the pharmaceutical, biotechnological and device industry have yielded remarkable insights into the diagnosis, treatment and prevention of disease and been celebrated by a stunning array of Nobel laureates in medicine and related arenas of endeavour.1 The translation of discovery to the bedside, clinic and the community coupled, most recently, with insights into the gap between potential effectiveness and what ultimately occurs as part of health care delivery, have been monumental in scope. This progress has unquestionably been the province of the university based clinician scientist. Within Canada, the emergence of the Canadian Institutes of Health Research, the Canadian Foundation for Innovation, and the Canada Research Chairs has been pivotal in launching the careers of a new generation of clinician scientists. The excitement of discovery, gratification associated with direct patient care, and satisfaction of inspiring learning while engaging the next generation of emerging health professionals is rewarded by a career in academic medicine characterized by extraordinary challenge, fulfillment and meaning. As remarkable as these advances in quantity and quality of life have been (in large part attributable to health care research and its implementation) the promises of molecular medicine and abundant new technologies portend an exciting future whereby academic medicine can build upon its noble and traditional contributions to human health.
Schellenberg, Florian; Taylor, William R; Lorenzetti, Silvio
2017-01-01
To ensure an efficient and targeted adaptation with low injury risk during strength exercises, knowledge of the participant specific internal loading conditions is essential. The goal of this study was to calculate the lower limb muscles forces during the strength exercises deadlifts, goodmornings and splits squats by means of musculoskeletal simulation. 11 participants were assessed performing 10 different variations of split squats by varying the step length as well as the maximal frontal tibia angle, and 13 participants were measured performing deadlift and goodmorning exercises. Using individualised musculoskeletal models, forces of the Quadriceps ( four parts), Hamstrings (four parts) and m. gluteus maximus (three parts) were computed. Deadlifts resulted highest loading for the Quadriceps, especially for the vasti (18-34 N/kg), but not for the rectus femoris (8-10 N/kg), which exhibited its greatest loading during split squats (13-27 N/kg) in the rear limb. Hamstrings were loaded isometrically during goodmornings but dynamically during deadlifts. For the m. gluteus maximus , the highest loading was observed during split squats in the front limb (up to 25 N/kg), while deadlifts produced increasingly, large loading over large ranges of motion in hip and knee. Acting muscle forces vary between exercises, execution form and joint angle. For all examined muscles, deadlifts produced considerable loading over large ranges of motion, while split squats seem to be highly dependent upon exercise variation. This study provides key information to design strength-training programs with respect to loading conditions and ranges of motion of lower extremity muscles.
Incorporating social and cultural significance of large old trees in conservation policy.
Blicharska, Malgorzata; Mikusiński, Grzegorz
2014-12-01
In addition to providing key ecological functions, large old trees are a part of a social realm and as such provide numerous social-cultural benefits to people. However, their social and cultural values are often neglected when designing conservation policies and management guidelines. We believe that awareness of large old trees as a part of human identity and cultural heritage is essential when addressing the issue of their decline worldwide. Large old trees provide humans with aesthetic, symbolic, religious, and historic values, as well as concrete tangible benefits, such as leaves, branches, or nuts. In many cultures particularly large trees are treated with reverence. Also, contemporary popular culture utilizes the image of trees as sentient beings and builds on the ancient myths that attribute great powers to large trees. Although the social and cultural role of large old trees is usually not taken into account in conservation, accounting for human-related values of these trees is an important part of conservation policy because it may strengthen conservation by highlighting the potential synergies in protecting ecological and social values. © 2014 Society for Conservation Biology.
Detection of early caries by laser-induced breakdown spectroscopy
NASA Astrophysics Data System (ADS)
Sasazawa, Shuhei; Kakino, Satoko; Matsuura, Yuji
2015-07-01
To improve sensitivity of dental caries detection by laser-induced breakdown spectroscopy (LIBS) analysis, it is proposed to utilize emission peaks in the ultraviolet. We newly focused on zinc whose emission peaks exist in ultraviolet because zinc exists at high concentration in the outer layer of enamel. It was shown that by using ratios between heights of an emission peak of Zn and that of Ca, the detection sensitivity and stability are largely improved. It was also shown that early caries are differentiated from healthy part by properly setting a threshold in the detected ratios. The proposed caries detection system can be applied to dental laser systems such as ones based on Er:YAG-lasers. When ablating early caries part by laser light, the system notices the dentist that the ablation of caries part is finished. We also show the intensity of emission peaks of zinc decreased with ablation with Er:YAG laser light.
Developments in platinum anticancer drugs
NASA Astrophysics Data System (ADS)
Tylkowski, Bartosz; Jastrząb, Renata; Odani, Akira
2018-01-01
Platinum compounds represent one of the great success stories of metals in medicine. Following the unexpected discovery of the anticancer activity of cisplatin (Fig. 1) in 1965 by Prof. Rosenberg [1], a large number of its variants have been prepared and tested for their ability to kill cancer cells and inhibit tumor growth. Although cisplatin has been in use for over four decades, new and more effective platinum-based therapeutics are finally on the horizon. A wide introduction to anticancer studies is given by the authors of the previous chapter. This chapter aims at providing the readers with a comprehensive and in-depth understanding of recent developments of platinum anticancer drugs and to review the state of the art. The chapter is divided into two parts. In the first part we present a historical aspect of platinum and its complexes, while in the second part we give an overview of developments in the field of platinum anticancer agents.
Ground Based Studies of Thermocapillary Flows in Levitated Drops: Analytical Part
NASA Technical Reports Server (NTRS)
Sadhal, S. S.; Trinh, Eugene H.
1997-01-01
The main objectives of the analytical part of this investigation are to study the fluid flow phenomena together with the thermal effects on drops levitated in an acoustic field. To a large extent, experimentation on ground requires a strong acoustic field that has a significant interference with other thermal-fluid effects. While most of the work has been directed towards particles in strong acoustic fields to overcome gravity, some results for microgravity have been obtained. One of the objectives was to obtain the thermocapillary flow in a spot-heated drop, and set up a model for the prediction of thermophysical properties. In addition, for acoustically levitated particles, a clear understanding of the underlying fluid mechanics was required. Also, the interaction of acoustics with steady and pulsating thermal stimuli was required to be analyzed. The experimental part of the work was funded through JPL, and has been reported separately.
The use of COD and plastic instability in crack propagation and arrest in shells
NASA Technical Reports Server (NTRS)
Erdogan, F.; Ratwani, M.
1974-01-01
The initiation, growth, and possible arrest of fracture in cylindrical shells containing initial defects are dealt with. For those defects which may be approximated by a part-through semi-elliptic surface crack which is sufficiently shallow so that part of the net ligament in the plane of the crack is still elastic, the existing flat plate solution is modified to take into account the shell curvature effect as well as the effect of the thickness and the small scale plastic deformations. The problem of large defects is then considered under the assumptions that the defect may be approximated by a relatively deep meridional part-through surface crack and the net ligament through the shell wall is fully yielded. The results given are based on an 8th order bending theory of shallow shells using a conventional plastic strip model to account for the plastic deformations around the crack border.
Complexity analysis and mathematical tools towards the modelling of living systems.
Bellomo, N; Bianca, C; Delitala, M
2009-09-01
This paper is a review and critical analysis of the mathematical kinetic theory of active particles applied to the modelling of large living systems made up of interacting entities. The first part of the paper is focused on a general presentation of the mathematical tools of the kinetic theory of active particles. The second part provides a review of a variety of mathematical models in life sciences, namely complex social systems, opinion formation, evolution of epidemics with virus mutations, and vehicular traffic, crowds and swarms. All the applications are technically related to the mathematical structures reviewed in the first part of the paper. The overall contents are based on the concept that living systems, unlike the inert matter, have the ability to develop behaviour geared towards their survival, or simply to improve the quality of their life. In some cases, the behaviour evolves in time and generates destructive and/or proliferative events.
Transboundary Groundwater Body Karavanke/Karawanken Between Austria and Slovenia
NASA Astrophysics Data System (ADS)
Brencic, M.; Poltnig, W.
2009-04-01
Large part of the border region between Republic of Slovenia and Republic of Austria is represented by high east west extended mountainous ridge of Karavanke/Karawanken. It is a range extending along the Slovenian-Austrian border for almost 150 km. Its terrain consists of long and prominent ridges, whose slopes steeply fall to the northern and southern side. Ridges are interrupted by long, deep and narrow valleys. The highest peaks reach over 2000 m above sea level. In the entire range prominent ridges with mountain meadows and forests prevail. The area is scarcely populated, the main economic activities are grazing and forestry, in some places tourism is also developing, especially winter sports centres. Karavanke/Karawanken lies on the contact between two continental plates, the large European plate in the north and the smaller Adriatic plate in the south. When the Adriatic plate was thrusted over the European one towards the north, the collision resulted in the folding of sediments previously deposited in the space between the plates. The contact of both plates caused large lateral displacements, causing the rocks of both plates to fold and fault and then extend along the contact. This is the area of Periadriatic lineament, dividing Karavanke/Karawanken range into their north and south part. Periadriatic lineament is large stripe slip tectonic structure along which on the northern side rocks were extruded to the east and on the southern side to the west. Along the lineament metamorphic (e.g. biotitic and feldsparic para-gneis, amfibolites) and magmatic (e.g. diabaz, granite and tonalite) rocks of various ages are present. Palaeozoic sedimentary rocks cover large part of the mountain ridge. The oldest are Silurian and Ordovician limestone on the northern border followed by Devonian ridge limestones. They are covered by molasse sedimentation in Carbon and shallow marine and river predominantly clastic sedimentation in Perm. The most abundant and with numerous varieties are rocks from Triassic age. In general they can be divided into rocks of Northern and Southern Karavanke/Karawanken deposited in different sedimentation basins. In lower part clastic rocks prevail, going into the upper part of Triassic age more and more carbonate rocks are present. In Southern Karavanke/Karawanken sedimentary rocks formed in the deeper part as well as on the carbonate platform are present, however in Northern Karavanke/Karawanken sedimentary rocks of shallower sedimentary environment are predominant. In the upper Triassic part of Northern Karavanke/Karawanken large zinc and lead ore deposits were formed. Among younger rocks only small patches are present. The most abundant are Rosenbacher coal-bearing beads of Jauntal/Juna in Austria of Miocen age where the uplift history of Karavanke/Karawanken is very well reflected. Extensive Quaternary sediments are present as slope sediments and sediments filling deep valleys. At the end of the 20th century decision was made to construct a 7,8 km long road tunnel through Karavanke/Karawanken between Hrušica on the Slovenian side and Rosenbach/Podrožca on the Austrian side. It was established already during the construction that waters flowing from the tunnel represent an important water resource. In Slovenia some of these springs were captured and led into the water supply network, while in Austria they remained well protected water resource for the future. Such important water resources require protection, which in turn demands knowledge about their recharge areas. This fact stimulated authorities of both countries to support the beginning of hydrogeological investigations in the west Karavanke/Karawanken region through the common ''Drava/Drau water-management commission'' and subcommission "Drinking water reserves of Karavanke/Karawanken mountains". During hydrogeological investigations detailed hydrogeological mapping of the whole Karavanke/Karawanken ridge was made. Sampling of important springs and low water discharge measurements followed this stage. Samples were taken for basic chemistry and stable isotope determination of water as well as some more sophisticated analyses (e.g. isotope analyses of noble gases) in the area of mineral waters appearance. Important part of investigations was production and compilation of new geological map based on older published and unpublished geological maps from both sides of the state border. This map represented background for the definition of hydrogeological and other detailed and specific maps (e.g. risk potential and vulnerability maps). Based on these results basic hydrological balance of the area was calculated, identification of cross border flow was performed and finally protection measures were suggested. A large part of Karavanke/Karawanken is built from karstified carbonate rocks of limestone and dolomite with underlying Paleozoic limestones. The largest part of karstified rocks lies in the area of North Karavanke/Karawanken, the Košuta unit and the Kamnik-Savinja Alps. About 3600 springs were recorded in the area of Karavanke/Karawanken on both sides of the Austrian-Slovenian state border from 1990 to 2002. For each spring, water flow, electrical conductivity and water temperature were determined. Mostly the springs have a small water flow. Only some very large springs flowing from a karstic aquifer were found to have a recharge area extending across the state border. In 2004 based on the bilateral agreement between Republic of Slovenia and Republic of Austria the common transboundary groundwater body Karavanke/Karawanken was defined. The body is defined according to the Water Framework Directive requirements and extends to the area of the main border ridge. It is divided on areas, where prevails the surface water outflow, which depends only on the surface form and areas, where groundwater outflow is present. Within the area of common water body of the Karavanke/Karawanken five cross-border aquifers were determined.
STT Doubles with Large DM - Part IV: Ophiuchus and Hercules
NASA Astrophysics Data System (ADS)
Knapp, Wilfried; Nanson, John
2016-04-01
The results of visual double star observing sessions suggested a pattern for STT doubles with large DM of being harder to resolve than would be expected based on the WDS catalog data. It was felt this might be a problem with expectations on one hand, and on the other might be an indication of a need for new precise measurements, so we decided to take a closer look at a selected sample of STT doubles and do some research. We found that like in the other constellations covered so far (Gem, Leo, UMa, etc.) at least several of the selected objects in Ophiuchus and Hercules show parameters quite different from the current WDS data.
STT Doubles with Large DM - Part V: Aquila, Delphinus, Cygnus, Aquarius
NASA Astrophysics Data System (ADS)
Knapp, Wilfried; Nanson, John
2016-07-01
The results of visual double star observing sessions suggested a pattern for STT doubles with large DM of being harder to resolve than would be expected based on the WDS catalog data. It was felt this might be a problem with expectations on one hand, and on the other might be an indication of a need for new precise measurements, so we decided to take a closer look at a selected sample of STT doubles and do some research. We found that, as in the other constellations covered so far (Gem, Leo, UMa etc.), at least several of the selected objects in Aql, Del, Cyg and Aqr show parameters quite different from the current WDS data
Aerosol backscatter lidar calibration and data interpretation
NASA Technical Reports Server (NTRS)
Kavaya, M. J.; Menzies, R. T.
1984-01-01
A treatment of the various factors involved in lidar data acquisition and analysis is presented. This treatment highlights sources of fundamental, systematic, modeling, and calibration errors that may affect the accurate interpretation and calibration of lidar aerosol backscatter data. The discussion primarily pertains to ground based, pulsed CO2 lidars that probe the troposphere and are calibrated using large, hard calibration targets. However, a large part of the analysis is relevant to other types of lidar systems such as lidars operating at other wavelengths; continuous wave (CW) lidars; lidars operating in other regions of the atmosphere; lidars measuring nonaerosol elastic or inelastic backscatter; airborne or Earth-orbiting lidar platforms; and lidars employing combinations of the above characteristics.
Zhang, Qin
2015-07-01
Probabilistic graphical models (PGMs) such as Bayesian network (BN) have been widely applied in uncertain causality representation and probabilistic reasoning. Dynamic uncertain causality graph (DUCG) is a newly presented model of PGMs, which can be applied to fault diagnosis of large and complex industrial systems, disease diagnosis, and so on. The basic methodology of DUCG has been previously presented, in which only the directed acyclic graph (DAG) was addressed. However, the mathematical meaning of DUCG was not discussed. In this paper, the DUCG with directed cyclic graphs (DCGs) is addressed. In contrast, BN does not allow DCGs, as otherwise the conditional independence will not be satisfied. The inference algorithm for the DUCG with DCGs is presented, which not only extends the capabilities of DUCG from DAGs to DCGs but also enables users to decompose a large and complex DUCG into a set of small, simple sub-DUCGs, so that a large and complex knowledge base can be easily constructed, understood, and maintained. The basic mathematical definition of a complete DUCG with or without DCGs is proved to be a joint probability distribution (JPD) over a set of random variables. The incomplete DUCG as a part of a complete DUCG may represent a part of JPD. Examples are provided to illustrate the methodology.
Seal, Dakshina R.; Martin, Cliff G.
2016-01-01
Peppers (Capsicum spp.) are an important crop in the USA, with about 32,000 ha cultivated in 2007, which resulted in $588 million in farm revenue. The pepper weevil, Anthonomus eugenii Cano (Coleoptera: Curculionidae), is the most troublesome insect pest of peppers in the southern United States. It is therefore urgent to find different vulnerabilities of pepper cultivars, fruit and plants parts, fruit colors and sizes, and timing to infestation by A. eugenii. Also relevant is testing whether fruit length and infestation state affect fruit numbers, weights, and proportions of fruit that are infested. Counts of A. eugenii adults and marks from oviposition and feeding suggested that C. chinense Jacquin “Habanero” was least susceptible, and C. annuum L. cultivars “SY” and “SR” were most susceptible. Comparison of plant parts and fruit sizes revealed that A. eugenii preferred the peduncle, calyx, and top of pepper fruits over the middle, bottom, leaves, or remainder of flowers. Anthonomus eugenii does not discriminate between green or yellow fruit color nor vary diurnally in numbers. Based on adult counts, medium to extra-large fruits (≥1.5 cm long) attracted more weevils than small fruits (<1.5 cm). However based on proportions of fruit numbers or fruit weights that were infested, there were no differences between large and small fruits. Choice of pepper cultivar can thus be an important part of an IPM cultural control program designed to combat A. eugenii by reduced susceptibility or by synchronous fruit drop of infested fruits. Our results are potentially helpful in developing scouting programs including paying particular attention to the preferred locations of adults and their sites of feeding and oviposition on the fruit. The results also suggested the potential value of spraying when the fruits are still immature to prevent and control infestation. PMID:26959066
Kinematic modeling of a double octahedral Variable Geometry Truss (VGT) as an extensible gimbal
NASA Technical Reports Server (NTRS)
Williams, Robert L., II
1994-01-01
This paper presents the complete forward and inverse kinematics solutions for control of the three degree-of-freedom (DOF) double octahedral variable geometry truss (VGT) module as an extensible gimbal. A VGT is a truss structure partially comprised of linearly actuated members. A VGT can be used as joints in a large, lightweight, high load-bearing manipulator for earth- and space-based remote operations, plus industrial applications. The results have been used to control the NASA VGT hardware as an extensible gimbal, demonstrating the capability of this device to be a joint in a VGT-based manipulator. This work is an integral part of a VGT-based manipulator design, simulation, and control tool.
Short Duration Base Heating Test Improvements
NASA Technical Reports Server (NTRS)
Bender, Robert L.; Dagostino, Mark G.; Engel, Bradley A.; Engel, Carl D.
1999-01-01
Significant improvements have been made to a short duration space launch vehicle base heating test technique. This technique was first developed during the 1960's to investigate launch vehicle plume induced convective environments. Recent improvements include the use of coiled nitrogen buffer gas lines upstream of the hydrogen / oxygen propellant charge tubes, fast acting solenoid valves, stand alone gas delivery and data acquisition systems, and an integrated model design code. Technique improvements were successfully demonstrated during a 2.25% scale X-33 base heating test conducted in the NASA/MSFC Nozzle Test Facility in early 1999. Cost savings of approximately an order of magnitude over previous tests were realized due in large part to these improvements.
NASA Astrophysics Data System (ADS)
Nishii, R.; Imaizumi, F.; Murakami, W.; Daimaru, H.; Miyamae, T.; Ogawa, Y.
2012-04-01
Akakuzure landslide in Japanese Alps is located in a steep mountain slope experienced deep-seated gravitational slope deformation. The landslide is 700 m high (1200-1900 m a.s.l.), 700 m wide and 400000 m2 in area with post-collapsed sediment ca 27 million m3 in volume. The steep rockslope (>40°) in the landslide shows anaclinal structure consisting of sandstone interbedding with shale. Large volume of sediment produced from the landslide has actively formed an alluvial fan on the outlet of the landslide. The volume and processes of the sediment production in the upper part (ca.40000 m^2) of the landslide were evaluated by geodetic surveys using techniques of airborne and ground-based LiDAR (Light Detection and Ranging). The airborne and ground-based LiDAR surveys were performed twice (2003 and 2007) and 3 times (2010-2011), respectively. Ground surface temperatures were monitored at 3 locations within the landslide from 2010 to 2011. Precipitation and air temperature have been also observed on a meteorological station near the study site. The average erosion depths in the observed rockslope reached 0.89 m (0.22 m/yr) during the first 4 years (2003-2007) and 0.55 m (0.18 m/yr) during the later 3 years (2007-2010). The erosion mainly occurred within the landslide rather than on the edge of the landslide (i.e. no significant retreat of the main scarp). Such large sediment production can be divided into three processes based on the depth of detachment. Deep detachment (>5 m in depth), significantly contributing to the retreat of the rockslope, happened to large blocks had located just above knick lines. During the observation period, at least five large blocks fell down, which appears to originate from sliding along the detachment zone steeper than 30°. Second, anaclinal bedding-parallel blocks (1-2 m in depth) fell down, which mainly occurred around sandstone layers. Finally, thin detachment (<1 m in depth) widely occurred on the rockslope. On one part of shale layers, the erosion depth reached 0.35 m from 2010 to 2011. In Akakuzure landside, numerous fractures of the bedrock, probably produced by gravitational deformation, play an important role to promote the rapid erosion, in addition to external triggers such as heavy rainfalls and frost actions.
De Sanctis, A; Russo, S; Craciun, M F; Alexeev, A; Barnes, M D; Nagareddy, V K; Wright, C D
2018-06-06
Graphene-based materials are being widely explored for a range of biomedical applications, from targeted drug delivery to biosensing, bioimaging and use for antibacterial treatments, to name but a few. In many such applications, it is not graphene itself that is used as the active agent, but one of its chemically functionalized forms. The type of chemical species used for functionalization will play a key role in determining the utility of any graphene-based device in any particular biomedical application, because this determines to a large part its physical, chemical, electrical and optical interactions. However, other factors will also be important in determining the eventual uptake of graphene-based biomedical technologies, in particular the ease and cost of manufacture of proposed device and system designs. In this work, we describe three novel routes for the chemical functionalization of graphene using oxygen, iron chloride and fluorine. We also introduce novel in situ methods for controlling and patterning such functionalization on the micro- and nanoscales. Our approaches are readily transferable to large-scale manufacturing, potentially paving the way for the eventual cost-effective production of functionalized graphene-based materials, devices and systems for a range of important biomedical applications.
A Review of Large Solid Rocket Motor Free Field Acoustics, Part I
NASA Technical Reports Server (NTRS)
Pilkey, Debbie; Kenny, Robert Jeremy
2011-01-01
At the ATK facility in Utah, large full scale solid rocket motors are tested. The largest is a five segment version of the Reusable Solid Rocket Motor, which is for use on future launch vehicles. Since 2006, Acoustic measurements have been taken on large solid rocket motors at ATK. Both the four segment RSRM and the five segment RSRMV have been instrumented. Measurements are used to update acoustic prediction models and to correlate against vibration responses of the motor. Presentation focuses on two major sections: Part I) Unique challenges associated with measuring rocket acoustics Part II) Acoustic measurements summary over past five years
Paleoglaciation of the Tibetan Plateau based on exposure ages and ELA depression estimates
NASA Astrophysics Data System (ADS)
Heyman, Jakob
2014-05-01
The Tibetan Plateau holds a major part of all glaciers outside the polar regions and an ample record of past glaciations. The glacial history of the Tibetan Plateau has attracted significant interest, with a large body of research investigating the extent, timing, and climatic implications of past glaciations. Here I present an extensive compilation of exposure ages and equilibrium line altitude (ELA) depression estimates from glacial deposits across the Tibetan Plateau to address the timing and degree of past glaciations. I compiled Be-10 exposure age data for a total of 1877 samples and recalculated exposure ages using an updated (lower) global Be-10 production rate. All samples were organized in groups of individual glacial deposits where each deposit represents one glacial event enabling evaluation of the exposure age clustering. For each glacial deposit I estimated the ELA depression based on a simple toe to headwall ratio approach using Google Earth. To discriminate good (well-clustered) from poor (scattered) exposure age groups the glacial deposits were divided into three groups based on exposure age clustering. A major part of the glacial deposits have scattered exposure ages affected by prior or incomplete exposure, complicating exposure age interpretations. The well-clustered exposure age groups are primarily from mountain ranges along the margins of the Tibetan Plateau with a main peak in age between 10 and 30 ka, indicating glacial advances during the global last glacial maximum (LGM). A large number of exposure ages older than 30 ka indicates maximum glaciation predating the LGM, but the exposure age scatter generally prohibits accurate definition of the glacial chronology. The ELA depression estimates scatter significantly, but a major part is remarkably low. Average ELA depressions of 333 ± 191 m for the LGM and 494 ± 280 m for the pre-LGM exposure indicate restricted glacier expansion and limited glacial cooling.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-23
... and plastic parts coatings; large appliance coatings; offset lithographic printing and letterpress... local air pollution control authorities information that should assist them in determining RACT for VOC... plastic parts coatings; (4) large appliance coatings; (5) offset lithographic printing and letterpress...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berkel, M. van; Fellow of the Japan Society for the Promotion of Science; FOM Institute DIFFER-Dutch Institute for Fundamental Energy Research, Association EURATOM- FOM, Trilateral Euregio Cluster, PO Box 1207, 3430 BE Nieuwegein
2014-11-15
In this paper, a number of new approximations are introduced to estimate the perturbative diffusivity (χ), convectivity (V), and damping (τ) in cylindrical geometry. For this purpose, the harmonic components of heat waves induced by localized deposition of modulated power are used. The approximations are based on semi-infinite slab approximations of the heat equation. The main result is the approximation of χ under the influence of V and τ based on the phase of two harmonics making the estimate less sensitive to calibration errors. To understand why the slab approximations can estimate χ well in cylindrical geometry, the relationships betweenmore » heat transport models in slab and cylindrical geometry are studied. In addition, the relationship between amplitude and phase with respect to their derivatives, used to estimate χ, is discussed. The results are presented in terms of the relative error for the different derived approximations for different values of frequency, transport coefficients, and dimensionless radius. The approximations show a significant region in which χ, V, and τ can be estimated well, but also regions in which the error is large. Also, it is shown that some compensation is necessary to estimate V and τ in a cylindrical geometry. On the other hand, errors resulting from the simplified assumptions are also discussed showing that estimating realistic values for V and τ based on infinite domains will be difficult in practice. This paper is the first part (Part I) of a series of three papers. In Part II and Part III, cylindrical approximations based directly on semi-infinite cylindrical domain (outward propagating heat pulses) and inward propagating heat pulses in a cylindrical domain, respectively, will be treated.« less
Ensemble Kalman filters for dynamical systems with unresolved turbulence
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grooms, Ian, E-mail: grooms@cims.nyu.edu; Lee, Yoonsang; Majda, Andrew J.
Ensemble Kalman filters are developed for turbulent dynamical systems where the forecast model does not resolve all the active scales of motion. Coarse-resolution models are intended to predict the large-scale part of the true dynamics, but observations invariably include contributions from both the resolved large scales and the unresolved small scales. The error due to the contribution of unresolved scales to the observations, called ‘representation’ or ‘representativeness’ error, is often included as part of the observation error, in addition to the raw measurement error, when estimating the large-scale part of the system. It is here shown how stochastic superparameterization (amore » multiscale method for subgridscale parameterization) can be used to provide estimates of the statistics of the unresolved scales. In addition, a new framework is developed wherein small-scale statistics can be used to estimate both the resolved and unresolved components of the solution. The one-dimensional test problem from dispersive wave turbulence used here is computationally tractable yet is particularly difficult for filtering because of the non-Gaussian extreme event statistics and substantial small scale turbulence: a shallow energy spectrum proportional to k{sup −5/6} (where k is the wavenumber) results in two-thirds of the climatological variance being carried by the unresolved small scales. Because the unresolved scales contain so much energy, filters that ignore the representation error fail utterly to provide meaningful estimates of the system state. Inclusion of a time-independent climatological estimate of the representation error in a standard framework leads to inaccurate estimates of the large-scale part of the signal; accurate estimates of the large scales are only achieved by using stochastic superparameterization to provide evolving, large-scale dependent predictions of the small-scale statistics. Again, because the unresolved scales contain so much energy, even an accurate estimate of the large-scale part of the system does not provide an accurate estimate of the true state. By providing simultaneous estimates of both the large- and small-scale parts of the solution, the new framework is able to provide accurate estimates of the true system state.« less
The Great Observatories Origins Deep Survey (GOODS): Overview and Status
NASA Astrophysics Data System (ADS)
Hook, R. N.; GOODS Team
2002-12-01
GOODS is a very large project to gather deep imaging data and spectroscopic followup of two fields, the Hubble Deep Field North (HDF-N) and the Chandra Deep Field South (CDF-S), with both space and ground-based instruments to create an extensive multiwavelength public data set for community research on the distant Universe. GOODS includes a SIRTF Legacy Program (PI: Mark Dickinson) and a Hubble Treasury Program of ACS imaging (PI: Mauro Giavalisco). The ACS imaging was also optimized for the detection of high-z supernovae which are being followed up by a further target of opportunity Hubble GO Program (PI: Adam Riess). The bulk of the CDF-S ground-based data presently available comes from an ESO Large Programme (PI: Catherine Cesarsky) which includes both deep imaging and multi-object followup spectroscopy. This is currently complemented in the South by additional CTIO imaging. Currently available HDF-N ground-based data forming part of GOODS includes NOAO imaging. Although the SIRTF part of the survey will not begin until later in the year the ACS imaging is well advanced and there is also a huge body of complementary ground-based imaging and some follow-up spectroscopy which is already publicly available. We summarize the current status of GOODS and give an overview of the data products currently available and present the timescales for the future. Many early science results from the survey are presented in other GOODS papers at this meeting. Support for the HST GOODS program presented here and in companion abstracts was provided by NASA thorugh grant number GO-9425 from the Space Telescope Science Institute, which is operated by the Association of Universities for Research in Astronomy, Incorporated, under NASA contract NAS5-26555.
Family Registration Card as electronic medical carrier in Bosnia and Herzegovina.
Novo, Ahmed; Masic, Izet; Toromanovic, Selim; Loncarevic, Nedim; Junuzovic, Dzelaludin; Dizdarevic, Jadranka
2004-01-01
Medical documentation is a very important part of the medical documentalistics and is occupies a large part of daily work of medical staff working in Primary Health Care. Paper documentation is going to be replaced by electronic cards in Bosnia and Herzegovina and a new Health Care System is under development, based on an Electronic Family Registration Card. Developed countries proceeded from the manual and semiautomatic method of medical data processing to the new method of entering, storage, transferring, searching and protecting data, using electronic equipment. Currently, many European countries have developed a Medical Card Based Electronic Information System. Three types of electronic card are currently in use: a Hybrid Card, a Smart Card and a Laser Card. The dilemma is which card should be used as a data carrier. The Electronic Family Registration Cared is a question of strategic interest for B&H, but also a great investment. We should avoid the errors of other countries that have been developing card-based system. In this article we present all mentioned cards and compare advantages and disadvantages of different technologies.
Sample size for post-marketing safety studies based on historical controls.
Wu, Yu-te; Makuch, Robert W
2010-08-01
As part of a drug's entire life cycle, post-marketing studies are an important part in the identification of rare, serious adverse events. Recently, the US Food and Drug Administration (FDA) has begun to implement new post-marketing safety mandates as a consequence of increased emphasis on safety. The purpose of this research is to provide exact sample size formula for the proposed hybrid design, based on a two-group cohort study with incorporation of historical external data. Exact sample size formula based on the Poisson distribution is developed, because the detection of rare events is our outcome of interest. Performance of exact method is compared to its approximate large-sample theory counterpart. The proposed hybrid design requires a smaller sample size compared to the standard, two-group prospective study design. In addition, the exact method reduces the number of subjects required in the treatment group by up to 30% compared to the approximate method for the study scenarios examined. The proposed hybrid design satisfies the advantages and rationale of the two-group design with smaller sample sizes generally required. 2010 John Wiley & Sons, Ltd.
Liu, Weina; Sun, Haoran; Xu, Lei
2018-05-05
We present a microwave method for the dielectric characterization of small liquids based on a metamaterial-based sensor The proposed sensor consists of a micro-strip line and a double split-ring resonator (SRR). A large electric field is observed on the two splits of the double SRRs at the resonance frequency (1.9 GHz). The dielectric property data of the samples under test (SUTs) were obtained with two measurements. One is with the sensor loaded with the reference liquid (REF) and the other is with the sensor loaded with the SUTs. Additionally, the principle of extracting permittivity from measured changes of resonance characteristics changes of the sensor loaded with REF and SUTs is given. Some measurements were carried out at 1.9 GHz, and the calculated results of methanol⁻water mixtures with different molar fractions agree well with the time-domain reflectometry method. Moreover, the proposed sensor is compact and highly sensitive for use of sub-wavelength resonance. In comparison with literature data, relative errors are less than 3% for the real parts and 2% for the imaginary parts of complex permittivity.
The Stanford equivalence principle program
NASA Technical Reports Server (NTRS)
Worden, Paul W., Jr.; Everitt, C. W. Francis; Bye, M.
1989-01-01
The Stanford Equivalence Principle Program (Worden, Jr. 1983) is intended to test the uniqueness of free fall to the ultimate possible accuracy. The program is being conducted in two phases: first, a ground-based version of the experiment, which should have a sensitivity to differences in rate of fall of one part in 10(exp 12); followed by an orbital experiment with a sensitivity of one part in 10(exp 17) or better. The ground-based experiment, although a sensitive equivalence principle test in its own right, is being used for technology development for the orbital experiment. A secondary goal of the experiment is a search for exotic forces. The instrument is very well suited for this search, which would be conducted mostly with the ground-based apparatus. The short range predicted for these forces means that forces originating in the Earth would not be detectable in orbit. But detection of Yukawa-type exotic forces from a nearby large satellite (such as Space Station) is feasible, and gives a very sensitive and controllable test for little more effort than the orbiting equivalence principle test itself.
Assessing the Gap Between Top-down and Bottom-up Measured Methane Emissions in Indianapolis, IN.
NASA Astrophysics Data System (ADS)
Prasad, K.; Lamb, B. K.; Cambaliza, M. O. L.; Shepson, P. B.; Stirm, B. H.; Salmon, O. E.; Lavoie, T. N.; Lauvaux, T.; Ferrara, T.; Howard, T.; Edburg, S. L.; Whetstone, J. R.
2014-12-01
Releases of methane (CH4) from the natural gas supply chain in the United States account for approximately 30% of the total US CH4 emissions. However, there continues to be large questions regarding the accuracy of current emission inventories for methane emissions from natural gas usage. In this paper, we describe results from top-down and bottom-up measurements of methane emissions from the large isolated city of Indianapolis. The top-down results are based on aircraft mass balance and tower based inverse modeling methods, while the bottom-up results are based on direct component sampling at metering and regulating stations, surface enclosure measurements of surveyed pipeline leaks, and tracer/modeling methods for other urban sources. Mobile mapping of methane urban concentrations was also used to identify significant sources and to show an urban-wide low level enhancement of methane levels. The residual difference between top-down and bottom-up measured emissions is large and cannot be fully explained in terms of the uncertainties in top-down and bottom-up emission measurements and estimates. Thus, the residual appears to be, at least partly, attributed to a significant wide-spread diffusive source. Analyses are included to estimate the size and nature of this diffusive source.
Bait preference by the Argentine ant (Hymenoptera: Formicidae) in Haleakala National Park, Hawaii
Krushelnycky, Paul D.; Reimer, Neil J.
1998-01-01
The Argentine ant, Linepithema humile (Mayr), has proven to be a threat to native arthropod species in Haleakala National Park, Maui, HI, and is also a potential threat to the park's native flora. As it continues to expand its range, an effort has been undertaken to eradicate it, or at the least, control its spread. The 1st part of this effort focused on finding a bait carrier for subsequent toxicant-based control tests. A year-long bait preference test was implemented at each of the ant's 2 infestation sites in Haleakala National Park, in which 6 solid baits and 2 liquid baits were assessed for attractiveness and feasibility for large scale control. At both sites, a toxicant-free formulation of Maxforce, a protein-based granular bait made from ground silkworm, Bombyx mori (L.), pupae, and a 25% sugar water solution were the most attractive baits. Ants took more Maxforce (without toxicant) and sugar water than all other baits, including honey granules and a fish protein bait. Sugar water, however, is difficult to distribute over large natural areas. Maxforce was therefore concluded to be the best bait carrier for toxicant-based control at Haleakala National Park because of its attractiveness and its ease for large scale broadcast dispersal.
Leaders Leading and Learning (Part 1)
ERIC Educational Resources Information Center
Hannay, Lynne M.; Manning, Michael; Earl, Sandra; Blair, Don
2006-01-01
Internationally, large scale reform is big business and yet relatively little is known about the senior administrators who manage and lead local educational reform implementation. In this first of a two-part article, the authors focus on the role of senior administrators in facilitating large-scale reform in one Ontario, Canada school district…
17 CFR Appendix B to Part 420 - Sample Large Position Report
Code of Federal Regulations, 2010 CFR
2010-04-01
..., and as collateral for financial derivatives and other securities transactions $ Total Memorandum 1... 17 Commodity and Securities Exchanges 3 2010-04-01 2010-04-01 false Sample Large Position Report B Appendix B to Part 420 Commodity and Securities Exchanges DEPARTMENT OF THE TREASURY REGULATIONS UNDER...
Till, Alison B.; Dumoulin, Julie A.; Ayuso, Robert A.; Aleinikoff, John N.; Amato, Jeffrey M.; Slack, John F.; Shanks, W.C. Pat
2014-01-01
The Nome Complex is a large metamorphic unit that sits along the southern boundary of the Arctic Alaska–Chukotka terrane, the largest of several micro continental fragments of uncertain origin located between the Siberian and Laurentian cratons. The Arctic Alaska–Chukotka terrane moved into its present position during the Mesozoic; its Mesozoic and older movements are central to reconstruction of Arctic tectonic history. Accurate representation of the Arctic Alaska–Chukotka terrane in reconstructions of Late Proterozoic and early Paleozoic paleogeography is hampered by the paucity of information available. Most of the Late Proterozoic to Paleozoic rocks in the Alaska–Chukotka terrane were penetratively deformed and recrystallized during the Mesozoic deformational events; primary features and relationships have been obliterated, and age control is sparse. We use a variety of geochemical, geochronologic, paleontologic, and geologic tools to read through penetrative deformation and reconstruct the protolith sequence of part of the Arctic Alaska–Chukotka terrane, the Nome Complex. We confirm that the protoliths of the Nome Complex were part of the same Late Proterozoic to Devonian continental margin as weakly deformed rocks in the southern and central part of the terrane, the Brooks Range. We show that the protoliths of the Nome Complex represent a carbonate platform (and related rocks) that underwent incipient rifting, probably during the Ordovician, and that the carbonate platform was overrun by an influx of siliciclastic detritus during the Devonian. During early phases of the transition to siliciclastic deposition, restricted basins formed that were the site of sedimentary exhalative base-metal sulfide deposition. Finally, we propose that most of the basement on which the largely Paleozoic sedimentary protolith was deposited was subducted during the Mesozoic.
NASA Astrophysics Data System (ADS)
Brun, Christophe
2017-05-01
This paper is the second part of a study of katabatic jet along a convexly curved slope with a maximum angle of about 35.5°. Large-Eddy Simulation (LES) is performed with a special focus on the outer-layer shear of the katabatic jet. In the first part, a basic statistical quantitative analysis of the flow was performed. Here a qualitative and quantitative description of vortical structures is used to gain insight in the present 3-D turbulent flow. It is shown that Görtler vortices oriented in the streamwise downslope direction develop in the shear layer. They spread with a specific mushroom shape in the vertical direction up to about 100 m height. They play a main role with respect to local turbulent mixing in the ground surface boundary layer. The present curved slope configuration constitutes a realistic model for alpine orography. This paper provides a procedure based on local turbulence anisotropy to track Görtler vortices for in situ measurements, which has never been proposed in the literature.
Comparing DNA damage-processing pathways by computer analysis of chromosome painting data.
Levy, Dan; Vazquez, Mariel; Cornforth, Michael; Loucas, Bradford; Sachs, Rainer K; Arsuaga, Javier
2004-01-01
Chromosome aberrations are large-scale illegitimate rearrangements of the genome. They are indicative of DNA damage and informative about damage processing pathways. Despite extensive investigations over many years, the mechanisms underlying aberration formation remain controversial. New experimental assays such as multiplex fluorescent in situ hybridyzation (mFISH) allow combinatorial "painting" of chromosomes and are promising for elucidating aberration formation mechanisms. Recently observed mFISH aberration patterns are so complex that computer and graph-theoretical methods are needed for their full analysis. An important part of the analysis is decomposing a chromosome rearrangement process into "cycles." A cycle of order n, characterized formally by the cyclic graph with 2n vertices, indicates that n chromatin breaks take part in a single irreducible reaction. We here describe algorithms for computing cycle structures from experimentally observed or computer-simulated mFISH aberration patterns. We show that analyzing cycles quantitatively can distinguish between different aberration formation mechanisms. In particular, we show that homology-based mechanisms do not generate the large number of complex aberrations, involving higher-order cycles, observed in irradiated human lymphocytes.
NASA Astrophysics Data System (ADS)
Wallace, Colin; Prather, Edward; Duncan, Douglas
2011-10-01
We recently completed a large-scale, systematic study of general education introductory astronomy students' conceptual and reasoning difficulties related to cosmology. As part of this study, we analyzed a total of 4359 surveys (pre- and post-instruction) containing students' responses to questions about the Big Bang, the evolution and expansion of the universe, using Hubble plots to reason about the age and expansion rate of the universe, and using galaxy rotation curves to infer the presence of dark matter. We also designed, piloted, and validated a new suite of five cosmology Lecture-Tutorials. We found that students who use the new Lecture-Tutorials can achieve larger learning gains than their peers who did not. This material is based in part upon work supported by the National Science Foundation under Grant Nos. 0833364 and 0715517, a CCLI Phase III Grant for the Collaboration of Astronomy Teaching Scholars (CATS). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.
NASA Astrophysics Data System (ADS)
Wallace, Colin Scott; Prather, E. E.; Duncan, D. K.; Collaboration of Astronomy Teaching Scholars CATS
2012-01-01
We recently completed a large-scale, systematic study of general education introductory astronomy students’ conceptual and reasoning difficulties related to cosmology. As part of this study, we analyzed a total of 4359 surveys (pre- and post-instruction) containing students’ responses to questions about the Big Bang, the evolution and expansion of the universe, using Hubble plots to reason about the age and expansion rate of the universe, and using galaxy rotation curves to infer the presence of dark matter. We also designed, piloted, and validated a new suite of five cosmology Lecture-Tutorials. We found that students who use the new Lecture-Tutorials can achieve larger learning gains than their peers who did not. This material is based in part upon work supported by the National Science Foundation under Grant Nos. 0833364 and 0715517, a CCLI Phase III Grant for the Collaboration of Astronomy Teaching Scholars (CATS). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.
The changing pattern of ground-water development on Long Island, New York
Heath, Ralph C.; Foxworthy, B.L.; Cohen, Philip M.
1966-01-01
Ground-water development on Long Island has followed a pattern that has reflected changing population trends, attendant changes in the use and disposal of water, and the response of the hydrologic system to these changes. The historic pattern of development has ranged from individually owned shallow wells tapping glacial deposits to large-capacity public-supply wells tapping deep artesian aquifers. Sewage disposal has ranged from privately owned cesspools to modern large-capacity sewage-treatment plants discharging more than 70 mgd of water to the sea. At present (1965), different parts of long Island are characterized by different stages of ground-water development. In parts of Suffolk County in eastern long Island, development is similar to the earliest historical stages. Westward toward New York City, ground-water development becomes more intensive and complex, and the attendant problems become more acute. The alleviation of present problems and those that arise in the future will require management decisions based on the soundest possible knowledge of the hydrologic system, including an understanding of the factors involved in the changing pattern of ground-water development on the island.
Li, Ying; Ji, Xiaoting; Song, Weiling; Guo, Yingshu
2013-04-03
A cross-circular amplification system for sensitive detection of adenosine triphosphate (ATP) in cancer cells was developed based on aptamer-target interaction, magnetic microbeads (MBs)-assisted strand displacement amplification and target recycling. Here we described a new recognition probe possessing two parts, the ATP aptamer and the extension part. The recognition probe was firstly immobilized on the surface of MBs and hybridized with its complementary sequence to form a duplex. When combined with ATP, the probe changed its conformation, revealing the extension part in single-strand form, which further served as a toehold for subsequent target recycling. The released complementary sequence of the probe acted as the catalyst of the MB-assisted strand displacement reaction. Incorporated with target recycling, a large amount of biotin-tagged MB complexes were formed to stimulate the generation of chemiluminescence (CL) signal in the presence of luminol and H2O2 by incorporating with streptavidin-HRP, reaching a detection limit of ATP as low as 6.1×10(-10)M. Moreover, sample assays of ATP in Ramos Burkitt's lymphoma B cells were performed, which confirmed the reliability and practicality of the protocol. Copyright © 2013 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Wilder, Anna
The purpose of this study was to investigate the effects of a visualization-centered curriculum, Hemoglobin: A Case of Double Identity, on conceptual understanding and representational competence in high school biology. Sixty-nine students enrolled in three sections of freshman biology taught by the same teacher participated in this study. Online Chemscape Chime computer-based molecular visualizations were incorporated into the 10-week curriculum to introduce students to fundamental structure and function relationships. Measures used in this study included a Hemoglobin Structure and Function Test, Mental Imagery Questionnaire, Exam Difficulty Survey, the Student Assessment of Learning Gains, the Group Assessment of Logical Thinking, the Attitude Toward Science in School Assessment, audiotapes of student interviews, students' artifacts, weekly unit activity surveys, informal researcher observations and a teacher's weekly questionnaire. The Hemoglobin Structure and Function Test, consisting of Parts A and B, was administered as a pre and posttest. Part A used exclusively verbal test items to measure conceptual understanding, while Part B used visual-verbal test items to measure conceptual understanding and representational competence. Results of the Hemoglobin Structure and Function pre and posttest revealed statistically significant gains in conceptual understanding and representational competence, suggesting the visualization-centered curriculum implemented in this study was effective in supporting positive learning outcomes. The large positive correlation between posttest results on Part A, comprised of all-verbal test items, and Part B, using visual-verbal test items, suggests this curriculum supported students' mutual development of conceptual understanding and representational competence. Evidence based on student interviews, Student Assessment of Learning Gains ratings and weekly activity surveys indicated positive attitudes toward the use of Chemscape Chime software and the computer-based molecular visualization activities as learning tools. Evidence from these same sources also indicated that students felt computer-based molecular visualization activities in conjunction with other classroom activities supported their learning. Implications for instructional design are discussed.
Population Stratification in the Context of Diverse Epidemiologic Surveys Sans Genome-Wide Data
Oetjens, Matthew T.; Brown-Gentry, Kristin; Goodloe, Robert; Dilks, Holli H.; Crawford, Dana C.
2016-01-01
Population stratification or confounding by genetic ancestry is a potential cause of false associations in genetic association studies. Estimation of and adjustment for genetic ancestry has become common practice thanks in part to the availability of ancestry informative markers on genome-wide association study (GWAS) arrays. While array data is now widespread, these data are not ubiquitous as several large epidemiologic and clinic-based studies lack genome-wide data. One such large epidemiologic-based study lacking genome-wide data accessible to investigators is the National Health and Nutrition Examination Surveys (NHANES), population-based cross-sectional surveys of Americans linked to demographic, health, and lifestyle data conducted by the Centers for Disease Control and Prevention. DNA samples (n = 14,998) were extracted from biospecimens from consented NHANES participants between 1991–1994 (NHANES III, phase 2) and 1999–2002 and represent three major self-identified racial/ethnic groups: non-Hispanic whites (n = 6,634), non-Hispanic blacks (n = 3,458), and Mexican Americans (n = 3,950). We as the Epidemiologic Architecture for Genes Linked to Environment study genotyped candidate gene and GWAS-identified index variants in NHANES as part of the larger Population Architecture using Genomics and Epidemiology I study for collaborative genetic association studies. To enable basic quality control such as estimation of genetic ancestry to control for population stratification in NHANES san genome-wide data, we outline here strategies that use limited genetic data to identify the markers optimal for characterizing genetic ancestry. From among 411 and 295 autosomal SNPs available in NHANES III and NHANES 1999–2002, we demonstrate that markers with ancestry information can be identified to estimate global ancestry. Despite limited resolution, global genetic ancestry is highly correlated with self-identified race for the majority of participants, although less so for ethnicity. Overall, the strategies outlined here for a large epidemiologic study can be applied to other datasets accessible for genotype–phenotype studies but are sans genome-wide data. PMID:27200085
Coupling plant growth and waste recycling systems in a controlled life support system (CELSS)
NASA Technical Reports Server (NTRS)
Garland, Jay L.
1992-01-01
The development of bioregenerative systems as part of the Controlled Ecological Life Support System (CELSS) program depends, in large part, on the ability to recycle inorganic nutrients, contained in waste material, into plant growth systems. One significant waste (resource) stream is inedible plant material. This research compared wheat growth in hydroponic solutions based on inorganic salts (modified Hoagland's) with solutions based on the soluble fraction of inedible wheat biomass (leachate). Recycled nutrients in leachate solutions provided the majority of mineral nutrients for plant growth, although additions of inorganic nutrients to leachate solutions were necessary. Results indicate that plant growth and waste recyling systems can be effectively coupled within CELSS based on equivalent wheat yield in leachate and Hoagland solutions, and the rapid mineralization of waste organic material in the hydroponic systems. Selective enrichment for microbial communities able to mineralize organic material within the leachate was necessary to prevent accumulation of dissolved organic matter in leachate-based solutions. Extensive analysis of microbial abundance, growth, and activity in the hydroponic systems indicated that addition of soluble organic material from plants does not cause excessive microbial growth or 'biofouling', and helped define the microbially-mediated flux of carbon in hydroponic solutions.
Yang, Yingzhen; Jittayasothorn, Yingyos; Chronis, Demosthenis; Wang, Xiaohong; Cousins, Peter; Zhong, Gan-Yuan
2013-01-01
Root-knot nematodes (RKNs) infect many annual and perennial crops and are the most devastating soil-born pests in vineyards. To develop a biotech-based solution for controlling RKNs in grapes, we evaluated the efficacy of plant-derived RNA interference (RNAi) silencing of a conserved RKN effector gene, 16D10, for nematode resistance in transgenic grape hairy roots. Two hairpin-based silencing constructs, containing a stem sequence of 42 bp (pART27-42) or 271 bp (pART27-271) of the 16D10 gene, were transformed into grape hairy roots and compared for their small interfering RNA (siRNA) production and efficacy on suppression of nematode infection. Transgenic hairy root lines carrying either of the two RNAi constructs showed less susceptibility to nematode infection compared with control. Small RNA libraries from four pART27-42 and two pART27-271 hairy root lines were sequenced using an Illumina sequencing technology. The pART27-42 lines produced hundred times more 16D10-specific siRNAs than the pART27-271 lines. On average the 16D10 siRNA population had higher GC content than the 16D10 stem sequences in the RNAi constructs, supporting previous observation that plant dicer-like enzymes prefer GC-rich sequences as substrates for siRNA production. The stems of the 16D10 RNAi constructs were not equally processed into siRNAs. Several hot spots for siRNA production were found in similar positions of the hairpin stems in pART27-42 and pART27-271. Interestingly, stem sequences at the loop terminus produced more siRNAs than those at the stem base. Furthermore, the relative abundance of guide and passenger single-stranded RNAs from putative siRNA duplexes was largely correlated with their 5' end thermodynamic strength. This study demonstrated the feasibility of using a plant-derived RNAi approach for generation of novel nematode resistance in grapes and revealed several interesting molecular characteristics of transgene siRNAs important for optimizing plant RNAi constructs.
Chronis, Demosthenis; Wang, Xiaohong; Cousins, Peter; Zhong, Gan-Yuan
2013-01-01
Root-knot nematodes (RKNs) infect many annual and perennial crops and are the most devastating soil-born pests in vineyards. To develop a biotech-based solution for controlling RKNs in grapes, we evaluated the efficacy of plant-derived RNA interference (RNAi) silencing of a conserved RKN effector gene, 16D10, for nematode resistance in transgenic grape hairy roots. Two hairpin-based silencing constructs, containing a stem sequence of 42 bp (pART27-42) or 271 bp (pART27-271) of the 16D10 gene, were transformed into grape hairy roots and compared for their small interfering RNA (siRNA) production and efficacy on suppression of nematode infection. Transgenic hairy root lines carrying either of the two RNAi constructs showed less susceptibility to nematode infection compared with control. Small RNA libraries from four pART27-42 and two pART27-271 hairy root lines were sequenced using an Illumina sequencing technology. The pART27-42 lines produced hundred times more 16D10-specific siRNAs than the pART27-271 lines. On average the 16D10 siRNA population had higher GC content than the 16D10 stem sequences in the RNAi constructs, supporting previous observation that plant dicer-like enzymes prefer GC-rich sequences as substrates for siRNA production. The stems of the 16D10 RNAi constructs were not equally processed into siRNAs. Several hot spots for siRNA production were found in similar positions of the hairpin stems in pART27-42 and pART27-271. Interestingly, stem sequences at the loop terminus produced more siRNAs than those at the stem base. Furthermore, the relative abundance of guide and passenger single-stranded RNAs from putative siRNA duplexes was largely correlated with their 5′ end thermodynamic strength. This study demonstrated the feasibility of using a plant-derived RNAi approach for generation of novel nematode resistance in grapes and revealed several interesting molecular characteristics of transgene siRNAs important for optimizing plant RNAi constructs. PMID:23874962
Mechanisms for the elevation structure of a giant telescope
NASA Astrophysics Data System (ADS)
Hu, Shouwei; Song, Xiaoli; Zhang, Hui
2018-06-01
This paper describes an innovative mechanism based on hydrostatic pads and linear motors for the elevation structure of next-generation extremely large telescopes. Both hydrostatic pads and linear motors are integrated on the frame that includes a kinematical joint, such that the upper part is properly positioned with respect to the elevation runner tracks, while the lower part is connected to the azimuth structure. Potential deflections of the elevation runner bearings at the radial pad locations are absorbed by this flexible kinematic connection and not transmitted to the linear motors and hydrostatic pads. Extensive simulations using finite-element analysis are carried out to verify that the auxiliary whiffletree hydraulic design of the mechanism is sufficient to satisfy the assigned optical length variation errors.
Mechanisms for the elevation structure of a giant telescope
NASA Astrophysics Data System (ADS)
Hu, Shouwei; Song, Xiaoli; Zhang, Hui
2018-05-01
This paper describes an innovative mechanism based on hydrostatic pads and linear motors for the elevation structure of next-generation extremely large telescopes. Both hydrostatic pads and linear motors are integrated on the frame that includes a kinematical joint, such that the upper part is properly positioned with respect to the elevation runner tracks, while the lower part is connected to the azimuth structure. Potential deflections of the elevation runner bearings at the radial pad locations are absorbed by this flexible kinematic connection and not transmitted to the linear motors and hydrostatic pads. Extensive simulations using finite-element analysis are carried out to verify that the auxiliary whiffletree hydraulic design of the mechanism is sufficient to satisfy the assigned optical length variation errors.
Ormand, W. E.; Brown, B. A.; Hjorth-Jensen, M.
2017-08-01
We present calculations for the c coefficients of the isobaric mass multiplet equation for nuclei from A = 42 to A = 54 based on input from three realistic nucleon-nucleon interactions. We demonstrate that there is a clear dependence on the short-range charge-symmetry-breaking (CSB) part of the strong interaction and that there is significant disagreement in the CSB part between the commonly used CD-Bonn, chiral effective field theory at next-to-next-to-next-to-leading-order, and Argonne V18 nucleon-nucleon interactions. In addition, we show that all three interactions give a CSB contribution to the c coefficient that is too large when compared to experiment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ormand, W. E.; Brown, B. A.; Hjorth-Jensen, M.
We present calculations for the c coefficients of the isobaric mass multiplet equation for nuclei from A = 42 to A = 54 based on input from three realistic nucleon-nucleon interactions. We demonstrate that there is a clear dependence on the short-range charge-symmetry-breaking (CSB) part of the strong interaction and that there is significant disagreement in the CSB part between the commonly used CD-Bonn, chiral effective field theory at next-to-next-to-next-to-leading-order, and Argonne V18 nucleon-nucleon interactions. In addition, we show that all three interactions give a CSB contribution to the c coefficient that is too large when compared to experiment.
Analysis on the restriction factors of the green building scale promotion based on DEMATEL
NASA Astrophysics Data System (ADS)
Wenxia, Hong; Zhenyao, Jiang; Zhao, Yang
2017-03-01
In order to promote the large-scale development of the green building in our country, DEMATEL method was used to classify influence factors of green building development into three parts, including green building market, green technology and macro economy. Through the DEMATEL model, the interaction mechanism of each part was analyzed. The mutual influence degree of each barrier factor that affects the green building promotion was quantitatively analysed and key factors for the development of green building in China were also finally determined. In addition, some implementation strategies of promoting green building scale development in our country were put forward. This research will show important reference value and practical value for making policies of the green building promotion.
Semantic Technologies for Re-Use of Clinical Routine Data.
Kreuzthaler, Markus; Martínez-Costa, Catalina; Kaiser, Peter; Schulz, Stefan
2017-01-01
Routine patient data in electronic patient records are only partly structured, and an even smaller segment is coded, mainly for administrative purposes. Large parts are only available as free text. Transforming this content into a structured and semantically explicit form is a prerequisite for querying and information extraction. The core of the system architecture presented in this paper is based on SAP HANA in-memory database technology using the SAP Connected Health platform for data integration as well as for clinical data warehousing. A natural language processing pipeline analyses unstructured content and maps it to a standardized vocabulary within a well-defined information model. The resulting semantically standardized patient profiles are used for a broad range of clinical and research application scenarios.
Large-scale flow experiments for managing river systems
Konrad, Christopher P.; Olden, Julian D.; Lytle, David A.; Melis, Theodore S.; Schmidt, John C.; Bray, Erin N.; Freeman, Mary C.; Gido, Keith B.; Hemphill, Nina P.; Kennard, Mark J.; McMullen, Laura E.; Mims, Meryl C.; Pyron, Mark; Robinson, Christopher T.; Williams, John G.
2011-01-01
Experimental manipulations of streamflow have been used globally in recent decades to mitigate the impacts of dam operations on river systems. Rivers are challenging subjects for experimentation, because they are open systems that cannot be isolated from their social context. We identify principles to address the challenges of conducting effective large-scale flow experiments. Flow experiments have both scientific and social value when they help to resolve specific questions about the ecological action of flow with a clear nexus to water policies and decisions. Water managers must integrate new information into operating policies for large-scale experiments to be effective. Modeling and monitoring can be integrated with experiments to analyze long-term ecological responses. Experimental design should include spatially extensive observations and well-defined, repeated treatments. Large-scale flow manipulations are only a part of dam operations that affect river systems. Scientists can ensure that experimental manipulations continue to be a valuable approach for the scientifically based management of river systems.
Vaklavas, Christos
2012-01-01
Antibody-based immunotherapy has become an integral part of cancer therapeutics. However, monoclonal antibodies have their limitations as identifying an antigen selectively expressed on malignant cells and developing a high-affinity antibody may not by itself alter tumor growth. This is illustrated in the case of CD30; CD30 epitomizes many properties of an ideal pharmacologic target such as high expression on malignant cells and limited expression on normal tissues. However, until the advent of brentuximab vedotin, CD30 remained an elusive target as antibody-based anti-CD30 immunotherapy had been largely clinically unsuccessful. Brentuximab vedotin (cAC10-vcMMAE, SGN-35) is an antibody-drug conjugate consisting of a chimeric anti-CD30 monoclonal antibody whereupon the potent microtubule inhibitor monomethyl auristatin E (MMAE) is attached via a valine–citrulline linker. Once bound to CD30, brentuximab vedotin is internalized and MMAE is released with the action of lysosomal enzymes on the linker. In phase I studies in relapsed or refractory Hodgkin lymphoma and anaplastic large cell lymphoma, brentuximab vedotin induced unprecedented responses with manageable toxicity. In phase II studies, brentuximab vedotin induced overall response rates of 75% and 86% in relapsed or refractory Hodgkin lymphoma and anaplastic large cell lymphoma, respectively. The results of these trials led to the accelerated approval of the drug by the US Food and Drug Administration in a patient population with few other alternative options. Brentuximab vedotin has overall manageable toxicity profile; however, cumulative peripheral neuropathy constitutes an important clinical consideration as it may limit prolonged administration of the drug. The mechanism by which brentuximab vedotin exerts its antitumor activity is not entirely clear. Diffusion of MMAE in the tumor microenvironment and cytotoxicity on bystander cells may in part explain its activity, especially in Hodgkin lymphoma. Herein, we review the biology of CD30 and brentuximab vedotin, and the clinical data that has accumulated thus far with SGN-35. PMID:23606932
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ouellette, Brittany Joy
Additive Manufacturing (AM) technology has been around for decades, but until recently, machines have been expensive, relatively large, and not available to most institutions. Increased technological advances in 3D printing and awareness throughout industry, universities, and even hobbyists has increased demand to substitute AM parts in place of traditionally manufactured (subtractive) designs; however, there is a large variability of part quality and mechanical behavior due to the inherent printing process, which must be understood before AM parts are used for load bearing and structural design.
Hotspot Patterns: The Formal Definition and Automatic Detection of Architecture Smells
2015-01-15
serious question for a project manager or architect: how to determine which parts of the code base should be given higher priority for maintenance and...services framework; Hadoop8 is a tool for distributed processing of large data sets; HBase9 is the Hadoop database; Ivy10 is a dependency management tool...answer this question more rigorously, we conducted Pearson Correlation Analysis to test the dependency between the number of issues a file involves
NASA Astrophysics Data System (ADS)
Nazarian, Robert H.; Legg, Sonya
2017-10-01
When internal waves interact with topography, such as continental slopes, they can transfer wave energy to local dissipation and diapycnal mixing. Submarine canyons comprise approximately ten percent of global continental slopes, and can enhance the local dissipation of internal wave energy, yet parameterizations of canyon mixing processes are currently missing from large-scale ocean models. As a first step in the development of such parameterizations, we conduct a parameter space study of M2 tidal-frequency, low-mode internal waves interacting with idealized V-shaped canyon topographies. Specifically, we examine the effects of varying the canyon mouth width, shape and slope of the thalweg (line of lowest elevation). This effort is divided into two parts. In the first part, presented here, we extend the theory of 3-dimensional internal wave reflection to a rotated coordinate system aligned with our idealized V-shaped canyons. Based on the updated linear internal wave reflection solution that we derive, we construct a ray tracing algorithm which traces a large number of rays (the discrete analog of a continuous wave) into the canyon region where they can scatter off topography. Although a ray tracing approach has been employed in other studies, we have, for the first time, used ray tracing to calculate changes in wavenumber and ray density which, in turn, can be used to calculate the Froude number (a measure of the likelihood of instability). We show that for canyons of intermediate aspect ratio, large spatial envelopes of instability can form in the presence of supercritical sidewalls. Additionally, the canyon height and length can modulate the Froude number. The second part of this study, a diagnosis of internal wave scattering in continental slope canyons using both numerical simulations and this ray tracing algorithm, as well as a test of robustness of the ray tracing, is presented in the companion article.
NASA Astrophysics Data System (ADS)
den Hollander, Richard J. M.; Bouma, Henri; van Rest, Jeroen H. C.; ten Hove, Johan-Martijn; ter Haar, Frank B.; Burghouts, Gertjan J.
2017-10-01
Video analytics is essential for managing large quantities of raw data that are produced by video surveillance systems (VSS) for the prevention, repression and investigation of crime and terrorism. Analytics is highly sensitive to changes in the scene, and for changes in the optical chain so a VSS with analytics needs careful configuration and prompt maintenance to avoid false alarms. However, there is a trend from static VSS consisting of fixed CCTV cameras towards more dynamic VSS deployments over public/private multi-organization networks, consisting of a wider variety of visual sensors, including pan-tilt-zoom (PTZ) cameras, body-worn cameras and cameras on moving platforms. This trend will lead to more dynamic scenes and more frequent changes in the optical chain, creating structural problems for analytics. If these problems are not adequately addressed, analytics will not be able to continue to meet end users' developing needs. In this paper, we present a three-part solution for managing the performance of complex analytics deployments. The first part is a register containing meta data describing relevant properties of the optical chain, such as intrinsic and extrinsic calibration, and parameters of the scene such as lighting conditions or measures for scene complexity (e.g. number of people). A second part frequently assesses these parameters in the deployed VSS, stores changes in the register, and signals relevant changes in the setup to the VSS administrator. A third part uses the information in the register to dynamically configure analytics tasks based on VSS operator input. In order to support the feasibility of this solution, we give an overview of related state-of-the-art technologies for autocalibration (self-calibration), scene recognition and lighting estimation in relation to person detection. The presented solution allows for rapid and robust deployment of Video Content Analysis (VCA) tasks in large scale ad-hoc networks.
Robinson, Philip G; Newman, David; Reitz, Cara L; Vaynberg, Lena Z; Bahga, Dalbir K; Levitt, Morton H
2018-06-01
The purpose of this study is to see whether a large drawing of a nephron helped medical students in self-directed learning groups learn renal physiology, histology, and pharmacology before discussing clinical cases. The end points were the grades on the renal examination and a student survey. The classes in the fall of 2014 and 2015 used the drawing, but not those of 2012 and 2013. The Charles E. Schmidt College of Medicine at Florida Atlantic University is a newly formed Florida medical school, which enrolled its first class in the fall of 2011. The school relies on self-directed problem-based learning in year 1 and changes over to a case inquiry method in the latter part of year 1 and throughout year 2. At the start of the renal course, each student group received a poster of a nephron with the objective of learning the cell functions of the different nephron parts. During the first year of using the drawing, there was no improvement in grades. After a student suggested adjustment to the drawing, there was a statistically significant difference in the total test score in the second year ( P < 0.001). An unexpected finding was lower grades in all 4 yr in the area of acid-base balance and electrolytes compared with the other four areas tested. In the survey, the students found the drawing useful.
Fuessinger, Marc Anton; Schwarz, Steffen; Cornelius, Carl-Peter; Metzger, Marc Christian; Ellis, Edward; Probst, Florian; Semper-Hogg, Wiebke; Gass, Mathieu; Schlager, Stefan
2018-04-01
Virtual reconstruction of large cranial defects is still a challenging task. The current reconstruction procedures depend on the surgeon's experience and skills in planning the reconstruction based on mirroring and manual adaptation. The aim of this study is to propose and evaluate a computer-based approach employing a statistical shape model (SSM) of the cranial vault. An SSM was created based on 131 CT scans of pathologically unaffected adult crania. After segmentation, the resulting surface mesh of one patient was established as template and subsequently registered to the entire sample. Using the registered surface meshes, an SSM was generated capturing the shape variability of the cranial vault. The knowledge about this shape variation in healthy patients was used to estimate the missing parts. The accuracy of the reconstruction was evaluated by using 31 CT scans not included in the SSM. Both unilateral and bilateral bony defects were created on each skull. The reconstruction was performed using the current gold standard of mirroring the intact to the affected side, and the result was compared to the outcome of our proposed SSM-driven method. The accuracy of the reconstruction was determined by calculating the distances to the corresponding parts on the intact skull. While unilateral defects could be reconstructed with both methods, the reconstruction of bilateral defects was, for obvious reasons, only possible employing the SSM-based method. Comparing all groups, the analysis shows a significantly higher precision of the SSM group, with a mean error of 0.47 mm compared to the mirroring group which exhibited a mean error of 1.13 mm. Reconstructions of bilateral defects yielded only slightly higher estimation errors than those of unilateral defects. The presented computer-based approach using SSM is a precise and simple tool in the field of computer-assisted surgery. It helps to reconstruct large-size defects of the skull considering the natural asymmetry of the cranium and is not limited to unilateral defects.
NASA Astrophysics Data System (ADS)
Luo, Keqin
1999-11-01
The electroplating industry of over 10,000 planting plants nationwide is one of the major waste generators in the industry. Large quantities of wastewater, spent solvents, spent process solutions, and sludge are the major wastes generated daily in plants, which costs the industry tremendously for waste treatment and disposal and hinders the further development of the industry. It becomes, therefore, an urgent need for the industry to identify technically most effective and economically most attractive methodologies and technologies to minimize the waste, while the production competitiveness can be still maintained. This dissertation aims at developing a novel WM methodology using artificial intelligence, fuzzy logic, and fundamental knowledge in chemical engineering, and an intelligent decision support tool. The WM methodology consists of two parts: the heuristic knowledge-based qualitative WM decision analysis and support methodology and fundamental knowledge-based quantitative process analysis methodology for waste reduction. In the former, a large number of WM strategies are represented as fuzzy rules. This becomes the main part of the knowledge base in the decision support tool, WMEP-Advisor. In the latter, various first-principles-based process dynamic models are developed. These models can characterize all three major types of operations in an electroplating plant, i.e., cleaning, rinsing, and plating. This development allows us to perform a thorough process analysis on bath efficiency, chemical consumption, wastewater generation, sludge generation, etc. Additional models are developed for quantifying drag-out and evaporation that are critical for waste reduction. The models are validated through numerous industrial experiments in a typical plating line of an industrial partner. The unique contribution of this research is that it is the first time for the electroplating industry to (i) use systematically available WM strategies, (ii) know quantitatively and accurately what is going on in each tank, and (iii) identify all WM opportunities through process improvement. This work has formed a solid foundation for the further development of powerful WM technologies for comprehensive WM in the following decade.
NASA Astrophysics Data System (ADS)
Saeedimoghaddam, M.; Kim, C.
2017-10-01
Understanding individual travel behavior is vital in travel demand management as well as in urban and transportation planning. New data sources including mobile phone data and location-based social media (LBSM) data allow us to understand mobility behavior on an unprecedented level of details. Recent studies of trip purpose prediction tend to use machine learning (ML) methods, since they generally produce high levels of predictive accuracy. Few studies used LSBM as a large data source to extend its potential in predicting individual travel destination using ML techniques. In the presented research, we created a spatio-temporal probabilistic model based on an ensemble ML framework named "Random Forests" utilizing the travel extracted from geotagged Tweets in 419 census tracts of Greater Cincinnati area for predicting the tract ID of an individual's travel destination at any time using the information of its origin. We evaluated the model accuracy using the travels extracted from the Tweets themselves as well as the travels from household travel survey. The Tweets and survey based travels that start from same tract in the south western parts of the study area is more likely to select same destination compare to the other parts. Also, both Tweets and survey based travels were affected by the attraction points in the downtown of Cincinnati and the tracts in the north eastern part of the area. Finally, both evaluations show that the model predictions are acceptable, but it cannot predict destination using inputs from other data sources as precise as the Tweets based data.
Water Use by Urban Landscapes in Semi-Arid Environments
NASA Astrophysics Data System (ADS)
Litvak, E.; Pataki, D. E.
2017-12-01
Water use by urban trees and lawns constitutes a significant yet uncertain portion of urban water budgets. Reducing this uncertainty is essential for developing effective water conservation strategies that are critically needed in dry regions. Landscape water use is particularly difficult to estimate in semi-arid cities with diverse plant compositions and large proportions of non-native species sustained by irrigation. We developed an empirical model of urban evapotranspiration based on in situ measurements of 11 lawns and 108 trees that we previously collected in the greater Los Angeles area. The model in its current state considers urban landscapes as two-component systems comprised of lawns and trees, which have contrasting patterns of water use. Turfgrass lawns consume large amounts of irrigation water (up to 10 mm/d) that may be effectively reduced by the shade from trees. Trees consume much smaller amounts of water at common urban planting densities (0.1-2.6 mm/d), and provide shade over lawns. We estimated water use by irrigated landscapes in Los Angeles by combining this model with remotely sensed estimates of vegetation cover and ground-based vegetation surveys and weather data. According to our estimates, water use by Los Angeles landscapes was close to potential evapotranspiration ( 1,100 mm/yr), with turfgrass responsible for 64-84% of total water use. Landscape water use linearly increased with median household income across Los Angeles, where wealthier parts of the city were consistently more vegetated than less affluent parts. Our results indicate extremely high water use by urban landscapes in semi-arid environments, largely owing to high spatial coverage of excessively irrigated lawns. These results have important implications for constraining municipal water budgets and developing water-saving landscaping practices.
NASA Astrophysics Data System (ADS)
Hikosaka, Tomoyuki; Miyamoto, Masahiro; Yamada, Mamoru; Morita, Tadashi
1993-05-01
It is very important to obtain saturated magnetic properties from reverse saturation (full B-H curve) of ferromagnetic cores to design magnetic switches which are used in high power pulse generators. The magnetic switch is excited in the high frequency range (˜MHz). But, it is extremely difficult to measure full B-H curve of large toroidal cores of which diameter is some hundreds of mm, using the conventional ac excitation method at high frequency. The main reason is poor output ability of power source for core excitation. Therefore we have developed pulse excitation method to get high frequency magnetic properties. The measurement circuit has two sections. One is excitation part composed by charge transfer circuit. The others is reset part for adjustment initial point on direct B-H curve. The sample core is excited by sinusoidal voltage pulse expressed as 1-cos(2π ft). Excitation frequency f is decided by the constants of the elements of the charge transfer circuit. The change of magnetic flux density ΔB and magnetic field H are calculated, respectively, by measuring the induced voltage of search coil and magnetizing current. ΔB-H characteristics from reverse saturation of four different kinds of large cores were measured in frequency range from 50 kHz to 1 MHz. Core loss increases in proportion to Nth powers of the frequency, where the index N depends on each of cores. N is about 0.5 in case of winding ribbon cores, such as Fe-based amorphous, Co-based amorphous, and Finemet, but N is about 0.2 in case of the Ni-Zn ferrite.
Large format 15μm pitch XBn detector
NASA Astrophysics Data System (ADS)
Karni, Yoram; Avnon, Eran; Ben Ezra, Michael; Berkowitz, Eyal; Cohen, Omer; Cohen, Yossef; Dobromislin, Roman; Hirsh, Itay; Klin, Olga; Klipstein, Philip; Lukomsky, Inna; Nitzani, Michal; Pivnik, Igor; Rozenberg, Omer; Shtrichman, Itay; Singer, Michael; Sulimani, Shay; Tuito, Avi; Weiss, Eliezer
2014-06-01
Over the past few years, a new type of High Operating Temperature (HOT) photon detector has been developed at SCD, which operates in the blue part of the MWIR atmospheric window (3.4 - 4.2 μm). This window is generally more transparent than the red part of the MWIR window (4.4 - 4.9 μm), and thus is especially useful for mid and long range applications. The detector has an InAsSb active layer and is based on the new "XBn" device concept, which eliminates Generation-Recombination dark current and enables operation at temperatures of 150K or higher, while maintaining excellent image quality. Such high operating temperatures reduce the cooling requirements of Focal Plane Array (FPA) detectors dramatically, and allow the use of a smaller closed-cycle Stirling cooler. As a result, the complete Integrated Detector Cooler Assembly (IDCA) has about 60% lower power consumption and a much longer lifetime compared with IDCAs based on standard InSb detectors and coolers operating at 77K. In this work we present a new large format IDCA designed for 150K operation. The 15 μm pitch 1280×1024 FPA is based on SCD's XBn technology and digital Hercules ROIC. The FPA is housed in a robust Dewar and is integrated with Ricor's K508N Stirling cryo-cooler. The IDCA has a weight of ~750 gram and its power consumption is ~ 5.5 W at a frame rate of 100Hz. The Mean Time to Failure (MTTF) of the IDCA is more than 20,000 hours, greatly facilitating 24/7 operation.
Hirai, Yuko; Kodama, Yoshiaki; Cullings, Harry M; Miyazawa, Chuzo; Nakamura, Nori
2011-01-01
The atomic bombs in Hiroshima and Nagasaki led to two different types of radiation exposure; one was direct and brief and the other was indirect and persistent. The latter (so-called exposure to residual radiation) resulted from the presence of neutron activation products in the soil, or from fission products present in the fallout. Compared with the doses from direct exposures, estimations of individual doses from residual radiation have been much more complicated, and estimates vary widely among researchers. The present report bases its conclusions on radiation doses recorded in tooth enamel from survivors in Hiroshima. Those survivors were present at distances of about 3 km or greater from the hypocenter at the time of the explosion, and have DS02 estimated doses (direct exposure doses) of less than 5 mGy (and are regarded as control subjects). Individual doses were estimated by measuring CO(2)(-) radicals in tooth enamel with the electron spin resonance (ESR; or electron paramagnetic resonance, EPR) method. The results from 56 molars donated by 49 survivors provided estimated doses which vary from -200 mGy to 500 mGy, and the median dose was 17 mGy (25% and 75% quartiles are -54 mGy and 137 mGy, respectively) for the buccal parts and 13 mGy (25% and 75% quartiles: -49 mGy and 87 mGy, respectively) for the lingual parts of the molars. Three molars had ESR-estimated doses of 300 to 400 mGy for both the buccal and lingual parts, which indicates possible exposures to excess doses of penetrating radiation, although the origin of such radiation remains to be determined. The results did not support claims that a large fraction of distally-exposed survivors received large doses (e.g. 1 Gy) of external penetrating radiation resulting from residual radiation.
Fernández, O; Delvecchio, M; Edan, G; Fredrikson, S; Giovannoni, G; Hartung, H-P; Havrdova, E; Kappos, L; Pozzilli, C; Soerensen, P S; Tackenberg, B; Vermersch, P; Comi, G
2018-05-01
The European Charcot Foundation supported the development of a set of surveys to understand current practice patterns for the diagnosis and management of multiple sclerosis (MS) in Europe. Part 2 of the report summarizes survey results related to secondary progressive MS (SPMS), primary progressive MS (PPMS), pregnancy, paediatric MS and overall patient management. A steering committee of MS neurologists developed case- and practice-based questions for two sequential surveys distributed to MS neurologists throughout Europe. Respondents generally favoured changing rather than stopping disease-modifying treatment (DMT) in patients transitioning from relapsing-remitting MS to SPMS, particularly with active disease. Respondents would not initiate DMT in patients with typical PPMS symptoms, although the presence of ≥1 spinal cord or brain gadolinium-enhancing lesion might affect that decision. For patients considering pregnancy, respondents were equally divided on whether to stop treatment before or after conception. Respondents strongly favoured starting DMT in paediatric MS with active disease; recommended treatments included interferon, glatiramer acetate and, in John Cunningham virus negative patients, natalizumab. Additional results regarding practice-based questions and management are summarized. Results of part 2 of the survey of diagnostic and treatment practices for MS in Europe largely mirror results for part 1, with neurologists in general agreement about the treatment and management of SPMS, PPMS, pregnancy and paediatric MS as well as the general management of MS. However, there are also many areas of disagreement, indicating the need for evidence-based recommendations and/or guidelines. © 2018 EAN.
The HI Content of Galaxies as a Function of Local Density and Large-Scale Environment
NASA Astrophysics Data System (ADS)
Thoreen, Henry; Cantwell, Kelly; Maloney, Erin; Cane, Thomas; Brough Morris, Theodore; Flory, Oscar; Raskin, Mark; Crone-Odekon, Mary; ALFALFA Team
2017-01-01
We examine the HI content of galaxies as a function of environment, based on a catalogue of 41527 galaxies that are part of the 70% complete Arecibo Legacy Fast-ALFA (ALFALFA) survey. We use nearest-neighbor methods to characterize local environment, and a modified version of the algorithm developed for the Galaxy and Mass Assembly (GAMA) survey to classify large-scale environment as group, filament, tendril, or void. We compare the HI content in these environments using statistics that include both HI detections and the upper limits on detections from ALFALFA. The large size of the sample allows to statistically compare the HI content in different environments for early-type galaxies as well as late-type galaxies. This work is supported by NSF grants AST-1211005 and AST-1637339, the Skidmore Faculty-Student Summer Research program, and the Schupf Scholars program.
Object Recognition and Localization: The Role of Tactile Sensors
Aggarwal, Achint; Kirchner, Frank
2014-01-01
Tactile sensors, because of their intrinsic insensitivity to lighting conditions and water turbidity, provide promising opportunities for augmenting the capabilities of vision sensors in applications involving object recognition and localization. This paper presents two approaches for haptic object recognition and localization for ground and underwater environments. The first approach called Batch Ransac and Iterative Closest Point augmented Particle Filter (BRICPPF) is based on an innovative combination of particle filters, Iterative-Closest-Point algorithm, and a feature-based Random Sampling and Consensus (RANSAC) algorithm for database matching. It can handle a large database of 3D-objects of complex shapes and performs a complete six-degree-of-freedom localization of static objects. The algorithms are validated by experimentation in ground and underwater environments using real hardware. To our knowledge this is the first instance of haptic object recognition and localization in underwater environments. The second approach is biologically inspired, and provides a close integration between exploration and recognition. An edge following exploration strategy is developed that receives feedback from the current state of recognition. A recognition by parts approach is developed which uses the BRICPPF for object sub-part recognition. Object exploration is either directed to explore a part until it is successfully recognized, or is directed towards new parts to endorse the current recognition belief. This approach is validated by simulation experiments. PMID:24553087
X-ray micro-CT measurement of large parts at very low temperature
NASA Astrophysics Data System (ADS)
Koutecký, T.; Zikmund, T.; Glittová, D.; Paloušek, D.; Živčák, J.; Kaiser, J.
2017-03-01
At present, the automotive industry, along with other industries, has increasing demands on accuracy of produced parts and assemblies. Besides the regular dimensional and geometrical inspection, in some cases, also a verification at very low temperatures is required. X-ray computed tomography (CT), as a tool for non-destructive testing, is able to examine samples and then determine dimensions for strictly stable temperature conditions necessary for the stability of the CT system. Until now, no system that allows scanning of samples larger than a few millimeters at temperatures much below 0 °C has been presented. This paper presents a cooling system for CT imaging of parts with length up to 300 mm at the extreme temperature conditions of -40 °C, which are based on automotive industry requests. It describes the equipment and conditions under which it is possible to achieve a temperature stability of samples at low temperatures, while keeping an independent temperature regulation of the CT system. The presented system uses a standard industrial CT device and a newly designed cooling stage with passive cooling based on phase-change material. The system is demonstrated on the measurement of plastic part (car door handle) at temperatures of -40 °C and 20 °C. The paper also presents the method of how to interpret the thermal changes using tools of the commercial software VGStudio MAX (Volume Graphics GmbH, Germany).
Reducing Information Overload in Large Seismic Data Sets
DOE Office of Scientific and Technical Information (OSTI.GOV)
HAMPTON,JEFFERY W.; YOUNG,CHRISTOPHER J.; MERCHANT,BION J.
2000-08-02
Event catalogs for seismic data can become very large. Furthermore, as researchers collect multiple catalogs and reconcile them into a single catalog that is stored in a relational database, the reconciled set becomes even larger. The sheer number of these events makes searching for relevant events to compare with events of interest problematic. Information overload in this form can lead to the data sets being under-utilized and/or used incorrectly or inconsistently. Thus, efforts have been initiated to research techniques and strategies for helping researchers to make better use of large data sets. In this paper, the authors present their effortsmore » to do so in two ways: (1) the Event Search Engine, which is a waveform correlation tool and (2) some content analysis tools, which area combination of custom-built and commercial off-the-shelf tools for accessing, managing, and querying seismic data stored in a relational database. The current Event Search Engine is based on a hierarchical clustering tool known as the dendrogram tool, which is written as a MatSeis graphical user interface. The dendrogram tool allows the user to build dendrogram diagrams for a set of waveforms by controlling phase windowing, down-sampling, filtering, enveloping, and the clustering method (e.g. single linkage, complete linkage, flexible method). It also allows the clustering to be based on two or more stations simultaneously, which is important to bridge gaps in the sparsely recorded event sets anticipated in such a large reconciled event set. Current efforts are focusing on tools to help the researcher winnow the clusters defined using the dendrogram tool down to the minimum optimal identification set. This will become critical as the number of reference events in the reconciled event set continually grows. The dendrogram tool is part of the MatSeis analysis package, which is available on the Nuclear Explosion Monitoring Research and Engineering Program Web Site. As part of the research into how to winnow the reference events in these large reconciled event sets, additional database query approaches have been developed to provide windows into these datasets. These custom built content analysis tools help identify dataset characteristics that can potentially aid in providing a basis for comparing similar reference events in these large reconciled event sets. Once these characteristics can be identified, algorithms can be developed to create and add to the reduced set of events used by the Event Search Engine. These content analysis tools have already been useful in providing information on station coverage of the referenced events and basic statistical, information on events in the research datasets. The tools can also provide researchers with a quick way to find interesting and useful events within the research datasets. The tools could also be used as a means to review reference event datasets as part of a dataset delivery verification process. There has also been an effort to explore the usefulness of commercially available web-based software to help with this problem. The advantages of using off-the-shelf software applications, such as Oracle's WebDB, to manipulate, customize and manage research data are being investigated. These types of applications are being examined to provide access to large integrated data sets for regional seismic research in Asia. All of these software tools would provide the researcher with unprecedented power without having to learn the intricacies and complexities of relational database systems.« less
Lin, Yi; Puttonen, Eetu; Hyyppä, Juha
2013-01-01
In mobile terrestrial hyperspectral imaging, individual trees often present large variations in spectral reflectance that may impact the relevant applications, but the related studies have been seldom reported. To fill this gap, this study was dedicated to investigating the spectral reflectance characteristics of individual trees with a Sensei mobile mapping system, which comprises a Specim line spectrometer and an Ibeo Lux laser scanner. The addition of the latter unit facilitates recording the structural characteristics of the target trees synchronously, and this is beneficial for revealing the characteristics of the spatial distributions of tree spectral reflectance with variations at different levels. Then, the parts of trees with relatively low-level variations can be extracted. At the same time, since it is difficult to manipulate the whole spectrum, the traditional concept of vegetation indices (VI) based on some particular spectral bands was taken into account here. Whether the assumed VIs capable of behaving consistently for the whole crown of each tree was also checked. The specific analyses were deployed based on four deciduous tree species and six kinds of VIs. The test showed that with the help of the laser scanner data, the parts of individual trees with relatively low-level variations can be located. Based on these parts, the relatively stable spectral reflectance characteristics for different tree species can be learnt. PMID:23877127
Variations on a theme: novel immersed grating based spectrometer designs for space
NASA Astrophysics Data System (ADS)
Agócs, T.; Navarro, R.; Venema, L.
2017-11-01
We present novel immersed grating (IG) based spectrometer designs that can be used in space instrumentation. They are based on the design approach that aims to optimize the optical design using the expanded parameter space that the IG technology offers. In principle the wavefront error (WFE) of any optical system the most conveniently can be corrected in the pupil, where in the case of the IG based spectrometer, the IG itself is positioned. By modifying existing three-mirror based optical systems, which can form the main part of double pass spectrometer designs, a large portion of the WFE of the optical system can be transferred to the pupil and to the IG. In these cases the IG can compensate simple low order aberrations of the system and consequently the main benefit is that the mirrors that tend to be off-axis conical sections can be substituted by spherical mirrors. The WFE budget of such designs has only a minor contribution from the very high quality spherical mirrors and the majority of the WFE can be then allocated to the most complex part of the system, the IG. The latter can be designed so that the errors are compensated by a special grating pattern that in turn can be manufactured using the expertise and experience of the semiconductor industry.
Laser Heating in a Dense Plasma Focus.
The report is divided in two parts. In the first part an account is given of the measurement of the momentum distribution of the deuterons ejected from a dense plasma focus . The results show the existence of a pronounced non-Maxwellian distribution and a small population of deuterons accelerated to the voltage of the condenser bank. In the second part theoretical calculation of laser heating establish the presence of large density gradient which probably accounts for the large currents detected in such plasmas. (Author)
Effects of partitioned enthalpy of mixing on glass-forming ability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Song, Wen-Xiong; Zhao, Shi-Jin, E-mail: shijin.zhao@shu.edu.cn
2015-04-14
We explore the inherent reason at atomic level for the glass-forming ability of alloys by molecular simulation, in which the effect of partitioned enthalpy of mixing is studied. Based on Morse potential, we divide the enthalpy of mixing into three parts: the chemical part (Δ E{sub nn}), strain part (Δ E{sub strain}), and non-bond part (Δ E{sub nnn}). We find that a large negative Δ E{sub nn} value represents strong AB chemical bonding in AB alloy and is the driving force to form a local ordered structure, meanwhile the transformed local ordered structure needs to satisfy the condition (Δ E{submore » nn}/2 + Δ E{sub strain}) < 0 to be stabilized. Understanding the chemical and strain parts of enthalpy of mixing is helpful to design a new metallic glass with a good glass forming ability. Moreover, two types of metallic glasses (i.e., “strain dominant” and “chemical dominant”) are classified according to the relative importance between chemical effect and strain effect, which enriches our knowledge of the forming mechanism of metallic glass. Finally, a soft sphere model is established, different from the common hard sphere model.« less
Non-Destructive Thermography Analysis of Impact Damage on Large-Scale CFRP Automotive Parts.
Maier, Alexander; Schmidt, Roland; Oswald-Tranta, Beate; Schledjewski, Ralf
2014-01-14
Laminated composites are increasingly used in aeronautics and the wind energy industry, as well as in the automotive industry. In these applications, the construction and processing need to fulfill the highest requirements regarding weight and mechanical properties. Environmental issues, like fuel consumption and CO₂-footprint, set new challenges in producing lightweight parts that meet the highly monitored standards for these branches. In the automotive industry, one main aspect of construction is the impact behavior of structural parts. To verify the quality of parts made from composite materials with little effort, cost and time, non-destructive test methods are increasingly used. A highly recommended non-destructive testing method is thermography analysis. In this work, a prototype for a car's base plate was produced by using vacuum infusion. For research work, testing specimens were produced with the same multi-layer build up as the prototypes. These specimens were charged with defined loads in impact tests to simulate the effect of stone chips. Afterwards, the impacted specimens were investigated with thermography analysis. The research results in that work will help to understand the possible fields of application and the usage of thermography analysis as the first quick and economic failure detection method for automotive parts.
Reinventing the Solar Power Satellite
NASA Technical Reports Server (NTRS)
Landis, Geoffrey A.
2002-01-01
Economy of scale is inherent in the microwave power transmission aperture/spot-size trade-off, resulting in a requirement for large space systems in the existing design concepts. Unfortunately, this large size means that the initial investment required before the first return, and the price of amortization of this initial investment, is a daunting (and perhaps insurmountable) barrier to economic viability. As the growth of ground-based solar power applications will fund the development of the PV technology required for space solar power and will also create the demand for space solar power by manufacturing a ready-made market, space power systems must be designed with an understanding that ground-based solar technologies will be implemented as a precursor to space-based solar. for low initial cost, (3) operation in synergy with ground solar systems, and (4) power production profile tailored to peak rates. A key to simplicity of design is to maximize the integration of the system components. Microwave, millimeter-wave, and laser systems are analyzed. A new solar power satellite design concept with no sun-tracking and no moving parts is proposed to reduce the required cost to initial operational capability.
Innovative Approaches to Space-Based Manufacturing and Rapid Prototyping of Composite Materials
NASA Technical Reports Server (NTRS)
Hill, Charles S.
2012-01-01
The ability to deploy large habitable structures, construct, and service exploration vehicles in low earth orbit will be an enabling capability for continued human exploration of the solar system. It is evident that advanced manufacturing methods to fabricate replacement parts and re-utilize launch vehicle structural mass by converting it to different uses will be necessary to minimize costs and allow flexibility to remote crews engaged in space travel. Recent conceptual developments and the combination of inter-related approaches to low-cost manufacturing of composite materials and structures are described in context leading to the possibility of on-orbit and space-based manufacturing.
Impact of oscillations of shafts on machining accuracy using non-stationary machines
NASA Astrophysics Data System (ADS)
Fedorenko, M. A.; Bondarenko, J. A.; Pogonin, A. A.
2018-03-01
The solution of the problem of restoring parts and units of equipment of the large mass and size is possible on the basis of the development of the research base, including the development of models and theoretical relations, revealing complex reasons for causes of damage and equipment failure. This allows one to develop new effective technologies of maintenance and repair, implementation of which ensures the efficiency and durability of the machines. The development of new forms of technical maintenance and repair of equipment, based on a systematic evaluation of its technical condition with the help of modern diagnostic tools can significantly reduce the duration of the downtime.
Large Scale Portability of Hospital Information System Software
Munnecke, Thomas H.; Kuhn, Ingeborg M.
1986-01-01
As part of its Decentralized Hospital Computer Program (DHCP) the Veterans Administration installed new hospital information systems in 169 of its facilities during 1984 and 1985. The application software for these systems is based on the ANS MUMPS language, is public domain, and is designed to be operating system and hardware independent. The software, developed by VA employees, is built upon a layered approach, where application packages layer on a common data dictionary which is supported by a Kernel of software. Communications between facilities are based on public domain Department of Defense ARPA net standards for domain naming, mail transfer protocols, and message formats, layered on a variety of communications technologies.
A Component-Based Extension Framework for Large-Scale Parallel Simulations in NEURON
King, James G.; Hines, Michael; Hill, Sean; Goodman, Philip H.; Markram, Henry; Schürmann, Felix
2008-01-01
As neuronal simulations approach larger scales with increasing levels of detail, the neurosimulator software represents only a part of a chain of tools ranging from setup, simulation, interaction with virtual environments to analysis and visualizations. Previously published approaches to abstracting simulator engines have not received wide-spread acceptance, which in part may be to the fact that they tried to address the challenge of solving the model specification problem. Here, we present an approach that uses a neurosimulator, in this case NEURON, to describe and instantiate the network model in the simulator's native model language but then replaces the main integration loop with its own. Existing parallel network models are easily adopted to run in the presented framework. The presented approach is thus an extension to NEURON but uses a component-based architecture to allow for replaceable spike exchange components and pluggable components for monitoring, analysis, or control that can run in this framework alongside with the simulation. PMID:19430597
NASA Astrophysics Data System (ADS)
Cécillon, Lauric; Quénéa, Katell; Anquetil, Christelle; Barré, Pierre
2015-04-01
Due to its large heterogeneity at all scales (from soil core to the globe), several measurements are often mandatory to get a meaningful value of a measured soil property. A large number of measurements can therefore be needed to study a soil property whatever the scale of the study. Moreover, several soil investigation techniques produce large and complex datasets, such as pyrolysis-gas chromatography-mass spectrometry (Py-GC-MS) which produces complex 3-way data. In this context, straightforward methods designed to speed up data treatments are needed to deal with large datasets. GC-MS pyrolysis (py-GCMS) is a powerful and frequently used tool to characterize soil organic matter (SOM). However, the treatment of the results of a py-GCMS analysis of soil sample is time consuming (number of peaks, co-elution, etc.) and the treatment of large data set of py-GCMS results is rather laborious. Moreover, peak position shifts and baseline drifts between analyses make the automation of GCMS programs data treatment difficult. These problems can be fixed using the Parallel Factor Analysis 2 (PARAFAC 2, Kiers et al., 1999; Bro et al., 1999). This algorithm has been applied frequently on chromatography data but has never been applied to analyses of SOM. We developed a Matlab routine based on existing Matlab packages dedicated to the simultaneous treatment of dozens of pyro-chromatograms mass spectra. We applied this routine on 40 soil samples. The benefits and expected improvements of our method will be discussed in our poster. References Kiers et al. (1999) PARAFAC2 - PartI. A direct fitting algorithm for the PARAFAC2 model. Journal of Chemometrics, 13: 275-294. Bro et al. (1999) PARAFAC2 - PartII. Modeling chromatographic data with retention time shifts. Journal of Chemometrics, 13: 295-309.
Mining the Galaxy Zoo Database: Machine Learning Applications
NASA Astrophysics Data System (ADS)
Borne, Kirk D.; Wallin, J.; Vedachalam, A.; Baehr, S.; Lintott, C.; Darg, D.; Smith, A.; Fortson, L.
2010-01-01
The new Zooniverse initiative is addressing the data flood in the sciences through a transformative partnership between professional scientists, volunteer citizen scientists, and machines. As part of this project, we are exploring the application of machine learning techniques to data mining problems associated with the large and growing database of volunteer science results gathered by the Galaxy Zoo citizen science project. We will describe the basic challenge, some machine learning approaches, and early results. One of the motivators for this study is the acquisition (through the Galaxy Zoo results database) of approximately 100 million classification labels for roughly one million galaxies, yielding a tremendously large and rich set of training examples for improving automated galaxy morphological classification algorithms. In our first case study, the goal is to learn which morphological and photometric features in the Sloan Digital Sky Survey (SDSS) database correlate most strongly with user-selected galaxy morphological class. As a corollary to this study, we are also aiming to identify which galaxy parameters in the SDSS database correspond to galaxies that have been the most difficult to classify (based upon large dispersion in their volunter-provided classifications). Our second case study will focus on similar data mining analyses and machine leaning algorithms applied to the Galaxy Zoo catalog of merging and interacting galaxies. The outcomes of this project will have applications in future large sky surveys, such as the LSST (Large Synoptic Survey Telescope) project, which will generate a catalog of 20 billion galaxies and will produce an additional astronomical alert database of approximately 100 thousand events each night for 10 years -- the capabilities and algorithms that we are exploring will assist in the rapid characterization and classification of such massive data streams. This research has been supported in part through NSF award #0941610.
NASA Astrophysics Data System (ADS)
Martynenko, S.; Rozumenko, V.; Tyrnov, O.; Manson, A.; Meek, C.
The large V/m electric fields inherent in the mesosphere play an essential role in lower ionospheric electrodynamics. They must be the cause of large variations in the electron temperature and the electron collision frequency at D region altitudes, and consequently the ionospheric plasma in the lower part of the D region undergoes a transition into a nonisothermal state. This study is based on the databases on large mesospheric electric fields collected with the 2.2-MHz radar of the Institute of Space and Atmospheric Studies, University of Saskatchewan, Canada (52°N geographic latitude, 60.4°N geomagnetic latitude) and with the 2.3-MHz radar of the Kharkiv V. Karazin National University (49.6°N geographic latitude, 45.6°N geomagnetic latitude). The statistical analysis of these data is presented in Meek, C. E., A. H. Manson, S. I. Martynenko, V. T. Rozumenko, O. F. Tyrnov, Remote sensing of mesospheric electric fields using MF radars, Journal of Atmospheric and Solar-Terrestrial Physics, in press. The large mesospheric electric fields is experimentally established to follow a Rayleigh distribution in the interval 0
NASA Astrophysics Data System (ADS)
Zhang, Yang; Liu, Wei; Li, Xiaodong; Yang, Fan; Gao, Peng; Jia, Zhenyuan
2015-10-01
Large-scale triangulation scanning measurement systems are widely used to measure the three-dimensional profile of large-scale components and parts. The accuracy and speed of the laser stripe center extraction are essential for guaranteeing the accuracy and efficiency of the measuring system. However, in the process of large-scale measurement, multiple factors can cause deviation of the laser stripe center, including the spatial light intensity distribution, material reflectivity characteristics, and spatial transmission characteristics. A center extraction method is proposed for improving the accuracy of the laser stripe center extraction based on image evaluation of Gaussian fitting structural similarity and analysis of the multiple source factors. First, according to the features of the gray distribution of the laser stripe, evaluation of the Gaussian fitting structural similarity is estimated to provide a threshold value for center compensation. Then using the relationships between the gray distribution of the laser stripe and the multiple source factors, a compensation method of center extraction is presented. Finally, measurement experiments for a large-scale aviation composite component are carried out. The experimental results for this specific implementation verify the feasibility of the proposed center extraction method and the improved accuracy for large-scale triangulation scanning measurements.
Shen, Jian; Hua, Baozhen
2013-08-01
Male adults of Panorpidae possess a special sperm pump, through which the males transfer liquid sperm to the females. However, the structures of the sperm pump and the transfer mechanism have not been satisfactorily elucidated hitherto. In this paper the structures of the ejaculatory sac and sperm pump of the scorpionfly Panorpa liui Hua were investigated using light microscopy and scanning electron microscopy. The ejaculatory sac is located between the basal end of the paired vasa deferentia and the aedeagus, comprising a small anterior part and a large posterior part. The anterior part is simple and functions only as a channel for sperm transfer. The epithelial cells of the large posterior part likely have secretory functions. The sperm pump is formed by the posterior region of the ejaculatory sac and derivates of the genital field, which enclose the pumping chamber, a piston and the associated muscles. The orifice of the ejaculatory duct lies ventrad of the piston. The piston of the sperm pump is heavily sclerotized and controlled by two antagonistic muscle pairs. A pair of simple tubular accessory glands opens to the pumping chamber. Two well-developed sex pheromone glands are located on the ventral side of the ejaculatory sac, and are composed of two fan-shaped lamellae. The epithelium of the sex pheromone glands is single-layered, and forms densely filamentous processes. The ejaculation mechanism is briefly discussed based on the morphology of ejaculatory sac and sperm pump. Copyright © 2013 Elsevier Ltd. All rights reserved.
Application of AIS Technology to Forest Mapping
NASA Technical Reports Server (NTRS)
Yool, S. R.; Star, J. L.
1985-01-01
Concerns about environmental effects of large scale deforestation have prompted efforts to map forests over large areas using various remote sensing data and image processing techniques. Basic research on the spectral characteristics of forest vegetation are required to form a basis for development of new techniques, and for image interpretation. Examination of LANDSAT data and image processing algorithms over a portion of boreal forest have demonstrated the complexity of relations between the various expressions of forest canopies, environmental variability, and the relative capacities of different image processing algorithms to achieve high classification accuracies under these conditions. Airborne Imaging Spectrometer (AIS) data may in part provide the means to interpret the responses of standard data and techniques to the vegetation based on its relatively high spectral resolution.
Research on Fault Rate Prediction Method of T/R Component
NASA Astrophysics Data System (ADS)
Hou, Xiaodong; Yang, Jiangping; Bi, Zengjun; Zhang, Yu
2017-07-01
T/R component is an important part of the large phased array radar antenna array, because of its large numbers, high fault rate, it has important significance for fault prediction. Aiming at the problems of traditional grey model GM(1,1) in practical operation, the discrete grey model is established based on the original model in this paper, and the optimization factor is introduced to optimize the background value, and the linear form of the prediction model is added, the improved discrete grey model of linear regression is proposed, finally, an example is simulated and compared with other models. The results show that the method proposed in this paper has higher accuracy and the solution is simple and the application scope is more extensive.
Large-scale mapping of hard-rock aquifer properties applied to Burkina Faso.
Courtois, Nathalie; Lachassagne, Patrick; Wyns, Robert; Blanchin, Raymonde; Bougaïré, Francis D; Somé, Sylvain; Tapsoba, Aïssata
2010-01-01
A country-scale (1:1,000,000) methodology has been developed for hydrogeologic mapping of hard-rock aquifers (granitic and metamorphic rocks) of the type that underlie a large part of the African continent. The method is based on quantifying the "useful thickness" and hydrodynamic properties of such aquifers and uses a recent conceptual model developed for this hydrogeologic context. This model links hydrodynamic parameters (transmissivity, storativity) to lithology and the geometry of the various layers constituting a weathering profile. The country-scale hydrogeological mapping was implemented in Burkina Faso, where a recent 1:1,000,000-scale digital geological map and a database of some 16,000 water wells were used to evaluate the methodology.
A Review of Astronomy Education Research
NASA Astrophysics Data System (ADS)
Bailey, Janelle M.; Slater, Timothy F.
The field of astronomy education is rapidly growing beyond merely sharing effective activities or curriculum ideas. This paper categorizes and summarizes the literature in astronomy education research and contains more than 100 references to articles, books, and Web-based materials. Research into student understanding on a variety of topics now occupies a large part of the literature. Topics include the shape of Earth and gravity, lunar phases, seasons, astrobiology, and cosmology. The effectiveness of instructional methods is now being tested systematically, taking data beyond the anecdotal with powerful research designs and statistical analyses. Quantitative, qualitative, and mixed-methods approaches have found their places in the researcher's toolbox. In all cases, the connection between the research performed and its effect on classroom instruction is largely lacking.
Evidence-Based Approach to Fiber Supplements and Clinically Meaningful Health Benefits, Part 1
McRorie, Johnson W.
2015-01-01
Dietary fiber that is intrinsic and intact in fiber-rich foods (eg, fruits, vegetables, legumes, whole grains) is widely recognized to have beneficial effects on health when consumed at recommended levels (25 g/d for adult women, 38 g/d for adult men). Most (90%) of the US population does not consume this level of dietary fiber, averaging only 15 g/d. In an attempt to bridge this “fiber gap,” many consumers are turning to fiber supplements, which are typically isolated from a single source. Fiber supplements cannot be presumed to provide the health benefits that are associated with dietary fiber from whole foods. Of the fiber supplements on the market today, only a minority possess the physical characteristics that underlie the mechanisms driving clinically meaningful health benefits. The first part (current issue) of this 2-part series will focus on the 4 main characteristics of fiber supplements that drive clinical efficacy (solubility, degree/rate of fermentation, viscosity, and gel formation), the 4 clinically meaningful designations that identify which health benefits are associated with specific fibers, and the gel-dependent mechanisms in the small bowel that drive specific health benefits (eg, cholesterol lowering, improved glycemic control). The second part (next issue) of this 2-part series will focus on the effects of fiber supplements in the large bowel, including the 2 mechanisms by which fiber prevents/relieves constipation (insoluble mechanical irritant and soluble gel-dependent water-holding capacity), the gel-dependent mechanism for attenuating diarrhea and normalizing stool form in irritable bowel syndrome, and the combined large bowel/small bowel fiber effects for weight loss/maintenance. The second part will also discuss how processing for marketed products can attenuate efficacy, why fiber supplements can cause gastrointestinal symptoms, and how to avoid symptoms for better long-term compliance. PMID:25972618
Evidence-Based Approach to Fiber Supplements and Clinically Meaningful Health Benefits, Part 2
McRorie, Johnson W.
2015-01-01
Dietary fiber that is intrinsic and intact in fiber-rich foods (eg, fruits, vegetables, legumes, whole grains) is widely recognized to have beneficial effects on health when consumed at recommended levels (25 g/d for adult women, 38 g/d for adult men). Most (90%) of the US population does not consume this level of dietary fiber, averaging only 15 g/d. In an attempt to bridge this “fiber gap,” many consumers are turning to fiber supplements, which are typically isolated from a single source. Fiber supplements cannot be presumed to provide the health benefits that are associated with dietary fiber from whole foods. Of the fiber supplements on the market today, only a minority possess the physical characteristics that underlie the mechanisms driving clinically meaningful health benefits. In this 2-part series, the first part (previous issue) described the 4 main characteristics of fiber supplements that drive clinical efficacy (solubility, degree/rate of fermentation, viscosity, and gel formation), the 4 clinically meaningful designations that identify which health benefits are associated with specific fibers, and the gel-dependent mechanisms in the small bowel that drive specific health benefits (eg, cholesterol lowering, improved glycemic control). The second part (current issue) of this 2-part series will focus on the effects of fiber supplements in the large bowel, including the 2 mechanisms by which fiber prevents/relieves constipation (insoluble mechanical irritant and soluble gel-dependent water-holding capacity), the gel-dependent mechanism for attenuating diarrhea and normalizing stool form in irritable bowel syndrome, and the combined large bowel/small bowel fiber effects for weight loss/maintenance. The second part will also discuss how processing for marketed products can attenuate efficacy, why fiber supplements can cause gastrointestinal symptoms, and how to avoid symptoms for better long-term compliance. PMID:25972619
Investigation of relationships between parameters of solar nano-flares and solar activity
NASA Astrophysics Data System (ADS)
Safari, Hossein; Javaherian, Mohsen; Kaki, Bardia
2016-07-01
Solar flares are one of the important coronal events which are originated in solar magnetic activity. They release lots of energy during the interstellar medium, right after the trigger. Flare prediction can play main role in avoiding eventual damages on the Earth. Here, to interpret solar large-scale events (e.g., flares), we investigate relationships between small-scale events (nano-flares) and large-scale events (e.g., flares). In our method, by using simulations of nano-flares based on Monte Carlo method, the intensity time series of nano-flares are simulated. Then, the solar full disk images taken at 171 angstrom recorded by SDO/AIA are employed. Some parts of the solar disk (quiet Sun (QS), coronal holes (CHs), and active regions (ARs)) are cropped and the time series of these regions are extracted. To compare the simulated intensity time series of nano-flares with the intensity time series of real data extracted from different parts of the Sun, the artificial neural networks is employed. Therefore, we are able to extract physical parameters of nano-flares like both kick and decay rate lifetime, and the power of their power-law distributions. The procedure of variations in the power value of power-law distributions within QS, CH is similar to AR. Thus, by observing the small part of the Sun, we can follow the procedure of solar activity.
OPTICAL–NEAR-INFRARED PHOTOMETRIC CALIBRATION OF M DWARF METALLICITY AND ITS APPLICATION
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hejazi, N.; Robertis, M. M. De; Dawson, P. C., E-mail: nedahej@yorku.ca, E-mail: mmdr@yorku.ca, E-mail: pdawson@trentu.ca
2015-04-15
Based on a carefully constructed sample of dwarf stars, a new optical–near-infrared photometric calibration to estimate the metallicity of late-type K and early-to-mid-type M dwarfs is presented. The calibration sample has two parts; the first part includes 18 M dwarfs with metallicities determined by high-resolution spectroscopy and the second part contains 49 dwarfs with metallicities obtained through moderate-resolution spectra. By applying this calibration to a large sample of around 1.3 million M dwarfs from the Sloan Digital Sky Survey and 2MASS, the metallicity distribution of this sample is determined and compared with those of previous studies. Using photometric parallaxes, themore » Galactic heights of M dwarfs in the large sample are also estimated. Our results show that stars farther from the Galactic plane, on average, have lower metallicity, which can be attributed to the age–metallicity relation. A scarcity of metal-poor dwarf stars in the metallicity distribution relative to the Simple Closed Box Model indicates the existence of the “M dwarf problem,” similar to the previously known G and K dwarf problems. Several more complicated Galactic chemical evolution models which have been proposed to resolve the G and K dwarf problems are tested and it is shown that these models could, to some extent, mitigate the M dwarf problem as well.« less
Voltage stability analysis in the new deregulated environment
NASA Astrophysics Data System (ADS)
Zhu, Tong
Nowadays, a significant portion of the power industry is under deregulation. Under this new circumstance, network security analysis is more critical and more difficult. One of the most important issues in network security analysis is voltage stability analysis. Due to the expected higher utilization of equipment induced by competition in a power market that covers bigger power systems, this issue is increasingly acute after deregulation. In this dissertation, some selected topics of voltage stability analysis are covered. In the first part, after a brief review of general concepts of continuation power flow (CPF), investigations on various matrix analysis techniques to improve the speed of CPF calculation for large systems are reported. Based on these improvements, a new CPF algorithm is proposed. This new method is then tested by an inter-area transaction in a large inter-connected power system. In the second part, the Arnoldi algorithm, the best method to find a few minimum singular values for a large sparse matrix, is introduced into the modal analysis for the first time. This new modal analysis is applied to the estimation of the point of voltage collapse and contingency evaluation in voltage security assessment. Simulations show that the new method is very efficient. In the third part, after transient voltage stability component models are investigated systematically, a novel system model for transient voltage stability analysis, which is a logical-algebraic-differential-difference equation (LADDE), is offered. As an example, TCSC (Thyristor controlled series capacitors) is addressed as a transient voltage stabilizing controller. After a TCSC transient voltage stability model is outlined, a new TCSC controller is proposed to enhance both fault related and load increasing related transient voltage stability. Its ability is proven by the simulation.
Sumant, Anirudha V.; Divan, Ralu; Posada, Chrystian M.; Castano, Carlos H.; Grant, Edwin J.; Lee, Hyoung K.
2016-03-29
A source cold cathode field emission array (FEA) source based on ultra-nanocrystalline diamond (UNCD) field emitters. This system was constructed as an alternative for detection of obscured objects and material. Depending on the geometry of the given situation a flat-panel source can be used in tomography, radiography, or tomosynthesis. Furthermore, the unit can be used as a portable electron or X-ray scanner or an integral part of an existing detection system. UNCD field emitters show great field emission output and can be deposited over large areas as the case with carbon nanotube "forest" (CNT) cathodes. Furthermore, UNCDs have better mechanical and thermal properties as compared to CNT tips which further extend the lifetime of UNCD based FEA.
Temperature Control Diagnostics for Sample Environments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Santodonato, Louis J; Walker, Lakeisha MH; Church, Andrew J
2010-01-01
In a scientific laboratory setting, standard equipment such as cryocoolers are often used as part of a custom sample environment system designed to regulate temperature over a wide range. The end user may be more concerned with precise sample temperature control than with base temperature. But cryogenic systems tend to be specified mainly in terms of cooling capacity and base temperature. Technical staff at scientific user facilities (and perhaps elsewhere) often wonder how to best specify and evaluate temperature control capabilities. Here we describe test methods and give results obtained at a user facility that operates a large sample environmentmore » inventory. Although this inventory includes a wide variety of temperature, pressure, and magnetic field devices, the present work focuses on cryocooler-based systems.« less
NASA Astrophysics Data System (ADS)
Lawlor, John; Conneely, Claire; Tangney, Brendan
The poor assimilation of ICT in formal education is firmly rooted in models of learning prevalent in the classroom which are largely teacher-led, individualistic and reproductive, with little connection between theory and practice and poor linkages across the curriculum. A new model of classroom practice is required to allow for creativity, peer-learning, thematic learning, collaboration and problem solving, i.e. the skills commonly deemed necessary for the knowledge-based society of the 21st century. This paper describes the B2C model for group-based, technology-mediated, project-oriented learning which, while being developed as part of an out of school programme, offers a pragmatic alternative to traditional classroom pedagogy.
NASA Astrophysics Data System (ADS)
Zhang, Wenqng
2015-03-01
A concept of part-crystalline part-liquid state (or liquid-like), and even part-crystalline part-glass state (or glass-like), was demonstrated in some materials such as Cu3SbSe3 with chemical-bond-hierarchy, in which certain constituent species weakly bond to other part of the crystal. Such a material could intrinsically manifest the coexistence of rigid crystalline sublattices and other fluctuating noncrystalline sublattices with thermally induced large amplitude vibrations and even flow of the group of species atoms. The large-amplitude vibrations and movement of atoms can generate unusual severe phonon scattering and thermal damping due to the collective low-frequency vibrations similar to the Boson peak in amorphous or liquid materials. While different phase or state may have large energetic discrepancy, whether the thermally-induced part-crystalline state is undergoing phase transition becomes an interesting issue. In addition, our earlier work reported that second-order phase transition could induce extreme electron and phonon scattering in thermoelectrics. The above work clearly demonstrated that the unusual effect from structural fluctuations on thermal and electrical transport in thermoelectrics should be paid attention to. While materials with these structural changes can retain extremely low lattice thermal conductivity and unusual electron transport and become promising candidates for high-performance thermoelectrics, underlying mechanism is yet to be explored.
Mechanism of Rock Burst Occurrence in Specially Thick Coal Seam with Rock Parting
NASA Astrophysics Data System (ADS)
Wang, Jian-chao; Jiang, Fu-xing; Meng, Xiang-jun; Wang, Xu-you; Zhu, Si-tao; Feng, Yu
2016-05-01
Specially thick coal seam with complex construction, such as rock parting and alternative soft and hard coal, is called specially thick coal seam with rock parting (STCSRP), which easily leads to rock burst during mining. Based on the stress distribution of rock parting zone, this study investigated the mechanism, engineering discriminant conditions, prevention methods, and risk evaluation method of rock burst occurrence in STCSRP through setting up a mechanical model. The main conclusions of this study are as follows. (1) When the mining face moves closer to the rock parting zone, the original non-uniform stress of the rock parting zone and the advancing stress of the mining face are combined to intensify gradually the shearing action of coal near the mining face. When the shearing action reaches a certain degree, rock burst easily occurs near the mining face. (2) Rock burst occurrence in STCSRP is positively associated with mining depth, advancing stress concentration factor of the mining face, thickness of rock parting, bursting liability of coal, thickness ratio of rock parting to coal seam, and difference of elastic modulus between rock parting and coal, whereas negatively associated with shear strength. (3) Technologies of large-diameter drilling, coal seam water injection, and deep hole blasting can reduce advancing stress concentration factor, thickness of rock parting, and difference of elastic modulus between rock parting and coal to lower the risk of rock burst in STCSRP. (4) The research result was applied to evaluate and control the risk of rock burst occurrence in STCSRP.
Indentured Parts List Maintenance and Part Assembly Capture Tool - IMPACT
NASA Technical Reports Server (NTRS)
Jain, Bobby; Morris, Jill; Sharpe, Kelly
2004-01-01
Johnson Space Center's (JSC's) indentured parts list (IPL) maintenance and parts assembly capture tool (IMPACT) is an easy-to-use graphical interface for viewing and maintaining the complex assembly hierarchies of large databases. IMPACT, already in use at JSC to support the International Space Station (ISS), queries, updates, modifies, and views data in IPL and associated resource data, functions that it can also perform, with modification, for any large commercial database. By enabling its users to efficiently view and manipulate IPL hierarchical data, IMPACT performs a function unlike that of any other tool. Through IMPACT, users will achieve results quickly, efficiently, and cost effectively.
Kim, Eun Young; Magnotta, Vincent A; Liu, Dawei; Johnson, Hans J
2014-09-01
Machine learning (ML)-based segmentation methods are a common technique in the medical image processing field. In spite of numerous research groups that have investigated ML-based segmentation frameworks, there remains unanswered aspects of performance variability for the choice of two key components: ML algorithm and intensity normalization. This investigation reveals that the choice of those elements plays a major part in determining segmentation accuracy and generalizability. The approach we have used in this study aims to evaluate relative benefits of the two elements within a subcortical MRI segmentation framework. Experiments were conducted to contrast eight machine-learning algorithm configurations and 11 normalization strategies for our brain MR segmentation framework. For the intensity normalization, a Stable Atlas-based Mapped Prior (STAMP) was utilized to take better account of contrast along boundaries of structures. Comparing eight machine learning algorithms on down-sampled segmentation MR data, it was obvious that a significant improvement was obtained using ensemble-based ML algorithms (i.e., random forest) or ANN algorithms. Further investigation between these two algorithms also revealed that the random forest results provided exceptionally good agreement with manual delineations by experts. Additional experiments showed that the effect of STAMP-based intensity normalization also improved the robustness of segmentation for multicenter data sets. The constructed framework obtained good multicenter reliability and was successfully applied on a large multicenter MR data set (n>3000). Less than 10% of automated segmentations were recommended for minimal expert intervention. These results demonstrate the feasibility of using the ML-based segmentation tools for processing large amount of multicenter MR images. We demonstrated dramatically different result profiles in segmentation accuracy according to the choice of ML algorithm and intensity normalization chosen. Copyright © 2014 Elsevier Inc. All rights reserved.
Brouwers, Elisabeth M.
1994-01-01
Shallow-marine ostracode assemblages from upper Pliocene sediments of the upper part of the Sagavanirktok Formation and lower part of the Gubik Formation record the last warm period that occurred before the onset of significant cooling of the Arctic Ocean and the initiation of Northern Hemisphere continental glaciation. The informally named Colvillian and Bigbendian transgressions represent the oldest deposits of the Gubik Formation and are dated, based on various lines of evidence, between 2.48 and 3 Ma. Ostracode faunas from the lower part of the Gubik Formation indicate a cold-temperate to subfrigid marine climate with summer bottom temperatures 1-4 C warmer than today. Deposits of the upper part of the Sagavanirktok Formation at Manning Point and Barter Island are older than Colvillian sediments but are believed to be late Pliocene in age and contain an ostracode fauna that has many species in common with the lower part of the Gubik Formation. The Sagavanirktok ostracode faunas indicate a cold-temperature to subfrigid marine climate, similar to that inferred for the lower part of the Gubik Formation, with summer bottom temperatures 1-3 C warmer than today. The opening of Bering Strait at about 3 Ma altered Arctic Ocean assemblage composition as Pacific species migrated into the Arctic and North Atlantic oceans. The admixture of evolutionarily distinct faunas from the Atlantic and Pacific oceans identifies Colvillian (and younger) faunas and provides a convenient reference horizon in the Alaskan fossil record. The marine climatic deterioration that followed the Bigbendian appears to have been abrupt and is documented by biotic turnover, with large numbers of species extinctions and first appearances of new species. The change in species composition can be attributed to the cooling of the Arctic Ocean during the late Pliocene.
NASA Technical Reports Server (NTRS)
Baker, C. R.
1975-01-01
Liquid hydrogen is being considered as a substitute for conventional hydrocarbon-based fuels for future generations of commercial jet aircraft. Its acceptance will depend, in part, upon the technology and cost of liquefaction. The process and economic requirements for providing a sufficient quantity of liquid hydrogen to service a major airport are described. The design is supported by thermodynamic studies which determine the effect of process arrangement and operating parameters on the process efficiency and work of liquefaction.
1980-12-01
consequences such that the ecosystem will not recover at all, (7) are the consequences such that the impact may be large but the recovery process...Bswe $Vicinitoe MLWI Impact Analysis Process DEPLOYMENT AREA SELECTION AND LAND WITHDRAWAL/ ACQUISITION DISI, DEPARTMENT OF THE AmR F1ORC ’oritinax...Subtitle) S. TYPE OF REPORT & PERIOD COVEREDDraft Environmental Impact Statement-MX Deployment Area Selection-Environmental Draft-December 80 Consequences
Employing Land-Based Anti-Ship Missiles in the Western Pacific
2013-01-01
law as indicated in a notice appearing later in this work . This electronic representation of RAND intellectual property is provided for non...the Western Pacific 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT...the past several years, some strategists have argued that China is shifting the balance of power in the Western Pacific in its favor, in large part
1976-08-01
extensive areas of good agreement with measured loadings where the prediction is based on acoustic theory. Acoustic theory as applied to thin airfoils...Acoustic thaory ha« baen demonstrated by references 12 through 18 to provide fairly good agreeaent with measured airloads due to blast and shock... ia found to riae to large values near the leading edge. Higher observed values of Ac further rearward of the leading edge ere found to compel
2013-09-01
considered acts of redemption. A necessary part of this is the defensive dehumanization of the victim which will deprive them of their unique value based...fosters their identity, dehumanizes the enemy and creates a “killer” mentality that is capable of murdering large numbers of innocent people. She...alienated religious group can inflict upon its perceived outgroup. She finds religion to be the ideal motivator of people to violence. Dehumanization
Survey-Guided Development: Data Based Organizational Change
1975-06-01
are largely environmentally determined, these experiences ":npact," and as they do so, move from the II 28 more existential surface level to the more...8217 personal values with those humanistic values which he believes are held by top managers today--form the major basis, in Bennis’ view, for the change...as "society." If they are, it is because we adhere to a set of humanistic values and define society’s "work" at least in part in these terms. It is
NASA Technical Reports Server (NTRS)
Naderi, F. (Editor)
1982-01-01
A system design for a satellite aided land mobile service is described. The advanced system is based on a geostationary satellite which employs a large UHF reflector to communicate with small user antennas on mobile vehicles. It is shown that the system through multiple beam antennas and frequency reuse provides for radiotelephone and dispatch channels. It is concluded that the system is technologically feasible to provide service to rural and remote regions.
Personal identification by eyes.
Marinović, Dunja; Njirić, Sanja; Coklo, Miran; Muzić, Vedrana
2011-09-01
Identification of persons through the eyes is in the field of biometrical science. Many security systems are based on biometric methods of personal identification, to determine whether a person is presenting itself truly. The human eye contains an extremely large number of individual characteristics that make it particularly suitable for the process of identifying a person. Today, the eye is considered to be one of the most reliable body parts for human identification. Systems using iris recognition are among the most secure biometric systems.
Urbanisation, poverty and employment: the large metropolis in the third world.
Singh, A
1992-01-01
"The main purpose of this paper is to provide an overall review of the chief analytical as well as economic policy issues in relation to Third World cities in the light of the available theoretical and empirical studies on urbanisation, poverty and employment in the developing countries.... Part I...provides basic information on urbanisation in the Third World...[and] outlines the nature and extent of urban poverty in these large cities and considers the impact of the world economic crisis on the urban poor. Part II of the paper discusses the most important structural features of urbanisation in relation to economic development....Finally, Part III briefly examines policy issues in relation to urbanisation and poverty in the Third World's large cities." excerpt
Questionnaire-based assessment of executive functioning: Psychometrics.
Castellanos, Irina; Kronenberger, William G; Pisoni, David B
2018-01-01
The psychometric properties of the Learning, Executive, and Attention Functioning (LEAF) scale were investigated in an outpatient clinical pediatric sample. As a part of clinical testing, the LEAF scale, which broadly measures neuropsychological abilities related to executive functioning and learning, was administered to parents of 118 children and adolescents referred for psychological testing at a pediatric psychology clinic; 85 teachers also completed LEAF scales to assess reliability across different raters and settings. Scores on neuropsychological tests of executive functioning and academic achievement were abstracted from charts. Psychometric analyses of the LEAF scale demonstrated satisfactory internal consistency, parent-teacher inter-rater reliability in the small to large effect size range, and test-retest reliability in the large effect size range, similar to values for other executive functioning checklists. Correlations between corresponding subscales on the LEAF and other behavior checklists were large, while most correlations with neuropsychological tests of executive functioning and achievement were significant but in the small to medium range. Results support the utility of the LEAF as a reliable and valid questionnaire-based assessment of delays and disturbances in executive functioning and learning. Applications and advantages of the LEAF and other questionnaire measures of executive functioning in clinical neuropsychology settings are discussed.
NASA Astrophysics Data System (ADS)
Zhou, Lifan; Chai, Dengfeng; Xia, Yu; Ma, Peifeng; Lin, Hui
2018-01-01
Phase unwrapping (PU) is one of the key processes in reconstructing the digital elevation model of a scene from its interferometric synthetic aperture radar (InSAR) data. It is known that two-dimensional (2-D) PU problems can be formulated as maximum a posteriori estimation of Markov random fields (MRFs). However, considering that the traditional MRF algorithm is usually defined on a rectangular grid, it fails easily if large parts of the wrapped data are dominated by noise caused by large low-coherence area or rapid-topography variation. A PU solution based on sparse MRF is presented to extend the traditional MRF algorithm to deal with sparse data, which allows the unwrapping of InSAR data dominated by high phase noise. To speed up the graph cuts algorithm for sparse MRF, we designed dual elementary graphs and merged them to obtain the Delaunay triangle graph, which is used to minimize the energy function efficiently. The experiments on simulated and real data, compared with other existing algorithms, both confirm the effectiveness of the proposed MRF approach, which suffers less from decorrelation effects caused by large low-coherence area or rapid-topography variation.
NASA Technical Reports Server (NTRS)
Davis, Don; Bennett, Toby; Short, Nicholas M., Jr.
1994-01-01
The Earth Observing System (EOS), part of a cohesive national effort to study global change, will deploy a constellation of remote sensing spacecraft over a 15 year period. Science data from the EOS spacecraft will be processed and made available to a large community of earth scientists via NASA institutional facilities. A number of these spacecraft are also providing an additional interface to broadcast data directly to users. Direct broadcast of real-time science data from overhead spacecraft has valuable applications including validation of field measurements, planning science campaigns, and science and engineering education. The success and usefulness of EOS direct broadcast depends largely on the end-user cost of receiving the data. To extend this capability to the largest possible user base, the cost of receiving ground stations must be as low as possible. To achieve this goal, NASA Goddard Space Flight Center is developing a prototype low-cost transportable ground station for EOS direct broadcast data based on Very Large Scale Integration (VLSI) components and pipelined, multiprocessing architectures. The targeted reproduction cost of this system is less than $200K. This paper describes a prototype ground station and its constituent components.
Towards Biological Restoration of Tehran Megalopolis River Valleys- Case Study: Farahzad River
NASA Astrophysics Data System (ADS)
Samadi, Nafishe; Oveis Torabi, Seyed; Akhani, Hossein
2017-04-01
Towards biological restoration of Tehran megalopolis river-valleys: case study Farahzad river 1Nafiseh Samadi, 2OveisTorabi, 3Hossein Akhani 1Mahsab Shargh Company, Tehran ,Iran, nafiseh19@gmail.com 2 Mahsab Shargh Company, Tehran ,Iran, weg@tna-co.com 3Department of Plant Sciences, Halophytes and C4 Research Laboratory, School of Biology, College of Sciences, University of Tehran, PO Box 14155-6455, Tehran, Iran, akhani@khayam.ut.ac.ir Tehran is located in northcentral parts of Iran on the alluvium of southern Alborz Mountains. Seven rivers originated from the highlands of N Tehran run inside and around the city. Many of these river valleys have been deformed by a variety of urban utilizations such as garden, building, canal, park, autobahn etc. Tehran with more than eight million populations suffered from adverse environmental conditions such as pollution and scarcity of natural habitats for recreational activities. Ecological restoration of altered river valleys of Tehran is one of the priorities of Tehran municipality started as a pilot project in Farahzad river. Intensive disturbance, conversion into various urban utilization, illegal building construction, waste water release into the river, garbage accumulation, artificial park constructions and domination of invasive species have largely altered the river. Parts of the river located in Pardisan Nature Park was studied before its complete deformation into a modern park. The riparian vegetation consisted of Tamarix ramosissima and Salix acmophylla shrubs with large number of aquatic and palustric plants. The norther parts of the river still contain semi-natural vegetation which change into patchy and intensive degraded habitats towards its southern parts. In northern parts of valley there are old gardens of Morus alba and Juglans regia, and planted trees such as Plataneus oreientalis and Acer negundo. Salix acmophylla, Fraxinus excelsior and Celtis caucasica are native species growing on river margin or surrounding steep slopes. The rare local endemic Convolvulus gracillimus still occurs in surrounding dry slopes. Ailanthus altissima is an invasive introduced tree largely occupied disturbed habitats and slopes of the valley associated with large number of ruderals belonging to genera Amaranthus, Bassia, Chenopodium, Echinochloa, Heliotropium, Tribulus etc. Restoration plan include 1. Study of past biological and geomorphological conditions of the area based on remnants of vegetation and aerial and satellite imaginary data 2. Survey of present environmental conditions of the area including identification native and introduced plants and animals, assessing the degree of originality of existing vegetation and cultural landscapes and abiotic factors. 3. Soil reclamation and topography improvements towards cultivation and/or formation of natural vegetation.
NASA Astrophysics Data System (ADS)
Hikiji, R.
2018-01-01
The trend toward downsizing of engines helps to increase the number of turbochargers around Europe. As for the turbocharger, the temperature of the exhaust gas is so high that the parts made of nickel base super alloy Inconel 713C are used as high temperature strength metals. External turning of Inconel 713C which is used as the actual automotive parts was carried out. The effect of the cutting fluids and cutting conditions on the surface integrity and tool wear was investigated, considering global environment and cost performance. As a result, in the range of the cutting conditions used this time, when the depth of cut was small, the good surface integrity and tool life were obtained. However, in the case of the large corner radius, it was found that the more the cutting length increased, the more the tool wear increased. When the cutting length is so large, the surface integrity and tool life got worse. As for the cutting fluids, it was found that the synthetic type showed better performance in the surface integrity and tool life than the conventional emulsion. However, it was clear that the large corner radius made the surface roughness and tool life good, but it affected the size error etc. in machining the workpiece held in a cantilever style.
NASA Astrophysics Data System (ADS)
Zhu, Hong; Huang, Mai; Sadagopan, Sriram; Yao, Hong
2017-09-01
With increasing vehicle fuel economy standards, automotive OEMs are widely using various AHSS grades including DP, TRIP, CP and 3rd Gen AHSS to reduce vehicle weight due to their good combination of strength and formability. As one of enabling technologies for AHSS application, the requirement for requiring accurate prediction of springback for cold stamped AHSS parts stimulated a large number of investigations in the past decade with reversed loading path at large strains followed by constitutive modeling. With a spectrum of complex loading histories occurring in production stamping processes, there were many challenges in this field including issues of test data reliability, loading path representability, constitutive model robustness and non-unique constitutive parameter-identification. In this paper, various testing approaches and constitutive modeling will be reviewed briefly and a systematic methodology from stress-strain characterization, constitutive model parameter identification for material card generation will be presented in order to support automotive OEM’s need on virtual stamping. This systematic methodology features a tension-compression test at large strain with robust anti-buckling device with concurrent friction force correction, properly selected loading paths to represent material behavior during different springback modes as well as the 10-parameter Yoshida model with knowledge-based parameter-identification through nonlinear optimization. Validation cases for lab AHSS parts will also be discussed to check applicability of this methodology.
DGEBF epoxy blends for use in the resin impregnation of extremely large composite parts
Madhukar, M. S.; Martovetsky, N. N.
2015-01-16
Large superconducting electromagnets used in fusion reactors utilize a large amount of glass/epoxy composite for electrical insulation and mechanical and thermal strengths. Moreover, the manufacture of these magnets involves wrapping each superconducting cable bundle with dry glass cloth followed by the vacuum-assisted resin transfer molding of the entire magnet. Due to their enormous size (more than 100 tons), it requires more than 40 h for resin impregnation and the subsequent pressure cycles to ensure complete impregnation and removal of any trapped air pockets. Diglycidyl ether of bisphenol F epoxy resin cross-linked with methyltetrahydrophthalic anhydride with an accelerator has been shownmore » to be a good candidate for use in composite parts requiring long impregnation cycles. Viscosity, gel time, and glass transition temperature of four resin-blends of diglycidyl ether of bisphenol F resin system were monitored as a function of time and temperature with an objective to find the blend that provides a working window longer than 40h at low viscosity without lowering its glass transition temperature. A resin-blend in the weight ratios of resin:hardener:accelerator=100:82:0.125 is shown to provide more than 60h at low resin viscosity while maintaining the same glass transition temperature as obtained with previously used resin-blends, based on the results.« less
A zonally symmetric model for the monsoon-Hadley circulation with stochastic convective forcing
NASA Astrophysics Data System (ADS)
De La Chevrotière, Michèle; Khouider, Boualem
2017-02-01
Idealized models of reduced complexity are important tools to understand key processes underlying a complex system. In climate science in particular, they are important for helping the community improve our ability to predict the effect of climate change on the earth system. Climate models are large computer codes based on the discretization of the fluid dynamics equations on grids of horizontal resolution in the order of 100 km, whereas unresolved processes are handled by subgrid models. For instance, simple models are routinely used to help understand the interactions between small-scale processes due to atmospheric moist convection and large-scale circulation patterns. Here, a zonally symmetric model for the monsoon circulation is presented and solved numerically. The model is based on the Galerkin projection of the primitive equations of atmospheric synoptic dynamics onto the first modes of vertical structure to represent free tropospheric circulation and is coupled to a bulk atmospheric boundary layer (ABL) model. The model carries bulk equations for water vapor in both the free troposphere and the ABL, while the processes of convection and precipitation are represented through a stochastic model for clouds. The model equations are coupled through advective nonlinearities, and the resulting system is not conservative and not necessarily hyperbolic. This makes the design of a numerical method for the solution of this system particularly difficult. Here, we develop a numerical scheme based on the operator time-splitting strategy, which decomposes the system into three pieces: a conservative part and two purely advective parts, each of which is solved iteratively using an appropriate method. The conservative system is solved via a central scheme, which does not require hyperbolicity since it avoids the Riemann problem by design. One of the advective parts is a hyperbolic diagonal matrix, which is easily handled by classical methods for hyperbolic equations, while the other advective part is a nilpotent matrix, which is solved via the method of lines. Validation tests using a synthetic exact solution are presented, and formal second-order convergence under grid refinement is demonstrated. Moreover, the model is tested under realistic monsoon conditions, and the ability of the model to simulate key features of the monsoon circulation is illustrated in two distinct parameter regimes.
A dynamic regularized gradient model of the subgrid-scale stress tensor for large-eddy simulation
NASA Astrophysics Data System (ADS)
Vollant, A.; Balarac, G.; Corre, C.
2016-02-01
Large-eddy simulation (LES) solves only the large scales part of turbulent flows by using a scales separation based on a filtering operation. The solution of the filtered Navier-Stokes equations requires then to model the subgrid-scale (SGS) stress tensor to take into account the effect of scales smaller than the filter size. In this work, a new model is proposed for the SGS stress model. The model formulation is based on a regularization procedure of the gradient model to correct its unstable behavior. The model is developed based on a priori tests to improve the accuracy of the modeling for both structural and functional performances, i.e., the model ability to locally approximate the SGS unknown term and to reproduce enough global SGS dissipation, respectively. LES is then performed for a posteriori validation. This work is an extension to the SGS stress tensor of the regularization procedure proposed by Balarac et al. ["A dynamic regularized gradient model of the subgrid-scale scalar flux for large eddy simulations," Phys. Fluids 25(7), 075107 (2013)] to model the SGS scalar flux. A set of dynamic regularized gradient (DRG) models is thus made available for both the momentum and the scalar equations. The second objective of this work is to compare this new set of DRG models with direct numerical simulations (DNS), filtered DNS in the case of classic flows simulated with a pseudo-spectral solver and with the standard set of models based on the dynamic Smagorinsky model. Various flow configurations are considered: decaying homogeneous isotropic turbulence, turbulent plane jet, and turbulent channel flows. These tests demonstrate the stable behavior provided by the regularization procedure, along with substantial improvement for velocity and scalar statistics predictions.
A bibliographical surveys of large-scale systems
NASA Technical Reports Server (NTRS)
Corliss, W. R.
1970-01-01
A limited, partly annotated bibliography was prepared on the subject of large-scale system control. Approximately 400 references are divided into thirteen application areas, such as large societal systems and large communication systems. A first-author index is provided.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hutton, Joe; Morgan, Huw, E-mail: joh9@aber.ac.uk
2015-11-01
The 3-part appearance of many coronal mass ejections (CMEs) arising from erupting filaments emerges from a large magnetic flux tube structure, consistent with the form of the erupting filament system. Other CMEs arising from erupting filaments lack a clear 3-part structure and reasons for this have not been researched in detail. This paper aims to further establish the link between CME structure and the structure of the erupting filament system and to investigate whether CMEs which lack a 3-part structure have different eruption characteristics. A survey is made of 221 near-limb filament eruptions observed from 2013 May 03 to 2014more » June 30 by Extreme UltraViolet (EUV) imagers and coronagraphs. Ninety-two filament eruptions are associated with 3-part structured CMEs, 41 eruptions are associated with unstructured CMEs. The remaining 88 are categorized as failed eruptions. For 34% of the 3-part CMEs, processing applied to EUV images reveals the erupting front edge is a pre-existing loop structure surrounding the filament, which subsequently erupts with the filament to form the leading bright front edge of the CME. This connection is confirmed by a flux-rope density model. Furthermore, the unstructured CMEs have a narrower distribution of mass compared to structured CMEs, with total mass comparable to the mass of 3-part CME cores. This study supports the interpretation of 3-part CME leading fronts as the outer boundaries of a large pre-existing flux tube. Unstructured (non 3-part) CMEs are a different family to structured CMEs, arising from the eruption of filaments which are compact flux tubes in the absence of a large system of enclosing closed field.« less
Large scale track analysis for wide area motion imagery surveillance
NASA Astrophysics Data System (ADS)
van Leeuwen, C. J.; van Huis, J. R.; Baan, J.
2016-10-01
Wide Area Motion Imagery (WAMI) enables image based surveillance of areas that can cover multiple square kilometers. Interpreting and analyzing information from such sources, becomes increasingly time consuming as more data is added from newly developed methods for information extraction. Captured from a moving Unmanned Aerial Vehicle (UAV), the high-resolution images allow detection and tracking of moving vehicles, but this is a highly challenging task. By using a chain of computer vision detectors and machine learning techniques, we are capable of producing high quality track information of more than 40 thousand vehicles per five minutes. When faced with such a vast number of vehicular tracks, it is useful for analysts to be able to quickly query information based on region of interest, color, maneuvers or other high-level types of information, to gain insight and find relevant activities in the flood of information. In this paper we propose a set of tools, combined in a graphical user interface, which allows data analysts to survey vehicles in a large observed area. In order to retrieve (parts of) images from the high-resolution data, we developed a multi-scale tile-based video file format that allows to quickly obtain only a part, or a sub-sampling of the original high resolution image. By storing tiles of a still image according to a predefined order, we can quickly retrieve a particular region of the image at any relevant scale, by skipping to the correct frames and reconstructing the image. Location based queries allow a user to select tracks around a particular region of interest such as landmark, building or street. By using an integrated search engine, users can quickly select tracks that are in the vicinity of locations of interest. Another time-reducing method when searching for a particular vehicle, is to filter on color or color intensity. Automatic maneuver detection adds information to the tracks that can be used to find vehicles based on their behavior.
NASA Astrophysics Data System (ADS)
Doke, R.; Harada, M.; Miyaoka, K.; Satomura, M.
2016-12-01
The Izu collision zone, which is characterized by the collision between the Izu-Bonin arc (Izu Peninsula) and the Honshu arc (the main island of Japan), is located in the northernmost part of the Philippine Sea (PHS) plate. Particularly in the northeastern margin of the zone, numerous large earthquakes have occurred. To clarify the convergent tectonics of the zone related to the occurrence of these earthquakes, in this study, we performed Global Positioning System (GPS) observations and analysis around the Izu collision zone. Based on the results of mapping the steady state of the GPS velocity and strain rate fields, we verified that there has been wide shear deformation in the northeastern part of the Izu collision zone, which agrees with the maximum shear directions in the left-lateral slip of the active faults in the study area. Based on the relative motion between the western Izu Peninsula and the eastern subducting forearc, the shear zone can be considered as a transition zone affected by both collision and subduction. The Higashi-Izu Monogenic Volcano Group, which is located in the southern part of the shear deformation zone, may have formed as a result of the steady motion of the subducting PHS plate and the collision of the Izu Peninsula with the Honshu arc. The seismic activities in the Tanzawa Mountains, which is located in the northern part of the shear deformation zone, and the eastern part of the Izu Peninsula may be related to the shear deformation zone, because the temporal patterns of the seismic activity in both areas are correlated.
Adjoint-based Sensitivity of Jet Noise to Near-nozzle Forcing
NASA Astrophysics Data System (ADS)
Chung, Seung Whan; Vishnampet, Ramanathan; Bodony, Daniel; Freund, Jonathan
2017-11-01
Past efforts have used optimal control theory, based on the numerical solution of the adjoint flow equations, to perturb turbulent jets in order to reduce their radiated sound. These efforts have been successful in that sound is reduced, with concomitant changes to the large-scale turbulence structures in the flow. However, they have also been inconclusive, in that the ultimate level of reduction seemed to depend upon the accuracy of the adjoint-based gradient rather than a physical limitation of the flow. The chaotic dynamics of the turbulence can degrade the smoothness of cost functional in the control-parameter space, which is necessary for gradient-based optimization. We introduce a route to overcoming this challenge, in part by leveraging the regularity and accuracy with a dual-consistent, discrete-exact adjoint formulation. We confirm its properties and use it to study the sensitivity and controllability of the acoustic radiation from a simulation of a M = 1.3 turbulent jet, whose statistics matches data. The smoothness of the cost functional over time is quantified by a minimum optimization step size beyond which the gradient cannot have a certain degree of accuracy. Based on this, we achieve a moderate level of sound reduction in the first few optimization steps. This material is based [in part] upon work supported by the Department of Energy, National Nuclear Security Administration, under Award Number DE-NA0002374.
Epoxy blanket protects milled part during explosive forming
NASA Technical Reports Server (NTRS)
1966-01-01
Epoxy blanket protects chemically milled or machined sections of large, complex structural parts during explosive forming. The blanket uniformly covers all exposed surfaces and fills any voids to support and protect the entire part.