Bioremediation of oil-contaminated beaches typically involves fertilization with nutrients that are thought to limit the growth rate of hydrocarbon-degrading bacteria. Much of the available technology involves application of fertilizers that release nutrients in a water-soluble ...
ERIC Educational Resources Information Center
Hopwood, Christopher J.
2007-01-01
Second-generation early intervention research typically involves the specification of multivariate relations between interventions, outcomes, and other variables. Moderation and mediation involve variables or sets of variables that influence relations between interventions and outcomes. Following the framework of Baron and Kenny's (1986) seminal…
Automated watershed subdivision for simulations using multi-objective optimization
USDA-ARS?s Scientific Manuscript database
The development of watershed management plans to evaluate placement of conservation practices typically involves application of watershed models. Incorporating spatially variable watershed characteristics into a model often requires subdividing the watershed into small areas to accurately account f...
Overview 1993: Computational applications
NASA Technical Reports Server (NTRS)
Benek, John A.
1993-01-01
Computational applications include projects that apply or develop computationally intensive computer programs. Such programs typically require supercomputers to obtain solutions in a timely fashion. This report describes two CSTAR projects involving Computational Fluid Dynamics (CFD) technology. The first, the Parallel Processing Initiative, is a joint development effort and the second, the Chimera Technology Development, is a transfer of government developed technology to American industry.
Universal Design: Process, Principles, and Applications
ERIC Educational Resources Information Center
Burgstahler, Sheryl
2009-01-01
Designing any product or environment involves the consideration of many factors, including aesthetics, engineering options, environmental issues, safety concerns, industry standards, and cost. Typically, designers focus their attention on the average user. In contrast, universal design (UD), according to the Center for Universal Design," is…
An integrated power/attitude control system /IPACS/ for space vehicle application
NASA Technical Reports Server (NTRS)
Anderson, W. W.; Keckler, C. R.
1973-01-01
An integrated power and attitude control system (IPACS) concept with potential application to a broad class of space missions is discussed. The concept involves the storage and supply on demand of electrical energy in rotating flywheels while simultaneously providing control torques by controlled precession of the flywheels. The system is thus an alternative to the storage batteries used on present spacecraft while providing similar capability for attitude control as that represented by a control moment gyroscope (CMG) system. Potential IPACS configurations discussed include single- and double-rotor double-gimbal IPACS units. Typical sets of control laws which would manage the momentum and energy exchange between the IPACS and a typical space vehicle are discussed. Discussion of a simulation of a typical potential IPACS configuration and candidate mission concerned with pointing capability, power supply and demand flow, and discussion of the interactions between stabilization and control requirements and power flow requirements are presented.
Universal Design in Postsecondary Education: Process, Principles, and Applications
ERIC Educational Resources Information Center
Burgstahler, Sheryl
2009-01-01
Designing any product or environment involves the consideration of many factors, including aesthetics, engineering options, environmental issues, safety concerns, industry standards, and cost. Typically, designers focus their attention on the average user. In contrast, universal design (UD), according to the Center for Universal Design, "is…
Characterization of Developer Application Methods Used in Fluorescent Penetrant Inspection
NASA Astrophysics Data System (ADS)
Brasche, L. J. H.; Lopez, R.; Eisenmann, D.
2006-03-01
Fluorescent penetrant inspection (FPI) is the most widely used inspection method for aviation components seeing use for production as well as an inservice inspection applications. FPI is a multiple step process requiring attention to the process parameters for each step in order to enable a successful inspection. A multiyear program is underway to evaluate the most important factors affecting the performance of FPI, to determine whether existing industry specifications adequately address control of the process parameters, and to provide the needed engineering data to the public domain. The final step prior to the inspection is the application of developer with typical aviation inspections involving the use of dry powder (form d) usually applied using either a pressure wand or dust storm chamber. Results from several typical dust storm chambers and wand applications have shown less than optimal performance. Measurements of indication brightness and recording of the UVA image, and in some cases, formal probability of detection (POD) studies were used to assess the developer application methods. Key conclusions and initial recommendations are provided.
USDA-ARS?s Scientific Manuscript database
Initially discovered in Georgia in 2009, the exotic invasive plataspid, Megacopta cribraria Fabricius has become a serious pest of soybean. Managing M. cribraria in soybean typically involves the application of broad-spectrum insecticides. Soybean host plant resistance is an attractive alternative...
Issues Involved in Developing Ada Real-Time Systems
1989-02-15
expensive modifications to the compiler or Ada runtime system to fit a particular application. Whether we can solve the problems of programming real - time systems in...lock in solutions to problems that are not yet well understood in standards as rigorous as the Ada language. Moreover, real - time systems typically have
Organizational Decision Making
1975-08-01
the lack of formal techniques typically used by large organizations, digress on the advantages of formal over informal... optimization ; for example one might do a number of optimization calculations, each time using a different measure of effectiveness as the optimized ...final decision. The next level of computer application involves the use of computerized optimization techniques. Optimization
ERIC Educational Resources Information Center
Scherger, Nicole
2012-01-01
Of the most universal applications in integral calculus are those involved with finding volumes of solids of revolution. These profound problems are typically taught with traditional approaches of the disk and shell methods, after which most calculus curriculums will additionally cover arc length and surfaces of revolution. Even in these visibly…
USDA-ARS?s Scientific Manuscript database
Among the most important and visible weeds in the Southeatern U.S. is the exotic invasive vine, kudzu (Pueraria montana var. lobata). Efforts to eradicate it typically involve many years of application of restricted-use pesticides. Recent availability of effective, non-restricted use pesticides and...
COLLECTION OF A SINGLE ALVEOLAR EXHALED BREATH FOR VOLATILE ORGANIC COMPOUNDS ANALYSIS
Measurement of specific organic compounds in exhaled breath has been used as an indicator of recent exposure to pollutants or as an indicator of the health of an individual. Typical application involves the collection of multiple breaths onto a sorbent cartridge or into an evacua...
A Cognitive Diagnosis Model for Cognitively Based Multiple-Choice Options
ERIC Educational Resources Information Center
de la Torre, Jimmy
2009-01-01
Cognitive or skills diagnosis models are discrete latent variable models developed specifically for the purpose of identifying the presence or absence of multiple fine-grained skills. However, applications of these models typically involve dichotomous or dichotomized data, including data from multiple-choice (MC) assessments that are scored as…
Stand development and silviculture in bottomland hardwoods
J. Steven Meadows
1993-01-01
Silviculture for the production of high-quality timber in southern bottomland hardwood forests involves the application of environmentally sound practices in order to enhance the growth and quality of both individual trees and stands. To accomplish this purpose, silvicultural practices are typically used to regulate stand density, species composition, and stem quality...
The history of ceramic filters.
Fujishima, S
2000-01-01
The history of ceramic filters is surveyed. Included is the history of piezoelectric ceramics. Ceramic filters were developed using technology similar to that of quartz crystal and electro-mechanical filters. However, the key to this development involved the theoretical analysis of vibration modes and material improvements of piezoelectric ceramics. The primary application of ceramic filters has been for consumer-market use. Accordingly, a major emphasis has involved mass production technology, leading to low-priced devices. A typical ceramic filter includes monolithic resonators and capacitors packaged in unique configurations.
Continuum-Kinetic Models and Numerical Methods for Multiphase Applications
NASA Astrophysics Data System (ADS)
Nault, Isaac Michael
This thesis presents a continuum-kinetic approach for modeling general problems in multiphase solid mechanics. In this context, a continuum model refers to any model, typically on the macro-scale, in which continuous state variables are used to capture the most important physics: conservation of mass, momentum, and energy. A kinetic model refers to any model, typically on the meso-scale, which captures the statistical motion and evolution of microscopic entitites. Multiphase phenomena usually involve non-negligible micro or meso-scopic effects at the interfaces between phases. The approach developed in the thesis attempts to combine the computational performance benefits of a continuum model with the physical accuracy of a kinetic model when applied to a multiphase problem. The approach is applied to modeling a single particle impact in Cold Spray, an engineering process that intimately involves the interaction of crystal grains with high-magnitude elastic waves. Such a situation could be classified a multiphase application due to the discrete nature of grains on the spatial scale of the problem. For this application, a hyper elasto-plastic model is solved by a finite volume method with approximate Riemann solver. The results of this model are compared for two types of plastic closure: a phenomenological macro-scale constitutive law, and a physics-based meso-scale Crystal Plasticity model.
ERIC Educational Resources Information Center
Satake, Eiki; Vashlishan Murray, Amy
2015-01-01
This paper presents a comparison of three approaches to the teaching of probability to demonstrate how the truth table of elementary mathematical logic can be used to teach the calculations of conditional probabilities. Students are typically introduced to the topic of conditional probabilities--especially the ones that involve Bayes' rule--with…
Spent refractory reuse as a slag conditioning additive in the EAF
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bennett, James P.; Kwong, Kyei-Sing; Krabbe, Rick
2000-01-01
Refractories removed from service in EAF applications are typically landfilled. A joint USDOE and Steel Manufacturers Association program involving industrial cooperators is evaluating spent refractory recycling/reuse. A review of current recycling practices and a review of progress towards controlling EAF slag chemistry and properties with the additions of basic spent refractories will be discussed.
Polymethylsilsesquioxanes through base-catalyzed redistribution of oligomethylhydridosiloxanes
DOE Office of Scientific and Technical Information (OSTI.GOV)
RAHIMIAN,KAMYAR; ASSINK,ROGER A.; LOY,DOUGLAS A.
2000-04-04
There has been an increasing amount of interest in silsesquioxanes and polysilsesquioxanes. They have been used as models for silica surfaces and have been shown to have great potential for several industrial applications. Typical synthesis of polysilsesquioxanes involves the hydrolysis of organotricholorosilanes and/or organotrialkoxysilanes in the presence of acid or base catalysts, usually in the presence of organic solvents.
Visualization of multi-INT fusion data using Java Viewer (JVIEW)
NASA Astrophysics Data System (ADS)
Blasch, Erik; Aved, Alex; Nagy, James; Scott, Stephen
2014-05-01
Visualization is important for multi-intelligence fusion and we demonstrate issues for presenting physics-derived (i.e., hard) and human-derived (i.e., soft) fusion results. Physics-derived solutions (e.g., imagery) typically involve sensor measurements that are objective, while human-derived (e.g., text) typically involve language processing. Both results can be geographically displayed for user-machine fusion. Attributes of an effective and efficient display are not well understood, so we demonstrate issues and results for filtering, correlation, and association of data for users - be they operators or analysts. Operators require near-real time solutions while analysts have the opportunities of non-real time solutions for forensic analysis. In a use case, we demonstrate examples using the JVIEW concept that has been applied to piloting, space situation awareness, and cyber analysis. Using the open-source JVIEW software, we showcase a big data solution for multi-intelligence fusion application for context-enhanced information fusion.
Peckner, Ryan; Myers, Samuel A; Jacome, Alvaro Sebastian Vaca; Egertson, Jarrett D; Abelin, Jennifer G; MacCoss, Michael J; Carr, Steven A; Jaffe, Jacob D
2018-05-01
Mass spectrometry with data-independent acquisition (DIA) is a promising method to improve the comprehensiveness and reproducibility of targeted and discovery proteomics, in theory by systematically measuring all peptide precursors in a biological sample. However, the analytical challenges involved in discriminating between peptides with similar sequences in convoluted spectra have limited its applicability in important cases, such as the detection of single-nucleotide polymorphisms (SNPs) and alternative site localizations in phosphoproteomics data. We report Specter (https://github.com/rpeckner-broad/Specter), an open-source software tool that uses linear algebra to deconvolute DIA mixture spectra directly through comparison to a spectral library, thus circumventing the problems associated with typical fragment-correlation-based approaches. We validate the sensitivity of Specter and its performance relative to that of other methods, and show that Specter is able to successfully analyze cases involving highly similar peptides that are typically challenging for DIA analysis methods.
Revision of the Rawls et al. (1982) pedotransfer functions for their applicability to US croplands
USDA-ARS?s Scientific Manuscript database
Large scale environmental impact studies typically involve the use of simulation models and require a variety of inputs, some of which may need to be estimated in absence of adequate measured data. As an example, soil water retention needs to be estimated for a large number of soils that are to be u...
Christopher J. Fettig; A. Steven Munson; Donald M. Grosman; Parshall B. Bush
2014-01-01
Protection of conifers from bark beetle colonization typically involves applications of liquid formulations of contact insecticides to the tree bole. An evaluation was made of the efficacy of bole injections of emamectin benzoate alone and combined with the fungicide propiconazole for protecting individual lodgepole pine, Pinus contorta Dougl. ex...
NASA Small Business Innovation Research program
NASA Technical Reports Server (NTRS)
Johnson, Harry W.
1985-01-01
NASA activities in the framework of the 11-agency federal Small Business Innovation Research program are outlined in tables and graphs and briefly characterized. Statistics on the program are given; the technical topics covered are listed; and the procedures involved in evaluating applications for support are discussed. A number of typical defects in proposals are indicated, and recommendations for avoiding them are provided.
Comparison of Low-Thrust Control Laws for Application in Planetocentric Space
NASA Technical Reports Server (NTRS)
Falck, Robert D.; Sjauw, Waldy K.; Smith, David A.
2014-01-01
Recent interest at NASA for the application of solar electric propulsion for the transfer of significant payloads in cislunar space has led to the development of high-fidelity simulations of such missions. With such transfers involving transfer times on the order of months, simulation time can be significant. In the past, the examination of such missions typically began with the use of lower-fidelity trajectory optimization tools such as SEPSPOT to develop and tune guidance laws which delivered optimal or near- optimal trajectories, where optimal is generally defined as minimizing propellant expenditure or time of flight. The transfer of these solutions to a high-fidelity simulation is typically an iterative process whereby the initial solution may nearly, but not precisely, meet mission objectives. Further tuning of the guidance algorithm is typically necessary when accounting for high-fidelity perturbations such as those due to more detailed gravity models, secondary-body effects, solar radiation pressure, etc. While trajectory optimization is a useful method for determining optimal performance metrics, algorithms which deliver nearly optimal performance with minimal tuning are an attractive alternative.
Age-Related Brain Activation Changes during Rule Repetition in Word-Matching.
Methqal, Ikram; Pinsard, Basile; Amiri, Mahnoush; Wilson, Maximiliano A; Monchi, Oury; Provost, Jean-Sebastien; Joanette, Yves
2017-01-01
Objective: The purpose of this study was to explore the age-related brain activation changes during a word-matching semantic-category-based task, which required either repeating or changing a semantic rule to be applied. In order to do so, a word-semantic rule-based task was adapted from the Wisconsin Sorting Card Test, involving the repeated feedback-driven selection of given pairs of words based on semantic category-based criteria. Method: Forty healthy adults (20 younger and 20 older) performed a word-matching task while undergoing a fMRI scan in which they were required to pair a target word with another word from a group of three words. The required pairing is based on three word-pair semantic rules which correspond to different levels of semantic control demands: functional relatedness, moderately typical-relatedness (which were considered as low control demands), and atypical-relatedness (high control demands). The sorting period consisted of a continuous execution of the same sorting rule and an inferred trial-by-trial feedback was given. Results: Behavioral performance revealed increases in response times and decreases of correct responses according to the level of semantic control demands (functional vs. typical vs. atypical) for both age groups (younger and older) reflecting graded differences in the repetition of the application of a given semantic rule. Neuroimaging findings of significant brain activation showed two main results: (1) Greater task-related activation changes for the repetition of the application of atypical rules relative to typical and functional rules, and (2) Changes (older > younger) in the inferior prefrontal regions for functional rules and more extensive and bilateral activations for typical and atypical rules. Regarding the inter-semantic rules comparison, only task-related activation differences were observed for functional > typical (e.g., inferior parietal and temporal regions bilaterally) and atypical > typical (e.g., prefrontal, inferior parietal, posterior temporal, and subcortical regions). Conclusion: These results suggest that healthy cognitive aging relies on the adaptive changes of inferior prefrontal resources involved in the repetitive execution of semantic rules, thus reflecting graded differences in support of task demands.
The Development Status and Key Technologies of Solar Powered Unmanned Air Vehicle
NASA Astrophysics Data System (ADS)
Sai, Li; Wei, Zhou; Xueren, Wang
2017-03-01
By analyzing the development status of several typical solar powered unmanned aerial vehicles (UAV) at home and abroad, the key technologies involved in the design and manufacture of solar powered UAV and the technical difficulties need to be solved at present are obtained. It is pointed out that with the improvement of energy system efficiency, advanced aerodynamic configuration design, realization of high applicability flight stability and control system, breakthrough of efficient propulsion system, the application prospect of solar powered UAV will be more extensive.
Hon, Carol K H; Liu, Yulin
2016-09-22
The safety of repair, maintenance, minor alteration and addition (RMAA) work is an under-explored area. This study explored the typical and atypical safety climate perceptions of practitioners in the RMAA sector in Hong Kong, based on a self-administered questionnaire survey of 662 local practitioners in the industry. Profile analysis, via multidimensional scaling of the respondents' scores of three safety climate scales, identified one typical perception: high in management commitment to occupational health and safety (OHS) and employee involvement, low in applicability for safety rules and regulations, and low in responsibility for OHS. The respondents were clustered into typical and atypical perception groups according to their safety climate scores' match to the typical perception. A comparison of demographics between the two groups with logistic regression found that work level and direct employer significantly affect their classification. A multivariate analysis of variance of safety performance measures between the two groups indicated that the typical group had a significantly higher level of safety compliance than the atypical group, with no significant difference in safety participation or injury. The significance of this study lies in revealing the typical safety climate perception profile pattern of RMAA works and offering a new perspective of safety climate research.
Hon, Carol K.H.; Liu, Yulin
2016-01-01
The safety of repair, maintenance, minor alteration and addition (RMAA) work is an under-explored area. This study explored the typical and atypical safety climate perceptions of practitioners in the RMAA sector in Hong Kong, based on a self-administered questionnaire survey of 662 local practitioners in the industry. Profile analysis, via multidimensional scaling of the respondents’ scores of three safety climate scales, identified one typical perception: high in management commitment to occupational health and safety (OHS) and employee involvement, low in applicability for safety rules and regulations, and low in responsibility for OHS. The respondents were clustered into typical and atypical perception groups according to their safety climate scores’ match to the typical perception. A comparison of demographics between the two groups with logistic regression found that work level and direct employer significantly affect their classification. A multivariate analysis of variance of safety performance measures between the two groups indicated that the typical group had a significantly higher level of safety compliance than the atypical group, with no significant difference in safety participation or injury. The significance of this study lies in revealing the typical safety climate perception profile pattern of RMAA works and offering a new perspective of safety climate research. PMID:27669269
Visual Modelling of Data Warehousing Flows with UML Profiles
NASA Astrophysics Data System (ADS)
Pardillo, Jesús; Golfarelli, Matteo; Rizzi, Stefano; Trujillo, Juan
Data warehousing involves complex processes that transform source data through several stages to deliver suitable information ready to be analysed. Though many techniques for visual modelling of data warehouses from the static point of view have been devised, only few attempts have been made to model the data flows involved in a data warehousing process. Besides, each attempt was mainly aimed at a specific application, such as ETL, OLAP, what-if analysis, data mining. Data flows are typically very complex in this domain; for this reason, we argue, designers would greatly benefit from a technique for uniformly modelling data warehousing flows for all applications. In this paper, we propose an integrated visual modelling technique for data cubes and data flows. This technique is based on UML profiling; its feasibility is evaluated by means of a prototype implementation.
NASA Technical Reports Server (NTRS)
Young, S. G.
1976-01-01
An ultrasonic cavitation method for restoring obliterated serial numbers has been further explored by application to articles involved in police cases. The method was applied successfully to gun parts. In one case portions of numbers were restored after prior failure by other laboratories using chemical etching techniques. The ultrasonic method was not successful on a heavily obliterated and restamped automobile engine block, but it was partially successful on a motorcycle gear-case housing. Additional studies were made on the effect of a larger diameter ultrasonic probe, and on the method's ability to restore numbers obliterated by peening.
Schiecke, Karin; Pester, Britta; Feucht, Martha; Leistritz, Lutz; Witte, Herbert
2015-01-01
In neuroscience, data are typically generated from neural network activity. Complex interactions between measured time series are involved, and nothing or only little is known about the underlying dynamic system. Convergent Cross Mapping (CCM) provides the possibility to investigate nonlinear causal interactions between time series by using nonlinear state space reconstruction. Aim of this study is to investigate the general applicability, and to show potentials and limitation of CCM. Influence of estimation parameters could be demonstrated by means of simulated data, whereas interval-based application of CCM on real data could be adapted for the investigation of interactions between heart rate and specific EEG components of children with temporal lobe epilepsy.
An Ontology-based Architecture for Integration of Clinical Trials Management Applications
Shankar, Ravi D.; Martins, Susana B.; O’Connor, Martin; Parrish, David B.; Das, Amar K.
2007-01-01
Management of complex clinical trials involves coordinated-use of a myriad of software applications by trial personnel. The applications typically use distinct knowledge representations and generate enormous amount of information during the course of a trial. It becomes vital that the applications exchange trial semantics in order for efficient management of the trials and subsequent analysis of clinical trial data. Existing model-based frameworks do not address the requirements of semantic integration of heterogeneous applications. We have built an ontology-based architecture to support interoperation of clinical trial software applications. Central to our approach is a suite of clinical trial ontologies, which we call Epoch, that define the vocabulary and semantics necessary to represent information on clinical trials. We are continuing to demonstrate and validate our approach with different clinical trials management applications and with growing number of clinical trials. PMID:18693919
Papadakis, Georgios Z; Jha, Smita; Bhattacharyya, Timothy; Millo, Corina; Tu, Tsang-Wei; Bagci, Ulas; Marias, Kostas; Karantanas, Apostolos H; Patronas, Nicholas J
2017-07-01
Melorheostosis is a rare, nonhereditary, benign, sclerotic bone dysplasia with no sex predilection, typically occurring in late childhood or early adulthood, which can lead to substantial functional morbidity, depending on the sites of involvement. We report on a patient with extensive melorheostosis in the axial and appendicular skeleton, as well as in the soft tissues, who was evaluated with whole-body F-NaF PET/CT scan. All melorheostotic lesions of the skeleton and of the ossified soft-tissue masses demonstrated intensely increased F-NaF activity, suggesting the application of this modality in assessing and monitoring the disease activity.
Project summary: Application of a trailer-mounted slash bundler for southern logging
S. Meadows; T. Gallagher; D. Mitchell
2010-01-01
The John Deere bundler was originally designed to collect material behind a cutâtoâlength (CTL) operation, where the biomass feedstock is distributed across the harvested site. While the occurrence of a CTL operation is common in Europe, it is rarely used in the southern United States. Southern logging typically involves a treeâlength operation, where the whole tree is...
The Interaction of UV-Laser Radiation with Metal and Semiconductor Surfaces
1992-05-26
order of magnitude larger than the typical widths of non- 43 R.C. Weast, ed., Handbook of Chemistry and Physics, p. D-185 (CRC Press, 1986). 25 resonant...fundamental chemistry and practical applications of laser chemical processing techniques involved photofragmentation of relatively 28 simple metal-alkyl...pressure of the gas was monitored with a capacitance manometer. A variety of techniques were used in this work to examine the surface-phase chemistry and
Sun-pumped lasers: revisiting an old problem with nonimaging optics.
Cooke, D
1992-12-20
The techniques of nonimaging optics have permitted the production of a world-record intensity of sunlight, 72 W/mm(2), by using a sapphire concentrator. Such an intensity exceeds the intensity of light at the surface of the Sun itself (63 W/mm(2)) by 15% and may have useful applications in pumping lasers, which require high intensities of light to function. The author describes the production of high-intensity sunlight and reports its application in generating over 3 W of laser power from a 72.5-cm-diameter telescope mirror at an efficiency exceeding that typically attained in approaches not involving nonimaging optics.
Leveraging Technology to Reduce Patient Transaction Costs.
Edlow, Richard C
2015-01-01
Medical practices are under significant pressure to provide superior customer service in an environment of declining or flat reimbursement. The solution for many practices involves the integration of a variety of third-party technologies that conveniently interface with one's electronic practice management and medical records systems. Typically, the applications allow the practice to reduce the cost of each patient interaction. Drilling down to quantify the cost of each individual patient interaction helps to determine the practicality of implementation.
Modeling Longitudinal Data Containing Non-Normal Within Subject Errors
NASA Technical Reports Server (NTRS)
Feiveson, Alan; Glenn, Nancy L.
2013-01-01
The mission of the National Aeronautics and Space Administration’s (NASA) human research program is to advance safe human spaceflight. This involves conducting experiments, collecting data, and analyzing data. The data are longitudinal and result from a relatively few number of subjects; typically 10 – 20. A longitudinal study refers to an investigation where participant outcomes and possibly treatments are collected at multiple follow-up times. Standard statistical designs such as mean regression with random effects and mixed–effects regression are inadequate for such data because the population is typically not approximately normally distributed. Hence, more advanced data analysis methods are necessary. This research focuses on four such methods for longitudinal data analysis: the recently proposed linear quantile mixed models (lqmm) by Geraci and Bottai (2013), quantile regression, multilevel mixed–effects linear regression, and robust regression. This research also provides computational algorithms for longitudinal data that scientists can directly use for human spaceflight and other longitudinal data applications, then presents statistical evidence that verifies which method is best for specific situations. This advances the study of longitudinal data in a broad range of applications including applications in the sciences, technology, engineering and mathematics fields.
Potential use of advanced process control for safety purposes during attack of a process plant.
Whiteley, James R
2006-03-17
Many refineries and commodity chemical plants employ advanced process control (APC) systems to improve throughputs and yields. These APC systems utilize empirical process models for control purposes and enable operation closer to constraints than can be achieved with traditional PID regulatory feedback control. Substantial economic benefits are typically realized from the addition of APC systems. This paper considers leveraging the control capabilities of existing APC systems to minimize the potential impact of a terrorist attack on a process plant (e.g., petroleum refinery). Two potential uses of APC are described. The first is a conventional application of APC and involves automatically moving the process to a reduced operating rate when an attack first begins. The second is a non-conventional application and involves reconfiguring the APC system to optimize safety rather than economics. The underlying intent in both cases is to reduce the demands on the operator to allow focus on situation assessment and optimal response planning. An overview of APC is provided along with a brief description of the modifications required for the proposed new applications of the technology.
Joshi, Ashish; de Araujo Novaes, Magdala; Machiavelli, Josiane; Iyengar, Sriram; Vogler, Robert; Johnson, Craig; Zhang, Jiajie; Hsu, Chiehwen E
2012-01-01
Public health data is typically organized by geospatial unit. GeoVisualization (GeoVis) allows users to see information visually on a map. Examine telehealth users' perceptions towards existing public health GeoVis applications and obtains users' feedback about features important for the design and development of Human Centered GeoVis application "the SanaViz". We employed a cross sectional study design using mixed methods approach for this pilot study. Twenty users involved with the NUTES telehealth center at Federal University of Pernambuco (UFPE), Recife, Brazil were enrolled. Open and closed ended questionnaires were used to gather data. We performed audio recording for the interviews. Information gathered included socio-demographics, prior spatial skills and perception towards use of GeoVis to evaluate telehealth services. Card sorting and sketching methods were employed. Univariate analysis was performed for the continuous and categorical variables. Qualitative analysis was performed for open ended questions. Existing Public Health GeoVis applications were difficult to use. Results found interaction features zooming, linking and brushing and representation features Google maps, tables and bar chart as most preferred GeoVis features. Early involvement of users is essential to identify features necessary to be part of the human centered GeoVis application "the SanaViz".
Minkov, V; Klammer, H; Brix, G
2017-07-01
In Germany, persons who are to be exposed to radiation for medical research purposes are protected by a licensing requirement. However, there are considerable uncertainties on the part of the applicants as to whether licensing by the competent Federal Office for Radiation Protection is necessary, and regarding the choice of application procedure. The article provides explanatory notes and practical assistance for applicants and an outlook on the forthcoming new regulations concerning the law on radiation protection of persons in the field of medical research. Questions and typical mistakes in the application process were identified and evaluated. The qualified physicians involved in a study are responsible for deciding whether a license is required for the intended application of radiation. The decision can be guided by answering the key question whether the study participants would undergo the same exposures regarding type and extent if they had not taken part in the study. When physicians are still unsure about their decision, they can seek the advisory service provided by the professional medical societies. Certain groups of people are particularly protected through the prohibition or restriction of radiation exposure. A simplified licensing procedure is used for a proportion of diagnostic procedures involving radiation when all related requirements are met; otherwise, the regular licensing procedure should be used. The new radiation protection law, which will enter into force on the 31st of december 2018, provides a notification procedure in addition to deadlines for both the notification and the licensing procedures. In the article, the authors consider how eligible studies involving applications of radiation that are legally not admissible at present may be feasible in the future, while still ensuring a high protection level for study participants.
Electrochemical capacitors: mechanism, materials, systems, characterization and applications.
Wang, Yonggang; Song, Yanfang; Xia, Yongyao
2016-10-24
Electrochemical capacitors (i.e. supercapacitors) include electrochemical double-layer capacitors that depend on the charge storage of ion adsorption and pseudo-capacitors that are based on charge storage involving fast surface redox reactions. The energy storage capacities of supercapacitors are several orders of magnitude higher than those of conventional dielectric capacitors, but are much lower than those of secondary batteries. They typically have high power density, long cyclic stability and high safety, and thus can be considered as an alternative or complement to rechargeable batteries in applications that require high power delivery or fast energy harvesting. This article reviews the latest progress in supercapacitors in charge storage mechanisms, electrode materials, electrolyte materials, systems, characterization methods, and applications. In particular, the newly developed charge storage mechanism for intercalative pseudocapacitive behaviour, which bridges the gap between battery behaviour and conventional pseudocapacitive behaviour, is also clarified for comparison. Finally, the prospects and challenges associated with supercapacitors in practical applications are also discussed.
TREATING HEMOGLOBINOPATHIES USING GENE CORRECTION APPROACHES: PROMISES AND CHALLENGES
Cottle, Renee N.; Lee, Ciaran M.; Bao, Gang
2016-01-01
Hemoglobinopathies are genetic disorders caused by aberrant hemoglobin expression or structure changes, resulting in severe mortality and health disparities worldwide. Sickle cell disease (SCD) and β-thalassemia, the most common forms of hemoglobinopathies, are typically treated using transfusions and pharmacological agents. Allogeneic hematopoietic stem cell transplantation is the only curative therapy, but has limited clinical applicability. Although gene therapy approaches have been proposed based on the insertion and forced expression of wild-type or anti-sickling β-globin variants, safety concerns may impede their clinical application. A novel curative approach is nuclease-based gene correction, which involves the application of precision genome editing tools to correct the disease-causing mutation. This review describes the development and potential application of gene therapy and precision genome editing approaches for treating SCD and β-thalassemia. The opportunities and challenges in advancing a curative therapy for hemoglobinopathies are also discussed. PMID:27314256
Two-photon excited photoconversion of cyanine-based dyes
NASA Astrophysics Data System (ADS)
Kwok, Sheldon J. J.; Choi, Myunghwan; Bhayana, Brijesh; Zhang, Xueli; Ran, Chongzhao; Yun, Seok-Hyun
2016-03-01
The advent of phototransformable fluorescent proteins has led to significant advances in optical imaging, including the unambiguous tracking of cells over large spatiotemporal scales. However, these proteins typically require activating light in the UV-blue spectrum, which limits their in vivo applicability due to poor light penetration and associated phototoxicity on cells and tissue. We report that cyanine-based, organic dyes can be efficiently photoconverted by nonlinear excitation at the near infrared (NIR) window. Photoconversion likely involves singlet-oxygen mediated photochemical cleavage, yielding blue-shifted fluorescent products. Using SYTO62, a biocompatible and cell-permeable dye, we demonstrate photoconversion in a variety of cell lines, including depth-resolved labeling of cells in 3D culture. Two-photon photoconversion of cyanine-based dyes offer several advantages over existing photoconvertible proteins, including use of minimally toxic NIR light, labeling without need for genetic intervention, rapid kinetics, remote subsurface targeting, and long persistence of photoconverted signal. These findings are expected to be useful for applications involving rapid labeling of cells deep in tissue.
A hybrid life cycle inventory of nano-scale semiconductor manufacturing.
Krishnan, Nikhil; Boyd, Sarah; Somani, Ajay; Raoux, Sebastien; Clark, Daniel; Dornfeld, David
2008-04-15
The manufacturing of modern semiconductor devices involves a complex set of nanoscale fabrication processes that are energy and resource intensive, and generate significant waste. It is important to understand and reduce the environmental impacts of semiconductor manufacturing because these devices are ubiquitous components in electronics. Furthermore, the fabrication processes used in the semiconductor industry are finding increasing application in other products, such as microelectromechanical systems (MEMS), flat panel displays, and photovoltaics. In this work we develop a library of typical gate-to-gate materials and energy requirements, as well as emissions associated with a complete set of fabrication process models used in manufacturing a modern microprocessor. In addition, we evaluate upstream energy requirements associated with chemicals and materials using both existing process life cycle assessment (LCA) databases and an economic input-output (EIO) model. The result is a comprehensive data set and methodology that may be used to estimate and improve the environmental performance of a broad range of electronics and other emerging applications that involve nano and micro fabrication.
Bacterial signaling ecology and potential applications during aquatic biofilm construction.
Vega, Leticia M; Alvarez, Pedro J; McLean, Robert J C
2014-07-01
In their natural environment, bacteria and other microorganisms typically grow as surface-adherent biofilm communities. Cell signal processes, including quorum signaling, are now recognized as being intimately involved in the development and function of biofilms. In contrast to their planktonic (unattached) counterparts, bacteria within biofilms are notoriously resistant to many traditional antimicrobial agents and so represent a major challenge in industry and medicine. Although biofilms impact many human activities, they actually represent an ancient mode of bacterial growth as shown in the fossil record. Consequently, many aquatic organisms have evolved strategies involving signal manipulation to control or co-exist with biofilms. Here, we review the chemical ecology of biofilms and propose mechanisms whereby signal manipulation can be used to promote or control biofilms.
Motion based parsing for video from observational psychology
NASA Astrophysics Data System (ADS)
Kokaram, Anil; Doyle, Erika; Lennon, Daire; Joyeux, Laurent; Fuller, Ray
2006-01-01
In Psychology it is common to conduct studies involving the observation of humans undertaking some task. The sessions are typically recorded on video and used for subjective visual analysis. The subjective analysis is tedious and time consuming, not only because much useless video material is recorded but also because subjective measures of human behaviour are not necessarily repeatable. This paper presents tools using content based video analysis that allow automated parsing of video from one such study involving Dyslexia. The tools rely on implicit measures of human motion that can be generalised to other applications in the domain of human observation. Results comparing quantitative assessment of human motion with subjective assessment are also presented, illustrating that the system is a useful scientific tool.
Beres, Anna M
2017-12-01
The discovery of electroencephalography (EEG) over a century ago has changed the way we understand brain structure and function, in terms of both clinical and research applications. This paper starts with a short description of EEG and then focuses on the event-related brain potentials (ERPs), and their use in experimental settings. It describes the typical set-up of an ERP experiment. A description of a number of ERP components typically involved in language research is presented. Finally, the advantages and disadvantages of using ERPs in language research are discussed. EEG has an extensive use in today's world, including medical, psychology, or linguistic research. The excellent temporal resolution of EEG information allows one to track a brain response in milliseconds and therefore makes it uniquely suited to research concerning language processing.
Transient analysis of a thermal storage unit involving a phase change material
NASA Technical Reports Server (NTRS)
Griggs, E. I.; Pitts, D. R.; Humphries, W. R.
1974-01-01
The transient response of a single cell of a typical phase change material type thermal capacitor has been modeled using numerical conductive heat transfer techniques. The cell consists of a base plate, an insulated top, and two vertical walls (fins) forming a two-dimensional cavity filled with a phase change material. Both explicit and implicit numerical formulations are outlined. A mixed explicit-implicit scheme which treats the fin implicity while treating the phase change material explicitly is discussed. A band algorithmic scheme is used to reduce computer storage requirements for the implicit approach while retaining a relatively fine grid. All formulations are presented in dimensionless form thereby enabling application to geometrically similar problems. Typical parametric results are graphically presented for the case of melting with constant heat input to the base of the cell.
Three-dimensional microbubble streaming flows
NASA Astrophysics Data System (ADS)
Rallabandi, Bhargav; Marin, Alvaro; Rossi, Massimiliano; Kaehler, Christian; Hilgenfeldt, Sascha
2014-11-01
Streaming due to acoustically excited bubbles has been used successfully for applications such as size-sorting, trapping and focusing of particles, as well as fluid mixing. Many of these applications involve the precise control of particle trajectories, typically achieved using cylindrical bubbles, which establish planar flows. Using astigmatic particle tracking velocimetry (APTV), we show that, while this two-dimensional picture is a useful description of the flow over short times, a systematic three-dimensional flow structure is evident over long time scales. We demonstrate that this long-time three-dimensional fluid motion can be understood through asymptotic theory, superimposing secondary axial flows (induced by boundary conditions at the device walls) onto the two-dimensional description. This leads to a general framework that describes three-dimensional flows in confined microstreaming systems, guiding the design of applications that profit from minimizing or maximizing these effects.
Supervised learning of probability distributions by neural networks
NASA Technical Reports Server (NTRS)
Baum, Eric B.; Wilczek, Frank
1988-01-01
Supervised learning algorithms for feedforward neural networks are investigated analytically. The back-propagation algorithm described by Werbos (1974), Parker (1985), and Rumelhart et al. (1986) is generalized by redefining the values of the input and output neurons as probabilities. The synaptic weights are then varied to follow gradients in the logarithm of likelihood rather than in the error. This modification is shown to provide a more rigorous theoretical basis for the algorithm and to permit more accurate predictions. A typical application involving a medical-diagnosis expert system is discussed.
A tensor approach to modeling of nonhomogeneous nonlinear systems
NASA Technical Reports Server (NTRS)
Yurkovich, S.; Sain, M.
1980-01-01
Model following control methodology plays a key role in numerous application areas. Cases in point include flight control systems and gas turbine engine control systems. Typical uses of such a design strategy involve the determination of nonlinear models which generate requested control and response trajectories for various commands. Linear multivariable techniques provide trim about these motions; and protection logic is added to secure the hardware from excursions beyond the specification range. This paper reports upon experience in developing a general class of such nonlinear models based upon the idea of the algebraic tensor product.
Alimonti, Luca; Atalla, Noureddine; Berry, Alain; Sgard, Franck
2015-02-01
Practical vibroacoustic systems involve passive acoustic treatments consisting of highly dissipative media such as poroelastic materials. The numerical modeling of such systems at low to mid frequencies typically relies on substructuring methodologies based on finite element models. Namely, the master subsystems (i.e., structural and acoustic domains) are described by a finite set of uncoupled modes, whereas condensation procedures are typically preferred for the acoustic treatments. However, although accurate, such methodology is computationally expensive when real life applications are considered. A potential reduction of the computational burden could be obtained by approximating the effect of the acoustic treatment on the master subsystems without introducing physical degrees of freedom. To do that, the treatment has to be assumed homogeneous, flat, and of infinite lateral extent. Under these hypotheses, simple analytical tools like the transfer matrix method can be employed. In this paper, a hybrid finite element-transfer matrix methodology is proposed. The impact of the limiting assumptions inherent within the analytical framework are assessed for the case of plate-cavity systems involving flat and homogeneous acoustic treatments. The results prove that the hybrid model can capture the qualitative behavior of the vibroacoustic system while reducing the computational effort.
Calibration of asynchronous smart phone cameras from moving objects
NASA Astrophysics Data System (ADS)
Hagen, Oksana; Istenič, Klemen; Bharti, Vibhav; Dhali, Maruf Ahmed; Barmaimon, Daniel; Houssineau, Jérémie; Clark, Daniel
2015-04-01
Calibrating multiple cameras is a fundamental prerequisite for many Computer Vision applications. Typically this involves using a pair of identical synchronized industrial or high-end consumer cameras. This paper considers an application on a pair of low-cost portable cameras with different parameters that are found in smart phones. This paper addresses the issues of acquisition, detection of moving objects, dynamic camera registration and tracking of arbitrary number of targets. The acquisition of data is performed using two standard smart phone cameras and later processed using detections of moving objects in the scene. The registration of cameras onto the same world reference frame is performed using a recently developed method for camera calibration using a disparity space parameterisation and the single-cluster PHD filter.
The need for higher-order averaging in the stability analysis of hovering, flapping-wing flight.
Taha, Haithem E; Tahmasian, Sevak; Woolsey, Craig A; Nayfeh, Ali H; Hajj, Muhammad R
2015-01-05
Because of the relatively high flapping frequency associated with hovering insects and flapping wing micro-air vehicles (FWMAVs), dynamic stability analysis typically involves direct averaging of the time-periodic dynamics over a flapping cycle. However, direct application of the averaging theorem may lead to false conclusions about the dynamics and stability of hovering insects and FWMAVs. Higher-order averaging techniques may be needed to understand the dynamics of flapping wing flight and to analyze its stability. We use second-order averaging to analyze the hovering dynamics of five insects in response to high-amplitude, high-frequency, periodic wing motion. We discuss the applicability of direct averaging versus second-order averaging for these insects.
Hybrid magnet devices for molecule manipulation and small scale high gradient-field applications
Humphries, David E [El Cerrito, CA; Hong, Seok-Cheol [Seoul, KR; Cozzarelli, legal representative, Linda A.; Pollard, Martin J [El Cerrito, CA; Cozzarelli, Nicholas R [Berkeley, CA
2009-01-06
The present disclosure provides a high performance hybrid magnetic structure made from a combination of permanent magnets and ferromagnetic pole materials which are assembled in a predetermined array. The hybrid magnetic structure provides means for separation and other biotechnology applications involving holding, manipulation, or separation of magnetizable molecular structures and targets. Also disclosed are hybrid magnetic tweezers able to exert approximately 1 nN of force to 4.5 .mu.m magnetic bead. The maximum force was experimentally measured to be .about.900 pN which is in good agreement with theoretical estimations and other measurements. In addition, a new analysis scheme that permits fast real-time position measurement in typical geometry of magnetic tweezers has been developed and described in detail.
The NASA Aerospace Battery Safety Handbook
NASA Technical Reports Server (NTRS)
Halpert, Gerald; Subbarao, Surampudi; Rowlette, John J.
1986-01-01
This handbook has been written for the purpose of acquainting those involved with batteries with the information necessary for the safe handling, storage, and disposal of these energy storage devices. Included in the document is a discussion of the cell and battery design considerations and the role of the components within a cell. The cell and battery hazards are related to user- and/or manufacturer-induced causes. The Johnson Space Center (JSC) Payload Safety Guidelines for battery use in Shuttle applications are also provided. The electrochemical systems are divided into zinc anode and lithium anode primaries, secondary cells, and fuel cells. Each system is briefly described, typical applications are given, advantages and disadvantages are tabulated, and most importantly, safety hazards associated with its use are given.
Assessing cost-effectiveness of drug interventions for schizophrenia.
Magnus, Anne; Carr, Vaughan; Mihalopoulos, Cathrine; Carter, Rob; Vos, Theo
2005-01-01
To assess from a health sector perspective the incremental cost-effectiveness of eight drug treatment scenarios for established schizophrenia. Using a standardized methodology, costs and outcomes are modelled over the lifetime of prevalent cases of schizophrenia in Australia in 2000. A two-stage approach to assessment of health benefit is used. The first stage involves a quantitative analysis based on disability-adjusted life years (DALYs) averted, using best available evidence. The robustness of results is tested using probabilistic uncertainty analysis. The second stage involves application of 'second filter' criteria (equity, strength of evidence, feasibility and acceptability) to allow broader concepts of benefit to be considered. Replacing oral typicals with risperidone or olanzapine has an incremental cost-effectiveness ratio (ICER) of 48,000 Australian dollars and 92,000 Australian dollars/DALY respectively. Switching from low-dose typicals to risperidone has an ICER of 80,000 Australian dollars. Giving risperidone to people experiencing side-effects on typicals is more cost-effective at 20,000 Australian dollars. Giving clozapine to people taking typicals, with the worst course of the disorder and either little or clear deterioration, is cost-effective at 42,000 Australian dollars or 23,000 Australian dollars/DALY respectively. The least cost-effective intervention is to replace risperidone with olanzapine at 160,000 Australian dollars/DALY. Based on an 50,000 Australian dollars/DALY threshold, low-dose typical neuroleptics are indicated as the treatment of choice for established schizophrenia, with risperidone being reserved for those experiencing moderate to severe side-effects on typicals. The more expensive olanzapine should only be prescribed when risperidone is not clinically indicated. The high cost of risperidone and olanzapine relative to modest health gains underlie this conclusion. Earlier introduction of clozapine however, would be cost-effective. This work is limited by weaknesses in trials (lack of long-term efficacy data, quality of life and consumer satisfaction evidence) and the translation of effect size into a DALY change. Some stakeholders, including SANE Australia, argue the modest health gains reported in the literature do not adequately reflect perceptions by patients, clinicians and carers, of improved quality of life with these atypicals.
12 CFR 1070.22 - Fees for processing requests for CFPB records.
Code of Federal Regulations, 2012 CFR
2012-01-01
... of grades typically involved may be established. This charge shall include transportation of...), an average rate for the range of grades typically involved may be established. Fees shall be charged... research. (iii) Non-commercial scientific institution refers to an institution that is not operated on a...
Congdon, Peter
2013-01-01
This paper considers estimation of disease prevalence for small areas (neighbourhoods) when the available observations on prevalence are for an alternative partition of a region, such as service areas. Interpolation to neighbourhoods uses a kernel method extended to take account of two types of collateral information. The first is morbidity and service use data, such as hospital admissions, observed for neighbourhoods. Variations in morbidity and service use are expected to reflect prevalence. The second type of collateral information is ecological risk factors (e.g., pollution indices) that are expected to explain variability in prevalence in service areas, but are typically observed only for neighbourhoods. An application involves estimating neighbourhood asthma prevalence in a London health region involving 562 neighbourhoods and 189 service (primary care) areas. PMID:24129116
Congdon, Peter
2013-10-14
This paper considers estimation of disease prevalence for small areas (neighbourhoods) when the available observations on prevalence are for an alternative partition of a region, such as service areas. Interpolation to neighbourhoods uses a kernel method extended to take account of two types of collateral information. The first is morbidity and service use data, such as hospital admissions, observed for neighbourhoods. Variations in morbidity and service use are expected to reflect prevalence. The second type of collateral information is ecological risk factors (e.g., pollution indices) that are expected to explain variability in prevalence in service areas, but are typically observed only for neighbourhoods. An application involves estimating neighbourhood asthma prevalence in a London health region involving 562 neighbourhoods and 189 service (primary care) areas.
Leptin applications in 2015: What have we learned about leptin and obesity?
Farr, Olivia M.; Gavrieli, Anna; Mantzoros, Christos S.
2015-01-01
Purpose of review To summarize previous and current advancements for leptin therapeutics, we described how leptin may be useful in leptin deficient states such as lipodystrophy, for which leptin was recently approved, and how it may be useful in the future for typical obesity. Recent findings The discovery of leptin in 1994 built the foundation for understanding the pathophysiology and treatment of obesity. Leptin therapy reverses morbid obesity related to congenital leptin deficiency and appears to effectively treat lipodystrophy, a finding which has led to the approval of leptin for the treatment of lipodystrophy in the USA and Japan. Typical obesity, on the other hand, is characterized by hyperleptinemia and leptin resistance. Thus, leptin administration has proven ineffective for inducing weight loss on its own but may be useful in combination with other therapies or for weight loss maintenance. Summary Leptin is not yet able to treat typical obesity, however, it is effective for reversing leptin deficiency-induced obesity and lipodystrophy. New mechanisms and pathways involved in leptin resistance are continuously discovered, while the development of new techniques and drug combinations which may improve leptin’s efficacy and safety regenerate the hope for its use as an effective treatment for typical obesity. PMID:26313897
NASA Astrophysics Data System (ADS)
Park, Jihoon; Yang, Guang; Satija, Addy; Scheidt, Céline; Caers, Jef
2016-12-01
Sensitivity analysis plays an important role in geoscientific computer experiments, whether for forecasting, data assimilation or model calibration. In this paper we focus on an extension of a method of regionalized sensitivity analysis (RSA) to applications typical in the Earth Sciences. Such applications involve the building of large complex spatial models, the application of computationally extensive forward modeling codes and the integration of heterogeneous sources of model uncertainty. The aim of this paper is to be practical: 1) provide a Matlab code, 2) provide novel visualization methods to aid users in getting a better understanding in the sensitivity 3) provide a method based on kernel principal component analysis (KPCA) and self-organizing maps (SOM) to account for spatial uncertainty typical in Earth Science applications and 4) provide an illustration on a real field case where the above mentioned complexities present themselves. We present methods that extend the original RSA method in several ways. First we present the calculation of conditional effects, defined as the sensitivity of a parameter given a level of another parameters. Second, we show how this conditional effect can be used to choose nominal values or ranges to fix insensitive parameters aiming to minimally affect uncertainty in the response. Third, we develop a method based on KPCA and SOM to assign a rank to spatial models in order to calculate the sensitivity on spatial variability in the models. A large oil/gas reservoir case is used as illustration of these ideas.
General solutions for the oxidation kinetics of polymers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gillen, K.T.; Clough, R.L.; Wise, J.
1996-08-01
The simplest general kinetic schemes applicable to the oxidation of polymers are presented, discussed and analyzed in terms of the underlying kinetic assumptions. For the classic basic autoxidation scheme (BAS), which involves three bimolecular termination steps and is applicable mainly to unstabilized polymers, typical assumptions used singly or in groups include (1) long kinetic chain length, (2) a specific ratio of the termination rate constants and (3) insensitivity to the oxygen concentration (e.g., domination by a single termination step). Steady-state solutions for the rate of oxidation are given in terms of one, two, three, or four parameters, corresponding respectively tomore » three, two, one, or zero kinetic assumptions. The recently derived four-parameter solution predicts conditions yielding unusual dependencies of the oxidation rate on oxygen concentration and on initiation rate, as well as conditions leading to some unusual diffusion-limited oxidation profile shapes. For stabilized polymers, unimolecular termination schemes are typically more appropriate than bimolecular. Kinetics incorporating unimolecular termination reactions are shown to result in very simple oxidation expressions which have been experimentally verified for both radiation-initiated oxidation of an EPDM and thermoxidative degradation of nitrile and chloroprene elastomers.« less
Spatially selective assembly of quantum dot light emitters in an LED using engineered peptides.
Demir, Hilmi Volkan; Seker, Urartu Ozgur Safak; Zengin, Gulis; Mutlugun, Evren; Sari, Emre; Tamerler, Candan; Sarikaya, Mehmet
2011-04-26
Semiconductor nanocrystal quantum dots are utilized in numerous applications in nano- and biotechnology. In device applications, where several different material components are involved, quantum dots typically need to be assembled at explicit locations for enhanced functionality. Conventional approaches cannot meet these requirements where assembly of nanocrystals is usually material-nonspecific, thereby limiting the control of their spatial distribution. Here we demonstrate directed self-assembly of quantum dot emitters at material-specific locations in a color-conversion LED containing several material components including a metal, a dielectric, and a semiconductor. We achieve a spatially selective immobilization of quantum dot emitters by using the unique material selectivity characteristics provided by the engineered solid-binding peptides as smart linkers. Peptide-decorated quantum dots exhibited several orders of magnitude higher photoluminescence compared to the control groups, thus, potentially opening up novel ways to advance these photonic platforms in applications ranging from chemical to biodetection.
Laser-Induced Fluorescence Velocity Measurements in Supersonic Underexpanded Impinging Jets
NASA Technical Reports Server (NTRS)
Inman, Jennifer A.; Danehy, Paul M.; Barthel, Brett; Alderfer, David W.; Novak, Robert J.
2010-01-01
We report on an application of nitric oxide (NO) flow-tagging velocimetry to impinging underexpanded jet flows issuing from a Mach 2.6 nozzle. The technique reported herein utilizes a single laser, single camera system to obtain planar maps of the streamwise component of velocity. Whereas typical applications of this technique involve comparing two images acquired at different time delays, this application uses a single image and time delay. The technique extracts velocity by assuming that particular regions outside the jet flowfield have negligible velocity and may therefore serve as a stationary reference against which to measure motion of the jet flowfield. By taking the average of measurements made in 100 single-shot images for each flow condition, streamwise velocities of between -200 and +1,000 m/s with accuracies of between 15 and 50 m/s are reported within the jets. Velocity measurements are shown to explain otherwise seemingly anomalous impingement surface pressure measurements.
Joint FACET: the Canada-Netherlands initiative to study multisensor data fusion systems
NASA Astrophysics Data System (ADS)
Bosse, Eloi; Theil, Arne; Roy, Jean; Huizing, Albert G.; van Aartsen, Simon
1998-09-01
This paper presents the progress of a collaborative effort between Canada and The Netherlands in analyzing multi-sensor data fusion systems, e.g. for potential application to their respective frigates. In view of the overlapping interest in studying and comparing applicability and performance and advanced state-of-the-art Multi-Sensor Data FUsion (MSDF) techniques, the two research establishments involved have decided to join their efforts in the development of MSDF testbeds. This resulted in the so-called Joint-FACET, a highly modular and flexible series of applications that is capable of processing both real and synthetic input data. Joint-FACET allows the user to create and edit test scenarios with multiple ships, sensor and targets, generate realistic sensor outputs, and to process these outputs with a variety of MSDF algorithms. These MSDF algorithms can also be tested using typical experimental data collected during live military exercises.
Atmospheric propagation and combining of high-power lasers.
Nelson, W; Sprangle, P; Davis, C C
2016-03-01
In this paper, we analyze beam combining and atmospheric propagation of high-power lasers for directed-energy (DE) applications. The large linewidths inherent in high-power fiber and slab lasers cause random phase and intensity fluctuations that occur on subnanosecond time scales. Coherently combining these high-power lasers would involve instruments capable of precise phase control and operation at rates greater than ∼10 GHz. To the best of our knowledge, this technology does not currently exist. This presents a challenging problem when attempting to phase lock high-power lasers that is not encountered when phase locking low-power lasers, for example, at milliwatt power levels. Regardless, we demonstrate that even if instruments are developed that can precisely control the phase of high-power lasers, coherent combining is problematic for DE applications. The dephasing effects of atmospheric turbulence typically encountered in DE applications will degrade the coherent properties of the beam before it reaches the target. Through simulations, we find that coherent beam combining in moderate turbulence and over multikilometer propagation distances has little advantage over incoherent combining. Additionally, in cases of strong turbulence and multikilometer propagation ranges, we find nearly indistinguishable intensity profiles and virtually no difference in the energy on the target between coherently and incoherently combined laser beams. Consequently, we find that coherent beam combining at the transmitter plane is ineffective under typical atmospheric conditions.
Havelaar, A H; De Hollander, A E; Teunis, P F; Evers, E G; Van Kranen, H J; Versteegh, J F; Van Koten, J E; Slob, W
2000-04-01
To evaluate the applicability of disability adjusted life-years (DALYs) as a measure to compare positive and negative health effects of drinking water disinfection, we conducted a case study involving a hypothetical drinking water supply from surface water. This drinking water supply is typical in The Netherlands. We compared the reduction of the risk of infection with Cryptosporidium parvum by ozonation of water to the concomitant increase in risk of renal cell cancer arising from the production of bromate. We applied clinical, epidemiologic, and toxicologic data on morbidity and mortality to calculate the net health benefit in DALYs. We estimated the median risk of infection with C. parvum as 10(-3)/person-year. Ozonation reduces the median risk in the baseline approximately 7-fold, but bromate is produced in a concentration above current guideline levels. However, the health benefits of preventing gastroenteritis in the general population and premature death in patients with acquired immunodeficiency syndrome outweigh health losses by premature death from renal cell cancer by a factor of > 10. The net benefit is approximately 1 DALY/million person-years. The application of DALYs in principle allows us to more explicitly compare the public health risks and benefits of different management options. In practice, the application of DALYs may be hampered by the substantial degree of uncertainty, as is typical for risk assessment.
MS-QI: A Modulation Spectrum-Based ECG Quality Index for Telehealth Applications.
Tobon V, Diana P; Falk, Tiago H; Maier, Martin
2016-08-01
As telehealth applications emerge, the need for accurate and reliable biosignal quality indices has increased. One typical modality used in remote patient monitoring is the electrocardiogram (ECG), which is inherently susceptible to several different noise sources, including environmental (e.g., powerline interference), experimental (e.g., movement artifacts), and physiological (e.g., muscle and breathing artifacts). Accurate measurement of ECG quality can allow for automated decision support systems to make intelligent decisions about patient conditions. This is particularly true for in-home monitoring applications, where the patient is mobile and the ECG signal can be severely corrupted by movement artifacts. In this paper, we propose an innovative ECG quality index based on the so-called modulation spectral signal representation. The representation quantifies the rate of change of ECG spectral components, which are shown to be different from the rate of change of typical ECG noise sources. The proposed modulation spectral-based quality index, MS-QI, was tested on 1) synthetic ECG signals corrupted by varying levels of noise, 2) single-lead recorded data using the Hexoskin garment during three activity levels (sitting, walking, running), 3) 12-lead recorded data using conventional ECG machines (Computing in Cardiology 2011 dataset), and 4) two-lead ambulatory ECG recorded from arrhythmia patients (MIT-BIH Arrhythmia Database). Experimental results showed the proposed index outperforming two conventional benchmark quality measures, particularly in the scenarios involving recorded data in real-world environments.
Uncertainties in the Thermal and Mechanical Properties of Particulate Composites Quantified
NASA Technical Reports Server (NTRS)
Murthy, Pappu L. N.; Mital, Subodh K.
2001-01-01
Particle-reinforced composites are candidate materials for a wide variety of aerospace and nonaerospace applications. The high costs and technical difficulties involved with the use of many fiber-reinforced composites often limit their use in many applications. Consequently, particulate composites have emerged as viable alternatives to conventional fiber-reinforced composites. Particulate composites can be processed to near net shapepotentially reducing the manufacturing costs. They are candidate materials where shock or impact properties are important. For example, particle-reinforced metal matrix composites have shown great potential for many automotive applications. Typically, these materials are aluminum matrix reinforced with SiC or TiC particles. Reinforced concrete can also be thought of as a particle-reinforced composite. In situ ceramics can be modeled as particulate composites and are candidate materials for many high-temperature applications. The characterization of these materials is fundamental to their reliable use. It has been observed that the overall properties of these composites exhibit scatter because of the uncertainty in the constituent material properties, and fabrication-related parameters.
Conformal mapping for multiple terminals
Wang, Weimin; Ma, Wenying; Wang, Qiang; Ren, Hao
2016-01-01
Conformal mapping is an important mathematical tool that can be used to solve various physical and engineering problems in many fields, including electrostatics, fluid mechanics, classical mechanics, and transformation optics. It is an accurate and convenient way to solve problems involving two terminals. However, when faced with problems involving three or more terminals, which are more common in practical applications, existing conformal mapping methods apply assumptions or approximations. A general exact method does not exist for a structure with an arbitrary number of terminals. This study presents a conformal mapping method for multiple terminals. Through an accurate analysis of boundary conditions, additional terminals or boundaries are folded into the inner part of a mapped region. The method is applied to several typical situations, and the calculation process is described for two examples of an electrostatic actuator with three electrodes and of a light beam splitter with three ports. Compared with previously reported results, the solutions for the two examples based on our method are more precise and general. The proposed method is helpful in promoting the application of conformal mapping in analysis of practical problems. PMID:27830746
ERIC Educational Resources Information Center
Solomon, Olga; Heritage, John; Yin, Larry; Maynard, Douglas W.; Bauman, Margaret L.
2016-01-01
Conversation and discourse analyses were used to examine medical problem presentation in pediatric care. Healthcare visits involving children with ASD and typically developing children were analyzed. We examined how children's communicative and epistemic capabilities, and their opportunities to be socialized into a competent patient role are…
An expert systems approach to automated fault management in a regenerative life support subsystem
NASA Technical Reports Server (NTRS)
Malin, J. T.; Lance, N., Jr.
1986-01-01
This paper describes FIXER, a prototype expert system for automated fault management in a regenerative life support subsystem typical of Space Station applications. The development project provided an evaluation of the use of expert systems technology to enhance controller functions in space subsystems. The software development approach permitted evaluation of the effectiveness of direct involvement of the expert in design and development. The approach also permitted intensive observation of the knowledge and methods of the expert. This paper describes the development of the prototype expert system and presents results of the evaluation.
On applying cognitive psychology.
Baddeley, Alan
2013-11-01
Recent attempts to assess the practical impact of scientific research prompted my own reflections on over 40 years worth of combining basic and applied cognitive psychology. Examples are drawn principally from the study of memory disorders, but also include applications to the assessment of attention, reading, and intelligence. The most striking conclusion concerns the many years it typically takes to go from an initial study, to the final practical outcome. Although the complexity and sheer timescale involved make external evaluation problematic, the combination of practical satisfaction and theoretical stimulation make the attempt to combine basic and applied research very rewarding. © 2013 The British Psychological Society.
A rapid boundary integral equation technique for protein electrostatics
NASA Astrophysics Data System (ADS)
Grandison, Scott; Penfold, Robert; Vanden-Broeck, Jean-Marc
2007-06-01
A new boundary integral formulation is proposed for the solution of electrostatic field problems involving piecewise uniform dielectric continua. Direct Coulomb contributions to the total potential are treated exactly and Green's theorem is applied only to the residual reaction field generated by surface polarisation charge induced at dielectric boundaries. The implementation shows significantly improved numerical stability over alternative schemes involving the total field or its surface normal derivatives. Although strictly respecting the electrostatic boundary conditions, the partitioned scheme does introduce a jump artefact at the interface. Comparison against analytic results in canonical geometries, however, demonstrates that simple interpolation near the boundary is a cheap and effective way to circumvent this characteristic in typical applications. The new scheme is tested in a naive model to successfully predict the ground state orientation of biomolecular aggregates comprising the soybean storage protein, glycinin.
Drager, Kathryn; Light, Janice; Caron, Jessica Gosnell
2017-01-01
Purpose Augmentative and alternative communication (AAC) promotes communicative participation and language development for young children with complex communication needs. However, the motor, linguistic, and cognitive demands of many AAC technologies restrict young children's operational use of and influence over these technologies. The purpose of the current study is to better understand young children's participation in programming vocabulary “just in time” on an AAC application with minimized demands. Method A descriptive study was implemented to highlight the participation of 10 typically developing toddlers (M age: 16 months, range: 10–22 months) in just-in-time vocabulary programming in an AAC app with visual scene displays. Results All 10 toddlers participated in some capacity in adding new visual scene displays and vocabulary to the app just in time. Differences in participation across steps were observed, suggesting variation in the developmental demands of controls involved in vocabulary programming. Conclusions Results from the current study provide clinical insights toward involving young children in AAC programming just in time and steps that may allow for more independent participation or require more scaffolding. Technology designed to minimize motor, cognitive, and linguistic demands may allow children to participate in programming devices at a younger age. PMID:28586825
Holyfield, Christine; Drager, Kathryn; Light, Janice; Caron, Jessica Gosnell
2017-08-15
Augmentative and alternative communication (AAC) promotes communicative participation and language development for young children with complex communication needs. However, the motor, linguistic, and cognitive demands of many AAC technologies restrict young children's operational use of and influence over these technologies. The purpose of the current study is to better understand young children's participation in programming vocabulary "just in time" on an AAC application with minimized demands. A descriptive study was implemented to highlight the participation of 10 typically developing toddlers (M age: 16 months, range: 10-22 months) in just-in-time vocabulary programming in an AAC app with visual scene displays. All 10 toddlers participated in some capacity in adding new visual scene displays and vocabulary to the app just in time. Differences in participation across steps were observed, suggesting variation in the developmental demands of controls involved in vocabulary programming. Results from the current study provide clinical insights toward involving young children in AAC programming just in time and steps that may allow for more independent participation or require more scaffolding. Technology designed to minimize motor, cognitive, and linguistic demands may allow children to participate in programming devices at a younger age.
Judicial intervention in alcohol regulation: an empirical legal analysis.
Muhunthan, Janani; Angell, Blake; Wilson, Andrew; Reeve, Belinda; Jan, Stephen
2017-08-01
While governments draft law and policy to promote public health, it is through cases put before the judiciary that the implementation of law can be challenged and where its practical implications are typically determined. In this paper, we examine the role of court judgements on efforts in Australia to regulate the harmful use of alcohol. Australian case law (2010 to June 2015) involving the judicial review of administrative decisions relating to development applications or liquor licences for retail liquor outlets (bottle shops), hotels, pubs and clubs was identified using a case law database (WestLaw AU). Data were extracted and analysed using standard systematic review techniques. A total of 44 cases were included in the analysis. Of these, 90% involved appeals brought by industry actors against local or state government stakeholders seeking to reject applications for development applications and liquor licences. The proportion of judicial decisions resulting in outcomes in favour of industry was 77%. Public health research evidence appeared to have little or no influence, as there is no requirement for legislation to consider public health benefit. Implications for public health: A requirement that the impact on public health is considered in legislation will help to offset its strong pro-competition emphasis, which in turn has strongly influenced judicial decision making in this area. © 2017 The Authors.
241Am Ingrowth and Its Effect on Internal Dose
Konzen, Kevin
2016-07-01
Generally, plutonium has been manufactured to support commercial and military applications involving heat sources, weapons and reactor fuel. This work focuses on three typical plutonium mixtures, while observing the potential of 241Am ingrowth and its effect on internal dose. The term “ingrowth” is used to describe 241Am production due solely from the decay of 241Pu as part of a plutonium mixture, where it is initially absent or present in a smaller quantity. Dose calculation models do not account for 241Am ingrowth unless the 241Pu quantity is specified. This work suggested that 241Am ingrowth be considered in bioassay analysis when theremore » is a potential of a 10% increase to the individual’s committed effective dose. It was determined that plutonium fuel mixtures, initially absent of 241Am, would likely exceed 10% for typical reactor grade fuel aged less than 30 years; however, heat source grade and aged weapons grade fuel would normally fall below this threshold. In conclusion, although this work addresses typical plutonium mixtures following separation, it may be extended to irradiated commercial uranium fuel and is expected to be a concern in the recycling of spent fuel.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Konzen, Kevin
Generally, plutonium has been manufactured to support commercial and military applications involving heat sources, weapons and reactor fuel. This work focuses on three typical plutonium mixtures, while observing the potential of 241Am ingrowth and its effect on internal dose. The term “ingrowth” is used to describe 241Am production due solely from the decay of 241Pu as part of a plutonium mixture, where it is initially absent or present in a smaller quantity. Dose calculation models do not account for 241Am ingrowth unless the 241Pu quantity is specified. This work suggested that 241Am ingrowth be considered in bioassay analysis when theremore » is a potential of a 10% increase to the individual’s committed effective dose. It was determined that plutonium fuel mixtures, initially absent of 241Am, would likely exceed 10% for typical reactor grade fuel aged less than 30 years; however, heat source grade and aged weapons grade fuel would normally fall below this threshold. In conclusion, although this work addresses typical plutonium mixtures following separation, it may be extended to irradiated commercial uranium fuel and is expected to be a concern in the recycling of spent fuel.« less
Complement Coercion: The Joint Effects of Type and Typicality.
Zarcone, Alessandra; McRae, Ken; Lenci, Alessandro; Padó, Sebastian
2017-01-01
Complement coercion ( begin a book → reading ) involves a type clash between an event-selecting verb and an entity-denoting object, triggering a covert event ( reading ). Two main factors involved in complement coercion have been investigated: the semantic type of the object (event vs. entity), and the typicality of the covert event ( the author began a book → writing ). In previous research, reading times have been measured at the object. However, the influence of the typicality of the subject-object combination on processing an aspectual verb such as begin has not been studied. Using a self-paced reading study, we manipulated semantic type and subject-object typicality, exploiting German word order to measure reading times at the aspectual verb. These variables interacted at the target verb. We conclude that both type and typicality probabilistically guide expectations about upcoming input. These results are compatible with an expectation-based view of complement coercion and language comprehension more generally in which there is rapid interaction between what is typically viewed as linguistic knowledge, and what is typically viewed as domain general knowledge about how the world works.
Complement Coercion: The Joint Effects of Type and Typicality
Zarcone, Alessandra; McRae, Ken; Lenci, Alessandro; Padó, Sebastian
2017-01-01
Complement coercion (begin a book →reading) involves a type clash between an event-selecting verb and an entity-denoting object, triggering a covert event (reading). Two main factors involved in complement coercion have been investigated: the semantic type of the object (event vs. entity), and the typicality of the covert event (the author began a book →writing). In previous research, reading times have been measured at the object. However, the influence of the typicality of the subject–object combination on processing an aspectual verb such as begin has not been studied. Using a self-paced reading study, we manipulated semantic type and subject–object typicality, exploiting German word order to measure reading times at the aspectual verb. These variables interacted at the target verb. We conclude that both type and typicality probabilistically guide expectations about upcoming input. These results are compatible with an expectation-based view of complement coercion and language comprehension more generally in which there is rapid interaction between what is typically viewed as linguistic knowledge, and what is typically viewed as domain general knowledge about how the world works. PMID:29225585
Applications of hybrid genetic algorithms in seismic tomography
NASA Astrophysics Data System (ADS)
Soupios, Pantelis; Akca, Irfan; Mpogiatzis, Petros; Basokur, Ahmet T.; Papazachos, Constantinos
2011-11-01
Almost all earth sciences inverse problems are nonlinear and involve a large number of unknown parameters, making the application of analytical inversion methods quite restrictive. In practice, most analytical methods are local in nature and rely on a linearized form of the problem equations, adopting an iterative procedure which typically employs partial derivatives in order to optimize the starting (initial) model by minimizing a misfit (penalty) function. Unfortunately, especially for highly non-linear cases, the final model strongly depends on the initial model, hence it is prone to solution-entrapment in local minima of the misfit function, while the derivative calculation is often computationally inefficient and creates instabilities when numerical approximations are used. An alternative is to employ global techniques which do not rely on partial derivatives, are independent of the misfit form and are computationally robust. Such methods employ pseudo-randomly generated models (sampling an appropriately selected section of the model space) which are assessed in terms of their data-fit. A typical example is the class of methods known as genetic algorithms (GA), which achieves the aforementioned approximation through model representation and manipulations, and has attracted the attention of the earth sciences community during the last decade, with several applications already presented for several geophysical problems. In this paper, we examine the efficiency of the combination of the typical regularized least-squares and genetic methods for a typical seismic tomography problem. The proposed approach combines a local (LOM) and a global (GOM) optimization method, in an attempt to overcome the limitations of each individual approach, such as local minima and slow convergence, respectively. The potential of both optimization methods is tested and compared, both independently and jointly, using the several test models and synthetic refraction travel-time date sets that employ the same experimental geometry, wavelength and geometrical characteristics of the model anomalies. Moreover, real data from a crosswell tomographic project for the subsurface mapping of an ancient wall foundation are used for testing the efficiency of the proposed algorithm. The results show that the combined use of both methods can exploit the benefits of each approach, leading to improved final models and producing realistic velocity models, without significantly increasing the required computation time.
Automated Transition State Search and Its Application to Diverse Types of Organic Reactions.
Jacobson, Leif D; Bochevarov, Art D; Watson, Mark A; Hughes, Thomas F; Rinaldo, David; Ehrlich, Stephan; Steinbrecher, Thomas B; Vaitheeswaran, S; Philipp, Dean M; Halls, Mathew D; Friesner, Richard A
2017-11-14
Transition state search is at the center of multiple types of computational chemical predictions related to mechanistic investigations, reactivity and regioselectivity predictions, and catalyst design. The process of finding transition states in practice is, however, a laborious multistep operation that requires significant user involvement. Here, we report a highly automated workflow designed to locate transition states for a given elementary reaction with minimal setup overhead. The only essential inputs required from the user are the structures of the separated reactants and products. The seamless workflow combining computational technologies from the fields of cheminformatics, molecular mechanics, and quantum chemistry automatically finds the most probable correspondence between the atoms in the reactants and the products, generates a transition state guess, launches a transition state search through a combined approach involving the relaxing string method and the quadratic synchronous transit, and finally validates the transition state via the analysis of the reactive chemical bonds and imaginary vibrational frequencies as well as by the intrinsic reaction coordinate method. Our approach does not target any specific reaction type, nor does it depend on training data; instead, it is meant to be of general applicability for a wide variety of reaction types. The workflow is highly flexible, permitting modifications such as a choice of accuracy, level of theory, basis set, or solvation treatment. Successfully located transition states can be used for setting up transition state guesses in related reactions, saving computational time and increasing the probability of success. The utility and performance of the method are demonstrated in applications to transition state searches in reactions typical for organic chemistry, medicinal chemistry, and homogeneous catalysis research. In particular, applications of our code to Michael additions, hydrogen abstractions, Diels-Alder cycloadditions, carbene insertions, and an enzyme reaction model involving a molybdenum complex are shown and discussed.
Performance evaluation of a distance learning program.
Dailey, D J; Eno, K R; Brinkley, J F
1994-01-01
This paper presents a performance metric which uses a single number to characterize the response time for a non-deterministic client-server application operating over the Internet. When applied to a Macintosh-based distance learning application called the Digital Anatomist Browser, the metric allowed us to observe that "A typical student doing a typical mix of Browser commands on a typical data set will experience the same delay if they use a slow Macintosh on a local network or a fast Macintosh on the other side of the country accessing the data over the Internet." The methodology presented is applicable to other client-server applications that are rapidly appearing on the Internet.
Human exposures to monomers resulting from consumer contact with polymers.
Leber, A P
2001-06-01
Many consumer products are composed completely, or in part, of polymeric materials. Direct or indirect human contact results in potential exposures to monomers as a result of migrations of trace amounts from the polymeric matrix into foods, into the skin or other bodily surfaces. Typically, residual monomer levels in these polymers are <100 p.p.m., and represent exposures well below those observable in traditional toxicity testing. These product applications thus require alternative methods for evaluating health risks relating to monomer exposures. A typical approach includes: (a) assessment of potential human contacts for specific polymer uses; (b) utilization of data from toxicity testing of pure monomers, e.g. cancer bioassay results; and (c) mathematical risk assessment methods. Exposure potentials are measured in one of two analytical procedures: (1) migration of monomer from polymer into a simulant solvent (e.g. alcohol, acidic water, vegetable oil) appropriate for the intended use of the product (e.g. beer cans, food jars, packaging adhesive, dairy hose); or (2) total monomer content of the polymer, providing worse-case values for migratable monomer. Application of toxicity data typically involves NOEL or benchmark values for non-cancer endpoints, or tumorigenicity potencies for monomers demonstrated to be carcinogens. Risk assessments provide exposure 'safety margin' ratios between levels that: (1) are projected to be safe according to toxicity information, and (2) are potential monomer exposures posed by the intended use of the consumer product. This paper includes an example of a health risk assessment for a chewing gum polymer for which exposures to trace levels of butadiene monomer occur.
Disentangling patient and public involvement in healthcare decisions: why the difference matters.
Fredriksson, Mio; Tritter, Jonathan Q
2017-01-01
Patient and public involvement has become an integral aspect of many developed health systems and is judged to be an essential driver for reform. However, little attention has been paid to the distinctions between patients and the public, and the views of patients are often seen to encompass those of the general public. Using an ideal-type approach, we analyse crucial distinctions between patient involvement and public involvement using examples from Sweden and England. We highlight that patients have sectional interests as health service users in contrast to citizens who engage as a public policy agent reflecting societal interests. Patients draw on experiential knowledge and focus on output legitimacy and performance accountability, aim at typical representativeness, and a direct responsiveness to individual needs and preferences. In contrast, the public contributes with collective perspectives generated from diversity, centres on input legitimacy achieved through statistical representativeness, democratic accountability and indirect responsiveness to general citizen preferences. Thus, using patients as proxies for the public fails to achieve intended goals and benefits of involvement. We conclude that understanding and measuring the impact of patient and public involvement can only develop with the application of a clearer comprehension of the differences. © 2016 Foundation for the Sociology of Health & Illness.
Testing and Evaluation of Passive Radiation Detection Equipment for Homeland Security
West, David L.; Wood, Nathan L.; Forrester, Christina D.
2017-12-01
This article is concerned with test and evaluation methods for passive radiation detection equipment used in homeland security applications. The different types of equipment used in these applications are briefly reviewed and then test and evaluation methods discussed. The primary emphasis is on the test and evaluation standards developed by the American National Standards Institute’s N42 committees. Commonalities among the standards are then reviewed as well as examples of unique aspects for specific equipment types. Throughout, sample test configurations and results from testing and evaluation at Oak Ridge National Laboratory are given. The article concludes with a brief discussion ofmore » typical tests and evaluations not covered by the N42 standards and some examples of test and evaluation that involve the end users of the equipment.« less
Selective Plasma Etching of Polymeric Substrates for Advanced Applications
Puliyalil, Harinarayanan; Cvelbar, Uroš
2016-01-01
In today’s nanoworld, there is a strong need to manipulate and process materials on an atom-by-atom scale with new tools such as reactive plasma, which in some states enables high selectivity of interaction between plasma species and materials. These interactions first involve preferential interactions with precise bonds in materials and later cause etching. This typically occurs based on material stability, which leads to preferential etching of one material over other. This process is especially interesting for polymeric substrates with increasing complexity and a “zoo” of bonds, which are used in numerous applications. In this comprehensive summary, we encompass the complete selective etching of polymers and polymer matrix micro-/nanocomposites with plasma and unravel the mechanisms behind the scenes, which ultimately leads to the enhancement of surface properties and device performance. PMID:28335238
Testing and Evaluation of Passive Radiation Detection Equipment for Homeland Security
DOE Office of Scientific and Technical Information (OSTI.GOV)
West, David L.; Wood, Nathan L.; Forrester, Christina D.
This article is concerned with test and evaluation methods for passive radiation detection equipment used in homeland security applications. The different types of equipment used in these applications are briefly reviewed and then test and evaluation methods discussed. The primary emphasis is on the test and evaluation standards developed by the American National Standards Institute’s N42 committees. Commonalities among the standards are then reviewed as well as examples of unique aspects for specific equipment types. Throughout, sample test configurations and results from testing and evaluation at Oak Ridge National Laboratory are given. The article concludes with a brief discussion ofmore » typical tests and evaluations not covered by the N42 standards and some examples of test and evaluation that involve the end users of the equipment.« less
From astronomy and telecommunications to biomedicine
NASA Astrophysics Data System (ADS)
Behr, Bradford B.; Baker, Scott A.; Bismilla, Yusuf; Cenko, Andrew T.; DesRoches, Brandon; Hajian, Arsen R.; Meade, Jeffrey T.; Nitkowski, Arthur; Preston, Kyle J.; Schmidt, Bradley S.; Sherwood-Droz, Nicolás.; Slaa, Jared
2015-03-01
Photonics is an inherently interdisciplinary endeavor, as technologies and techniques invented or developed in one scientific field are often found to be applicable to other fields or disciplines. We present two case studies in which optical spectroscopy technologies originating from stellar astrophysics and optical telecommunications multiplexing have been successfully adapted for biomedical applications. The first case involves a design concept called the High Throughput Virtual Slit, or HTVS, which provides high spectral resolution without the throughput inefficiency typically associated with a narrow spectrometer slit. HTVS-enhanced spectrometers have been found to significantly improve the sensitivity and speed of fiber-fed Raman analysis systems, and the method is now being adapted for hyperspectral imaging for medical and biological sensing. The second example of technology transfer into biomedicine centers on integrated optics, in which optical waveguides are fabricated on to silicon substrates in a substantially similar fashion as integrated circuits in computer chips. We describe an architecture referred to as OCTANE which implements a small and robust "spectrometer-on-a-chip" which is optimized for optical coherence tomography (OCT). OCTANE-based OCT systems deliver three-dimensional imaging resolution at the micron scale with greater stability and lower cost than equivalent conventional OCT approaches. Both HTVS and OCTANE enable higher precision and improved reliability under environmental conditions that are typically found in a clinical or laboratory setting.
Improving P2P live-content delivery using SVC
NASA Astrophysics Data System (ADS)
Schierl, T.; Sánchez, Y.; Hellge, C.; Wiegand, T.
2010-07-01
P2P content delivery techniques for video transmission have become of high interest in the last years. With the involvement of client into the delivery process, P2P approaches can significantly reduce the load and cost on servers, especially for popular services. However, previous studies have already pointed out the unreliability of P2P-based live streaming approaches due to peer churn, where peers may ungracefully leave the P2P infrastructure, typically an overlay networks. Peers ungracefully leaving the system cause connection losses in the overlay, which require repair operations. During such repair operations, which typically take a few roundtrip times, no data is received from the lost connection. While taking low delay for fast-channel tune-in into account as a key feature for broadcast-like streaming applications, the P2P live streaming approach can only rely on a certain media pre-buffer during such repair operations. In this paper, multi-tree based Application Layer Multicast as a P2P overlay technique for live streaming is considered. The use of Flow Forwarding (FF), a.k.a. Retransmission, or Forward Error Correction (FEC) in combination with Scalable video Coding (SVC) for concealment during overlay repair operations is shown. Furthermore the benefits of using SVC over the use of AVC single layer transmission are presented.
NASA Astrophysics Data System (ADS)
Cauquil, Jean-Marc; Martin, Jean-Yves; Bruins, Peter; Benschop, A. A. J.
2003-01-01
The life time tests realised on the serial production of Rotary Mmonoblock RM2 coolers show a measured MTTF of 4900 hours. The conventional test profile applied to these coolers is representative of operation in typical application. The duration of such life time tests is very long. The results of a design change and its impact on MTTF are available only several months after the assembly of the prototypes. We decided to develop a test method in order to reduce the duration of these life time tests. The principle is to define a test protocol easy to implement, more severe than typical application profile in order to accelerate life time tests. The accelerated test profile was defined and tested successfully. This new technique allows us to reduce life time tests costs and duration and thus the costs involved. As a consequence, we decided to have a screening of our production with this accelerated test. This allows us to master continuously the quality of our serial products and to collect additional data. This paper presents the results of life time tests performed on RM2 coolers according to the conventional and accelerated test profiles as well as the first results on the new RM2 design which show a calculated MTTF of 10000 hours.
Havelaar, A H; De Hollander, A E; Teunis, P F; Evers, E G; Van Kranen, H J; Versteegh, J F; Van Koten, J E; Slob, W
2000-01-01
To evaluate the applicability of disability adjusted life-years (DALYs) as a measure to compare positive and negative health effects of drinking water disinfection, we conducted a case study involving a hypothetical drinking water supply from surface water. This drinking water supply is typical in The Netherlands. We compared the reduction of the risk of infection with Cryptosporidium parvum by ozonation of water to the concomitant increase in risk of renal cell cancer arising from the production of bromate. We applied clinical, epidemiologic, and toxicologic data on morbidity and mortality to calculate the net health benefit in DALYs. We estimated the median risk of infection with C. parvum as 10(-3)/person-year. Ozonation reduces the median risk in the baseline approximately 7-fold, but bromate is produced in a concentration above current guideline levels. However, the health benefits of preventing gastroenteritis in the general population and premature death in patients with acquired immunodeficiency syndrome outweigh health losses by premature death from renal cell cancer by a factor of > 10. The net benefit is approximately 1 DALY/million person-years. The application of DALYs in principle allows us to more explicitly compare the public health risks and benefits of different management options. In practice, the application of DALYs may be hampered by the substantial degree of uncertainty, as is typical for risk assessment. Images Figure 1 Figure 2 Figure 3 Figure 4 Figure 5 Figure 6 Figure 7 PMID:10753089
Visualization of energy: light dose indicator based on electrochromic gyroid nano-materials
NASA Astrophysics Data System (ADS)
Wei, Di; Scherer, Maik R. J.; Astley, Michael; Steiner, Ullrich
2015-06-01
The typical applications of electrochromic devices do not make use of the charge-dependent, gradual optical response due to their slow voltage-sensitive coloration. However, in this paper we present a design for a reusable, self-powered light dose indicator consisting of a solar cell and a gyroid-structured nickel oxide (NiO) electrochromic display that measures the cumulative charge per se, making use of the efficient voltage-sensitive coloration of gyroid materials. To circumvent the stability issues associated with the standard aqueous electrolyte that is typically accompanied by water splitting and gas evolution, we investigate a novel nano-gyroid NiO electrochromic device based on organic solvents of 1,1,1,3,3,3-hexafluoropropan-2-ol, and room temperature ionic liquid (RTIL) triethylsulfonium bis(trifluoromethylsulfonyl) imide ([SET3][TFSI]) containing lithium bis(trifluoromethylsulfonyl) imide. We show that an effective light dose indicator can be enabled by nano-gyroid NiO with RTIL; this proves to be a reliable device since it does not involve solvent degradation or gas generation.
NASA Astrophysics Data System (ADS)
Gong, Rui; Wang, Qing; Shao, Xiaopeng; Zhou, Conghao
2016-12-01
This study aims to expand the applications of color appearance models to representing the perceptual attributes for digital images, which supplies more accurate methods for predicting image brightness and image colorfulness. Two typical models, i.e., the CIELAB model and the CIECAM02, were involved in developing algorithms to predict brightness and colorfulness for various images, in which three methods were designed to handle pixels of different color contents. Moreover, massive visual data were collected from psychophysical experiments on two mobile displays under three lighting conditions to analyze the characteristics of visual perception on these two attributes and to test the prediction accuracy of each algorithm. Afterward, detailed analyses revealed that image brightness and image colorfulness were predicted well by calculating the CIECAM02 parameters of lightness and chroma; thus, the suitable methods for dealing with different color pixels were determined for image brightness and image colorfulness, respectively. This study supplies an example of enlarging color appearance models to describe image perception.
Oral Storytelling as Evidence of Pedagogy in Forager Societies.
Scalise Sugiyama, Michelle
2017-01-01
Teaching is reportedly rare in hunter-gatherer societies, raising the question of whether it is a species-typical trait in humans. A problem with past studies is that they tend to conceptualize teaching in terms of Western pedagogical practices. In contrast, this study proceeds from the premise that teaching requires the ostensive manifestation of generalizable knowledge: the teacher must signal intent to share information, indicate the intended recipient, and transmit knowledge that is applicable beyond the present context. Certain features of human communication appear to be ostensive in function (e.g., eye contact, pointing, contingency, prosodic variation), and collectively serve as "natural pedagogy." Tellingly, oral storytelling in forager societies typically employs these and other ostensive behaviors, and is widely reported to be an important source of generalizable ecological and social knowledge. Despite this, oral storytelling has been conspicuously overlooked in studies of teaching in preliterate societies. Accordingly, this study presents evidence that oral storytelling involves the use of ostension and the transmission of generic knowledge, thereby meeting the criteria of pedagogy.
How Ag Nanospheres Are Transformed into AgAu Nanocages
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moreau, Liane M.; Schurman, Charles A.; Kewalramani, Sumit
Bimetallic hollow, porous noble metal nanoparticles are of broad interest for biomedical, optical and catalytic applications. The most straightforward method for preparing such structures involves the reaction between HAuCl4 and well-formed Ag particles, typically spheres, cubes, or triangular prisms, yet the mechanism underlying their formation is poorly understood at the atomic scale. By combining in situ nanoscopic and atomic-scale characterization techniques (XAFS, SAXS, XRF, and electron microscopy) to follow the process, we elucidate a plausible reaction pathway for the conversion of citrate-capped Ag nanospheres to AgAu nanocages; importantly, the hollowing event cannot be explained by the nanoscale Kirkendall effect, normore » by Galvanic exchange alone, two processes that have been previously proposed. We propose a modification of the bulk Galvanic exchange process that takes into account considerations that can only occur with nanoscale particles. This nanoscale Galvanic exchange process explains the novel morphological and chemical changes associated with the typically observed hollowing process.« less
Case for diagnosis. Systemic light chain amyloidosis with cutaneous involvement*
Gontijo, João Renato Vianna; Pinto, Jackson Machado; de Paula, Maysa Carla
2017-01-01
Systemic light chain amiloydosis is a rare disease. Due to its typical cutaneous lesions, dermatologists play an essential role in its diagnosis. Clinical manifestations vary according to the affected organ and are often unspecific. Definitive diagnosis is achieved through biopsy. We report a patient with palpebral amyloidosis, typical bilateral ecchymoses and cardiac involvement, without plasma cell dyscrasia or lymphomas. The patient died shortly after the diagnosis. PMID:29166521
Reimers, Jeffrey R; McKemmish, Laura K; McKenzie, Ross H; Hush, Noel S
2015-10-14
While diabatic approaches are ubiquitous for the understanding of electron-transfer reactions and have been mooted as being of general relevance, alternate applications have not been able to unify the same wide range of observed spectroscopic and kinetic properties. The cause of this is identified as the fundamentally different orbital configurations involved: charge-transfer phenomena involve typically either 1 or 3 electrons in two orbitals whereas most reactions are typically closed shell. As a result, two vibrationally coupled electronic states depict charge-transfer scenarios whereas three coupled states arise for closed-shell reactions of non-degenerate molecules and seven states for the reactions implicated in the aromaticity of benzene. Previous diabatic treatments of closed-shell processes have considered only two arbitrarily chosen states as being critical, mapping these states to those for electron transfer. We show that such effective two-state diabatic models are feasible but involve renormalized electronic coupling and vibrational coupling parameters, with this renormalization being property dependent. With this caveat, diabatic models are shown to provide excellent descriptions of the spectroscopy and kinetics of the ammonia inversion reaction, proton transfer in N2H7(+), and aromaticity in benzene. This allows for the development of a single simple theory that can semi-quantitatively describe all of these chemical phenomena, as well as of course electron-transfer reactions. It forms a basis for understanding many technologically relevant aspects of chemical reactions, condensed-matter physics, chemical quantum entanglement, nanotechnology, and natural or artificial solar energy capture and conversion.
Current opinion in Alzheimer's disease therapy by nanotechnology-based approaches.
Ansari, Shakeel Ahmed; Satar, Rukhsana; Perveen, Asma; Ashraf, Ghulam Md
2017-03-01
Nanotechnology typically deals with the measuring and modeling of matter at nanometer scale by incorporating the fields of engineering and technology. The most prominent feature of these engineered materials involves their manipulation/modification for imparting new functional properties. The current review covers the most recent findings of Alzheimer's disease (AD) therapeutics based on nanoscience and technology. Current studies involve the application of nanotechnology in developing novel diagnostic and therapeutic tools for neurological disorders. Nanotechnology-based approaches can be exploited for limiting/reversing these diseases for promoting functional regeneration of damaged neurons. These strategies offer neuroprotection by facilitating the delivery of drugs and small molecules more effectively across the blood-brain barrier. Nanotechnology based approaches show promise in improving AD therapeutics. Further replication work on synthesis and surface modification of nanoparticles, longer-term clinical trials, and attempts to increase their impact in treating AD are required.
Optimization of Operations Resources via Discrete Event Simulation Modeling
NASA Technical Reports Server (NTRS)
Joshi, B.; Morris, D.; White, N.; Unal, R.
1996-01-01
The resource levels required for operation and support of reusable launch vehicles are typically defined through discrete event simulation modeling. Minimizing these resources constitutes an optimization problem involving discrete variables and simulation. Conventional approaches to solve such optimization problems involving integer valued decision variables are the pattern search and statistical methods. However, in a simulation environment that is characterized by search spaces of unknown topology and stochastic measures, these optimization approaches often prove inadequate. In this paper, we have explored the applicability of genetic algorithms to the simulation domain. Genetic algorithms provide a robust search strategy that does not require continuity and differentiability of the problem domain. The genetic algorithm successfully minimized the operation and support activities for a space vehicle, through a discrete event simulation model. The practical issues associated with simulation optimization, such as stochastic variables and constraints, were also taken into consideration.
Ephus: Multipurpose Data Acquisition Software for Neuroscience Experiments
Suter, Benjamin A.; O'Connor, Timothy; Iyer, Vijay; Petreanu, Leopoldo T.; Hooks, Bryan M.; Kiritani, Taro; Svoboda, Karel; Shepherd, Gordon M. G.
2010-01-01
Physiological measurements in neuroscience experiments often involve complex stimulus paradigms and multiple data channels. Ephus (http://www.ephus.org) is an open-source software package designed for general-purpose data acquisition and instrument control. Ephus operates as a collection of modular programs, including an ephys program for standard whole-cell recording with single or multiple electrodes in typical electrophysiological experiments, and a mapper program for synaptic circuit mapping experiments involving laser scanning photostimulation based on glutamate uncaging or channelrhodopsin-2 excitation. Custom user functions allow user-extensibility at multiple levels, including on-line analysis and closed-loop experiments, where experimental parameters can be changed based on recently acquired data, such as during in vivo behavioral experiments. Ephus is compatible with a variety of data acquisition and imaging hardware. This paper describes the main features and modules of Ephus and their use in representative experimental applications. PMID:21960959
M-DAS: System for multispectral data analysis. [in Saginaw Bay, Michigan
NASA Technical Reports Server (NTRS)
Johnson, R. H.
1975-01-01
M-DAS is a ground data processing system designed for analysis of multispectral data. M-DAS operates on multispectral data from LANDSAT, S-192, M2S and other sources in CCT form. Interactive training by operator-investigators using a variable cursor on a color display was used to derive optimum processing coefficients and data on cluster separability. An advanced multivariate normal-maximum likelihood processing algorithm was used to produce output in various formats: color-coded film images, geometrically corrected map overlays, moving displays of scene sections, coverage tabulations and categorized CCTs. The analysis procedure for M-DAS involves three phases: (1) screening and training, (2) analysis of training data to compute performance predictions and processing coefficients, and (3) processing of multichannel input data into categorized results. Typical M-DAS applications involve iteration between each of these phases. A series of photographs of the M-DAS display are used to illustrate M-DAS operation.
Emerging Opportunities for Serotypes of Botulinum Neurotoxins
Peng Chen, Zhongxing; Morris, J. Glenn; Rodriguez, Ramon L.; Shukla, Aparna Wagle; Tapia-Núñez, John; Okun, Michael S.
2012-01-01
Background: Two decades ago, botulinum neurotoxin (BoNT) type A was introduced to the commercial market. Subsequently, the toxin was approved by the FDA to address several neurological syndromes, involving muscle, nerve, and gland hyperactivity. These syndromes have typically been associated with abnormalities in cholinergic transmission. Despite the multiplicity of botulinal serotypes (designated as types A through G), therapeutic preparations are currently only available for BoNT types A and B. However, other BoNT serotypes are under study for possible clinical use and new clinical indications; Objective: To review the current research on botulinum neurotoxin serotypes A-G, and to analyze potential applications within basic science and clinical settings; Conclusions: The increasing understanding of botulinal neurotoxin pathophysiology, including the neurotoxin’s effects on specific neuronal populations, will help us in tailoring treatments for specific diagnoses, symptoms and patients. Scientists and clinicians should be aware of the full range of available data involving neurotoxin subtypes A-G. PMID:23202312
Femtosecond Laser Fabrication of Monolithically Integrated Microfluidic Sensors in Glass
He, Fei; Liao, Yang; Lin, Jintian; Song, Jiangxin; Qiao, Lingling; Cheng, Ya; Sugioka, Koji
2014-01-01
Femtosecond lasers have revolutionized the processing of materials, since their ultrashort pulse width and extremely high peak intensity allows high-quality micro- and nanofabrication of three-dimensional (3D) structures. This unique capability opens up a new route for fabrication of microfluidic sensors for biochemical applications. The present paper presents a comprehensive review of recent advancements in femtosecond laser processing of glass for a variety of microfluidic sensor applications. These include 3D integration of micro-/nanofluidic, optofluidic, electrofluidic, surface-enhanced Raman-scattering devices, in addition to fabrication of devices for microfluidic bioassays and lab-on-fiber sensors. This paper describes the unique characteristics of femtosecond laser processing and the basic concepts involved in femtosecond laser direct writing. Advanced spatiotemporal beam shaping methods are also discussed. Typical examples of microfluidic sensors fabricated using femtosecond lasers are then highlighted, and their applications in chemical and biological sensing are described. Finally, a summary of the technology is given and the outlook for further developments in this field is considered. PMID:25330047
Scalable DB+IR Technology: Processing Probabilistic Datalog with HySpirit.
Frommholz, Ingo; Roelleke, Thomas
2016-01-01
Probabilistic Datalog (PDatalog, proposed in 1995) is a probabilistic variant of Datalog and a nice conceptual idea to model Information Retrieval in a logical, rule-based programming paradigm. Making PDatalog work in real-world applications requires more than probabilistic facts and rules, and the semantics associated with the evaluation of the programs. We report in this paper some of the key features of the HySpirit system required to scale the execution of PDatalog programs. Firstly, there is the requirement to express probability estimation in PDatalog. Secondly, fuzzy-like predicates are required to model vague predicates (e.g. vague match of attributes such as age or price). Thirdly, to handle large data sets there are scalability issues to be addressed, and therefore, HySpirit provides probabilistic relational indexes and parallel and distributed processing . The main contribution of this paper is a consolidated view on the methods of the HySpirit system to make PDatalog applicable in real-scale applications that involve a wide range of requirements typical for data (information) management and analysis.
Perspective: Evolutionary design of granular media and block copolymer patterns
NASA Astrophysics Data System (ADS)
Jaeger, Heinrich M.; de Pablo, Juan J.
2016-05-01
The creation of new materials "by design" is a process that starts from desired materials properties and proceeds to identify requirements for the constituent components. Such process is challenging because it inverts the typical modeling approach, which starts from given micro-level components to predict macro-level properties. We describe how to tackle this inverse problem using concepts from evolutionary computation. These concepts have widespread applicability and open up new opportunities for design as well as discovery. Here we apply them to design tasks involving two very different classes of soft materials, shape-optimized granular media and nanopatterned block copolymer thin films.
Impact of Site Elevation on Mg Smelter Design
NASA Astrophysics Data System (ADS)
Baker, Phillip W.
Site elevation has many surprising and significant impacts on the engineering design of metallurgical plant of all types. Electrolytic magnesium smelters maybe built at high elevation for a variety of reasons including availability of raw material, energy or electric power. Because of the unit processes they typically involve, Mg smelters can be extensively impacted by site elevation. In this paper, generic examples of the design changes required to adapt a smelter originally designed for sea level to operate at 2700 m are presented. While the examples are drawn from a magnesium plant design case, these changes are generically applicable to all industrial plants utilizing similar unit processes irrespective of product.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferrada, J.J.; Osborne-Lee, I.W.; Grizzaffi, P.A.
Expert systems are known to be useful in capturing expertise and applying knowledge to chemical engineering problems such as diagnosis, process control, process simulation, and process advisory. However, expert system applications are traditionally limited to knowledge domains that are heuristic and involve only simple mathematics. Neural networks, on the other hand, represent an emerging technology capable of rapid recognition of patterned behavior without regard to mathematical complexity. Although useful in problem identification, neural networks are not very efficient in providing in-depth solutions and typically do not promote full understanding of the problem or the reasoning behind its solutions. Hence, applicationsmore » of neural networks have certain limitations. This paper explores the potential for expanding the scope of chemical engineering areas where neural networks might be utilized by incorporating expert systems and neural networks into the same application, a process called hybridization. In addition, hybrid applications are compared with those using more traditional approaches, the results of the different applications are analyzed, and the feasibility of converting the preliminary prototypes described herein into useful final products is evaluated. 12 refs., 8 figs.« less
Exposure to TCDD from base perimeter application of Agent Orange in Vietnam.
Ross, John H; Hewitt, Andrew; Armitage, James; Solomon, Keith; Watkins, Deborah K; Ginevan, Michael E
2015-04-01
Using recognized methods routinely employed by pesticide regulatory agencies, the exposures of military personnel that were mixer/loader/applicators (M/L/A) of Agent Orange (AO) for perimeter foliage at bases during the Vietnam War were estimated. From the fraction of TCDD in AO, absorbed dosage of the manufacturing contaminant was estimated. Dermal exposure estimated from spray drift to residents of the bases was calculated using internationally recognized software that accounted for proximity, foliar density of application site, droplet size and wind speed among other factors, and produced estimates of deposition. Those that directly handled AO generally had much higher exposures than those further from the areas of use. The differences in exposure potential varied by M/L/A activity, but were typically orders of magnitude greater than bystanders. However, even the most-exposed M/L/A involved in perimeter application had lifetime exposures comparable to persons living in the U.S. at the time, i.e., ~1.3 to 5 pg TCDD/kg bodyweight. Copyright © 2014 Elsevier B.V. All rights reserved.
Performance of a Half-Heusler Thermoelectric Generator for Automotive Application
Szybist, James; Davis, Steven; Thomas, John; ...
2018-04-03
Thermoelectric generators (TEGs) have been researched and developed for harvesting energy from otherwise wasted heat. For automotive applications this will most likely involve using internal combustion engine exhaust as the heat source, with the TEG positioned after the catalyst system. Applications to exhaust gas recirculation systems and compressed air coolers have also been suggested. A thermoelectric generator based on half-Heusler thermoelectric materials was developed, engineered, and fabricated, targeting a gasoline passenger sedan application. This generator was installed on a gasoline engine exhaust system in a dynamometer cell, and positioned immediately downstream of the closecoupled three-way catalyst. The generator was characterizedmore » using a matrix of steady-state conditions representing the important portions of the engine map. Detailed performance results are presented. Measurements indicate the generator can produces over 300 W of power with 900 °C exhaust at relatively high flow rates, but less than 50 W when the exhaust is 600 °C and at lower flow rates. The latter condition is typical of standard test cycles and most driving scenarios.« less
Applications and Mechanisms of Ionic Liquids in Whole-Cell Biotransformation
Fan, Lin-Lin; Li, Hong-Ji; Chen, Qi-He
2014-01-01
Ionic liquids (ILs), entirely composed of cations and anions, are liquid solvents at room temperature. They are interesting due to their low vapor pressure, high polarity and thermostability, and also for the possibility to fine-tune their physicochemical properties through modification of the chemical structures of their cations or anions. In recent years, ILs have been widely used in biotechnological fields involving whole-cell biotransformations of biodiesel or biomass, and organic compound synthesis with cells. Research studies in these fields have increased from the past decades and compared to the typical solvents, ILs are the most promising alternative solvents for cell biotransformations. However, there are increasing limitations and new challenges in whole-cell biotransformations with ILs. There is little understanding of the mechanisms of ILs’ interactions with cells, and much remains to be clarified. Further investigations are required to overcome the drawbacks of their applications and to broaden their application spectrum. This work mainly reviews the applications of ILs in whole-cell biotransformations, and the possible mechanisms of ILs in microbial cell biotransformation are proposed and discussed. PMID:25007820
Applications and mechanisms of ionic liquids in whole-cell biotransformation.
Fan, Lin-Lin; Li, Hong-Ji; Chen, Qi-He
2014-07-09
Ionic liquids (ILs), entirely composed of cations and anions, are liquid solvents at room temperature. They are interesting due to their low vapor pressure, high polarity and thermostability, and also for the possibility to fine-tune their physicochemical properties through modification of the chemical structures of their cations or anions. In recent years, ILs have been widely used in biotechnological fields involving whole-cell biotransformations of biodiesel or biomass, and organic compound synthesis with cells. Research studies in these fields have increased from the past decades and compared to the typical solvents, ILs are the most promising alternative solvents for cell biotransformations. However, there are increasing limitations and new challenges in whole-cell biotransformations with ILs. There is little understanding of the mechanisms of ILs' interactions with cells, and much remains to be clarified. Further investigations are required to overcome the drawbacks of their applications and to broaden their application spectrum. This work mainly reviews the applications of ILs in whole-cell biotransformations, and the possible mechanisms of ILs in microbial cell biotransformation are proposed and discussed.
Performance of a Half-Heusler Thermoelectric Generator for Automotive Application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Szybist, James; Davis, Steven; Thomas, John
Thermoelectric generators (TEGs) have been researched and developed for harvesting energy from otherwise wasted heat. For automotive applications this will most likely involve using internal combustion engine exhaust as the heat source, with the TEG positioned after the catalyst system. Applications to exhaust gas recirculation systems and compressed air coolers have also been suggested. A thermoelectric generator based on half-Heusler thermoelectric materials was developed, engineered, and fabricated, targeting a gasoline passenger sedan application. This generator was installed on a gasoline engine exhaust system in a dynamometer cell, and positioned immediately downstream of the closecoupled three-way catalyst. The generator was characterizedmore » using a matrix of steady-state conditions representing the important portions of the engine map. Detailed performance results are presented. Measurements indicate the generator can produces over 300 W of power with 900 °C exhaust at relatively high flow rates, but less than 50 W when the exhaust is 600 °C and at lower flow rates. The latter condition is typical of standard test cycles and most driving scenarios.« less
Recent advances in superhydrophobic surfaces and their relevance to biology and medicine.
Ciasca, G; Papi, M; Businaro, L; Campi, G; Ortolani, M; Palmieri, V; Cedola, A; De Ninno, A; Gerardino, A; Maulucci, G; De Spirito, M
2016-02-04
By mimicking naturally occurring superhydrophobic surfaces, scientists can now realize artificial surfaces on which droplets of a few microliters of water are forced to assume an almost spherical shape and an extremely high contact angle. In recent decades, these surfaces have attracted much attention due to their technological applications for anti-wetting and self-cleaning materials. Very recently, researchers have shifted their interest to investigate whether superhydrophobic surfaces can be exploited to study biological systems. This research effort has stimulated the design and realization of new devices that allow us to actively organize, visualize and manipulate matter at both the microscale and nanoscale levels. Such precise control opens up wide applications in biomedicine, as it allows us to directly manipulate objects at the typical length scale of cells and macromolecules. This progress report focuses on recent biological and medical applications of superhydrophobicity. Particular regard is paid to those applications that involve the detection, manipulation and study of extremely small quantities of molecules, and to those that allow high throughput cell and biomaterial screening.
NASA Astrophysics Data System (ADS)
Aaronson, Judith N.; Nablo, Sam V.
1985-05-01
Selfshielded electron accelerators have been successfully used in industry for more than ten years. One of the important advantages of these machines is their compactness for easy adaptation to conventional coating and product finishing machinery. It is equally important that these machines qualify for use under "unrestricted" conditions as specified by OSHA. The shielding and product handling configurations which make this unrestricted designation possible for operating voltages under 300 kV are discussed. Thin film dosimetry techniques used for the determination of the machine performance parameters are discussed along with the rotary scanner techniques employed for the dose rate studies which are important in the application of these processors. Paper and wood coatings, which are important industrial applications involving electron initiated polymerization, are reviewed. The sterilization and disinfestation applications are also discussed. The increasing concern of these industries for the more efficient use of energy and for compliance with more stringent pollution regulations, coupled with the novel processes this energy source makes possible, assure a bright future for this developing technology.
[Local involvement of the optic nerve by acute lymphoblastic leukemia].
Bernardczyk-Meller, Jadwiga; Stefańska, Katarzyna
2005-01-01
The leucemias quite commonly involve the eyes and adnexa. In some cases it causes visual complants. Both, the anterior chamber of the eye and the posterior portion of the globe may sites of acute or chronic leukemia and leucemic relapse. We report an unique case of a 14 years old leucemic patient who suffered visual loss and papilloedema, due to a unilateral local involvement within optic nerve, during second relapse of acute lymphocytic leuemia. In spite of typical treatment of main disease, the boy had died. The authors present typical ophthalmic features of the leucemia, too.
Arab, Ali; Holan, Scott H.; Wikle, Christopher K.; Wildhaber, Mark L.
2012-01-01
Ecological studies involving counts of abundance, presence–absence or occupancy rates often produce data having a substantial proportion of zeros. Furthermore, these types of processes are typically multivariate and only adequately described by complex nonlinear relationships involving externally measured covariates. Ignoring these aspects of the data and implementing standard approaches can lead to models that fail to provide adequate scientific understanding of the underlying ecological processes, possibly resulting in a loss of inferential power. One method of dealing with data having excess zeros is to consider the class of univariate zero-inflated generalized linear models. However, this class of models fails to address the multivariate and nonlinear aspects associated with the data usually encountered in practice. Therefore, we propose a semiparametric bivariate zero-inflated Poisson model that takes into account both of these data attributes. The general modeling framework is hierarchical Bayes and is suitable for a broad range of applications. We demonstrate the effectiveness of our model through a motivating example on modeling catch per unit area for multiple species using data from the Missouri River Benthic Fishes Study, implemented by the United States Geological Survey.
NASA Astrophysics Data System (ADS)
Kass, R.; Kass, J.
On February 7, 1994, four Canadian Astronauts were sealed off in a hyperbaric chamber at the Canadian Government's Defense and Civil Institute for Environmental Medicine in Toronto, Canada. This space lab training mission lasted seven days and was the first to be conducted with astronauts outside of Russia. The objective of this mission was to give Canadian astronauts, space scientists and the staff of the Canadian Space Agency (CSA), the opportunity to gain first hand experience on preparational and operational aspects of a typical space mission. Twenty-one scientific experiments involving six countries from several disciplines were involved in this mission. This paper describes the goals and preliminary results of a psychological experiment/training program that used the CAPSULS mission as a test bed for its application in the manned space flight environment. The objective of this project was to enhance the understanding of small group behaviour with a view to maximizing team effectiveness and task accomplishment in teams living and working in isolation under difficult and confined conditions. The application of this model in the light of future missions is a key thesis in this paper.
Frosini, Francesco; Miniati, Roberto; Grillone, Saverio; Dori, Fabrizio; Gentili, Guido Biffi; Belardinelli, Andrea
2016-11-14
The following study proposes and tests an integrated methodology involving Health Technology Assessment (HTA) and Failure Modes, Effects and Criticality Analysis (FMECA) for the assessment of specific aspects related to robotic surgery involving safety, process and technology. The integrated methodology consists of the application of specific techniques coming from the HTA joined to the aid of the most typical models from reliability engineering such as FMEA/FMECA. The study has also included in-site data collection and interviews to medical personnel. The total number of robotic procedures included in the analysis was 44: 28 for urology and 16 for general surgery. The main outcomes refer to the comparative evaluation between robotic, laparoscopic and open surgery. Risk analysis and mitigation interventions come from FMECA application. The small sample size available for the study represents an important bias, especially for the clinical outcomes reliability. Despite this, the study seems to confirm the better trend for robotics' surgical times with comparison to the open technique as well as confirming the robotics' clinical benefits in urology. More complex situation is observed for general surgery, where robotics' clinical benefits directly measured are the lowest blood transfusion rate.
Herpes zoster - typical and atypical presentations.
Dayan, Roy Rafael; Peleg, Roni
2017-08-01
Varicella- zoster virus infection is an intriguing medical entity that involves many medical specialties including infectious diseases, immunology, dermatology, and neurology. It can affect patients from early childhood to old age. Its treatment requires expertise in pain management and psychological support. While varicella is caused by acute viremia, herpes zoster occurs after the dormant viral infection, involving the cranial nerve or sensory root ganglia, is re-activated and spreads orthodromically from the ganglion, via the sensory nerve root, to the innervated target tissue (skin, cornea, auditory canal, etc.). Typically, a single dermatome is involved, although two or three adjacent dermatomes may be affected. The lesions usually do not cross the midline. Herpes zoster can also present with unique or atypical clinical manifestations, such as glioma, zoster sine herpete and bilateral herpes zoster, which can be a challenging diagnosis even for experienced physicians. We discuss the epidemiology, pathophysiology, diagnosis and management of Herpes Zoster, typical and atypical presentations.
AC electrothermal technique in microchannels
NASA Astrophysics Data System (ADS)
Salari, Alinaghi; Navi, Maryam; Dalton, Colin
2017-02-01
Electrokinetic techniques have a wide range of applications in droplet, particle, and fluid manipulation systems. In general, they can be categorized into different subgroups including electroosmosis, electrothermal, electrophoresis, dielectrophoresis, etc. The AC electrothermal (ACET) technique has been shown to be very effective in applications which involve high conductivity fluids, such as blood, which are typically used in biomedical applications. In the past few years, the ACET effect has received considerable attention. Unlike AC electroosmosis (ACEO), the ACET effect shows plateaus in force in a wide frequency range. In other words, with electrothermal force, velocity is more steady and predictable at different frequencies, compared to ACEO and dielectrophoresis (DEP). Although electrothermal microflows form as a result of Joule heating in the fluid, due to high conduction of heat to the ambience, the temperature rise in the fluid is not so high as to threaten the nature of the biofluids. The average temperature rise resulting from the ACET effect is below 5 °K. In order to generate high strength AC electric fields, microfabricated electrode arrays are commonly used in microchannels. For pumping applications, it is essential to create asymmetry in the electric field, typically by having asymmetrical electrode pairs. There is no defined border between many electrokinetic techniques, and as such the point where electrothermal processes interferes with other electrokinetic techniques is not clear in the literature. In addition, there have been comprehensive reviews on micropumps, electrokinetics, and their subcategories, but the literature lacks a detailed up-to-date review on electrothermal microdevices. In this paper, a brief review is made specifically on electric fields in ACET devices, in order to provide an insight for the reader about the importance of this aspect of ACET devices and the improvements made to date.
Social and Non-Social Cueing of Visuospatial Attention in Autism and Typical Development
ERIC Educational Resources Information Center
Pruett, John R.; LaMacchia, Angela; Hoertel, Sarah; Squire, Emma; McVey, Kelly; Todd, Richard D.; Constantino, John N.; Petersen, Steven E.
2011-01-01
Three experiments explored attention to eye gaze, which is incompletely understood in typical development and is hypothesized to be disrupted in autism. Experiment 1 (n = 26 typical adults) involved covert orienting to box, arrow, and gaze cues at two probabilities and cue-target times to test whether reorienting for gaze is endogenous, exogenous,…
Kushniruk, A. W.; Patel, V. L.; Cimino, J. J.
1997-01-01
This paper describes an approach to the evaluation of health care information technologies based on usability engineering and a methodological framework from the study of medical cognition. The approach involves collection of a rich set of data including video recording of health care workers as they interact with systems, such as computerized patient records and decision support tools. The methodology can be applied in the laboratory setting, typically involving subjects "thinking aloud" as they interact with a system. A similar approach to data collection and analysis can also be extended to study of computer systems in the "live" environment of hospital clinics. Our approach is also influenced from work in the area of cognitive task analysis, which aims to characterize the decision making and reasoning of subjects of varied levels of expertise as they interact with information technology in carrying out representative tasks. The stages involved in conducting cognitively-based usability analyses are detailed and the application of such analysis in the iterative process of system and interface development is discussed. PMID:9357620
Photoactivated methods for enabling cartilage-to-cartilage tissue fixation
NASA Astrophysics Data System (ADS)
Sitterle, Valerie B.; Roberts, David W.
2003-06-01
The present study investigates whether photoactivated attachment of cartilage can provide a viable method for more effective repair of damaged articular surfaces by providing an alternative to sutures, barbs, or fibrin glues for initial fixation. Unlike artificial materials, biological constructs do not possess the initial strength for press-fitting and are instead sutured or pinned in place, typically inducing even more tissue trauma. A possible alternative involves the application of a photosensitive material, which is then photoactivated with a laser source to attach the implant and host tissues together in either a photothermal or photochemical process. The photothermal version of this method shows potential, but has been almost entirely applied to vascularized tissues. Cartilage, however, exhibits several characteristics that produce appreciable differences between applying and refining these techniques when compared to previous efforts involving vascularized tissues. Preliminary investigations involving photochemical photosensitizers based on singlet oxygen and electron transfer mechanisms are discussed, and characterization of the photodynamic effects on bulk collagen gels as a simplified model system using FTIR is performed. Previous efforts using photothermal welding applied to cartilaginous tissues are reviewed.
Biologically Inspired Purification and Dispersion of SWCNTs
NASA Technical Reports Server (NTRS)
Feeback, Daniel L.; Clarke, Mark S.; Nikolaev, Pavel
2009-01-01
A biologically inspired method has been developed for (1) separating single-wall carbon nanotubes (SWCNTs) from other materials (principally, amorphous carbon and metal catalysts) in raw production batches and (2) dispersing the SWCNTs as individual particles (in contradistinction to ropes and bundles) in suspension, as required for a number of applications. Prior methods of purification and dispersal of SWCNTs involve, variously, harsh physical processes (e.g., sonication) or harsh chemical processes (e.g., acid reflux). These processes do not completely remove the undesired materials and do not disperse bundles and ropes into individual suspended SWCNTs. Moreover, these processes cut long SWCNTs into shorter pieces, yielding typical nanotube lengths between 150 and 250 nm. In contrast, the present method does not involve harsh physical or chemical processes. The method involves the use of biologically derived dispersal agents (BDDAs) in an aqueous solution that is mechanically homogenized (but not sonicated) and centrifuged. The dense solid material remaining after centrifugation is resuspended by vortexing in distilled water, yielding an aqueous suspension of individual, separated SWCNTs having lengths from about 10 to about 15 microns.
Application and systems software in Ada: Development experiences
NASA Technical Reports Server (NTRS)
Kuschill, Jim
1986-01-01
In its most basic sense software development involves describing the tasks to be solved, including the given objects and the operations to be performed on those objects. Unfortunately, the way people describe objects and operations usually bears little resemblance to source code in most contemporary computer languages. There are two ways around this problem. One is to allow users to describe what they want the computer to do in everyday, typically imprecise English. The PRODOC methodology and software development environment is based on a second more flexible and possibly even easier to use approach. Rather than hiding program structure, PRODOC represents such structure graphically using visual programming techniques. In addition, the program terminology used in PRODOC may be customized so as to match the way human experts in any given application area naturally describe the relevant data and operations. The PRODOC methodology is described in detail.
NASA Astrophysics Data System (ADS)
Siti Nor Qamarina, M.; Fatimah Rubaizah, M. R.; Nurul Suhaira, A.; Norhanifah, M. Y.
2017-12-01
Epoxidized natural rubber latex (ENRL) is a chemically modified natural rubber latex produced from epoxidation process that involves usage of organic peracids. Conversion of the ENRL into dry rubber products has been known to exhibit many beneficial properties, however limited published works were found on diversifiying the ENRL latex-based products applications. In this preliminary work, different source of raw materials and neutralization systems were investigated. The objective was to explore possibilities in producing distinctive ENRL. Findings have demonstrated that different source of raw materials and neutralization systems influenced the typical ENRL specifications, stability behavior and particle size distribution. Morphological observations performed on these ENRL systems appeared to agree with the ENRL characteristics achieved. Since experimenting these two main factors resulted in encouraging ENRL findings, detailed work shall be further scrutinized to search for an optimum condition in producing marketable ENRL specifically for latex-based products applications.
Exploring methods to expedite the recording of CEST datasets using selective pulse excitation
NASA Astrophysics Data System (ADS)
Yuwen, Tairan; Bouvignies, Guillaume; Kay, Lewis E.
2018-07-01
Chemical Exchange Saturation Transfer (CEST) has emerged as a powerful tool for studies of biomolecular conformational exchange involving the interconversion between a major, visible conformer and one or more minor, invisible states. Applications typically entail recording a large number of 2D datasets, each of which differs in the position of a weak radio frequency field, so as to generate a CEST profile for each nucleus from which the chemical shifts of spins in the invisible state(s) are obtained. Here we compare a number of band-selective CEST schemes for speeding up the process using either DANTE or cosine-modulated excitation approaches. We show that while both are essentially identical for applications such as 15N CEST, in cases where the probed spins are dipolar or scalar coupled to other like spins there can be advantages for the cosine-excitation scheme.
Research and competition: Best partners
NASA Technical Reports Server (NTRS)
Shaw, J. M.
1986-01-01
NASA's Microgravity Science and Applications Program is directed toward research in the science and technology of processing materials under conditions of low gravity. The objective is to make a detailed examination of the constraints imposed by gravitational forces on Earth. The program is expected to lead ultimately to the development of new materials and processes in Earth-based commercial applications, adding to this nation's technological base. An important resource that U.S. researchers have readily available to them is the new Microgravity Materials Science Laboratory (MMSL) at NASA Lewis Research Center in Cleveland. A typical scenario for a microgravity materials experiment at Lewis would begin by establishing 1-g baseline data in the MMSL and then proceeding, if it is indicated, to a drop tower or to simulated microgravity conditions in a research aircraft to qualify the project for space flight. A major component of Lewis microgravity materials research work involves the study of metal and alloy solidification fundamentals.
Parallel multigrid smoothing: polynomial versus Gauss-Seidel
NASA Astrophysics Data System (ADS)
Adams, Mark; Brezina, Marian; Hu, Jonathan; Tuminaro, Ray
2003-07-01
Gauss-Seidel is often the smoother of choice within multigrid applications. In the context of unstructured meshes, however, maintaining good parallel efficiency is difficult with multiplicative iterative methods such as Gauss-Seidel. This leads us to consider alternative smoothers. We discuss the computational advantages of polynomial smoothers within parallel multigrid algorithms for positive definite symmetric systems. Two particular polynomials are considered: Chebyshev and a multilevel specific polynomial. The advantages of polynomial smoothing over traditional smoothers such as Gauss-Seidel are illustrated on several applications: Poisson's equation, thin-body elasticity, and eddy current approximations to Maxwell's equations. While parallelizing the Gauss-Seidel method typically involves a compromise between a scalable convergence rate and maintaining high flop rates, polynomial smoothers achieve parallel scalable multigrid convergence rates without sacrificing flop rates. We show that, although parallel computers are the main motivation, polynomial smoothers are often surprisingly competitive with Gauss-Seidel smoothers on serial machines.
Lock hopper values for coal gasification plant service
NASA Technical Reports Server (NTRS)
Schoeneweis, E. F.
1977-01-01
Although the operating principle of the lock hopper system is extremely simple, valve applications involving this service for coal gasification plants are likewise extremely difficult. The difficulties center on the requirement of handling highly erosive pulverized coal or char (either in dry or slurry form) combined with the requirement of providing tight sealing against high-pressure (possibly very hot) gas. Operating pressures and temperatures in these applications typically range up to 1600 psi (110bar) and 600F (316C), with certain process requirements going even higher. In addition, and of primary concern, is the need for reliable operation over long service periods with the provision for practical and economical maintenance. Currently available data indicate the requirement for something in the order of 20,000 to 30,000 open-close cycles per year and a desire to operate at least that long without valve failure.
Strength tests for elite rowers: low- or high-repetition?
Lawton, Trent W; Cronin, John B; McGuigan, Michael R
2014-01-01
The purpose of this project was to evaluate the utility of low- and high-repetition maximum (RM) strength tests used to assess rowers. Twenty elite heavyweight males (age 23.7 ± 4.0 years) performed four tests (5 RM, 30 RM, 60 RM and 120 RM) using leg press and seated arm pulling exercise on a dynamometer. Each test was repeated on two further occasions; 3 and 7 days from the initial trial. Per cent typical error (within-participant variation) and intraclass correlation coefficients (ICCs) were calculated using log-transformed repeated-measures data. High-repetition tests (30 RM, 60 RM and 120 RM), involving seated arm pulling exercise are not recommended to be included in an assessment battery, as they had unsatisfactory measurement precision (per cent typical error > 5% or ICC < 0.9). Conversely, low-repetition tests (5 RM) involving leg press and seated arm pulling exercises could be used to assess elite rowers (per cent typical error ≤ 5% and ICC ≥ 0.9); however, only 5 RM leg pressing met criteria (per cent typical error = 2.7%, ICC = 0.98) for research involving small samples (n = 20). In summary, low-repetition 5 RM strength testing offers greater utility as assessments of rowers, as they can be used to measure upper- and lower-body strength; however, only the leg press exercise is recommended for research involving small squads of elite rowers.
DOT National Transportation Integrated Search
1988-01-01
Operational monitoring situations, in contrast to typical laboratory vigilance tasks, generally involve more than just stimulus detection and recognition. They frequently involve complex multidimensional discriminations, interpretations of significan...
Low vibration microminiature split Stirling cryogenic cooler for infrared aerospace applications
NASA Astrophysics Data System (ADS)
Veprik, A.; Zechtzer, S.; Pundak, N.; Kirkconnel, C.; Freeman, J.; Riabzev, S.
2011-06-01
The operation of the thermo-mechanical unit of a cryogenic cooler may originate a resonant excitation of the spacecraft frame, optical bench or components of the optical train. This may result in degraded functionality of the inherently vibration sensitive space-borne infrared imager directly associated with the cooler or neighboring instrumentation typically requiring a quiet micro-g environment. The best practice for controlling cooler induced vibration relies on the principle of active momentum cancellation. In particular, the pressure wave generator typically contains two oppositely actuated piston compressors, while the single piston expander is counterbalanced by an auxiliary active counter-balancer. Active vibration cancellation is supervised by a dedicated DSP feed-forward controller, where the error signals are delivered by the vibration sensors (accelerometers or load cells). This can result in oversized, overweight and overpriced cryogenic coolers with degraded electromechanical performance and impaired reliability. The authors are advocating a reliable, compact, cost and power saving approach capitalizing on the combined application of a passive tuned dynamic absorber and a low frequency vibration isolator. This concept appears to be especially suitable for low budget missions involving mini and micro satellites, where price, size, weight and power consumption are of concern. The authors reveal the results of theoretical study and experimentation on the attainable performance using a fullscale technology demonstrator relying on a Ricor model K527 tactical split Stirling cryogenic cooler. The theoretical predictions are in fair agreement with the experimental data. From experimentation, the residual vibration export is quite suitable for demanding wide range of aerospace applications. The authors give practical recommendations on heatsinking and further maximizing performance.
XBoard: A Framework for Integrating and Enhancing Collaborative Work Practices
NASA Technical Reports Server (NTRS)
Shab, Ted
2006-01-01
Teams typically collaborate in different modes including face-to-face meetings, meetings that are synchronous (i. e. require parties to participate at the same time) but distributed geographically, and meetings involving asynchronously working on common tasks at different times. The XBoard platform was designed to create an integrated environment for creating applications that enhance collaborative work practices. Specifically, it takes large, touch-screen enabled displays as the starting point for enhancing face-to-face meetings by providing common facilities such as whiteboarding/electronic flipcharts, laptop projection, web access, screen capture and content distribution. These capabilities are built upon by making these functions inherently distributed by allowing these sessions to be easily connected between two or more systems at different locations. Finally, an information repository is integrated into the functionality to provide facilities for work practices that involve work being done at different times, such as reports that span different shifts. The Board is designed to be extendible allowing customization of both the general functionality and by adding new functionality to the core facilities by means of a plugin architecture. This, in essence, makes it a collaborative framework for extending or integrating work practices for different mission scenarios. XBoard relies heavily on standards such as Web Services and SVG, and is built using predominately Java and well-known open-source products such as Apache and Postgres. Increasingly, organizations are geographically dispersed, and rely on "virtual teams" that are assembled from a pool of various partner organizations. These organizations often have different infrastructures of applications and workflows. The XBoard has been designed to be a good partner in these situations, providing the flexibility to integrate with typical legacy applications while providing a standards-based infrastructure that is readily accepted by most organizations. The XBoard has been used on the Mars Exploration Rovers mission at JPL, and is currently being used or considered for use in pilot projects at Johnson Space Center (JSC) Mission Control, the University of Arizona Lunar and Planetav Laboratory (Phoenix Mars Lander), and MBART (Monterey Bay Aquarium Research Institute).
Baxter, Ruth; Taylor, Natalie; Kellar, Ian; Lawton, Rebecca
2016-01-01
Background The positive deviance approach focuses on those who demonstrate exceptional performance, despite facing the same constraints as others. ‘Positive deviants’ are identified and hypotheses about how they succeed are generated. These hypotheses are tested and then disseminated within the wider community. The positive deviance approach is being increasingly applied within healthcare organisations, although limited guidance exists and different methods, of varying quality, are used. This paper systematically reviews healthcare applications of the positive deviance approach to explore how positive deviance is defined, the quality of existing applications and the methods used within them, including the extent to which staff and patients are involved. Methods Peer-reviewed articles, published prior to September 2014, reporting empirical research on the use of the positive deviance approach within healthcare, were identified from seven electronic databases. A previously defined four-stage process for positive deviance in healthcare was used as the basis for data extraction. Quality assessments were conducted using a validated tool, and a narrative synthesis approach was followed. Results 37 of 818 articles met the inclusion criteria. The positive deviance approach was most frequently applied within North America, in secondary care, and to address healthcare-associated infections. Research predominantly identified positive deviants and generated hypotheses about how they succeeded. The approach and processes followed were poorly defined. Research quality was low, articles lacked detail and comparison groups were rarely included. Applications of positive deviance typically lacked staff and/or patient involvement, and the methods used often required extensive resources. Conclusion Further research is required to develop high quality yet practical methods which involve staff and patients in all stages of the positive deviance approach. The efficacy and efficiency of positive deviance must be assessed and compared with other quality improvement approaches. PROSPERO registration number CRD42014009365. PMID:26590198
de Solla, Shane Raymond; Martin, Pamela Anne; Mikoda, Paul
2011-09-15
Many reptiles oviposit in soils associated with agricultural landscapes. We evaluated the toxicity of a pesticide and fertilizer regime similar to those used in corn production in Ontario on the survivorship of exposed snapping turtle (Chelydra serpentina) eggs. The herbicides atrazine, dimethenamid, and glyphosate, the pyrethroid insecticide tefluthrin, and the fertilizer ammonia, were applied to clean soil, both as partial mixtures within chemical classes, as well as complete mixtures. Eggs were incubated in the soil in a garden plot in which these mixtures were applied at a typical field application rate, and higher rates. Otherwise, the eggs were unmanipulated and were subject to ambient temperature and weather conditions. Eggs were also exposed at male producing temperatures in the laboratory in covered bins in the same soil, where there was less opportunity for loss through volatilization or leaching. Egg mortality was 100% at 10× the typical field application rate of the complete mixture, both with and without tefluthrin. At typical field application rates, hatching success ranged between 91.7 and 95.8%. Eggs exposed only to herbicides were not negatively affected at any application rates. Although fertilizer treatments at typical field application rates did not affect eggs, mortality was remarkably higher at three times this rate, and 100% at higher rates. The frequency of deformities of hatchlings was elevated at the highest application rate of the insecticide tefluthrin. The majority of the toxicity of the mixture was not due to the herbicides or insecticide, but was due to the ammonia fertilizer. At typical field application rates, the chemical regime associated with corn production does not appear to have any detrimental impacts upon turtle egg development; however toxicity dramatically increases if this threshold is passed. Copyright © 2011. Published by Elsevier B.V.
A sensor monitoring system for telemedicine, safety and security applications
NASA Astrophysics Data System (ADS)
Vlissidis, Nikolaos; Leonidas, Filippos; Giovanis, Christos; Marinos, Dimitrios; Aidinis, Konstantinos; Vassilopoulos, Christos; Pagiatakis, Gerasimos; Schmitt, Nikolaus; Pistner, Thomas; Klaue, Jirka
2017-02-01
A sensor system capable of medical, safety and security monitoring in avionic and other environments (e.g. homes) is examined. For application inside an aircraft cabin, the system relies on an optical cellular network that connects each seat to a server and uses a set of database applications to process data related to passengers' health, safety and security status. Health monitoring typically encompasses electrocardiogram, pulse oximetry and blood pressure, body temperature and respiration rate while safety and security monitoring is related to the standard flight attendance duties, such as cabin preparation for take-off, landing, flight in regions of turbulence, etc. In contrast to previous related works, this article focuses on the system's modules (medical and safety sensors and associated hardware), the database applications used for the overall control of the monitoring function and the potential use of the system for security applications. Further tests involving medical, safety and security sensing performed in an real A340 mock-up set-up are also described and reference is made to the possible use of the sensing system in alternative environments and applications, such as health monitoring within other means of transport (e.g. trains or small passenger sea vessels) as well as for remotely located home users, over a wired Ethernet network or the Internet.
NASA Astrophysics Data System (ADS)
Muenich, R. L.; Kalcic, M. M.; Teshager, A. D.; Long, C. M.; Wang, Y. C.; Scavia, D.
2017-12-01
Thanks to the availability of open-source software, online tutorials, and advanced software capabilities, watershed modeling has expanded its user-base and applications significantly in the past thirty years. Even complicated models like the Soil and Water Assessment Tool (SWAT) are being used and documented in hundreds of peer-reviewed publications each year, and likely more applied in practice. These models can help improve our understanding of present, past, and future conditions, or analyze important "what-if" management scenarios. However, baseline data and methods are often adopted and applied without rigorous testing. In multiple collaborative projects, we have evaluated the influence of some of these common approaches on model results. Specifically, we examined impacts of baseline data and assumptions involved in manure application, combined sewer overflows, and climate data incorporation across multiple watersheds in the Western Lake Erie Basin. In these efforts, we seek to understand the impact of using typical modeling data and assumptions, versus using improved data and enhanced assumptions on model outcomes and thus ultimately, study conclusions. We provide guidance for modelers as they adopt and apply data and models for their specific study region. While it is difficult to quantitatively assess the full uncertainty surrounding model input data and assumptions, recognizing the impacts of model input choices is important when considering actions at the both the field and watershed scales.
Replaceable Microfluidic Cartridges for a PCR Biosensor
NASA Technical Reports Server (NTRS)
Francis, Kevin; Sullivan, Ron
2005-01-01
The figure depicts a replaceable microfluidic cartridge that is a component of a miniature biosensor that detects target deoxyribonucleic acid (DNA) sequences. The biosensor utilizes (1) polymerase chain reactions (PCRs) to multiply the amount of DNA to be detected, (2) fluorogenic polynucleotide probe chemicals for labeling the target DNA sequences, and (3) a high-sensitivity epifluorescence-detection optoelectronic subsystem. Microfluidics is a relatively new field of device development in which one applies techniques for fabricating microelectromechanical systems (MEMS) to miniature systems for containing and/or moving fluids. Typically, microfluidic devices are microfabricated, variously, from silicon or polymers. The development of microfluidic devices for applications that involve PCR and fluorescence-based detection of PCR products poses special challenges
Aerodynamic preliminary analysis system 2. Part 1: Theory
NASA Technical Reports Server (NTRS)
Bonner, E.; Clever, W.; Dunn, K.
1981-01-01
A subsonic/supersonic/hypersonic aerodynamic analysis was developed by integrating the Aerodynamic Preliminary Analysis System (APAS), and the inviscid force calculation modules of the Hypersonic Arbitrary Body Program. APAS analysis was extended for nonlinear vortex forces using a generalization of the Polhamus analogy. The interactive system provides appropriate aerodynamic models for a single input geometry data base and has a run/output format similar to a wind tunnel test program. The user's manual was organized to cover the principle system activities of a typical application, geometric input/editing, aerodynamic evaluation, and post analysis review/display. Sample sessions are included to illustrate the specific task involved and are followed by a comprehensive command/subcommand dictionary used to operate the system.
Applications of multi-spectral imaging: failsafe industrial flame detector
NASA Astrophysics Data System (ADS)
Wing Au, Kwong; Larsen, Christopher; Cole, Barry; Venkatesha, Sharath
2016-05-01
Industrial and petrochemical facilities present unique challenges for fire protection and safety. Typical scenarios include detection of an unintended fire in a scene, wherein the scene also includes a flare stack in the background. Maintaining a high level of process and plant safety is a critical concern. In this paper, we present a failsafe industrial flame detector which has significant performance benefits compared to current flame detectors. The design involves use of microbolometer in the MWIR and LWIR spectrum and a dual band filter. This novel flame detector can help industrial facilities to meet their plant safety and critical infrastructure protection requirements while ensuring operational and business readiness at project start-up.
Harnessing recombination to speed adaptive evolution in Escherichia coli.
Winkler, James; Kao, Katy C
2012-09-01
Evolutionary engineering typically involves asexual propagation of a strain to improve a desired phenotype. However, asexual populations suffer from extensive clonal interference, a phenomenon where distinct lineages of beneficial clones compete and are often lost from the population given sufficient time. Improved adaptive mutants can likely be generated by genetic exchange between lineages, thereby reducing clonal interference. We present a system that allows continuous in situ recombination by using an Esherichia coli F-based conjugation system lacking surface exclusion. Evolution experiments revealed that Hfr-mediated recombination significantly speeds adaptation in certain circumstances. These results show that our system is stable, effective, and suitable for use in evolutionary engineering applications. Copyright © 2012 Elsevier Inc. All rights reserved.
A fuzzy logic approach to modeling a vehicle crash test
NASA Astrophysics Data System (ADS)
Pawlus, Witold; Karimi, Hamid Reza; Robbersmyr, Kjell G.
2013-03-01
This paper presents an application of fuzzy approach to vehicle crash modeling. A typical vehicle to pole collision is described and kinematics of a car involved in this type of crash event is thoroughly characterized. The basics of fuzzy set theory and modeling principles based on fuzzy logic approach are presented. In particular, exceptional attention is paid to explain the methodology of creation of a fuzzy model of a vehicle collision. Furthermore, the simulation results are presented and compared to the original vehicle's kinematics. It is concluded which factors have influence on the accuracy of the fuzzy model's output and how they can be adjusted to improve the model's fidelity.
Fabrication of boron sputter targets
Makowiecki, Daniel M.; McKernan, Mark A.
1995-01-01
A process for fabricating high density boron sputtering targets with sufficient mechanical strength to function reliably at typical magnetron sputtering power densities and at normal process parameters. The process involves the fabrication of a high density boron monolithe by hot isostatically compacting high purity (99.9%) boron powder, machining the boron monolithe into the final dimensions, and brazing the finished boron piece to a matching boron carbide (B.sub.4 C) piece, by placing aluminum foil there between and applying pressure and heat in a vacuum. An alternative is the application of aluminum metallization to the back of the boron monolithe by vacuum deposition. Also, a titanium based vacuum braze alloy can be used in place of the aluminum foil.
A note on adding viscoelasticity to earthquake simulators
Pollitz, Fred
2017-01-01
Here, I describe how time‐dependent quasi‐static stress transfer can be implemented in an earthquake simulator code that is used to generate long synthetic seismicity catalogs. Most existing seismicity simulators use precomputed static stress interaction coefficients to rapidly implement static stress transfer in fault networks with typically tens of thousands of fault patches. The extension to quasi‐static deformation, which accounts for viscoelasticity of Earth’s ductile lower crust and mantle, involves the precomputation of additional interaction coefficients that represent time‐dependent stress transfer among the model fault patches, combined with defining and evolving additional state variables that track this stress transfer. The new approach is illustrated with application to a California‐wide synthetic fault network.
NASA Astrophysics Data System (ADS)
Baker, G. N.
This paper examines the constraints upon a typical manufacturer of gyros and strapdown systems. It describes that while being responsive to exchange and keeping abreast of 'state of the art' technology, there are many reasons why the manufacturer must satisfy the market using existing technology and production equipment. The Single-Degree-of-Freedom Rate Integrating Gyro is a well established product, yet is capable of achieving far higher performances than originally envisaged due to modelling and characterization within digital strapdown systems. The parameters involved are discussed, and a description given of the calibration process undertaken on a strapdown system being manufactured in a production environment in batch quantities.
Airborne EM survey in volcanoes : Application to a volcanic hazards assessment
NASA Astrophysics Data System (ADS)
Mogi, T.
2010-12-01
Airborne electromagnetics (AEM) is a useful tool for investigating subsurface structures of volcanoes because it can survey large areas involving inaccessible areas. Disadvantages include lower accuracy and limited depth of investigation. AEM has been widely used in mineral exploration in frontier areas, and have been applying to engineering and environmental fields, particularly in studies involving active volcanoes. AEM systems typically comprise a transmitter and a receiver on an aircraft or in a towed bird, and although effective for surveying large areas, their penetration depth is limited because the distance between the transmitter and receiver is small and higher-frequency signals are used. To explore deeper structures using AEM, a semi-airborne system called GRounded Electrical source Airborne Transient ElectroMagnetics (GREATEM) has been developed. The system uses a grounded-electrical-dipole as the transmitter and generates horizontal electric fields. The GREATEM technology, first proposed by Mogi et al. (1998), has recently been improved and used in practical surveys (Mogi et al., 2009). The GREATEM survey system was developed to increase the depth of investigation possible using AEM. The method was tested in some volcanoes at 2004-2005. Here I will talk about some results of typical AEM surveys and GREATEM surveys in some volcanoes in Japan to mitigate hazards associated with volcano eruption. Geologic hazards caused by volcanic eruptions can be mitigated by a combination of prediction, preparedness and land-use control. Risk management depends on the identification of hazard zones and forecasting of eruptions. Hazard zoning involves the mapping of deposits which have formed during particular phases of volcanic activity and their extrapolation to identify the area which would be likely to suffer a similar hazard at some future time. The mapping is usually performed by surface geological surveys of volcanic deposits. Resistivity mapping by AEM is useful tool to identify each volcanic deposit on the surface and at shallower depth as well. This suggests that more efficient hazard map involving subsurface information can be supplied by AEM resistivity mapping.
Airframe Noise Prediction by Acoustic Analogy: Revisited
NASA Technical Reports Server (NTRS)
Farassat, F.; Casper, Jay H.; Tinetti, A.; Dunn, M. H.
2006-01-01
The present work follows a recent survey of airframe noise prediction methodologies. In that survey, Lighthill s acoustic analogy was identified as the most prominent analytical basis for current approaches to airframe noise research. Within this approach, a problem is typically modeled with the Ffowcs Williams and Hawkings (FW-H) equation, for which a geometry-independent solution is obtained by means of the use of the free-space Green function (FSGF). Nonetheless, the aeroacoustic literature would suggest some interest in the use of tailored or exact Green s function (EGF) for aerodynamic noise problems involving solid boundaries, in particular, for trailing edge (TE) noise. A study of possible applications of EGF for prediction of broadband noise from turbulent flow over an airfoil surface and the TE is, therefore, the primary topic of the present work. Typically, the applications of EGF in the literature have been limited to TE noise prediction at low Mach numbers assuming that the normal derivative of the pressure vanishes on the airfoil surface. To extend the application of EGF to higher Mach numbers, the uniqueness of the solution of the wave equation when either the Dirichlet or the Neumann boundary condition (BC) is specified on a deformable surface in motion. The solution of Lighthill s equation with either the Dirichlet or the Neumann BC is given for such a surface using EGFs. These solutions involve both surface and volume integrals just like the solution of FW-H equation using FSGF. Insight drawn from this analysis is evoked to discuss the potential application of EGF to broadband noise prediction. It appears that the use of a EGF offers distinct advantages for predicting TE noise of an airfoil when the normal pressure gradient vanishes on the airfoil surface. It is argued that such an approach may also apply to an airfoil in motion. However, for the prediction of broadband noise not directly associated with a trailing edge, the use of EGF does not appear to offer any advantages over the use of FSGF at the present stage of development. It is suggested here that the applications of EGF for airframe noise analysis be continued. As an example pertinent to airframe noise prediction, the Fast Scattering Code of NASA Langley is utilized to obtain the EGF numerically on the surface of a three dimensional wing with a flap and leading edge slat in uniform rectilinear motion. The interpretation and use of these numerical Green functions are then discussed.
Oral Storytelling as Evidence of Pedagogy in Forager Societies
Scalise Sugiyama, Michelle
2017-01-01
Teaching is reportedly rare in hunter-gatherer societies, raising the question of whether it is a species-typical trait in humans. A problem with past studies is that they tend to conceptualize teaching in terms of Western pedagogical practices. In contrast, this study proceeds from the premise that teaching requires the ostensive manifestation of generalizable knowledge: the teacher must signal intent to share information, indicate the intended recipient, and transmit knowledge that is applicable beyond the present context. Certain features of human communication appear to be ostensive in function (e.g., eye contact, pointing, contingency, prosodic variation), and collectively serve as “natural pedagogy.” Tellingly, oral storytelling in forager societies typically employs these and other ostensive behaviors, and is widely reported to be an important source of generalizable ecological and social knowledge. Despite this, oral storytelling has been conspicuously overlooked in studies of teaching in preliterate societies. Accordingly, this study presents evidence that oral storytelling involves the use of ostension and the transmission of generic knowledge, thereby meeting the criteria of pedagogy. PMID:28424643
Fast algorithm for spectral processing with application to on-line welding quality assurance
NASA Astrophysics Data System (ADS)
Mirapeix, J.; Cobo, A.; Jaúregui, C.; López-Higuera, J. M.
2006-10-01
A new technique is presented in this paper for the analysis of welding process emission spectra to accurately estimate in real-time the plasma electronic temperature. The estimation of the electronic temperature of the plasma, through the analysis of the emission lines from multiple atomic species, may be used to monitor possible perturbations during the welding process. Unlike traditional techniques, which usually involve peak fitting to Voigt functions using the Levenberg-Marquardt recursive method, sub-pixel algorithms are used to more accurately estimate the central wavelength of the peaks. Three different sub-pixel algorithms will be analysed and compared, and it will be shown that the LPO (linear phase operator) sub-pixel algorithm is a better solution within the proposed system. Experimental tests during TIG-welding using a fibre optic to capture the arc light, together with a low cost CCD-based spectrometer, show that some typical defects associated with perturbations in the electron temperature can be easily detected and identified with this technique. A typical processing time for multiple peak analysis is less than 20 ms running on a conventional PC.
Image simulation for automatic license plate recognition
NASA Astrophysics Data System (ADS)
Bala, Raja; Zhao, Yonghui; Burry, Aaron; Kozitsky, Vladimir; Fillion, Claude; Saunders, Craig; Rodríguez-Serrano, José
2012-01-01
Automatic license plate recognition (ALPR) is an important capability for traffic surveillance applications, including toll monitoring and detection of different types of traffic violations. ALPR is a multi-stage process comprising plate localization, character segmentation, optical character recognition (OCR), and identification of originating jurisdiction (i.e. state or province). Training of an ALPR system for a new jurisdiction typically involves gathering vast amounts of license plate images and associated ground truth data, followed by iterative tuning and optimization of the ALPR algorithms. The substantial time and effort required to train and optimize the ALPR system can result in excessive operational cost and overhead. In this paper we propose a framework to create an artificial set of license plate images for accelerated training and optimization of ALPR algorithms. The framework comprises two steps: the synthesis of license plate images according to the design and layout for a jurisdiction of interest; and the modeling of imaging transformations and distortions typically encountered in the image capture process. Distortion parameters are estimated by measurements of real plate images. The simulation methodology is successfully demonstrated for training of OCR.
Energy levels distribution in supersaturated silicon with titanium for photovoltaic applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pérez, E., E-mail: eduper@ele.uva.es; Castán, H.; García, H.
2015-01-12
In the attempt to form an intermediate band in the bandgap of silicon substrates to give it the capability to absorb infrared radiation, we studied the deep levels in supersaturated silicon with titanium. The technique used to characterize the energy levels was the thermal admittance spectroscopy. Our experimental results showed that in samples with titanium concentration just under Mott limit there was a relationship among the activation energy value and the capture cross section value. This relationship obeys to the well known Meyer-Neldel rule, which typically appears in processes involving multiple excitations, like carrier capture/emission in deep levels, and itmore » is generally observed in disordered systems. The obtained characteristic Meyer-Neldel parameters were Tmn = 176 K and kTmn = 15 meV. The energy value could be associated to the typical energy of the phonons in the substrate. The almost perfect adjust of all experimental data to the same straight line provides further evidence of the validity of the Meyer Neldel rule, and may contribute to obtain a deeper insight on the ultimate meaning of this phenomenon.« less
Spectra of conditionalization and typicality in the multiverse
NASA Astrophysics Data System (ADS)
Azhar, Feraz
2016-02-01
An approach to testing theories describing a multiverse, that has gained interest of late, involves comparing theory-generated probability distributions over observables with their experimentally measured values. It is likely that such distributions, were we indeed able to calculate them unambiguously, will assign low probabilities to any such experimental measurements. An alternative to thereby rejecting these theories, is to conditionalize the distributions involved by restricting attention to domains of the multiverse in which we might arise. In order to elicit a crisp prediction, however, one needs to make a further assumption about how typical we are of the chosen domains. In this paper, we investigate interactions between the spectra of available assumptions regarding both conditionalization and typicality, and draw out the effects of these interactions in a concrete setting; namely, on predictions of the total number of species that contribute significantly to dark matter. In particular, for each conditionalization scheme studied, we analyze how correlations between densities of different dark matter species affect the prediction, and explicate the effects of assumptions regarding typicality. We find that the effects of correlations can depend on the conditionalization scheme, and that in each case atypicality can significantly change the prediction. In doing so, we demonstrate the existence of overlaps in the predictions of different "frameworks" consisting of conjunctions of theory, conditionalization scheme and typicality assumption. This conclusion highlights the acute challenges involved in using such tests to identify a preferred framework that aims to describe our observational situation in a multiverse.
Peptide based therapeutics and their use for the treatment of neurodegenerative and other diseases.
Baig, Mohammad Hassan; Ahmad, Khurshid; Saeed, Mohd; Alharbi, Ahmed M; Barreto, George E; Ashraf, Ghulam Md; Choi, Inho
2018-04-17
Bioactive peptides are actively involved in different biological functions and importantly contribute to human health, and the use of peptides as therapeutics has a long successful history in disease management. A number of peptides have wide-ranging therapeutic effects, such as antioxidant, antimicrobial, and antithrombotic effects. Neurodegenerative diseases are typically caused by abnormal aggregations of proteins or peptides, and the depositions of these aggregates in or on neurons, disrupt signaling and eventually kill neurons. During recent years, research on short peptides has advanced tremendously. This review offers a brief introduction to peptide based therapeutics and their application in disease management and provides an overview of peptide vaccines, and toxicity related issues. In addition, the importance of peptides in the management of different neurodegenerative diseases and their therapeutic applications is discussed. The present review provides an understanding of peptides and their applications for the management of different diseases, but with focus on neurodegenerative diseases. The role of peptides as anti-cancer, antimicrobial and antidiabetic agents has also been discussed. Copyright © 2018 Elsevier Masson SAS. All rights reserved.
Metal colloids and semiconductor quantum dots: Linear and nonlinear optical properties
NASA Technical Reports Server (NTRS)
Henderson, D. O.; My, R.; Tung, Y.; Ueda, A.; Zhu, J.; Collins, W. E.; Hall, Christopher
1995-01-01
One aspect of this project involves a collaborative effort with the Solid State Division of ORNL. The thrust behind this research is to develop ion implantion for synthesizing novel materials (quantum dots wires and wells, and metal colloids) for applications in all optical switching devices, up conversion, and the synthesis of novel refractory materials. In general the host material is typically a glass such as optical grade silica. The ions of interest are Au, Ag, Cd, Se, In, P, Sb, Ga and As. An emphasis is placed on host guest interactions between the matrix and the implanted ion and how the matrix effects and implantation parameters can be used to obtain designer level optical devices tailored for specific applications. The specific materials of interest are: CdSe, CdTe, InAs, GaAs, InP, GaP, InSb, GaSb and InGaAs. A second aspect of this research program involves using porous glass (25-200 A) for fabricating materials of finite size. In this part of the program, we are particularly interested in characterizing the thermodynamic and optical properties of these non-composite materials. We also address how phase diagram of the confined material is altered by the interfacial properties between the confined material and the pore wall.
Friction Stir Welding Development at NASA, Marshall Space Flight Center
NASA Technical Reports Server (NTRS)
McGill, Preston; Gentz, Steve (Technical Monitor)
2001-01-01
Friction stir welding (FSW) is a solid state process that pan be used to join materials without melting. The process was invented by The Welding Institute (TWI), Cambridge, England. Friction stir welding exhibits several advantages over fusion welding in that it produces welds with fewer defects and higher joint efficiency and is capable of joining alloys that are generally considered non-weldable with a fusion weld process. In 1994, NASA-Marshall began collaborating with TWI to transform FSW from a laboratory curiosity to a viable metal joining process suitable for manufacturing hardware. While teamed with TWI, NASA-Marshall began its own FSW research and development effort to investigate possible aerospace applications for the FSW process. The work involved nearly all aspects of FSW development, including process modeling, scale-up issues, applications to advanced materials and development of tooling to use FSW on components of the Space Shuttle with particular emphasis on aluminum tanks. The friction stir welding process involves spinning a pin-tool at an appropriate speed, plunging it into the base metal pieces to be joined, and then translating it along the joint of the work pieces. In aluminum alloys the rotating speed typically ranges from 200 to 400 revolutions per minute and the translation speed is approximately two to five inches per minute. The pin-tool is inserted at a small lead angle from the axis normal to the work piece and requires significant loading along the axis of the tool. An anvil or reaction structure is required behind the welded material to react the load along the axis of the pin tool. The process requires no external heat input, filler material, protective shielding gas or inert atmosphere typical of fusion weld processes. The FSW solid-state weld process has resulted in aluminum welds with significantly higher strengths, higher joint efficiencies and fewer defects than fusion welds used to join similar alloys.
NASA Astrophysics Data System (ADS)
Aquilanti, Vincenzo; Bitencourt, Ana Carla P.; Ferreira, Cristiane da S.; Marzuoli, Annalisa; Ragni, Mirco
2008-11-01
The mathematical apparatus of quantum-mechanical angular momentum (re)coupling, developed originally to describe spectroscopic phenomena in atomic, molecular, optical and nuclear physics, is embedded in modern algebraic settings which emphasize the underlying combinatorial aspects. SU(2) recoupling theory, involving Wigner's 3nj symbols, as well as the related problems of their calculations, general properties, asymptotic limits for large entries, nowadays plays a prominent role also in quantum gravity and quantum computing applications. We refer to the ingredients of this theory—and of its extension to other Lie and quantum groups—by using the collective term of 'spin networks'. Recent progress is recorded about the already established connections with the mathematical theory of discrete orthogonal polynomials (the so-called Askey scheme), providing powerful tools based on asymptotic expansions, which correspond on the physical side to various levels of semi-classical limits. These results are useful not only in theoretical molecular physics but also in motivating algorithms for the computationally demanding problems of molecular dynamics and chemical reaction theory, where large angular momenta are typically involved. As for quantum chemistry, applications of these techniques include selection and classification of complete orthogonal basis sets in atomic and molecular problems, either in configuration space (Sturmian orbitals) or in momentum space. In this paper, we list and discuss some aspects of these developments—such as for instance the hyperquantization algorithm—as well as a few applications to quantum gravity and topology, thus providing evidence of a unifying background structure.
Fernandez, A M; Schrogie, J J; Wilson, W W; Nash, D B
1997-01-01
Technology assessment has become a rapidly growing component of the healthcare system. It has assumed a functional role in operational settings and is rapidly impacting decisions involving purchasing, coverage, and reimbursement. This review is intended to assist the healthcare decision maker in considering the application of technology assessment in healthcare, so as to maximize the efficiency of future purchasing decisions. This "best practice" was synthesized after identifying key institutions performing technology assessment in healthcare and analyzing their working processes, including literature review, consensus panel discussions, and expert opinion. We describe this best practice on a reiterative loop that consists of five processes: awareness, strategic appropriateness, analysis versus need, acquisition and implementation, and reassessment. Typical barriers to adoption of technology assessment are also identified and discussed. This review suggests a common terminology for the core processes involved in technology assessment, thereby facilitating a more uniform understanding among the different components of the healthcare system (i.e., payer, provider, and society) while recognizing their different perspectives.
Schepis, M M; Reid, D H; Ownbey, J; Parsons, M B
2001-01-01
We evaluated a program for training 4 support staff to embed instruction within the existing activities of 5 children with disabilities in an inclusive preschool. The program involved classroom-based instruction, role playing, and feedback regarding how to effectively prompt, correct, and reinforce child behavior. Descriptions of naturally occurring teaching opportunities in which to use the teaching skills were also provided. Following classroom training, brief on-the-job training was provided to each staff member, followed by on-the-job feedback. Results indicated that each staff member increased her use of correct teaching procedures when training was implemented. Improvements in child performance accompanied each application of the staff training program. Results are discussed in terms of using effective staff training as one means of increasing the use of recommended intervention procedures in inclusive settings. Areas for future research could focus on training staff to embed other types of recommended practices within typical preschool routines involving children with disabilities.
Li, Yang; Li, Xiaotong; Shen, Fei; Wang, Zhanghong; Yang, Gang; Lin, Lili; Zhang, Yanzong; Zeng, Yongmei; Deng, Shihuai
2014-01-01
Although lignocellulosic biomass has been extensively regarded as the most important resource for bioethanol, the wide application was seriously restricted by the high transportation cost of biomass. Currently, biomass densification is regarded as an acceptable solution to this issue. Herein, briquettes, pellets and their corresponding undensified biomass were pretreated by diluted-NaOH and hydrothermal method to investigate the responses of biomass densification to these typical water-involved pretreatments and subsequent enzymatic hydrolysis. The densified biomass auto-swelling was initially investigated before pretreatment. Results indicated pellets could be totally auto-swollen in an hour, while it took about 24 h for briquettes. When diluted-NaOH pretreatment was performed, biomass briquetting and pelleting improved sugar conversion rate by 20.1% and 5.5% comparing with their corresponding undensified biomass. Pelleting improved sugar conversion rate by 7.0% after hydrothermal pretreatment comparing with the undensified biomass. However, briquetting disturbed hydrothermal pretreatment resulting in the decrease of sugar conversion rate by 15.0%. Copyright © 2013 Elsevier Ltd. All rights reserved.
High-Q/V Monolithic Diamond Microdisks Fabricated with Quasi-isotropic Etching.
Khanaliloo, Behzad; Mitchell, Matthew; Hryciw, Aaron C; Barclay, Paul E
2015-08-12
Optical microcavities enhance light-matter interactions and are essential for many experiments in solid state quantum optics, optomechanics, and nonlinear optics. Single crystal diamond microcavities are particularly sought after for applications involving diamond quantum emitters, such as nitrogen vacancy centers, and for experiments that benefit from diamond's excellent optical and mechanical properties. Light-matter coupling rates in experiments involving microcavities typically scale with Q/V, where Q and V are the microcavity quality-factor and mode-volume, respectively. Here we demonstrate that microdisk whispering gallery mode cavities with high Q/V can be fabricated directly from bulk single crystal diamond. By using a quasi-isotropic oxygen plasma to etch along diamond crystal planes and undercut passivated diamond structures, we create monolithic diamond microdisks. Fiber taper based measurements show that these devices support TE- and TM-like optical modes with Q > 1.1 × 10(5) and V < 11(λ/n) (3) at a wavelength of 1.5 μm.
CSMA Versus Prioritized CSMA for Air-Traffic-Control Improvement
NASA Technical Reports Server (NTRS)
Robinson, Daryl C.
2001-01-01
OPNET version 7.0 simulations are presented involving an important application of the Aeronautical Telecommunications Network (ATN), Controller Pilot Data Link Communications (CPDLC) over the Very High Frequency Data Link, Mode 2 (VDL-2). Communication is modeled for essentially all incoming and outgoing nonstop air-traffic for just three United States cities: Cleveland, Cincinnati, and Detroit. There are 32 airports in the simulation, 29 of which are either sources or destinations for the air-traffic of the aforementioned three airports. The simulation involves 111 Air Traffic Control (ATC) ground stations, and 1,235 equally equipped aircraft-taking off, flying realistic free-flight trajectories, and landing in a 24-hr period. Collisionless, Prioritized Carrier Sense Multiple Access (CSMA) is successfully tested and compared with the traditional CSMA typically associated with VDL-2. The performance measures include latency, throughput, and packet loss. As expected, Prioritized CSMA is much quicker and more efficient than traditional CSMA. These simulation results show the potency of Prioritized CSMA for implementing low latency, high throughput, and efficient connectivity.
Goodspeed, Kimberly; Newsom, Cassandra; Morris, Mary Ann; Powell, Craig; Evans, Patricia; Golla, Sailaja
2018-03-01
Pitt-Hopkins syndrome (PTHS) is a rare, genetic disorder caused by a molecular variant of TCF4 which is involved in embryologic neuronal differentiation. PTHS is characterized by syndromic facies, psychomotor delay, and intellectual disability. Other associated features include early-onset myopia, seizures, constipation, and hyperventilation-apneic spells. Many also meet criteria for autism spectrum disorder. Here the authors present a series of 23 PTHS patients with molecularly confirmed TCF4 variants and describe 3 unique individuals. The first carries a small deletion but does not exhibit the typical facial features nor the typical pattern of developmental delay. The second exhibits typical facial features, but has attained more advanced motor and verbal skills than other reported cases to date. The third displays typical features of PTHS, however inherited a large chromosomal duplication involving TCF4 from his unaffected father with somatic mosaicism. To the authors' knowledge, this is the first chromosomal duplication case reported to date.
Involving Teachers in Charter School Governance: A Guide for State Policymakers
ERIC Educational Resources Information Center
Sam, Cecilia
2008-01-01
This guide for state policymakers examines teacher involvement in charter school governance. Teacher involvement is defined to include the gamut of decision-making roles not typically afforded teachers in traditional public schools, including founding schools, serving on governing boards, and engaging in site-based collective bargaining. Different…
ERIC Educational Resources Information Center
Gilbert, George L., Ed.
1985-01-01
Background information, procedures, and typical results obtained are provided for two demonstrations. The first involves the colorful complexes of copper(II). The second involves reverse-phase separation of Food, Drug, and Cosmetic (FD & C) dyes using a solvent gradient. (JN)
Processing of speech signals for physical and sensory disabilities.
Levitt, H
1995-01-01
Assistive technology involving voice communication is used primarily by people who are deaf, hard of hearing, or who have speech and/or language disabilities. It is also used to a lesser extent by people with visual or motor disabilities. A very wide range of devices has been developed for people with hearing loss. These devices can be categorized not only by the modality of stimulation [i.e., auditory, visual, tactile, or direct electrical stimulation of the auditory nerve (auditory-neural)] but also in terms of the degree of speech processing that is used. At least four such categories can be distinguished: assistive devices (a) that are not designed specifically for speech, (b) that take the average characteristics of speech into account, (c) that process articulatory or phonetic characteristics of speech, and (d) that embody some degree of automatic speech recognition. Assistive devices for people with speech and/or language disabilities typically involve some form of speech synthesis or symbol generation for severe forms of language disability. Speech synthesis is also used in text-to-speech systems for sightless persons. Other applications of assistive technology involving voice communication include voice control of wheelchairs and other devices for people with mobility disabilities. Images Fig. 4 PMID:7479816
Processing of Speech Signals for Physical and Sensory Disabilities
NASA Astrophysics Data System (ADS)
Levitt, Harry
1995-10-01
Assistive technology involving voice communication is used primarily by people who are deaf, hard of hearing, or who have speech and/or language disabilities. It is also used to a lesser extent by people with visual or motor disabilities. A very wide range of devices has been developed for people with hearing loss. These devices can be categorized not only by the modality of stimulation [i.e., auditory, visual, tactile, or direct electrical stimulation of the auditory nerve (auditory-neural)] but also in terms of the degree of speech processing that is used. At least four such categories can be distinguished: assistive devices (a) that are not designed specifically for speech, (b) that take the average characteristics of speech into account, (c) that process articulatory or phonetic characteristics of speech, and (d) that embody some degree of automatic speech recognition. Assistive devices for people with speech and/or language disabilities typically involve some form of speech synthesis or symbol generation for severe forms of language disability. Speech synthesis is also used in text-to-speech systems for sightless persons. Other applications of assistive technology involving voice communication include voice control of wheelchairs and other devices for people with mobility disabilities.
[Local fixation of antibiotics by fibrin spray : In bone defects with soft tissue involvement].
Janko, Maren; Nau, Christoph; Marzi, Ingo; Frank, Johannes
2017-02-01
In acute and chronic bone infections with concomitant soft tissue involvement the current gold standard is radical surgical debridement including explantation of the infected prosthetic devices. This is followed by initiation of systemic antibiotic therapy appropriate for the antibiogram. Several revision operations are often necessary to achieve complete healing. Additional treatment with local antibiotics or antibiotic-containing substances is routinely used in bone surgery. Apart from the typical procedures with commercially available products, we have conducted a study with 21 patients by application of local antibiotic treatment in combination with the fibrin glue spray technique and evaluated the results. Out of nine wounds of the lower extremities with bone involvement, total healing could be achieved in eight cases. We were also successful in two out of three very complex pelvic wounds; however, as expected the implant infections were complicated. Out of the seven desolate cases we were only able to achieve complete long-term healing in two cases. In the meantime we routinely use the described method in such special disastrous infection situations; however, this is carried out only in combination with established surgical procedures in sepsis surgery and anti-infection management.
Image re-sampling detection through a novel interpolation kernel.
Hilal, Alaa
2018-06-01
Image re-sampling involved in re-size and rotation transformations is an essential element block in a typical digital image alteration. Fortunately, traces left from such processes are detectable, proving that the image has gone a re-sampling transformation. Within this context, we present in this paper two original contributions. First, we propose a new re-sampling interpolation kernel. It depends on five independent parameters that controls its amplitude, angular frequency, standard deviation, and duration. Then, we demonstrate its capacity to imitate the same behavior of the most frequent interpolation kernels used in digital image re-sampling applications. Secondly, the proposed model is used to characterize and detect the correlation coefficients involved in re-sampling transformations. The involved process includes a minimization of an error function using the gradient method. The proposed method is assessed over a large database of 11,000 re-sampled images. Additionally, it is implemented within an algorithm in order to assess images that had undergone complex transformations. Obtained results demonstrate better performance and reduced processing time when compared to a reference method validating the suitability of the proposed approaches. Copyright © 2018 Elsevier B.V. All rights reserved.
29 CFR 780.210 - The typical hatchery operations constitute “agriculture.”
Code of Federal Regulations, 2014 CFR
2014-07-01
... EXEMPTIONS APPLICABLE TO AGRICULTURE, PROCESSING OF AGRICULTURAL COMMODITIES, AND RELATED SUBJECTS UNDER THE FAIR LABOR STANDARDS ACT Agriculture as It Relates to Specific Situations Hatchery Operations § 780.210 The typical hatchery operations constitute “agriculture.” As stated in § 780.127, the typical hatchery...
29 CFR 780.210 - The typical hatchery operations constitute “agriculture.”
Code of Federal Regulations, 2010 CFR
2010-07-01
... EXEMPTIONS APPLICABLE TO AGRICULTURE, PROCESSING OF AGRICULTURAL COMMODITIES, AND RELATED SUBJECTS UNDER THE FAIR LABOR STANDARDS ACT Agriculture as It Relates to Specific Situations Hatchery Operations § 780.210 The typical hatchery operations constitute “agriculture.” As stated in § 780.127, the typical hatchery...
29 CFR 780.210 - The typical hatchery operations constitute “agriculture.”
Code of Federal Regulations, 2011 CFR
2011-07-01
... EXEMPTIONS APPLICABLE TO AGRICULTURE, PROCESSING OF AGRICULTURAL COMMODITIES, AND RELATED SUBJECTS UNDER THE FAIR LABOR STANDARDS ACT Agriculture as It Relates to Specific Situations Hatchery Operations § 780.210 The typical hatchery operations constitute “agriculture.” As stated in § 780.127, the typical hatchery...
29 CFR 780.210 - The typical hatchery operations constitute “agriculture.”
Code of Federal Regulations, 2013 CFR
2013-07-01
... EXEMPTIONS APPLICABLE TO AGRICULTURE, PROCESSING OF AGRICULTURAL COMMODITIES, AND RELATED SUBJECTS UNDER THE FAIR LABOR STANDARDS ACT Agriculture as It Relates to Specific Situations Hatchery Operations § 780.210 The typical hatchery operations constitute “agriculture.” As stated in § 780.127, the typical hatchery...
29 CFR 780.210 - The typical hatchery operations constitute “agriculture.”
Code of Federal Regulations, 2012 CFR
2012-07-01
... EXEMPTIONS APPLICABLE TO AGRICULTURE, PROCESSING OF AGRICULTURAL COMMODITIES, AND RELATED SUBJECTS UNDER THE FAIR LABOR STANDARDS ACT Agriculture as It Relates to Specific Situations Hatchery Operations § 780.210 The typical hatchery operations constitute “agriculture.” As stated in § 780.127, the typical hatchery...
Filippini, D; Tejle, K; Lundström, I
2005-08-15
The computer screen photo-assisted technique (CSPT), a method for substance classification based on spectral fingerprinting, which involves just a computer screen and a web camera as measuring platform is used here for the evaluation of a prospective enzyme-linked immunosorbent assay (ELISA). A anti-neutrophil cytoplasm antibodies (ANCA-ELISA) test, typically used for diagnosing patients suffering from chronic inflammatory disorders in the skin, joints, blood vessels and other tissues is comparatively tested with a standard microplate reader and CSPT, yielding equivalent results at a fraction of the instrumental costs. The CSPT approach is discussed as a distributed measuring platform allowing decentralized measurements in routine applications, whereas keeping centralized information management due to its natural network embedded operation.
Numerical investigation of finite-volume effects for the HVP
NASA Astrophysics Data System (ADS)
Boyle, Peter; Gülpers, Vera; Harrison, James; Jüttner, Andreas; Portelli, Antonin; Sachrajda, Christopher
2018-03-01
It is important to correct for finite-volume (FV) effects in the presence of QED, since these effects are typically large due to the long range of the electromagnetic interaction. We recently made the first lattice calculation of electromagnetic corrections to the hadronic vacuum polarisation (HVP). For the HVP, an analytical derivation of FV corrections involves a two-loop calculation which has not yet been carried out. We instead calculate the universal FV corrections numerically, using lattice scalar QED as an effective theory. We show that this method gives agreement with known analytical results for scalar mass FV effects, before applying it to calculate FV corrections for the HVP. This method for numerical calculation of FV effects is also widely applicable to quantities beyond the HVP.
On the Use of Surface Porosity to Reduce Unsteady Lift
NASA Technical Reports Server (NTRS)
Tinetti, Ana F.; Kelly, Jeffrey J.; Bauer, Steven X. S.; Thomas, Russell H.
2001-01-01
An innovative application of existing technology is proposed for attenuating the effects of transient phenomena, such as rotor-stator and rotor-strut interactions, linked to noise and fatigue failure in turbomachinery environments. A computational study was designed to assess the potential of passive porosity technology as a mechanism for alleviating interaction effects by reducing the unsteady lift developed on a stator airfoil subject to wake impingement. The study involved a typical high bypass fan Stator airfoil (solid baseline and several porous configurations), immersed in a free field and exposed to the effects of a transversely moving wake. It was found that, for the airfoil under consideration, the magnitude of the unsteady lift could be reduced more than 18% without incurring significant performance losses.
Fabrication of boron sputter targets
Makowiecki, D.M.; McKernan, M.A.
1995-02-28
A process is disclosed for fabricating high density boron sputtering targets with sufficient mechanical strength to function reliably at typical magnetron sputtering power densities and at normal process parameters. The process involves the fabrication of a high density boron monolithe by hot isostatically compacting high purity (99.9%) boron powder, machining the boron monolithe into the final dimensions, and brazing the finished boron piece to a matching boron carbide (B{sub 4}C) piece, by placing aluminum foil there between and applying pressure and heat in a vacuum. An alternative is the application of aluminum metallization to the back of the boron monolithe by vacuum deposition. Also, a titanium based vacuum braze alloy can be used in place of the aluminum foil. 7 figs.
NASA Astrophysics Data System (ADS)
Yahiaoui, Riad; Manjappa, Manukumara; Srivastava, Yogesh Kumar; Singh, Ranjan
2017-07-01
Electromagnetically induced transparency (EIT) arises from coupling between the bright and dark mode resonances that typically involve subwavelength structures with broken symmetry, which results in an extremely sharp transparency band. Here, we demonstrate a tunable broadband EIT effect in a symmetry preserved metamaterial structure at the terahertz frequencies. Alongside, we also envisage a photo-active EIT effect in a hybrid metal-semiconductor metamaterial, where the transparency window can be dynamically switched by shining near-infrared light beam. A robust coupled oscillator model explains the coupling mechanism in the proposed design, which shows a good agreement with the observed results on tunable broadband transparency effect. Such active, switchable, and broadband metadevices could have applications in delay bandwidth management, terahertz filtering, and slow light effects.
In search of how people change. Applications to addictive behaviors.
Prochaska, J O; DiClemente, C C; Norcross, J C
1992-09-01
How people intentionally change addictive behaviors with and without treatment is not well understood by behavioral scientists. This article summarizes research on self-initiated and professionally facilitated change of addictive behaviors using the key trans-theoretical constructs of stages and processes of change. Modification of addictive behaviors involves progression through five stages--pre-contemplation, contemplation, preparation, action, and maintenance--and individuals typically recycle through these stages several times before termination of the addiction. Multiple studies provide strong support for these stages as well as for a finite and common set of change processes used to progress through the stages. Research to date supports a trans-theoretical model of change that systematically integrates the stages with processes of change from diverse theories of psychotherapy.
Ghosh, Samik; Kim, Ki-Hyun; Sohn, Jong Ryeul
2011-01-01
In this study, we have examined the patterns of VOCs released from used Tedlar bags that were once used for the collection under strong source activities. In this way, we attempted to account for the possible bias associated with the repetitive use of Tedlar bags. To this end, we selected the bags that were never heated. All of these target bags were used in ambient temperature (typically at or below 30°C). These bags were also dealt carefully to avoid any mechanical abrasion. This study will provide the essential information regarding the interaction between VOCs and Tedlar bag materials as a potential source of bias in bag sampling approaches. PMID:22235175
Ghosh, Samik; Kim, Ki-Hyun; Sohn, Jong Ryeul
2011-01-01
In this study, we have examined the patterns of VOCs released from used Tedlar bags that were once used for the collection under strong source activities. In this way, we attempted to account for the possible bias associated with the repetitive use of Tedlar bags. To this end, we selected the bags that were never heated. All of these target bags were used in ambient temperature (typically at or below 30°C). These bags were also dealt carefully to avoid any mechanical abrasion. This study will provide the essential information regarding the interaction between VOCs and Tedlar bag materials as a potential source of bias in bag sampling approaches.
USDA-ARS?s Scientific Manuscript database
Introduction: Detection of foodborne pathogens typically involves microbiological enrichment with subsequent isolation and identification of a pure culture. This is typically followed by strain typing, which provides information critical to outbreak and source investigations. In the early 1990’s pul...
ERIC Educational Resources Information Center
Wand, Sean; Thermos, Adam C.
1998-01-01
Explains the issues to consider before a college decides to purchase a card-access system. The benefits of automation, questions involving implementation, the criteria for technology selection, what typical card technology involves, privacy concerns, and the placement of card readers are discussed. (GR)
Embedded I&C for Extreme Environments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kisner, Roger A.
2016-04-01
This project uses embedded instrumentation and control (I&C) technologies to demonstrate potential performance gains of nuclear power plant components in extreme environments. Extreme environments include high temperature, radiation, high pressure, high vibration, and high EMI conditions. For extreme environments, performance gains arise from moment-to-moment sensing of local variables and immediate application of local feedback control. Planning for embedding I&C during early system design phases contrasts with the traditional, serial design approach that incorporates minimal I&C after mechanical and electrical design is complete. The demonstration application involves the development and control of a novel, proof-of-concept motor/pump design. The motor and pumpmore » combination operate within the fluid environment, eliminating the need for rotating seals. Actively controlled magnetic bearings also replace failure-prone mechanical contact bearings that typically suspend rotating components. Such as design has the potential to significantly enhance the reliability and life of the pumping system and would not be possible without embedded I&C.« less
Spin-Transfer Studies in Magnetic Multilayer Nanostructures
NASA Astrophysics Data System (ADS)
Emley, N. C.; Albert, F. J.; Ryan, E. M.; Krivorotov, I. N.; Ralph, D. C.; Buhrman, R. A.
2003-03-01
Numerous experiments have demonstrated current-induced magnetization reversal in ferromagnet/paramagnet/ferromagnet nanostructures with the current in the CPP geometry. The primary mechanism for this reversal is the transfer of angular momentum from the spin-polarized conduction electrons to the nanomagnet moment the spin transfer effect. This phenomenon has potential application in nanoscale, current-controlled non-volatile memory elements, but several challenges must be overcome for realistic device implementation. Typical Co/Cu/Co nanopillar devices, although effective for fundamental studies, are not advantageous for technological applications because of their large switching currents Ic ( 3-10 mA) and small R·A (< 1 mΩ·µm^2). Here we report initial results testing some possible approaches for enhancing spin-transfer device performance which involve the addition of more layers, and hence, more complexity, to the simple Co/Cu/Co trilayer structure. These additions include synthetic antiferromagnet layers (SAF), exchange biased layers, nano-oxide layers (NOL), and additional magnetic layers. Research supported by NSF and DARPA
Current trends in small vocabulary speech recognition for equipment control
NASA Astrophysics Data System (ADS)
Doukas, Nikolaos; Bardis, Nikolaos G.
2017-09-01
Speech recognition systems allow human - machine communication to acquire an intuitive nature that approaches the simplicity of inter - human communication. Small vocabulary speech recognition is a subset of the overall speech recognition problem, where only a small number of words need to be recognized. Speaker independent small vocabulary recognition can find significant applications in field equipment used by military personnel. Such equipment may typically be controlled by a small number of commands that need to be given quickly and accurately, under conditions where delicate manual operations are difficult to achieve. This type of application could hence significantly benefit by the use of robust voice operated control components, as they would facilitate the interaction with their users and render it much more reliable in times of crisis. This paper presents current challenges involved in attaining efficient and robust small vocabulary speech recognition. These challenges concern feature selection, classification techniques, speaker diversity and noise effects. A state machine approach is presented that facilitates the voice guidance of different equipment in a variety of situations.
The ReaxFF reactive force-field: Development, applications, and future directions
Senftle, Thomas; Hong, Sungwook; Islam, Md Mahbubul; ...
2016-03-04
The reactive force-field (ReaxFF) interatomic potential is a powerful computational tool for exploring, developing and optimizing material properties. Methods based on the principles of quantum mechanics (QM), while offering valuable theoretical guidance at the electronic level, are often too computationally intense for simulations that consider the full dynamic evolution of a system. Alternatively, empirical interatomic potentials that are based on classical principles require significantly fewer computational resources, which enables simulations to better describe dynamic processes over longer timeframes and on larger scales. Such methods, however, typically require a predefined connectivity between atoms, precluding simulations that involve reactive events. The ReaxFFmore » method was developed to help bridge this gap. Approaching the gap from the classical side, ReaxFF casts the empirical interatomic potential within a bond-order formalism, thus implicitly describing chemical bonding without expensive QM calculations. As a result, this article provides an overview of the development, application, and future directions of the ReaxFF method.« less
Bharucha, Ashok J.; Anand, Vivek; Forlizzi, Jodi; Dew, Mary Amanda; Reynolds, Charles F.; Stevens, Scott; Wactlar, Howard
2009-01-01
The number of older Americans afflicted by Alzheimer disease and related dementias will triple to 13 million persons by 2050, thus greatly increasing healthcare needs. An approach to this emerging crisis is the development and deployment of intelligent assistive technologies that compensate for the specific physical and cognitive deficits of older adults with dementia, and thereby also reduce caregiver burden. The authors conducted an extensive search of the computer science, engineering, and medical databases to review intelligent cognitive devices, physiologic and environmental sensors, and advanced integrated sensor networks that may find future applications in dementia care. Review of the extant literature reveals an overwhelming focus on the physical disability of younger persons with typically nonprogressive anoxic and traumatic brain injuries, with few clinical studies specifically involving persons with dementia. A discussion of the specific capabilities, strengths, and limitations of each technology is followed by an overview of research methodological challenges that must be addressed to achieve measurable progress to meet the healthcare needs of an aging America. PMID:18849532
Application of the AHP method in modeling the trust and reputation of software agents
NASA Astrophysics Data System (ADS)
Zytniewski, Mariusz; Klementa, Marek; Skorupka, Dariusz; Stanek, Stanislaw; Duchaczek, Artur
2016-06-01
Given the unique characteristics of cyberspace and, in particular, the number of inherent security threats, communication between software agents becomes a highly complex issue and a major challenge that, on the one hand, needs to be continuously monitored and, on the other, awaits new solutions addressing its vulnerabilities. An approach that has recently come into view mimics mechanisms typical of social systems and is based on trust and reputation that assist agents in deciding which other agents to interact with. The paper offers an enhancement to existing trust and reputation models, involving the application of the AHP method that is widely used for decision support in social systems, notably for risks analysis. To this end, it is proposed to expand the underlying conceptual basis by including such notions as self-trust and social trust, and to apply these to software agents. The discussion is concluded with an account of an experiment aimed at testing the effectiveness of the proposed solution.
Multilevel acceleration of scattering-source iterations with application to electron transport
Drumm, Clif; Fan, Wesley
2017-08-18
Acceleration/preconditioning strategies available in the SCEPTRE radiation transport code are described. A flexible transport synthetic acceleration (TSA) algorithm that uses a low-order discrete-ordinates (S N) or spherical-harmonics (P N) solve to accelerate convergence of a high-order S N source-iteration (SI) solve is described. Convergence of the low-order solves can be further accelerated by applying off-the-shelf incomplete-factorization or algebraic-multigrid methods. Also available is an algorithm that uses a generalized minimum residual (GMRES) iterative method rather than SI for convergence, using a parallel sweep-based solver to build up a Krylov subspace. TSA has been applied as a preconditioner to accelerate the convergencemore » of the GMRES iterations. The methods are applied to several problems involving electron transport and problems with artificial cross sections with large scattering ratios. These methods were compared and evaluated by considering material discontinuities and scattering anisotropy. Observed accelerations obtained are highly problem dependent, but speedup factors around 10 have been observed in typical applications.« less
Pollitz, F.F.
2002-01-01
I present a new algorithm for calculating seismic wave propagation through a three-dimensional heterogeneous medium using the framework of mode coupling theory originally developed to perform very low frequency (f < ???0.01-0.05 Hz) seismic wavefield computation. It is a Greens function approach for multiple scattering within a defined volume and employs a truncated traveling wave basis set using the locked mode approximation. Interactions between incident and scattered wavefields are prescribed by mode coupling theory and account for the coupling among surface waves, body waves, and evanescent waves. The described algorithm is, in principle, applicable to global and regional wave propagation problems, but I focus on higher frequency (typically f ??????0.25 Hz) applications at regional and local distances where the locked mode approximation is best utilized and which involve wavefields strongly shaped by propagation through a highly heterogeneous crust. Synthetic examples are shown for P-SV-wave propagation through a semi-ellipsoidal basin and SH-wave propagation through a fault zone.
Future trends in commercial and military systems
NASA Astrophysics Data System (ADS)
Bond, F. E.
Commercial and military satellite communication systems are addressed, with a review of current applications and typical communication characteristics of the space and earth segments. Drivers for the development of future commercial systems include: the pervasion of digital techniques and services, growing orbit and frequency congestion, demand for more entertainment, and the large potential market for commercial 'roof-top' service. For military systems, survivability, improved flexibility, and the need for service to small mobile terminals are the principal factors involved. Technical trends include the use of higher frequency bands, multibeam antennas and a significant increase in the application of onboard processing. Military systems will employ a variety of techniques to counter both physical and electronic threats. The use of redundant transmission paths is a particularly effective approach. Successful implementation requires transmission standards to achieve the required interoperability among the pertinent networks. For both the military and commercial sectors, the trend toward larger numbers of terminals and more complex spacecraft is still persisting.
Bioinspired Methodology for Artificial Olfaction
Raman, Baranidharan; Hertz, Joshua L.; Benkstein, Kurt D.; Semancik, Steve
2008-01-01
Artificial olfaction is a potential tool for noninvasive chemical monitoring. Application of “electronic noses” typically involves recognition of “pretrained” chemicals, while long-term operation and generalization of training to allow chemical classification of “unknown” analytes remain challenges. The latter analytical capability is critically important, as it is unfeasible to pre-expose the sensor to every analyte it might encounter. Here, we demonstrate a biologically inspired approach where the recognition and generalization problems are decoupled and resolved in a hierarchical fashion. Analyte composition is refined in a progression from general (e.g., target is a hydrocarbon) to precise (e.g., target is ethane), using highly optimized response features for each step. We validate this approach using a MEMS-based chemiresistive microsensor array. We show that this approach, a unique departure from existing methodologies in artificial olfaction, allows the recognition module to better mitigate sensor-aging effects and to better classify unknowns, enhancing the utility of chemical sensors for real-world applications. PMID:18855409
An fMRI Study of Parietal Cortex Involvement in the Visual Guidance of Locomotion
ERIC Educational Resources Information Center
Billington, Jac; Field, David T.; Wilkie, Richard M.; Wann, John P.
2010-01-01
Locomoting through the environment typically involves anticipating impending changes in heading trajectory in addition to maintaining the current direction of travel. We explored the neural systems involved in the "far road" and "near road" mechanisms proposed by Land and Horwood (1995) using simulated forward or backward travel where participants…
ERIC Educational Resources Information Center
Potter, Carol
2016-01-01
Father involvement in education has been shown to result in a range of positive outcomes for typically developing children. However, the nature of paternal involvement in the education of children with disabilities and especially autism has been under-researched and is little understood. This study aimed to explore the nature of the involvement of…
A graphic user interface for efficient 3D photo-reconstruction based on free software
NASA Astrophysics Data System (ADS)
Castillo, Carlos; James, Michael; Gómez, Jose A.
2015-04-01
Recently, different studies have stressed the applicability of 3D photo-reconstruction based on Structure from Motion algorithms in a wide range of geoscience applications. For the purpose of image photo-reconstruction, a number of commercial and freely available software packages have been developed (e.g. Agisoft Photoscan, VisualSFM). The workflow involves typically different stages such as image matching, sparse and dense photo-reconstruction, point cloud filtering and georeferencing. For approaches using open and free software, each of these stages usually require different applications. In this communication, we present an easy-to-use graphic user interface (GUI) developed in Matlab® code as a tool for efficient 3D photo-reconstruction making use of powerful existing software: VisualSFM (Wu, 2015) for photo-reconstruction and CloudCompare (Girardeau-Montaut, 2015) for point cloud processing. The GUI performs as a manager of configurations and algorithms, taking advantage of the command line modes of existing software, which allows an intuitive and automated processing workflow for the geoscience user. The GUI includes several additional features: a) a routine for significantly reducing the duration of the image matching operation, normally the most time consuming stage; b) graphical outputs for understanding the overall performance of the algorithm (e.g. camera connectivity, point cloud density); c) a number of useful options typically performed before and after the photo-reconstruction stage (e.g. removal of blurry images, image renaming, vegetation filtering); d) a manager of batch processing for the automated reconstruction of different image datasets. In this study we explore the advantages of this new tool by testing its performance using imagery collected in several soil erosion applications. References Girardeau-Montaut, D. 2015. CloudCompare documentation accessed at http://cloudcompare.org/ Wu, C. 2015. VisualSFM documentation access at http://ccwu.me/vsfm/doc.html#.
Nabavizadeh, Seyed Ali; Mamourian, Alexander; Schmitt, James E; Cloran, Francis; Vossough, Arastoo; Pukenas, Bryan; Loevner, Laurie A; Mohan, Suyash
2016-01-01
While haemangiomas are common benign vascular lesions involving the spine, some behave in an aggressive fashion. We investigated the utility of fat-suppressed sequences to differentiate between benign and aggressive vertebral haemangiomas. Patients with the diagnosis of aggressive vertebral haemangioma and available short tau inversion-recovery or T2 fat saturation sequence were included in the study. 11 patients with typical asymptomatic vertebral body haemangiomas were selected as the control group. Region of interest signal intensity (SI) analysis of the entire haemangioma as well as the portion of each haemangioma with highest signal on fat-saturation sequences was performed and normalized to a reference normal vertebral body. A total of 8 patients with aggressive vertebral haemangioma and 11 patients with asymptomatic typical vertebral haemangioma were included. There was a significant difference between total normalized mean SI ratio (3.14 vs 1.48, p = 0.0002), total normalized maximum SI ratio (5.72 vs 2.55, p = 0.0003), brightest normalized mean SI ratio (4.28 vs 1.72, p < 0.0001) and brightest normalized maximum SI ratio (5.25 vs 2.45, p = 0.0003). Multiple measures were able to discriminate between groups with high sensitivity (>88%) and specificity (>82%). In addition to the conventional imaging features such as vertebral expansion and presence of extravertebral component, quantitative evaluation of fat-suppression sequences is also another imaging feature that can differentiate aggressive haemangioma and typical asymptomatic haemangioma. The use of quantitative fat-suppressed MRI in vertebral haemangiomas is demonstrated. Quantitative fat-suppressed MRI can have a role in confirming the diagnosis of aggressive haemangiomas. In addition, this application can be further investigated in future studies to predict aggressiveness of vertebral haemangiomas in early stages.
ERIC Educational Resources Information Center
Egbert, Robert I.; Stone, Lorene H.; Adams, David L.
2011-01-01
Four-year cooperative engineering programs are becoming more common in the United States. Cooperative engineering programs typically involve a "parent" institution with an established engineering program and one or more "satellite" institutions which typically have few or no engineering programs and are located in an area where…
Distributed Group Design Process: Lessons Learned.
ERIC Educational Resources Information Center
Eseryel, Deniz; Ganesan, Radha
A typical Web-based training development team consists of a project manager, an instructional designer, a subject-matter expert, a graphic artist, and a Web programmer. The typical scenario involves team members working together in the same setting during the entire design and development process. What happens when the team is distributed, that is…
Planning for robust reserve networks using uncertainty analysis
Moilanen, A.; Runge, M.C.; Elith, Jane; Tyre, A.; Carmel, Y.; Fegraus, E.; Wintle, B.A.; Burgman, M.; Ben-Haim, Y.
2006-01-01
Planning land-use for biodiversity conservation frequently involves computer-assisted reserve selection algorithms. Typically such algorithms operate on matrices of species presence?absence in sites, or on species-specific distributions of model predicted probabilities of occurrence in grid cells. There are practically always errors in input data?erroneous species presence?absence data, structural and parametric uncertainty in predictive habitat models, and lack of correspondence between temporal presence and long-run persistence. Despite these uncertainties, typical reserve selection methods proceed as if there is no uncertainty in the data or models. Having two conservation options of apparently equal biological value, one would prefer the option whose value is relatively insensitive to errors in planning inputs. In this work we show how uncertainty analysis for reserve planning can be implemented within a framework of information-gap decision theory, generating reserve designs that are robust to uncertainty. Consideration of uncertainty involves modifications to the typical objective functions used in reserve selection. Search for robust-optimal reserve structures can still be implemented via typical reserve selection optimization techniques, including stepwise heuristics, integer-programming and stochastic global search.
Topical anaesthesia for needle-related pain in newborn infants.
Foster, Jann P; Taylor, Christine; Spence, Kaye
2017-02-04
Hospitalised newborn neonates frequently undergo painful invasive procedures that involve penetration of the skin and other tissues by a needle. One intervention that can be used prior to a needle insertion procedure is application of a topical local anaesthetic. To evaluate the efficacy and safety of topical anaesthetics such as amethocaine and EMLA in newborn term or preterm infants requiring an invasive procedure involving puncture of skin and other tissues with a needle. We searched the Cochrane Central Register of Controlled Trials (CENTRAL), PubMed, Embase and CINAHL up to 15 May 2016; previous reviews including cross-references, abstracts, and conference proceedings. We contacted expert informants. We contacted authors directly to obtain additional data. We imposed no language restrictions. Randomised, quasi-randomised controlled trials, and cluster and cross-over randomised trials that compared the topical anaesthetics amethocaine and eutectic mixture of local anaesthetics (EMLA) in terms of anaesthetic efficacy and safety in newborn term or preterm infants requiring an invasive procedure involving puncture of skin and other tissues with a needle DATA COLLECTION AND ANALYSIS: From the reports of the clinical trials we extracted data regarding clinical outcomes including pain, number of infants with methaemoglobin level 5% and above, number of needle prick attempts prior to successful needle-related procedure, crying, time taken to complete the procedure, episodes of apnoea, episodes of bradycardia, episodes of oxygen desaturation, neurodevelopmental disability and other adverse events. Eight small randomised controlled trials met the inclusion criteria (n = 506). These studies compared either EMLA and placebo or amethocaine and placebo. No studies compared EMLA and amethocaine. We were unable to meta-analyse the outcome of pain due to differing outcome measures and methods of reporting. For EMLA, two individual studies reported a statistically significant reduction in pain compared to placebo during lumbar puncture and venepuncture. Three studies found no statistical difference between the groups during heel lancing. For amethocaine, three studies reported a statistically significant reduction in pain compared to placebo during venepuncture and one study reported a statistically significant reduction in pain compared to placebo during cannulation. One study reported no statistical difference between the two groups during intramuscular injection.One study reported no statistical difference between EMLA and the placebo group for successful venepuncture at first attempt. One study similarly reported no statistically significant difference between Amethocaine and the placebo group for successful cannulation at first attempt.Risk for local redness, swelling or blanching was significantly higher with EMLA (typical risk ratio (RR) 1.65, 95% confidence interval (CI) 1.24 to 2.19; typical risk difference (RD) 0.17, 95% CI 0.09 to 0.26; n = 272; number needed to treat for an additional harmful outcome (NNTH) 6, 95% CI 4 to 11; I 2 = 92% indicating considerable heterogeneity) although not for amethocaine (typical RR 2.11, 95% CI 0.72 to 6.16; typical RD 0.05, 95% CI -0.02 to 0.11, n = 221). These local skin reactions for EMLA and amethocaine were reported as short-lasting. Two studies reported no methaemoglobinaemia with single application of EMLA. The quality of the evidence on outcomes assessed according to GRADE was low to moderate. Overall, all the trials were small, and the effects of uncertain clinical significance. The evidence regarding the effectiveness or safety of the interventions studied is inadequate to support clinical recommendations. There has been no evaluation regarding any long-term effects of topical anaesthetics in newborn infants.High quality studies evaluating the efficacy and safety of topical anaesthetics such as amethocaine and EMLA for needle-related pain in newborn term or preterm infants are required. These studies should aim to determine efficacy of these topical anaesthetics and on homogenous groups of infants for gestational age. While there was no methaemoglobinaemia in the studies that reported methaemoglobin, the efficacy and safety of EMLA, especially in very preterm infants, and for repeated application, need to be further evaluated in future studies.
Modernizing Earth and Space Science Modeling Workflows in the Big Data Era
NASA Astrophysics Data System (ADS)
Kinter, J. L.; Feigelson, E.; Walker, R. J.; Tino, C.
2017-12-01
Modeling is a major aspect of the Earth and space science research. The development of numerical models of the Earth system, planetary systems or astrophysical systems is essential to linking theory with observations. Optimal use of observations that are quite expensive to obtain and maintain typically requires data assimilation that involves numerical models. In the Earth sciences, models of the physical climate system are typically used for data assimilation, climate projection, and inter-disciplinary research, spanning applications from analysis of multi-sensor data sets to decision-making in climate-sensitive sectors with applications to ecosystems, hazards, and various biogeochemical processes. In space physics, most models are from first principles, require considerable expertise to run and are frequently modified significantly for each case study. The volume and variety of model output data from modeling Earth and space systems are rapidly increasing and have reached a scale where human interaction with data is prohibitively inefficient. A major barrier to progress is that modeling workflows isn't deemed by practitioners to be a design problem. Existing workflows have been created by a slow accretion of software, typically based on undocumented, inflexible scripts haphazardly modified by a succession of scientists and students not trained in modern software engineering methods. As a result, existing modeling workflows suffer from an inability to onboard new datasets into models; an inability to keep pace with accelerating data production rates; and irreproducibility, among other problems. These factors are creating an untenable situation for those conducting and supporting Earth system and space science. Improving modeling workflows requires investments in hardware, software and human resources. This paper describes the critical path issues that must be targeted to accelerate modeling workflows, including script modularization, parallelization, and automation in the near term, and longer term investments in virtualized environments for improved scalability, tolerance for lossy data compression, novel data-centric memory and storage technologies, and tools for peer reviewing, preserving and sharing workflows, as well as fundamental statistical and machine learning algorithms.
ERIC Educational Resources Information Center
Zou, Di
2017-01-01
This research inspects the allocation of involvement load to the evaluation component of the involvement load hypothesis, examining how three typical approaches to evaluation (cloze-exercises, sentence-writing, and composition-writing) promote word learning. The results of this research were partially consistent with the predictions of the…
Runkel, Robert L.
1998-01-01
OTIS is a mathematical simulation model used to characterize the fate and transport of water-borne solutes in streams and rivers. The governing equation underlying the model is the advection-dispersion equation with additional terms to account for transient storage, lateral inflow, first-order decay, and sorption. This equation and the associated equations describing transient storage and sorption are solved using a Crank-Nicolson finite-difference solution. OTIS may be used in conjunction with data from field-scale tracer experiments to quantify the hydrologic parameters affecting solute transport. This application typically involves a trial-and-error approach wherein parameter estimates are adjusted to obtain an acceptable match between simulated and observed tracer concentrations. Additional applications include analyses of nonconservative solutes that are subject to sorption processes or first-order decay. OTIS-P, a modified version of OTIS, couples the solution of the governing equation with a nonlinear regression package. OTIS-P determines an optimal set of parameter estimates that minimize the squared differences between the simulated and observed concentrations, thereby automating the parameter estimation process. This report details the development and application of OTIS and OTIS-P. Sections of the report describe model theory, input/output specifications, sample applications, and installation instructions.
Amestoy, Anouck; Guillaud, Etienne; Bouvard, Manuel P.; Cazalets, Jean-René
2015-01-01
Individuals with autism spectrum disorder (ASD) present reduced visual attention to faces. However, contradictory conclusions have been drawn about the strategies involved in visual face scanning due to the various methodologies implemented in the study of facial screening. Here, we used a data-driven approach to compare children and adults with ASD subjected to the same free viewing task and to address developmental aspects of face scanning, including its temporal patterning, in healthy children, and adults. Four groups (54 subjects) were included in the study: typical adults, typically developing children, and adults and children with ASD. Eye tracking was performed on subjects viewing unfamiliar faces. Fixations were analyzed using a data-driven approach that employed spatial statistics to provide an objective, unbiased definition of the areas of interest. Typical adults expressed a spatial and temporal strategy for visual scanning that differed from the three other groups, involving a sequential fixation of the right eye (RE), left eye (LE), and mouth. Typically developing children, adults and children with autism exhibited similar fixation patterns and they always started by looking at the RE. Children (typical or with ASD) subsequently looked at the LE or the mouth. Based on the present results, the patterns of fixation for static faces that mature from childhood to adulthood in typical subjects are not found in adults with ASD. The atypical patterns found after developmental progression and experience in ASD groups appear to remain blocked in an immature state that cannot be differentiated from typical developmental child patterns of fixation. PMID:26236264
Probabilistic treatment of the uncertainty from the finite size of weighted Monte Carlo data
NASA Astrophysics Data System (ADS)
Glüsenkamp, Thorsten
2018-06-01
Parameter estimation in HEP experiments often involves Monte Carlo simulation to model the experimental response function. A typical application are forward-folding likelihood analyses with re-weighting, or time-consuming minimization schemes with a new simulation set for each parameter value. Problematically, the finite size of such Monte Carlo samples carries intrinsic uncertainty that can lead to a substantial bias in parameter estimation if it is neglected and the sample size is small. We introduce a probabilistic treatment of this problem by replacing the usual likelihood functions with novel generalized probability distributions that incorporate the finite statistics via suitable marginalization. These new PDFs are analytic, and can be used to replace the Poisson, multinomial, and sample-based unbinned likelihoods, which covers many use cases in high-energy physics. In the limit of infinite statistics, they reduce to the respective standard probability distributions. In the general case of arbitrary Monte Carlo weights, the expressions involve the fourth Lauricella function FD, for which we find a new finite-sum representation in a certain parameter setting. The result also represents an exact form for Carlson's Dirichlet average Rn with n > 0, and thereby an efficient way to calculate the probability generating function of the Dirichlet-multinomial distribution, the extended divided difference of a monomial, or arbitrary moments of univariate B-splines. We demonstrate the bias reduction of our approach with a typical toy Monte Carlo problem, estimating the normalization of a peak in a falling energy spectrum, and compare the results with previously published methods from the literature.
Westbrook, J. I.
2015-01-01
Summary Objectives To examine if human factors methods were applied in the design, development, and evaluation of mobile applications developed to facilitate aspects of patient-centered care coordination. Methods We searched MEDLINE and EMBASE (2013-2014) for studies describing the design or the evaluation of a mobile health application that aimed to support patients’ active involvement in the coordination of their care. Results 34 papers met the inclusion criteria. Applications ranged from tools that supported self-management of specific conditions (e.g. asthma) to tools that provided coaching or education. Twelve of the 15 papers describing the design or development of an app reported the use of a human factors approach. The most frequently used methods were interviews and surveys, which often included an exploration of participants’ current use of information technology. Sixteen papers described the evaluation of a patient application in practice. All of them adopted a human factors approach, typically an examination of the use of app features and/or surveys or interviews which enquired about patients’ views of the effects of using the app on their behaviors (e.g. medication adherence), knowledge, and relationships with healthcare providers. No study in our review assessed the impact of mobile applications on health outcomes. Conclusion The potential of mobile health applications to assist patients to more actively engage in the management of their care has resulted in a large number of applications being developed. Our review showed that human factors approaches are nearly always adopted to some extent in the design, development, and evaluation of mobile applications. PMID:26293851
Absorbable energy monitoring scheme: new design protocol to test vehicle structural crashworthiness.
Ofochebe, Sunday M; Enibe, Samuel O; Ozoegwu, Chigbogu G
2016-05-01
In vehicle crashworthiness design optimization detailed system evaluation capable of producing reliable results are basically achieved through high-order numerical computational (HNC) models such as the dynamic finite element model, mesh-free model etc. However the application of these models especially during optimization studies is basically challenged by their inherent high demand on computational resources, conditional stability of the solution process, and lack of knowledge of viable parameter range for detailed optimization studies. The absorbable energy monitoring scheme (AEMS) presented in this paper suggests a new design protocol that attempts to overcome such problems in evaluation of vehicle structure for crashworthiness. The implementation of the AEMS involves studying crash performance of vehicle components at various absorbable energy ratios based on a 2DOF lumped-mass-spring (LMS) vehicle impact model. This allows for prompt prediction of useful parameter values in a given design problem. The application of the classical one-dimensional LMS model in vehicle crash analysis is further improved in the present work by developing a critical load matching criterion which allows for quantitative interpretation of the results of the abstract model in a typical vehicle crash design. The adequacy of the proposed AEMS for preliminary vehicle crashworthiness design is demonstrated in this paper, however its extension to full-scale design-optimization problem involving full vehicle model that shows greater structural detail requires more theoretical development.
NASA Astrophysics Data System (ADS)
Boon, J. A.
Education innovation is here to stay. This chapter gives the results of a study of the application of information and communication technology to advanced teaching and learning activities. It is strategically important that the technology opens up new ways of teaching and learning. The purpose of this chapter is firstly to identify the typical advanced teaching and learning activities/functions that can be applied in e-Learning and face-to-face teaching and learning. Case studies were selected from a group of teachers who have already been involved in both teaching modes for some years and thus have experience in blended teaching and learning. A number of teaching activities/functions were seen as positive in their application in the e-Learning situation. Those that stand out are peer review and collaboration, promotion of reflection and stimulation of critical and creative thinking, team teaching, promotion of discovery/extension of knowledge, and problematization of the curriculum. In face-to-face teaching and learning, inviting engagement, how to come to know, involving metaphors and analogies, teaching that connects to learning, inspire change, promote understanding, and others stand out. As seen by the teachers in the case studies, both e-Learning and face-to-face teaching and learning are seen as complementary to each other. We define this view as blended teaching and learning.
Man, mind, and machine: the past and future of virtual reality simulation in neurologic surgery.
Robison, R Aaron; Liu, Charles Y; Apuzzo, Michael L J
2011-11-01
To review virtual reality in neurosurgery, including the history of simulation and virtual reality and some of the current implementations; to examine some of the technical challenges involved; and to propose a potential paradigm for the development of virtual reality in neurosurgery going forward. A search was made on PubMed using key words surgical simulation, virtual reality, haptics, collision detection, and volumetric modeling to assess the current status of virtual reality in neurosurgery. Based on previous results, investigators extrapolated the possible integration of existing efforts and potential future directions. Simulation has a rich history in surgical training, and there are numerous currently existing applications and systems that involve virtual reality. All existing applications are limited to specific task-oriented functions and typically sacrifice visual realism for real-time interactivity or vice versa, owing to numerous technical challenges in rendering a virtual space in real time, including graphic and tissue modeling, collision detection, and direction of the haptic interface. With ongoing technical advancements in computer hardware and graphic and physical rendering, incremental or modular development of a fully immersive, multipurpose virtual reality neurosurgical simulator is feasible. The use of virtual reality in neurosurgery is predicted to change the nature of neurosurgical education, and to play an increased role in surgical rehearsal and the continuing education and credentialing of surgical practitioners. Copyright © 2011 Elsevier Inc. All rights reserved.
Structural equation modeling for observational studies
Grace, J.B.
2008-01-01
Structural equation modeling (SEM) represents a framework for developing and evaluating complex hypotheses about systems. This method of data analysis differs from conventional univariate and multivariate approaches familiar to most biologists in several ways. First, SEMs are multiequational and capable of representing a wide array of complex hypotheses about how system components interrelate. Second, models are typically developed based on theoretical knowledge and designed to represent competing hypotheses about the processes responsible for data structure. Third, SEM is conceptually based on the analysis of covariance relations. Most commonly, solutions are obtained using maximum-likelihood solution procedures, although a variety of solution procedures are used, including Bayesian estimation. Numerous extensions give SEM a very high degree of flexibility in dealing with nonnormal data, categorical responses, latent variables, hierarchical structure, multigroup comparisons, nonlinearities, and other complicating factors. Structural equation modeling allows researchers to address a variety of questions about systems, such as how different processes work in concert, how the influences of perturbations cascade through systems, and about the relative importance of different influences. I present 2 example applications of SEM, one involving interactions among lynx (Lynx pardinus), mongooses (Herpestes ichneumon), and rabbits (Oryctolagus cuniculus), and the second involving anuran species richness. Many wildlife ecologists may find SEM useful for understanding how populations function within their environments. Along with the capability of the methodology comes a need for care in the proper application of SEM.
Macrophage polarization in virus-host interactions
USDA-ARS?s Scientific Manuscript database
Macrophage involvement in viral infections and antiviral states is common. However, this involvement has not been well-studied in the paradigm of macrophage polarization, which typically has been categorized by the dichotomy of classical (M1) and alternative (M2) statuses. Recent studies have reveal...
Sensemaking Handoffs: Why? How? and When?
ERIC Educational Resources Information Center
Sharma, Nikhil
2010-01-01
Sensemaking tasks are challenging and typically involve collecting, organizing and understanding information. Sensemaking often involves a handoff where a subsequent recipient picks up work done by a provider. Sensemaking handoffs are very challenging because handoffs introduce discontinuity in sensemaking. This dissertation attempts to explore…
An approach to the design and implementation of spacecraft attitude control systems
NASA Technical Reports Server (NTRS)
ODonnell, James R., Jr.; Mangus, David J.
1998-01-01
Over 39 years and a long list of missions, the guidance, navigation, and control (GN&C) groups at the Goddard Space Flight Center have gradually developed approaches to the design and implementation of successful spacecraft attitude control systems. With the recent creation of the Guidance, Navigation, and Control Center at Goddard, there is a desire to document some of these design practices to help to ensure their consistent application in the future. In this paper, we will discuss the beginnings of this effort, drawing primarily on the experience of one of the past attitude control system (ACS) groups at Goddard (what was formerly known as Code 712, the Guidance, Navigation, and Control Branch). We will discuss the analysis and design methods and criteria used, including guidelines for linear and nonlinear analysis, as well as the use of low- and high-fidelity simulation for system design and verification of performance. Descriptions of typical ACS sensor and actuator hardware will be shown, and typical sensor/actuator suites for a variety of mission types detailed. A description of the software and hardware test effort will be given, along with an attempt to make some qualitative estimates on how much effort is involved. The spacecraft and GN&C subsystem review cycles will be discussed, giving an outline of what design reviews are typically held and what information should be presented at each stage. Finally, we will point out some of the lessons learned at Goddard.
An Approach to the Design and Implementation of Spacecraft Attitude Control Systems
NASA Technical Reports Server (NTRS)
ODonnell, James R., Jr.; Mangus, David J.
1998-01-01
Over 39 years and a long list of missions, the guidance, navigation, and control (GN&C) groups at the Goddard Space Flight Center have gradually developed approaches to the design and implementation of successful spacecraft attitude control systems. With the recent creation of the Guidance, Navigation, and Control Center at Goddard, there is a desire to document some of these design practices to help to ensure their consistent application in the future. In this paper, we will discuss the beginnings of this effort, drawing primarily on the experience of one of the past attitude control system (ACS) groups at Goddard (what was formerly known as Code 712, the Guidance, Navigation, and Control Branch). We will discuss the analysis and design methods and criteria used, including guidelines for linear and nonlinear analysis, as well as the use of low- and high-fidelity simulation for system design and verification of performance. Descriptions of typical ACS sensor and actuator hardware will be shown, and typical sensor/actuator suites for a variety of mission types detailed. A description of the software and hardware test effort will be given, along with an attempt to make some qualitative estimates on how much effort is involved. The spacecraft and GN&C subsystem review cycles will be discussed, giving an outline of what design reviews are typically held and .what information should be presented at each stage. Finally, we will point out some of the lessons learned at Goddard.
McNerney, Monica P.; Watstein, Daniel M.; Styczynski, Mark P.
2015-01-01
Metabolic engineering is generally focused on static optimization of cells to maximize production of a desired product, though recently dynamic metabolic engineering has explored how metabolic programs can be varied over time to improve titer. However, these are not the only types of applications where metabolic engineering could make a significant impact. Here, we discuss a new conceptual framework, termed “precision metabolic engineering,” involving the design and engineering of systems that make different products in response to different signals. Rather than focusing on maximizing titer, these types of applications typically have three hallmarks: sensing signals that determine the desired metabolic target, completely directing metabolic flux in response to those signals, and producing sharp responses at specific signal thresholds. In this review, we will first discuss and provide examples of precision metabolic engineering. We will then discuss each of these hallmarks and identify which existing metabolic engineering methods can be applied to accomplish those tasks, as well as some of their shortcomings. Ultimately, precise control of metabolic systems has the potential to enable a host of new metabolic engineering and synthetic biology applications for any problem where flexibility of response to an external signal could be useful. PMID:26189665
Composite Overwrapped Pressure Vessels, A Primer
NASA Technical Reports Server (NTRS)
McLaughlan, Pat B.; Forth, Scott C.; Grimes-Ledesma, Lorie R.
2011-01-01
Due to the extensive amount of detailed information that has been published on composite overwrapped pressure vessels (COPVs), this document has been written to serve as a primer for those who desire an elementary knowledge of COPVs and the factors affecting composite safety. In this application, the word "composite" simply refers to a matrix of continuous fibers contained within a resin and wrapped over a pressure barrier to form a vessel for gas or liquid containment. COPVs are currently used at NASA to contain high pressure fluids in propulsion, science experiments, and life support applications. They have a significant weight advantage over all metal vessels but require unique design, manufacturing, and test requirements. COPVs also involve a much more complex mechanical understanding due to the interplay between the composite overwrap and the inner liner. A metallic liner is typically used in a COPV as a fluid permeation barrier. The liner design concepts and requirements have been borrowed from all-metal vessels. However, application of metallic vessel design standards to a very thin liner is not straightforward. Different failure modes exist for COPVs than for all-metal vessels, and understanding of these failure modes is at a much more rudimentary level than for metal vessels.
Varadavenkatesan, Thivaharan; Murty, Vytla Ramachandra
2013-01-01
Biosurfactants are surface-active compounds derived from varied microbial sources including bacteria and fungi. They are secreted extracellularly and have a wide range of exciting properties for bioremediation purposes. They also have vast applications in the food and medicine industry. With an objective of isolating microorganisms for enhanced oil recovery (EOR) operations, the study involved screening of organisms from an oil-contaminated site. Morphological, biochemical, and 16S rRNA analysis of the most promising candidate revealed it to be Bacillus siamensis, which has been associated with biosurfactant production, for the first time. Initial fermentation studies using mineral salt medium supplemented with crude oil resulted in a maximum biosurfactant yield of 0.64 g/L and reduction of surface tension to 36.1 mN/m at 96 h. Characterization studies were done using thin layer chromatography and Fourier transform infrared spectroscopy. FTIR spectra indicated the presence of carbonyl groups, alkyl bonds, and C-H and N-H stretching vibrations, typical of peptides. The extracted biosurfactant was stable at extreme temperatures, pH, and salinity. Its applicability to EOR was further verified by conducting sand pack column studies that yielded up to 60% oil recovery.
Semantic Web meets Integrative Biology: a survey.
Chen, Huajun; Yu, Tong; Chen, Jake Y
2013-01-01
Integrative Biology (IB) uses experimental or computational quantitative technologies to characterize biological systems at the molecular, cellular, tissue and population levels. IB typically involves the integration of the data, knowledge and capabilities across disciplinary boundaries in order to solve complex problems. We identify a series of bioinformatics problems posed by interdisciplinary integration: (i) data integration that interconnects structured data across related biomedical domains; (ii) ontology integration that brings jargons, terminologies and taxonomies from various disciplines into a unified network of ontologies; (iii) knowledge integration that integrates disparate knowledge elements from multiple sources; (iv) service integration that build applications out of services provided by different vendors. We argue that IB can benefit significantly from the integration solutions enabled by Semantic Web (SW) technologies. The SW enables scientists to share content beyond the boundaries of applications and websites, resulting into a web of data that is meaningful and understandable to any computers. In this review, we provide insight into how SW technologies can be used to build open, standardized and interoperable solutions for interdisciplinary integration on a global basis. We present a rich set of case studies in system biology, integrative neuroscience, bio-pharmaceutics and translational medicine, to highlight the technical features and benefits of SW applications in IB.
Close-range photogrammetry in underground mining ground control
NASA Astrophysics Data System (ADS)
Benton, Donovan J.; Chambers, Amy J.; Raffaldi, Michael J.; Finley, Seth A.; Powers, Mark J.
2016-09-01
Monitoring underground mine deformation and support conditions has traditionally involved visual inspection and geotechnical instrumentation. Monitoring displacements with conventional instrumentation can be expensive and time-consuming, and the number of locations that can be effectively monitored is generally limited. Moreover, conventional methods typically produce vector rather than tensor descriptions of geometry changes. Tensor descriptions can provide greater insight into hazardous ground movements, particularly in recently excavated openings and in older workings that have been negatively impacted by high stress concentrations, time-dependent deformation, or corrosion of ground support elements. To address these issues, researchers with the National Institute for Occupational Safety and Health, Spokane Mining Research Division are developing and evaluating photogrammetric systems for ground control monitoring applications in underground mines. This research has demonstrated that photogrammetric systems can produce millimeter-level measurements that are comparable to conventional displacement-measuring instruments. This paper provides an overview of the beneficial use of close-range photogrammetry for the following three ground control applications in underground mines: monitoring the deformation of surface support, monitoring rock mass movement, and monitoring the corrosion of surface support. Preliminary field analyses, case studies, limitations, and best practices for these applications are also discussed.
Kanin, Maralee R; Pontrello, Jason K
2016-01-01
Calls to bring interdisciplinary content and examples into introductory science courses have increased, yet strategies that involve course restructuring often suffer from the need for a significant faculty commitment to motivate change. Minimizing the need for dramatic course reorganization, the structure, reactivity, and chemical biology applications of classes of biological monomers and polymers have been integrated into introductory organic chemistry courses through three series of semester-long weekly assignments that explored (a) Carbohydrates and Oligosaccharides, (b) Amino Acids, Peptides, and Proteins, and (c) Nucleosides, Nucleotides, and Nucleic Acids. Comparisons of unannounced pre- and post tests revealed improved understanding of a reaction introduced in the assignments, and course examinations evaluated cumulative assignment topics. Course surveys revealed that demonstrating biologically relevant applications consistently throughout the semesters enhanced student interest in the connection between basic organic chemistry content and its application to new and unfamiliar bio-related examples. Covering basic material related to these classes of molecules outside of the classroom opened lecture time to allow the instructor to further build on information developed through the weekly assignments, teaching advanced topics and applications typically not covered in an introductory organic chemistry lecture course. Assignments were implemented as homework, either with or without accompanying discussion, in both laboratory and lecture organic courses within the context of the existing course structures. © 2015 The International Union of Biochemistry and Molecular Biology.
Pharmaceutical Applications of Ion-Exchange Resins
NASA Astrophysics Data System (ADS)
Elder, David P.
2005-04-01
The historical uses of ion-exchange resins and a summary of the basic chemical principles involved in the ion-exchange process are discussed. Specific applications of ion-exchange resins are provided. The utility of these agents to stabilize drugs are evaluated. Commonly occurring chemical and physical incompatibilities are reviewed. Ion-exchange resins have found applicability as inactive pharmaceutical constituents, particularly as disintegrants (inactive tablet ingredient whose function is to rapidly disrupt the tablet matrix on contact with gastric fluid). One of the more elegant approaches to improving palatability of ionizable drugs is the use of ion-exchange resins as taste-masking agents. The selection, optimization of drug:resin ratio and particle size, together with a review of scaleup of typical manufacturing processes for taste-masked products are provided. Ion-exchange resins have been extensively utilized in oral sustained-release products. The selection, optimization of drug:resin ratio and particle size, together with a summary of commonly occurring commercial sustained-release products are discussed. Ion-exchange resins have also been used in topical products for local application to the skin, including those where drug flux is controlled by a differential electrical current (ionotophoretic delivery). General applicability of ion-exchange resins, including ophthalmic delivery, nasal delivery, use as drugs in their own right (e.g., colestyramine, formerly referred to as cholestyramine), as well as measuring gastrointestinal transit times, are discussed. Finally, pharmaceutical monographs for ion-exchange resins are reviewed.
Fabrication, functionalization, and application of electrospun biopolymer nanofibers.
Kriegel, Christina; Arecchi, Alessandra; Arrechi, Alessandra; Kit, Kevin; McClements, D J; Weiss, Jochen
2008-09-01
The use of novel nanostructured materials has attracted considerable interest in the food industry for their utilization as highly functional ingredients, high-performance packaging materials, processing aids, and food quality and safety sensors. Most previous application interest has focused on the development of nanoparticles. However, more recently, the ability to produce non-woven mats composed of nanofibers that can be used in food applications is beginning to be investigated. Electrospinning is a novel fabrication technique that can be used to produce fibers with diameters below 100 nm from (bio-) polymer solutions. These nanofibers have been shown to possess unique properties that distinguish them from non-woven fibers produced by other methods, e.g., melt-blowing. This is because first the process involved results in a high orientation of polymers within the fibers that leads to mechanically superior properties, e.g., increased tensile strengths. Second, during the spinning of the fibers from polymer solutions, the solvent is rapidly evaporated allowing the production of fibers composed of polymer blends that would typically phase separate if spun with other processes. Third, the small dimensions of the fibers lead to very high specific surface areas. Because of this the fiber properties may be greatly influenced by surface properties giving rise to fiber functionalities not found in fibers of larger sizes. For food applications, the fibers may find uses as ingredients if they are composed solely of edible polymers and GRAS ingredients, (e.g., fibers could contain functional ingredients such as nutraceuticals, antioxidants, antimicrobials, and flavors), as active packaging materials or as processing aids (e.g., catalytic reactors, membranes, filters (Lala et al., 2007), and sensors (Manesh et al., 2007; Ren et al., 2006; Sawicka et al., 2005). This review is therefore intended to introduce interested food and agricultural scientists to the concept of nano-fiber manufacturing with a particular emphasis on the use of biopolymers. We will review typical fabrication set-ups, discuss the influence of process conditions on nanofiber properties, and then review previous studies that describe the production of biopolymer-based nanofibers. Finally we briefly discuss emerging methods to further functionalize fibers and discuss potential applications in the area of food science and technology.
Training Class Inclusion Responding in Typically Developing Children and Individuals with Autism
ERIC Educational Resources Information Center
Ming, Siri; Mulhern, Teresa; Stewart, Ian; Moran, Laura; Bynum, Kellie
2018-01-01
In a "class inclusion" task, a child must respond to stimuli as being involved in two different though hierarchically related categories. This study used a Relational Frame Theory (RFT) paradigm to assess and train this ability in three typically developing preschoolers and three individuals with autism spectrum disorder, all of whom had…
Applying ecological concepts to the management of widespread grass invasions [Chapter 7
Carla M. D' Antonio; Jeanne C. Chambers; Rhonda Loh; J. Tim Tunison
2009-01-01
The management of plant invasions has typically focused on the removal of invading populations or control of existing widespread species to unspecified but lower levels. Invasive plant management typically has not involved active restoration of background vegetation to reduce the likelihood of invader reestablishment. Here, we argue that land managers could benefit...
ERIC Educational Resources Information Center
Tobia, Valentina; Bonifacci, Paola; Ottaviani, Cristina; Borsato, Thomas; Marzocchi, Gian Marco
2016-01-01
The aim of this study was to investigate physiological activation during reading and control tasks in children with dyslexia and typical readers. Skin conductance response (SCR) recorded during four tasks involving reading aloud, reading silently, and describing illustrated stories aloud and silently was compared for children with dyslexia (n =…
Ferroxidase-Mediated Iron Oxide Biomineralization: Novel Pathways to Multifunctional Nanoparticles.
Zeth, Kornelius; Hoiczyk, Egbert; Okuda, Mitsuhiro
2016-02-01
Iron oxide biomineralization occurs in all living organisms and typically involves protein compartments ranging from 5 to 100nm in size. The smallest iron-oxo particles are formed inside dodecameric Dps protein cages, while the structurally related ferritin compartments consist of twice as many identical protein subunits. The largest known compartments are encapsulins, icosahedra made of up to 180 protein subunits that harbor additional ferritin-like proteins in their interior. The formation of iron-oxo particles in all these compartments requires a series of steps including recruitment of iron, translocation, oxidation, nucleation, and storage, that are mediated by ferroxidase centers. Thus, compartmentalized iron oxide biomineralization yields uniform nanoparticles strictly determined by the sizes of the compartments, allowing customization for highly diverse nanotechnological applications. Copyright © 2015 Elsevier Ltd. All rights reserved.
Multi-asteroid comet missions using solar electric propulsion.
NASA Technical Reports Server (NTRS)
Bender, D. F.; Bourke, R. D.
1972-01-01
Multitarget flyby missions to asteroids and comets are attractive candidates for solar electric propulsion (SEP) application because SEP can efficiently provide the thrust required for carefully chosen sequences of encounters. In this paper, techniques for finding encounter sequences for these missions are described, and examples involving flyby and rendezvous missions to P/Encke, P/Kopff and 20/Massalia are presented. In addition, examples of four asteroid flyby sequences are given. Encounters typically have flyby speeds on the order of 5-10 km/sec and are limited only by navigational capability as regards flyby distance, which is taken as zero in the study. Flights traversing the asteroid belt can be modified by SEP to pass one or more asteroids, and the performance penalty is small if the encounters are properly spaced.
Metallic superhydrophobic surfaces via thermal sensitization
NASA Astrophysics Data System (ADS)
Vahabi, Hamed; Wang, Wei; Popat, Ketul C.; Kwon, Gibum; Holland, Troy B.; Kota, Arun K.
2017-06-01
Superhydrophobic surfaces (i.e., surfaces extremely repellent to water) allow water droplets to bead up and easily roll off from the surface. While a few methods have been developed to fabricate metallic superhydrophobic surfaces, these methods typically involve expensive equipment, environmental hazards, or multi-step processes. In this work, we developed a universal, scalable, solvent-free, one-step methodology based on thermal sensitization to create appropriate surface texture and fabricate metallic superhydrophobic surfaces. To demonstrate the feasibility of our methodology and elucidate the underlying mechanism, we fabricated superhydrophobic surfaces using ferritic (430) and austenitic (316) stainless steels (representative alloys) with roll off angles as low as 4° and 7°, respectively. We envision that our approach will enable the fabrication of superhydrophobic metal alloys for a wide range of civilian and military applications.
Modelling multiple sources of dissemination bias in meta-analysis.
Bowden, Jack; Jackson, Dan; Thompson, Simon G
2010-03-30
Asymmetry in the funnel plot for a meta-analysis suggests the presence of dissemination bias. This may be caused by publication bias through the decisions of journal editors, by selective reporting of research results by authors or by a combination of both. Typically, study results that are statistically significant or have larger estimated effect sizes are more likely to appear in the published literature, hence giving a biased picture of the evidence-base. Previous statistical approaches for addressing dissemination bias have assumed only a single selection mechanism. Here we consider a more realistic scenario in which multiple dissemination processes, involving both the publishing authors and journals, are operating. In practical applications, the methods can be used to provide sensitivity analyses for the potential effects of multiple dissemination biases operating in meta-analysis.
Quantum speedup in solving the maximal-clique problem
NASA Astrophysics Data System (ADS)
Chang, Weng-Long; Yu, Qi; Li, Zhaokai; Chen, Jiahui; Peng, Xinhua; Feng, Mang
2018-03-01
The maximal-clique problem, to find the maximally sized clique in a given graph, is classically an NP-complete computational problem, which has potential applications ranging from electrical engineering, computational chemistry, and bioinformatics to social networks. Here we develop a quantum algorithm to solve the maximal-clique problem for any graph G with n vertices with quadratic speedup over its classical counterparts, where the time and spatial complexities are reduced to, respectively, O (√{2n}) and O (n2) . With respect to oracle-related quantum algorithms for the NP-complete problems, we identify our algorithm as optimal. To justify the feasibility of the proposed quantum algorithm, we successfully solve a typical clique problem for a graph G with two vertices and one edge by carrying out a nuclear magnetic resonance experiment involving four qubits.
NASA Astrophysics Data System (ADS)
Lin, Fubiao; Meleshko, Sergey V.; Flood, Adrian E.
2018-06-01
The population balance equation (PBE) has received an unprecedented amount of attention in recent years from both academics and industrial practitioners because of its long history, widespread use in engineering, and applicability to a wide variety of particulate and discrete-phase processes. However it is typically impossible to obtain analytical solutions, although in almost every case a numerical solution of the PBEs can be obtained. In this article, the symmetries of PBEs with homogeneous coagulation kernels involving aggregation, breakage and growth processes and particle transport in one dimension are found by direct solving the determining equations. Using the optimal system of one and two-dimensional subalgebras, all invariant solutions and reduced equations are obtained. In particular, an explicit analytical physical solution is also presented.
Delivery of chemotherapeutic drugs in tumour cell-derived microparticles.
Tang, Ke; Zhang, Yi; Zhang, Huafeng; Xu, Pingwei; Liu, Jing; Ma, Jingwei; Lv, Meng; Li, Dapeng; Katirai, Foad; Shen, Guan-Xin; Zhang, Guimei; Feng, Zuo-Hua; Ye, Duyun; Huang, Bo
2012-01-01
Cellular microparticles are vesicular plasma membrane fragments with a diameter of 100-1,000 nanometres that are shed by cells in response to various physiological and artificial stimuli. Here we demonstrate that tumour cell-derived microparticles can be used as vectors to deliver chemotherapeutic drugs. We show that tumour cells incubated with chemotherapeutic drugs package these drugs into microparticles, which can be collected and used to effectively kill tumour cells in murine tumour models without typical side effects. We describe several mechanisms involved in this process, including uptake of drug-containing microparticles by tumour cells, synthesis of additional drug-packaging microparticles by these cells that contribute to the cytotoxic effect and the inhibition of drug efflux from tumour cells. This study highlights a novel drug delivery strategy with potential clinical application.
Diabatic models with transferrable parameters for generalized chemical reactions
NASA Astrophysics Data System (ADS)
Reimers, Jeffrey R.; McKemmish, Laura K.; McKenzie, Ross H.; Hush, Noel S.
2017-05-01
Diabatic models applied to adiabatic electron-transfer theory yield many equations involving just a few parameters that connect ground-state geometries and vibration frequencies to excited-state transition energies and vibration frequencies to the rate constants for electron-transfer reactions, utilizing properties of the conical-intersection seam linking the ground and excited states through the Pseudo Jahn-Teller effect. We review how such simplicity in basic understanding can also be obtained for general chemical reactions. The key feature that must be recognized is that electron-transfer (or hole transfer) processes typically involve one electron (hole) moving between two orbitals, whereas general reactions typically involve two electrons or even four electrons for processes in aromatic molecules. Each additional moving electron leads to new high-energy but interrelated conical-intersection seams that distort the shape of the critical lowest-energy seam. Recognizing this feature shows how conical-intersection descriptors can be transferred between systems, and how general chemical reactions can be compared using the same set of simple parameters. Mathematical relationships are presented depicting how different conical-intersection seams relate to each other, showing that complex problems can be reduced into an effective interaction between the ground-state and a critical excited state to provide the first semi-quantitative implementation of Shaik’s “twin state” concept. Applications are made (i) demonstrating why the chemistry of the first-row elements is qualitatively so different to that of the second and later rows, (ii) deducing the bond-length alternation in hypothetical cyclohexatriene from the observed UV spectroscopy of benzene, (iii) demonstrating that commonly used procedures for modelling surface hopping based on inclusion of only the first-derivative correction to the Born-Oppenheimer approximation are valid in no region of the chemical parameter space, and (iv), demonstrating the types of chemical reactions that may be suitable for exploitation as a chemical qubit in some quantum information processor.
Adequacy of the Regular Early Education Classroom Environment for Students with Visual Impairment
ERIC Educational Resources Information Center
Brown, Cherylee M.; Packer, Tanya L.; Passmore, Anne
2013-01-01
This study describes the classroom environment that students with visual impairment typically experience in regular Australian early education. Adequacy of the classroom environment (teacher training and experience, teacher support, parent involvement, adult involvement, inclusive attitude, individualization of the curriculum, physical…
Treating Families of Bone Marrow Recipients and Donors
ERIC Educational Resources Information Center
Cohen, Marie; And Others
1977-01-01
Luekemia and aplastic anemia are beginning to be treated by bone marrow transplants, involving donors and recipients from the same family. Such intimate involvement in the patient's life and death struggles typically produces a family crisis and frequent maladaptive responses by various family members. (Author)
NASA Technical Reports Server (NTRS)
Lupton, W. F.; Conrad, A. R.
1992-01-01
KTL is a set of routines which eases the job of writing applications which must interact with a variety of underlying sub-systems (known as services). A typical application is an X Window user interface coordinating telescope and instruments. In order to connect to a service, application code specifies a service name--typically an instrument name--and a style, which defines the way in which the application will interact with the service. Two styles are currently supported: keyword, where the application reads and writes named keywords and the resulting inter-task message traffic is hidden; and message, where the application deals directly with messages. The keyword style is intended mainly for user interfaces, and the message style is intended mainly for lower-level applications. KTL applications are event driven: a typical application first connects to all its desired services, then expresses interest in specified events. The application then enters an event dispatch loop in which it waits for events and calls the appropriate service's event-handling routine. Each event is associated with a call-back routine which is invoked when the event occurs. Call-back routines may (and typically do) interact with other sub-systems and KTL provides the means of doing so without blocking the application (vital for X Window user interfaces). This approach is a marriage of ideas culled from the X window, ADAM, Keck instrument, and Keck telescope control systems. A novel feature of KTL is that it knows nothing about any services or styles. Instead it defines a generic set of routines which must be implemented by all services and styles (essentially open(), ioctl(), read(), write(), event(), and close()) and activates sharable libraries at run-time. Services have been implemented (in both keyword and message styles) for HIRES (the Keck high resolution echelle spectrograph built by Lick Observatory), LWS (the Keck long wavelength spectrometer built by UC San Diego), and the Keck telescope. Each of these implementations uses different underlying message systems: the Lick MUSIC system, RPC's, and direct sockets (respectively). Services for the remaining three front-line Keck instruments will be implemented over the next few months.
Arya, Nisha G; Weissbart, Steven J
2017-04-01
Urinary incontinence disproportionately affects women. Anatomical textbooks typically describe continence mechanisms in women in the context of the pelvic floor support of the urinary bladder and the urethral sphincters. However, the urinary bladder and urethral sphincters are under the central control of the brain through a complex network of neurons that allow storage of urine followed by voiding when socially appropriate. Recent studies suggest that the most common type of urinary incontinence in women, urgency urinary incontinence, involves significant dysfunction of the central control of micturition. In this paper, we review the anatomy and functional connectivity of the nervous system structures involved in the control of micturition. Clinical application of this anatomy in the context of urgency urinary incontinence is also discussed. Understanding the anatomy of the neural structures that control continence will allow clinicians to better understand the underlying pathology of urge incontinence and consider new ways of treating this distressing condition. Clin. Anat. 30:373-384, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
Applied Physics Education: PER focused on Physics-Intensive Careers
NASA Astrophysics Data System (ADS)
Zwickl, Benjamin
2017-01-01
Physics education research is moving beyond classroom learning to study the application of physics education within STEM jobs and PhD-level research. Workforce-related PER is vital to supporting physics departments as they educate students for a diverse range of careers. Results from an on-going study involving interviews with entry-level employees, academic researchers, and supervisors in STEM jobs describe the ways that mathematics, physics, and communication are needed for workplace success. Math and physics are often used for solving ill-structured problems that involve data analysis, computational modeling, or hands-on work. Communication and collaboration are utilized in leadership, sales, and as way to transfer information capital throughout the organization through documentation, emails, memos, and face-to-face discussions. While managers and advisors think a physics degree typically establishes technical competency, communication skills are vetted through interviews and developed on the job. Significant learning continues after graduation, showing the importance of cultivating self-directed learning habits and the critical role of employers as educators of specialized technical abilities through on-the-job training. Supported by NSF DGE-1432578.
Mechanistic Insight on the Activity and Substrate Selectivity of Nonheme Iron Dioxygenases.
de Visser, Sam P
2018-06-07
Nonheme iron dioxygenases catalyze vital reactions for human health particularly related to aging processes. They are involved in the biosynthesis of amino acids, but also the biodegradation of toxic compounds. Typically they react with their substrate(s) through oxygen atom transfer, although often with the assistance of a co-substrate like α-ketoglutarate that is converted to succinate and CO 2 . Many reaction processes catalyzed by the nonheme iron dioxygenases are stereoselective or regiospecific and hence understanding the mechanism and protein involvement in the selectivity is important for the design of biotechnological applications of these enzymes. To this end, I will review recent work of our group on nonheme iron dioxygenases and include background information on their general structure and catalytic cycle. Examples of stereoselective and regiospecific reaction mechanisms we elucidated are for the AlkB repair enzyme, prolyl-4-hydroxylase and the ergothioneine biosynthesis enzyme. Finally, I cover an example where we bioengineered S-p-hydroxymandelate synthase into the R-p-hydroxymandelate synthase. © 2018 The Chemical Society of Japan & Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
León-López, Liliana; Márquez-Mota, Claudia C; Velázquez-Villegas, Laura A; Gálvez-Mariscal, Amanda; Arrieta-Báez, Daniel; Dávila-Ortiz, Gloria; Tovar, Armando R; Torres, Nimbe
2015-09-01
Jatropha curcas is an oil seed plant that belongs to the Euphorbiaceae family. Nontoxic genotypes have been reported in Mexico. The purpose of the present work was to evaluate the effect of a Mexican variety of J. curcas protein concentrate (JCP) on weight gain, biochemical parameters, and the expression of genes and proteins involved in insulin signaling, lipogenesis, cholesterol and protein synthesis in rats. The results demonstrated that short-term consumption of JCP increased serum glucose, insulin, triglycerides and cholesterol levels as well as the expression of transcription factors involved in lipogenesis and cholesterol synthesis (SREBP-1 and LXRα). Moreover, there was an increase in insulin signaling mediated by Akt phosphorylation and mTOR. JCP also increased PKCα protein abundance and the activation of downstream signaling pathway targets such as the AP1 and NF-κB transcription factors typically activated by phorbol esters. These results suggested that phorbol esters are present in JCP, and that they could be involved in the activation of PKC which may be responsible for the high insulin secretion and consequently the activation of insulin-dependent pathways. Our data suggest that this Mexican Jatropha variety contains toxic compounds that produce negative metabolic effects which require caution when using in the applications of Jatropha-based products in medicine and nutrition.
Involvement of Activated Oxygen in Nitrate-Induced Senescence of Pea Root Nodules.
Escuredo, P. R.; Minchin, F. R.; Gogorcena, Y.; Iturbe-Ormaetxe, I.; Klucas, R. V.; Becana, M.
1996-01-01
The effect of short-term nitrate application (10 mM, 0-4 d) on nitrogenase (N2ase) activity, antioxidant defenses, and related parameters was investigated in pea (Pisum sativum L. cv Frilene) nodules. The response of nodules to nitrate comprised two stages. In the first stage (0-2 d), there were major decreases in N2ase activity and N2ase-linked respiration and concomitant increases in carbon cost of N2ase and oxygen diffusion resistance of nodules. There was no apparent oxidative damage, and the decline in N2ase activity was, to a certain extent, reversible. The second stage (>2 d) was typical of a senescent, essentially irreversible process. It was characterized by moderate increases in oxidized proteins and catalytic Fe and by major decreases in antioxidant enzymes and metabolites. The restriction in oxygen supply to bacteroids may explain the initial decline in N2ase activity. The decrease in antioxidant protection is not involved in this process and is not specifically caused by nitrate, since it also occurs with drought stress. However, comparison of nitrate- and drought-induced senescence shows an important difference: there is no lipid degradation or lipid peroxide accumulation with nitrate, indicating that lipid peroxidation is not necessarily involved in nodule senescence. PMID:12226252
de Solla, Shane Raymond; Palonen, Kimberley Elizabeth; Martin, Pamela Anne
2014-01-01
Turtles frequently oviposit in soils associated with agriculture and, thus, may be exposed to pesticides or fertilizers. The toxicity of a pesticide regime that is used for potato production in Ontario on the survivorship of snapping turtle (Chelydra serpentina) eggs was evaluated. The following treatments were applied to clean soil: 1) a mixture of the pesticides chlorothalonil, S-metolachlor, metribuzin, and chlorpyrifos, and 2) the soil fumigant metam sodium. Turtle eggs were incubated in soil in outdoor plots in which these mixtures were applied at typical and higher field application rates, where the eggs were subject to ambient temperature and weather conditions. The pesticide mixture consisting of chlorothalonil, S-metolachlor, metribuzin, and chlorpyrifos did not affect survivorship, deformities, or body size at applications up to 10 times the typical field application rates. Hatching success ranged between 87% and 100% for these treatments. Metam sodium was applied at 0.1¯ times, 0.3¯ times, 1 times, and 3 times field application rates. Eggs exposed to any application of metam sodium had 100% mortality. At typical field application rates, the chemical regime associated with potato production does not appear to have any detrimental impacts on turtle egg development, except for the use of the soil fumigant metam sodium, which is highly toxic to turtle eggs at the lowest recommended application rate. © 2013 SETAC.
SOCIAL AND NON-SOCIAL CUEING OF VISUOSPATIAL ATTENTION IN AUTISM AND TYPICAL DEVELOPMENT
Pruett, John R.; LaMacchia, Angela; Hoertel, Sarah; Squire, Emma; McVey, Kelly; Todd, Richard D.; Constantino, John N.; Petersen, Steven E.
2013-01-01
Three experiments explored attention to eye gaze, which is incompletely understood in typical development and is hypothesized to be disrupted in autism. Experiment 1 (n=26 typical adults) involved covert orienting to box, arrow, and gaze cues at two probabilities and cue-target times to test whether reorienting for gaze is endogenous, exogenous, or unique; experiment 2 (total n=80: male and female children and adults) studied age and sex effects on gaze cueing. Gaze cueing appears endogenous and may strengthen in typical development. Experiment 3 tested exogenous, endogenous, and/or gaze-based orienting in 25 typical and 27 Autistic Spectrum Disorder (ASD) children. ASD children made more saccades, slowing their reaction times; however, exogenous and endogenous orienting, including gaze cueing, appear intact in ASD. PMID:20809377
At the forefront of thought: the effect of media exposure on airplane typicality.
Novick, Laura R
2003-12-01
The terrorist attacks of September 11, 2001 provided a unique opportunity to investigate the causal status of frequency on typicality for one exemplar of a common conceptual category--namely, the typicality of airplane as a member of the category of vehicles. The extensive media coverage following the attacks included numerous references to the hijacked airplanes and to the consequences of suspending air travel to and from the United States for several days. The present study, involving 152 undergraduates, assessed airplane typicality at three time points ranging from 5 h to 1 month after the attacks and then again at 4.5 months after the attacks. Airplane was judged to be a more typical vehicle for 1 month following the attacks, relative to a baseline calculated from data collected yearly for 5 years preceding the attacks. By 4.5 months, however, typicality was back to baseline.
Montemayor, Beta P; Price, Bradford B; van Egmond, Roger A
2013-10-01
Decamethylcyclopentasiloxane, commonly known as D5 (cyclopentasiloxane) has a wide application of use across a multitude of personal care product categories. The relative volatility of D5 is one of the key properties attributed to this substance that provide for the derived performance benefits from the use of this raw material in personal care formulations. On this basis, rapid evaporative loss following use of many products comprising D5 is expected following typical use application and corresponding wear time. Studies were conducted on three key product categories containing D5 (antiperspirants, skin care products and hair care products) to characterize the amount of D5 that may be destined to 'go down the drain' following simulated typical personal care use scenarios. Marketed antiperspirants and skin care products were applied to human subjects and hair care products were applied to human hair tressesand subsequently rinsed off at designated time points representative of typical consumer cleansing and personal hygiene habits. Wash water was collected at 0, 8 and 24h (antiperspirant and hair care analysis) and additionally at 4h (skin care analysis) post product application and samples were analyzed by isotope dilution headspace gas chromatography/mass spectrometry (GC/MS) to quantify the concentration of D5 destined to be available to go down the drain in captured wash water. It is demonstrated that significant amounts of D5 in 'leave-on' application products evaporate during typical use and that the concentration of D5 available to go down the drain under such conditions of use is only a very small (negligible) fraction of that delivered immediately upon product application. Copyright © 2012 Elsevier Ltd. All rights reserved.
Composite Flywheels Assessed Analytically by NDE and FEA
NASA Technical Reports Server (NTRS)
Abdul-Aziz, Ali; Baaklini, George Y.
2000-01-01
As an alternative to expensive and short-lived lead-acid batteries, composite flywheels are being developed to provide an uninterruptible power supply for advanced aerospace and industrial applications. Flywheels can help prevent irregularities in voltage caused by power spikes, sags, surges, burnout, and blackouts. Other applications include load-leveling systems for wind and solar power facilities, where energy output fluctuates with weather. Advanced composite materials are being considered for these components because they are significantly lighter than typical metallic alloys and have high specific strength and stiffness. However, much more research is needed before these materials can be fully utilized, because there is insufficient data concerning their fatigue characteristics and nonlinear behavior, especially at elevated temperatures. Moreover, these advanced types of structural composites pose greater challenges for nondestructive evaluation (NDE) techniques than are encountered with typical monolithic engineering metals. This is particularly true for ceramic polymer and metal matrix composites, where structural properties are tailored during the processing stages. Current efforts involving the NDE group at the NASA Glenn Research Center at Lewis Field are focused on evaluating many important structural components, including the flywheel system. Glenn's in-house analytical and experimental capabilities are being applied to analyze data produced by computed tomography (CT) scans to help assess the damage and defects of high-temperature structural composite materials. Finite element analysis (FEA) has been used extensively to model the effects of static and dynamic loading on aerospace propulsion components. This technique allows the use of complicated loading schemes by breaking the complex part geometry into many smaller, geometrically simple elements.
Stakeholder Perceptions of Cyberbullying Cases: Application of the Uniform Definition of Bullying.
Moreno, Megan A; Suthamjariya, Nina; Selkie, Ellen
2018-04-01
The Uniform Definition of Bullying was developed to address bullying and cyberbullying, and to promote consistency in measurement and policy. The purpose of this study was to understand community stakeholder perceptions of typical cyberbullying cases, and to evaluate how these case descriptions align with the Uniform Definition. In this qualitative case analysis we recruited stakeholders commonly involved in cyberbullying. We used purposeful sampling to identify and recruit adolescents and young adults, parents, and professionals representing education and health care. Participants were asked to write a typical case of cyberbullying and descriptors in the context of a group discussion. We applied content analysis to case excerpts using inductive and deductive approaches, and chi-squared tests for mixed methods analyses. A total of 68 participants contributed; participants included 73% adults and 27% adolescents and young adults. A total of 650 excerpts were coded from participants' example cases and 362 (55.6%) were consistent with components of the Uniform Definition. The most frequently mentioned component of the Uniform Definition was Aggressive Behavior (n = 218 excerpts), whereas Repeated was mentioned infrequently (n = 19). Most participants included two to three components of the Uniform Definition within an example case; none of the example cases included all components of the Uniform Definition. We found that most participants described cyberbullying cases using few components of the Uniform Definition. Findings can be applied toward considering refinement of the Uniform Definition to ensure stakeholders find it applicable to cyberbullying. Copyright © 2017 The Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.
Micrometeoroid and Lunar Secondary Ejecta Flux Measurements: Comparison of Three Acoustic Systems
NASA Technical Reports Server (NTRS)
Corsaro, R. D.; Giovane, F.; Liou, Jer-Chyi; Burtchell, M.; Pisacane, V.; Lagakos, N.; Williams, E.; Stansbery, E.
2010-01-01
This report examines the inherent capability of three large-area acoustic sensor systems and their applicability for micrometeoroids (MM) and lunar secondary ejecta (SE) detection and characterization for future lunar exploration activities. Discussion is limited to instruments that can be fabricated and deployed with low resource requirements. Previously deployed impact detection probes typically have instrumented capture areas less than 0.2 square meters. Since the particle flux decreases rapidly with increased particle size, such small-area sensors rarely encounter particles in the size range above 50 microns, and even their sampling the population above 10 microns is typically limited. Characterizing the sparse dust population in the size range above 50 microns requires a very large-area capture instrument. However it is also important that such an instrument simultaneously measures the population of the smaller particles, so as to provide a complete instantaneous snapshot of the population. For lunar or planetary surface studies, the system constraints are significant. The instrument must be as large as possible to sample the population of the largest MM. This is needed to reliably assess the particle impact risks and to develop cost-effective shielding designs for habitats, astronauts, and critical instrument. The instrument should also have very high sensitivity to measure the flux of small and slow SE particles. is the SE environment is currently poorly characterized, and possess a contamination risk to machinery and personnel involved in exploration. Deployment also requires that the instrument add very little additional mass to the spacecraft. Three acoustic systems are being explored for this application.
Relative Age Effects in a Cognitive Task: A Case Study of Youth Chess
ERIC Educational Resources Information Center
Helsen, Werner F.; Baker, Joseph; Schorer, Joerg; Steingröver, Christina; Wattie, Nick; Starkes, Janet L.
2016-01-01
The relative age effect (RAE) has been demonstrated in many youth and professional sports. In this study, we hypothesized that there would also be a RAE among youth chess players who are typically involved in a complex cognitive task without significant physical requirements. While typical RAEs have been observed in adult chess players, in this…
ERIC Educational Resources Information Center
Teixeira, Jennifer M.; Byers, Jessie Nedrow; Perez, Marilu G.; Holman, R. W.
2010-01-01
Experimental exercises within second-year-level organic laboratory manuals typically involve a statement of a principle that is then validated by student generation of data in a single experiment. These experiments are structured in the exact opposite order of the scientific method, in which data interpretation, typically from multiple related…
ERIC Educational Resources Information Center
Recker, Margaret M.; Pirolli, Peter
Students learning to program recursive LISP functions in a typical school-like lesson on recursion were observed. The typical lesson contains text and examples and involves solving a series of programming problems. The focus of this study is on students' learning strategies in new domains. In this light, a Soar computational model of…
Muiño, Elena; Gallego-Fabrega, Cristina; Cullell, Natalia; Carrera, Caty; Torres, Nuria; Krupinski, Jurek; Roquer, Jaume; Montaner, Joan; Fernández-Cadenas, Israel
2017-09-13
CADASIL (cerebral autosomal dominant arteriopathy with subcortical infarcts and leukoencephalopathy) is caused by mutations in the NOTCH3 gene, affecting the number of cysteines in the extracellular domain of the receptor, causing protein misfolding and receptor aggregation. The pathogenic role of cysteine-sparing NOTCH3 missense mutations in patients with typical clinical CADASIL syndrome is unknown. The aim of this article is to describe these mutations to clarify if any could be potentially pathogenic. Articles on cysteine-sparing NOTCH3 missense mutations in patients with clinical suspicion of CADASIL were reviewed. Mutations were considered potentially pathogenic if patients had: (a) typical clinical CADASIL syndrome; (b) diffuse white matter hyperintensities; (c) the 33 NOTCH3 exons analyzed; (d) mutations that were not polymorphisms; and (e) Granular osmiophilic material (GOM) deposits in the skin biopsy. Twenty-five different mutations were listed. Four fulfill the above criteria: p.R61W; p.R75P; p.D80G; and p.R213K. Patients carrying these mutations had typical clinical CADASIL syndrome and diffuse white matter hyperintensities, mostly without anterior temporal pole involvement. Cysteine-sparing NOTCH3 missense mutations are associated with typical clinical CADASIL syndrome and typical magnetic resonance imaging (MRI) findings, although with less involvement of the anterior temporal lobe. Hence, these mutations should be further studied to confirm their pathological role in CADASIL.
Agelastos, Anthony; Allan, Benjamin; Brandt, Jim; ...
2016-05-18
A detailed understanding of HPC applications’ resource needs and their complex interactions with each other and HPC platform resources are critical to achieving scalability and performance. Such understanding has been difficult to achieve because typical application profiling tools do not capture the behaviors of codes under the potentially wide spectrum of actual production conditions and because typical monitoring tools do not capture system resource usage information with high enough fidelity to gain sufficient insight into application performance and demands. In this paper we present both system and application profiling results based on data obtained through synchronized system wide monitoring onmore » a production HPC cluster at Sandia National Laboratories (SNL). We demonstrate analytic and visualization techniques that we are using to characterize application and system resource usage under production conditions for better understanding of application resource needs. Furthermore, our goals are to improve application performance (through understanding application-to-resource mapping and system throughput) and to ensure that future system capabilities match their intended workloads.« less
Engaging Undergraduates in Social Science Research: The Taking the Pulse of Saskatchewan Project
ERIC Educational Resources Information Center
Berdahl, Loleen
2014-01-01
Although student involvement in research and inquiry can advance undergraduate learning, there are limited opportunities for undergraduate students to be directly involved in social science research. Social science faculty members typically work outside of laboratory settings, with the limited research assistance work being completed by graduate…
Estimating the effect of changes in water quality on non-market values for recreation involves estimating a change in aggregate consumer surplus. This aggregate value typically involves estimating both a per-person, per-trip change in willingness to pay, as well as defining the m...
43 CFR 10005.12 - Policy regarding the scope of measures to be included in the plan.
Code of Federal Regulations, 2011 CFR
2011-10-01
... the site of the impact typically involves restoration or replacement. Off-site mitigation might involve protection, restoration, or enhancement of a similar resource value at a different location... responsibilities, the Commission sees an obligation to give priority to protection and restoration activities that...
Small School Ritual and Parent Involvement.
ERIC Educational Resources Information Center
Bushnell, Mary
This paper examines the ritual socialization of parents into a school community. Rituals may be mundane or sacred and typically involve actions that have transformative potential. In the context of groups, rituals may serve the purposes of identifying and constructing group identity, maintaining cohesion, and constructing and communicating values.…
[Algorithms of artificial neural networks--practical application in medical science].
Stefaniak, Bogusław; Cholewiński, Witold; Tarkowska, Anna
2005-12-01
Artificial Neural Networks (ANN) may be a tool alternative and complementary to typical statistical analysis. However, in spite of many computer applications of various ANN algorithms ready for use, artificial intelligence is relatively rarely applied to data processing. This paper presents practical aspects of scientific application of ANN in medicine using widely available algorithms. Several main steps of analysis with ANN were discussed starting from material selection and dividing it into groups, to the quality assessment of obtained results at the end. The most frequent, typical reasons for errors as well as the comparison of ANN method to the modeling by regression analysis were also described.
Reason, emotion and decision-making: risk and reward computation with feeling.
Quartz, Steven R
2009-05-01
Many models of judgment and decision-making posit distinct cognitive and emotional contributions to decision-making under uncertainty. Cognitive processes typically involve exact computations according to a cost-benefit calculus, whereas emotional processes typically involve approximate, heuristic processes that deliver rapid evaluations without mental effort. However, it remains largely unknown what specific parameters of uncertain decision the brain encodes, the extent to which these parameters correspond to various decision-making frameworks, and their correspondence to emotional and rational processes. Here, I review research suggesting that emotional processes encode in a precise quantitative manner the basic parameters of financial decision theory, indicating a reorientation of emotional and cognitive contributions to risky choice.
Mass balance and swath displacement evaluations from agricultural application field trials
USDA-ARS?s Scientific Manuscript database
Spray drift is on an ongoing concern for any agricultural application and continues to be the focus for new developments and research efforts dealing with drift reduction technologies, best management application practices and the development of new decision support systems for applicators. Typical...
Rapid microwave-assisted preparation of binary and ternary transition metal sulfide compounds
DOE Office of Scientific and Technical Information (OSTI.GOV)
Butala, Megan M.; Perez, Minue A.; Arnon, Shiri
Transition metal chalcogenides are of interest for energy applications, including energy generation in photoelectrochemical cells and as electrodes for next-generation electrochemical energy storage. Synthetic routes for such chalcogenides typically involve extended heating at elevated temperatures for multiple weeks. We demonstrate here the feasibility of rapidly preparing select sulfide compounds in a matter of minutes, rather than weeks, using microwave-assisted heating in domestic microwaves. We report the preparations of phase pure FeS2, CoS2, and solid solutions thereof from the elements with only 40 min of heating. Conventional furnace and rapid microwave preparations of CuTi2S4 both result in a majority of themore » targeted phase, even with the significantly shorter heating time of 40 min for microwave methods relative to 12 days using a conventional furnace. The preparations we describe for these compounds can be extended to related structures and chemistries and thus enable rapid screening of the properties and performance of various compositions of interest for electronic, optical, and electrochemical applications.« less
Kinetic Monte Carlo Simulation of Oxygen Diffusion in Ytterbium Disilicate
NASA Technical Reports Server (NTRS)
Good, Brian S.
2015-01-01
Ytterbium disilicate is of interest as a potential environmental barrier coating for aerospace applications, notably for use in next generation jet turbine engines. In such applications, the transport of oxygen and water vapor through these coatings to the ceramic substrate is undesirable if high temperature oxidation is to be avoided. In an effort to understand the diffusion process in these materials, we have performed kinetic Monte Carlo simulations of vacancy-mediated and interstitial oxygen diffusion in Ytterbium disilicate. Oxygen vacancy and interstitial site energies, vacancy and interstitial formation energies, and migration barrier energies were computed using Density Functional Theory. We have found that, in the case of vacancy-mediated diffusion, many potential diffusion paths involve large barrier energies, but some paths have barrier energies smaller than one electron volt. However, computed vacancy formation energies suggest that the intrinsic vacancy concentration is small. In the case of interstitial diffusion, migration barrier energies are typically around one electron volt, but the interstitial defect formation energies are positive, with the result that the disilicate is unlikely to exhibit experience significant oxygen permeability except at very high temperature.
Varnes, Jeffrey G; Geschwindner, Stefan; Holmquist, Christopher R; Forst, Janet; Wang, Xia; Dekker, Niek; Scott, Clay W; Tian, Gaochao; Wood, Michael W; Albert, Jeffrey S
2016-01-01
Fragment-based drug design (FBDD) relies on direct elaboration of fragment hits and typically requires high resolution structural information to guide optimization. In fragment-assisted drug discovery (FADD), fragments provide information to guide selection and design but do not serve as starting points for elaboration. We describe FADD and high-throughput screening (HTS) campaign strategies conducted in parallel against PDE10A where fragment hit co-crystallography was not available. The fragment screen led to prioritized fragment hits (IC50's ∼500μM), which were used to generate a hypothetical core scaffold. Application of this scaffold as a filter to HTS output afforded a 4μM hit, which, after preparation of a small number of analogs, was elaborated into a 16nM lead. This approach highlights the strength of FADD, as fragment methods were applied despite the absence of co-crystallographical information to efficiently identify a lead compound for further optimization. Copyright © 2015 Elsevier Ltd. All rights reserved.
Maximum-Entropy Inference with a Programmable Annealer
Chancellor, Nicholas; Szoke, Szilard; Vinci, Walter; Aeppli, Gabriel; Warburton, Paul A.
2016-01-01
Optimisation problems typically involve finding the ground state (i.e. the minimum energy configuration) of a cost function with respect to many variables. If the variables are corrupted by noise then this maximises the likelihood that the solution is correct. The maximum entropy solution on the other hand takes the form of a Boltzmann distribution over the ground and excited states of the cost function to correct for noise. Here we use a programmable annealer for the information decoding problem which we simulate as a random Ising model in a field. We show experimentally that finite temperature maximum entropy decoding can give slightly better bit-error-rates than the maximum likelihood approach, confirming that useful information can be extracted from the excited states of the annealer. Furthermore we introduce a bit-by-bit analytical method which is agnostic to the specific application and use it to show that the annealer samples from a highly Boltzmann-like distribution. Machines of this kind are therefore candidates for use in a variety of machine learning applications which exploit maximum entropy inference, including language processing and image recognition. PMID:26936311
Rath, N; Kato, S; Levesque, J P; Mauel, M E; Navratil, G A; Peng, Q
2014-04-01
Fast, digital signal processing (DSP) has many applications. Typical hardware options for performing DSP are field-programmable gate arrays (FPGAs), application-specific integrated DSP chips, or general purpose personal computer systems. This paper presents a novel DSP platform that has been developed for feedback control on the HBT-EP tokamak device. The system runs all signal processing exclusively on a Graphics Processing Unit (GPU) to achieve real-time performance with latencies below 8 μs. Signals are transferred into and out of the GPU using PCI Express peer-to-peer direct-memory-access transfers without involvement of the central processing unit or host memory. Tests were performed on the feedback control system of the HBT-EP tokamak using forty 16-bit floating point inputs and outputs each and a sampling rate of up to 250 kHz. Signals were digitized by a D-TACQ ACQ196 module, processing done on an NVIDIA GTX 580 GPU programmed in CUDA, and analog output was generated by D-TACQ AO32CPCI modules.
NASA Astrophysics Data System (ADS)
Gauthier, Jean-Christophe; Robichaud, Louis-Rafaël; Fortin, Vincent; Vallée, Réal; Bernier, Martin
2018-06-01
The quest for a compact and efficient broadband laser source able to probe the numerous fundamental molecular absorption lines in the mid-infrared (3-8 µm) for various applications has been going on for more than a decade. While robust commercial fiber-based supercontinuum (SC) systems have started to appear on the market, they still exhibit poor energy conversion into the mid-infrared (typically under 30%) and are generally not producing wavelengths exceeding 4.7 µm. Here, we present an overview of the results obtained from a novel approach to SC generation based on spectral broadening inside of an erbium-doped fluoride fiber amplifier seeded directly at 2.8 µm, allowing mid-infrared conversion efficiencies reaching up to 95% and spectral coverage approaching the transparency limit of ZrF4 (4.2 µm) and InF3 (5.5 µm) fibers. The general concept of the approach and the physical mechanisms involved are presented alongside the various configurations of the system to adjust the output characteristics in terms of spectral coverage and output power for different applications.
NASA Astrophysics Data System (ADS)
An, Suyeong; Kim, Byoungsoo; Lee, Jonghwi
2017-07-01
Porous materials with surprisingly diverse structures have been utilized in nature for many functional purposes. However, the structures and applications of porous man-made polymer materials have been limited by the use of processing techniques involving foaming agents. Herein, we demonstrate for the first time the outstanding hardness and modulus properties of an elastomer that originate from the novel processing approach applied. Polyurethane films of 100-μm thickness with biomimetic ordered porous structures were prepared using directional melt crystallization of a solvent and exhibited hardness and modulus values that were 6.8 and 4.3 times higher than those of the random pore structure, respectively. These values surpass the theoretical prediction of the typical model for porous materials, which works reasonably well for random pores but not for directional pores. Both the ordered and random pore structures exhibited similar porosities and pore sizes, which decreased with increasing solution concentration. This unexpectedly significant improvement of the hardness and modulus could open up new application areas for porous polymeric materials using this relatively novel processing technique.
Rights of Conscience Protections for Armed Forces Service Members and Their Chaplains
2015-07-22
established five categories of religious accommodation requests: dietary, grooming, medical , uniform, and worship practices.2 • Dietary: typically, these... Medical : typically, these are requests for a waiver of mandatory immunizations. • Uniform: typically, these are requests to wear religious jewelry or...service members in their units. Requirements A chaplain applicant is required to meet DoD medical and physical standards for commissioning as an
Human-Robot Interface: Issues in Operator Performance, Interface Design, and Technologies
2006-07-01
and the use of lightweight portable robotic sensor platforms. 5 robotics has reached a point where some generalities of HRI transcend specific...displays with control devices such as joysticks, wheels, and pedals (Kamsickas, 2003). Typical control stations include panels displaying (a) sensor ...tasks that do not involve mobility and usually involve camera control or data fusion from sensors Active search: Search tasks that involve mobility
ERIC Educational Resources Information Center
Jarrold, Christopher; Thorn, Annabel S. C.; Stephens, Emma
2009-01-01
This study examined the correlates of new word learning in a sample of 64 typically developing children between 5 and 8 years of age and a group of 22 teenagers and young adults with Down syndrome. Verbal short-term memory and phonological awareness skills were assessed to determine whether learning new words involved accurately representing…
ERIC Educational Resources Information Center
Nowakowski, Matilda E.; Tasker, Susan L.; Cunningham, Charles E.; McHolm, Angela E.; Edison, Shannon; St. Pierre, Jeff; Boyle, Michael H.; Schmidt, Louis A.
2011-01-01
Although joint attention processes are known to play an important role in adaptive social behavior in typical development, we know little about these processes in clinical child populations. We compared early school age children with selective mutism (SM; n = 19) versus mixed anxiety (MA; n = 18) and community controls (CC; n = 26) on joint…
Molecular genetic models related to schizophrenia and psychotic illness: heuristics and challenges.
O'Tuathaigh, Colm M P; Desbonnet, Lieve; Moran, Paula M; Kirby, Brian P; Waddington, John L
2011-01-01
Schizophrenia is a heritable disorder that may involve several common genes of small effect and/or rare copy number variation, with phenotypic heterogeneity across patients. Furthermore, any boundaries vis-à-vis other psychotic disorders are far from clear. Consequently, identification of informative animal models for this disorder, which typically relate to pharmacological and putative pathophysiological processes of uncertain validity, faces considerable challenges. In juxtaposition, the majority of mutant models for schizophrenia relate to the functional roles of a diverse set of genes associated with risk for the disorder or with such putative pathophysiological processes. This chapter seeks to outline the evidence from phenotypic studies in mutant models related to schizophrenia. These have commonly assessed the degree to which mutation of a schizophrenia-related gene is associated with the expression of several aspects of the schizophrenia phenotype or more circumscribed, schizophrenia-related endophenotypes; typically, they place specific emphasis on positive and negative symptoms and cognitive deficits, and extend to structural and other pathological features. We first consider the primary technological approaches to the generation of such mutants, to include their relative merits and demerits, and then highlight the diverse phenotypic approaches that have been developed for their assessment. The chapter then considers the application of mutant phenotypes to study pathobiological and pharmacological mechanisms thought to be relevant for schizophrenia, particularly in terms of dopaminergic and glutamatergic dysfunction, and to an increasing range of candidate susceptibility genes and copy number variants. Finally, we discuss several pertinent issues and challenges within the field which relate to both phenotypic evaluation and a growing appreciation of the functional genomics of schizophrenia and the involvement of gene × environment interactions.
MutScan: fast detection and visualization of target mutations by scanning FASTQ data.
Chen, Shifu; Huang, Tanxiao; Wen, Tiexiang; Li, Hong; Xu, Mingyan; Gu, Jia
2018-01-22
Some types of clinical genetic tests, such as cancer testing using circulating tumor DNA (ctDNA), require sensitive detection of known target mutations. However, conventional next-generation sequencing (NGS) data analysis pipelines typically involve different steps of filtering, which may cause miss-detection of key mutations with low frequencies. Variant validation is also indicated for key mutations detected by bioinformatics pipelines. Typically, this process can be executed using alignment visualization tools such as IGV or GenomeBrowse. However, these tools are too heavy and therefore unsuitable for validating mutations in ultra-deep sequencing data. We developed MutScan to address problems of sensitive detection and efficient validation for target mutations. MutScan involves highly optimized string-searching algorithms, which can scan input FASTQ files to grab all reads that support target mutations. The collected supporting reads for each target mutation will be piled up and visualized using web technologies such as HTML and JavaScript. Algorithms such as rolling hash and bloom filter are applied to accelerate scanning and make MutScan applicable to detect or visualize target mutations in a very fast way. MutScan is a tool for the detection and visualization of target mutations by only scanning FASTQ raw data directly. Compared to conventional pipelines, this offers a very high performance, executing about 20 times faster, and offering maximal sensitivity since it can grab mutations with even one single supporting read. MutScan visualizes detected mutations by generating interactive pile-ups using web technologies. These can serve to validate target mutations, thus avoiding false positives. Furthermore, MutScan can visualize all mutation records in a VCF file to HTML pages for cloud-friendly VCF validation. MutScan is an open source tool available at GitHub: https://github.com/OpenGene/MutScan.
Intraepidermal Merkel cell carcinoma: A case series of a rare entity with clinical follow up.
Jour, George; Aung, Phyu P; Rozas-Muñoz, Eduardo; Curry, Johnathan L; Prieto, Victor; Ivan, Doina
2017-08-01
Merkel cell carcinoma (MCC) is a rare but aggressive cutaneous carcinoma. MCC typically involves dermis and although epidermotropism has been reported, MCC strictly intraepidermal or in situ (MCCIS) is exceedingly rare. Most of the cases of MCCIS described so far have other associated lesions, such as squamous or basal cell carcinoma, actinic keratosis and so on. Herein, we describe 3 patients with MCC strictly in situ, without a dermal component. Our patients were elderly. 2 of the lesions involved the head and neck area and 1 was on a finger. All tumors were strictly intraepidermal in the diagnostic biopsies, and had histomorphologic features and an immunohistochemical profile supporting the diagnosis of MCC. Excisional biopsies were performed in 2 cases and failed to reveal dermal involvement by MCC or other associated malignancies. Our findings raise the awareness that MCC strictly in situ does exist and it should be included in the differential diagnosis of Paget's or extramammary Paget's disease, pagetoid squamous cell carcinoma, melanoma and other neoplasms that typically show histologically pagetoid extension of neoplastic cells. Considering the limited number of cases reported to date, the diagnosis of isolated MCCIS should not warrant a change in management from the typical MCC. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Grip force coordination during bimanual tasks in unilateral cerebral palsy.
Islam, Mominul; Gordon, Andrew M; Sköld, Annika; Forssberg, Hans; Eliasson, Ann-Christin
2011-10-01
The aim of the study was to investigate coordination of fingertip forces during an asymmetrical bimanual task in children with unilateral cerebral palsy (CP). Twelve participants (six males, six females; mean age 14y 4mo, SD 3.3y; range 9-20y;) with unilateral CP (eight right-sided, four left-sided) and 15 age-matched typically developing participants (five males, 10 females; mean age 14y 3mo, SD 2.9y; range 9-18y,) were included. Participants were instructed to hold custom-made grip devices in each hand and place one device on top of the other. The grip force and load force were recorded simultaneously in both hands. Temporal coordination between the two hands was impaired in the participants with CP (compared with that in typically developing participants), that is they initiated the task by decreasing grip force in the releasing hand before increasing the force in the holding hand. The grip force increase in the holding hand was also smaller in participants with CP (involved hand/non-dominant hand releasing, p<0.001; non-involved hand/dominant hand releasing, p=0.007), indicating deficient scaling of force amplitude. The impairment was greater when participants with CP used their non-involved hand as the holding hand. Temporal coordination and scaling of fingertip forces were impaired in both hands in participants with CP. The non-involved hand was strongly affected by activity in the involved hand, which may explain why children with unilateral CP prefer to use only one hand during tasks that are typically performed with both hands. © The Authors. Developmental Medicine & Child Neurology © 2011 Mac Keith Press.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Agelastos, Anthony; Allan, Benjamin; Brandt, Jim
A detailed understanding of HPC applications’ resource needs and their complex interactions with each other and HPC platform resources are critical to achieving scalability and performance. Such understanding has been difficult to achieve because typical application profiling tools do not capture the behaviors of codes under the potentially wide spectrum of actual production conditions and because typical monitoring tools do not capture system resource usage information with high enough fidelity to gain sufficient insight into application performance and demands. In this paper we present both system and application profiling results based on data obtained through synchronized system wide monitoring onmore » a production HPC cluster at Sandia National Laboratories (SNL). We demonstrate analytic and visualization techniques that we are using to characterize application and system resource usage under production conditions for better understanding of application resource needs. Furthermore, our goals are to improve application performance (through understanding application-to-resource mapping and system throughput) and to ensure that future system capabilities match their intended workloads.« less
Real-Time Processing Library for Open-Source Hardware Biomedical Sensors
Castro-García, Juan A.; Lebrato-Vázquez, Clara
2018-01-01
Applications involving data acquisition from sensors need samples at a preset frequency rate, the filtering out of noise and/or analysis of certain frequency components. We propose a novel software architecture based on open-software hardware platforms which allows programmers to create data streams from input channels and easily implement filters and frequency analysis objects. The performances of the different classes given in the size of memory allocated and execution time (number of clock cycles) were analyzed in the low-cost platform Arduino Genuino. In addition, 11 people took part in an experiment in which they had to implement several exercises and complete a usability test. Sampling rates under 250 Hz (typical for many biomedical applications) makes it feasible to implement filters, sliding windows and Fourier analysis, operating in real time. Participants rated software usability at 70.2 out of 100 and the ease of use when implementing several signal processing applications was rated at just over 4.4 out of 5. Participants showed their intention of using this software because it was percieved as useful and very easy to use. The performances of the library showed that it may be appropriate for implementing small biomedical real-time applications or for human movement monitoring, even in a simple open-source hardware device like Arduino Genuino. The general perception about this library is that it is easy to use and intuitive. PMID:29596394
Paper-based device for separation and cultivation of single microalga.
Chen, Chih-Chung; Liu, Yi-Ju; Yao, Da-Jeng
2015-12-01
Single-cell separation is among the most useful techniques in biochemical research, diagnosis and various industrial applications. Microalgae species have great economic importance as industrial raw materials. Microalgae species collected from environment are typically a mixed and heterogeneous population of species that must be isolated and purified for examination and further application. Conventional methods, such as serial dilution and a streaking-plate method, are intensive of labor and inefficient. We developed a paper-based device for separation and cultivation of single microalga. The fabrication was simply conducted with a common laser printer and required only a few minutes without lithographic instruments and clean-room. The driving force of the paper device was simple capillarity without a complicated pump connection that is part of most devices for microfluidics. The open-structure design of the paper device makes it operable with a common laboratory micropipette for sample transfer and manipulation with a naked eye or adaptable to a robotic system with functionality of high-throughput retrieval and analysis. The efficiency of isolating a single cell from mixed microalgae species is seven times as great as with a conventional method involving serial dilution. The paper device can serve also as an incubator for microalgae growth on simply rinsing the paper with a growth medium. Many applications such as highly expressed cell selection and various single-cell analysis would be applicable. Copyright © 2015 Elsevier B.V. All rights reserved.
The Cloud-Based Integrated Data Viewer (IDV)
NASA Astrophysics Data System (ADS)
Fisher, Ward
2015-04-01
Maintaining software compatibility across new computing environments and the associated underlying hardware is a common problem for software engineers and scientific programmers. While there are a suite of tools and methodologies used in traditional software engineering environments to mitigate this issue, they are typically ignored by developers lacking a background in software engineering. The result is a large body of software which is simultaneously critical and difficult to maintain. Visualization software is particularly vulnerable to this problem, given the inherent dependency on particular graphics hardware and software API's. The advent of cloud computing has provided a solution to this problem, which was not previously practical on a large scale; Application Streaming. This technology allows a program to run entirely on a remote virtual machine while still allowing for interactivity and dynamic visualizations, with little-to-no re-engineering required. Through application streaming we are able to bring the same visualization to a desktop, a netbook, a smartphone, and the next generation of hardware, whatever it may be. Unidata has been able to harness Application Streaming to provide a tablet-compatible version of our visualization software, the Integrated Data Viewer (IDV). This work will examine the challenges associated with adapting the IDV to an application streaming platform, and include a brief discussion of the underlying technologies involved. We will also discuss the differences between local software and software-as-a-service.
Real-Time Processing Library for Open-Source Hardware Biomedical Sensors.
Molina-Cantero, Alberto J; Castro-García, Juan A; Lebrato-Vázquez, Clara; Gómez-González, Isabel M; Merino-Monge, Manuel
2018-03-29
Applications involving data acquisition from sensors need samples at a preset frequency rate, the filtering out of noise and/or analysis of certain frequency components. We propose a novel software architecture based on open-software hardware platforms which allows programmers to create data streams from input channels and easily implement filters and frequency analysis objects. The performances of the different classes given in the size of memory allocated and execution time (number of clock cycles) were analyzed in the low-cost platform Arduino Genuino. In addition, 11 people took part in an experiment in which they had to implement several exercises and complete a usability test. Sampling rates under 250 Hz (typical for many biomedical applications) makes it feasible to implement filters, sliding windows and Fourier analysis, operating in real time. Participants rated software usability at 70.2 out of 100 and the ease of use when implementing several signal processing applications was rated at just over 4.4 out of 5. Participants showed their intention of using this software because it was percieved as useful and very easy to use. The performances of the library showed that it may be appropriate for implementing small biomedical real-time applications or for human movement monitoring, even in a simple open-source hardware device like Arduino Genuino. The general perception about this library is that it is easy to use and intuitive.
NASA Astrophysics Data System (ADS)
Koch, Wolfgang
1996-05-01
Sensor data processing in a dense target/dense clutter environment is inevitably confronted with data association conflicts which correspond with the multiple hypothesis character of many modern approaches (MHT: multiple hypothesis tracking). In this paper we analyze the efficiency of retrodictive techniques that generalize standard fixed interval smoothing to MHT applications. 'Delayed estimation' based on retrodiction provides uniquely interpretable and accurate trajectories from ambiguous MHT output if a certain time delay is tolerated. In a Bayesian framework the theoretical background of retrodiction and its intimate relation to Bayesian MHT is sketched. By a simulated example with two closely-spaced targets, relatively low detection probabilities, and rather high false return densities, we demonstrate the benefits of retrodiction and quantitatively discuss the achievable track accuracies and the time delays involved for typical radar parameters.
Ozaki, Vitor A.; Ghosh, Sujit K.; Goodwin, Barry K.; Shirota, Ricardo
2009-01-01
This article presents a statistical model of agricultural yield data based on a set of hierarchical Bayesian models that allows joint modeling of temporal and spatial autocorrelation. This method captures a comprehensive range of the various uncertainties involved in predicting crop insurance premium rates as opposed to the more traditional ad hoc, two-stage methods that are typically based on independent estimation and prediction. A panel data set of county-average yield data was analyzed for 290 counties in the State of Paraná (Brazil) for the period of 1990 through 2002. Posterior predictive criteria are used to evaluate different model specifications. This article provides substantial improvements in the statistical and actuarial methods often applied to the calculation of insurance premium rates. These improvements are especially relevant to situations where data are limited. PMID:19890450
Oleson, Jacob J; Cavanaugh, Joseph E; McMurray, Bob; Brown, Grant
2015-01-01
In multiple fields of study, time series measured at high frequencies are used to estimate population curves that describe the temporal evolution of some characteristic of interest. These curves are typically nonlinear, and the deviations of each series from the corresponding curve are highly autocorrelated. In this scenario, we propose a procedure to compare the response curves for different groups at specific points in time. The method involves fitting the curves, performing potentially hundreds of serially correlated tests, and appropriately adjusting the overall alpha level of the tests. Our motivating application comes from psycholinguistics and the visual world paradigm. We describe how the proposed technique can be adapted to compare fixation curves within subjects as well as between groups. Our results lead to conclusions beyond the scope of previous analyses. PMID:26400088
Hyperfine excitation of CH in collisions with atomic and molecular hydrogen
NASA Astrophysics Data System (ADS)
Dagdigian, Paul J.
2018-04-01
We investigate here the excitation of methylidene (CH) induced by collisions with atomic and molecular hydrogen (H and H2). The hyperfine-resolved rate coefficients were obtained from close coupling nuclear-spin-free scattering calculations. The calculations are based upon recent, high-accuracy calculations of the CH(X2Π)-H(2S) and CH(X2Π)-H2 potential energy surfaces. Cross-sections and rate coefficients for collisions with atomic H, para-H2, and ortho-H2 were computed for all transitions between the 32 hyperfine levels for CH(X2Π) involving the n ≤ 4 rotational levels for temperatures between 10 and 300 K. These rate coefficients should significantly aid in the interpretation of astronomical observations of CH spectra. As a first application, the excitation of CH is simulated for conditions in typical molecular clouds.
Basics of Compounding: Clinical Pharmaceutics, Part 2.
Allen, Loyd V
2016-01-01
This article represents part 2 of a 2-part article on the topic of clinical pharmaceutics. Pharmaceutics is relevant far beyond the pharmaceutical industry, compounding, and the laboratory. Pharmaceutics can be used to solve many clinical problems in medication therapy. A pharmacists' knowledge of the physicochemical aspects of drugs and drug products should help the patient, physician, and healthcare professionals resolve issues in the increasingly complex world of modern medicine. Part 1 of this series of articles discussed incompatibilities which can directly affect a clinical outcome and utilized pharmaceutics case examples of the application and importance of clinical pharmaceutics covering different characteristics. Part 2 continues to illustrate the scientific principles and clinical effects involved in clinical pharmaceutics. Also covered in this article are many of the scientific principles in typical to patient care. Copyright© by International Journal of Pharmaceutical Compounding, Inc.
Algorithm for covert convoy of a moving target using a group of autonomous robots
NASA Astrophysics Data System (ADS)
Polyakov, Igor; Shvets, Evgeny
2018-04-01
An important application of autonomous robot systems is to substitute human personnel in dangerous environments to reduce their involvement and subsequent risk on human lives. In this paper we solve the problem of covertly convoying a civilian in a dangerous area with a group of unmanned ground vehicles (UGVs) using social potential fields. The novelty of our work lies in the usage of UGVs as compared to the unmanned aerial vehicles typically employed for this task in the approaches described in literature. Additionally, in our paper we assume that the group of UGVs should simultaneously solve the problem of patrolling to detect intruders on the area. We develop a simulation system to test our algorithms, provide numerical results and give recommendations on how to tune the potentials governing robots' behaviour to prioritize between patrolling and convoying tasks.
NASA Technical Reports Server (NTRS)
Daigle, Matthew J.; Sankararaman, Shankar
2013-01-01
Prognostics is centered on predicting the time of and time until adverse events in components, subsystems, and systems. It typically involves both a state estimation phase, in which the current health state of a system is identified, and a prediction phase, in which the state is projected forward in time. Since prognostics is mainly a prediction problem, prognostic approaches cannot avoid uncertainty, which arises due to several sources. Prognostics algorithms must both characterize this uncertainty and incorporate it into the predictions so that informed decisions can be made about the system. In this paper, we describe three methods to solve these problems, including Monte Carlo-, unscented transform-, and first-order reliability-based methods. Using a planetary rover as a case study, we demonstrate and compare the different methods in simulation for battery end-of-discharge prediction.
Hybrid membrane contactor system for creating semi-breathing air
NASA Astrophysics Data System (ADS)
Timofeev, D. V.
2012-02-01
Typically, the equipment to create an artificial climate does not involve changing the composition of the respiratory air. In particular in medical institutions assumes the existence of plant of artificial climate and disinfection in operating rooms and intensive care wards. The use of a hybrid membrane-absorption systems for the generation of artificial atmospheres are improving the respiratory system, blood is enriched or depleted of various gases, resulting in increased stamina, there is a better, faster or slower metabolism, improves concentration and memory. Application of the system contributes to easy and rapid recovery after the operation. By adding a special component, with drug activity, air ionization, and adjust its composition, you can create a special, more favorable for patients with the atmosphere. These factors allow for the treatment and rehabilitation of patients and reduce mortality of heavy patients.
How to Appropriately Extrapolate Costs and Utilities in Cost-Effectiveness Analysis.
Bojke, Laura; Manca, Andrea; Asaria, Miqdad; Mahon, Ronan; Ren, Shijie; Palmer, Stephen
2017-08-01
Costs and utilities are key inputs into any cost-effectiveness analysis. Their estimates are typically derived from individual patient-level data collected as part of clinical studies the follow-up duration of which is often too short to allow a robust quantification of the likely costs and benefits a technology will yield over the patient's entire lifetime. In the absence of long-term data, some form of temporal extrapolation-to project short-term evidence over a longer time horizon-is required. Temporal extrapolation inevitably involves assumptions regarding the behaviour of the quantities of interest beyond the time horizon supported by the clinical evidence. Unfortunately, the implications for decisions made on the basis of evidence derived following this practice and the degree of uncertainty surrounding the validity of any assumptions made are often not fully appreciated. The issue is compounded by the absence of methodological guidance concerning the extrapolation of non-time-to-event outcomes such as costs and utilities. This paper considers current approaches to predict long-term costs and utilities, highlights some of the challenges with the existing methods, and provides recommendations for future applications. It finds that, typically, economic evaluation models employ a simplistic approach to temporal extrapolation of costs and utilities. For instance, their parameters (e.g. mean) are typically assumed to be homogeneous with respect to both time and patients' characteristics. Furthermore, costs and utilities have often been modelled to follow the dynamics of the associated time-to-event outcomes. However, cost and utility estimates may be more nuanced, and it is important to ensure extrapolation is carried out appropriately for these parameters.
Fatal crash involvement and laws against alcohol-impaired driving.
Zador, P L; Lund, A K; Fields, M; Weinberg, K
1989-01-01
It is estimated that in 1985 about 1,560 fewer drivers were involved in fatal crashes because of three types of drinking-driving laws. The laws studied were per se laws that define driving under the influence using blood alcohol concentration (BAC) thresholds; laws that provide for administrative license suspension or revocation prior to conviction for driving under the influence (often referred to as "administrative per se" laws); and laws that mandate jail or community service for first convictions of driving under the influence. It is estimated that if all 48 of the contiguous states adopted laws similar to those studied here, and if these new laws had effects comparable to those reported here, another 2,600 fatal driver involvements could be prevented each year. During hours when typically at least half of all fatally injured drivers have a BAC over 0.10 percent, administrative suspension/revocation is estimated to reduce the involvement of drivers in fatal crashes by about 9 percent; during the same hours, first offense mandatory jail/community service laws are estimated to have reduced driver involvement by about 6 percent. The effect of per se laws was estimated to be a 6 percent reduction during hours when fatal crashes typically are less likely to involve alcohol. These results are based on analyses of drivers involved in fatal crashes in the 48 contiguous states of the United States during the years 1978 to 1985.
NASA Astrophysics Data System (ADS)
Jacques, Diederik
2017-04-01
As soil functions are governed by a multitude of interacting hydrological, geochemical and biological processes, simulation tools coupling mathematical models for interacting processes are needed. Coupled reactive transport models are a typical example of such coupled tools mainly focusing on hydrological and geochemical coupling (see e.g. Steefel et al., 2015). Mathematical and numerical complexity for both the tool itself or of the specific conceptual model can increase rapidly. Therefore, numerical verification of such type of models is a prerequisite for guaranteeing reliability and confidence and qualifying simulation tools and approaches for any further model application. In 2011, a first SeSBench -Subsurface Environmental Simulation Benchmarking- workshop was held in Berkeley (USA) followed by four other ones. The objective is to benchmark subsurface environmental simulation models and methods with a current focus on reactive transport processes. The final outcome was a special issue in Computational Geosciences (2015, issue 3 - Reactive transport benchmarks for subsurface environmental simulation) with a collection of 11 benchmarks. Benchmarks, proposed by the participants of the workshops, should be relevant for environmental or geo-engineering applications; the latter were mostly related to radioactive waste disposal issues - excluding benchmarks defined for pure mathematical reasons. Another important feature is the tiered approach within a benchmark with the definition of a single principle problem and different sub problems. The latter typically benchmarked individual or simplified processes (e.g. inert solute transport, simplified geochemical conceptual model) or geometries (e.g. batch or one-dimensional, homogeneous). Finally, three codes should be involved into a benchmark. The SeSBench initiative contributes to confidence building for applying reactive transport codes. Furthermore, it illustrates the use of those type of models for different environmental and geo-engineering applications. SeSBench will organize new workshops to add new benchmarks in a new special issue. Steefel, C. I., et al. (2015). "Reactive transport codes for subsurface environmental simulation." Computational Geosciences 19: 445-478.
Simulating variable source problems via post processing of individual particle tallies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bleuel, D.L.; Donahue, R.J.; Ludewigt, B.A.
2000-10-20
Monte Carlo is an extremely powerful method of simulating complex, three dimensional environments without excessive problem simplification. However, it is often time consuming to simulate models in which the source can be highly varied. Similarly difficult are optimization studies involving sources in which many input parameters are variable, such as particle energy, angle, and spatial distribution. Such studies are often approached using brute force methods or intelligent guesswork. One field in which these problems are often encountered is accelerator-driven Boron Neutron Capture Therapy (BNCT) for the treatment of cancers. Solving the reverse problem of determining the best neutron source formore » optimal BNCT treatment can be accomplished by separating the time-consuming particle-tracking process of a full Monte Carlo simulation from the calculation of the source weighting factors which is typically performed at the beginning of a Monte Carlo simulation. By post-processing these weighting factors on a recorded file of individual particle tally information, the effect of changing source variables can be realized in a matter of seconds, instead of requiring hours or days for additional complete simulations. By intelligent source biasing, any number of different source distributions can be calculated quickly from a single Monte Carlo simulation. The source description can be treated as variable and the effect of changing multiple interdependent source variables on the problem's solution can be determined. Though the focus of this study is on BNCT applications, this procedure may be applicable to any problem that involves a variable source.« less
Schultz, David; Jones, Shelby S; Pinder, Wendy M; Wiprovnick, Alicia E; Groth, Elisabeth C; Shanty, Lisa M; Duggan, Anne
2018-06-23
Purpose Home visiting programs have produced inconsistent outcomes. One challenge for the field is the design and implementation of effective training to support home visiting staff. In part due to a lack of formal training, most home visitors need to develop the majority of their skills on the job. Home visitors typically receive training in their agency's specific model (e.g., HFA, NFP) and, if applicable, curriculum. Increasingly, states and other home visiting systems are developing and/or coordinating more extensive training and support systems beyond model-specific and curricula trainings. To help guide these training efforts and future evaluations of them, this paper reviews research on effective training, particularly principles of training transfer and adult learning. Description Our review summarizes several meta-analyses, reviews, and more recent publications on training transfer and adult learning principles. Assessment Effective training involves not only the introduction and modeling of concepts and skills but also the practice of, evaluation of, and reflection upon these skills. Further, ongoing encouragement of, reward for, and reflection upon use of these skills, particularly by a home visitor's supervisor, are critical for the home visitor's continued use of these skills with families. Conclusion Application of principles of adult learning and training transfer to home visiting training will likely lead to greater transfer of skills from the training environment to work with families. The involvement of both home visitors and their supervisors in training is likely important for this transfer to occur.
The Dynamics of "Market-Making" in Higher Education
ERIC Educational Resources Information Center
Komljenovic, Janja; Robertson, Susan L.
2016-01-01
This paper examines what to some is a well-worked furrow; the processes and outcomes involved in what is typically referred to as "marketization" in the higher education sector. We do this through a case study of Newton University, where we reveal a rapid proliferation of market exchanges involving the administrative division of the…
Impact of parental weight status on weight loss efforts in Hispanic children
USDA-ARS?s Scientific Manuscript database
Parents have been shown to play an important role in weight loss for children. Parents are typically involved either as models for change or as supporters of children's weight loss efforts. It is likely that overweight/obese parents will need to be involved in changing the environment for themselv...
ERIC Educational Resources Information Center
Bugeja, Clare
2009-01-01
This article investigates parental involvement in the musical education of violin students and the changing role of the parents' across the learning process. Two contexts were compared, one emphasising the Suzuki methodology and the other a "traditional" approach. Students learning "traditionally" are typically taught note reading from the…
ERIC Educational Resources Information Center
Hall, Natalie; Durand, Marie-Anne; Mengoni, Silvana E.
2017-01-01
Background: Despite experiencing health inequalities, people with intellectual disabilities are under-represented in health research. Previous research has identified barriers but has typically focused on under-recruitment to specific studies. This study aimed to explore care staff's attitudes to health research involving people with intellectual…
Natale, Alessandra; Boeckmans, Joost; Desmae, Terry; De Boe, Veerle; De Kock, Joery; Vanhaecke, Tamara; Rogiers, Vera; Rodrigues, Robim M
2018-03-01
Phospholipidosis is a metabolic disorder characterized by intracellular accumulation of phospholipids. It can be caused by short-term or chronic exposure to cationic amphiphilic drugs (CADs). These compounds bind to phospholipids, leading to inhibition of their degradation and consequently to their accumulation in lysosomes. Drug-induced phospholipidosis (DIPL) is frequently at the basis of discontinuation of drug development and post-market drug withdrawal. Therefore, reliable human-relevant in vitro models must be developed to speed up the identification of compounds that are potential inducers of phospholipidosis. Here, hepatic cells derived from human skin (hSKP-HPC) were evaluated as an in vitro model for DIPL. These cells were exposed over time to amiodarone, a CAD known to induce phospholipidosis in humans. Transmission electron microscopy revealed the formation of the typical lamellar inclusions in the cell cytoplasm. Increase of phospholipids was already detected after 24 h exposure to amiodarone, whereas a significant increase of neutral lipid vesicles could be observed after 72 h. At the transcriptional level, the modulation of genes involved in DIPL was detected. These results provide a valuable indication of the applicability of hSKP-HPC for the quick assessment of drug-induced phospholipidosis in vitro, early in the drug development process. Copyright © 2017 Elsevier B.V. All rights reserved.
Yan, Bao; Liu, Rongjia; Li, Yibo; Wang, Yan; Gao, Guanjun; Zhang, Qinglu; Liu, Xing; Jiang, Gonghao; He, Yuqing
2014-01-01
Rice grain shape and yield are usually controlled by multiple quantitative trait loci (QTL). This study used a set of F9–10 recombinant inbred lines (RILs) derived from a cross of Huahui 3 (Bt/Xa21) and Zhongguoxiangdao, and detected 27 QTLs on ten rice chromosomes. Among them, twelve QTLs responsive for grain shape/ or yield were mostly reproducibly detected and had not yet been reported before. Interestingly, the two known genes involved in the materials, with one insect-resistant Bt gene, and the other disease-resistant Xa21 gene, were found to closely link the QTLs responsive for grain shape and weight. The Bt fragment insertion was firstly mapped on the chromosome 10 in Huahui 3 and may disrupt grain-related QTLs resulting in weaker yield performance in transgenic plants. The introgression of Xa21 gene by backcrossing from donor material into receptor Minghui 63 may also contain a donor linkage drag which included minor-effect QTL alleles positively affecting grain shape and yield. The QTL analysis on rice grain appearance quality exemplified the typical events of transgenic or backcrossing breeding. The QTL findings in this study will in the future facilitate the gene isolation and breeding application for improvement of rice grain shape and yield. PMID:25320558
Yan, Bao; Liu, Rongjia; Li, Yibo; Wang, Yan; Gao, Guanjun; Zhang, Qinglu; Liu, Xing; Jiang, Gonghao; He, Yuqing
2014-09-01
Rice grain shape and yield are usually controlled by multiple quantitative trait loci (QTL). This study used a set of F9-10 recombinant inbred lines (RILs) derived from a cross of Huahui 3 (Bt/Xa21) and Zhongguoxiangdao, and detected 27 QTLs on ten rice chromosomes. Among them, twelve QTLs responsive for grain shape/ or yield were mostly reproducibly detected and had not yet been reported before. Interestingly, the two known genes involved in the materials, with one insect-resistant Bt gene, and the other disease-resistant Xa21 gene, were found to closely link the QTLs responsive for grain shape and weight. The Bt fragment insertion was firstly mapped on the chromosome 10 in Huahui 3 and may disrupt grain-related QTLs resulting in weaker yield performance in transgenic plants. The introgression of Xa21 gene by backcrossing from donor material into receptor Minghui 63 may also contain a donor linkage drag which included minor-effect QTL alleles positively affecting grain shape and yield. The QTL analysis on rice grain appearance quality exemplified the typical events of transgenic or backcrossing breeding. The QTL findings in this study will in the future facilitate the gene isolation and breeding application for improvement of rice grain shape and yield.
Improving solubility and refolding efficiency of human V(H)s by a novel mutational approach.
Tanha, Jamshid; Nguyen, Thanh-Dung; Ng, Andy; Ryan, Shannon; Ni, Feng; Mackenzie, Roger
2006-11-01
The antibody V(H) domains of camelids tend to be soluble and to resist aggregation, in contrast to human V(H) domains. For immunotherapy, attempts have therefore been made to improve the properties of human V(H)s by camelization of a small set of framework residues. Here, we have identified through sequence comparison of well-folded llama V(H) domains an alternative set of residues (not typically camelid) for mutation. Thus, the solubility and thermal refolding efficiency of a typical human V(H), derived from the human antibody BT32/A6, were improved by introduction of two mutations in framework region (FR) 1 and 4 to generate BT32/A6.L1. Three more mutations in FR3 of BT32/A6.L1 further improved the thermal refolding efficiency while retaining solubility and cooperative melting profiles. To demonstrate practical utility, BT32/A6.L1 was used to construct a phage display library from which were isolated human V(H)s with good antigen binding activity and solubility. The engineered human V(H) domains described here may be useful for immunotherapy, due to their expected low immunogenicity, and in applications involving transient high temperatures, due to their efficient refolding after thermal denaturation.
Brackley, Victoria; Ball, Kevin; Tor, Elaine
2018-05-12
The effectiveness of the swimming turn is highly influential to overall performance in competitive swimming. The push-off or wall contact, within the turn phase, is directly involved in determining the speed the swimmer leaves the wall. Therefore, it is paramount to develop reliable methods to measure the wall-contact-time during the turn phase for training and research purposes. The aim of this study was to determine the concurrent validity and reliability of the Pool Pad App to measure wall-contact-time during the freestyle and backstroke tumble turn. The wall-contact-times of nine elite and sub-elite participants were recorded during their regular training sessions. Concurrent validity statistics included the standardised typical error estimate, linear analysis and effect sizes while the intraclass correlating coefficient (ICC) was used for the reliability statistics. The standardised typical error estimate resulted in a moderate Cohen's d effect size with an R 2 value of 0.80 and the ICC between the Pool Pad and 2D video footage was 0.89. Despite these measurement differences, the results from this concurrent validity and reliability analyses demonstrated that the Pool Pad is suitable for measuring wall-contact-time during the freestyle and backstroke tumble turn within a training environment.
An incremental strategy for calculating consistent discrete CFD sensitivity derivatives
NASA Technical Reports Server (NTRS)
Korivi, Vamshi Mohan; Taylor, Arthur C., III; Newman, Perry A.; Hou, Gene W.; Jones, Henry E.
1992-01-01
In this preliminary study involving advanced computational fluid dynamic (CFD) codes, an incremental formulation, also known as the 'delta' or 'correction' form, is presented for solving the very large sparse systems of linear equations which are associated with aerodynamic sensitivity analysis. For typical problems in 2D, a direct solution method can be applied to these linear equations which are associated with aerodynamic sensitivity analysis. For typical problems in 2D, a direct solution method can be applied to these linear equations in either the standard or the incremental form, in which case the two are equivalent. Iterative methods appear to be needed for future 3D applications; however, because direct solver methods require much more computer memory than is currently available. Iterative methods for solving these equations in the standard form result in certain difficulties, such as ill-conditioning of the coefficient matrix, which can be overcome when these equations are cast in the incremental form; these and other benefits are discussed. The methodology is successfully implemented and tested in 2D using an upwind, cell-centered, finite volume formulation applied to the thin-layer Navier-Stokes equations. Results are presented for two laminar sample problems: (1) transonic flow through a double-throat nozzle; and (2) flow over an isolated airfoil.
Similarity Metrics for Closed Loop Dynamic Systems
NASA Technical Reports Server (NTRS)
Whorton, Mark S.; Yang, Lee C.; Bedrossian, Naz; Hall, Robert A.
2008-01-01
To what extent and in what ways can two closed-loop dynamic systems be said to be "similar?" This question arises in a wide range of dynamic systems modeling and control system design applications. For example, bounds on error models are fundamental to the controller optimization with modern control design methods. Metrics such as the structured singular value are direct measures of the degree to which properties such as stability or performance are maintained in the presence of specified uncertainties or variations in the plant model. Similarly, controls-related areas such as system identification, model reduction, and experimental model validation employ measures of similarity between multiple realizations of a dynamic system. Each area has its tools and approaches, with each tool more or less suited for one application or the other. Similarity in the context of closed-loop model validation via flight test is subtly different from error measures in the typical controls oriented application. Whereas similarity in a robust control context relates to plant variation and the attendant affect on stability and performance, in this context similarity metrics are sought that assess the relevance of a dynamic system test for the purpose of validating the stability and performance of a "similar" dynamic system. Similarity in the context of system identification is much more relevant than are robust control analogies in that errors between one dynamic system (the test article) and another (the nominal "design" model) are sought for the purpose of bounding the validity of a model for control design and analysis. Yet system identification typically involves open-loop plant models which are independent of the control system (with the exception of limited developments in closed-loop system identification which is nonetheless focused on obtaining open-loop plant models from closed-loop data). Moreover the objectives of system identification are not the same as a flight test and hence system identification error metrics are not directly relevant. In applications such as launch vehicles where the open loop plant is unstable it is similarity of the closed-loop system dynamics of a flight test that are relevant.
Micromechanics of compression failures in open hole composite laminates
NASA Technical Reports Server (NTRS)
Guynn, E. Gail; Bradley, Walter L.
1987-01-01
The high strength-to-weight ratio of composite materials is ideally suited for aerospace applications where they already are used in commercial and military aircraft secondary structures and will soon be used for heavily loaded primary structures. One area impeding the widespread application of composites is their inherent weakness in compressive strength when compared to the tensile properties of the same material. Furthermore, these airframe designs typically contain many bolted or riveted joints, as well as electrical and hydraulic control lines. These applications produce areas of stress concentration, and thus, further complicate the compression failure problem. Open hole compression failures which represent a typical failure mode for composite materials are addressed.
Scalable wide-field optical coherence tomography-based angiography for in vivo imaging applications
Xu, Jingjiang; Wei, Wei; Song, Shaozhen; Qi, Xiaoli; Wang, Ruikang K.
2016-01-01
Recent advances in optical coherence tomography (OCT)-based angiography have demonstrated a variety of biomedical applications in the diagnosis and therapeutic monitoring of diseases with vascular involvement. While promising, its imaging field of view (FOV) is however still limited (typically less than 9 mm2), which somehow slows down its clinical acceptance. In this paper, we report a high-speed spectral-domain OCT operating at 1310 nm to enable wide FOV up to 750 mm2. Using optical microangiography (OMAG) algorithm, we are able to map vascular networks within living biological tissues. Thanks to 2,048 pixel-array line scan InGaAs camera operating at 147 kHz scan rate, the system delivers a ranging depth of ~7.5 mm and provides wide-field OCT-based angiography at a single data acquisition. We implement two imaging modes (i.e., wide-field mode and high-resolution mode) in the OCT system, which gives highly scalable FOV with flexible lateral resolution. We demonstrate scalable wide-field vascular imaging for multiple finger nail beds in human and whole brain in mice with skull left intact at a single 3D scan, promising new opportunities for wide-field OCT-based angiography for many clinical applications. PMID:27231630
Performance Analysis of ICA in Sensor Array
Cai, Xin; Wang, Xiang; Huang, Zhitao; Wang, Fenghua
2016-01-01
As the best-known scheme in the field of Blind Source Separation (BSS), Independent Component Analysis (ICA) has been intensively used in various domains, including biomedical and acoustics applications, cooperative or non-cooperative communication, etc. While sensor arrays are involved in most of the applications, the influence on the performance of ICA of practical factors therein has not been sufficiently investigated yet. In this manuscript, the issue is researched by taking the typical antenna array as an illustrative example. Factors taken into consideration include the environment noise level, the properties of the array and that of the radiators. We analyze the analytic relationship between the noise variance, the source variance, the condition number of the mixing matrix and the optimal signal to interference-plus-noise ratio, as well as the relationship between the singularity of the mixing matrix and practical factors concerned. The situations where the mixing process turns (nearly) singular have been paid special attention to, since such circumstances are critical in applications. Results and conclusions obtained should be instructive when applying ICA algorithms on mixtures from sensor arrays. Moreover, an effective countermeasure against the cases of singular mixtures has been proposed, on the basis of previous analysis. Experiments validating the theoretical conclusions as well as the effectiveness of the proposed scheme have been included. PMID:27164100
The clinical applications of genome editing in HIV.
Wang, Cathy X; Cannon, Paula M
2016-05-26
HIV/AIDS has long been at the forefront of the development of gene- and cell-based therapies. Although conventional gene therapy approaches typically involve the addition of anti-HIV genes to cells using semirandomly integrating viral vectors, newer genome editing technologies based on engineered nucleases are now allowing more precise genetic manipulations. The possible outcomes of genome editing include gene disruption, which has been most notably applied to the CCR5 coreceptor gene, or the introduction of small mutations or larger whole gene cassette insertions at a targeted locus. Disruption of CCR5 using zinc finger nucleases was the first-in-human application of genome editing and remains the most clinically advanced platform, with 7 completed or ongoing clinical trials in T cells and hematopoietic stem/progenitor cells (HSPCs). Here we review the laboratory and clinical findings of CCR5 editing in T cells and HSPCs for HIV therapy and summarize other promising genome editing approaches for future clinical development. In particular, recent advances in the delivery of genome editing reagents and the demonstration of highly efficient homology-directed editing in both T cells and HSPCs are expected to spur the development of even more sophisticated applications of this technology for HIV therapy. © 2016 by The American Society of Hematology.
NASA Astrophysics Data System (ADS)
McMillan, Norman D.; Baker, M.; O'Neill, M.; Smith, Stuart; Augousti, Andreas T.; Mason, Julian; Ryan, Bernard; Ryan, R. A.
1999-01-01
The multianalyzer is a powerful amplitude modulated fiber optic sensor which is perhaps quite typical of so many sensor innovations in that it is a technology looking for an application. Consequently, a series of collaborations with fruit juice, brewing, distilling, biotechnology and polymer industries were made with the objective of identifying potential applications of the multianalyzer. An assessment of these interactions is made for each of the industrial fields explored, by giving for each, just one positive result from the work. The results are then critically assessed. While these studies have illustrated the universal nature of the technology, in every case, lessons have been drawn of a general nature. This experience in particular underlined the difficulty in acceptance of a fiber based technology in industrial process monitoring, against the backdrop of the conservative practice of industry with long established instrumentation. The hard won experience of this product development has shown the vital important of technologists understanding the difference between the marketing concepts of features, benefits and advantages. Three categories of conclusions are drawn, the technical, the commercial, and finally, conclusions drawn from generalizations of the project by the Kingston partners based on their own independent experience in sensor development involving industrial and medical collaborations.
Cyclododecane exposure in the field of conservation and restoration of art objects.
Vernez, David; Wognin, Barthélémy; Tomicic, Catherine; Plateel, Gregory; Charrière, Nicole; Bruhin, Stefanie
2011-04-01
Recent work practices in the conservation and restoration involve the use of cyclododecane (CDD, CAS 294-62-2) to protect fragile artifacts during their handling or transportation. Little is known about its toxicity, and no previous exposure has been reported. A short field investigation was conducted to characterize the exposure conditions to both CDD vapors and aerosols. Measurements were conducted in the laboratory of conservation and restoration of the archeological service in Bern (Switzerland). Three indoor and four outdoor typical work situations, either during brush or spray gun applications, were investigated. Measurements were performed on charcoal adsorbent tube and analyzed by a gas chromatograph equipped with a flame ionization detector. Measurements have been conducted during both brush and spray gun applications. Indoor exposures were of 0.75-15.5 mg/m(3), while outdoors exposures were 19.5-53.9 mg/m(3). Exposures appear to be extremely localized due to both physicochemical properties and application methods of the CDD. Vapor exposure increases dramatically with the confinement of the workplace. Preventive measures should be undertaken to limit as much as possible these exposures. Field work in confined areas (ditches, underground) is of particular concern. CDD-coated artifacts or materials should be stored in ventilated areas to avoid delayed exposures.
McNerney, Monica P; Watstein, Daniel M; Styczynski, Mark P
2015-09-01
Metabolic engineering is generally focused on static optimization of cells to maximize production of a desired product, though recently dynamic metabolic engineering has explored how metabolic programs can be varied over time to improve titer. However, these are not the only types of applications where metabolic engineering could make a significant impact. Here, we discuss a new conceptual framework, termed "precision metabolic engineering," involving the design and engineering of systems that make different products in response to different signals. Rather than focusing on maximizing titer, these types of applications typically have three hallmarks: sensing signals that determine the desired metabolic target, completely directing metabolic flux in response to those signals, and producing sharp responses at specific signal thresholds. In this review, we will first discuss and provide examples of precision metabolic engineering. We will then discuss each of these hallmarks and identify which existing metabolic engineering methods can be applied to accomplish those tasks, as well as some of their shortcomings. Ultimately, precise control of metabolic systems has the potential to enable a host of new metabolic engineering and synthetic biology applications for any problem where flexibility of response to an external signal could be useful. Copyright © 2015 International Metabolic Engineering Society. Published by Elsevier Inc. All rights reserved.
Policies for implementing network firewalls
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, C.D.
1994-05-01
Corporate networks are frequently protected by {open_quotes}firewalls{close_quotes} or gateway systems that control access to/from other networks, e.g., the Internet, in order to reduce the network`s vulnerability to hackers and other unauthorized access. Firewalls typically limit access to particular network nodes and application protocols, and they often perform special authentication and authorization functions. One of the difficult issues associated with network firewalls is determining which applications should be permitted through the firewall. For example, many networks permit the exchange of electronic mail with the outside but do not permit file access to be initiated by outside users, as this might allowmore » outside users to access sensitive data or to surreptitiously modify data or programs (e.g., to intall Trojan Horse software). However, if access through firewalls is severely restricted, legitimate network users may find it difficult or impossible to collaborate with outside users and to share data. Some of the most serious issues regarding firewalls involve setting policies for firewalls with the goal of achieving an acceptable balance between the need for greater functionality and the associated risks. Two common firewall implementation techniques, screening routers and application gateways, are discussed below, followed by some common policies implemented by network firewalls.« less
Distilled Water Distribution Systems. Laboratory Design Notes.
ERIC Educational Resources Information Center
Sell, J.C.
Factors concerning water distribution systems, including an evaluation of materials and a recommendation of materials best suited for service in typical facilities are discussed. Several installations are discussed in an effort to bring out typical features in selected applications. The following system types are included--(1) industrial…
29 CFR 784.149 - Typical operations that may qualify for exemption.
Code of Federal Regulations, 2010 CFR
2010-07-01
... THE FAIR LABOR STANDARDS ACT APPLICABLE TO FISHING AND OPERATIONS ON AQUATIC PRODUCTS Exemptions Provisions Relating to Fishing and Aquatic Products Processing, Freezing, and Curing § 784.149 Typical operations that may qualify for exemption. Such operations as transporting the specified aquatic products to...
29 CFR 784.149 - Typical operations that may qualify for exemption.
Code of Federal Regulations, 2011 CFR
2011-07-01
... THE FAIR LABOR STANDARDS ACT APPLICABLE TO FISHING AND OPERATIONS ON AQUATIC PRODUCTS Exemptions Provisions Relating to Fishing and Aquatic Products Processing, Freezing, and Curing § 784.149 Typical operations that may qualify for exemption. Such operations as transporting the specified aquatic products to...
ERIC Educational Resources Information Center
Jelinek, Mariann
Until recently, research on management and careers typically examined white, middle-class male subjects. Patterns, norms, and career problems brought to light by this research are not necessarily applicable to wider populations. When studies on women did appear, at first they were typically more polemical than scientific; they sought to prove…
Lindsay, Sally; McDougall, Carolyn; Sanford, Robyn; Menna-Dack, Dolly; Kingsnorth, Shauna; Adams, Tracey
2015-01-01
To assess performance differences in a mock job interview and workplace role-play exercise for youth with disabilities compared to their typically developing peers. We evaluated a purposive sample of 31 youth (15 with a physical disability and 16 typically developing) on their performance (content and delivery) in employment readiness role-play exercises. Our findings show significant differences between youth with disabilities compared to typically developing peers in several areas of the mock interview content (i.e. responses to the questions: "tell me about yourself", "how would you provide feedback to someone not doing their share" and a problem-solving scenario question) and delivery (i.e. voice clarity and mean latency). We found no significant differences in the workplace role-play performances of youth with and without disabilities. Youth with physical disabilities performed poorer in some areas of a job interview compared to their typically developing peers. They could benefit from further targeted employment readiness training. Clinicians should: Coach youth with physical disability on how to "sell" their abilities to potential employers and encourage youth to get involved in volunteer activities and employment readiness training programs. Consider using mock job interviews and other employment role-play exercises as assessment and training tools for youth with physical disabilities. Involve speech pathologists in the development of employment readiness programs that address voice clarity as a potential delivery issue.
ERIC Educational Resources Information Center
Burack, Jacob A.; Russo, Natalie; Kovshoff, Hannah; Palma Fernandes, Tania; Ringo, Jason; Landry, Oriane; Iarocci, Grace
2016-01-01
Evidence from the study of attention among persons with autism spectrum disorder (ASD) and typically developing (TD) children suggests a rethinking of the notion that performance inherently reflects disability, ability, or capacity in favor of a more nuanced story that involves an emphasis on styles and biases that reflect real-world attending. We…
Programmable architecture for pixel level processing tasks in lightweight strapdown IR seekers
NASA Astrophysics Data System (ADS)
Coates, James L.
1993-06-01
Typical processing tasks associated with missile IR seeker applications are described, and a straw man suite of algorithms is presented. A fully programmable multiprocessor architecture is realized on a multimedia video processor (MVP) developed by Texas Instruments. The MVP combines the elements of RISC, floating point, advanced DSPs, graphics processors, display and acquisition control, RAM, and external memory. Front end pixel level tasks typical of missile interceptor applications, operating on 256 x 256 sensor imagery, can be processed at frame rates exceeding 100 Hz in a single MVP chip.
NASA Technical Reports Server (NTRS)
1985-01-01
Typical R&D limited partnership arrangements, advantages and disadvantages of R&D limited partnership (RDLPs) and antitrust and tax implications are described. A number of typical forms of RDLPs are described that may be applicable for use in stimulating R&D and experimental programs using the advanced communications technology satellite. The ultimate goal is to increase the rate of market penetration of goods and/or services based upon advanced satellite communications technology. The conditions necessary for these RDLP forms to be advantageous are outlined.
Nabavizadeh, Seyed Ali; Mamourian, Alexander; Schmitt, James E; Cloran, Francis; Vossough, Arastoo; Pukenas, Bryan; Loevner, Laurie A
2016-01-01
Objective: While haemangiomas are common benign vascular lesions involving the spine, some behave in an aggressive fashion. We investigated the utility of fat-suppressed sequences to differentiate between benign and aggressive vertebral haemangiomas. Methods: Patients with the diagnosis of aggressive vertebral haemangioma and available short tau inversion-recovery or T2 fat saturation sequence were included in the study. 11 patients with typical asymptomatic vertebral body haemangiomas were selected as the control group. Region of interest signal intensity (SI) analysis of the entire haemangioma as well as the portion of each haemangioma with highest signal on fat-saturation sequences was performed and normalized to a reference normal vertebral body. Results: A total of 8 patients with aggressive vertebral haemangioma and 11 patients with asymptomatic typical vertebral haemangioma were included. There was a significant difference between total normalized mean SI ratio (3.14 vs 1.48, p = 0.0002), total normalized maximum SI ratio (5.72 vs 2.55, p = 0.0003), brightest normalized mean SI ratio (4.28 vs 1.72, p < 0.0001) and brightest normalized maximum SI ratio (5.25 vs 2.45, p = 0.0003). Multiple measures were able to discriminate between groups with high sensitivity (>88%) and specificity (>82%). Conclusion: In addition to the conventional imaging features such as vertebral expansion and presence of extravertebral component, quantitative evaluation of fat-suppression sequences is also another imaging feature that can differentiate aggressive haemangioma and typical asymptomatic haemangioma. Advances in knowledge: The use of quantitative fat-suppressed MRI in vertebral haemangiomas is demonstrated. Quantitative fat-suppressed MRI can have a role in confirming the diagnosis of aggressive haemangiomas. In addition, this application can be further investigated in future studies to predict aggressiveness of vertebral haemangiomas in early stages. PMID:26511277
The Importance of Modelling in the Teaching and Popularization of Science.
ERIC Educational Resources Information Center
Giordan, Andre
1991-01-01
Discusses the epistemology and typical applications of learning models focusing on practical methods to operationally introduce the distinctive, alloseric models into the educational environment. Alloseric learning models strive to minimize the characteristic resistance that learners typically exhibit when confronted with the need to reorganize or…
Direct Multizone System -- DMS1-275.
ERIC Educational Resources Information Center
Lennox Industries, Inc., Marshalltown, IA.
Lennox Direct Multizone System as a new concept for integrated comfort control is described. The following areas of concern are included--(1) flexibility - typical applications, (2) detailed engineering data, (3) accessories, (4) approvals, (5) guide specifications, (6) dimensional drawings of a typical unit, (7) blower data, (8) mounting data,…
NASA Technical Reports Server (NTRS)
Cramer, K. E.; Winfree, W. P.
2005-01-01
The Nondestructive Evaluation Sciences Branch at NASA s Langley Research Center has been actively involved in the development of thermographic inspection techniques for more than 15 years. Since the Space Shuttle Columbia accident, NASA has focused on the improvement of advanced NDE techniques for the Reinforced Carbon-Carbon (RCC) panels that comprise the orbiter s wing leading edge. Various nondestructive inspection techniques have been used in the examination of the RCC, but thermography has emerged as an effective inspection alternative to more traditional methods. Thermography is a non-contact inspection method as compared to ultrasonic techniques which typically require the use of a coupling medium between the transducer and material. Like radiographic techniques, thermography can be used to inspect large areas, but has the advantage of minimal safety concerns and the ability for single-sided measurements. Principal Component Analysis (PCA) has been shown effective for reducing thermographic NDE data. A typical implementation of PCA is when the eigenvectors are generated from the data set being analyzed. Although it is a powerful tool for enhancing the visibility of defects in thermal data, PCA can be computationally intense and time consuming when applied to the large data sets typical in thermography. Additionally, PCA can experience problems when very large defects are present (defects that dominate the field-of-view), since the calculation of the eigenvectors is now governed by the presence of the defect, not the "good" material. To increase the processing speed and to minimize the negative effects of large defects, an alternative method of PCA is being pursued where a fixed set of eigenvectors, generated from an analytic model of the thermal response of the material under examination, is used to process the thermal data from the RCC materials. Details of a one-dimensional analytic model and a two-dimensional finite-element model will be presented. An overview of the PCA process as well as a quantitative signal-to-noise comparison of the results of performing both embodiments of PCA on thermographic data from various RCC specimens will be shown. Finally, a number of different applications of this technology to various RCC components will be presented.
An investigation on defect-generation conditions in immersion lithography
NASA Astrophysics Data System (ADS)
Tomita, Tadatoshi; Shimoaoki, Takeshi; Enomoto, Masashi; Kyoda, Hideharu; Kitano, Junichi; Suganaga, Toshifumi
2006-03-01
As a powerful candidate for a lithography technique that can accommodate the scaling-down of semiconductors, 193-nm immersion lithography-which realizes a high numerical aperture (NA) and uses deionized water as the medium between the lens and wafer in the exposure system-has been developing at a rapid pace and has reached the stage of practical application. In regards to defects that are a cause for concern in the case of 193-nm immersion lithography, however, many components are still unclear and many problems remain to be solved. It has been pointed out, for example, that in the case of 193-nm immersion lithography, immersion of the resist film in deionized water during exposure causes infiltration of moisture into the resist film, internal components of the resist dissolve into the deionized water, and residual water generated during exposure affects post-processing. Moreover, to prevent this influence of directly immersing the resist in de-ionized water, application of a protective film is regarded as effective. However, even if such a film is applied, it is still highly likely that the above-mentioned defects will still occur. Accordingly, to reduce these defects, it is essential to identify the typical defects occurring in 193-nm immersion lithography and to understand the condition for generation of defects by using some kinds of protective films and resist materials. Furthermore, from now onwards, with further scaling down of semiconductors, it is important to maintain a clear understanding of the relation between defect-generation conditions and critical dimensions (CD). Aiming to extract typical defects occurring in 193-nm immersion lithography, the authors carried out a comparative study with dry exposure lithography, thereby confirming several typical defects associated with immersion lithography. We then investigated the conditions for generation of defects in the case of some kinds of protective films. In addition to that, by investigating the defect-generation conditions and comparing the classification data between wet and dry exposure, we were able to determine the origin of each particular defect involved in immersion lithography. Furthermore, the comparison of CD for wet and dry processing could indicate the future defectivity levels to be expected with shrinking immersion process critical dimensions.
Approximation of wave action flux velocity in strongly sheared mean flows
NASA Astrophysics Data System (ADS)
Banihashemi, Saeideh; Kirby, James T.; Dong, Zhifei
2017-08-01
Spectral wave models based on the wave action equation typically use a theoretical framework based on depth uniform current to account for current effects on waves. In the real world, however, currents often have variations over depth. Several recent studies have made use of a depth-weighted current U˜ due to [Skop, R. A., 1987. Approximate dispersion relation for wave-current interactions. J. Waterway, Port, Coastal, and Ocean Eng. 113, 187-195.] or [Kirby, J. T., Chen, T., 1989. Surface waves on vertically sheared flows: approximate dispersion relations. J. Geophys. Res. 94, 1013-1027.] in order to account for the effect of vertical current shear. Use of the depth-weighted velocity, which is a function of wavenumber (or frequency and direction) has been further simplified in recent applications by only utilizing a weighted current based on the spectral peak wavenumber. These applications do not typically take into account the dependence of U˜ on wave number k, as well as erroneously identifying U˜ as the proper choice for current velocity in the wave action equation. Here, we derive a corrected expression for the current component of the group velocity. We demonstrate its consistency using analytic results for a current with constant vorticity, and numerical results for a measured, strongly-sheared current profile obtained in the Columbia River. The effect of choosing a single value for current velocity based on the peak wave frequency is examined, and we suggest an alternate strategy, involving a Taylor series expansion about the peak frequency, which should significantly extend the range of accuracy of current estimates available to the wave model with minimal additional programming and data transfer.
QRAP: A numerical code for projected (Q)uasiparticle (RA)ndom (P)hase approximation
NASA Astrophysics Data System (ADS)
Samana, A. R.; Krmpotić, F.; Bertulani, C. A.
2010-06-01
A computer code for quasiparticle random phase approximation - QRPA and projected quasiparticle random phase approximation - PQRPA models of nuclear structure is explained in details. The residual interaction is approximated by a simple δ-force. An important application of the code consists in evaluating nuclear matrix elements involved in neutrino-nucleus reactions. As an example, cross sections for 56Fe and 12C are calculated and the code output is explained. The application to other nuclei and the description of other nuclear and weak decay processes are also discussed. Program summaryTitle of program: QRAP ( Quasiparticle RAndom Phase approximation) Computers: The code has been created on a PC, but also runs on UNIX or LINUX machines Operating systems: WINDOWS or UNIX Program language used: Fortran-77 Memory required to execute with typical data: 16 Mbytes of RAM memory and 2 MB of hard disk space No. of lines in distributed program, including test data, etc.: ˜ 8000 No. of bytes in distributed program, including test data, etc.: ˜ 256 kB Distribution format: tar.gz Nature of physical problem: The program calculates neutrino- and antineutrino-nucleus cross sections as a function of the incident neutrino energy, and muon capture rates, using the QRPA or PQRPA as nuclear structure models. Method of solution: The QRPA, or PQRPA, equations are solved in a self-consistent way for even-even nuclei. The nuclear matrix elements for the neutrino-nucleus interaction are treated as the beta inverse reaction of odd-odd nuclei as function of the transfer momentum. Typical running time: ≈ 5 min on a 3 GHz processor for Data set 1.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vanswijgenhoven, E.; Holmes, J.; Wevers, M.
Fiber-reinforced ceramic-matrix composites are under development for high-temperature structural applications. These applications involve fatigue loading under a wide range of frequencies. To date, high-temperature fatigue experiments have typically been performed at loading frequencies of 10 Hz or lower. At higher frequencies, a strong effect of loading frequency on fatigue life has been demonstrated for certain CMC`s tested at room temperature. The fatigue life of CMC`s with weak fiber-matrix interfaces typically decreases as the loading frequency increases. This decrease is attributed to frictional heating and frequency dependent interface and fiber damage. More recently, it has been shown that the room temperaturemore » fatigue life of a Nicalon-fabric-reinforced composite with a strong interface (SYLRAMIC{trademark}) appears to be independent of loading frequency. The high-temperature low-frequency fatigue behavior of the SYLRAMIC composite has also been investigated. For a fatigue peak stress {sigma}{sub peak} above a proportional limit stress of 70 MPa, the number of cycles to failure N{sub f} decreased with an increase in {sigma}{sub peak}. The material endured more than 10{sup 6} cycles for {sigma}{sub peak} below 70 MPa. In this paper, the influence of loading frequency on the high-temperature fatigue behavior of the SYLRAMIC composite is reported. It will be shown that the fatigue limit is unaffected by the loading frequency, that the number of fatigue cycles to failure N{sub f} increases with an increase in frequency, and that the time to failure t{sub f} decreases with an increase in frequency.« less
Yang, Yu; Lian, Xin-Ying; Jiang, Yong-Hai; Xi, Bei-Dou; He, Xiao-Song
2017-11-01
Agricultural regions are a significant source of groundwater pesticide pollution. To ensure that agricultural regions with a significantly high risk of groundwater pesticide contamination are properly managed, a risk-based ranking method related to groundwater pesticide contamination is needed. In the present paper, a risk-based prioritization method for the classification of groundwater pesticide pollution from agricultural regions was established. The method encompasses 3 phases, including indicator selection, characterization, and classification. In the risk ranking index system employed here, 17 indicators involving the physicochemical properties, environmental behavior characteristics, pesticide application methods, and inherent vulnerability of groundwater in the agricultural region were selected. The boundary of each indicator was determined using K-means cluster analysis based on a survey of a typical agricultural region and the physical and chemical properties of 300 typical pesticides. The total risk characterization was calculated by multiplying the risk value of each indicator, which could effectively avoid the subjectivity of index weight calculation and identify the main factors associated with the risk. The results indicated that the risk for groundwater pesticide contamination from agriculture in a region could be ranked into 4 classes from low to high risk. This method was applied to an agricultural region in Jiangsu Province, China, and it showed that this region had a relatively high risk for groundwater contamination from pesticides, and that the pesticide application method was the primary factor contributing to the relatively high risk. The risk ranking method was determined to be feasible, valid, and able to provide reference data related to the risk management of groundwater pesticide pollution from agricultural regions. Integr Environ Assess Manag 2017;13:1052-1059. © 2017 SETAC. © 2017 SETAC.
Spectroscopic analysis technique for arc-welding process control
NASA Astrophysics Data System (ADS)
Mirapeix, Jesús; Cobo, Adolfo; Conde, Olga; Quintela, María Ángeles; López-Higuera, José-Miguel
2005-09-01
The spectroscopic analysis of the light emitted by thermal plasmas has found many applications, from chemical analysis to monitoring and control of industrial processes. Particularly, it has been demonstrated that the analysis of the thermal plasma generated during arc or laser welding can supply information about the process and, thus, about the quality of the weld. In some critical applications (e.g. the aerospace sector), an early, real-time detection of defects in the weld seam (oxidation, porosity, lack of penetration, ...) is highly desirable as it can reduce expensive non-destructive testing (NDT). Among others techniques, full spectroscopic analysis of the plasma emission is known to offer rich information about the process itself, but it is also very demanding in terms of real-time implementations. In this paper, we proposed a technique for the analysis of the plasma emission spectrum that is able to detect, in real-time, changes in the process parameters that could lead to the formation of defects in the weld seam. It is based on the estimation of the electronic temperature of the plasma through the analysis of the emission peaks from multiple atomic species. Unlike traditional techniques, which usually involve peak fitting to Voigt functions using the Levenberg-Marquardt recursive method, we employ the LPO (Linear Phase Operator) sub-pixel algorithm to accurately estimate the central wavelength of the peaks (allowing an automatic identification of each atomic species) and cubic-spline interpolation of the noisy data to obtain the intensity and width of the peaks. Experimental tests on TIG-welding using fiber-optic capture of light and a low-cost CCD-based spectrometer, show that some typical defects can be easily detected and identified with this technique, whose typical processing time for multiple peak analysis is less than 20msec. running in a conventional PC.
Vasculogenic and Angiogenic Pathways in Moyamoya Disease.
Bedini, Gloria; Blecharz, Kinga G; Nava, Sara; Vajkoczy, Peter; Alessandri, Giulio; Ranieri, Michela; Acerbi, Francesco; Ferroli, Paolo; Riva, Daria; Esposito, Silvia; Pantaleoni, Chiara; Nardocci, Nardo; Zibordi, Federica; Ciceri, Elisa; Parati, Eugenio A; Bersano, Anna
2016-01-01
Moyamoya disease (MMD) is a slowly progressing steno-occlusive cerebrovascular disease. The typical moyamoya vessels, which originate from an initial stenosis of the internal carotid, highlight that increased and/or abnormal angiogenic, vasculogenic and arteriogenic processes are involved in the disease pathophysiology. Herein, we summarize the current knowledge on the most important signaling pathways involved in MMD vessel formation, particularly focusing on the expression of growth factors and function of endothelial progenitor cells (EPCs). Higher plasma concentrations of vascular endothelial growth factor, matrix metalloproteinase, hepatocyte growth factor, and interleukin-1β were reported in MMD. A specific higher level of basic fibroblast growth factor was also found in the cerebrospinal fluid of these patients. Finally, the number and the functionality of EPCs were found to be increased. In spite of the available data, the approaches and findings reported so far do not give an evident correlation between the expression levels of the aforementioned growth factors and MMD severity. Furthermore, the controversial results provided by studies on EPCs, do not permit to understand the true involvement of these cells in MMD pathophysiology. Further studies should thus be implemented to extend our knowledge on processes regulating both the arterial stenosis and the excessive formation of collateral vessels. Moreover, we suggest advances of integrated approaches and functional assays to correlate biological and clinical data, arguing for the development of new therapeutic applications for MMD.
ERIC Educational Resources Information Center
Schoonenboom, Judith
2016-01-01
Educational innovations often involve intact subgroups, such as school classes or university departments. In small-scale educational evaluation research, typically involving 1 to 20 subgroups, differences among these subgroups are often neglected. This article presents a mixed method from a qualitative perspective, in which differences among…
[Secondary bladder lymphoma in a patient with AIDS].
Vendrell, J R; Alcaraz, A; Gutíerrez, R; Rodríguez, A; Barranco, M A; Carretero, P
1996-10-01
Contribution of one case of non-Hodgkin lymphoma (NHL) with vesical involvement, that presented clinically with urological symptomatology. Vesical involvement is typical of NHL, and is becoming more frequent in association with the increased number of AIDS patients under immunosuppressive therapy. It should be expected that this currently unusual entity will become more common in the future.
Discussion of David Thissen's Bad Questions: An Essay Involving Item Response Theory
ERIC Educational Resources Information Center
Wainer, Howard
2016-01-01
The usual role of a discussant is to clarify and correct the paper being discussed, but in this case, the author, Howard Wainer, generally agrees with everything David Thissen says in his essay, "Bad Questions: An Essay Involving Item Response Theory." This essay expands on David Thissen's statement that there are typically two principal…
Involving Your Child or Teen with ASD in Integrated Community Activities
ERIC Educational Resources Information Center
McKee, Rebecca
2011-01-01
Participating in outside activities and community-based endeavors can be tricky for people with special needs, like Autism Spectrum Disorder (ASD). Families meet more than a few obstacles attempting to integrate their children or teens who have special needs like ASD. Most typical children are highly involved in sports, clubs and camps. If a…
Foster Care Involvement among Medicaid-Enrolled Children with Autism
ERIC Educational Resources Information Center
Cidav, Zuleyha; Xie, Ming; Mandell, David S.
2018-01-01
The prevalence and risk of foster care involvement among children with autism spectrum disorder (ASD) relative to children with intellectual disability (ID), children with ASD and ID, and typically developing children were examined using 2001-2007 Medicaid data. Children were followed up to the first foster care placement or until the end of 2007;…
Developing Deep Learning Applications for Life Science and Pharma Industry.
Siegismund, Daniel; Tolkachev, Vasily; Heyse, Stephan; Sick, Beate; Duerr, Oliver; Steigele, Stephan
2018-06-01
Deep Learning has boosted artificial intelligence over the past 5 years and is seen now as one of the major technological innovation areas, predicted to replace lots of repetitive, but complex tasks of human labor within the next decade. It is also expected to be 'game changing' for research activities in pharma and life sciences, where large sets of similar yet complex data samples are systematically analyzed. Deep learning is currently conquering formerly expert domains especially in areas requiring perception, previously not amenable to standard machine learning. A typical example is the automated analysis of images which are typically produced en-masse in many domains, e. g., in high-content screening or digital pathology. Deep learning enables to create competitive applications in so-far defined core domains of 'human intelligence'. Applications of artificial intelligence have been enabled in recent years by (i) the massive availability of data samples, collected in pharma driven drug programs (='big data') as well as (ii) deep learning algorithmic advancements and (iii) increase in compute power. Such applications are based on software frameworks with specific strengths and weaknesses. Here, we introduce typical applications and underlying frameworks for deep learning with a set of practical criteria for developing production ready solutions in life science and pharma research. Based on our own experience in successfully developing deep learning applications we provide suggestions and a baseline for selecting the most suited frameworks for a future-proof and cost-effective development. © Georg Thieme Verlag KG Stuttgart · New York.
Ogden, Rob
2010-09-01
Wildlife DNA forensics is receiving increasing coverage in the popular press and has begun to appear in the scientific literature in relation to several different fields. Recognized as an applied subject, it rests on top of very diverse scientific pillars ranging from biochemistry through to evolutionary genetics, all embedded within the context of modern forensic science. This breadth of scope, combined with typically limited resources, has often left wildlife DNA forensics hanging precariously between human DNA forensics and academics keen to seek novel applications for biological research. How best to bridge this gap is a matter for regular debate among the relatively few full-time practitioners in the field. The decisions involved in establishing forensic genetic services to investigate wildlife crime can be complex, particularly where crimes involve a wide range of species and evidential questions. This paper examines some of the issues relevant to setting up a wildlife DNA forensics laboratory based on experiences of working in this area over the past 7 years. It includes a discussion of various models for operating individual laboratories as well as options for organizing forensic testing at higher national and international levels.
Finite difference time domain calculation of transients in antennas with nonlinear loads
NASA Technical Reports Server (NTRS)
Luebbers, Raymond J.; Beggs, John H.; Kunz, Karl S.; Chamberlin, Kent
1991-01-01
Determining transient electromagnetic fields in antennas with nonlinear loads is a challenging problem. Typical methods used involve calculating frequency domain parameters at a large number of different frequencies, then applying Fourier transform methods plus nonlinear equation solution techniques. If the antenna is simple enough so that the open circuit time domain voltage can be determined independently of the effects of the nonlinear load on the antennas current, time stepping methods can be applied in a straightforward way. Here, transient fields for antennas with more general geometries are calculated directly using Finite Difference Time Domain (FDTD) methods. In each FDTD cell which contains a nonlinear load, a nonlinear equation is solved at each time step. As a test case, the transient current in a long dipole antenna with a nonlinear load excited by a pulsed plane wave is computed using this approach. The results agree well with both calculated and measured results previously published. The approach given here extends the applicability of the FDTD method to problems involving scattering from targets, including nonlinear loads and materials, and to coupling between antennas containing nonlinear loads. It may also be extended to propagation through nonlinear materials.
Lead sorption-desorption from organic residues.
Duarte Zaragoza, Victor M; Carrillo, Rogelio; Gutierrez Castorena, Carmen M
2011-01-01
Sorption and desorption are mechanisms involved in the reduction of metal mobility and bioavailability in organic materials. Metal release from substrates is controlled by desorption. The capacity of coffee husk and pulp residues, vermicompost and cow manure to adsorb Pb2+ was evaluated. The mechanisms involved in the sorption process were also studied. Organic materials retained high concentrations of lead (up to 36,000 mg L(-1)); however, the mechanisms of sorption varied according to the characteristics of each material: degree of decomposition, pH, cation exchange capacity and percentage of organic matter. Vermicompost and manure removed 98% of the Pb from solution. Lead precipitated in manure and vermicompost, forming lead oxide (PbO) and lead ferrite (PbFe4O7). Adsorption isotherms did not fit to the typical Freundlich and Langmuir equations. Not only specific and non-specific adsorption was observed, but also precipitation and coprecipitation. Lead desorption from vermicompost and cow manure was less than 2%. For remediation of Pb-polluted sites, the application of vermicompost and manure is recommended in places with alkaline soils because Pb precipitation can be induced, whereas coffee pulp residue is recommended for acidic soils where Pb is adsorbed.
Separation of time scales in one-dimensional directed nucleation-growth processes
NASA Astrophysics Data System (ADS)
Pierobon, Paolo; Miné-Hattab, Judith; Cappello, Giovanni; Viovy, Jean-Louis; Lagomarsino, Marco Cosentino
2010-12-01
Proteins involved in homologous recombination such as RecA and hRad51 polymerize on single- and double-stranded DNA according to a nucleation-growth kinetics, which can be monitored by single-molecule in vitro assays. The basic models currently used to extract biochemical rates rely on ensemble averages and are typically based on an underlying process of bidirectional polymerization, in contrast with the often observed anisotropic polymerization of similar proteins. For these reasons, if one considers single-molecule experiments, the available models are useful to understand observations only in some regimes. In particular, recent experiments have highlighted a steplike polymerization kinetics. The classical model of one-dimensional nucleation growth, the Kolmogorov-Avrami-Mehl-Johnson (KAMJ) model, predicts the correct polymerization kinetics only in some regimes and fails to predict the steplike behavior. This work illustrates by simulations and analytical arguments the limitation of applicability of the KAMJ description and proposes a minimal model for the statistics of the steps based on the so-called stick-breaking stochastic process. We argue that this insight might be useful to extract information on the time and length scales involved in the polymerization kinetics.
"Tactic": Traffic Aware Cloud for Tiered Infrastructure Consolidation
ERIC Educational Resources Information Center
Sangpetch, Akkarit
2013-01-01
Large-scale enterprise applications are deployed as distributed applications. These applications consist of many inter-connected components with heterogeneous roles and complex dependencies. Each component typically consumes 5-15% of the server capacity. Deploying each component as a separate virtual machine (VM) allows us to consolidate the…
Application of Computer Technology to Educational Administration in the United States.
ERIC Educational Resources Information Center
Bozeman, William C.; And Others
1991-01-01
Description of evolution of computer applications in U.S. educational administration is followed by an overview of the structure and governance of public education and Visscher's developmental framework. Typical administrative computer applications in education are discussed, including student records, personnel management, budgeting, library…
Environmental applications activity at Marshall Space Flight Center
NASA Technical Reports Server (NTRS)
Paludan, C. T. N.
1972-01-01
MSFC environmental applications demonstration projects have emphasized application of aerospace technology to community needs of southeastern U.S. Some of the typical projects underway are: hydrological parameter determination; land use surveys; agricultural stress detection; new community site surveys; pollution monitoring; urban transportation studies; and urban environmental quality.
Silverstein, Jonathan C; Dech, Fred; Kouchoukos, Philip L
2004-01-01
Radiological volumes are typically reviewed by surgeons using cross-sections and iso-surface reconstructions. Applications that combine collaborative stereo volume visualization with symbolic anatomic information and data fusions would expand surgeons' capabilities in interpretation of data and in planning treatment. Such an application has not been seen clinically. We are developing methods to systematically combine symbolic anatomy (term hierarchies and iso-surface atlases) with patient data using data fusion. We describe our progress toward integrating these methods into our collaborative virtual reality application. The fully combined application will be a feature-rich stereo collaborative volume visualization environment for use by surgeons in which DICOM datasets will self-report underlying anatomy with visual feedback. Using hierarchical navigation of SNOMED-CT anatomic terms integrated with our existing Tele-immersive DICOM-based volumetric rendering application, we will display polygonal representations of anatomic systems on the fly from menus that query a database. The methods and tools involved in this application development are SNOMED-CT, DICOM, VISIBLE HUMAN, volumetric fusion and C++ on a Tele-immersive platform. This application will allow us to identify structures and display polygonal representations from atlas data overlaid with the volume rendering. First, atlas data is automatically translated, rotated, and scaled to the patient data during loading using a public domain volumetric fusion algorithm. This generates a modified symbolic representation of the underlying canonical anatomy. Then, through the use of collision detection or intersection testing of various transparent polygonal representations, the polygonal structures are highlighted into the volumetric representation while the SNOMED names are displayed. Thus, structural names and polygonal models are associated with the visualized DICOM data. This novel juxtaposition of information promises to expand surgeons' abilities to interpret images and plan treatment.
McCrea, Simon M.; Robinson, Thomas P.
2011-01-01
In this study, five consecutive patients with focal strokes and/or cortical excisions were examined with the Wechsler Adult Intelligence Scale and Wechsler Memory Scale—Fourth Editions along with a comprehensive battery of other neuropsychological tasks. All five of the lesions were large and typically involved frontal, temporal, and/or parietal lobes and were lateralized to one hemisphere. The clinical case method was used to determine the cognitive neuropsychological correlates of mental rotation (Visual Puzzles), Piagetian balance beam (Figure Weights), and visual search (Cancellation) tasks. The pattern of results on Visual Puzzles and Figure Weights suggested that both subtests involve predominately right frontoparietal networks involved in visual working memory. It appeared that Visual Puzzles could also critically rely on the integrity of the left temporoparietal junction. The left temporoparietal junction could be involved in temporal ordering and integration of local elements into a nonverbal gestalt. In contrast, the Figure Weights task appears to critically involve the right temporoparietal junction involved in numerical magnitude estimation. Cancellation was sensitive to left frontotemporal lesions and not right posterior parietal lesions typical of other visual search tasks. In addition, the Cancellation subtest was sensitive to verbal search strategies and perhaps object-based attention demands, thereby constituting a unique task in comparison with previous visual search tasks. PMID:22389807
The kinetics and acoustics of fingering and note transitions on the flute.
Almeida, André; Chow, Renee; Smith, John; Wolfe, Joe
2009-09-01
Motion of the keys was measured in a transverse flute while beginner, amateur, and professional flutists played a range of exercises. The time taken for a key to open or close was typically 10 ms when pushed by a finger or 16 ms when moved by a spring. Because the opening and closing of keys will never be exactly simultaneous, transitions between notes that involve the movement of multiple fingers can occur via several possible pathways with different intermediate fingerings. A transition is classified as "safe" if it is possible to be slurred from the initial to final note with little perceptible change in pitch or volume. Some transitions are "unsafe" and possibly involve a transient change in pitch or a decrease in volume. Players, on average, used safe transitions more frequently than unsafe transitions. Delays between the motion of the fingers were typically tens of milliseconds, with longer delays as more fingers become involved. Professionals exhibited smaller average delays between the motion of their fingers than did amateurs.
NASA Astrophysics Data System (ADS)
Pedretti, Daniele; Beckie, Roger Daniel
2014-05-01
Missing data in hydrological time-series databases are ubiquitous in practical applications, yet it is of fundamental importance to make educated decisions in problems involving exhaustive time-series knowledge. This includes precipitation datasets, since recording or human failures can produce gaps in these time series. For some applications, directly involving the ratio between precipitation and some other quantity, lack of complete information can result in poor understanding of basic physical and chemical dynamics involving precipitated water. For instance, the ratio between precipitation (recharge) and outflow rates at a discharge point of an aquifer (e.g. rivers, pumping wells, lysimeters) can be used to obtain aquifer parameters and thus to constrain model-based predictions. We tested a suite of methodologies to reconstruct missing information in rainfall datasets. The goal was to obtain a suitable and versatile method to reduce the errors given by the lack of data in specific time windows. Our analyses included both a classical chronologically-pairing approach between rainfall stations and a probability-based approached, which accounted for the probability of exceedence of rain depths measured at two or multiple stations. Our analyses proved that it is not clear a priori which method delivers the best methodology. Rather, this selection should be based considering the specific statistical properties of the rainfall dataset. In this presentation, our emphasis is to discuss the effects of a few typical parametric distributions used to model the behavior of rainfall. Specifically, we analyzed the role of distributional "tails", which have an important control on the occurrence of extreme rainfall events. The latter strongly affect several hydrological applications, including recharge-discharge relationships. The heavy-tailed distributions we considered were parametric Log-Normal, Generalized Pareto, Generalized Extreme and Gamma distributions. The methods were first tested on synthetic examples, to have a complete control of the impact of several variables such as minimum amount of data required to obtain reliable statistical distributions from the selected parametric functions. Then, we applied the methodology to precipitation datasets collected in the Vancouver area and on a mining site in Peru.
BIOSENSORS FOR ENVIRONMENTAL APPLICATIONS
A review, with 19 references, is given on challenges and possible opportunities for the development of biosensors for environmental monitoring applications. The high cost and slow turnaround times typically associated with the measurement of regulated pollutants clearly indicates...
Space Transportation systems overview
NASA Technical Reports Server (NTRS)
Lee, C. M.
1979-01-01
Planning for the operations phase of the Space Transportation system is reviewed. Attention is given to mission profile (typical), applications, manifesting rationale, the Operational Flight Test manifest, the operations manifest, pricing policy, and potential applications of the STS.
A forestry application simulation of man-machine techniques for analyzing remotely sensed data
NASA Technical Reports Server (NTRS)
Berkebile, J.; Russell, J.; Lube, B.
1976-01-01
The typical steps in the analysis of remotely sensed data for a forestry applications example are simulated. The example uses numerically-oriented pattern recognition techniques and emphasizes man-machine interaction.
The Typicality Ranking Task: A New Method to Derive Typicality Judgments from Children.
Djalal, Farah Mutiasari; Ameel, Eef; Storms, Gert
2016-01-01
An alternative method for deriving typicality judgments, applicable in young children that are not familiar with numerical values yet, is introduced, allowing researchers to study gradedness at younger ages in concept development. Contrary to the long tradition of using rating-based procedures to derive typicality judgments, we propose a method that is based on typicality ranking rather than rating, in which items are gradually sorted according to their typicality, and that requires a minimum of linguistic knowledge. The validity of the method is investigated and the method is compared to the traditional typicality rating measurement in a large empirical study with eight different semantic concepts. The results show that the typicality ranking task can be used to assess children's category knowledge and to evaluate how this knowledge evolves over time. Contrary to earlier held assumptions in studies on typicality in young children, our results also show that preference is not so much a confounding variable to be avoided, but that both variables are often significantly correlated in older children and even in adults.
The Typicality Ranking Task: A New Method to Derive Typicality Judgments from Children
Ameel, Eef; Storms, Gert
2016-01-01
An alternative method for deriving typicality judgments, applicable in young children that are not familiar with numerical values yet, is introduced, allowing researchers to study gradedness at younger ages in concept development. Contrary to the long tradition of using rating-based procedures to derive typicality judgments, we propose a method that is based on typicality ranking rather than rating, in which items are gradually sorted according to their typicality, and that requires a minimum of linguistic knowledge. The validity of the method is investigated and the method is compared to the traditional typicality rating measurement in a large empirical study with eight different semantic concepts. The results show that the typicality ranking task can be used to assess children’s category knowledge and to evaluate how this knowledge evolves over time. Contrary to earlier held assumptions in studies on typicality in young children, our results also show that preference is not so much a confounding variable to be avoided, but that both variables are often significantly correlated in older children and even in adults. PMID:27322371
Parent-directed approaches to enrich the early language environments of children living in poverty.
Leffel, Kristin; Suskind, Dana
2013-11-01
Children's early language environments are critical for their cognitive development, school readiness, and ultimate educational attainment. Significant disparities exist in these environments, with profound and lasting impacts upon children's ultimate outcomes. Children from backgrounds of low socioeconomic status experience diminished language inputs and enter school at a disadvantage, with disparities persisting throughout their educational careers. Parents are positioned as powerful agents of change in their children's lives, however, and evidence indicates that parent-directed intervention is effective in improving child outcomes. This article explores the efficacy of parent-directed interventions and their potential applicability to the wider educational achievement gap seen in typically developing populations of low socioeconomic status and then describes efforts to develop such interventions with the Thirty Million Words Project and Project ASPIRE (Achieving Superior Parental Involvement for Rehabilitative Excellence) curricula. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.
Fast large-scale clustering of protein structures using Gauss integrals.
Harder, Tim; Borg, Mikael; Boomsma, Wouter; Røgen, Peter; Hamelryck, Thomas
2012-02-15
Clustering protein structures is an important task in structural bioinformatics. De novo structure prediction, for example, often involves a clustering step for finding the best prediction. Other applications include assigning proteins to fold families and analyzing molecular dynamics trajectories. We present Pleiades, a novel approach to clustering protein structures with a rigorous mathematical underpinning. The method approximates clustering based on the root mean square deviation by first mapping structures to Gauss integral vectors--which were introduced by Røgen and co-workers--and subsequently performing K-means clustering. Compared to current methods, Pleiades dramatically improves on the time needed to perform clustering, and can cluster a significantly larger number of structures, while providing state-of-the-art results. The number of low energy structures generated in a typical folding study, which is in the order of 50,000 structures, can be clustered within seconds to minutes.
Seeking maximum linearity of transfer functions
NASA Astrophysics Data System (ADS)
Silva, Filipi N.; Comin, Cesar H.; Costa, Luciano da F.
2016-12-01
Linearity is an important and frequently sought property in electronics and instrumentation. Here, we report a method capable of, given a transfer function (theoretical or derived from some real system), identifying the respective most linear region of operation with a fixed width. This methodology, which is based on least squares regression and systematic consideration of all possible regions, has been illustrated with respect to both an analytical (sigmoid transfer function) and a simple situation involving experimental data of a low-power, one-stage class A transistor current amplifier. Such an approach, which has been addressed in terms of transfer functions derived from experimentally obtained characteristic surface, also yielded contributions such as the estimation of local constants of the device, as opposed to typically considered average values. The reported method and results pave the way to several further applications in other types of devices and systems, intelligent control operation, and other areas such as identifying regions of power law behavior.
Variable Structure Control of a Hand-Launched Glider
NASA Technical Reports Server (NTRS)
Anderson, Mark R.; Waszak, Martin R.
2005-01-01
Variable structure control system design methods are applied to the problem of aircraft spin recovery. A variable structure control law typically has two phases of operation. The reaching mode phase uses a nonlinear relay control strategy to drive the system trajectory to a pre-defined switching surface within the motion state space. The sliding mode phase involves motion along the surface as the system moves toward an equilibrium or critical point. Analysis results presented in this paper reveal that the conventional method for spin recovery can be interpreted as a variable structure controller with a switching surface defined at zero yaw rate. Application of Lyapunov stability methods show that deflecting the ailerons in the direction of the spin helps to insure that this switching surface is stable. Flight test results, obtained using an instrumented hand-launched glider, are used to verify stability of the reaching mode dynamics.
Flexible and stackable terahertz metamaterials via silver-nanoparticle inkjet printing
NASA Astrophysics Data System (ADS)
Kashiwagi, K.; Xie, L.; Li, X.; Kageyama, T.; Miura, M.; Miyashita, H.; Kono, J.; Lee, S.-S.
2018-04-01
There is presently much interest in tunable, flexible, or reconfigurable metamaterial structures that work in the terahertz frequency range. They can be useful for a range of applications, including spectroscopy, sensing, imaging, and communications. Various methods based on microelectromechanical systems have been used for fabricating terahertz metamaterials, but they typically require high-cost facilities and involve a number of time-consuming and intricate processes. Here, we demonstrate a simple, robust, and cost-effective method for fabricating flexible and stackable multiresonant terahertz metamaterials, using silver nanoparticle inkjet printing. Using this method, we designed and fabricated two arrays of split-ring resonators (SRRs) having different resonant frequencies on separate sheets of paper and then combined the two arrays by stacking. Through terahertz time-domain spectroscopy, we observed resonances at the frequencies expected for the individual SRR arrays as well as at a new frequency due to coupling between the two SRR arrays.
[Bus drivers' biomechanical risk assessment in two different contexts].
Baracco, A; Coggiola, M; Perrelli, F; Banchio, M; Martignone, S; Gullino, A; Romano, C
2012-01-01
The application of standardize methods for the biomechanical risk assessment in non-industrial cycled activity is not always possible. A typical case is the public transport sector, where workers complain of suffering for shoulder more than elbow and wrist pains. The Authors present the results of two studies involving two public transport companies and the risk of biomechanical overload of upper limbs for bus and tram drivers. The analysis has been made using three different approaches: focus groups; static analysis by using anthropometric manikins; work sampling technique by monitoring worker's activity and posture at each minute, for two hours and for each binomial vehicle-route, considering P5F e P95M drivers and assessing the perceived efforts thorough the Borg's CR10 Scale. The conclusive results show that the ergonomic analysis managed by multiple non-standardized techniques may reach consistent and repeatable results according to the epidemiological evidences.
NASA Astrophysics Data System (ADS)
Kemp, Melissa M.; Kumar, Ashavani; Mousa, Shaymaa; Dyskin, Evgeny; Yalcin, Murat; Ajayan, Pulickel; Linhardt, Robert J.; Mousa, Shaker A.
2009-11-01
Silver and gold nanoparticles display unique physical and biological properties that have been extensively studied for biological and medical applications. Typically, gold and silver nanoparticles are prepared by chemical reductants that utilize excess toxic reactants, which need to be removed for biological purposes. We utilized a clean method involving a single synthetic step to prepare metal nanoparticles for evaluating potential effects on angiogenesis modulation. These nanoparticles were prepared by reducing silver nitrate and gold chloride with diaminopyridinyl (DAP)-derivatized heparin (HP) polysaccharides. Both gold and silver nanoparticles reduced with DAPHP exhibited effective inhibition of basic fibroblast growth factor (FGF-2)-induced angiogenesis, with an enhanced anti-angiogenesis efficacy with the conjugation to DAPHP (P<0.01) as compared to glucose conjugation. These results suggest that DAPHP-reduced silver nanoparticles and gold nanoparticles have potential in pathological angiogenesis accelerated disorders such as cancer and inflammatory diseases.
Real-time fuzzy inference based robot path planning
NASA Technical Reports Server (NTRS)
Pacini, Peter J.; Teichrow, Jon S.
1990-01-01
This project addresses the problem of adaptive trajectory generation for a robot arm. Conventional trajectory generation involves computing a path in real time to minimize a performance measure such as expended energy. This method can be computationally intensive, and it may yield poor results if the trajectory is weakly constrained. Typically some implicit constraints are known, but cannot be encoded analytically. The alternative approach used here is to formulate domain-specific knowledge, including implicit and ill-defined constraints, in terms of fuzzy rules. These rules utilize linguistic terms to relate input variables to output variables. Since the fuzzy rulebase is determined off-line, only high-level, computationally light processing is required in real time. Potential applications for adaptive trajectory generation include missile guidance and various sophisticated robot control tasks, such as automotive assembly, high speed electrical parts insertion, stepper alignment, and motion control for high speed parcel transfer systems.
Sequential geophysical and flow inversion to characterize fracture networks in subsurface systems
Mudunuru, Maruti Kumar; Karra, Satish; Makedonska, Nataliia; ...
2017-09-05
Subsurface applications, including geothermal, geological carbon sequestration, and oil and gas, typically involve maximizing either the extraction of energy or the storage of fluids. Fractures form the main pathways for flow in these systems, and locating these fractures is critical for predicting flow. However, fracture characterization is a highly uncertain process, and data from multiple sources, such as flow and geophysical are needed to reduce this uncertainty. We present a nonintrusive, sequential inversion framework for integrating data from geophysical and flow sources to constrain fracture networks in the subsurface. In this framework, we first estimate bounds on the statistics formore » the fracture orientations using microseismic data. These bounds are estimated through a combination of a focal mechanism (physics-based approach) and clustering analysis (statistical approach) of seismic data. Then, the fracture lengths are constrained using flow data. In conclusion, the efficacy of this inversion is demonstrated through a representative example.« less
Lipid Informed Quantitation and Identification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kevin Crowell, PNNL
2014-07-21
LIQUID (Lipid Informed Quantitation and Identification) is a software program that has been developed to enable users to conduct both informed and high-throughput global liquid chromatography-tandem mass spectrometry (LC-MS/MS)-based lipidomics analysis. This newly designed desktop application can quickly identify and quantify lipids from LC-MS/MS datasets while providing a friendly graphical user interface for users to fully explore the data. Informed data analysis simply involves the user specifying an electrospray ionization mode, lipid common name (i.e. PE(16:0/18:2)), and associated charge carrier. A stemplot of the isotopic profile and a line plot of the extracted ion chromatogram are also provided to showmore » the MS-level evidence of the identified lipid. In addition to plots, other information such as intensity, mass measurement error, and elution time are also provided. Typically, a global analysis for 15,000 lipid targets« less
Superiority of terahertz over infrared transmission through bandages and burn wound ointments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Suen, Jonathan Y., E-mail: j.suen@duke.edu; Padilla, Willie J.
Terahertz electromagnetic waves have long been proposed to be ideal for spectroscopy and imaging through non-polar dielectric materials that contain no water. Terahertz radiation may thus be useful for monitoring burn and wound injury recovery, as common care treatments involve application of both a clinical dressing and topical ointment. Here, we investigate the optical properties of typical care treatments in the millimeter wave (150–300 GHz), terahertz (0.3–3 THz), and infrared (14.5–0.67 μm) ranges of the electromagnetic spectrum. We find that THz radiation realizes low absorption coefficients and high levels of transmission compared to infrared wavelengths, which were strongly attenuated. Terahertz imaging canmore » enable safe, non-ionizing, noninvasive monitoring of the healing process directly through clinical dressings and recovery ointments, minimizing the frequency of dressing changes and thus increasing the rate of the healing process.« less
A sequential solution for anisotropic total variation image denoising with interval constraints
NASA Astrophysics Data System (ADS)
Xu, Jingyan; Noo, Frédéric
2017-09-01
We show that two problems involving the anisotropic total variation (TV) and interval constraints on the unknown variables admit, under some conditions, a simple sequential solution. Problem 1 is a constrained TV penalized image denoising problem; problem 2 is a constrained fused lasso signal approximator. The sequential solution entails finding first the solution to the unconstrained problem, and then applying a thresholding to satisfy the constraints. If the interval constraints are uniform, this sequential solution solves problem 1. If the interval constraints furthermore contain zero, the sequential solution solves problem 2. Here uniform interval constraints refer to all unknowns being constrained to the same interval. A typical example of application is image denoising in x-ray CT, where the image intensities are non-negative as they physically represent linear attenuation coefficient in the patient body. Our results are simple yet seem unknown; we establish them using the Karush-Kuhn-Tucker conditions for constrained convex optimization.
Controlled nanostructrures formation by ultra fast laser pulses for color marking.
Dusser, B; Sagan, Z; Soder, H; Faure, N; Colombier, J P; Jourlin, M; Audouard, E
2010-02-01
Precise nanostructuration of surface and the subsequent upgrades in material properties is a strong outcome of ultra fast laser irradiations. Material characteristics can be designed on mesoscopic scales, carrying new optical properties. We demonstrate in this work, the possibility of achieving material modifications using ultra short pulses, via polarization dependent structures generation, that can generate specific color patterns. These oriented nanostructures created on the metal surface, called ripples, are typically smaller than the laser wavelength and in the range of visible spectrum. In this way, a complex colorization process of the material, involving imprinting, calibration and reading, has been performed to associate a priori defined colors. This new method based on the control of the laser-driven nanostructure orientation allows cumulating high quantity of information in a minimal surface, proposing new applications for laser marking and new types of identifying codes.
Techniques for Investigating Molecular Toxicology of Nanomaterials.
Wang, Yanli; Li, Chenchen; Yao, Chenjie; Ding, Lin; Lei, Zhendong; Wu, Minghong
2016-06-01
Nanotechnology has been a rapidly developing field in the past few decades, resulting in the more and more exposure of nanomaterials to human. The increased applications of nanomaterials for industrial, commercial and life purposes, such as fillers, catalysts, semiconductors, paints, cosmetic additives and drug carriers, have caused both obvious and potential impacts on human health and environment. Nanotoxicology is used to study the safety of nanomaterials and has grown at the historic moment. Molecular toxicology is a new subdiscipline to study the interactions and impacts of materials at the molecular level. To better understand the relationship between the molecular toxicology and nanomaterials, this review summarizes the typical techniques and methods in molecular toxicology which are applied when investigating the toxicology of nanomaterials and include six categories: namely; genetic mutation detection, gene expression analysis, DNA damage detection, chromosomal aberration analysis, proteomics, and metabolomics. Each category involves several experimental techniques and methods.
NASA Technical Reports Server (NTRS)
Holt, James B.; Monk, Timothy S.
2009-01-01
Propellant Mass Fraction (pmf) calculation methods vary throughout the aerospace industry. While typically used as a means of comparison between candidate launch vehicle designs, the actual pmf calculation method varies slightly from one entity to another. It is the purpose of this paper to present various methods used to calculate the pmf of launch vehicles. This includes fundamental methods of pmf calculation that consider only the total propellant mass and the dry mass of the vehicle; more involved methods that consider the residuals, reserves and any other unusable propellant remaining in the vehicle; and calculations excluding large mass quantities such as the installed engine mass. Finally, a historical comparison is made between launch vehicles on the basis of the differing calculation methodologies, while the unique mission and design requirements of the Ares V Earth Departure Stage (EDS) are examined in terms of impact to pmf.
Ten-minute analysis of drugs and metabolites in saliva by surface-enhanced Raman spectroscopy
NASA Astrophysics Data System (ADS)
Shende, Chetan; Inscore, Frank; Maksymiuk, Paul; Farquharson, Stuart
2005-11-01
Rapid analysis of drugs in emergency room overdose patients is critical to selecting appropriate medical care. Saliva analysis has long been considered an attractive alternative to blood plasma analysis for this application. However, current clinical laboratory analysis methods involve extensive sample extraction followed by gas chromatography and mass spectrometry, and typically require as much as one hour to perform. In an effort to overcome this limitation we have been investigating metal-doped sol-gels to both separate drugs and their metabolites from saliva and generate surface-enhanced Raman spectra. We have incorporated the sol-gel in a disposable lab-on-a-chip format, and generally no more than a drop of sample is required. The detailed molecular vibrational information allows chemical identification, while the increase in Raman scattering by six orders of magnitude or more allows detection of microg/mL concentrations. Measurements of cocaine, its metabolite benzoylecgonine, and several barbiturates are presented.
Zimmerman, Gregory R.
1994-01-01
Carpal tunnel syndrome is a neuropathy resulting from compression of the median nerve as it passes through a narrow tunnel in the wrist on its way to the hand. The lack of precise objective and clinical tests, along with symptoms that are synonymous with other syndromes in the upper extremity, cause carpal tunnel syndrome to appear to be a rare entity in athletics. However, it should not be ruled out as a possible etiology of upper extremity paralysis in the athlete. More typically, carpal tunnel syndrome is the most common peripheral entrapment neuropathy encountered in industry. Treatment may include rest and/or splinting of the involved wrist, ice application, galvanic stimulation, or iontophoresis to reduce inflammation, and then transition to heat modalities and therapeutic exercises for developing flexibility, strength, and endurance. In addition, an ergonomic assessment should be conducted, resulting in modifications to accommodate the carpal tunnel syndrome patient. ImagesFig 3.Fig 4.Fig 5.Fig 6.Fig 7. PMID:16558255
Aerodynamic design of electric and hybrid vehicles: A guidebook
NASA Technical Reports Server (NTRS)
Kurtz, D. W.
1980-01-01
A typical present-day subcompact electric hybrid vehicle (EHV), operating on an SAE J227a D driving cycle, consumes up to 35% of its road energy requirement overcoming aerodynamic resistance. The application of an integrated system design approach, where drag reduction is an important design parameter, can increase the cycle range by more than 15%. This guidebook highlights a logic strategy for including aerodynamic drag reduction in the design of electric and hybrid vehicles to the degree appropriate to the mission requirements. Backup information and procedures are included in order to implement the strategy. Elements of the procedure are based on extensive wind tunnel tests involving generic subscale models and full-scale prototype EHVs. The user need not have any previous aerodynamic background. By necessity, the procedure utilizes many generic approximations and assumptions resulting in various levels of uncertainty. Dealing with these uncertainties, however, is a key feature of the strategy.
Sharma, Pankaj; Liu, Rai-Shung
2015-03-16
A one-pot, two-step synthesis of α-O-, S-, and N-substituted 4-methylquinoline derivatives through Cu-catalyzed aerobic oxidations of N-hydroxyaminoallenes with alcohols, thiols, and amines is described. This reaction sequence involves an initial oxidation of N-hydroxyaminoallenes with NuH (Nu = OH, OR, NHR, and SR) to form 3-substituted 2-en-1-ones, followed by Brønsted acid catalyzed intramolecular cyclizations of the resulting products. Our mechanistic analysis suggests that the reactions proceed through a radical-type mechanism rather than a typical nitrone-intermediate route. The utility of this new Cu-catalyzed reaction is shown by its applicability to the synthesis of several 2-amino-4-methylquinoline derivatives, which are known to be key precursors to several bioactive molecules. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
A multitasking general executive for compound continuous tasks.
Salvucci, Dario D
2005-05-06
As cognitive architectures move to account for increasingly complex real-world tasks, one of the most pressing challenges involves understanding and modeling human multitasking. Although a number of existing models now perform multitasking in real-world scenarios, these models typically employ customized executives that schedule tasks for the particular domain but do not generalize easily to other domains. This article outlines a general executive for the Adaptive Control of Thought-Rational (ACT-R) cognitive architecture that, given independent models of individual tasks, schedules and interleaves the models' behavior into integrated multitasking behavior. To demonstrate the power of the proposed approach, the article describes an application to the domain of driving, showing how the general executive can interleave component subtasks of the driving task (namely, control and monitoring) and interleave driving with in-vehicle secondary tasks (radio tuning and phone dialing). 2005 Lawrence Erlbaum Associates, Inc.
Associations among multiple markers and complex disease: models, algorithms, and applications.
Assimes, Themistocles L; Olshen, Adam B; Narasimhan, Balasubramanian; Olshen, Richard A
2008-01-01
This chapter is a report on collaborations among its authors and others over many years. It devolves from our goal of understanding genes, their main and epistatic effects combined with interactions involving demographic and environmental features also, as together they predict genetically complex diseases. Thus, our goal is "association." Particular phenotypes of interest to us are hypertension, insulin resistance, angina, and myocardial infarction. Prediction of complex disease is notoriously difficult, though it would be made easier were we given strand-specific information on genotype. Unfortunately, with current technology, genotypic information comes to us "unphased." While obviously we have strand-specific information when genotype is homozygous, we do not have such information when genotype is heterozygous. To summarize, the ultimate goals of approaches we provide is to predict phenotype, typically untoward or not, within a specific window of time. Our approach is neither through linkage nor from finding haplotype frequencies per se.
Sequential geophysical and flow inversion to characterize fracture networks in subsurface systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mudunuru, Maruti Kumar; Karra, Satish; Makedonska, Nataliia
Subsurface applications, including geothermal, geological carbon sequestration, and oil and gas, typically involve maximizing either the extraction of energy or the storage of fluids. Fractures form the main pathways for flow in these systems, and locating these fractures is critical for predicting flow. However, fracture characterization is a highly uncertain process, and data from multiple sources, such as flow and geophysical are needed to reduce this uncertainty. We present a nonintrusive, sequential inversion framework for integrating data from geophysical and flow sources to constrain fracture networks in the subsurface. In this framework, we first estimate bounds on the statistics formore » the fracture orientations using microseismic data. These bounds are estimated through a combination of a focal mechanism (physics-based approach) and clustering analysis (statistical approach) of seismic data. Then, the fracture lengths are constrained using flow data. In conclusion, the efficacy of this inversion is demonstrated through a representative example.« less
Burgess, Darren J
2017-04-01
Research describing load-monitoring techniques for team sport is plentiful. Much of this research is conducted retrospectively and typically involves recreational or semielite teams. Load-monitoring research conducted on professional team sports is largely observational. Challenges exist for the practitioner in implementing peer-reviewed research into the applied setting. These challenges include match scheduling, player adherence, manager/coach buy-in, sport traditions, and staff availability. External-load monitoring often attracts questions surrounding technology reliability and validity, while internal-load monitoring makes some assumptions about player adherence, as well as having some uncertainty around the impact these measures have on player performance This commentary outlines examples of load-monitoring research, discusses the issues associated with the application of this research in an elite team-sport setting, and suggests practical adjustments to the existing research where necessary.
Murfee, Walter L.; Sweat, Richard S.; Tsubota, Ken-ichi; Gabhann, Feilim Mac; Khismatullin, Damir; Peirce, Shayn M.
2015-01-01
Microvascular network remodelling is a common denominator for multiple pathologies and involves both angiogenesis, defined as the sprouting of new capillaries, and network patterning associated with the organization and connectivity of existing vessels. Much of what we know about microvascular remodelling at the network, cellular and molecular scales has been derived from reductionist biological experiments, yet what happens when the experiments provide incomplete (or only qualitative) information? This review will emphasize the value of applying computational approaches to advance our understanding of the underlying mechanisms and effects of microvascular remodelling. Examples of individual computational models applied to each of the scales will highlight the potential of answering specific questions that cannot be answered using typical biological experimentation alone. Looking into the future, we will also identify the needs and challenges associated with integrating computational models across scales. PMID:25844149
Fused electron deficient semiconducting polymers for air stable electron transport.
Onwubiko, Ada; Yue, Wan; Jellett, Cameron; Xiao, Mingfei; Chen, Hung-Yang; Ravva, Mahesh Kumar; Hanifi, David A; Knall, Astrid-Caroline; Purushothaman, Balaji; Nikolka, Mark; Flores, Jean-Charles; Salleo, Alberto; Bredas, Jean-Luc; Sirringhaus, Henning; Hayoz, Pascal; McCulloch, Iain
2018-01-29
Conventional semiconducting polymer synthesis typically involves transition metal-mediated coupling reactions that link aromatic units with single bonds along the backbone. Rotation around these bonds contributes to conformational and energetic disorder and therefore potentially limits charge delocalisation, whereas the use of transition metals presents difficulties for sustainability and application in biological environments. Here we show that a simple aldol condensation reaction can prepare polymers where double bonds lock-in a rigid backbone conformation, thus eliminating free rotation along the conjugated backbone. This polymerisation route requires neither organometallic monomers nor transition metal catalysts and offers a reliable design strategy to facilitate delocalisation of frontier molecular orbitals, elimination of energetic disorder arising from rotational torsion and allowing closer interchain electronic coupling. These characteristics are desirable for high charge carrier mobilities. Our polymers with a high electron affinity display long wavelength NIR absorption with air stable electron transport in solution processed organic thin film transistors.
Probabilistic self-organizing maps for continuous data.
Lopez-Rubio, Ezequiel
2010-10-01
The original self-organizing feature map did not define any probability distribution on the input space. However, the advantages of introducing probabilistic methodologies into self-organizing map models were soon evident. This has led to a wide range of proposals which reflect the current emergence of probabilistic approaches to computational intelligence. The underlying estimation theories behind them derive from two main lines of thought: the expectation maximization methodology and stochastic approximation methods. Here, we present a comprehensive view of the state of the art, with a unifying perspective of the involved theoretical frameworks. In particular, we examine the most commonly used continuous probability distributions, self-organization mechanisms, and learning schemes. Special emphasis is given to the connections among them and their relative advantages depending on the characteristics of the problem at hand. Furthermore, we evaluate their performance in two typical applications of self-organizing maps: classification and visualization.
Fracture surfaces of granular pastes.
Mohamed Abdelhaye, Y O; Chaouche, M; Van Damme, H
2013-11-01
Granular pastes are dense dispersions of non-colloidal grains in a simple or a complex fluid. Typical examples are the coating, gluing or sealing mortars used in building applications. We study the cohesive rupture of thick mortar layers in a simple pulling test where the paste is initially confined between two flat surfaces. After hardening, the morphology of the fracture surfaces was investigated, using either the box counting method to analyze fracture profiles perpendicular to the mean fracture plane, or the slit-island method to analyze the islands obtained by cutting the fracture surfaces at different heights, parallel to the mean fracture plane. The fracture surfaces were shown to exhibit scaling properties over several decades. However, contrary to what has been observed in the brittle or ductile fracture of solid materials, the islands were shown to be mass fractals. This was related to the extensive plastic flow involved in the fracture process.
Restoration of out-of-focus images based on circle of confusion estimate
NASA Astrophysics Data System (ADS)
Vivirito, Paolo; Battiato, Sebastiano; Curti, Salvatore; La Cascia, M.; Pirrone, Roberto
2002-11-01
In this paper a new method for a fast out-of-focus blur estimation and restoration is proposed. It is suitable for CFA (Color Filter Array) images acquired by typical CCD/CMOS sensor. The method is based on the analysis of a single image and consists of two steps: 1) out-of-focus blur estimation via Bayer pattern analysis; 2) image restoration. Blur estimation is based on a block-wise edge detection technique. This edge detection is carried out on the green pixels of the CFA sensor image also called Bayer pattern. Once the blur level has been estimated the image is restored through the application of a new inverse filtering technique. This algorithm gives sharp images reducing ringing and crisping artifact, involving wider region of frequency. Experimental results show the effectiveness of the method, both in subjective and numerical way, by comparison with other techniques found in literature.
Face recognition using an enhanced independent component analysis approach.
Kwak, Keun-Chang; Pedrycz, Witold
2007-03-01
This paper is concerned with an enhanced independent component analysis (ICA) and its application to face recognition. Typically, face representations obtained by ICA involve unsupervised learning and high-order statistics. In this paper, we develop an enhancement of the generic ICA by augmenting this method by the Fisher linear discriminant analysis (LDA); hence, its abbreviation, FICA. The FICA is systematically developed and presented along with its underlying architecture. A comparative analysis explores four distance metrics, as well as classification with support vector machines (SVMs). We demonstrate that the FICA approach leads to the formation of well-separated classes in low-dimension subspace and is endowed with a great deal of insensitivity to large variation in illumination and facial expression. The comprehensive experiments are completed for the facial-recognition technology (FERET) face database; a comparative analysis demonstrates that FICA comes with improved classification rates when compared with some other conventional approaches such as eigenface, fisherface, and the ICA itself.
Computers and the design of ion beam optical systems
NASA Astrophysics Data System (ADS)
White, Nicholas R.
Advances in microcomputers have made it possible to maintain a library of advanced ion optical programs which can be used on inexpensive computer hardware, which are suitable for the design of a variety of ion beam systems including ion implanters, giving excellent results. This paper describes in outline the steps typically involved in designing a complete ion beam system for materials modification applications. Two computer programs are described which, although based largely on algorithms which have been in use for many years, make possible detailed beam optical calculations using microcomputers, specifically the IBM PC. OPTICIAN is an interactive first-order program for tracing beam envelopes through complex optical systems. SORCERY is a versatile program for solving Laplace's and Poisson's equations by finite difference methods using successive over-relaxation. Ion and electron trajectories can be traced through these potential fields, and plots of beam emittance obtained.
Minimalistic Liquid-Assisted Route to Highly Crystalline α-Zirconium Phosphate.
Cheng, Yu; Wang, Xiaodong Tony; Jaenicke, Stephan; Chuah, Gaik-Khuan
2017-08-24
Zirconium phosphates have potential applications in areas of ion exchange, catalysis, photochemistry, and biotechnology. However, synthesis methodologies to form crystalline α-zirconium phosphate (Zr(HPO 4 ) 2 ⋅H 2 O) typically involve the use of excess phosphoric acid, addition of HF or oxalic acid and long reflux times or hydrothermal conditions. A minimalistic sustainable route to its synthesis has been developed by using only zirconium oxychloride and concentrated phosphoric acid to form highly crystalline α-zirconium phosphate within hours. The morphology can be changed from platelets to rod-shaped particles by fluoride addition. By varying the temperature and time, α-zirconium phosphate with particle sizes from nanometers to microns can be obtained. Key features of this minimal solvent synthesis are the excellent yields obtained with high atom economy under mild conditions and ease of scalability. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Stimulant Paste Preparation and Bark Streak Tapping Technique for Pine Oleoresin Extraction.
Füller, Thanise Nogueira; de Lima, Júlio César; de Costa, Fernanda; Rodrigues-Corrêa, Kelly C S; Fett-Neto, Arthur G
2016-01-01
Tapping technique comprises the extraction of pine oleoresin, a non-wood forest product consisting of a complex mixture of mono, sesqui, and diterpenes biosynthesized and exuded as a defense response to wounding. Oleoresin is used to produce gum rosin, turpentine, and their multiple derivatives. Oleoresin yield and quality are objects of interest in pine tree biotechnology, both in terms of environmental and genetic control. Monitoring these parameters in individual trees grown in the field provides a means to examine the control of terpene production in resin canals, as well as the identification of genetic-based differences in resinosis. A typical method of tapping involves the removal of bark and application of a chemical stimulant on the wounded area. Here we describe the methods for preparing the resin-stimulant paste with different adjuvants, as well as the bark streaking process in adult pine trees.
Ishii, Jun; Fukuda, Nobuo; Tanaka, Tsutomu; Ogino, Chiaki; Kondo, Akihiko
2010-05-01
For elucidating protein–protein interactions, many methodologies have been developed during the past two decades. For investigation of interactions inside cells under physiological conditions, yeast is an attractive organism with which to quickly screen for hopeful candidates using versatile genetic technologies, and various types of approaches are now available.Among them, a variety of unique systems using the guanine nucleotide-binding protein (G-protein) signaling pathway in yeast have been established to investigate the interactions of proteins for biological study and pharmaceutical research. G-proteins involved in various cellular processes are mainly divided into two groups: small monomeric G-proteins,and heterotrimeric G-proteins. In this minireview, we summarize the basic principles and applications of yeast-based screening systems, using these two types of G-protein, which are typically used for elucidating biological protein interactions but are differentiated from traditional yeast two-hybrid systems.
Bridging different perspectives of the physiological and mathematical disciplines.
Batzel, Jerry Joseph; Hinghofer-Szalkay, Helmut; Kappel, Franz; Schneditz, Daniel; Kenner, Thomas; Goswami, Nandu
2012-12-01
The goal of this report is to discuss educational approaches for bridging the different perspectives of the physiological and mathematical disciplines. These approaches can enhance the learning experience for physiology, medical, and mathematics students and simultaneously act to stimulate mathematical/physiological/clinical interdisciplinary research. While physiology education incorporates mathematics, via equations and formulas, it does not typically provide a foundation for interdisciplinary research linking mathematics and physiology. Here, we provide insights and ideas derived from interdisciplinary seminars involving mathematicians and physiologists that have been conducted over the last decade. The approaches described here can be used as templates for giving physiology and medical students insights into how sophisticated tools from mathematics can be applied and how the disciplines of mathematics and physiology can be integrated in research, thereby fostering a foundation for interdisciplinary collaboration. These templates are equally applicable to linking mathematical methods with other life and health sciences in the educational process.
Khalaj, Mohammadreza; Kamali, Mohammadreza; Khodaparast, Zahra; Jahanshahi, Akram
2018-02-01
Synthesis of the various types of engineered nanomaterials has gained a huge attention in recent years for various applications. Copper based nanomaterials are a branch of this category seem to be able to provide an efficient and cost-effective way for the treatment of the persistent effluents. The present work aimed to study the various parameters may involve in the overall performance of the copper based nanomaterials for environmental clean-up purposes. To this end, the related characteristics of copper based nanomaterials and their effects on the nanomaterials reactivity and the environmental and operating parameters have been critically reviewed. Toxicological study of the copper based nanomaterials has been also considered as a factor with high importance for the selection of a typical nanomaterial with optimum performance and minimum environmental and health subsequent effects. Copyright © 2017 Elsevier Inc. All rights reserved.
Remote measurement of surface roughness, surface reflectance, and body reflectance with LiDAR.
Li, Xiaolu; Liang, Yu
2015-10-20
Light detection and ranging (LiDAR) intensity data are attracting increasing attention because of the great potential for use of such data in a variety of remote sensing applications. To fully investigate the data potential for target classification and identification, we carried out a series of experiments with typical urban building materials and employed our reconstructed built-in-lab LiDAR system. Received intensity data were analyzed on the basis of the derived bidirectional reflectance distribution function (BRDF) model and the established integration method. With an improved fitting algorithm, parameters involved in the BRDF model can be obtained to depict the surface characteristics. One of these parameters related to surface roughness was converted to a most used roughness parameter, the arithmetical mean deviation of the roughness profile (Ra), which can be used to validate the feasibility of the BRDF model in surface characterizations and performance evaluations.
Microfluidic Controlled Conformal Coating of Particles
NASA Astrophysics Data System (ADS)
Tsai, Scott; Wexler, Jason; Wan, Jiandi; Stone, Howard
2011-11-01
Coating flows are an important class of fluid mechanics problems. Typically a substrate is coated with a moving continuous film, but it is also possible to consider coating of discrete objects. In particular, in applications involving coating of particles that are useful in drug delivery, the coatings act as drug-carrying vehicles, while in cell therapy a thin polymeric coating is required to protect the cells from the host's immune system. Although many functional capabilities have been developed for lab-on-a-chip devices, a technique for coating has not been demonstrated. We present a microfluidic platform developed to coat micron-size spheres with a thin aqueous layer by magnetically pulling the particles from the aqueous phase to the non-aqueous phase in a co-flow. Coating thickness can be adjusted by the average fluid speed and the number of beads encapsulated inside a single coat is tuned by the ratio of magnetic to interfacial forces acting on the beads.
Theoretical investigations of X-ray bursts
NASA Technical Reports Server (NTRS)
Taam, Ronald E.
1987-01-01
Current theoretical understanding of the X-ray burst phenomenon is reviewed, providing a framework in which the burst radiation can be used as a diagnostic of the fundamental properties of the underlying neutron star. The typical Type I X-ray burst is detected as a rapid increase in emission to a level about a factor of 10 above that seen during the quiescent state and recurs on time scales which range from several hours to several days. The thermonuclear flash model has successfully reproduced the basic features of the X-ray burst phenomenon and thereby provided strong theoretical evidence that neutron stars are involved. Topics covered include: theory of the emission spectrum; oscillation modes and prospects for diagnosing the thermal state of neutron stars through experiments on board the X-Ray Timing Explorer or the Advanced X-Ray Astrophysics Facility; applications to the mass and radius of a neutron star.
An overview of smart grid routing algorithms
NASA Astrophysics Data System (ADS)
Wang, Junsheng; OU, Qinghai; Shen, Haijuan
2017-08-01
This paper summarizes the typical routing algorithm in smart grid by analyzing the communication business and communication requirements of intelligent grid. Mainly from the two kinds of routing algorithm is analyzed, namely clustering routing algorithm and routing algorithm, analyzed the advantages and disadvantages of two kinds of typical routing algorithm in routing algorithm and applicability.
ERIC Educational Resources Information Center
Gump, Steven E.
2007-01-01
General education classrooms provide a common milieu for understanding and appropriating results of classroom research projects, which are typically viewed as having little application outside their original contexts. Here, results of an investigation into the "sophomore slump," where grades and class attendance rates typically suffer, are…
... usually involves taking prescription hormones. This can include hydrocortisone, prednisone, or cortisone acetate. If your body is ... treatment typically consists of intravenous (IV) injections of hydrocortisone, saline (salt water), and dextrose (sugar). These injections ...
Atypical Pityriasis rosea in a black child: a case report
2009-01-01
Introduction Pityriasis rosea is a self-limited inflammatory condition of the skin that mostly affects healthy children and adolescents. Atypical cases of Pityriasis rosea are fairly common and less readily recognized than typical eruptions, and may pose a diagnostic challenge. Case presentation We report the case of a 12-year-old black child that developed an intense pruritic papular eruption with intense facial involvement that was diagnosed of Pityriasis rosea and resolved after five weeks leaving a slight hyperpigmentation. Conclusion Facial and scalp involvement, post-inflammatory disorders of pigmentation and papular lesions are characteristics typically associated to black patients with Pityriasis rosea. The knowledge of features found more frequently in dark-skinned population may be helpful to physicians for diagnosing an atypical Pityriasis rosea in these patients. PMID:20181179
Peterson, Candida C; Garnett, Michelle; Kelly, Adrian; Attwood, Tony
2009-02-01
Children with autism-spectrum disorders (ASD) often fail laboratory false-belief tests of theory of mind (ToM). Yet how this impacts on their everyday social behavior is less clear, partly owing to uncertainty over which specific everyday conversational and social skills require ToM understanding. A new caregiver-report scale of these everyday applications of ToM was developed and validated in two studies. Study 1 obtained parent ratings of 339 children (85 with autism; 230 with Asperger's; 24 typically-developing) on the new scale and results revealed (a) that the scale had good psychometric properties and (b) that children with ASD had significantly more everyday mindreading difficulties than typical developers. In Study 2, we directly tested links between laboratory ToM and everyday mindreading using teacher ratings on the new scale. The sample of 25 children included 15 with autism and 10 typical developers aged 5-12 years. Children in both groups who passed laboratory ToM tests had fewer everyday mindreading difficulties than those of the same diagnosis who failed. Yet, intriguingly, autistic ToM-passers still had more problems with everyday mindreading than younger typically-developing ToM-failers. The possible roles of family conversation and peer interaction, along with ToM, in everyday social functioning were considered.
Multiple Cranial Nerve Palsies in Giant Cell Arteritis.
Ross, Michael; Bursztyn, Lulu; Superstein, Rosanne; Gans, Mark
2017-12-01
Giant cell arteritis (GCA) is a systemic vasculitis of medium and large arteries often with ophthalmic involvement, including ischemic optic neuropathy, retinal artery occlusion, and ocular motor cranial nerve palsies. This last complication occurs in 2%-15% of patients, but typically involves only 1 cranial nerve. We present 2 patients with biopsy-proven GCA associated with multiple cranial nerve palsies.
Jeffrey, Jennifer; Whelan, Jodie; Pirouz, Dante M; Snowdon, Anne W
2016-07-01
Campaigns advocating behavioural changes often employ social norms as a motivating technique, favouring injunctive norms (what is typically approved or disapproved) over descriptive norms (what is typically done). Here, we investigate an upside to including descriptive norms in health and safety appeals. Because descriptive norms are easy to process and understand, they should provide a heuristic to guide behaviour in those individuals who lack the interest or motivation to reflect on the advocated behaviour more deeply. When those descriptive norms are positive - suggesting that what is done is consistent with what ought to be done - including them in campaigns should be particularly beneficial at influencing this low-involvement segment. We test this proposition via research examining booster seat use amongst parents with children of booster seat age, and find that incorporating positive descriptive norms into a related campaign is particularly impactful for parents who report low involvement in the topic of booster seat safety. Descriptive norms are easy to state and easy to understand, and our research suggests that these norms resonate with low involvement individuals. As a result, we recommend incorporating descriptive norms when possible into health and safety campaigns. Copyright © 2016. Published by Elsevier Ltd.
Optimal neighborhood indexing for protein similarity search.
Peterlongo, Pierre; Noé, Laurent; Lavenier, Dominique; Nguyen, Van Hoa; Kucherov, Gregory; Giraud, Mathieu
2008-12-16
Similarity inference, one of the main bioinformatics tasks, has to face an exponential growth of the biological data. A classical approach used to cope with this data flow involves heuristics with large seed indexes. In order to speed up this technique, the index can be enhanced by storing additional information to limit the number of random memory accesses. However, this improvement leads to a larger index that may become a bottleneck. In the case of protein similarity search, we propose to decrease the index size by reducing the amino acid alphabet. The paper presents two main contributions. First, we show that an optimal neighborhood indexing combining an alphabet reduction and a longer neighborhood leads to a reduction of 35% of memory involved into the process, without sacrificing the quality of results nor the computational time. Second, our approach led us to develop a new kind of substitution score matrices and their associated e-value parameters. In contrast to usual matrices, these matrices are rectangular since they compare amino acid groups from different alphabets. We describe the method used for computing those matrices and we provide some typical examples that can be used in such comparisons. Supplementary data can be found on the website http://bioinfo.lifl.fr/reblosum. We propose a practical index size reduction of the neighborhood data, that does not negatively affect the performance of large-scale search in protein sequences. Such an index can be used in any study involving large protein data. Moreover, rectangular substitution score matrices and their associated statistical parameters can have applications in any study involving an alphabet reduction.
Optimal neighborhood indexing for protein similarity search
Peterlongo, Pierre; Noé, Laurent; Lavenier, Dominique; Nguyen, Van Hoa; Kucherov, Gregory; Giraud, Mathieu
2008-01-01
Background Similarity inference, one of the main bioinformatics tasks, has to face an exponential growth of the biological data. A classical approach used to cope with this data flow involves heuristics with large seed indexes. In order to speed up this technique, the index can be enhanced by storing additional information to limit the number of random memory accesses. However, this improvement leads to a larger index that may become a bottleneck. In the case of protein similarity search, we propose to decrease the index size by reducing the amino acid alphabet. Results The paper presents two main contributions. First, we show that an optimal neighborhood indexing combining an alphabet reduction and a longer neighborhood leads to a reduction of 35% of memory involved into the process, without sacrificing the quality of results nor the computational time. Second, our approach led us to develop a new kind of substitution score matrices and their associated e-value parameters. In contrast to usual matrices, these matrices are rectangular since they compare amino acid groups from different alphabets. We describe the method used for computing those matrices and we provide some typical examples that can be used in such comparisons. Supplementary data can be found on the website . Conclusion We propose a practical index size reduction of the neighborhood data, that does not negatively affect the performance of large-scale search in protein sequences. Such an index can be used in any study involving large protein data. Moreover, rectangular substitution score matrices and their associated statistical parameters can have applications in any study involving an alphabet reduction. PMID:19087280
Cortical and subcortical mechanisms of brain-machine interfaces.
Marchesotti, Silvia; Martuzzi, Roberto; Schurger, Aaron; Blefari, Maria Laura; Del Millán, José R; Bleuler, Hannes; Blanke, Olaf
2017-06-01
Technical advances in the field of Brain-Machine Interfaces (BMIs) enable users to control a variety of external devices such as robotic arms, wheelchairs, virtual entities and communication systems through the decoding of brain signals in real time. Most BMI systems sample activity from restricted brain regions, typically the motor and premotor cortex, with limited spatial resolution. Despite the growing number of applications, the cortical and subcortical systems involved in BMI control are currently unknown at the whole-brain level. Here, we provide a comprehensive and detailed report of the areas active during on-line BMI control. We recorded functional magnetic resonance imaging (fMRI) data while participants controlled an EEG-based BMI inside the scanner. We identified the regions activated during BMI control and how they overlap with those involved in motor imagery (without any BMI control). In addition, we investigated which regions reflect the subjective sense of controlling a BMI, the sense of agency for BMI-actions. Our data revealed an extended cortical-subcortical network involved in operating a motor-imagery BMI. This includes not only sensorimotor regions but also the posterior parietal cortex, the insula and the lateral occipital cortex. Interestingly, the basal ganglia and the anterior cingulate cortex were involved in the subjective sense of controlling the BMI. These results inform basic neuroscience by showing that the mechanisms of BMI control extend beyond sensorimotor cortices. This knowledge may be useful for the development of BMIs that offer a more natural and embodied feeling of control for the user. Hum Brain Mapp 38:2971-2989, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
Rheumatoid pseudocyst (geode) of the femoral neck without apparent joint involvement.
Morrey, B F
1987-05-01
Typically, rheumatoid cysts are associated with obvious joint involvement and are located in the subchondral portion of the adjacent joint. Giant pseudocysts (geodes) are uncommon and are characteristically associated with extensive joint destruction. The patient described in this report had a giant pseudocyst of the femoral neck but no joint involvement. To the best of my knowledge, this is the first report of such a manifestation of a giant pseudocyst. As such, it posed a somewhat difficult diagnostic problem.
Langille, Megan M.; Desai, Jay
2015-01-01
Encephalitis due to antibodies to voltage gated potassium channel (VGKC) typically presents with limbic encephalitis and medial temporal lobe involvement on neuroimaging. We describe a case of 13 year girl female with encephalitis due to antibodies to VGKC with signal changes in the cerebellar dentate nuclei bilaterally and clinical features that suggested predominant cerebellar involvement. These have never been reported previously in the literature. Our case expands the phenotypic spectrum of this rare condition. PMID:26019428
Langille, Megan M; Desai, Jay
2015-01-01
Encephalitis due to antibodies to voltage gated potassium channel (VGKC) typically presents with limbic encephalitis and medial temporal lobe involvement on neuroimaging. We describe a case of 13 year girl female with encephalitis due to antibodies to VGKC with signal changes in the cerebellar dentate nuclei bilaterally and clinical features that suggested predominant cerebellar involvement. These have never been reported previously in the literature. Our case expands the phenotypic spectrum of this rare condition.
Coated-Wire Ion Selective Electrodes and Their Application to the Teaching Laboratory.
ERIC Educational Resources Information Center
Martin, Charles R.; Freiser, Henry
1980-01-01
Describes the procedures for construction of a nitrate coated-wire ion selective electrode and suggests experiments for evaluation of electrode response and illustration of typical analytical applications of ion selective electrodes. (CS)
Applicability of ISO 16697 Data to Spacecraft Fire Fighting Strategies
NASA Technical Reports Server (NTRS)
Hirsch, David B.; Beeson, Harold D.
2012-01-01
Presentation Agenda: (1) Selected variables affecting oxygen consumption during spacecraft fires, (2) General overview of ISO 16697, (3) Estimated amounts of material consumed during combustion in typical ISS enclosures, (4) Discussion on potential applications.
Nanopyroxene Grafting with β-Cyclodextrin Monomer for Wastewater Applications.
Nafie, Ghada; Vitale, Gerardo; Carbognani Ortega, Lante; Nassar, Nashaat N
2017-12-06
Emerging nanoparticle technology provides opportunities for environmentally friendly wastewater treatment applications, including those in the large liquid tailings containments in the Alberta oil sands. In this study, we synthesize β-cyclodextrin grafted nanopyroxenes to offer an ecofriendly platform for the selective removal of organic compounds typically present in these types of applications. We carry out computational modeling at the micro level through molecular mechanics and molecular dynamics simulations and laboratory experiments at the macro level to understand the interactions between the synthesized nanomaterials and two-model naphthenic acid molecules (cyclopentanecarboxylic and trans-4-pentylcyclohexanecarboxylic acids) typically existing in tailing ponds. The proof-of-concept computational modeling and experiments demonstrate that monomer grafted nanopyroxene or nano-AE of the sodium iron-silicate aegirine are found to be promising candidates for the removal of polar organic compounds from wastewater, among other applications. These nano-AE offer new possibilities for treating tailing ponds generated by the oil sands industry.
... a life-threatening reaction called anaphylaxis (an-a-fi-LAK-sis). Symptoms of anaphylaxis typically involve more ... Immunology 555 East Wells Street Suite 1100, Milwaukee , WI 53202-3823 (414) 272-6071 Additional Contact Information ...
Fecal microbiota transplantation and its potential therapeutic uses in gastrointestinal disorders.
Heath, Ryan D; Cockerell, Courtney; Mankoo, Ravinder; Ibdah, Jamal A; Tahan, Veysel
2018-01-01
Typical human gut flora has been well characterized in previous studies and has been noted to have significant differences when compared with the typical microbiome of various disease states involving the gastrointestinal tract. Such diseases include Clostridium difficile colitis, inflammatory bowel disease, functional bowel syndromes, and various states of liver disease. A growing number of studies have investigated the use of a fecal microbiota transplant as a potential therapy for these disease states.
2014-06-11
typically of a few 10-11 torr using oil-free magnetically suspended turbomolecular pumps backed with dry scroll pumps . A cold finger assembled from...on line and in situ utilizing a Faraday cup mounted inside a differentially pumped chamber on an ultrahigh vacuum compatible translation state. The...down to a base pressure typically of a few 10-11 torr using oil-free magnetically suspended turbomolecular pumps backed with dry scroll pumps . A
Fecal microbiota transplantation and its potential therapeutic uses in gastrointestinal disorders
Heath, Ryan D.; Cockerell, Courtney; Mankoo, Ravinder; Ibdah, Jamal A.; Tahan, Veysel
2018-01-01
Typical human gut flora has been well characterized in previous studies and has been noted to have significant differences when compared with the typical microbiome of various disease states involving the gastrointestinal tract. Such diseases include Clostridium difficile colitis, inflammatory bowel disease, functional bowel syndromes, and various states of liver disease. A growing number of studies have investigated the use of a fecal microbiota transplant as a potential therapy for these disease states. PMID:29607440
Neuroradiological findings in maple syrup urine disease
Indiran, Venkatraman; Gunaseelan, R. Emmanuel
2013-01-01
Maple syrup urine disease is a rare inborn error of amino acid metabolism involving catabolic pathway of the branched-chain amino acids. This disease, if left untreated, may cause damage to the brain and may even cause death. These patients typically present with distinctive maple syrup odour of sweat and urine. Patients typically present with skin and urine smelling like maple syrup. Here we describe a case with relevant magnetic resonance imaging findings and confirmatory biochemical findings. PMID:23772241
Neuroradiological findings in maple syrup urine disease.
Indiran, Venkatraman; Gunaseelan, R Emmanuel
2013-01-01
Maple syrup urine disease is a rare inborn error of amino acid metabolism involving catabolic pathway of the branched-chain amino acids. This disease, if left untreated, may cause damage to the brain and may even cause death. These patients typically present with distinctive maple syrup odour of sweat and urine. Patients typically present with skin and urine smelling like maple syrup. Here we describe a case with relevant magnetic resonance imaging findings and confirmatory biochemical findings.
Reduced-Order Models Based on POD-Tpwl for Compositional Subsurface Flow Simulation
NASA Astrophysics Data System (ADS)
Durlofsky, L. J.; He, J.; Jin, L. Z.
2014-12-01
A reduced-order modeling procedure applicable for compositional subsurface flow simulation will be described and applied. The technique combines trajectory piecewise linearization (TPWL) and proper orthogonal decomposition (POD) to provide highly efficient surrogate models. The method is based on a molar formulation (which uses pressure and overall component mole fractions as the primary variables) and is applicable for two-phase, multicomponent systems. The POD-TPWL procedure expresses new solutions in terms of linearizations around solution states generated and saved during previously simulated 'training' runs. High-dimensional states are projected into a low-dimensional subspace using POD. Thus, at each time step, only a low-dimensional linear system needs to be solved. Results will be presented for heterogeneous three-dimensional simulation models involving CO2 injection. Both enhanced oil recovery and carbon storage applications (with horizontal CO2 injectors) will be considered. Reasonably close agreement between full-order reference solutions and compositional POD-TPWL simulations will be demonstrated for 'test' runs in which the well controls differ from those used for training. Construction of the POD-TPWL model requires preprocessing overhead computations equivalent to about 3-4 full-order runs. Runtime speedups using POD-TPWL are, however, very significant - typically O(100-1000). The use of POD-TPWL for well control optimization will also be illustrated. For this application, some amount of retraining during the course of the optimization is required, which leads to smaller, but still significant, speedup factors.
Bishop, Christopher M
2013-02-13
Several decades of research in the field of machine learning have resulted in a multitude of different algorithms for solving a broad range of problems. To tackle a new application, a researcher typically tries to map their problem onto one of these existing methods, often influenced by their familiarity with specific algorithms and by the availability of corresponding software implementations. In this study, we describe an alternative methodology for applying machine learning, in which a bespoke solution is formulated for each new application. The solution is expressed through a compact modelling language, and the corresponding custom machine learning code is then generated automatically. This model-based approach offers several major advantages, including the opportunity to create highly tailored models for specific scenarios, as well as rapid prototyping and comparison of a range of alternative models. Furthermore, newcomers to the field of machine learning do not have to learn about the huge range of traditional methods, but instead can focus their attention on understanding a single modelling environment. In this study, we show how probabilistic graphical models, coupled with efficient inference algorithms, provide a very flexible foundation for model-based machine learning, and we outline a large-scale commercial application of this framework involving tens of millions of users. We also describe the concept of probabilistic programming as a powerful software environment for model-based machine learning, and we discuss a specific probabilistic programming language called Infer.NET, which has been widely used in practical applications.
3D surface scan of biological samples with a Push-broom Imaging Spectrometer
NASA Astrophysics Data System (ADS)
Yao, Haibo; Kincaid, Russell; Hruska, Zuzana; Brown, Robert L.; Bhatnagar, Deepak; Cleveland, Thomas E.
2013-08-01
The food industry is always on the lookout for sensing technologies for rapid and nondestructive inspection of food products. Hyperspectral imaging technology integrates both imaging and spectroscopy into unique imaging sensors. Its application for food safety and quality inspection has made significant progress in recent years. Specifically, hyperspectral imaging has shown its potential for surface contamination detection in many food related applications. Most existing hyperspectral imaging systems use pushbroom scanning which is generally used for flat surface inspection. In some applications it is desirable to be able to acquire hyperspectral images on circular objects such as corn ears, apples, and cucumbers. Past research describes inspection systems that examine all surfaces of individual objects. Most of these systems did not employ hyperspectral imaging. These systems typically utilized a roller to rotate an object, such as an apple. During apple rotation, the camera took multiple images in order to cover the complete surface of the apple. The acquired image data lacked the spectral component present in a hyperspectral image. This paper discusses the development of a hyperspectral imaging system for a 3-D surface scan of biological samples. The new instrument is based on a pushbroom hyperspectral line scanner using a rotational stage to turn the sample. The system is suitable for whole surface hyperspectral imaging of circular objects. In addition to its value to the food industry, the system could be useful for other applications involving 3-D surface inspection.
Bishop, Christopher M.
2013-01-01
Several decades of research in the field of machine learning have resulted in a multitude of different algorithms for solving a broad range of problems. To tackle a new application, a researcher typically tries to map their problem onto one of these existing methods, often influenced by their familiarity with specific algorithms and by the availability of corresponding software implementations. In this study, we describe an alternative methodology for applying machine learning, in which a bespoke solution is formulated for each new application. The solution is expressed through a compact modelling language, and the corresponding custom machine learning code is then generated automatically. This model-based approach offers several major advantages, including the opportunity to create highly tailored models for specific scenarios, as well as rapid prototyping and comparison of a range of alternative models. Furthermore, newcomers to the field of machine learning do not have to learn about the huge range of traditional methods, but instead can focus their attention on understanding a single modelling environment. In this study, we show how probabilistic graphical models, coupled with efficient inference algorithms, provide a very flexible foundation for model-based machine learning, and we outline a large-scale commercial application of this framework involving tens of millions of users. We also describe the concept of probabilistic programming as a powerful software environment for model-based machine learning, and we discuss a specific probabilistic programming language called Infer.NET, which has been widely used in practical applications. PMID:23277612
Deterministic Ethernet for Space Applications
NASA Astrophysics Data System (ADS)
Fidi, C.; Wolff, B.
2015-09-01
Typical spacecraft systems are distributed to be able to achieve the required reliability and availability targets of the mission. However the requirements on these systems are different for launchers, satellites, human space flight and exploration missions. Launchers require typically high reliability with very short mission times whereas satellites or space exploration missions require very high availability at very long mission times. Comparing a distributed system of launchers with satellites it shows very fast reaction times in launchers versus much slower once in satellite applications. Human space flight missions are maybe most challenging concerning reliability and availability since human lives are involved and the mission times can be very long e.g. ISS. Also the reaction times of these vehicles can get challenging during mission scenarios like landing or re-entry leading to very fast control loops. In these different applications more and more autonomous functions are required to fulfil the needs of current and future missions. This autonomously leads to new requirements with respect to increase performance, determinism, reliability and availability. On the other hand side the pressure on reducing costs of electronic components in space applications is increasing, leading to the use of more and more COTS components especially for launchers and LEO satellites. This requires a technology which is able to provide a cost competitive solution for both the high reliable and available deep-space as well as the low cost “new space” markets. Future spacecraft communication standards therefore have to be much more flexible, scalable and modular to be able to deal with these upcoming challenges. The only way to fulfill these requirements is, if they are based on open standards which are used cross industry leading to a reduction of the lifecycle costs and an increase in performance. The use of a communication network that fulfills these requirements will be essential for such spacecraft’s to allow the use in launcher, satellite, human space flight and exploration missions. Using one technology and the related infrastructure for these different applications will lead to a significant reduction of complexity and would moreover lead to significant savings in size weight and power while increasing the performance of the overall system. The paper focuses on the use of the TTEthernet technology for launchers, satellites and human spaceflight and will demonstrate the scalability of the technology for the different applications. The data used is derived from the ESA TRP 7594 on “Reliable High-Speed Data Bus/Network for Safety-Oriented Missions”.
Pyne, Saumyadipta; Lee, Sharon X; Wang, Kui; Irish, Jonathan; Tamayo, Pablo; Nazaire, Marc-Danie; Duong, Tarn; Ng, Shu-Kay; Hafler, David; Levy, Ronald; Nolan, Garry P; Mesirov, Jill; McLachlan, Geoffrey J
2014-01-01
In biomedical applications, an experimenter encounters different potential sources of variation in data such as individual samples, multiple experimental conditions, and multivariate responses of a panel of markers such as from a signaling network. In multiparametric cytometry, which is often used for analyzing patient samples, such issues are critical. While computational methods can identify cell populations in individual samples, without the ability to automatically match them across samples, it is difficult to compare and characterize the populations in typical experiments, such as those responding to various stimulations or distinctive of particular patients or time-points, especially when there are many samples. Joint Clustering and Matching (JCM) is a multi-level framework for simultaneous modeling and registration of populations across a cohort. JCM models every population with a robust multivariate probability distribution. Simultaneously, JCM fits a random-effects model to construct an overall batch template--used for registering populations across samples, and classifying new samples. By tackling systems-level variation, JCM supports practical biomedical applications involving large cohorts. Software for fitting the JCM models have been implemented in an R package EMMIX-JCM, available from http://www.maths.uq.edu.au/~gjm/mix_soft/EMMIX-JCM/.
Supersonic gas-liquid cleaning system
NASA Technical Reports Server (NTRS)
Caimi, Raoul E. B.; Thaxton, Eric A.
1994-01-01
A system to perform cleaning and cleanliness verification is being developed to replace solvent flush methods using CFC 113 for fluid system components. The system is designed for two purposes: internal and external cleaning and verification. External cleaning is performed with the nozzle mounted at the end of a wand similar to a conventional pressure washer. Internal cleaning is performed with a variety of fixtures designed for specific applications. Internal cleaning includes tubes, pipes, flex hoses, and active fluid components such as valves and regulators. The system uses gas-liquid supersonic nozzles to generate high impingement velocities at the surface of the object to be cleaned. Compressed air or any inert gas may be used to provide the conveying medium for the liquid. The converging-diverging nozzles accelerate the gas-liquid mixture to supersonic velocities. The liquid being accelerated may be any solvent including water. This system may be used commercially to replace CFC and other solvent cleaning methods widely used to remove dust, dirt, flux, and lubricants. In addition, cleanliness verification can be performed without the solvents which are typically involved. This paper will present the technical details of the system, the results achieved during testing at KSC, and future applications for this system.
A travelling standard for radiopharmaceutical production centres in Italy
NASA Astrophysics Data System (ADS)
Capogni, M.; de Felice, P.; Fazio, A.
Short-lived radionuclides, γ, β+ and/or β- emitters, such as 18F, and 99mTc, particularly useful for nuclear medicine applications, both diagnostic and in radiotherapy, can be produced with high-specific activity in a small biomedical cyclotron or by a radionuclide generator. While [18F]Fludeoxyglucose ([18F]FDG) is a widely used radiopharmaceutical for positron emission tomography, the development of innovative diagnostic techniques and therapies involves the use of new radio-labelled molecules and emerging radionuclides, such as 64Cu and 124I. During the last 3 years, an extensive supply of [18F]FDG was started by many production sites in Italy, and new radiopharmaceuticals are being studied for future nuclear medical applications. Therefore, a special nuclear medicine research programme for primary standard development and transferral to the end-users has been carried out by the ENEA-INMRI. Because of the short half-lives of these nuclides, a portable well-type ionisation chamber was established as a secondary travelling standard. This device has been calibrated and transported to the radiopharmaceutical production centres in Italy where the local instrumentation, typically radionuclide calibrators, has been calibrated by a simple comparison, with an uncertainty level lower than 2%.
Submillimeter sources for radiometry using high power Indium Phosphide Gunn diode oscillators
NASA Technical Reports Server (NTRS)
Deo, Naresh C.
1990-01-01
A study aimed at developing high frequency millimeter wave and submillimeter wave local oscillator sources in the 60-600 GHz range was conducted. Sources involved both fundamental and harmonic-extraction type Indium Phosphide Gunn diode oscillators as well as varactor multipliers. In particular, a high power balanced-doubler using varactor diodes was developed for 166 GHz. It is capable of handling 100 mW input power, and typically produced 25 mW output power. A high frequency tripler operating at 500 GHz output frequency was also developed and cascaded with the balanced-doubler. A dual-diode InP Gunn diode combiner was used to pump this cascaded multiplier to produce on the order of 0.5 mW at 500 GHz. In addition, considerable development and characterization work on InP Gunn diode oscillators was carried out. Design data and operating characteristics were documented for a very wide range of oscillators. The reliability of InP devices was examined, and packaging techniques to enhance the performance were analyzed. A theoretical study of a new class of high power multipliers was conducted for future applications. The sources developed here find many commercial applications for radio astronomy and remote sensing.
Microshell-tipped optical fibers as sensors of high-pressure pulses in adverse environments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Benjamin, R.F.; Mayer, F.J.; Maynard, R.L.
1984-01-01
An optical-fiber sensor for detecting the arrival of strong pressure pulses was developed. The sensor consists of an optical fiber, tipped with a gas-filled microballoon. They have been used successfully in adverse environments including explosives, ballistics and electromagnetic pulses (EMP). The sensor produces a bright optical pulse caused by the rapid shock-heating of a gas, typically argon or xenon, which is confined in the spherical glass or plastic microballoon. The light pulse is transmitted via the optical fiber to a photo detector, usually a streak camera or photomultiplier tube. The microballoon optical sensor (called an optical pin by analogy tomore » standard electrical pins), was originally developed for diagnosing an explosive, pulsed-power generator. Optical pins are required due to the EMP. The optical pins are economical arrival-time indicators because many channels can be recorded by one streak camera. The generator tests and related experiments, involving projectile velocities and detonation velocities of several kilometers per sec have demonstrated the usefulness of the sensors in explosives and ballistics applications. The technical and cost advantages of this optical pin make it potentially useful for many electromagnetic, explosive, and ballistics applications.« less
Cowan, Don A; Fernandez-Lafuente, Roberto
2011-09-10
The immobilization of proteins (mostly typically enzymes) onto solid supports is mature technology and has been used successfully to enhance biocatalytic processes in a wide range of industrial applications. However, continued developments in immobilization technology have led to more sophisticated and specialized applications of the process. A combination of targeted chemistries, for both the support and the protein, sometimes in combination with additional chemical and/or genetic engineering, has led to the development of methods for the modification of protein functional properties, for enhancing protein stability and for the recovery of specific proteins from complex mixtures. In particular, the development of effective methods for immobilizing large multi-subunit proteins with multiple covalent linkages (multi-point immobilization) has been effective in stabilizing proteins where subunit dissociation is the initial step in enzyme inactivation. In some instances, multiple benefits are achievable in a single process. Here we comprehensively review the literature pertaining to immobilization and chemical modification of different enzyme classes from thermophiles, with emphasis on the chemistries involved and their implications for modification of the enzyme functional properties. We also highlight the potential for synergies in the combined use of immobilization and other chemical modifications. Copyright © 2011 Elsevier Inc. All rights reserved.
Tey, Chuang-Kit; An, Jinyoung; Chung, Wan-Young
2017-01-01
Chronic obstructive pulmonary disease is a type of lung disease caused by chronically poor airflow that makes breathing difficult. As a chronic illness, it typically worsens over time. Therefore, pulmonary rehabilitation exercises and patient management for extensive periods of time are required. This paper presents a remote rehabilitation system for a multimodal sensors-based application for patients who have chronic breathing difficulties. The process involves the fusion of sensory data-captured motion data by stereo-camera and photoplethysmogram signal by a wearable PPG sensor-that are the input variables of a detection and evaluation framework. In addition, we incorporated a set of rehabilitation exercises specific for pulmonary patients into the system by fusing sensory data. Simultaneously, the system also features medical functions that accommodate the needs of medical professionals and those which ease the use of the application for patients, including exercises for tracking progress, patient performance, exercise assignments, and exercise guidance. Finally, the results indicate the accurate determination of pulmonary exercises from the fusion of sensory data. This remote rehabilitation system provides a comfortable and cost-effective option in the healthcare rehabilitation system.
Fluorescence from the maillard reaction and its potential applications in food science.
Matiacevich, Silvia B; Santagapita, Patricio R; Buera, M Pilar
2005-01-01
The chemistry of the Maillard reaction involves a complex set of steps, and its interpretation represents a challenge in basic and applied aspects of Food Science. Fluorescent compounds have been recognized as important early markers of the reaction in food products since 1942. However, the recent advances in the characterization of fluorophores' development were observed in biological and biomedical areas. The in vivo non-enzymatic glycosylation of proteins produces biological effects, promoting health deterioration. The characteristic fluorescence of advanced glycosylation end products (AGEs) is similar to that of Maillard food products and represents an indicator of the level of AGE-modified proteins, but the structure of the fluorescent groups is, typically, unknown. Application of fluorescence measurement is considered a potential tool for addressing key problems of food deterioration as an early marker or index of the damage of biomolecules. Fluorophores may be precursors of the brown pigments and/or end products. A general scheme of the Maillard reaction is proposed in this article, incorporating the pool concept. A correct interpretation of the effect of environmental and compositional conditions and their influences on the reaction kinetics may help to define the meaning of fluorescence development for each particular system.
NASA Astrophysics Data System (ADS)
Manan, N. H.; Majid, D. L.; Romli, F. I.
2016-10-01
Sandwich structures with honeycomb core are known to significantly improve stiffness at lower weight and possess high flexural rigidity. They have found wide applications in aerospace as part of the primary structures, as well as the interior paneling and floors. High performance aluminum and aramid are the typical materials used for the purpose of honeycomb core whereas in other industries, materials such as fibre glass, carbon fibre, Nomex and also Kevlar reinforced with polymer are used. Recently, growing interest in developing composite structures with natural fibre reinforcement has also spurred research in natural fibre honeycomb material. The majority of the researches done, however, have generally emphasized on the usage of random chopped fibre and only a few are reported on development of honeycomb structure using unidirectional fibre as the reinforcement. This is mainly due to its processing difficulties, which often involve several stages to account for the arrangement of fibres and curing. Since the use of unidirectional fibre supports greater strength compared to random chopped fibre, a single-stage process in conjunction with vacuum infusion is suggested with a mould design that supports fibre arrangement in the direction of honeycomb loading.
Supersonic gas-liquid cleaning system
NASA Astrophysics Data System (ADS)
Caimi, Raoul E. B.; Thaxton, Eric A.
1994-02-01
A system to perform cleaning and cleanliness verification is being developed to replace solvent flush methods using CFC 113 for fluid system components. The system is designed for two purposes: internal and external cleaning and verification. External cleaning is performed with the nozzle mounted at the end of a wand similar to a conventional pressure washer. Internal cleaning is performed with a variety of fixtures designed for specific applications. Internal cleaning includes tubes, pipes, flex hoses, and active fluid components such as valves and regulators. The system uses gas-liquid supersonic nozzles to generate high impingement velocities at the surface of the object to be cleaned. Compressed air or any inert gas may be used to provide the conveying medium for the liquid. The converging-diverging nozzles accelerate the gas-liquid mixture to supersonic velocities. The liquid being accelerated may be any solvent including water. This system may be used commercially to replace CFC and other solvent cleaning methods widely used to remove dust, dirt, flux, and lubricants. In addition, cleanliness verification can be performed without the solvents which are typically involved. This paper will present the technical details of the system, the results achieved during testing at KSC, and future applications for this system.
Atypical cross talk between mentalizing and mirror neuron networks in autism spectrum disorder.
Fishman, Inna; Keown, Christopher L; Lincoln, Alan J; Pineda, Jaime A; Müller, Ralph-Axel
2014-07-01
Converging evidence indicates that brain abnormalities in autism spectrum disorder (ASD) involve atypical network connectivity, but it is unclear whether altered connectivity is especially prominent in brain networks that participate in social cognition. To investigate whether adolescents with ASD show altered functional connectivity in 2 brain networks putatively impaired in ASD and involved in social processing, theory of mind (ToM) and mirror neuron system (MNS). Cross-sectional study using resting-state functional magnetic resonance imaging involving 25 adolescents with ASD between the ages of 11 and 18 years and 25 typically developing adolescents matched for age, handedness, and nonverbal IQ. Statistical parametric maps testing the degree of whole-brain functional connectivity and social functioning measures. Relative to typically developing controls, participants with ASD showed a mixed pattern of both over- and underconnectivity in the ToM network, which was associated with greater social impairment. Increased connectivity in the ASD group was detected primarily between the regions of the MNS and ToM, and was correlated with sociocommunicative measures, suggesting that excessive ToM-MNS cross talk might be associated with social impairment. In a secondary analysis comparing a subset of the 15 participants with ASD with the most severe symptomology and a tightly matched subset of 15 typically developing controls, participants with ASD showed exclusive overconnectivity effects in both ToM and MNS networks, which were also associated with greater social dysfunction. Adolescents with ASD showed atypically increased functional connectivity involving the mentalizing and mirror neuron systems, largely reflecting greater cross talk between the 2. This finding is consistent with emerging evidence of reduced network segregation in ASD and challenges the prevailing theory of general long-distance underconnectivity in ASD. This excess ToM-MNS connectivity may reflect immature or aberrant developmental processes in 2 brain networks involved in understanding of others, a domain of impairment in ASD. Further, robust links with sociocommunicative symptoms of ASD implicate atypically increased ToM-MNS connectivity in social deficits observed in ASD.
Code of Federal Regulations, 2010 CFR
2010-10-01
... connection with industrial heating operations utilized in a manufacturing or production process. (e) Medical.... (c) Industrial, scientific, and medical (ISM) equipment. Equipment or appliances designed to generate... applications in the field of telecommunication. Typical ISM applications are the production of physical...
DEVELOPMENT AND APPLICATION OF PLANNING PROCESS TO ACHIEVE SUSTAINABILITY
Concepts of sustainability are numerous, widely discussed, and necessary, but sustainability needs to be applied to development projects to succeed. However, few applications are made and their measures are unclear. Sustainability indicators are typically used as measures, but ...
Understanding the Physical Optics Phenomena by Using a Digital Application for Light Propagation
NASA Astrophysics Data System (ADS)
Sierra-Sosa, Daniel-Esteban; Ángel-Toro, Luciano
2011-01-01
Understanding the light propagation on the basis of the Huygens-Fresnel principle stands for a fundamental factor for deeper comprehension of different physical optics related phenomena like diffraction, self-imaging, image formation, Fourier analysis and spatial filtering. This constitutes the physical approach of the Fourier optics whose principles and applications have been developed since the 1950's. Both for analytical and digital applications purposes, light propagation can be formulated in terms of the Fresnel Integral Transform. In this work, a digital optics application based on the implementation of the Discrete Fresnel Transform (DFT), and addressed to serve as a tool for applications in didactics of optics is presented. This tool allows, at a basic and intermediate learning level, exercising with the identification of basic phenomena, and observing changes associated with modifications of physical parameters. This is achieved by using a friendly graphic user interface (GUI). It also assists the user in the development of his capacity for abstracting and predicting the characteristics of more complicated phenomena. At an upper level of learning, the application could be used to favor a deeper comprehension of involved physics and models, and experimenting with new models and configurations. To achieve this, two characteristics of the didactic tool were taken into account when designing it. First, all physical operations, ranging from simple diffraction experiments to digital holography and interferometry, were developed on the basis of the more fundamental concept of light propagation. Second, the algorithm was conceived to be easily upgradable due its modular architecture based in MATLAB® software environment. Typical results are presented and briefly discussed in connection with didactics of optics.
The Missing Stakeholder Group: Why Patients Should be Involved in Health Economic Modelling.
van Voorn, George A K; Vemer, Pepijn; Hamerlijnck, Dominique; Ramos, Isaac Corro; Teunissen, Geertruida J; Al, Maiwenn; Feenstra, Talitha L
2016-04-01
Evaluations of healthcare interventions, e.g. new drugs or other new treatment strategies, commonly include a cost-effectiveness analysis (CEA) that is based on the application of health economic (HE) models. As end users, patients are important stakeholders regarding the outcomes of CEAs, yet their knowledge of HE model development and application, or their involvement therein, is absent. This paper considers possible benefits and risks of patient involvement in HE model development and application for modellers and patients. An exploratory review of the literature has been performed on stakeholder-involved modelling in various disciplines. In addition, Dutch patient experts have been interviewed about their experience in, and opinion about, the application of HE models. Patients have little to no knowledge of HE models and are seldom involved in HE model development and application. Benefits of becoming involved would include a greater understanding and possible acceptance by patients of HE model application, improved model validation, and a more direct infusion of patient expertise. Risks would include patient bias and increased costs of modelling. Patient involvement in HE modelling seems to carry several benefits as well as risks. We claim that the benefits may outweigh the risks and that patients should become involved.
... shop employees People who work in poultry processing plants Veterinarians Typical birds involved are parrots, parakeets, and budgerigars, although other birds have also caused the disease. Psittacosis is a rare disease. Very few cases are reported each year ...
Click It or Ticket Evaluation, 2011
DOT National Transportation Integrated Search
2013-05-01
The 2011 Click It or Ticket (CIOT) mobilization followed a typical selective traffic enforcement program (STEP) sequence, involving paid media, earned media, and enforcement. A nationally representative telephone survey indicated that the mobilizatio...
Genetics Home Reference: Swyer syndrome
... they help determine whether a person will develop male or female sex characteristics. Girls and women typically ... Y protein starts processes that are involved in male sexual development. These processes cause a fetus to ...
NASA Astrophysics Data System (ADS)
Chen, Hudong
2001-06-01
There have been considerable advances in Lattice Boltzmann (LB) based methods in the last decade. By now, the fundamental concept of using the approach as an alternative tool for computational fluid dynamics (CFD) has been substantially appreciated and validated in mainstream scientific research and in industrial engineering communities. Lattice Boltzmann based methods possess several major advantages: a) less numerical dissipation due to the linear Lagrange type advection operator in the Boltzmann equation; b) local dynamic interactions suitable for highly parallel processing; c) physical handling of boundary conditions for complicated geometries and accurate control of fluxes; d) microscopically consistent modeling of thermodynamics and of interface properties in complex multiphase flows. It provides a great opportunity to apply the method to practical engineering problems encountered in a wide range of industries from automotive, aerospace to chemical, biomedical, petroleum, nuclear, and others. One of the key challenges is to extend the applicability of this alternative approach to regimes of highly turbulent flows commonly encountered in practical engineering situations involving high Reynolds numbers. Over the past ten years, significant efforts have been made on this front at Exa Corporation in developing a lattice Boltzmann based commercial CFD software, PowerFLOW. It has become a useful computational tool for the simulation of turbulent aerodynamics in practical engineering problems involving extremely complex geometries and flow situations, such as in new automotive vehicle designs world wide. In this talk, we present an overall LB based algorithm concept along with certain key extensions in order to accurately handle turbulent flows involving extremely complex geometries. To demonstrate the accuracy of turbulent flow simulations, we provide a set of validation results for some well known academic benchmarks. These include straight channels, backward-facing steps, flows over a curved hill and typical NACA airfoils at various angles of attack including prediction of stall angle. We further provide numerous engineering cases, ranging from external aerodynamics around various car bodies to internal flows involved in various industrial devices. We conclude with a discussion of certain future extensions for complex fluids.
Selective dry etching of silicon containing anti-reflective coating
NASA Astrophysics Data System (ADS)
Sridhar, Shyam; Nolan, Andrew; Wang, Li; Karakas, Erdinc; Voronin, Sergey; Biolsi, Peter; Ranjan, Alok
2018-03-01
Multi-layer patterning schemes involve the use of Silicon containing Anti-Reflective Coating (SiARC) films for their anti-reflective properties. Patterning transfer completion requires complete and selective removal of SiARC which is very difficult due to its high silicon content (>40%). Typically, SiARC removal is accomplished through a non-selective etch during the pattern transfer process using fluorine containing plasmas, or an ex-situ wet etch process using hydrofluoric acid is employed to remove the residual SiARC, post pattern transfer. Using a non-selective etch may result in profile distortion or wiggling, due to distortion of the underlying organic layer. The drawbacks of using wet etch process for SiARC removal are increased overall processing time and the need for additional equipment. Many applications may involve patterning of active structures in a poly-Si layer with an underlying oxide stopping layer. In such applications, SiARC removal selective to oxide using a wet process may prove futile. Removing SiARC selectively to SiO2 using a dry etch process is also challenging, due to similarity in the nature of chemical bonds (Si - O) in the two materials. In this work, we present highly selective etching of SiARC, in a plasma driven by a surface wave radial line slot antenna. The first step in the process involves an in-situ modification of the SiARC layer in O2 plasma followed by selective etching in a NF3/H2 plasma. Surface treatment in O2 plasma resulted in enhanced etching of the SiARC layer. For the right processing conditions, in-situ NF3/H2 dry etch process demonstrated selectivity values greater than 15:1 with respect to SiO2. The etching chemistry, however, was sensitive to NF3:H2 gas ratio. For dilute NF3 in H2, no SiARC etching was observed. Presumably, this is due to the deposition of ammonium fluorosilicate layer that occurs for dilute NF3/H2 plasmas. Additionally, challenges involved in selective SiARC removal (selective to SiO2, organic and Si layers) post pattern transfer, in a multi-layer structure will be discussed.
Identifying typical patterns of vulnerability: A 5-step approach based on cluster analysis
NASA Astrophysics Data System (ADS)
Sietz, Diana; Lüdeke, Matthias; Kok, Marcel; Lucas, Paul; Carsten, Walther; Janssen, Peter
2013-04-01
Specific processes that shape the vulnerability of socio-ecological systems to climate, market and other stresses derive from diverse background conditions. Within the multitude of vulnerability-creating mechanisms, distinct processes recur in various regions inspiring research on typical patterns of vulnerability. The vulnerability patterns display typical combinations of the natural and socio-economic properties that shape a systems' vulnerability to particular stresses. Based on the identification of a limited number of vulnerability patterns, pattern analysis provides an efficient approach to improving our understanding of vulnerability and decision-making for vulnerability reduction. However, current pattern analyses often miss explicit descriptions of their methods and pay insufficient attention to the validity of their groupings. Therefore, the question arises as to how do we identify typical vulnerability patterns in order to enhance our understanding of a systems' vulnerability to stresses? A cluster-based pattern recognition applied at global and local levels is scrutinised with a focus on an applicable methodology and practicable insights. Taking the example of drylands, this presentation demonstrates the conditions necessary to identify typical vulnerability patterns. They are summarised in five methodological steps comprising the elicitation of relevant cause-effect hypotheses and the quantitative indication of mechanisms as well as an evaluation of robustness, a validation and a ranking of the identified patterns. Reflecting scale-dependent opportunities, a global study is able to support decision-making with insights into the up-scaling of interventions when available funds are limited. In contrast, local investigations encourage an outcome-based validation. This constitutes a crucial step in establishing the credibility of the patterns and hence their suitability for informing extension services and individual decisions. In this respect, working at the local level provides a clear advantage since, to a large extent, limitations in globally available observational data constrain such a validation on the global scale. Overall, the five steps are outlined in detail in order to facilitate and motivate the application of pattern recognition in other research studies concerned with vulnerability analysis, including future applications to different vulnerability frameworks. Such applications could promote the refinement of mechanisms in specific contexts and advance methodological adjustments. This would further increase the value of identifying typical patterns in the properties of socio-ecological systems for an improved understanding and management of the relation between these systems and particular stresses.
ERIC Educational Resources Information Center
Ozgun, Ozkan; Honig, Alice Sterling
2005-01-01
In this low-income Turkish sample, parents reported on father and mother division of childcare labor and satisfaction with division. Regardless of whether they were rearing typical or atypical children, mothers reported a higher level of involvement than fathers in every domain of childcare. In general, both mothers and fathers reported slight…
ERIC Educational Resources Information Center
Smith, Herbert A.
This study involved examining an instructional unit with regard to its concept content and appropriateness for its target audience. The study attempted to determine (1) what concepts are treated explicitly or implicitly, (2) whether there is a hierarchical conceptual structure within the unit, (3) what level of sophistication is required to…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gowitt, G.T.; Hanzlick, R.L.
1992-06-01
So-called typical' autoerotic fatalities are the result of asphyxia due to mechanical compression of the neck, chest, or abdomen, whereas atypical' autoeroticism involves sexual self-stimulation by other means. The authors present five atypical autoerotic fatalities that involved the use of dichlorodifluoromethane, nitrous oxide, isobutyl nitrite, cocaine, or compounds containing 1-1-1-trichloroethane. Mechanisms of death are discussed in each case and the pertinent literature is reviewed.
NASA Technical Reports Server (NTRS)
Hirshberg, A. S.
1975-01-01
The following topics are discussed: (1) Assignment of population to microclimatic zones; (2) specifications of the mix of buildings in the SCE territory; (3) specification of four typical buildings for thermal analysis and market penetration studies; (4) identification of the materials and energy conserving characteristics of these typical buildings; (5) specifications of the HVAC functions used in each typical building, and determination of the HVAC systems used in each building; and (6) identification of the type of fuel used in each building.
Johnson, Matthew W; Bruner, Natalie R; Johnson, Patrick S
2015-01-01
Cocaine dependence and other forms of drug dependence are associated with steeper devaluation of future outcomes (delay discounting). Although studies in this domain have typically assessed choices between monetary gains (e.g., receive less money now versus receive more money after a delay), delay discounting is also applicable to decisions involving losses (e.g., small loss now versus larger delayed loss), with gains typically discounted more than losses (the "sign effect"). It is also known that drugs are discounted more than equivalently valued money. In the context of drug dependence, however, relatively little is known about the discounting of delayed monetary and drug losses and the presence of the sign effect. In this within-subject, laboratory study, delay discounting for gains and losses was assessed for cocaine and money outcomes in cocaine-dependent individuals (n=89). Both cocaine and monetary gains were discounted at significantly greater rates than cocaine and monetary losses, respectively (i.e., the sign effect). Cocaine gains were discounted significantly more than monetary gains, but cocaine and monetary losses were discounted similarly. Results suggest that cocaine is discounted by cocaine-dependent individuals in a systematic manner similar to other rewards. Because the sign effect was shown for both cocaine and money, delayed aversive outcomes may generally have greater impact than delayed rewards in shaping present behavior in this population. Copyright © 2014. Published by Elsevier Ltd.
Validity and reliability of four language mapping paradigms.
Wilson, Stephen M; Bautista, Alexa; Yen, Melodie; Lauderdale, Stefanie; Eriksson, Dana K
2017-01-01
Language areas of the brain can be mapped in individual participants with functional MRI. We investigated the validity and reliability of four language mapping paradigms that may be appropriate for individuals with acquired aphasia: sentence completion, picture naming, naturalistic comprehension, and narrative comprehension. Five neurologically normal older adults were scanned on each of the four paradigms on four separate occasions. Validity was assessed in terms of whether activation patterns reflected the known typical organization of language regions, that is, lateralization to the left hemisphere, and involvement of the left inferior frontal gyrus and the left middle and/or superior temporal gyri. Reliability (test-retest reproducibility) was quantified in terms of the Dice coefficient of similarity, which measures overlap of activations across time points. We explored the impact of different absolute and relative voxelwise thresholds, a range of cluster size cutoffs, and limitation of analyses to a priori potential language regions. We found that the narrative comprehension and sentence completion paradigms offered the best balance of validity and reliability. However, even with optimal combinations of analysis parameters, there were many scans on which known features of typical language organization were not demonstrated, and test-retest reproducibility was only moderate for realistic parameter choices. These limitations in terms of validity and reliability may constitute significant limitations for many clinical or research applications that depend on identifying language regions in individual participants.
Grassmann phase space methods for fermions. II. Field theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dalton, B.J., E-mail: bdalton@swin.edu.au; Jeffers, J.; Barnett, S.M.
In both quantum optics and cold atom physics, the behaviour of bosonic photons and atoms is often treated using phase space methods, where mode annihilation and creation operators are represented by c-number phase space variables, with the density operator equivalent to a distribution function of these variables. The anti-commutation rules for fermion annihilation, creation operators suggests the possibility of using anti-commuting Grassmann variables to represent these operators. However, in spite of the seminal work by Cahill and Glauber and a few applications, the use of Grassmann phase space methods in quantum-atom optics to treat fermionic systems is rather rare, thoughmore » fermion coherent states using Grassmann variables are widely used in particle physics. This paper presents a phase space theory for fermion systems based on distribution functionals, which replace the density operator and involve Grassmann fields representing anti-commuting fermion field annihilation, creation operators. It is an extension of a previous phase space theory paper for fermions (Paper I) based on separate modes, in which the density operator is replaced by a distribution function depending on Grassmann phase space variables which represent the mode annihilation and creation operators. This further development of the theory is important for the situation when large numbers of fermions are involved, resulting in too many modes to treat separately. Here Grassmann fields, distribution functionals, functional Fokker–Planck equations and Ito stochastic field equations are involved. Typical applications to a trapped Fermi gas of interacting spin 1/2 fermionic atoms and to multi-component Fermi gases with non-zero range interactions are presented, showing that the Ito stochastic field equations are local in these cases. For the spin 1/2 case we also show how simple solutions can be obtained both for the untrapped case and for an optical lattice trapping potential.« less
Extraction of drainage networks from large terrain datasets using high throughput computing
NASA Astrophysics Data System (ADS)
Gong, Jianya; Xie, Jibo
2009-02-01
Advanced digital photogrammetry and remote sensing technology produces large terrain datasets (LTD). How to process and use these LTD has become a big challenge for GIS users. Extracting drainage networks, which are basic for hydrological applications, from LTD is one of the typical applications of digital terrain analysis (DTA) in geographical information applications. Existing serial drainage algorithms cannot deal with large data volumes in a timely fashion, and few GIS platforms can process LTD beyond the GB size. High throughput computing (HTC), a distributed parallel computing mode, is proposed to improve the efficiency of drainage networks extraction from LTD. Drainage network extraction using HTC involves two key issues: (1) how to decompose the large DEM datasets into independent computing units and (2) how to merge the separate outputs into a final result. A new decomposition method is presented in which the large datasets are partitioned into independent computing units using natural watershed boundaries instead of using regular 1-dimensional (strip-wise) and 2-dimensional (block-wise) decomposition. Because the distribution of drainage networks is strongly related to watershed boundaries, the new decomposition method is more effective and natural. The method to extract natural watershed boundaries was improved by using multi-scale DEMs instead of single-scale DEMs. A HTC environment is employed to test the proposed methods with real datasets.
Students' conceptual performance on synthesis physics problems with varying mathematical complexity
NASA Astrophysics Data System (ADS)
Ibrahim, Bashirah; Ding, Lin; Heckler, Andrew F.; White, Daniel R.; Badeau, Ryan
2017-06-01
A body of research on physics problem solving has focused on single-concept problems. In this study we use "synthesis problems" that involve multiple concepts typically taught in different chapters. We use two types of synthesis problems, sequential and simultaneous synthesis tasks. Sequential problems require a consecutive application of fundamental principles, and simultaneous problems require a concurrent application of pertinent concepts. We explore students' conceptual performance when they solve quantitative synthesis problems with varying mathematical complexity. Conceptual performance refers to the identification, follow-up, and correct application of the pertinent concepts. Mathematical complexity is determined by the type and the number of equations to be manipulated concurrently due to the number of unknowns in each equation. Data were collected from written tasks and individual interviews administered to physics major students (N =179 ) enrolled in a second year mechanics course. The results indicate that mathematical complexity does not impact students' conceptual performance on the sequential tasks. In contrast, for the simultaneous problems, mathematical complexity negatively influences the students' conceptual performance. This difference may be explained by the students' familiarity with and confidence in particular concepts coupled with cognitive load associated with manipulating complex quantitative equations. Another explanation pertains to the type of synthesis problems, either sequential or simultaneous task. The students split the situation presented in the sequential synthesis tasks into segments but treated the situation in the simultaneous synthesis tasks as a single event.
Performance Basis for Airborne Separation
NASA Technical Reports Server (NTRS)
Wing, David J.
2008-01-01
Emerging applications of Airborne Separation Assistance System (ASAS) technologies make possible new and powerful methods in Air Traffic Management (ATM) that may significantly improve the system-level performance of operations in the future ATM system. These applications typically involve the aircraft managing certain components of its Four Dimensional (4D) trajectory within the degrees of freedom defined by a set of operational constraints negotiated with the Air Navigation Service Provider. It is hypothesized that reliable individual performance by many aircraft will translate into higher total system-level performance. To actually realize this improvement, the new capabilities must be attracted to high demand and complexity regions where high ATM performance is critical. Operational approval for use in such environments will require participating aircraft to be certified to rigorous and appropriate performance standards. Currently, no formal basis exists for defining these standards. This paper provides a context for defining the performance basis for 4D-ASAS operations. The trajectory constraints to be met by the aircraft are defined, categorized, and assessed for performance requirements. A proposed extension of the existing Required Navigation Performance (RNP) construct into a dynamic standard (Dynamic RNP) is outlined. Sample data is presented from an ongoing high-fidelity batch simulation series that is characterizing the performance of an advanced 4D-ASAS application. Data of this type will contribute to the evaluation and validation of the proposed performance basis.
Spatial distribution of errors associated with multistatic meteor radar
NASA Astrophysics Data System (ADS)
Hocking, W. K.
2018-06-01
With the recent increase in numbers of small and versatile low-power meteor radars, the opportunity exists to benefit from simultaneous application of multiple systems spaced by only a few hundred km and less. Transmissions from one site can be recorded at adjacent receiving sites using various degrees of forward scatter, potentially allowing atmospheric conditions in the mesopause regions between stations to be diagnosed. This can allow a better spatial overview of the atmospheric conditions at any time. Such studies have been carried out using a small version of such so-called multistatic meteor radars, e.g. Chau et al. (Radio Sci 52:811-828, 2017,
Static and Dynamic Frequency Scaling on Multicore CPUs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bao, Wenlei; Hong, Changwan; Chunduri, Sudheer
2016-12-28
Dynamic voltage and frequency scaling (DVFS) adapts CPU power consumption by modifying a processor’s operating frequency (and the associated voltage). Typical approaches employing DVFS involve default strategies such as running at the lowest or the highest frequency, or observing the CPU’s runtime behavior and dynamically adapting the voltage/frequency configuration based on CPU usage. In this paper, we argue that many previous approaches suffer from inherent limitations, such as not account- ing for processor-specific impact of frequency changes on energy for different workload types. We first propose a lightweight runtime-based approach to automatically adapt the frequency based on the CPU workload,more » that is agnostic of the processor characteristics. We then show that further improvements can be achieved for affine kernels in the application, using a compile-time characterization instead of run-time monitoring to select the frequency and number of CPU cores to use. Our framework relies on a one-time energy characterization of CPU-specific DVFS profiles followed by a compile-time categorization of loop-based code segments in the application. These are combined to determine a priori of the frequency and the number of cores to use to execute the application so as to optimize energy or energy-delay product, outperforming runtime approach. Extensive evaluation on 60 benchmarks and five multi-core CPUs show that our approach systematically outperforms the powersave Linux governor, while improving overall performance.« less
Wong, Wang I; Pasterski, Vickie; Hindmarsh, Peter C; Geffner, Mitchell E; Hines, Melissa
2013-04-01
Influences of prenatal androgen exposure on human sex-typical behavior have been established largely through studies of individuals with congenital adrenal hyperplasia (CAH). However, evidence that addresses the potential confounding influence of parental socialization is limited. Parental socialization and its relationship to sex-typical toy play and spatial ability were investigated in two samples involving 137 individuals with CAH and 107 healthy controls. Females with CAH showed more boy-typical toy play and better targeting performance than control females, but did not differ in mental rotations performance. Males with CAH showed worse mental rotations performance than control males, but did not differ in sex-typical toy play or targeting. Reported parental encouragement of girl-typical toy play correlated with girl-typical toy play in all four groups. Moreover, parents reported encouraging less girl-typical, and more boy-typical, toy play in females with CAH than in control females and this reported encouragement partially mediated the relationship between CAH status and sex-typical toy play. Other evidence suggests that the reported parental encouragement of sex-atypical toy play in girls with CAH may be a response to the girls' preferences for boys' toys. Nevertheless, this encouragement could further increase boy-typical behavior in girls with CAH. In contrast to the results for toy play, we found no differential parental socialization for spatial activities and little evidence linking parental socialization to spatial ability. Overall, evidence suggests that prenatal androgen exposure and parental socialization both contribute to sex-typical toy play.
Click It or Ticket Evaluation, 2010
DOT National Transportation Integrated Search
2013-05-01
The 201 Click It or Ticket (CIOT) mobilization followed a typical elective traffic enforcement program TEP) sequence, involving paid media, earned media, and enforcement. A nationally representative telephone survey indicated that the mobilization wa...
SMARTE'S SITE CHARACTERIZATION TOOL
Site Characterization involves collecting environmental data to evaluate the nature and extent of contamination. Environmental data could consist of chemical analyses of soil, sediment, water or air samples. Typically site characterization data are statistically evaluated for thr...
RFID Reader Antenna with Multi-Linear Polarization Diversity
NASA Technical Reports Server (NTRS)
Fink, Patrick; Lin, Greg; Ngo, Phong; Kennedy, Timothy; Rodriguez, Danny; Chu, Andrew; Broyan, James; Schmalholz, Donald
2018-01-01
This paper describes an RFID reader antenna that offers reduced polarization loss compared to that typically associated with reader-tag communications involving arbitrary relative orientation of the reader antenna and the tag.
The White Adolescent's Drug Odyssey.
ERIC Educational Resources Information Center
Lipton, Douglas S.; Marel, Rozanne
1980-01-01
Presents a "typical" case history of a White middle-class teenager who becomes involved with marihuana and subsequently begins to abuse other drugs. Sociological findings from other research are interspersed in the anecdotal account. (GC)
... through the stages than do adults. Stages are: Experimental use. Typically involves peers, done for recreational use; ... Hostility when confronted about drug dependence Lack of control ... Secretive behavior to hide drug use Using drugs even when alone
Brownfields Environmental Insurance and Risk Management Tools Glossary of Terms
This document provides a list of terms that are typically used by the environmental insurance industry, transactional specialists, and other parties involved in using environmental insurance or risk management tools.
1972-08-01
of public health hazards and may alter reuse approaches to de -emphasize the fertilizer uses of these sludges because of the heavy metals involved...materials are removed with organic sludges, or lime sludges where that process is used. Toxic solids would typically include phenols and heavy metals , 80...solids would typically include phenols and heavy metals , 80 percent and 40 percent respectively being removable with the organic sludges. - 8
Solomon, Olga; Heritage, John; Yin, Larry; Marynard, Douglas; Bauman, Margaret
2015-01-01
Conversation and discourse analyses were used to examine medical problem presentation in pediatric care. Healthcare visits involving children with ASD and typically developing children were analyzed. We examined how children’s communicative and epistemic capabilities and their opportunities to be socialized into a competent patient role are interactionally achieved. We found that medical problem presentation is designed to contain a ‘pre-visit’ account of the interactional and epistemic work that children and caregivers carry out at home to identify the child’s health problems; and that the intersubjective accessibility of children’s experiences that becomes disrupted by ASD presents a dilemma to all participants in the visit. The article examines interactional roots of unmet healthcare needs and foregone medical care of people with ASD. PMID:26463739
Magnetic Reconnection in Different Environments: Similarities and Differences
NASA Technical Reports Server (NTRS)
Hesse, Michael; Aunai, Nicolas; Kuznetsova, Masha; Zenitani, Seiji; Birn, Joachim
2014-01-01
Depending on the specific situation, magnetic reconnection may involve symmetric or asymmetric inflow regions. Asymmetric reconnection applies, for example, to reconnection at the Earth's magnetopause, whereas reconnection in the nightside magnetotail tends to involve more symmetric geometries. A combination of review and new results pertaining to magnetic reconnection is being presented. The focus is on three aspects: A basic, MHD-based, analysis of the role magnetic reconnection plays in the transport of energy, followed by an analysis of a kinetic model of time dependent reconnection in a symmetric current sheet, similar to what is typically being encountered in the magnetotail of the Earth. The third element is a review of recent results pertaining to the orientation of the reconnection line in asymmetric geometries, which are typical for the magnetopause of the Earth, as well as likely to occur at other planets.
NASA Astrophysics Data System (ADS)
Hark, Richard R.; East, Lucille J.
Forensic science is broadly defined as the application of science to matters of the law. Practitioners typically use multidisciplinary scientific techniques for the analysis of physical evidence in an attempt to establish or exclude an association between a suspect and the scene of a crime.
Guidelines to Facilitate the Evaluation of Brines for Winter Roadway Maintenance Operations.
DOT National Transportation Integrated Search
2017-09-19
This document presents guidelines to facilitate the evaluation of brines for winter weather roadway maintenance applications in Texas. Brines are used in anti-icing applications which typically consist of placing liquid snow and ice control chemicals...
The BioLexicon: a large-scale terminological resource for biomedical text mining
2011-01-01
Background Due to the rapidly expanding body of biomedical literature, biologists require increasingly sophisticated and efficient systems to help them to search for relevant information. Such systems should account for the multiple written variants used to represent biomedical concepts, and allow the user to search for specific pieces of knowledge (or events) involving these concepts, e.g., protein-protein interactions. Such functionality requires access to detailed information about words used in the biomedical literature. Existing databases and ontologies often have a specific focus and are oriented towards human use. Consequently, biological knowledge is dispersed amongst many resources, which often do not attempt to account for the large and frequently changing set of variants that appear in the literature. Additionally, such resources typically do not provide information about how terms relate to each other in texts to describe events. Results This article provides an overview of the design, construction and evaluation of a large-scale lexical and conceptual resource for the biomedical domain, the BioLexicon. The resource can be exploited by text mining tools at several levels, e.g., part-of-speech tagging, recognition of biomedical entities, and the extraction of events in which they are involved. As such, the BioLexicon must account for real usage of words in biomedical texts. In particular, the BioLexicon gathers together different types of terms from several existing data resources into a single, unified repository, and augments them with new term variants automatically extracted from biomedical literature. Extraction of events is facilitated through the inclusion of biologically pertinent verbs (around which events are typically organized) together with information about typical patterns of grammatical and semantic behaviour, which are acquired from domain-specific texts. In order to foster interoperability, the BioLexicon is modelled using the Lexical Markup Framework, an ISO standard. Conclusions The BioLexicon contains over 2.2 M lexical entries and over 1.8 M terminological variants, as well as over 3.3 M semantic relations, including over 2 M synonymy relations. Its exploitation can benefit both application developers and users. We demonstrate some such benefits by describing integration of the resource into a number of different tools, and evaluating improvements in performance that this can bring. PMID:21992002
The BioLexicon: a large-scale terminological resource for biomedical text mining.
Thompson, Paul; McNaught, John; Montemagni, Simonetta; Calzolari, Nicoletta; del Gratta, Riccardo; Lee, Vivian; Marchi, Simone; Monachini, Monica; Pezik, Piotr; Quochi, Valeria; Rupp, C J; Sasaki, Yutaka; Venturi, Giulia; Rebholz-Schuhmann, Dietrich; Ananiadou, Sophia
2011-10-12
Due to the rapidly expanding body of biomedical literature, biologists require increasingly sophisticated and efficient systems to help them to search for relevant information. Such systems should account for the multiple written variants used to represent biomedical concepts, and allow the user to search for specific pieces of knowledge (or events) involving these concepts, e.g., protein-protein interactions. Such functionality requires access to detailed information about words used in the biomedical literature. Existing databases and ontologies often have a specific focus and are oriented towards human use. Consequently, biological knowledge is dispersed amongst many resources, which often do not attempt to account for the large and frequently changing set of variants that appear in the literature. Additionally, such resources typically do not provide information about how terms relate to each other in texts to describe events. This article provides an overview of the design, construction and evaluation of a large-scale lexical and conceptual resource for the biomedical domain, the BioLexicon. The resource can be exploited by text mining tools at several levels, e.g., part-of-speech tagging, recognition of biomedical entities, and the extraction of events in which they are involved. As such, the BioLexicon must account for real usage of words in biomedical texts. In particular, the BioLexicon gathers together different types of terms from several existing data resources into a single, unified repository, and augments them with new term variants automatically extracted from biomedical literature. Extraction of events is facilitated through the inclusion of biologically pertinent verbs (around which events are typically organized) together with information about typical patterns of grammatical and semantic behaviour, which are acquired from domain-specific texts. In order to foster interoperability, the BioLexicon is modelled using the Lexical Markup Framework, an ISO standard. The BioLexicon contains over 2.2 M lexical entries and over 1.8 M terminological variants, as well as over 3.3 M semantic relations, including over 2 M synonymy relations. Its exploitation can benefit both application developers and users. We demonstrate some such benefits by describing integration of the resource into a number of different tools, and evaluating improvements in performance that this can bring.
Springer-Wanner, C; Brauns, T
2017-06-01
Ocular manifestation of sarcoidosis occurs in up to 60% of patients with confirmed systemic sarcoidosis and represents one of the most common forms of noninfectious uveitis. In known pulmonary sarcoidosis, ocular involvement can occur in up to 80% of cases. Sarcoidosis can also present only in the eye, without a systemic manifestation (ocular sarcoidosis). Typically, ocular sarcoidosis shows bilateral granulomatous uveitis and can involve all parts of the eye. Apart from an acute anterior uveitis, chronic intermediate or posterior uveitis can be found. In order to prevent a severe reduction of visual acuity leading to blindness, early diagnosis and treatment is essential. For diagnosis, specific clinical signs involving the eye (bilateral granulomatous changes in all parts of the eye) and typical laboratory investigations (angiotensin-converting enzyme, ACE; lysozyme; soluble interleukin 2 receptor, sIL2R; chest X‑ray; chest CT) have to be taken into account, since biopsy to prove noncaseating granulomas is not performed with changes restricted to the eye due to the high risk of vision loss. Ocular sarcoidosis mostly responds well to local or systemic steroid treatment. If the therapeutic effect is insufficient, immunosuppressive agents and biologics can be applied.
Mullerian papilloma-like proliferation arising in cystic pelvic endosalpingiosis.
McCluggage, W Glenn; O'Rourke, Declan; McElhenney, Clodagh; Crooks, Michael
2002-09-01
This report describes an unusual epithelial proliferation occurring in pelvic cystic endosalpingiosis. A cyst mass lined by a layer of ciliated epithelial cells involved the posterior surface of the cervix and vagina. The epithelial proliferation within the wall resembled a mullerian papilloma with fibrous and fibrovascular cores lined by bland cuboidal epithelial cells. Other areas had a microglandular growth pattern resembling cervical microglandular hyperplasia, and focally there was a solid growth pattern. Foci of typical endosalpingiosis involved the surface of both ovaries and pelvic soft tissues. The cystic lesion recurred after partial cystectomy and drainage and was followed up radiologically and with periodic fine-needle aspiration. Part of the wall of the cyst removed 11 years after the original surgery showed an identical epithelial proliferation. MIB1 staining showed a proliferation index of less than 5%, contrasting with the higher proliferation index of a typical serous borderline tumor. The differential diagnosis is discussed. As far as we are aware, this is the first report of such a benign epithelial proliferation involving cystic endosalpingiosis. Copyright 2002, Elsevier Science (USA). All rights reserved.
ERIC Educational Resources Information Center
Forrest, Melanie D.
This curriculum guide is intended for Missouri teachers teaching a course in database applications for high school students enrolled in marketing and cooperative education. The curriculum presented includes learning activities in which students are taught to analyze database tables containing the types of data typically encountered by employees…
Industrial Applications of Low Temperature Plasmas
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bardsley, J N
2001-03-15
The use of low temperature plasmas in industry is illustrated by the discussion of four applications, to lighting, displays, semiconductor manufacturing and pollution control. The type of plasma required for each application is described and typical materials are identified. The need to understand radical formation, ionization and metastable excitation within the discharge and the importance of surface reactions are stressed.
Applicability of Domain-Specific Application Framework for End-User Development
ERIC Educational Resources Information Center
Chusho, Takeshi
2016-01-01
It is preferable for business professionals to develop web applications which must be modified frequently based on their needs. A website for matching is a typical example because various matching websites for C2C (Consumer to Consumer) have recently been opened in relation to the "sharing economy". In our case studies on end-user…
Recall and response time norms for English-Swahili word pairs and facts about Kenya.
Bangert, Ashley S; Heydarian, Nazanin M
2017-02-01
In the vast literature exploring learning, many studies have used paired-associate stimuli, despite the fact that real-world learning involves many different types of information. One of the most popular materials used in studies of learning has been a set of Swahili-English word pairs for which Nelson and Dunlosky (Memory 2; 325-335, 1994) published recall norms two decades ago. These norms involved use of the Swahili words as cues to facilitate recall of the English translation. It is unclear whether cueing in the opposite direction (from English to Swahili) would lead to symmetric recall performance. Bilingual research has suggested that translation in these two different directions involves asymmetric links that may differentially impact recall performance, depending on which language is used as the cue (Kroll & Stewart, Journal of Memory and Language 33; 149-174,1994). Moreover, the norms for these and many other learning stimuli have typically been gathered from college students. In the present study, we report recall accuracy and response time norms for Swahili words when they are cued by their English translations. We also report norms for a companion set of fact stimuli that may be used along with the Swahili-English word pairs to assess learning on a broader scale across different stimulus materials. Data were collected using Amazon's Mechanical Turk to establish a sample that was diverse in both age and ethnicity. These different, but related, stimulus sets will be applicable to studies of learning, metacognition, and memory in diverse samples.
Zero in the brain: A voxel-based lesion-symptom mapping study in right hemisphere damaged patients.
Benavides-Varela, Silvia; Passarini, Laura; Butterworth, Brian; Rolma, Giuseppe; Burgio, Francesca; Pitteri, Marco; Meneghello, Francesca; Shallice, Tim; Semenza, Carlo
2016-04-01
Transcoding numerals containing zero is more problematic than transcoding numbers formed by non-zero digits. However, it is currently unknown whether this is due to zeros requiring brain areas other than those traditionally associated with number representation. Here we hypothesize that transcoding zeros entails visuo-spatial and integrative processes typically associated with the right hemisphere. The investigation involved 22 right-brain-damaged patients and 20 healthy controls who completed tests of reading and writing Arabic numbers. As expected, the most significant deficit among patients involved a failure to cope with zeros. Moreover, a voxel-based lesion-symptom mapping (VLSM) analysis showed that the most common zero-errors were maximally associated to the right insula which was previously related to sensorimotor integration, attention, and response selection, yet for the first time linked to transcoding processes. Error categories involving other digits corresponded to the so-called Neglect errors, which however, constituted only about 10% of the total reading and 3% of the writing mistakes made by the patients. We argue that damage to the right hemisphere impairs the mechanism of parsing, and the ability to set-up empty-slot structures required for processing zeros in complex numbers; moreover, we suggest that the brain areas located in proximity to the right insula play a role in the integration of the information resulting from the temporary application of transcoding procedures. Copyright © 2016 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Marsh, Herbert W.; Scalas, L. Francesca; Nagengast, Benjamin
2010-01-01
Self-esteem, typically measured by the Rosenberg Self-Esteem Scale (RSE), is one of the most widely studied constructs in psychology. Nevertheless, there is broad agreement that a simple unidimensional factor model, consistent with the original design and typical application in applied research, does not provide an adequate explanation of RSE…
PATHOGEN EQUIVALENCY COMMITTEE UPDATE: PFRP EQUIVALENCY DETERMINATIONS
This presentation will:
Review the mandate of the Pathogen Equivalency Committee
Review the PEC's current membership of 10
Discuss how a typical application is evaluated
Note where information can be found
List present deliberations/applications and describe t...
First Look: TRADEMARKSCAN Database.
ERIC Educational Resources Information Center
Fernald, Anne Conway; Davidson, Alan B.
1984-01-01
Describes database produced by Thomson and Thomson and available on Dialog which contains over 700,000 records representing all active federal trademark registrations and applications for registrations filed in United States Patent and Trademark Office. A typical record, special features, database applications, learning to use TRADEMARKSCAN, and…
Advances In High Temperature (Viscoelastoplastic) Material Modeling for Thermal Structural Analysis
NASA Technical Reports Server (NTRS)
Arnold, Steven M.; Saleeb, Atef F.
2005-01-01
Typical High Temperature Applications High Temperature Applications Demand High Performance Materials: 1) Complex Thermomechanical Loading; 2) Complex Material response requires Time-Dependent/Hereditary Models: Viscoelastic/Viscoplastic; and 3) Comprehensive Characterization (Tensile, Creep, Relaxation) for a variety of material systems.
Application of laser anemometry in turbine engine research
NASA Technical Reports Server (NTRS)
Seasholtz, R. G.
1983-01-01
The application of laser anemometry to the study of flow fields in turbine engine components is reviewed. Included are discussions of optical configurations, seeding requirements, electronic signal processing, and data processing. Some typical results are presented along with a discussion of ongoing work.
Application of laser anemometry in turbine engine research
NASA Technical Reports Server (NTRS)
Seasholtz, R. G.
1982-01-01
The application of laser anemometry to the study of flow fields in turbine engine components is reviewed. Included are discussions of optical configurations, seeding requirements, electronic signal processing, and data processing. Some typical results are presented along with a discussion of ongoing work.
Environmental considerations for application of high Tc superconductors in space
NASA Technical Reports Server (NTRS)
Carlberg, I. A.; Kelliher, W. C.; Wise, S. A.; Hooker, M. W.; Buckley, J. D.
1993-01-01
The impact of the environmental factors on the performance of the superconductive devices during spaceflight missions is reviewed. Specific factors typical of spaceflight are addressed to evaluate superconductive devices for space-based applications including preflight storage, radiation, vibration, and thermal cycling.
ERIC Educational Resources Information Center
Lund, Alana; Roemmele, Christopher; Roetker, Lisa; Smith, Steven
2018-01-01
The study of earthquakes can help students build connections between theoretical analysis and real-world applications. However, units on earthquakes typically struggle to bridge that gap between theory and application. Traditional class activities focus on measuring earthquakes, such as triangulating epicenters by analyzing P and S wave arrival…
Full-Body CT Scans - What You Need to Know
... Medical Imaging Medical X-ray Imaging Full-Body CT Scans - What You Need to Know Share Tweet ... new service for health-conscious people: "Whole-body CT screening." This typically involves scanning the body from ...
Coherence, Charging, and Spin Effects in Quantum Dots and Point Contacts
2001-12-01
requires changing the direction of the external field. Considering the typical fields involved (several tesla) and the high- inductance superconducting ...84 6-5 QPC nonlinear conductance...86 6-6 Nonlinear transconductance colorscales
Genetics Home Reference: recombinant 8 syndrome
... with a change in chromosome 8 called an inversion . An inversion involves the breakage of a chromosome in two ... typically not lost as a result of this inversion in chromosome 8 , so people usually do not ...
Planetary Image Geometry Library
NASA Technical Reports Server (NTRS)
Deen, Robert C.; Pariser, Oleg
2010-01-01
The Planetary Image Geometry (PIG) library is a multi-mission library used for projecting images (EDRs, or Experiment Data Records) and managing their geometry for in-situ missions. A collection of models describes cameras and their articulation, allowing application programs such as mosaickers, terrain generators, and pointing correction tools to be written in a multi-mission manner, without any knowledge of parameters specific to the supported missions. Camera model objects allow transformation of image coordinates to and from view vectors in XYZ space. Pointing models, specific to each mission, describe how to orient the camera models based on telemetry or other information. Surface models describe the surface in general terms. Coordinate system objects manage the various coordinate systems involved in most missions. File objects manage access to metadata (labels, including telemetry information) in the input EDRs and RDRs (Reduced Data Records). Label models manage metadata information in output files. Site objects keep track of different locations where the spacecraft might be at a given time. Radiometry models allow correction of radiometry for an image. Mission objects contain basic mission parameters. Pointing adjustment ("nav") files allow pointing to be corrected. The object-oriented structure (C++) makes it easy to subclass just the pieces of the library that are truly mission-specific. Typically, this involves just the pointing model and coordinate systems, and parts of the file model. Once the library was developed (initially for Mars Polar Lander, MPL), adding new missions ranged from two days to a few months, resulting in significant cost savings as compared to rewriting all the application programs for each mission. Currently supported missions include Mars Pathfinder (MPF), MPL, Mars Exploration Rover (MER), Phoenix, and Mars Science Lab (MSL). Applications based on this library create the majority of operational image RDRs for those missions. A Java wrapper around the library allows parts of it to be used from Java code (via a native JNI interface). Future conversions of all or part of the library to Java are contemplated.
High efficiency digital cooler electronics for aerospace applications
NASA Astrophysics Data System (ADS)
Kirkconnell, C. S.; Luong, T. T.; Shaw, L. S.; Murphy, J. B.; Moody, E. A.; Lisiecki, A. L.; Ellis, M. J.
2014-06-01
Closed-cycle cryogenic refrigerators, or cryocoolers, are an enabling technology for a wide range of aerospace applications, mostly related to infrared (IR) sensors. While the industry focus has tended to be on the mechanical cryocooler thermo mechanical unit (TMU) alone, implementation on a platform necessarily consists of the combination of the TMU and a mating set of command and control electronics. For some applications the cryocooler electronics (CCE) are technologically simple and low cost relative to the TMU, but this is not always the case. The relative cost and complexity of the CCE for a space-borne application can easily exceed that of the TMU, primarily due to the technical constraints and cost impacts introduced by the typical space radiation hardness and reliability requirements. High end tactical IR sensor applications also challenge the state of the art in cryocooler electronics, such as those for which temperature setpoint and frequency must be adjustable, or those where an informative telemetry set must be supported, etc. Generally speaking for both space and tactical applications, it is often the CCE that limits the rated lifetime and reliability of the cryocooler system. A family of high end digital cryocooler electronics has been developed to address these needs. These electronics are readily scalable from 10W to 500W output capacity; experimental performance data for nominally 25W and 100W variants are presented. The combination of a FPGA-based controller and dual H-bridge motor drive architectures yields high efficiency (>92% typical) and precision temperature control (+/- 30 mK typical) for a wide range of Stirling-class mechanical cryocooler types and vendors. This paper focuses on recent testing with the AIM INFRAROT-MODULE GmbH (AIM) SX030 and AIM SF100 cryocoolers.
NASA Astrophysics Data System (ADS)
Allani, Mouna; Garbinato, Benoît; Pedone, Fernando
An increasing number of Peer-to-Peer (P2P) Internet applications rely today on data dissemination as their cornerstone, e.g., audio or video streaming, multi-party games. These applications typically depend on some support for multicast communication, where peers interested in a given data stream can join a corresponding multicast group. As a consequence, the efficiency, scalability, and reliability guarantees of these applications are tightly coupled with that of the underlying multicast mechanism.
Export Control Guide: Loose Parts Monitoring Systems for Nuclear Power Plants
DOE Office of Scientific and Technical Information (OSTI.GOV)
Langenberg, Donald W.
2012-12-01
This report describes a typical LPMS, emphasizing its application to the RCS of a modern NPP. The report also examines the versatility of AE monitoring technology by describing several nuclear applications other than loose parts monitoring, as well as some non-nuclear applications. In addition, LPMS implementation requirements are outlined, and LPMS suppliers are identified. Finally, U.S. export controls applicable to LPMSs are discussed.
Treatment of category generation and retrieval in aphasia: Effect of typicality of category items.
Kiran, Swathi; Sandberg, Chaleece; Sebastian, Rajani
2011-01-01
Purpose: Kiran and colleagues (Kiran, 2007, 2008; Kiran & Johnson, 2008; Kiran & Thompson, 2003) have previously suggested that training atypical examples within a semantic category is a more efficient treatment approach to facilitating generalization within the category than training typical examples. The present study extended our previous work examining the notion of semantic complexity within goal-derived (ad-hoc) categories in individuals with aphasia. Methods: Six individuals with fluent aphasia (range = 39-84 years) and varying degrees of naming deficits and semantic impairments were involved. Thirty typical and atypical items each from two categories were selected after an extensive stimulus norming task. Generative naming for the two categories was tested during baseline and treatment. Results: As predicted, training atypical examples in the category resulted in generalization to untrained typical examples in five out the five patient-treatment conditions. In contrast, training typical examples (which was in examined three conditions) produced mixed results. One patient showed generalization to untrained atypical examples, whereas two patients did not show generalization to untrained atypical examples. Conclusions: Results of the present study supplement our existing data on the effect of a semantically based treatment for lexical retrieval by manipulating the typicality of category exemplars. PMID:21173393
NASA Astrophysics Data System (ADS)
Zhu, Ronghua
An n-channel power vertical double-diffused metal-oxide-silicon field-effect transistor (VDMOSFET) with a new atomic-lattice-layout (ALL) has been designed and fabricated. The performance of the VDMOSFET with the ALL has been studied experimentally and comprehensively for the first time. The experimental results with the ALL are compared with the square (SQ), hexagonal (HEX) and stripe (STR) layouts for different applications. For high-frequency applications of VDMOSFET, the ALL is superior to the HEX and inferior to the STR. The optimum specific on-resistance and input capacitance product (Rsb{ON,SP} × Csb{iss,SP}) and optimum specific on-resistance and output capacitance product (Rsb{ON,SP} × Csb{oss,SP}) for the ALL are 44% and 36% lower than the HEX, and 10% and 13% higher than the STR, respectively. The ALL offers superior performance compared to the SQ for applications involving smart power feedback control using integrated current sensor. For a typical sense resistance of 100 Omega, the sense current drops 44% of its value at 0 Omega for the SQ, but only 11% for the ALL. For high-voltage and high-current applications, such as voltage-controlled current source, one observes that the ALL enters into quasi-saturation region at lower gate voltage (Vsb{G}). Typically, quasi-saturation occurs at Vsb{G} of 3V above the threshold voltage (Vsb{T}) for ALL, whereas this voltage is 5 and 6V for the STR and HEX, respectively. Minority carrier lifetime control by proton implantation has been successfully employed to improve the VDMOSFET built-in diode switching performance for the first time. A sevenfold reduction in reverse recovery charge has been achieved with a proton energy of 2.5 MeV and dose of 3 × 10sp{11}/cmsp2. The impact of proton implantation on diode forward voltage and the VDMOSFET characteristics, such as Vsb{T}, leakage current and on-resistance, has been found negligible. Proton implantation has also been found to significantly improve the device ruggedness. The peak reverse current of the built-in diode is reduced to 17.6 A for a proton energy of 1.5 MeV compared to 29.1 A for an un-implanted device at di/dt = 450 A/mus. The optimum location of the proton has been found at approximately middle of the epi-layer.
Effects of Process Parameters on Ultrasonic Micro-Hole Drilling in Glass and Ruby
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schorderet, Alain; Deghilage, Emmanuel; Agbeviade, Kossi
2011-05-04
Brittle materials such as ceramics, glasses and oxide single crystals find increasing applications in advanced micro-engineering products. Machining small features in such materials represents a manufacturing challenge. Ultrasonic drilling constitutes a promising technique for realizing simple micro-holes of high diameter-to-depth ratio. The process involves impacting abrasive particles in suspension in a liquid slurry between tool and work piece. Among the process performance criteria, the drilling time (productivity) is one of the most important quantities to evaluate the suitability of the process for industrial applications.This paper summarizes recent results pertaining to the ultrasonic micro-drilling process obtained with a semi-industrial 3-axis machine.more » The workpiece is vibrated at 40 kHz frequency with an amplitude of several micrometers. A voice-coil actuator and a control loop based on the drilling force impose the tool feed. In addition, the tool is rotated at a prescribed speed to improve the drilling speed as well as the hole geometry. Typically, a WC wire serves as tool to bore 200 {mu}m diameter micro-holes of 300 to 1,000 {mu}m depth in glass and ruby. The abrasive slurry contains B4C particles of 1 {mu}m to 5 {mu}m diameter in various concentrations.This paper discusses, on the basis of the experimental results, the influence of several parameters on the drilling time. First, the results show that the control strategy based on the drilling force allows to reach higher feed rates (avoiding tool breakage). Typically, a 8 um/s feed rate is achieved with glass and 0.9 {mu}m/s with ruby. Tool rotation, even for values as low as 50 rpm, increases productivity and improves holes geometry. Drilling with 1 {mu}m and 5 {mu}m B4C particles yields similar productivity results. Our future research will focus on using the presented results to develop a model that can serve to optimize the process for different applications.« less
SYNCHROTRON ORIGIN OF THE TYPICAL GRB BAND FUNCTION—A CASE STUDY OF GRB 130606B
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Bin-Bin; Briggs, Michael S.; Uhm, Z. Lucas
2016-01-10
We perform a time-resolved spectral analysis of GRB 130606B within the framework of a fast-cooling synchrotron radiation model with magnetic field strength in the emission region decaying with time, as proposed by Uhm and Zhang. The data from all time intervals can be successfully fit by the model. The same data can be equally well fit by the empirical Band function with typical parameter values. Our results, which involve only minimal physical assumptions, offer one natural solution to the origin of the observed GRB spectra and imply that, at least some, if not all, Band-like GRB spectra with typical Bandmore » parameter values can indeed be explained by synchrotron radiation.« less